WO2020003887A1 - External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result - Google Patents

External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result Download PDF

Info

Publication number
WO2020003887A1
WO2020003887A1 PCT/JP2019/021682 JP2019021682W WO2020003887A1 WO 2020003887 A1 WO2020003887 A1 WO 2020003887A1 JP 2019021682 W JP2019021682 W JP 2019021682W WO 2020003887 A1 WO2020003887 A1 WO 2020003887A1
Authority
WO
WIPO (PCT)
Prior art keywords
inspection
inspection target
display
imaging
inspection result
Prior art date
Application number
PCT/JP2019/021682
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 豊
正広 高山
佑二 山内
伸悟 稲積
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020003887A1 publication Critical patent/WO2020003887A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination

Definitions

  • the present disclosure relates to an appearance inspection system that inspects an inspection object using a captured image, a method for displaying an appearance inspection result, and a display program for an appearance inspection result.
  • Patent Document 1 discloses an appearance inspection system for inspecting the quality of a substrate.
  • the appearance inspection system displays an inspection result of the board on a two-dimensional map image.
  • the serial numbers of the boards are arranged on the horizontal axis, and the component information in the boards is arranged on the vertical axis.
  • cells are provided for each combination of the serial number of the board and the component information in the board, and the inspection result of each component on each board is displayed on the corresponding cell.
  • Patent Document 1 inspects the appearance of a substrate by a fixed imaging device imaging a substrate being conveyed on a production line.
  • the imaging device is fixed, the number of places on the substrate where images can be taken is also limited, so that the number of images used for inspection is relatively small.
  • an object in one aspect is to perform a visual inspection performed by imaging an inspection object from a plurality of viewpoints while moving an imaging device. It is an object of the present invention to provide a visual inspection system capable of supporting the confirmation work of the result of the above. An object in another aspect is to provide a display method capable of supporting a check operation of a result of an appearance inspection performed by imaging an inspection target from a plurality of viewpoints while moving an imaging device. . Another object of the present invention is to provide a display program capable of supporting a check operation of a result of a visual inspection performed by imaging an inspection target from a plurality of viewpoints while moving an imaging device. .
  • An inspection unit configured to inspect each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained from the imaging device by imaging each of the plurality of inspection parts of the inspection object; and
  • a display control unit for displaying, on the display device, an inspection result matrix in which inspection results of the inspection unit for each of the inspection parts are displayed for each inspection object.
  • the inspection result of each inspection part of each inspection object is represented in two dimensions, so that the user can immediately grasp which inspection part of which inspection object has a defect. it can.
  • the display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
  • each row or each column of the inspection result matrix is grouped in row units or column units according to a predetermined classification rule.
  • the display control unit when receiving an aggregation instruction for a group of rows or a group of columns that are grouped, displays the inspection results of the group of rows or the group of columns in one row or one column, and displays the inspection result.
  • a deployment instruction is received for the inspection result that is displayed as a result, the display of the inspection result that is collectively displayed is returned to the state before the aggregation.
  • the user can efficiently check the inspection results by utilizing the aggregated display and the expanded display. can do.
  • the display control unit displays the inspection result indicating a defect among the inspection results included in the inspection result matrix in a display mode different from other inspection results.
  • the inspection result indicating a defect is displayed in a display mode different from other inspection results, so that the user can immediately determine the inspection result indicating a defect, and It is possible to more easily grasp which inspection part has a defect.
  • the visual inspection system accepts a selection operation of selecting one or more inspection results from the inspection results included in the inspection result matrix, and thereby performs one or more inspection objects and one inspection result.
  • the operation unit capable of selecting the above inspection part and the inspection unit or the display control unit are selected as the selected inspection target based on the operation unit receiving the selection operation. And performing processing relating to at least one of the inspection parts.
  • the display control unit displays at least one of an image used for the inspection of the selected inspection part and an imaging condition of the image on the display device.
  • the user can easily confirm the imaging condition of an arbitrary inspection portion indicated in the inspection result matrix.
  • the visual inspection system further includes a storage device for storing a three-dimensional model representing a shape of the inspection object.
  • the plurality of inspection parts are set in advance for the three-dimensional model.
  • the display control unit displays the three-dimensional model on the display device, and displays an inspection result of the selected inspection part in a corresponding part on the three-dimensional model.
  • the user can check the inspection result of an arbitrary inspection part on the three-dimensional model, and can easily check which part of the inspection target has a defect.
  • the display control unit displays a statistical result of the plurality of inspection results on the display device.
  • the user can easily analyze the inspection result for an arbitrary inspection portion.
  • the inspection unit re-inspects the selected inspection part based on the image used for the inspection of the selected inspection part.
  • the user can easily execute the reexamination of an arbitrary inspection portion.
  • the display control unit displays, on the display device, a comparison result between the expected value matrix indicating a true correct value for each test result included in the test result matrix and the test result matrix.
  • the user can easily determine whether or not each inspection result indicated in the inspection result matrix is as expected. Such comparison processing becomes more effective as the number of test results shown in the test result matrix 25 increases.
  • a method for displaying an appearance inspection result performed by an imaging device imaging a plurality of inspection portions of an inspection target includes the imaging device while the robot is moving the imaging device. Acquiring each image obtained by imaging each of the plurality of inspection parts, and inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained in the acquiring step. And displaying on a display device an inspection result matrix representing inspection results for each of the plurality of inspection portions for each inspection object.
  • the inspection result of each inspection part of each inspection object is represented in two dimensions, so that the user can immediately grasp which inspection part of which inspection object has a defect. it can.
  • the display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
  • a display program of an appearance inspection result performed by an imaging device imaging a plurality of inspection portions of an inspection target includes, while a computer is moving the imaging device, a robot.
  • the step of inspecting the presence / absence and the step of displaying on a display device an inspection result matrix representing the inspection results for each of the plurality of inspection portions for each inspection object are executed.
  • the inspection result of each inspection part of each inspection object is represented in two dimensions, so that the user can immediately grasp which inspection part of which inspection object has a defect. it can.
  • the display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
  • FIG. 1 is a schematic diagram showing an outline of a visual inspection system according to an embodiment.
  • FIG. 5 is a diagram showing an example of an inspection result displayed on a display device of the image processing device according to the embodiment.
  • FIG. 3 is a diagram showing a modification of the inspection result matrix shown in FIG. 2.
  • FIG. 4 is a diagram illustrating an example of a data structure of a classification rule.
  • FIG. 4 is a diagram illustrating an example of a data structure of a classification rule.
  • FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed.
  • FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed.
  • FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of an image processing apparatus.
  • FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of a PLC (Programmable Logic Controller).
  • FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of a setting device.
  • FIG. 3 is a diagram illustrating an example of a data structure of a project file.
  • FIG. 4 is a diagram illustrating an example of a data structure of an inspection result file.
  • 6 is a flowchart illustrating an example of a processing flow in the setting device. It is a figure showing an example of the screen where the schematic diagram showing the design appearance of the work was displayed. It is a figure showing an example of the screen where the inspection object part was displayed. It is a figure showing an example of a point on an inspection object field.
  • FIG. 4 is a diagram illustrating an example of a method of determining a work shooting position by a setting device.
  • FIG. 9 is a diagram illustrating an example of a shooting path determined for a shooting position.
  • 5 is a flowchart illustrating an example of a processing flow in a PLC.
  • 9 is a flowchart illustrating an example of a flow of an inspection process performed by the image processing apparatus. It is a flowchart which shows an example of the flow of a display process of an inspection result matrix.
  • FIG. 4 is a diagram illustrating an example of a method of determining a work shooting position by a setting device.
  • FIG. 9 is a diagram illustrating an example of a shooting path determined for a shooting position.
  • 5 is a flowchart illustrating an example of a processing flow in a PLC.
  • 9 is a flowchart illustrating an example of a flow of an inspection process performed by the image processing apparatus. It is a flowchart which shows an example of the flow of
  • FIG. 14 is a diagram illustrating a specific example 1 of a process executed in response to a cell selection operation.
  • FIG. 19 is a diagram illustrating a specific example 2 of a process performed in response to a cell selection operation.
  • FIG. 21 is a diagram illustrating a specific example 3 of a process executed in response to a cell selection operation.
  • FIG. 15 is a diagram illustrating a specific example 4 of a process executed in response to a cell selection operation.
  • FIG. 21 is a diagram illustrating a specific example 5 of a process performed in response to a cell selection operation.
  • FIG. 4 is a diagram illustrating a modification of the inspection result matrix illustrated in FIG. 3.
  • FIG. 4 is a diagram illustrating an example of a data structure of a classification rule.
  • FIG. 4 is a diagram illustrating an example of a data structure of a classification rule.
  • FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed.
  • FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed. It is a figure which shows roughly the comparison processing of an inspection result matrix and an expectation value matrix. It is a figure showing an example of a creation process of an expectation value matrix. It is a flow chart which shows an example of the flow of three-dimensional display processing of an inspection result. It is a figure which shows the specific example 1 of a process performed according to the selection operation of the part to be examined shown in a three-dimensional model. It is a figure which shows the example 2 of a process performed according to the selection operation of the inspection target part shown by a three-dimensional model.
  • FIG. 11 is a diagram showing still another mode in which the relative position between the work and the imaging device is changed.
  • FIG. 1 is a schematic diagram showing an outline of a visual inspection system 1 according to the present embodiment.
  • Appearance inspection system 1 includes, for example, a plurality of inspection target portions on an inspection target (hereinafter, also referred to as “work W”) mounted on stage 90 in an industrial product production line or the like. Is imaged, and the appearance of the work W is inspected using the obtained image. In the appearance inspection, the work W is inspected for scratches, dirt, presence or absence of foreign matter, dimensions, and the like.
  • the next work W is transported onto the stage 90.
  • the work W is placed at a predetermined position on the stage 90 in a predetermined posture.
  • the visual inspection system 1 includes an imaging device 10, an image processing device 20, a robot 30, a robot controller 40, a PLC 50, and a setting device 60.
  • the imaging device 10 is for imaging a subject existing in an imaging field of view and generating image data in accordance with a command from the image processing device 20, and captures a workpiece W to be subjected to a visual inspection as a subject.
  • the image processing device 20 outputs an imaging command to the imaging device 10 in accordance with the command from the PLC 50.
  • the image processing device 20 includes an inspection unit 21, a display control unit 22, and a display device 23.
  • the inspection unit 21 and the display control unit 22 are functional modules executed by the processor 110 (see FIG. 8) of the image processing device 20.
  • the inspection unit 21 determines the appearance of the work W by performing a predetermined process on the image data generated by the imaging device 10.
  • the display control unit 22 causes the display device 23 to display the determination result by the inspection unit 21.
  • the output of the inspection result may be output to the display device 23 provided in the image processing device 20 or may be output to a display device (for example, a display 366 described later) provided in the setting device 60.
  • the robot 30 is, for example, a vertical articulated robot in which a plurality of arms 32 are connected on a base 31. Each connecting portion of the plurality of arms 32 includes a rotation shaft.
  • the imaging device 10 is attached to the distal end of the distal arm 32a.
  • the robot controller 40 controls the robot 30 according to a command from the PLC 50 to change the relative position between the work W and the imaging device 10 and the posture of the imaging device 10 with respect to the work W.
  • the robot 30 can change the relative position between the work W and the imaging device 10 and the posture of the imaging device 10 with respect to the work W by changing the relative position and the posture of the imaging device 10 with respect to the stage 90. .
  • the robot 30 moves the imaging device 10 using a coordinate system having a point on the stage 90 as the origin, whereby the relative position between the work W and the imaging device 10 and the posture of the imaging device 10 with respect to the work W Can be changed.
  • the PLC 50 controls the robot controller 40 and the image processing device 20 so that the imaging device 10 sequentially captures a plurality of inspection target portions on the work W.
  • the PLC 50 controls the robot controller 40 according to a path that satisfies the imaging conditions set by the setting device 60. Further, the PLC 50 controls the image processing device 20 so that the imaging device 10 outputs an imaging command at a timing that satisfies the designated imaging condition.
  • the setting device 60 sets a path that satisfies imaging conditions including a relative position between the work W and the imaging device 10 for sequentially imaging a plurality of inspection target portions on the work W.
  • the setting device 60 sets a path that satisfies the imaging conditions suitable for the work W when a new product or a new type of work W needs to be visually inspected.
  • FIG. 2 is a diagram illustrating an example of an inspection result displayed on the display device 23 of the image processing device 20.
  • the inspection unit 21 of the image processing apparatus 20 performs a predetermined image process on each image obtained by photographing the workpiece W from a plurality of directions, thereby detecting a defect in each inspection target portion of the workpiece W. Check for presence.
  • the inspection result is represented by, for example, one of “with defect” and “without defect”.
  • the inspection process by the inspection unit 21 is executed each time the stage 90 transports the workpiece W to be inspected to a predetermined position.
  • the inspection result by the inspection unit 21 is output to the display control unit 22 after being associated with the identification information of the work W and the inspection target portion of the work W.
  • the display control unit 22 displays on the display device 23 an inspection result matrix 25 in which the inspection result of each inspection target portion of the work W is represented for each work W.
  • the inspection target portions of the work are arranged by site name. Each part name may be registered in advance or may be set by the user.
  • identification information of inspected works is arranged on the vertical axis of the inspection result matrix 25, identification information of inspected works is arranged.
  • the work identification information is represented by, for example, a work serial number (hereinafter, also referred to as “work No.”) or a work name.
  • the work identification information is represented by work No.
  • the display control unit 22 displays an inspection result of each inspection target portion of each work W on a corresponding cell.
  • the inspection result of each inspection target portion of each work is represented two-dimensionally, so that the user can immediately grasp which portion of which work has a defect.
  • the display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
  • the display control unit 22 highlights an inspection result indicating a defect among the inspection results included in the inspection result matrix 25 in a display mode different from other inspection results.
  • the cell of the inspection result indicating a defect may be represented by a specific color (for example, red) or may be represented by a blinking display.
  • cells indicating “with defect” are hatched, and cells indicating “no defect” are not hatched. Since the inspection result indicating the defect is highlighted in a display mode different from other inspection results, the user can easily determine the inspection result indicating the defect, and which part of which work has the defect. Can be grasped more easily.
  • the inspection target portions of the work are arranged on the horizontal axis of the inspection result matrix 25 and the work numbers are arranged on the vertical axis of the inspection result matrix 25.
  • 25 may be arranged on the horizontal axis, and the inspection target site of the work may be arranged on the vertical axis of the inspection result matrix 25.
  • FIG. 3 is a diagram showing a modification of the inspection result matrix 25 shown in FIG.
  • Each row or each column of the inspection result matrix 25 shown in FIG. 3 differs from the inspection result matrix 25 shown in FIG. 2 in that the rows or columns are grouped in units of rows or columns according to a predetermined classification rule.
  • the work parts “A1” to “A5” are grouped as a group GV1, and the work parts “B1” to “B3” are grouped. Grouped as GV2.
  • the grouping of the inspection target portion of the work is performed according to the classification rule 134A shown in FIG.
  • FIG. 4 is a diagram showing an example of the data structure of the classification rule 134A.
  • the inspection rule of the work W is hierarchically associated with the classification rule 134A.
  • the correspondence between the parts may be registered in advance or may be arbitrarily set by the user.
  • the inspection target part defined in the higher hierarchy has a relationship including the inspection target part defined in the lower hierarchy.
  • the inspection target portion “A” is included in the portions “A1” to “A5”. That is, the parts “A1” to “A5” are areas in the inspection target part “A”.
  • FIG. 5 is a diagram showing an example of the data structure of the classification rule 134B.
  • each work to be inspected is hierarchically associated with the classification rule 134B.
  • the hierarchical relationship of each work may be registered in advance or may be arbitrarily set by the user.
  • each work is associated with a lot number of a production line.
  • the lot number defined in the upper hierarchy has a relation including the work defined in the lower hierarchy.
  • the lot number “001” is included in the work numbers “001A” to “001E”. In other words, it indicates that the work numbers “001A” to “001E” were produced in the lot number “001”.
  • a development / aggregation button is assigned to each group of the inspection result matrix 25 in the vertical axis direction and the horizontal axis direction.
  • the expand / aggregate button BH1 is assigned to the group GH1.
  • the group GH2 is assigned an expand / consolidate button BH2.
  • the group GV1 is assigned an expand / consolidate button BV1.
  • the GV2 is assigned an expand / consolidate button BV2.
  • FIG. 6 is a diagram showing screen transition of the inspection result matrix 25 when the deploy / aggregate button BV2 is pressed.
  • the expansion / aggregation button BV2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GV2 each time the button is pressed. That is, when the expansion / aggregation button BV2 is pressed while the cells of the group GV2 are expanded, the display control unit 22 of the image processing device 20 aggregates the cells of the group GV2 in a line.
  • the display control unit 22 returns the display of the cells of the group GV2 to the state before the aggregation.
  • the display control unit 22 changes the display of the inspection result “defect” to “no defect”. Is performed prior to the display of the inspection result of ".”
  • the inspection result within the broken line AR1 includes one inspection result indicating “defective” and two inspection results indicating “no defect”.
  • the three inspection results in the broken line AR1 are aggregated into one inspection result shown in the broken line AR2.
  • the display control unit 22 expresses the inspection result after aggregation as “defective”.
  • the display control unit 22 determines the inspection result after aggregation as “defect”. Yes ".
  • the display control unit 22 determines that the inspection result after aggregation is “No defect”. ".
  • FIG. 7 is a diagram showing screen transition of the inspection result matrix 25 when the deploy / aggregate button BH2 is pressed.
  • the expansion / aggregation button BH2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GH2 each time the button is pressed. That is, when the expansion / aggregation button BH2 is pressed while the cells of the group GH2 are expanded, the display control unit 22 aggregates the cells of the group GH2 into one row. On the other hand, when the expansion / aggregation button BH2 is pressed in a state where the cells of the group GH2 are aggregated, the display control unit 22 returns the display of the cells of the group GH2 to the state before the aggregation.
  • the inspection results within the broken line AR3 include one inspection result indicating “defect” and nine inspection results indicating “no defect”.
  • the ten inspection results in the broken line AR3 are aggregated into one inspection result shown in the broken line AR4.
  • the display control unit 22 expresses the inspection result after aggregation as “defective”.
  • the display control unit 22 of the image processing device 20 receives the aggregation instruction for the group of rows or the group of columns, the display control unit 22 displays the inspection result of the row group or the column group in one row. Or display them in a single row.
  • the display control unit 22 receives an instruction to expand the inspection results that are collectively displayed, the display control unit 22 returns the display of the inspection results that are collectively displayed to the state before the aggregation.
  • the display control unit 22 may display the cells after aggregation more densely as the number of inspection results indicating “defective” is included in the cells to be aggregated.
  • FIG. 8 is a schematic diagram illustrating an example of a hardware configuration of the image processing apparatus 20.
  • image processing device 20 typically has a structure according to a general-purpose computer architecture, and executes various programs such as a visual inspection process by executing a program installed in advance by a processor. Image processing is realized.
  • the image processing apparatus 20 includes a processor 110 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 112, a display controller 114, and a system controller 116. , An input / output (I / O) controller 118, a storage device 120 such as a hard disk, a camera interface 122, an input interface 124, a controller interface 126, a communication interface 128, and a memory card interface 130. These units are connected to each other so as to enable data communication with the system controller 116 as a center.
  • the processor 110 exchanges programs (codes) with the system controller 116 and executes them in a predetermined order, thereby realizing a target arithmetic processing.
  • the system controller 116 is connected to the processor 110, the RAM 112, the display controller 114, and the I / O controller 118 via buses, respectively, exchanges data with each unit, and performs processing of the entire image processing apparatus 20. Governing
  • the RAM 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and includes a program read from the storage device 120, a camera image (image data) acquired by the imaging device 10, The processing result for the camera image, the work data, and the like are stored.
  • DRAM Dynamic Random Access Memory
  • the display controller 114 is connected to the display device 23, and outputs a signal for displaying various information to the display device 23 according to an internal command from the system controller 116.
  • the I / O controller 118 controls data exchange between a recording medium connected to the image processing apparatus 20 and an external device. More specifically, I / O controller 118 is connected to storage device 120, camera interface 122, input interface 124, controller interface 126, communication interface 128, and memory card interface 130.
  • the storage device 120 stores various data such as a project file 144 in addition to the image processing program 142 and the display program 143 executed by the processor 110. Details of the project file 144 will be described later.
  • the storage device 120 may be a nonvolatile magnetic storage device such as a hard disk, a semiconductor storage device such as a flash memory, or a DVD-RAM (Digital Versatile Disk Random Access Memory). ) May be used.
  • the image processing program 142 and the display program 143 may be provided by being incorporated in a part of another program. In that case, the image processing program 142 itself and the display program 143 itself execute a predetermined process in cooperation with another program. That is, the image processing program 142 and the display program 143 may be incorporated in such another program. Alternatively, some or all of the functions provided by executing the image processing program 142 or the display program 143 may be implemented as a dedicated hardware circuit.
  • the camera interface 122 corresponds to an input unit that receives image data generated by photographing the work W, and mediates data transmission between the imaging device 10 and the processor 110. More specifically, the camera interface 122 can be connected to one or more imaging devices 10, and a shooting instruction is output from the processor 110 to the imaging device 10 via the camera interface 122. Thus, the imaging device 10 captures an image of a subject and outputs the generated image to the processor 110 via the camera interface 122.
  • the input interface 124 mediates data transmission between the processor 110 and input devices such as a keyboard 134, a mouse, a touch panel, and a dedicated console.
  • the controller interface 126 mediates data transmission between the PLC 50 and the processor 110. More specifically, the controller interface 126 transmits information on the state of the production line controlled by the PLC 50, information on the work W, and the like to the processor 110.
  • the communication interface 128 mediates data transmission between the processor 110 and another personal computer or server device (not shown).
  • the communication interface 128 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • the memory card interface 130 mediates data transmission between the processor 110 and the memory card 136 as a recording medium.
  • the memory card 136 circulates with the image processing program 142 and the display program 143 executed by the image processing apparatus 20 stored therein, and the memory card interface 130 reads out these programs from the memory card 136.
  • the memory card 136 may be a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), or an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory). Become.
  • a program downloaded from a distribution server or the like may be installed in the image processing apparatus 20 via the communication interface 128.
  • FIG. 9 is a schematic diagram illustrating an example of a hardware configuration of the PLC 50.
  • the PLC 50 includes a chipset 212, a processor 214, a nonvolatile memory 216, a main memory 218, a system clock 220, a memory card interface 222, a communication interface 228, an internal bus controller 230, and a field bus controller 238. including.
  • the chipset 212 and other components are respectively connected via various buses.
  • the processor 214 and the chipset 212 typically have a configuration according to a general-purpose computer architecture. That is, the processor 214 interprets and executes the instruction codes sequentially supplied from the chipset 212 according to the internal clock.
  • the chipset 212 exchanges internal data with various connected components and generates an instruction code necessary for the processor 214.
  • the system clock 220 generates a system clock having a predetermined cycle and outputs the generated system clock to the processor 214.
  • the chipset 212 has a function of caching data and the like obtained as a result of execution of arithmetic processing by the processor 214.
  • the PLC 50 has a nonvolatile memory 216 and a main memory 218 as storage means.
  • the non-volatile memory 216 stores an OS, a system program, a user program, log information, and the like in a non-volatile manner.
  • the main memory 218 is a volatile storage area that holds various programs to be executed by the processor 214 and is also used as a work memory when executing various programs.
  • the PLC 50 has a communication interface 228, an internal bus controller 230, and a field bus controller 238 as communication means. These communication circuits transmit and receive data.
  • the communication interface 228 exchanges data with the image processing device 20 and the setting device 60.
  • the PLC 50 outputs an imaging instruction to the image processing device 20 via the communication interface 228.
  • the PLC 50 receives the result of the appearance inspection of the work W from the image processing device 20 via the communication interface 228.
  • the internal bus controller 230 controls data exchange via the internal bus 226. More specifically, the internal bus controller 230 includes a DMA (Dynamic Memory Access) control circuit 232, an internal bus control circuit 234, and a buffer memory 236.
  • DMA Dynamic Memory Access
  • the memory card interface 222 connects the memory card 224 detachable to the PLC 50 and the processor 214.
  • the fieldbus controller 238 is a communication interface for connecting to a field network.
  • the PLC 50 is connected to the robot controller 40 via the field bus controller 238.
  • the field network for example, EtherCAT (registered trademark), EtherNet / IP (registered trademark), CompoNet (registered trademark), or the like is employed.
  • FIG. 10 is a schematic diagram illustrating an example of a hardware configuration of the setting device 60.
  • the setting device 60 includes a processor 362, a main memory 363, a storage device 364, a display 366, an input device 367, and a communication interface 368. These units are connected to each other via a bus 361 so as to be able to perform data communication.
  • the processor 362 executes programs (codes) including the setting program 365 installed in the storage device 364 in the main memory 363 and executes them in a predetermined order, thereby performing various operations.
  • the main memory 363 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory).
  • the storage device 364 is an internal memory included in the setting device 60 and is a nonvolatile storage device, and stores various programs such as the setting program 365. Note that the storage device 364 may be a hard disk or a semiconductor storage device such as a flash memory.
  • the setting program 365 is a program showing a procedure for setting a change path of the imaging condition by the setting device 60.
  • Various programs such as the setting program 365 need not be stored in the storage device 364, but may be stored in a server that can communicate with the setting device 60 or an external memory that can be directly connected to the setting device 60.
  • the program is distributed in a state where various programs executed by the setting device 60 and various parameters used in the various programs are stored in the external memory, and the setting device 60 reads the various programs and various parameters from the external memory.
  • the external memory accumulates information of the program or the like by an electric, magnetic, optical, mechanical or chemical action so that the computer or other device or machine or the like can read the information of the recorded program or the like.
  • programs and parameters downloaded from a server or the like communicably connected to the setting device 60 may be installed in the setting device 60.
  • the display 366 is, for example, a liquid crystal display.
  • the input device 367 includes, for example, a mouse, a keyboard, a touch pad, and the like.
  • the communication interface 368 exchanges various data between the PLC 50 and the processor 362. Note that the communication interface 368 may exchange data between the server and the processor 362. Communication interface 368 includes hardware corresponding to a network for exchanging various data with PLC 50.
  • FIG. 11 is a diagram showing an example of the data structure of the project file 144.
  • the project file 144 includes an imaging condition file 146A, an image file group 146B, an inspection condition file 146C, an inspection result file 146D, a work information file 146E, a production information file 146F, and a classification rule file 146G. , And the expected value file 146H.
  • the imaging condition file 146A is data that defines imaging conditions for each inspection target portion of the work W. That is, by referring to the imaging condition file 146A, the image processing apparatus 20 can uniquely specify the imaging condition using the inspection target portion as a key. Alternatively, the imaging condition may be uniquely specified using a combination of the work No. and the inspection target portion as a key.
  • the imaging conditions include, for example, the relative position between the work W and the imaging device 10, the position of illumination when the work W is imaged, and the like.
  • the image file group 146B is a data group obtained from the imaging device 10 by imaging the work W according to the defined imaging conditions.
  • Each image file included in the image file group 146B is associated with a work number and an inspection target portion. That is, the image processing apparatus 20 can uniquely specify the image file used for the inspection using the combination of the work number and the inspection target portion as a key. Information on the combination of the work No. and the inspection target portion is specified, for example, in the file name of the image file or the header of the image file.
  • the inspection condition file 146C is data defining the inspection conditions for each inspection target portion of the work. That is, the image processing apparatus 20 can uniquely specify the inspection condition as the inspection target partial key by referring to the inspection condition file 146C.
  • the inspection conditions include, for example, an inspection target portion in an image, an image processing flowchart to be executed, a measurement parameter read at the time of execution of the image processing, a threshold for determining the presence or absence of a defect, and the like.
  • the inspection result file 146D is data that defines measurement values, inspection results, and the like for each inspection target portion of each work. Details of the data structure of the inspection result file 146D will be described later.
  • the work information file 146E includes a three-dimensional model representing the shape of the work to be inspected, type information of the work to be inspected, and the like.
  • the production information file 146F is data for defining a lot number, a serial number, and the like of a work to be inspected.
  • the classification rule file 146G is data including at least one of the above-described classification rule 134A (see FIG. 4), the above-described classification rule 134B (see FIG. 5), and the below-described classification rule 134C (see FIG. 30). .
  • the expected value file 146H is data that specifies the expected value of the inspection result for each inspection target portion of each inspection target work. Details of the expected value file 146H will be described later.
  • Inspection result file 146D > The inspection result file 146D included in the project file 144 (see FIG. 11) will be described with reference to FIG.
  • FIG. 12 is a diagram showing an example of the data structure of the inspection result file 146D.
  • the inspection result file 146D includes identification information of the work (for example, work No.), a part name indicating a part to be inspected of the work, a measurement value obtained as an output result of the inspection processing, and an inspection result indicating the presence or absence of a defect. Including.
  • the image processing apparatus 20 can uniquely specify the measurement value and the inspection result by using the combination of the work number and the inspection target portion as a key by referring to the inspection result file 146D.
  • FIG. 13 is a flowchart illustrating an example of a processing flow in the setting device 60.
  • the setting device 60 performs processing according to, for example, a flowchart shown in FIG. A change path of the imaging condition suitable for is set.
  • the three-dimensional design data indicating the design surface of the new or new type of workpiece W is stored in the storage device 364 of the setting device 60 in advance.
  • step S1 the processor 362 of the setting device 60 reads the three-dimensional design data from the storage device 364.
  • step S2 the processor 362 displays a schematic diagram showing the design appearance of the work W indicated by the three-dimensional design data on the display 366 of the setting device 60, and specifies the inspection target area on the work W according to the user input. decide.
  • the processor 362 changes the coordinate system of the three-dimensional design data to a point on the stage 90. Convert to the XYZ coordinate system as the origin. Therefore, the inspection target area is indicated by an XYZ coordinate system having a point on the stage 90 as an origin.
  • step S3 the processor 362 determines a plurality of inspection target positions from within the inspection target region so as to satisfy the inspection requirement corresponding to the inspection target region. That is, the processor 362 determines each of the plurality of inspection target positions determined in the inspection target region as an inspection target portion.
  • step S4 the processor 362 determines an imaging condition including a relative position between the workpiece W and the imaging device 10 at the time of inspection for each of the plurality of inspection target positions.
  • step S5 the processor 362 determines an imaging route so as to pass through the relative position between the workpiece W and the imaging device 10 determined in step S4.
  • the inspection target area determined in step S2, the inspection target position determined in step S3, the imaging condition determined in step S4, and the imaging path determined in step S5 are transmitted to the image processing device 20 and the PLC 50. .
  • a plurality of inspection target positions are determined from the inspection target portion in step S3.
  • a plurality of inspection target portions may be determined in step S2, and in step S3, at least one inspection target position may be determined from each of the plurality of inspection target portions. Also in this case, a plurality of inspection target positions are determined in step S3.
  • FIG. 14 is a diagram illustrating an example of a screen on which a schematic diagram illustrating a design appearance of the work W is displayed.
  • FIG. 15 is a diagram illustrating an example of a screen on which an inspection target portion is displayed.
  • the setting device 60 causes the display 366 to display a screen 61a including the three-dimensional model ML representing the shape of the work W.
  • the screen 61a includes a tool button 71 for rotating the three-dimensional model ML about a vertical direction and a tool button 72 for rotating the three-dimensional model ML about a horizontal direction. The user can appropriately rotate the three-dimensional model ML by operating the tool buttons 71 and 72.
  • the setting device 60 receives designation of a place to be inspected from the user. Specifically, the user uses the input device 367 to click a plurality of points on the three-dimensional model ML of the work W that the user wants to inspect. In the screen 61a shown in FIG. 14, a plurality of points clicked by the user are indicated by circles 74.
  • the setting device 60 cuts out an area including a plurality of circles 74 specified on the three-dimensional model ML as an inspection target area. Specifically, the setting device 60 obtains a range within a predetermined distance from the point on the surface of the work W corresponding to each of the designated circles 74 along the surface, and sets the union of the ranges to the inspection target area. Cut out as.
  • the setting device 60 further adjusts the inspection target area so that the contour is a geometric figure such as a straight line or a circle.
  • the inspection target area 75 adjusted so that the contour is a straight line parallel to any one of the ridges of the workpiece W is shown.
  • the setting device 60 receives a fine adjustment instruction of the inspection target area 75 from the user, and finely adjusts the inspection target area 75 according to the instruction.
  • the screen 61b includes tool buttons 76 and 77 for enlarging or reducing the inspection target area 75.
  • the user uses the input device 367 to select one side of the contour of the inspection target area 75 and operates the tool buttons 76 and 77 to input an instruction to enlarge or reduce the inspection target area 75.
  • the user inputs an instruction to enlarge or reduce the inspection target area 75 by dragging a point 78 on one side constituting the contour of the inspection target area 75 using a mouse included in the input device 367. You may.
  • the setting device 60 enlarges or reduces the inspection target area 75.
  • the setting device 60 determines the inspection target area 75. In the example illustrated in FIG. 15, a region in which partial regions of four surfaces among the six surfaces of the rectangular parallelepiped work W are gathered is determined as the inspection target region 75.
  • FIG. 16 is a diagram illustrating an example of a point on the inspection target area.
  • FIG. 17 is a diagram illustrating an example of the inspection target position determined from the inspection target region and the corresponding effective visual field.
  • FIG. 18 is a diagram illustrating all inspection target portions determined from the inspection target region and the effective visual field.
  • the field of view of the imaging device 10 that can satisfy the inspection requirement that a defect of the minimum defect size can be recognized.
  • the FOV is set as the inspection target part.
  • the diameter of the imaging field of view FOV is generally represented by a ⁇ D ⁇ R using a proportionality constant a.
  • the setting device 60 regards the inspection target area 75 as a set of points, and can investigate a three-dimensional shape near a point in the inspection target area 75 using the distribution of normal vectors. For example, based on the three-dimensional design data of the workpiece W, the setting device 60 obtains a distribution of normal vectors in a range within a distance L along a surface from a point in the inspection target area 75.
  • the vicinity of the point P1 is flat. Therefore, all the normal vectors within the range of the distance L from the point P1 along the surface of the work W become the vector n2.
  • Point P2 is located near the ridgeline where the two surfaces intersect. Therefore, the normal vector within the range of the distance L from the point P2 along the surface of the workpiece W includes the two vectors n2 and n4.
  • Point P2 is located near the vertex where the three surfaces intersect. Therefore, the normal vectors within the range of the distance L from the point P3 along the surface of the workpiece W include three vectors n1, n2, and n4.
  • the setting device 60 determines whether the vicinity of the point is flat or a ridgeline exists near the point by the distribution of the normal vectors within the distance L along the surface from the point in the inspection target area 75. Or whether there is a ridgeline near the point.
  • the setting device 60 selects one point selected at random from the inspection target area 75 as the inspection target position Bi.
  • the setting device 60 obtains the effective field of view FOV2i of the imaging device 10 for the inspection target position Bi as the inspection target portion.
  • the effective visual field FOV2i is a visual field that includes the inspection target position Bi and can be imaged and inspected by the imaging device 10 using one imaging condition.
  • the setting device 60 determines the effective field of view FOV2i according to the three-dimensional shape near the inspection target position Bi.
  • the setting device 60 determines the effective visual field FOV2i so that the variation of the normal vector distribution in the effective visual field FOV2i falls within a predetermined range.
  • the setting device 60 determines a range (that is, an imaging visual field FOV) within a distance a ⁇ D ⁇ R along the surface from the inspection target position Bi as the effective visual field FOV2i.
  • a range that is, an imaging visual field FOV
  • the setting device 60 moves the inspection target position Bi along the surface within a distance a ⁇ D ⁇ R within one distance.
  • the range excluding the range of the portion is defined as the effective field of view FOV2i.
  • the excluded range is the range of the surface having normal vectors other than the normal vector showing the maximum distribution amount in the normal vector distribution.
  • the setting device 60 removes the set of points belonging to the determined effective field of view FOV2i from the set of points belonging to the inspection target area 75, and selects one point selected at random from the set of remaining points. Is selected as the inspection target position B (i + 1).
  • the setting device 60 also determines the effective visual field FOV2 (i + 1) for the selected inspection target position B (i + 1).
  • the setting device 60 repeats this processing until the set of points belonging to the inspection target area 75 becomes 0.
  • the maximum possible diameter of the effective visual field FOV2i is set to the diameter of the imaging visual field FOV. Therefore, the inspection target position is determined so that the defect having the minimum defect size can be recognized in the entire inspection target region 75.
  • the setting device 60 randomly extracts the inspection target position Bi from the inspection target region 75.
  • the setting device 60 may extract the inspection target position Bi from the inspection target region 75 according to a predetermined geometric condition.
  • the setting device 60 may extract the inspection target position Bi from the inspection target region 75 such that the plurality of extracted inspection target positions Bi are regularly aligned.
  • the inspection target position may be determined so as to satisfy the inspection requirement that the inspection near the ridge or the vertex is preferentially performed.
  • the inspection target position Bi may be preferentially extracted from a set of points in the inspection target area 75 where a ridgeline or a vertex exists within a predetermined distance.
  • FIG. 19 is a diagram illustrating an example of a method of determining the photographing position Ci of the work W by the setting device 60.
  • the setting device 60 determines the photographing position Ci on the normal line of the design appearance surface of the work W at the inspection target position Bi. More specifically, the photographing position Ci can capture an image of the effective field of view FOV2i corresponding to the inspection target position Bi, and is located at a position away from the inspection target position Bi by an optimal subject distance that is in focus with the inspection target position Bi. It is determined.
  • the setting device 60 determines an imaging condition based on the determined imaging position Ci.
  • the setting device 60 includes the inspection target position Bi in the field of view, and determines an optimal imaging condition for focusing on the inspection target position Bi.
  • the imaging conditions include, for example, an X coordinate, a Y coordinate, and a Z coordinate of the imaging device 10 in an XYZ coordinate system whose origin is a point on the stage 90, and ⁇ x, ⁇ y, and ⁇ z that specify the direction of the optical axis of the imaging device 10.
  • ⁇ x is the angle between the line that projects the optical axis of the imaging device 10 on the XY plane and the X axis
  • ⁇ y is the angle that the line that projects the optical axis of the imaging device 10 on the YZ plane and the Y axis
  • ⁇ z is the angle between the line that projects the optical axis of the imaging device 10 on the ZX plane and the Z axis.
  • the XYZ coordinates are parameters for specifying a relative position between the work W and the imaging device 10
  • ⁇ x, ⁇ y, and ⁇ z are parameters for specifying the posture of the imaging device 10 with respect to the work W.
  • FIG. 20 is a diagram illustrating an example of the photographing route determined for the photographing position Ci.
  • the setting device 60 determines a photographing route so as to pass through the decided photographing positions C1 to C7. At this time, the setting device 60 determines a photographing route so as to satisfy a predetermined requirement. For example, when the predetermined requirement is a requirement that minimizes the travel time, the setting device 60 sets a route candidate that has the shortest travel time among route candidates that sequentially pass through a plurality of imaging positions as an imaging route. Set. For example, the setting device 60 may calculate an evaluation value for evaluating an item (for example, travel time) indicated by a predetermined requirement for each route candidate, and set an imaging route based on the calculated evaluation value. .
  • the setting device 60 selects one of the plurality of imaging position candidates as the imaging position based on the evaluation value. Good. In this way, a shooting path optimized to satisfy predetermined requirements is set.
  • FIG. 20 shows the determined photographing route PS.
  • the image sequentially passes through “shooting position C1” ⁇ “shooting position C4” ⁇ “shooting position C5” ⁇ “shooting position C6” ⁇ “shooting position C7” ⁇ “shooting position C3” ⁇ “shooting position C2”.
  • the photographing path PS is determined so as to perform the operation.
  • FIG. 21 is a flowchart illustrating an example of the flow of processing in the PLC 50.
  • the processing illustrated in FIG. 21 is realized by the processor 214 of the PLC 50 executing a program. In other aspects, some or all of the processing may be performed by circuit elements or other hardware.
  • the PLC 50 controls the robot controller 40 so that the imaging device 10 passes through the determined imaging path PS (see FIG. 20), and executes imaging at the imaging position Ci (see FIG. 20) where the imaging device 10 is determined.
  • the image processing device 20 is controlled so as to perform the operation.
  • step S10 the processor 214 determines whether or not the workpiece W placed on the stage 90 has been set at a predetermined position. If the processor 214 determines that the work W has been installed at a predetermined position (YES in step S10), it switches the control to step S12. Otherwise (NO in step S10), processor 214 executes the process of step S10 again.
  • step S12 the processor 214 sequentially outputs a command value to the robot controller 40 according to a preset photographing path PS.
  • the robot controller 40 drives each axis of the robot 30 according to a command value from the PLC 50.
  • the imaging device 10 attached to the tip of the robot 30 sequentially moves along the imaging path PS.
  • step S20 the processor 214 determines whether or not the imaging device 10 has reached a preset imaging position Ci (see FIG. 20). Whether or not the imaging device 10 has reached the preset imaging position Ci is determined, for example, based on a command value output to the robot controller 40 in step S12. When determining that the imaging device 10 has reached the preset imaging position Ci (YES in step S20), the processor 214 switches the control to step S22. Otherwise (NO in step S20), processor 214 switches the control to step S30.
  • step S22 the processor 214 outputs a photographing instruction to the image processing device 20. Thereby, the image processing device 20 executes the photographing process.
  • step S30 the processor 214 determines whether or not the imaging device 10 has reached the preset end point of the shooting path PS. Whether or not the imaging device 10 has reached the preset end point of the shooting path PS is determined, for example, based on a command value output to the robot controller 40 in step S12. If the processor 214 determines that the imaging device 10 has reached the preset end point of the shooting path PS (NO in step S30), the processor 214 returns the control to step S10. Otherwise (NO in step S30), processor 214 returns the control to step S12.
  • FIG. 22 is a flowchart illustrating an example of the flow of an inspection process performed by the image processing apparatus 20.
  • the processing illustrated in FIG. 22 is realized by the processor 110 of the image processing device 20 executing the image processing program 142 (see FIG. 8). Note that part or all of the processing may be executed by a circuit element or other hardware.
  • step S50 the processor 110 determines whether or not a photographing instruction has been received from the PLC 50. As described above with reference to FIG. 21, the PLC 50 outputs a shooting instruction to the image processing device 20 when the imaging device 10 reaches a preset shooting position Ci. When determining that the photographing instruction has been received from the PLC 50 (YES in step S50), the processor 110 switches the control to step S52. Otherwise (NO in step S50), processor 110 switches the control to step S60.
  • step S52 the processor 110 outputs a shooting instruction to the imaging device 10 as the above-described inspection unit 21 (see FIG. 1). Thereby, the imaging device 10 executes the photographing process based on receiving the photographing instruction from the image processing device 20. Thereby, the image processing device 20 can cause the imaging device 10 to execute the imaging process at the predetermined imaging position Ci.
  • step S ⁇ b> 54 the processor 110 performs the inspection processing by executing predetermined image processing on the image obtained from the imaging device 10 as the inspection unit 21 described above.
  • the image processing to be executed is specified, for example, in the above-described inspection condition file 146C (see FIG. 11).
  • the inspection condition file 146C is data that defines inspection conditions for each inspection target portion of the work W. That is, by referring to the inspection condition file 146C, the image processing apparatus 20 can uniquely specify the inspection condition using the inspection target portion as a key.
  • the processor 110 performs image processing on an image obtained according to the inspection condition acquired from the inspection condition file 146C, and obtains an inspection result.
  • step S56 the processor 110 writes the test result obtained in step S54 in the test result file 146D (see FIG. 12) as the test unit 21 described above.
  • the processor 110 determines the identification information of the work W (for example, work No.), the name of the part indicating the inspection target portion of the work W, the measurement value obtained as an output result of the inspection processing, and the inspection indicating the presence or absence of a defect. After being associated with the result, the result is written into the inspection result file 146D.
  • step S60 the processor 110 determines whether or not an instruction to end the inspection processing has been received.
  • processor 110 ends the inspection process shown in FIG. Otherwise (NO in step S60), processor 110 returns the control to step S50.
  • the inspection process is executed each time the imaging process of the work W is executed.
  • the imaging process of the work W and the inspection process on the image are performed separately. May be done. That is, the image processing apparatus 20 may first accumulate images, and then inspect the accumulated images collectively.
  • FIG. 23 is a flowchart illustrating an example of the flow of the above-described inspection result matrix 25 display process.
  • the processing illustrated in FIG. 23 is realized by the processor 110 of the image processing device 20 executing the display program 143 (see FIG. 8). Note that part or all of the processing may be executed by a circuit element or other hardware.
  • step S70 the processor 110 determines whether a display operation of the inspection result matrix 25 has been received.
  • the display operation is performed on an operation unit provided in the image processing device 20.
  • the operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like.
  • the processor 110 switches the control to step S72. Otherwise (NO in step S70), processor 110 executes the process of step S70 again.
  • step S72 the processor 110 refers to the above-described inspection result file 146D (see FIG. 12) to determine the work No. defined in the inspection result file 146D and the inspection target part defined in the inspection result file 146D. To get.
  • step S74 the processor 110 configures the vertical axis of the inspection result matrix 25 according to the work number acquired in step S72 as the display control unit 22 (see FIG. 1).
  • the work numbers are arranged on the vertical axis of the inspection result matrix 25.
  • step S76 the processor 110 configures the horizontal axis of the inspection result matrix 25 according to the inspection target portion of the workpiece acquired in step S74 as the display control unit 22 described above. As a result, the inspection target portions are arranged on the horizontal axis of the inspection result matrix 25.
  • step S78 the processor 110, as the above-described display control unit 22, sets blank cells on the inspection result matrix 25 for each combination of the work No. acquired in step S72 and the inspection target portion of the work acquired in step S74. Deploy. Each cell is associated with a work number and an inspection target portion.
  • the processor 110 refers to the inspection result file 146D and reflects the inspection result in each cell of the inspection result matrix 25. More specifically, the processor 110 acquires an inspection result for each combination of the work No. and the inspection target portion, and reflects the acquired inspection result in a cell corresponding to each combination. Typically, the processor 110 reflects the inspection result on each cell in such a manner that the inspection result of “defective” and the inspection result of “no defect” can be distinguished. As an example, if the inspection result indicates “defective”, the processor 110 displays the corresponding cell in a specific color (for example, red). On the other hand, if the inspection result indicates “no defect”, the processor 110 displays the corresponding cell in another color (for example, white or green).
  • step S82 the processor 110 determines whether any cell in the inspection result matrix 25 has been selected.
  • the selection operation is performed on an operation unit provided in the image processing device 20.
  • the operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like.
  • the processor 110 switches the control to step S84. Otherwise (NO in step S82), processor 110 switches the control to step S90.
  • step S84 the processor 110 executes a process according to the selected cell as the inspection unit 21 (see FIG. 1) or the display control unit 22. That is, the processor 110 executes a process according to the combination of the specified work No. and the inspection target portion. Details of the processing executed in response to the cell selection operation will be described later.
  • step S90 the processor 110 determines whether or not an operation for closing the inspection result matrix 25 has been received.
  • processor 110 ends the process illustrated in FIG. Otherwise (NO in step S90), processor 110 returns the control to step S82.
  • the image processing device 20 executes a process according to the selected cell based on the selection of the cell in the inspection result matrix 25. Examples of the processing that can be executed include the processing of specific examples 1 to 5 described below. Hereinafter, these processes will be described in order.
  • the image processing device 20 executes at least one of the processes shown in the following specific examples 1 to 5 based on the reception of the cell selection operation.
  • the image processing apparatus 20 may display the selection screen of the processing shown in the following specific examples 1 to 5 based on the reception of the cell selection operation and execute the processing selected on the selection screen. Good.
  • FIG. 24 is a diagram illustrating a specific example 1 of a process performed in response to a cell selection operation.
  • the cell CE1 in the inspection result matrix 25 is selected.
  • the display control unit 22 of the image processing device 20 displays, on the display device 23, the image used for the inspection of the inspection target portion corresponding to the cell CE1 and the imaging conditions of the image. indicate. Thereby, the user can visually check the image indicating the defect, and can confirm the validity of the imaging conditions. Further, the user can easily confirm which part of the work the inspection target part shown in the inspection result matrix 25 indicates.
  • the image processing apparatus 20 specifies a combination of the work No. and the inspection target portion associated with the selected cell CE1.
  • the image processing apparatus 20 acquires a corresponding image file from the above-described image file group 146B (see FIG. 11) using the combination of the specified work number and the inspection target portion as a key.
  • the image processing apparatus 20 acquires the corresponding imaging condition from the above-described imaging condition file 146A (see FIG. 11) using the specified inspection target portion as a key.
  • the display control unit 22 displays the acquired image file and the imaging condition on the pop-up screen PU1 associated with the cell CE1.
  • the displayed imaging condition is configured to be changeable to an arbitrary value.
  • both the captured image and the imaging condition are displayed according to the cell selection operation.
  • the display control unit 22 determines the captured image according to the cell selection operation.
  • either one of the imaging condition and the imaging condition may be displayed.
  • FIG. 25 is a diagram illustrating a specific example 2 of a process performed in response to a cell selection operation.
  • the cell group CE2 in the inspection result matrix 25 is selected.
  • the display control unit 22 of the image processing device 20 displays the three-dimensional model ML representing the shape of the work to be inspected on the display device 23 based on the selection of the cell group CE2, and corresponds to the cell group CE2.
  • the inspection result of the inspection target portion is represented as a corresponding portion on the three-dimensional model ML.
  • the display control unit 22 of the image processing apparatus 20 reads the three-dimensional model of the work to be inspected from the work information file 146E (see FIG. 11). ML is acquired, and the three-dimensional model ML is displayed on the pop-up screen PU2 associated with the cell group CE2.
  • the image processing apparatus 20 acquires an inspection result corresponding to each combination of the specified work No. and the inspection target portion from the above-described inspection result file 146D (see FIG. 12).
  • the image processing device 20 displays the inspection result of each inspection target portion in a corresponding location of the three-dimensional model ML.
  • the display control unit 22 of the image processing device 20 can reflect the inspection result of each inspection target portion on the three-dimensional model ML based on the known information.
  • the inspection target portions A3 to A5 corresponding to the selected cell group CE2 are reflected on the three-dimensional model ML.
  • the display control unit 22 displays, on the three-dimensional model ML, the inspection target portion indicating “having a defect” in a display mode different from “no defect”.
  • the inspection target portion indicating "having a defect” is represented by a specific color (for example, red) on the three-dimensional model ML, and the inspection target portion indicating "no defect” is represented by another color on the three-dimensional model ML. (Eg, green).
  • the inspection target portion indicating “defect” may be represented by blinking display.
  • the display control unit 22 further shows, on the three-dimensional model ML, an inspection target area 75A including the inspection target part A3 and an inspection target area 75B including the inspection target parts A4 and A5. Since the inspection target area is as described in FIG. 15, the description will not be repeated.
  • the inspection target area 75B including the inspection target portion indicating “defective” is displayed in a different form from the inspection target region 75A not including the inspection target portion indicating “defective”.
  • the inspection target area 75B may be represented by a specific color (for example, red) or may be represented by blinking display.
  • the display control unit 22 displays a tool button 71 for rotating the three-dimensional model ML around the vertical direction and a tool button 72 for rotating the three-dimensional model ML around the horizontal direction on the pop-up screen PU2. To display further. The user can appropriately rotate the three-dimensional model ML by operating the tool buttons 71 and 72.
  • FIG. 26 is a diagram illustrating a specific example 3 of the process performed in response to the cell selection operation.
  • the cell group CE3 in the inspection result matrix 25 is selected.
  • the display control unit 22 of the image processing device 20 displays the statistical result of the inspection result corresponding to the cell group CE3 on the display device 23 based on the selection of the cell group CE3. Thereby, the user can easily analyze the inspection result of an arbitrary inspection target portion.
  • the image processing apparatus 20 specifies a combination of the work number and the inspection target portion associated with each cell of the selected cell group CE3. Next, the image processing apparatus 20 acquires the inspection result and the measurement value corresponding to each combination of the specified work No. and the inspection target portion from the above-described inspection result file 146D. Next, the image processing device 20 performs a predetermined statistical process on the acquired measurement value.
  • the image processing device 20 generates a frequency distribution (that is, a histogram) by executing a predetermined statistical process.
  • the horizontal axis of the histogram represents the division of the measurement value, and the horizontal axis of the histogram represents the frequency of the measurement value included in each division.
  • the display control unit 22 of the image processing device 20 displays the generated histogram on the pop-up screen PU3 associated with the cell group CE3.
  • the frequency distribution of the measurement values is generated for each selected inspection target portion.
  • the frequency distribution is shown for each of the selected inspection target portions A2 and A3.
  • the image processing device 20 generates a measured value transition graph by executing a predetermined statistical process.
  • the horizontal axis of the measured value transition graph represents the work number, and the vertical axis of the measured value transition graph represents the measured value.
  • the display control unit 22 of the image processing device 20 displays the generated measurement value transition graph on the pop-up screen PU3 associated with the cell group CE3.
  • the measurement value transition graph is generated for each inspection target portion corresponding to the selected cell group CE3.
  • a measured value transition graph is shown for each of the selected inspection target portions A2 and A3.
  • FIG. 26 an example has been described in which two statistical results, a histogram and a measured value transition graph, are displayed in accordance with a cell group selection operation.
  • the display control unit 22 selects the cell group. At least one statistical result may be displayed according to the operation.
  • FIG. 27 is a diagram illustrating a specific example 4 of the process performed in response to the cell selection operation.
  • the cell group CE4 in the inspection result matrix 25 is selected.
  • the inspection unit 21 of the image processing device 20 re-examines the inspection target portion based on the selection of the cell group CE4 and based on the image used for the inspection of the inspection target portion corresponding to the cell group CE4. .
  • the user can easily execute the re-inspection for an arbitrary inspection target portion.
  • the user resets the inspection conditions, and then selects the cell group CE4 in the inspection result matrix 25.
  • the inspection unit 21 specifies a combination of the work No. associated with each cell of the selected cell group CE4 and the inspection target portion.
  • the inspection unit 21 acquires image data corresponding to each combination of the specified work No. and the inspection target portion from the image file group 146B.
  • the inspection unit 21 acquires the inspection conditions corresponding to each of the specified inspection target portions from the above-described inspection condition file 146C.
  • the inspection unit 21 performs image processing on each of the acquired image data according to the corresponding inspection conditions.
  • the display control unit 22 reflects the result of the reinspection on the inspection result matrix 25.
  • the number of selected cells may be one or more.
  • FIG. 28 is a diagram illustrating a specific example 5 of the process performed in response to the cell selection operation.
  • the cell group CE5 in the inspection result matrix 25 is selected.
  • the display control unit 22 of the image processing device 20 displays the three-dimensional model ML of the work to be inspected on the display device 23 based on the selection of the cell group CE5, and also displays the inspection target portion corresponding to the cell group CE5. Are represented by corresponding parts on the three-dimensional model ML. Thereby, the user can easily confirm the imaging condition of the selected portion.
  • the display control unit 22 of the image processing apparatus 20 stores the three-dimensional model ML of the work to be inspected in the work information file 146E described above (see FIG. 11). And displays the three-dimensional model ML on the pop-up screen PU5 associated with the cell group CE5.
  • the image processing apparatus 20 acquires the imaging condition corresponding to each combination of the specified work No. and the inspection target portion from the above-described imaging condition file 146A (see FIG. 11).
  • the display control unit 22 of the image processing apparatus 20 determines any one of the work numbers (for example, the minimum work number) from the work numbers corresponding to the cell group CE5, and corresponds to the determined work number.
  • the respective imaging conditions to be performed are represented at corresponding locations of the three-dimensional model ML.
  • the pop-up screen PU5 displays a return button 81 and a forward button 82. By pressing the return button 81 or the forward button 82, the user can switch the imaging conditions to be displayed in the order of work No. it can.
  • the displayed imaging conditions include the imaging position of the imaging device 10 with respect to the three-dimensional model ML.
  • the shooting position Ci of the imaging device 10 with respect to the three-dimensional model ML is determined by the setting process performed by the setting device 60. Is known. Therefore, the display control unit 22 of the image processing device 20 can display a schematic diagram representing the imaging device 10 on the imaging position Ci based on the known information. In the example of FIG. 28, a schematic diagram representing the imaging device 10 at the imaging positions C5 and C7 is displayed.
  • the displayed imaging condition includes a portion to be inspected on the workpiece.
  • the relationship between the three-dimensional model ML and the inspection target position Bi is known. Therefore, the display control unit 22 of the image processing device 20 can reflect the inspection result of each inspection target portion on the three-dimensional model ML based on the known information.
  • the inspection target portions A5 and B1 are represented on the three-dimensional model ML.
  • the imaging position and the inspection target portion are displayed as the imaging conditions
  • the displayed imaging conditions are not limited to the imaging position and the inspection target portion.
  • an optical condition such as an imaging field of view
  • an illumination condition at the time of imaging may be displayed.
  • FIG. 29 is a diagram showing a modification of the inspection result matrix 25 shown in FIG.
  • Each row of the inspection result matrix 25 shown in FIG. 3 is grouped by the lot number of the work, whereas each row of the inspection result matrix 25 shown in FIG. 29 is grouped by the type of the work.
  • the work type ⁇ has unique inspection target portions “A1” and “A3” which are not included in the work type ⁇ .
  • the inspection results are reflected in the cells corresponding to the inspection target portions “A1” and “A3” of the work type ⁇ , but the inspection target portions “A1” and “A1” For the cell corresponding to “A3”, a blank is displayed as “not applicable”.
  • FIG. 30 is a diagram illustrating an example of the data structure of the classification rule 134C.
  • the inspection rule of the work W is hierarchically associated with the classification rule 134C.
  • “A1” to “A4” are associated with the inspection target portion “A” of the work type ⁇ .
  • “B1” and “B2” are associated with the inspection target portion “B” of the work type ⁇ .
  • “A2”, “A4”, and “A5” are associated with the inspection target portion “A” of the work type ⁇ .
  • “B1” to “B3” are associated with the inspection target portion “B” of the work type ⁇ .
  • buttons are assigned to each group of the inspection result matrix 25 in the vertical axis direction and the horizontal axis direction.
  • an expand / consolidate button BH1 is assigned to the group GH1.
  • the group GH2 is assigned an expand / consolidate button BH2.
  • the group GV1 is assigned an expand / consolidate button BV1.
  • the group GV2 is assigned an expand / consolidate button BV2.
  • FIG. 31 is a diagram showing a screen transition of the inspection result matrix 25 when the development / aggregation button BV2 is pressed.
  • the expansion / aggregation button BV2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GV2 each time the button is pressed. That is, when the expansion / aggregation button BV2 is pressed while the cells of the group GV2 are expanded, the display control unit 22 of the image processing device 20 aggregates the cells of the group GV2 in a line.
  • the display control unit 22 returns the display of the cells of the group GV2 to the state before the aggregation.
  • the display control unit 22 displays “defect existing” as “no defect”, “applicable”. Aggregation processing is performed with priority over the display of "None”. When the inspection results of “no defect” and “not applicable” are included in the cell to be consolidated, the display control unit 22 gives priority to the display of “no defect” over the display of “not applicable”. To perform aggregation processing.
  • the broken line AR10 includes one “defective” cell and two “not applicable” cells.
  • the three cells within the dashed line AR10 are aggregated into one cell indicated within the dashed line AR12.
  • the display control unit 22 expresses the inspection result after aggregation as “defective”.
  • the broken line AR11 includes two “no defect” cells and one “non-corresponding” cell.
  • the three cells within the dashed line AR11 are aggregated into one cell indicated within the dashed line AR13.
  • the display control unit 22 compares the inspection result after aggregation with “defect”. "None”.
  • FIG. 32 is a diagram showing screen transition of the inspection result matrix 25 when the deploy / aggregate button BH2 is pressed.
  • the expansion / aggregation button BH2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GH2 each time the button is pressed. That is, when the expansion / aggregation button BH2 is pressed while the cells of the group GH2 are expanded, the display control unit 22 aggregates the cells of the group GH2 into one row. On the other hand, when the expansion / aggregation button BH2 is pressed in a state where the cells of the group GH2 are aggregated, the display control unit 22 returns the display of the cells of the group GH2 to the state before the aggregation.
  • the broken line AR14 includes ten cells of “not applicable”. By the aggregation processing of the cells of the group GH2, the ten cells in the broken line AR14 are aggregated into one cell shown in the broken line AR15. At this time, since the cells before aggregation do not include the cells with “defect” and “without defect”, the display control unit 22 expresses the inspection result after aggregation as “not applicable”.
  • the image processing apparatus 20 displays a comparison result between the expected value matrix indicating a true correct value (hereinafter, also referred to as an “expected value”) for each inspection result included in the inspection result matrix 25 and the inspection result matrix 25. It is displayed on the device 23. Thus, the user can easily determine whether each test result shown in the test result matrix 25 is as expected. Such comparison processing becomes more effective as the number of test results shown in the test result matrix 25 increases.
  • FIG. 33 is a diagram schematically showing a comparison process between the inspection result matrix 25 and the expected value matrix 27.
  • expected values are defined for at least a part of the inspection results included in the inspection result matrix 25.
  • the expected value of each cell of the expected value matrix 27 is set, for example, by a user input.
  • the expected value that can be input is, for example, one of “defective”, “no defect”, and “invalid”.
  • the expected value matrix 27 set by the user is stored in the image processing apparatus 20 as the above-described expected value file 146H (see FIG. 11).
  • the image processing apparatus 20 compares cells in the same row and the same column between each cell in the inspection result matrix 25 and each cell in the expected value matrix 27.
  • the inspection result of the inspection result matrix 25 is “no defect” and the expected value of the expected value matrix 27 is “defective”, “missing” is output as the comparison result. “Missed” means that a defect to be detected has been missed. That is, "missing” indicates that the inspection result is not as expected.
  • One reason that the comparison result is “missed” is that the threshold value for determining whether or not there is a defect is too loose.
  • the inspection result of the inspection result matrix 25 is “defective” and the expected value of the expected value matrix 27 is “no defect”, “overdetected” is output as the comparison result.
  • “Overdetection” means that a normal part has been detected as a defect. That is, “overdetected” indicates that the inspection result is not as expected.
  • One reason that the comparison result is “overdetected” is that the threshold value for determining the presence / absence of a defect is too strict.
  • a comparison result matrix 29 is output as a comparison result between the inspection result matrix 25 and the expected value matrix 27.
  • the display control unit 22 displays the comparison result “correct answer OK”, “correct answer NG”, “missed”, “overdetected”, and “invalid” in a distinguishable manner. As an example, these comparison results may be distinguished by color or hatching type.
  • the “correct answer OK” and “correct answer NG” that are the expected comparison results may be displayed in the same color (for example, white).
  • the cell aggregation / expansion function may be implemented for the comparison result matrix 29.
  • the display control unit 22 sets the cell after aggregation to “missing” or “overdetection”.
  • the display control unit 22 sets the cell to be aggregated in a display mode indicating that both “missing” and “overdetection” are included.
  • FIG. 34 is a diagram illustrating an example of a process of creating the expected value matrix 27.
  • the image processing device 20 has a function of supporting creation of the expected value matrix 27.
  • the user selects each cell of the inspection result matrix 25.
  • the cell selection operation is performed on an operation unit provided in the image processing device 20.
  • the operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like.
  • the user copies the cell group CE10 of the inspection result matrix 25 to the expected value matrix 27 being edited. This eliminates the need for the user to input each cell of the expected value matrix 27 one by one, and reduces the time and effort of creating the expected value matrix 27.
  • the expected value matrix 27 created by the user is stored in the image processing apparatus 20 as the above-described expected value file 146H (see FIG. 11).
  • the image processing device 20 has a function of displaying the inspection result in three dimensions by reflecting the inspection result of the work on the three-dimensional model ML. The user can easily determine the location where the defect has occurred by checking the inspection result on the three-dimensional model ML.
  • FIG. 35 is a flowchart illustrating an example of the flow of the three-dimensional display processing of the inspection result.
  • the processing illustrated in FIG. 35 is realized by the processor 110 of the image processing device 20 executing the display program 143 (see FIG. 8). Note that part or all of the processing may be executed by a circuit element or other hardware.
  • step S90 the processor 110 determines whether or not an execution operation of the three-dimensional display processing of the inspection result has been received.
  • the display operation is performed on an operation unit provided in the image processing device 20.
  • the operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like.
  • step S92 the processor 110 acquires the three-dimensional model ML of the work from the work information file 146E (see FIG. 11) as the display control unit 22 (see FIG. 1), and outputs the acquired three-dimensional model ML. It is displayed on the display device 23.
  • step S100 the processor 110 determines whether an instruction to update the three-dimensional display of the inspection result has been received.
  • the update instruction is issued, for example, based on a new inspection result obtained from the inspection unit 21.
  • the update instruction is issued based on the fact that the inspection unit 21 has detected the inspection result indicating the defect.
  • the update instruction is issued based on a user operation.
  • processor 110 switches the control to step S102. Otherwise (NO in step S100), processor 110 switches the control to step S120.
  • step S102 the processor 110 refers to the above-described inspection result file 146D (see FIG. 12) to acquire inspection results for each inspection target portion of the work No. of the inspection target work.
  • step S104 the processor 110, as the above-described display control unit 22, displays each inspection result acquired in step S102 in a corresponding location on the three-dimensional model ML displayed in step S92.
  • the display control unit 22 of the image processing device 20 can reflect each inspection result acquired in step S102 on the three-dimensional model ML based on the known information.
  • the processor 110 displays a portion indicating a defect in a display mode different from other portions.
  • a portion indicating a defect may be represented by a specific color (for example, red) or may be represented by a blinking display. This makes it easier for the user to determine the portion indicating the defect.
  • step S110 the processor 110 determines whether any of the inspection target portions shown on the three-dimensional model has been selected.
  • the selection operation is performed on an operation unit provided in the image processing device 20.
  • the operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like. If the processor 110 determines that any of the inspection target portions represented on the three-dimensional model has been selected (YES in step S110), it switches the control to step S112. Otherwise (NO in step S110), processor 110 switches the control to step S120.
  • step S112 the processor 110 executes a process corresponding to the inspection target portion selected in step S110 as the inspection unit 21 or the display control unit 22 described above. Details of the processing executed in accordance with the selected inspection target portion will be described later.
  • step S120 the processor 110 determines whether or not an operation to close the three-dimensional display screen of the inspection result has been received.
  • processor 110 ends the process illustrated in FIG. Otherwise (NO in step S120), processor 110 returns the control to step S100.
  • the three-dimensional display of the inspection result is updated when the instruction to update the three-dimensional display of the inspection result is issued in step S100.
  • the update instruction is issued, for example, based on the inspection result indicating the defect being detected by the inspection unit 21. That is, when a defect is detected in the work to be sequentially inspected, the processor 110 updates the inspection result represented by the three-dimensional model ML with the inspection result of each inspection target portion of the work. As a result, while no inspection result indicating a defect is detected, the inspection result displayed on the three-dimensional model ML is not updated. Thus, the latest inspection result indicating the defect is always displayed on the three-dimensional model ML. Therefore, it is difficult for the user to overlook the inspection result indicating the defect.
  • the image processing apparatus 20 responds to the selected inspection target portion based on selection of any of the inspection target portions shown in the three-dimensional model ML. Execute the process. Examples of the processing that can be executed include the processing of specific examples 1 and 2 described below. Hereinafter, these processes will be described in order.
  • the image processing device 20 executes at least one of the processes shown in the following specific examples 1 and 2 based on receiving an operation of selecting an inspection target portion on the three-dimensional model ML. .
  • the image processing device 20 displays the selection screen of the processing shown in the following specific examples 1 and 2 based on the selection operation of the inspection target portion on the three-dimensional model ML, and makes a selection on the selection screen. The performed processing may be executed.
  • FIG. 36 is a diagram illustrating a specific example 1 of a process performed in response to a selection operation of an inspection target portion illustrated in the three-dimensional model ML.
  • the display device 23 includes a three-dimensional model ML of the work to be inspected, a tool button 71 for rotating the three-dimensional model ML about the vertical direction, and a three-dimensional model ML about the horizontal direction.
  • a tool button 72 for rotating the dimensional model ML is displayed. The user can appropriately rotate the three-dimensional model ML by operating the tool buttons 71 and 72.
  • the inspection target position Bi is set with respect to the three-dimensional model ML, and thus the relationship between the three-dimensional model ML and the inspection target position Bi is known. Therefore, the display control unit 22 of the image processing device 20 can represent the inspection target portion on the three-dimensional model ML based on the known information. In the example of FIG. 36, the inspection target portions A1 to A4 are represented in the three-dimensional model ML.
  • the display control unit 22 indicates the inspection result in the inspection target portions A1 to A4 on the three-dimensional model ML.
  • the portion indicating “having a defect” may be represented by a specific color (for example, red) or may be represented by blinking display.
  • the portion indicating “no defect” is represented by a different color (for example, green) from the portion indicating “defective”.
  • the inspection target portion A1 indicates “defect”, and the inspection target portions A2 to A4 indicate “no defect”.
  • the display control unit 22 highlights a defective portion A1_1 indicating a defect in the inspection target portion A1 more than other portions.
  • the user can select any of the inspection target portions A1 to A4.
  • the inspection target portion A1 is selected.
  • the image processing device 20 displays, on the display device 23, the image used for the inspection of the inspection target portion A1 and the imaging conditions of the image based on the selection of the inspection target portion A1.
  • the user can visually check the image indicating the defect, and can confirm the validity of the imaging conditions.
  • the image processing apparatus 20 uses the combination of the work No. of the inspection target work and the selected inspection target portion A1 as a key to generate a corresponding image file. From the image file group 146B (see FIG. 11). Next, the image processing apparatus 20 acquires the corresponding imaging condition from the above-described imaging condition file 146A (see FIG. 11) using the inspection target portion A1 as a key.
  • the display control unit 22 displays the acquired image file and the imaging condition on the pop-up screen PU6 associated with the selected inspection target portion A1.
  • the displayed imaging condition is configured to be changeable to an arbitrary value.
  • both the captured image and the imaging condition are displayed according to the operation of selecting the inspection target portion.
  • the display control unit 22 performs Accordingly, either one of the captured image and the imaging condition may be displayed.
  • FIG. 37 is a diagram illustrating a specific example 2 of a process performed in response to an operation of selecting an inspection target portion illustrated in the three-dimensional model ML.
  • the three-dimensional model ML includes inspection target portions A1 to A4.
  • the user can select any of the inspection target portions A1 to A4.
  • the inspection target portion A1 is selected.
  • the image processing device 20 uses the combination of the work No. of the inspection target work and the selected inspection target portion A1 as a key to store corresponding imaging conditions in the above-described imaging condition file. 146A (see FIG. 11).
  • the display control unit 22 of the image processing device 20 displays the acquired imaging conditions in corresponding locations of the three-dimensional model ML.
  • the displayed imaging conditions include the imaging position of the imaging device 10 with respect to the three-dimensional model ML.
  • the shooting position Ci of the imaging device 10 with respect to the three-dimensional model ML is determined by the setting process performed by the setting device 60. Is known. Therefore, the display control unit 22 of the image processing device 20 can display the schematic diagram 10A representing the imaging device 10 on the photographing position Ci based on the known information.
  • a schematic diagram 10A of the imaging device 10 is displayed at the shooting position C1.
  • the displayed imaging condition is not limited to the imaging position.
  • an optical condition of the imaging device 10 at the time of imaging, an illumination condition at the time of imaging, and the like may be displayed.
  • FIG. 38 is a diagram showing a process of developing / aggregating the inspection target portion shown in the three-dimensional model ML.
  • the inspection target portion shown in the three-dimensional model ML is hierarchically associated with the above-described classification rule 134A (see FIG. 4).
  • the inspection target part defined in the higher hierarchy has a relationship including the inspection target part defined in the lower hierarchy.
  • the lower inspection target parts included in the inspection target part defined in the upper hierarchy are regarded as the same group.
  • inspection target portions A to C are shown on the three-dimensional model ML. As an example, it is assumed that lower inspection target portions A1 to A3 are associated with upper inspection target portion A. The inspection target portions A1 to A3 included in the upper inspection target portion A are regarded as the same group.
  • the lower inspection target portions A1_1 to A1_4 are associated with the upper inspection target portion A1.
  • the inspection target portions A1_1 to A1_4 included in the upper inspection target portion A1 are regarded as the same group.
  • Inspection results are shown in inspection target portions A to C on the three-dimensional model ML.
  • the inspection results of the inspection target parts A to C are obtained from the inspection result file 146D (see FIG. 12).
  • the inspection target portion A indicates “with defect”, and the inspection target portions B and C indicate “without defect”.
  • the development / aggregation button BT1 is assigned to the inspection target portion A.
  • the development / aggregation button BT2 is assigned to the inspection target portion B.
  • the development / aggregation button BT3 is assigned to the inspection target portion C. “+” Indicates a development button, and “ ⁇ ” indicates an aggregation button.
  • the display control unit 22 When the display control unit 22 receives the aggregation instruction for the grouped inspection target portions, the display control unit 22 aggregates the inspection results of the grouped inspection target portions, and displays the inspection results after the aggregation in the grouping. Each part on the three-dimensional model ML corresponding to the part to be inspected is represented. In addition, when the display control unit 22 receives a deployment instruction for the aggregated inspection results, the display control unit 22 returns the display of the aggregated inspection results to the state before the aggregation.
  • the display control unit 22 when the deploy / aggregate button BT1 is pressed, the display control unit 22 deploys the inspection target part A to lower inspection target parts A1 to A3. Next, the display control unit 22 reflects the inspection results of the inspection target portions A1 to A3 on the inspection target portions A1 to A3.
  • the inspection results of the inspection target parts A1 to A3 are obtained from the inspection result file 146D (see FIG. 12). As an example, the inspection target portion A1 indicates "having a defect", and the inspection target portions A2 and A3 indicate "no defect".
  • the display control unit 22 assigns the deploy / aggregate button BT1_1 to the inspection target portion A1. Similarly, the display control unit 22 assigns a development / aggregation button BT1_2 to the inspection target portion A2. Similarly, the display control unit 22 assigns the deploy / aggregate button BT1_3 to the inspection target portion A3.
  • the display control unit 22 expands the inspection target portion A1 into lower inspection target portions A1_1 to A1_4.
  • the display control unit 22 reflects the inspection results of the inspection target portions A1_1 to A1_4 on the inspection target portions A1_1 to A1_4.
  • the inspection results of the inspection target portions A1_1 to A1_4 are obtained from the inspection result file 146D (see FIG. 12).
  • the inspection target portions A1_1 to A1_3 indicate "no defect", and the inspection target portion A1_4 indicates "defect".
  • the display control unit 22 When the expand / combine button BT1_1A is pressed, the display control unit 22 combines the inspection target parts A1_1 to A1_4 into a higher-level inspection target part A1. At this time, when at least one inspection result indicating “having a defect” is included in the inspection result of the inspection target portion to be consolidated, the display control unit 22 displays the inspection result of the inspection target portion after the aggregation as “defective”. ". On the other hand, when no inspection result indicating “having a defect” is included in the inspection result of the inspection target portion to be consolidated, the display control unit 22 displays the inspection result of the inspection target portion after the aggregation as “no defect”. ". Since the inspection target portion A1_4 indicates “having a defect”, the display control unit 22 sets the inspection result of the inspection target portion A1 obtained by integrating the inspection target portions A1_1 to A1_4 to “having a defect”.
  • the display control unit 22 When the “-” of the expand / combine button BT1 is pressed, the display control unit 22 combines the inspection target parts A1 to A3 into the inspection target part A. Since the inspection target portion A1 indicates "having a defect", the display control unit 22 sets the inspection result of the inspection target portion A obtained by integrating the inspection target portions A1 to A3 to "having a defect".
  • FIG. 39 is a diagram illustrating a visual inspection system according to a modification.
  • the appearance inspection system shown in FIG. 39 is different from the appearance inspection system 1 shown in FIG. 1 in that it does not include the PLC 50 and includes an image processing device 20a instead of the image processing device 20.
  • the image processing device 20a has both the configuration of the image processing device 20 and the configuration of the PLC 50.
  • FIG. 40 is a diagram showing another embodiment in which the relative position between the workpiece W and the imaging device 10 is changed.
  • the robot 30 may move the work W instead of the imaging device 10.
  • the imaging device 10 is fixed. By moving the work W in this manner, the relative position between the work W and the imaging device 10 may be changed.
  • FIG. 41 is a diagram showing still another mode in which the relative position between the workpiece W and the imaging device 10 is changed.
  • the workpiece W may be placed on the turntable 91.
  • the rotary table 91 rotates according to an instruction from the robot controller 40. Thereby, the relative position between the workpiece W and the imaging device 10 can be easily changed.
  • the robot 30 may be a robot other than the vertical articulated robot (for example, a horizontal articulated robot, an orthogonal robot, or the like).
  • the imaging field of view FOV and the effective field of view FOV2 are described as being circular.
  • the shapes of the imaging field of view FOV and the effective field of view FOV2 are not limited to circles, and may be, for example, rectangular (rectangular, square).
  • this embodiment includes the following disclosure.
  • An appearance inspection system comprising:
  • Each row or each column of the inspection result matrix (25) is grouped in row units or column units according to a predetermined classification rule
  • the display control unit (22) includes: When an aggregation instruction is received for a group of rows or a group of columns that are grouped, the inspection results of the group of rows or the group of columns are aggregated and displayed in one row or one column, 3.
  • the visual inspection system according to Configuration 1 wherein, when a deployment instruction for the inspection results that are collectively displayed is received, the display of the inspection results that are collectively displayed is returned to a state before the integration.
  • the visual inspection system (1) receives one or more inspection results from among the inspection results included in the inspection result matrix (25), thereby receiving one or more inspection objects (W).
  • An operation unit (134) capable of selecting one and one or more inspection parts;
  • the inspection unit (21) or the display control unit (22) determines whether the selected inspection object (W) and the selected inspection part are based on the operation unit (134) receiving the selection operation.
  • the visual inspection system according to any one of Configurations 1 to 3, wherein the visual inspection system performs a process related to at least one of the following.
  • the visual inspection system (1) further includes a storage device for storing a three-dimensional model (ML) representing a shape of the inspection object (W), and the plurality of inspection parts include the three-dimensional model (ML).
  • a storage device for storing a three-dimensional model (ML) representing a shape of the inspection object (W), and the plurality of inspection parts include the three-dimensional model (ML).
  • the display control unit (22) displays the three-dimensional model (ML) on the display device (23), and displays an inspection result of the selected inspection part on a corresponding part on the three-dimensional model (ML).
  • the display control unit (22) displays a comparison result between an expected value matrix indicating a true correct value for each test result included in the test result matrix (25) and the test result matrix (25) on the display device.
  • the visual inspection system according to any one of Configurations 1 to 8, which is displayed in (23).
  • [Configuration 10] A method for displaying a result of a visual inspection performed by an imaging device (10) imaging a plurality of inspection portions of an inspection object (W), Acquiring each image obtained by the imaging device (10) imaging each of the plurality of inspection portions while the robot (30) is moving the imaging device (10); Inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained in the obtaining step; Displaying a test result matrix (25) representing test results for each of the plurality of test portions for each test object (W) on a display device (23).
  • 1 appearance inspection system 10 imaging device, 10A schematic diagram, 20, 20a image processing device, 21 inspection unit, 22 display control unit, 23 display device, 25 inspection result matrix, 27 expected value matrix, 29 comparison result matrix, 30 robot , 31 base, 32 arm, 32a tip arm, 40 robot controller, 50 PLC, 60 setting device, 61a, 61b screen, 71, 72, 76, 77 tool button, 74 circle, 75, 75A, 75B inspection target area , 81 back button, 82 forward button, 90 stage, 91 rotary table, 110, 214, 362 processor, 112 RAM, 114 display controller, 116 system controller, 118 I / O controller, 120, 364 storage device 122, camera interface, 124, input interface, 126, controller interface, 128, 228, 368, communication interface, 130, 222 memory card interface, 134 keyboard, 134A, 134B, 134C classification rule, 136, 224 memory card, 142 image processing program, 143 display program, 144 project file, 146A imaging condition file,

Abstract

The present invention provides assistance in a procedure for confirming the results of external-appearance inspection performed by acquiring images of an inspection target (W) from a plurality of points of view while moving an image-acquisition device (10). An external-appearance inspection system (1) for performing external-appearance inspection on the inspection target (W) is provided with: a display device (23); a robot (30) for moving the image-acquisition device (10); an inspection unit (21) for inspecting, on the basis of individual images obtained from the image-acquisition device (10) as a result of the image-acquisition device (10) respectively acquiring images of a plurality of inspection sites of the inspection target (W) while the robot (30) is moving the image-acquisition device (10), each of the plurality of inspection sites for the presence/absence of a defect; and a display control unit (22) for causing the display device (23) to display an inspection-result matrix (25) representing, for the respective inspection targets (W), the?] inspection results for each of the plurality of inspection sites obtained by the inspection unit (21).

Description

外観検査システム、外観検査結果の表示方法、および、外観検査結果の表示プログラムAppearance inspection system, appearance inspection result display method, and appearance inspection result display program
 本開示は、撮像画像を用いて検査対象物を検査する外観検査システム、外観検査結果の表示方法、および、外観検査結果の表示プログラムに関する。 The present disclosure relates to an appearance inspection system that inspects an inspection object using a captured image, a method for displaying an appearance inspection result, and a display program for an appearance inspection result.
 画像処理技術を用いて、樹脂、金属、基板などの検査対象物を検査する外観検査システムが数多く提案されている。 外 観 A number of visual inspection systems for inspecting inspection objects such as resins, metals, and substrates using image processing techniques have been proposed.
 たとえば、特開2013-211323号公報(特許文献1)には、基板の品質を検査するための外観検査システムが開示されている。当該外観検査システムは、基板の検査結果を2次元マップ画像上に表示する。2次元マップ画像には、基板の通し番号が横軸に配列され、基板内の部品情報が縦軸に配列される。また、2次元マップ画像には、基板の通し番号と基板内の部品情報との組み合わせごとにセルが設けられ、各基板の各部品の検査結果が対応するセル上に表示される。 For example, Japanese Patent Application Laid-Open No. 2013-213323 (Patent Document 1) discloses an appearance inspection system for inspecting the quality of a substrate. The appearance inspection system displays an inspection result of the board on a two-dimensional map image. In the two-dimensional map image, the serial numbers of the boards are arranged on the horizontal axis, and the component information in the boards is arranged on the vertical axis. In the two-dimensional map image, cells are provided for each combination of the serial number of the board and the component information in the board, and the inspection result of each component on each board is displayed on the corresponding cell.
特開2013-211323号公報JP 2013-213323 A
 特許文献1に開示される外観検査システムは、固定されている撮像装置が生産ラインを、搬送される基板を撮像することで、基板の外観を検査する。撮像装置が固定されていると、基板の撮像可能な箇所も限られるため、検査に用いられる画像は比較的少なくなる。 外 観 The appearance inspection system disclosed in Patent Document 1 inspects the appearance of a substrate by a fixed imaging device imaging a substrate being conveyed on a production line. When the imaging device is fixed, the number of places on the substrate where images can be taken is also limited, so that the number of images used for inspection is relatively small.
 一方で、撮像装置を移動させながら検査対象物を撮像する外観検査システムにおいては、1つの検査対象物をあらゆる方向から撮像できるため、検査に用いられる画像も膨大になる傾向にある。1つの検査対象物の外観検査に用いられる画像数は、数百回枚になることもある。その結果、出力される検査結果が膨大になり、ユーザは、検査結果を1つ1つ確認することが困難になる。 On the other hand, in a visual inspection system that captures an image of an inspection target while moving the imaging device, one inspection target can be imaged from all directions, and thus the number of images used for inspection tends to be enormous. The number of images used for the appearance inspection of one inspection object may be several hundred times. As a result, the output inspection results become enormous, and it becomes difficult for the user to check the inspection results one by one.
 本開示は上述のような問題点を解決するためになされたものであって、ある局面における目的は、撮像装置を移動させながら複数の視点から検査対象物を撮像することによって行なわれた外観検査の結果の確認作業を支援することが可能な外観検査システムを提供することである。他の局面における目的は、撮像装置を移動させながら複数の視点から検査対象物を撮像することによって行なわれた外観検査の結果の確認作業を支援することが可能な表示方法を提供することである。他の局面における目的は、撮像装置を移動させながら複数の視点から検査対象物を撮像することによって行なわれた外観検査の結果の確認作業を支援することが可能な表示プログラムを提供することである。 The present disclosure has been made in order to solve the above-described problems, and an object in one aspect is to perform a visual inspection performed by imaging an inspection object from a plurality of viewpoints while moving an imaging device. It is an object of the present invention to provide a visual inspection system capable of supporting the confirmation work of the result of the above. An object in another aspect is to provide a display method capable of supporting a check operation of a result of an appearance inspection performed by imaging an inspection target from a plurality of viewpoints while moving an imaging device. . Another object of the present invention is to provide a display program capable of supporting a check operation of a result of a visual inspection performed by imaging an inspection target from a plurality of viewpoints while moving an imaging device. .
 本開示の一例では、検査対象物の外観検査を行なう外観検査システムは、表示装置と、撮像装置を移動させるためのロボットと、上記ロボットが上記撮像装置を移動している間に上記撮像装置が上記検査対象物の複数の検査部分の各々を撮像して上記撮像装置から得られた各画像に基づいて、上記複数の検査部分の各々について欠陥の有無を検査するための検査部と、上記複数の検査部分の各々についての上記検査部による検査結果を検査対象物ごとに表わした検査結果マトリクスを上記表示装置に表示するための表示制御部とを備える。 In an example of the present disclosure, a visual inspection system that performs a visual inspection of an inspection target includes a display device, a robot for moving the imaging device, and the imaging device while the robot is moving the imaging device. An inspection unit configured to inspect each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained from the imaging device by imaging each of the plurality of inspection parts of the inspection object; and And a display control unit for displaying, on the display device, an inspection result matrix in which inspection results of the inspection unit for each of the inspection parts are displayed for each inspection object.
 この開示によれば、各検査対象物の各検査部分についての検査結果が2次元に表わされることで、ユーザは、どの検査対象物のどの検査部分に欠陥があるのかを即座に把握することができる。2次元での検査結果の表示は、検査結果が膨大になるほど有効になる。 According to this disclosure, the inspection result of each inspection part of each inspection object is represented in two dimensions, so that the user can immediately grasp which inspection part of which inspection object has a defect. it can. The display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
 本開示の一例では、上記検査結果マトリクスの各行または各列は、予め定められた分類ルールに従って行単位または列単位でグルーピングされている。上記表示制御部は、グルーピングされている行群またはグルーピングされている列群に対する集約指示を受け付けた場合に、当該行群または当該列群の検査結果を一行または一列に集約して表示し、集約して表示されている検査結果に対する展開指示を受け付けた場合に、当該集約して表示されている検査結果の表示を集約前に戻す。 {In one example of the present disclosure, each row or each column of the inspection result matrix is grouped in row units or column units according to a predetermined classification rule. The display control unit, when receiving an aggregation instruction for a group of rows or a group of columns that are grouped, displays the inspection results of the group of rows or the group of columns in one row or one column, and displays the inspection result. When a deployment instruction is received for the inspection result that is displayed as a result, the display of the inspection result that is collectively displayed is returned to the state before the aggregation.
 この開示によれば、検査結果マトリクスの行数または列数が表示装置に表示しきれないほど膨大である場合でも、ユーザは、集約表示および展開表示を活用することで検査結果を効率的に確認することができる。 According to this disclosure, even when the number of rows or columns of the inspection result matrix is too large to be displayed on the display device, the user can efficiently check the inspection results by utilizing the aggregated display and the expanded display. can do.
 本開示の一例では、上記表示制御部は、上記検査結果マトリクスに含まれる検査結果の内の、欠陥を示す検査結果を、他の検査結果とは異なる表示態様で表示する。 In one example of the present disclosure, the display control unit displays the inspection result indicating a defect among the inspection results included in the inspection result matrix in a display mode different from other inspection results.
 この開示によれば、欠陥を示す検査結果が他の検査結果とは異なる表示態様で表示されることで、ユーザは、欠陥を示す検査結果を即座に判別することができ、どの検査対象物のどの検査部分に欠陥があるのかをより容易に把握することができる。 According to this disclosure, the inspection result indicating a defect is displayed in a display mode different from other inspection results, so that the user can immediately determine the inspection result indicating a defect, and It is possible to more easily grasp which inspection part has a defect.
 本開示の一例では、上記外観検査システムは、上記検査結果マトリクスに含まれる検査結果の内から1つ以上の検査結果を選択する選択操作を受け付けることで、1つ以上の検査対象物と1つ以上の検査部分とを選択することが可能な操作部と、上記検査部または上記表示制御部は、上記操作部が上記選択操作を受け付けたことに基づいて、選択された検査対象物と選択された検査部分との少なくとも一方に関する処理を実行する。 In an example of the present disclosure, the visual inspection system accepts a selection operation of selecting one or more inspection results from the inspection results included in the inspection result matrix, and thereby performs one or more inspection objects and one inspection result. The operation unit capable of selecting the above inspection part and the inspection unit or the display control unit are selected as the selected inspection target based on the operation unit receiving the selection operation. And performing processing relating to at least one of the inspection parts.
 この開示によれば、検査対象物と検査部分との組み合わせを容易に指定するためのインターフェイスを提供することができる。 According to the present disclosure, it is possible to provide an interface for easily specifying a combination of an inspection object and an inspection portion.
 本開示の一例では、上記表示制御部は、上記選択された検査部分の検査に用いられた画像と、当該画像の撮像条件との少なくとも一方を上記表示装置に表示する。 In one example of the present disclosure, the display control unit displays at least one of an image used for the inspection of the selected inspection part and an imaging condition of the image on the display device.
 この開示によれば、ユーザは、検査結果マトリクスに示される任意の検査部分の撮像条件を容易に確認することができる。 According to the present disclosure, the user can easily confirm the imaging condition of an arbitrary inspection portion indicated in the inspection result matrix.
 本開示の一例では、上記外観検査システムは、上記検査対象物の形状を表わす3次元モデルを格納するための記憶装置をさらに備える。上記複数の検査部分は、上記3次元モデルに対して予め設定されている。上記表示制御部は、上記3次元モデルを上記表示装置に表示するとともに、上記選択された検査部分の検査結果を当該3次元モデル上の対応部分に表わす。 In one example of the present disclosure, the visual inspection system further includes a storage device for storing a three-dimensional model representing a shape of the inspection object. The plurality of inspection parts are set in advance for the three-dimensional model. The display control unit displays the three-dimensional model on the display device, and displays an inspection result of the selected inspection part in a corresponding part on the three-dimensional model.
 この開示によれば、ユーザは、任意の検査部分の検査結果を3次元モデル上で確認することができ、検査対象物のどの部分に欠陥があるのかを容易に確認することができる。 According to this disclosure, the user can check the inspection result of an arbitrary inspection part on the three-dimensional model, and can easily check which part of the inspection target has a defect.
 本開示の一例では、上記表示制御部は、上記選択操作で複数の検査結果が選択された場合、当該複数の検査結果の統計結果を上記表示装置に表示する。 In one example of the present disclosure, when a plurality of inspection results are selected by the selection operation, the display control unit displays a statistical result of the plurality of inspection results on the display device.
 この開示によれば、ユーザは、任意の検査部分について検査結果を容易に分析することができる。 According to this disclosure, the user can easily analyze the inspection result for an arbitrary inspection portion.
 本開示の一例では、上記検査部は、上記選択された検査部分の検査に用いられた画像に基づいて、当該選択された検査部分を再検査する。 In one example of the present disclosure, the inspection unit re-inspects the selected inspection part based on the image used for the inspection of the selected inspection part.
 この開示によれば、ユーザは、任意の検査部分の再検査を容易に実行することができる。 According to this disclosure, the user can easily execute the reexamination of an arbitrary inspection portion.
 本開示の一例では、上記表示制御部は、上記検査結果マトリクスに含まれる各検査結果についての真の正解値を示す期待値マトリクスと、上記検査結果マトリクスとの比較結果を上記表示装置に表示する。 In an example of the present disclosure, the display control unit displays, on the display device, a comparison result between the expected value matrix indicating a true correct value for each test result included in the test result matrix and the test result matrix. .
 この開示によれば、ユーザは、検査結果マトリクスに示される各検査結果が期待通りであるか否かを容易に判断することができる。このような比較処理は、検査結果マトリクス25に示される検査結果の数が増えるほど有効となる。 According to this disclosure, the user can easily determine whether or not each inspection result indicated in the inspection result matrix is as expected. Such comparison processing becomes more effective as the number of test results shown in the test result matrix 25 increases.
 本開示の他の例では、検査対象物の複数の検査部分を撮像装置が撮像することによって行なわれた外観検査結果の表示方法は、ロボットが上記撮像装置を移動している間に上記撮像装置が上記複数の検査部分の各々を撮像して得られた各画像を取得するステップと、上記取得するステップで得られた各画像に基づいて、上記複数の検査部分の各々について欠陥の有無を検査するステップと、上記複数の検査部分の各々についての検査結果を検査対象物ごとに表わした検査結果マトリクスを表示装置に表示するステップと備える。 In another example of the present disclosure, a method for displaying an appearance inspection result performed by an imaging device imaging a plurality of inspection portions of an inspection target includes the imaging device while the robot is moving the imaging device. Acquiring each image obtained by imaging each of the plurality of inspection parts, and inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained in the acquiring step. And displaying on a display device an inspection result matrix representing inspection results for each of the plurality of inspection portions for each inspection object.
 この開示によれば、各検査対象物の各検査部分についての検査結果が2次元に表わされることで、ユーザは、どの検査対象物のどの検査部分に欠陥があるのかを即座に把握することができる。2次元での検査結果の表示は、検査結果が膨大になるほど有効になる。 According to this disclosure, the inspection result of each inspection part of each inspection object is represented in two dimensions, so that the user can immediately grasp which inspection part of which inspection object has a defect. it can. The display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
 本開示の他の例では、検査対象物の複数の検査部分を撮像装置が撮像することによって行なわれた外観検査結果の表示プログラムは、コンピュータに、ロボットが上記撮像装置を移動している間に上記撮像装置が上記複数の検査部分の各々を撮像して得られた各画像を取得するステップと、上記取得するステップで得られた各画像に基づいて、上記複数の検査部分の各々について欠陥の有無を検査するステップと、上記複数の検査部分の各々についての検査結果を検査対象物ごとに表わした検査結果マトリクスを表示装置に表示するステップと実行させる。 In another example of the present disclosure, a display program of an appearance inspection result performed by an imaging device imaging a plurality of inspection portions of an inspection target includes, while a computer is moving the imaging device, a robot. A step of acquiring each image obtained by the imaging device imaging each of the plurality of inspection parts; and a step of acquiring a defect for each of the plurality of inspection parts based on each image obtained in the acquiring step. The step of inspecting the presence / absence and the step of displaying on a display device an inspection result matrix representing the inspection results for each of the plurality of inspection portions for each inspection object are executed.
 この開示によれば、各検査対象物の各検査部分についての検査結果が2次元に表わされることで、ユーザは、どの検査対象物のどの検査部分に欠陥があるのかを即座に把握することができる。2次元での検査結果の表示は、検査結果が膨大になるほど有効になる。 According to this disclosure, the inspection result of each inspection part of each inspection object is represented in two dimensions, so that the user can immediately grasp which inspection part of which inspection object has a defect. it can. The display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
 ある局面において、撮像装置を移動させながら複数の視点から検査対象物を撮像することによって行なわれた外観検査の結果の確認作業を支援することができる。 に お い て In a certain situation, it is possible to support the work of confirming the result of the appearance inspection performed by imaging the inspection target object from a plurality of viewpoints while moving the imaging device.
実施の形態に従う外観検査システムの概要を示す模式図である。1 is a schematic diagram showing an outline of a visual inspection system according to an embodiment. 実施の形態に従う画像処理装置の表示装置に表示される検査結果の一例を示す図である。FIG. 5 is a diagram showing an example of an inspection result displayed on a display device of the image processing device according to the embodiment. 図2に示される検査結果マトリクスの変形例を示す図である。FIG. 3 is a diagram showing a modification of the inspection result matrix shown in FIG. 2. 分類ルールのデータ構造の一例を示す図である。FIG. 4 is a diagram illustrating an example of a data structure of a classification rule. 分類ルールのデータ構造の一例を示す図である。FIG. 4 is a diagram illustrating an example of a data structure of a classification rule. 展開/集約ボタンを押下した場合における検査結果マトリクスの画面遷移を示す図である。FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed. 展開/集約ボタンを押下した場合における検査結果マトリクスの画面遷移を示す図である。FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed. 画像処理装置のハードウェア構成の一例を示す模式図である。FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of an image processing apparatus. PLC(Programmable Logic Controller)のハードウェア構成の一例を示す模式図である。FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of a PLC (Programmable Logic Controller). 設定装置のハードウェア構成の一例を示す模式図である。FIG. 2 is a schematic diagram illustrating an example of a hardware configuration of a setting device. プロジェクトファイルのデータ構造の一例を示す図である。FIG. 3 is a diagram illustrating an example of a data structure of a project file. 検査結果ファイルのデータ構造の一例を示す図である。FIG. 4 is a diagram illustrating an example of a data structure of an inspection result file. 設定装置における処理の流れの一例を示すフローチャートである。6 is a flowchart illustrating an example of a processing flow in the setting device. ワークの設計上の外観を示す模式図が表示された画面の一例を示す図である。It is a figure showing an example of the screen where the schematic diagram showing the design appearance of the work was displayed. 検査対象部分が表示された画面の一例を示す図である。It is a figure showing an example of the screen where the inspection object part was displayed. 検査対象領域上の点の例を示す図である。It is a figure showing an example of a point on an inspection object field. 検査対象領域から決定された検査対象位置とそれに対応する実効視野との例を示す図である。It is a figure showing an example of an inspection object position determined from an inspection object field, and an effective visual field corresponding to it. 検査対象領域から決定された全ての検査対象部分と実効視野とを示す図である。It is a figure showing all the inspection object parts determined from the inspection object field, and an effective visual field. 設定装置によるワークの撮影位置の決定方法の一例を示す図である。FIG. 4 is a diagram illustrating an example of a method of determining a work shooting position by a setting device. 撮影位置に対して決定された撮影経路の一例を示す図である。FIG. 9 is a diagram illustrating an example of a shooting path determined for a shooting position. PLCにおける処理の流れの一例を示すフローチャートである。5 is a flowchart illustrating an example of a processing flow in a PLC. 画像処理装置による検査処理の流れの一例を示すフローチャートである。9 is a flowchart illustrating an example of a flow of an inspection process performed by the image processing apparatus. 検査結果マトリクスの表示処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of a display process of an inspection result matrix. セル選択操作に応じて実行される処理の具体例1を示す図である。FIG. 14 is a diagram illustrating a specific example 1 of a process executed in response to a cell selection operation. セル選択操作に応じて実行される処理の具体例2を示す図である。FIG. 19 is a diagram illustrating a specific example 2 of a process performed in response to a cell selection operation. セル選択操作に応じて実行される処理の具体例3を示す図である。FIG. 21 is a diagram illustrating a specific example 3 of a process executed in response to a cell selection operation. セル選択操作に応じて実行される処理の具体例4を示す図である。FIG. 15 is a diagram illustrating a specific example 4 of a process executed in response to a cell selection operation. セル選択操作に応じて実行される処理の具体例5を示す図である。FIG. 21 is a diagram illustrating a specific example 5 of a process performed in response to a cell selection operation. 図3に示される検査結果マトリクスの変形例を示す図である。FIG. 4 is a diagram illustrating a modification of the inspection result matrix illustrated in FIG. 3. 分類ルールのデータ構造の一例を示す図である。FIG. 4 is a diagram illustrating an example of a data structure of a classification rule. 展開/集約ボタンを押下した場合における検査結果マトリクスの画面遷移を示す図である。FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed. 展開/集約ボタンを押下した場合における検査結果マトリクスの画面遷移を示す図である。FIG. 11 is a diagram showing screen transition of an inspection result matrix when an expand / aggregate button is pressed. 検査結果マトリクスと期待値マトリクスとの比較処理を概略的に示す図である。It is a figure which shows roughly the comparison processing of an inspection result matrix and an expectation value matrix. 期待値マトリクスの作成過程の一例を示す図である。It is a figure showing an example of a creation process of an expectation value matrix. 検査結果の3次元表示処理の流れの一例を示すフローチャートである。It is a flow chart which shows an example of the flow of three-dimensional display processing of an inspection result. 3次元モデルに示される検査対象部分の選択操作に応じて実行される処理の具体例1を示す図である。It is a figure which shows the specific example 1 of a process performed according to the selection operation of the part to be examined shown in a three-dimensional model. 3次元モデルに示される検査対象部分の選択操作に応じて実行される処理の具体例2を示す図である。It is a figure which shows the example 2 of a process performed according to the selection operation of the inspection target part shown by a three-dimensional model. 3次元モデルに示される検査対象部分の展開/集約している過程を示す図である。It is a figure which shows the process of expanding / consolidating the part to be inspected shown by a three-dimensional model. 変形例に係る外観検査システムを示す図である。It is a figure showing an appearance inspection system concerning a modification. ワークと撮像装置との間の相対位置を変更する別の形態を示す図である。It is a figure showing another form which changes a relative position between a work and an imaging device. ワークと撮像装置との間の相対位置を変更するさらに別の形態を示す図である。FIG. 11 is a diagram showing still another mode in which the relative position between the work and the imaging device is changed.
 以下、図面を参照しつつ、本発明に従う各実施の形態について説明する。以下の説明では、同一の部品および構成要素には同一の符号を付してある。それらの名称および機能も同じである。したがって、これらについての詳細な説明は繰り返さない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same parts and components are denoted by the same reference numerals. Their names and functions are the same. Therefore, detailed description thereof will not be repeated.
 <A.適用例>
 まず、図1および図2を参照して、本発明が適用される場面の一例について説明する。図1は、本実施の形態に従う外観検査システム1の概要を示す模式図である。
<A. Application example>
First, an example of a scene to which the present invention is applied will be described with reference to FIGS. FIG. 1 is a schematic diagram showing an outline of a visual inspection system 1 according to the present embodiment.
 本実施の形態に従う外観検査システム1は、たとえば、工業製品の生産ラインなどにおいて、ステージ90上に載置された検査対象物(以下、「ワークW」とも称す。)上の複数の検査対象部分を撮像し、得られた画像を用いて、ワークWの外観検査を行なう。外観検査では、ワークWの傷、汚れ、異物の有無、寸法などが検査される。 Appearance inspection system 1 according to the present embodiment includes, for example, a plurality of inspection target portions on an inspection target (hereinafter, also referred to as “work W”) mounted on stage 90 in an industrial product production line or the like. Is imaged, and the appearance of the work W is inspected using the obtained image. In the appearance inspection, the work W is inspected for scratches, dirt, presence or absence of foreign matter, dimensions, and the like.
 ステージ90上に載置されたワークWの外観検査が完了すると、次のワークWがステージ90上に搬送される。このとき、ワークWは、ステージ90上の予め定められた位置に予め定められた姿勢で載置される。 When the appearance inspection of the work W placed on the stage 90 is completed, the next work W is transported onto the stage 90. At this time, the work W is placed at a predetermined position on the stage 90 in a predetermined posture.
 図1に示すように、外観検査システム1は、撮像装置10と、画像処理装置20と、ロボット30と、ロボットコントローラ40と、PLC50と、設定装置60とを備える。 As shown in FIG. 1, the visual inspection system 1 includes an imaging device 10, an image processing device 20, a robot 30, a robot controller 40, a PLC 50, and a setting device 60.
 撮像装置10は、画像処理装置20からの指令に従って、撮像視野に存在する被写体を撮像して画像データを生成するものであり、被写体として外観検査の対象であるワークWを撮像する。 The imaging device 10 is for imaging a subject existing in an imaging field of view and generating image data in accordance with a command from the image processing device 20, and captures a workpiece W to be subjected to a visual inspection as a subject.
 画像処理装置20は、PLC50からの指令に従って、撮像装置10に対して撮像指令を出力する。画像処理装置20は、検査部21と、表示制御部22と、表示装置23とを含む。検査部21および表示制御部22は、画像処理装置20のプロセッサ110(図8参照)によって実行される機能モジュールである。検査部21は、撮像装置10によって生成された画像データに対して予め定められた処理を実行することにより、ワークWの外観の良否を判定する。表示制御部22は、検査部21による判定結果を表示装置23に表示させる。なお、検査結果の出力は、画像処理装置20に備えられる表示装置23に出力されてもよいが、設定装置60に備えられる表示装置(たとえば、後述のディスプレイ366)に出力されてもよい。 The image processing device 20 outputs an imaging command to the imaging device 10 in accordance with the command from the PLC 50. The image processing device 20 includes an inspection unit 21, a display control unit 22, and a display device 23. The inspection unit 21 and the display control unit 22 are functional modules executed by the processor 110 (see FIG. 8) of the image processing device 20. The inspection unit 21 determines the appearance of the work W by performing a predetermined process on the image data generated by the imaging device 10. The display control unit 22 causes the display device 23 to display the determination result by the inspection unit 21. The output of the inspection result may be output to the display device 23 provided in the image processing device 20 or may be output to a display device (for example, a display 366 described later) provided in the setting device 60.
 ロボット30は、たとえば、基台31上に複数のアーム32が連結された垂直多関節ロボットである。複数のアーム32の各連結部には回転軸が含まれている。先端アーム32aの先端には、撮像装置10が取り付けられている。ロボットコントローラ40は、PLC50からの指令に応じてロボット30を制御し、ワークWと撮像装置10との間の相対位置およびワークWに対する撮像装置10の姿勢を変える。 The robot 30 is, for example, a vertical articulated robot in which a plurality of arms 32 are connected on a base 31. Each connecting portion of the plurality of arms 32 includes a rotation shaft. The imaging device 10 is attached to the distal end of the distal arm 32a. The robot controller 40 controls the robot 30 according to a command from the PLC 50 to change the relative position between the work W and the imaging device 10 and the posture of the imaging device 10 with respect to the work W.
 なお、上述したように、ワークWは、ステージ90上の予め定められた位置に予め定められた姿勢で載置される。そのため、ロボット30は、ステージ90に対する撮像装置10の相対位置および姿勢を変更することにより、ワークWと撮像装置10との間の相対位置およびワークWに対する撮像装置10の姿勢を変更することができる。すなわち、ロボット30は、ステージ90上の点を原点とする座標系を用いて撮像装置10を移動することにより、ワークWと撮像装置10との間の相対位置およびワークWに対する撮像装置10の姿勢を変更することができる。 As described above, the work W is placed at a predetermined position on the stage 90 in a predetermined posture. Therefore, the robot 30 can change the relative position between the work W and the imaging device 10 and the posture of the imaging device 10 with respect to the work W by changing the relative position and the posture of the imaging device 10 with respect to the stage 90. . In other words, the robot 30 moves the imaging device 10 using a coordinate system having a point on the stage 90 as the origin, whereby the relative position between the work W and the imaging device 10 and the posture of the imaging device 10 with respect to the work W Can be changed.
 PLC50は、撮像装置10がワークW上の複数の検査対象部分を順次撮像するように、ロボットコントローラ40および画像処理装置20を制御する。PLC50は、設定装置60によって設定された撮像条件を満たす経路に従って、ロボットコントローラ40を制御する。さらに、PLC50は、撮像装置10が指定された撮像条件を満たしたタイミングで撮像指令を出力するように画像処理装置20を制御する。 The PLC 50 controls the robot controller 40 and the image processing device 20 so that the imaging device 10 sequentially captures a plurality of inspection target portions on the work W. The PLC 50 controls the robot controller 40 according to a path that satisfies the imaging conditions set by the setting device 60. Further, the PLC 50 controls the image processing device 20 so that the imaging device 10 outputs an imaging command at a timing that satisfies the designated imaging condition.
 設定装置60は、ワークW上の複数の検査対象部分を順次撮像するための、ワークWと撮像装置10との間の相対位置を含む撮像条件を満たす経路を設定する。設定装置60は、新製品または新品種のワークWの外観検査が必要となったときに、当該ワークWに適した撮像条件を満たす経路を設定する。 The setting device 60 sets a path that satisfies imaging conditions including a relative position between the work W and the imaging device 10 for sequentially imaging a plurality of inspection target portions on the work W. The setting device 60 sets a path that satisfies the imaging conditions suitable for the work W when a new product or a new type of work W needs to be visually inspected.
 図2を参照して、図1に示される外観検査システム1による検査結果の表示態様について説明する。図2は、画像処理装置20の表示装置23に表示される検査結果の一例を示す図である。 With reference to FIG. 2, a display mode of the inspection result by the visual inspection system 1 shown in FIG. 1 will be described. FIG. 2 is a diagram illustrating an example of an inspection result displayed on the display device 23 of the image processing device 20.
 画像処理装置20の検査部21は、ワークWを複数の方向から撮影して得られた各画像に対して予め定められた画像処理を実行することにより、ワークWの各検査対象部分について欠陥の有無を検査する。検査結果は、たとえば、「欠陥有」および「欠陥無」のいずれかで表わされる。検査部21による検査処理は、ステージ90が検査対象のワークWを予め定められた位置に搬送する度に実行される。検査部21による検査結果は、ワークWの識別情報とワークWの検査対象部分とに対応付けられた上で表示制御部22に出力される。表示制御部22は、ワークWの各検査対象部分についての検査結果をワークWごとに表わした検査結果マトリクス25を表示装置23に表示する。 The inspection unit 21 of the image processing apparatus 20 performs a predetermined image process on each image obtained by photographing the workpiece W from a plurality of directions, thereby detecting a defect in each inspection target portion of the workpiece W. Check for presence. The inspection result is represented by, for example, one of “with defect” and “without defect”. The inspection process by the inspection unit 21 is executed each time the stage 90 transports the workpiece W to be inspected to a predetermined position. The inspection result by the inspection unit 21 is output to the display control unit 22 after being associated with the identification information of the work W and the inspection target portion of the work W. The display control unit 22 displays on the display device 23 an inspection result matrix 25 in which the inspection result of each inspection target portion of the work W is represented for each work W.
 図2に示されるように、検査結果マトリクス25の横軸には、ワークの検査対象部分が部位名別に並べられる。各部位名は、予め登録されていてもよいし、ユーザによって設定されてもよい。検査結果マトリクス25の縦軸には、検査済みのワークの識別情報が並べられる。ワークの識別情報は、たとえば、ワークの通し番号(以下、「ワークNo」ともいう。)やワーク名などで表わされる。図2の例では、ワークの識別情報がワークNoで表わされている。 As shown in FIG. 2, on the horizontal axis of the inspection result matrix 25, the inspection target portions of the work are arranged by site name. Each part name may be registered in advance or may be set by the user. On the vertical axis of the inspection result matrix 25, identification information of inspected works is arranged. The work identification information is represented by, for example, a work serial number (hereinafter, also referred to as “work No.”) or a work name. In the example of FIG. 2, the work identification information is represented by work No.
 また、検査結果マトリクス25上には、ワークの検査対象部分とワークNoとの各組み合わせに対応付けてセルが配置される。表示制御部22は、各ワークWの各検査対象部分の検査結果を対応するセル上に表示する。このように、各ワークの各検査対象部分についての検査結果が2次元に表わされることで、ユーザは、どのワークのどの部分に欠陥があるのかを即座に把握することができる。2次元での検査結果の表示は、検査結果が膨大になるほど有効になる。 {Circle around (2)} On the inspection result matrix 25, cells are arranged in association with each combination of the inspection target portion of the work and the work No. The display control unit 22 displays an inspection result of each inspection target portion of each work W on a corresponding cell. As described above, the inspection result of each inspection target portion of each work is represented two-dimensionally, so that the user can immediately grasp which portion of which work has a defect. The display of the inspection result in two dimensions becomes effective as the inspection result becomes enormous.
 典型的には、表示制御部22は、検査結果マトリクス25に含まれる検査結果の内の、欠陥を示す検査結果を、他の検査結果とは異なる表示態様で強調表示する。一例として、欠陥を示す検査結果のセルは、特定の色(たとえば、赤色)で表わされてもよいし、点滅表示で表わされてもよい。図2の例では、「欠陥有」を示すセルにはハッチングが付されており、「欠陥無」を示すセルにはハッチングが付されていない。欠陥を示す検査結果が他の検査結果とは異なる表示態様で強調表示されることで、ユーザは、欠陥を示す検査結果を容易に判別することができ、どのワークのどの部分に欠陥があるのかをより容易に把握することができる。 Typically, the display control unit 22 highlights an inspection result indicating a defect among the inspection results included in the inspection result matrix 25 in a display mode different from other inspection results. As an example, the cell of the inspection result indicating a defect may be represented by a specific color (for example, red) or may be represented by a blinking display. In the example of FIG. 2, cells indicating “with defect” are hatched, and cells indicating “no defect” are not hatched. Since the inspection result indicating the defect is highlighted in a display mode different from other inspection results, the user can easily determine the inspection result indicating the defect, and which part of which work has the defect. Can be grasped more easily.
 なお、上述では、ワークの検査対象部位が検査結果マトリクス25の横軸に配列され、ワークNoが検査結果マトリクス25の縦軸に配列される例について説明を行なったが、ワークNoが検査結果マトリクス25の横軸に配列され、ワークの検査対象部位が検査結果マトリクス25の縦軸に配列されてもよい。 In the above description, an example has been described in which the inspection target portions of the work are arranged on the horizontal axis of the inspection result matrix 25 and the work numbers are arranged on the vertical axis of the inspection result matrix 25. 25 may be arranged on the horizontal axis, and the inspection target site of the work may be arranged on the vertical axis of the inspection result matrix 25.
 <B.検査結果マトリクスの展開/集約機能>
 図3~図7を参照して、検査結果マトリクス25に示されるセルの展開/集約機能について説明する。図3は、図2に示される検査結果マトリクス25の変形例を示す図である。図3に示される検査結果マトリクス25の各行または各列は、予め定められた分類ルールに従って行単位または列単位でグルーピングされている点で、図2に示される検査結果マトリクス25と異なる。
<B. Inspection result matrix development / aggregation function>
With reference to FIGS. 3 to 7, a description will be given of the function of developing / aggregating cells shown in the inspection result matrix 25. FIG. FIG. 3 is a diagram showing a modification of the inspection result matrix 25 shown in FIG. Each row or each column of the inspection result matrix 25 shown in FIG. 3 differs from the inspection result matrix 25 shown in FIG. 2 in that the rows or columns are grouped in units of rows or columns according to a predetermined classification rule.
 図3の例では、検査結果マトリクス25の横軸方向においては、ワークの部位「A1」~「A5」は、グループGV1としてグルーピングされており、ワークの部位「B1」~「B3」は、グループGV2としてグルーピングされている。ワークの検査対象部分のグルーピングは、図4に示される分類ルール134Aに従って行なわれる。 In the example of FIG. 3, in the horizontal axis direction of the inspection result matrix 25, the work parts “A1” to “A5” are grouped as a group GV1, and the work parts “B1” to “B3” are grouped. Grouped as GV2. The grouping of the inspection target portion of the work is performed according to the classification rule 134A shown in FIG.
 図4は、分類ルール134Aのデータ構造の一例を示す図である。図4に示されるように、分類ルール134Aには、ワークWの検査対象部分が階層的に対応付けられている。各部位の対応関係は、予め登録されていてもよいし、ユーザによって任意に設定されてもよい。一例として、上位の階層に規定される検査対象部分は、下位の階層に規定される検査対象部分を包含する関係を有する。図4の例では、検査対象部分「A」は、部位「A1」~「A5」に包含されている。すなわち、部位「A1」~「A5」は、検査対象部分「A」内の領域である。 FIG. 4 is a diagram showing an example of the data structure of the classification rule 134A. As illustrated in FIG. 4, the inspection rule of the work W is hierarchically associated with the classification rule 134A. The correspondence between the parts may be registered in advance or may be arbitrarily set by the user. As an example, the inspection target part defined in the higher hierarchy has a relationship including the inspection target part defined in the lower hierarchy. In the example of FIG. 4, the inspection target portion “A” is included in the portions “A1” to “A5”. That is, the parts “A1” to “A5” are areas in the inspection target part “A”.
 再び図3を参照して、検査結果マトリクス25の縦軸方向においては、ワークNo「001A」~「001E」は、グループGH1としてグルーピングされており、ワークNo「002A」~「002J」は、グループGH2としてグルーピングされている。ワークNoのグルーピングは、図5に示される分類ルール134Bに従って行なわれる。 Referring to FIG. 3 again, in the vertical axis direction of inspection result matrix 25, work Nos. “001A” to “001E” are grouped as group GH1, and work Nos. “002A” to “002J” are grouped. Grouped as GH2. The grouping of the work numbers is performed according to the classification rule 134B shown in FIG.
 図5は、分類ルール134Bのデータ構造の一例を示す図である。図5に示されるように、分類ルール134Bには、検査対象の各ワークが階層的に関連付けられている。各ワークの階層関係は、予め登録されていてもよいし、ユーザによって任意に設定されてもよい。一例として、各ワークは、生産ラインのロット番号に関連付けられる。上位の階層に規定されるロット番号は、下位の階層に規定されるワークを包含する関係を有する。図5の例では、ロット番号「001」は、ワークNo「001A」~「001E」に包含される。すなわち、ロット番号「001」において、ワークNo「001A」~「001E」が生産されたことを示す。 FIG. 5 is a diagram showing an example of the data structure of the classification rule 134B. As shown in FIG. 5, each work to be inspected is hierarchically associated with the classification rule 134B. The hierarchical relationship of each work may be registered in advance or may be arbitrarily set by the user. As an example, each work is associated with a lot number of a production line. The lot number defined in the upper hierarchy has a relation including the work defined in the lower hierarchy. In the example of FIG. 5, the lot number “001” is included in the work numbers “001A” to “001E”. In other words, it indicates that the work numbers “001A” to “001E” were produced in the lot number “001”.
 再び図3を参照して、検査結果マトリクス25の縦軸方向および横軸方向の各グループには、展開/集約ボタンが割り付けられている。図3の例では、グループGH1には展開/集約ボタンBH1が割り付けられている。グループGH2には展開/集約ボタンBH2が割り付けられている。グループGV1には展開/集約ボタンBV1が割り付けられている。GV2には展開/集約ボタンBV2が割り付けられている。 Referring again to FIG. 3, a development / aggregation button is assigned to each group of the inspection result matrix 25 in the vertical axis direction and the horizontal axis direction. In the example of FIG. 3, the expand / aggregate button BH1 is assigned to the group GH1. The group GH2 is assigned an expand / consolidate button BH2. The group GV1 is assigned an expand / consolidate button BV1. The GV2 is assigned an expand / consolidate button BV2.
 図6は、展開/集約ボタンBV2を押下した場合における検査結果マトリクス25の画面遷移を示す図である。展開/集約ボタンBV2は、グループGV2のセルに対する集約指示および展開指示を押下の度に交互に受け付ける。すなわち、グループGV2のセルが展開されている状態で、展開/集約ボタンBV2が押下された場合、画像処理装置20の表示制御部22は、グループGV2のセルを一列に集約する。一方で、グループGV2のセルが集約されている状態で、展開/集約ボタンBV2が押下された場合、表示制御部22は、グループGV2のセルの表示を集約前に戻す。 FIG. 6 is a diagram showing screen transition of the inspection result matrix 25 when the deploy / aggregate button BV2 is pressed. The expansion / aggregation button BV2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GV2 each time the button is pressed. That is, when the expansion / aggregation button BV2 is pressed while the cells of the group GV2 are expanded, the display control unit 22 of the image processing device 20 aggregates the cells of the group GV2 in a line. On the other hand, when the expansion / aggregation button BV2 is pressed in a state where the cells of the group GV2 are aggregated, the display control unit 22 returns the display of the cells of the group GV2 to the state before the aggregation.
 集約対象の検査結果に「欠陥有」を示す検査結果と「欠陥無」を示す検査結果とが含まれている場合、表示制御部22は、「欠陥有」の検査結果の表示を「欠陥無」の検査結果の表示よりも優先して集約処理を行なう。一例として、破線AR1内の検査結果には、「欠陥有」を示す検査結果が1個含まれており、「欠陥無」を示す検査結果が2個含まれている。集約処理により、破線AR1内の3個の検査結果は、破線AR2内に示される1個の検査結果に集約される。このとき、集約前の検査結果には、「欠陥有」が含まれているので、表示制御部22は、集約後の検査結果を「欠陥有」として表わす。 If the inspection results to be consolidated include an inspection result indicating “defect” and an inspection result indicating “no defect”, the display control unit 22 changes the display of the inspection result “defect” to “no defect”. Is performed prior to the display of the inspection result of "." As an example, the inspection result within the broken line AR1 includes one inspection result indicating “defective” and two inspection results indicating “no defect”. By the aggregation processing, the three inspection results in the broken line AR1 are aggregated into one inspection result shown in the broken line AR2. At this time, since the inspection result before aggregation includes “defective”, the display control unit 22 expresses the inspection result after aggregation as “defective”.
 このように、表示制御部22は、集約前の検査結果に「欠陥有」を示す検査結果が1個でも含まれている場合には、表示制御部22は、集約後の検査結果を「欠陥有」とする。一方で、表示制御部22は、集約前の検査結果に「欠陥有」を示す検査結果が1個も含まれていない場合には、表示制御部22は、集約後の検査結果を「欠陥無」とする。 As described above, when at least one inspection result indicating “defective” is included in the inspection result before aggregation, the display control unit 22 determines the inspection result after aggregation as “defect”. Yes ". On the other hand, when the inspection result before aggregation does not include any inspection result indicating “defective”, the display control unit 22 determines that the inspection result after aggregation is “No defect”. ".
 図7は、展開/集約ボタンBH2を押下した場合における検査結果マトリクス25の画面遷移を示す図である。展開/集約ボタンBH2は、グループGH2のセルに対する集約指示および展開指示を押下の度に交互に受け付ける。すなわち、グループGH2のセルが展開されている状態で、展開/集約ボタンBH2が押下された場合、表示制御部22は、グループGH2のセルを一行に集約する。一方で、グループGH2のセルが集約されている状態で、展開/集約ボタンBH2が押下された場合、表示制御部22は、グループGH2のセルの表示を集約前に戻す。 FIG. 7 is a diagram showing screen transition of the inspection result matrix 25 when the deploy / aggregate button BH2 is pressed. The expansion / aggregation button BH2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GH2 each time the button is pressed. That is, when the expansion / aggregation button BH2 is pressed while the cells of the group GH2 are expanded, the display control unit 22 aggregates the cells of the group GH2 into one row. On the other hand, when the expansion / aggregation button BH2 is pressed in a state where the cells of the group GH2 are aggregated, the display control unit 22 returns the display of the cells of the group GH2 to the state before the aggregation.
 一例として、破線AR3内の検査結果には、「欠陥有」を示す検査結果が1個含まれ、「欠陥無」を示す検査結果が9個含まれている。集約処理により、破線AR3内の10個の検査結果は、破線AR4内に示される1個の検査結果に集約される。このとき、集約前の検査結果には、「欠陥有」が含まれているので、表示制御部22は、集約後の検査結果を「欠陥有」として表わす。 と し て As an example, the inspection results within the broken line AR3 include one inspection result indicating “defect” and nine inspection results indicating “no defect”. By the aggregation process, the ten inspection results in the broken line AR3 are aggregated into one inspection result shown in the broken line AR4. At this time, since the inspection result before aggregation includes “defective”, the display control unit 22 expresses the inspection result after aggregation as “defective”.
 以上のように、画像処理装置20の表示制御部22は、グルーピングされている行群またはグルーピングされている列群に対する集約指示を受け付けた場合に、当該行群または当該列群の検査結果を一行または一列に集約して表示する。また、表示制御部22は、集約して表示されている検査結果に対する展開指示を受け付けた場合に、当該集約して表示されている検査結果の表示を集約前に戻す。これにより、検査結果マトリクス25の行数または列数が表示装置23に表示しきれないほど膨大である場合でも、ユーザは、集約表示および展開表示を活用することで検査結果を効率的に確認することができる。 As described above, when the display control unit 22 of the image processing device 20 receives the aggregation instruction for the group of rows or the group of columns, the display control unit 22 displays the inspection result of the row group or the column group in one row. Or display them in a single row. In addition, when the display control unit 22 receives an instruction to expand the inspection results that are collectively displayed, the display control unit 22 returns the display of the inspection results that are collectively displayed to the state before the aggregation. Thus, even when the number of rows or columns of the inspection result matrix 25 is so large that it cannot be displayed on the display device 23, the user can efficiently check the inspection results by utilizing the aggregated display and the expanded display. be able to.
 なお、上述の例では、集約されたセルが「欠陥有」と「欠陥無」との2値で表される例について説明を行なったが、集約されたセルは必ずしも2値で表される必要はない。一例として、「欠陥有」を示す検査結果が集約対象のセルに多く含まれているほど、表示制御部22は、集約後のセルの表示を濃く表示してもよい。 In the above-described example, an example has been described in which an aggregated cell is represented by two values, that is, “has a defect” and “no defect”, but the aggregated cell is necessarily represented by a binary value. There is no. As an example, the display control unit 22 may display the cells after aggregation more densely as the number of inspection results indicating “defective” is included in the cells to be aggregated.
 <C.ハードウェア構成>
 次に、図8~図10を参照して、図1に示される画像処理装置20、PLC50、および設定装置60のハードウェア構成について順に説明する。
<C. Hardware Configuration>
Next, hardware configurations of the image processing device 20, the PLC 50, and the setting device 60 shown in FIG. 1 will be described in order with reference to FIGS.
  (C1.画像処理装置20のハードウェア構成)
 図8は、画像処理装置20のハードウェア構成の一例を示す模式図である。図8を参照して、画像処理装置20は、典型的には、汎用的なコンピュータアーキテクチャに従う構造を有しており、予めインストールされたプログラムをプロセッサが実行することで、外観検査処理などの各種の画像処理を実現する。
(C1. Hardware Configuration of Image Processing Device 20)
FIG. 8 is a schematic diagram illustrating an example of a hardware configuration of the image processing apparatus 20. Referring to FIG. 8, image processing device 20 typically has a structure according to a general-purpose computer architecture, and executes various programs such as a visual inspection process by executing a program installed in advance by a processor. Image processing is realized.
 より具体的には、画像処理装置20は、CPU(Central Processing Unit)やMPU(Micro-Processing Unit)などのプロセッサ110と、RAM(Random Access Memory)112と、表示コントローラ114と、システムコントローラ116と、I/O(Input Output)コントローラ118と、ハードディスクなどの記憶装置120と、カメラインターフェイス122と、入力インターフェイス124と、コントローラインターフェイス126と、通信インターフェイス128と、メモリカードインターフェイス130とを含む。これらの各部は、システムコントローラ116を中心として、互いにデータ通信可能に接続される。 More specifically, the image processing apparatus 20 includes a processor 110 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 112, a display controller 114, and a system controller 116. , An input / output (I / O) controller 118, a storage device 120 such as a hard disk, a camera interface 122, an input interface 124, a controller interface 126, a communication interface 128, and a memory card interface 130. These units are connected to each other so as to enable data communication with the system controller 116 as a center.
 プロセッサ110は、システムコントローラ116との間でプログラム(コード)などを交換して、これらを所定順序で実行することで、目的の演算処理を実現する。 The processor 110 exchanges programs (codes) with the system controller 116 and executes them in a predetermined order, thereby realizing a target arithmetic processing.
 システムコントローラ116は、プロセッサ110、RAM112、表示コントローラ114、およびI/Oコントローラ118とそれぞれバスを介して接続されており、各部との間でデータ交換などを行なうとともに、画像処理装置20全体の処理を司る。 The system controller 116 is connected to the processor 110, the RAM 112, the display controller 114, and the I / O controller 118 via buses, respectively, exchanges data with each unit, and performs processing of the entire image processing apparatus 20. Govern
 RAM112は、典型的には、DRAM(Dynamic Random Access Memory)などの揮発性の記憶装置であり、記憶装置120から読み出されたプログラムや、撮像装置10によって取得されたカメラ画像(画像データ)、カメラ画像に対する処理結果、およびワークデータなどを保持する。 The RAM 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and includes a program read from the storage device 120, a camera image (image data) acquired by the imaging device 10, The processing result for the camera image, the work data, and the like are stored.
 表示コントローラ114は、表示装置23と接続されており、システムコントローラ116からの内部コマンドに従って、各種の情報を表示するための信号を表示装置23へ出力する。 The display controller 114 is connected to the display device 23, and outputs a signal for displaying various information to the display device 23 according to an internal command from the system controller 116.
 I/Oコントローラ118は、画像処理装置20に接続される記録媒体や外部機器との間のデータ交換を制御する。より具体的には、I/Oコントローラ118は、記憶装置120と、カメラインターフェイス122と、入力インターフェイス124と、コントローラインターフェイス126と、通信インターフェイス128と、メモリカードインターフェイス130と接続される。 The I / O controller 118 controls data exchange between a recording medium connected to the image processing apparatus 20 and an external device. More specifically, I / O controller 118 is connected to storage device 120, camera interface 122, input interface 124, controller interface 126, communication interface 128, and memory card interface 130.
 記憶装置120は、プロセッサ110で実行される画像処理プログラム142や表示プログラム143に加えて、プロジェクトファイル144などの各種データを格納する。プロジェクトファイル144の詳細については後述する。典型的には、記憶装置120は、ハードディスクなどの不揮発性の磁気記憶装置であってもよいし、フラッシュメモリなどの半導体記憶装置であってもよいし、DVD-RAM(Digital Versatile Disk Random Access Memory)などの光学記憶装置であってもよい。 The storage device 120 stores various data such as a project file 144 in addition to the image processing program 142 and the display program 143 executed by the processor 110. Details of the project file 144 will be described later. Typically, the storage device 120 may be a nonvolatile magnetic storage device such as a hard disk, a semiconductor storage device such as a flash memory, or a DVD-RAM (Digital Versatile Disk Random Access Memory). ) May be used.
 画像処理プログラム142および表示プログラム143は、他のプログラムの一部に組み込まれて提供されるものであってもよい。その場合、画像処理プログラム142自体および表示プログラム143自体は、他のプログラムと協働して予め定められた処理を実行する。すなわち、画像処理プログラム142および表示プログラム143としては、このような他のプログラムに組み込まれた形態であってもよい。また、代替的に、画像処理プログラム142または表示プログラム143の実行により提供される機能の一部もしくは全部を専用のハードウェア回路として実装してもよい。 The image processing program 142 and the display program 143 may be provided by being incorporated in a part of another program. In that case, the image processing program 142 itself and the display program 143 itself execute a predetermined process in cooperation with another program. That is, the image processing program 142 and the display program 143 may be incorporated in such another program. Alternatively, some or all of the functions provided by executing the image processing program 142 or the display program 143 may be implemented as a dedicated hardware circuit.
 カメラインターフェイス122は、ワークWを撮影することで生成された画像データを受け付ける入力部に相当し、撮像装置10とプロセッサ110との間のデータ伝送を仲介する。より具体的には、カメラインターフェイス122は、1つ以上の撮像装置10と接続が可能であり、プロセッサ110からカメラインターフェイス122を介して撮像装置10とに撮影指示が出力される。これにより、撮像装置10とは、被写体を撮影し、生成した画像をカメラインターフェイス122を介してプロセッサ110に出力する。 The camera interface 122 corresponds to an input unit that receives image data generated by photographing the work W, and mediates data transmission between the imaging device 10 and the processor 110. More specifically, the camera interface 122 can be connected to one or more imaging devices 10, and a shooting instruction is output from the processor 110 to the imaging device 10 via the camera interface 122. Thus, the imaging device 10 captures an image of a subject and outputs the generated image to the processor 110 via the camera interface 122.
 入力インターフェイス124は、プロセッサ110とキーボード134、マウス、タッチパネル、専用コンソールなどの入力装置との間のデータ伝送を仲介する。 The input interface 124 mediates data transmission between the processor 110 and input devices such as a keyboard 134, a mouse, a touch panel, and a dedicated console.
 コントローラインターフェイス126は、PLC50とプロセッサ110との間のデータ伝送を仲介する。より具体的には、コントローラインターフェイス126は、PLC50によって制御される生産ラインの状態に係る情報やワークWに係る情報などをプロセッサ110へ伝送する。 The controller interface 126 mediates data transmission between the PLC 50 and the processor 110. More specifically, the controller interface 126 transmits information on the state of the production line controlled by the PLC 50, information on the work W, and the like to the processor 110.
 通信インターフェイス128は、プロセッサ110と図示しない他のパーソナルコンピュータやサーバ装置などとの間のデータ伝送を仲介する。通信インターフェイス128は、典型的には、イーサネット(登録商標)やUSB(Universal Serial Bus)などからなる。 The communication interface 128 mediates data transmission between the processor 110 and another personal computer or server device (not shown). The communication interface 128 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
 メモリカードインターフェイス130は、プロセッサ110と記録媒体であるメモリカード136との間のデータ伝送を仲介する。メモリカード136には、画像処理装置20で実行される画像処理プログラム142や表示プログラム143などが格納された状態で流通し、メモリカードインターフェイス130は、このメモリカード136からこれらのプログラムを読み出す。メモリカード136は、SD(Secure Digital)などの汎用的な半導体記憶デバイスや、フレキシブルディスク(Flexible Disk)などの磁気記録媒体や、CD-ROM(Compact Disk Read Only Memory)などの光学記録媒体等からなる。あるいは、通信インターフェイス128を介して、配信サーバなどからダウンロードしたプログラムを画像処理装置20にインストールしてもよい。 The memory card interface 130 mediates data transmission between the processor 110 and the memory card 136 as a recording medium. The memory card 136 circulates with the image processing program 142 and the display program 143 executed by the image processing apparatus 20 stored therein, and the memory card interface 130 reads out these programs from the memory card 136. The memory card 136 may be a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), or an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory). Become. Alternatively, a program downloaded from a distribution server or the like may be installed in the image processing apparatus 20 via the communication interface 128.
  (C2.PLC50のハードウェア構成)
 次に、図9を参照して、PLC50のハードウェア構成について説明する。図9は、PLC50のハードウェア構成の一例を示す模式図である。
(C2. Hardware configuration of PLC 50)
Next, a hardware configuration of the PLC 50 will be described with reference to FIG. FIG. 9 is a schematic diagram illustrating an example of a hardware configuration of the PLC 50.
 PLC50は、チップセット212と、プロセッサ214と、不揮発性メモリ216と、主メモリ218と、システムクロック220と、メモリカードインターフェイス222と、通信インターフェイス228と、内部バスコントローラ230と、フィールドバスコントローラ238とを含む。チップセット212と他のコンポーネントとの間は、各種のバスを介してそれぞれ結合されている。 The PLC 50 includes a chipset 212, a processor 214, a nonvolatile memory 216, a main memory 218, a system clock 220, a memory card interface 222, a communication interface 228, an internal bus controller 230, and a field bus controller 238. including. The chipset 212 and other components are respectively connected via various buses.
 プロセッサ214およびチップセット212は、典型的には、汎用的なコンピュータアーキテクチャに従う構成を有している。すなわち、プロセッサ214は、チップセット212から内部クロックに従って順次供給される命令コードを解釈して実行する。チップセット212は、接続されている各種コンポーネントとの間で内部的なデータを遣り取りするとともに、プロセッサ214に必要な命令コードを生成する。システムクロック220は、予め定められた周期のシステムクロックを発生してプロセッサ214に出力する。チップセット212は、プロセッサ214での演算処理の実行の結果得られたデータなどをキャッシュする機能を有する。 The processor 214 and the chipset 212 typically have a configuration according to a general-purpose computer architecture. That is, the processor 214 interprets and executes the instruction codes sequentially supplied from the chipset 212 according to the internal clock. The chipset 212 exchanges internal data with various connected components and generates an instruction code necessary for the processor 214. The system clock 220 generates a system clock having a predetermined cycle and outputs the generated system clock to the processor 214. The chipset 212 has a function of caching data and the like obtained as a result of execution of arithmetic processing by the processor 214.
 PLC50は、記憶手段として、不揮発性メモリ216および主メモリ218を有する。不揮発性メモリ216は、OS、システムプログラム、ユーザプログラム、ログ情報などを不揮発的に保持する。主メモリ218は、揮発性の記憶領域であり、プロセッサ214で実行されるべき各種プログラムを保持するとともに、各種プログラムの実行時の作業用メモリとしても使用される。 The PLC 50 has a nonvolatile memory 216 and a main memory 218 as storage means. The non-volatile memory 216 stores an OS, a system program, a user program, log information, and the like in a non-volatile manner. The main memory 218 is a volatile storage area that holds various programs to be executed by the processor 214 and is also used as a work memory when executing various programs.
 PLC50は、通信手段として、通信インターフェイス228と、内部バスコントローラ230と、フィールドバスコントローラ238とを有する。これらの通信回路は、データの送信および受信を行なう。 The PLC 50 has a communication interface 228, an internal bus controller 230, and a field bus controller 238 as communication means. These communication circuits transmit and receive data.
 通信インターフェイス228は、画像処理装置20や設定装置60との間でデータを遣り取りする。一例として、PLC50は、通信インターフェイス228を介して画像処理装置20に対して撮像指示を出力する。あるいは、PLC50は、通信インターフェイス228を介して画像処理装置20からワークWの外観検査結果を受け付ける。 The communication interface 228 exchanges data with the image processing device 20 and the setting device 60. As an example, the PLC 50 outputs an imaging instruction to the image processing device 20 via the communication interface 228. Alternatively, the PLC 50 receives the result of the appearance inspection of the work W from the image processing device 20 via the communication interface 228.
 内部バスコントローラ230は、内部バス226を介したデータの遣り取りを制御する。より具体的には、内部バスコントローラ230は、DMA(Dynamic Memory Access)制御回路232と、内部バス制御回路234と、バッファメモリ236とを含む。 (4) The internal bus controller 230 controls data exchange via the internal bus 226. More specifically, the internal bus controller 230 includes a DMA (Dynamic Memory Access) control circuit 232, an internal bus control circuit 234, and a buffer memory 236.
 メモリカードインターフェイス222は、PLC50に対して着脱可能なメモリカード224とプロセッサ214とを接続する。 The memory card interface 222 connects the memory card 224 detachable to the PLC 50 and the processor 214.
 フィールドバスコントローラ238は、フィールドネットワークに接続するための通信インターフェイスである。PLC50は、フィールドバスコントローラ238を介してロボットコントローラ40と接続される。当該フィールドネットワークには、たとえば、EtherCAT(登録商標)、EtherNet/IP(登録商標)、CompoNet(登録商標)などが採用される。 The fieldbus controller 238 is a communication interface for connecting to a field network. The PLC 50 is connected to the robot controller 40 via the field bus controller 238. For the field network, for example, EtherCAT (registered trademark), EtherNet / IP (registered trademark), CompoNet (registered trademark), or the like is employed.
  (C3.設定装置60のハードウェア構成)
 次に、図10を参照して、設定装置60のハードウェア構成について説明する。図10は、設定装置60のハードウェア構成の一例を示す模式図である。
(C3. Hardware configuration of setting device 60)
Next, a hardware configuration of the setting device 60 will be described with reference to FIG. FIG. 10 is a schematic diagram illustrating an example of a hardware configuration of the setting device 60.
 設定装置60は、プロセッサ362と、メインメモリ363と、記憶装置364と、ディスプレイ366と、入力デバイス367と、通信インターフェイス368とを含む。これらの各部は、バス361を介して、互いにデータ通信可能に接続される。 The setting device 60 includes a processor 362, a main memory 363, a storage device 364, a display 366, an input device 367, and a communication interface 368. These units are connected to each other via a bus 361 so as to be able to perform data communication.
 プロセッサ362は、記憶装置364にインストールされた設定プログラム365を含むプログラム(コード)をメインメモリ363に展開して、これらを所定順序で実行することで、各種の演算を実施する。メインメモリ363は、典型的には、DRAM(Dynamic Random Access Memory)などの揮発性の記憶装置である。 The processor 362 executes programs (codes) including the setting program 365 installed in the storage device 364 in the main memory 363 and executes them in a predetermined order, thereby performing various operations. The main memory 363 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory).
 記憶装置364は、設定装置60が備える内部メモリであって、不揮発性の記憶装置であって、設定プログラム365等の各種プログラムを記憶する。なお、記憶装置364は、ハードディスクであってもよいし、フラッシュメモリなどの半導体記憶装置であってもよい。 The storage device 364 is an internal memory included in the setting device 60 and is a nonvolatile storage device, and stores various programs such as the setting program 365. Note that the storage device 364 may be a hard disk or a semiconductor storage device such as a flash memory.
 設定プログラム365は、設定装置60による撮像条件の変更経路を設定するための手順を示すプログラムである。設定プログラム365などの各種プログラムは、記憶装置364に保存されている必要はなく、設定装置60と通信可能なサーバや、設定装置60と直接接続可能な外部メモリに保存されていてもよい。たとえば、外部メモリに設定装置60で実行される各種プログラムおよび各種プログラムで用いられる各種パラメータが格納された状態で流通し、設定装置60は、この外部メモリから各種プログラムおよび各種パラメータを読み出す。外部メモリは、コンピュータその他装置、機械等が記録されたプログラム等の情報を読み取り可能なように、当該プログラム等の情報を、電気的、磁気的、光学的、機械的または化学的作用によって蓄積する媒体である。あるいは、設定装置60と通信可能に接続されたサーバなどからダウンロードしたプログラムやパラメータを設定装置60にインストールしてもよい。 The setting program 365 is a program showing a procedure for setting a change path of the imaging condition by the setting device 60. Various programs such as the setting program 365 need not be stored in the storage device 364, but may be stored in a server that can communicate with the setting device 60 or an external memory that can be directly connected to the setting device 60. For example, the program is distributed in a state where various programs executed by the setting device 60 and various parameters used in the various programs are stored in the external memory, and the setting device 60 reads the various programs and various parameters from the external memory. The external memory accumulates information of the program or the like by an electric, magnetic, optical, mechanical or chemical action so that the computer or other device or machine or the like can read the information of the recorded program or the like. Medium. Alternatively, programs and parameters downloaded from a server or the like communicably connected to the setting device 60 may be installed in the setting device 60.
 ディスプレイ366は、たとえば液晶ディスプレイである。入力デバイス367は、たとえばマウス、キーボード、タッチパッドなどにより構成される。 The display 366 is, for example, a liquid crystal display. The input device 367 includes, for example, a mouse, a keyboard, a touch pad, and the like.
 通信インターフェイス368は、PLC50とプロセッサ362との間で各種データをやり取りする。なお、通信インターフェイス368は、サーバとプロセッサ362との間でデータをやり取りしてもよい。通信インターフェイス368は、PLC50との間で各種データをやり取りするためのネットワークに対応するハードウェアを含む。 The communication interface 368 exchanges various data between the PLC 50 and the processor 362. Note that the communication interface 368 may exchange data between the server and the processor 362. Communication interface 368 includes hardware corresponding to a network for exchanging various data with PLC 50.
 <D.プロジェクトファイル144>
 図11を参照して、上述のプロジェクトファイル144(図8参照)について説明する。図11は、プロジェクトファイル144のデータ構造の一例を示す図である。
<D. Project file 144>
The project file 144 (see FIG. 8) will be described with reference to FIG. FIG. 11 is a diagram showing an example of the data structure of the project file 144.
 ワークWの検査処理に関する各種データがプロジェクトファイル144に関連付けられることで、ワークWの検査処理に関する各種データが一元的に管理される。 (4) Various data relating to the inspection processing of the work W is associated with the project file 144, so that various data relating to the inspection processing of the work W are centrally managed.
 一例として、プロジェクトファイル144には、撮像条件ファイル146Aと、画像ファイル群146Bと、検査条件ファイル146Cと、検査結果ファイル146Dと、ワーク情報ファイル146Eと、生産情報ファイル146Fと、分類ルールファイル146Gと、期待値ファイル146Hとが関連付けられる。 As an example, the project file 144 includes an imaging condition file 146A, an image file group 146B, an inspection condition file 146C, an inspection result file 146D, a work information file 146E, a production information file 146F, and a classification rule file 146G. , And the expected value file 146H.
 撮像条件ファイル146Aは、ワークWの各検査対象部分についての撮像条件を規定するデータである。すなわち、画像処理装置20は、撮像条件ファイル146Aを参照すれば、検査対象部分をキーとして、撮像条件を一意に特定することができる。あるいは、ワークNoと検査対象部分との組み合わせをキーとして撮像条件が一意に特定されてもよい。撮像条件は、たとえば、ワークWと撮像装置10との間の相対位置、ワークWの撮像時における照明の位置などを含む。 The imaging condition file 146A is data that defines imaging conditions for each inspection target portion of the work W. That is, by referring to the imaging condition file 146A, the image processing apparatus 20 can uniquely specify the imaging condition using the inspection target portion as a key. Alternatively, the imaging condition may be uniquely specified using a combination of the work No. and the inspection target portion as a key. The imaging conditions include, for example, the relative position between the work W and the imaging device 10, the position of illumination when the work W is imaged, and the like.
 画像ファイル群146Bは、規定される撮像条件に従ってワークWを撮像して撮像装置10から得られたデータ群である。画像ファイル群146Bに含まれる各画像ファイルは、ワークNoと検査対象部分とに対応付けられている。すなわち、画像処理装置20は、ワークNoと検査対象部分との組み合わせをキーとして、検査に用いられた画像ファイルを一意に特定することができる。ワークNoと検査対象部分との組み合わせに関する情報は、たとえば、画像ファイルのファイル名や画像ファイルのヘッダに規定される。 The image file group 146B is a data group obtained from the imaging device 10 by imaging the work W according to the defined imaging conditions. Each image file included in the image file group 146B is associated with a work number and an inspection target portion. That is, the image processing apparatus 20 can uniquely specify the image file used for the inspection using the combination of the work number and the inspection target portion as a key. Information on the combination of the work No. and the inspection target portion is specified, for example, in the file name of the image file or the header of the image file.
 検査条件ファイル146Cは、ワークの各検査対象部分についての検査条件を規定するデータである。すなわち、画像処理装置20は、検査条件ファイル146Cを参照すれば、検査対象部分キーとして検査条件を一意に特定することができる。検査条件は、たとえば、画像内の検査対象部分、実行される画像処理フローチャート、画像処理の実行時に読み込まれる計測パラメータ、欠陥の有無を判断するための閾値などを含む。 The inspection condition file 146C is data defining the inspection conditions for each inspection target portion of the work. That is, the image processing apparatus 20 can uniquely specify the inspection condition as the inspection target partial key by referring to the inspection condition file 146C. The inspection conditions include, for example, an inspection target portion in an image, an image processing flowchart to be executed, a measurement parameter read at the time of execution of the image processing, a threshold for determining the presence or absence of a defect, and the like.
 検査結果ファイル146Dは、各ワークの各検査対象部分に対する計測値や検査結果などを規定するデータである。検査結果ファイル146Dのデータ構造の詳細については後述する。 The inspection result file 146D is data that defines measurement values, inspection results, and the like for each inspection target portion of each work. Details of the data structure of the inspection result file 146D will be described later.
 ワーク情報ファイル146Eは、検査対象のワークの形状を表わす3次元モデルや、検査対象のワークの種別情報などを含む。 The work information file 146E includes a three-dimensional model representing the shape of the work to be inspected, type information of the work to be inspected, and the like.
 生産情報ファイル146Fは、検査対象のワークのロット番号、シリアル番号などを規定するデータである。 The production information file 146F is data for defining a lot number, a serial number, and the like of a work to be inspected.
 分類ルールファイル146Gは、上述の分類ルール134A(図4参照)と、上述の分類ルール134B(図5参照)と、後述の分類ルール134C(図30参照)との少なくとも1つを含むデータである。 The classification rule file 146G is data including at least one of the above-described classification rule 134A (see FIG. 4), the above-described classification rule 134B (see FIG. 5), and the below-described classification rule 134C (see FIG. 30). .
 期待値ファイル146Hは、検査対象の各ワークの各検査対象部分について検査結果の期待値を規定するデータである。期待値ファイル146Hの詳細については後述する。 The expected value file 146H is data that specifies the expected value of the inspection result for each inspection target portion of each inspection target work. Details of the expected value file 146H will be described later.
 <E.検査結果ファイル146D>
 図12を参照して、上述のプロジェクトファイル144(図11参照)に含まれる検査結果ファイル146Dについて説明する。
<E. Inspection result file 146D>
The inspection result file 146D included in the project file 144 (see FIG. 11) will be described with reference to FIG.
 図12は、検査結果ファイル146Dのデータ構造の一例を示す図である。検査結果ファイル146Dは、ワークの識別情報(たとえば、ワークNo)と、ワークの検査対象部分を示す部位名と、検査処理の出力結果として得られる計測値と、欠陥の有無を示す検査結果とを含む。 FIG. 12 is a diagram showing an example of the data structure of the inspection result file 146D. The inspection result file 146D includes identification information of the work (for example, work No.), a part name indicating a part to be inspected of the work, a measurement value obtained as an output result of the inspection processing, and an inspection result indicating the presence or absence of a defect. Including.
 画像処理装置20は、検査結果ファイル146Dを参照すれば、ワークNoと検査対象部分との組み合わせをキーとして、計測値および検査結果を一意に特定することができる。 (4) The image processing apparatus 20 can uniquely specify the measurement value and the inspection result by using the combination of the work number and the inspection target portion as a key by referring to the inspection result file 146D.
 <F.設定装置における処理の流れ>
 図13は、設定装置60における処理の流れの一例を示すフローチャートである。外観検査システム1によって新製品または新品種のワークWの外観検査が必要になったときに、設定装置60は、たとえば図13に示すフローチャートに従った処理を行ない、新製品または新品種のワークWに適した撮像条件の変更経路を設定する。新製品または新品種のワークWの設計上の表面を示す3次元設計データは、設定装置60の記憶装置364に予め格納される。
<F. Process flow in setting device>
FIG. 13 is a flowchart illustrating an example of a processing flow in the setting device 60. When the appearance inspection system 1 needs to inspect the appearance of a new product or a new type of work W, the setting device 60 performs processing according to, for example, a flowchart shown in FIG. A change path of the imaging condition suitable for is set. The three-dimensional design data indicating the design surface of the new or new type of workpiece W is stored in the storage device 364 of the setting device 60 in advance.
 図13に示す例では、まずステップS1において、設定装置60のプロセッサ362は、記憶装置364から3次元設計データを読み込む。次にステップS2において、プロセッサ362は、3次元設計データで示されるワークWの設計上の外観を示す模式図を設定装置60のディスプレイ366に表示し、ユーザ入力に従ってワークW上の検査対象領域を決定する。このとき、ワークWがステージ90上の予め定められた位置に予め定められた姿勢で載置されることを前提として、プロセッサ362は、3次元設計データの座標系を、ステージ90上の点を原点とするXYZ座標系に変換する。そのため、検査対象領域は、ステージ90上の点を原点とするXYZ座標系で示される。 In the example shown in FIG. 13, first, in step S1, the processor 362 of the setting device 60 reads the three-dimensional design data from the storage device 364. Next, in step S2, the processor 362 displays a schematic diagram showing the design appearance of the work W indicated by the three-dimensional design data on the display 366 of the setting device 60, and specifies the inspection target area on the work W according to the user input. decide. At this time, on the assumption that the workpiece W is placed at a predetermined position on the stage 90 in a predetermined posture, the processor 362 changes the coordinate system of the three-dimensional design data to a point on the stage 90. Convert to the XYZ coordinate system as the origin. Therefore, the inspection target area is indicated by an XYZ coordinate system having a point on the stage 90 as an origin.
 次にステップS3において、プロセッサ362は、検査対象領域に対応する検査要件を満たすように、検査対象領域内から複数の検査対象位置を決定する。すなわち、プロセッサ362は、検査対象領域内において決定された複数の検査対象位置の各々を検査対象部分として決定する。 Next, in step S3, the processor 362 determines a plurality of inspection target positions from within the inspection target region so as to satisfy the inspection requirement corresponding to the inspection target region. That is, the processor 362 determines each of the plurality of inspection target positions determined in the inspection target region as an inspection target portion.
 次にステップS4において、プロセッサ362は、複数の検査対象位置の各々に対して、検査時のワークWと撮像装置10との間の相対位置を含む撮像条件を決定する。 Next, in step S4, the processor 362 determines an imaging condition including a relative position between the workpiece W and the imaging device 10 at the time of inspection for each of the plurality of inspection target positions.
 次にステップS5において、プロセッサ362は、ステップS4で決定された、ワークWと撮像装置10との間の相対位置を通過するように撮影経路を決定する。 Next, in step S5, the processor 362 determines an imaging route so as to pass through the relative position between the workpiece W and the imaging device 10 determined in step S4.
 ステップS2で決定された検査対象領域、ステップS3で決定された検査対象位置、ステップS4で決定された撮像条件、およびステップS5で決定された撮影経路は、画像処理装置20およびPLC50に送信される。 The inspection target area determined in step S2, the inspection target position determined in step S3, the imaging condition determined in step S4, and the imaging path determined in step S5 are transmitted to the image processing device 20 and the PLC 50. .
 なお、上記の例では、ステップS3において、検査対象部分から複数の検査対象位置が決定されるとした。しかしながら、ステップS2において、複数の検査対象部分が決定され、ステップS3において、複数の検査対象部分の各々から少なくとも1つの検査対象位置が決定されてもよい。これによっても、ステップS3により複数の検査対象位置が決定される。 In the above example, it is assumed that a plurality of inspection target positions are determined from the inspection target portion in step S3. However, a plurality of inspection target portions may be determined in step S2, and in step S3, at least one inspection target position may be determined from each of the plurality of inspection target portions. Also in this case, a plurality of inspection target positions are determined in step S3.
 <G.検査対象部分の決定方法>
 図14および図15を参照して、検査対象部分の決定方法の一例について説明する。図14は、ワークWの設計上の外観を示す模式図が表示された画面の一例を示す図である。図15は、検査対象部分が表示された画面の一例を示す図である。
<G. Method for determining inspection target part>
An example of a method of determining the inspection target portion will be described with reference to FIGS. FIG. 14 is a diagram illustrating an example of a screen on which a schematic diagram illustrating a design appearance of the work W is displayed. FIG. 15 is a diagram illustrating an example of a screen on which an inspection target portion is displayed.
 図14に示されるように、設定装置60は、ワークWの形状を表わす3次元モデルMLを含む画面61aをディスプレイ366に表示させる。画面61aには、垂直方向を軸として3次元モデルMLを回転させるためのツールボタン71と、水平方向を軸として3次元モデルMLを回転させるためのツールボタン72とが含まれる。ユーザは、ツールボタン71,72を操作することにより、3次元モデルMLを適宜回転させることができる。 設定 As shown in FIG. 14, the setting device 60 causes the display 366 to display a screen 61a including the three-dimensional model ML representing the shape of the work W. The screen 61a includes a tool button 71 for rotating the three-dimensional model ML about a vertical direction and a tool button 72 for rotating the three-dimensional model ML about a horizontal direction. The user can appropriately rotate the three-dimensional model ML by operating the tool buttons 71 and 72.
 設定装置60は、ユーザから検査したい場所の指定を受け付ける。具体的には、ユーザは、入力デバイス367を用いて、ワークWの3次元モデルML上における検査したい複数点をクリックする。図14に示す画面61aでは、ユーザによってクリックされた複数点が丸印74によって示されている。 (4) The setting device 60 receives designation of a place to be inspected from the user. Specifically, the user uses the input device 367 to click a plurality of points on the three-dimensional model ML of the work W that the user wants to inspect. In the screen 61a shown in FIG. 14, a plurality of points clicked by the user are indicated by circles 74.
 設定装置60は、3次元モデルML上で指定された複数の丸印74を含む領域を検査対象領域として切り出す。具体的には、設定装置60は、指定された各丸印74に対応するワークWの表面上の点から当該表面に沿って所定距離以内の範囲を求め、当該範囲の和集合を検査対象領域として切り出す。 The setting device 60 cuts out an area including a plurality of circles 74 specified on the three-dimensional model ML as an inspection target area. Specifically, the setting device 60 obtains a range within a predetermined distance from the point on the surface of the work W corresponding to each of the designated circles 74 along the surface, and sets the union of the ranges to the inspection target area. Cut out as.
 さらに、設定装置60は、輪郭線が直線または円などの幾何学図形となるように検査対象領域を調整する。図15に示すディスプレイ366の画面61bには、輪郭線がワークWのいずれかの稜線と平行な直線となるように調整された検査対象領域75が示される。 {Circle around (4)} The setting device 60 further adjusts the inspection target area so that the contour is a geometric figure such as a straight line or a circle. On the screen 61b of the display 366 shown in FIG. 15, the inspection target area 75 adjusted so that the contour is a straight line parallel to any one of the ridges of the workpiece W is shown.
 さらに、設定装置60は、ユーザから検査対象領域75の微調整の指示を受け付け、当該指示に応じて検査対象領域75を微調整する。画面61bには、検査対象領域75を拡大または縮小するためのツールボタン76,77が含まれる。ユーザは、入力デバイス367を用いて、検査対象領域75の輪郭線を構成する一辺を選択し、ツールボタン76,77を操作することにより、検査対象領域75の拡大または縮小の指示を入力する。もしくは、ユーザは、入力デバイス367に含まれるマウスを用いて、検査対象領域75の輪郭線を構成する一辺上の点78をドラッグすることにより、検査対象領域75の拡大または縮小の指示を入力してもよい。これにより、設定装置60は、検査対象領域75の拡大または縮小を行なう。このようにして、設定装置60は、検査対象領域75を決定する。図15に示す例では、直方体のワークWの6面のうちの4面の部分領域を集合した領域が検査対象領域75として決定されている。 設定 Furthermore, the setting device 60 receives a fine adjustment instruction of the inspection target area 75 from the user, and finely adjusts the inspection target area 75 according to the instruction. The screen 61b includes tool buttons 76 and 77 for enlarging or reducing the inspection target area 75. The user uses the input device 367 to select one side of the contour of the inspection target area 75 and operates the tool buttons 76 and 77 to input an instruction to enlarge or reduce the inspection target area 75. Alternatively, the user inputs an instruction to enlarge or reduce the inspection target area 75 by dragging a point 78 on one side constituting the contour of the inspection target area 75 using a mouse included in the input device 367. You may. Accordingly, the setting device 60 enlarges or reduces the inspection target area 75. Thus, the setting device 60 determines the inspection target area 75. In the example illustrated in FIG. 15, a region in which partial regions of four surfaces among the six surfaces of the rectangular parallelepiped work W are gathered is determined as the inspection target region 75.
 <H.検査対象位置の決定方法>
 図16~図18を参照して、設定装置60による検査対象位置の決定方法の一例について説明する。図16は、検査対象領域上の点の例を示す図である。図17は、検査対象領域から決定された検査対象位置とそれに対応する実効視野との例を示す図である。図18は、検査対象領域から決定された全ての検査対象部分と実効視野とを示す図である。
<H. Method of determining inspection target position>
An example of a method of determining the inspection target position by the setting device 60 will be described with reference to FIGS. FIG. 16 is a diagram illustrating an example of a point on the inspection target area. FIG. 17 is a diagram illustrating an example of the inspection target position determined from the inspection target region and the corresponding effective visual field. FIG. 18 is a diagram illustrating all inspection target portions determined from the inspection target region and the effective visual field.
 カメラ解像度をR(pix)、要求される最小欠陥サイズをD(mm)とするとき、当該最小欠陥サイズの欠陥を認識できるという検査要件を満たすことが可能な最大の、撮像装置10による撮像視野FOVが検査対象部分として設定される。撮像視野FOVの径は、一般に、比例定数aを用いてa×D×Rで表わされる。 When the camera resolution is R (pix) and the required minimum defect size is D (mm), the field of view of the imaging device 10 that can satisfy the inspection requirement that a defect of the minimum defect size can be recognized. The FOV is set as the inspection target part. The diameter of the imaging field of view FOV is generally represented by a × D × R using a proportionality constant a.
 設定装置60は、検査対象領域75を点の集合とみなし、法線ベクトルの分布を用いて、検査対象領域75内の点の近傍の3次元形状を調査することができる。たとえば、設定装置60は、ワークWの3次元設計データに基づいて、検査対象領域75内の点から表面に沿って距離L以内の範囲における法線ベクトルの分布を求める。距離Lは、たとえば、撮像視野FOVの径(=a×D×R)に比例定数bを乗じた値(=b×a×D×R)である。 The setting device 60 regards the inspection target area 75 as a set of points, and can investigate a three-dimensional shape near a point in the inspection target area 75 using the distribution of normal vectors. For example, based on the three-dimensional design data of the workpiece W, the setting device 60 obtains a distribution of normal vectors in a range within a distance L along a surface from a point in the inspection target area 75. The distance L is, for example, a value (= b × a × D × R) obtained by multiplying the diameter (= a × D × R) of the imaging field of view FOV by a proportional constant b.
 図16に示す例では、点P1の近傍は平坦である。そのため、点P1からワークWの表面に沿って距離L以内の範囲内の法線ベクトルは、全てベクトルn2となる。点P2は2面が交わる稜線の近傍に位置する。そのため、点P2からワークWの表面に沿って距離L以内の範囲内の法線ベクトルは、2つのベクトルn2,n4を含む。点P2は3面が交わる頂点の近傍に位置する。そのため、点P3からワークWの表面に沿って距離L以内の範囲内の法線ベクトルは、3つのベクトルn1,n2,n4を含む。したがって、設定装置60は、検査対象領域75内の点から表面に沿って距離L以内の範囲の法線ベクトルの分布により、当該点の近傍が平坦であるのか、当該点の近傍に稜線が存在するのか、当該点の近傍に稜線が存在するのか、を判定できる。 で は In the example shown in FIG. 16, the vicinity of the point P1 is flat. Therefore, all the normal vectors within the range of the distance L from the point P1 along the surface of the work W become the vector n2. Point P2 is located near the ridgeline where the two surfaces intersect. Therefore, the normal vector within the range of the distance L from the point P2 along the surface of the workpiece W includes the two vectors n2 and n4. Point P2 is located near the vertex where the three surfaces intersect. Therefore, the normal vectors within the range of the distance L from the point P3 along the surface of the workpiece W include three vectors n1, n2, and n4. Therefore, the setting device 60 determines whether the vicinity of the point is flat or a ridgeline exists near the point by the distribution of the normal vectors within the distance L along the surface from the point in the inspection target area 75. Or whether there is a ridgeline near the point.
 図17に示されるように、設定装置60は、検査対象領域75の中から無作為に選択した1点を検査対象位置Biとして選択する。設定装置60は、検査対象位置Biに対して、撮像装置10による実効視野FOV2iを検査対象部分として求める。実効視野FOV2iは、検査対象位置Biを含み、1つの撮像条件を用いて撮像装置10が撮像可能かつ検査可能な視野である。設定装置60は、検査対象位置Biの近傍の3次元形状に応じて実効視野FOV2iを決定する。実効視野FOV2iが取り得る最大径は、撮像視野FOVの径(=a×D×R)に設定される。 As shown in FIG. 17, the setting device 60 selects one point selected at random from the inspection target area 75 as the inspection target position Bi. The setting device 60 obtains the effective field of view FOV2i of the imaging device 10 for the inspection target position Bi as the inspection target portion. The effective visual field FOV2i is a visual field that includes the inspection target position Bi and can be imaged and inspected by the imaging device 10 using one imaging condition. The setting device 60 determines the effective field of view FOV2i according to the three-dimensional shape near the inspection target position Bi. The maximum diameter that the effective visual field FOV2i can have is set to the diameter of the imaging visual field FOV (= a × D × R).
 たとえば、設定装置60は、実効視野FOV2iにおける法線ベクトル分布のばらつきが所定範囲内に入るように、実効視野FOV2iを決定する。検査対象位置Biの近傍が平坦である場合、設定装置60は、検査対象位置Biから表面に沿って距離a×D×R以内の範囲(つまり、撮像視野FOV)を実効視野FOV2iとして決定する。一方、検査対象位置Biの近傍に稜線または頂点が存在する場合、図17に示されるように、設定装置60は、検査対象位置Biから表面に沿って距離a×D×R以内の範囲から一部の範囲を除いた範囲を実効視野FOV2iとする。除かれる範囲は、法線ベクトル分布において最大の分布量を示す法線ベクトル以外の法線ベクトルを有する表面の範囲である。 For example, the setting device 60 determines the effective visual field FOV2i so that the variation of the normal vector distribution in the effective visual field FOV2i falls within a predetermined range. When the vicinity of the inspection target position Bi is flat, the setting device 60 determines a range (that is, an imaging visual field FOV) within a distance a × D × R along the surface from the inspection target position Bi as the effective visual field FOV2i. On the other hand, when a ridgeline or a vertex exists near the inspection target position Bi, as shown in FIG. 17, the setting device 60 moves the inspection target position Bi along the surface within a distance a × D × R within one distance. The range excluding the range of the portion is defined as the effective field of view FOV2i. The excluded range is the range of the surface having normal vectors other than the normal vector showing the maximum distribution amount in the normal vector distribution.
 次に、設定装置60は、検査対象領域75に属する点の集合から、決定した実効視野FOV2iに属する点の集合を除去し、残った点の集合の中から無作為に選択した1点を次の検査対象位置B(i+1)として選択する。設定装置60は、選択した検査対象位置B(i+1)に対しても実効視野FOV2(i+1)を決定する。設定装置60は、検査対象領域75に属する点の集合が0になるまで、この処理を繰り返す。これにより、図18に示されるように、検査対象領域75から複数の検査対象位置Bi(i=1,2,・・・)が決定される。検査対象領域75内の全ての点は、複数の検査対象位置Bi(i=1,2,・・・)のいずれかの実効視野FOV2iに含まれる。上述したように、実効視野FOV2iが取り得る最大径は、撮像視野FOVの径に設定される。そのため、検査対象領域75の全域について最小欠陥サイズの欠陥が認識可能なように検査対象位置が決定される。 Next, the setting device 60 removes the set of points belonging to the determined effective field of view FOV2i from the set of points belonging to the inspection target area 75, and selects one point selected at random from the set of remaining points. Is selected as the inspection target position B (i + 1). The setting device 60 also determines the effective visual field FOV2 (i + 1) for the selected inspection target position B (i + 1). The setting device 60 repeats this processing until the set of points belonging to the inspection target area 75 becomes 0. Thereby, as shown in FIG. 18, a plurality of inspection target positions Bi (i = 1, 2,...) Are determined from the inspection target region 75. All points in the inspection target area 75 are included in any one of the effective visual fields FOV2i of the plurality of inspection target positions Bi (i = 1, 2,...). As described above, the maximum possible diameter of the effective visual field FOV2i is set to the diameter of the imaging visual field FOV. Therefore, the inspection target position is determined so that the defect having the minimum defect size can be recognized in the entire inspection target region 75.
 なお、上記の説明では、設定装置60は、検査対象領域75から無作為に検査対象位置Biを抽出するとした。しかしながら、設定装置60は、予め定められた幾何学的条件に従って検査対象領域75から検査対象位置Biを抽出してもよい。あるいは、設定装置60は、抽出された複数の検査対象位置Biが規則的に整列するように、検査対象領域75から検査対象位置Biを抽出してもよい。 In the above description, the setting device 60 randomly extracts the inspection target position Bi from the inspection target region 75. However, the setting device 60 may extract the inspection target position Bi from the inspection target region 75 according to a predetermined geometric condition. Alternatively, the setting device 60 may extract the inspection target position Bi from the inspection target region 75 such that the plurality of extracted inspection target positions Bi are regularly aligned.
 たとえば、ワークWにおいて稜線または頂点の近傍に欠陥が生じやすい場合、稜線または頂点の近傍を優先的に検査するという検査要件を満たすように、検査対象位置が決定されてもよい。たとえば、検査対象領域75内のうち、所定距離内に稜線または頂点が存在する点の集合から優先的に検査対象位置Biが抽出されてもよい。 For example, when a defect is likely to occur near the ridge or the vertex in the work W, the inspection target position may be determined so as to satisfy the inspection requirement that the inspection near the ridge or the vertex is preferentially performed. For example, the inspection target position Bi may be preferentially extracted from a set of points in the inspection target area 75 where a ridgeline or a vertex exists within a predetermined distance.
 <I.撮像条件の決定方法>
 図19は、設定装置60によるワークWの撮影位置Ciの決定方法の一例を示す図である。設定装置60は、検査対象位置BiにおけるワークWの設計上の外観表面の法線上において撮影位置Ciを決定する。具体的には、撮影位置Ciは、検査対象位置Biに対応する実効視野FOV2iを撮像可能であり、かつ、検査対象位置Biにピントが合う最適な被写体距離だけ検査対象位置Biから離れた位置に決定される。
<I. How to determine imaging conditions>
FIG. 19 is a diagram illustrating an example of a method of determining the photographing position Ci of the work W by the setting device 60. The setting device 60 determines the photographing position Ci on the normal line of the design appearance surface of the work W at the inspection target position Bi. More specifically, the photographing position Ci can capture an image of the effective field of view FOV2i corresponding to the inspection target position Bi, and is located at a position away from the inspection target position Bi by an optimal subject distance that is in focus with the inspection target position Bi. It is determined.
 次に、設定装置60は、決定された撮影位置Ciに基づいて撮像条件を決定する。典型的には、設定装置60は、検査対象位置Biを視野に含み、検査対象位置Biにピントが合う最適な撮像条件を決定する。 Next, the setting device 60 determines an imaging condition based on the determined imaging position Ci. Typically, the setting device 60 includes the inspection target position Bi in the field of view, and determines an optimal imaging condition for focusing on the inspection target position Bi.
 撮像条件は、たとえば、ステージ90上の点を原点とするXYZ座標系における撮像装置10のX座標、Y座標およびZ座標と、撮像装置10の光軸の方向を特定するθx、θy、θzとの6個のパラメータを含む。θxは、撮像装置10の光軸をXY平面に投影した線とX軸とのなす角度であり、θyは、撮像装置10の光軸をYZ平面に投影した線とY軸とのなす角度であり、θzは、撮像装置10の光軸をZX平面に投影した線とZ軸とのなす角度である。XYZ座標は、ワークWと撮像装置10との間の相対位置を特定するパラメータであり、θx,θy,θzは、ワークWに対する撮像装置10の姿勢を特定するパラメータである。 The imaging conditions include, for example, an X coordinate, a Y coordinate, and a Z coordinate of the imaging device 10 in an XYZ coordinate system whose origin is a point on the stage 90, and θx, θy, and θz that specify the direction of the optical axis of the imaging device 10. Of the six parameters. θx is the angle between the line that projects the optical axis of the imaging device 10 on the XY plane and the X axis, and θy is the angle that the line that projects the optical axis of the imaging device 10 on the YZ plane and the Y axis. Here, θz is the angle between the line that projects the optical axis of the imaging device 10 on the ZX plane and the Z axis. The XYZ coordinates are parameters for specifying a relative position between the work W and the imaging device 10, and θx, θy, and θz are parameters for specifying the posture of the imaging device 10 with respect to the work W.
 <J.撮影経路の決定方法>
 図20は、撮影位置Ciに対して決定された撮影経路の一例を示す図である。
<J. How to determine the shooting route>
FIG. 20 is a diagram illustrating an example of the photographing route determined for the photographing position Ci.
 設定装置60は、決定した撮影位置C1~C7を通過するように撮影経路を決定する。このとき、設定装置60は、予め定められた要件を満たすように撮影経路を決定する。たとえば、予め定められた要件が移動時間を最短とする要件である場合、設定装置60は、複数の撮像位置を順に通過する経路候補のうちで、移動時間が最短となる経路候補を撮影経路として設定する。たとえば、設定装置60は、予め定められた要件で示される項目(たとえば移動時間)を評価するための評価値を経路候補ごとに算出し、算出した評価値に基づいて撮影経路を設定すればよい。さらに、複数の検査対象位置の各々について複数の撮像位置候補が存在する場合も、設定装置60は、評価値に基づいて、当該複数の撮像位置候補の中の1つを撮像位置として選択すればよい。このようにして、予め定められた要件を満たすように最適化された撮影経路が設定される。 The setting device 60 determines a photographing route so as to pass through the decided photographing positions C1 to C7. At this time, the setting device 60 determines a photographing route so as to satisfy a predetermined requirement. For example, when the predetermined requirement is a requirement that minimizes the travel time, the setting device 60 sets a route candidate that has the shortest travel time among route candidates that sequentially pass through a plurality of imaging positions as an imaging route. Set. For example, the setting device 60 may calculate an evaluation value for evaluating an item (for example, travel time) indicated by a predetermined requirement for each route candidate, and set an imaging route based on the calculated evaluation value. . Furthermore, even when a plurality of imaging position candidates exist for each of the plurality of inspection target positions, the setting device 60 selects one of the plurality of imaging position candidates as the imaging position based on the evaluation value. Good. In this way, a shooting path optimized to satisfy predetermined requirements is set.
 図20には、決定された撮影経路PSが示されている。図20の例では、「撮影位置C1」→「撮影位置C4」→「撮影位置C5」→「撮影位置C6」→「撮影位置C7」→「撮影位置C3」→「撮影位置C2」を順に通過するように撮影経路PSが決定されている。 FIG. 20 shows the determined photographing route PS. In the example of FIG. 20, the image sequentially passes through “shooting position C1” → “shooting position C4” → “shooting position C5” → “shooting position C6” → “shooting position C7” → “shooting position C3” → “shooting position C2”. The photographing path PS is determined so as to perform the operation.
 <K.PLC50による制御処理>
 図21は、PLC50における処理の流れの一例を示すフローチャートである。図21に示される処理は、PLC50のプロセッサ214がプログラムを実行することにより実現される。他の局面において、処理の一部または全部が、回路素子またはその他のハードウェアによって実行されてもよい。
<K. Control processing by PLC 50>
FIG. 21 is a flowchart illustrating an example of the flow of processing in the PLC 50. The processing illustrated in FIG. 21 is realized by the processor 214 of the PLC 50 executing a program. In other aspects, some or all of the processing may be performed by circuit elements or other hardware.
 PLC50は、撮像装置10が決定された撮影経路PS(図20参照)を通過するようにロボットコントローラ40を制御するとともに、撮像装置10が決定された撮影位置Ci(図20参照)で撮影を実行するように画像処理装置20を制御する。 The PLC 50 controls the robot controller 40 so that the imaging device 10 passes through the determined imaging path PS (see FIG. 20), and executes imaging at the imaging position Ci (see FIG. 20) where the imaging device 10 is determined. The image processing device 20 is controlled so as to perform the operation.
 まず、ステップS10において、プロセッサ214は、ステージ90に載置されたワークWが予め定められた位置に設置されたか否かを判断する。プロセッサ214は、ワークWが予め定められた位置に設置されたと判断した場合(ステップS10においてYES)、制御をステップS12に切り替える。そうでない場合には(ステップS10においてNO)、プロセッサ214は、ステップS10の処理を再び実行する。 First, in step S10, the processor 214 determines whether or not the workpiece W placed on the stage 90 has been set at a predetermined position. If the processor 214 determines that the work W has been installed at a predetermined position (YES in step S10), it switches the control to step S12. Otherwise (NO in step S10), processor 214 executes the process of step S10 again.
 ステップS12において、プロセッサ214は、予め設定されている撮影経路PSに従ってロボットコントローラ40に指令値を順次出力する。ロボットコントローラ40は、PLC50からの指令値に従ってロボット30の各軸を駆動する。これにより、ロボット30の先端に取り付けられている撮像装置10が撮影経路PSに沿って順次移動する。 In step S12, the processor 214 sequentially outputs a command value to the robot controller 40 according to a preset photographing path PS. The robot controller 40 drives each axis of the robot 30 according to a command value from the PLC 50. Thereby, the imaging device 10 attached to the tip of the robot 30 sequentially moves along the imaging path PS.
 ステップS20において、プロセッサ214は、撮像装置10が予め設定されている撮影位置Ci(図20参照)に到達したか否かを判断する。撮像装置10が予め設定されている撮影位置Ciに到達したか否かは、たとえば、ステップS12でロボットコントローラ40に出力される指令値に基づいて判断される。プロセッサ214は、撮像装置10が予め設定されている撮影位置Ciに到達したと判断した場合(ステップS20においてYES)、制御をステップS22に切り替える。そうでない場合には(ステップS20においてNO)、プロセッサ214は、制御をステップS30に切り替える。 In step S20, the processor 214 determines whether or not the imaging device 10 has reached a preset imaging position Ci (see FIG. 20). Whether or not the imaging device 10 has reached the preset imaging position Ci is determined, for example, based on a command value output to the robot controller 40 in step S12. When determining that the imaging device 10 has reached the preset imaging position Ci (YES in step S20), the processor 214 switches the control to step S22. Otherwise (NO in step S20), processor 214 switches the control to step S30.
 ステップS22において、プロセッサ214は、画像処理装置20に撮影指示を出力する。これにより、画像処理装置20は、撮影処理を実行する。 In step S22, the processor 214 outputs a photographing instruction to the image processing device 20. Thereby, the image processing device 20 executes the photographing process.
 ステップS30において、プロセッサ214は、撮像装置10が予め設定されている撮影経路PSの終点に到達したか否かを判断する。撮像装置10が予め設定されている撮影経路PSの終点に到達したか否かは、たとえば、ステップS12でロボットコントローラ40に出力される指令値に基づいて判断される。プロセッサ214は、撮像装置10が予め設定されている撮影経路PSの終点に到達したと判断した場合(ステップS30においてNO)、プロセッサ214は、制御をステップS10に戻す。そうでない場合には(ステップS30においてNO)、プロセッサ214は、制御をステップS12に戻す。 In step S30, the processor 214 determines whether or not the imaging device 10 has reached the preset end point of the shooting path PS. Whether or not the imaging device 10 has reached the preset end point of the shooting path PS is determined, for example, based on a command value output to the robot controller 40 in step S12. If the processor 214 determines that the imaging device 10 has reached the preset end point of the shooting path PS (NO in step S30), the processor 214 returns the control to step S10. Otherwise (NO in step S30), processor 214 returns the control to step S12.
 <L.画像処理装置20による検査処理>
 図22は、画像処理装置20による検査処理の流れの一例を示すフローチャートである。図22に示される処理は、画像処理装置20のプロセッサ110が画像処理プログラム142(図8参照)を実行することにより実現される。なお、処理の一部または全部が、回路素子またはその他のハードウェアによって実行されてもよい。
<L. Inspection Processing by Image Processing Device 20>
FIG. 22 is a flowchart illustrating an example of the flow of an inspection process performed by the image processing apparatus 20. The processing illustrated in FIG. 22 is realized by the processor 110 of the image processing device 20 executing the image processing program 142 (see FIG. 8). Note that part or all of the processing may be executed by a circuit element or other hardware.
 ステップS50において、プロセッサ110は、PLC50から撮影指示を受け付けたか否かを判断する。上述の図21で説明したように、PLC50は、撮像装置10が予め設定されている撮影位置Ciに到達した時点で撮影指示を画像処理装置20に出力する。プロセッサ110は、PLC50から当該撮影指示を受け付けたと判断した場合(ステップS50においてYES)、制御をステップS52に切り替える。そうでない場合には(ステップS50においてNO)、プロセッサ110は、制御をステップS60に切り替える。 In step S50, the processor 110 determines whether or not a photographing instruction has been received from the PLC 50. As described above with reference to FIG. 21, the PLC 50 outputs a shooting instruction to the image processing device 20 when the imaging device 10 reaches a preset shooting position Ci. When determining that the photographing instruction has been received from the PLC 50 (YES in step S50), the processor 110 switches the control to step S52. Otherwise (NO in step S50), processor 110 switches the control to step S60.
 ステップS52において、プロセッサ110は、上述の検査部21(図1参照)として、撮影指示を撮像装置10に出力する。これにより、撮像装置10は、画像処理装置20から撮影指示を受け付けたことに基づいて、撮影処理を実行する。これにより、画像処理装置20は、予め定められた撮影位置Ciで撮像装置10に撮影処理を実行させることができる。 In step S52, the processor 110 outputs a shooting instruction to the imaging device 10 as the above-described inspection unit 21 (see FIG. 1). Thereby, the imaging device 10 executes the photographing process based on receiving the photographing instruction from the image processing device 20. Thereby, the image processing device 20 can cause the imaging device 10 to execute the imaging process at the predetermined imaging position Ci.
 ステップS54において、プロセッサ110は、上述の検査部21として、撮像装置10から得られた画像に対して予め定められた画像処理を実行することで検査処理を実行する。実行される画像処理は、たとえば、上述の検査条件ファイル146C(図11参照)に規定されている。上述のように、検査条件ファイル146Cは、ワークWの各検査対象部分についての検査条件を規定するデータである。すなわち、画像処理装置20は、検査条件ファイル146Cを参照すれば、検査対象部分をキーとして検査条件を一意に特定することができる。プロセッサ110は、検査条件ファイル146Cから取得した検査条件に従って得られた画像に対して画像処理を実行し、検査結果を得る。 In step S <b> 54, the processor 110 performs the inspection processing by executing predetermined image processing on the image obtained from the imaging device 10 as the inspection unit 21 described above. The image processing to be executed is specified, for example, in the above-described inspection condition file 146C (see FIG. 11). As described above, the inspection condition file 146C is data that defines inspection conditions for each inspection target portion of the work W. That is, by referring to the inspection condition file 146C, the image processing apparatus 20 can uniquely specify the inspection condition using the inspection target portion as a key. The processor 110 performs image processing on an image obtained according to the inspection condition acquired from the inspection condition file 146C, and obtains an inspection result.
 ステップS56において、プロセッサ110は、上述の検査部21として、ステップS54で得られた検査結果を上述の検査結果ファイル146D(図12参照)に書き込む。一例として、プロセッサ110は、ワークWの識別情報(たとえば、ワークNo)と、ワークWの検査対象部分を示す部位名と、検査処理の出力結果として得られる計測値と、欠陥の有無を示す検査結果とを対応付けた上で検査結果ファイル146Dに書き込む。 In step S56, the processor 110 writes the test result obtained in step S54 in the test result file 146D (see FIG. 12) as the test unit 21 described above. As an example, the processor 110 determines the identification information of the work W (for example, work No.), the name of the part indicating the inspection target portion of the work W, the measurement value obtained as an output result of the inspection processing, and the inspection indicating the presence or absence of a defect. After being associated with the result, the result is written into the inspection result file 146D.
 ステップS60において、プロセッサ110は、検査処理の終了命令を受け付けたか否かを判断する。プロセッサ110は、検査処理の終了命令を受け付けたと判断した場合(ステップS60においてYES)、図22に示される検査処理を終了する。そうでない場合には(ステップS60においてNO)、プロセッサ110は、制御をステップS50に戻す。 In step S60, the processor 110 determines whether or not an instruction to end the inspection processing has been received. When determining that an instruction to end the inspection process has been received (YES in step S60), processor 110 ends the inspection process shown in FIG. Otherwise (NO in step S60), processor 110 returns the control to step S50.
 なお、図22の例では、ワークWの撮影処理が実行される度に検査処理が実行される例について説明を行ったが、ワークWの撮影処理と、画像に対する検査処理とは、個別に実行されてもよい。すなわち、画像処理装置20は、まず画像を蓄積し、その後に蓄積した画像を一括で検査してもよい。 In the example of FIG. 22, an example is described in which the inspection process is executed each time the imaging process of the work W is executed. However, the imaging process of the work W and the inspection process on the image are performed separately. May be done. That is, the image processing apparatus 20 may first accumulate images, and then inspect the accumulated images collectively.
 <M.検査結果マトリクス25の表示フロー>
 図23は、上述の検査結果マトリクス25の表示処理の流れの一例を示すフローチャートである。図23に示される処理は、画像処理装置20のプロセッサ110が表示プログラム143(図8参照)を実行することにより実現される。なお、処理の一部または全部が、回路素子またはその他のハードウェアによって実行されてもよい。
<M. Display flow of inspection result matrix 25>
FIG. 23 is a flowchart illustrating an example of the flow of the above-described inspection result matrix 25 display process. The processing illustrated in FIG. 23 is realized by the processor 110 of the image processing device 20 executing the display program 143 (see FIG. 8). Note that part or all of the processing may be executed by a circuit element or other hardware.
 ステップS70において、プロセッサ110は、検査結果マトリクス25の表示操作を受け付けたか否かを判断する。当該表示操作は、画像処理装置20に備えられる操作部に対して行なわれる。当該操作部は、たとえば、キーボード134(図8参照)、マウス、タッチパネルなどを含む。プロセッサ110は、検査結果マトリクス25の表示操作を受け付けたと判断した場合(ステップS70においてYES)、制御をステップS72に切り替える。そうでない場合には(ステップS70においてNO)、プロセッサ110は、ステップS70の処理を再び実行する。 In step S70, the processor 110 determines whether a display operation of the inspection result matrix 25 has been received. The display operation is performed on an operation unit provided in the image processing device 20. The operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like. When determining that the display operation of the inspection result matrix 25 has been received (YES in step S70), the processor 110 switches the control to step S72. Otherwise (NO in step S70), processor 110 executes the process of step S70 again.
 ステップS72において、プロセッサ110は、上述の検査結果ファイル146D(図12参照)を参照して、検査結果ファイル146Dに規定されているワークNoと、検査結果ファイル146Dに規定されている検査対象部分とを取得する。 In step S72, the processor 110 refers to the above-described inspection result file 146D (see FIG. 12) to determine the work No. defined in the inspection result file 146D and the inspection target part defined in the inspection result file 146D. To get.
 ステップS74において、プロセッサ110は、上述の表示制御部22(図1参照)として、ステップS72で取得したワークNoに従って検査結果マトリクス25の縦軸を構成する。これにより、検査結果マトリクス25の縦軸にワークNoが配列される。 In step S74, the processor 110 configures the vertical axis of the inspection result matrix 25 according to the work number acquired in step S72 as the display control unit 22 (see FIG. 1). Thus, the work numbers are arranged on the vertical axis of the inspection result matrix 25.
 ステップS76において、プロセッサ110は、上述の表示制御部22として、ステップS74で取得したワークの検査対象部分に従って検査結果マトリクス25の横軸を構成する。これにより、検査結果マトリクス25の横軸に検査対象部分が配列される。 In step S76, the processor 110 configures the horizontal axis of the inspection result matrix 25 according to the inspection target portion of the workpiece acquired in step S74 as the display control unit 22 described above. As a result, the inspection target portions are arranged on the horizontal axis of the inspection result matrix 25.
 ステップS78において、プロセッサ110は、上述の表示制御部22として、ステップS72で取得したワークNoと、ステップS74で取得したワークの検査対象部分との組み合わせごとに検査結果マトリクス25上に空白のセルを配置する。各セルは、ワークNoと検査対象部分とに対応付けられる。 In step S78, the processor 110, as the above-described display control unit 22, sets blank cells on the inspection result matrix 25 for each combination of the work No. acquired in step S72 and the inspection target portion of the work acquired in step S74. Deploy. Each cell is associated with a work number and an inspection target portion.
 ステップS80において、プロセッサ110は、上述の表示制御部22として、検査結果ファイル146Dを参照して、検査結果マトリクス25の各セルに検査結果を反映する。より具体的には、プロセッサ110は、ワークNoと検査対象部分との各組み合わせについて検査結果を取得し、各組み合わせに対応するセルに取得した検査結果を反映する。典型的には、プロセッサ110は、「欠陥有」の検査結果と「欠陥無」の検査結果とを区別可能な態様で各セルに検査結果を反映する。一例として、検査結果が「欠陥有」を示す場合、プロセッサ110は、対応するセルを特定の色(たとえば、赤色)で表示する。一方で、検査結果が「欠陥無」を示す場合、プロセッサ110は、対応するセルを他の色(たとえば、白色または緑色)で表示する。 In step S80, the processor 110, as the display control unit 22, refers to the inspection result file 146D and reflects the inspection result in each cell of the inspection result matrix 25. More specifically, the processor 110 acquires an inspection result for each combination of the work No. and the inspection target portion, and reflects the acquired inspection result in a cell corresponding to each combination. Typically, the processor 110 reflects the inspection result on each cell in such a manner that the inspection result of “defective” and the inspection result of “no defect” can be distinguished. As an example, if the inspection result indicates “defective”, the processor 110 displays the corresponding cell in a specific color (for example, red). On the other hand, if the inspection result indicates “no defect”, the processor 110 displays the corresponding cell in another color (for example, white or green).
 ステップS82において、プロセッサ110は、検査結果マトリクス25内のいずれかのセルが選択されたか否かを判断する。当該選択操作は、画像処理装置20に備えられる操作部に対して行なわれる。当該操作部は、たとえば、キーボード134(図8参照)、マウス、タッチパネルなどを含む。プロセッサ110は、検査結果マトリクス25内のいずれかのセルが選択されたと判断した場合(ステップS82においてYES)、制御をステップS84に切り替える。そうでない場合には(ステップS82においてNO)、プロセッサ110は、制御をステップS90に切り替える。 In step S82, the processor 110 determines whether any cell in the inspection result matrix 25 has been selected. The selection operation is performed on an operation unit provided in the image processing device 20. The operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like. When determining that any of the cells in the inspection result matrix 25 has been selected (YES in step S82), the processor 110 switches the control to step S84. Otherwise (NO in step S82), processor 110 switches the control to step S90.
 検査結果マトリクス25は、ワークNoと検査対象部分とに対応付けられているため、各セルを選択することは、ワークNoと検査対象部分との組み合わせを指定することを意味する。 (4) Since the inspection result matrix 25 is associated with the work No. and the inspection target portion, selecting each cell means specifying a combination of the work No. and the inspection target portion.
 ステップS84において、プロセッサ110は、上述の検査部21(図1参照)または表示制御部22として、選択されたセルに応じた処理を実行する。すなわち、プロセッサ110は、指定されたワークNoと検査対象部分との組み合わせに応じた処理を実行する。セルの選択操作に応じて実行される処理の詳細については後述する。 In step S84, the processor 110 executes a process according to the selected cell as the inspection unit 21 (see FIG. 1) or the display control unit 22. That is, the processor 110 executes a process according to the combination of the specified work No. and the inspection target portion. Details of the processing executed in response to the cell selection operation will be described later.
 ステップS90において、プロセッサ110は、検査結果マトリクス25を閉じる操作を受け付けたか否かを判断する。プロセッサ110は、検査結果マトリクス25を閉じる操作を受け付けたと判断した場合(ステップS90においてYES)、図23に示される処理を終了する。そうでない場合には(ステップS90においてNO)、プロセッサ110は、制御をステップS82に戻す。 In step S90, the processor 110 determines whether or not an operation for closing the inspection result matrix 25 has been received. When determining that an operation of closing inspection result matrix 25 has been received (YES in step S90), processor 110 ends the process illustrated in FIG. Otherwise (NO in step S90), processor 110 returns the control to step S82.
 <N.検査結果マトリクス内のセルの選択機能>
 図24のステップS82,S84に示されるように、画像処理装置20は、検査結果マトリクス25内のセルが選択されたことに基づいて、選択されたセルに応じた処理を実行する。実行される得る処理としては、たとえば、以下で説明する具体例1~5の処理が挙げられる。以下では、これらの処理について順に説明する。
<N. Function for selecting cells in inspection result matrix>
As shown in steps S82 and S84 in FIG. 24, the image processing device 20 executes a process according to the selected cell based on the selection of the cell in the inspection result matrix 25. Examples of the processing that can be executed include the processing of specific examples 1 to 5 described below. Hereinafter, these processes will be described in order.
 なお、典型的には、画像処理装置20は、セルの選択操作を受け付けたことに基づいて、下記の具体例1~5に示す処理の少なくとも1つを実行する。あるいは、画像処理装置20は、セルの選択操作を受け付けたことに基づいて、下記の具体例1~5に示す処理の選択画面を表示し、当該選択画面において選択された処理を実行してもよい。 Note that typically, the image processing device 20 executes at least one of the processes shown in the following specific examples 1 to 5 based on the reception of the cell selection operation. Alternatively, the image processing apparatus 20 may display the selection screen of the processing shown in the following specific examples 1 to 5 based on the reception of the cell selection operation and execute the processing selected on the selection screen. Good.
  (N1.具体例1)
 図24は、セル選択操作に応じて実行される処理の具体例1を示す図である。図24の例では、検査結果マトリクス25内のセルCE1が選択されている。
(N1. Specific example 1)
FIG. 24 is a diagram illustrating a specific example 1 of a process performed in response to a cell selection operation. In the example of FIG. 24, the cell CE1 in the inspection result matrix 25 is selected.
 画像処理装置20の表示制御部22は、セルCE1が選択されたことに基づいて、セルCE1に対応する検査対象部分の検査に用いられた画像と、当該画像の撮像条件とを表示装置23に表示する。これにより、ユーザは、欠陥を示す画像を目視で確認したり、撮像条件の妥当性などを確認することができる。また、ユーザは、検査結果マトリクス25に示される検査対象部分がワークのどの部分を示しているのかを容易に確認することができる。 Based on the selection of the cell CE1, the display control unit 22 of the image processing device 20 displays, on the display device 23, the image used for the inspection of the inspection target portion corresponding to the cell CE1 and the imaging conditions of the image. indicate. Thereby, the user can visually check the image indicating the defect, and can confirm the validity of the imaging conditions. Further, the user can easily confirm which part of the work the inspection target part shown in the inspection result matrix 25 indicates.
 より具体的には、画像処理装置20は、選択されたセルCE1に関連付けられているワークNoと検査対象部分との組み合わせを特定する。次に、画像処理装置20は、特定したワークNoと検査対象部分との組み合わせをキーとして対応する画像ファイルを上述の画像ファイル群146B(図11参照)から取得する。次に、画像処理装置20は、特定した検査対象部分をキーとして対応する撮像条件を上述の撮像条件ファイル146A(図11参照)から取得する。表示制御部22は、取得した画像ファイルと撮像条件とを、セルCE1に対応付けられたポップアップ画面PU1上に表示する。表示された撮像条件は、任意の値に変更可能に構成される。 {More specifically, the image processing apparatus 20 specifies a combination of the work No. and the inspection target portion associated with the selected cell CE1. Next, the image processing apparatus 20 acquires a corresponding image file from the above-described image file group 146B (see FIG. 11) using the combination of the specified work number and the inspection target portion as a key. Next, the image processing apparatus 20 acquires the corresponding imaging condition from the above-described imaging condition file 146A (see FIG. 11) using the specified inspection target portion as a key. The display control unit 22 displays the acquired image file and the imaging condition on the pop-up screen PU1 associated with the cell CE1. The displayed imaging condition is configured to be changeable to an arbitrary value.
 なお、図24の例では、セルの選択操作に応じて撮像画像と撮像条件との両方が表示される例について説明を行なったが、表示制御部22は、セルの選択操作に応じて撮像画像および撮像条件のいずれか一方を表示してもよい。 In the example of FIG. 24, an example has been described in which both the captured image and the imaging condition are displayed according to the cell selection operation. However, the display control unit 22 determines the captured image according to the cell selection operation. Alternatively, either one of the imaging condition and the imaging condition may be displayed.
  (N2.具体例2)
 図25は、セル選択操作に応じて実行される処理の具体例2を示す図である。図25の例では、検査結果マトリクス25内のセル群CE2が選択されている。
(N2. Specific example 2)
FIG. 25 is a diagram illustrating a specific example 2 of a process performed in response to a cell selection operation. In the example of FIG. 25, the cell group CE2 in the inspection result matrix 25 is selected.
 画像処理装置20の表示制御部22は、セル群CE2が選択されたことに基づいて、検査対象のワークの形状を表わす3次元モデルMLを表示装置23に表示するとともに、セル群CE2に対応する検査対象部分の検査結果を当該3次元モデルML上の対応部分に表わす。これにより、ユーザは、ワークのどの部分に欠陥があるのかを容易に確認することができる。 The display control unit 22 of the image processing device 20 displays the three-dimensional model ML representing the shape of the work to be inspected on the display device 23 based on the selection of the cell group CE2, and corresponds to the cell group CE2. The inspection result of the inspection target portion is represented as a corresponding portion on the three-dimensional model ML. Thus, the user can easily confirm which part of the work has a defect.
 より具体的には、画像処理装置20の表示制御部22は、セル群CE2が選択されたことに基づいて、上述のワーク情報ファイル146E(図11参照)から、検査対象のワークの3次元モデルMLを取得し、セル群CE2に対応付けられたポップアップ画面PU2上に当該3次元モデルMLを表示する。 More specifically, based on the selection of the cell group CE2, the display control unit 22 of the image processing apparatus 20 reads the three-dimensional model of the work to be inspected from the work information file 146E (see FIG. 11). ML is acquired, and the three-dimensional model ML is displayed on the pop-up screen PU2 associated with the cell group CE2.
 次に、選択された各セルに関連付けられているワークNoと検査対象部分との組み合わせを特定する。次に、画像処理装置20は、特定したワークNoと検査対象部分との各組み合わせに対応する検査結果を上述の検査結果ファイル146D(図12参照)から取得する。次に、画像処理装置20は、各検査対象部分の検査結果を3次元モデルMLの対応箇所に表わす。上述の図13~図18で説明したように、検査対象位置Biは、3次元モデルMLに対して設定されるので、3次元モデルMLと検査対象位置Biとの関係は既知である。そのため、画像処理装置20の表示制御部22は、この既知の情報に基づいて、各検査対象部分の検査結果を3次元モデルML上に反映することができる。 (4) Next, the combination of the work No. associated with each selected cell and the inspection target portion is specified. Next, the image processing apparatus 20 acquires an inspection result corresponding to each combination of the specified work No. and the inspection target portion from the above-described inspection result file 146D (see FIG. 12). Next, the image processing device 20 displays the inspection result of each inspection target portion in a corresponding location of the three-dimensional model ML. As described above with reference to FIGS. 13 to 18, since the inspection target position Bi is set for the three-dimensional model ML, the relationship between the three-dimensional model ML and the inspection target position Bi is known. Therefore, the display control unit 22 of the image processing device 20 can reflect the inspection result of each inspection target portion on the three-dimensional model ML based on the known information.
 図25の例では、選択されたセル群CE2に対応する検査対象部分A3~A5が3次元モデルML上に反映されている。このとき、表示制御部22は、3次元モデルML上において、「欠陥有」を示す検査対象部分を「欠陥無」とは異なる表示態様で表わす。「欠陥有」を示す検査対象部分は、3次元モデルML上で特定の色(たとえば、赤色)で表わされ、「欠陥無」を示す検査対象部分は、3次元モデルML上で他の色(たとえば、緑色)で表わされる。あるいは、「欠陥有」を示す検査対象部分は、点滅表示で表わされてもよい。 In the example of FIG. 25, the inspection target portions A3 to A5 corresponding to the selected cell group CE2 are reflected on the three-dimensional model ML. At this time, the display control unit 22 displays, on the three-dimensional model ML, the inspection target portion indicating “having a defect” in a display mode different from “no defect”. The inspection target portion indicating "having a defect" is represented by a specific color (for example, red) on the three-dimensional model ML, and the inspection target portion indicating "no defect" is represented by another color on the three-dimensional model ML. (Eg, green). Alternatively, the inspection target portion indicating “defect” may be represented by blinking display.
 好ましくは、表示制御部22は、3次元モデルML上において、検査対象部分A3を含む検査対象領域75Aと、検査対象部分A4,A5を含む検査対象領域75Bとをさらに表わす。検査対象領域については図15で説明した通りであるので、その説明については繰り返さない。「欠陥有」を示す検査対象部分を含む検査対象領域75Bは、「欠陥有」を示す検査対象部分を含まない検査対象領域75Aとは異なる態様で表示される。一例として、検査対象領域75Bは、特定の色(たとえば、赤色)で表わされてもよいし、点滅表示で表わされてもよい。 {Preferably, the display control unit 22 further shows, on the three-dimensional model ML, an inspection target area 75A including the inspection target part A3 and an inspection target area 75B including the inspection target parts A4 and A5. Since the inspection target area is as described in FIG. 15, the description will not be repeated. The inspection target area 75B including the inspection target portion indicating “defective” is displayed in a different form from the inspection target region 75A not including the inspection target portion indicating “defective”. As an example, the inspection target area 75B may be represented by a specific color (for example, red) or may be represented by blinking display.
 また、表示制御部22は、垂直方向を軸として3次元モデルMLを回転させるためのツールボタン71と、水平方向を軸として3次元モデルMLを回転させるためのツールボタン72とをポップアップ画面PU2上にさらに表示する。ユーザは、ツールボタン71,72を操作することにより、3次元モデルMLを適宜回転させることができる。 In addition, the display control unit 22 displays a tool button 71 for rotating the three-dimensional model ML around the vertical direction and a tool button 72 for rotating the three-dimensional model ML around the horizontal direction on the pop-up screen PU2. To display further. The user can appropriately rotate the three-dimensional model ML by operating the tool buttons 71 and 72.
  (N3.具体例3)
 図26は、セル選択操作に応じて実行される処理の具体例3を示す図である。図26の例では、検査結果マトリクス25内のセル群CE3が選択されている。
(N3. Specific example 3)
FIG. 26 is a diagram illustrating a specific example 3 of the process performed in response to the cell selection operation. In the example of FIG. 26, the cell group CE3 in the inspection result matrix 25 is selected.
 画像処理装置20の表示制御部22は、セル群CE3が選択されたことに基づいて、セル群CE3に対応する検査結果の統計結果を表示装置23に表わす。これにより、ユーザは、任意の検査対象部分の検査結果を容易に分析することができる。 The display control unit 22 of the image processing device 20 displays the statistical result of the inspection result corresponding to the cell group CE3 on the display device 23 based on the selection of the cell group CE3. Thereby, the user can easily analyze the inspection result of an arbitrary inspection target portion.
 より具体的には、画像処理装置20は、選択されたセル群CE3の各セルに関連付けられているワークNoと検査対象部分との組み合わせを特定する。次に、画像処理装置20は、特定したワークNoと検査対象部分との組み合わせの各々に対応する検査結果および計測値を上述の検査結果ファイル146Dから取得する。次に、画像処理装置20は、取得した計測値に対して予め定められた統計処理を実行する。 {More specifically, the image processing apparatus 20 specifies a combination of the work number and the inspection target portion associated with each cell of the selected cell group CE3. Next, the image processing apparatus 20 acquires the inspection result and the measurement value corresponding to each combination of the specified work No. and the inspection target portion from the above-described inspection result file 146D. Next, the image processing device 20 performs a predetermined statistical process on the acquired measurement value.
 一例として、画像処理装置20は、予め定められた統計処理を実行することで、度数分布(すなわち、ヒストグラム)を生成する。当該ヒストグラムの横軸は計測値の区分を表わし、当該ヒストグラムの横軸は各区分に含まれる計測値の頻度を表わす。画像処理装置20の表示制御部22は、セル群CE3に対応付けられたポップアップ画面PU3上に、生成したヒストグラムを表示する。 As an example, the image processing device 20 generates a frequency distribution (that is, a histogram) by executing a predetermined statistical process. The horizontal axis of the histogram represents the division of the measurement value, and the horizontal axis of the histogram represents the frequency of the measurement value included in each division. The display control unit 22 of the image processing device 20 displays the generated histogram on the pop-up screen PU3 associated with the cell group CE3.
 好ましくは、計測値の度数分布は、選択された検査対象部分ごとに生成される。図26の例では、選択された検査対象部分A2,A3のそれぞれについて度数分布が示されている。 Preferably, the frequency distribution of the measurement values is generated for each selected inspection target portion. In the example of FIG. 26, the frequency distribution is shown for each of the selected inspection target portions A2 and A3.
 他の例として、画像処理装置20は、予め定められた統計処理を実行することで、計測値推移グラフを生成する。当該計測値推移グラフの横軸はワークNoを表わし、当該計測値推移グラフの縦軸は計測値を表わす。画像処理装置20の表示制御部22は、セル群CE3に対応付けられたポップアップ画面PU3上に、生成した計測値推移グラフを表示する。 As another example, the image processing device 20 generates a measured value transition graph by executing a predetermined statistical process. The horizontal axis of the measured value transition graph represents the work number, and the vertical axis of the measured value transition graph represents the measured value. The display control unit 22 of the image processing device 20 displays the generated measurement value transition graph on the pop-up screen PU3 associated with the cell group CE3.
 好ましくは、計測値推移グラフは、選択されたセル群CE3に対応する検査対象部分ごとに生成される。図26の例では、選択された検査対象部分A2,A3のそれぞれについて計測値推移グラフが示されている。 {Preferably, the measurement value transition graph is generated for each inspection target portion corresponding to the selected cell group CE3. In the example of FIG. 26, a measured value transition graph is shown for each of the selected inspection target portions A2 and A3.
 なお、図26の例では、セル群の選択操作に応じてヒストグラムと計測値推移グラフとの2つの統計結果が表示される例について説明を行なったが、表示制御部22は、セル群の選択操作に応じて少なくとも1つの統計結果を表示すればよい。 In the example of FIG. 26, an example has been described in which two statistical results, a histogram and a measured value transition graph, are displayed in accordance with a cell group selection operation. However, the display control unit 22 selects the cell group. At least one statistical result may be displayed according to the operation.
  (N4.具体例4)
 図27は、セル選択操作に応じて実行される処理の具体例4を示す図である。図27の例では、検査結果マトリクス25内のセル群CE4が選択されている。
(N4. Specific example 4)
FIG. 27 is a diagram illustrating a specific example 4 of the process performed in response to the cell selection operation. In the example of FIG. 27, the cell group CE4 in the inspection result matrix 25 is selected.
 画像処理装置20の検査部21は、セル群CE4が選択されたことに基づいて、セル群CE4に対応する検査対象部分の検査に用いられた画像に基づいて、当該検査対象部分を再検査する。これにより、ユーザは、任意の検査対象部分について再検査を容易に実行することができる。 The inspection unit 21 of the image processing device 20 re-examines the inspection target portion based on the selection of the cell group CE4 and based on the image used for the inspection of the inspection target portion corresponding to the cell group CE4. . Thus, the user can easily execute the re-inspection for an arbitrary inspection target portion.
 より具体的には、ユーザは、検査条件を設定し直し、その上で検査結果マトリクス25のセル群CE4を選択する。検査部21は、選択されたセル群CE4の各セルに関連付けられているワークNoと検査対象部分との組み合わせを特定する。次に、検査部21は、特定したワークNoと検査対象部分との組み合わせの各々に対応する画像データを上述の画像ファイル群146Bから取得する。次に、検査部21は、特定した検査対象部分の各々に対応する検査条件を上述の検査条件ファイル146Cから取得する。次に、検査部21は、取得した画像データの各々に対して、対応する検査条件に従った画像処理を実行する。次に、表示制御部22は、再検査の結果を検査結果マトリクス25に反映する。 More specifically, the user resets the inspection conditions, and then selects the cell group CE4 in the inspection result matrix 25. The inspection unit 21 specifies a combination of the work No. associated with each cell of the selected cell group CE4 and the inspection target portion. Next, the inspection unit 21 acquires image data corresponding to each combination of the specified work No. and the inspection target portion from the image file group 146B. Next, the inspection unit 21 acquires the inspection conditions corresponding to each of the specified inspection target portions from the above-described inspection condition file 146C. Next, the inspection unit 21 performs image processing on each of the acquired image data according to the corresponding inspection conditions. Next, the display control unit 22 reflects the result of the reinspection on the inspection result matrix 25.
 なお、図27の例では、検査結果マトリクス25内において複数のセルが選択される例について説明を行なったが、本例においては、選択されるセル数は、1つ以上であればよい。 In the example of FIG. 27, an example has been described in which a plurality of cells are selected in the inspection result matrix 25. However, in this example, the number of selected cells may be one or more.
  (N5.具体例5)
 図28は、セル選択操作に応じて実行される処理の具体例5を示す図である。図28の例では、検査結果マトリクス25内のセル群CE5が選択されている。
(N5. Specific example 5)
FIG. 28 is a diagram illustrating a specific example 5 of the process performed in response to the cell selection operation. In the example of FIG. 28, the cell group CE5 in the inspection result matrix 25 is selected.
 画像処理装置20の表示制御部22は、セル群CE5が選択されたことに基づいて、検査対象のワークの3次元モデルMLを表示装置23に表示するとともに、セル群CE5に対応する検査対象部分の撮像条件を3次元モデルML上の対応部分に表わす。これにより、ユーザは、選択部分の撮像条件を容易に確認することができる。 The display control unit 22 of the image processing device 20 displays the three-dimensional model ML of the work to be inspected on the display device 23 based on the selection of the cell group CE5, and also displays the inspection target portion corresponding to the cell group CE5. Are represented by corresponding parts on the three-dimensional model ML. Thereby, the user can easily confirm the imaging condition of the selected portion.
 より具体的には、画像処理装置20の表示制御部22は、セル群CE5が選択されたことに基づいて、検査対象のワークの3次元モデルMLを上述のワーク情報ファイル146E(図11参照)から取得し、セル群CE5に対応付けられたポップアップ画面PU5に3次元モデルMLを表示する。 More specifically, based on the selection of the cell group CE5, the display control unit 22 of the image processing apparatus 20 stores the three-dimensional model ML of the work to be inspected in the work information file 146E described above (see FIG. 11). And displays the three-dimensional model ML on the pop-up screen PU5 associated with the cell group CE5.
 次に、選択された各セルに関連付けられているワークNoと検査対象部分との組み合わせを特定する。次に、画像処理装置20は、特定したワークNoと検査対象部分との各組み合わせに対応する撮像条件を上述の撮像条件ファイル146A(図11参照)から取得する。 (4) Next, the combination of the work No. associated with each selected cell and the inspection target portion is specified. Next, the image processing apparatus 20 acquires the imaging condition corresponding to each combination of the specified work No. and the inspection target portion from the above-described imaging condition file 146A (see FIG. 11).
 次に、画像処理装置20の表示制御部22は、セル群CE5に対応するワークNoの内からいずれか1つのワークNo(たとえば、最小のワークNo)を決定し、当該決定したワークNoに対応する各撮像条件を3次元モデルMLの対応箇所に表わす。ポップアップ画面PU5には、戻るボタン81と、進むボタン82とが表示されており、ユーザは、戻るボタン81または進むボタン82を押下することで、表示対象の撮像条件をワークNoの順に切り替えることができる。 Next, the display control unit 22 of the image processing apparatus 20 determines any one of the work numbers (for example, the minimum work number) from the work numbers corresponding to the cell group CE5, and corresponds to the determined work number. The respective imaging conditions to be performed are represented at corresponding locations of the three-dimensional model ML. The pop-up screen PU5 displays a return button 81 and a forward button 82. By pressing the return button 81 or the forward button 82, the user can switch the imaging conditions to be displayed in the order of work No. it can.
 表示される撮像条件は、3次元モデルMLに対する撮像装置10の撮影位置を含む。上述の図19で説明したように、3次元モデルMLに対する撮像装置10の撮影位置Ciは、設定装置60による設定処理で決定されているので、3次元モデルMLに対する撮像装置10の撮影位置Ciは既知である。そのため、画像処理装置20の表示制御部22は、この既知の情報に基づいて、撮像装置10を表わす模式図を撮影位置Ci上に表示することができる。図28の例では、撮影位置C5,C7において撮像装置10を表わす模式図が表示されている。 The displayed imaging conditions include the imaging position of the imaging device 10 with respect to the three-dimensional model ML. As described above with reference to FIG. 19, the shooting position Ci of the imaging device 10 with respect to the three-dimensional model ML is determined by the setting process performed by the setting device 60. Is known. Therefore, the display control unit 22 of the image processing device 20 can display a schematic diagram representing the imaging device 10 on the imaging position Ci based on the known information. In the example of FIG. 28, a schematic diagram representing the imaging device 10 at the imaging positions C5 and C7 is displayed.
 他の例として、表示される撮像条件は、ワーク上の検査対象部分を含む。上述の図13~図18で説明したように、検査対象位置Biは、3次元モデルMLに対して設定されるので、3次元モデルMLと検査対象位置Biとの関係は既知である。そのため、画像処理装置20の表示制御部22は、この既知の情報に基づいて、各検査対象部分の検査結果を3次元モデルML上に反映することができる。図28の例では、検査対象部分A5,B1が3次元モデルML上に表わされている。 と し て As another example, the displayed imaging condition includes a portion to be inspected on the workpiece. As described above with reference to FIGS. 13 to 18, since the inspection target position Bi is set for the three-dimensional model ML, the relationship between the three-dimensional model ML and the inspection target position Bi is known. Therefore, the display control unit 22 of the image processing device 20 can reflect the inspection result of each inspection target portion on the three-dimensional model ML based on the known information. In the example of FIG. 28, the inspection target portions A5 and B1 are represented on the three-dimensional model ML.
 なお、上述の例では、撮像条件として撮影位置や検査対象部分が表示される例について説明を行なったが、表示される撮像条件は、撮影位置および検査対象部分に限定されない。たとえば、撮像時における撮像装置10の光学条件(撮像視野など)、撮像時における照明条件などが表示されてもよい。 In the above example, an example in which the imaging position and the inspection target portion are displayed as the imaging conditions has been described, but the displayed imaging conditions are not limited to the imaging position and the inspection target portion. For example, an optical condition (such as an imaging field of view) of the imaging device 10 at the time of imaging and an illumination condition at the time of imaging may be displayed.
 <O.検査結果マトリクス25の変形例>
 図29は、図3に示される検査結果マトリクス25の変形例を示す図である。図3に示される検査結果マトリクス25の各行は、ワークのロット番号でグルーピングされているのに対して、図29に示される検査結果マトリクス25の各行は、ワークの品種でグルーピングされている。
<O. Modified Example of Inspection Result Matrix 25>
FIG. 29 is a diagram showing a modification of the inspection result matrix 25 shown in FIG. Each row of the inspection result matrix 25 shown in FIG. 3 is grouped by the lot number of the work, whereas each row of the inspection result matrix 25 shown in FIG. 29 is grouped by the type of the work.
 異なる品種のワーク間では、共通の検査対象部分が存在する可能性もあるし、特有の検査対象部分が存在する可能性もある。そのため、異なる品種の各ワークの検査結果を1つの検査結果マトリクス25上で表わす場合、工夫が必要になる。 (4) There is a possibility that a common inspection target portion exists between works of different types, or a specific inspection target portion exists. Therefore, when the inspection result of each work of a different kind is represented on one inspection result matrix 25, a device is required.
 一例として、ワーク品種αは、ワーク品種βにはない特有の検査対象部分「A1」,「A3」を有するとする。この場合、図29に示されるように、ワーク品種αの検査対象部分「A1」,「A3」に対応するセルについては検査結果が反映されるが、ワーク品種βの検査対象部分「A1」,「A3」に対応するセルについては「該当無」として空白が表示される。 と し て As an example, it is assumed that the work type α has unique inspection target portions “A1” and “A3” which are not included in the work type β. In this case, as shown in FIG. 29, the inspection results are reflected in the cells corresponding to the inspection target portions “A1” and “A3” of the work type α, but the inspection target portions “A1” and “A1” For the cell corresponding to “A3”, a blank is displayed as “not applicable”.
 また、ワーク品種βは、ワーク品種αにはない特有の検査対象部分「A5」,「B3」を有するとする。この場合、図29に示されるように、ワーク品種βの検査対象部分「A5」,「B3」に対応するセルについては検査結果が反映されるが、ワーク品種αの検査対象部分「A5」,「B3」に対応するセルについては「該当無」として空白が表示される。 (4) Assume that the work type β has unique inspection target portions “A5” and “B3” that are not included in the work type α. In this case, as shown in FIG. 29, the inspection result is reflected in the cells corresponding to the inspection target portions “A5” and “B3” of the work type β, but the inspection target portions “A5” and “A5” of the work type α are reflected. For the cell corresponding to “B3”, a blank is displayed as “not applicable”.
 また、ワーク品種α,βは、共通の検査対象部分「A2」,「A4」,「B1」,「B2」を有するとする。この場合、図29に示されるように、ワーク品種αの検査対象部分「A2」,「A4」,「B1」,「B2」に対応するセルと、ワーク品種βの検査対象部分「A2」,「A4」,「B1」,「B2」に対応するセルとの両方に、検査結果が反映される。 ワ ー ク Further, it is assumed that the work types α and β have common inspection target portions “A2”, “A4”, “B1”, and “B2”. In this case, as shown in FIG. 29, the cells corresponding to the inspection target parts "A2", "A4", "B1", and "B2" of the work type α, and the inspection target parts "A2", The inspection result is reflected in both the cells corresponding to “A4”, “B1”, and “B2”.
 ワークの品種別の検査対象部分は、たとえば、図30に示される分類ルール134Cに規定されている。図30は、分類ルール134Cのデータ構造の一例を示す図である。 検 査 The inspection target portion for each type of work is specified, for example, in a classification rule 134C shown in FIG. FIG. 30 is a diagram illustrating an example of the data structure of the classification rule 134C.
 図30に示されるように、分類ルール134Cには、ワークWの検査対象部分が階層的に関連付けられている。図30の例では、ワーク品種αの検査対象部分「A」には、「A1」~「A4」が関連付けられている。ワーク品種αの検査対象部分「B」には、「B1」,「B2」が関連付けられている。ワーク品種βの検査対象部分「A」には、「A2」,「A4」,「A5」が関連付けられている。ワーク品種βの検査対象部分「B」には、「B1」~「B3」が関連付けられている。 As shown in FIG. 30, the inspection rule of the work W is hierarchically associated with the classification rule 134C. In the example of FIG. 30, “A1” to “A4” are associated with the inspection target portion “A” of the work type α. “B1” and “B2” are associated with the inspection target portion “B” of the work type α. “A2”, “A4”, and “A5” are associated with the inspection target portion “A” of the work type β. “B1” to “B3” are associated with the inspection target portion “B” of the work type β.
 再び図29を参照して、検査結果マトリクス25の縦軸方向および横軸方向の各グループには、ボタンが割り付けられている。図29の例では、グループGH1には展開/集約ボタンBH1が割り付けられている。グループGH2には展開/集約ボタンBH2が割り付けられている。グループGV1には展開/集約ボタンBV1が割り付けられている。グループGV2には展開/集約ボタンBV2が割り付けられている。 Referring again to FIG. 29, buttons are assigned to each group of the inspection result matrix 25 in the vertical axis direction and the horizontal axis direction. In the example of FIG. 29, an expand / consolidate button BH1 is assigned to the group GH1. The group GH2 is assigned an expand / consolidate button BH2. The group GV1 is assigned an expand / consolidate button BV1. The group GV2 is assigned an expand / consolidate button BV2.
 図31は、展開/集約ボタンBV2を押下した場合における検査結果マトリクス25の画面遷移を示す図である。展開/集約ボタンBV2は、グループGV2のセルに対する集約指示および展開指示を押下の度に交互に受け付ける。すなわち、グループGV2のセルが展開されている状態で、展開/集約ボタンBV2が押下された場合、画像処理装置20の表示制御部22は、グループGV2のセルを一列に集約する。一方で、グループGV2のセルが集約されている状態で、展開/集約ボタンBV2が押下された場合、表示制御部22は、グループGV2のセルの表示を集約前に戻す。 FIG. 31 is a diagram showing a screen transition of the inspection result matrix 25 when the development / aggregation button BV2 is pressed. The expansion / aggregation button BV2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GV2 each time the button is pressed. That is, when the expansion / aggregation button BV2 is pressed while the cells of the group GV2 are expanded, the display control unit 22 of the image processing device 20 aggregates the cells of the group GV2 in a line. On the other hand, when the expansion / aggregation button BV2 is pressed in a state where the cells of the group GV2 are aggregated, the display control unit 22 returns the display of the cells of the group GV2 to the state before the aggregation.
 「欠陥有」と「欠陥無」と「該当無」との検査結果が集約対象のセルに含まれている場合、表示制御部22は、「欠陥有」の表示を「欠陥無」,「該当無」の表示よりも優先して集約処理を行なう。また、「欠陥無」と「該当無」との検査結果が集約対象のセルに含まれている場合、表示制御部22は、「欠陥無」の表示を「該当無」の表示よりも優先して集約処理を行なう。 When the inspection results of “defective”, “no defect”, and “not applicable” are included in the cells to be consolidated, the display control unit 22 displays “defect existing” as “no defect”, “applicable”. Aggregation processing is performed with priority over the display of "None". When the inspection results of “no defect” and “not applicable” are included in the cell to be consolidated, the display control unit 22 gives priority to the display of “no defect” over the display of “not applicable”. To perform aggregation processing.
 一例として、破線AR10内には、「欠陥有」のセルが1個含まれており、「該当無」のセルが2個含まれている。集約処理により、破線AR10内の3個のセルは、破線AR12内に示される1個のセルに集約される。このとき、集約前のセルには、「欠陥有」のセルが含まれているので、表示制御部22は、集約後の検査結果を「欠陥有」として表わす。 と し て As an example, the broken line AR10 includes one “defective” cell and two “not applicable” cells. By the aggregation process, the three cells within the dashed line AR10 are aggregated into one cell indicated within the dashed line AR12. At this time, since the cells before aggregation include cells having “defect”, the display control unit 22 expresses the inspection result after aggregation as “defective”.
 他の例として、破線AR11内には、「欠陥無」のセルが2個含まれており、「該当無」のセルが1個含まれている。集約処理により、破線AR11内の3個のセルは、破線AR13内に示される1個のセルに集約される。このとき、集約前のセルには、「欠陥有」のセルが含まれておらず、「欠陥無」のセルが含まれているので、表示制御部22は、集約後の検査結果を「欠陥無」として表わす。 と し て As another example, the broken line AR11 includes two “no defect” cells and one “non-corresponding” cell. By the aggregation process, the three cells within the dashed line AR11 are aggregated into one cell indicated within the dashed line AR13. At this time, since the cells before aggregation do not include a cell having “defect” and include a cell having “no defect”, the display control unit 22 compares the inspection result after aggregation with “defect”. "None".
 図32は、展開/集約ボタンBH2を押下した場合における検査結果マトリクス25の画面遷移を示す図である。展開/集約ボタンBH2は、グループGH2のセルに対する集約指示および展開指示を押下の度に交互に受け付ける。すなわち、グループGH2のセルが展開されている状態で、展開/集約ボタンBH2が押下された場合、表示制御部22は、グループGH2のセルを一行に集約する。一方で、グループGH2のセルが集約されている状態で、展開/集約ボタンBH2が押下された場合、表示制御部22は、グループGH2のセルの表示を集約前に戻す。 FIG. 32 is a diagram showing screen transition of the inspection result matrix 25 when the deploy / aggregate button BH2 is pressed. The expansion / aggregation button BH2 alternately receives an aggregation instruction and an expansion instruction for cells in the group GH2 each time the button is pressed. That is, when the expansion / aggregation button BH2 is pressed while the cells of the group GH2 are expanded, the display control unit 22 aggregates the cells of the group GH2 into one row. On the other hand, when the expansion / aggregation button BH2 is pressed in a state where the cells of the group GH2 are aggregated, the display control unit 22 returns the display of the cells of the group GH2 to the state before the aggregation.
 破線AR14内には、「該当無」のセルが10個含まれている。グループGH2のセルの集約処理により、破線AR14内の10個のセルは、破線AR15内に示される1個のセルに集約される。このとき、集約前のセルには、「欠陥有」および「欠陥無」のセルが含まれていないので、表示制御部22は、集約後の検査結果を「該当無」として表わす。 The broken line AR14 includes ten cells of “not applicable”. By the aggregation processing of the cells of the group GH2, the ten cells in the broken line AR14 are aggregated into one cell shown in the broken line AR15. At this time, since the cells before aggregation do not include the cells with “defect” and “without defect”, the display control unit 22 expresses the inspection result after aggregation as “not applicable”.
 <P.検査結果マトリクスと期待値マトリクスとの比較機能>
 画像処理装置20は、検査結果マトリクス25に含まれる各検査結果についての真の正解値(以下、「期待値」ともいう。)を示す期待値マトリクスと、検査結果マトリクス25との比較結果を表示装置23に表示する。これにより、ユーザは、検査結果マトリクス25に示される各検査結果が期待通りであるか否かを容易に判断することができる。このような比較処理は、検査結果マトリクス25に示される検査結果の数が増えるほど有効となる。
<P. Comparison function between inspection result matrix and expected value matrix>
The image processing apparatus 20 displays a comparison result between the expected value matrix indicating a true correct value (hereinafter, also referred to as an “expected value”) for each inspection result included in the inspection result matrix 25 and the inspection result matrix 25. It is displayed on the device 23. Thus, the user can easily determine whether each test result shown in the test result matrix 25 is as expected. Such comparison processing becomes more effective as the number of test results shown in the test result matrix 25 increases.
 図33は、検査結果マトリクス25と期待値マトリクス27との比較処理を概略的に示す図である。 FIG. 33 is a diagram schematically showing a comparison process between the inspection result matrix 25 and the expected value matrix 27.
 期待値マトリクス27内のセルにおいて、検査結果マトリクス25に含まれる検査結果の少なくとも一部について期待値が規定される。期待値マトリクス27の各セルの期待値は、たとえば、ユーザ入力によって設定される。入力可能な期待値は、たとえば、「欠陥有」、「欠陥無」、および「無効」のいずれかである。ユーザによって設定された期待値マトリクス27は、上述の期待値ファイル146H(図11参照)として画像処理装置20に格納される。 In the cells in the expected value matrix 27, expected values are defined for at least a part of the inspection results included in the inspection result matrix 25. The expected value of each cell of the expected value matrix 27 is set, for example, by a user input. The expected value that can be input is, for example, one of “defective”, “no defect”, and “invalid”. The expected value matrix 27 set by the user is stored in the image processing apparatus 20 as the above-described expected value file 146H (see FIG. 11).
 画像処理装置20は、検査結果マトリクス25内の各セルと期待値マトリクス27内の各セルとの間で、同一行かつ同一列のセル同士を比較する。 (4) The image processing apparatus 20 compares cells in the same row and the same column between each cell in the inspection result matrix 25 and each cell in the expected value matrix 27.
 より具体的には、検査結果マトリクス25の検査結果が「欠陥無」で、期待値マトリクス27の期待値が「欠陥無」である場合、比較結果として「正解OK」が出力される。すなわち、「正解OK」は、検査結果が期待通りであることを意味する。 More specifically, if the inspection result of the inspection result matrix 25 is “no defect” and the expected value of the expected value matrix 27 is “no defect”, “correct answer OK” is output as the comparison result. That is, "correct answer OK" means that the inspection result is as expected.
 検査結果マトリクス25の検査結果が「欠陥有」で、期待値マトリクス27の期待値が「欠陥有」である場合、比較結果として「正解NG」が出力される。すなわち、「正解NG」は、検査結果が期待通りであることを意味する。 If the inspection result of the inspection result matrix 25 is “defective” and the expected value of the expected value matrix 27 is “defective”, “correct answer NG” is output as the comparison result. That is, “correct answer NG” means that the inspection result is as expected.
 検査結果マトリクス25の検査結果が「欠陥無」で、期待値マトリクス27の期待値が「欠陥有」である場合、比較結果として「見逃し」が出力される。「見逃し」は、検出すべき欠陥を見逃したことを意味する。すなわち、「見逃し」は、検査結果が期待通りではないことを示す。比較結果が「見逃し」となる一因として、欠陥有/欠陥無を判断するための閾値が緩すぎることが挙げられる。 If the inspection result of the inspection result matrix 25 is “no defect” and the expected value of the expected value matrix 27 is “defective”, “missing” is output as the comparison result. “Missed” means that a defect to be detected has been missed. That is, "missing" indicates that the inspection result is not as expected. One reason that the comparison result is “missed” is that the threshold value for determining whether or not there is a defect is too loose.
 検査結果マトリクス25の検査結果が「欠陥有」で、期待値マトリクス27の期待値が「欠陥無」である場合、比較結果として「過検出」が出力される。「過検出」は、正常な部分を欠陥として検出してしまったことを意味する。すなわち、「過検出」は、検査結果が期待通りではないことを示す。比較結果が「過検出」となる一因として、欠陥有/欠陥無を判断するための閾値が厳しすぎることが挙げられる。 If the inspection result of the inspection result matrix 25 is “defective” and the expected value of the expected value matrix 27 is “no defect”, “overdetected” is output as the comparison result. "Overdetection" means that a normal part has been detected as a defect. That is, “overdetected” indicates that the inspection result is not as expected. One reason that the comparison result is “overdetected” is that the threshold value for determining the presence / absence of a defect is too strict.
 検査結果マトリクス25の検査結果と、期待値マトリクス27の期待値との少なくとも一方が「無効」である場合、比較結果として「無効」が出力される。「無効」は、検査結果および期待値の少なくとも一方が存在しないことを意味する。 If at least one of the inspection result of the inspection result matrix 25 and the expected value of the expected value matrix 27 is “invalid”, “invalid” is output as the comparison result. “Invalid” means that at least one of the test result and the expected value does not exist.
 検査結果マトリクス25と期待値マトリクス27との比較結果として比較結果マトリクス29が出力される。表示制御部22は、比較結果「正解OK」,「正解NG」,「見逃し」,「過検出」,「無効」を区別可能な態様で表示する。一例として、これらの比較結果は、色によって区別されてもよいし、ハッチングの種類によって区別されてもよい。 A comparison result matrix 29 is output as a comparison result between the inspection result matrix 25 and the expected value matrix 27. The display control unit 22 displays the comparison result “correct answer OK”, “correct answer NG”, “missed”, “overdetected”, and “invalid” in a distinguishable manner. As an example, these comparison results may be distinguished by color or hatching type.
 なお、期待通りの比較結果である「正解OK」および「正解NG」は、同一色(たとえば、白色)で表示されてもよい。 The “correct answer OK” and “correct answer NG” that are the expected comparison results may be displayed in the same color (for example, white).
 また、検査結果マトリクス25と同様に、比較結果マトリクス29についてもセルの集約/展開機能が実装されてもよい。一例として、集約対象のセルに「見逃し」または「過検出」が1つでも含まれている場合、表示制御部22は、集約後のセルを「見逃し」または「過検出」とする。集約対象のセルに「見逃し」および「過検出」の両方が含まれている場合、表示制御部22は、「見逃し」および「過検出」の両方を含むことを示す表示態様で集約対象のセルを表わす。 Also, as with the inspection result matrix 25, the cell aggregation / expansion function may be implemented for the comparison result matrix 29. As an example, when at least one “missing” or “overdetection” is included in the cell to be aggregated, the display control unit 22 sets the cell after aggregation to “missing” or “overdetection”. When both the “missing” and “overdetection” are included in the cell to be aggregated, the display control unit 22 sets the cell to be aggregated in a display mode indicating that both “missing” and “overdetection” are included. Represents
 <Q.期待値マトリクスの作成支援機能>
 図34は、期待値マトリクス27の作成過程の一例を示す図である。画像処理装置20は、期待値マトリクス27の作成を支援する機能を有する。
<Q. Expectation matrix creation support function>
FIG. 34 is a diagram illustrating an example of a process of creating the expected value matrix 27. The image processing device 20 has a function of supporting creation of the expected value matrix 27.
 より具体的には、ユーザは、検査結果マトリクス25の各セルを選択する。セルの選択操作は、画像処理装置20に備えられる操作部に対して行なわれる。当該操作部は、たとえば、キーボード134(図8参照)、マウス、タッチパネルなどを含む。 More specifically, the user selects each cell of the inspection result matrix 25. The cell selection operation is performed on an operation unit provided in the image processing device 20. The operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like.
 一例として、ユーザによってセル群CE10が選択されたとする。その後、ユーザは、検査結果マトリクス25のセル群CE10を編集中の期待値マトリクス27にコピーする。これにより、ユーザは、期待値マトリクス27の各セルを1つずつ入力する必要がなくなり、期待値マトリクス27の作成の手間が軽減される。 と す る As an example, suppose that the cell group CE10 is selected by the user. Thereafter, the user copies the cell group CE10 of the inspection result matrix 25 to the expected value matrix 27 being edited. This eliminates the need for the user to input each cell of the expected value matrix 27 one by one, and reduces the time and effort of creating the expected value matrix 27.
 ユーザによって作成された期待値マトリクス27は、上述の期待値ファイル146H(図11参照)として画像処理装置20に格納される。 The expected value matrix 27 created by the user is stored in the image processing apparatus 20 as the above-described expected value file 146H (see FIG. 11).
 <R.検査結果の3次元表示のフロー>
 画像処理装置20は、ワークの検査結果を3次元モデルMLに反映することで、検査結果を3次元表示する機能を有する。ユーザは、3次元モデルML上で検査結果を確認することで、欠陥が生じている箇所を容易に判別することができる。
<R. Flow of 3D display of inspection results>
The image processing device 20 has a function of displaying the inspection result in three dimensions by reflecting the inspection result of the work on the three-dimensional model ML. The user can easily determine the location where the defect has occurred by checking the inspection result on the three-dimensional model ML.
 以下では、図35を参照して、検査結果の3次元表示処理について説明する。図35は、検査結果の3次元表示処理の流れの一例を示すフローチャートである。図35に示される処理は、画像処理装置20のプロセッサ110が表示プログラム143(図8参照)を実行することにより実現される。なお、処理の一部または全部が、回路素子またはその他のハードウェアによって実行されてもよい。 Hereinafter, the three-dimensional display processing of the inspection result will be described with reference to FIG. FIG. 35 is a flowchart illustrating an example of the flow of the three-dimensional display processing of the inspection result. The processing illustrated in FIG. 35 is realized by the processor 110 of the image processing device 20 executing the display program 143 (see FIG. 8). Note that part or all of the processing may be executed by a circuit element or other hardware.
 ステップS90において、プロセッサ110は、検査結果の3次元表示処理の実行操作を受け付けたか否かを判断する。当該表示操作は、画像処理装置20に備えられる操作部に対して行なわれる。当該操作部は、たとえば、キーボード134(図8参照)、マウス、タッチパネルなどを含む。プロセッサ110は、検査結果の3次元表示処理の実行操作を受け付けたと判断した場合(ステップS90においてYES)、制御をステップS92に切り替える。そうでない場合には(ステップS90においてNO)、プロセッサ110は、ステップS90の処理を再び実行する。 In step S90, the processor 110 determines whether or not an execution operation of the three-dimensional display processing of the inspection result has been received. The display operation is performed on an operation unit provided in the image processing device 20. The operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like. When determining that the execution operation of the three-dimensional display processing of the inspection result has been received (YES in step S90), processor 110 switches the control to step S92. Otherwise (NO in step S90), processor 110 executes the process of step S90 again.
 ステップS92において、プロセッサ110は、上述の表示制御部22(図1参照)として、ワークの3次元モデルMLを上述のワーク情報ファイル146E(図11参照)から取得し、取得した3次元モデルMLを表示装置23に表示する。 In step S92, the processor 110 acquires the three-dimensional model ML of the work from the work information file 146E (see FIG. 11) as the display control unit 22 (see FIG. 1), and outputs the acquired three-dimensional model ML. It is displayed on the display device 23.
 ステップS100において、プロセッサ110は、検査結果の3次元表示の更新指示を受け付けたか否かを判断する。当該更新指示は、たとえば、新たな検査結果が検査部21から得られたことに基づいて発せられる。あるいは、当該更新指示は、欠陥を示す検査結果が検査部21によって検出されたことに基づいて発せられる。あるいは、当該更新指示は、ユーザ操作に基づいて発せられる。プロセッサ110は、検査結果の3次元表示の更新指示を受け付けたと判断した場合(ステップS100においてYES)、制御をステップS102に切り替える。そうでない場合には(ステップS100においてNO)、プロセッサ110は、制御をステップS120に切り替える。 In step S100, the processor 110 determines whether an instruction to update the three-dimensional display of the inspection result has been received. The update instruction is issued, for example, based on a new inspection result obtained from the inspection unit 21. Alternatively, the update instruction is issued based on the fact that the inspection unit 21 has detected the inspection result indicating the defect. Alternatively, the update instruction is issued based on a user operation. When determining that the instruction to update the three-dimensional display of the inspection result has been received (YES in step S100), processor 110 switches the control to step S102. Otherwise (NO in step S100), processor 110 switches the control to step S120.
 ステップS102において、プロセッサ110は、上述の検査結果ファイル146D(図12参照)を参照して、検査対象のワークのワークNoの各検査対象部分について検査結果を取得する。 In step S102, the processor 110 refers to the above-described inspection result file 146D (see FIG. 12) to acquire inspection results for each inspection target portion of the work No. of the inspection target work.
 ステップS104において、プロセッサ110は、上述の表示制御部22として、ステップS102で取得した各検査結果を、ステップS92で表示された3次元モデルML上の対応箇所に表わす。上述の図13~図18で説明したように、検査対象位置Biは、3次元モデルMLに対して設定されるので、3次元モデルMLと検査対象位置Biとの関係は既知である。そのため、画像処理装置20の表示制御部22は、この既知の情報に基づいて、ステップS102で取得した各検査結果を3次元モデルML上に反映することができる。 In step S104, the processor 110, as the above-described display control unit 22, displays each inspection result acquired in step S102 in a corresponding location on the three-dimensional model ML displayed in step S92. As described above with reference to FIGS. 13 to 18, since the inspection target position Bi is set for the three-dimensional model ML, the relationship between the three-dimensional model ML and the inspection target position Bi is known. Therefore, the display control unit 22 of the image processing device 20 can reflect each inspection result acquired in step S102 on the three-dimensional model ML based on the known information.
 典型的には、プロセッサ110は、各検査対象部分の検査結果を3次元モデルML上の対応部分に表わす際に、欠陥を示す部分を他の部分とは異なる表示態様で表示する。一例として、欠陥を示す部分は、特定の色(たとえば、赤色)で表わされてもよいし、点滅表示で表わされてもよい。これにより、ユーザは、欠陥を示す部分をより判別しやすくなる。 Typically, when the inspection result of each inspection target portion is expressed as a corresponding portion on the three-dimensional model ML, the processor 110 displays a portion indicating a defect in a display mode different from other portions. As an example, a portion indicating a defect may be represented by a specific color (for example, red) or may be represented by a blinking display. This makes it easier for the user to determine the portion indicating the defect.
 ステップS110において、プロセッサ110は、3次元モデル上に示される検査対象部分のいずれかが選択されたか否かを判断する。当該選択操作は、画像処理装置20に備えられる操作部に対して行なわれる。当該操作部は、たとえば、キーボード134(図8参照)、マウス、タッチパネルなどを含む。プロセッサ110は、3次元モデル上に表わされている検査対象部分のいずれかが選択されたと判断した場合(ステップS110においてYES)、制御をステップS112に切り替える。そうでない場合には(ステップS110においてNO)、プロセッサ110は、制御をステップS120に切り替える。 In step S110, the processor 110 determines whether any of the inspection target portions shown on the three-dimensional model has been selected. The selection operation is performed on an operation unit provided in the image processing device 20. The operation unit includes, for example, a keyboard 134 (see FIG. 8), a mouse, a touch panel, and the like. If the processor 110 determines that any of the inspection target portions represented on the three-dimensional model has been selected (YES in step S110), it switches the control to step S112. Otherwise (NO in step S110), processor 110 switches the control to step S120.
 ステップS112において、プロセッサ110は、上述の検査部21または表示制御部22として、ステップS110で選択された検査対象部分に応じた処理を実行する。選択された検査対象部分に応じて実行される処理の詳細については後述する。 In step S112, the processor 110 executes a process corresponding to the inspection target portion selected in step S110 as the inspection unit 21 or the display control unit 22 described above. Details of the processing executed in accordance with the selected inspection target portion will be described later.
 ステップS120において、プロセッサ110は、検査結果の3次元表示画面を閉じる操作を受け付けたか否かを判断する。プロセッサ110は、検査結果の3次元表示画面を閉じる操作を受け付けたと判断した場合(ステップS120においてYES)、図35に示される処理を終了する。そうでない場合には(ステップS120においてNO)、プロセッサ110は、制御をステップS100に戻す。 In step S120, the processor 110 determines whether or not an operation to close the three-dimensional display screen of the inspection result has been received. When determining that an operation of closing the three-dimensional display screen of the inspection result has been received (YES in step S120), processor 110 ends the process illustrated in FIG. Otherwise (NO in step S120), processor 110 returns the control to step S100.
 以上のように、ステップS100において検査結果の3次元表示の更新指示が発せられた時点で検査結果の3次元表示が更新される。上述のように、更新指示は、たとえば、欠陥を示す検査結果が検査部21によって検出されたことに基づいて発せられる。すなわち、プロセッサ110は、順次検査されるワークに欠陥が検出された時点で、当該ワークについての各検査対象部分の検査結果で3次元モデルMLに表わされている検査結果を更新する。その結果、欠陥を示す検査結果が検出されない間は、3次元モデルML上に表示される検査結果が更新されない。これにより、常に、欠陥を示す最新の検査結果が3次元モデルML上に表示される。そのため、ユーザは、欠陥を示す検査結果を見逃しにくくなる。 As described above, the three-dimensional display of the inspection result is updated when the instruction to update the three-dimensional display of the inspection result is issued in step S100. As described above, the update instruction is issued, for example, based on the inspection result indicating the defect being detected by the inspection unit 21. That is, when a defect is detected in the work to be sequentially inspected, the processor 110 updates the inspection result represented by the three-dimensional model ML with the inspection result of each inspection target portion of the work. As a result, while no inspection result indicating a defect is detected, the inspection result displayed on the three-dimensional model ML is not updated. Thus, the latest inspection result indicating the defect is always displayed on the three-dimensional model ML. Therefore, it is difficult for the user to overlook the inspection result indicating the defect.
 <S.3次元モデルに対する検査対象部分の選択機能>
 図35のステップS110,S112に示されるように、画像処理装置20は、3次元モデルMLに示される検査対象部分のいずれかが選択されたことに基づいて、選択された検査対象部分に応じた処理を実行する。実行される得る処理としては、たとえば、以下で説明する具体例1,2の処理が挙げられる。以下では、これらの処理について順に説明する。
<S. Selection function of inspection target part for 3D model>
As shown in steps S110 and S112 in FIG. 35, the image processing apparatus 20 responds to the selected inspection target portion based on selection of any of the inspection target portions shown in the three-dimensional model ML. Execute the process. Examples of the processing that can be executed include the processing of specific examples 1 and 2 described below. Hereinafter, these processes will be described in order.
 なお、典型的には、画像処理装置20は、3次元モデルML上の検査対象部分の選択操作を受け付けたことに基づいて、下記の具体例1,2に示す処理の少なくとも1つを実行する。あるいは、画像処理装置20は、3次元モデルML上の検査対象部分の選択操作を受け付けたことに基づいて、下記の具体例1,2に示す処理の選択画面を表示し、当該選択画面において選択された処理を実行してもよい。 Note that, typically, the image processing device 20 executes at least one of the processes shown in the following specific examples 1 and 2 based on receiving an operation of selecting an inspection target portion on the three-dimensional model ML. . Alternatively, the image processing device 20 displays the selection screen of the processing shown in the following specific examples 1 and 2 based on the selection operation of the inspection target portion on the three-dimensional model ML, and makes a selection on the selection screen. The performed processing may be executed.
  (S1.具体例1)
 図36は、3次元モデルMLに示される検査対象部分の選択操作に応じて実行される処理の具体例1を示す図である。
(S1. Specific example 1)
FIG. 36 is a diagram illustrating a specific example 1 of a process performed in response to a selection operation of an inspection target portion illustrated in the three-dimensional model ML.
 図36に示されるように、表示装置23には、検査対象のワークの3次元モデルMLと、垂直方向を軸として3次元モデルMLを回転させるためのツールボタン71と、水平方向を軸として3次元モデルMLを回転させるためのツールボタン72とが表示される。ユーザは、ツールボタン71,72を操作することにより、3次元モデルMLを適宜回転させることができる。 As shown in FIG. 36, the display device 23 includes a three-dimensional model ML of the work to be inspected, a tool button 71 for rotating the three-dimensional model ML about the vertical direction, and a three-dimensional model ML about the horizontal direction. A tool button 72 for rotating the dimensional model ML is displayed. The user can appropriately rotate the three-dimensional model ML by operating the tool buttons 71 and 72.
 上述の図13~図18で説明したように、検査対象位置Biは、3次元モデルMLに対して設定されるので、3次元モデルMLと検査対象位置Biとの関係は既知である。そのため、画像処理装置20の表示制御部22は、この既知の情報に基づいて、3次元モデルML上に検査対象部分を表わすことができる。図36の例では、3次元モデルMLにおいて、検査対象部分A1~A4が表わされている。 As described with reference to FIGS. 13 to 18, the inspection target position Bi is set with respect to the three-dimensional model ML, and thus the relationship between the three-dimensional model ML and the inspection target position Bi is known. Therefore, the display control unit 22 of the image processing device 20 can represent the inspection target portion on the three-dimensional model ML based on the known information. In the example of FIG. 36, the inspection target portions A1 to A4 are represented in the three-dimensional model ML.
 また、表示制御部22は、3次元モデルML上の検査対象部分A1~A4において検査結果を表わす。一例として、「欠陥有」を示す部分は、特定の色(たとえば、赤色)で表わされてもよいし、点滅表示で表わされてもよい。「欠陥無」を示す部分は、「欠陥有」を示す部分とは異なる色(たとえば、緑色)で表わされる。 {Circle around (2)} The display control unit 22 indicates the inspection result in the inspection target portions A1 to A4 on the three-dimensional model ML. As an example, the portion indicating “having a defect” may be represented by a specific color (for example, red) or may be represented by blinking display. The portion indicating “no defect” is represented by a different color (for example, green) from the portion indicating “defective”.
 図36の例では、検査対象部分A1が「欠陥有」を示しており、検査対象部分A2~A4が「欠陥無」を示している。好ましくは、表示制御部22は、検査対象部分A1内の欠陥を示す欠陥部分A1_1を他の部分よりも強調表示する。 In the example of FIG. 36, the inspection target portion A1 indicates “defect”, and the inspection target portions A2 to A4 indicate “no defect”. Preferably, the display control unit 22 highlights a defective portion A1_1 indicating a defect in the inspection target portion A1 more than other portions.
 ユーザは、検査対象部分A1~A4のいずれかを選択することができる。図36の例では、検査対象部分A1が選択されている。画像処理装置20は、検査対象部分A1が選択されたことに基づいて、検査対象部分A1の検査に用いられた画像と、当該画像の撮像条件とを表示装置23に表示する。これにより、ユーザは、欠陥を示す画像を目視で確認したり、撮像条件の妥当性などを確認することができる。 The user can select any of the inspection target portions A1 to A4. In the example of FIG. 36, the inspection target portion A1 is selected. The image processing device 20 displays, on the display device 23, the image used for the inspection of the inspection target portion A1 and the imaging conditions of the image based on the selection of the inspection target portion A1. Thus, the user can visually check the image indicating the defect, and can confirm the validity of the imaging conditions.
 より具体的には、画像処理装置20は、検査対象部分A1が選択されたことに基づいて、検査対象のワークのワークNoと選択された検査対象部分A1との組み合わせをキーとして対応する画像ファイルを上述の画像ファイル群146B(図11参照)から取得する。次に、画像処理装置20は、検査対象部分A1をキーとして対応する撮像条件を上述の撮像条件ファイル146A(図11参照)から取得する。表示制御部22は、取得した画像ファイルと撮像条件とを、選択された検査対象部分A1に対応付けられたポップアップ画面PU6上に表示する。表示された撮像条件は、任意の値に変更可能に構成される。 More specifically, based on the fact that the inspection target portion A1 is selected, the image processing apparatus 20 uses the combination of the work No. of the inspection target work and the selected inspection target portion A1 as a key to generate a corresponding image file. From the image file group 146B (see FIG. 11). Next, the image processing apparatus 20 acquires the corresponding imaging condition from the above-described imaging condition file 146A (see FIG. 11) using the inspection target portion A1 as a key. The display control unit 22 displays the acquired image file and the imaging condition on the pop-up screen PU6 associated with the selected inspection target portion A1. The displayed imaging condition is configured to be changeable to an arbitrary value.
 なお、図36の例では、検査対象部分の選択操作に応じて撮像画像と撮像条件との両方が表示される例について説明を行なったが、表示制御部22は、検査対象部分の選択操作に応じて撮像画像および撮像条件のいずれか一方を表示してもよい。 In the example of FIG. 36, an example has been described in which both the captured image and the imaging condition are displayed according to the operation of selecting the inspection target portion. However, the display control unit 22 performs Accordingly, either one of the captured image and the imaging condition may be displayed.
  (S2.具体例2)
 図37は、3次元モデルMLに示される検査対象部分の選択操作に応じて実行される処理の具体例2を示す図である。
(S2. Specific example 2)
FIG. 37 is a diagram illustrating a specific example 2 of a process performed in response to an operation of selecting an inspection target portion illustrated in the three-dimensional model ML.
 3次元モデルMLは、検査対象部分A1~A4を含む。ユーザは、検査対象部分A1~A4のいずれかを選択することができる。図37の例では、検査対象部分A1が選択されている。画像処理装置20は、検査対象部分A1が選択されたことに基づいて、検査対象のワークのワークNoと選択された検査対象部分A1との組み合わせをキーとして対応する撮像条件を上述の撮像条件ファイル146A(図11参照)から取得する。次に、画像処理装置20の表示制御部22は、取得した撮像条件を3次元モデルMLの対応箇所に表わす。 The three-dimensional model ML includes inspection target portions A1 to A4. The user can select any of the inspection target portions A1 to A4. In the example of FIG. 37, the inspection target portion A1 is selected. Based on the fact that the inspection target portion A1 is selected, the image processing device 20 uses the combination of the work No. of the inspection target work and the selected inspection target portion A1 as a key to store corresponding imaging conditions in the above-described imaging condition file. 146A (see FIG. 11). Next, the display control unit 22 of the image processing device 20 displays the acquired imaging conditions in corresponding locations of the three-dimensional model ML.
 一例として、表示される撮像条件は、3次元モデルMLに対する撮像装置10の撮影位置を含む。上述の図19で説明したように、3次元モデルMLに対する撮像装置10の撮影位置Ciは、設定装置60による設定処理で決定されているので、3次元モデルMLに対する撮像装置10の撮影位置Ciは既知である。そのため、画像処理装置20の表示制御部22は、この既知の情報に基づいて、撮像装置10を表わす模式図10Aを撮影位置Ci上に表示することができる。図37の例では、撮影位置C1において撮像装置10の模式図10Aが表示されている。 As an example, the displayed imaging conditions include the imaging position of the imaging device 10 with respect to the three-dimensional model ML. As described above with reference to FIG. 19, the shooting position Ci of the imaging device 10 with respect to the three-dimensional model ML is determined by the setting process performed by the setting device 60. Is known. Therefore, the display control unit 22 of the image processing device 20 can display the schematic diagram 10A representing the imaging device 10 on the photographing position Ci based on the known information. In the example of FIG. 37, a schematic diagram 10A of the imaging device 10 is displayed at the shooting position C1.
 なお、上述の例では、撮像条件として撮影位置が表示される例について説明を行なったが、表示される撮像条件は、撮影位置に限定されない。たとえば、撮像時における撮像装置10の光学条件、撮像時における照明条件などが表示されてもよい。 In the above example, an example in which the imaging position is displayed as the imaging condition has been described, but the displayed imaging condition is not limited to the imaging position. For example, an optical condition of the imaging device 10 at the time of imaging, an illumination condition at the time of imaging, and the like may be displayed.
 <T.検査対象部分の展開/集約機能>
 図38は、3次元モデルMLに示される検査対象部分の展開/集約している過程を示す図である。
<T. Deployment / aggregation function of inspection target part>
FIG. 38 is a diagram showing a process of developing / aggregating the inspection target portion shown in the three-dimensional model ML.
 3次元モデルMLに示される検査対象部分は、上述の分類ルール134A(図4参照)において階層的に関連付けられている。上位の階層に規定される検査対象部分は、下位の階層に規定される検査対象部分を包含する関係を有する。上位の階層に規定される検査対象部分に包含されている下位の検査対象部分は、同一グループとみなされる。 The inspection target portion shown in the three-dimensional model ML is hierarchically associated with the above-described classification rule 134A (see FIG. 4). The inspection target part defined in the higher hierarchy has a relationship including the inspection target part defined in the lower hierarchy. The lower inspection target parts included in the inspection target part defined in the upper hierarchy are regarded as the same group.
 図38の例では、3次元モデルML上において、検査対象部分A~Cが示されている。一例として、上位の検査対象部分Aには、下位の検査対象部分A1~A3が関連付けられているとする。上位の検査対象部分Aに包含されている検査対象部分A1~A3は、同一グループとみなされる。 例 In the example of FIG. 38, inspection target portions A to C are shown on the three-dimensional model ML. As an example, it is assumed that lower inspection target portions A1 to A3 are associated with upper inspection target portion A. The inspection target portions A1 to A3 included in the upper inspection target portion A are regarded as the same group.
 上位の検査対象部分A1には、下位の検査対象部分A1_1~A1_4が関連付けられているとする。上位の検査対象部分A1に包含されている検査対象部分A1_1~A1_4は、同一グループとみなされる。 と す る It is assumed that the lower inspection target portions A1_1 to A1_4 are associated with the upper inspection target portion A1. The inspection target portions A1_1 to A1_4 included in the upper inspection target portion A1 are regarded as the same group.
 3次元モデルML上の検査対象部分A~Cには検査結果が表わされる。検査対象部分A~Cの検査結果は、上述の検査結果ファイル146D(図12参照)から取得される。一例として、検査対象部分Aは「欠陥有」を示し、検査対象部分B,Cは「欠陥無」を示す。 Inspection results are shown in inspection target portions A to C on the three-dimensional model ML. The inspection results of the inspection target parts A to C are obtained from the inspection result file 146D (see FIG. 12). As an example, the inspection target portion A indicates “with defect”, and the inspection target portions B and C indicate “without defect”.
 検査対象部分Aには、展開/集約ボタンBT1が割り付けられている。検査対象部分Bには、展開/集約ボタンBT2が割り付けられている。検査対象部分Cには、展開/集約ボタンBT3が割り付けられている。「+」は展開ボタンを示し、「-」は集約ボタンを示す。 (4) The development / aggregation button BT1 is assigned to the inspection target portion A. The development / aggregation button BT2 is assigned to the inspection target portion B. The development / aggregation button BT3 is assigned to the inspection target portion C. “+” Indicates a development button, and “−” indicates an aggregation button.
 表示制御部22は、グルーピングされている検査対象部分に対する集約指示を受け付けた場合に、当該グルーピングされている検査対象部分の検査結果を集約し、当該集約後の検査結果を、当該グルーピングされている検査対象部分に対応する3次元モデルML上の各部分に表わす。また、表示制御部22は、集約された検査結果に対する展開指示を受け付けた場合に、当該集約された検査結果の表示を集約前に戻す。 When the display control unit 22 receives the aggregation instruction for the grouped inspection target portions, the display control unit 22 aggregates the inspection results of the grouped inspection target portions, and displays the inspection results after the aggregation in the grouping. Each part on the three-dimensional model ML corresponding to the part to be inspected is represented. In addition, when the display control unit 22 receives a deployment instruction for the aggregated inspection results, the display control unit 22 returns the display of the aggregated inspection results to the state before the aggregation.
 たとえば、展開/集約ボタンBT1が押下された場合、表示制御部22は、検査対象部分Aを下位の検査対象部分A1~A3に展開する。次に、表示制御部22は、検査対象部分A1~A3の検査結果を検査対象部分A1~A3に反映する。検査対象部分A1~A3の検査結果は、上述の検査結果ファイル146D(図12参照)から取得される。一例として、検査対象部分A1は「欠陥有」を示し、検査対象部分A2,A3は「欠陥無」を示す。 For example, when the deploy / aggregate button BT1 is pressed, the display control unit 22 deploys the inspection target part A to lower inspection target parts A1 to A3. Next, the display control unit 22 reflects the inspection results of the inspection target portions A1 to A3 on the inspection target portions A1 to A3. The inspection results of the inspection target parts A1 to A3 are obtained from the inspection result file 146D (see FIG. 12). As an example, the inspection target portion A1 indicates "having a defect", and the inspection target portions A2 and A3 indicate "no defect".
 次に、表示制御部22は、検査対象部分A1に展開/集約ボタンBT1_1を割り付ける。同様に、表示制御部22は、検査対象部分A2に展開/集約ボタンBT1_2を割り付ける。同様に、表示制御部22は、検査対象部分A3に展開/集約ボタンBT1_3を割り付ける。 (4) Next, the display control unit 22 assigns the deploy / aggregate button BT1_1 to the inspection target portion A1. Similarly, the display control unit 22 assigns a development / aggregation button BT1_2 to the inspection target portion A2. Similarly, the display control unit 22 assigns the deploy / aggregate button BT1_3 to the inspection target portion A3.
 展開/集約ボタンBT1_1の「+」が押下された場合、表示制御部22は、検査対象部分A1を下位の検査対象部分A1_1~A1_4に展開する。次に、表示制御部22は、検査対象部分A1_1~A1_4の検査結果を検査対象部分A1_1~A1_4に反映する。検査対象部分A1_1~A1_4の検査結果は、上述の検査結果ファイル146D(図12参照)から取得される。一例として、検査対象部分A1_1~A1_3は「欠陥無」を示し、検査対象部分A1_4は「欠陥有」を示す。 When the “+” of the expand / aggregate button BT1_1 is pressed, the display control unit 22 expands the inspection target portion A1 into lower inspection target portions A1_1 to A1_4. Next, the display control unit 22 reflects the inspection results of the inspection target portions A1_1 to A1_4 on the inspection target portions A1_1 to A1_4. The inspection results of the inspection target portions A1_1 to A1_4 are obtained from the inspection result file 146D (see FIG. 12). As an example, the inspection target portions A1_1 to A1_3 indicate "no defect", and the inspection target portion A1_4 indicates "defect".
 展開/集約ボタンBT1_1Aが押下された場合、表示制御部22は、検査対象部分A1_1~A1_4を上位の検査対象部分A1に集約する。このとき、「欠陥有」を示す検査結果が集約対象の検査対象部分の検査結果に1個でも含まれている場合、表示制御部22は、集約後の検査対象部分の検査結果を「欠陥有」とする。一方で、「欠陥有」を示す検査結果が集約対象の検査対象部分の検査結果に1個も含まれていない場合、表示制御部22は、集約後の検査対象部分の検査結果を「欠陥無」とする。検査対象部分A1_4は「欠陥有」を示すので、表示制御部22は、検査対象部分A1_1~A1_4を集約した検査対象部分A1の検査結果を「欠陥有」とする。 (4) When the expand / combine button BT1_1A is pressed, the display control unit 22 combines the inspection target parts A1_1 to A1_4 into a higher-level inspection target part A1. At this time, when at least one inspection result indicating “having a defect” is included in the inspection result of the inspection target portion to be consolidated, the display control unit 22 displays the inspection result of the inspection target portion after the aggregation as “defective”. ". On the other hand, when no inspection result indicating “having a defect” is included in the inspection result of the inspection target portion to be consolidated, the display control unit 22 displays the inspection result of the inspection target portion after the aggregation as “no defect”. ". Since the inspection target portion A1_4 indicates “having a defect”, the display control unit 22 sets the inspection result of the inspection target portion A1 obtained by integrating the inspection target portions A1_1 to A1_4 to “having a defect”.
 展開/集約ボタンBT1の「-」が押下された場合、表示制御部22は、検査対象部分A1~A3を検査対象部分Aに集約する。検査対象部分A1は「欠陥有」を示すので、表示制御部22は、検査対象部分A1~A3を集約した検査対象部分Aの検査結果を「欠陥有」とする。 When the “-” of the expand / combine button BT1 is pressed, the display control unit 22 combines the inspection target parts A1 to A3 into the inspection target part A. Since the inspection target portion A1 indicates "having a defect", the display control unit 22 sets the inspection result of the inspection target portion A obtained by integrating the inspection target portions A1 to A3 to "having a defect".
 <U.外観検査システムの変形例>
 図39は、変形例に係る外観検査システムを示す図である。図39に示される外観検査システムは、図1に示す外観検査システム1と比較して、PLC50を備えず、画像処理装置20の代わりに画像処理装置20aを備える点で相違する。画像処理装置20aは、上記の画像処理装置20の構成とPLC50の構成との両方を有する。
<U. Modification of Appearance Inspection System>
FIG. 39 is a diagram illustrating a visual inspection system according to a modification. The appearance inspection system shown in FIG. 39 is different from the appearance inspection system 1 shown in FIG. 1 in that it does not include the PLC 50 and includes an image processing device 20a instead of the image processing device 20. The image processing device 20a has both the configuration of the image processing device 20 and the configuration of the PLC 50.
 図40は、ワークWと撮像装置10との間の相対位置を変更する別の形態を示す図である。図40に示されるように、ロボット30は、撮像装置10ではなく、ワークWを移動させてもよい。図40に示す例では、撮像装置10は固定される。このようにワークWを移動させることにより、ワークWと撮像装置10との間の相対位置を変更してもよい。 FIG. 40 is a diagram showing another embodiment in which the relative position between the workpiece W and the imaging device 10 is changed. As shown in FIG. 40, the robot 30 may move the work W instead of the imaging device 10. In the example shown in FIG. 40, the imaging device 10 is fixed. By moving the work W in this manner, the relative position between the work W and the imaging device 10 may be changed.
 図41は、ワークWと撮像装置10との間の相対位置を変更するさらに別の形態を示す図である。図41に示されるように、ワークWは、回転テーブル91の上に載置されてもよい。回転テーブル91は、ロボットコントローラ40の指示に応じて回転する。これにより、ワークWと撮像装置10との間の相対位置を容易に変更することができる。 FIG. 41 is a diagram showing still another mode in which the relative position between the workpiece W and the imaging device 10 is changed. As shown in FIG. 41, the workpiece W may be placed on the turntable 91. The rotary table 91 rotates according to an instruction from the robot controller 40. Thereby, the relative position between the workpiece W and the imaging device 10 can be easily changed.
 なお、ロボット30は、垂直多関節ロボット以外のロボット(たとえば、水平多関節ロボット、直交ロボットなど)であってもよい。 Note that the robot 30 may be a robot other than the vertical articulated robot (for example, a horizontal articulated robot, an orthogonal robot, or the like).
 上記では、撮像視野FOVおよび実効視野FOV2を円形とした説明したが、撮像視野FOVおよび実効視野FOV2の形状は、円形に限定されず、たとえば矩形(長方形、正方形)であってもよい。 In the above description, the imaging field of view FOV and the effective field of view FOV2 are described as being circular. However, the shapes of the imaging field of view FOV and the effective field of view FOV2 are not limited to circles, and may be, for example, rectangular (rectangular, square).
 <V.付記>
 以上のように、本実施形態は以下のような開示を含む。
<V. Appendix>
As described above, this embodiment includes the following disclosure.
 [構成1]
 検査対象物(W)の外観検査を行なう外観検査システム(1)であって、
 表示装置(23)と、
 撮像装置(10)を移動させるためのロボット(30)と、
 前記ロボット(30)が前記撮像装置(10)を移動している間に前記撮像装置(10)が前記検査対象物(W)の複数の検査部分の各々を撮像して前記撮像装置(10)から得られた各画像に基づいて、前記複数の検査部分の各々について欠陥の有無を検査するための検査部(21)と、
 前記複数の検査部分の各々についての前記検査部(21)による検査結果を検査対象物(W)ごとに表わした検査結果マトリクス(25)を前記表示装置(23)に表示するための表示制御部(22)とを備える、外観検査システム。
[Configuration 1]
An appearance inspection system (1) for performing an appearance inspection of an inspection object (W),
A display device (23);
A robot (30) for moving the imaging device (10);
While the robot (30) is moving the imaging device (10), the imaging device (10) images each of a plurality of inspection portions of the inspection object (W) to obtain the imaging device (10). An inspection unit (21) for inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained from
A display control unit for displaying, on the display device (23), an inspection result matrix (25) in which inspection results of the inspection unit (21) for each of the plurality of inspection parts are displayed for each inspection object (W). (22) An appearance inspection system comprising:
 [構成2]
 前記検査結果マトリクス(25)の各行または各列は、予め定められた分類ルールに従って行単位または列単位でグルーピングされており、
 前記表示制御部(22)は、
  グルーピングされている行群またはグルーピングされている列群に対する集約指示を受け付けた場合に、当該行群または当該列群の検査結果を一行または一列に集約して表示し、
  集約して表示されている検査結果に対する展開指示を受け付けた場合に、当該集約して表示されている検査結果の表示を集約前に戻す、構成1に記載の外観検査システム。
[Configuration 2]
Each row or each column of the inspection result matrix (25) is grouped in row units or column units according to a predetermined classification rule,
The display control unit (22) includes:
When an aggregation instruction is received for a group of rows or a group of columns that are grouped, the inspection results of the group of rows or the group of columns are aggregated and displayed in one row or one column,
3. The visual inspection system according to Configuration 1, wherein, when a deployment instruction for the inspection results that are collectively displayed is received, the display of the inspection results that are collectively displayed is returned to a state before the integration.
 [構成3]
 前記表示制御部(22)は、前記検査結果マトリクス(25)に含まれる検査結果の内の、欠陥を示す検査結果を、他の検査結果とは異なる表示態様で表示する、構成1または2に記載の外観検査システム。
[Configuration 3]
The display control unit (22) according to the configuration 1 or 2, wherein an inspection result indicating a defect among the inspection results included in the inspection result matrix (25) is displayed in a display mode different from other inspection results. Visual inspection system as described.
 [構成4]
 前記外観検査システム(1)は、前記検査結果マトリクス(25)に含まれる検査結果の内から1つ以上の検査結果を選択する選択操作を受け付けることで、1つ以上の検査対象物(W)と1つ以上の検査部分とを選択することが可能な操作部(134)と、
 前記検査部(21)または前記表示制御部(22)は、前記操作部(134)が前記選択操作を受け付けたことに基づいて、選択された検査対象物(W)と選択された検査部分との少なくとも一方に関する処理を実行する、構成1~3のいずれか1項に記載の外観検査システム。
[Configuration 4]
The visual inspection system (1) receives one or more inspection results from among the inspection results included in the inspection result matrix (25), thereby receiving one or more inspection objects (W). An operation unit (134) capable of selecting one and one or more inspection parts;
The inspection unit (21) or the display control unit (22) determines whether the selected inspection object (W) and the selected inspection part are based on the operation unit (134) receiving the selection operation. 4. The visual inspection system according to any one of Configurations 1 to 3, wherein the visual inspection system performs a process related to at least one of the following.
 [構成5]
 前記表示制御部(22)は、前記選択された検査部分の検査に用いられた画像と、当該画像の撮像条件との少なくとも一方を前記表示装置(23)に表示する、構成4に記載の外観検査システム。
[Configuration 5]
The external appearance according to Configuration 4, wherein the display control unit (22) displays at least one of an image used for inspection of the selected inspection part and an imaging condition of the image on the display device (23). Inspection system.
 [構成6]
 前記外観検査システム(1)は、前記検査対象物(W)の形状を表わす3次元モデル(ML)を格納するための記憶装置をさらに備え、前記複数の検査部分は、前記3次元モデル(ML)に対して予め設定されており、
 前記表示制御部(22)は、前記3次元モデル(ML)を前記表示装置(23)に表示するとともに、前記選択された検査部分の検査結果を当該3次元モデル(ML)上の対応部分に表わす、構成4または5に記載の外観検査システム。
[Configuration 6]
The visual inspection system (1) further includes a storage device for storing a three-dimensional model (ML) representing a shape of the inspection object (W), and the plurality of inspection parts include the three-dimensional model (ML). ) Is preset for
The display control unit (22) displays the three-dimensional model (ML) on the display device (23), and displays an inspection result of the selected inspection part on a corresponding part on the three-dimensional model (ML). A visual inspection system according to configuration 4 or 5, wherein
 [構成7]
 前記表示制御部(22)は、前記選択操作で複数の検査結果が選択された場合、当該複数の検査結果の統計結果を前記表示装置(23)に表示する、構成4~6のいずれか1項に記載の外観検査システム。
[Configuration 7]
The display control unit (22) according to any one of Configurations 4 to 6, wherein, when a plurality of inspection results are selected by the selection operation, a statistical result of the plurality of inspection results is displayed on the display device (23). The visual inspection system according to the paragraph.
 [構成8]
 前記検査部(21)は、前記選択された検査部分の検査に用いられた画像に基づいて、当該選択された検査部分を再検査する、構成4~7のいずれか1項に記載の外観検査システム。
[Configuration 8]
The visual inspection according to any one of configurations 4 to 7, wherein the inspection unit (21) re-examines the selected inspection part based on an image used for inspection of the selected inspection part. system.
 [構成9]
 前記表示制御部(22)は、前記検査結果マトリクス(25)に含まれる各検査結果についての真の正解値を示す期待値マトリクスと、前記検査結果マトリクス(25)との比較結果を前記表示装置(23)に表示する、構成1~8のいずれか1項に記載の外観検査システム。
[Configuration 9]
The display control unit (22) displays a comparison result between an expected value matrix indicating a true correct value for each test result included in the test result matrix (25) and the test result matrix (25) on the display device. The visual inspection system according to any one of Configurations 1 to 8, which is displayed in (23).
 [構成10]
 検査対象物(W)の複数の検査部分を撮像装置(10)が撮像することによって行なわれた外観検査結果の表示方法であって、
 ロボット(30)が前記撮像装置(10)を移動している間に前記撮像装置(10)が前記複数の検査部分の各々を撮像して得られた各画像を取得するステップと、
 前記取得するステップで得られた各画像に基づいて、前記複数の検査部分の各々について欠陥の有無を検査するステップと、
 前記複数の検査部分の各々についての検査結果を検査対象物(W)ごとに表わした検査結果マトリクス(25)を表示装置(23)に表示するステップと備える、外観検査結果の表示方法。
[Configuration 10]
A method for displaying a result of a visual inspection performed by an imaging device (10) imaging a plurality of inspection portions of an inspection object (W),
Acquiring each image obtained by the imaging device (10) imaging each of the plurality of inspection portions while the robot (30) is moving the imaging device (10);
Inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained in the obtaining step;
Displaying a test result matrix (25) representing test results for each of the plurality of test portions for each test object (W) on a display device (23).
 [構成11]
 検査対象物(W)の複数の検査部分を撮像装置(10)が撮像することによって行なわれた外観検査結果の表示プログラムであって、
 前記表示プログラムは、コンピュータに、
  ロボット(30)が前記撮像装置(10)を移動している間に前記撮像装置(10)が前記複数の検査部分の各々を撮像して得られた各画像を取得するステップと、
  前記取得するステップで得られた各画像に基づいて、前記複数の検査部分の各々について欠陥の有無を検査するステップと、
  前記複数の検査部分の各々についての検査結果を検査対象物(W)ごとに表わした検査結果マトリクス(25)を表示装置(23)に表示するステップと実行させる、外観検査結果の表示プログラム。
[Configuration 11]
A display program of an appearance inspection result obtained by imaging a plurality of inspection portions of an inspection object (W) by an imaging device (10),
The display program, the computer,
Acquiring each image obtained by the imaging device (10) imaging each of the plurality of inspection portions while the robot (30) is moving the imaging device (10);
Inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained in the obtaining step;
A step of displaying an inspection result matrix (25) representing inspection results for each of the plurality of inspection parts for each inspection object (W) on a display device (23);
 今回開示された実施の形態は全ての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内での全ての変更が含まれることが意図される。 実 施 The embodiments disclosed this time are to be considered in all respects as illustrative and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 1 外観検査システム、10 撮像装置、10A 模式図、20,20a 画像処理装置、21 検査部、22 表示制御部、23 表示装置、25 検査結果マトリクス、27 期待値マトリクス、29 比較結果マトリクス、30 ロボット、31 基台、32 アーム、32a 先端アーム、40 ロボットコントローラ、50 PLC、60 設定装置、61a,61b 画面、71,72,76,77 ツールボタン、74 丸印、75,75A,75B 検査対象領域、81 戻るボタン、82 進むボタン、90 ステージ、91 回転テーブル、110,214,362 プロセッサ、112 RAM、114 表示コントローラ、116 システムコントローラ、118 I/Oコントローラ、120,364 記憶装置、122 カメラインターフェイス、124 入力インターフェイス、126 コントローラインターフェイス、128,228,368 通信インターフェイス、130,222 メモリカードインターフェイス、134 キーボード、134A,134B,134C 分類ルール、136,224 メモリカード、142 画像処理プログラム、143 表示プログラム、144 プロジェクトファイル、146A 撮像条件ファイル、146B 画像ファイル群、146C 検査条件ファイル、146D 検査結果ファイル、146E ワーク情報ファイル、146F 生産情報ファイル、146G 分類ルールファイル、146H 期待値ファイル、212 チップセット、216 不揮発性メモリ、218 主メモリ、220 システムクロック、226 内部バス、230 内部バスコントローラ、232 制御回路、234 内部バス制御回路、236 バッファメモリ、238 フィールドバスコントローラ、361 バス、363 メインメモリ、365 設定プログラム、366 ディスプレイ、367 入力デバイス。 1 appearance inspection system, 10 imaging device, 10A schematic diagram, 20, 20a image processing device, 21 inspection unit, 22 display control unit, 23 display device, 25 inspection result matrix, 27 expected value matrix, 29 comparison result matrix, 30 robot , 31 base, 32 arm, 32a tip arm, 40 robot controller, 50 PLC, 60 setting device, 61a, 61b screen, 71, 72, 76, 77 tool button, 74 circle, 75, 75A, 75B inspection target area , 81 back button, 82 forward button, 90 stage, 91 rotary table, 110, 214, 362 processor, 112 RAM, 114 display controller, 116 system controller, 118 I / O controller, 120, 364 storage device 122, camera interface, 124, input interface, 126, controller interface, 128, 228, 368, communication interface, 130, 222 memory card interface, 134 keyboard, 134A, 134B, 134C classification rule, 136, 224 memory card, 142 image processing program, 143 display program, 144 project file, 146A imaging condition file, 146B image file group, 146C inspection condition file, 146D inspection result file, 146E work information file, 146F production information file, 146G classification rule file, 146H expected value file, 212 chip Set, 216 non-volatile memory, 218 main memory, 220 System clock, 226 internal bus, 230 an internal bus controller, 232 control circuit, 234 an internal bus control circuit, 236 buffer memory, 238 a fieldbus controller, 361 bus, 363 main memory, 365 setting program 366 displays, 367 input device.

Claims (11)

  1.  検査対象物の外観検査を行なう外観検査システムであって、
     表示装置と、
     撮像装置を移動させるためのロボットと、
     前記ロボットが前記撮像装置を移動している間に前記撮像装置が前記検査対象物の複数の検査部分の各々を撮像して前記撮像装置から得られた各画像に基づいて、前記複数の検査部分の各々について欠陥の有無を検査するための検査部と、
     前記複数の検査部分の各々についての前記検査部による検査結果を検査対象物ごとに表わした検査結果マトリクスを前記表示装置に表示するための表示制御部とを備える、外観検査システム。
    An appearance inspection system that performs an appearance inspection of an inspection object,
    A display device;
    A robot for moving the imaging device;
    While the robot is moving the imaging device, the imaging device images each of the plurality of inspection portions of the inspection target, and based on each image obtained from the imaging device, the plurality of inspection portions. An inspection unit for inspecting each of the for the presence or absence of a defect,
    A display control unit for displaying, on the display device, an inspection result matrix in which inspection results of the inspection unit for each of the plurality of inspection parts are displayed for each inspection object.
  2.  前記検査結果マトリクスの各行または各列は、予め定められた分類ルールに従って行単位または列単位でグルーピングされており、
     前記表示制御部は、
      グルーピングされている行群またはグルーピングされている列群に対する集約指示を受け付けた場合に、当該行群または当該列群の検査結果を一行または一列に集約して表示し、
      集約して表示されている検査結果に対する展開指示を受け付けた場合に、当該集約して表示されている検査結果の表示を集約前に戻す、請求項1に記載の外観検査システム。
    Each row or each column of the inspection result matrix is grouped in row units or column units according to a predetermined classification rule,
    The display control unit,
    When an aggregation instruction is received for a group of rows or a group of columns that are grouped, the inspection results of the group of rows or the group of columns are aggregated and displayed in one row or one column,
    The visual inspection system according to claim 1, wherein, when a deployment instruction for the inspection results that are collectively displayed is received, the display of the inspection results that are collectively displayed is returned to a state before the integration.
  3.  前記表示制御部は、前記検査結果マトリクスに含まれる検査結果の内の、欠陥を示す検査結果を、他の検査結果とは異なる表示態様で表示する、請求項1または2に記載の外観検査システム。 The visual inspection system according to claim 1, wherein the display control unit displays an inspection result indicating a defect in the inspection results included in the inspection result matrix in a display mode different from other inspection results. .
  4.  前記外観検査システムは、前記検査結果マトリクスに含まれる検査結果の内から1つ以上の検査結果を選択する選択操作を受け付けることで、1つ以上の検査対象物と1つ以上の検査部分とを選択することが可能な操作部と、
     前記検査部または前記表示制御部は、前記操作部が前記選択操作を受け付けたことに基づいて、選択された検査対象物と選択された検査部分との少なくとも一方に関する処理を実行する、請求項1~3のいずれか1項に記載の外観検査システム。
    The visual inspection system receives the selection operation of selecting one or more inspection results from the inspection results included in the inspection result matrix, and thereby connects one or more inspection objects and one or more inspection parts. An operation unit that can be selected,
    The said inspection part or the said display control part performs the process regarding at least one of the selected test object and the selected test part based on the said operation part having received the said selection operation. 4. The visual inspection system according to any one of items 3 to 3.
  5.  前記表示制御部は、前記選択された検査部分の検査に用いられた画像と、当該画像の撮像条件との少なくとも一方を前記表示装置に表示する、請求項4に記載の外観検査システム。 The visual inspection system according to claim 4, wherein the display control unit displays at least one of an image used for inspection of the selected inspection part and an imaging condition of the image on the display device.
  6.  前記外観検査システムは、前記検査対象物の形状を表わす3次元モデルを格納するための記憶装置をさらに備え、前記複数の検査部分は、前記3次元モデルに対して予め設定されており、
     前記表示制御部は、前記3次元モデルを前記表示装置に表示するとともに、前記選択された検査部分の検査結果を当該3次元モデル上の対応部分に表わす、請求項4または5に記載の外観検査システム。
    The visual inspection system further includes a storage device for storing a three-dimensional model representing a shape of the inspection object, wherein the plurality of inspection parts are preset for the three-dimensional model,
    The visual inspection according to claim 4, wherein the display control unit displays the three-dimensional model on the display device and displays an inspection result of the selected inspection part in a corresponding part on the three-dimensional model. system.
  7.  前記表示制御部は、前記選択操作で複数の検査結果が選択された場合、当該複数の検査結果の統計結果を前記表示装置に表示する、請求項4~6のいずれか1項に記載の外観検査システム。 The external appearance according to any one of claims 4 to 6, wherein when a plurality of inspection results are selected by the selection operation, the display control unit displays a statistical result of the plurality of inspection results on the display device. Inspection system.
  8.  前記検査部は、前記選択された検査部分の検査に用いられた画像に基づいて、当該選択された検査部分を再検査する、請求項4~7のいずれか1項に記載の外観検査システム。 The visual inspection system according to any one of claims 4 to 7, wherein the inspection unit re-inspects the selected inspection part based on an image used for inspection of the selected inspection part.
  9.  前記表示制御部は、前記検査結果マトリクスに含まれる各検査結果についての真の正解値を示す期待値マトリクスと、前記検査結果マトリクスとの比較結果を前記表示装置に表示する、請求項1~8のいずれか1項に記載の外観検査システム。 9. The display control unit according to claim 1, wherein the display control unit displays, on the display device, a comparison result between an expected value matrix indicating a true correct value for each inspection result included in the inspection result matrix and the inspection result matrix. The visual inspection system according to any one of claims 1 to 4.
  10.  検査対象物の複数の検査部分を撮像装置が撮像することによって行なわれた外観検査結果の表示方法であって、
     ロボットが前記撮像装置を移動している間に前記撮像装置が前記複数の検査部分の各々を撮像して得られた各画像を取得するステップと、
     前記取得するステップで得られた各画像に基づいて、前記複数の検査部分の各々について欠陥の有無を検査するステップと、
     前記複数の検査部分の各々についての検査結果を検査対象物ごとに表わした検査結果マトリクスを表示装置に表示するステップと備える、外観検査結果の表示方法。
    A display method of a visual inspection result performed by an imaging device imaging a plurality of inspection portions of an inspection object,
    Acquiring each image obtained by the imaging device imaging each of the plurality of inspection portions while the robot is moving the imaging device;
    Inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained in the obtaining step;
    Displaying an inspection result matrix representing inspection results for each of the plurality of inspection portions for each inspection object on a display device.
  11.  検査対象物の複数の検査部分を撮像装置が撮像することによって行なわれた外観検査結果の表示プログラムであって、
     前記表示プログラムは、コンピュータに、
      ロボットが前記撮像装置を移動している間に前記撮像装置が前記複数の検査部分の各々を撮像して得られた各画像を取得するステップと、
      前記取得するステップで得られた各画像に基づいて、前記複数の検査部分の各々について欠陥の有無を検査するステップと、
      前記複数の検査部分の各々についての検査結果を検査対象物ごとに表わした検査結果マトリクスを表示装置に表示するステップと実行させる、外観検査結果の表示プログラム。
    A display program of an appearance inspection result performed by an imaging device imaging a plurality of inspection portions of an inspection object,
    The display program, the computer,
    Acquiring each image obtained by the imaging device imaging each of the plurality of inspection portions while the robot is moving the imaging device;
    Inspecting each of the plurality of inspection parts for the presence or absence of a defect based on each image obtained in the obtaining step;
    A step of displaying, on a display device, an inspection result matrix representing inspection results for each of the plurality of inspection parts for each inspection object;
PCT/JP2019/021682 2018-06-27 2019-05-31 External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result WO2020003887A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018122068A JP7131127B2 (en) 2018-06-27 2018-06-27 APPEARANCE INSPECTION SYSTEM, APPEARANCE INSPECTION RESULT DISPLAY METHOD AND APPEARANCE INSPECTION RESULT DISPLAY PROGRAM
JP2018-122068 2018-06-27

Publications (1)

Publication Number Publication Date
WO2020003887A1 true WO2020003887A1 (en) 2020-01-02

Family

ID=68987116

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/021682 WO2020003887A1 (en) 2018-06-27 2019-05-31 External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result

Country Status (2)

Country Link
JP (1) JP7131127B2 (en)
WO (1) WO2020003887A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021148593A (en) * 2020-03-19 2021-09-27 株式会社Screenホールディングス Inspection system, display method of inspection result, and display program
JP7409199B2 (en) * 2020-04-01 2024-01-09 株式会社プロテリアル Visual inspection route search method, inspection route search device for visual inspection robot, inspection route search program, and visual inspection robot

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62110138A (en) * 1985-11-08 1987-05-21 Nissan Motor Co Ltd Method for outputting data in surface flaw
JPH0276080A (en) * 1988-09-12 1990-03-15 Omron Tateisi Electron Co Method for displaying inspection result in substrate inspection device
JP2008227027A (en) * 2007-03-12 2008-09-25 Hitachi High-Technologies Corp Data processor, inspection system, and data processing method
JP2009210504A (en) * 2008-03-06 2009-09-17 Olympus Corp Substrate inspecting method
JP2010177293A (en) * 2009-01-27 2010-08-12 Omron Corp Information display system and information display method for quality control of component-mounted substrate
JP2013211323A (en) * 2012-03-30 2013-10-10 Omron Corp Information display system for assisting analysis work of substrate inspection result, and method of assisting analysis work
JP2014132437A (en) * 2013-01-02 2014-07-17 Boeing Co Systems and methods for stand-off inspection of aircraft structures
JP2014134496A (en) * 2013-01-11 2014-07-24 Djtech Co Ltd Method of estimating (classifying) defect factor with print inspection device
JP2017062160A (en) * 2015-09-24 2017-03-30 アイシン精機株式会社 Defect inspection device and defect inspection method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100228752A1 (en) 2009-02-25 2010-09-09 Microsoft Corporation Multi-condition filtering of an interactive summary table

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62110138A (en) * 1985-11-08 1987-05-21 Nissan Motor Co Ltd Method for outputting data in surface flaw
JPH0276080A (en) * 1988-09-12 1990-03-15 Omron Tateisi Electron Co Method for displaying inspection result in substrate inspection device
JP2008227027A (en) * 2007-03-12 2008-09-25 Hitachi High-Technologies Corp Data processor, inspection system, and data processing method
JP2009210504A (en) * 2008-03-06 2009-09-17 Olympus Corp Substrate inspecting method
JP2010177293A (en) * 2009-01-27 2010-08-12 Omron Corp Information display system and information display method for quality control of component-mounted substrate
JP2013211323A (en) * 2012-03-30 2013-10-10 Omron Corp Information display system for assisting analysis work of substrate inspection result, and method of assisting analysis work
JP2014132437A (en) * 2013-01-02 2014-07-17 Boeing Co Systems and methods for stand-off inspection of aircraft structures
JP2014134496A (en) * 2013-01-11 2014-07-24 Djtech Co Ltd Method of estimating (classifying) defect factor with print inspection device
JP2017062160A (en) * 2015-09-24 2017-03-30 アイシン精機株式会社 Defect inspection device and defect inspection method

Also Published As

Publication number Publication date
JP7131127B2 (en) 2022-09-06
JP2020003300A (en) 2020-01-09

Similar Documents

Publication Publication Date Title
WO2020003888A1 (en) External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result
JP4417400B2 (en) Solder inspection line centralized management system and management device used therefor
JP2016502216A (en) Method and system for improved automated visual inspection of physical assets
KR102321768B1 (en) Display screen peripheral circuit detection method, apparatus, electronic device and storage medium
JP2018055675A (en) System and method for improved 3d pose scoring and for eliminating miscellaneous point in 3d image data
WO2020003887A1 (en) External-appearance inspection system, method for displaying external-appearance inspection result, and program for displaying external-appearance inspection result
JP2015233267A (en) Image processing device and imaging apparatus
US10133256B2 (en) Information processing apparatus and method for calculating inspection ranges
JP2014126494A (en) Inspection support device, inspection support method, robot system, control device, robot and program
JP2019158500A (en) Visual inspection system, image processing device, imaging device, and inspection method
JP4958114B2 (en) Information processing apparatus, information processing method, and computer program
JP2011222636A (en) Inspection apparatus, inspection method, and defect coordinate correction method
JP6915288B2 (en) Image processing system, image processing device, circuit reconstruction method in FPGA (Field Programmable Gate Array), and circuit reconstruction program in FPGA
JP2007103645A (en) Pattern inspection method
KR101792701B1 (en) Apparatus and method for inspecting drawing
JP2018005500A (en) Image processing system, image processing method, and image processing program
JP2019215225A (en) Image inspection system and method for controlling the same
JP2011112379A (en) Image processing device and image processing program
JP7404017B2 (en) Image processing method, image processing device, production system, article manufacturing method, program, and recording medium
JP6389120B2 (en) Data processing apparatus, data processing method, and program
JP2018124607A (en) Image processing system, information processing apparatus, information processing method, and information processing program
KR102252326B1 (en) Systems, methods and computer program products for automatically generating wafer image-to-design coordinate mapping
JP7067869B2 (en) Image processing systems, information processing equipment, information processing methods, and information processing programs
JP4812477B2 (en) Image measurement device part program generation device, image measurement device part program generation method, and image measurement device part program generation program
JP6293293B2 (en) How to establish routines for multi-sensor measurement equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19824963

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19824963

Country of ref document: EP

Kind code of ref document: A1