WO2018185893A1 - Unmeasured area extraction device, work guidance device, and work guidance method - Google Patents

Unmeasured area extraction device, work guidance device, and work guidance method Download PDF

Info

Publication number
WO2018185893A1
WO2018185893A1 PCT/JP2017/014280 JP2017014280W WO2018185893A1 WO 2018185893 A1 WO2018185893 A1 WO 2018185893A1 JP 2017014280 W JP2017014280 W JP 2017014280W WO 2018185893 A1 WO2018185893 A1 WO 2018185893A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
measured
image
unit
measurement
Prior art date
Application number
PCT/JP2017/014280
Other languages
French (fr)
Japanese (ja)
Inventor
昌也 松平
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2017/014280 priority Critical patent/WO2018185893A1/en
Publication of WO2018185893A1 publication Critical patent/WO2018185893A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05CAPPARATUS FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05C11/00Component parts, details or accessories not specifically provided for in groups B05C1/00 - B05C9/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/08Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness for measuring thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/20Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring contours or curvatures, e.g. determining profile

Definitions

  • the present invention relates to an unmeasured region extraction device, a work guidance device, and a work guidance method.
  • Patent Document 1 A technique for measuring the film thickness of a coating film by pressing a measuring element against a painted surface is known.
  • the unmeasured region extraction device includes a position measurement unit that measures position information for each part of the measurement object, and a measurement part of the measurement object that is measured by the position measurement unit.
  • An image acquisition unit for acquiring the image information
  • an overall image generation unit for generating the entire image information of the measured object from the image information for each measured part, and the measured object based on the entire image information
  • An unmeasured part extraction unit that extracts an unmeasured part of the position information.
  • FIG. 1 is a diagram for explaining a painting work system according to the first embodiment.
  • FIG. 2 is a figure which shows typically the mode at the time of the painting work using the painting work system by 1st Embodiment. 1 and 2, an aircraft 11 is illustrated as an example of a painting object to which the painting work system according to the first embodiment is applied.
  • FIG. 1 shows a hangar 2 in which the object to be painted is stored.
  • the hangar 2 is a facility where the aircraft 11 is painted and maintained.
  • the hangar 2 is provided with a gondola 3, a measuring device 10, and a marker 20 for an operator 4 to get on and perform a painting operation on an object to be painted.
  • FIG. 2 shows the gondola 3.
  • the gondola 3 is equipped with a measuring device 10, and is provided with a painting work guidance control unit (work guidance device) 1, a display device 100, a display device 111 (not shown), and the like.
  • the measuring apparatus 10 measures the distance to a coating target object (aircraft 11) three-dimensional position, and analyzes the state of the coating film formed in the coating target object.
  • the painting work guidance control unit 1 has a function of instructing the worker 4 of a suitable painting work, and extracts an unmeasured area that has not been measured by the measuring apparatus 10 in order to execute the function. Further, the painting work guidance control unit 1 generates information on the unmeasured area of the painting object, and outputs information on the unmeasured area and its painting state to the display device 111.
  • the display device 100 displays an image generated regarding the film thickness distribution information and the painting work information of the coating film applied to the painting object.
  • the worker 4 can execute re-measurement and re-painting of the coating state of the object to be painted by confirming the contents displayed on the display devices 100 and 111.
  • the painting work guidance control unit 1 and the display devices 100 and 111 are separate, but the display devices 100 and 111 may be integrated. Further, the present invention includes one in which the display devices 100 and 111 are integrated with the painting work guidance control unit 1.
  • the painting work guidance control unit 1, the measurement device 10, the display device 100, and the display device 111 are arranged on the gondola 3, and the marker 20 is arranged on the beam of the hangar 2, The position where these are arranged is not limited to this. Further, when the worker 4 performs the painting work without using the gondola 3, for example, when the painting work is performed while standing on the ground at the work site, the measuring device 10 can be applied.
  • the gondola 3 is configured to move to a desired position by supplying power to a drive unit (not shown) by an operation by the operator 4 or the like.
  • the worker 4 appropriately moves the gondola 3 along the painting target surface of the aircraft 11 that is the painting target, and operates the painting device 5 to perform the painting work.
  • only one gondola 3 is shown to simplify the drawing, but a plurality of gondolas may be provided according to the size of the aircraft 11 or the like.
  • the coating device 5 is, for example, a spray device (painting gun) that ejects paint, and a nozzle is attached to the tip.
  • the painting device 5 is connected to a paint supply device (not shown) (tank, paint supply pump, etc.) via a hose 6, and the trigger disposed in the painting device 5 is operated, so that the paint supply device Discharge (spray) the supplied paint.
  • the nozzle can be exchanged, and the paint application pattern (paint ejection pattern) can be changed by exchanging the nozzle with a different shape of the part from which the paint is ejected.
  • the measuring device 10 measures the distance from the measuring device 10 to the painting object in a non-contact manner with respect to the painting object by irradiating the painting object with light and receiving the light reflected from the painting object.
  • the measuring device 10 is a laser radar device that measures distance by the Time Of Flight method, and irradiates the aircraft 11 that is the object to be coated with laser light that is frequency-modulated so that the frequency changes with time.
  • the measuring apparatus 10 includes an azimuth angle changing unit and an elevation angle changing unit so that the direction in which laser light is emitted can be freely set in a three-dimensional space.
  • the measuring device 10 calculates the distance between the measuring device 10 and the measurement point of the aircraft 11 based on the frequency difference between the laser beam reflected from the object to be coated and the reference laser beam. Moreover, the measuring apparatus 10 can generate three-dimensional position information of the position reflected from the azimuth angle and the elevation angle.
  • the measurement apparatus 10 may calculate the distance between the measurement apparatus 10 and the measurement point based on the phase difference between the amplitudes of the reflected laser beam and the reference laser beam. Further, when there is an obstacle between the measuring device 10 and the measurement target position of the painting target, a mirror may be arranged to irradiate the measurement target position with the measurement light. Thereby, the laser beam from the measuring apparatus 10 can be irradiated to the object to be coated via the mirror, and the reflected light from the object to be coated can also be detected by the measuring apparatus 10 via the mirror.
  • the heel marker 20 is arranged at a known position in the hangar 2.
  • the position of the marker 20 is a reference position for specifying the positions of the measuring device 10 and the aircraft 11 in the hangar 2.
  • each measuring device 10 measures the position of the marker 20, thereby obtaining the relative position and angle of the measuring device 10 with respect to the marker 20.
  • the three-dimensional spatial position in the hangar 2 of the measuring apparatus 10 is obtained.
  • FIG. 1 by installing markers 20 at a plurality of locations in the hangar 2 and using the position of each marker 20 in the hangar 2 as a reference, the measuring device 10 and the measurement points in a wide range in the hangar 2 Can be obtained.
  • the measuring device 10 has a horizontal angle (azimuth angle) and a vertical angle (elevation angle or depression angle) of the laser beam to be irradiated, and By using the three-dimensional spatial position information of the measuring device 10 itself, the three-dimensional spatial position in the hangar 2 can be calculated for every position of the painting target surface.
  • a coordinate system for representing the three-dimensional spatial position an orthogonal coordinate system or a polar coordinate system is used.
  • the measurement apparatus 10 performs measurement along the surface of the aircraft 11 by sequentially changing the irradiation angles of the laser light in the horizontal direction and the vertical direction (azimuth angle and elevation angle or depression angle). That is, the measuring apparatus 10 acquires point cloud data representing the three-dimensional spatial position of each measurement point of the aircraft 11 by scanning (scanning) the laser light to be irradiated while changing the azimuth angle, the elevation angle, or the depression angle.
  • the correct position information may be calculated by correcting the point cloud data based on the information indicating the mirror mounting position and the normal direction of the reflecting surface of the mirror. it can.
  • the measuring apparatus 10 generates shape data representing the shape of the aircraft 11 based on the obtained plurality of point cloud data.
  • one measuring device 10 is shown to simplify the drawing, but actually, in order to measure the entire surface of the aircraft 11, a plurality of measuring devices 10 are arranged around the aircraft 11. Placed in.
  • the measuring device 10 can be disposed, for example, in the vicinity of the gondola 3, a movable carriage, a fixed base, and the like.
  • the measuring device 10 may be disposed above or below the aircraft 11 or on a self-propelled rail.
  • the operator 4 may be attached to the operator 4.
  • the measuring device 10 measures the distance in each state before painting and after painting, and includes a three-dimensional position as spatial position information of the measuring point in the hangar 2 including the distance between the measuring device 10 and the measuring point. Get information.
  • the measuring apparatus 10 transmits the acquired three-dimensional position information to the painting work guidance control unit 1 by wireless communication or the like.
  • the painting work guidance control unit 1 includes the three-dimensional position information of the measurement points acquired by the measurement device 10 before painting and the tertiary of the measurement points acquired by the measurement device 10 after painting. Get original location information.
  • the measuring device 10 is programmed in advance so as to measure a plurality of positions within the surface to be painted. For example, the measuring apparatus 10 performs measurement at a predetermined pitch along the route information set on the painting target surface.
  • the measuring device 10 is rotatably provided with a mirror whose tilt angle can be controlled. By irradiating the object to be coated with the laser light through this mirror, the azimuth angle and elevation angle of the laser light are changed.
  • the irradiation direction of the laser beam can be set so that the laser beam is irradiated to a desired position on the surface to be coated.
  • the measuring apparatus 10 controls the irradiation direction of the laser light and the moving condition of the gondola 3 so that the measurement is performed at a predetermined pitch on the surface to be coated.
  • the difference in the three-dimensional position information before and after painting corresponds to the thickness of the coating film (or the thickness of the paint) formed on the measurement point of the painting object. Therefore, the film thickness of the coating film formed at the measurement point is calculated by calculating the difference between the three-dimensional position information before painting and the three-dimensional position information after painting of the measurement points included in each painting target area. be able to.
  • the measuring device 10 can acquire three-dimensional position information at each timing before painting, during painting, and after painting by measuring the vicinity of the area where the worker 4 is performing the painting work as needed.
  • the coating operation guidance control unit 1 can appropriately acquire the film thickness of the coating film formed by coating based on the three-dimensional position information during the coating operation.
  • the coating operation guidance control unit 1 estimates the shape change of the coating target based on the temperature difference between before and after coating in the coating target region.
  • the acquired three-dimensional position information before painting may be corrected.
  • the control unit 1 for painting work guidance can calculate the film thickness of the coating film based on the corrected three-dimensional position information before painting and three-dimensional position information after painting.
  • the measuring apparatus 10 measures the distance before painting only in the range immediately before painting, and immediately measures the distance when the range is painted, so that the control unit 1 for painting work guidance is based on those measurement results.
  • the film thickness of the coating film may be calculated.
  • the measuring device 10 measures the three-dimensional position of the coating device 5 and the posture of the coating device 5 in addition to measuring the three-dimensional position of the measurement point in the coating target region. Thereby, information related to the position and posture of the coating apparatus 5 is acquired.
  • the information related to the posture of the coating apparatus 5 is, for example, information related to the orientation of the nozzle of the coating apparatus 5, and this information is acquired by measuring the distance from the measuring apparatus 10 to a plurality of locations of the coating apparatus 5. Can do.
  • the measuring apparatus 10 transmits information related to the acquired position and orientation of the painting apparatus 5 to the painting work guidance control unit 1 by wireless communication or the like.
  • a distance sensor and an inclination sensor may be provided in the coating apparatus 5 instead of performing measurement related to the position and orientation of the coating apparatus 5 by the measuring apparatus 10.
  • the distance sensor measures the distance from the coating device 5 to the measurement point, and the inclination sensor measures the posture of the coating device 5.
  • the measuring device 10 includes an imaging device 102 (see FIG. 3).
  • the measuring device 10 acquires a captured image of the painting target surface including the measurement point by the imaging device 102, and generates image information including color information of the painting target surface.
  • the optical axis of the optical system that measures the distance from the measurement apparatus 10 to the measurement point and the optical system that captures the image of the surface to be coated are shared, and the distance measurement and imaging are performed simultaneously.
  • the image data including the captured color information is associated with the measured distance from the measurement apparatus 10 in the vicinity region including the measurement point with each measurement point. That is, the measuring apparatus 10 generates the three-dimensional position information of the painting target surface and the image data near the position corresponding to the three-dimensional position information.
  • These pieces of information are transmitted to the painting work guidance control unit 1 by wireless communication or the like and stored in the storage unit 105.
  • the imaging device 102 may be provided at a position different from the measurement device 10 instead of being provided in the measurement device 10.
  • the painting work guidance control unit 1 has, for example, an arithmetic processing circuit such as a CPU and a memory such as a ROM and a RAM, and executes a predetermined program to execute its function. Further, the painting work guidance control unit 1 acquires painting apparatus information related to the painting apparatus 5 by an input operation or the like by the operator 4.
  • the coating device information is, for example, discharge information that is information on the type of nozzle of the coating device 5, the discharge amount and discharge distribution of the paint discharged (spouted) from the nozzle, and the like.
  • the painting work guidance control unit 1 generates work information related to the painting work to be performed by the worker 4 based on the coating apparatus information and the target film thickness information which is information on the film thickness of the coating film to be formed.
  • control information such as a moving direction and a moving speed at which the robot arm moves the painting device 5 along the surface to be painted, and a discharge amount of paint from the painting device 5 is generated.
  • the painting work guidance control unit 1 can issue a warning when the painting work or the measurement work is not performed even though the painting work is to be performed.
  • the coating operation guidance control unit 1 generates image data for displaying a film thickness distribution image which is an image representing the film thickness distribution information of the coating film and a work instruction image which is an image instructing the operation. Further, the coating auxiliary device control unit 1 generates work information or control information to be supplied to the automatic coating device based on the coating device information and the target film thickness information when the coating is started in the area where the coating film is not formed. May be. Such information is transmitted to the display device 100 or the display device 111 by wireless communication or the like.
  • the display device 111 is, for example, a projector that projects and displays an image, and displays the image based on the image data transmitted from the painting work guidance control unit 1.
  • the display device 111 projects and displays an image on the painting target surface based on the image data output by the painting work guidance control unit 1.
  • the operator 4 can perform a painting operation or a measurement operation according to the operation instruction image displayed by the display device 111.
  • one display device 111 is shown to simplify the drawing, but in order to project an image on the entire surface of the aircraft 11, a plurality of display devices 111 are arranged around the aircraft 11. You may arrange in.
  • the display device 111 is disposed at a position near the gondola 3, a movable carriage, a fixed base, a position near the beam or column of the hangar 2, and the like.
  • the display device 111 may be provided in the measurement device 10.
  • the work instruction image relating to the painting work is projected onto the painting target surface by the display device 111, whereby the worker 4 is instructed to perform the painting work.
  • a CRT, a liquid crystal display device, or the like may be used as the display device 111.
  • the worker 4 may wear a head-mounted display (HMD) as the display device 111 and present a work instruction image related to the painting work to the worker 4. You may make it provide the tablet terminal etc. which have the function as the unit 1 for painting work guidance, and the display apparatus 111.
  • FIG. 3 is a block diagram for explaining an example of the configuration of the measuring apparatus 10, the painting work guidance unit 1, the display apparatus 100, and the display apparatus 111 according to the first embodiment.
  • the measurement device 10 includes a position measurement unit 101 and an imaging device 102.
  • the painting work guidance control unit 1 includes an unmeasured part extraction unit 25, a work information generation unit 60, a painting work instruction video generation unit 70, a coating film information acquisition unit 108, an alignment unit 109, and an image generation unit 110.
  • the unmeasured area extraction unit 25 includes an image acquisition unit 103, a storage unit 105, an entire image generation unit 106, and an unmeasured part extraction unit 107.
  • the unmeasured part extraction unit 25 may be included in the measurement apparatus 10.
  • the position measuring unit 101 acquires relative position information of the measuring device 10 with respect to the marker 20 at the time of measurement. If the position measuring unit 101 can directly output the distance information from the measuring device 10 to the measurement point, the azimuth and elevation angles when measuring the measurement point and the distance information correspond to the captured image information as the three-dimensional position information. You may generate it.
  • the image acquisition unit 101 performs cut-out processing based on the measurement pitch information by the measurement device 10 so that the captured image information is in an image range corresponding to the measurement pitch.
  • the measuring apparatus 10 uses the time measurement information generated by the position measurement unit 101 regarding the 3D position information of the surface to be painted and the time information about the time when the image data near the position corresponding to the 3D position information is acquired for painting work guidance. Output to the control unit 1.
  • the measuring device 10 is fixed to the gondola 3 and arranged.
  • the gondola 3 is stopped during the painting operation and the position information of the painting target area is acquired before and after the painting operation, before and after the painting operation acquired by the coating apparatus 10.
  • the position information at these two points in time is necessarily in the same painting target area.
  • the imaging device 102 captures an image including each measurement point in the painting target range and generates image information when the position measurement unit 101 measures the three-dimensional position of each measurement point of the aircraft 11 (coating object).
  • the imaging unit 102 includes an imaging optical system and an imaging element that outputs a signal according to the light intensity distribution of an image formed by the imaging optical system.
  • the measuring device 10 acquires position information and image information for the same measurement point by the first running saddle bottom 101 and the imaging device 102. That is, the image information generated by the imaging device 102 is image information in a range including measurement points where the three-dimensional position measurement is performed by the position measurement unit 101.
  • Image information captured and generated by the imaging device 102 is hereinafter referred to as captured image information.
  • the imaging device 102 generates angle-of-view information related to the angle of view of the captured image information in addition to the captured image information.
  • the captured image information and the angle-of-view information are transmitted from the measuring device 10 to the painting work guidance control unit 1.
  • the painting work guidance control unit 1 acquires the captured image information and the angle-of-view information generated by the imaging unit 102 from the measurement device 10. Further, the painting work guidance control unit 1 acquires from the measurement device 10 position information generated by the position measurement unit 101 and information indicating the position of the measurement device 10 at the time of position measurement.
  • the information indicating the position of the measuring apparatus 10 is information indicating the three-dimensional position of the measuring apparatus 10 when the measuring apparatus 10 performs position measurement while sequentially moving a plurality of positions intermittently with a painting operation. For example, in the state where the measuring device 10 is fixed to the gondola 3 and the gondola 3 is sequentially moved to perform painting work and position measurement, the information indicating the position of the measuring device 10 is the traveling direction of the gondola 3.
  • the three-dimensional position information of each measurement point and the captured image information and the angle-of-view information respectively corresponding to each measurement point transmitted from the measurement apparatus 10 to the painting work guidance control unit 1 are stored in the storage unit 105.
  • the image acquisition unit 103 performs a cut-out process based on the measurement pitch information by the measurement device 10 so that the captured image information is in an image range corresponding to the measurement pitch. That is, when the image acquisition unit 103 detects that position information relating to three or more measurement points among the measurement points on the surface to be coated is stored in the storage unit 105, the image acquisition unit 103 calculates a cut-out range from the stored captured image information. To do. Depending on the distance from a certain measurement point to the measuring device 10, the shooting range on the surface to be painted changes. Further, as the angle between the measurement direction of the position measuring unit 101 and the normal direction of the surface to be painted at the measurement point changes from a parallel state, the shooting range on the surface to be painted changes.
  • the image generation range is extracted from the captured image information as image information corresponding to the measurement pitch based on the distance from the measurement point to the measurement apparatus 10 or the three-dimensional position information of the measurement points around the measurement point. To decide.
  • the image acquisition unit 103 extracts image information corresponding to the determined image generation range from the captured image information.
  • the image information generated by the image acquisition unit 103 is image information corresponding to a wide range including measurement points by the position measurement unit 101. A specific example will be described with reference to FIG.
  • FIG. 4 is a diagram for explaining an example of processing by the image processing unit 103 of the painting work guidance control unit 1 according to the first embodiment.
  • a range 80a is a range surrounded by the point ABCD, and indicates a range of captured image information acquired by the imaging device 102 when measuring the three-dimensional position of the measurement point 81a.
  • a range 80b is a range surrounded by the point EFGH, and indicates a range of captured image information acquired by the imaging apparatus 102 when measuring the three-dimensional position of the measurement point 81b that is the measurement point next to the measurement point 81a.
  • each measurement point is located at the center.
  • FIG. 4 shows a measurement pitch 83 on the measurement target surface.
  • the image acquisition unit 103 calculates an imaging range corresponding to the set measurement pitch 83 when the measurement target surface is a flat surface and the measurement direction of the measurement apparatus 10 is perpendicular to the measurement target surface. For example, when the focal length of the imaging optical system of the imaging device 102 is f, the distance between adjacent measurement points is P, and the distance from the measurement device 10 to the measurement point acquired by the position measurement unit 101 is d. , D ⁇ P / f, the length of the shooting range in the measurement pitch direction can be calculated. Based on the imaging region and the information indicating the measurement pitch, image generation ranges 82a and 82b corresponding to the measurement pitch are set.
  • the image generation range 82a is a range surrounded by the points abcd
  • the image generation range 82b is a range surrounded by the points efgh.
  • the image acquisition unit 103 extracts image information corresponding to the image generation range 82a from the captured image information 80a, and extracts image information corresponding to the image generation range 82b from the captured image information 80b.
  • FIG. 5 is a diagram for explaining another example of processing performed by the image acquisition unit 103 according to the first embodiment.
  • FIG. 5 shows whether the image acquisition unit 103 is tilted with respect to the measurement direction of the position measurement unit 101 based on the measurement points 81 a and 81 b and the three-dimensional position information in the vicinity of these measurement points. Determine whether.
  • the measurement target surface 84 is tilted, that is, when the angle between the measurement direction of the position measurement unit 101 and the normal direction of the coating target surface that is the measurement target surface is equal to or greater than a predetermined angle, the angle Accordingly, the range for cutting out the captured image information is changed.
  • the range for extracting the captured image information corresponding to the measurement point 81a is set to C1
  • the range for extracting the captured image information corresponding to the next measurement point 81b is also the same.
  • the range of the captured image information corresponding to the measurement point 81a is set to C21
  • the captured image information corresponding to the next measurement point 81b ′ is extracted.
  • the range is set to C22 different from C21.
  • the measuring apparatus 10 performs three-dimensional position measurement at a plurality of measurement points while changing the azimuth angle and the elevation angle as described above. Therefore, by changing the pitch of the measurement points for each position where the elevation angle pitch and the azimuth pitch are measured in accordance with the shape of the aircraft that is the object to be coated, a plurality of pitches are measured at a predetermined measurement pitch on the object to be painted. Measure the measurement point.
  • the aircraft that is the object to be painted has a shape that is different from the assumed shape, or the relative positional relationship between the measuring machine and the measuring device is measured in a different positional relationship. In such a case, the normal direction of the measurement point with respect to the measurement apparatus 10 may be different from the assumed direction, and measurement is not performed at the required measurement pitch, resulting in measurement omission of the coating film thickness.
  • the image acquisition unit 103 outputs a signal for changing the scanning angle amount of the measurement device 10 to the measurement device 10. Then, the position measuring unit 101 may change the scanning range of the measuring apparatus 10 to perform three-dimensional position measurement.
  • the image acquisition unit 103 calculates an angle formed between the normal direction of the measurement surface 84 and the measurement direction of the measurement apparatus 10, and the direction set by the position measurement unit 101 at the time of measurement based on the calculated angle.
  • the setting information of the angle or the elevation angle is reset, and information indicating the movement amount of the measuring device 10 is generated based on the setting information.
  • the information indicating the movement amount of the measurement device 10 generated in this way is output to the measurement device 10, and the movement amount of the measurement device 10 is changed. For example, the distance between the position of the measurement device 10 when measuring the distance of the measurement point 81a and the position of the measurement device 10 when measuring the distance of the measurement point 81b is reduced, and the number of measurement points is increased.
  • the unmeasured area extraction unit 25 adds the three-dimensional position information read from the storage unit 105 to the image information cut out by the image acquisition unit 103.
  • the image information provided with the position information is stored in the storage unit 105.
  • the whole image generation unit 106 synthesizes the image information for each measurement point cut out by the image acquisition unit 103 based on the position information given to the image information, and generates the whole image information of the measured region in the painting target. To do. Specifically, the entire image generation unit 106 generates the entire image information of the measured region of the painting target by connecting the plurality of pieces of image information cut out by the image acquisition unit 103.
  • the unmeasured part extraction unit 107 extracts a three-dimensional region from which position information has not been acquired based on the entire image information. Among the extracted three-dimensional regions, for example, a three-dimensional region corresponding to the conditions shown in the following (1) and (2) is extracted as an unmeasured region.
  • a three-dimensional region corresponding to the conditions shown in the following (1) and (2) is extracted as an unmeasured region.
  • (1) When there is the shape model data of the object to be painted, the shape model data is compared with the entire image information, and the shape model data exists but the entire image information does not exist. In particular, when there is paint target area information, the information is also referred to.
  • the unmeasured part extraction unit 107 stores the extracted unmeasured area in the storage unit 105.
  • the measurement device 10, the unmeasured region extraction unit 25 in the work guidance control unit 1, and the display device 111 are configured to be used independently as an unmeasured region extraction device.
  • the coating film information acquisition unit 108 is based on the three-dimensional position information before the painting operation and the painting based on the three-dimensional position information of the same measurement point before and after the painting operation acquired by the position measurement unit 101 of the measuring apparatus 10.
  • the thickness of the coating film formed at each measurement point is calculated by obtaining the difference in the position information of the three-dimensional position information after the work.
  • the coating film information acquisition part 108 produces
  • the paint film information acquisition unit 108 calculates the thickness of the paint film formed on the painting object from the difference information between the position information of the painting object after the painting work and the shape model data of the painting object.
  • the film thickness distribution information may be acquired.
  • the alignment unit 109 performs alignment processing between the model data of the painting object and the entire image information of the painting object.
  • the shape model data of the painting object is, for example, design data (CAD data) of the aircraft 11 that is the painting object.
  • CAD data design data
  • the alignment unit 109 uses the position information given to the image information cut out by the image acquisition unit 103, which is used when generating the entire image information, so that the shape model data of the painting target and the painting target are used. Alignment processing with the entire image of the object is performed. Also.
  • the alignment unit 109 performs alignment processing between the shape model data of the coating object and the film thickness distribution information.
  • the image generation unit 110 generates image data for displaying an image representing the shape model data of the painting target and the entire image information of the painting target based on the processing result by the positioning unit 109. Further, for example, the image generation unit 110 generates image data for displaying an image representing the shape model data and film thickness distribution information of the painting target based on the processing result by the alignment unit 109. In addition, for example, the image generation unit 110 displays an image representing a portion for which position information has not been acquired based on information regarding a portion for which position information has not been acquired, generated by the unmeasured portion extraction unit 107. Generate image data. The image generation unit 110 outputs the generated image data to the display device 100 and / or 111. As for the film thickness distribution information, the film thickness distribution information may be output by being superimposed on the entire image information, instead of being represented together with the shape model data of the object to be coated.
  • the display devices 100 and 111 are, for example, a CRT or a liquid crystal display device, and display an image based on the image data transmitted from the measurement device 10 to the painting work guidance control unit 1.
  • Each of the display devices 100 and 111 is disposed in the gondola 3 or the like.
  • the worker 4 can check the images displayed by the display devices 100 and 111 and perform the painting work.
  • the display device 100 and / or 11 displays an area in which the shape measurement of the surface to be painted is not performed, so that the operator 4 can perform remeasurement or repainting work. Can be encouraged.
  • the worker 4 may wear a head-mounted display (HMD) as the display devices 100 and / 111 and present the image to the worker 4. You may make it provide the tablet terminal etc. which have a function as the display apparatuses 100 and / 111.
  • HMD head-mounted display
  • the operator 4 measures the film thickness of the coating film and confirms the state of the coating film, and then the next coating position. It is possible to move to. However, the operator 4 may move to the next place without measuring the film thickness due to a mistake of the operator 4 or the like.
  • this Embodiment has described about the case where the gondola 3 is one, when arranging a plurality of gondola around the object to be painted and starting the painting work from different positions all at once There is also. In such a case, it is not included in the movement range of any gondola and may be left without being painted.
  • the film thickness of the coating film is measured by the measuring device 10 when the gondola moves to the painting target position and the painting operation is completed.
  • the coating target position to which the gondola has moved is acquired by associating the three-dimensional position information with the film thickness information of the paint film, so check the unpainted area or the poorly painted area during the painting of the object to be painted. be able to. That is, according to the present embodiment, the painting work guidance control unit 1 extracts an unmeasured area where shape measurement by the measuring apparatus 10 is not performed, and displays an image indicating the position of the unmeasured area on the display device 100. And / or 111. Thereby, it is possible to prompt the worker 4 to perform remeasurement or repainting work. As a result, it is possible to reduce the work time of the painting work by reducing the additional painting work on the unpainted area. Moreover, the burden on the operator 4 can be reduced.
  • the painting information acquisition unit 108 determines whether each of the measurement points in the coating target area is a painted area or an unpainted area based on the film thickness distribution information acquired by the measuring apparatus 10. Further, the painting information acquisition unit 108 determines whether the thickness of the coating film is within a predetermined range for the painted area. Thereby, the coating information acquisition unit 108 is information on the insufficient film thickness distribution information that is information on the region where the coating film thickness is insufficient and the insufficient coating thickness from the coating thickness distribution information on the coating target surface. Is generated.
  • the storage unit 105 stores the film thickness distribution information input to the coating information acquisition unit 108.
  • the storage unit 105 stores coating apparatus information regarding the coating apparatus 5 and information regarding the film thickness of the coating film to be formed (target film thickness information and the like) by an input operation or the like by the operator 4.
  • the storage unit 105 stores discharge information for a plurality of nozzles as the coating apparatus information.
  • the storage unit 105 includes a semiconductor memory such as a RAM and a storage medium such as a hard disk device.
  • the work information generation unit 60 generates work information that is information related to the painting work to be performed on the painting target based on the target film thickness information, the film thickness distribution information, the coating apparatus information, and the like.
  • the work information is, for example, information related to the target position of the paint sprayed by the coating apparatus 5, information related to the position of the coating apparatus 5 and the direction of the nozzle, information related to the speed (speed and direction) of moving the coating apparatus 5.
  • the work information generation unit 60 includes a position calculation unit 61 and a transition calculation unit 62.
  • the position calculation unit 61 calculates a target position for spraying the paint by the coating apparatus 5 on the object to be coated. Further, if the coating apparatus 5 is a brush, the position calculation unit 61 calculates a contact target position of the brush with respect to the coating object. Moreover, the position calculation part 61 will calculate the position of the coating target immersed in the electrodeposition coating liquid, when the coating apparatus 5 is an electrodeposition coating apparatus. In the following description, the case where the coating apparatus 5 is a spray gun will be described.
  • the position calculation unit 61 calculates a target position where the coating apparatus 5 performs coating and a target posture of the coating apparatus 5 at that time based on information on the discharge amount and the discharge distribution included in the coating apparatus information. Specifically, the position calculation unit 61 calculates the spray target position with respect to the coating target and the position and orientation of the coating apparatus 5 using information on insufficient film thickness distribution information, discharge amount and discharge distribution, and the like. The position calculation unit 61 may calculate the spray target position based on the shape data generated by the measurement apparatus 10. Further, the position calculation unit 61 may adjust the spray target position according to the shape and size of the painting target.
  • the transition calculation unit 62 calculates the spray target position for each work time in a series of steps of the paint spraying operation based on, for example, the painting apparatus information. That is, the transition calculation unit 62 calculates the temporal transition of the position of the coating apparatus 5 when performing the painting work.
  • the transition calculation unit 62 includes information on the discharge amount of the paint discharged from the nozzles used in the coating apparatus 5 and information on its discharge distribution, and the film thickness calculated by the analysis unit 15 and the film thickness at the time of design.
  • the speed at which the coating apparatus 5 is moved is calculated based on information relating to the difference from (target film thickness).
  • the speed at which the coating apparatus 5 is moved is, for example, the moving distance per unit time of the coating apparatus 5 with respect to the coating target.
  • the work information generation unit 60 determines the paint spray target position, the position of the coating apparatus 5, the moving speed, and the like in accordance with the state of the film thickness of the coating film formed on the painting target surface by the painting work. Generate work information on the painting work to be performed.
  • the work information generation unit 60 may generate the work information in consideration of the ambient temperature and humidity of the object to be painted, the characteristics of the paint, the overall work time, and the like.
  • the work information generated by the work information generating unit 60 is output to the painting work instruction image generating unit 70.
  • the painting work instruction video generation unit 70 generates image data for displaying a film thickness distribution image and a work instruction image. For example, the painting work instruction image generation unit 70 displays the film thickness distribution image and the work instruction image superimposed on the painting target surface based on the film thickness distribution information, the work information, and the shape data of the painting target. Image data is generated.
  • the image data generated by the painting work instruction video generation unit 70 is generated based on the position and orientation of the display device 111 with respect to the painting target surface. For example, information regarding the position and orientation of the display device 111 is input to the painting work instruction image generation unit 70, and the coating operation instruction image generation unit 70 displays the film thickness distribution based on the information regarding the position and orientation of the display device 111. Image data such as images and work instruction images are generated.
  • the film thickness distribution image and the work instruction image can be appropriately superimposed and displayed on the painting target.
  • the image data generated by the painting work instruction video generation unit 70 is output to the display device 111 by wireless communication or the like. Note that only one of the film thickness distribution image and the work instruction image may be displayed without being superimposed.
  • the display device 111 can display various images based on the image data generated by the painting work instruction image generation unit 70. For example, the display device 111 displays a film thickness distribution image in which the film thickness of the coating film is classified step by step and color-coded. Moreover, you may make it the coating auxiliary device control unit 1 display the film thickness value of the coating film for every place with respect to the display apparatus 111. FIG. Further, the painting auxiliary device control unit 1 determines an optimum nozzle from the replaceable nozzles, generates an image for guiding the replacement of the nozzle of the painting device 5, and causes the display unit 111 to display the image. Good.
  • FIG. 6 is a diagram illustrating an example of a display image displayed on the display device 111 in the first embodiment.
  • the film thickness distribution image and the work instruction image are superimposed and displayed on the painting target surface of the aircraft 11 as the painting target. These images are projected and displayed in alignment with the painting target surface of the aircraft 11.
  • the colors are displayed according to the film thickness of the coating film.
  • Regions 121 and 122 are regions in which the film thickness is within a predetermined range from the target film thickness.
  • the region 122 is a region whose film thickness is thinner than the film thickness range of the region 121.
  • the region 123 is a region whose film thickness is thinner than the film thickness range of the region 122 and is below a predetermined range from the target film thickness.
  • the color difference is expressed using dots and hatching.
  • a pointer 90 shown in FIG. 6 indicates a target position at which painting spraying is started.
  • the pointer 90 moves in the direction indicated by the arrow 91.
  • the movement of the arrow 91 is performed in a direction and speed suitable for forming a desired film thickness.
  • the operator 4 moves the coating device 5 following the movement of the arrow, so that an appropriate painting operation can be performed for an area where the coating is not performed or an area where the film thickness of the coating film is insufficient. It can be performed.
  • the discharge amount of the paint discharged from the coating apparatus 5 is set to be constant, the film thickness of the coating film formed by the painting operation can be adjusted by adjusting the moving speed of the pointer 90. it can.
  • the operator 4 is instructed in advance to spray paint onto the object to be painted from the coating device 5 while the pointer 90 is displayed, and to stop spraying the paint by the coating device 5 when the pointer 90 disappears. You can also keep it. As a result, it is possible to prevent the paint from being sprayed on the object to be used unnecessarily, leading to cost reduction of the painting work.
  • a work instruction image can be projected onto the painting target surface of the aircraft 11 which is the painting target. Further, by displaying such a work instruction image on a see-through type head mounted display, the worker 4 can also perform a painting operation while confirming the work instruction image superimposed on the painting work surface. As a result, the burden on the operator can be greatly reduced.
  • FIG. 7 is a flowchart showing a flow of processing by the painting work system according to the first embodiment.
  • the painting work guidance control unit 1 controls the movement of the gondola so as to move the gondola to the painting position of the painting object, and the process proceeds to step 203.
  • the painting work guidance control unit 1 obtains, from the measuring device 10, the three-dimensional position information of the painting target surface measured by the position measurement unit 101 and the painting target surface image obtained by the imaging device 102, Proceed to step S205.
  • the image acquisition unit 103 cuts out the image of the painting target surface acquired by the imaging device 102 so as to be in an image range corresponding to the measurement pitch, and proceeds to step S207.
  • step S207 it is determined whether or not the painting work has been completed. If the determination is affirmative, that is, if it is determined that the painting operation has been completed, the process proceeds to step S209. On the other hand, when a negative determination is made, that is, when it is determined that the painting work is not completed, the process returns to step S201.
  • step S209 the entire image generation unit 106 connects the images cut out by the image acquisition unit 103 in step S207 to generate an entire image of the painting target, and the process proceeds to step S211.
  • step S211 the unmeasured part extraction unit 107 determines whether measurement position information has not been acquired, that is, there is an unmeasured area.
  • step S211 If it is determined affirmative in step S211, that is, if there is an unmeasured area, the process proceeds to step S213. On the other hand, if a negative determination is made in step S211, that is, it is determined that there is no unmeasured area, it means that there is no area where the painting work has not been completed, and thus the painting work guidance control unit 1 ends this routine.
  • step 213 the control unit for painting work guidance 1 moves the gondola to this area for the area determined as the unmeasured area in step S211. After the painting operation is performed, the film thickness of the coating film is measured by the same procedure as S201, and the process proceeds to Step S215.
  • step S215 the coating film information acquisition unit 108 generates film thickness distribution information, and the coating operation guidance control unit 1 determines whether or not the coating film thickness is a predetermined value. If an affirmative determination is made in step S215, that is, if it is determined that the film thickness of the coating film is a predetermined value, the coating work guidance control unit 1 ends this routine. On the other hand, if a negative determination is made in step S215, that is, if it is determined that the film thickness of the coating film has not reached the predetermined value, the process returns to step S213.
  • the coating work guidance control unit 1 acquires position information for each part of the measurement object and image information of the measurement part of the measurement object from the measurement apparatus 10.
  • the coating work guidance control unit 1 generates the entire image information of the object to be measured from the image information for each measurement part, and the position information on the object to be measured is not measured based on the entire image information.
  • an unmeasured part extracting unit 107 for extracting the part With such a configuration, it is possible to grasp a region where distance measurement is performed on a coating target and a region where distance measurement is not performed. Therefore, it is possible to present information indicating an unmeasured area to the worker and prompt the worker to perform remeasurement or repainting work. Further, by performing the painting operation based on the unmeasured area, it is possible to reliably apply the painting to the area to be painted, and to shorten the time of the painting work.
  • mesh data based on position information measured by the position measurement unit 101 may be used as the image information.
  • the points of the point cloud data acquired by the position measuring unit 101 are connected to generate mesh data composed of a plurality of meshes (for example, triangular meshes).
  • the unmeasured part extraction unit 107 extracts an area where no mesh is generated as an unmeasured area for distance measurement.
  • the image generation unit 110 may generate a two-dimensional image based on the three-dimensional data acquired by the measurement apparatus 10 and use the two-dimensional image data as image information.
  • Modification 2 In the embodiment and the modification described above, an example in which an aircraft is used as a painting target has been described.
  • the painting target may be an automobile or a ship, and is not particularly limited.
  • the present invention can be applied to the analysis of the coating state of various coating objects.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This unmeasured area extraction device is provided with a position measurement unit for measuring position information for each part of an object under measurement, an image acquisition unit for acquiring image information for the parts under measurement of the object under measurement that have been measured by the position measurement unit, an overall image generation unit for generating overall image information for the object under measurement from the image information for each part under measurement, and an unmeasured part extraction unit for extracting a part where position information for the object under measurement is unmeasured on the basis of the overall image information.

Description

未測定領域抽出装置、作業ガイダンス装置、および作業ガイダンス方法Unmeasured region extraction device, work guidance device, and work guidance method
 本発明は、未測定領域抽出装置、作業ガイダンス装置、および作業ガイダンス方法に関する。 The present invention relates to an unmeasured region extraction device, a work guidance device, and a work guidance method.
 測定子を塗装面に押接させて塗膜の膜厚を測定する技術が知られている(特許文献1)。 A technique for measuring the film thickness of a coating film by pressing a measuring element against a painted surface is known (Patent Document 1).
日本国特開昭59-180322号公報Japanese Unexamined Patent Publication No. 59-180322
 本発明の第1の態様によると、未測定領域抽出装置は、被測定物の部分ごとの位置情報を測定する位置測定部と、前記位置測定部により測定された前記被測定物の被測定部分の画像情報を取得する画像取得部と、前記被測定部分ごとの画像情報から、前記被測定物の全体画像情報を生成する全体画像生成部と、前記全体画像情報に基づき、前記被測定物における前記位置情報が未測定の部分を抽出する未測定部分抽出部と、を備える。 According to the first aspect of the present invention, the unmeasured region extraction device includes a position measurement unit that measures position information for each part of the measurement object, and a measurement part of the measurement object that is measured by the position measurement unit. An image acquisition unit for acquiring the image information, an overall image generation unit for generating the entire image information of the measured object from the image information for each measured part, and the measured object based on the entire image information An unmeasured part extraction unit that extracts an unmeasured part of the position information.
第1の実施の形態による未測定領域抽出装置を説明するための図である。It is a figure for demonstrating the unmeasured area | region extraction apparatus by 1st Embodiment. 第1の実施の形態による未測定領域抽出装置を用いた塗装作業時の様子を模式的に示す図である。It is a figure which shows typically the mode at the time of the painting operation | work using the unmeasured area | region extraction apparatus by 1st Embodiment. 第1の実施の形態による未測定領域抽出装置の機能構成を説明するための図である。It is a figure for demonstrating the function structure of the unmeasured area | region extraction apparatus by 1st Embodiment. 第1の実施の形態による未測定領域抽出装置による処理を説明するための図である。It is a figure for demonstrating the process by the unmeasured area | region extraction apparatus by 1st Embodiment. 第1の実施の形態による未測定領域抽出装置による処理の別の例を説明するための図である。It is a figure for demonstrating another example of the process by the unmeasured area extraction apparatus by 1st Embodiment. 表示装置による表示画像の一例を示す図である。It is a figure which shows an example of the display image by a display apparatus. 第1の実施の形態による未測定領域抽出装置による処理の流れを示したフローチャートである。It is the flowchart which showed the flow of the process by the unmeasured area | region extraction apparatus by 1st Embodiment.
 本発明の実施形態について図面を参照しながら説明するが、本発明はこれに限定されない。また、図面においては、実施形態を説明するため、一部分を大きくまたは強調して記載するなど適宜縮尺を変更して表現している。 Embodiments of the present invention will be described with reference to the drawings, but the present invention is not limited thereto. Further, in the drawings, in order to describe the embodiment, the scale is appropriately changed and expressed, for example, partly enlarged or emphasized.
-第1の実施の形態-
 図面を参照しながら、第1の実施の形態による塗装作業システム(未測定領域抽出装置)について、未測定領域抽出機能を用いて塗装対象物の一例である航空機に対する塗装作業を行う場合を例に挙げて説明する。なお、第1の実施の形態は、発明の趣旨の理解のために具体的に説明するためのものであり、特に指定の無い限り、本発明を限定するものではない。
-First embodiment-
With reference to the drawings, with respect to the painting work system (unmeasured area extraction device) according to the first embodiment, an example of performing a painting work on an aircraft, which is an example of a painting object, using the unmeasured area extraction function I will give you a description. The first embodiment is specifically described for the purpose of understanding the gist of the invention, and does not limit the present invention unless otherwise specified.
 図1は、第1の実施の形態による塗装作業システムを説明するための図である。また、図2は、第1の実施の形態による塗装作業システムを用いた塗装作業時の様子を模式的に示す図である。図1および図2では、第1の実施の形態による塗装作業システムが適用される塗装対象物の例として、航空機11を例示する。図1には、この塗装対象物が格納される格納庫(ハンガー)2が示されている。格納庫2は、航空機11の塗装や整備等が行われる設備である。 FIG. 1 is a diagram for explaining a painting work system according to the first embodiment. Moreover, FIG. 2 is a figure which shows typically the mode at the time of the painting work using the painting work system by 1st Embodiment. 1 and 2, an aircraft 11 is illustrated as an example of a painting object to which the painting work system according to the first embodiment is applied. FIG. 1 shows a hangar 2 in which the object to be painted is stored. The hangar 2 is a facility where the aircraft 11 is painted and maintained.
 格納庫2には、作業者4が乗り込んで塗装対象物に対して塗装作業を行うためのゴンドラ3と、測定装置10と、マーカ20とが設けられている。図2にゴンドラ3を示す。ゴンドラ3には、測定装置10が装着され、また、塗装作業ガイダンス用制御ユニット(作業ガイダンス装置)1、表示装置100、表示装置111(不図示)等が設けられる。本実施の形態において、測定装置10は、塗装対象物(航空機11)までの距離を三次元位置測定し、塗装対象物に形成された塗膜の状態を解析する。塗装作業ガイダンス用制御ユニット1は、好適な塗装作業を作業者4に指示する機能を有し、その機能の実行のために、測定装置10により測定されなかった未測定領域を抽出する。さらに、塗装作業ガイダンス用制御ユニット1は、塗装対象物の未測定領域に関する情報を生成し、未測定領域とその塗装状態に関する情報を表示装置111に出力する。表示装置100は、塗装対象物に塗装された塗膜の膜厚分布情報や塗装作業情報に関して生成された画像を表示する。作業者4は、表示装置100、111に表示された内容を確認することで、塗装対象物の塗膜状態の再測定や再塗装作業を実行することができる。 The hangar 2 is provided with a gondola 3, a measuring device 10, and a marker 20 for an operator 4 to get on and perform a painting operation on an object to be painted. FIG. 2 shows the gondola 3. The gondola 3 is equipped with a measuring device 10, and is provided with a painting work guidance control unit (work guidance device) 1, a display device 100, a display device 111 (not shown), and the like. In this Embodiment, the measuring apparatus 10 measures the distance to a coating target object (aircraft 11) three-dimensional position, and analyzes the state of the coating film formed in the coating target object. The painting work guidance control unit 1 has a function of instructing the worker 4 of a suitable painting work, and extracts an unmeasured area that has not been measured by the measuring apparatus 10 in order to execute the function. Further, the painting work guidance control unit 1 generates information on the unmeasured area of the painting object, and outputs information on the unmeasured area and its painting state to the display device 111. The display device 100 displays an image generated regarding the film thickness distribution information and the painting work information of the coating film applied to the painting object. The worker 4 can execute re-measurement and re-painting of the coating state of the object to be painted by confirming the contents displayed on the display devices 100 and 111.
 本実施の形態では、塗装作業ガイダンス用制御ユニット1と表示装置100、111とが別々になっているが、表示装置100と111は一体であってもよい。また、本発明には、表示装置100、111が塗装作業ガイダンス用制御ユニット1と一体となっているものも含まれる。なお、本実施の形態では、塗装作業ガイダンス用制御ユニット1、測定装置10、および表示装置100、表示装置111をゴンドラ3に配置し、マーカ20を格納庫2の梁に配置する構成としているが、これらを配置する位置はこれに限定されない。また、作業者4がゴンドラ3を使用せずに塗装作業を行う場合、例えば作業現場における地面に立って塗装作業を行う場合についても、この測定装置10を適用することができる。 In the present embodiment, the painting work guidance control unit 1 and the display devices 100 and 111 are separate, but the display devices 100 and 111 may be integrated. Further, the present invention includes one in which the display devices 100 and 111 are integrated with the painting work guidance control unit 1. In the present embodiment, the painting work guidance control unit 1, the measurement device 10, the display device 100, and the display device 111 are arranged on the gondola 3, and the marker 20 is arranged on the beam of the hangar 2, The position where these are arranged is not limited to this. Further, when the worker 4 performs the painting work without using the gondola 3, for example, when the painting work is performed while standing on the ground at the work site, the measuring device 10 can be applied.
 ゴンドラ3は、作業者4による操作等によって不図示の駆動部に動力を供給することにより、所望の位置へ移動できるように構成されている。作業者4は、ゴンドラ3を塗装対象物である航空機11の塗装対象面に沿って適宜移動させて、塗装装置5を操作して塗装作業を行う。図1に示す例では、図を簡略化するためにゴンドラ3が一つだけ示されているが、航空機11の大きさ等に応じて複数のゴンドラを設けてもよい。 The gondola 3 is configured to move to a desired position by supplying power to a drive unit (not shown) by an operation by the operator 4 or the like. The worker 4 appropriately moves the gondola 3 along the painting target surface of the aircraft 11 that is the painting target, and operates the painting device 5 to perform the painting work. In the example shown in FIG. 1, only one gondola 3 is shown to simplify the drawing, but a plurality of gondolas may be provided according to the size of the aircraft 11 or the like.
 塗装装置5は、例えば塗料を噴出するスプレー装置(塗装ガン)であり、先端にはノズルが取り付けられている。塗装装置5は、ホース6を介して不図示の塗料供給装置(タンクおよび塗料供給用ポンプ等)に接続されており、塗装装置5に配置されたトリガが操作されることで、塗料供給装置から供給される塗料を吐出(吹き付け)する。ノズルは交換することが可能であり、塗料が吐出される部分の形状が異なるノズルに交換することにより、塗料の塗布パターン(塗料の吐出パターン)を変更することができる。なお、塗装装置5として、ローラー、刷毛、電着塗装装置を用いてもよい。 The coating device 5 is, for example, a spray device (painting gun) that ejects paint, and a nozzle is attached to the tip. The painting device 5 is connected to a paint supply device (not shown) (tank, paint supply pump, etc.) via a hose 6, and the trigger disposed in the painting device 5 is operated, so that the paint supply device Discharge (spray) the supplied paint. The nozzle can be exchanged, and the paint application pattern (paint ejection pattern) can be changed by exchanging the nozzle with a different shape of the part from which the paint is ejected. In addition, as the coating apparatus 5, you may use a roller, a brush, and an electrodeposition coating apparatus.
 測定装置10は、塗装対象物に光を照射し塗装対象物から反射した光を受光することにより、塗装対象物に対して非接触で、測定装置10から塗装対象物までの距離を測定する。例えば、測定装置10は、Time Of Flight方式で距離測定を行うレーザレーダ装置であり、周波数が時間とともに変化するように周波数変調したレーザ光を塗装対象物である航空機11に対して照射する。測定装置10は、レーザ光を放射する方向を三次元空間内で自由に設定できるように、方位角変更部と仰俯角変更部を備えている。測定装置10は、塗装対象物から反射したレーザ光と参照レーザ光との周波数差に基づいて、測定装置10と航空機11の測定点との間の距離を算出する。また、測定装置10は、方位角および仰俯角から反射した位置の三次元位置情報を生成することができる。 The measuring device 10 measures the distance from the measuring device 10 to the painting object in a non-contact manner with respect to the painting object by irradiating the painting object with light and receiving the light reflected from the painting object. For example, the measuring device 10 is a laser radar device that measures distance by the Time Of Flight method, and irradiates the aircraft 11 that is the object to be coated with laser light that is frequency-modulated so that the frequency changes with time. The measuring apparatus 10 includes an azimuth angle changing unit and an elevation angle changing unit so that the direction in which laser light is emitted can be freely set in a three-dimensional space. The measuring device 10 calculates the distance between the measuring device 10 and the measurement point of the aircraft 11 based on the frequency difference between the laser beam reflected from the object to be coated and the reference laser beam. Moreover, the measuring apparatus 10 can generate three-dimensional position information of the position reflected from the azimuth angle and the elevation angle.
 なお、測定装置10は、反射レーザ光および参照レーザ光の振幅の位相差に基づいて、測定装置10と測定点との間の距離を算出してもよい。また、測定装置10と塗装対象物の測定対象位置との間に障害物がある場合は、ミラーを配置して、測定光を測定対象位置に照射してもよい。これにより、ミラーを介して測定装置10からのレーザ光を塗装対象物に照射し、かつ塗装対象物からの反射光もミラーを介して測定装置10で検出することができる。 Note that the measurement apparatus 10 may calculate the distance between the measurement apparatus 10 and the measurement point based on the phase difference between the amplitudes of the reflected laser beam and the reference laser beam. Further, when there is an obstacle between the measuring device 10 and the measurement target position of the painting target, a mirror may be arranged to irradiate the measurement target position with the measurement light. Thereby, the laser beam from the measuring apparatus 10 can be irradiated to the object to be coated via the mirror, and the reflected light from the object to be coated can also be detected by the measuring apparatus 10 via the mirror.
  マーカ20は、格納庫2内の既知の位置に配置される。マーカ20の位置は、格納庫2内において、測定装置10および航空機11の位置等を特定するための基準位置となる。格納庫2内に複数の測定装置10が配置される場合は、各々の測定装置10がマーカ20の位置を測定することにより、マーカ20に対する測定装置10の相対的な位置および角度を求める。これにより、測定装置10の格納庫2内における三次元空間的位置が求まる。図1に示すように、格納庫2内の複数個所にマーカ20を設置して、格納庫2内の各マーカ20の位置を基準にすることにより、格納庫2内の広い範囲で測定装置10および測定点の位置を求めることが可能となる。 The heel marker 20 is arranged at a known position in the hangar 2. The position of the marker 20 is a reference position for specifying the positions of the measuring device 10 and the aircraft 11 in the hangar 2. When a plurality of measuring devices 10 are arranged in the hangar 2, each measuring device 10 measures the position of the marker 20, thereby obtaining the relative position and angle of the measuring device 10 with respect to the marker 20. Thereby, the three-dimensional spatial position in the hangar 2 of the measuring apparatus 10 is obtained. As shown in FIG. 1, by installing markers 20 at a plurality of locations in the hangar 2 and using the position of each marker 20 in the hangar 2 as a reference, the measuring device 10 and the measurement points in a wide range in the hangar 2 Can be obtained.
 上記の通り、測定装置10は、測定装置10と塗装対象物との間の距離に加えて、照射するレーザ光の水平方向の角度(方位角)と垂直方向の角度(仰角または俯角)、および測定装置10自体の三次元空間的位置情報を用いることにより、塗装対象面のあらゆる位置について、格納庫2内における三次元空間的位置を算出することができる。この三次元空間的位置を表すための座標系としては、直交座標系や極座標系が用いられる。測定装置10は、水平方向および垂直方向(方位角および仰角または俯角)のレーザ光の照射角度を順次変化させて、航空機11の表面に沿って測定を行う。すなわち、測定装置10は、照射するレーザ光を方位角および仰角または俯角を変えながらスキャン(走査)することにより、航空機11の各測定点の三次元空間的位置を表す点群データを取得する。 As described above, in addition to the distance between the measuring device 10 and the object to be coated, the measuring device 10 has a horizontal angle (azimuth angle) and a vertical angle (elevation angle or depression angle) of the laser beam to be irradiated, and By using the three-dimensional spatial position information of the measuring device 10 itself, the three-dimensional spatial position in the hangar 2 can be calculated for every position of the painting target surface. As a coordinate system for representing the three-dimensional spatial position, an orthogonal coordinate system or a polar coordinate system is used. The measurement apparatus 10 performs measurement along the surface of the aircraft 11 by sequentially changing the irradiation angles of the laser light in the horizontal direction and the vertical direction (azimuth angle and elevation angle or depression angle). That is, the measuring apparatus 10 acquires point cloud data representing the three-dimensional spatial position of each measurement point of the aircraft 11 by scanning (scanning) the laser light to be irradiated while changing the azimuth angle, the elevation angle, or the depression angle.
 ミラーを使用して測定装置10により測定を行う場合は、ミラーの取り付け位置とミラーの反射面の法線方向を示す情報に基づいて点群データを補正して、正しい位置情報を算出することができる。測定装置10は、求めた複数の点群データに基づいて、航空機11の形状を表す形状データを生成する。図1に示す例では、図を簡略化するために1つの測定装置10が示されているが、実際には、航空機11の全面を測定するために、複数の測定装置10が航空機11の周囲に配置される。測定装置10は、例えば、ゴンドラ3の近傍、移動可能な台車、固定的に設けられる台座などにそれぞれ配置することができる。測定装置10を、航空機の11の上方や下方、自走式のレールの上に配置させてもよい。作業者4に測定装置10を装着させてもよい。 When measurement is performed by the measuring apparatus 10 using a mirror, the correct position information may be calculated by correcting the point cloud data based on the information indicating the mirror mounting position and the normal direction of the reflecting surface of the mirror. it can. The measuring apparatus 10 generates shape data representing the shape of the aircraft 11 based on the obtained plurality of point cloud data. In the example shown in FIG. 1, one measuring device 10 is shown to simplify the drawing, but actually, in order to measure the entire surface of the aircraft 11, a plurality of measuring devices 10 are arranged around the aircraft 11. Placed in. The measuring device 10 can be disposed, for example, in the vicinity of the gondola 3, a movable carriage, a fixed base, and the like. The measuring device 10 may be disposed above or below the aircraft 11 or on a self-propelled rail. The operator 4 may be attached to the operator 4.
 測定装置10は、塗装前および塗装後のそれぞれの状態において距離測定を行い、測定装置10と測定点との間の距離を含む、格納庫2内における測定点の空間的位置情報としての三次元位置情報を取得する。測定装置10は、取得した三次元位置情報を無線通信等により塗装作業ガイダンス用制御ユニット1に送信する。すなわち、本実施の形態では、塗装作業ガイダンス用制御ユニット1は、塗装前において測定装置10により取得された測定点の三次元位置情報と、塗装後において測定装置10により取得された測定点の三次元位置情報とを取得する。 The measuring device 10 measures the distance in each state before painting and after painting, and includes a three-dimensional position as spatial position information of the measuring point in the hangar 2 including the distance between the measuring device 10 and the measuring point. Get information. The measuring apparatus 10 transmits the acquired three-dimensional position information to the painting work guidance control unit 1 by wireless communication or the like. In other words, in the present embodiment, the painting work guidance control unit 1 includes the three-dimensional position information of the measurement points acquired by the measurement device 10 before painting and the tertiary of the measurement points acquired by the measurement device 10 after painting. Get original location information.
 測定装置10は、塗装対象面内において複数の位置を測定するように予めプログラミングされている。例えば、塗装対象面上に設定された経路情報に沿って、測定装置10は所定のピッチで測定を行う。測定装置10には、傾斜角度が制御可能なミラーが回転可能に設けられており、このミラーを介してレーザ光を塗装対象物に照射することで、レーザ光の方位角および仰俯角を変化させ、塗装対象面の所望の位置にレーザ光が照射されるようにレーザ光の照射方向を設定することができる。測定装置10は、塗装対象面において、所定のピッチで測定を行うように、レーザ光の照射方向およびゴンドラ3の移動条件を制御する。 The measuring device 10 is programmed in advance so as to measure a plurality of positions within the surface to be painted. For example, the measuring apparatus 10 performs measurement at a predetermined pitch along the route information set on the painting target surface. The measuring device 10 is rotatably provided with a mirror whose tilt angle can be controlled. By irradiating the object to be coated with the laser light through this mirror, the azimuth angle and elevation angle of the laser light are changed. The irradiation direction of the laser beam can be set so that the laser beam is irradiated to a desired position on the surface to be coated. The measuring apparatus 10 controls the irradiation direction of the laser light and the moving condition of the gondola 3 so that the measurement is performed at a predetermined pitch on the surface to be coated.
 塗装前および塗装後におけるそれぞれの三次元位置情報の差は、塗装対象物の測定点上に形成された塗膜の厚さ(又は塗料の厚さ)に相当する。従って、それぞれの塗装対象領域に含まれる測定点の塗装前における三次元位置情報と塗装後における三次元位置情報との差を求めることにより、測定点に形成された塗膜の膜厚を算出することができる。測定装置10は、作業者4が塗装作業を行っている領域の周囲近傍を随時測定することにより、塗装前、塗装中および塗装後のそれぞれのタイミングで三次元位置情報を取得することができる。塗装作業ガイダンス用制御ユニット1は、これらの三次元位置情報に基づいて、塗装により形成された塗膜の膜厚を塗装作業中に適宜取得することができる。 The difference in the three-dimensional position information before and after painting corresponds to the thickness of the coating film (or the thickness of the paint) formed on the measurement point of the painting object. Therefore, the film thickness of the coating film formed at the measurement point is calculated by calculating the difference between the three-dimensional position information before painting and the three-dimensional position information after painting of the measurement points included in each painting target area. be able to. The measuring device 10 can acquire three-dimensional position information at each timing before painting, during painting, and after painting by measuring the vicinity of the area where the worker 4 is performing the painting work as needed. The coating operation guidance control unit 1 can appropriately acquire the film thickness of the coating film formed by coating based on the three-dimensional position information during the coating operation.
 なお、塗装作業ガイダンス用制御ユニット1は、塗膜の膜厚を算出する際、塗装対象領域における塗装前と塗装後との温度差に基づいて塗装対象の形状変化を推定し、測定装置10から取得した塗装前の三次元位置情報を補正してもよい。これにより、塗装作業ガイダンス用制御ユニット1は、補正した塗装前の三次元位置情報と塗装後の三次元位置情報とに基づいて、塗膜の膜厚を算出することができる。また、測定装置10は、塗装前における距離測定は塗装直前の範囲のみについて行い、その範囲が塗装されたら直ちに距離測定を行うことで、塗装作業ガイダンス用制御ユニット1は、それらの測定結果に基づいて、塗膜の膜厚を算出してもよい。 In addition, when calculating the coating film thickness, the coating operation guidance control unit 1 estimates the shape change of the coating target based on the temperature difference between before and after coating in the coating target region. The acquired three-dimensional position information before painting may be corrected. Thereby, the control unit 1 for painting work guidance can calculate the film thickness of the coating film based on the corrected three-dimensional position information before painting and three-dimensional position information after painting. The measuring apparatus 10 measures the distance before painting only in the range immediately before painting, and immediately measures the distance when the range is painted, so that the control unit 1 for painting work guidance is based on those measurement results. The film thickness of the coating film may be calculated.
 測定装置10は、塗装対象領域における測定点の三次元位置を測定することに加えて、塗装装置5の三次元位置および塗装装置5の姿勢を測定する。これにより、塗装装置5の位置および姿勢に関連する情報を取得する。塗装装置5の姿勢に関連する情報は、例えば、塗装装置5のノズルの向きに関する情報であり、この情報は、測定装置10から塗装装置5の複数個所への距離を測定することにより取得することができる。測定装置10は、取得した塗装装置5の位置および姿勢に関連する情報を、無線通信等により塗装作業ガイダンス用制御ユニット1に送信する。なお、測定装置10により塗装装置5の位置および姿勢に関する測定を行う代わりに、塗装装置5に距離センサおよび傾斜センサを設けてもよい。距離センサは、塗装装置5から測定点までの距離を測定し、傾斜センサは、塗装装置5の姿勢を測定する。 The measuring device 10 measures the three-dimensional position of the coating device 5 and the posture of the coating device 5 in addition to measuring the three-dimensional position of the measurement point in the coating target region. Thereby, information related to the position and posture of the coating apparatus 5 is acquired. The information related to the posture of the coating apparatus 5 is, for example, information related to the orientation of the nozzle of the coating apparatus 5, and this information is acquired by measuring the distance from the measuring apparatus 10 to a plurality of locations of the coating apparatus 5. Can do. The measuring apparatus 10 transmits information related to the acquired position and orientation of the painting apparatus 5 to the painting work guidance control unit 1 by wireless communication or the like. Note that a distance sensor and an inclination sensor may be provided in the coating apparatus 5 instead of performing measurement related to the position and orientation of the coating apparatus 5 by the measuring apparatus 10. The distance sensor measures the distance from the coating device 5 to the measurement point, and the inclination sensor measures the posture of the coating device 5.
 測定装置10は、撮像装置102(図3参照)を含んで構成される。測定装置10は、撮像装置102により測定点を含む塗装対象面の撮像画像を取得し、塗装対象面の色情報を含む画像情報を生成する。測定装置10においては、例えば、測定装置10から測定点までの距離を測定する光学系と塗装対象面の画像を撮像する光学系の光軸を共通とする構成とし、距離の測定と撮像を同時に行う。これにより、各測定点との測定点を含む近傍領域について、撮像された色情報を含む画像データを、測定された測定装置10からの距離に対応付ける。すなわち、測定装置10は、塗装対象面の三次元位置情報とその三次元位置情報に対応する位置近傍の画像データとを生成する。これらの情報は、無線通信等により塗装作業ガイダンス用制御ユニット1に送信され、記憶部105に記憶される。なお、撮像装置102は、測定装置10に設ける代わりに、測定装置10とは別の位置に設けてもよい。 The measuring device 10 includes an imaging device 102 (see FIG. 3). The measuring device 10 acquires a captured image of the painting target surface including the measurement point by the imaging device 102, and generates image information including color information of the painting target surface. In the measurement apparatus 10, for example, the optical axis of the optical system that measures the distance from the measurement apparatus 10 to the measurement point and the optical system that captures the image of the surface to be coated are shared, and the distance measurement and imaging are performed simultaneously. Do. As a result, the image data including the captured color information is associated with the measured distance from the measurement apparatus 10 in the vicinity region including the measurement point with each measurement point. That is, the measuring apparatus 10 generates the three-dimensional position information of the painting target surface and the image data near the position corresponding to the three-dimensional position information. These pieces of information are transmitted to the painting work guidance control unit 1 by wireless communication or the like and stored in the storage unit 105. Note that the imaging device 102 may be provided at a position different from the measurement device 10 instead of being provided in the measurement device 10.
 塗装作業ガイダンス用制御ユニット1は、例えばCPU等の演算処理回路、ROMやRAM等のメモリを有し、所定のプログラムを実行してその機能を実行する。また、塗装作業ガイダンス用制御ユニット1は、作業者4による入力操作等により、塗装装置5に関する塗装装置情報を取得する。塗装装置情報とは、例えば、塗装装置5のノズルの種類や、そのノズルから吐出(噴出)される塗料の吐出量および吐出分布に関する情報等である吐出情報などである。塗装作業ガイダンス用制御ユニット1は、塗装装置情報、および形成すべき塗膜の膜厚に関する情報である目標膜厚情報に基づいて、作業者4が行うべき塗装作業に関する作業情報を生成する。また、ロボットを用いた自動塗装を行う場合には、ロボットアームが塗装装置5を塗装対象面に沿って移動させる移動方向や移動速度、塗装装置5からの塗料の吐出量等の制御情報を生成する。さらに、塗装作業ガイダンス用制御ユニット1は、塗装作業を行うべき位置であるにもかかわらず塗装作業や測定作業が行われていない場合に警告を行うことができる。 The painting work guidance control unit 1 has, for example, an arithmetic processing circuit such as a CPU and a memory such as a ROM and a RAM, and executes a predetermined program to execute its function. Further, the painting work guidance control unit 1 acquires painting apparatus information related to the painting apparatus 5 by an input operation or the like by the operator 4. The coating device information is, for example, discharge information that is information on the type of nozzle of the coating device 5, the discharge amount and discharge distribution of the paint discharged (spouted) from the nozzle, and the like. The painting work guidance control unit 1 generates work information related to the painting work to be performed by the worker 4 based on the coating apparatus information and the target film thickness information which is information on the film thickness of the coating film to be formed. Further, when performing automatic painting using a robot, control information such as a moving direction and a moving speed at which the robot arm moves the painting device 5 along the surface to be painted, and a discharge amount of paint from the painting device 5 is generated. To do. Further, the painting work guidance control unit 1 can issue a warning when the painting work or the measurement work is not performed even though the painting work is to be performed.
 塗装作業ガイダンス用制御ユニット1は、塗膜の膜厚分布情報を表す画像である膜厚分布画像や、作業を指示する画像である作業指示画像を表示させるための画像データを生成する。また、塗装補助装置用制御ユニット1は、塗膜が形成されていない領域に塗装を開始する場合、塗装装置情報および目標膜厚情報に基づいて作業情報または自動塗装装置に供給する制御情報を生成してもよい。これらの情報は、無線通信等により表示装置100や表示装置111に送信される。 The coating operation guidance control unit 1 generates image data for displaying a film thickness distribution image which is an image representing the film thickness distribution information of the coating film and a work instruction image which is an image instructing the operation. Further, the coating auxiliary device control unit 1 generates work information or control information to be supplied to the automatic coating device based on the coating device information and the target film thickness information when the coating is started in the area where the coating film is not formed. May be. Such information is transmitted to the display device 100 or the display device 111 by wireless communication or the like.
 表示装置111は、例えば画像を投影表示するプロジェクタであり、塗装作業ガイダンス用制御ユニット1から送信された画像データに基づいて画像を表示する。表示装置111は、塗装作業ガイダンス用制御ユニット1により出力される画像データに基づいて、塗装対象面に画像を投影して表示させる。作業者4は、表示装置111により表示された作業指示画像に従って塗装作業や測定作業を行うことが可能となる。図1に示す例では、図を簡略化するために1つの表示装置111が示されているが、航空機11の全面に画像を投影可能とするために、複数の表示装置111を航空機11の周囲に配置してもよい。 The display device 111 is, for example, a projector that projects and displays an image, and displays the image based on the image data transmitted from the painting work guidance control unit 1. The display device 111 projects and displays an image on the painting target surface based on the image data output by the painting work guidance control unit 1. The operator 4 can perform a painting operation or a measurement operation according to the operation instruction image displayed by the display device 111. In the example shown in FIG. 1, one display device 111 is shown to simplify the drawing, but in order to project an image on the entire surface of the aircraft 11, a plurality of display devices 111 are arranged around the aircraft 11. You may arrange in.
 この場合、表示装置111は、例えば、ゴンドラ3の近傍位置、移動可能な台車、固定的に設けられる台座、格納庫2の梁や柱の近傍位置などにそれぞれ配置される。なお、表示装置111を測定装置10に設けるようにしてもよい。上記説明のように、表示装置111により塗装作業に関する作業指示画像が塗装対象面に投影されることにより、作業者4に対して塗装作業の指示が行われる。なお、表示装置111として、CRTや液晶表示装置などを用いてもよい。また、表示装置111として作業者4がヘッドマウントディスプレイ(HMD)を装着して、塗装作業に関する作業指示画像を作業者4に提示してもよい。塗装作業ガイダンス用ユニット1および表示装置111としての機能を有するタブレット端末などを備えるようにしてもよい。 In this case, for example, the display device 111 is disposed at a position near the gondola 3, a movable carriage, a fixed base, a position near the beam or column of the hangar 2, and the like. Note that the display device 111 may be provided in the measurement device 10. As described above, the work instruction image relating to the painting work is projected onto the painting target surface by the display device 111, whereby the worker 4 is instructed to perform the painting work. Note that a CRT, a liquid crystal display device, or the like may be used as the display device 111. Alternatively, the worker 4 may wear a head-mounted display (HMD) as the display device 111 and present a work instruction image related to the painting work to the worker 4. You may make it provide the tablet terminal etc. which have the function as the unit 1 for painting work guidance, and the display apparatus 111. FIG.
 図3は、第1の実施の形態による測定装置10および塗装作業ガイダンス用ユニット1、表示装置100、表示装置111の構成の一例を説明するブロック図である。測定装置10は、位置測定部101と、撮像装置102とを備える。塗装作業ガイダンス用制御ユニット1は、未測定部分抽出部25、作業情報生成部60、塗装作業指示映像生成部70、塗膜情報取得部108、位置合わせ部109、および画像生成部110を備える。未測定領域抽出部25は、画像取得部103、記憶部105、全体画像生成部106、および未測定部分抽出部107を備える。なお、未測定部分抽出部25は測定装置10に含まれていてもよい。 FIG. 3 is a block diagram for explaining an example of the configuration of the measuring apparatus 10, the painting work guidance unit 1, the display apparatus 100, and the display apparatus 111 according to the first embodiment. The measurement device 10 includes a position measurement unit 101 and an imaging device 102. The painting work guidance control unit 1 includes an unmeasured part extraction unit 25, a work information generation unit 60, a painting work instruction video generation unit 70, a coating film information acquisition unit 108, an alignment unit 109, and an image generation unit 110. The unmeasured area extraction unit 25 includes an image acquisition unit 103, a storage unit 105, an entire image generation unit 106, and an unmeasured part extraction unit 107. The unmeasured part extraction unit 25 may be included in the measurement apparatus 10.
 位置測定部101は、測定時におけるマーカ20に対する測定装置10の相対的位置情報を取得する。位置測定部101は測定装置10から測定点までの距離情報を直接出せるものであれば、三次元位置情報として、測定点を測定したときの方位角および仰俯角と距離情報を撮像画像情報と対応付けて生成してもよい。画像取得部101は、測定装置10による測定ピッチ情報に基づき、撮像画像情報を測定ピッチに応じた画像範囲になるように切り出し処理を行う。測定装置10は、位置測定部101により生成された、塗装対象面の三次元位置情報とその三次元位置情報に対応する位置近傍の画像データが取得された時刻に関する時刻情報を、塗装作業ガイダンス用制御ユニット1に出力する。 The position measuring unit 101 acquires relative position information of the measuring device 10 with respect to the marker 20 at the time of measurement. If the position measuring unit 101 can directly output the distance information from the measuring device 10 to the measurement point, the azimuth and elevation angles when measuring the measurement point and the distance information correspond to the captured image information as the three-dimensional position information. You may generate it. The image acquisition unit 101 performs cut-out processing based on the measurement pitch information by the measurement device 10 so that the captured image information is in an image range corresponding to the measurement pitch. The measuring apparatus 10 uses the time measurement information generated by the position measurement unit 101 regarding the 3D position information of the surface to be painted and the time information about the time when the image data near the position corresponding to the 3D position information is acquired for painting work guidance. Output to the control unit 1.
 図1および図2に示す例では、測定装置10はゴンドラ3に固定して配置される。このような配置において、塗装作業中はゴンドラ3を停止させた状態とし、塗装作業前と塗装作業後に塗装対象領域の位置情報を取得する場合、塗装装置10により取得した塗装作業前と塗装作業後の2つの時点における位置情報は、必然的に同一塗装対象領域のものとなる。 In the example shown in FIGS. 1 and 2, the measuring device 10 is fixed to the gondola 3 and arranged. In such an arrangement, when the gondola 3 is stopped during the painting operation and the position information of the painting target area is acquired before and after the painting operation, before and after the painting operation acquired by the coating apparatus 10. The position information at these two points in time is necessarily in the same painting target area.
 撮像装置102は、位置測定部101による航空機11(塗装対象物)の各測定点の三次元位置測定時に、塗装対象範囲のおける各測定点を含む画像を撮影して画像情報を生成する。撮像部102は、結像光学系と、結像光学系により結像する画像の光強度分布に応じて信号を出力する撮像素子とを備える。測定装置10は、一走狗底部101と撮像装置102とにより、同一の測定点について位置情報と画像情報とを取得する。すなわち、撮像装置102により生成される画像情報は、位置測定部101による三次元位置測定が行われた測定点を含む範囲の画像情報である。撮像装置102により撮像され生成された画像情報を、以下、撮像画像情報と称する。撮像装置102は、撮像画像情報に加えて、撮像画像情報の画角に関する画角情報も生成する。撮像画像情報と画角情報とは、測定装置10から塗装作業ガイダンス用制御ユニット1に送信される。 The imaging device 102 captures an image including each measurement point in the painting target range and generates image information when the position measurement unit 101 measures the three-dimensional position of each measurement point of the aircraft 11 (coating object). The imaging unit 102 includes an imaging optical system and an imaging element that outputs a signal according to the light intensity distribution of an image formed by the imaging optical system. The measuring device 10 acquires position information and image information for the same measurement point by the first running saddle bottom 101 and the imaging device 102. That is, the image information generated by the imaging device 102 is image information in a range including measurement points where the three-dimensional position measurement is performed by the position measurement unit 101. Image information captured and generated by the imaging device 102 is hereinafter referred to as captured image information. The imaging device 102 generates angle-of-view information related to the angle of view of the captured image information in addition to the captured image information. The captured image information and the angle-of-view information are transmitted from the measuring device 10 to the painting work guidance control unit 1.
 塗装作業ガイダンス用制御ユニット1は、撮像部102が生成した撮像画像情報および画角情報を測定装置10から取得する。また、塗装作業ガイダンス用制御ユニット1は、位置測定部101が生成した各測定点の位置情報および位置測定時における測定装置10の位置を示す情報を測定装置10から取得する。測定装置10の位置を示す情報は、塗装作業に伴って測定装置10が間欠的に複数の位置を順次移動しながら位置測定を行う際の、測定装置10の三次元位置を示す情報である。例えば、測定装置10をゴンドラ3に固定して配置している状態で、ゴンドラ3を順次移動させて塗装作業および位置測定を行う場合、測定装置10の位置を示す情報は、ゴンドラ3の進行方向への移動量(移動距離)を算出するために用いられる情報となる。測定装置10から塗装作業ガイダンス用制御ユニット1に送信された、各測定点の三次元位置情報と各測定点にそれぞれ対応する撮像画像情報および画角情報は、記憶部105に記憶される。 The painting work guidance control unit 1 acquires the captured image information and the angle-of-view information generated by the imaging unit 102 from the measurement device 10. Further, the painting work guidance control unit 1 acquires from the measurement device 10 position information generated by the position measurement unit 101 and information indicating the position of the measurement device 10 at the time of position measurement. The information indicating the position of the measuring apparatus 10 is information indicating the three-dimensional position of the measuring apparatus 10 when the measuring apparatus 10 performs position measurement while sequentially moving a plurality of positions intermittently with a painting operation. For example, in the state where the measuring device 10 is fixed to the gondola 3 and the gondola 3 is sequentially moved to perform painting work and position measurement, the information indicating the position of the measuring device 10 is the traveling direction of the gondola 3. This is information used to calculate the amount of movement (movement distance). The three-dimensional position information of each measurement point and the captured image information and the angle-of-view information respectively corresponding to each measurement point transmitted from the measurement apparatus 10 to the painting work guidance control unit 1 are stored in the storage unit 105.
 画像取得部103は、測定装置10による測定ピッチ情報に基づいて、撮像画像情報を測定ピッチに応じた画像範囲になるように切り出し処理を行う。すなわち、画像取得部103は、塗装対象面にある測定点のうち3点以上の測定点に関する位置情報が記憶部105に記憶されたことを検知すると、格納された撮像画像情報から切り出し範囲を算出する。ある測定点から測定装置10までの距離に応じて、塗装対象面上の撮影範囲は変化する。また、測定点における塗装対象面の法線方向に対して、位置測定部101の測定方向とのなす角度が、平行な状態から変化するにつれ、塗装対象面上の撮影範囲は変化する。そこで、本発明においては、測定点と測定装置10までの距離、またはその周囲の測定点の三次元位置情報に基づいて、撮像画像情報から測定ピッチに応じた画像情報として切り出すように画像生成範囲を決定する。画像取得部103は、決定した画像生成範囲に対応する画像情報を撮像画像情報から抽出する。画像取得部103により生成される画像情報は、位置測定部101による測定点を含む広い範囲に対応する画像情報である。具体的な例について、図4を参照して説明する。 The image acquisition unit 103 performs a cut-out process based on the measurement pitch information by the measurement device 10 so that the captured image information is in an image range corresponding to the measurement pitch. That is, when the image acquisition unit 103 detects that position information relating to three or more measurement points among the measurement points on the surface to be coated is stored in the storage unit 105, the image acquisition unit 103 calculates a cut-out range from the stored captured image information. To do. Depending on the distance from a certain measurement point to the measuring device 10, the shooting range on the surface to be painted changes. Further, as the angle between the measurement direction of the position measuring unit 101 and the normal direction of the surface to be painted at the measurement point changes from a parallel state, the shooting range on the surface to be painted changes. Therefore, in the present invention, the image generation range is extracted from the captured image information as image information corresponding to the measurement pitch based on the distance from the measurement point to the measurement apparatus 10 or the three-dimensional position information of the measurement points around the measurement point. To decide. The image acquisition unit 103 extracts image information corresponding to the determined image generation range from the captured image information. The image information generated by the image acquisition unit 103 is image information corresponding to a wide range including measurement points by the position measurement unit 101. A specific example will be described with reference to FIG.
 図4は、第1の実施の形態による塗装作業ガイダンス用制御ユニット1の画像処理部103による処理の例を説明するための図である。図4において、範囲80aは、点ABCDにより囲まれる範囲であり、測定点81aの三次元位置測定時に撮像装置102が取得した撮像画像情報の範囲を示す。また、範囲80bは、点EFGHにより囲まれる範囲であり、測定点81aの次の測定点である測定点81bの三次元位置測定時に撮像装置102が取得する撮像画像情報の範囲を示す。範囲80aおよび範囲80bのいずれの撮像画像情報の範囲においても、それぞれの測定点が中心に位置する。 FIG. 4 is a diagram for explaining an example of processing by the image processing unit 103 of the painting work guidance control unit 1 according to the first embodiment. In FIG. 4, a range 80a is a range surrounded by the point ABCD, and indicates a range of captured image information acquired by the imaging device 102 when measuring the three-dimensional position of the measurement point 81a. A range 80b is a range surrounded by the point EFGH, and indicates a range of captured image information acquired by the imaging apparatus 102 when measuring the three-dimensional position of the measurement point 81b that is the measurement point next to the measurement point 81a. In any range of the captured image information in the range 80a and the range 80b, each measurement point is located at the center.
 図4に示す83は、測定対象面上における測定ピッチを表している。画像取得部103は、測定対象面が平面であり、測定装置10の測定方向が測定対象面に垂直である場合、設定された測定ピッチ83に応じた撮影範囲を算出する。例えば、撮像装置102の結像光学系の焦点距離をf、互いに隣接する測定点間の距離をP、位置測定部101により取得された、測定装置10から測定点までの距離をdとする場合、d×P/fで測定ピッチ方向の撮影範囲の長さが算出できる。撮影領域と測定ピッチを示す情報とに基づいて測定ピッチに応じた画像生成範囲82a、82bを設定する。画像生成範囲82aは、点abcdにより囲まれる範囲であり、画像生成範囲82bは、点efghにより囲まれる範囲である。画像取得部103は、撮像画像情報80aから画像生成範囲82aに対応する画像情報を抽出し、撮像画像情報80bから画像生成範囲82bに対応する画像情報を抽出する。 FIG. 4 shows a measurement pitch 83 on the measurement target surface. The image acquisition unit 103 calculates an imaging range corresponding to the set measurement pitch 83 when the measurement target surface is a flat surface and the measurement direction of the measurement apparatus 10 is perpendicular to the measurement target surface. For example, when the focal length of the imaging optical system of the imaging device 102 is f, the distance between adjacent measurement points is P, and the distance from the measurement device 10 to the measurement point acquired by the position measurement unit 101 is d. , D × P / f, the length of the shooting range in the measurement pitch direction can be calculated. Based on the imaging region and the information indicating the measurement pitch, image generation ranges 82a and 82b corresponding to the measurement pitch are set. The image generation range 82a is a range surrounded by the points abcd, and the image generation range 82b is a range surrounded by the points efgh. The image acquisition unit 103 extracts image information corresponding to the image generation range 82a from the captured image information 80a, and extracts image information corresponding to the image generation range 82b from the captured image information 80b.
 図5は、第1の実施の形態による画像取得部103による処理の別の例を説明するための図である。図5は、画像取得部103が、測定点81a、81bとこれらの測定点の近傍の三次元位置情報に基づいて、測定対象面84が位置測定部101の測定方向に対して傾いているか否かを判定する。測定対象面84が傾いていると判定された場合、すなわち、位置測定部101の測定方向と測定対象面である塗装対象面の法線方向とのなす角度が所定角度以上である場合、その角度に応じて、撮像画像情報を切り出す範囲を変える。 FIG. 5 is a diagram for explaining another example of processing performed by the image acquisition unit 103 according to the first embodiment. FIG. 5 shows whether the image acquisition unit 103 is tilted with respect to the measurement direction of the position measurement unit 101 based on the measurement points 81 a and 81 b and the three-dimensional position information in the vicinity of these measurement points. Determine whether. When it is determined that the measurement target surface 84 is tilted, that is, when the angle between the measurement direction of the position measurement unit 101 and the normal direction of the coating target surface that is the measurement target surface is equal to or greater than a predetermined angle, the angle Accordingly, the range for cutting out the captured image information is changed.
 この点について図5を用いて具体的に説明する。測定対象面84が傾いていないと判定された場合には、測定点81aに対応する撮像画像情報を切り出す範囲をC1に設定し、次の測定点81bに対応する撮像画像情報を切り出す範囲も同じくC1に設定する。一方、測定対象面84が傾いていると判定された場合には、測定点81aに対応する撮像画像情報を切り出す範囲をC21に設定し、次の測定点81b’に対応する撮像画像情報を切り出す範囲は、C21とは異なるC22に設定する。以上は、説明を簡単にするために、撮像画像情報を切り出す範囲における一つの方向について説明したが、二次元の各方向についても同様である。 This point will be specifically described with reference to FIG. When it is determined that the measurement target surface 84 is not tilted, the range for extracting the captured image information corresponding to the measurement point 81a is set to C1, and the range for extracting the captured image information corresponding to the next measurement point 81b is also the same. Set to C1. On the other hand, when it is determined that the measurement target surface 84 is tilted, the range of the captured image information corresponding to the measurement point 81a is set to C21, and the captured image information corresponding to the next measurement point 81b ′ is extracted. The range is set to C22 different from C21. In the above, for the sake of simplicity, one direction in the range where the captured image information is cut out has been described, but the same applies to each two-dimensional direction.
 ところで、測定装置10は、前述のように方位角および仰俯角を変えながら、複数の測定点における三次元位置測定を行っている。したがって、測定点のピッチは塗装対象物である航空機の形状に応じて、仰俯角ピッチおよび方位角ピッチを測定する位置ごとに変えていくことで、塗装対象物上で所定の測定ピッチで複数の測定点を測定する。しかしながら、実際には、塗装対象物である航空機が想定された形状と異なる形状であったり、また、測定機と測定装置の相対位置関係が異なる位置関係の状態で測定されたりするケースがある。このような場合、測定装置10に対して測定点の法線方向が想定された向きとは異なる方向になる場合があり、要求された測定ピッチで測定されず、塗装膜厚の測定漏れが生ずる原因となる。 Incidentally, the measuring apparatus 10 performs three-dimensional position measurement at a plurality of measurement points while changing the azimuth angle and the elevation angle as described above. Therefore, by changing the pitch of the measurement points for each position where the elevation angle pitch and the azimuth pitch are measured in accordance with the shape of the aircraft that is the object to be coated, a plurality of pitches are measured at a predetermined measurement pitch on the object to be painted. Measure the measurement point. However, in reality, there are cases where the aircraft that is the object to be painted has a shape that is different from the assumed shape, or the relative positional relationship between the measuring machine and the measuring device is measured in a different positional relationship. In such a case, the normal direction of the measurement point with respect to the measurement apparatus 10 may be different from the assumed direction, and measurement is not performed at the required measurement pitch, resulting in measurement omission of the coating film thickness. Cause.
 例えば、測定点81a、81bのそれぞれの撮像画像情報を測定点の距離情報を基に切り出しただけでは、被測定面上の測定ピッチに応じた画像範囲とならない。しかしながら、測定方向に対する測定対象面の傾きの程度により撮像画像情報から切り出す画像生成範囲を設定することで、取得したい測定ピッチに応じた画像情報が作成できる。そこで、測定対象面84が位置測定部101による測定方向に対して傾いていると判定した場合、画像取得部103は、測定装置10の走査角度量を変更するための信号を測定装置10に出力し、位置測定部101は、測定装置10の走査範囲を変更して三次元位置測定を行うようにしてもよい。 For example, if the captured image information of each of the measurement points 81a and 81b is cut out based only on the distance information of the measurement point, the image range corresponding to the measurement pitch on the measurement surface is not obtained. However, image information corresponding to the measurement pitch to be acquired can be created by setting the image generation range cut out from the captured image information according to the degree of inclination of the measurement target surface with respect to the measurement direction. Therefore, when it is determined that the measurement target surface 84 is inclined with respect to the measurement direction by the position measurement unit 101, the image acquisition unit 103 outputs a signal for changing the scanning angle amount of the measurement device 10 to the measurement device 10. Then, the position measuring unit 101 may change the scanning range of the measuring apparatus 10 to perform three-dimensional position measurement.
 このような場合、画像取得部103は、測定面84の法線方向と測定装置10の測定方向とのなす角度を算出し、算出した角度に基づいて位置測定部101が測定時に設定される方位角または仰俯角の設定情報を再設定し、それに基づき測定装置10の移動量を示す情報を生成する。このように生成された測定装置10の移動量を示す情報は、測定装置10に出力され、測定装置10の移動量は変更される。例えば、測定点81aの距離測定を行う場合の測定装置10の位置と、測定点81bの距離測定を行う場合の測定装置10の位置との間隔を小さくし、測定点を増加させる。 In such a case, the image acquisition unit 103 calculates an angle formed between the normal direction of the measurement surface 84 and the measurement direction of the measurement apparatus 10, and the direction set by the position measurement unit 101 at the time of measurement based on the calculated angle. The setting information of the angle or the elevation angle is reset, and information indicating the movement amount of the measuring device 10 is generated based on the setting information. The information indicating the movement amount of the measurement device 10 generated in this way is output to the measurement device 10, and the movement amount of the measurement device 10 is changed. For example, the distance between the position of the measurement device 10 when measuring the distance of the measurement point 81a and the position of the measurement device 10 when measuring the distance of the measurement point 81b is reduced, and the number of measurement points is increased.
 未測定領域抽出部25は、画像取得部103で切り出された画像情報に対して、記憶部105から読み出された三次元位置情報を付与する。位置情報が付与された画像情報は、記憶部105に記憶される。 The unmeasured area extraction unit 25 adds the three-dimensional position information read from the storage unit 105 to the image information cut out by the image acquisition unit 103. The image information provided with the position information is stored in the storage unit 105.
 全体画像生成部106は、画像取得部103で切り出された測定点ごとの画像情報をその画像情報に付与された位置情報に基づいて合成し、塗装対象物における測定済み領域の全体画像情報を生成する。具体的には、全体画像生成部106は、画像取得部103で切り出された複数の画像情報をそれぞれ繋ぎ合わせることにより、塗装対象物における測定済み領域の全体画像情報を生成する。 The whole image generation unit 106 synthesizes the image information for each measurement point cut out by the image acquisition unit 103 based on the position information given to the image information, and generates the whole image information of the measured region in the painting target. To do. Specifically, the entire image generation unit 106 generates the entire image information of the measured region of the painting target by connecting the plurality of pieces of image information cut out by the image acquisition unit 103.
 未測定部分抽出部107は、全体画像情報に基づいて、位置情報を取得していない三次元領域を抽出する。抽出された三次元領域のうち、例えば、次の (1)および(2) に示す条件に該当する三次元領域を未測定領域として抽出する。(1)塗装対象物の形状モデルデータがある場合、形状モデルデータと全体画像情報とを比較し、形状モデルデータは存在するものの全体画像情報は存在しない領域。特に、塗装対象領域情報がある場合はその情報も参照する。(2)全体画像情報から抽出した塗装対象物の輪郭情報において、輪郭線が不連続となっている領域または輪郭線で閉じた領域内で画像情報が欠落している領域。未測定部分抽出部107は、抽出された未測定領域を記憶部105に記憶させる。 The unmeasured part extraction unit 107 extracts a three-dimensional region from which position information has not been acquired based on the entire image information. Among the extracted three-dimensional regions, for example, a three-dimensional region corresponding to the conditions shown in the following (1) and (2) is extracted as an unmeasured region. (1) When there is the shape model data of the object to be painted, the shape model data is compared with the entire image information, and the shape model data exists but the entire image information does not exist. In particular, when there is paint target area information, the information is also referred to. (2) In the contour information of the painting object extracted from the entire image information, a region where the image information is missing in a region where the contour line is discontinuous or a region closed by the contour line. The unmeasured part extraction unit 107 stores the extracted unmeasured area in the storage unit 105.
 なお、本実施の形態では、測定装置10と作業ガイダンス用制御ユニット1内の未測定領域抽出部25と表示装置111により、未測定領域抽出装置として独立して使用できる態様になっている。 In the present embodiment, the measurement device 10, the unmeasured region extraction unit 25 in the work guidance control unit 1, and the display device 111 are configured to be used independently as an unmeasured region extraction device.
 次に、未測定領域を抽出する機能以外の機能であって、本実施形態に実装されている機能について説明する。塗膜情報取得部108は、測定装置10の位置測定部101により取得された塗装作業前および塗装作業後における同一測定点の三次元位置情報に基づいて、塗装作業前の三次元位置情報と塗装作業後の三次元位置情報の位置情報の差を求めることにより、各測定点に形成された塗膜の厚さを算出する。塗膜情報取得部108は、算出された塗膜の厚さに基づいて、塗膜の膜厚分布に関する膜厚分布情報を生成する。なお、塗膜情報取得部108は、塗装作業後における塗装対象物の位置情報と塗装対象物の形状モデルデータとの差分情報から、塗装対象物に形成された塗膜の厚さを算出して膜厚分布情報を取得してもよい。 Next, functions other than the function of extracting an unmeasured area, which are functions implemented in this embodiment will be described. The coating film information acquisition unit 108 is based on the three-dimensional position information before the painting operation and the painting based on the three-dimensional position information of the same measurement point before and after the painting operation acquired by the position measurement unit 101 of the measuring apparatus 10. The thickness of the coating film formed at each measurement point is calculated by obtaining the difference in the position information of the three-dimensional position information after the work. The coating film information acquisition part 108 produces | generates the film thickness distribution information regarding the film thickness distribution of a coating film based on the calculated thickness of the coating film. The paint film information acquisition unit 108 calculates the thickness of the paint film formed on the painting object from the difference information between the position information of the painting object after the painting work and the shape model data of the painting object. The film thickness distribution information may be acquired.
 位置合わせ部109は、塗装対象物のモデルデータと塗装対象物の全体画像情報との位置合わせ処理を行う。塗装対象物の形状モデルデータは、例えば、塗装対象物である航空機11の設計データ(CADデータ)である。例えば、位置合わせ部109は、全体画像情報を生成する際に利用した、画像取得部103で切り出された画像情報に付与された位置情報を用いることにより、塗装対象物の形状モデルデータと塗装対象物の全体画像との位置合わせ処理を行う。また。位置合わせ部109は、塗装対象物の形状モデルデータと膜厚分布情報との位置合わせ処理を行う。 The alignment unit 109 performs alignment processing between the model data of the painting object and the entire image information of the painting object. The shape model data of the painting object is, for example, design data (CAD data) of the aircraft 11 that is the painting object. For example, the alignment unit 109 uses the position information given to the image information cut out by the image acquisition unit 103, which is used when generating the entire image information, so that the shape model data of the painting target and the painting target are used. Alignment processing with the entire image of the object is performed. Also. The alignment unit 109 performs alignment processing between the shape model data of the coating object and the film thickness distribution information.
 画像生成部110は、位置合わせ部109による処理結果に基づいて、塗装対象物の形状モデルデータと塗装対象物の全体画像情報とを表す画像を表示するための画像データを生成する。また、例えば、画像生成部110は、位置合わせ部109による処理結果に基づいて、塗装対象物の形状モデルデータと膜厚分布情報とを表す画像を表示するための画像データを生成する。また、例えば、画像生成部110は、未測定部分抽出部107により生成された位置情報を取得していない部分に関する情報に基づいて、位置情報を取得していない部分を表す画像を表示するための画像データを生成する。画像生成部110は、生成した画像データを表示装置100および/または111に出力する。なお、膜厚分布情報については、塗装対象物の形状モデルデータと一緒に表すことに代えて、全体画像情報に重畳して膜厚分布情報を出力してもよい。 The image generation unit 110 generates image data for displaying an image representing the shape model data of the painting target and the entire image information of the painting target based on the processing result by the positioning unit 109. Further, for example, the image generation unit 110 generates image data for displaying an image representing the shape model data and film thickness distribution information of the painting target based on the processing result by the alignment unit 109. In addition, for example, the image generation unit 110 displays an image representing a portion for which position information has not been acquired based on information regarding a portion for which position information has not been acquired, generated by the unmeasured portion extraction unit 107. Generate image data. The image generation unit 110 outputs the generated image data to the display device 100 and / or 111. As for the film thickness distribution information, the film thickness distribution information may be output by being superimposed on the entire image information, instead of being represented together with the shape model data of the object to be coated.
 表示装置100および111は、例えばCRTや液晶表示装置であり、測定装置10から塗装作業ガイダンス用制御ユニット1に送信された画像データに基づいて画像を表示する。表示装置100および111はそれぞれ、ゴンドラ3内などに配置される。作業者4は、表示装置100および111により表示された画像を確認して塗装作業を行うことが可能となる。上記説明のように、本実施の形態では、表示装置100および/または11により塗装対象面の形状測定が行われていない領域が表示させることにより、作業者4に対して再測定や再塗装作業を促すことができる。なお、表示装置100および/111として作業者4がヘッドマウントディスプレイ(HMD)を装着して、画像を作業者4に提示してもよい。表示装置100および/111としての機能を有するタブレット端末などを備えるようにしてもよい。 The display devices 100 and 111 are, for example, a CRT or a liquid crystal display device, and display an image based on the image data transmitted from the measurement device 10 to the painting work guidance control unit 1. Each of the display devices 100 and 111 is disposed in the gondola 3 or the like. The worker 4 can check the images displayed by the display devices 100 and 111 and perform the painting work. As described above, in the present embodiment, the display device 100 and / or 11 displays an area in which the shape measurement of the surface to be painted is not performed, so that the operator 4 can perform remeasurement or repainting work. Can be encouraged. The worker 4 may wear a head-mounted display (HMD) as the display devices 100 and / 111 and present the image to the worker 4. You may make it provide the tablet terminal etc. which have a function as the display apparatuses 100 and / 111.
 航空機11のような大型な塗装対象物に対して作業者が手作業で塗装を行う場合、作業者4は塗膜の膜厚を測定して塗膜の状態を確認した後、次の塗装位置に移動することが考えられる。しかし、作業者4のミス等により膜厚を測定せずに次の場所に移動してしまう可能性がある。また、本実施の形態は、ゴンドラ3が一台である場合について記載しているが、複数台のゴンドラを塗装対象物の周囲に配置して、それぞれ異なる位置から一斉に塗装作業を開始する場合もある。このような場合、いずれのゴンドラの移動範囲にも含まれずに塗装作業がなされないまま放置される場合がある。 When an operator manually applies to a large object to be painted such as the aircraft 11, the operator 4 measures the film thickness of the coating film and confirms the state of the coating film, and then the next coating position. It is possible to move to. However, the operator 4 may move to the next place without measuring the film thickness due to a mistake of the operator 4 or the like. Moreover, although this Embodiment has described about the case where the gondola 3 is one, when arranging a plurality of gondola around the object to be painted and starting the painting work from different positions all at once There is also. In such a case, it is not included in the movement range of any gondola and may be left without being painted.
 本実施の形態によれば、ゴンドラが塗装対象位置に移動して、塗装作業が終了した時点で測定装置10により塗膜の膜厚が測定される。ゴンドラが移動した塗装対象位置については、三次元位置情報と塗膜の膜厚の情報とを対応させて取得するので、塗装対象物の塗装作業中に、未塗装領域あるいは塗装不良領域を確認することができる。すなわち、本実施の形態によれば、塗装作業ガイダンス用制御ユニット1は、測定装置10による形状測定が行われていない未測定領域を抽出して、未測定領域の位置を示す画像を表示装置100および/または111に表示させる。それにより、作業者4に再測定や再塗装作業を促すことができる。この結果、未塗装領域への追加塗装作業などを削減して塗装作業の作業時間を短縮することができる。また、作業者4の負担を軽減することができる。 According to the present embodiment, the film thickness of the coating film is measured by the measuring device 10 when the gondola moves to the painting target position and the painting operation is completed. The coating target position to which the gondola has moved is acquired by associating the three-dimensional position information with the film thickness information of the paint film, so check the unpainted area or the poorly painted area during the painting of the object to be painted. be able to. That is, according to the present embodiment, the painting work guidance control unit 1 extracts an unmeasured area where shape measurement by the measuring apparatus 10 is not performed, and displays an image indicating the position of the unmeasured area on the display device 100. And / or 111. Thereby, it is possible to prompt the worker 4 to perform remeasurement or repainting work. As a result, it is possible to reduce the work time of the painting work by reducing the additional painting work on the unpainted area. Moreover, the burden on the operator 4 can be reduced.
 塗装情報取得部108は、測定装置10により取得された膜厚分布情報に基づいて、塗装対象領域の各測定点に対して塗装済み領域か未塗装領域かを判定する。また、塗装情報取得部108は、塗装済み領域について、塗膜の厚さが所定の範囲であるか否かを判定する。これにより、塗装情報取得部108は、塗装対象面における塗装後の膜厚分布情報から塗膜の膜厚が不足している領域と塗膜の不足厚さについての情報である膜厚不足分布情報を生成する。 The painting information acquisition unit 108 determines whether each of the measurement points in the coating target area is a painted area or an unpainted area based on the film thickness distribution information acquired by the measuring apparatus 10. Further, the painting information acquisition unit 108 determines whether the thickness of the coating film is within a predetermined range for the painted area. Thereby, the coating information acquisition unit 108 is information on the insufficient film thickness distribution information that is information on the region where the coating film thickness is insufficient and the insufficient coating thickness from the coating thickness distribution information on the coating target surface. Is generated.
 記憶部105は、塗装情報取得部108に入力された膜厚分布情報を記憶する。また、記憶部105は、作業者4による入力操作等により、塗装装置5に関する塗装装置情報、および形成すべき塗膜の膜厚に関する情報(目標膜厚情報など)を記憶する。例えば、記憶部105には、塗装装置情報として、複数のノズルについての吐出情報が記憶される。記憶部105は、RAM等の半導体メモリやハードディスク装置等の記憶媒体を含んで構成される。 The storage unit 105 stores the film thickness distribution information input to the coating information acquisition unit 108. In addition, the storage unit 105 stores coating apparatus information regarding the coating apparatus 5 and information regarding the film thickness of the coating film to be formed (target film thickness information and the like) by an input operation or the like by the operator 4. For example, the storage unit 105 stores discharge information for a plurality of nozzles as the coating apparatus information. The storage unit 105 includes a semiconductor memory such as a RAM and a storage medium such as a hard disk device.
 作業情報生成部60は、目標膜厚情報、膜厚分布情報および塗装装置情報等に基づいて、塗装対象物に対して行うべき塗装作業に関する情報である作業情報を生成する。作業情報は、例えば、塗装装置5による塗料の吹き付け目標位置に関する情報、塗装装置5の位置およびノズルの向きに関する情報、塗装装置5を動かす速度(速さと方向)に関する情報等である。作業情報生成部60は、位置算出部61と、推移算出部62とを有する。 The work information generation unit 60 generates work information that is information related to the painting work to be performed on the painting target based on the target film thickness information, the film thickness distribution information, the coating apparatus information, and the like. The work information is, for example, information related to the target position of the paint sprayed by the coating apparatus 5, information related to the position of the coating apparatus 5 and the direction of the nozzle, information related to the speed (speed and direction) of moving the coating apparatus 5. The work information generation unit 60 includes a position calculation unit 61 and a transition calculation unit 62.
 位置算出部61は、塗装装置5がスプレーガンの場合であれば、塗装対象物における塗装装置5による塗料の吹き付け目標位置を算出する。また、位置算出部61は、塗装装置5が刷毛の場合であれば、塗装対象物に対する刷毛の接触目標位置を算出する。また、位置算出部61は、塗装装置5が電着塗装装置の場合であれば、電着塗装液に浸漬させる塗装対象物の位置を算出する。なお、以下の説明では、塗装装置5がスプレーガンの場合について説明する。 If the coating apparatus 5 is a spray gun, the position calculation unit 61 calculates a target position for spraying the paint by the coating apparatus 5 on the object to be coated. Further, if the coating apparatus 5 is a brush, the position calculation unit 61 calculates a contact target position of the brush with respect to the coating object. Moreover, the position calculation part 61 will calculate the position of the coating target immersed in the electrodeposition coating liquid, when the coating apparatus 5 is an electrodeposition coating apparatus. In the following description, the case where the coating apparatus 5 is a spray gun will be described.
 位置算出部61は、塗装装置情報に含まれる吐出量および吐出分布に関する情報に基づいて、塗装装置5による塗装を施す目標位置およびその際の塗装装置5の目標姿勢を算出する。具体的には、位置算出部61は、膜厚不足分布情報、吐出量および吐出分布の情報等を用いて、塗装対象に対する吹き付け目標位置、塗装装置5の位置および姿勢を算出する。なお、位置算出部61は、測定装置10により生成される形状データに基づいて吹き付け目標位置を算出してもよい。また、位置算出部61は、塗装対象の形状および大きさに応じて、吹き付け目標位置を調整してもよい。 The position calculation unit 61 calculates a target position where the coating apparatus 5 performs coating and a target posture of the coating apparatus 5 at that time based on information on the discharge amount and the discharge distribution included in the coating apparatus information. Specifically, the position calculation unit 61 calculates the spray target position with respect to the coating target and the position and orientation of the coating apparatus 5 using information on insufficient film thickness distribution information, discharge amount and discharge distribution, and the like. The position calculation unit 61 may calculate the spray target position based on the shape data generated by the measurement apparatus 10. Further, the position calculation unit 61 may adjust the spray target position according to the shape and size of the painting target.
 推移算出部62は、例えば、塗装装置情報等に基づいて、塗料の吹き付け作業の一連の工程において、作業時間ごとの吹き付け目標位置を算出する。すなわち、推移算出部62は、塗装作業を行う際の塗装装置5の位置の時間的推移を算出する。また、例えば、推移算出部62は、塗装装置5に使用しているノズルから吐出される塗料の吐出量やその吐出分布に関する情報、および解析部15により算出される膜厚と設計時の膜厚(目標膜厚)との差に関する情報等に基づいて、塗装装置5を動かす速度を算出する。塗装装置5を動かす速度は、例えば、塗装装置5の塗装対象に対する単位時間あたりの移動距離である。 The transition calculation unit 62 calculates the spray target position for each work time in a series of steps of the paint spraying operation based on, for example, the painting apparatus information. That is, the transition calculation unit 62 calculates the temporal transition of the position of the coating apparatus 5 when performing the painting work. In addition, for example, the transition calculation unit 62 includes information on the discharge amount of the paint discharged from the nozzles used in the coating apparatus 5 and information on its discharge distribution, and the film thickness calculated by the analysis unit 15 and the film thickness at the time of design. The speed at which the coating apparatus 5 is moved is calculated based on information relating to the difference from (target film thickness). The speed at which the coating apparatus 5 is moved is, for example, the moving distance per unit time of the coating apparatus 5 with respect to the coating target.
 上記説明のように、作業情報生成部60は、塗装作業により塗装対象面に形成された塗膜の膜厚の状態に応じて、塗料の吹き付け目標位置や塗装装置5の位置や動かす速度等の、行うべき塗装作業に関する作業情報を生成する。なお、作業情報生成部60は、塗装対象の周囲温度および湿度、塗料の特性、全体の作業時間等を考慮して、作業情報を生成してもよい。作業情報生成部60により生成された作業情報は、塗装作業指示映像生成部70に出力される。 As described above, the work information generation unit 60 determines the paint spray target position, the position of the coating apparatus 5, the moving speed, and the like in accordance with the state of the film thickness of the coating film formed on the painting target surface by the painting work. Generate work information on the painting work to be performed. The work information generation unit 60 may generate the work information in consideration of the ambient temperature and humidity of the object to be painted, the characteristics of the paint, the overall work time, and the like. The work information generated by the work information generating unit 60 is output to the painting work instruction image generating unit 70.
 塗装作業指示映像生成部70は、膜厚分布画像や作業指示画像を表示するための画像データを生成する。塗装作業指示映像生成部70は、例えば、膜厚分布情報、作業情報、および塗装対象の形状データに基づいて、膜厚分布画像と作業指示画像とを塗装対象面上に重畳して表示するための画像データを生成する。塗装作業指示映像生成部70により生成される画像データは、塗装対象面に対する表示装置111の位置および姿勢に基づいて生成される。例えば、表示装置111の位置および姿勢に関する情報を塗装作業指示映像生成部70に入力し、塗装作業指示映像生成部70は、表示装置111の位置および姿勢に関する情報に基づいて、表示する膜厚分布画像や作業指示画像等の画像データを生成する。これにより、膜厚分布画像と作業指示画像とを塗装対象上に適切に重畳して表示させることが可能となる。塗装作業指示映像生成部70により生成された画像データは、無線通信等により表示装置111に出力される。なお、膜厚分布画像と作業指示画像とを重畳して表示させずに、いずれか一方のみを表示させるようにしてもよい。 The painting work instruction video generation unit 70 generates image data for displaying a film thickness distribution image and a work instruction image. For example, the painting work instruction image generation unit 70 displays the film thickness distribution image and the work instruction image superimposed on the painting target surface based on the film thickness distribution information, the work information, and the shape data of the painting target. Image data is generated. The image data generated by the painting work instruction video generation unit 70 is generated based on the position and orientation of the display device 111 with respect to the painting target surface. For example, information regarding the position and orientation of the display device 111 is input to the painting work instruction image generation unit 70, and the coating operation instruction image generation unit 70 displays the film thickness distribution based on the information regarding the position and orientation of the display device 111. Image data such as images and work instruction images are generated. As a result, the film thickness distribution image and the work instruction image can be appropriately superimposed and displayed on the painting target. The image data generated by the painting work instruction video generation unit 70 is output to the display device 111 by wireless communication or the like. Note that only one of the film thickness distribution image and the work instruction image may be displayed without being superimposed.
 表示装置111は、塗装作業指示映像生成部70により生成される画像データによって、種々の画像を表示することができる。例えば、表示装置111は、塗膜の膜厚を段階的に分類して色分けした膜厚分布画像を表示する。また、塗装補助装置用制御ユニット1は、表示装置111に対して、場所ごとの塗膜の膜厚値を表示させるようにしてもよい。また、塗装補助装置用制御ユニット1は、交換可能なノズルから最適なノズルを決定して、塗装装置5のノズルの交換を案内する画像を生成して、表示部111により表示させるようにしてもよい。 The display device 111 can display various images based on the image data generated by the painting work instruction image generation unit 70. For example, the display device 111 displays a film thickness distribution image in which the film thickness of the coating film is classified step by step and color-coded. Moreover, you may make it the coating auxiliary device control unit 1 display the film thickness value of the coating film for every place with respect to the display apparatus 111. FIG. Further, the painting auxiliary device control unit 1 determines an optimum nozzle from the replaceable nozzles, generates an image for guiding the replacement of the nozzle of the painting device 5, and causes the display unit 111 to display the image. Good.
 図6は、第1の実施の形態において、表示装置111に表示させる表示画像の一例を示す図である。図6に示す例では、塗装対象物としての航空機11の塗装対象面上に、膜厚分布画像と作業指示画像とを重畳して表示している。これらの画像は、航空機11の塗装対象面に位置合わせされて投影表示される。図6に示す膜厚分布画像においては、塗膜の膜厚に応じて色分けして表示されている。領域121および領域122は、膜厚が目標膜厚から所定の範囲内である領域である。領域122は、膜厚が領域121の膜厚範囲よりも薄い領域である。領域123は、膜厚が領域122の膜厚範囲よりもさらに薄く、目標膜厚から所定の範囲を下回る領域である。なお、図6においては、色の違いをドットおよびハッチングを用いて表現している。 FIG. 6 is a diagram illustrating an example of a display image displayed on the display device 111 in the first embodiment. In the example shown in FIG. 6, the film thickness distribution image and the work instruction image are superimposed and displayed on the painting target surface of the aircraft 11 as the painting target. These images are projected and displayed in alignment with the painting target surface of the aircraft 11. In the film thickness distribution image shown in FIG. 6, the colors are displayed according to the film thickness of the coating film. Regions 121 and 122 are regions in which the film thickness is within a predetermined range from the target film thickness. The region 122 is a region whose film thickness is thinner than the film thickness range of the region 121. The region 123 is a region whose film thickness is thinner than the film thickness range of the region 122 and is below a predetermined range from the target film thickness. In FIG. 6, the color difference is expressed using dots and hatching.
 図6に示すポインタ90は、塗装の吹き付けを開始する目標位置を示している。図6の表示画像において、ポインタ90は、矢印91で示す方向に移動する。矢印91の移動は、所望の膜厚が形成されるのに適した方向および速度で行われる。作業者4は矢印の移動に追従して塗装装置5を移動させることで、塗装が行われずに放置されていた領域や塗膜の膜厚が不足している領域に対して、適切な塗装作業を行うことができる。例えば、塗装装置5から吐出される塗料の吐出量を一定に設定した場合には、ポインタ90の移動する速度を調整することで、塗装作業により形成される塗膜の膜厚を調整することができる。 A pointer 90 shown in FIG. 6 indicates a target position at which painting spraying is started. In the display image of FIG. 6, the pointer 90 moves in the direction indicated by the arrow 91. The movement of the arrow 91 is performed in a direction and speed suitable for forming a desired film thickness. The operator 4 moves the coating device 5 following the movement of the arrow, so that an appropriate painting operation can be performed for an area where the coating is not performed or an area where the film thickness of the coating film is insufficient. It can be performed. For example, when the discharge amount of the paint discharged from the coating apparatus 5 is set to be constant, the film thickness of the coating film formed by the painting operation can be adjusted by adjusting the moving speed of the pointer 90. it can.
 また、作業者4に対して、ポインタ90が表示されている間は塗装装置5から塗装対象物に塗料を吹き付け、ポインタ90が消えた時点で塗装装置5による塗料の吹き付けを停止するよう予め指導しておくこともできる。これにより、無駄に塗料が塗装対象物に吹き付けられることが防げ、塗装作業の低コスト化につなげる。なお、このような作業指示画像を塗装対象物である航空機11の塗装対象面に投影することもできる。また、シースルータイプのヘッドマウントディスプレイにこのような作業指示画像を表示することで、作業者4は、塗装作業面に重畳して作業指示画像を確認しながら塗装作業を行うこともできる。これにより、作業者の負担を大きく軽減することも可能となる。 In addition, the operator 4 is instructed in advance to spray paint onto the object to be painted from the coating device 5 while the pointer 90 is displayed, and to stop spraying the paint by the coating device 5 when the pointer 90 disappears. You can also keep it. As a result, it is possible to prevent the paint from being sprayed on the object to be used unnecessarily, leading to cost reduction of the painting work. Note that such a work instruction image can be projected onto the painting target surface of the aircraft 11 which is the painting target. Further, by displaying such a work instruction image on a see-through type head mounted display, the worker 4 can also perform a painting operation while confirming the work instruction image superimposed on the painting work surface. As a result, the burden on the operator can be greatly reduced.
 図7は、第1の実施の形態による塗装作業システムによる処理の流れを示したフローチャートである。ステップS201において、塗装作業ガイダンス用制御ユニット1は塗装対象物の塗装位置にゴンドラを移動させるようゴンドラの移動を制御し、ステップ203に進む。ステップS203において、塗装作業ガイダンス用制御ユニット1は、位置測定部101が測定した塗装対象面の三次元位置情報と、撮像装置102が取得した塗装対象面の画像とを測定装置10から取得し、ステップS205に進む。ステップS205において、画像取得部103は、撮像装置102が取得した塗装対象面の画像を、測定ピッチに応じた画像範囲になるように切り出し、ステップS207に進む。 FIG. 7 is a flowchart showing a flow of processing by the painting work system according to the first embodiment. In step S <b> 201, the painting work guidance control unit 1 controls the movement of the gondola so as to move the gondola to the painting position of the painting object, and the process proceeds to step 203. In step S203, the painting work guidance control unit 1 obtains, from the measuring device 10, the three-dimensional position information of the painting target surface measured by the position measurement unit 101 and the painting target surface image obtained by the imaging device 102, Proceed to step S205. In step S205, the image acquisition unit 103 cuts out the image of the painting target surface acquired by the imaging device 102 so as to be in an image range corresponding to the measurement pitch, and proceeds to step S207.
 ステップS207において、塗装作業が終了しているかどうか判定する。肯定判定、すなわち、塗装作業が終了していると判定した場合、ステップS209に進む。一方、否定判定、すなわち、塗装作業が終了していないと判定した場合、ステップS201に戻る。ステップS209において、全体画像生成部106は、ステップS207において画像取得部103により切り出された画像を繋ぎ合わせて、塗装対象物の全体画像を生成し、ステップS211に進む。ステップS211において、未測定部分抽出部107は、測定位置情報を取得していない、すなわち、未測定領域があるかどうか判定する。 In step S207, it is determined whether or not the painting work has been completed. If the determination is affirmative, that is, if it is determined that the painting operation has been completed, the process proceeds to step S209. On the other hand, when a negative determination is made, that is, when it is determined that the painting work is not completed, the process returns to step S201. In step S209, the entire image generation unit 106 connects the images cut out by the image acquisition unit 103 in step S207 to generate an entire image of the painting target, and the process proceeds to step S211. In step S211, the unmeasured part extraction unit 107 determines whether measurement position information has not been acquired, that is, there is an unmeasured area.
 ステップS211において肯定判定、すなわち、未測定領域があると判定した場合、ステップS213に進む。一方、ステップS211において否定判定、すなわち、未測定領域がないと判定した場合は、塗装作業が未了の領域はないことを意味するので、塗装作業ガイダンス用制御ユニット1は本ルーチンを終了する。ステップ213において、ステップS211で未測定領域であると判定された領域に対して、塗装作業ガイダンス用制御ユニット1は、ゴンドラをこの領域に移動させる。塗装作業が行われた後、S201と同様の手順により塗膜の膜厚を測定し、ステップS215に進む。ステップS215において、塗膜情報取得部108は膜厚分布情報を生成し、塗装作業ガイダンス用制御ユニット1は、塗膜の膜厚が所定の値であるかどうかを判定する。ステップS215において肯定判定された場合、すなわち、塗膜の膜厚が所定の値であると判定した場合、塗装作業ガイダンス用制御ユニット1は本ルーチンを終了する。一方、ステップS215において否定判定された場合、すなわち、塗膜の膜厚が所定の値に達していないと判定した場合は、ステップS213に戻る。 If it is determined affirmative in step S211, that is, if there is an unmeasured area, the process proceeds to step S213. On the other hand, if a negative determination is made in step S211, that is, it is determined that there is no unmeasured area, it means that there is no area where the painting work has not been completed, and thus the painting work guidance control unit 1 ends this routine. In step 213, the control unit for painting work guidance 1 moves the gondola to this area for the area determined as the unmeasured area in step S211. After the painting operation is performed, the film thickness of the coating film is measured by the same procedure as S201, and the process proceeds to Step S215. In step S215, the coating film information acquisition unit 108 generates film thickness distribution information, and the coating operation guidance control unit 1 determines whether or not the coating film thickness is a predetermined value. If an affirmative determination is made in step S215, that is, if it is determined that the film thickness of the coating film is a predetermined value, the coating work guidance control unit 1 ends this routine. On the other hand, if a negative determination is made in step S215, that is, if it is determined that the film thickness of the coating film has not reached the predetermined value, the process returns to step S213.
 上述した第1の実施の形態によれば、次の作用効果が得られる。
(1)被塗装作業ガイダンス用制御ユニット1は、測定装置10から、被測定物の部分ごとの位置情報と測定物の被測定部分の画像情報とを取得する。被塗装作業ガイダンス用制御ユニット1は、測定部分ごとの画像情報から、被測定物の全体画像情報を生成する全体画像生成部106と、全体画像情報に基づき、被測定物における位置情報が未測定の部分を抽出する未測定部分抽出部107とを備える。このような構成により、塗装対象における距離測定を行った領域と距離測定を行っていない領域とを把握することができる。そのため、未測定領域を示す情報を作業者に提示して、作業者に再測定や再塗装作業を促すことが可能となる。また、未測定領域に基づいて塗装作業を行わせることにより、塗装対象領域に対して確実に塗装することができ、塗装作業の時間を短縮させることができる。
According to the first embodiment described above, the following operational effects are obtained.
(1) The coating work guidance control unit 1 acquires position information for each part of the measurement object and image information of the measurement part of the measurement object from the measurement apparatus 10. The coating work guidance control unit 1 generates the entire image information of the object to be measured from the image information for each measurement part, and the position information on the object to be measured is not measured based on the entire image information. And an unmeasured part extracting unit 107 for extracting the part. With such a configuration, it is possible to grasp a region where distance measurement is performed on a coating target and a region where distance measurement is not performed. Therefore, it is possible to present information indicating an unmeasured area to the worker and prompt the worker to perform remeasurement or repainting work. Further, by performing the painting operation based on the unmeasured area, it is possible to reliably apply the painting to the area to be painted, and to shorten the time of the painting work.
(2)一般的に、航空機のコスト等を考慮して航空機の運休期間を短くする必要があるため、限られた時間内に塗装作業を行う必要がある。また、航空機の需要は増大すると考えられている。そこで、本実施の形態では、未測定領域を塗装作業中に適宜抽出して、抽出結果を作業者に提示する。これにより、作業者は、未測定領域を確認しながら塗装作業を行うことができ、未塗装領域への追加塗装の塗り直しや出戻り作業を削減して塗装作業を速やかに完了させることができる。その結果、航空機の稼動効率を向上させることができる。また、測定結果に基づいて膜厚分布情報を生成し、膜厚分布情報に基づいて適切な塗装作業を行うことにより、過剰な膜厚による航空機の燃費の悪化、膜厚の不足による防水性や防錆性の低下、および膜厚の不足による機体表面の温度上昇等が生じることを防止することができる。 (2) Generally, since it is necessary to shorten the aircraft suspension period in consideration of the cost of the aircraft, it is necessary to perform the painting work within a limited time. Aircraft demand is also expected to increase. Therefore, in this embodiment, an unmeasured region is appropriately extracted during the painting operation, and the extraction result is presented to the operator. As a result, the operator can perform the painting work while confirming the unmeasured area, and can quickly complete the painting work by reducing the repainting and return work of the additional painting to the unpainted area. As a result, the operational efficiency of the aircraft can be improved. In addition, by generating film thickness distribution information based on the measurement results and performing appropriate painting work based on the film thickness distribution information, the deterioration of aircraft fuel consumption due to excessive film thickness, waterproofing due to insufficient film thickness, It is possible to prevent a decrease in rust prevention and an increase in temperature of the airframe surface due to insufficient film thickness.
 次のような変形も本発明の範囲内であり、変形例の一つ、もしくは複数を上述の実施形態と組み合わせることも可能である。 The following modifications are also within the scope of the present invention, and one or a plurality of modifications can be combined with the above-described embodiment.
(変形例1)
 上述した実施の形態および変形例では、画像情報として撮像画像情報から抽出した画像情報を用いる例について説明した。しかし、画像情報として、位置測定部101により測定された位置情報に基づくメッシュデータを用いるようにしてもよい。例えば、位置測定部101により取得された点群データの各点をつなぎ合わせて、複数のメッシュ(例えば、三角形のメッシュ)により構成されるメッシュデータを生成する。未測定部分抽出部107は、生成したメッシュデータにおいて、メッシュが生成されていない領域を、距離測定の未測定領域として抽出する。なお、位置測定部101により取得された点群データと、航空機11のモデルデータとを比較することによって、距離測定の未測定領域を抽出してもよい。また、画像生成部110は、測定装置10により取得された三次元データに基づいて2次元画像を生成して、その2次元画像データを画像情報として用いるようにしてもよい。
(Modification 1)
In the above-described embodiments and modifications, examples in which image information extracted from captured image information is used as image information have been described. However, mesh data based on position information measured by the position measurement unit 101 may be used as the image information. For example, the points of the point cloud data acquired by the position measuring unit 101 are connected to generate mesh data composed of a plurality of meshes (for example, triangular meshes). In the generated mesh data, the unmeasured part extraction unit 107 extracts an area where no mesh is generated as an unmeasured area for distance measurement. In addition, you may extract the unmeasured area | region of distance measurement by comparing the point cloud data acquired by the position measurement part 101, and the model data of the aircraft 11. FIG. Further, the image generation unit 110 may generate a two-dimensional image based on the three-dimensional data acquired by the measurement apparatus 10 and use the two-dimensional image data as image information.
(変形例2)
 上述した実施の形態および変形例では、塗装対象として航空機を用いる例について説明したが、塗装対象は自動車であってもよいし、船舶であってもよく、特に限定されない。本発明は、種々の塗装対象の塗膜状態の解析に適用することができる。
(Modification 2)
In the embodiment and the modification described above, an example in which an aircraft is used as a painting target has been described. However, the painting target may be an automobile or a ship, and is not particularly limited. The present invention can be applied to the analysis of the coating state of various coating objects.
 なお、上述の各実施形態の要件は、適宜組み合わせることができる。また、一部の構成要素を用いない場合もある。また、法令で許容される限りにおいて、上述の各実施形態および変形例で引用した検出装置などに関する全ての公開公報および米国特許の開示を援用して本文の記載の一部とする。 Note that the requirements of the above-described embodiments can be combined as appropriate. Some components may not be used. In addition, as long as permitted by law, the disclosure of all published publications and US patents related to the detection devices and the like cited in the above embodiments and modifications are incorporated herein by reference.
 上記では、種々の実施の形態および変形例を説明したが、本発明はこれらの内容に限定されるものではない。本発明の技術的思想の範囲内で考えられるその他の態様も本発明の範囲内に含まれる。 Although various embodiments and modifications have been described above, the present invention is not limited to these contents. Other embodiments conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention.
10…測定装置、101…位置測定部、103…画像取得部、106…全体画像生成部、107…未測定部分抽出部 DESCRIPTION OF SYMBOLS 10 ... Measuring apparatus, 101 ... Position measurement part, 103 ... Image acquisition part, 106 ... Whole image generation part, 107 ... Unmeasured part extraction part

Claims (13)

  1.  被測定物の部分ごとの位置情報を測定する位置測定部と、
     前記位置測定部により測定された前記被測定物の被測定部分の画像情報を取得する画像取得部と、
     前記被測定部分ごとの画像情報から、前記被測定物の全体画像情報を生成する全体画像生成部と、
     前記全体画像情報に基づき、前記被測定物における前記位置情報が未測定の部分を抽出する未測定部分抽出部と、
    を備える未測定領域抽出装置。
    A position measuring unit for measuring position information for each part of the object to be measured;
    An image acquisition unit for acquiring image information of a measurement target portion of the measurement object measured by the position measurement unit;
    From the image information for each part to be measured, a whole image generating unit that generates whole image information of the object to be measured;
    Based on the entire image information, an unmeasured part extraction unit that extracts an unmeasured part of the position information in the object to be measured;
    An unmeasured region extraction apparatus comprising:
  2.  請求項1に記載の未測定領域抽出装置において、
     前記画像取得部は、前記被測定部分を含むように撮影範囲が設定され、前記位置測定部による前記被測定部分の測定時に前記被測定物の像を撮影する撮像部により得られた画像情報であり、
     前記被測定物における前記画像情報の位置情報を前記位置測定部から取得し、前記画像情報のそれぞれに前記位置情報を付与する画像位置取得部を更に備え、
     前記全体画像生成部は、前記画像情報に付与された位置情報に基づき前記被測定部分ごとの画像情報のそれぞれを位置決めして、前記被測定物の全体画像情報を生成する未測定領域抽出装置。
    In the unmeasured region extraction device according to claim 1,
    The image acquisition unit is image information obtained by an imaging unit in which an imaging range is set so as to include the measured portion, and an image of the measured object is captured when measuring the measured portion by the position measuring unit. Yes,
    It further includes an image position acquisition unit that acquires position information of the image information on the object to be measured from the position measurement unit, and adds the position information to each of the image information,
    The whole image generation unit is an unmeasured region extraction device that positions each piece of image information for each part to be measured based on position information given to the image information and generates whole image information of the object to be measured.
  3.  請求項2に記載の未測定領域抽出装置において、
     前記画像取得部で取得される画像情報は、前記被測定部分を含み、前記被測定部分よりも広い範囲の像が撮影された画像情報であり、
     前記全体画像生成部は、前記画像取得部で取得された複数の前記画像情報のそれぞれで、前記被測定物の同じ位置となる領域をもとに位置合わせことにより、前記複数の画像情報をつなぎ合わせ、前記被測定物の全体画像情報を生成する未測定領域抽出装置。
    In the unmeasured region extraction device according to claim 2,
    The image information acquired by the image acquisition unit is image information obtained by capturing an image of a wider range than the measured portion, including the measured portion.
    The whole image generation unit connects the plurality of pieces of image information by aligning each of the plurality of pieces of image information acquired by the image acquisition unit based on an area that is the same position of the object to be measured. In addition, an unmeasured region extraction device that generates overall image information of the object to be measured.
  4.  請求項3に記載の未測定領域抽出装置において、
     前記画像取得部は、前記位置測定部による測定ピッチに応じて、前記画像情報の画像生成範囲を設定する未測定領域抽出装置。
    In the unmeasured region extraction device according to claim 3,
    The image measurement unit is an unmeasured region extraction device that sets an image generation range of the image information according to a measurement pitch by the position measurement unit.
  5.  請求項4に記載の未測定領域抽出装置において、
     前記画像取得部は、前記位置測定部による測定ピッチと、前記画像情報が示す領域の前記被測定物の表面の向きに応じて前記画像情報の画像生成範囲を設定する未測定領域抽出装置。
    In the unmeasured region extraction device according to claim 4,
    The unmeasured region extraction device, wherein the image acquisition unit sets an image generation range of the image information according to a measurement pitch by the position measurement unit and a direction of a surface of the object to be measured in a region indicated by the image information.
  6.  請求項5に記載の未測定領域抽出装置において、
     前記画像情報が示す領域の前記被測定物の表面の向きは、前記位置測定部から得られた位置情報から算出する未測定領域抽出装置。
    In the unmeasured region extraction device according to claim 5,
    An unmeasured region extraction device that calculates the orientation of the surface of the object to be measured in the region indicated by the image information from position information obtained from the position measurement unit.
  7.  請求項1から請求項6までのいずれか一項に記載の未測定領域抽出装置において、
     前記画像情報は、前記位置測定部により測定された位置情報に基づくメッシュデータである未測定領域抽出装置。
    In the unmeasured region extraction device according to any one of claims 1 to 6,
    The unmeasured region extraction device, wherein the image information is mesh data based on position information measured by the position measurement unit.
  8.  請求項1から請求項7までのいずれか一項に記載の未測定領域抽出装置において、
     前記被測定物の全体画像と前記被測定物のモデルデータを位置合わせするモデルデータ位置合わせ部を更に備える未測定領域抽出装置。
    In the unmeasured region extraction device according to any one of claims 1 to 7,
    An unmeasured region extraction device further comprising a model data alignment unit that aligns the entire image of the object to be measured and model data of the object to be measured.
  9.  請求項8に記載の未測定領域抽出装置において、
     前記全体画像の各々の領域に対応して、複数の前記画像情報に付与された前記位置測定部からの位置情報が対応づけられ、
     前記モデルデータ位置合わせ部は、前記被測定物のモデルデータと前記全体画像を位置合わせするときに、前記位置測定部からの位置情報を用いて前記被測定物のモデルデータと位置合わせする未測定領域抽出装置。
    In the unmeasured region extraction device according to claim 8,
    Corresponding to each region of the entire image, the position information from the position measurement unit assigned to a plurality of the image information is associated,
    The model data alignment unit uses the position information from the position measurement unit to align with the model data of the object to be measured when aligning the model data of the object to be measured and the entire image. Region extraction device.
  10.  請求項1から請求項9までのいずれか一項に記載の未測定領域抽出装置と、
     前記被測定物の全体画像と前記被測定物のモデルデータとを位置合わせして表示する表示部と、を備える作業ガイダンス装置。
    The unmeasured region extraction device according to any one of claims 1 to 9,
    A work guidance device comprising: a display unit that aligns and displays an entire image of the device under test and model data of the device under test.
  11.  請求項10に記載の作業ガイダンス装置において、
     前記作業ガイダンス装置は、前記被測定物への測定作業に関するガイダンスを行う作業ガイダンス装置。
    The work guidance device according to claim 10,
    The work guidance device is a work guidance device that performs guidance on a measurement work on the object to be measured.
  12.  請求項10または請求項11に記載の作業ガイダンス装置において、
     前記位置測定部で得られた前記被測定物の位置情報と前記被測定物のモデルデータとの差分情報から、前記被測定物に生成された塗膜の厚さ情報を取得する塗膜情報取得部を更に有し、
     前記表示部は、前記塗膜の厚さ情報を前記被測定物のモデルデータと位置合わせして表示する作業ガイダンス装置。
    In the work guidance device according to claim 10 or claim 11,
    Coating film information acquisition for acquiring thickness information of the coating film generated on the object to be measured from the difference information between the position information of the object to be measured obtained by the position measuring unit and the model data of the object to be measured. Further comprising
    The display unit is a work guidance device that displays the thickness information of the coating film in alignment with the model data of the object to be measured.
  13.  被測定物の部分ごとの位置情報を測定することと、
     前記測定された前記被測定物の被測定部分の画像情報を取得することと、
     前記被測定部分ごとの画像情報から、前記被測定物の全体画像情報を生成することと、
     前記被測定物の位置情報と前記被測定物のモデルデータとの差分情報から、前記被測定物に生成された塗膜の厚さ情報を取得することと、
     前記被測定物の全体画像と前記塗膜の厚さ情報と前記被測定物のモデルデータとを位置合わせして表示することと、を備える作業ガイダンス方法。
    Measuring position information for each part of the measured object;
    Obtaining image information of a measured part of the measured object to be measured;
    Generating overall image information of the object to be measured from the image information for each part to be measured;
    From the difference information between the position information of the object to be measured and the model data of the object to be measured, obtaining thickness information of the coating film generated on the object to be measured;
    A work guidance method comprising: aligning and displaying an entire image of the object to be measured, thickness information of the coating film, and model data of the object to be measured.
PCT/JP2017/014280 2017-04-05 2017-04-05 Unmeasured area extraction device, work guidance device, and work guidance method WO2018185893A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014280 WO2018185893A1 (en) 2017-04-05 2017-04-05 Unmeasured area extraction device, work guidance device, and work guidance method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014280 WO2018185893A1 (en) 2017-04-05 2017-04-05 Unmeasured area extraction device, work guidance device, and work guidance method

Publications (1)

Publication Number Publication Date
WO2018185893A1 true WO2018185893A1 (en) 2018-10-11

Family

ID=63713407

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/014280 WO2018185893A1 (en) 2017-04-05 2017-04-05 Unmeasured area extraction device, work guidance device, and work guidance method

Country Status (1)

Country Link
WO (1) WO2018185893A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020105311A1 (en) * 2018-11-21 2020-05-28 三菱重工業株式会社 Position measurement system and position measurement method
US12025422B2 (en) 2018-11-21 2024-07-02 Mitsubishi Heavy Industries, Ltd. Position measurement system and position measurement method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1047935A (en) * 1996-08-06 1998-02-20 Topcon Corp Picture superimposing type measuring system
JP2002074347A (en) * 2000-08-31 2002-03-15 Minolta Co Ltd Information aquiring system
JP2013092456A (en) * 2011-10-26 2013-05-16 Topcon Corp Image measuring device
US20130260016A1 (en) * 2012-04-02 2013-10-03 The Boeing Company Sealant Analysis System
US20150317070A1 (en) * 2014-04-11 2015-11-05 Ikegps Group Limited Mobile handheld instruments and methods
JP2016151423A (en) * 2015-02-16 2016-08-22 株式会社トプコン Posture detection device and data acquisition device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1047935A (en) * 1996-08-06 1998-02-20 Topcon Corp Picture superimposing type measuring system
JP2002074347A (en) * 2000-08-31 2002-03-15 Minolta Co Ltd Information aquiring system
JP2013092456A (en) * 2011-10-26 2013-05-16 Topcon Corp Image measuring device
US20130260016A1 (en) * 2012-04-02 2013-10-03 The Boeing Company Sealant Analysis System
US20150317070A1 (en) * 2014-04-11 2015-11-05 Ikegps Group Limited Mobile handheld instruments and methods
JP2016151423A (en) * 2015-02-16 2016-08-22 株式会社トプコン Posture detection device and data acquisition device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020105311A1 (en) * 2018-11-21 2020-05-28 三菱重工業株式会社 Position measurement system and position measurement method
US12025422B2 (en) 2018-11-21 2024-07-02 Mitsubishi Heavy Industries, Ltd. Position measurement system and position measurement method

Similar Documents

Publication Publication Date Title
KR101572917B1 (en) Graphical application system
JP5824054B2 (en) Surface sputtering equipment
US9914150B2 (en) Graphical application system
US8554395B2 (en) Method and system for facilitating autonomous landing of aerial vehicles on a surface
CN111192189A (en) Three-dimensional automatic detection method and system for automobile appearance
US20100318319A1 (en) Projection apparatus
WO2018185891A1 (en) Coating film state analysis device and film thickness measurement device
US10532561B2 (en) Metrology-based path planning for inkjet printing along a contoured surface
US11865852B2 (en) Robotic livery printing system
US20230386080A1 (en) System and Method For Utilization of Displacement Sensor During Placement of Vehicle Service Fixture
CN104182095B (en) Portable self-positioning laser 3D optical projection systems
WO2018185893A1 (en) Unmeasured area extraction device, work guidance device, and work guidance method
WO2018185890A1 (en) Coating assistance device, coating device, coating work assistance method, production method for coated article, and coating assistance program
WO2018185892A1 (en) Position detecting device and painting assisting system
KR20130073161A (en) Coating apparatus applied to the surface of hull and control method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP