WO2020021601A1 - Appareil de correction d'image et procédé de correction d'image - Google Patents

Appareil de correction d'image et procédé de correction d'image Download PDF

Info

Publication number
WO2020021601A1
WO2020021601A1 PCT/JP2018/027529 JP2018027529W WO2020021601A1 WO 2020021601 A1 WO2020021601 A1 WO 2020021601A1 JP 2018027529 W JP2018027529 W JP 2018027529W WO 2020021601 A1 WO2020021601 A1 WO 2020021601A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
correction
imaging target
imaging
time
Prior art date
Application number
PCT/JP2018/027529
Other languages
English (en)
Japanese (ja)
Inventor
孝一 折戸
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2018/027529 priority Critical patent/WO2020021601A1/fr
Priority to JP2019504145A priority patent/JP6502003B1/ja
Priority to CN201880095738.7A priority patent/CN112425147B/zh
Publication of WO2020021601A1 publication Critical patent/WO2020021601A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/02Still-picture cameras
    • G03B19/04Roll-film cameras
    • G03B19/07Roll-film cameras having more than one objective
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an image correction device and an image correction method for correcting an image obtained by an imaging device that images a moving imaging target.
  • Patent Literature 1 discloses a technique for correcting a shake occurring in an image captured by an infrared camera of a ladle in a steelworks.
  • the present invention has been made in view of the above, and an object of the present invention is to provide an image correction apparatus that appropriately corrects a blur that occurs in a captured image of a moving imaging target and that is caused by integration of luminance values.
  • the present invention provides an image obtained at a time when one frame is formed by a first imaging device that images a moving imaging target at the end of the above-described time. Is divided into a final time image that is an image formed at the time and a non-final time image that is an image that is formed before the final time of the above time, and each of the final time image and the non-final time image is A dividing unit that divides the final time image and the non-final time image into a plurality of regions for each exposure time required when the respective portions are formed, based on the position of the imaging target, and a final unit obtained by the dividing unit.
  • a correction unit configured to correct the luminance value of each of the plurality of regions included in the time image and the non-final time image.
  • the correction unit calculates, for a plurality of regions included in the final time image, a luminance value of the correction target region and a first correction value corresponding to an exposure time required when the correction target region is formed.
  • the brightness value of the area to be corrected is corrected by the first correction corresponding to the exposure time required when the area to be corrected is formed.
  • the correction is performed using the value and the second correction value corresponding to the non-final time image.
  • the image correction device has an effect that a moving image capturing target can be appropriately corrected for a blur that occurs in a captured image and that is caused by integrating luminance values.
  • FIG. 1 is a diagram showing a configuration of an imaging system according to a first embodiment.
  • FIG. 4 is a diagram illustrating an example of an image obtained by capturing an image of a moving target by each of a first imaging device and a second imaging device included in the imaging system according to the first embodiment;
  • FIG. 1 is a diagram showing a configuration of an image correction device according to a first embodiment.
  • FIG. 3 is a diagram for explaining functions of a division unit included in the image correction apparatus according to the first embodiment.
  • FIG. 4 is a diagram for explaining a first correction value stored in a correction value storage unit included in the image correction apparatus according to the first embodiment.
  • 5 is a flowchart illustrating a procedure of an operation of the image correction apparatus according to the first embodiment;
  • FIG. 4 is a diagram illustrating an example of an image obtained by capturing an image of a moving target by each of a first imaging device and a second imaging device included in the imaging system according to the first embodiment
  • FIG. 1 is a diagram showing
  • FIG. 4 is a diagram schematically illustrating a distribution of luminance values obtained by performing correction by a correction unit included in the image correction apparatus according to the first embodiment
  • FIG. 7 is a diagram for explaining correction performed by the image correction apparatus according to the first embodiment.
  • FIG. 9 is a diagram for explaining functions of a division unit included in the image correction apparatus according to the second embodiment.
  • FIG. 9 is a diagram for explaining a second correction value stored in a correction value storage unit included in the image correction device according to the second embodiment.
  • FIG. 7 is a diagram illustrating a configuration of an image correction device according to a third embodiment.
  • FIG. 11 is a diagram for explaining a function of an interpolation image generation unit included in the image correction device according to the third embodiment.
  • FIG. 3 is a diagram showing a situation in which the image correction device according to the first to third embodiments is connected to an encoder and a control device.
  • FIG. 1 is a diagram illustrating a configuration of an imaging system 50 according to the first embodiment.
  • the imaging system 50 includes the image correction device 1 that corrects an image. Details of the image correction device 1 will be described later.
  • the imaging system 50 includes a first imaging device 21 that captures the moving imaging target 23 at a first frame rate, and a first imaging device 21 that is higher than the first frame rate when the first imaging device 21 captures the imaging target 23.
  • a second imaging device 22 that images the imaging target 23 at a frame rate of 2.
  • the first imaging device 21 and the second imaging device 22 are arranged side by side.
  • FIG. 1 also shows an imaging target 23.
  • the frame rate means the number of frames obtained by imaging per unit time.
  • An example of the first imaging device 21 is an infrared camera.
  • An example of the second imaging device 22 is a visible light camera.
  • An example of the first frame rate is 60 fps, and an example of the second frame rate is 1200 fps.
  • the first frame rate is 60 fps and the second frame rate is 180 fps.
  • the angle of view of the first imaging device 21 and the angle of view of the second imaging device 22 are the same.
  • FIG. 1 also shows a belt conveyor 24.
  • the imaging target 23 moves in the direction of the arrow 25.
  • An example of the imaging target 23 is a cardboard made by a hot melt process. In the first embodiment, it is assumed that the imaging target 23 moves linearly at a constant speed.
  • FIG. 2 is a diagram illustrating an example of an image obtained by each of the first imaging device 21 and the second imaging device 22 included in the imaging system 50 according to the first embodiment capturing the moving imaging target 23. It is. Specifically, FIG. 2A shows an example of an image obtained by capturing an image of the imaging target 23 that the first imaging device 21 moves. FIG. 2B illustrates an example of an image obtained by imaging the imaging target 23 that the second imaging device 22 moves. In addition, in each figure after FIG. 2, except for FIG. 13, the imaging object 23 is shown by the triangle for convenience of explanation. In the following, it is assumed that the imaging target 23 is a triangle.
  • the second imaging device 22 controls the imaging target 23 at each time of 1/180 seconds, 2/180 seconds, and 3/180 seconds after the first time of imaging. Get an image.
  • the first imaging device 21 obtains an image of the imaging target 23 1/60 seconds after the initial time of imaging, that is, 3/180 seconds.
  • the time from 1/60 seconds after the initial time of imaging is the time when one frame is formed by the first imaging device 21. That is, the time 3/180 seconds after the first time of the imaging in FIG. 2 is the last time of the time when one frame is formed by the first imaging device 21. That is, the image 3/180 seconds after the initial time of imaging in FIG. 2A is an image obtained at the time when one frame is formed by the first imaging device 21. 2A and 2B, the image of the imaging target 23 described at the first time of imaging is captured by the second imaging device 22 at the first time of imaging. This is the image obtained by this.
  • the first frame rate is 60 fps
  • the second frame rate is 180 fps. That is, the time when the first imaging device 21 obtains one image is three times the time when the second imaging device 22 obtains one image.
  • the imaging target 23 is moving. Therefore, as shown in the image 1/60 second after the initial time of imaging in FIG. 2A, the first imaging device 21 performs the first imaging 1/60 second after the initial time of imaging.
  • a blurred image 23a showing the imaging target 23 at each position from the first time of the imaging to 1/60 seconds after is obtained.
  • the first imaging device 21 obtains an image of the imaging target 23 1/60 seconds after the first time of the imaging, and moves the imaging device 1/60 seconds after the first time of the imaging.
  • a blurred image 23a generated by integrating the luminance values of a part of the target 23 is obtained.
  • the first imaging device 21 obtains the image of the imaging target 23 and the blurred image 23a 1/60 second after the first time of the imaging.
  • the blurred image 23a is an image that cannot be obtained by the second imaging device 22 1/60 seconds after the first time of imaging.
  • the image of the imaging target 23 obtained by the first imaging device 21 1/60 second after the first time of the imaging includes a portion in which the luminance values of a part of the moving imaging target 23 are integrated.
  • the image correction device 1 obtains an image similar to the image obtained by the second imaging device 22 every 1/60 second. The purpose is to:
  • the image correction device 1 obtains an image of the imaging target 23 from an image including the blurred image 23a and the image of the imaging target 23, such as the image 1/60 seconds after the initial time of imaging in FIG. Is corrected. That is, the image correction device 1 corrects the image obtained by the first imaging device 21 that captures the moving imaging target 23.
  • FIG. 3 is a diagram illustrating a configuration of the image correction device 1 according to the first embodiment.
  • the image correction device 1 includes a trigger generation unit 2 that generates an imaging trigger for causing each of the first imaging device 21 and the second imaging device 22 to start imaging.
  • FIG. 3 also shows a first imaging device 21 and a second imaging device 22.
  • the image correction device 1 includes a communication unit 3 that simultaneously transmits the imaging trigger generated by the trigger generation unit 2 to each of the first imaging device 21 and the second imaging device 22.
  • the first imaging device 21 and the second imaging device 22 receive the imaging trigger and start imaging the imaging target 23.
  • the communication unit 3 has a function of receiving image data obtained when the first imaging device 21 images the imaging target 23 from the first imaging device 21, and a function of the second imaging device 22 imaging the imaging target 23. And a function of receiving the image data obtained at this time from the second imaging device 22.
  • the image correction device 1 further includes an image storage unit 4 that stores the image data received by the communication unit 3 from the first imaging device 21 and the image data received from the second imaging device 22.
  • the image correction device 1 extracts the feature points of the image of the imaging target 23 based on the image data stored in the image storage unit 4 and the image data received by the communication unit 3 from the second imaging device 22. It further has a feature extraction unit 5.
  • An example of a feature point is an edge of the imaging target 23. In the first embodiment, the edges are each of the three vertices of the triangular imaging target 23. The shape and size of the triangular imaging target 23 are specified based on the three vertices.
  • the image correction device 1 further has a feature storage unit 6 that stores data indicating feature points extracted by the feature extraction unit 5.
  • the image correction apparatus 1 further includes a movement amount calculation unit 7 that calculates a movement amount per unit time of the feature point indicated by the data stored in the feature storage unit 6. Specifically, the moving amount calculation unit 7 forms one frame by the second imaging device 22 based on the corresponding feature points in two adjacent frames imaged by the second imaging device 22. The movement amount of the feature point at a certain time is calculated.
  • the image correction apparatus 1 further includes a movement amount storage unit 8 that stores data indicating the movement amount per unit time of the feature point obtained by the movement amount calculation unit 7 performing the calculation.
  • the image correction device 1 extracts feature points of a correction target image which is image data stored in the image storage unit 4 and is an image based on the image data received by the communication unit 3 from the first imaging device 21. And a correction image specifying unit 9 for specifying a correction target image. Specifically, the corrected image specifying unit 9 sets a trapezoidal image including the image of the imaging target 23 and the blurred image 23a 1/60 seconds after the initial time of the imaging in FIG. Identify.
  • the feature storage unit 6 also stores data indicating the feature points extracted by the corrected image specifying unit 9. More specifically, the feature storage unit 6 also stores data indicating the correction target image.
  • the image correction device 1 forms a correction target image, which is an image obtained at the time when one frame is formed by the first imaging device 21 that captures the moving imaging target 23, at the last time of the time.
  • the image processing apparatus further includes a division unit that divides the image into a final time image that is an image and a non-final time image that is an image formed before the last time of the time.
  • the dividing unit 10 separates each of the final time image and the non-final time image based on the position of the imaging target 23 for each exposure time required when each part of the final time image and the non-final time image is formed. Divide into multiple areas.
  • the position of the imaging target 23 is specified by the image data received by the communication unit 3 from the second imaging device 22. More specifically, the position of the imaging target 23 is specified by the data stored in the movement amount storage unit 8.
  • the dividing unit 10 also stores the shape and size of the imaging target 23 of the captured image based on the data stored in the feature storage unit 6 and the data stored in the movement amount storage unit 8.
  • the correction target image specified by the correction image specifying unit 9 based on the position of the imaging target 23 and the position of the imaging target 23 1/60 seconds after the initial time of the imaging in FIG.
  • the image is divided into a final time image as an image and a non-final time image as a blurred image 23a.
  • FIG. 4 is a diagram for explaining a function of the dividing unit 10 included in the image correction device 1 according to the first embodiment. More specifically, FIG. 4 is a diagram illustrating a correction target image including the image of the imaging target 23 and the blurred image 23a 1/60 seconds after the first time of the imaging in FIG. 2A. As described above, the imaging target 23 linearly moves at a constant speed. Therefore, after 1/60 second from the first time of imaging, a correction target image in which a triangular image having three points P0, Q0, and R0 in FIG. 4 moves in the direction of arrow 25a in order Is obtained by the first imaging device 21.
  • the first imaging device 21 obtains a triangular image having three points P0, Q0, and R0 in FIG.
  • the first imaging device 21 obtains a triangular image having three points P1, P1 and R1 in FIG. 2/180 seconds after the first image capturing time
  • the first image capturing apparatus 21 obtains a triangular image having three points P2, Q2, and R2 in FIG.
  • 1/60 second which is 3/180 seconds after the first time of imaging, a triangular image having three vertices of points P3, Q3, and R3 in FIG.
  • a trapezoidal correction target image having the vertices is obtained by the first imaging device 21.
  • “y0” indicates the amount of movement of the triangular image in each 1/180 second when 1/60 second from 1/60 second after the first time of imaging is divided into three equal parts. ing. Data indicating the movement amount is stored in the movement amount storage unit 8. Since the imaging target 23 moves linearly at a constant speed, the movement amount y0 of the triangular image every 1/180 second is constant.
  • the image is taken by the device 21.
  • a code B is assigned to four regions forming a triangular image having three points P3, Q3, and R3.
  • the image constituted by all the regions to which the reference symbol B is assigned is the last time image formed at the last time of the time when the correction target image is formed.
  • the exposure time required for forming each of the four regions to which the code B is assigned is different from the exposure time required for forming the other regions.
  • the image constituted by all the regions to which the reference symbol A is assigned is a non-final time image formed before the last time of the exposure time.
  • a part of the triangle image at the first time of imaging is a triangle image 1/180 second after the first time of imaging, a triangle image two / 180 second after the first time of imaging, and It overlaps with the image of the triangle 3/180 seconds after the first time.
  • a part of the triangular image 1/180 second after the initial time of imaging is a triangle image 2/180 second after the initial time of imaging and 3/180 second after the initial time of imaging. Overlap with the triangle image.
  • a part of the triangular image 2/180 seconds after the first image capturing time overlaps with the triangular image 3/180 seconds after the first image capturing time.
  • the numbers added to the right side of the code A or the code B indicate the 1/60 second from 1/60 second after the first time of the imaging to be divided into three equal parts in each 1/180 second. This is obtained by adding “1” to the number of times a part of the above triangular image overlaps.
  • the number of overlaps corresponds to the exposure time. Specifically, the exposure time required when the area to which the code A1 is assigned is formed is shorter than the exposure time required when the area to which the code A2 is assigned is formed. The exposure time required when the area to which the code A2 is assigned is formed is shorter than the exposure time required when the area to which the code A3 is assigned is formed.
  • the exposure time required when the area to which the code B1 is assigned is formed is shorter than the exposure time required when the area to which the code B2 is assigned is formed.
  • the exposure time required when the area to which the code B2 is assigned is formed is shorter than the exposure time required when the area to which the code B3 is assigned is formed.
  • the exposure time required when the area to which the code B3 is assigned is formed is shorter than the exposure time required when the area to which the code B4 is assigned is formed.
  • the dividing unit 10 separates each of the final time image and the non-final time image based on the position of the imaging target 23 for each exposure time required when each part of the final time image and the non-final time image is formed. Divide into multiple areas.
  • the position of the imaging target 23 is specified by the image data received by the communication unit 3 from the second imaging device 22. Specifically, the position of the imaging target 23 is specified by data stored in the movement amount storage unit 8.
  • the dividing unit 10 divides the last time image into an area to which the code B1 is assigned, an area to which the code B2 is assigned, an area to which the code B3 is assigned, and The area is divided into areas to which B4 is assigned.
  • the dividing unit 10 divides the non-final time image into an area to which the code A1 is assigned, an area to which the code A2 is assigned, and an area to which the code A3 is assigned.
  • the image correction device 1 further includes a correction value storage unit 11 that stores a first correction value and a second correction value used when correcting an image.
  • FIG. 5 is a diagram for explaining the first correction value stored in the correction value storage unit 11 included in the image correction device 1 according to the first embodiment.
  • the first correction value corresponds to the exposure time required when the region to be corrected is formed.
  • FIG. 5 shows an example in which the first correction value decreases in proportion to the exposure time.
  • FIG. 5 shows four first correction values ⁇ 1, ⁇ 2, ⁇ 3, and ⁇ 4.
  • ⁇ 1 is a first correction value used when correcting the luminance value of the area to which the code A1 and the code B1 are assigned.
  • ⁇ 2 is a first correction value used when correcting the luminance value of the area to which the code A2 and the code B2 are assigned.
  • ⁇ 3 is a first correction value used when correcting the luminance value of the area to which the code A3 and the code B3 are assigned.
  • ⁇ 4 is a first correction value used when correcting the luminance value of the area to which the code B4 is assigned.
  • the image correction device 1 further includes a correction unit 12 that corrects the luminance value of each of the plurality of regions included in the final time image and the non-final time image obtained by the division unit 10. For a plurality of areas included in the final time image, the correction unit 12 calculates the brightness value of the correction target area by a first correction value corresponding to the exposure time required when the correction target area was formed. Correct using. The first correction value is stored in the correction value storage unit 11.
  • the correction unit 12 determines the luminance value of the area to which the code B1 is assigned when the area to which the code B1 is assigned is formed. The correction is performed using the first correction value ⁇ 1 corresponding to the required exposure time.
  • the correction unit 12 determines the luminance value of the area to which the code B2 is assigned when the area to which the code B2 is assigned is formed. The correction is performed using the first correction value ⁇ 2 corresponding to the required exposure time.
  • the correction unit 12 determines the luminance value of the correction target area by a first correction corresponding to the exposure time required when the correction target area was formed. The correction is performed using the value and the second correction value corresponding to the non-final time image.
  • the first correction value and the second correction value are stored in the correction value storage unit 11.
  • the second correction value is a value smaller than one.
  • the correction unit 12 determines the luminance value of the area to which the code A1 is assigned when the area to which the code A1 is assigned is formed. The correction is performed using the first correction value ⁇ 1 corresponding to the required exposure time and the second correction value corresponding to the non-final time image.
  • the correction unit 12 determines the luminance value of the area to which the code A2 is assigned when the area to which the code A2 is assigned is formed. The correction is performed using the first correction value ⁇ 2 corresponding to the required exposure time and the second correction value corresponding to the non-final time image.
  • the correction unit 12 Corrects the luminance value before correction according to the following equation (1).
  • X is A or B
  • n is any integer from 1 to 4.
  • ⁇ 0 is a second correction value.
  • the image correction device 1 further includes a corrected image generation unit 13 that generates a corrected image based on the luminance value obtained by the correction unit 12.
  • the image correction device 1 further includes a display unit 14 that displays the corrected image generated by the corrected image generation unit 13.
  • An example of the display unit 14 is a liquid crystal display device.
  • An example of the image storage unit 4, the feature storage unit 6, the movement amount storage unit 8, and the correction value storage unit 11 included in the image correction device 1 is a semiconductor memory.
  • FIG. 6 is a flowchart of an operation procedure of the image correction apparatus 1 according to the first embodiment.
  • the trigger generation unit 2 generates an imaging trigger for causing each of the first imaging device 21 and the second imaging device 22 to start imaging.
  • the communication unit 3 simultaneously transmits the imaging trigger generated by the trigger generation unit 2 to each of the first imaging device 21 and the second imaging device 22 (S1).
  • the communication unit 3 receives image data obtained when the first imaging device 21 has imaged the imaging target 23 from the first imaging device 21, and when the second imaging device 22 has captured the imaging target 23 The obtained image data is received from the second imaging device 22 (S2).
  • the feature extracting unit 5 extracts feature points of the image of the imaging target 23 which is an image based on the image data received by the communication unit 3 from the second imaging device 22 (S3).
  • the movement amount calculation unit 7 calculates the movement amount per unit time of the feature point extracted by the feature extraction unit 5 (S4).
  • the correction image specifying unit 9 specifies a correction target image by extracting feature points of an image based on the image data received by the communication unit 3 from the first imaging device 21 (S5).
  • the division unit 10 converts the correction target image, which is an image obtained by the first imaging device 21 that captures the moving imaging target 23, into an image formed at the last time of the time when the correction target image is formed.
  • the image is divided into a final time image and a non-final time image which is an image formed before the final time of the exposure time (S6).
  • the dividing unit 10 separates each of the final time image and the non-final time image based on the position of the imaging target 23 for each exposure time required when each part of the final time image and the non-final time image is formed. It is divided into a plurality of areas (S6).
  • the correction unit 12 corrects each luminance value of a plurality of regions included in the final time image and the non-final time image obtained by the division unit 10. That is, the correction unit 12 corrects the correction target image (S7).
  • the corrected image generation unit 13 generates a corrected image, which is a corrected image, based on the luminance values obtained by the correction unit 12 (S8).
  • the display unit 14 displays the corrected image generated by the corrected image generation unit 13 as the corrected image (S9).
  • the image correction device 1 sets the correction target image, which is an image obtained at the time when one frame is formed by the first imaging device 21 that captures the moving imaging target 23, at the final time of the time. Is divided into a final time image, which is an image formed at the time, and a non-final time image, which is an image formed before the last time of the time.
  • the image correction device 1 converts each of the final time image and the non-final time image based on the position of the imaging target 23 for each exposure time required when each part of the final time image and the non-final time image is formed. Is divided into a plurality of regions.
  • the image correction apparatus 1 converts the brightness value of the correction target area into a first correction corresponding to the exposure time required when the correction target area is formed. Correct using the value.
  • the image correction device 1 sets the brightness value of the correction target region to a first value corresponding to the exposure time required when the correction target region was formed. The correction is performed using the correction value and the second correction value corresponding to the non-final time image.
  • FIG. 7 is a diagram schematically illustrating a distribution of luminance values obtained by performing correction by the correction unit 12 included in the image correction apparatus 1 according to the first embodiment.
  • FIG. 7 shows that the luminance values of the areas to which the codes B1, B2, B3, and B4 are assigned are the same.
  • FIG. 7 shows that the brightness values of the areas to which the codes A1, A2, and A3 are assigned are smaller than the brightness values of the areas to which the codes B1, B2, B3, and B4 are assigned, and It is the same.
  • FIG. 8 is a diagram for explaining correction performed by the image correction apparatus 1 according to the first embodiment.
  • FIG. 8A is the same as FIG. 2A, and shows the correction target image at the final time of imaging.
  • FIG. 8B illustrates an image obtained by the image correction device 1 correcting the correction target image at the final time of imaging.
  • the image shown at the final time of imaging in FIG. 8B is an image obtained based on the distribution of luminance values in FIG.
  • FIG. 8 (C) is the same as FIG. 2 (B), and shows the first time of imaging, the time 1/180 second after the beginning of imaging, and the time 2/180 second after the beginning of imaging.
  • an image obtained by the second imaging device 22 at a time 3/180 seconds after the start of imaging.
  • the image correction apparatus 1 can obtain an image of the imaging target 23 that does not include the blurred image 23a at a time 3/180 seconds after the start of imaging. .
  • the image correction apparatus 1 can appropriately correct a blur that occurs in a captured image of a moving imaging target and is caused by the integration of luminance values. More specifically, when the use of the second imaging device 22 is prohibited and the first imaging device 21 is used without using the second imaging device 22, , Every 1/60 second, an image similar to the image obtained by the second imaging device 22 can be obtained.
  • the image correction device 1 is configured to correct the luminance value after the first imaging device 21 has formed one frame in consideration of the exposure time. 23 images can be obtained. That is, when the first imaging device 21 is an infrared camera used to determine whether or not the corrugated cardboard produced by the hot melt process is properly produced, the image correction device 1 performs the hot melt process. It is possible to generate an image for determining whether or not the boxed cardboard is properly boxed. Since the image correction device 1 does not perform correction for each pixel but performs correction for each region where a plurality of pixels are collected, the correction can be performed relatively quickly.
  • the data indicating the shape and the size of the imaging target 23 of the captured image may be stored in the feature storage unit 6 in advance. More specifically, data indicating the shape and size of the imaging target 23 of the image captured by the second imaging device 22 may be stored in the feature storage unit 6 in advance.
  • Embodiment 2 FIG. Next, an image correction device 1 according to the second embodiment will be described.
  • the imaging target 23 moves linearly at a constant speed.
  • the imaging target 23 performs an acceleration motion.
  • the configuration of the image correction device 1 according to the second embodiment is the same as the configuration of the image correction device 1 according to the first embodiment.
  • portions different from the first embodiment will be described.
  • the dividing unit 10 divides the non-final time image into a plurality of regions corresponding to the exposure time when each part of the non-final time image is formed and the acceleration of the imaging target 23.
  • FIG. 9 is a diagram for explaining functions of the dividing unit 10 included in the image correction device 1 according to the second embodiment.
  • the first imaging device 21 obtains a triangular image having three points P0, Q0, and R0 in FIG.
  • the first imaging device 21 obtains a triangular image having three vertices of the points P1, Q1 and R1 in FIG.
  • the first imaging device 21 obtains a triangular image having three vertices of points P2, Q2, and R2 in FIG.
  • a triangular image having three vertices of point P3, point Q3 and point R3 in FIG. Can be The images obtained at each of the above four times are combined, and the point P0, the point Q0, the point R3, and the point P3 are set to 4 at 1/60 second, which is 3/180 seconds after the first time of imaging.
  • a trapezoidal correction target image having the vertices is obtained by the first imaging device 21.
  • the movement amount of the above triangular image is y1 from the first time of imaging to 1/180 second, and y2 from 1/180 second to 2/180 second. And y3 from 2/180 seconds to 3/180 seconds.
  • the imaging target 23 performs an acceleration motion. Therefore, each of the movement amounts y1, y2, and y3 is different from the other two movement amounts.
  • Each of the movement amount y1, the movement amount y2, and the movement amount y3 corresponds to the acceleration of the imaging target 23.
  • the code B is assigned to the four regions constituting the triangular image having the points P3, Q3 and R3 at the three vertices, as in the first embodiment.
  • the image constituted by all the areas to which the reference symbol B is assigned is the last time image formed at the last time of the time when the correction target image is formed.
  • the image constituted by all the regions to which the code A is assigned is a non-final time image formed before the last time of the time when the correction target image is formed.
  • the dividing unit 10 divides the non-final time image into a plurality of regions corresponding to the exposure time and the acceleration of the imaging target when each part of the non-final time image is formed. That is, since the moving amount y1, the moving amount y2, and the moving amount y3 each correspond to the acceleration of the imaging target 23, the dividing unit 10 sets the non-final time image to any one of the codes A4 to A9. Are divided into six areas to which are assigned.
  • FIG. 10 is a diagram for explaining the second correction value stored in the correction value storage unit 11 included in the image correction device 1 according to the second embodiment.
  • the second correction value according to the second embodiment corresponds to the non-final time image and differs depending on the acceleration of the imaging target 23.
  • FIG. 10 shows an example in which the second correction value decreases as the moving amount of the triangle image per 1/180 second increases.
  • the second correction value when the amount of movement of the triangle image per 1/180 second is y1 is ⁇ 1.
  • the second correction value when the moving amount of the triangle image per 1/180 second is y2 is ⁇ 2.
  • the second correction value when the moving amount of the triangle image per 1/180 second is y3 is ⁇ 3.
  • the first correction value is ⁇ 1, and the second correction value is ⁇ 1.
  • the first correction value is ⁇ 1 and the second correction value is ⁇ 2 for the area to which the reference symbol A5 is assigned.
  • the first correction value is ⁇ 1, and the second correction value is ⁇ 3, for the area to which the code A6 is assigned.
  • the first correction value is ⁇ 2 and the second correction value is ⁇ 2 for the area to which the code A7 is assigned.
  • the first correction value is ⁇ 2 and the second correction value is ⁇ 3 for the area to which the code A8 is assigned.
  • the first correction value is ⁇ 3 and the second correction value is ⁇ 3 for the area to which the code A9 is assigned.
  • the correction unit 12 determines the luminance value of the correction target area by a first correction corresponding to the exposure time required when the correction target area was formed. The correction is performed using the value and the second correction value corresponding to the acceleration of the imaging target 23 when the region to be corrected is formed. The correction unit 12 corrects the luminance value of the correction target area for the plurality of areas included in the final time image in the same manner as described in the first embodiment.
  • the correction unit 12 When the brightness value before correction of the area to which the code Xn is assigned is Xn and the brightness value after correction of the area to which the code Xn is assigned is Xn ′, for example, the correction unit 12 The luminance value before correction is corrected according to equation (2). Note that X is A or B, and n is any integer from 1 to 9.
  • the image correction apparatus 1 converts the non-final time image into a plurality of areas corresponding to the exposure time when each part of the non-final time image is formed and the acceleration of the imaging target 23.
  • Divided into The second correction value according to the second embodiment corresponds to the non-final time image and differs depending on the acceleration of the imaging target 23.
  • the correction unit 12 determines the luminance value of the correction target area by a first correction corresponding to the exposure time required when the correction target area was formed. The correction is performed using the value and the second correction value corresponding to the acceleration of the imaging target 23 when the region to be corrected is formed.
  • the correction unit 12 corrects the luminance value of the correction target area for the plurality of areas included in the final time image in the same manner as described in the first embodiment.
  • the image correction apparatus 1 according to the second embodiment corrects the correction target image even when the imaging target 23 performs an acceleration motion, so that the blurred image is obtained at the final time of the imaging.
  • An image of the imaging target 23 that does not include 23a can be obtained.
  • the image correction apparatus 1 according to the second embodiment converts an image of the imaging target 23 whose luminance value after one frame has been formed by the first imaging apparatus 21 in consideration of the exposure time. Obtainable.
  • the image correction apparatus 1 corrects the correction target image by the method described in the first embodiment without considering the acceleration of the imaging target 23 even when the imaging target 23 performs an acceleration motion. You may. Even in such a case, the image correction apparatus 1 can generate a corrected image in which the influence of the blurred image 23a is smaller than before and the exposure time is considered.
  • FIG. 11 is a diagram illustrating a configuration of an image correction device 1A according to the third embodiment.
  • the image correction device 1A has all the components of the image correction device 1 according to the first or second embodiment.
  • the image correction device 1A has components other than the components of the image correction device 1.
  • portions different from the first or second embodiment will be described.
  • the image correction device 1A further includes a corrected image storage unit 15 that stores data indicating the corrected image generated by the corrected image generation unit 13.
  • An example of the corrected image storage unit 15 is a semiconductor memory.
  • the image correction device 1A includes an interpolation image generation unit 16 that generates an interpolation image based on the first image based on the data stored in the correction image storage unit 15 and the position of the imaging target 23.
  • the first image is an image obtained by the correction unit 12 performing the correction.
  • the position of the imaging target 23 is specified based on the data stored in the movement amount storage unit 8. More specifically, the position of the imaging target 23 is specified by the image data received by the communication unit 3 from the second imaging device 22.
  • the first image is an image generated by the corrected image generation unit 13 after the correction unit 12 performs the correction.
  • the interpolated image includes a first time at which an uncorrected image of the first image is obtained, and an uncorrected image of a second image obtained after the first image by the correction unit 12 performing the correction. It is an image of the imaging target 23 at a time between the obtained second time and the second time.
  • the display unit 14 also displays the interpolation image generated by the interpolation image generation unit 16.
  • FIG. 12 is a diagram for explaining the function of the interpolation image generation unit 16 included in the image correction device 1A according to the third embodiment.
  • FIG. 12A illustrates a first image at a first time and a second image at a second time.
  • FIG. 12B shows the interpolation image 23b generated by the interpolation image generation unit 16 at an intermediate time between the first time and the second time in addition to the first image and the second image. Is shown.
  • the image of the imaging target 23 is captured by the first imaging device 21 capturing the imaging target 23 at an intermediate time.
  • the image of the imaging target 23 at the intermediate time shifts the imaging target 23 of the first image from the position of the imaging target 23 of the first image to the position of the imaging target 23 of the second image in the second direction. It is assumed that the image has been moved by the movement amount Z from the position of the imaging target 23 of one image.
  • the movement amount Z is specified based on the position of the imaging target 23. Specifically, the movement amount Z is specified based on the data stored in the movement amount storage unit 8. More specifically, the movement amount Z is specified by image data received by the communication unit 3 from the second imaging device 22.
  • the interpolated image generation unit 16 generates an interpolated image 23b, which is an image of the imaging target 23 at an intermediate time, based on the first image and the movement amount Z, as shown in FIG. Specifically, the interpolation image generation unit 16 moves the imaging target 23 of the first image from the position of the imaging target 23 of the first image by the moving amount Z in the direction of the position of the imaging target 23 of the second image. By moving it, an interpolated image 23b, which is an image of the imaging target 23 at an intermediate time, is generated.
  • the image correction device 1 based on the first image and the position of the imaging target 23, specifically, based on the first image and the movement amount Z, the image correction device 1 ⁇ / b> A , An interpolation image 23b that is an image of the imaging target 23 at an intermediate time is generated. That is, the image correction device 1A can present the image of the imaging target 23 at an intermediate time to the user in addition to the first image and the second image. The user can confirm the movement of the imaging target 23 by visually recognizing the interpolated image 23b which is an image of the imaging target 23 at an intermediate time.
  • the image correction devices 1 and 1A may be connected to an encoder and a control device.
  • FIG. 13 is a diagram illustrating a situation where the image correction devices 1 and 1A according to the first to third embodiments are connected to the encoder 26 and the control device 27.
  • the encoder 26 is a device that detects the position of the imaging target 23.
  • the control device 27 is a device that controls the movement speed of the imaging target 23 by controlling the belt conveyor 24.
  • the communication unit 3 receives information related to the position of the imaging target 23 from one or both of the encoder 26 and the control device 27.
  • the position of the imaging target 23 or the acceleration of the imaging target 23 may be specified by the information received by the communication unit 3.
  • the encoder 26 may be replaced with a sensor that detects the position or speed of the imaging target 23.
  • the communication unit 3 receives information related to the position of the imaging target 23 from the sensor.
  • the first imaging device 21 may be a device that detects depth or sound and generates image data based on the detection result.
  • Some or all of the functions of 13 may be implemented by a processor that executes a program stored in a memory.
  • the processor is a CPU (Central Processing Unit), a processing device, an arithmetic device, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor).
  • CPU Central Processing Unit
  • processing device an arithmetic device
  • microprocessor a microprocessor
  • microcomputer a microcomputer
  • DSP Digital Signal Processor
  • Part or all of the functions of the trigger generation unit 2, the communication unit 3, the feature extraction unit 5, the movement amount calculation unit 7, the correction image identification unit 9, the division unit 10, the correction unit 12, and the correction image generation unit 13 are realized by a processor.
  • some or all of the functions are realized by a processor and software, firmware, or a combination of software and firmware.
  • Software or firmware is described as a program and stored in a memory. The processor reads out and executes the program stored in the memory, so that the trigger generation unit 2, the communication unit 3, the feature extraction unit 5, the movement amount calculation unit 7, the corrected image identification unit 9, the division unit 10, the correction unit 12 And part or all of the functions of the corrected image generation unit 13 are realized.
  • the image correction device 1 includes the trigger generation unit 2, the communication unit 3, the feature extraction unit 5, the movement amount calculation unit 7, the correction image identification unit 9, the division unit 10, the correction unit 12, and the correction image generation unit 13. It has a memory for storing a program in which the steps performed by some or all will be executed as a result.
  • the programs stored in the memory include a part of the trigger generation unit 2, the communication unit 3, the feature extraction unit 5, the movement amount calculation unit 7, the correction image identification unit 9, the division unit 10, the correction unit 12, and the correction image generation unit 13.
  • the computer is caused to execute a procedure or a method that is entirely executed.
  • the memory may be, for example, a non-volatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (registered trademark) (Electrically Erasable Programmable Read-Only Memory), or the like. It is a volatile semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versatile Disk), or the like.
  • a non-volatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (registered trademark) (Electrically Erasable Programmable Read-Only Memory), or the like. It is a volatile semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versatile Dis
  • Some or all of the functions of 13 may be realized by a processing circuit.
  • the processing circuit is dedicated hardware.
  • the processing circuit is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof. is there.
  • Some of the trigger generation unit 2, the communication unit 3, the feature extraction unit 5, the movement amount calculation unit 7, the correction image identification unit 9, the division unit 10, the correction unit 12, and the correction image generation unit 13 are dedicated to the individual parts different from the rest. Hardware may be used.
  • a plurality of functions of the trigger generation unit 2 the communication unit 3, the feature extraction unit 5, the movement amount calculation unit 7, the correction image identification unit 9, the division unit 10, the correction unit 12, and the correction image generation unit 13, Some may be realized by software or firmware, and the rest of the plurality of functions may be realized by dedicated hardware. As described above, a plurality of functions of the trigger generation unit 2, the communication unit 3, the feature extraction unit 5, the movement amount calculation unit 7, the correction image identification unit 9, the division unit 10, the correction unit 12, and the correction image generation unit 13 are implemented by hardware. It can be realized by software, firmware, firmware, or a combination thereof.
  • Trigger generation unit 2 communication unit 3, feature extraction unit 5, movement amount calculation unit 7, correction image identification unit 9, division unit 10, correction unit 12, correction image generation unit included in image correction device 1A according to the third embodiment.
  • Some or all of the functions of the interpolation image generation unit 13 and the interpolation image generation unit 16 may be realized by a processor.
  • the image correction apparatus 1A includes a trigger generation unit 2, a communication unit 3, a feature extraction unit 5, a movement amount calculation unit 7, a corrected image identification unit 9, and a division unit 10.
  • a memory for storing a program in which the steps executed by some or all of the correction unit 12, the correction image generation unit 13 and the interpolation image generation unit 16 are executed as a result.
  • Trigger generation unit 2 communication unit 3, feature extraction unit 5, movement amount calculation unit 7, correction image identification unit 9, division unit 10, correction unit 12, correction image generation unit included in image correction device 1A according to the third embodiment.
  • Part or all of the functions of the interpolation image generation unit 13 and the interpolation image generation unit 16 may be realized by a processing circuit.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Adjustment Of Camera Lenses (AREA)

Abstract

L'invention concerne un appareil de correction d'image (1) comprenant : une unité de division (10) pour diviser une image, obtenue par un premier dispositif d'imagerie (21) qui capture une cible d'imagerie mobile, dans une image de point temporel final qui est formée au moment final d'un temps dans lequel l'image est formée et dans une image de point temporel non final qui est formée avant le point temporel final et pour diviser chacune de l'image de point temporel final et de l'image de point temporel non final sur la base de la position de la cible d'imagerie en une pluralité de zones pour chacune d'une pluralité de temps d'exposition requis lorsque des parties de chacune de l'image de point temporel final et de l'image de point temporel non final sont formées; et une unité de correction (12) pour corriger la valeur de luminosité de chaque zone. Pour la pluralité de zones incluses dans l'image de point temporel final, l'unité de correction (12) corrige la valeur de luminosité d'une zone cible de correction à l'aide d'une première valeur de correction correspondant à un temps d'exposition requis lorsque la zone cible de correction est formée; et pour la pluralité de zones comprises dans l'image de point temporel non final, l'unité de correction (12) corrige la valeur de luminosité d'une zone cible de correction à l'aide de la première valeur de correction et d'une seconde valeur de correction correspondant à l'image de point temporel non final.
PCT/JP2018/027529 2018-07-23 2018-07-23 Appareil de correction d'image et procédé de correction d'image WO2020021601A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2018/027529 WO2020021601A1 (fr) 2018-07-23 2018-07-23 Appareil de correction d'image et procédé de correction d'image
JP2019504145A JP6502003B1 (ja) 2018-07-23 2018-07-23 画像補正装置及び画像補正方法
CN201880095738.7A CN112425147B (zh) 2018-07-23 2018-07-23 图像校正装置及图像校正方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027529 WO2020021601A1 (fr) 2018-07-23 2018-07-23 Appareil de correction d'image et procédé de correction d'image

Publications (1)

Publication Number Publication Date
WO2020021601A1 true WO2020021601A1 (fr) 2020-01-30

Family

ID=66166748

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/027529 WO2020021601A1 (fr) 2018-07-23 2018-07-23 Appareil de correction d'image et procédé de correction d'image

Country Status (3)

Country Link
JP (1) JP6502003B1 (fr)
CN (1) CN112425147B (fr)
WO (1) WO2020021601A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10276361A (ja) * 1997-01-31 1998-10-13 Sony Corp 撮像装置および方法、撮像システム、画像情報提供装置および方法、画像データ処理装置および方法、並びに伝送媒体
JP2009017223A (ja) * 2007-07-04 2009-01-22 Sony Corp 撮影装置、画像処理装置、これらにおける画像処理方法およびプログラム
JP2009044236A (ja) * 2007-08-06 2009-02-26 Fujifilm Corp ホワイトバランス調整装置及びホワイトバランス調整方法
JP2017187434A (ja) * 2016-04-08 2017-10-12 日本アビオニクス株式会社 赤外線撮影装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3804617B2 (ja) * 2003-02-14 2006-08-02 コニカミノルタフォトイメージング株式会社 画像処理装置及び方法
CN102857696A (zh) * 2009-08-18 2013-01-02 夏普株式会社 拍摄条件决定装置、拍摄条件决定方法和不均校正系统
EP2391142B1 (fr) * 2010-03-19 2013-12-25 FUJIFILM Corporation Dispositif, procédé et programme d'imagerie et support d'enregistrement associé

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10276361A (ja) * 1997-01-31 1998-10-13 Sony Corp 撮像装置および方法、撮像システム、画像情報提供装置および方法、画像データ処理装置および方法、並びに伝送媒体
JP2009017223A (ja) * 2007-07-04 2009-01-22 Sony Corp 撮影装置、画像処理装置、これらにおける画像処理方法およびプログラム
JP2009044236A (ja) * 2007-08-06 2009-02-26 Fujifilm Corp ホワイトバランス調整装置及びホワイトバランス調整方法
JP2017187434A (ja) * 2016-04-08 2017-10-12 日本アビオニクス株式会社 赤外線撮影装置

Also Published As

Publication number Publication date
JP6502003B1 (ja) 2019-04-17
JPWO2020021601A1 (ja) 2020-07-30
CN112425147A (zh) 2021-02-26
CN112425147B (zh) 2021-08-27

Similar Documents

Publication Publication Date Title
JP4509925B2 (ja) 画像処理装置及びカメラシステム並びに画像処理方法及び動画像表示方法
JP6836568B2 (ja) 重なり合う視界を有する第1の画像と第2の画像とを合成するための方法、デバイス、およびカメラ
JP6594180B2 (ja) 撮像装置、撮像装置の制御方法及びプログラム
US20180182075A1 (en) Image processing apparatus, image capturing apparatus, method of image processing, and storage medium
JP6151930B2 (ja) 撮像装置およびその制御方法
US20200213482A1 (en) Information processing apparatus, imaging apparatus, and information processing method each of which issues a notification of blur of an object, and control method for the imaging apparatus
US20060077261A1 (en) Image processing device and electronic camera
US10348966B2 (en) Optical apparatus and a control method for performing image readout and reconstruction based on a delayed readout from different partial regions
US10009547B2 (en) Image pickup apparatus that compensates for flash band, control method therefor, and storage medium
WO2019151030A1 (fr) Dispositif d'imagerie, élément d'imagerie à semi-conducteurs, module de caméra, unité de controle de commande et procédé d'imagerie
JP2014127773A5 (fr)
WO2020021601A1 (fr) Appareil de correction d'image et procédé de correction d'image
JP2005130159A (ja) 撮像装置および撮像装置の手振れ補正方法
JP2015095670A (ja) 撮像装置、その制御方法、および制御プログラム
JP5955003B2 (ja) 画像処理装置および画像処理方法、プログラム
US10257420B2 (en) Image processing apparatus, optical apparatus, and image processing method
JP2014215304A (ja) 撮像装置
US11394907B2 (en) Imaging apparatus, image processing method, and program
JP6694907B2 (ja) 判定装置、判定方法及び判定プログラム
JP5505072B2 (ja) 画像データ処理装置および画像データ処理方法
US8786677B2 (en) Imaging device
JP2017034352A (ja) 投射装置およびその制御方法、表示装置
US9681039B2 (en) Image processing apparatus, image processing method, and program for filter processing
US20170206667A1 (en) Image processing apparatus, image processing method, and storage medium
JP7358653B2 (ja) 撮像装置、駆動方法、及び撮像プログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019504145

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18928100

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18928100

Country of ref document: EP

Kind code of ref document: A1