WO2020208806A1 - Image acquisition device, paper sheet handling device, and image acquisition method - Google Patents

Image acquisition device, paper sheet handling device, and image acquisition method Download PDF

Info

Publication number
WO2020208806A1
WO2020208806A1 PCT/JP2019/015949 JP2019015949W WO2020208806A1 WO 2020208806 A1 WO2020208806 A1 WO 2020208806A1 JP 2019015949 W JP2019015949 W JP 2019015949W WO 2020208806 A1 WO2020208806 A1 WO 2020208806A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image
wavelength region
image data
image acquisition
Prior art date
Application number
PCT/JP2019/015949
Other languages
French (fr)
Japanese (ja)
Inventor
晶 坊垣
了介 南
Original Assignee
グローリー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by グローリー株式会社 filed Critical グローリー株式会社
Priority to PCT/JP2019/015949 priority Critical patent/WO2020208806A1/en
Priority to JP2021513133A priority patent/JP7218429B2/en
Publication of WO2020208806A1 publication Critical patent/WO2020208806A1/en

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/06Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using wave or particle radiation
    • G07D7/12Visible light, infrared or ultraviolet radiation
    • G07D7/121Apparatus characterised by sensor details

Definitions

  • the present invention relates to an image acquisition device, a paper leaf processing device, and an image acquisition method. More specifically, the present invention relates to an image acquisition device, a paper leaf processing device, and an image acquisition method suitable for acquiring a high-resolution image of paper leaves.
  • the authenticity of paper leaves is determined, for example, by analyzing an image of paper leaves read by an optical line sensor.
  • the authenticity of the paper leaves is determined based on the arrangement of the serial numbers of the recognized paper leaves and the number of digits. In order to perform serial number recognition, it is desirable that the image of the serial number portion has a high resolution.
  • the number of times of light emission of at least one of the first to nth (n is an integer of 2 or more) lights having different wavelengths in one lighting cycle is different from the number of times of light emission of other lights.
  • n is an integer of 2 or more
  • a paper leaf identification device that controls the light emission of the light emitting means is disclosed. More specifically, an example of acquiring a high-resolution reflected image by increasing the number of times of light emission of reflected green is disclosed.
  • Patent Document 2 describes a reflected image by reflected light obtained by reflecting the light of the first light source on the paper sheets and a transmitted image by the transmitted light transmitted by the light of the second light source on the paper leaves.
  • a serial number recognition device is disclosed that generates a composite image in which the above are complementary to each other and recognizes the serial number region of the paper leaf as a character for the composite image.
  • Patent Document 2 there is a possibility that an accurate image of only one surface of the paper leaf cannot be obtained by the method of acquiring a composite image in which the reflected image and the transmitted image are complemented with each other. .. This is because the pattern on the other surface can also be captured in the transparent image.
  • the transmittance of the transmitted light received to generate the transmitted image fluctuates depending on the thickness, material, etc. of the paper sheets, and the fluctuation of the sensor output is generally higher than that of the reflected light reception. This is because time is bigger.
  • the present invention has been made in view of the above situation, and provides an image acquisition device, a paper leaf processing device, and an image acquisition method capable of acquiring various images and more accurate high-resolution images. It is intended to provide.
  • the present invention is an image acquisition device for acquiring an image of transported paper sheets, and the light in the first wavelength region and the second wavelength region.
  • a light emitting unit that irradiates the paper leaves with light, and the light in the first wavelength range emitted from the light emitting unit receives the light reflected by the paper sheets and outputs a first image signal.
  • the light receiving unit that receives the light reflected by the paper sheets and outputs the second image signal by the light in the second wavelength region emitted from the light emitting unit, and the first image signal based on the first image signal.
  • An image generation unit that generates an image data of 1 and a second image data based on the second image signal, and synthesizes the first image data and the second image data to generate a composite image. It is characterized by having.
  • the light emitting unit irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region at different timings, and the image generation unit.
  • the first image data and the second image data are combined in the transport direction of the paper sheets.
  • the light emitting unit irradiates the paper leaves with light in the first wavelength region and then irradiates the paper leaves with light in the second wavelength region. After irradiating the paper sheets with light in the second wavelength range, the paper sheets are irradiated with light in the first wavelength range.
  • the light emitting unit has a time interval from the start of irradiation of light in the first wavelength region to the start of irradiation of light in the second wavelength region, and the above-mentioned.
  • the light in the first wavelength region and the first wavelength region are such that the time interval from the start of irradiation of light in the second wavelength region to the start of irradiation of light in the first wavelength region is the same. It is characterized in that the paper sheets are irradiated with light in the wavelength range of 2.
  • the light emitting unit simultaneously irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region, and the image generation unit generates the image. It is characterized in that the first image data and the second image data are combined in the main scanning direction.
  • the present invention is characterized in that, in the above invention, the first image data and the second image data are gray-converted image data.
  • the image acquisition device further includes an image recognition unit that recognizes the composite image, and the image recognition unit is a serial number assigned to the paper sheets from the composite image. It is characterized by recognizing.
  • the image acquisition device further includes an image recognition unit that recognizes the composite image, and the image recognition unit is a bar code attached to the paper sheets from the composite image. It is characterized by recognizing.
  • the present invention is characterized in that, in the above invention, the light in the first wavelength region is blue light and the light in the second wavelength region is green light.
  • the present invention is characterized in that, in the above invention, the light in the first wavelength region is red light, and the light in the second wavelength region is infrared light.
  • the light in the first wavelength region is the first infrared light
  • the light in the second wavelength region has a wavelength region different from that of the first infrared light. Is a second infrared light having a different wavelength.
  • the present invention is a paper leaf processing apparatus including the image acquisition apparatus.
  • the present invention is an image acquisition method for acquiring an image of conveyed paper leaves, in which light in the first wavelength range receives light reflected by the paper leaves and outputs a first image signal.
  • the light receiving step in which the light in the second wavelength range receives the light reflected by the paper sheets and outputs the second image signal, the first image data based on the first image signal, and the above. It is characterized by including an image generation step of generating a second image data based on the second image signal, and synthesizing the first image data and the second image data to generate a composite image. To do.
  • the image acquisition device the paper leaf processing device, and the image acquisition method of the present invention, various images can be acquired and more accurate high-resolution images can be acquired.
  • FIG. 5 is a schematic cross-sectional view illustrating the configuration of an imaging unit included in the banknote identification device (image acquisition device) according to the first embodiment. It is a block diagram explaining the structure of the banknote identification apparatus (image acquisition apparatus) which concerns on Embodiment 1.
  • FIG. 5 is a timing chart showing an example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment.
  • FIG. 5 is a timing chart showing another example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment.
  • 9 is a timing chart showing still another example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment.
  • It is a flowchart which shows the acquisition procedure of the composite image in the banknote identification apparatus (image acquisition apparatus) and image acquisition method which concerns on Embodiment 1, and the main processing flow after that. It is a timing chart which showed an example of the control of each light source by a light source control unit, and the control of signal reading from each line sensor by a sensor control unit in the modification of Embodiment 1.
  • FIG. 9 is a timing chart showing a plurality of examples of control of each light source by a light source control unit and control of signal reading from each line sensor by a sensor control unit in the first embodiment and its modified example. It is a figure for demonstrating the outline of Embodiment 2.
  • FIG. 5 is a schematic plan view showing the correspondence between the arrangement of the image sensor of the light receiving portion of the optical line sensor according to the second embodiment and the bandpass filter. It is a plane schematic diagram which showed the arrangement of the bandpass filter in Embodiment 2, and shows the case where blue light and green light are used as light of the 1st and 2nd wavelength regions.
  • the reflected image means an image based on the intensity distribution of light reflected by the paper leaves by irradiating the paper leaves with light.
  • the optical line sensor repeatedly irradiates the bill with light in the first wavelength region and light in a second wavelength region different from the first wavelength region.
  • Each color reflection image is acquired, and these reflection images are gray-converted and then combined in the bill transport direction (secondary scanning direction of the optical line sensor) to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution, specifically, twice the resolution in the banknote transport direction as compared with each reflected image.
  • Suitable wavelength combinations include, for example, a combination of blue light and green light, a combination of red light and infrared light, and a wavelength range of the first infrared light and the first infrared light. A combination with a different second infrared light can be mentioned.
  • the banknote processing device 300 shown in FIG. 2 is a small bill processing device installed and used on a table, and includes a bill identification device (not shown in FIG. 2) that performs bill identification processing and a plurality of processing targets.
  • the rejected banknotes are discharged when the hopper 301 on which the banknotes are placed in a laminated body and the banknotes drawn out from the hopper 301 into the housing 310 are rejected banknotes such as counterfeit banknotes and uncertain authenticity tickets.
  • the method of distributing banknotes to the accumulation units 306a to 306d can be arbitrarily set.
  • the imaging unit 21 includes optical line sensors 110 and 120 arranged so as to face each other.
  • a gap for transporting the bill BN is formed between the optical line sensors 110 and 120, and this gap constitutes a part of the transport path 311 of the bill processing device according to the present embodiment.
  • the optical line sensors 110 and 120 are located above and below the transport path 311 respectively.
  • the optical line sensor 110 includes two reflection light sources 111 as light emitting units, a condenser lens 112, and a light receiving unit 113.
  • the reflection light source 111 has light of a predetermined wavelength (invisible light such as infrared light, monochromatic light such as red, green, blue, or white) on the main surface (hereinafter, A surface) of the bill BN on the light receiving portion 113 side. Visible light such as light) is sequentially irradiated.
  • the condenser lens 112 collects the light emitted from the reflection light source 111 and reflected by the bill BN.
  • the light receiving unit 113 includes a plurality of image pickup elements (light receiving elements, not shown) arranged in a line in a direction (main scanning direction) orthogonal to the transport direction (secondary scanning direction) of the bill BN, and collects light.
  • the light collected by the lens 112 is received and converted into an electric signal. Then, after the electric signal is amplified, it is A / D converted into digital data and then output as an image signal.
  • the optical line sensor 120 includes two reflection light sources 121 as light emitting units, a condenser lens 122, and a light receiving unit 123.
  • the reflection light source 121 is formed on the main surface (hereinafter referred to as the B surface) of the bill BN on the light receiving portion 123 side with light of a predetermined wavelength (invisible light such as infrared light, monochromatic light such as red, green, blue, or white). Irradiate visible light such as light).
  • the condenser lens 122 collects the light emitted from the reflection light source 121 and reflected by the bill BN.
  • the light receiving unit 123 includes a plurality of image pickup elements (light receiving elements, not shown) arranged in a line in a direction orthogonal to the transport direction of the banknote BN, and receives the light collected by the condensing lens 122. And convert it into an electrical signal. Then, after the electric signal is amplified, it is A / D converted into digital data and then output as an image signal.
  • image pickup elements light receiving elements, not shown
  • the light sources 111 and 121 are provided on a line-shaped light guide body (not shown) extending in a direction perpendicular to the paper surface (main scanning direction) of FIG. 3 and at both ends of the light guide body (may be one end). It is provided with a plurality of LED elements (not shown) provided.
  • Each of the light sources 111 and 121 includes, as LED elements, a plurality of LED elements capable of irradiating light in different wavelength regions including at least light in the first and second wavelength regions, and light in a selected wavelength region. It is configured to be able to irradiate.
  • each of the light sources 111 and 121 includes an LED element that emits infrared light (IR) and an LED element that emits visible light (monochromatic light such as red, green, blue, or white). Infrared light and visible light are emitted toward the bill BN, respectively.
  • each of the light sources 111 and 121 has a plurality of LED elements that emit a plurality of types of infrared light (for example, three types of LED elements that emit each of the first to third infrared lights) and a wavelength range. It includes an LED element that emits red light having a wavelength band of 600 nm to 700 nm, an LED element that emits green light having a wavelength band of 500 nm to 600 nm, and an LED element that emits blue light having a wavelength band of 400 nm to 500 nm.
  • the optical line sensors 110 and 120 each repeatedly image a banknote BN being transported in the transport direction and output an image signal, whereby the banknote identification device according to the present embodiment captures an image of the entire banknote BN. get.
  • the bill identification device according to the present embodiment acquires a reflection image of the A side of the bill BN based on the output signal of the optical line sensor 110, and reflects a reflection image of the B side of the bill BN based on the output signal of the optical line sensor 120. To get.
  • the bill identification device (image acquisition device) 1 includes a control unit 10, a detection unit 20, and a storage unit 30.
  • the control unit 10 includes a program for realizing various processes stored in the storage unit 30, a CPU (Central Processing Unit) that executes the program, various hardware controlled by the CPU, and an FPGA (Field). It is composed of logical devices such as Programmable Gate Array).
  • the control unit 10 controls each unit of the banknote identification device 1 based on the signal output from each unit of the banknote identification device 1 and the control signal from the control unit 10 according to the program stored in the storage unit 30. Further, the control unit 10 has the functions of the light source control unit 11, the sensor control unit 12, the image generation unit 13, the image recognition unit 14, and the identification unit 15 according to the program stored in the storage unit 30.
  • the detection unit 20 includes a magnetic detection unit 22, a thickness detection unit 23, and a UV detection unit 24 in addition to the above-mentioned imaging unit 21 along the bill transport path.
  • the image pickup unit 21 takes an image of the banknote as described above and outputs an image signal (image data).
  • the magnetic detection unit 22 includes a magnetic sensor (not shown) for measuring magnetism, and detects the magnetism of magnetic ink, security threads, etc. printed on banknotes by the magnetic sensor.
  • the magnetic sensor is a magnetic line sensor in which a plurality of magnetic detection elements are arranged in a line.
  • the thickness detection unit 23 includes a thickness detection sensor (not shown) for measuring the thickness of banknotes, and detects tape, double feed, etc. by the thickness detection sensor.
  • the thickness detection sensor detects the amount of displacement of the rollers facing each other across the transport path when the banknotes pass by the sensors provided on each roller.
  • the UV detection unit 24 includes an ultraviolet irradiation unit (not shown) and a light receiving unit (not shown), and receives fluorescence generated when the banknotes are irradiated with ultraviolet rays by the ultraviolet irradiation unit and ultraviolet rays transmitted through the banknotes. Detected by.
  • the storage unit 30 is composed of a non-volatile storage device such as a semiconductor memory or a hard disk, and stores various programs and various data for controlling the bill identification device 1. Further, the storage unit 30 has a wavelength range of light emitted from the light sources 111 and 121 during one cycle of imaging by the image pickup unit 21, timing for turning on and off the light sources 111 and 121, and light sources 111. The value of the forward current flowing through the LED element 121, the timing of reading signals from the optical line sensors 110 and 120, and the like are stored as imaging parameters.
  • the one-cycle imaging includes the wavelength range of the light emitted from the light sources 111 and 121, the lighting and extinguishing of the light sources 111 and 121, the value of the forward current flowing through each LED element, the timing of signal reading, and the like. It refers to the set imaging pattern.
  • An image of the entire banknote is acquired by continuously and repeatedly executing one cycle of imaging as one cycle.
  • the light source control unit 11 performs dynamic lighting control in which the light sources 111 and 121 are turned on in order in order to capture an image of individual banknotes by the light sources 111 and 121. Specifically, the light source control unit 11 controls lighting and extinguishing of the light sources 111 and 121 based on the timing set in the imaging parameter. This control is performed by using a mechanical clock that changes according to the transport speed of banknotes and a system clock that is always output at a constant frequency regardless of the transport speed of banknotes. Further, the light source control unit 11 sets the magnitude of the forward current flowing through each LED element based on the imaging parameter.
  • the sensor control unit 12 controls the timing of reading the image signal from the optical line sensors 110 and 120 based on the timing set in the imaging parameter, and synchronizes with the timing of turning on and off the light sources 111 and 121. Read the image signal from each line sensor. This control is performed using the mechanical clock and the system clock. Then, the sensor control unit 12 sequentially stores the read image signal, that is, the line data, in the ring buffer (line memory) of the storage unit 30.
  • the line data means data based on an image signal obtained by one imaging by each optical line sensor 110, 120, and is orthogonal to the lateral direction of the acquired image (orthogonal to the transport direction of the bill). Corresponds to one column of data (direction).
  • the image generation unit 13 has a function of generating an image based on various signals related to banknotes acquired from the detection unit 20. Specifically, the image generation unit 13 first decomposes the data (image signal) stored in the ring buffer into data for each condition of light irradiation and light reception. Specifically, the light receiving intensity data of the light reflected by irradiating the first infrared light, the light receiving intensity data of the light reflected by irradiating the second infrared light, and the third infrared light are displayed.
  • the image generation unit 13 generates the first image data (hereinafter, the first monochrome image data) by gray-converting the first raw image data based on the light receiving intensity data of the light in the first wavelength region.
  • the first raw image data based on the light receiving intensity data of the light in the second wavelength region is gray-converted to generate the second image data (hereinafter, the second monochrome image data). More specifically, the raw image data of each color is converted so that the maximum intensity (255 digit) and the minimum intensity (0 digit) of the output of the corresponding light receiving unit 113 or 123 are white and black, respectively.
  • the first monochrome image data and the second monochrome image data correspond to the first image data and the second image data, respectively.
  • the image generation unit 13 also functions as a super-resolution processing unit that synthesizes two or more types of image data to increase the resolution in the sub-scanning direction, and the generated first monochrome image data and second monochrome image.
  • the data is combined in the bill transport direction (secondary scanning direction) to generate a composite image. That is, the first monochrome image data and the second monochrome image data are combined for each column.
  • the line data constituting the first monochrome image data and the line data constituting the second monochrome image data are alternately arranged in the sub-scanning direction to generate one reflected image.
  • the image generation unit 13 may execute the generation of the composite image at the same time as the generation of the first and second raw image data and the generation of the first and second monochrome image data. That is, each time the line data of each raw image data is generated, the corresponding monochrome image data may be generated, and each time the line data of each monochrome image data is generated, the monochrome image data may be sequentially combined.
  • the image generation unit 13 may execute the generation of the composite image after the generation of the first and second monochrome image data is completed. That is, the entire first monochrome image data may be generated, and after the entire monochrome image data is generated, these image data may be combined to generate a composite image.
  • the image recognition unit 14 recognizes the composite image generated by the image generation unit 13. That is, the composite image is analyzed to extract features and recognize the object. Specifically, for example, when a serial number is printed on a bill, the image recognition unit 14 recognizes the serial number portion of the composite image as characters and recognizes the serial number of the bill. In addition, the face of a person printed on a bill may be recognized. Further, the image recognition unit 14 stores the recognition result in the storage unit 30.
  • the identification unit 15 performs identification processing using various signals related to banknotes acquired from the detection unit 20.
  • the identification unit 23 identifies at least the denomination and authenticity of the bill.
  • the identification unit 15 may have a function of determining whether the bill is correct or not. In that case, the identification unit 15 detects dirt, folds, tears, etc. of the banknote, and also detects the tape or the like attached to the banknote from the thickness of the banknote, so that the banknote can be reused in the market. It has a function to determine which of the loss tickets is not suitable for market distribution.
  • the identification unit 15 uses an image of a banknote taken by the imaging unit 21 to identify the denomination, authenticity, positive loss, etc.
  • the composite image generated by the image generation unit 13 or the image recognition unit 14 Use the recognition result obtained by.
  • the identification unit 15 uses the serial number information of the banknotes obtained by the image recognition unit 14 in order to identify the authenticity of the banknotes.
  • the light source 111 irradiates a first infrared light, then a second infrared light having a wavelength range different from that of the first infrared light, then a blue light, and then a blue light.
  • a third infrared light having a wavelength range different from that of the first and second infrared lights is irradiated, followed by a red light and then a green light. While the banknote is irradiated with light, each image sensor of the light receiving unit 113 is exposed and accumulates electric charges.
  • the image signal by the light before the switching is read from the optical line sensor 110.
  • the line data for one row, the second red which forms the reflection image of the A surface by the first infrared light (hereinafter, the first infrared reflection image) by the optical line sensor 110 in one cycle.
  • Line data for one row that forms a reflection image of surface A by external light (hereinafter, second infrared reflection image), one row that forms a reflection image of surface A by blue light (hereinafter, blue reflection image)
  • Line data, line data for one row forming a reflection image of surface A by third infrared light (hereinafter, third infrared reflection image), reflection image of surface A by red light (hereinafter, red reflection)
  • the line data for one row forming the image) and the line data for one row forming the reflected image of the A surface by the green light (hereinafter referred to as the green reflected image) are sequentially acquired in this order.
  • each of the first to third infrared lights is not particularly limited, but the peak wavelengths of the first, second, and third infrared lights may increase in this order. Often, for example, the peak wavelengths of the first to third infrared lights may be 800 nm for the first, 880 nm for the second, and 950 nm for the third.
  • the first infrared reflection image, the second infrared reflection image, the blue reflection image, and the third infrared reflection image of the entire A side of the banknote can be obtained.
  • a red reflection image and a green reflection image are acquired respectively.
  • a color image of the entire A side of the banknote can be obtained from the red, green, and blue reflection images.
  • FIG. 5 shows a case where blue light and green light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a blue reflection image and a green reflection image.
  • a monochrome image of the light is subordinated. It can be acquired with a resolution of 100 dpi in the scanning direction. Therefore, in the example shown in FIG. 5, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a composite image obtained by combining a blue reflection image and a green reflection image is 200 dpi in the sub-scanning direction. Can be obtained at the resolution of.
  • FIG. 6 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
  • the conveyed bill is irradiated with the first infrared light from the light source 111 during one cycle, followed by the second red. It is irradiated with external light, followed by a third infrared light, then blue light, then green light, and then red light.
  • the optical line sensor 110 forms one row of line data for forming the first infrared reflection image, one row of line data for forming the second infrared reflection image, and a third row of line data in one cycle.
  • the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained.
  • a green reflection image and a red reflection image are acquired respectively.
  • FIG. 6 shows a case where the third infrared light and the red light are used as the light in the first and second wavelength regions, and the third infrared reflection image and the red reflection image are combined and synthesized. An image is generated.
  • a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a third infrared reflection image and a red reflection image are combined.
  • the combined image can be acquired at a resolution of 200 dpi in the sub-scanning direction.
  • FIG. 7 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
  • the conveyed bills are irradiated with a third infrared light from the light source 111 and subsequently irradiated with blue light during one cycle. Then, the first infrared light is irradiated, then the green light is irradiated, then the red light is irradiated, and then the second infrared light is irradiated.
  • the optical line sensor 110 performs one row of line data forming a third infrared reflection image, one row of line data forming a blue reflection image, and a first infrared reflection image in one cycle.
  • One row of line data to form a green reflection image one row of line data to form a green reflection image, one row of line data to form a red reflection image, and one row to form a second infrared reflection image.
  • Line data will be acquired in this order.
  • the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained.
  • a green reflection image and a red reflection image are acquired respectively.
  • FIG. 7 shows a case where the first and second infrared lights are used as the light in the first and second wavelength regions, and the first infrared reflection image and the second infrared reflection image are displayed. A composite image is generated by compositing.
  • a monochrome image and an RGB color image of each color can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a first infrared reflection image and a second infrared image can be obtained.
  • a composite image obtained by synthesizing the reflected image can be acquired at a resolution of 200 dpi in the sub-scanning direction.
  • the time for irradiating each type of light is set to be constant. That is, the charge accumulation time of each image sensor is set to be constant regardless of the type of light. However, the time for irradiating various types of light can be appropriately set, and the irradiation time may be different for different lights.
  • the fact that the plurality of time intervals (T1 and T2 and t1, t2 and t3 described later) are the same occurs not only when the plurality of time intervals are completely the same but also due to the fluctuation of the mechanical clock. It also includes cases where the degree of deviation occurs between multiple time intervals.
  • ⁇ Processing flow by banknote identification device> a processing flow by the banknote identification device 1, a processing flow of an image acquisition method for acquiring a composite image of banknotes, and a main processing flow thereafter will be described.
  • the image acquisition method using the optical line sensor 110 will be described, but since the same applies to the case where the optical line sensor 120 is used, the description thereof will be omitted.
  • the light receiving unit 113 receives the light of the first wavelength region (for example, blue light) emitted from the light source 111 and reflected by the bill, and outputs the image signal S1.
  • the first wavelength region for example, blue light
  • the light in the second wavelength range (for example, green light) emitted from the light source 111 at a timing different from the light in the first wavelength range receives the light reflected by the bill and outputs the image signal S2 (light receiving step S11). ..
  • the image signal S1 and the image signal S2 correspond to the first image signal and the second image signal, respectively.
  • the sensor control unit 12 reads the image signals S1 and S2 from the optical line sensor 110, and sequentially stores the read image signals S1 and S2 in the ring buffer of the storage unit 30 (image signal reading step S12).
  • the image generation unit 13 gray-converts the raw image data based on the image signal S1 to generate the first image data (first monochrome image data), and grays the raw image data based on the image signal S2.
  • the conversion is performed to generate the second image data (second monochrome image data), and the generated first monochrome image data and the second monochrome image data are combined in the bill transport direction (secondary scanning direction).
  • image generation step S13 To generate a composite image (image generation step S13).
  • the image recognition unit 14 recognizes the composite image generated by the image generation unit 13 (image recognition step S14).
  • the identification unit 15 determines the denomination, authenticity, correct loss, etc. of the banknote (identification step S15). At this time, the identification unit 15 uses the composite image generated by the image generation unit 13 and the recognition result of the image recognition unit 14.
  • the light receiving unit 113 receives the light reflected by the light source in the first wavelength range from the reflection light source 111 and outputs the image signal S1 and outputs the image signal S1 and also outputs the first wavelength range.
  • the light in the second wavelength range emitted from the reflection light source 111 at a timing different from the light of the above receives the light reflected by the bill and outputs the image signal S2, and the image generation unit 13 is based on the image signal S1.
  • the image data of 1 and the second image data based on the image signal S2 are combined in the transport direction of the bill to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution than the first and second image data, specifically, a monochrome image having twice the resolution in the sub-scanning direction.
  • the case where the light in the first wavelength region and the light in the second wavelength region are repeatedly irradiated to the banknote in this order has been described.
  • the first and second wavelengths The light in the region does not have to be repeatedly irradiated once.
  • at least one of the lights in the first and second wavelength regions may be continuously irradiated on the bill, such that the light source 111 emits light in the order of green ⁇ green ⁇ blue ⁇ green ⁇ green ⁇ blue.
  • the image generation unit 13 combines the two types of image data after being gray-converted has been described, but the two types of image data may be combined without being gray-converted.
  • the light used for generating the composite image has two wavelengths (first and second wavelength regions)
  • the light used for generating the composite image has been described.
  • the number of wavelengths is not particularly limited to two, and may be three or more. Specifically, for example, red, green, and blue light may be used, or three types of infrared light (for example, first to third infrared light) may be used.
  • the light source control unit 11 controls each light source
  • the sensor control unit 12 controls each optical line sensor 110 when the wavelengths of light used for generating the composite image are three.
  • An example of control of reading a signal from 120 will be described.
  • FIG. 9 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
  • the optical line sensor 120 is also controlled in the same manner as the optical line sensor 110, the description thereof will be omitted.
  • the conveyed bills are irradiated with red light from the light source 111 and subsequently irradiated with the first infrared light during one cycle. Then, it is irradiated with green light, then it is irradiated with a second infrared light, then it is irradiated with blue light, and then it is irradiated with a third infrared light.
  • the optical line sensor 110 forms one row of line data for forming a red reflection image, one row of line data for forming a first infrared reflection image, and a green reflection image in one cycle. Line data for one row, line data for one row that forms the second infrared reflection image, line data for one row that forms the blue reflection image, and one row that forms the third infrared reflection image. Line data will be acquired in this order.
  • the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained.
  • a green reflection image and a red reflection image are acquired respectively.
  • FIG. 9 shows a case where the first to third infrared lights are used as the light in the three wavelength ranges, and the first infrared reflection image, the second infrared reflection image, and the third red A composite image is generated by synthesizing the external reflection images.
  • a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction as in the case of FIG.
  • a composite image obtained by synthesizing the first infrared reflection image, the second infrared reflection image, and the third infrared reflection image is acquired at a resolution of 300 dpi in the sub-scanning direction. Therefore, it is possible to acquire a reflected image having a higher resolution than that of the first embodiment.
  • the time for irradiating each type of light is set to be constant as in the case of FIG. That is, the charge accumulation time of each image sensor is set to be constant regardless of the type of light.
  • the time for irradiating various types of light can be appropriately set, and the irradiation time may be different for different lights.
  • t1 the time interval t2 from the start of irradiation of light in the second wavelength region to the start of irradiation of light in the third wavelength region (for example, third infrared light), and the third wavelength region.
  • the time interval t3 from the start of the irradiation of the light of the above to the start of the irradiation of the light in the first wavelength region is set to the same time interval.
  • FIG. 10 shows the contents and timing of lighting the light source and reading the signal, and is the same as the cases of FIGS. 5 to 7 and 9 except that the contents and timing of lighting the light source are partially different.
  • the optical line sensor 120 is also controlled in the same manner as the optical line sensor 110, the description thereof will be omitted.
  • the light source 111 emits blue light with respect to the bills conveyed during one cycle corresponding to one cycle of the mechanical clock generated by the rotary encoder.
  • Light, red light, green light, first infrared light, second infrared light, and third infrared light are irradiated in the predetermined order shown in FIG.
  • 0.254 mm of banknotes are conveyed by one cycle of imaging (one cycle of the mechanical clock).
  • a clock having a period of one half of the mechanical clock (hereinafter, also abbreviated as 1 / 2MCLK) and a clock having a period of one third of the mechanical clock (hereinafter, 1). / 3MCLK is also abbreviated.) And is generated. Then, based on the rising edge of each clock, the intervals T1'and T2'of the period of irradiating the light in the first and second wavelength regions are set to be the same (including substantially the same), and the three wavelength regions are set. The intervals t1', t2', and t3'of the period of irradiating light are set to be the same (including substantially the same). As a result, the moving distance of each color in the transport direction can be made close to the same.
  • the intervals T1'and T2' are set in the period from the rising timing of 1 / 2MCLK to the elapse of a certain time based on the system clock. Further, the intervals t1', t2'and t3'are set in the period from the rising timing of 1/3 MCLK to the elapse of a certain time based on the system clock.
  • the optical line sensor simultaneously irradiates the banknote with light in the first wavelength region and light in a second wavelength region different from the first wavelength region.
  • Each color reflection image is acquired, and these reflection images are gray-converted and then combined in a direction orthogonal to the bill transport direction (main scanning direction of the optical line sensor) to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution, specifically, twice the resolution in the direction orthogonal to the transport direction of the banknotes, as compared with each reflected image.
  • the light sources 111 and 121 have an LED element that emits blue light having a wavelength range of 400 nm to 500 nm, an LED element that emits green light having a wavelength range of 500 nm to 600 nm, and a wavelength range of 600 nm to 700 nm.
  • the LED element that emits red light and the LED element that emits infrared light having a wavelength range of 700 nm to 1000 nm are provided, and the bill BN contains white light including blue light, green light, and red light, and infrared light. Irradiate with light at the same time.
  • the light receiving units 113 and 123 have a plurality of pixel unit units 130 arranged in a line in a direction (main scanning direction) orthogonal to the transport direction (sub-scanning direction) of the bill BN.
  • Each pixel unit 130 has four image pickup elements (light receiving elements) 131a to 131d arranged in a row in the main scanning direction and four bandpass filters provided corresponding to the four image pickup elements 131a to 131d, respectively.
  • the four bandpass filters are filters (optical filters) having different wavelength ranges through which they pass, and are bandpass filters (hereinafter, also referred to as blue filters) 132B and 500 that transmit blue light in the wavelength range of 400 to 500 nm.
  • a bandpass filter that transmits green light in the wavelength range of up to 600 nm (hereinafter, also referred to as a green filter) 132G and a bandpass filter that transmits red light in the wavelength range of 600 to 700 nm (hereinafter, also referred to as a red filter). It includes 132R and a bandpass filter (hereinafter, also referred to as an infrared light filter) 132IR that transmits infrared light in a wavelength range of 700 to 1000 nm.
  • the blue filter 132B, the red filter 132R, the green filter 132G, and the infrared light filter 132IR are arranged in this order in the main scanning direction. Then, each of the image pickup devices 131a to 131d receives light of different wavelength bands transmitted through the corresponding bandpass filter.
  • the line data for one row read by the sensor control unit 12 includes a plurality of types of line data that are distinguished by the difference in the wavelength of the irradiated light and the like.
  • the image generation unit 13 irradiates the data (image signal) stored in the ring buffer with the received intensity data of the light reflected by irradiating infrared light and reflects the data (image signal) by irradiating with red light. It is decomposed into the light reception intensity data, the light reception intensity data of the light reflected by irradiating green light, and the light reception intensity data of the light reflected by irradiating blue light.
  • the image generation unit 13 also functions as a super-resolution processing unit that synthesizes two or more types of image data to increase the resolution in the main scanning direction, and the generated first monochrome image data and second monochrome image.
  • a composite image is generated by synthesizing the data in a direction (main scanning direction) orthogonal to the transport direction of the bill. That is, the first monochrome image data and the second monochrome image data are combined line by line. In other words, the first monochrome image data and the second monochrome image data are alternately arranged in the main scanning direction to generate one reflected image.
  • FIG. 13 shows a case where blue light and green light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a blue reflection image and a green reflection image.
  • the pitch (distance) X of the bandpass filter of each color in the main scanning direction is 0.254 mm
  • the pitch X1 of the green filter 132G and the blue filter 132B sandwiching the infrared light filter 132IR is 0.127 mm
  • red Assuming that the pitch X2 of the blue filter 132B and the green filter 132G sandwiching the filter 132R is 0.127 mm, a monochrome image of each color can be acquired at a resolution of 100 dpi in the main scanning direction. Therefore, in the example shown in FIG.
  • a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the main scanning direction, and a composite image obtained by combining a blue reflection image and a green reflection image is 200 dpi in the main scanning direction. Can be obtained at the resolution of.
  • FIG. 14 shows a case where red light and infrared light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a red reflection image and an infrared reflection image.
  • the pitch (distance) X of the bandpass filter of each color in the main scanning direction is set to 0.254 mm
  • the pitch X3 of the red filter 132R and the infrared light filter 132IR sandwiching the green filter 132G is set to 0.127 mm
  • blue Assuming that the pitch X4 of the infrared light filter 132IR and the red filter 132R sandwiching the filter 132B is 0.127 mm, a monochrome image of each color can be acquired at a resolution of 100 dpi in the main scanning direction. Therefore, in the example shown in FIG. 14, as in the case of FIG.
  • a monochrome image and an RGB color image of each color can be acquired at a resolution of 100 dpi in the main scanning direction, and a red reflection image and an infrared reflection image are combined.
  • the composite image can be acquired at a resolution of 200 dpi in the main scanning direction.
  • the two types of bandpass filters that transmit light in the first and second wavelength regions are arranged at the same (including substantially the same) pitch in the main scanning direction. This makes it possible to obtain a more natural composite image with less distortion.
  • the image acquisition method using the optical line sensor 110 will be described, but since the same applies to the case where the optical line sensor 120 is used, the description thereof will be omitted.
  • the light receiving unit 113 receives the light of the first wavelength region (for example, blue light) emitted from the light source 111 and reflected by the bill, and outputs the image signal S1.
  • the light in the second wavelength range (for example, green light) emitted from the light source 111 at the same timing as the light in the first wavelength range receives the light reflected by the bill and outputs the image signal S2.
  • the image generation unit 13 synthesizes the generated first monochrome image data and the second monochrome image data in the main scanning direction to generate a composite image.
  • the light receiving unit 113 receives the light reflected by the light source in the first wavelength range from the reflection light source 111 and outputs the image signal S1 and outputs the image signal S1 and also outputs the first wavelength range.
  • the light in the second wavelength range emitted from the reflection light source 111 at the same timing as the light of the above receives the light reflected by the bill and outputs the image signal S2, and the image generation unit 13 is based on the image signal S1.
  • the image data of 1 and the second image data based on the image signal S2 are combined in the main scanning direction to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution than the first and second image data, specifically, a monochrome image having twice the resolution in the main scanning direction.
  • the image generation unit 13 may combine the two types of image data without gray conversion.
  • the number of wavelengths of light used for generating the composite image is not particularly limited to two, and may be three or more. Specifically, for example, red, green, and blue light may be used, or three types of infrared light (for example, first to third infrared light) may be used.
  • each pixel unit 130 is divided into six image pickup elements (light receiving elements) 131a to 131f and six image pickup elements 131a to 131f arranged in a row in the main scanning direction, respectively. It is equipped with six bandpass filters provided correspondingly.
  • the six bandpass filters are filters (optical filters) having different wavelength ranges through which they transmit, and in addition to the above-mentioned blue filter 132B, green filter 132G and red filter 132R, the bandpass filters that transmit the first infrared light are transmitted.
  • a filter (hereinafter, also referred to as a first infrared light filter) 132IR1 and a bandpass filter (hereinafter, a second infrared light) that transmits a second infrared light having a wavelength range different from that of the first infrared light.
  • 132IR2 and a bandpass filter (hereinafter, also referred to as a third infrared light filter) 132IR3 that transmits a third infrared light having a wavelength range different from that of the first and second infrared light.
  • the blue filter 132B, the first infrared light filter 132IR1, the red filter 132R, the second infrared light filter 132IR2, the green filter 132G, and the third infrared light filter 132IR3 are arranged in this order in the main scanning direction. Has been done. Then, each of the image pickup elements 131a to 131f receives light of different wavelength bands transmitted through the corresponding bandpass filter.
  • the specific wavelength range of each of the first to third infrared lights is not particularly limited, but the peak wavelengths of the first, second, and third infrared lights increase in this order.
  • the peak wavelengths of the first to third infrared lights may be 800 nm for the first, 880 nm for the second, and 950 nm for the third.
  • FIG. 16 shows a case where the first to third infrared light is used as the light in the three wavelength regions, and the first infrared reflection image, the second infrared reflection image, and the third infrared ray are used.
  • a composite image is generated by synthesizing the reflected images.
  • the pitch (distance) X of the bandpass filter of each color in the main scanning direction is set to 0.254 mm
  • X5 is 0.0847 mm
  • the pitch X6 of the second infrared filter 132IR2 and the third infrared filter 132IR3 sandwiching the green filter 132G is 0.0847 mm
  • the third infrared light sandwiching the blue filter 132B is set to 0.254 mm
  • a monochrome image of each color can be acquired with a resolution of 100 dpi in the main scanning direction. Therefore, even in the example shown in FIG. 16, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the main scanning direction, as in the case of FIG.
  • a composite image obtained by synthesizing the first, second, and third infrared reflection images can be acquired at a resolution of 300 dpi in the main scanning direction, which is further compared with the second embodiment. It is possible to acquire a high-resolution reflected image.
  • white light is generated by using an LED element that emits blue light and a red phosphor and a green phosphor that are excited by the light and emit red and green, respectively, and irradiate the bill BN.
  • white light may be generated by using an LED element that emits blue light and a yellow phosphor that is excited by the light and emits a complementary color of yellow, and the bill BN may be irradiated.
  • the banknote BN it is possible to simultaneously irradiate the banknote BN with light in a plurality of wavelength ranges different from each other. Even when an LED element that emits blue light and a yellow phosphor are used, the white light usually includes green light and red light, so that the first and second wavelength regions are included. As the light, as described above, it is possible to use blue light and green light, or red light and infrared light.
  • the composite image is compared with each image data. It is possible to acquire a higher resolution monochrome image. Further, since image data of at least two wavelengths are acquired, more types (wavelengths) of images are obtained as compared with the case where the number of times of light emission of a specific type is increased to increase the resolution as described in Patent Document 1. Can be obtained. This makes it possible to efficiently acquire a color image. Further, since each image data used for the composite image constitutes a reflection image, as compared with the case where the reflection image and the transmission image are used as described in Patent Document 2, only one side of the bill is used. An accurate image can be obtained.
  • the image recognition unit 14 recognizes the serial number or the face of a person printed on the bill has been described, but if the paper sheets to which the present invention is intended are other than the bill.
  • a code such as a barcode or QR code (registered trademark) attached to paper sheets such as a gift certificate or a barcode ticket of a casino may be recognized by the image recognition unit according to the present invention.
  • the image recognition unit according to the present invention reads the bar code portion of the composite image and recognizes the bar code of the banknote.
  • the barcode may be arranged in the longitudinal direction of the paper sheets, in which case it is preferable to transport the paper sheets in the longitudinal direction and acquire an image thereof.
  • the image recognition unit may recognize a plurality of types of features attached to paper sheets, such as a serial number, a person's face, a barcode, and a QR code.
  • the present invention is a technique useful for acquiring accurate high-resolution images while acquiring various images.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Inspection Of Paper Currency And Valuable Securities (AREA)
  • Image Input (AREA)

Abstract

The present invention provides an image acquisition device, a paper sheet handling device, and an image acquisition method capable of acquiring various types of images and capable of acquiring more accurate high resolution images. The present invention is an image acquisition device for acquiring an image of a paper sheet being transferred, and is provided with: a light emitting unit which illuminates a paper sheet with light of a first wavelength range and light of a second wavelength range; a light reception unit which receives the reflected light resulting from the illumination of the paper sheet with the light of the first wavelength range emitted from the light emitting unit, and outputs a first image signal, and which also receives the reflected light resulting from the illumination of the paper sheet with the light of the second wavelength range emitted from the light emitting unit, and outputs a second image signal; and an image generation unit which generates first image data based on the first image signal and second image data based on the second image signal, and combines the first image data and the second image data to generate a composite image.

Description

画像取得装置、紙葉類処理装置及び画像取得方法Image acquisition device, paper leaf processing device and image acquisition method
本発明は、画像取得装置、紙葉類処理装置及び画像取得方法に関する。より詳しくは、紙葉類の高解像度の画像を取得するのに好適な画像取得装置、紙葉類処理装置及び画像取得方法に関する。 The present invention relates to an image acquisition device, a paper leaf processing device, and an image acquisition method. More specifically, the present invention relates to an image acquisition device, a paper leaf processing device, and an image acquisition method suitable for acquiring a high-resolution image of paper leaves.
現在、紙葉類処理装置では、紙葉類の搬送過程において、紙葉類の種類の識別や、正損判定、真偽判定等が行われている。このうち、紙葉類の真偽判定は、例えば、光学ラインセンサで読み取った紙葉類の画像を解析することにより行われる。この場合、例えば、認識した紙葉類の記番号の並びや桁数に基づいて、紙葉類の真偽判定がなされる。記番号認識を行うためには、記番号部分の画像が高解像度であることが望ましい。 At present, in the paper leaf processing apparatus, in the process of transporting paper leaves, identification of the type of paper leaves, positive / loss determination, authenticity determination, and the like are performed. Of these, the authenticity of paper leaves is determined, for example, by analyzing an image of paper leaves read by an optical line sensor. In this case, for example, the authenticity of the paper leaves is determined based on the arrangement of the serial numbers of the recognized paper leaves and the number of digits. In order to perform serial number recognition, it is desirable that the image of the serial number portion has a high resolution.
例えば、特許文献1には、1点灯サイクル中の波長の異なる第1~第n(nは2以上の整数)の光のうちの少なくとも1つの光の発光回数が他の光の発光回数と異なるように発光手段の発光制御を行う紙葉類識別装置が開示されている。より具体的には、反射緑の発光回数を多くして高解像度の反射画像を取得する例が開示されている。 For example, in Patent Document 1, the number of times of light emission of at least one of the first to nth (n is an integer of 2 or more) lights having different wavelengths in one lighting cycle is different from the number of times of light emission of other lights. As described above, a paper leaf identification device that controls the light emission of the light emitting means is disclosed. More specifically, an example of acquiring a high-resolution reflected image by increasing the number of times of light emission of reflected green is disclosed.
また、例えば、特許文献2には、第1光源の光を紙葉類に反射させた反射光による反射画像と、第2光源の光を該紙葉類に透過させた透過光による透過画像とを互いに補完した合成画像を生成し、この合成画像に対して、該紙葉類の記番号領域を文字認識する記番号認識装置が開示されている。 Further, for example, Patent Document 2 describes a reflected image by reflected light obtained by reflecting the light of the first light source on the paper sheets and a transmitted image by the transmitted light transmitted by the light of the second light source on the paper leaves. A serial number recognition device is disclosed that generates a composite image in which the above are complementary to each other and recognizes the serial number region of the paper leaf as a character for the composite image.
特許第5005760号公報Japanese Patent No. 5005760 特開2016-218810号公報Japanese Unexamined Patent Publication No. 2016-218810
近年、記番号の色の認識や、記番号の背景色からの分離等を行うため、記番号認識用のカラー画像を取込む必要性が高まっている。すなわち、より多くの種類(波長)の光を紙葉類に照射して、より多くの種類(波長)の画像を取得することが要求されている。 In recent years, there is an increasing need to capture a color image for serial number recognition in order to recognize the serial number color and separate the serial number from the background color. That is, it is required to irradiate paper sheets with more types (wavelengths) of light to acquire more types (wavelengths) of images.
しかしながら、特許文献1に記載の技術を利用して、例えばカラー画像を取得しようとすると、紙葉類に照射する光の種類が増加するため、反射緑等の特定種の光の発光回数を減らす必要があり、高解像度の画像を取得することができなくなる。また、特許文献1に記載の技術では、高解像度の画像を得るために特定種の光の発光回数を多くすると、その分だけ他の光の発光が無くなるため、他の画像の低解像度化や、取得する画像の種類の減少を招くことになる。 However, when an attempt is made to acquire a color image by using the technique described in Patent Document 1, for example, the types of light irradiating paper leaves increase, so that the number of times of light emission of a specific type such as reflected green is reduced. It is necessary and it becomes impossible to acquire a high resolution image. Further, in the technique described in Patent Document 1, if the number of times of light emission of a specific type is increased in order to obtain a high-resolution image, the light emission of other light is eliminated by that amount, so that the resolution of other images can be lowered. , Will lead to a decrease in the types of images to be acquired.
また、特許文献2に記載されているように、反射画像と透過画像とを互いに補完した合成画像を取得する方法では、紙葉類の一方の面のみの正確な画像が取得できない可能性がある。透過画像では他方の面の図柄も撮像され得るためである。また、透過画像を生成するために受光される透過光の透過率は、紙葉類の厚み、材質等により変動し、センサ出力の変動は、一般的に、反射光受光時よりも透過光受光時の方が大きくなるためである。 Further, as described in Patent Document 2, there is a possibility that an accurate image of only one surface of the paper leaf cannot be obtained by the method of acquiring a composite image in which the reflected image and the transmitted image are complemented with each other. .. This is because the pattern on the other surface can also be captured in the transparent image. In addition, the transmittance of the transmitted light received to generate the transmitted image fluctuates depending on the thickness, material, etc. of the paper sheets, and the fluctuation of the sensor output is generally higher than that of the reflected light reception. This is because time is bigger.
本発明は、上記現状に鑑みてなされたものであり、種々の画像を取得できるとともに、より正確な高解像度の画像を取得することができる画像取得装置、紙葉類処理装置及び画像取得方法を提供することを目的とするものである。 The present invention has been made in view of the above situation, and provides an image acquisition device, a paper leaf processing device, and an image acquisition method capable of acquiring various images and more accurate high-resolution images. It is intended to provide.
上述した課題を解決し、目的を達成するために、本発明は、搬送される紙葉類の画像を取得する画像取得装置であって、第1の波長域の光と第2の波長域の光とを紙葉類に照射する発光部と、前記発光部から照射された前記第1の波長域の光が前記紙葉類で反射した光を受光して第1の画像信号を出力するとともに、前記発光部から照射された前記第2の波長域の光が前記紙葉類で反射した光を受光して第2の画像信号を出力する受光部と、前記第1の画像信号に基づく第1の画像データ及び前記第2の画像信号に基づく第2の画像データをそれぞれ生成し、前記第1の画像データ及び前記第2の画像データを合成して合成画像を生成する画像生成部と、を備えることを特徴とする。 In order to solve the above-mentioned problems and achieve the object, the present invention is an image acquisition device for acquiring an image of transported paper sheets, and the light in the first wavelength region and the second wavelength region. A light emitting unit that irradiates the paper leaves with light, and the light in the first wavelength range emitted from the light emitting unit receives the light reflected by the paper sheets and outputs a first image signal. The light receiving unit that receives the light reflected by the paper sheets and outputs the second image signal by the light in the second wavelength region emitted from the light emitting unit, and the first image signal based on the first image signal. An image generation unit that generates an image data of 1 and a second image data based on the second image signal, and synthesizes the first image data and the second image data to generate a composite image. It is characterized by having.
また、本発明は、上記発明において、前記発光部は、前記第1の波長域の光と、前記第2の波長域の光とを異なるタイミングで前記紙葉類に照射し、前記画像生成部は、前記第1の画像データと、前記第2の画像データとを前記紙葉類の搬送方向において合成することを特徴とする。 Further, in the present invention, in the above invention, the light emitting unit irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region at different timings, and the image generation unit. Is characterized in that the first image data and the second image data are combined in the transport direction of the paper sheets.
また、本発明は、上記発明において、前記発光部は、前記第1の波長域の光を前記紙葉類に照射した後に、前記第2の波長域の光を前記紙葉類に照射し、前記第2の波長域の光を前記紙葉類に照射した後に、前記第1の波長域の光を前記紙葉類に照射することを特徴とする。 Further, in the present invention, in the above invention, the light emitting unit irradiates the paper leaves with light in the first wavelength region and then irradiates the paper leaves with light in the second wavelength region. After irradiating the paper sheets with light in the second wavelength range, the paper sheets are irradiated with light in the first wavelength range.
また、本発明は、上記発明において、前記発光部は、前記第1の波長域の光の照射を開始してから前記第2の波長域の光の照射を開始するまでの時間間隔と、前記第2の波長域の光の照射を開始してから前記第1の波長域の光の照射を開始するまでの時間間隔とが同一となるように、前記第1の波長域の光と前記第2の波長域の光とを前記紙葉類に照射することを特徴とする。 Further, in the present invention, the light emitting unit has a time interval from the start of irradiation of light in the first wavelength region to the start of irradiation of light in the second wavelength region, and the above-mentioned. The light in the first wavelength region and the first wavelength region are such that the time interval from the start of irradiation of light in the second wavelength region to the start of irradiation of light in the first wavelength region is the same. It is characterized in that the paper sheets are irradiated with light in the wavelength range of 2.
また、本発明は、上記発明において、前記発光部は、前記第1の波長域の光と、前記第2の波長域の光とを同時に前記紙葉類に照射し、前記画像生成部は、前記第1の画像データと、前記第2の画像データとを主走査方向において合成することを特徴とする。 Further, in the present invention, in the above invention, the light emitting unit simultaneously irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region, and the image generation unit generates the image. It is characterized in that the first image data and the second image data are combined in the main scanning direction.
また、本発明は、上記発明において、前記第1の画像データ及び前記第2の画像データは、グレー変換された画像データであることを特徴とする。 Further, the present invention is characterized in that, in the above invention, the first image data and the second image data are gray-converted image data.
また、本発明は、上記発明において、前記画像取得装置は、前記合成画像を認識する画像認識部を更に備え、前記画像認識部は、前記合成画像から前記紙葉類に付されている記番号を認識することを特徴とする。 Further, in the present invention, in the above invention, the image acquisition device further includes an image recognition unit that recognizes the composite image, and the image recognition unit is a serial number assigned to the paper sheets from the composite image. It is characterized by recognizing.
また、本発明は、上記発明において、前記画像取得装置は、前記合成画像を認識する画像認識部を更に備え、前記画像認識部は、前記合成画像から前記紙葉類に付されているバーコードを認識することを特徴とする。 Further, in the present invention, in the above invention, the image acquisition device further includes an image recognition unit that recognizes the composite image, and the image recognition unit is a bar code attached to the paper sheets from the composite image. It is characterized by recognizing.
また、本発明は、上記発明において、前記第1の波長域の光は、青色光であり、前記第2の波長域の光は、緑色光であることを特徴とする。 Further, the present invention is characterized in that, in the above invention, the light in the first wavelength region is blue light and the light in the second wavelength region is green light.
また、本発明は、上記発明において、前記第1の波長域の光は、赤色光であり、前記第2の波長域の光は、赤外光であることを特徴とする。 Further, the present invention is characterized in that, in the above invention, the light in the first wavelength region is red light, and the light in the second wavelength region is infrared light.
また、本発明は、上記発明において、前記第1の波長域の光は、第1の赤外光であり、前記第2の波長域の光は、前記第1の赤外光とは波長域が異なる第2の赤外光であることを特徴とする。 Further, in the present invention, in the above invention, the light in the first wavelength region is the first infrared light, and the light in the second wavelength region has a wavelength region different from that of the first infrared light. Is a second infrared light having a different wavelength.
また、本発明は、前記画像取得装置を備えることを特徴とする紙葉類処理装置である。 Further, the present invention is a paper leaf processing apparatus including the image acquisition apparatus.
また、本発明は、搬送される紙葉類の画像を取得する画像取得方法であって、第1の波長域の光が紙葉類で反射した光を受光して第1の画像信号を出力するとともに、第2の波長域の光が前記紙葉類で反射した光を受光して第2の画像信号を出力する受光ステップと、前記第1の画像信号に基づく第1の画像データ及び前記第2の画像信号に基づく第2の画像データをそれぞれ生成し、前記第1の画像データ及び前記第2の画像データを合成して合成画像を生成する画像生成ステップと、を備えることを特徴とする。 Further, the present invention is an image acquisition method for acquiring an image of conveyed paper leaves, in which light in the first wavelength range receives light reflected by the paper leaves and outputs a first image signal. At the same time, the light receiving step in which the light in the second wavelength range receives the light reflected by the paper sheets and outputs the second image signal, the first image data based on the first image signal, and the above. It is characterized by including an image generation step of generating a second image data based on the second image signal, and synthesizing the first image data and the second image data to generate a composite image. To do.
本発明の画像取得装置、紙葉類処理装置及び画像取得方法によれば、種々の画像を取得できるとともに、より正確な高解像度の画像を取得することができる。 According to the image acquisition device, the paper leaf processing device, and the image acquisition method of the present invention, various images can be acquired and more accurate high-resolution images can be acquired.
実施形態1の概要を説明するための図である。It is a figure for demonstrating the outline of Embodiment 1. FIG. 実施形態1に係る紙幣処理装置の外観を示した斜視模式図である。It is a perspective schematic diagram which showed the appearance of the banknote processing apparatus which concerns on Embodiment 1. FIG. 実施形態1に係る紙幣識別装置(画像取得装置)が備える撮像部の構成を説明する断面模式図である。FIG. 5 is a schematic cross-sectional view illustrating the configuration of an imaging unit included in the banknote identification device (image acquisition device) according to the first embodiment. 実施形態1に係る紙幣識別装置(画像取得装置)の構成を説明するブロック図である。It is a block diagram explaining the structure of the banknote identification apparatus (image acquisition apparatus) which concerns on Embodiment 1. 実施形態1における、光源制御部による各光源の制御、及びセンサ制御部による各ラインセンサからの信号読出の制御の一例を示したタイミングチャートである。FIG. 5 is a timing chart showing an example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment. 実施形態1における、光源制御部による各光源の制御、及びセンサ制御部による各ラインセンサからの信号読出の制御の他の例を示したタイミングチャートである。FIG. 5 is a timing chart showing another example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment. 実施形態1における、光源制御部による各光源の制御、及びセンサ制御部による各ラインセンサからの信号読出の制御の更に他の例を示したタイミングチャートである。9 is a timing chart showing still another example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment. 実施形態1に係る紙幣識別装置(画像取得装置)及び画像取得方法における合成画像の取得手順と、その後の主な処理フローとを示すフローチャートである。It is a flowchart which shows the acquisition procedure of the composite image in the banknote identification apparatus (image acquisition apparatus) and image acquisition method which concerns on Embodiment 1, and the main processing flow after that. 実施形態1の変形例における、光源制御部による各光源の制御、及びセンサ制御部による各ラインセンサからの信号読出の制御の一例を示したタイミングチャートである。It is a timing chart which showed an example of the control of each light source by a light source control unit, and the control of signal reading from each line sensor by a sensor control unit in the modification of Embodiment 1. FIG. 実施形態1と、その変形例における、光源制御部による各光源の制御、及びセンサ制御部による各ラインセンサからの信号読出の制御の複数の例を示したタイミングチャートである。9 is a timing chart showing a plurality of examples of control of each light source by a light source control unit and control of signal reading from each line sensor by a sensor control unit in the first embodiment and its modified example. 実施形態2の概要を説明するための図である。It is a figure for demonstrating the outline of Embodiment 2. 実施形態2に係る光学ラインセンサの受光部の撮像素子の配置とバンドパスフィルタの対応を示した平面模式図である。FIG. 5 is a schematic plan view showing the correspondence between the arrangement of the image sensor of the light receiving portion of the optical line sensor according to the second embodiment and the bandpass filter. 実施形態2におけるバンドパスフィルタの配置を示した平面模式図であり、第1及び第2の波長域の光として青色光と緑色光を利用する場合を示す。It is a plane schematic diagram which showed the arrangement of the bandpass filter in Embodiment 2, and shows the case where blue light and green light are used as light of the 1st and 2nd wavelength regions. 実施形態2におけるバンドパスフィルタの配置を示した平面模式図であり、第1及び第2の波長域の光として赤色光と赤外光を利用する場合を示す。It is a plane schematic diagram which showed the arrangement of the bandpass filter in Embodiment 2, and shows the case where red light and infrared light are used as light of the 1st and 2nd wavelength regions. 実施形態2の変形例に係る光学ラインセンサの受光部の撮像素子の配置とバンドパスフィルタの対応を示した平面模式図である。It is a top view which showed the correspondence between the arrangement of the image pickup element of the light receiving part of the optical line sensor which concerns on the modification of Embodiment 2 and a bandpass filter. 実施形態2の変形例におけるバンドパスフィルタの配置を示した平面模式図であり、3つの波長域の光として第1~第3の赤外光を利用する場合を示す。It is a plane schematic diagram which showed the arrangement of the bandpass filter in the modification of Embodiment 2, and shows the case where the 1st to 3rd infrared light is used as the light of three wavelength regions.
以下、本発明に係る画像取得装置、紙葉類処理装置及び画像取得方法の好適な実施形態を、図面を参照しながら説明する。本発明の対象となる紙葉類としては、紙幣、小切手、商品券、手形、帳票、有価証券、カード状媒体等の様々な紙葉類が適用可能であるが、以下においては、紙幣を対象とする装置及び方法を例として、本発明を説明する。また、以下では、本発明に係る画像取得装置の好適な実施形態として、画像取得装置の機能を兼ね備えた紙幣識別装置について説明する。なお、以下の説明は、紙幣識別装置(画像取得装置)、紙幣処理装置及び画像取得方法の一例である。 Hereinafter, preferred embodiments of the image acquisition device, the paper leaf processing device, and the image acquisition method according to the present invention will be described with reference to the drawings. As the paper leaves to be the subject of the present invention, various paper leaves such as banknotes, checks, gift certificates, bills, forms, securities, card-like media, etc. can be applied, but in the following, banknotes are targeted. The present invention will be described by exemplifying the apparatus and method described above. Further, in the following, as a preferred embodiment of the image acquisition device according to the present invention, a bill identification device having a function of the image acquisition device will be described. The following description is an example of a bill identification device (image acquisition device), a bill processing device, and an image acquisition method.
本明細書において、反射画像とは、紙葉類に光を照射して当該紙葉類で反射された光の強度分布に基づく画像を意味する。 In the present specification, the reflected image means an image based on the intensity distribution of light reflected by the paper leaves by irradiating the paper leaves with light.
(実施形態1)
<本実施形態の概要>
まず、図1を用いて、本実施形態の概要について説明する。図1に示したように、本実施形態では、光学ラインセンサにより、第1の波長域の光と、第1の波長域と異なる第2の波長域の光とを紙幣に繰り返し照射して2色の反射画像をそれぞれ取得し、そして、これらの反射画像をグレー変換した上で紙幣の搬送方向(光学ラインセンサの副走査方向)において合成して合成画像を生成する。そのため、各反射画像に比べて、紙幣の搬送方向においてより高解像度、具体的には2倍の解像度のモノクロ画像を取得することができる。また、2つの波長の反射画像を取得することから、特許文献1に記載のように特定種の光の発光回数を増やして解像度を高める場合に比べて、より多くの種類(波長)の画像を取得することができる。これは、カラー画像を取得しようとする場合に非常に有利に働く。更に、2つの反射画像を合成することから、特許文献2に記載のように反射画像と透過画像とを用いる場合に比べ、紙幣の一方の面のみのより正確な画像を取得することができる。
(Embodiment 1)
<Outline of this embodiment>
First, the outline of the present embodiment will be described with reference to FIG. As shown in FIG. 1, in the present embodiment, the optical line sensor repeatedly irradiates the bill with light in the first wavelength region and light in a second wavelength region different from the first wavelength region. Each color reflection image is acquired, and these reflection images are gray-converted and then combined in the bill transport direction (secondary scanning direction of the optical line sensor) to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution, specifically, twice the resolution in the banknote transport direction as compared with each reflected image. In addition, since reflection images of two wavelengths are acquired, more types (wavelengths) of images can be obtained as compared with the case where the number of times of light emission of a specific type is increased to increase the resolution as described in Patent Document 1. Can be obtained. This works very well when trying to get a color image. Further, since the two reflection images are combined, a more accurate image of only one side of the banknote can be obtained as compared with the case where the reflection image and the transmission image are used as described in Patent Document 2.
合成画像に利用する第1及び第2の波長域の光としては、波長が近い光が好適である。これにより、グレー変換した際に似た画像が得られやすく、合成画像がより自然なものとなる。好適な波長の組み合わせとしては、例えば、青色光と、緑色光との組み合わせ、赤色光と、赤外光との組み合わせ、第1の赤外光と、第1の赤外光とは波長域が異なる第2の赤外光との組み合わせが挙げられる。 As the light in the first and second wavelength regions used for the composite image, light having a similar wavelength is preferable. As a result, a similar image can be easily obtained when the gray conversion is performed, and the composite image becomes more natural. Suitable wavelength combinations include, for example, a combination of blue light and green light, a combination of red light and infrared light, and a wavelength range of the first infrared light and the first infrared light. A combination with a different second infrared light can be mentioned.
<紙幣処理装置の構成>
次に、図2を用いて、本実施形態に係る紙幣処理装置の構成について説明する。本実施形態に係る紙幣処理装置は、例えば、図2に示す構成を有するものであってもよい。図2に示す紙幣処理装置300は、テーブル上に設置して利用する小型の紙幣処理装置であり、紙幣の識別処理を行う紙幣識別装置(図2では図示せず)と、処理対象の複数の紙幣が積層状体で載置されるホッパ301と、ホッパ301から筐体310内に繰り出された紙幣が偽造券、真偽不確定券等のリジェクト紙幣であった場合に該リジェクト紙幣が排出される2つのリジェクト部302と、オペレータからの指示を入力するための操作部303と、筐体310内で金種、真偽及び正損が識別された紙幣を分類して集積するための4つの集積部306a~306dと、紙幣の識別計数結果や各集積部306a~306dの集積状況等の情報を表示するための表示部305とを備える。紙幣識別装置による正損判定の結果に基づき、4つの集積部306a~306dのうち、集積部306a~306cには、正券が収納され、集積部306dには汚損券が収納される。なお、集積部306a~306dへの紙幣の振り分け方法は任意に設定可能である。
<Structure of banknote processing device>
Next, the configuration of the banknote processing apparatus according to the present embodiment will be described with reference to FIG. The banknote processing device according to the present embodiment may have, for example, the configuration shown in FIG. The bill processing device 300 shown in FIG. 2 is a small bill processing device installed and used on a table, and includes a bill identification device (not shown in FIG. 2) that performs bill identification processing and a plurality of processing targets. The rejected banknotes are discharged when the hopper 301 on which the banknotes are placed in a laminated body and the banknotes drawn out from the hopper 301 into the housing 310 are rejected banknotes such as counterfeit banknotes and uncertain authenticity tickets. Two reject units 302, an operation unit 303 for inputting instructions from the operator, and four for sorting and accumulating banknotes whose denomination, authenticity, and correctness are identified in the housing 310. It includes a stacking unit 306a to 306d and a display unit 305 for displaying information such as a bill identification count result and an accumulation status of each of the stacking units 306a to 306d. Based on the result of the positive / loss determination by the bill identification device, among the four collecting units 306a to 306d, the collecting parts 306a to 306c store the regular ticket, and the collecting part 306d stores the dirty ticket. The method of distributing banknotes to the accumulation units 306a to 306d can be arbitrarily set.
<撮像部の構成>
次に、図3を用いて、本実施形態に係る紙幣識別装置の主要部である撮像部の構成について説明する。図3に示すように、撮像部21は、互いに対向配置された光学ラインセンサ110及び120を備えている。光学ラインセンサ110及び120の間には、紙幣BNが搬送される隙間が形成されており、この隙間は本実施形態に係る紙幣処理装置の搬送路311の一部を構成する。光学ラインセンサ110及び120は、それぞれ、搬送路311の上側及び下側に位置している。
<Structure of imaging unit>
Next, the configuration of the image pickup unit, which is the main part of the banknote identification device according to the present embodiment, will be described with reference to FIG. As shown in FIG. 3, the imaging unit 21 includes optical line sensors 110 and 120 arranged so as to face each other. A gap for transporting the bill BN is formed between the optical line sensors 110 and 120, and this gap constitutes a part of the transport path 311 of the bill processing device according to the present embodiment. The optical line sensors 110 and 120 are located above and below the transport path 311 respectively.
光学ラインセンサ110は、発光部としての2つの反射用光源111、集光レンズ112及び受光部113を備えている。反射用光源111は、紙幣BNの受光部113側の主面(以下、A面)に、所定波長の光(赤外光等の非可視光と、赤・緑・青等の単色光や白色光等の可視光)を順次照射する。集光レンズ112は、反射用光源111から出射され、紙幣BNで反射された光を集光する。受光部113は、紙幣BNの搬送方向(副走査方向)に対して直交する方向(主走査方向)にライン状に配列された複数の撮像素子(受光素子、図示せず)を備え、集光レンズ112によって集光された光を受光して電気信号に変換する。そして、その電気信号を増幅処理した後、デジタルデータにA/D変換した上で画像信号として出力する。 The optical line sensor 110 includes two reflection light sources 111 as light emitting units, a condenser lens 112, and a light receiving unit 113. The reflection light source 111 has light of a predetermined wavelength (invisible light such as infrared light, monochromatic light such as red, green, blue, or white) on the main surface (hereinafter, A surface) of the bill BN on the light receiving portion 113 side. Visible light such as light) is sequentially irradiated. The condenser lens 112 collects the light emitted from the reflection light source 111 and reflected by the bill BN. The light receiving unit 113 includes a plurality of image pickup elements (light receiving elements, not shown) arranged in a line in a direction (main scanning direction) orthogonal to the transport direction (secondary scanning direction) of the bill BN, and collects light. The light collected by the lens 112 is received and converted into an electric signal. Then, after the electric signal is amplified, it is A / D converted into digital data and then output as an image signal.
光学ラインセンサ120は、発光部としての2つの反射用光源121、集光レンズ122及び受光部123を備えている。反射用光源121は、紙幣BNの受光部123側の主面(以下、B面)に、所定波長の光(赤外光等の非可視光と、赤・緑・青等の単色光や白色光等の可視光)を照射する。集光レンズ122は、反射用光源121から出射され、紙幣BNで反射された光を集光する。受光部123は、紙幣BNの搬送方向に対して直交する方向にライン状に配列された複数の撮像素子(受光素子、図示せず)を備え、集光レンズ122によって集光された光を受光して電気信号に変換する。そして、その電気信号を増幅処理した後、デジタルデータにA/D変換した上で画像信号として出力する。 The optical line sensor 120 includes two reflection light sources 121 as light emitting units, a condenser lens 122, and a light receiving unit 123. The reflection light source 121 is formed on the main surface (hereinafter referred to as the B surface) of the bill BN on the light receiving portion 123 side with light of a predetermined wavelength (invisible light such as infrared light, monochromatic light such as red, green, blue, or white). Irradiate visible light such as light). The condenser lens 122 collects the light emitted from the reflection light source 121 and reflected by the bill BN. The light receiving unit 123 includes a plurality of image pickup elements (light receiving elements, not shown) arranged in a line in a direction orthogonal to the transport direction of the banknote BN, and receives the light collected by the condensing lens 122. And convert it into an electrical signal. Then, after the electric signal is amplified, it is A / D converted into digital data and then output as an image signal.
各光源111、121は、図3の紙面に垂直な方向(主走査方向)に延びるライン状の導光体(図示せず)と、導光体の両端部(一方の端部でもよい)に設けられた複数のLED素子(図示せず)とを備えている。 The light sources 111 and 121 are provided on a line-shaped light guide body (not shown) extending in a direction perpendicular to the paper surface (main scanning direction) of FIG. 3 and at both ends of the light guide body (may be one end). It is provided with a plurality of LED elements (not shown) provided.
各光源111、121は、LED素子として、例えば、第1及び第2の波長域の光を少なくとも含む互いに異なる波長域の光を照射可能な複数のLED素子を備え、選択された波長域の光を照射できるように構成されている。具体的には、例えば、各光源111、121は、赤外光(IR)を発光するLED素子と、可視光(赤・緑・青等の単色光や白色)を発光するLED素子とを備えており、紙幣BNに向けて赤外光及び可視光をそれぞれ照射する。好ましくは、各光源111、121は、複数種の赤外光をそれぞれ発光する複数のLED素子(例えば第1~第3の赤外光をそれぞれ発光する3種のLED素子)と、波長域が600nm~700nmの赤色光を発光するLED素子と、波長帯が500nm~600nmの緑色光を発光するLED素子と、波長帯が400nm~500nmの青色光を発光するLED素子とを備えている。 Each of the light sources 111 and 121 includes, as LED elements, a plurality of LED elements capable of irradiating light in different wavelength regions including at least light in the first and second wavelength regions, and light in a selected wavelength region. It is configured to be able to irradiate. Specifically, for example, each of the light sources 111 and 121 includes an LED element that emits infrared light (IR) and an LED element that emits visible light (monochromatic light such as red, green, blue, or white). Infrared light and visible light are emitted toward the bill BN, respectively. Preferably, each of the light sources 111 and 121 has a plurality of LED elements that emit a plurality of types of infrared light (for example, three types of LED elements that emit each of the first to third infrared lights) and a wavelength range. It includes an LED element that emits red light having a wavelength band of 600 nm to 700 nm, an LED element that emits green light having a wavelength band of 500 nm to 600 nm, and an LED element that emits blue light having a wavelength band of 400 nm to 500 nm.
光学ラインセンサ110及び120がそれぞれ、搬送方向に搬送されている紙幣BNに対して撮像を繰り返し行い、画像信号を出力することによって、本実施形態に係る紙幣識別装置は、紙幣BN全体の画像を取得する。本実施形態に係る紙幣識別装置は、光学ラインセンサ110の出力信号に基づいて紙幣BNのA面の反射画像を取得し、光学ラインセンサ120の出力信号に基づいて紙幣BNのB面の反射画像を取得する。 The optical line sensors 110 and 120 each repeatedly image a banknote BN being transported in the transport direction and output an image signal, whereby the banknote identification device according to the present embodiment captures an image of the entire banknote BN. get. The bill identification device according to the present embodiment acquires a reflection image of the A side of the bill BN based on the output signal of the optical line sensor 110, and reflects a reflection image of the B side of the bill BN based on the output signal of the optical line sensor 120. To get.
<紙幣識別装置(画像取得装置)の構成>
次に、図4を用いて、本実施形態に係る紙幣識別装置(画像取得装置)の構成について説明する。図4に示すように、本実施形態に係る紙幣識別装置(画像取得装置)1は、制御部10、検出部20及び記憶部30を備えている。
<Structure of banknote identification device (image acquisition device)>
Next, the configuration of the banknote identification device (image acquisition device) according to the present embodiment will be described with reference to FIG. As shown in FIG. 4, the bill identification device (image acquisition device) 1 according to the present embodiment includes a control unit 10, a detection unit 20, and a storage unit 30.
制御部10は、記憶部30に記憶された各種の処理を実現するためのプログラムと、当該プログラムを実行するCPU(Central Processing Unit)と、当該CPUによって制御される各種ハードウェアと、FPGA(Field Programmable Gate Array)等の論理デバイス等によって構成されている。制御部10は、記憶部30に記憶されたプログラムに従って、紙幣識別装置1の各部から出力された信号と、制御部10からの制御信号とに基づいて、紙幣識別装置1の各部を制御する。また、制御部10は、記憶部30に記憶されたプログラムにより、光源制御部11、センサ制御部12、画像生成部13、画像認識部14及び識別部15の機能を有している。 The control unit 10 includes a program for realizing various processes stored in the storage unit 30, a CPU (Central Processing Unit) that executes the program, various hardware controlled by the CPU, and an FPGA (Field). It is composed of logical devices such as Programmable Gate Array). The control unit 10 controls each unit of the banknote identification device 1 based on the signal output from each unit of the banknote identification device 1 and the control signal from the control unit 10 according to the program stored in the storage unit 30. Further, the control unit 10 has the functions of the light source control unit 11, the sensor control unit 12, the image generation unit 13, the image recognition unit 14, and the identification unit 15 according to the program stored in the storage unit 30.
検出部20は、紙幣の搬送路に沿って、上述の撮像部21に加え、磁気検出部22、厚み検出部23及びUV検出部24を備えている。撮像部21は、上述のように紙幣を撮像して画像信号(画像データ)を出力する。磁気検出部22は、磁気を測定する磁気センサ(図示せず)を備え、磁気センサにより紙幣に印刷されている磁気インクやセキュリティスレッド等の磁気を検出する。磁気センサは、複数の磁気検出素子をライン状に配列した磁気ラインセンサである。厚み検出部23は、紙幣の厚みを測定する厚み検出センサ(図示せず)を備え、厚み検出センサによりテープや重送等を検出する。厚み検出センサは、搬送路を挟んで対向するローラにおける紙幣通過時の変位量を、各ローラに設けたセンサによって検出するものである。UV検出部24は、紫外線照射部(図示せず)及び受光部(図示せず)を備え、紫外線照射部により紫外線を紙幣に照射したときに発生する蛍光や、紙幣を透過する紫外線を受光部により検出する。 The detection unit 20 includes a magnetic detection unit 22, a thickness detection unit 23, and a UV detection unit 24 in addition to the above-mentioned imaging unit 21 along the bill transport path. The image pickup unit 21 takes an image of the banknote as described above and outputs an image signal (image data). The magnetic detection unit 22 includes a magnetic sensor (not shown) for measuring magnetism, and detects the magnetism of magnetic ink, security threads, etc. printed on banknotes by the magnetic sensor. The magnetic sensor is a magnetic line sensor in which a plurality of magnetic detection elements are arranged in a line. The thickness detection unit 23 includes a thickness detection sensor (not shown) for measuring the thickness of banknotes, and detects tape, double feed, etc. by the thickness detection sensor. The thickness detection sensor detects the amount of displacement of the rollers facing each other across the transport path when the banknotes pass by the sensors provided on each roller. The UV detection unit 24 includes an ultraviolet irradiation unit (not shown) and a light receiving unit (not shown), and receives fluorescence generated when the banknotes are irradiated with ultraviolet rays by the ultraviolet irradiation unit and ultraviolet rays transmitted through the banknotes. Detected by.
記憶部30は、半導体メモリやハードディスク等の不揮発性の記憶装置から構成されており、紙幣識別装置1を制御するための各種プログラムと各種データとを記憶している。また、記憶部30には、撮像部21による1サイクル分の撮像の間に各光源111、121から照射する光の波長域、各光源111、121の点灯及び消灯を行うタイミング、各光源111、121のLED素子に流す順電流の値、各光学ラインセンサ110、120から信号を読み出すタイミング等が撮像用パラメータとして保存されている。 The storage unit 30 is composed of a non-volatile storage device such as a semiconductor memory or a hard disk, and stores various programs and various data for controlling the bill identification device 1. Further, the storage unit 30 has a wavelength range of light emitted from the light sources 111 and 121 during one cycle of imaging by the image pickup unit 21, timing for turning on and off the light sources 111 and 121, and light sources 111. The value of the forward current flowing through the LED element 121, the timing of reading signals from the optical line sensors 110 and 120, and the like are stored as imaging parameters.
なお、1サイクルの撮像とは、各光源111、121から照射する光の波長域や、各光源111、121の点灯及び消灯、各LED素子に流す順電流の値、信号読出を行うタイミング等が設定された撮像パターンのことを言う。1サイクルの撮像を1周期として、これを連続して繰り返し実行することにより、紙幣全体の画像を取得する。 The one-cycle imaging includes the wavelength range of the light emitted from the light sources 111 and 121, the lighting and extinguishing of the light sources 111 and 121, the value of the forward current flowing through each LED element, the timing of signal reading, and the like. It refers to the set imaging pattern. An image of the entire banknote is acquired by continuously and repeatedly executing one cycle of imaging as one cycle.
光源制御部11は、各光源111、121による個別の紙幣の画像を撮像するために、光源111及び121を順に点灯させる動的点灯制御を行うものである。詳細には、光源制御部11は、撮像用パラメータに設定されたタイミングに基づいて、各光源111、121の点灯及び消灯を制御する。この制御は、紙幣の搬送速度に応じて変化するメカクロックと、紙幣の搬送速度によらず常に一定の周波数で出力されるシステムクロックとを利用して行われる。また、光源制御部11は、撮像用パラメータに基づき、各LED素子に流す順電流の大きさを設定する。 The light source control unit 11 performs dynamic lighting control in which the light sources 111 and 121 are turned on in order in order to capture an image of individual banknotes by the light sources 111 and 121. Specifically, the light source control unit 11 controls lighting and extinguishing of the light sources 111 and 121 based on the timing set in the imaging parameter. This control is performed by using a mechanical clock that changes according to the transport speed of banknotes and a system clock that is always output at a constant frequency regardless of the transport speed of banknotes. Further, the light source control unit 11 sets the magnitude of the forward current flowing through each LED element based on the imaging parameter.
センサ制御部12は、撮像用パラメータに設定されたタイミングに基づいて、各光学ラインセンサ110、120から画像信号を読み出すタイミングを制御し、各光源111、121の点灯及び消灯のタイミングに同期して各ラインセンサから画像信号を読み出す。この制御は、メカクロックとシステムクロックとを利用して行われる。そして、センサ制御部12は、読み出した画像信号、すなわちラインデータを順次、記憶部30のリングバッファ(ラインメモリ)に保存する。 The sensor control unit 12 controls the timing of reading the image signal from the optical line sensors 110 and 120 based on the timing set in the imaging parameter, and synchronizes with the timing of turning on and off the light sources 111 and 121. Read the image signal from each line sensor. This control is performed using the mechanical clock and the system clock. Then, the sensor control unit 12 sequentially stores the read image signal, that is, the line data, in the ring buffer (line memory) of the storage unit 30.
なお、ここで、ラインデータとは、各光学ラインセンサ110、120による1回の撮像によって得られた画像信号に基づくデータを意味し、取得される画像の横方向(紙幣の搬送方向に直交する方向)の一列分のデータに対応する。 Here, the line data means data based on an image signal obtained by one imaging by each optical line sensor 110, 120, and is orthogonal to the lateral direction of the acquired image (orthogonal to the transport direction of the bill). Corresponds to one column of data (direction).
画像生成部13は、検出部20から取得した紙幣に係る各種信号に基づいて画像を生成する機能を有する。詳細には、画像生成部13は、まず、リングバッファに保存されたデータ(画像信号)を光の照射及び受光の条件毎のデータに分解する。具体的には、第1の赤外光を照射して反射した光の受光強度データと、第2の赤外光を照射して反射した光の受光強度データと、第3の赤外光を照射して反射した光の受光強度データと、赤色光を照射して反射した光の受光強度データと、緑色光を照射して反射した光の受光強度データと、青色光を照射して反射した光の受光強度データとに分解する。そして、画像生成部13は、分解されたデータ毎の特性に応じて、暗出力カット、ゲイン調整、明出力レベルの補正等の補正処理を行い、紙幣の各種の画像データを生成して生画像データとして記憶部30へ保存する。 The image generation unit 13 has a function of generating an image based on various signals related to banknotes acquired from the detection unit 20. Specifically, the image generation unit 13 first decomposes the data (image signal) stored in the ring buffer into data for each condition of light irradiation and light reception. Specifically, the light receiving intensity data of the light reflected by irradiating the first infrared light, the light receiving intensity data of the light reflected by irradiating the second infrared light, and the third infrared light are displayed. Light receiving intensity data of the light reflected by irradiation, light receiving intensity data of light reflected by irradiating red light, light receiving intensity data of light reflected by irradiating green light, and light receiving intensity data of light reflected by irradiating blue light. It is decomposed into light reception intensity data. Then, the image generation unit 13 performs correction processing such as dark output cut, gain adjustment, and bright output level correction according to the characteristics of each decomposed data, and generates various image data of banknotes to generate a raw image. It is stored in the storage unit 30 as data.
また、画像生成部13は、第1の波長域の光の受光強度データに基づく第1の生画像データをグレー変換して第1の画像データ(以下、第1のモノクロ画像データ)を生成するとともに、第2の波長域の光の受光強度データに基づく第1の生画像データをグレー変換して第2の画像データ(以下、第2のモノクロ画像データ)を生成する。より具体的には、対応する受光部113又は123の出力の最大強度(255digit)及び最小強度(0digit)がそれぞれ白及び黒となるように、各色の生画像データを変換する。なお、このとき、人間の比視感度に合わせて、緑色の光の生画像データの受光強度(出力)を赤色や青色等の光の生画像データの受光強度(出力)より高く補正してもよい。また、第1のモノクロ画像データ及び第2のモノクロ画像データは、それぞれ、上記第1の画像データ及び第2の画像データに対応する。 Further, the image generation unit 13 generates the first image data (hereinafter, the first monochrome image data) by gray-converting the first raw image data based on the light receiving intensity data of the light in the first wavelength region. At the same time, the first raw image data based on the light receiving intensity data of the light in the second wavelength region is gray-converted to generate the second image data (hereinafter, the second monochrome image data). More specifically, the raw image data of each color is converted so that the maximum intensity (255 digit) and the minimum intensity (0 digit) of the output of the corresponding light receiving unit 113 or 123 are white and black, respectively. At this time, even if the light receiving intensity (output) of the raw image data of green light is corrected to be higher than the light receiving intensity (output) of the raw image data of light such as red or blue, according to the relative luminosity of human beings. Good. Further, the first monochrome image data and the second monochrome image data correspond to the first image data and the second image data, respectively.
また、画像生成部13は、2種以上の画像データを合成して副走査方向の解像度を増加させる超解像処理部としても機能し、生成した第1のモノクロ画像データ及び第2のモノクロ画像データを紙幣の搬送方向(副走査方向)において合成して合成画像を生成する。すなわち、第1のモノクロ画像データと、第2のモノクロ画像データとを1列毎に合成する。更に換言すると、第1のモノクロ画像データを構成するラインデータと、第2のモノクロ画像データを構成するラインデータとを副走査方向において交互に配置して1枚の反射画像を生成する。 In addition, the image generation unit 13 also functions as a super-resolution processing unit that synthesizes two or more types of image data to increase the resolution in the sub-scanning direction, and the generated first monochrome image data and second monochrome image. The data is combined in the bill transport direction (secondary scanning direction) to generate a composite image. That is, the first monochrome image data and the second monochrome image data are combined for each column. In other words, the line data constituting the first monochrome image data and the line data constituting the second monochrome image data are alternately arranged in the sub-scanning direction to generate one reflected image.
なお、画像生成部13は、合成画像の生成を、第1及び第2の生画像データの生成、並びに、第1及び第2のモノクロ画像データの生成と同時進行で実行してもよい。すなわち、各生画像データのラインデータを生成する度にそれに対応するモノクロ画像データを生成し、各モノクロ画像データのラインデータを生成する度に順次合成してもよい。 The image generation unit 13 may execute the generation of the composite image at the same time as the generation of the first and second raw image data and the generation of the first and second monochrome image data. That is, each time the line data of each raw image data is generated, the corresponding monochrome image data may be generated, and each time the line data of each monochrome image data is generated, the monochrome image data may be sequentially combined.
一方、画像生成部13は、合成画像の生成を、第1及び第2のモノクロ画像データの生成が完了した後に実行してもよい。すなわち、第1のモノクロ画像データ全体を生成するとともに、モノクロ画像データ全体を生成した後、これらの画像データを合成して合成画像を生成してもよい。 On the other hand, the image generation unit 13 may execute the generation of the composite image after the generation of the first and second monochrome image data is completed. That is, the entire first monochrome image data may be generated, and after the entire monochrome image data is generated, these image data may be combined to generate a composite image.
画像認識部14は、画像生成部13によって生成された合成画像を認識する。すなわち、合成画像を分析して特徴を抽出し、対象物の認識を行う。具体的には、例えば、紙幣に記番号が印刷されている場合、画像認識部14は、合成画像の記番号部分を文字認識し、紙幣の記番号を認識する。その他、紙幣に印刷された人物の顔を認識してもよい。また、画像認識部14は、認識結果を記憶部30へ保存する。 The image recognition unit 14 recognizes the composite image generated by the image generation unit 13. That is, the composite image is analyzed to extract features and recognize the object. Specifically, for example, when a serial number is printed on a bill, the image recognition unit 14 recognizes the serial number portion of the composite image as characters and recognizes the serial number of the bill. In addition, the face of a person printed on a bill may be recognized. Further, the image recognition unit 14 stores the recognition result in the storage unit 30.
識別部15は、検出部20から取得した紙幣に係る各種信号を利用して識別処理を行う。識別部23は、紙幣の少なくとも金種及び真偽を識別する。識別部15は、紙幣の正損を判定する機能を有してもよい。その場合、識別部15は、紙幣の汚れ、折れ、破れ等を検出するとともに、紙幣の厚みから紙幣に貼り付けられたテープ等を検出することにより、紙幣を、市場で再利用できる正券及び市場流通に適さない損券のいずれとして処理するかを判定する機能を有する。 The identification unit 15 performs identification processing using various signals related to banknotes acquired from the detection unit 20. The identification unit 23 identifies at least the denomination and authenticity of the bill. The identification unit 15 may have a function of determining whether the bill is correct or not. In that case, the identification unit 15 detects dirt, folds, tears, etc. of the banknote, and also detects the tape or the like attached to the banknote from the thickness of the banknote, so that the banknote can be reused in the market. It has a function to determine which of the loss tickets is not suitable for market distribution.
また、識別部15は、金種、真偽、正損等を識別するために撮像部21が撮影した紙幣の画像を用いる場合、画像生成部13によって生成された合成画像や、画像認識部14によって得られた認識結果を利用する。例えば、識別部15は、紙幣の真偽を識別するために、画像認識部14によって得られた紙幣の記番号情報を利用する。 Further, when the identification unit 15 uses an image of a banknote taken by the imaging unit 21 to identify the denomination, authenticity, positive loss, etc., the composite image generated by the image generation unit 13 or the image recognition unit 14 Use the recognition result obtained by. For example, the identification unit 15 uses the serial number information of the banknotes obtained by the image recognition unit 14 in order to identify the authenticity of the banknotes.
<光源の制御方法及びラインセンサからの信号読出の制御方法>
次に、図5を用いて、光源制御部11による各光源111の制御、及びセンサ制御部12による光学ラインセンサ110からの信号読出の制御について説明する。なお、図5は、光源点灯及び信号読出の内容及びタイミングを示している。また、光学ラインセンサ120についても光学ラインセンサ110と同様に制御されることから、その説明は省略する。
<Control method of light source and control method of signal reading from line sensor>
Next, with reference to FIG. 5, control of each light source 111 by the light source control unit 11 and control of signal reading from the optical line sensor 110 by the sensor control unit 12 will be described. Note that FIG. 5 shows the content and timing of lighting the light source and reading the signal. Further, since the optical line sensor 120 is also controlled in the same manner as the optical line sensor 110, the description thereof will be omitted.
図5に示すように、光学ラインセンサ110の撮像位置では、ロータリーエンコーダにより生成されるメカクロック(マスタークロック:MCLK)の2周期分に相当する1サイクル中に、搬送される紙幣に対して、光源111から、第1の赤外光を照射し、続いて、第1の赤外光と波長域が異なる第2の赤外光を照射し、続いて、青色光を照射し、続いて、第1及び第2の赤外光と波長域が異なる第3の赤外光を照射し、続いて、赤色光を照射し、そして、緑色光を照射する。紙幣に光が照射されている間、受光部113の各撮像素子が露光されて電荷を蓄積する。紙幣に照射する光を切り替える度に、切り替え前の光による画像信号が光学ラインセンサ110から読み出される。この結果、光学ラインセンサ110によって、1サイクルで、第1の赤外光によるA面の反射画像(以下、第1の赤外反射画像)を形成する1列分のラインデータ、第2の赤外光によるA面の反射画像(以下、第2の赤外反射画像)を形成する1列分のラインデータ、青色光によるA面の反射画像(以下、青色反射画像)を形成する1列分のラインデータ、第3の赤外光によるA面の反射画像(以下、第3の赤外反射画像)を形成する1列分のラインデータ、赤色光によるA面の反射画像(以下、赤色反射画像)を形成する1列分のラインデータ、及び緑色光によるA面の反射画像(以下、緑色反射画像)を形成する1列分のラインデータがこの順で順次取得されることになる。 As shown in FIG. 5, at the imaging position of the optical line sensor 110, with respect to the bills conveyed in one cycle corresponding to two cycles of the mechanical clock (master clock: MCLK) generated by the rotary encoder. The light source 111 irradiates a first infrared light, then a second infrared light having a wavelength range different from that of the first infrared light, then a blue light, and then a blue light. A third infrared light having a wavelength range different from that of the first and second infrared lights is irradiated, followed by a red light and then a green light. While the banknote is irradiated with light, each image sensor of the light receiving unit 113 is exposed and accumulates electric charges. Every time the light to be applied to the bill is switched, the image signal by the light before the switching is read from the optical line sensor 110. As a result, the line data for one row, the second red, which forms the reflection image of the A surface by the first infrared light (hereinafter, the first infrared reflection image) by the optical line sensor 110 in one cycle. Line data for one row that forms a reflection image of surface A by external light (hereinafter, second infrared reflection image), one row that forms a reflection image of surface A by blue light (hereinafter, blue reflection image) Line data, line data for one row forming a reflection image of surface A by third infrared light (hereinafter, third infrared reflection image), reflection image of surface A by red light (hereinafter, red reflection) The line data for one row forming the image) and the line data for one row forming the reflected image of the A surface by the green light (hereinafter referred to as the green reflected image) are sequentially acquired in this order.
なお、第1~第3の赤外光のそれぞれの具体的な波長域は、特に限定されないが、第1、第2及び第3の赤外光のピーク波長は、この順に大きくなっていてもよく、例えば、第1~第3の赤外光のそれぞれのピーク波長は、第1が800nm、第2が880nm、第3が950nmであってもよい。 The specific wavelength range of each of the first to third infrared lights is not particularly limited, but the peak wavelengths of the first, second, and third infrared lights may increase in this order. Often, for example, the peak wavelengths of the first to third infrared lights may be 800 nm for the first, 880 nm for the second, and 950 nm for the third.
この1サイクルの撮像を連続して繰り返し実行することにより、紙幣のA面全体の、第1の赤外反射画像、第2の赤外反射画像、青色反射画像、第3の赤外反射画像、赤色反射画像及び緑色反射画像をそれぞれ取得する。また、赤色、緑色及び青色の反射画像から紙幣のA面全体のカラー画像を取得することができる。 By continuously and repeatedly performing this one-cycle imaging, the first infrared reflection image, the second infrared reflection image, the blue reflection image, and the third infrared reflection image of the entire A side of the banknote can be obtained. A red reflection image and a green reflection image are acquired respectively. In addition, a color image of the entire A side of the banknote can be obtained from the red, green, and blue reflection images.
図5は、第1及び第2の波長域の光として、青色光と緑色光を利用する場合を示しており、青色反射画像及び緑色反射画像を合成して合成画像が生成される。 FIG. 5 shows a case where blue light and green light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a blue reflection image and a green reflection image.
また、例えば、1サイクルの撮像(メカクロックの2周期)で紙幣が0.254mm搬送される場合、1サイクル中に特定の種類の光を1回発光させれば、その光のモノクロ画像を副走査方向において100dpiの解像度で取得できる。したがって、図5に示した例では、各色のモノクロ画像及びRGBのカラー画像を副走査方向において100dpiの解像度で取得できるとともに、青色反射画像及び緑色反射画像を合成した合成画像を副走査方向において200dpiの解像度で取得することができる。 Further, for example, when a bill is conveyed by 0.254 mm in one cycle of imaging (two cycles of a mechanical clock), if a specific type of light is emitted once in one cycle, a monochrome image of the light is subordinated. It can be acquired with a resolution of 100 dpi in the scanning direction. Therefore, in the example shown in FIG. 5, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a composite image obtained by combining a blue reflection image and a green reflection image is 200 dpi in the sub-scanning direction. Can be obtained at the resolution of.
図6を用いて、本実施形態における光源制御部11による各光源111の制御の他の例について説明する。なお、図6は、光源点灯及び信号読出の内容及びタイミングを示しており、光源点灯の内容及びタイミングが一部異なることを除いて、図5の場合と同様である。 With reference to FIG. 6, another example of control of each light source 111 by the light source control unit 11 in the present embodiment will be described. Note that FIG. 6 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
図6に示す例では、光学ラインセンサ110の撮像位置において、1サイクル中に、搬送される紙幣に対して、光源111から、第1の赤外光を照射し、続いて、第2の赤外光を照射し、続いて、第3の赤外光を照射し、続いて、青色光を照射し、続いて、緑色光を照射し、そして、赤色光を照射する。この結果、光学ラインセンサ110によって、1サイクルで、第1の赤外反射画像を形成する1列分のラインデータ、第2の赤外反射画像を形成する1列分のラインデータ、第3の赤外反射画像を形成する1列分のラインデータ、青色反射画像を形成する1列分のラインデータ、緑色反射画像を形成する1列分のラインデータ、及び赤色反射画像を形成する1列分のラインデータがこの順で順次取得されることになる。 In the example shown in FIG. 6, at the imaging position of the optical line sensor 110, the conveyed bill is irradiated with the first infrared light from the light source 111 during one cycle, followed by the second red. It is irradiated with external light, followed by a third infrared light, then blue light, then green light, and then red light. As a result, the optical line sensor 110 forms one row of line data for forming the first infrared reflection image, one row of line data for forming the second infrared reflection image, and a third row of line data in one cycle. One row of line data that forms an infrared reflection image, one row of line data that forms a blue reflection image, one row of line data that forms a green reflection image, and one row that forms a red reflection image. Line data will be acquired in this order.
この1サイクルの撮像を連続して繰り返し実行することにより、紙幣のA面全体の、第1の赤外反射画像、第2の赤外反射画像、第3の赤外反射画像、青色反射画像、緑色反射画像及び赤色反射画像をそれぞれ取得する。 By continuously and repeatedly performing this one-cycle imaging, the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained. A green reflection image and a red reflection image are acquired respectively.
図6は、第1及び第2の波長域の光として、第3の赤外光と赤色光を利用する場合を示しており、第3の赤外反射画像及び赤色反射画像を合成して合成画像が生成される。 FIG. 6 shows a case where the third infrared light and the red light are used as the light in the first and second wavelength regions, and the third infrared reflection image and the red reflection image are combined and synthesized. An image is generated.
図6に示した例でも図5の場合と同様に、各色のモノクロ画像及びRGBのカラー画像を副走査方向において100dpiの解像度で取得できるとともに、第3の赤外反射画像及び赤色反射画像を合成した合成画像を副走査方向において200dpiの解像度で取得することができる。 In the example shown in FIG. 6, as in the case of FIG. 5, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a third infrared reflection image and a red reflection image are combined. The combined image can be acquired at a resolution of 200 dpi in the sub-scanning direction.
図7を用いて、本実施形態における光源制御部11による各光源111の制御の更に他の例について説明する。なお、図7は、光源点灯及び信号読出の内容及びタイミングを示しており、光源点灯の内容及びタイミングが一部異なることを除いて、図5の場合と同様である。 Further, another example of control of each light source 111 by the light source control unit 11 in the present embodiment will be described with reference to FIG. 7. Note that FIG. 7 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
図7に示す例では、光学ラインセンサ110の撮像位置において、1サイクル中に、搬送される紙幣に対して、光源111から、第3の赤外光を照射し、続いて、青色光を照射し、続いて、第1の赤外光を照射し、続いて、緑色光を照射し、続いて、赤色光を照射し、そして、第2の赤外光を照射する。この結果、光学ラインセンサ110によって、1サイクルで、第3の赤外反射画像を形成する1列分のラインデータ、青色反射画像を形成する1列分のラインデータ、第1の赤外反射画像を形成する1列分のラインデータ、緑色反射画像を形成する1列分のラインデータ、赤色反射画像を形成する1列分のラインデータ、及び第2の赤外反射画像を形成する1列分のラインデータがこの順で順次取得されることになる。 In the example shown in FIG. 7, at the imaging position of the optical line sensor 110, the conveyed bills are irradiated with a third infrared light from the light source 111 and subsequently irradiated with blue light during one cycle. Then, the first infrared light is irradiated, then the green light is irradiated, then the red light is irradiated, and then the second infrared light is irradiated. As a result, the optical line sensor 110 performs one row of line data forming a third infrared reflection image, one row of line data forming a blue reflection image, and a first infrared reflection image in one cycle. One row of line data to form a green reflection image, one row of line data to form a green reflection image, one row of line data to form a red reflection image, and one row to form a second infrared reflection image. Line data will be acquired in this order.
この1サイクルの撮像を連続して繰り返し実行することにより、紙幣のA面全体の、第1の赤外反射画像、第2の赤外反射画像、第3の赤外反射画像、青色反射画像、緑色反射画像及び赤色反射画像をそれぞれ取得する。 By continuously and repeatedly performing this one-cycle imaging, the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained. A green reflection image and a red reflection image are acquired respectively.
図7は、第1及び第2の波長域の光として、第1及び第2の赤外光を利用する場合を示しており、第1の赤外反射画像及び第2の赤外反射画像を合成して合成画像が生成される。 FIG. 7 shows a case where the first and second infrared lights are used as the light in the first and second wavelength regions, and the first infrared reflection image and the second infrared reflection image are displayed. A composite image is generated by compositing.
図7に示した例でも図5の場合と同様に、各色のモノクロ画像及びRGBのカラー画像を副走査方向において100dpiの解像度で取得できるとともに、第1の赤外反射画像及び第2の赤外反射画像を合成した合成画像を副走査方向において200dpiの解像度で取得することができる。 In the example shown in FIG. 7, as in the case of FIG. 5, a monochrome image and an RGB color image of each color can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a first infrared reflection image and a second infrared image can be obtained. A composite image obtained by synthesizing the reflected image can be acquired at a resolution of 200 dpi in the sub-scanning direction.
図5~7に示した例では、それぞれの種類の光を照射する時間は一定に設定されている。すなわち、各撮像素子の電荷の蓄積時間は、光の種類によらず一定に設定されている。しかしながら、各種の光を照射する時間は適宜設定可能であり、異なる光で照射時間は異なっていてもよい。 In the examples shown in FIGS. 5 to 7, the time for irradiating each type of light is set to be constant. That is, the charge accumulation time of each image sensor is set to be constant regardless of the type of light. However, the time for irradiating various types of light can be appropriately set, and the irradiation time may be different for different lights.
ただし、第1の波長域の光の照射を開始してから第2の波長域の光の照射を開始するまでの時間間隔T1と、第2の波長域の光の照射を開始してから第1の波長域の光の照射を開始するまでの時間間隔T2とは、同一の時間間隔に設定されることが好ましい。これにより、合成画像を構成するラインデータを同一ピッチで取得できることから、ひずみの小さいより自然な合成画像を取得することができる。 However, the time interval T1 from the start of the irradiation of the light in the first wavelength range to the start of the irradiation of the light in the second wavelength range, and the second after the start of the irradiation of the light in the second wavelength range. It is preferable that the time interval T2 until the start of irradiation of light in the wavelength range of 1 is set to the same time interval. As a result, the line data constituting the composite image can be acquired at the same pitch, so that a more natural composite image with less distortion can be acquired.
なお、ここで、複数の時間間隔(T1及びT2や、後述するt1、t2及びt3)が同一とは、複数の時間間隔が完全に同一である場合のみならず、メカクロックのブレによって発生する程度のずれが複数の時間間隔の間に生じる場合も含むものとする。 Here, the fact that the plurality of time intervals (T1 and T2 and t1, t2 and t3 described later) are the same occurs not only when the plurality of time intervals are completely the same but also due to the fluctuation of the mechanical clock. It also includes cases where the degree of deviation occurs between multiple time intervals.
<紙幣識別装置による処理フロー>
次に、図8を用いて、紙幣識別装置1による処理フロー、なかでも紙幣の合成画像を取得する画像取得方法の処理フローと、その後の主な処理フローとについて説明する。ここでは、光学ラインセンサ110を用いた画像取得方法について説明するが、光学ラインセンサ120を用いた場合も同様であることから、その説明は省略する。図8に示すように、まず、受光部113が、光源111から照射された第1の波長域の光(例えば青色光)が紙幣で反射した光を受光して画像信号S1を出力するとともに、第1の波長域の光と異なるタイミングで光源111から照射された第2の波長域の光(例えば緑色光)が紙幣で反射した光を受光して画像信号S2を出力する(受光ステップS11)。なお、画像信号S1及び画像信号S2は、それぞれ、上記第1の画像信号及び第2の画像信号に対応する。
<Processing flow by banknote identification device>
Next, with reference to FIG. 8, a processing flow by the banknote identification device 1, a processing flow of an image acquisition method for acquiring a composite image of banknotes, and a main processing flow thereafter will be described. Here, the image acquisition method using the optical line sensor 110 will be described, but since the same applies to the case where the optical line sensor 120 is used, the description thereof will be omitted. As shown in FIG. 8, first, the light receiving unit 113 receives the light of the first wavelength region (for example, blue light) emitted from the light source 111 and reflected by the bill, and outputs the image signal S1. The light in the second wavelength range (for example, green light) emitted from the light source 111 at a timing different from the light in the first wavelength range receives the light reflected by the bill and outputs the image signal S2 (light receiving step S11). .. The image signal S1 and the image signal S2 correspond to the first image signal and the second image signal, respectively.
次に、センサ制御部12が、光学ラインセンサ110から画像信号S1及びS2を読み出し、読み出した画像信号S1及びS2を順次、記憶部30のリングバッファに保存する(画像信号読出ステップS12)。 Next, the sensor control unit 12 reads the image signals S1 and S2 from the optical line sensor 110, and sequentially stores the read image signals S1 and S2 in the ring buffer of the storage unit 30 (image signal reading step S12).
次に、画像生成部13が、画像信号S1に基づく生画像データをグレー変換して第1の画像データ(第1のモノクロ画像データ)を生成するとともに、画像信号S2に基づく生画像データをグレー変換して第2の画像データ(第2のモノクロ画像データ)を生成し、そして、生成した第1のモノクロ画像データ及び第2のモノクロ画像データを紙幣の搬送方向(副走査方向)において合成して合成画像を生成する(画像生成ステップS13)。 Next, the image generation unit 13 gray-converts the raw image data based on the image signal S1 to generate the first image data (first monochrome image data), and grays the raw image data based on the image signal S2. The conversion is performed to generate the second image data (second monochrome image data), and the generated first monochrome image data and the second monochrome image data are combined in the bill transport direction (secondary scanning direction). To generate a composite image (image generation step S13).
次に、画像認識部14が、画像生成部13によって生成された合成画像を認識する(画像認識ステップS14)。 Next, the image recognition unit 14 recognizes the composite image generated by the image generation unit 13 (image recognition step S14).
そして、識別部15が、紙幣の金種、真偽、正損等の判定を行う(識別ステップS15)。このとき、識別部15は、画像生成部13によって生成された合成画像や、画像認識部14の認識結果を利用する。 Then, the identification unit 15 determines the denomination, authenticity, correct loss, etc. of the banknote (identification step S15). At this time, the identification unit 15 uses the composite image generated by the image generation unit 13 and the recognition result of the image recognition unit 14.
上述のように、本実施形態では、受光部113が、反射用光源111から第1の波長域の光が紙幣で反射した光を受光して画像信号S1を出力するとともに、第1の波長域の光と異なるタイミングで反射用光源111から照射された第2の波長域の光が紙幣で反射した光を受光して画像信号S2を出力し、画像生成部13が、画像信号S1に基づく第1の画像データと、画像信号S2に基づく第2の画像データとを紙幣の搬送方向において合成して合成画像を生成する。このため、第1、第2の各画像データに比べてより高解像度、具体的には副走査方向において2倍の解像度のモノクロ画像を取得することができる。 As described above, in the present embodiment, the light receiving unit 113 receives the light reflected by the light source in the first wavelength range from the reflection light source 111 and outputs the image signal S1 and outputs the image signal S1 and also outputs the first wavelength range. The light in the second wavelength range emitted from the reflection light source 111 at a timing different from the light of the above receives the light reflected by the bill and outputs the image signal S2, and the image generation unit 13 is based on the image signal S1. The image data of 1 and the second image data based on the image signal S2 are combined in the transport direction of the bill to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution than the first and second image data, specifically, a monochrome image having twice the resolution in the sub-scanning direction.
なお、上記実施形態1では、第1の波長域の光と、第2の波長域の光とをこの順で繰り返し紙幣に照射する場合について説明したが、このように第1及び第2の波長域の光は1回ずつ繰り返し照射されなくてもよい。例えば、緑→緑→青→緑→緑→青の順に光源111が発光するといったように、第1及び第2の波長域の光の少なくとも一方が連続して紙幣に照射されてもよい。 In the first embodiment, the case where the light in the first wavelength region and the light in the second wavelength region are repeatedly irradiated to the banknote in this order has been described. In this way, the first and second wavelengths The light in the region does not have to be repeatedly irradiated once. For example, at least one of the lights in the first and second wavelength regions may be continuously irradiated on the bill, such that the light source 111 emits light in the order of green → green → blue → green → green → blue.
また、上記実施形態1では、画像生成部13が、2種類の画像データをグレー変換した上で合成する場合について説明したが、2種類の画像データをグレー変換することなく合成してもよい。 Further, in the first embodiment, the case where the image generation unit 13 combines the two types of image data after being gray-converted has been described, but the two types of image data may be combined without being gray-converted.
また、上記実施形態1では、合成画像の生成に利用される光の波長が2つ(第1及び第2の波長域)である場合について説明したが、合成画像の生成に利用される光の波長の数は、2つに特に限定されず、3つ以上であってもよい。具体的には、例えば、赤色、緑色及び青色の光を利用したり、3種の赤外光(例えば第1~第3の赤外光)を利用してもよい。 Further, in the first embodiment, the case where the light used for generating the composite image has two wavelengths (first and second wavelength regions) has been described, but the light used for generating the composite image has been described. The number of wavelengths is not particularly limited to two, and may be three or more. Specifically, for example, red, green, and blue light may be used, or three types of infrared light (for example, first to third infrared light) may be used.
ここで、図9を用いて、合成画像の生成に利用される光の波長が3つである場合の、光源制御部11による各光源の制御、及びセンサ制御部12による各光学ラインセンサ110、120からの信号読出の制御の一例について説明する。なお、図9は、光源点灯及び信号読出の内容及びタイミングを示しており、光源点灯の内容及びタイミングが一部異なることを除いて、図5の場合と同様である。また、光学ラインセンサ120についても光学ラインセンサ110と同様に制御されることから、その説明は省略する。 Here, using FIG. 9, the light source control unit 11 controls each light source, and the sensor control unit 12 controls each optical line sensor 110 when the wavelengths of light used for generating the composite image are three. An example of control of reading a signal from 120 will be described. Note that FIG. 9 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different. Further, since the optical line sensor 120 is also controlled in the same manner as the optical line sensor 110, the description thereof will be omitted.
図9に示す例では、光学ラインセンサ110の撮像位置において、1サイクル中に、搬送される紙幣に対して、光源111から、赤色光を照射し、続いて、第1の赤外光を照射し、続いて、緑色光を照射し、続いて、第2の赤外光を照射し、続いて、青色光を照射し、そして、第3の赤外光を照射する。この結果、光学ラインセンサ110によって、1サイクルで、赤色反射画像を形成する1列分のラインデータ、第1の赤外反射画像を形成する1列分のラインデータ、緑色反射画像を形成する1列分のラインデータ、第2の赤外反射画像を形成する1列分のラインデータ、青色反射画像を形成する1列分のラインデータ、及び第3の赤外反射画像を形成する1列分のラインデータがこの順で順次取得されることになる。 In the example shown in FIG. 9, at the imaging position of the optical line sensor 110, the conveyed bills are irradiated with red light from the light source 111 and subsequently irradiated with the first infrared light during one cycle. Then, it is irradiated with green light, then it is irradiated with a second infrared light, then it is irradiated with blue light, and then it is irradiated with a third infrared light. As a result, the optical line sensor 110 forms one row of line data for forming a red reflection image, one row of line data for forming a first infrared reflection image, and a green reflection image in one cycle. Line data for one row, line data for one row that forms the second infrared reflection image, line data for one row that forms the blue reflection image, and one row that forms the third infrared reflection image. Line data will be acquired in this order.
この1サイクルの撮像を連続して繰り返し実行することにより、紙幣のA面全体の、第1の赤外反射画像、第2の赤外反射画像、第3の赤外反射画像、青色反射画像、緑色反射画像及び赤色反射画像をそれぞれ取得する。 By continuously and repeatedly performing this one-cycle imaging, the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained. A green reflection image and a red reflection image are acquired respectively.
図9は、3つの波長域の光として、第1~第3の赤外光を利用する場合を示しており、第1の赤外反射画像、第2の赤外反射画像及び第3の赤外反射画像を合成して合成画像が生成される。 FIG. 9 shows a case where the first to third infrared lights are used as the light in the three wavelength ranges, and the first infrared reflection image, the second infrared reflection image, and the third red A composite image is generated by synthesizing the external reflection images.
図9に示した例でも図5の場合と同様に、各色のモノクロ画像及びRGBのカラー画像を副走査方向において100dpiの解像度で取得できる。一方、図9に示した例では、第1の赤外反射画像、第2の赤外反射画像及び第3の赤外反射画像を合成した合成画像を副走査方向において300dpiの解像度で取得することができ、実施形態1に比べて更に高解像度の反射画像を取得することが可能である。 In the example shown in FIG. 9, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction as in the case of FIG. On the other hand, in the example shown in FIG. 9, a composite image obtained by synthesizing the first infrared reflection image, the second infrared reflection image, and the third infrared reflection image is acquired at a resolution of 300 dpi in the sub-scanning direction. Therefore, it is possible to acquire a reflected image having a higher resolution than that of the first embodiment.
図9に示した例でも図5の場合と同様に、それぞれの種類の光を照射する時間は一定に設定されている。すなわち、各撮像素子の電荷の蓄積時間は、光の種類によらず一定に設定されている。しかしながら、各種の光を照射する時間は適宜設定可能であり、異なる光で照射時間は異なっていてもよい。 In the example shown in FIG. 9, the time for irradiating each type of light is set to be constant as in the case of FIG. That is, the charge accumulation time of each image sensor is set to be constant regardless of the type of light. However, the time for irradiating various types of light can be appropriately set, and the irradiation time may be different for different lights.
ただし、第1の波長域の光(例えば第1の赤外光)の照射を開始してから第2の波長域の光(例えば第2の赤外光)の照射を開始するまでの時間間隔t1と、第2の波長域の光の照射を開始してから第3の波長域(例えば第3の赤外光)の光の照射を開始するまでの時間間隔t2と、第3の波長域の光の照射を開始してから第1の波長域の光の照射を開始するまでの時間間隔t3とは、同一の時間間隔に設定されることが好ましい。これにより、合成画像を構成するラインデータを同一ピッチで取得できることから、ひずみの小さいより自然な合成画像を取得することができる。 However, the time interval from the start of irradiation of light in the first wavelength range (for example, first infrared light) to the start of irradiation of light in the second wavelength range (for example, second infrared light). t1, the time interval t2 from the start of irradiation of light in the second wavelength region to the start of irradiation of light in the third wavelength region (for example, third infrared light), and the third wavelength region. It is preferable that the time interval t3 from the start of the irradiation of the light of the above to the start of the irradiation of the light in the first wavelength region is set to the same time interval. As a result, the line data constituting the composite image can be acquired at the same pitch, so that a more natural composite image with less distortion can be acquired.
また、上記実施形態1では、メカクロック(マスタークロック:MCLK)の2周期分で1サイクルの撮像を行う場合について説明したが、メカクロックの1周期分で1サイクルの撮像を行ってもよい。 Further, in the first embodiment, the case where one cycle of imaging is performed in two cycles of the mechanical clock (master clock: MCLK) has been described, but one cycle of imaging may be performed in one cycle of the mechanical clock.
ここで、図10を用いて、メカクロックの1周期分で1サイクルの撮像を行う場合の、光源制御部11による各光源の制御、及びセンサ制御部12による各光学ラインセンサ110、120からの信号読出の制御の複数の例について説明する。なお、図10は、光源点灯及び信号読出の内容及びタイミングを示しており、光源点灯の内容及びタイミングが一部異なることを除いて、図5~7及び9の場合と同様である。また、光学ラインセンサ120についても光学ラインセンサ110と同様に制御されることから、その説明は省略する。 Here, using FIG. 10, when one cycle of imaging is performed in one cycle of the mechanical clock, the light source control unit 11 controls each light source, and the sensor control unit 12 controls each optical line sensor 110, 120. A plurality of examples of signal read control will be described. Note that FIG. 10 shows the contents and timing of lighting the light source and reading the signal, and is the same as the cases of FIGS. 5 to 7 and 9 except that the contents and timing of lighting the light source are partially different. Further, since the optical line sensor 120 is also controlled in the same manner as the optical line sensor 110, the description thereof will be omitted.
図10に示す各例では、光学ラインセンサ110の撮像位置において、ロータリーエンコーダにより生成されるメカクロックの1周期分に相当する1サイクル中に、搬送される紙幣に対して、光源111から、青色光、赤色光、緑色光、第1の赤外光、第2の赤外光及び第3の赤外光を図10に示した所定の順で照射する。また、1サイクルの撮像(メカクロックの1周期)で紙幣が0.254mm搬送される。 In each example shown in FIG. 10, at the imaging position of the optical line sensor 110, the light source 111 emits blue light with respect to the bills conveyed during one cycle corresponding to one cycle of the mechanical clock generated by the rotary encoder. Light, red light, green light, first infrared light, second infrared light, and third infrared light are irradiated in the predetermined order shown in FIG. In addition, 0.254 mm of banknotes are conveyed by one cycle of imaging (one cycle of the mechanical clock).
ここで、基準のメカクロックを基に、周期がメカクロックの2分の1のクロック(以下、1/2MCLKとも略記する。)と、周期がメカクロックの3分の1のクロック(以下、1/3MCLKとも略記する。)とを生成する。そして、各クロックの立ち上がりを基準にして、第1及び第2の波長域の光を照射する期間のインターバルT1’及びT2’を同一(実質同一も含む)に設定するとともに、3つの波長域の光を照射する期間のインターバルt1’、t2’及びt3’を同一(実質同一も含む)に設定する。これにより、各色の搬送方向の移動距離を同じに近づけることができる。 Here, based on the reference mechanical clock, a clock having a period of one half of the mechanical clock (hereinafter, also abbreviated as 1 / 2MCLK) and a clock having a period of one third of the mechanical clock (hereinafter, 1). / 3MCLK is also abbreviated.) And is generated. Then, based on the rising edge of each clock, the intervals T1'and T2'of the period of irradiating the light in the first and second wavelength regions are set to be the same (including substantially the same), and the three wavelength regions are set. The intervals t1', t2', and t3'of the period of irradiating light are set to be the same (including substantially the same). As a result, the moving distance of each color in the transport direction can be made close to the same.
なお、インターバルT1’及びT2’は、1/2MCLKの立ち上がりのタイミングから、システムクロックに基づく一定時間の経過後までの期間に設定される。また、インターバルt1’、t2’及びt3’は、1/3MCLKの立ち上がりのタイミングから、システムクロックに基づく一定時間の経過後までの期間に設定される。 The intervals T1'and T2'are set in the period from the rising timing of 1 / 2MCLK to the elapse of a certain time based on the system clock. Further, the intervals t1', t2'and t3'are set in the period from the rising timing of 1/3 MCLK to the elapse of a certain time based on the system clock.
(実施形態2)
本実施形態では、本実施形態に特有の特徴について主に説明し、実施形態1と重複する内容についての詳細な説明は省略する。また、本実施形態と実施形態1とにおいて、同一又は同様の機能を有する部材には同一の符号を付し、本実施形態において、その部材の詳細な説明は省略する。
(Embodiment 2)
In the present embodiment, the features peculiar to the present embodiment will be mainly described, and detailed description of the contents overlapping with the first embodiment will be omitted. Further, in the present embodiment and the first embodiment, members having the same or similar functions are designated by the same reference numerals, and detailed description of the members will be omitted in the present embodiment.
<本実施形態の概要>
まず、図11を用いて、本実施形態の概要について説明する。図11に示したように、本実施形態では、光学ラインセンサにより、第1の波長域の光と、第1の波長域と異なる第2の波長域の光とを紙幣に同時に照射して2色の反射画像をそれぞれ取得し、そして、これらの反射画像をグレー変換した上で紙幣の搬送方向に直交する方向(光学ラインセンサの主走査方向)において合成して合成画像を生成する。そのため、各反射画像に比べて、紙幣の搬送方向に直交する方向においてより高解像度、具体的には2倍の解像度のモノクロ画像を取得することができる。また、実施形態1と同様に、より多くの種類(波長)の画像を取得することができるとともに、紙幣の一方の面のみのより正確な画像を取得することができる。
<Outline of this embodiment>
First, the outline of the present embodiment will be described with reference to FIG. As shown in FIG. 11, in the present embodiment, the optical line sensor simultaneously irradiates the banknote with light in the first wavelength region and light in a second wavelength region different from the first wavelength region. Each color reflection image is acquired, and these reflection images are gray-converted and then combined in a direction orthogonal to the bill transport direction (main scanning direction of the optical line sensor) to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution, specifically, twice the resolution in the direction orthogonal to the transport direction of the banknotes, as compared with each reflected image. Further, as in the first embodiment, it is possible to acquire images of more types (wavelengths) and to acquire a more accurate image of only one side of the banknote.
<撮像部の構成>
本実施形態では、各光源111、121は、波長域が400nm~500nmの青色光を発光するLED素子と、波長域が500nm~600nmの緑色光を発光するLED素子と、波長域が600nm~700nmの赤色光を発光するLED素子と、波長域が700nm~1000nmの赤外光を発光するLED素子とを備えており、紙幣BNに青色光、緑色光及び赤色光を含む白色光と、赤外光とを同時に照射する。
<Structure of imaging unit>
In the present embodiment, the light sources 111 and 121 have an LED element that emits blue light having a wavelength range of 400 nm to 500 nm, an LED element that emits green light having a wavelength range of 500 nm to 600 nm, and a wavelength range of 600 nm to 700 nm. The LED element that emits red light and the LED element that emits infrared light having a wavelength range of 700 nm to 1000 nm are provided, and the bill BN contains white light including blue light, green light, and red light, and infrared light. Irradiate with light at the same time.
図12に示すように、各受光部113、123は、紙幣BNの搬送方向(副走査方向)に対して直交する方向(主走査方向)にライン状に配列された複数の画素単位ユニット130を備えている。各画素単位ユニット130は、主走査方向に1列に配置された4つの撮像素子(受光素子)131a~131dと、4つの撮像素子131a~131dにそれぞれ対応して設けられた4つのバンドパスフィルタとを備えている。4つのバンドパスフィルタは、それぞれ透過する波長域の異なるフィルタ(光学フィルタ)であり、400~500nmの波長域の青色光を透過するバンドパスフィルタ(以下、青色フィルタとも言う。)132Bと、500~600nmの波長域の緑色光を透過するバンドパスフィルタ(以下、緑色フィルタとも言う。)132Gと、600~700nmの波長域の赤色光を透過するバンドパスフィルタ(以下、赤色フィルタとも言う。)132Rと、700~1000nmの波長域の赤外光を透過するバンドパスフィルタ(以下、赤外光フィルタとも言う。)132IRとを含んでいる。ここでは、青色フィルタ132B、赤色フィルタ132R、緑色フィルタ132G及び赤外光フィルタ132IRが主走査方向にこの順で配置されている。そして、各撮像素子131a~131dは、対応するバンドパスフィルタを透過した異なる波長帯の光を受光する。 As shown in FIG. 12, the light receiving units 113 and 123 have a plurality of pixel unit units 130 arranged in a line in a direction (main scanning direction) orthogonal to the transport direction (sub-scanning direction) of the bill BN. I have. Each pixel unit 130 has four image pickup elements (light receiving elements) 131a to 131d arranged in a row in the main scanning direction and four bandpass filters provided corresponding to the four image pickup elements 131a to 131d, respectively. And have. The four bandpass filters are filters (optical filters) having different wavelength ranges through which they pass, and are bandpass filters (hereinafter, also referred to as blue filters) 132B and 500 that transmit blue light in the wavelength range of 400 to 500 nm. A bandpass filter that transmits green light in the wavelength range of up to 600 nm (hereinafter, also referred to as a green filter) 132G and a bandpass filter that transmits red light in the wavelength range of 600 to 700 nm (hereinafter, also referred to as a red filter). It includes 132R and a bandpass filter (hereinafter, also referred to as an infrared light filter) 132IR that transmits infrared light in a wavelength range of 700 to 1000 nm. Here, the blue filter 132B, the red filter 132R, the green filter 132G, and the infrared light filter 132IR are arranged in this order in the main scanning direction. Then, each of the image pickup devices 131a to 131d receives light of different wavelength bands transmitted through the corresponding bandpass filter.
これによって、各光源111、121から4つの波長域の光を発光タイミングを重ねて紙幣BNに照射して、照射した4種類の波長域の光に対する紙幣BNからの反射光を対応する受光部113又は123で同時に受光したとしても、それぞれの波長域に対応したバンドパスフィルタによってフィルタリングされるので、それぞれの波長域ごとの受光強度を撮像素子131a~131dによって取得することが可能である。 As a result, light in four wavelength ranges from each of the light sources 111 and 121 is irradiated to the bill BN at overlapping emission timings, and the light receiving unit 113 corresponding to the reflected light from the bill BN with respect to the irradiated light in the four wavelength ranges. Alternatively, even if the light is received by 123 at the same time, it is filtered by the bandpass filter corresponding to each wavelength range, so that the light receiving intensity for each wavelength range can be acquired by the image pickup elements 131a to 131d.
<紙幣識別装置(画像取得装置)の構成>
本実施形態では、センサ制御部12によって読み出された一列分のラインデータには、照射した光の波長の違い等によって区別される複数の種類のラインデータが含まれている。
<Structure of banknote identification device (image acquisition device)>
In the present embodiment, the line data for one row read by the sensor control unit 12 includes a plurality of types of line data that are distinguished by the difference in the wavelength of the irradiated light and the like.
また、本実施形態では、画像生成部13は、リングバッファに保存されたデータ(画像信号)を、赤外光を照射して反射した光の受光強度データと、赤色光を照射して反射した光の受光強度データと、緑色光を照射して反射した光の受光強度データと、青色光を照射して反射した光の受光強度データとに分解する。 Further, in the present embodiment, the image generation unit 13 irradiates the data (image signal) stored in the ring buffer with the received intensity data of the light reflected by irradiating infrared light and reflects the data (image signal) by irradiating with red light. It is decomposed into the light reception intensity data, the light reception intensity data of the light reflected by irradiating green light, and the light reception intensity data of the light reflected by irradiating blue light.
また、画像生成部13は、2種以上の画像データを合成して主走査方向の解像度を増加させる超解像処理部としても機能し、生成した第1のモノクロ画像データ及び第2のモノクロ画像データを紙幣の搬送方向に直交する方向(主走査方向)において合成して合成画像を生成する。すなわち、第1のモノクロ画像データと、第2のモノクロ画像データとを1行毎に合成する。更に換言すると、第1のモノクロ画像データと、第2のモノクロ画像データとを主走査方向において交互に配置して1枚の反射画像を生成する。 In addition, the image generation unit 13 also functions as a super-resolution processing unit that synthesizes two or more types of image data to increase the resolution in the main scanning direction, and the generated first monochrome image data and second monochrome image. A composite image is generated by synthesizing the data in a direction (main scanning direction) orthogonal to the transport direction of the bill. That is, the first monochrome image data and the second monochrome image data are combined line by line. In other words, the first monochrome image data and the second monochrome image data are alternately arranged in the main scanning direction to generate one reflected image.
<合成画像の具体例>
次に、図13及び14を用いて、合成画像に用いられる第1及び第2の波長域の光の組み合わせの具体例について説明する。図13は、第1及び第2の波長域の光として青色光と緑色光を利用する場合を示しており、青色反射画像及び緑色反射画像を合成して合成画像が生成される。
<Specific example of composite image>
Next, specific examples of light combinations in the first and second wavelength regions used in the composite image will be described with reference to FIGS. 13 and 14. FIG. 13 shows a case where blue light and green light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a blue reflection image and a green reflection image.
また、例えば、各色のバンドパスフィルタの主走査方向のピッチ(距離)Xを0.254mmとし、赤外光フィルタ132IRを挟んだ緑色フィルタ132Gと青色フィルタ132BのピッチX1を0.127mmとし、赤色フィルタ132Rを挟んだ青色フィルタ132Bと緑色フィルタ132GのピッチX2を0.127mmとすると、各色のモノクロ画像を主走査方向において100dpiの解像度で取得できる。したがって、図13に示した例では、各色のモノクロ画像及びRGBのカラー画像を主走査方向において100dpiの解像度で取得できるとともに、青色反射画像及び緑色反射画像を合成した合成画像を主走査方向において200dpiの解像度で取得することができる。 Further, for example, the pitch (distance) X of the bandpass filter of each color in the main scanning direction is 0.254 mm, the pitch X1 of the green filter 132G and the blue filter 132B sandwiching the infrared light filter 132IR is 0.127 mm, and red. Assuming that the pitch X2 of the blue filter 132B and the green filter 132G sandwiching the filter 132R is 0.127 mm, a monochrome image of each color can be acquired at a resolution of 100 dpi in the main scanning direction. Therefore, in the example shown in FIG. 13, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the main scanning direction, and a composite image obtained by combining a blue reflection image and a green reflection image is 200 dpi in the main scanning direction. Can be obtained at the resolution of.
図14は、第1及び第2の波長域の光として赤色光と赤外光を利用する場合を示しており、赤色反射画像及び赤外反射画像を合成して合成画像が生成される。 FIG. 14 shows a case where red light and infrared light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a red reflection image and an infrared reflection image.
また、例えば、各色のバンドパスフィルタの主走査方向のピッチ(距離)Xを0.254mmとし、緑色フィルタ132Gを挟んだ赤色フィルタ132Rと赤外光フィルタ132IRのピッチX3を0.127mmとし、青色フィルタ132Bを挟んだ赤外光フィルタ132IRと赤色フィルタ132RのピッチX4を0.127mmとすると、各色のモノクロ画像を主走査方向において100dpiの解像度で取得できる。したがって、図14に示した例でも図13の場合と同様に、各色のモノクロ画像及びRGBのカラー画像を主走査方向において100dpiの解像度で取得できるとともに、赤色反射画像及び赤外反射画像を合成した合成画像を主走査方向において200dpiの解像度で取得することができる。 Further, for example, the pitch (distance) X of the bandpass filter of each color in the main scanning direction is set to 0.254 mm, the pitch X3 of the red filter 132R and the infrared light filter 132IR sandwiching the green filter 132G is set to 0.127 mm, and blue. Assuming that the pitch X4 of the infrared light filter 132IR and the red filter 132R sandwiching the filter 132B is 0.127 mm, a monochrome image of each color can be acquired at a resolution of 100 dpi in the main scanning direction. Therefore, in the example shown in FIG. 14, as in the case of FIG. 13, a monochrome image and an RGB color image of each color can be acquired at a resolution of 100 dpi in the main scanning direction, and a red reflection image and an infrared reflection image are combined. The composite image can be acquired at a resolution of 200 dpi in the main scanning direction.
上述のように、第1及び第2の波長域の光を透過する2種類のバンドパスフィルタは、主走査方向で同一(実質同一も含む)のピッチで配置されることが好ましい。これにより、ひずみの小さいより自然な合成画像を取得することができる。 As described above, it is preferable that the two types of bandpass filters that transmit light in the first and second wavelength regions are arranged at the same (including substantially the same) pitch in the main scanning direction. This makes it possible to obtain a more natural composite image with less distortion.
<紙幣識別装置による処理フロー>
ここでは、光学ラインセンサ110を用いた画像取得方法について説明するが、光学ラインセンサ120を用いた場合も同様であることから、その説明は省略する。本実施形態では、受光ステップS11において、受光部113が、光源111から照射された第1の波長域の光(例えば青色光)が紙幣で反射した光を受光して画像信号S1を出力するとともに、第1の波長域の光と同じタイミングで光源111から照射された第2の波長域の光(例えば緑色光)が紙幣で反射した光を受光して画像信号S2を出力する。
<Processing flow by banknote identification device>
Here, the image acquisition method using the optical line sensor 110 will be described, but since the same applies to the case where the optical line sensor 120 is used, the description thereof will be omitted. In the present embodiment, in the light receiving step S11, the light receiving unit 113 receives the light of the first wavelength region (for example, blue light) emitted from the light source 111 and reflected by the bill, and outputs the image signal S1. The light in the second wavelength range (for example, green light) emitted from the light source 111 at the same timing as the light in the first wavelength range receives the light reflected by the bill and outputs the image signal S2.
また、本実施形態では、画像生成ステップS13において、画像生成部13が、生成した第1のモノクロ画像データ及び第2のモノクロ画像データを主走査方向において合成して合成画像を生成する。 Further, in the present embodiment, in the image generation step S13, the image generation unit 13 synthesizes the generated first monochrome image data and the second monochrome image data in the main scanning direction to generate a composite image.
上述のように、本実施形態では、受光部113が、反射用光源111から第1の波長域の光が紙幣で反射した光を受光して画像信号S1を出力するとともに、第1の波長域の光と同じタイミングで反射用光源111から照射された第2の波長域の光が紙幣で反射した光を受光して画像信号S2を出力し、画像生成部13が、画像信号S1に基づく第1の画像データと、画像信号S2に基づく第2の画像データとを主走査方向において合成して合成画像を生成する。このため、第1、第2の各画像データに比べてより高解像度、具体的には主走査方向において2倍の解像度のモノクロ画像を取得することができる。 As described above, in the present embodiment, the light receiving unit 113 receives the light reflected by the light source in the first wavelength range from the reflection light source 111 and outputs the image signal S1 and outputs the image signal S1 and also outputs the first wavelength range. The light in the second wavelength range emitted from the reflection light source 111 at the same timing as the light of the above receives the light reflected by the bill and outputs the image signal S2, and the image generation unit 13 is based on the image signal S1. The image data of 1 and the second image data based on the image signal S2 are combined in the main scanning direction to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution than the first and second image data, specifically, a monochrome image having twice the resolution in the main scanning direction.
なお、本実施形態においても、画像生成部13は、2種類の画像データをグレー変換することなく合成してもよい。 In this embodiment as well, the image generation unit 13 may combine the two types of image data without gray conversion.
また、本実施形態においても、合成画像の生成に利用される光の波長の数は、2つに特に限定されず、3つ以上であってもよい。具体的には、例えば、赤色、緑色及び青色の光を利用したり、3種の赤外光(例えば第1~第3の赤外光)を利用してもよい。 Further, also in the present embodiment, the number of wavelengths of light used for generating the composite image is not particularly limited to two, and may be three or more. Specifically, for example, red, green, and blue light may be used, or three types of infrared light (for example, first to third infrared light) may be used.
ここで、図15及び16を用いて、合成画像の生成に利用される光の波長が3つである場合の、画像生成部13による合成画像の具体例について説明する。 Here, with reference to FIGS. 15 and 16, a specific example of the composite image by the image generation unit 13 when the wavelengths of light used for generating the composite image are three will be described.
図15に示すように、この例では、各画素単位ユニット130は、主走査方向に1列に配置された6つの撮像素子(受光素子)131a~131fと、6つの撮像素子131a~131fにそれぞれ対応して設けられた6つのバンドパスフィルタとを備えている。6つのバンドパスフィルタは、それぞれ透過する波長域の異なるフィルタ(光学フィルタ)であり、上述の青色フィルタ132B、緑色フィルタ132G及び赤色フィルタ132Rに加えて、第1の赤外光を透過するバンドパスフィルタ(以下、第1の赤外光フィルタとも言う。)132IR1と、第1の赤外光と波長域が異なる第2の赤外光を透過するバンドパスフィルタ(以下、第2の赤外光フィルタとも言う。)132IR2と、第1及び第2の赤外光と波長域が異なる第3の赤外光を透過するバンドパスフィルタ(以下、第3の赤外光フィルタとも言う。)132IR3とを含んでいる。ここでは、青色フィルタ132B、第1の赤外光フィルタ132IR1、赤色フィルタ132R、第2の赤外光フィルタ132IR2、緑色フィルタ132G及び第3の赤外光フィルタ132IR3が主走査方向にこの順で配置されている。そして、各撮像素子131a~131fは、対応するバンドパスフィルタを透過した異なる波長帯の光を受光する。 As shown in FIG. 15, in this example, each pixel unit 130 is divided into six image pickup elements (light receiving elements) 131a to 131f and six image pickup elements 131a to 131f arranged in a row in the main scanning direction, respectively. It is equipped with six bandpass filters provided correspondingly. The six bandpass filters are filters (optical filters) having different wavelength ranges through which they transmit, and in addition to the above-mentioned blue filter 132B, green filter 132G and red filter 132R, the bandpass filters that transmit the first infrared light are transmitted. A filter (hereinafter, also referred to as a first infrared light filter) 132IR1 and a bandpass filter (hereinafter, a second infrared light) that transmits a second infrared light having a wavelength range different from that of the first infrared light. 132IR2 and a bandpass filter (hereinafter, also referred to as a third infrared light filter) 132IR3 that transmits a third infrared light having a wavelength range different from that of the first and second infrared light. Includes. Here, the blue filter 132B, the first infrared light filter 132IR1, the red filter 132R, the second infrared light filter 132IR2, the green filter 132G, and the third infrared light filter 132IR3 are arranged in this order in the main scanning direction. Has been done. Then, each of the image pickup elements 131a to 131f receives light of different wavelength bands transmitted through the corresponding bandpass filter.
ここでも、第1~第3の赤外光のそれぞれの具体的な波長域は、特に限定されないが、第1、第2及び第3の赤外光のピーク波長は、この順に大きくなっていてもよく、例えば、第1~第3の赤外光のそれぞれのピーク波長は、第1が800nm、第2が880nm、第3が950nmであってもよい。 Again, the specific wavelength range of each of the first to third infrared lights is not particularly limited, but the peak wavelengths of the first, second, and third infrared lights increase in this order. For example, the peak wavelengths of the first to third infrared lights may be 800 nm for the first, 880 nm for the second, and 950 nm for the third.
図16は、3つの波長域の光として第1~第3の赤外光を利用する場合を示しており、第1の赤外反射画像、第2の赤外反射画像及び第3の赤外反射画像を合成して合成画像が生成される。 FIG. 16 shows a case where the first to third infrared light is used as the light in the three wavelength regions, and the first infrared reflection image, the second infrared reflection image, and the third infrared ray are used. A composite image is generated by synthesizing the reflected images.
また、例えば、各色のバンドパスフィルタの主走査方向のピッチ(距離)Xを0.254mmとし、赤色フィルタ132Rを挟んだ第1の赤外光フィルタ132IR1と第2の赤外光フィルタ132IR2のピッチX5を0.0847mmとし、緑色フィルタ132Gを挟んだ第2の赤外光フィルタ132IR2と第3の赤外光フィルタ132IR3のピッチX6を0.0847mmとし、青色フィルタ132Bを挟んだ第3の赤外光フィルタ132IR3と第1の赤外光フィルタ132IR1のピッチX7を0.0847mmとすると、各色のモノクロ画像を主走査方向において100dpiの解像度で取得できる。したがって、図16に示した例でも図13の場合と同様に、各色のモノクロ画像及びRGBのカラー画像を主走査方向において100dpiの解像度で取得できる。一方、図16に示した例では、第1、第2及び第3の赤外反射画像を合成した合成画像を主走査方向において300dpiの解像度で取得することができ、実施形態2に比べて更に高解像度の反射画像を取得することが可能である。 Further, for example, the pitch (distance) X of the bandpass filter of each color in the main scanning direction is set to 0.254 mm, and the pitch of the first infrared light filter 132IR1 and the second infrared light filter 132IR2 sandwiching the red filter 132R. X5 is 0.0847 mm, the pitch X6 of the second infrared filter 132IR2 and the third infrared filter 132IR3 sandwiching the green filter 132G is 0.0847 mm, and the third infrared light sandwiching the blue filter 132B. Assuming that the pitch X7 of the optical filter 132IR3 and the first infrared filter 132IR1 is 0.0847 mm, a monochrome image of each color can be acquired with a resolution of 100 dpi in the main scanning direction. Therefore, even in the example shown in FIG. 16, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the main scanning direction, as in the case of FIG. On the other hand, in the example shown in FIG. 16, a composite image obtained by synthesizing the first, second, and third infrared reflection images can be acquired at a resolution of 300 dpi in the main scanning direction, which is further compared with the second embodiment. It is possible to acquire a high-resolution reflected image.
また、上記実施形態2では、青色光を発光するLED素子と、緑色光を発光するLED素子と、赤色光を発光するLED素子とを同時に発光させて白色光を紙幣BNに照射する場合について説明したが、例えば、青色光を発光するLED素子と、その光で励起して赤色及び緑色をそれぞれ発光する赤色蛍光体及び緑色蛍光体とを用いて白色光を生成し、紙幣BNに照射してもよいし、青色光を発光するLED素子と、その光で励起して補色の黄色を発光する黄色蛍光体とを用いて白色光を生成し、紙幣BNに照射してもよい。何れの場合であっても、互いに異なる複数の波長域の光を同時に紙幣BNに照射することが可能である。なお、青色光を発光するLED素子と黄色蛍光体とを用いる場合であっても、その白色光には、通常、緑色光及び赤色光も含まれることから、第1及び第2の波長域の光として、上述のように、青色光と緑色光を利用したり、赤色光と赤外光を利用することは可能である。 Further, in the second embodiment, a case where an LED element that emits blue light, an LED element that emits green light, and an LED element that emits red light are simultaneously emitted to irradiate a bill BN with white light will be described. However, for example, white light is generated by using an LED element that emits blue light and a red phosphor and a green phosphor that are excited by the light and emit red and green, respectively, and irradiate the bill BN. Alternatively, white light may be generated by using an LED element that emits blue light and a yellow phosphor that is excited by the light and emits a complementary color of yellow, and the bill BN may be irradiated. In any case, it is possible to simultaneously irradiate the banknote BN with light in a plurality of wavelength ranges different from each other. Even when an LED element that emits blue light and a yellow phosphor are used, the white light usually includes green light and red light, so that the first and second wavelength regions are included. As the light, as described above, it is possible to use blue light and green light, or red light and infrared light.
以上説明したように、上記実施形態1、2では、複数の波長域の光が紙幣で反射した光に基づく複数の画像データを合成して合成画像を生成することから、各画像データに比べてより高解像度のモノクロ画像を取得することができる。また、少なくとも2つの波長の画像データを取得することから、特許文献1に記載のように特定種の光の発光回数を増やして解像度を高める場合に比べて、より多くの種類(波長)の画像を取得することができる。これにより、効率的にカラー画像を取得することができる。更に、合成画像に用いる各画像データは、反射画像を構成するものであることから、特許文献2に記載のように反射画像と透過画像とを用いる場合に比べ、紙幣の一方の面のみのより正確な画像を取得することができる。 As described above, in the first and second embodiments, since a plurality of image data based on the light reflected by the light in the plurality of wavelength ranges is combined to generate a composite image, the composite image is compared with each image data. It is possible to acquire a higher resolution monochrome image. Further, since image data of at least two wavelengths are acquired, more types (wavelengths) of images are obtained as compared with the case where the number of times of light emission of a specific type is increased to increase the resolution as described in Patent Document 1. Can be obtained. This makes it possible to efficiently acquire a color image. Further, since each image data used for the composite image constitutes a reflection image, as compared with the case where the reflection image and the transmission image are used as described in Patent Document 2, only one side of the bill is used. An accurate image can be obtained.
また、上記実施形態1、2では、画像認識部14が紙幣に印刷された記番号や人物の顔を認識する場合について説明したが、本発明の対象となる紙葉類が紙幣以外であれば、商品券やカジノのバーコードチケット等の紙葉類に付されたバーコードやQRコード(登録商標)等のコードを本発明に係る画像認識部によって認識してもよい。例えば、紙葉類にバーコードが印刷されている場合、本発明に係る画像認識部は、合成画像のバーコード部分を読み取り、紙幣のバーコードを認識する。バーコードは、紙葉類の長手方向に配置されてもよく、その場合は、紙葉類を長手方向に搬送し、その画像を取得することが好ましい。これにより、高解像度のバーコードの画像を取得することができる。すなわち、バーコードの直線と直交する方向に紙葉類を搬送し、その画像を取得することが好ましい。本発明に係る画像認識部は、記番号や人物の顔、バーコード、QRコード等の、紙葉類に付された複数種の特徴を認識するものであってもよい。 Further, in the first and second embodiments, the case where the image recognition unit 14 recognizes the serial number or the face of a person printed on the bill has been described, but if the paper sheets to which the present invention is intended are other than the bill. , A code such as a barcode or QR code (registered trademark) attached to paper sheets such as a gift certificate or a barcode ticket of a casino may be recognized by the image recognition unit according to the present invention. For example, when a bar code is printed on paper sheets, the image recognition unit according to the present invention reads the bar code portion of the composite image and recognizes the bar code of the banknote. The barcode may be arranged in the longitudinal direction of the paper sheets, in which case it is preferable to transport the paper sheets in the longitudinal direction and acquire an image thereof. This makes it possible to acquire a high-resolution barcode image. That is, it is preferable to transport the paper sheets in the direction orthogonal to the straight line of the barcode and acquire the image thereof. The image recognition unit according to the present invention may recognize a plurality of types of features attached to paper sheets, such as a serial number, a person's face, a barcode, and a QR code.
以上、図面を参照しながら本発明の実施形態を説明したが、本発明は、上記実施形態に限定されるものではない。また、各実施形態の構成は、本発明の要旨を逸脱しない範囲において適宜組み合わされてもよいし、変更されてもよい。 Although the embodiments of the present invention have been described above with reference to the drawings, the present invention is not limited to the above embodiments. Further, the configurations of the respective embodiments may be appropriately combined or modified as long as they do not deviate from the gist of the present invention.
以上のように、本発明は、種々の画像を取得しつつ、正確な高解像度の画像を取得するのに有用な技術である。 As described above, the present invention is a technique useful for acquiring accurate high-resolution images while acquiring various images.
1:紙幣識別装置(画像取得装置)
10:制御部
11:光源制御部
12:センサ制御部
13:画像生成部
14:画像認識部
15:識別部
20:検出部
21:撮像部
22:磁気検出部
23:厚み検出部
24:UV検出部
30:記憶部
110、120:光学ラインセンサ
111、121:反射用光源
112、122:集光レンズ
113、123:受光部
131a~131f:撮像素子(受光素子)
132B:青色フィルタ
132R:赤色フィルタ
132G:緑色フィルタ
132IR:赤外光フィルタ
132IR1:第1の赤外光フィルタ
132IR2:第2の赤外光フィルタ
132IR3:第3の赤外光フィルタ
300:紙幣処理装置
301:ホッパ
302:リジェクト部
303:操作部
305:表示部
306a~306d:集積部
310:筐体
311:搬送路
BN:紙幣

 
1: Banknote identification device (image acquisition device)
10: Control unit 11: Light source control unit 12: Sensor control unit 13: Image generation unit 14: Image recognition unit 15: Identification unit 20: Detection unit 21: Image sensor 22: Magnetic detection unit 23: Thickness detection unit 24: UV detection Unit 30: Storage unit 110, 120: Optical line sensor 111, 121: Reflective light source 112, 122: Condensing lens 113, 123: Light receiving unit 131a to 131f: Image sensor (light receiving element)
132B: Blue filter 132R: Red filter 132G: Green filter 132IR: Infrared light filter 132IR1: First infrared light filter 132IR2: Second infrared light filter 132IR3: Third infrared light filter 300: Bill processing apparatus 301: Hopper 302: Reject unit 303: Operation unit 305: Display units 306a to 306d: Accumulation unit 310: Housing 311: Transport path BN: Bills

Claims (13)

  1. 搬送される紙葉類の画像を取得する画像取得装置であって、
    第1の波長域の光と第2の波長域の光とを紙葉類に照射する発光部と、
    前記発光部から照射された前記第1の波長域の光が前記紙葉類で反射した光を受光して第1の画像信号を出力するとともに、前記発光部から照射された前記第2の波長域の光が前記紙葉類で反射した光を受光して第2の画像信号を出力する受光部と、
    前記第1の画像信号に基づく第1の画像データ及び前記第2の画像信号に基づく第2の画像データをそれぞれ生成し、前記第1の画像データ及び前記第2の画像データを合成して合成画像を生成する画像生成部と、を備える
    ことを特徴とする画像取得装置。
    It is an image acquisition device that acquires images of paper sheets to be transported.
    A light emitting part that irradiates paper sheets with light in the first wavelength range and light in the second wavelength range,
    The light in the first wavelength range emitted from the light emitting unit receives the light reflected by the paper sheets and outputs a first image signal, and at the same time, outputs the first image signal and the second wavelength emitted from the light emitting unit. A light receiving unit that receives the light reflected by the paper sheets and outputs a second image signal.
    The first image data based on the first image signal and the second image data based on the second image signal are generated, respectively, and the first image data and the second image data are combined and synthesized. An image acquisition device including an image generation unit for generating an image.
  2. 前記発光部は、前記第1の波長域の光と、前記第2の波長域の光とを異なるタイミングで前記紙葉類に照射し、
    前記画像生成部は、前記第1の画像データと、前記第2の画像データとを前記紙葉類の搬送方向において合成する
    ことを特徴とする請求項1記載の画像取得装置。
    The light emitting unit irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region at different timings.
    The image acquisition device according to claim 1, wherein the image generation unit synthesizes the first image data and the second image data in the transport direction of the paper sheets.
  3. 前記発光部は、前記第1の波長域の光を前記紙葉類に照射した後に、前記第2の波長域の光を前記紙葉類に照射し、前記第2の波長域の光を前記紙葉類に照射した後に、前記第1の波長域の光を前記紙葉類に照射する
    ことを特徴とする請求項2記載の画像取得装置。
    The light emitting unit irradiates the paper sheets with light in the first wavelength region, then irradiates the paper sheets with light in the second wavelength region, and emits light in the second wavelength region to the paper sheets. The image acquisition device according to claim 2, further comprising irradiating the paper sheets with light in the first wavelength region after irradiating the paper sheets.
  4. 前記発光部は、前記第1の波長域の光の照射を開始してから前記第2の波長域の光の照射を開始するまでの時間間隔と、前記第2の波長域の光の照射を開始してから前記第1の波長域の光の照射を開始するまでの時間間隔とが同一となるように、前記第1の波長域の光と前記第2の波長域の光とを前記紙葉類に照射する
    ことを特徴とする請求項3記載の画像取得装置。
    The light emitting unit performs the time interval from the start of irradiation of light in the first wavelength region to the start of irradiation of light in the second wavelength region and the irradiation of light in the second wavelength region. The paper of the first wavelength region and the light of the second wavelength region so that the time interval from the start to the start of irradiation of the light in the first wavelength region is the same. The image acquisition device according to claim 3, wherein the leaves are irradiated.
  5. 前記発光部は、前記第1の波長域の光と、前記第2の波長域の光とを同時に前記紙葉類に照射し、
    前記画像生成部は、前記第1の画像データと、前記第2の画像データとを主走査方向において合成する
    ことを特徴とする請求項1記載の画像取得装置。
    The light emitting unit simultaneously irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region.
    The image acquisition device according to claim 1, wherein the image generation unit synthesizes the first image data and the second image data in the main scanning direction.
  6. 前記第1の画像データ及び前記第2の画像データは、グレー変換された画像データである
    ことを特徴とする請求項1~5のいずれかに記載の画像取得装置。
    The image acquisition device according to any one of claims 1 to 5, wherein the first image data and the second image data are gray-converted image data.
  7. 前記合成画像を認識する画像認識部を更に備え、
    前記画像認識部は、前記合成画像から前記紙葉類に付されている記番号を認識する
    ことを特徴とする請求項1~6のいずれかに記載の画像取得装置。
    An image recognition unit that recognizes the composite image is further provided.
    The image acquisition device according to any one of claims 1 to 6, wherein the image recognition unit recognizes a serial number attached to the paper sheet from the composite image.
  8. 前記合成画像を認識する画像認識部を更に備え、
    前記画像認識部は、前記合成画像から前記紙葉類に付されているバーコードを認識する
    ことを特徴とする請求項1~6のいずれかに記載の画像取得装置。
    An image recognition unit that recognizes the composite image is further provided.
    The image acquisition device according to any one of claims 1 to 6, wherein the image recognition unit recognizes a barcode attached to the paper sheet from the composite image.
  9. 前記第1の波長域の光は、青色光であり、
    前記第2の波長域の光は、緑色光である
    ことを特徴とする請求項1~8のいずれかに記載の画像取得装置。
    The light in the first wavelength region is blue light.
    The image acquisition device according to any one of claims 1 to 8, wherein the light in the second wavelength region is green light.
  10. 前記第1の波長域の光は、赤色光であり、
    前記第2の波長域の光は、赤外光である
    ことを特徴とする請求項1~8のいずれかに記載の画像取得装置。
    The light in the first wavelength region is red light.
    The image acquisition device according to any one of claims 1 to 8, wherein the light in the second wavelength region is infrared light.
  11. 前記第1の波長域の光は、第1の赤外光であり、
    前記第2の波長域の光は、前記第1の赤外光とは波長域が異なる第2の赤外光である
    ことを特徴とする請求項1~8のいずれかに記載の画像取得装置。
    The light in the first wavelength region is the first infrared light.
    The image acquisition device according to any one of claims 1 to 8, wherein the light in the second wavelength region is second infrared light having a wavelength region different from that of the first infrared light. ..
  12. 請求項1~11のいずれかに記載の画像取得装置を備える紙葉類処理装置。 A paper leaf processing apparatus including the image acquisition apparatus according to any one of claims 1 to 11.
  13. 搬送される紙葉類の画像を取得する画像取得方法であって、
    第1の波長域の光が紙葉類で反射した光を受光して第1の画像信号を出力するとともに、第2の波長域の光が前記紙葉類で反射した光を受光して第2の画像信号を出力する受光ステップと、
    前記第1の画像信号に基づく第1の画像データ及び前記第2の画像信号に基づく第2の画像データをそれぞれ生成し、前記第1の画像データ及び前記第2の画像データを合成して合成画像を生成する画像生成ステップと、を備える
    ことを特徴とする画像取得方法。

     
    It is an image acquisition method that acquires an image of paper leaves to be transported.
    The light in the first wavelength range receives the light reflected by the paper sheets and outputs the first image signal, and the light in the second wavelength range receives the light reflected by the paper sheets and receives the light. The light receiving step that outputs the image signal of 2 and
    The first image data based on the first image signal and the second image data based on the second image signal are generated, respectively, and the first image data and the second image data are combined and synthesized. An image acquisition method comprising an image generation step for generating an image.

PCT/JP2019/015949 2019-04-12 2019-04-12 Image acquisition device, paper sheet handling device, and image acquisition method WO2020208806A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/015949 WO2020208806A1 (en) 2019-04-12 2019-04-12 Image acquisition device, paper sheet handling device, and image acquisition method
JP2021513133A JP7218429B2 (en) 2019-04-12 2019-04-12 Image Acquisition Apparatus, Paper Sheet Processing Apparatus, and Image Acquisition Method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/015949 WO2020208806A1 (en) 2019-04-12 2019-04-12 Image acquisition device, paper sheet handling device, and image acquisition method

Publications (1)

Publication Number Publication Date
WO2020208806A1 true WO2020208806A1 (en) 2020-10-15

Family

ID=72751186

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/015949 WO2020208806A1 (en) 2019-04-12 2019-04-12 Image acquisition device, paper sheet handling device, and image acquisition method

Country Status (2)

Country Link
JP (1) JP7218429B2 (en)
WO (1) WO2020208806A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4050580A1 (en) 2021-02-25 2022-08-31 Glory Ltd. Sheet recognition unit and sheet recognition method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008257395A (en) * 2007-04-03 2008-10-23 Toshiba Corp Paper leaf processing unit
JP2010225013A (en) * 2009-03-25 2010-10-07 Hitachi Omron Terminal Solutions Corp Serial number recognition device, paper sheet processor, automatic transaction processor, and serial number recognition method
JP2016053783A (en) * 2014-09-03 2016-04-14 グローリー株式会社 Light receiving sensor, sensor module, and paper sheet processor
JP2016218810A (en) * 2015-05-22 2016-12-22 沖電気工業株式会社 Serial number recognition device, medium classification device, automated transaction device, serial number management device, paper sheet processing device, serial number management system, and serial number management program
JP2018169881A (en) * 2017-03-30 2018-11-01 グローリー株式会社 Serial number reading device, paper sheet identification device, paper sheet processing device, and serial number reading method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4703403B2 (en) * 2004-02-12 2011-06-15 日本電産コパル株式会社 Inspection device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008257395A (en) * 2007-04-03 2008-10-23 Toshiba Corp Paper leaf processing unit
JP2010225013A (en) * 2009-03-25 2010-10-07 Hitachi Omron Terminal Solutions Corp Serial number recognition device, paper sheet processor, automatic transaction processor, and serial number recognition method
JP2016053783A (en) * 2014-09-03 2016-04-14 グローリー株式会社 Light receiving sensor, sensor module, and paper sheet processor
JP2016218810A (en) * 2015-05-22 2016-12-22 沖電気工業株式会社 Serial number recognition device, medium classification device, automated transaction device, serial number management device, paper sheet processing device, serial number management system, and serial number management program
JP2018169881A (en) * 2017-03-30 2018-11-01 グローリー株式会社 Serial number reading device, paper sheet identification device, paper sheet processing device, and serial number reading method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4050580A1 (en) 2021-02-25 2022-08-31 Glory Ltd. Sheet recognition unit and sheet recognition method

Also Published As

Publication number Publication date
JPWO2020208806A1 (en) 2021-11-04
JP7218429B2 (en) 2023-02-06

Similar Documents

Publication Publication Date Title
EP2166515B1 (en) Paper-sheet recognition apparatus, paper-sheet processing apparatus, and paper-sheet recognition method
US7584890B2 (en) Validator linear array
CN102224530A (en) Determining document fitness using sequenced illumination
US10677646B2 (en) Light receiving sensor, sensor module, and paper sheet handling apparatus
JP4703403B2 (en) Inspection device
EP3598401B1 (en) Paper sheet detection device, paper sheet processing apparatus, and paper sheet detection method
US8989433B2 (en) Paper sheet recognition apparatus and paper sheet recognition method
JP2018169881A (en) Serial number reading device, paper sheet identification device, paper sheet processing device, and serial number reading method
EP3723054A1 (en) Banknote recognition unit, banknote handling device, and banknote recognition method
EP3680867B1 (en) Image acquisition device, sheet handling device, banknote handling device, and image acquisition method
WO2020208806A1 (en) Image acquisition device, paper sheet handling device, and image acquisition method
US20230015962A1 (en) Optical sensor and sheet recognition unit
JP3736028B2 (en) Bill discrimination device
JP7473677B2 (en) Optical sensor, paper sheet identification device, paper sheet processing device, and optical detection method
JP7337572B2 (en) Serial number reading device, paper sheet processing device, and serial number reading method
WO2019194152A1 (en) Light detection sensor, light detection device, sheets processing device, and light detection method
WO2019082251A1 (en) Optical sensor, optical sensor module, and paper processing device
WO2022210372A1 (en) Multifeed detection device and multifeed detection method
WO2021167004A1 (en) Optical sensor, paper sheet identification device, and paper sheet processing device
JP7496744B2 (en) Paper sheet identification device, paper sheet processing device, and paper sheet identification method
WO2023176530A1 (en) Paper sheet identifying device, paper sheet processing device, and paper sheet identification method
JP2023137760A (en) Paper sheet identification apparatus, paper sheet processing device, and paper sheet identification method
JP2022045035A (en) Paper sheet identification device, paper sheet processing device, and paper sheet identification method
JP2020047304A (en) Sensor module

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19924232

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021513133

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19924232

Country of ref document: EP

Kind code of ref document: A1