WO2020208806A1 - Dispositif d'acquisition d'image, dispositif de manipulation de feuille de papier et procédé d'acquisition d'image - Google Patents
Dispositif d'acquisition d'image, dispositif de manipulation de feuille de papier et procédé d'acquisition d'image Download PDFInfo
- Publication number
- WO2020208806A1 WO2020208806A1 PCT/JP2019/015949 JP2019015949W WO2020208806A1 WO 2020208806 A1 WO2020208806 A1 WO 2020208806A1 JP 2019015949 W JP2019015949 W JP 2019015949W WO 2020208806 A1 WO2020208806 A1 WO 2020208806A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- image
- wavelength region
- image data
- image acquisition
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 239000002131 composite material Substances 0.000 claims abstract description 64
- 230000001678 irradiating effect Effects 0.000 claims description 21
- 238000005286 illumination Methods 0.000 abstract 2
- 230000003287 optical effect Effects 0.000 description 46
- 238000003384 imaging method Methods 0.000 description 29
- 238000001514 detection method Methods 0.000 description 15
- 230000002194 synthesizing effect Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000009825 accumulation Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000630 rising effect Effects 0.000 description 3
- 238000000825 ultraviolet detection Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000005389 magnetism Effects 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07D—HANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
- G07D7/00—Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
- G07D7/06—Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using wave or particle radiation
- G07D7/12—Visible light, infrared or ultraviolet radiation
- G07D7/121—Apparatus characterised by sensor details
Definitions
- the present invention relates to an image acquisition device, a paper leaf processing device, and an image acquisition method. More specifically, the present invention relates to an image acquisition device, a paper leaf processing device, and an image acquisition method suitable for acquiring a high-resolution image of paper leaves.
- the authenticity of paper leaves is determined, for example, by analyzing an image of paper leaves read by an optical line sensor.
- the authenticity of the paper leaves is determined based on the arrangement of the serial numbers of the recognized paper leaves and the number of digits. In order to perform serial number recognition, it is desirable that the image of the serial number portion has a high resolution.
- the number of times of light emission of at least one of the first to nth (n is an integer of 2 or more) lights having different wavelengths in one lighting cycle is different from the number of times of light emission of other lights.
- n is an integer of 2 or more
- a paper leaf identification device that controls the light emission of the light emitting means is disclosed. More specifically, an example of acquiring a high-resolution reflected image by increasing the number of times of light emission of reflected green is disclosed.
- Patent Document 2 describes a reflected image by reflected light obtained by reflecting the light of the first light source on the paper sheets and a transmitted image by the transmitted light transmitted by the light of the second light source on the paper leaves.
- a serial number recognition device is disclosed that generates a composite image in which the above are complementary to each other and recognizes the serial number region of the paper leaf as a character for the composite image.
- Patent Document 2 there is a possibility that an accurate image of only one surface of the paper leaf cannot be obtained by the method of acquiring a composite image in which the reflected image and the transmitted image are complemented with each other. .. This is because the pattern on the other surface can also be captured in the transparent image.
- the transmittance of the transmitted light received to generate the transmitted image fluctuates depending on the thickness, material, etc. of the paper sheets, and the fluctuation of the sensor output is generally higher than that of the reflected light reception. This is because time is bigger.
- the present invention has been made in view of the above situation, and provides an image acquisition device, a paper leaf processing device, and an image acquisition method capable of acquiring various images and more accurate high-resolution images. It is intended to provide.
- the present invention is an image acquisition device for acquiring an image of transported paper sheets, and the light in the first wavelength region and the second wavelength region.
- a light emitting unit that irradiates the paper leaves with light, and the light in the first wavelength range emitted from the light emitting unit receives the light reflected by the paper sheets and outputs a first image signal.
- the light receiving unit that receives the light reflected by the paper sheets and outputs the second image signal by the light in the second wavelength region emitted from the light emitting unit, and the first image signal based on the first image signal.
- An image generation unit that generates an image data of 1 and a second image data based on the second image signal, and synthesizes the first image data and the second image data to generate a composite image. It is characterized by having.
- the light emitting unit irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region at different timings, and the image generation unit.
- the first image data and the second image data are combined in the transport direction of the paper sheets.
- the light emitting unit irradiates the paper leaves with light in the first wavelength region and then irradiates the paper leaves with light in the second wavelength region. After irradiating the paper sheets with light in the second wavelength range, the paper sheets are irradiated with light in the first wavelength range.
- the light emitting unit has a time interval from the start of irradiation of light in the first wavelength region to the start of irradiation of light in the second wavelength region, and the above-mentioned.
- the light in the first wavelength region and the first wavelength region are such that the time interval from the start of irradiation of light in the second wavelength region to the start of irradiation of light in the first wavelength region is the same. It is characterized in that the paper sheets are irradiated with light in the wavelength range of 2.
- the light emitting unit simultaneously irradiates the paper sheets with light in the first wavelength region and light in the second wavelength region, and the image generation unit generates the image. It is characterized in that the first image data and the second image data are combined in the main scanning direction.
- the present invention is characterized in that, in the above invention, the first image data and the second image data are gray-converted image data.
- the image acquisition device further includes an image recognition unit that recognizes the composite image, and the image recognition unit is a serial number assigned to the paper sheets from the composite image. It is characterized by recognizing.
- the image acquisition device further includes an image recognition unit that recognizes the composite image, and the image recognition unit is a bar code attached to the paper sheets from the composite image. It is characterized by recognizing.
- the present invention is characterized in that, in the above invention, the light in the first wavelength region is blue light and the light in the second wavelength region is green light.
- the present invention is characterized in that, in the above invention, the light in the first wavelength region is red light, and the light in the second wavelength region is infrared light.
- the light in the first wavelength region is the first infrared light
- the light in the second wavelength region has a wavelength region different from that of the first infrared light. Is a second infrared light having a different wavelength.
- the present invention is a paper leaf processing apparatus including the image acquisition apparatus.
- the present invention is an image acquisition method for acquiring an image of conveyed paper leaves, in which light in the first wavelength range receives light reflected by the paper leaves and outputs a first image signal.
- the light receiving step in which the light in the second wavelength range receives the light reflected by the paper sheets and outputs the second image signal, the first image data based on the first image signal, and the above. It is characterized by including an image generation step of generating a second image data based on the second image signal, and synthesizing the first image data and the second image data to generate a composite image. To do.
- the image acquisition device the paper leaf processing device, and the image acquisition method of the present invention, various images can be acquired and more accurate high-resolution images can be acquired.
- FIG. 5 is a schematic cross-sectional view illustrating the configuration of an imaging unit included in the banknote identification device (image acquisition device) according to the first embodiment. It is a block diagram explaining the structure of the banknote identification apparatus (image acquisition apparatus) which concerns on Embodiment 1.
- FIG. 5 is a timing chart showing an example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment.
- FIG. 5 is a timing chart showing another example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment.
- 9 is a timing chart showing still another example of control of each light source by the light source control unit and control of signal reading from each line sensor by the sensor control unit in the first embodiment.
- It is a flowchart which shows the acquisition procedure of the composite image in the banknote identification apparatus (image acquisition apparatus) and image acquisition method which concerns on Embodiment 1, and the main processing flow after that. It is a timing chart which showed an example of the control of each light source by a light source control unit, and the control of signal reading from each line sensor by a sensor control unit in the modification of Embodiment 1.
- FIG. 9 is a timing chart showing a plurality of examples of control of each light source by a light source control unit and control of signal reading from each line sensor by a sensor control unit in the first embodiment and its modified example. It is a figure for demonstrating the outline of Embodiment 2.
- FIG. 5 is a schematic plan view showing the correspondence between the arrangement of the image sensor of the light receiving portion of the optical line sensor according to the second embodiment and the bandpass filter. It is a plane schematic diagram which showed the arrangement of the bandpass filter in Embodiment 2, and shows the case where blue light and green light are used as light of the 1st and 2nd wavelength regions.
- the reflected image means an image based on the intensity distribution of light reflected by the paper leaves by irradiating the paper leaves with light.
- the optical line sensor repeatedly irradiates the bill with light in the first wavelength region and light in a second wavelength region different from the first wavelength region.
- Each color reflection image is acquired, and these reflection images are gray-converted and then combined in the bill transport direction (secondary scanning direction of the optical line sensor) to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution, specifically, twice the resolution in the banknote transport direction as compared with each reflected image.
- Suitable wavelength combinations include, for example, a combination of blue light and green light, a combination of red light and infrared light, and a wavelength range of the first infrared light and the first infrared light. A combination with a different second infrared light can be mentioned.
- the banknote processing device 300 shown in FIG. 2 is a small bill processing device installed and used on a table, and includes a bill identification device (not shown in FIG. 2) that performs bill identification processing and a plurality of processing targets.
- the rejected banknotes are discharged when the hopper 301 on which the banknotes are placed in a laminated body and the banknotes drawn out from the hopper 301 into the housing 310 are rejected banknotes such as counterfeit banknotes and uncertain authenticity tickets.
- the method of distributing banknotes to the accumulation units 306a to 306d can be arbitrarily set.
- the imaging unit 21 includes optical line sensors 110 and 120 arranged so as to face each other.
- a gap for transporting the bill BN is formed between the optical line sensors 110 and 120, and this gap constitutes a part of the transport path 311 of the bill processing device according to the present embodiment.
- the optical line sensors 110 and 120 are located above and below the transport path 311 respectively.
- the optical line sensor 110 includes two reflection light sources 111 as light emitting units, a condenser lens 112, and a light receiving unit 113.
- the reflection light source 111 has light of a predetermined wavelength (invisible light such as infrared light, monochromatic light such as red, green, blue, or white) on the main surface (hereinafter, A surface) of the bill BN on the light receiving portion 113 side. Visible light such as light) is sequentially irradiated.
- the condenser lens 112 collects the light emitted from the reflection light source 111 and reflected by the bill BN.
- the light receiving unit 113 includes a plurality of image pickup elements (light receiving elements, not shown) arranged in a line in a direction (main scanning direction) orthogonal to the transport direction (secondary scanning direction) of the bill BN, and collects light.
- the light collected by the lens 112 is received and converted into an electric signal. Then, after the electric signal is amplified, it is A / D converted into digital data and then output as an image signal.
- the optical line sensor 120 includes two reflection light sources 121 as light emitting units, a condenser lens 122, and a light receiving unit 123.
- the reflection light source 121 is formed on the main surface (hereinafter referred to as the B surface) of the bill BN on the light receiving portion 123 side with light of a predetermined wavelength (invisible light such as infrared light, monochromatic light such as red, green, blue, or white). Irradiate visible light such as light).
- the condenser lens 122 collects the light emitted from the reflection light source 121 and reflected by the bill BN.
- the light receiving unit 123 includes a plurality of image pickup elements (light receiving elements, not shown) arranged in a line in a direction orthogonal to the transport direction of the banknote BN, and receives the light collected by the condensing lens 122. And convert it into an electrical signal. Then, after the electric signal is amplified, it is A / D converted into digital data and then output as an image signal.
- image pickup elements light receiving elements, not shown
- the light sources 111 and 121 are provided on a line-shaped light guide body (not shown) extending in a direction perpendicular to the paper surface (main scanning direction) of FIG. 3 and at both ends of the light guide body (may be one end). It is provided with a plurality of LED elements (not shown) provided.
- Each of the light sources 111 and 121 includes, as LED elements, a plurality of LED elements capable of irradiating light in different wavelength regions including at least light in the first and second wavelength regions, and light in a selected wavelength region. It is configured to be able to irradiate.
- each of the light sources 111 and 121 includes an LED element that emits infrared light (IR) and an LED element that emits visible light (monochromatic light such as red, green, blue, or white). Infrared light and visible light are emitted toward the bill BN, respectively.
- each of the light sources 111 and 121 has a plurality of LED elements that emit a plurality of types of infrared light (for example, three types of LED elements that emit each of the first to third infrared lights) and a wavelength range. It includes an LED element that emits red light having a wavelength band of 600 nm to 700 nm, an LED element that emits green light having a wavelength band of 500 nm to 600 nm, and an LED element that emits blue light having a wavelength band of 400 nm to 500 nm.
- the optical line sensors 110 and 120 each repeatedly image a banknote BN being transported in the transport direction and output an image signal, whereby the banknote identification device according to the present embodiment captures an image of the entire banknote BN. get.
- the bill identification device according to the present embodiment acquires a reflection image of the A side of the bill BN based on the output signal of the optical line sensor 110, and reflects a reflection image of the B side of the bill BN based on the output signal of the optical line sensor 120. To get.
- the bill identification device (image acquisition device) 1 includes a control unit 10, a detection unit 20, and a storage unit 30.
- the control unit 10 includes a program for realizing various processes stored in the storage unit 30, a CPU (Central Processing Unit) that executes the program, various hardware controlled by the CPU, and an FPGA (Field). It is composed of logical devices such as Programmable Gate Array).
- the control unit 10 controls each unit of the banknote identification device 1 based on the signal output from each unit of the banknote identification device 1 and the control signal from the control unit 10 according to the program stored in the storage unit 30. Further, the control unit 10 has the functions of the light source control unit 11, the sensor control unit 12, the image generation unit 13, the image recognition unit 14, and the identification unit 15 according to the program stored in the storage unit 30.
- the detection unit 20 includes a magnetic detection unit 22, a thickness detection unit 23, and a UV detection unit 24 in addition to the above-mentioned imaging unit 21 along the bill transport path.
- the image pickup unit 21 takes an image of the banknote as described above and outputs an image signal (image data).
- the magnetic detection unit 22 includes a magnetic sensor (not shown) for measuring magnetism, and detects the magnetism of magnetic ink, security threads, etc. printed on banknotes by the magnetic sensor.
- the magnetic sensor is a magnetic line sensor in which a plurality of magnetic detection elements are arranged in a line.
- the thickness detection unit 23 includes a thickness detection sensor (not shown) for measuring the thickness of banknotes, and detects tape, double feed, etc. by the thickness detection sensor.
- the thickness detection sensor detects the amount of displacement of the rollers facing each other across the transport path when the banknotes pass by the sensors provided on each roller.
- the UV detection unit 24 includes an ultraviolet irradiation unit (not shown) and a light receiving unit (not shown), and receives fluorescence generated when the banknotes are irradiated with ultraviolet rays by the ultraviolet irradiation unit and ultraviolet rays transmitted through the banknotes. Detected by.
- the storage unit 30 is composed of a non-volatile storage device such as a semiconductor memory or a hard disk, and stores various programs and various data for controlling the bill identification device 1. Further, the storage unit 30 has a wavelength range of light emitted from the light sources 111 and 121 during one cycle of imaging by the image pickup unit 21, timing for turning on and off the light sources 111 and 121, and light sources 111. The value of the forward current flowing through the LED element 121, the timing of reading signals from the optical line sensors 110 and 120, and the like are stored as imaging parameters.
- the one-cycle imaging includes the wavelength range of the light emitted from the light sources 111 and 121, the lighting and extinguishing of the light sources 111 and 121, the value of the forward current flowing through each LED element, the timing of signal reading, and the like. It refers to the set imaging pattern.
- An image of the entire banknote is acquired by continuously and repeatedly executing one cycle of imaging as one cycle.
- the light source control unit 11 performs dynamic lighting control in which the light sources 111 and 121 are turned on in order in order to capture an image of individual banknotes by the light sources 111 and 121. Specifically, the light source control unit 11 controls lighting and extinguishing of the light sources 111 and 121 based on the timing set in the imaging parameter. This control is performed by using a mechanical clock that changes according to the transport speed of banknotes and a system clock that is always output at a constant frequency regardless of the transport speed of banknotes. Further, the light source control unit 11 sets the magnitude of the forward current flowing through each LED element based on the imaging parameter.
- the sensor control unit 12 controls the timing of reading the image signal from the optical line sensors 110 and 120 based on the timing set in the imaging parameter, and synchronizes with the timing of turning on and off the light sources 111 and 121. Read the image signal from each line sensor. This control is performed using the mechanical clock and the system clock. Then, the sensor control unit 12 sequentially stores the read image signal, that is, the line data, in the ring buffer (line memory) of the storage unit 30.
- the line data means data based on an image signal obtained by one imaging by each optical line sensor 110, 120, and is orthogonal to the lateral direction of the acquired image (orthogonal to the transport direction of the bill). Corresponds to one column of data (direction).
- the image generation unit 13 has a function of generating an image based on various signals related to banknotes acquired from the detection unit 20. Specifically, the image generation unit 13 first decomposes the data (image signal) stored in the ring buffer into data for each condition of light irradiation and light reception. Specifically, the light receiving intensity data of the light reflected by irradiating the first infrared light, the light receiving intensity data of the light reflected by irradiating the second infrared light, and the third infrared light are displayed.
- the image generation unit 13 generates the first image data (hereinafter, the first monochrome image data) by gray-converting the first raw image data based on the light receiving intensity data of the light in the first wavelength region.
- the first raw image data based on the light receiving intensity data of the light in the second wavelength region is gray-converted to generate the second image data (hereinafter, the second monochrome image data). More specifically, the raw image data of each color is converted so that the maximum intensity (255 digit) and the minimum intensity (0 digit) of the output of the corresponding light receiving unit 113 or 123 are white and black, respectively.
- the first monochrome image data and the second monochrome image data correspond to the first image data and the second image data, respectively.
- the image generation unit 13 also functions as a super-resolution processing unit that synthesizes two or more types of image data to increase the resolution in the sub-scanning direction, and the generated first monochrome image data and second monochrome image.
- the data is combined in the bill transport direction (secondary scanning direction) to generate a composite image. That is, the first monochrome image data and the second monochrome image data are combined for each column.
- the line data constituting the first monochrome image data and the line data constituting the second monochrome image data are alternately arranged in the sub-scanning direction to generate one reflected image.
- the image generation unit 13 may execute the generation of the composite image at the same time as the generation of the first and second raw image data and the generation of the first and second monochrome image data. That is, each time the line data of each raw image data is generated, the corresponding monochrome image data may be generated, and each time the line data of each monochrome image data is generated, the monochrome image data may be sequentially combined.
- the image generation unit 13 may execute the generation of the composite image after the generation of the first and second monochrome image data is completed. That is, the entire first monochrome image data may be generated, and after the entire monochrome image data is generated, these image data may be combined to generate a composite image.
- the image recognition unit 14 recognizes the composite image generated by the image generation unit 13. That is, the composite image is analyzed to extract features and recognize the object. Specifically, for example, when a serial number is printed on a bill, the image recognition unit 14 recognizes the serial number portion of the composite image as characters and recognizes the serial number of the bill. In addition, the face of a person printed on a bill may be recognized. Further, the image recognition unit 14 stores the recognition result in the storage unit 30.
- the identification unit 15 performs identification processing using various signals related to banknotes acquired from the detection unit 20.
- the identification unit 23 identifies at least the denomination and authenticity of the bill.
- the identification unit 15 may have a function of determining whether the bill is correct or not. In that case, the identification unit 15 detects dirt, folds, tears, etc. of the banknote, and also detects the tape or the like attached to the banknote from the thickness of the banknote, so that the banknote can be reused in the market. It has a function to determine which of the loss tickets is not suitable for market distribution.
- the identification unit 15 uses an image of a banknote taken by the imaging unit 21 to identify the denomination, authenticity, positive loss, etc.
- the composite image generated by the image generation unit 13 or the image recognition unit 14 Use the recognition result obtained by.
- the identification unit 15 uses the serial number information of the banknotes obtained by the image recognition unit 14 in order to identify the authenticity of the banknotes.
- the light source 111 irradiates a first infrared light, then a second infrared light having a wavelength range different from that of the first infrared light, then a blue light, and then a blue light.
- a third infrared light having a wavelength range different from that of the first and second infrared lights is irradiated, followed by a red light and then a green light. While the banknote is irradiated with light, each image sensor of the light receiving unit 113 is exposed and accumulates electric charges.
- the image signal by the light before the switching is read from the optical line sensor 110.
- the line data for one row, the second red which forms the reflection image of the A surface by the first infrared light (hereinafter, the first infrared reflection image) by the optical line sensor 110 in one cycle.
- Line data for one row that forms a reflection image of surface A by external light (hereinafter, second infrared reflection image), one row that forms a reflection image of surface A by blue light (hereinafter, blue reflection image)
- Line data, line data for one row forming a reflection image of surface A by third infrared light (hereinafter, third infrared reflection image), reflection image of surface A by red light (hereinafter, red reflection)
- the line data for one row forming the image) and the line data for one row forming the reflected image of the A surface by the green light (hereinafter referred to as the green reflected image) are sequentially acquired in this order.
- each of the first to third infrared lights is not particularly limited, but the peak wavelengths of the first, second, and third infrared lights may increase in this order. Often, for example, the peak wavelengths of the first to third infrared lights may be 800 nm for the first, 880 nm for the second, and 950 nm for the third.
- the first infrared reflection image, the second infrared reflection image, the blue reflection image, and the third infrared reflection image of the entire A side of the banknote can be obtained.
- a red reflection image and a green reflection image are acquired respectively.
- a color image of the entire A side of the banknote can be obtained from the red, green, and blue reflection images.
- FIG. 5 shows a case where blue light and green light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a blue reflection image and a green reflection image.
- a monochrome image of the light is subordinated. It can be acquired with a resolution of 100 dpi in the scanning direction. Therefore, in the example shown in FIG. 5, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a composite image obtained by combining a blue reflection image and a green reflection image is 200 dpi in the sub-scanning direction. Can be obtained at the resolution of.
- FIG. 6 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
- the conveyed bill is irradiated with the first infrared light from the light source 111 during one cycle, followed by the second red. It is irradiated with external light, followed by a third infrared light, then blue light, then green light, and then red light.
- the optical line sensor 110 forms one row of line data for forming the first infrared reflection image, one row of line data for forming the second infrared reflection image, and a third row of line data in one cycle.
- the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained.
- a green reflection image and a red reflection image are acquired respectively.
- FIG. 6 shows a case where the third infrared light and the red light are used as the light in the first and second wavelength regions, and the third infrared reflection image and the red reflection image are combined and synthesized. An image is generated.
- a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a third infrared reflection image and a red reflection image are combined.
- the combined image can be acquired at a resolution of 200 dpi in the sub-scanning direction.
- FIG. 7 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
- the conveyed bills are irradiated with a third infrared light from the light source 111 and subsequently irradiated with blue light during one cycle. Then, the first infrared light is irradiated, then the green light is irradiated, then the red light is irradiated, and then the second infrared light is irradiated.
- the optical line sensor 110 performs one row of line data forming a third infrared reflection image, one row of line data forming a blue reflection image, and a first infrared reflection image in one cycle.
- One row of line data to form a green reflection image one row of line data to form a green reflection image, one row of line data to form a red reflection image, and one row to form a second infrared reflection image.
- Line data will be acquired in this order.
- the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained.
- a green reflection image and a red reflection image are acquired respectively.
- FIG. 7 shows a case where the first and second infrared lights are used as the light in the first and second wavelength regions, and the first infrared reflection image and the second infrared reflection image are displayed. A composite image is generated by compositing.
- a monochrome image and an RGB color image of each color can be acquired at a resolution of 100 dpi in the sub-scanning direction, and a first infrared reflection image and a second infrared image can be obtained.
- a composite image obtained by synthesizing the reflected image can be acquired at a resolution of 200 dpi in the sub-scanning direction.
- the time for irradiating each type of light is set to be constant. That is, the charge accumulation time of each image sensor is set to be constant regardless of the type of light. However, the time for irradiating various types of light can be appropriately set, and the irradiation time may be different for different lights.
- the fact that the plurality of time intervals (T1 and T2 and t1, t2 and t3 described later) are the same occurs not only when the plurality of time intervals are completely the same but also due to the fluctuation of the mechanical clock. It also includes cases where the degree of deviation occurs between multiple time intervals.
- ⁇ Processing flow by banknote identification device> a processing flow by the banknote identification device 1, a processing flow of an image acquisition method for acquiring a composite image of banknotes, and a main processing flow thereafter will be described.
- the image acquisition method using the optical line sensor 110 will be described, but since the same applies to the case where the optical line sensor 120 is used, the description thereof will be omitted.
- the light receiving unit 113 receives the light of the first wavelength region (for example, blue light) emitted from the light source 111 and reflected by the bill, and outputs the image signal S1.
- the first wavelength region for example, blue light
- the light in the second wavelength range (for example, green light) emitted from the light source 111 at a timing different from the light in the first wavelength range receives the light reflected by the bill and outputs the image signal S2 (light receiving step S11). ..
- the image signal S1 and the image signal S2 correspond to the first image signal and the second image signal, respectively.
- the sensor control unit 12 reads the image signals S1 and S2 from the optical line sensor 110, and sequentially stores the read image signals S1 and S2 in the ring buffer of the storage unit 30 (image signal reading step S12).
- the image generation unit 13 gray-converts the raw image data based on the image signal S1 to generate the first image data (first monochrome image data), and grays the raw image data based on the image signal S2.
- the conversion is performed to generate the second image data (second monochrome image data), and the generated first monochrome image data and the second monochrome image data are combined in the bill transport direction (secondary scanning direction).
- image generation step S13 To generate a composite image (image generation step S13).
- the image recognition unit 14 recognizes the composite image generated by the image generation unit 13 (image recognition step S14).
- the identification unit 15 determines the denomination, authenticity, correct loss, etc. of the banknote (identification step S15). At this time, the identification unit 15 uses the composite image generated by the image generation unit 13 and the recognition result of the image recognition unit 14.
- the light receiving unit 113 receives the light reflected by the light source in the first wavelength range from the reflection light source 111 and outputs the image signal S1 and outputs the image signal S1 and also outputs the first wavelength range.
- the light in the second wavelength range emitted from the reflection light source 111 at a timing different from the light of the above receives the light reflected by the bill and outputs the image signal S2, and the image generation unit 13 is based on the image signal S1.
- the image data of 1 and the second image data based on the image signal S2 are combined in the transport direction of the bill to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution than the first and second image data, specifically, a monochrome image having twice the resolution in the sub-scanning direction.
- the case where the light in the first wavelength region and the light in the second wavelength region are repeatedly irradiated to the banknote in this order has been described.
- the first and second wavelengths The light in the region does not have to be repeatedly irradiated once.
- at least one of the lights in the first and second wavelength regions may be continuously irradiated on the bill, such that the light source 111 emits light in the order of green ⁇ green ⁇ blue ⁇ green ⁇ green ⁇ blue.
- the image generation unit 13 combines the two types of image data after being gray-converted has been described, but the two types of image data may be combined without being gray-converted.
- the light used for generating the composite image has two wavelengths (first and second wavelength regions)
- the light used for generating the composite image has been described.
- the number of wavelengths is not particularly limited to two, and may be three or more. Specifically, for example, red, green, and blue light may be used, or three types of infrared light (for example, first to third infrared light) may be used.
- the light source control unit 11 controls each light source
- the sensor control unit 12 controls each optical line sensor 110 when the wavelengths of light used for generating the composite image are three.
- An example of control of reading a signal from 120 will be described.
- FIG. 9 shows the content and timing of lighting the light source and reading the signal, and is the same as the case of FIG. 5 except that the content and timing of lighting the light source are partially different.
- the optical line sensor 120 is also controlled in the same manner as the optical line sensor 110, the description thereof will be omitted.
- the conveyed bills are irradiated with red light from the light source 111 and subsequently irradiated with the first infrared light during one cycle. Then, it is irradiated with green light, then it is irradiated with a second infrared light, then it is irradiated with blue light, and then it is irradiated with a third infrared light.
- the optical line sensor 110 forms one row of line data for forming a red reflection image, one row of line data for forming a first infrared reflection image, and a green reflection image in one cycle. Line data for one row, line data for one row that forms the second infrared reflection image, line data for one row that forms the blue reflection image, and one row that forms the third infrared reflection image. Line data will be acquired in this order.
- the first infrared reflection image, the second infrared reflection image, the third infrared reflection image, and the blue reflection image of the entire A side of the bill are obtained.
- a green reflection image and a red reflection image are acquired respectively.
- FIG. 9 shows a case where the first to third infrared lights are used as the light in the three wavelength ranges, and the first infrared reflection image, the second infrared reflection image, and the third red A composite image is generated by synthesizing the external reflection images.
- a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the sub-scanning direction as in the case of FIG.
- a composite image obtained by synthesizing the first infrared reflection image, the second infrared reflection image, and the third infrared reflection image is acquired at a resolution of 300 dpi in the sub-scanning direction. Therefore, it is possible to acquire a reflected image having a higher resolution than that of the first embodiment.
- the time for irradiating each type of light is set to be constant as in the case of FIG. That is, the charge accumulation time of each image sensor is set to be constant regardless of the type of light.
- the time for irradiating various types of light can be appropriately set, and the irradiation time may be different for different lights.
- t1 the time interval t2 from the start of irradiation of light in the second wavelength region to the start of irradiation of light in the third wavelength region (for example, third infrared light), and the third wavelength region.
- the time interval t3 from the start of the irradiation of the light of the above to the start of the irradiation of the light in the first wavelength region is set to the same time interval.
- FIG. 10 shows the contents and timing of lighting the light source and reading the signal, and is the same as the cases of FIGS. 5 to 7 and 9 except that the contents and timing of lighting the light source are partially different.
- the optical line sensor 120 is also controlled in the same manner as the optical line sensor 110, the description thereof will be omitted.
- the light source 111 emits blue light with respect to the bills conveyed during one cycle corresponding to one cycle of the mechanical clock generated by the rotary encoder.
- Light, red light, green light, first infrared light, second infrared light, and third infrared light are irradiated in the predetermined order shown in FIG.
- 0.254 mm of banknotes are conveyed by one cycle of imaging (one cycle of the mechanical clock).
- a clock having a period of one half of the mechanical clock (hereinafter, also abbreviated as 1 / 2MCLK) and a clock having a period of one third of the mechanical clock (hereinafter, 1). / 3MCLK is also abbreviated.) And is generated. Then, based on the rising edge of each clock, the intervals T1'and T2'of the period of irradiating the light in the first and second wavelength regions are set to be the same (including substantially the same), and the three wavelength regions are set. The intervals t1', t2', and t3'of the period of irradiating light are set to be the same (including substantially the same). As a result, the moving distance of each color in the transport direction can be made close to the same.
- the intervals T1'and T2' are set in the period from the rising timing of 1 / 2MCLK to the elapse of a certain time based on the system clock. Further, the intervals t1', t2'and t3'are set in the period from the rising timing of 1/3 MCLK to the elapse of a certain time based on the system clock.
- the optical line sensor simultaneously irradiates the banknote with light in the first wavelength region and light in a second wavelength region different from the first wavelength region.
- Each color reflection image is acquired, and these reflection images are gray-converted and then combined in a direction orthogonal to the bill transport direction (main scanning direction of the optical line sensor) to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution, specifically, twice the resolution in the direction orthogonal to the transport direction of the banknotes, as compared with each reflected image.
- the light sources 111 and 121 have an LED element that emits blue light having a wavelength range of 400 nm to 500 nm, an LED element that emits green light having a wavelength range of 500 nm to 600 nm, and a wavelength range of 600 nm to 700 nm.
- the LED element that emits red light and the LED element that emits infrared light having a wavelength range of 700 nm to 1000 nm are provided, and the bill BN contains white light including blue light, green light, and red light, and infrared light. Irradiate with light at the same time.
- the light receiving units 113 and 123 have a plurality of pixel unit units 130 arranged in a line in a direction (main scanning direction) orthogonal to the transport direction (sub-scanning direction) of the bill BN.
- Each pixel unit 130 has four image pickup elements (light receiving elements) 131a to 131d arranged in a row in the main scanning direction and four bandpass filters provided corresponding to the four image pickup elements 131a to 131d, respectively.
- the four bandpass filters are filters (optical filters) having different wavelength ranges through which they pass, and are bandpass filters (hereinafter, also referred to as blue filters) 132B and 500 that transmit blue light in the wavelength range of 400 to 500 nm.
- a bandpass filter that transmits green light in the wavelength range of up to 600 nm (hereinafter, also referred to as a green filter) 132G and a bandpass filter that transmits red light in the wavelength range of 600 to 700 nm (hereinafter, also referred to as a red filter). It includes 132R and a bandpass filter (hereinafter, also referred to as an infrared light filter) 132IR that transmits infrared light in a wavelength range of 700 to 1000 nm.
- the blue filter 132B, the red filter 132R, the green filter 132G, and the infrared light filter 132IR are arranged in this order in the main scanning direction. Then, each of the image pickup devices 131a to 131d receives light of different wavelength bands transmitted through the corresponding bandpass filter.
- the line data for one row read by the sensor control unit 12 includes a plurality of types of line data that are distinguished by the difference in the wavelength of the irradiated light and the like.
- the image generation unit 13 irradiates the data (image signal) stored in the ring buffer with the received intensity data of the light reflected by irradiating infrared light and reflects the data (image signal) by irradiating with red light. It is decomposed into the light reception intensity data, the light reception intensity data of the light reflected by irradiating green light, and the light reception intensity data of the light reflected by irradiating blue light.
- the image generation unit 13 also functions as a super-resolution processing unit that synthesizes two or more types of image data to increase the resolution in the main scanning direction, and the generated first monochrome image data and second monochrome image.
- a composite image is generated by synthesizing the data in a direction (main scanning direction) orthogonal to the transport direction of the bill. That is, the first monochrome image data and the second monochrome image data are combined line by line. In other words, the first monochrome image data and the second monochrome image data are alternately arranged in the main scanning direction to generate one reflected image.
- FIG. 13 shows a case where blue light and green light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a blue reflection image and a green reflection image.
- the pitch (distance) X of the bandpass filter of each color in the main scanning direction is 0.254 mm
- the pitch X1 of the green filter 132G and the blue filter 132B sandwiching the infrared light filter 132IR is 0.127 mm
- red Assuming that the pitch X2 of the blue filter 132B and the green filter 132G sandwiching the filter 132R is 0.127 mm, a monochrome image of each color can be acquired at a resolution of 100 dpi in the main scanning direction. Therefore, in the example shown in FIG.
- a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the main scanning direction, and a composite image obtained by combining a blue reflection image and a green reflection image is 200 dpi in the main scanning direction. Can be obtained at the resolution of.
- FIG. 14 shows a case where red light and infrared light are used as light in the first and second wavelength regions, and a composite image is generated by synthesizing a red reflection image and an infrared reflection image.
- the pitch (distance) X of the bandpass filter of each color in the main scanning direction is set to 0.254 mm
- the pitch X3 of the red filter 132R and the infrared light filter 132IR sandwiching the green filter 132G is set to 0.127 mm
- blue Assuming that the pitch X4 of the infrared light filter 132IR and the red filter 132R sandwiching the filter 132B is 0.127 mm, a monochrome image of each color can be acquired at a resolution of 100 dpi in the main scanning direction. Therefore, in the example shown in FIG. 14, as in the case of FIG.
- a monochrome image and an RGB color image of each color can be acquired at a resolution of 100 dpi in the main scanning direction, and a red reflection image and an infrared reflection image are combined.
- the composite image can be acquired at a resolution of 200 dpi in the main scanning direction.
- the two types of bandpass filters that transmit light in the first and second wavelength regions are arranged at the same (including substantially the same) pitch in the main scanning direction. This makes it possible to obtain a more natural composite image with less distortion.
- the image acquisition method using the optical line sensor 110 will be described, but since the same applies to the case where the optical line sensor 120 is used, the description thereof will be omitted.
- the light receiving unit 113 receives the light of the first wavelength region (for example, blue light) emitted from the light source 111 and reflected by the bill, and outputs the image signal S1.
- the light in the second wavelength range (for example, green light) emitted from the light source 111 at the same timing as the light in the first wavelength range receives the light reflected by the bill and outputs the image signal S2.
- the image generation unit 13 synthesizes the generated first monochrome image data and the second monochrome image data in the main scanning direction to generate a composite image.
- the light receiving unit 113 receives the light reflected by the light source in the first wavelength range from the reflection light source 111 and outputs the image signal S1 and outputs the image signal S1 and also outputs the first wavelength range.
- the light in the second wavelength range emitted from the reflection light source 111 at the same timing as the light of the above receives the light reflected by the bill and outputs the image signal S2, and the image generation unit 13 is based on the image signal S1.
- the image data of 1 and the second image data based on the image signal S2 are combined in the main scanning direction to generate a composite image. Therefore, it is possible to acquire a monochrome image having a higher resolution than the first and second image data, specifically, a monochrome image having twice the resolution in the main scanning direction.
- the image generation unit 13 may combine the two types of image data without gray conversion.
- the number of wavelengths of light used for generating the composite image is not particularly limited to two, and may be three or more. Specifically, for example, red, green, and blue light may be used, or three types of infrared light (for example, first to third infrared light) may be used.
- each pixel unit 130 is divided into six image pickup elements (light receiving elements) 131a to 131f and six image pickup elements 131a to 131f arranged in a row in the main scanning direction, respectively. It is equipped with six bandpass filters provided correspondingly.
- the six bandpass filters are filters (optical filters) having different wavelength ranges through which they transmit, and in addition to the above-mentioned blue filter 132B, green filter 132G and red filter 132R, the bandpass filters that transmit the first infrared light are transmitted.
- a filter (hereinafter, also referred to as a first infrared light filter) 132IR1 and a bandpass filter (hereinafter, a second infrared light) that transmits a second infrared light having a wavelength range different from that of the first infrared light.
- 132IR2 and a bandpass filter (hereinafter, also referred to as a third infrared light filter) 132IR3 that transmits a third infrared light having a wavelength range different from that of the first and second infrared light.
- the blue filter 132B, the first infrared light filter 132IR1, the red filter 132R, the second infrared light filter 132IR2, the green filter 132G, and the third infrared light filter 132IR3 are arranged in this order in the main scanning direction. Has been done. Then, each of the image pickup elements 131a to 131f receives light of different wavelength bands transmitted through the corresponding bandpass filter.
- the specific wavelength range of each of the first to third infrared lights is not particularly limited, but the peak wavelengths of the first, second, and third infrared lights increase in this order.
- the peak wavelengths of the first to third infrared lights may be 800 nm for the first, 880 nm for the second, and 950 nm for the third.
- FIG. 16 shows a case where the first to third infrared light is used as the light in the three wavelength regions, and the first infrared reflection image, the second infrared reflection image, and the third infrared ray are used.
- a composite image is generated by synthesizing the reflected images.
- the pitch (distance) X of the bandpass filter of each color in the main scanning direction is set to 0.254 mm
- X5 is 0.0847 mm
- the pitch X6 of the second infrared filter 132IR2 and the third infrared filter 132IR3 sandwiching the green filter 132G is 0.0847 mm
- the third infrared light sandwiching the blue filter 132B is set to 0.254 mm
- a monochrome image of each color can be acquired with a resolution of 100 dpi in the main scanning direction. Therefore, even in the example shown in FIG. 16, a monochrome image of each color and an RGB color image can be acquired at a resolution of 100 dpi in the main scanning direction, as in the case of FIG.
- a composite image obtained by synthesizing the first, second, and third infrared reflection images can be acquired at a resolution of 300 dpi in the main scanning direction, which is further compared with the second embodiment. It is possible to acquire a high-resolution reflected image.
- white light is generated by using an LED element that emits blue light and a red phosphor and a green phosphor that are excited by the light and emit red and green, respectively, and irradiate the bill BN.
- white light may be generated by using an LED element that emits blue light and a yellow phosphor that is excited by the light and emits a complementary color of yellow, and the bill BN may be irradiated.
- the banknote BN it is possible to simultaneously irradiate the banknote BN with light in a plurality of wavelength ranges different from each other. Even when an LED element that emits blue light and a yellow phosphor are used, the white light usually includes green light and red light, so that the first and second wavelength regions are included. As the light, as described above, it is possible to use blue light and green light, or red light and infrared light.
- the composite image is compared with each image data. It is possible to acquire a higher resolution monochrome image. Further, since image data of at least two wavelengths are acquired, more types (wavelengths) of images are obtained as compared with the case where the number of times of light emission of a specific type is increased to increase the resolution as described in Patent Document 1. Can be obtained. This makes it possible to efficiently acquire a color image. Further, since each image data used for the composite image constitutes a reflection image, as compared with the case where the reflection image and the transmission image are used as described in Patent Document 2, only one side of the bill is used. An accurate image can be obtained.
- the image recognition unit 14 recognizes the serial number or the face of a person printed on the bill has been described, but if the paper sheets to which the present invention is intended are other than the bill.
- a code such as a barcode or QR code (registered trademark) attached to paper sheets such as a gift certificate or a barcode ticket of a casino may be recognized by the image recognition unit according to the present invention.
- the image recognition unit according to the present invention reads the bar code portion of the composite image and recognizes the bar code of the banknote.
- the barcode may be arranged in the longitudinal direction of the paper sheets, in which case it is preferable to transport the paper sheets in the longitudinal direction and acquire an image thereof.
- the image recognition unit may recognize a plurality of types of features attached to paper sheets, such as a serial number, a person's face, a barcode, and a QR code.
- the present invention is a technique useful for acquiring accurate high-resolution images while acquiring various images.
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Inspection Of Paper Currency And Valuable Securities (AREA)
- Image Input (AREA)
Abstract
La présente invention concerne un dispositif d'acquisition d'image, un dispositif de manipulation de feuille de papier et un procédé d'acquisition d'image permettant d'acquérir divers types d'images et capables d'acquérir des images de haute résolution plus précises. La présente invention est un dispositif d'acquisition d'image pour acquérir une image d'une feuille de papier qui est transférée, et comprend : une unité électroluminescente qui éclaire une feuille de papier avec une lumière d'une première plage de longueurs d'onde et une lumière d'une seconde plage de longueurs d'onde ; une unité de réception de lumière qui reçoit la lumière réfléchie résultant de l'éclairage de la feuille de papier avec la lumière de la première plage de longueurs d'onde émise par l'unité électroluminescente, et délivre un premier signal d'image, et qui reçoit également la lumière réfléchie résultant de l'éclairage de la feuille de papier avec la lumière de la seconde plage de longueurs d'onde émise par l'unité électroluminescente, et délivre un second signal d'image ; et une unité de génération d'image qui génère des premières données d'image sur la base du premier signal d'image et des secondes données d'image sur la base du second signal d'image, et combine les premières données d'image et les secondes données d'image pour générer une image composite.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021513133A JP7218429B2 (ja) | 2019-04-12 | 2019-04-12 | 画像取得装置、紙葉類処理装置及び画像取得方法 |
PCT/JP2019/015949 WO2020208806A1 (fr) | 2019-04-12 | 2019-04-12 | Dispositif d'acquisition d'image, dispositif de manipulation de feuille de papier et procédé d'acquisition d'image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/015949 WO2020208806A1 (fr) | 2019-04-12 | 2019-04-12 | Dispositif d'acquisition d'image, dispositif de manipulation de feuille de papier et procédé d'acquisition d'image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020208806A1 true WO2020208806A1 (fr) | 2020-10-15 |
Family
ID=72751186
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/015949 WO2020208806A1 (fr) | 2019-04-12 | 2019-04-12 | Dispositif d'acquisition d'image, dispositif de manipulation de feuille de papier et procédé d'acquisition d'image |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7218429B2 (fr) |
WO (1) | WO2020208806A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4050580A1 (fr) | 2021-02-25 | 2022-08-31 | Glory Ltd. | Unité de reconnaissance de feuilles et procédé de reconnaissance de feuilles |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008257395A (ja) * | 2007-04-03 | 2008-10-23 | Toshiba Corp | 紙葉類処理装置 |
JP2010225013A (ja) * | 2009-03-25 | 2010-10-07 | Hitachi Omron Terminal Solutions Corp | 記番号認識装置、紙葉類処理装置、自動取引処理装置、及び記番号認識方法 |
JP2016053783A (ja) * | 2014-09-03 | 2016-04-14 | グローリー株式会社 | 受光センサ、センサモジュール及び紙葉類処理装置 |
JP2016218810A (ja) * | 2015-05-22 | 2016-12-22 | 沖電気工業株式会社 | 記番号認識装置、媒体鑑別装置、自動取引装置、記番号管理装置、紙葉類処理装置、記番号管理システム、及び記番号管理プログラム |
JP2018169881A (ja) * | 2017-03-30 | 2018-11-01 | グローリー株式会社 | 記番号読取装置、紙葉類識別装置、紙葉類処理装置、及び記番号読取方法 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101037430B1 (ko) * | 2004-02-12 | 2011-05-30 | 니덱 코팔 가부시키가이샤 | 검사장치 |
-
2019
- 2019-04-12 JP JP2021513133A patent/JP7218429B2/ja active Active
- 2019-04-12 WO PCT/JP2019/015949 patent/WO2020208806A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008257395A (ja) * | 2007-04-03 | 2008-10-23 | Toshiba Corp | 紙葉類処理装置 |
JP2010225013A (ja) * | 2009-03-25 | 2010-10-07 | Hitachi Omron Terminal Solutions Corp | 記番号認識装置、紙葉類処理装置、自動取引処理装置、及び記番号認識方法 |
JP2016053783A (ja) * | 2014-09-03 | 2016-04-14 | グローリー株式会社 | 受光センサ、センサモジュール及び紙葉類処理装置 |
JP2016218810A (ja) * | 2015-05-22 | 2016-12-22 | 沖電気工業株式会社 | 記番号認識装置、媒体鑑別装置、自動取引装置、記番号管理装置、紙葉類処理装置、記番号管理システム、及び記番号管理プログラム |
JP2018169881A (ja) * | 2017-03-30 | 2018-11-01 | グローリー株式会社 | 記番号読取装置、紙葉類識別装置、紙葉類処理装置、及び記番号読取方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4050580A1 (fr) | 2021-02-25 | 2022-08-31 | Glory Ltd. | Unité de reconnaissance de feuilles et procédé de reconnaissance de feuilles |
Also Published As
Publication number | Publication date |
---|---|
JP7218429B2 (ja) | 2023-02-06 |
JPWO2020208806A1 (ja) | 2021-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2166515B1 (fr) | Dispositif d'identification de feuilles de papier, dispositif de traitement de feuilles de papier et procédé d'identification de feuille de papier | |
US7584890B2 (en) | Validator linear array | |
CN102224530A (zh) | 使用顺序照射确定文件的适合性 | |
US10677646B2 (en) | Light receiving sensor, sensor module, and paper sheet handling apparatus | |
JP4703403B2 (ja) | 検査装置 | |
EP3598401B1 (fr) | Dispositif de détection de feuilles de papier, appareil de traitement de feuilles de papier et procédé de détection de feuilles de papier | |
US8989433B2 (en) | Paper sheet recognition apparatus and paper sheet recognition method | |
JP2018169881A (ja) | 記番号読取装置、紙葉類識別装置、紙葉類処理装置、及び記番号読取方法 | |
EP3723054A1 (fr) | Unité de reconnaissance de billets de banque, dispositif de manipulation de billets de banque et procédé de reconnaissance de billets de banque | |
EP3680867B1 (fr) | Dispositif d'acquisition d'images, dispositif de manipulation de feuilles, dispositif de manipulation de billets de banque et procédé d'acquisition d'images | |
WO2020208806A1 (fr) | Dispositif d'acquisition d'image, dispositif de manipulation de feuille de papier et procédé d'acquisition d'image | |
US20230015962A1 (en) | Optical sensor and sheet recognition unit | |
JP3736028B2 (ja) | 紙幣鑑別装置 | |
JP7473677B2 (ja) | 光学センサ、紙葉類識別装置、紙葉類処理装置及び光検出方法 | |
WO2019194152A1 (fr) | Capteur de détection de lumière, dispositif de détection de lumière, dispositif de traitement de feuilles et procédé de détection de lumière | |
JP7337572B2 (ja) | 記番号読取装置、紙葉類処理装置、及び記番号読取方法 | |
WO2019082251A1 (fr) | Capteur optique, module de capteur optique et dispositif de traitement de papier | |
WO2022210372A1 (fr) | Dispositif de détection d'alimentation multiple et procédé de détection d'alimentation multiple | |
WO2021167004A1 (fr) | Capteur optique, dispositif d'identification de feuille de papier et dispositif de traitement de feuille de papier | |
JP7496744B2 (ja) | 紙葉類識別装置、紙葉類処理装置及び紙葉類識別方法 | |
WO2023176530A1 (fr) | Dispositif d'identification de feuille de papier, dispositif de traitement de feuille de papier et procédé d'identification de feuille de papier | |
JP2023137760A (ja) | 紙葉類識別装置、紙葉類処理装置及び紙葉類識別方法 | |
JP2020047304A (ja) | センサモジュール |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19924232 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021513133 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19924232 Country of ref document: EP Kind code of ref document: A1 |