WO2021182185A1 - Information processing device, system, and control method - Google Patents
Information processing device, system, and control method Download PDFInfo
- Publication number
- WO2021182185A1 WO2021182185A1 PCT/JP2021/007965 JP2021007965W WO2021182185A1 WO 2021182185 A1 WO2021182185 A1 WO 2021182185A1 JP 2021007965 W JP2021007965 W JP 2021007965W WO 2021182185 A1 WO2021182185 A1 WO 2021182185A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- captured image
- processor
- interface
- input
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 230000010365 information processing Effects 0.000 title claims abstract description 18
- 238000004891 communication Methods 0.000 claims abstract description 53
- 238000012015 optical character recognition Methods 0.000 description 34
- 230000006870 function Effects 0.000 description 33
- 238000010586 diagram Methods 0.000 description 6
- 239000004973 liquid crystal related substance Substances 0.000 description 6
- 230000004044 response Effects 0.000 description 5
- 238000005192 partition Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/12—Detection or correction of errors, e.g. by rescanning the pattern
- G06V30/127—Detection or correction of errors, e.g. by rescanning the pattern with the intervention of an operator
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/148—Segmentation of character regions
- G06V30/153—Segmentation of character regions using recognition of characters or words
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/19—Recognition using electronic means
- G06V30/191—Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/40—Document-oriented image-based pattern recognition
- G06V30/42—Document-oriented image-based pattern recognition based on the type of document
- G06V30/424—Postal images, e.g. labels or addresses on parcels or postal envelopes
Definitions
- An embodiment of the present invention relates to an information processing device, a system, and a control method.
- a recognition system equipped with a recognition device that reads character strings such as addresses by OCR (optical character recognition) processing, and a VCS (video coding system) that accepts character string input from the operator when the recognition device fails to recognize the character string.
- OCR optical character recognition
- VCS video coding system
- an information processing device a system and a control method capable of effectively acquiring a character string are provided.
- the information processing device includes an image interface, an input interface, a communication interface, and a processor.
- the image interface is a display screen image based on at least the captured image displayed on the display screen of the input device from the input device for inputting the character string included in the captured image in which the recognition of the character string by the first algorithm fails.
- the input interface inputs a character string into the input device.
- the communication interface acquires the captured image from the image acquisition device.
- the processor searches for the captured image corresponding to the display screen image, acquires a character string based on the result of character recognition processing of the searched captured image by a second algorithm different from the first algorithm, and obtains a character string.
- the character string is input to the input device through the input interface.
- FIG. 1 is a block diagram showing a configuration example of a recognition system according to an embodiment.
- FIG. 2 is a block diagram showing a configuration example of the first recognition device according to the embodiment.
- FIG. 3 is a block diagram showing a configuration example of the VCS according to the embodiment.
- FIG. 4 is a block diagram showing a configuration example of the matching device according to the embodiment.
- FIG. 5 is a block diagram showing a configuration example of the second recognition device according to the embodiment.
- FIG. 6 is a diagram showing an example of an input screen according to the embodiment.
- FIG. 7 is a flowchart showing an operation example of the first recognition device according to the embodiment.
- FIG. 8 is a flowchart showing an operation example of the VCS according to the embodiment.
- FIG. 1 is a block diagram showing a configuration example of a recognition system according to an embodiment.
- FIG. 2 is a block diagram showing a configuration example of the first recognition device according to the embodiment.
- FIG. 3 is a block diagram showing a
- FIG. 9 is a flowchart showing an operation example of the second recognition device according to the embodiment.
- FIG. 10 is a flowchart showing an operation example of the second recognition device according to the embodiment.
- FIG. 11 is a flowchart showing an operation example of the matching device according to the embodiment.
- the recognition system reads a destination such as an address from an article put into the sorter.
- the recognition system sets the destination of the article (eg, sorter shoot) based on the read destination.
- the recognition system accepts the input of the destination from the operator who visually checks the destination.
- FIG. 1 shows a configuration example of the recognition system 1 according to the embodiment.
- the recognition system 1 includes a sorter 2, a camera 3, a first recognition device 10, a VCS 20 (20a to 20d), a matching device 30 (30a to 30d), and an operation unit 40 (40a and 40b).
- a display unit 50 (50a and 50b), a second recognition device 60, and the like are provided.
- the first recognition device 10 is connected to the sorter 2, the camera 3, and the VCS 20.
- the VCSs 20a to 20d are connected to the matching devices 30a to 30d, respectively.
- the matching devices 30a and 30b are connected to the operation units 40a and 40b, respectively. Further, the matching devices 30a and 30b are connected to the display units 50a and 50b, respectively.
- the second recognition device 60 is connected to the camera 3 and the matching device 30.
- the recognition system 1 may further include a configuration as required in addition to the configuration shown in FIG. 1, or a specific configuration may be excluded from the recognition system 1.
- the sorter 2 classifies the items to be input into the suppliers based on the signal from the first recognition device 10.
- the sorter 2 includes a plurality of shoots as a partition.
- the sorter 2 puts the article into the chute based on the signal from the first recognition device 10.
- the sorter 2 acquires assortment information indicating an ID for identifying an article and a supplier (for example, a chute number) for inserting the article from the first recognition device 10.
- the sorter 2 puts an article into the chute based on the sorting information.
- the camera 3 (image acquisition device) captures an article to be put into the sorter 2.
- the camera 3 captures a surface (destination surface) on which the destination of the article is described.
- the camera 3 is installed on a transport path for loading articles into the sorter 2.
- the camera 3 may capture an article from a plurality of surfaces.
- the camera 3 transmits the captured image to the first recognition device 10 and the second recognition device 60.
- the first recognition device 10 sets the sort destination of the article in the sorter 2 based on an image (captured image) or the like from the camera 3. For example, the first recognition device 10 transmits to the sorter 2 the sorting information indicating the ID for identifying the article and the partition to which the article is put.
- the first recognition device 10 will be described in detail later.
- the VCS20 is an input device for inputting a destination included in the captured image (captured image of the destination surface) in which the recognition of the destination fails when the first recognition device 10 fails to recognize the destination. VCS20 will be described in detail later.
- the matching device 30 (information processing device) inputs the destination from the second recognition device 60 to the VCS 20.
- the matching device 30 will be described in detail later.
- the operation unit 40 receives inputs for various operations from the operator.
- the operation unit 40 transmits a signal indicating the input operation to the matching device 30.
- the operation unit 40 is composed of a keyboard, buttons, a touch panel, and the like.
- the display unit 50 displays information based on the control from the matching device 30.
- the display unit 50 is composed of a liquid crystal monitor.
- the display unit 50 is composed of a liquid crystal monitor integrally formed with the operation unit 40.
- the recognition system 1 may include an operation unit and a display unit connected to the matching devices 30c and 30d, respectively.
- the second recognition device 60 recognizes the destination from the captured image by OCR processing (character recognition processing).
- the second recognition device 60 transmits the captured image and the recognized destination to the matching device 30.
- FIG. 2 shows a configuration example of the first recognition device 10.
- the first recognition device 10 includes a processor 11, a memory 12, an operation unit 13, a display unit 14, a camera interface 15, a communication interface 16, and the like.
- the processor 11 and the memory 12, the operation unit 13, the display unit 14, the camera interface 15, and the communication interface 16 are communicably connected to each other via a data bus, a predetermined interface, or the like.
- the first recognition device 10 may further include a configuration as required in addition to the configuration as shown in FIG. 2, or a specific configuration may be excluded from the first recognition device 10.
- the processor 11 controls the operation of the first recognition device 10 as a whole. For example, the processor 11 generates sorting information based on the recognition result of the destination and transmits it to the sorter 2.
- the processor 11 is composed of a CPU and the like. Further, the processor 11 may be composed of an ASIC (Application Specific Integrated Circuit) or the like. Further, the processor 11 may be composed of an FPGA (Field Programmable Gate Array) or the like.
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- the memory 12 stores various data.
- the memory 12 functions as a ROM, RAM and NVM.
- the memory 12 stores a control program, control data, and the like.
- the control program and control data are preliminarily incorporated according to the specifications of the first recognition device 10.
- the control program is a program that supports the functions realized by the first recognition device 10.
- the memory 12 temporarily stores data and the like being processed by the processor 11. Further, the memory 12 may store data necessary for executing the application program, an execution result of the application program, and the like.
- the operation unit 13 receives inputs for various operations from the operator.
- the operation unit 13 transmits a signal indicating the input operation to the processor 11.
- the operation unit 13 is composed of a keyboard, buttons, a touch panel, and the like.
- the display unit 14 displays information based on the control from the processor 11.
- the display unit 14 is composed of a liquid crystal monitor.
- the display unit 14 is composed of a liquid crystal monitor integrally formed with the operation unit 13.
- the camera interface 15 is an interface for transmitting and receiving data to and from the camera 3.
- the camera interface 15 is connected to the camera 3 by wire.
- the camera interface 15 receives the captured image from the camera 3.
- the camera interface 15 transmits the received captured image to the processor 11. Further, the camera interface 15 may supply electric power to the camera 3.
- the communication interface 16 is an interface for transmitting and receiving data to and from the sorter 2 and the VCS 20.
- the communication interface 16 supports a LAN (Local Area Network) connection.
- the communication interface 16 may support a USB (Universal Serial Bus) connection.
- the communication interface 16 may be composed of an interface for transmitting and receiving data with the first recognition device 10 and an interface for transmitting and receiving data with the VCS 20.
- VCS 20 Since VCS20a to 20d have the same configuration, they will be described as VCS20.
- FIG. 3 shows a configuration example of VCS20.
- the VCS 20 includes a processor 21, a memory 22, an operation unit interface 23, a display unit interface 24, a communication interface 25, and the like.
- the processor 21, the memory 22, the operation unit interface 23, the display unit interface 24, and the communication interface 25 are communicably connected to each other via a data bus, a predetermined interface, or the like.
- the VCS 20 may further include a configuration as required, or a specific configuration may be excluded from the VCS 20.
- the processor 21 controls the operation of the entire VCS20. For example, the processor 21 outputs a captured image that fails to recognize the destination through the display unit interface 24.
- the processor 21 is composed of a CPU and the like. Further, the processor 21 may be composed of an ASIC or the like. Further, the processor 21 may be composed of an FPGA or the like.
- the memory 22 stores various data.
- the memory 22 functions as a ROM, RAM and NVM.
- the memory 22 stores a control program, control data, and the like.
- the control program and control data are preliminarily incorporated according to the specifications of the VCS20.
- the control program is a program that supports the functions realized by the VCS20.
- the memory 22 temporarily stores data and the like being processed by the processor 21. Further, the memory 22 may store data necessary for executing the application program, an execution result of the application program, and the like.
- the operation unit interface 23 is an interface for transmitting and receiving data to and from an input device that receives input for operations.
- the operation unit interface 23 receives an operation signal indicating an operation (key input operation) input to the keyboard from the input device.
- the operation unit interface 23 transmits the received operation signal to the processor 21.
- the operation unit interface 23 may supply electric power to the input device.
- the operating unit interface 23 supports a USB connection.
- the display unit interface 24 is an interface for transmitting and receiving data to and from a display device that displays an image.
- the display unit interface 24 outputs the image data from the processor 21 to the display device.
- the communication interface 25 is an interface for transmitting and receiving data to and from the first recognition device 10 and the matching device 30.
- the communication interface 25 supports a LAN connection.
- the communication interface 25 may support a USB connection.
- the communication interface 25 may be composed of an interface for transmitting and receiving data with the first recognition device 10 and an interface for transmitting and receiving data with the matching device 30.
- the matching device 30 will be described. Since the matching devices 30a to 30d have the same configuration, they will be described as the matching device 30.
- FIG. 4 shows a configuration example of the matching device 30.
- the matching device 30 includes a processor 31, a memory 32, an image interface 33, an input interface 34, an operation unit interface 35, a display unit interface 36, a communication interface 37, and the like.
- the processor 31 and the memory 32, the image interface 33, the input interface 34, the operation unit interface 35, the display unit interface 36, and the communication interface 37 are communicably connected to each other via a data bus or a predetermined interface.
- the matching device 30 may further include a configuration as required, or a specific configuration may be excluded from the matching device 30.
- the processor 31 controls the operation of the entire matching device 30. For example, the processor 31 inputs the destination from the second recognition device 60 to the VCS 20.
- the processor 31 is composed of a CPU and the like. Further, the processor 31 may be composed of an ASIC or the like. Further, the processor 31 may be composed of an FPGA or the like.
- the memory 32 stores various data.
- the memory 32 functions as a ROM, RAM and NVM.
- the memory 32 stores a control program, control data, and the like.
- the control program and control data are preliminarily incorporated according to the specifications of the matching device 30.
- the control program is a program that supports the functions realized by the matching device 30.
- the memory 32 temporarily stores data and the like being processed by the processor 31. Further, the memory 32 may store data necessary for executing the application program, an execution result of the application program, and the like.
- the image interface 33 is connected to the display interface 24 of the VCS 20.
- the image interface 33 acquires an image from the display interface 24. That is, the image interface 33 acquires an image that the processor 21 of the VCS 20 intends to display on the display device.
- the image interface 33 transmits the acquired image to the processor 31.
- the image interface 33 is composed of a capture board or the like.
- the input interface 34 is connected to the operation unit interface 23.
- the input interface 34 inputs an operation signal indicating a key input operation to the operation unit interface 23 according to the control from the processor 31.
- the input interface 34 supports a USB connection.
- the operation unit interface 35 is an interface for transmitting and receiving data to and from the operation unit 40.
- the operation unit interface 35 receives an operation signal indicating an operation input to the operation unit 40 from the operation unit 40.
- the operation unit interface 35 transmits the received operation signal to the processor 31.
- the operation unit interface 35 may supply electric power to the operation unit 40.
- the operating unit interface 35 supports a USB connection.
- the display unit interface 36 is an interface for transmitting and receiving data to and from the display unit 50.
- the display unit interface 36 outputs the image data from the processor 31 to the display unit 50.
- the communication interface 37 (second communication interface) is an interface for transmitting and receiving data to and from the VCS20, another matching device 30, and the second recognition device 60.
- the communication interface 37 supports a LAN connection.
- the communication interface 37 may support a USB connection.
- the communication interface 37 is composed of an interface for transmitting and receiving data to and from the VCS 20, an interface for transmitting and receiving data to and from the other matching device 30, and an interface for transmitting and receiving data to and from the second recognition device 60. It may be configured.
- matching devices 30c and 30d do not have to include the operation unit interface 35 and the display unit interface 36.
- FIG. 5 shows a configuration example of the second recognition device 60.
- the second recognition device 60 includes a processor 61, a memory 62, an operation unit 63, a display unit 64, a camera interface 65, a communication interface 66, and the like.
- the processor 61, the memory 62, the operation unit 63, the display unit 64, the camera interface 65, and the communication interface 66 are communicably connected to each other via a data bus, a predetermined interface, or the like.
- the second recognition device 60 may further include a configuration as required in addition to the configuration as shown in FIG. 2, or a specific configuration may be excluded from the second recognition device 60.
- the processor 61 controls the operation of the entire second recognition device 60. For example, the processor 61 acquires a destination from a captured image.
- the processor 61 is composed of a CPU and the like. Further, the processor 61 may be composed of an ASIC (Application Specific Integrated Circuit) or the like. Further, the processor 61 may be composed of an FPGA (Field Programmable Gate Array) or the like.
- ASIC Application Specific Integrated Circuit
- FPGA Field Programmable Gate Array
- the memory 62 stores various data.
- the memory 62 functions as a ROM, RAM and NVM.
- the memory 62 stores a control program, control data, and the like.
- the control program and control data are preliminarily incorporated according to the specifications of the second recognition device 60.
- the control program is a program that supports the functions realized by the second recognition device 60.
- the memory 62 temporarily stores data and the like being processed by the processor 61. Further, the memory 62 may store data necessary for executing the application program, an execution result of the application program, and the like.
- the operation unit 63 receives inputs for various operations from the operator.
- the operation unit 63 transmits a signal indicating the input operation to the processor 61.
- the operation unit 63 includes a keyboard, buttons, a touch panel, and the like.
- the display unit 64 displays information based on the control from the processor 61.
- the display unit 64 is composed of a liquid crystal monitor.
- the display unit 64 is composed of a liquid crystal monitor integrally formed with the operation unit 63.
- the camera interface 65 (image acquisition device interface) is an interface for transmitting and receiving data to and from the camera 3.
- the camera interface 65 is connected to the camera 3 by wire.
- the camera interface 65 receives the captured image from the camera 3.
- the camera interface 65 transmits the received captured image to the processor 61.
- the communication interface 66 (first communication interface) is an interface for transmitting and receiving data to and from the matching device 30.
- the communication interface 66 supports a LAN connection.
- the communication interface 66 may support a USB connection.
- the function realized by the first recognition device 10 is realized by the processor 11 executing a program stored in the memory 12 or the like.
- the processor 11 has a function of acquiring an captured image including a destination surface from the camera 3.
- the camera 3 captures an image at the timing when the article passes through the imaging region of the camera 3.
- the camera 3 transmits the captured image to the first recognition device 10.
- the processor 11 acquires an captured image including the destination surface from the camera 3 through the camera interface 15.
- the processor 11 may send a request to the camera 3 and receive a response including the captured image.
- the processor 11 has a function of acquiring a destination from the captured image by OCR processing.
- the processor 11 When the captured image is acquired, the processor 11 performs OCR processing on the captured image according to a predetermined algorithm (first algorithm). When the OCR process is performed, the processor 11 acquires the destination described on the destination surface of the article based on the result of the OCR process.
- a predetermined algorithm first algorithm
- the processor 11 has a function of acquiring a destination by using the VCS20 when the OCR process fails.
- the processor 11 transmits the captured image to the VCS 20 through the communication interface 16.
- the processor 11 selects one VCS20 from the VCSs 20a to 20d, and transmits the captured image to the selected VCS20.
- the VCS 20 transmits the destination described on the destination surface included in the captured image to the first recognition device 10.
- the processor 11 acquires a destination from the VCS 20 through the communication interface 16.
- the processor 11 has a function of setting a supplier of articles based on a destination acquired by OCR processing or a destination from VCS20.
- the processor 11 sets the number of the chute into which the article is inserted in the sorter 2 as the supplier based on the destination. For example, the processor 11 sets the number of the shoot corresponding to the destination administrative division (prefecture, municipality, etc.).
- the processor 11 transmits the sorting information indicating the ID for identifying the article and the sorting destination of the article to the sorter 2 through the communication interface 16.
- the function realized by the VCS 20 is realized by the processor 21 executing a program stored in the memory 22 or the like.
- the processor 21 has a function of acquiring an captured image including a destination surface from the first recognition device 10.
- the processor 11 of the first recognition device 10 transmits the captured image to the VCS 20 when the OCR process fails.
- the processor 21 of the VCS 20 acquires the captured image from the first recognition device 10 through the communication interface 25.
- the processor 21 has a function of transmitting the acquired captured image to the matching device 30.
- the processor 21 When the captured image is acquired, the processor 21 generates an input screen (display screen) that accepts the input of the destination reflected in the captured image.
- the input screen includes the acquired captured image.
- FIG. 6 shows an example of the input screen 100 generated by the processor 21.
- the input screen 100 includes an image area 101, an input field 102, and the like.
- the image area 101 displays an captured image (display screen image) acquired from the first recognition device 10.
- the image area 101 displays an captured image including the destination surface.
- the image resolution of the character string included in the captured image displayed by the image area 101 may be lower than the image resolution of the character string included in the captured image captured by the camera 3.
- the input field 102 is formed in the lower part of the image area 101.
- the input field 102 accepts the input of the destination described on the destination surface of the captured image displayed by the image area 101.
- the input screen 100 may include an icon or the like for confirming the input to the input field 102. Further, the input field 102 may be formed in the upper part of the image area 101.
- the configuration of the input screen is not limited to a specific configuration.
- the processor 21 When the input screen is generated, the processor 21 outputs the input screen generated through the display unit interface 24.
- the processor 21 outputs an input screen in the same manner as when a display device is connected to the display unit interface 24. That is, the processor 21 outputs a signal similar to the signal output to a display device such as a display through the display unit interface 24.
- the processor 21 has a function of accepting the input of the destination through the operation unit interface 23.
- the processor 21 When the input screen is output, the processor 21 receives the input of the destination through the operation unit interface 23. The processor 21 acquires the same signal (operation signal indicating a key input operation) as when the operation unit is connected to the operation unit interface 23.
- the processor 21 has a function of transmitting an input-accepted destination (information indicating the destination) to the first recognition device 10.
- the processor 21 When the processor 21 receives an operation signal whose input is confirmed through the operation unit interface 23 (for example, an operation signal for which the enter key is pressed), the processor 21 transmits the destination input through the communication interface 25 to the first recognition device 10. ..
- the function realized by the second recognition device 60 is realized by the processor 61 executing a program stored in the memory 62 or the like.
- the processor 61 has a function of acquiring an captured image including a destination surface from the camera 3.
- the processor 61 acquires a captured image from the camera 3 through the camera interface 65.
- the processor 61 acquires an captured image similar to the captured image acquired by the first recognition device 10 from the camera 3.
- the processor 61 has a function of acquiring a destination from the captured image by OCR processing.
- the processor 61 When the captured image is extracted, the processor 61 performs OCR processing on the captured image according to a predetermined algorithm (second algorithm) different from the first algorithm.
- the second algorithm can recognize at least a part of the character image that the first algorithm cannot recognize.
- the processor 61 acquires the destination described on the destination surface of the article based on the result of the OCR processing.
- the processor 61 When a plurality of captured images are acquired, the processor 61 performs OCR processing on each captured image and acquires a destination based on the result of the OCR processing.
- the processor 61 associates the captured image with the destination and stores it in the memory 62.
- the processor 61 may store the destination as the file name of the captured image.
- the processor 61 may set an image ID or the like in the captured image.
- the processor 61 has a function of associating the captured image with the destination and transmitting the captured image to the matching device 30.
- the processor 61 receives a request for a captured image and a destination from the matching device 30 through the communication interface 66.
- the request requests an captured image captured by the camera 3 (or acquired by the processor 61 during a predetermined period) in a predetermined period and a destination corresponding to the captured image.
- the processor 61 acquires the captured image and the destination from the memory 62 according to the request. Upon acquiring the captured image and the destination, the processor 61 transmits a response including the captured image and the destination to the matching device 30 through the communication interface 66.
- the function realized by the matching device 30 is realized by the processor 31 executing a program stored in the memory 32 or the like.
- the processor 31 has a function of acquiring an input screen from the VCS 20.
- the processor 31 acquires an input screen through an image interface 33 connected to the display interface 24 of the VCS 20. That is, the processor 31 acquires the captured image including the destination surface from the VCS 20.
- the processor 31 has a function of acquiring the captured image and the destination corresponding to the captured image from the second recognition device 60.
- the processor 31 estimates the time when the captured image included in the input screen is captured by the camera 3 based on the time when the input screen is acquired. For example, the processor 31 estimates the time when the camera 3 is imaged based on the time from when the camera 3 captures the image until the first recognition device 10 recognizes the destination and the VCS 20 outputs the input screen. ..
- the processor 31 estimates a period (estimated period) having a predetermined width as the time when the camera 3 is imaged. For example, the processor 31 acquires, as an estimated period, a period between a time that is a predetermined time back from the time when the input screen is acquired and a time that a predetermined time has elapsed from the time.
- the processor 31 transmits a request for an image captured during the estimation period (candidate image) and a destination corresponding to the image through the communication interface 37.
- the processor 31 receives the response including the candidate captured image and the destination from the second recognition device 60 through the communication interface 37.
- the processor 31 has a function of searching the captured image included in the input screen from the acquired candidate captured images. That is, the processor 31 searches for a candidate captured image corresponding to the captured image included in the input screen.
- the processor 31 matches the candidate captured image with the captured image included in the input screen according to a predetermined algorithm.
- the processor 31 calculates the degree of similarity between each candidate captured image and the captured image included in the input screen.
- the processor 31 identifies the candidate captured image corresponding to the similarity and the captured image included in the input screen.
- the method by which the processor 31 searches for the captured image included in the input screen from the candidate captured images is not limited to a specific method.
- the processor 31 has a function of inputting a destination corresponding to the searched candidate captured image to the operation unit interface 23 of the VCS 20.
- the processor 31 acquires the destination corresponding to the captured image and the identified candidate captured image.
- the processor 31 inputs the acquired destination to the operation unit interface 23 of the VCS 20 through the input interface 34. That is, the processor 31 inputs an operation signal indicating a key input operation for inputting a destination to the operation unit interface 23.
- the processor 31 may input an operation signal indicating an operation for completing the input of the destination to the operation unit interface 23.
- the processor 31 has a function of inputting an operation signal indicating an operation input to the operation unit 40 to the operation unit interface 23 when the search for the captured image included in the input screen fails.
- the processor 31 determines that the search for the captured image included in the input screen has failed.
- the processor 31 displays the input screen from the VCS 20 on the display unit 50.
- the processor 31 accepts the input to the operation unit 40.
- the processor 31 Upon receiving the input to the operation unit 40, the processor 31 inputs an operation signal indicating the input operation to the operation unit interface 23.
- the processor 31 may update the input screen on the display unit 50. That is, the processor 31 acquires the input screen from the display unit interface 24 in real time and displays it on the display unit 50.
- the operator visually observes the image area of the input screen displayed on the display unit 50 and inputs the destination to the operation unit 40.
- the operator inputs the operation to complete the input to the operation unit 40.
- the processor 31 displays the input screen on the display unit 50 connected to the other matching device 30. Further, the processor 31 inputs an operation signal indicating an operation input to the operation unit 40 connected to the other matching device 30 to the operation unit interface 23 of the VCS 20.
- the main matching device 30 (for example, the matching device 30a) or the external control device may manage the operation unit 40 used for input and the display unit 50 for displaying the input screen.
- FIG. 7 is a flowchart for explaining an operation example of the first recognition device 10.
- the processor 11 of the first recognition device 10 acquires an captured image including the destination surface of the article through the camera interface 15 (S11).
- the processor 11 performs OCR processing on the captured image according to the first algorithm (S12).
- the processor 11 transmits the captured image to the VCS20 through the communication interface 16 (S14).
- the processor 11 determines whether or not the destination has been received from the VCS 20 through the communication interface 16 (S15).
- the processor 11 If it is determined that the destination has not been received from the VCS20 (S15, NO), the processor 11 returns to S15.
- the processor 11 determines the destination acquired by the OCR processing or the destination received from the VCS20. Based on the above, the sorter 2 is set to the supplier of the article (S16). When the sorting destination of the article is set in the sorter 2, the processor 11 ends the operation.
- FIG. 8 is a flowchart for explaining an operation example of the VCS20.
- the processor 11 of the VCS 20 determines whether or not the captured image has been received from the first recognition device 10 through the communication interface 25 (S21). If it is determined that the captured image has not been received from the first recognition device 10 (S21, NO), the processor 11 returns to S21.
- the processor 21 When it is determined that the captured image has been received from the first recognition device 10 (S21, YES), the processor 21 outputs an input screen including the captured image through the display unit interface 24 (S22).
- the processor 21 determines whether or not the input of the destination has been accepted through the operation unit interface 23 (S23). If it is determined that the input of the destination is not accepted (S23, NO), the processor 21 returns to S23.
- the processor 21 transmits the input destination to the first recognition device 10 (S24) through the communication interface 25.
- the processor 21 ends the operation.
- FIG. 9 is a flowchart for explaining an operation example in which the second recognition device 60 acquires a destination.
- the processor 61 of the second recognition device 60 acquires an captured image including the destination surface of the article through the camera interface 65 (S31).
- the processor 61 performs OCR processing on the captured image according to the first algorithm (S32).
- the processor 61 associates the captured image with the destination and stores the destination in the memory 62 (S34).
- FIG. 10 is a flowchart for explaining an operation example in which the second recognition device 60 transmits a captured image and a destination.
- the processor 61 of the second recognition device 60 determines whether or not a request has been received from the matching device 30 through the communication interface 66 (S41). If it is determined that the request has not been received (S41, NO), the processor 61 returns to S41.
- the processor 61 acquires the captured image and the destination from the memory 62 according to the received request (S42). Upon acquiring the captured image and the destination, the processor 61 transmits a response including the captured image and the destination to the matching device 30 through the communication interface 66 (S43). When the response is transmitted, the processor 61 ends the operation.
- FIG. 11 is a flowchart for explaining an operation example of the matching device 30.
- the processor 31 of the matching device 30 determines whether or not the input screen has been acquired through the image interface 33 (S51). If it is determined that the input screen has not been acquired (S51, NO), the processor 31 returns to S51.
- the processor 31 acquires the captured image (candidate captured image) and the destination based on the time when the input screen is acquired from the second recognition device 60 (S52).
- the processor 31 searches for the captured image included in the input screen from each candidate captured image (S53).
- the processor 31 acquires the destination corresponding to the candidate captured image identified as the captured image (S55).
- the processor 31 inputs an operation signal indicating a key input operation for inputting the destination through the input interface 34 to the operation unit interface 23 of the VCS 20 (S56).
- the processor 31 displays the input screen on the display unit 50 (S57).
- the processor 31 inputs an operation signal indicating the operation input to the operation unit 40 to the operation unit interface 23 of the VCS 20 (S58).
- the processor 31 executes S58 until it receives the input of the operation for which the input is completed.
- the estimated period may be extended to acquire the captured image and the destination from the second recognition device 60.
- the processor 31 searches for the captured image included in the input screen again.
- the processor 31 may perform OCR processing on the captured image acquired by the second recognition device 60 according to the second algorithm to acquire the destination. For example, when the processor 31 of the second recognition device 60 acquires the captured image from the camera 3, the processor 31 transmits the captured image to one of the matching devices 30. The processor 31 of the matching device 30 performs OCR processing on the received captured image to acquire the destination. When the destination is acquired, the processor 31 transmits the acquired destination to the second recognition device 60. The processor 61 of the second recognition device 60 stores the captured image and the received destination in the memory 62 in association with each other.
- the OCR processing by the second algorithm may be performed by both the matching device 30 and the second recognition device 60. Further, the OCR processing by the second algorithm may be executed by an external device. For example, the OCR processing by the second algorithm is executed by cloud computing. In this case, the processor 61 of the second recognition device 60 transmits the captured image to the external device. The processor 61 acquires the result of the OCR process from the external device.
- the function of the second recognition device 60 may be performed by any of the matching devices 30.
- the matching device 30 may be connected to a plurality of operation units and display units. Further, the matching device 30 may be integrally formed with the operation unit and the display unit.
- first recognition device 10 may be integrally formed with the VCS 20. Further, the first recognition device 10 may be integrally formed with the camera 3. Further, the first recognition device 10 may be integrally formed with the sorter 2.
- the VCS 20 may include an operation unit and a display unit.
- the recognition system 1 may recognize a character string other than the destination of the article.
- the character string recognized by the recognition system 1 is not limited to a specific configuration.
- the recognition system configured as described above acquires captured images in advance from the camera connected to the first recognition device.
- the recognition system 1 performs OCR processing on the captured image in advance according to a second algorithm different from the first algorithm of the first recognition device.
- the recognition system stores the destination based on the result of the OCR process in a memory or the like in advance.
- the recognition system acquires the captured image from the VCS in which the first recognition device fails to recognize the destination.
- the recognition system searches for an captured image that matches the captured image from the VCS from the captured image acquired in advance.
- the recognition system inputs the destination corresponding to the searched captured image into the VCS.
- the recognition system can effectively acquire the destination without modifying the first recognition device.
- the recognition system can input the destination to the VCS at high speed by storing the destination based on the OCR processing according to the second algorithm in advance.
- the recognition system performs OCR processing according to the second algorithm based on the high-resolution captured image. Can be done.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Artificial Intelligence (AREA)
- Quality & Reliability (AREA)
- Character Discrimination (AREA)
- Character Input (AREA)
- Hardware Redundancy (AREA)
- Communication Control (AREA)
Abstract
Description
実施形態に係る認識システムは、ソータに投入される物品から住所などの宛先を読み取る。認識システムは、読み取った宛先に基づいて物品の仕分先(たとえば、ソータのシュート)を設定する。認識システムは、宛先の読み取りに失敗した場合、宛先を目視したオペレータから宛先の入力を受け付ける。 Hereinafter, embodiments will be described with reference to the drawings.
The recognition system according to the embodiment reads a destination such as an address from an article put into the sorter. The recognition system sets the destination of the article (eg, sorter shoot) based on the read destination. When the recognition system fails to read the destination, the recognition system accepts the input of the destination from the operator who visually checks the destination.
図2は、第1の認識装置10の構成例を示す。図2が示すように、第1の認識装置10は、プロセッサ11、メモリ12、操作部13、表示部14、カメラインターフェース15及び通信インターフェース16などを備える。プロセッサ11と、メモリ12、操作部13、表示部14、カメラインターフェース15及び通信インターフェース16と、は、データバス又は所定のインターフェースなどを介して通信可能に接続する。 Next, the
FIG. 2 shows a configuration example of the
たとえば、メモリ12は、制御プログラム及び制御データなどを記憶する。制御プログラム及び制御データは、第1の認識装置10の仕様に応じて予め組み込まれる。たとえば、制御プログラムは、第1の認識装置10で実現する機能をサポートするプログラムなどである。 The
For example, the
VCS20a乃至20dは、同様の構成であるため、VCS20として説明する。 Next, the
Since VCS20a to 20d have the same configuration, they will be described as VCS20.
たとえば、メモリ22は、制御プログラム及び制御データなどを記憶する。制御プログラム及び制御データは、VCS20の仕様に応じて予め組み込まれる。たとえば、制御プログラムは、VCS20で実現する機能をサポートするプログラムなどである。 The
For example, the
マッチング装置30a乃至30dは、同様の構成であるため、マッチング装置30として説明する。 Next, the
Since the
たとえば、メモリ32は、制御プログラム及び制御データなどを記憶する。制御プログラム及び制御データは、マッチング装置30の仕様に応じて予め組み込まれる。たとえば、制御プログラムは、マッチング装置30で実現する機能をサポートするプログラムなどである。 The
For example, the
図5は、第2の認識装置60の構成例を示す。図5が示すように、第2の認識装置60は、プロセッサ61、メモリ62、操作部63、表示部64、カメラインターフェース65及び通信インターフェース66などを備える。プロセッサ61と、メモリ62、操作部63、表示部64、カメラインターフェース65及び通信インターフェース66と、は、データバス又は所定のインターフェースなどを介して通信可能に接続する。 Next, the
FIG. 5 shows a configuration example of the
たとえば、メモリ62は、制御プログラム及び制御データなどを記憶する。制御プログラム及び制御データは、第2の認識装置60の仕様に応じて予め組み込まれる。たとえば、制御プログラムは、第2の認識装置60で実現する機能をサポートするプログラムなどである。 The
For example, the
プロセッサ11は、通信インターフェース16を通じてVCS20から宛先を取得する。 As will be described later, the
The
また、入力欄102は、画像領域101の上部に形成されるものであってもよい。 入力画面の構成は、特定の構成に限定されるものではない。 Further, the
Further, the
プロセッサ61は、カメラインターフェース65を通じてカメラ3から撮像画像を取得する。プロセッサ61は、カメラ3から、第1の認識装置10が取得する撮像画像と同様の撮像画像を取得する。 First, the
The
プロセッサ31は、VCS20の表示部インターフェース24に接続する画像インターフェース33を通じて入力画面を取得する。即ち、プロセッサ31は、宛先面を含む撮像画像をVCS20から取得する。 First, the
The
図7は、第1の認識装置10の動作例について説明するためのフローチャートである。 Next, an operation example of the
FIG. 7 is a flowchart for explaining an operation example of the
ソータ2に物品の仕分先を設定すると、プロセッサ11は、動作を終了する。 When the acquisition of the destination by the OCR processing is successful (S13, YES) or when it is determined that the destination is received from the VCS20 (S15, YES), the
When the sorting destination of the article is set in the
図8は、VCS20の動作例について説明するためのフローチャートである。 Next, an operation example of the
FIG. 8 is a flowchart for explaining an operation example of the VCS20.
まず、第2の認識装置60が宛先を取得する動作例について説明する。 Next, an operation example of the
First, an operation example in which the
レスポンスを送信すると、プロセッサ61は、動作を終了する。 When it is determined that the request has been received (S41, YES), the
When the response is transmitted, the
図11は、マッチング装置30の動作例について説明するためのフローチャートである。 Next, an operation example of the
FIG. 11 is a flowchart for explaining an operation example of the
また、第2のアルゴリズムによるOCR処理は、外部装置によって実行されてもよい。たとえば、第2のアルゴリズムによるOCR処理は、クラウドコンピューティングによって実行される。この場合、第2の認識装置60のプロセッサ61は、外部装置に撮像画像を送信する。プロセッサ61は、外部装置からOCR処理の結果を取得する。 Further, the OCR processing by the second algorithm may be performed by both the
Further, the OCR processing by the second algorithm may be executed by an external device. For example, the OCR processing by the second algorithm is executed by cloud computing. In this case, the
また、マッチング装置30は、操作部及び表示部と一体的に形成されるものであってもよい。 Further, the function of the
Further, the
また、第1の認識装置10は、カメラ3と一体的に形成されるものであってもよい。 また、第1の認識装置10は、ソータ2と一体的に形成されるものであってもよい。 Further, the
Further, the
また、認識システム1は、物品の宛先以外の文字列を認識するものであってもよい。認識システム1が認識する文字列は、特定の構成に限定されるものではない。 Further, the
Further, the
Claims (9)
- 第1のアルゴリズムによる文字列の認識に失敗した撮像画像に含まれる文字列を入力するための入力装置から前記入力装置の表示画面に表示される少なくとも前記撮像画像に基づく表示画面画像を取得する画像インターフェースと、
前記入力装置に文字列を入力する入力インターフェースと、
画像取得装置からの前記撮像画像を取得する通信インターフェースと、
前記表示画面画像に対応する前記撮像画像を探索し、
前記第1のアルゴリズムと異なる第2のアルゴリズムによる、探索された前記撮像画像の文字認識処理の結果に基づく文字列を取得し、
前記入力インターフェースを通じて、前記文字列を前記入力装置に入力する、
プロセッサと、
を備える情報処理装置。 An image for acquiring at least a display screen image based on the captured image displayed on the display screen of the input device from an input device for inputting a character string included in the captured image in which recognition of the character string by the first algorithm fails. Interface and
An input interface for inputting a character string into the input device,
A communication interface that acquires the captured image from the image acquisition device, and
Search for the captured image corresponding to the display screen image,
A character string based on the result of the character recognition processing of the searched captured image by the second algorithm different from the first algorithm is acquired.
Input the character string into the input device through the input interface.
With the processor
Information processing device equipped with. - 前記通信インターフェースは、認識装置とデータを送受信し、
前記プロセッサは、前記通信インターフェースを通じて、前記認識装置から前記撮像画像と前記撮像画像の文字認識処理の結果に基づく前記文字列とを取得する、
請求項1に記載の情報処理装置。 The communication interface sends and receives data to and from the recognition device.
The processor acquires the captured image and the character string based on the result of character recognition processing of the captured image from the recognition device through the communication interface.
The information processing device according to claim 1. - 前記プロセッサは、前記通信インターフェースを通じて、前記表示画面画像を取得した時刻に基づく所定の期間において前記画像取得装置が撮像した前記撮像画像を取得する、請求項1又は2に記載の情報処理装置。 The information processing device according to claim 1 or 2, wherein the processor acquires the captured image captured by the image acquisition device in a predetermined period based on the time when the display screen image is acquired through the communication interface.
- 前記表示画面画像に含まれる文字列の画像解像度は、前記撮像画像に含まれる文字列の画像解像度よりも低い、
請求項1乃至3の何れか1項に記載の情報処理装置。 The image resolution of the character string included in the display screen image is lower than the image resolution of the character string included in the captured image.
The information processing device according to any one of claims 1 to 3. - 前記プロセッサは、前記入力インターフェースを通じて、前記文字列を入力するキー入力操作を示す操作信号を入力する、
請求項1乃至4の何れか1項に記載の情報処理装置。 The processor inputs an operation signal indicating a key input operation for inputting the character string through the input interface.
The information processing device according to any one of claims 1 to 4. - 操作部に接続する操作部インターフェースと、
表示部に接続する表示部インターフェースと、
を備え、
前記プロセッサは、前記表示画面画像に対応する前記撮像画像の探索に失敗した場合、前記表示部インターフェースを通じて前記表示画面を前記表示部に表示し、前記入力インターフェースを通じて前記操作部に入力された操作を示す操作信号を前記入力装置に入力する、
請求項1乃至5の何れか1項に記載の情報処理装置。 The operation unit interface that connects to the operation unit and
The display interface that connects to the display and
With
When the processor fails to search for the captured image corresponding to the display screen image, the processor displays the display screen on the display unit through the display unit interface, and performs an operation input to the operation unit through the input interface. The indicated operation signal is input to the input device.
The information processing device according to any one of claims 1 to 5. - 前記撮像画像は、物品の宛先が記載されている面を撮像した画像であり、
前記プロセッサは、前記入力インターフェースを通じて、前記文字列として前記宛先を入力する、
請求項1乃至6の何れか1項に記載の情報処理装置。 The captured image is an image obtained by capturing a surface on which the destination of the article is described.
The processor inputs the destination as the character string through the input interface.
The information processing device according to any one of claims 1 to 6. - 認識装置と情報処理装置とを備えるシステムであって、
前記認識装置は、
画像取得装置から撮像画像を取得する画像取得装置インターフェースと、
前記情報処理装置とデータを送受信する第1の通信インターフェースと、
第1のアルゴリズムと異なる第2のアルゴリズムに従って前記撮像画像に文字認識処理を行い、
前記第1の通信インターフェースを通じて、前記撮像画像と文字認識処理の結果に基づく文字列とを前記情報処理装置に送信する、
第1のプロセッサと、
を備え、
前記情報処理装置は、
前記第1のアルゴリズムによる文字列の認識に失敗した前記撮像画像に含まれる文字列を入力するための入力装置から前記入力装置の表示画面に表示される少なくとも前記撮像画像に基づく表示画面画像を取得する画像インターフェースと、
前記入力装置に文字列を入力する入力インターフェースと、
前記認識装置とデータを送受信する第2の通信インターフェースと、
前記第2の通信インターフェースを通じて、前記撮像画像を前記認識装置から取得し、
前記表示画面画像に対応する前記撮像画像を探索し、
前記第2の通信インターフェースを通じて、探索された前記撮像画像の文字認識処理の結果に基づく文字列を前記認識装置から取得し、
前記入力インターフェースを通じて、前記文字列を前記入力装置に入力する、
第2のプロセッサと、
を備える、
システム。 A system equipped with a recognition device and an information processing device.
The recognition device is
An image acquisition device interface that acquires captured images from an image acquisition device, and
A first communication interface for transmitting and receiving data to and from the information processing device,
Character recognition processing is performed on the captured image according to a second algorithm different from the first algorithm, and the captured image is subjected to character recognition processing.
Through the first communication interface, the captured image and a character string based on the result of the character recognition process are transmitted to the information processing apparatus.
With the first processor
With
The information processing device
Acquire at least a display screen image based on the captured image displayed on the display screen of the input device from the input device for inputting the character string included in the captured image that failed to recognize the character string by the first algorithm. Image interface and
An input interface for inputting a character string into the input device,
A second communication interface for transmitting and receiving data to and from the recognition device,
The captured image is acquired from the recognition device through the second communication interface.
Search for the captured image corresponding to the display screen image,
Through the second communication interface, a character string based on the result of the character recognition processing of the searched captured image is acquired from the recognition device.
Input the character string into the input device through the input interface.
With the second processor
To prepare
system. - プロセッサによって実行される制御方法であって、
画像取得装置から撮像画像を取得し、
第1のアルゴリズムと異なる第2のアルゴリズムに従って前記撮像画像に文字認識処理を行い、
前記第1のアルゴリズムによる文字列の認識に失敗した前記撮像画像に含まれる文字列を入力するための入力装置から前記入力装置の表示画面に表示される少なくとも前記撮像画像に基づく表示画面画像を取得し、
前記表示画面画像に対応する前記撮像画像を探索し、
探索された前記撮像画像の文字認識処理の結果に基づく文字列を前記入力装置に入力する、
制御方法。 A control method performed by a processor
Acquire the captured image from the image acquisition device and
Character recognition processing is performed on the captured image according to a second algorithm different from the first algorithm, and the captured image is subjected to character recognition processing.
Acquire at least a display screen image based on the captured image displayed on the display screen of the input device from the input device for inputting the character string included in the captured image that failed to recognize the character string by the first algorithm. death,
Search for the captured image corresponding to the display screen image,
A character string based on the result of the character recognition process of the searched image is input to the input device.
Control method.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA3169733A CA3169733A1 (en) | 2020-03-09 | 2021-03-02 | Information processing apparatus, system, and control method |
EP21767087.6A EP4120128A4 (en) | 2020-03-09 | 2021-03-02 | Information processing device, system, and control method |
AU2021235688A AU2021235688B2 (en) | 2020-03-09 | 2021-03-02 | Information processing apparatus, system, and control method |
US17/929,845 US20220413623A1 (en) | 2020-03-09 | 2022-09-06 | Information processing apparatus, system, and control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020039628A JP7443095B2 (en) | 2020-03-09 | 2020-03-09 | Information processing device, system and control method |
JP2020-039628 | 2020-03-09 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/929,845 Continuation US20220413623A1 (en) | 2020-03-09 | 2022-09-06 | Information processing apparatus, system, and control method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021182185A1 true WO2021182185A1 (en) | 2021-09-16 |
Family
ID=77668750
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/007965 WO2021182185A1 (en) | 2020-03-09 | 2021-03-02 | Information processing device, system, and control method |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220413623A1 (en) |
EP (1) | EP4120128A4 (en) |
JP (1) | JP7443095B2 (en) |
AU (1) | AU2021235688B2 (en) |
CA (1) | CA3169733A1 (en) |
WO (1) | WO2021182185A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005007315A (en) | 2003-06-19 | 2005-01-13 | Toshiba Corp | Dividing machine and dividing method |
JP2010009410A (en) * | 2008-06-27 | 2010-01-14 | Toshiba Corp | Video coding system, classifying system, coding method and classifying method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3037691B1 (en) * | 2015-06-16 | 2018-04-20 | Solystic | MATCHING PICTURES OF POSTAL ARTICLES WITH DESCRIPTORS OF SINGULARITES FROM THE GRADIENT FIELD |
-
2020
- 2020-03-09 JP JP2020039628A patent/JP7443095B2/en active Active
-
2021
- 2021-03-02 CA CA3169733A patent/CA3169733A1/en active Pending
- 2021-03-02 AU AU2021235688A patent/AU2021235688B2/en active Active
- 2021-03-02 EP EP21767087.6A patent/EP4120128A4/en active Pending
- 2021-03-02 WO PCT/JP2021/007965 patent/WO2021182185A1/en active Application Filing
-
2022
- 2022-09-06 US US17/929,845 patent/US20220413623A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005007315A (en) | 2003-06-19 | 2005-01-13 | Toshiba Corp | Dividing machine and dividing method |
JP2010009410A (en) * | 2008-06-27 | 2010-01-14 | Toshiba Corp | Video coding system, classifying system, coding method and classifying method |
Non-Patent Citations (1)
Title |
---|
See also references of EP4120128A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4120128A4 (en) | 2024-01-24 |
EP4120128A1 (en) | 2023-01-18 |
AU2021235688B2 (en) | 2023-12-07 |
AU2021235688A1 (en) | 2022-09-29 |
US20220413623A1 (en) | 2022-12-29 |
CA3169733A1 (en) | 2021-09-16 |
JP7443095B2 (en) | 2024-03-05 |
JP2021140631A (en) | 2021-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10078828B2 (en) | Commodity registration apparatus and commodity registration method | |
JP2002255097A (en) | Attitude determining device for artificial satellite | |
WO2021182186A1 (en) | Recognition device and program | |
WO2021182185A1 (en) | Information processing device, system, and control method | |
CN112368724A (en) | Learning device, learning system, and learning method | |
US20180240093A1 (en) | Information processing apparatus and information processing method | |
JP2009146245A (en) | Image comparing method, apparatus and program | |
JP2014182618A (en) | Sorting device and pattern recognition device | |
CN114139000A (en) | Image retrieval system based on image global and local feature reordering | |
JP6947283B2 (en) | Store equipment, store systems, image acquisition methods, and programs | |
JP7413219B2 (en) | Information processing equipment and systems | |
JP6484603B2 (en) | Information processing apparatus, system, information processing method, and program | |
US20230419698A1 (en) | Information processing apparatus and information input system | |
JP2021043833A (en) | Information processing apparatus | |
JP6625853B2 (en) | Delivery management system | |
WO2023171622A1 (en) | Recognition device, program, and system | |
JP7091974B2 (en) | Image processing system, image processing device, image processing method, and image processing program | |
JP2023026154A (en) | Information processing apparatus and program | |
JP2021192168A (en) | Information processing apparatus, system, control method of information processing apparatus, and program | |
WO2022190900A1 (en) | Image processing apparatus, program, and system | |
JP2014232436A (en) | Delivery information management server and delivery information management system | |
JP2022045557A (en) | Information processing device and program | |
JP2017174200A (en) | Information output device, information output method, and information output program | |
JP2022077624A (en) | Recognition device, recognition method, recognition system, and program | |
US20210027110A1 (en) | Pattern recognition system, parameter generation method, and parameter generation program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21767087 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021235688 Country of ref document: AU |
|
ENP | Entry into the national phase |
Ref document number: 3169733 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 2021235688 Country of ref document: AU Date of ref document: 20210302 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021767087 Country of ref document: EP Effective date: 20221010 |