WO2020075837A1 - Information processing system - Google Patents

Information processing system Download PDF

Info

Publication number
WO2020075837A1
WO2020075837A1 PCT/JP2019/040161 JP2019040161W WO2020075837A1 WO 2020075837 A1 WO2020075837 A1 WO 2020075837A1 JP 2019040161 W JP2019040161 W JP 2019040161W WO 2020075837 A1 WO2020075837 A1 WO 2020075837A1
Authority
WO
WIPO (PCT)
Prior art keywords
product
unit
camera
information
settlement
Prior art date
Application number
PCT/JP2019/040161
Other languages
French (fr)
Japanese (ja)
Inventor
寧 蒲原
敏也 波川
英揮 川端
友洋 佐々木
Original Assignee
サインポスト株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by サインポスト株式会社 filed Critical サインポスト株式会社
Publication of WO2020075837A1 publication Critical patent/WO2020075837A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/12Cash registers electronically operated

Definitions

  • the present invention relates to an information processing system.
  • the barcodes for each of the products will be read even at the self-checkout counter, so it is not possible to eliminate the waiting line at the cash register. If not, the purchaser will have to wait for a long time at the cashier. Further, when a purchaser purchases a desired product at a store, if there are many other purchasers, the purchaser may give up shopping. Therefore, in consideration of the above circumstances, when the purchaser purchases the product displayed in the store, it is possible to automate the settlement of the price of the product and shorten the time required for the settlement of the price of the product. The system is required. Further, in the conventional store, fraud such as shoplifting by shoppers and cashiers is also a problem, and a system capable of preventing such fraud is also required.
  • the present invention has been made in view of such a situation, and when the purchaser purchases a product, it is possible to automate the settlement of the price of the product, and to improve the identification accuracy of the product. To do.
  • an information processing system includes: First identifying means for obtaining a result of attempting to identify the object as a product by using a first method in which an eye examiner visually confirms an image including the object as a subject; Settlement means for performing settlement processing for the product identified based on the result of the first identifying means, Is provided.
  • the present invention when a purchaser purchases a product, it is possible to provide an information processing system capable of automating the payment of the price of the product and increasing the accuracy of identifying the product.
  • FIG. 1 is a diagram showing a list of essential points of Embodiments 1 to 4 of an information processing system according to the present invention. It is a schematic diagram showing the outline of the system flow in Embodiments 1 and 2. It is a schematic diagram showing the outline of the system flow in Embodiments 3 and 4.
  • FIG. 3 is a diagram showing a layout example of a convenience store that employs the product recognition system according to the first embodiment.
  • 1 is a schematic perspective view showing an example of the external configuration of a cashier terminal used in Embodiment 1.
  • FIG. It is a block diagram which shows the structure of the goods recognition system as Embodiment 1 of the information processing system of this invention. It is a block diagram which shows the hardware constitutions of a server in the goods recognition system of FIG.
  • FIG. 10 is a flowchart illustrating an automatic settlement process executed by the server and the cashier terminal of FIG. 9.
  • 10 is a flowchart illustrating processing of a trade-restricted product in the automatic settlement processing executed by the server and the cashier terminal of FIG. 9.
  • FIG. 9 is a diagram showing a layout example of a bookstore that employs the product recognition system according to the second embodiment.
  • FIG. 9 is a schematic perspective view showing an example in which a book is automatically settled by a cashier terminal adopted in the second embodiment.
  • It is a block diagram which shows the structure of the goods recognition system as Embodiment 2 of the information processing system of this invention.
  • It is a block diagram which shows the hardware constitutions of a sales floor apparatus in the goods recognition system of FIG.
  • FIG. 18 is a functional block diagram showing an example of a functional configuration of the server of FIG. 7, the cashier terminal of FIG. 8, and the sales floor device of FIG. 17.
  • 19 is a functional block diagram showing a detailed functional configuration example of a moving object tracking unit provided in the sales floor device of FIG. 18.
  • FIG. 19 is a functional block diagram showing a detailed functional configuration example of a position information management unit provided in the sales floor device of FIG. 18.
  • 19 is a functional block diagram showing a detailed functional configuration example of a book number counting unit provided in the sales floor device of FIG. 18.
  • FIG. 19 is a flowchart illustrating an automatic settlement process executed by the server, the cashier terminal, and the sales floor device of FIG. 18.
  • 23 is a flowchart for verifying the number-of-volumes information of the product and the number of volumes to be settled in step S210 of FIG. It is a figure which shows the example of a layout of the supermarket which employs the goods recognition system in Embodiment 3. It is a block diagram which shows the structure of the goods recognition system as Embodiment 3.
  • FIG. 28 is a functional block diagram showing a detailed functional configuration example of a moving object tracking unit provided in the sales floor device of FIG. 27. It is a functional block diagram which shows the detailed functional structural example of the shelf goods recognition part with which the sales floor apparatus of FIG. 27 was equipped.
  • FIG. 28 is a functional block diagram showing a detailed functional configuration example of a basket product recognition unit provided in the sales floor device of FIG. 27. It is a flowchart explaining the basic flow of the automatic settlement processing which the server of FIG.
  • a sales floor apparatus, and a settlement machine perform. It is a flowchart explaining the process which recognizes the goods in a basket of the automatic settlement process which the server sales floor apparatus of FIG. 27 and a settlement machine perform. It is a figure which shows the example of a layout of the supermarket which employs the goods recognition system in Embodiment 4. It is a block diagram which shows the structure of the goods recognition system as Embodiment 4 of the information processing system of this invention. It is a figure which shows the hardware constitutions of the settlement gate in the goods recognition system of FIG. It is a functional block diagram which shows an example of a functional structure of the server of FIG. 7, the sales floor apparatus of FIG. 26, and the settlement gate of FIG. FIG.
  • FIG. 13 is a functional block diagram showing a detailed functional configuration example of a moving object tracking unit included in a sales floor device according to a fourth embodiment. It is a functional block diagram which shows the detailed functional structural example of the shelf goods recognition part with which the sales floor apparatus in Embodiment 4 was equipped. It is a flowchart explaining the automatic settlement process which the server 1, the sales floor apparatus, and the settlement gate of FIG. 36 perform. It is a flowchart explaining the automatic settlement process which the server 1, the sales floor apparatus, and the settlement gate of FIG. 36 perform.
  • Embodiments 1 and 2 of a product recognition system including a cashier terminal for specifying a product and Embodiments 3 and 4 of a product recognition system that does not include a cashier terminal for specifying a product will be described respectively.
  • FIG. 1 is a diagram showing a list of the main points of the first to fourth embodiments of the information processing system of the present invention.
  • the “implementation store” column in FIG. 1 each implementation store of Embodiments 1 to 4 is described.
  • the first embodiment is an information processing system that is mainly intended for application in convenience stores.
  • this shop is merely an example, and the application destinations of the first to fourth embodiments are not particularly limited.
  • the implementation store of Embodiment 1 may be a retail store such as a supermarket, a cafeteria, or a store where settlement is performed.
  • the “payment place” column in FIG. 1 the place where the shopper pays is described in each of the first to fourth embodiments.
  • the checkout terminal in Embodiments 1 and 2 exemplifies a cashier terminal.
  • the cashier terminal has a function of identifying an object as a product and paying the product.
  • the settlement place in the third embodiment is a cashier table.
  • the cash register has a function to settle a product already specified as a product from the object.
  • the settlement place in the fourth embodiment exemplifies a settlement gate.
  • the checkout gate has a function of paying out a product that has already been specified as a product from an object without placing it on the cash register.
  • the information processing system according to the first embodiment is an information processing system that performs automatic settlement of a product placed on a cashier terminal. Details of each embodiment are described in the “Details” column in FIG. 1. That is, for example, the information processing system according to the first embodiment recognizes a hand-held product placed on the cashier terminal by the cashier camera installed on the cashier terminal, and automatically pays the product.
  • the product recognition system of Embodiments 1 to 4 includes one or more sensing devices that image an object.
  • the sensing device various devices such as a temperature sensor and a distance sensor can be adopted in addition to the image sensor (camera or the like).
  • captured image An image captured by an image sensor such as a camera will be referred to as “captured image” below. Furthermore, a captured image including an object as a subject is hereinafter referred to as an “object captured image”. On the other hand, a captured image including a product as a subject is hereinafter referred to as a “product captured image”. Further, in the present specification, when various kinds of image processing are performed on the product imaged image and the object imaged image, they are actually handled in the form of data, but in the following, the data will be omitted for the sake of convenience of description.
  • the object imaged image and the product imaged image for example, an imaged image of the entire area where the object of the cashier terminal is placed, an image obtained by individually cutting out the product, a product logo image, a product barcode image, a product image
  • the image of the label, the image of the entire shelf of the product, the image of the inside of the store taken from the ceiling or the like can be used.
  • the information processing system of the present invention when an object is specified as a product, it is estimated which product the subject of the object captured image is, by using various recognition methods for the object captured image, A list of products for which the subject of the image matches and the matching degree thereof (hereinafter, also referred to as “product candidate list”) is generated.
  • the object imaged image or the product imaged image of the object that cannot be specified as the product or the product whose sale is restricted is displayed on the terminal for eye examination.
  • the information is transmitted, and the eye examination terminal identifies the product and determines the trade restriction (determines whether or not to cancel the trade restriction).
  • the sale restriction of the sale-restricted product is released either by the salesclerk responding on the spot in response to the result of the determination or by adopting the result of the visual inspection as a system. Then, the result of the eye inspection by the eye inspector is appropriately referred to, and the product is settled.
  • FIG. 2 is a schematic diagram showing an outline of the system flow in the first and second embodiments.
  • an object-captured image is captured and a product is specified by a cashier terminal, and (2) an object or trade-restricted product in which the product cannot be specified. If it is determined that (3), relevant information regarding these objects or products is transmitted to the inspection terminal.
  • relevant information for example, an object captured image, a product captured image, a recognition result log, a product candidate list, or the like can be adopted.
  • a configuration may be adopted in which the object captured images of a plurality of frames are transmitted.
  • the eye examination request (reception of related information regarding the eye examination object) is notified by the eye examination terminal, and (5) the eye examiner executes the eye examination.
  • the eye inspection result by the eye examiner (the result of specifying the product or the determination result of the trade restriction) is transmitted to the cashier terminal, and (7) the eye inspection result is referred to by the cashier terminal. At this time, it is possible to inform the cashier terminal that the product is a trade-restricted product.
  • FIG. 3 is a schematic diagram showing an outline of the system flow in the third and fourth embodiments.
  • (1) an object imaged image is captured and a product is specified in the store (in the sales floor), and (2) an object or a trade that cannot specify the product.
  • (3) related information about these objects or products is transmitted to the inspection terminal.
  • the related information for example, an object captured image, a product captured image, a recognition result log, a product candidate list, position information where the object captured image is captured, information for tracking a product or a shopper in a store, and the like are adopted. be able to.
  • the object captured images of a plurality of frames may be transmitted.
  • the eye examination request (reception of related information regarding the eye examination object) is notified by the eye examination terminal, and (5) the eye examiner executes the eye examination.
  • the eye inspection result by the eye examiner (the result of specifying the product or the judgment result of the trade restriction) is transmitted to the sales floor device installed in the sales floor, and (7) the eye inspection result is referred to in the sales floor device. To be done. At this time, it is also possible to inform the sales floor device that the product is a trade-restricted product.
  • the configuration may be such that a purchaser is informed by a screen display or the like in a sales floor device or the like, or a configuration in which voice guidance is provided. Specific possible cases include (A) requesting the purchaser to retake the product (removing it from the shelf, putting it back into the basket) when the product is unspecified, and (B) determining whether to cancel the trading restrictions.
  • the first to fourth embodiments of the information processing system in which such a system flow is executed can be specifically realized as the following product recognition system.
  • the product recognition system recognizes the presence of an object based on an object captured image including an object placed on a cashier terminal as a subject.
  • the cashier terminal includes one or more cashier cameras as an example of a sensing device.
  • the cashier camera images a predetermined area of the cashier terminal.
  • the cash register camera images a predetermined area before an object is placed.
  • the cash register camera images the predetermined area after the object is placed in the predetermined area. Therefore, in the commodity recognition system of the first embodiment, the cashier terminal compares the captured image before the object is placed in the predetermined area of the cashier terminal with the object captured image after the object is placed in the predetermined area of the cashier terminal.
  • the cashier terminal specifies which product each of the recognized objects is by an object recognition method by image recognition.
  • an object recognition method for example, a method of identifying a product with high accuracy by creating a product candidate by deep learning and then activating the verification function is adopted.
  • the eye examiner confirms the object placed on the cashier terminal at the eye examination terminal connected to the cashier terminal via the network according to the set conditions.
  • This confirmation result (result of visual inspection) is transmitted to the cashier terminal.
  • the “eye check” means that a person visually confirms an object and makes some conclusion, for example, specifying what kind of product the object is. Hereinafter, such a conclusion will be referred to as a “confirmation result” or an “eye examination result”.
  • the cashier terminal identifies the final product based on the identification result by the object recognition method based on the image recognition and the confirmation result. I do.
  • the final identification of the product may be performed not by the cashier terminal but by another device (not shown) or a natural person. Then, in the first embodiment, the cashier terminal recognizes the quantity of the next specified product. In the first embodiment, the product specified next is settled.
  • the product recognition system according to the second embodiment is applied to a store such as a bookstore.
  • the product recognition system according to the second embodiment is described as a space between shelves installed at a sales floor in a bookstore or a table such as a wagon (hereinafter, referred to as “inside the shelf” including the table).
  • Object is recognized as the number of books, and when this book is taken, the shopper is tracked until it is placed in the cashier terminal, and when the book is placed in the cashier terminal, the number of books placed is recognized. Then, by identifying the book, the book is recognized as a product and the book is automatically settled.
  • the eye checker confirms the book placed on the cashier terminal at the eye examination terminal connected to the cashier terminal via the network according to the set conditions (the book is sold as Specified as). Therefore, the product recognition system according to the second embodiment may specify the product by further considering this confirmation result (visual inspection result). The eye checker may visually check the shopper before visually checking the “book”. As a result, even if the tracking by the product recognition system fails, the product can be identified.
  • the product recognition system of the third embodiment is applied to retail stores such as supermarkets. Specifically, the product recognition system of the third embodiment recognizes baskets (shopping baskets and carts) placed in the sales floors of retail stores such as supermarkets, and tracks baskets moving in the sales floors.
  • the product recognition system according to the third embodiment when an object is taken from the shelf, the object is recognized and specified as a product, and the baskets are placed on the cash register so that the list of the products placed in the baskets is displayed. It is read and the item is automatically settled.
  • the product taken from the shelf is confirmed by the eye examiner at the eye inspection terminal connected to the camera or the like installed in the sales floor via the network. It Therefore, the product recognition system according to the third embodiment may specify the product by further considering the confirmation result (visual inspection result).
  • the product recognition system of Embodiment 4 is applied to retail stores such as supermarkets. Specifically, not only the shopper, the shopping cart and the cart placed in the supermarket, but also the baskets including the shopper's my bag and the shopping bag and the shopper are recognized and tracked as moving objects. Then, in the product recognition system of the fourth embodiment, the product is recognized and specified when the object is taken from the shelf, and the product can be automatically settled at the cash register even if the product is not placed in the cashier terminal. At this time, in the fourth embodiment, according to the set conditions, the object taken from the shelf is also confirmed by the eye examiner at the eye inspection terminal connected to the camera or the like installed in the sales floor via the network. To be done. Therefore, the product recognition system according to the fourth embodiment may specify the product by further considering this confirmation result (visual inspection result).
  • the information processing system of the first embodiment is a product recognition system having a cashier terminal 2 as shown in FIG. 5 which is adopted in a store such as a convenience store as shown in FIG.
  • the information processing system according to the first embodiment enables automatic payment by placing a product on a cashier terminal.
  • FIG. 4 is a diagram illustrating a layout example when the store that employs the information processing system according to the first embodiment is a convenience store.
  • a cashier counter 12 is installed near the doorway 11 in the store 10.
  • An unmanned cashier terminal 2 is installed on the cashier counter 12 for automatically paying the merchandise.
  • a manned register 13 is installed next to the cashier terminal 2.
  • a plurality of shelf racks 14 for displaying products are installed, and the facing shelf racks 14 serve as passages 15 through which shoppers move.
  • the goods in the shelf are picked up by the shopper who has moved through the aisle 15 and placed in a predetermined area of the cashier terminal 2 (a predetermined area A in FIG. 5 described later).
  • a plurality of products are collectively specified by the cashier terminal 2 and triggered by a predetermined operation of the shopper on the cashier terminal 2 for automatic payment.
  • the clerk recognizes the products one by one by the barcode and makes the payment.
  • FIG. 5 is a schematic perspective view showing an example of the external configuration of the cashier terminal 2.
  • the cashier terminal 2 includes a surrounding portion 270 that surrounds a predetermined area A in which an object is placed.
  • the surrounding portion 270 includes a top plate portion 271, a bottom plate portion 272, and a pair of side plate portions 273.
  • a cash register camera 211 that images the predetermined area A is fixed to each of the top plate portion 271 and the pair of side plate portions 273.
  • the cashier camera 211 images an object placed in the predetermined area A. Although only three cash register cameras 211 are depicted in FIG. 5, five cashier cameras may be provided as described later, and it is sufficient that at least one cashier camera is present, and the number of cashier cameras is not limited.
  • the cashier terminal 2 also includes an external camera 212 that images the face, hands, etc. of the shopper.
  • the housing portion 275 is installed on the bottom plate portion 272.
  • a front surface of the housing 275 is provided with a receipt output unit and a display unit (not shown in FIG. 5) (a receipt output unit R and a display unit D of the output unit 206 of FIG. 8 described later).
  • a semitransparent plate 276 on which an object is placed is installed on the housing 275.
  • the plate surface on the upper surface of the plate 276 is set as the predetermined area A.
  • the board surface of the plate 276 is formed in a wave shape.
  • the wave shape is not limited to a sine wave shape, and may be a rectangular wave. Furthermore, the pitch and amplitude may be not only uniform but also uneven.
  • the plate 276 is configured such that the predetermined area A is formed by repeatedly forming the concave portion and the convex portion in this manner, so that at least a part of a columnar or spherical object is sandwiched between the convex portion and the convex portion, You can prevent it from rolling.
  • An illumination unit 221 that illuminates a predetermined area A is provided inside the plate 276 and the top plate portion 271 of the surrounding portion 270.
  • the lighting unit 221 may be included in the side plate portion 273 of the surrounding portion 270.
  • the illumination unit 221 emits not only white light but also various colors such as blue and red, which are not limited.
  • the illumination unit 221 emits light so that the shadow of an object placed in the predetermined area A is not generated or reduced in the predetermined area A.
  • the surrounding section 270 determines the state of the cashier terminal 2, for example, the state of normal standby, the state of settlement, the state of operation by a store clerk, and the occurrence of an abnormal situation.
  • the color of the presentation unit 210 can be changed so that the color can be visually recognized.
  • At least the top plate portion 271 and the side plate portion 273 of the surrounding portion 270 may be configured with an instantaneous light control sheet so that the transparent state becomes transparent and the opaque state becomes opaque. In that case, the visibility of the predetermined area A can be ensured by making the surrounding portion 270 transparent. Since the surrounding portion 270 is in the opaque state, it is possible to acquire the captured image of the object while suppressing the influence of external light during shooting.
  • FIG. 6 is a configuration diagram showing a configuration of a product recognition system as the first embodiment of the information processing system of the present invention.
  • the product recognition system according to the first embodiment includes a server 1, n (n is an arbitrary integer value of 1 or more) cashier terminals 2-1 to 2-n, and a visual inspection terminal Q. .
  • the server 1, the n cashier terminals 2-1 to 2-n, and the inspection terminal Q are connected to each other via a network N such as the Internet.
  • the inspection terminal Q can be possessed by a store clerk distant from the cashier terminal 2, provided in a backyard of the store, or provided in a call center remote from the store.
  • the server 1 executes necessary processing so that the respective operations of the plurality of cashier terminals 2 are performed in harmony.
  • the cashier terminal 2 is placed on the cashier counter 12 shown in FIG.
  • the cashier terminal 2 the number of objects placed by the shopper in the predetermined area A of the cashier terminal 2 is specified, and then the merchandise is specified and automatically settled.
  • the cashier terminal 2 does not necessarily need to have the server 1 and can function independently. In this case, some or all of the functions of the server 1 are possessed by another information processing device (for example, the cashier terminal 2).
  • FIG. 7 is a block diagram showing the hardware configuration of the server 1 in the information processing system according to the first embodiment shown in FIG.
  • the server 1 includes a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, a bus 104, an input / output interface 105, an output unit 106, and an input unit 107.
  • the storage unit 108, the communication unit 109, and the drive 110 are provided.
  • the CPU 101 executes various processes according to a program stored in the ROM 102 or a program loaded from the storage unit 108 into the RAM 103.
  • the RAM 103 also stores data necessary for the CPU 101 to execute various processes.
  • the CPU 101, the ROM 102, and the RAM 103 are connected to each other via a bus 104.
  • An input / output interface 105 is also connected to the bus 104.
  • An output unit 106, an input unit 107, a storage unit 108, a communication unit 109, and a drive 110 are connected to the input / output interface 105.
  • the output unit 106 includes a display, a speaker, and the like, and outputs various kinds of information as an image or sound.
  • the input unit 107 includes a touch panel, a keyboard, a mouse, a microphone, etc., and inputs various information.
  • the storage unit 108 includes a hard disk, a DRAM (Dynamic Random Access Memory), and the like, and stores various data. As illustrated in FIG. 6, the communication unit 109 communicates with the cashier terminal 2 via the network N including the Internet.
  • the network N including the Internet.
  • a removable medium 120 including a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 110.
  • the program read from the removable medium 120 by the drive 110 is installed in the storage unit 108 as needed.
  • the removable medium 120 can also store various data stored in the storage unit 108, similarly to the storage unit 108.
  • FIG. 8 is a block diagram showing a hardware configuration of the cashier terminal 2 in the information processing system according to the first embodiment shown in FIG.
  • the cashier terminal 2 includes a CPU 201, a ROM 202, a RAM 203, a bus 204, an input / output interface 205, an output unit 206, an input unit 207, an illumination unit 221, a light shielding unit 209, a presentation unit 210, and a cash register.
  • the camera 211, the storage unit 208, the communication unit 213, and the drive 214 are provided.
  • the removable medium 220 is appropriately attached to the drive 214.
  • the CPU 201, the ROM 202, the RAM 203, the bus 204, the input / output interface 205, the storage unit 208, the communication unit 213, the drive 214, and the removable medium 220 of the cashier terminal 2 are configured similarly to those of the server 1.
  • the output unit 206 is provided in the housing unit 275 shown in FIG.
  • the output unit 206 includes a display unit D that displays information about products and information about settlement, a receipt output unit R that outputs a receipt, and a speaker S that outputs a voice.
  • the input unit 207 is provided in the housing unit 275 shown in FIG.
  • the input unit 207 includes a touch panel (not shown), a card reader unit C, and a microphone M. Further, a bar code reader (not shown) may be provided.
  • the light shielding unit 209 switches the surrounding unit 270 shown in FIG. 5 between a transparent state and an opaque state when the surrounding unit 270 is formed of an instant light control sheet.
  • the presenting unit 210 can recognize whether the cashier terminal 2 is in a normal standby state, a settlement state, a store clerk in operation, an abnormal situation, or the like. Then, the presentation unit 210 shown in FIG. 5 is switched to emit light in a different color.
  • the presentation unit 210 is provided not only on the front surface but also on the back surface.
  • the cash register camera 211 images an object placed in the predetermined area A, and outputs one or more captured images obtained as a result as an object captured image.
  • the hardware configuration of the inspection terminal Q is the same as the hardware configuration of the server 1, for the hardware configuration of the inspection terminal Q, refer to FIG. The description is omitted.
  • FIG. 9 is a functional block diagram showing an example of the functional configuration of the server 1, the cashier terminal 2, and the eye examination terminal Q.
  • the DB management unit 141 that manages personal information, product information, and the like functions.
  • the DB management unit 141 may be provided in each cashier terminal 2.
  • a product DB 131 is provided in an area of the storage unit 108 of the server 1.
  • the product DB 131 is a DB (Data Base) that stores product information.
  • the object quantity recognition unit 234, the product identification unit 235, the trade restricted product determination unit 236, the settlement unit 237, the display control unit 238, and the visual inspection result acquisition unit 239 function.
  • the cashier terminal 2 includes a DB information holding unit 241 that holds personal information, product information, and the like.
  • the image display control unit 411 and the eye examination result transmission unit 412 function.
  • the light emission control unit 228 of the CPU 201 of the cashier terminal 2 performs control for switching between a state in which the illumination unit 221 emits light at the timing of capturing an object and a state in which the illumination unit 221 does not emit at the timing of not capturing an object, or in the predetermined area A.
  • the control for switching the emission color of the illumination unit 221 is executed according to the situation of recognizing an object to be placed.
  • the light-shielding control unit 229 switches the surrounding unit 270 between an opaque state and a transparent state when the surrounding unit 270 is formed of an instantaneous light control sheet.
  • the light-shielding control unit 229 controls the light-shielding unit 209 provided in the surrounding unit 270 from one of the opaque state at the timing of capturing an image of an object placed in the predetermined area A and the transparent state at the timing of not capturing the image. Execute control to switch to.
  • the presentation control unit 230 executes control so that the presentation unit 210 changes the emission color for presenting the state of the cashier terminal 2.
  • the personal authentication unit 231 performs personal authentication of the shopper by referring to the personal information managed by the DB management unit 141 during the settlement process. Specifically, the personal authentication unit 231 performs authentication processing using authentication methods such as face authentication, card authentication, fingerprint authentication, vein authentication, various biometric authentication such as iris authentication before image recognition of the product. .
  • the personal information managed by the DB management unit 141 includes information such as age, allergy, and halal. Therefore, the personal information acquired by the personal authentication unit 231 is utilized by the trading restricted product determination unit 236.
  • the image acquisition unit 232 acquires the object captured image captured by the cashier camera 211 with the object placed in the predetermined area A as shown in FIG. 10.
  • FIG. 10 shows an example of an image pickup screen of an object placed on the cashier terminal 2.
  • FIG. 10 is a diagram showing an example of an object imaged image obtained as a result of the objects X, Y, Z, and Z ′ placed in the predetermined area A being imaged by each of the three cash register cameras 211.
  • FIG. 10A illustrates an example of an object imaged image obtained as a result of imaging the objects X, Y, Z, and Z ′ by the cashier camera 211 fixed to the top plate portion 271 of the cashier terminal 2.
  • This object captured image contains six objects.
  • FIG. 10B shows an example of an object imaged image obtained as a result of imaging the objects X, Y, Z, and Z ′ by the cashier camera 211 fixed to one side plate portion 273 of the cashier terminal 2.
  • This object captured image includes two objects. In this object captured image, the objects X, Y, Z, and Z ′ on the cash register camera 211 side hide the objects on the back side.
  • FIGS. 10B and 10C shows a captured image obtained as a result of capturing the objects X, Y, Z, and Z ′ by the cashier camera 211 fixed to the other side plate portion 273 of the cashier terminal 2.
  • This object captured image includes two objects.
  • the objects X, Y, Z, and Z ′ on the cash register camera 211 side hide the objects on the back side.
  • the two objects Z included in the respective object captured images of FIGS. 10B and 10C one is the same object Z ′, but the other is a different object.
  • the object recognition unit 233 recognizes the presence of an object placed in the predetermined area A from the object captured image acquired by the image acquisition unit 232 using the above-described predetermined image recognition method. That is, the object recognizing unit 233 compares the background image before the object is placed in the predetermined area A of the cashier terminal with the imaged image of the object after the object is placed, and performs the background subtraction process on each object region for each object. The existence of an object is recognized by defining (specifying). The object recognizing unit 233 does not compare the background image before the object is placed in the predetermined area A with the object captured image after the object is placed using a method other than the background difference processing, and only the object captured image is obtained. The existence of an object may be recognized by defining the object area from.
  • the object quantity recognition unit 234 recognizes the quantity of objects placed in the predetermined area A by comparing the object recognition quantity with the settlement quantity.
  • the object recognition quantity is the quantity of objects recognized by the object recognition unit 233 from the object captured images captured by the plurality of cash register cameras 211 of the cashier terminal 2.
  • Settlement quantity is the quantity of the product to be settled.
  • the object recognition quantity may vary depending on the cashier camera 211, as shown in FIG. That is, in FIG. 10A, six objects are imaged, but in FIGS. 10B and 10C, two objects are imaged. In such a case, the object quantity recognition unit 234 recognizes the quantity of objects by taking the logical sum as shown in FIG.
  • FIG. 11 is a diagram showing an example of a truth table for calculating the number of objects placed on the cashier terminal 2.
  • FIG. 11 shows images of the objects X, Y, Z, and Z ′ shown in FIG. 10 taken by the first to fifth cash register cameras 211 provided in the cashier terminal 2 (FIG. 5). , "0" indicates that the image could not be captured.
  • the object Z and the object Z ′ are the same object, but since the cashier cameras 211 that have imaged them are different, they represent that they are imaged in different states.
  • the first cash register camera 211 is represented by “cash table camera 1”, and similarly, the second to fifth cash register cameras 211 are represented by “cash table camera 2” to “cash table camera 5”. ing.
  • the object X is imaged by the first, fourth, and fifth cashier cameras 211.
  • the object Y is imaged by the second, fourth and fifth cashier cameras 211.
  • the object Z is imaged by the second, third and fifth cashier cameras 211.
  • the object Z ′ is imaged by the first and third cashier cameras 211.
  • the object quantity recognition unit 234 recognizes the quantity of the product by the method using the logical sum. That is, even if objects are overlapped and placed in the predetermined area A, or even if one object is placed in the predetermined area A so as to be a shadow of another object, the logical sum of the objects imaged by one of the cash register cameras 211 is obtained. It is recognized that the object is placed in the predetermined area A by using.
  • the object quantity recognition unit 234 informs the display control unit 238 that the quantities are different. Output.
  • the product identification unit 235 matches the object whose existence is recognized by the object recognition unit 233 with the product information held in the DB information holding unit 241. That is, the product identification unit 235 first lists up product candidates by image recognition methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product specifying unit 235 causes the verification function to be performed and specifies the product with high accuracy.
  • the verification function is a function of listing the "commodity candidate list P" by an algorithm different from the method of listing the product candidates described above.
  • the results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
  • a method of listing the “commodity candidate list” for example, it is realized by a method of matching the image information of the object obtained from the object whose existence is recognized with the image information held in the DB information holding unit 241 or the memory. May be.
  • the object whose existence is recognized by the object recognition unit 233 is a product registered in the DB information holding unit 241, and thus the product specification unit 235 , And specifies that the product is registered in the DB information holding unit 241.
  • the product identification unit 235 compares the product image stored in the DB information storage unit 241 with the object captured image captured in the predetermined area A, and the feature points (similar feature points) and the features that are similar to each other in the images. Calculate the amount.
  • the product identification unit 235 reads the feature points and the feature amounts of the images of the products included in the product candidate list from the product DB 131 of the server 1.
  • the product identification unit 235 compares the feature amount for each feature point of the product included in the read product candidate list with the feature amount for each feature point of the recognized object, and saves it in the DB information holding unit 241. The similar feature points of the displayed product image and the object captured image captured in the predetermined area A are matched.
  • the product identifying unit 235 compares the positional relationships using the corresponding point coordinates of each set of similar characteristic points, and does not correctly correspond (change in positional relationship) in changes due to rotation and translation. Is removed and the number of remaining similar feature points is calculated. The product specifying unit 235 determines that the product is unspecified when the number of similar feature points is less than the threshold value.
  • products such as lunch boxes that are not packaged products may be unspecified products.
  • the bento may have a slightly different side dish position, and in such a case, it may be an unspecified product.
  • the product specification unit 235 detects various codes such as a multi-dimensional code including a bar code and the like written on a label attached to the product and characters such as a product name, and the DB information holding unit 241 or The product name or product code may be read based on the product information stored in the product DB 131 of the server 1 to specify the product.
  • the product identification unit 235 verifies similar products and related products (hereinafter referred to as “group products”) using the product information stored in the DB information storage unit 241 and the product DB 131 of the server 1.
  • group products similar products and related products
  • the product specification unit 235 for example, for group products of series having different sizes, colors, etc., thresholds are compared using characteristics such as size and color of products.
  • the product specifying unit 235 determines that the product is unspecified when the threshold is less than a predetermined threshold.
  • a mode in which the eye inspection terminal Q performs eye inspection for a specific object (hereinafter, referred to as “specific object eye inspection mode”) and all the objects are inspected by the eye inspection terminal Q.
  • the mode (hereinafter, referred to as “all-object visual inspection mode”) in which is performed can be set (the same applies to each of the following embodiments).
  • the product identification unit 235 transmits the object captured image of the object that is not specified as the product and the related information to the visual inspection terminal Q via the visual inspection result acquisition unit 239. Send to.
  • the product identification unit 235 determines the product identification result and the product captured image of the identified product, the object captured image of the object not identified as the product, and related information. And the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 239. Then, the product identification unit 235 identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 239 from the eye inspection terminal Q. Specifically, for the product identified by the product identification unit 235, when the identification result is approved or corrected by the inspection result, the product identification unit 235 identifies the product according to the content indicated by the inspection result. . In addition, for an object for which the product specifying unit 235 has determined that the product is unspecified, the product specifying unit 235 specifies the object as a product indicated by the visual inspection result.
  • the trade-restricted product determination unit 236 determines whether or not the product identified by the product identification unit 235 corresponds to the trade-restricted product, based on the determination information.
  • the trade-restricted products include (A) products such as cigarettes and alcoholic beverages that cannot be purchased until a certain age is reached, (B) products whose expiry date and expiration date have expired, and (C) allergic ingredients According to the above, products that should not be ingested, products that are restricted by religion, such as (D) products other than halal foods, correspond.
  • the trade-restricted product determination unit 236 transmits the product imaged image of the product corresponding to the trade-restricted product to the eye inspection terminal Q via the eye inspection result acquisition unit 239.
  • the trade-restricted product determination unit 236 determines an image (face image, identification card image, etc.) relating to the shopper according to the type of the trade-restricted product (above (A) to (D), etc.).
  • the information is appropriately transmitted to the inspection terminal Q via the inspection result acquisition unit 239.
  • the image relating to the shopper is unclear, the purchaser may be informed by screen display or voice guidance at the cashier terminal 2 and the image retake may be requested.
  • the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2. Then, the trade-restricted product determination unit 236 determines whether or not the sale of the trade-restricted product is permitted based on the eye inspection result acquired by the eye-inspection result acquisition unit 239 from the inspection terminal Q (the sale-restriction is released). Whether or not) is determined.
  • the settlement unit 237 calculates the total amount of money for all the products identified by the product identification unit 235. At this time, when the trading restricted products determination unit 236 determines that there are trading restricted products, it is necessary that the restrictions are released for all of the trading restricted products. Therefore, the settlement section 237 reads out the price of the product placed in the predetermined area A from the DB information holding section 241, and displays it on the display section D (FIG. 8).
  • the display control unit 238 controls the output unit 206 to warn the shopper or the clerk to confirm the quantity of the product when the object quantity recognition unit 234 does not match the object recognition quantity with the settlement quantity. To do.
  • the output unit 206 is controlled so as to output (replacement instruction, etc.). Specifically, the purchaser may be notified by a screen display or voice guidance, and the purchaser may be requested to replace the product. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
  • the display control unit 238 controls the output unit 206 to warn the shopper or the clerk that the object is a trade-restricted product when the object is determined to be a trade-restricted product by the trade-restricted product determination unit 236. To do. Further, when the total amount is calculated by the settlement unit 237, the output unit 206 is controlled to display the product name, price, etc. of the product to the shopper and the clerk.
  • the eye inspection result acquisition unit 239 requests the identification of the product by the eye inspection by transmitting the object captured image, or the product identification result and the product captured image of the identified product to the eye inspection terminal Q.
  • the eye inspection result transmitted from the eye inspection terminal Q is acquired in response to the request.
  • the visual inspection result acquisition unit 239 outputs the acquired visual inspection result (commercial product identification result) to the commercial product identification unit 235.
  • the eye inspection result acquisition unit 239 requests the determination of the sale restricted product by the eye inspection by transmitting the product imaged image of the product corresponding to the sale restricted product to the eye inspection terminal Q, and the eye inspection terminal Q. Get the result of the eye inspection sent from.
  • the visual inspection result acquisition unit 239 appropriately transmits an image (a face image, an image of an identification card, etc.) regarding the shopper to the visual inspection terminal Q according to the type of the restricted sale product.
  • the eye inspection result acquisition unit 239 outputs the acquired eye inspection result (determination result of the sale-restricted product) to the sale-restricted product determination unit 236.
  • the trade restriction product determination unit 236 determines that the object is a trade restriction product, and the trade restriction is not released even by the result of the visual inspection, a warning to that effect is presented from the presentation unit 210 and the output unit 206.
  • the clerk who notices this presentation collects the trade-restricted products so as not to sell the trade-restricted products.
  • the product recognition system may be provided with a remote operation unit for instructing cancellation of the trading restriction for the trading restricted product, in addition to the inspection terminal Q.
  • the remote control unit can be possessed by a clerk distant from the cashier terminal 2 or provided in the backyard of the store.
  • a single remote control unit can also remotely control a plurality of cashier terminals 2.
  • the image display control unit 411 displays images such as the object imaged image or the product imaged image transmitted from the cashier terminal 2 and these images when the cashier terminal 2 requests the eye inspection.
  • Various information (commodity candidate list, etc.) transmitted in association with is output to the output unit 106.
  • the eye inspection result transmission unit 412 receives the object inspection image or the product captured image output to the output unit 106 from the eye inspection result input by the eye examiner through the input unit 107 (determination of product identification result or trade restricted product). (Result etc.) is transmitted to the cashier terminal 2 which requested the eye examination.
  • the product recognition system applied as the information processing system of the present specification includes a product registration system for imaging the appearance of a product sold at the store 10 and registering it together with product information such as the price of the product. .
  • the product registration may be performed in the store 10 or may be performed outside the store such as the manufacturer or wholesaler of the product, and the location is not limited.
  • the product registration system includes a registration image generation unit (not shown), a product master registration unit, a captured image acquisition unit, and a recognition unit.
  • the registration image generation unit generates an image of a product that can be placed in the predetermined area A of the cashier terminal 2 as a product registration image.
  • the product master registration unit associates and registers the product registration image generated by the registration image generation unit and the product identifier uniquely assigned to the product included in the product registration image as a subject.
  • the captured image acquisition unit acquires a captured image of an object placed in a predetermined area A of the cashier terminal 2 as an object captured image.
  • the product specifying unit 235 specifies which product is the object whose existence is recognized.
  • the object recognition unit 233 recognizes the presence of an object placed in the predetermined area A based on the acquired object captured image.
  • This product recognition system matches the object captured image of the object whose existence is recognized with the image of the product held in the DB information holding unit 241 or the storage unit 108 of the server 1 as described above, and thereby the product specifying unit. 235 identifies which product.
  • a product registration image is generated, and a product identifier is given to a product included in the product registration image, whereby master registration by the product identifier uniquely given to the product is performed. It will be possible. Furthermore, the product recognition system provided with this product registration system can manage the products for which the barcode sticker cannot be attached.
  • FIG. 12 is a flowchart illustrating a product settlement process executed by the server 1, the cashier terminal 2, and the eye examination terminal Q in FIG. 9.
  • an image of the predetermined area A captured in advance in a state where no object is placed in the predetermined area A of the cashier terminal 2 is stored in the image acquisition unit 232.
  • the image of the predetermined area A captured in advance is updated at a predetermined timing, is not updated every time the shopper uses the cashier terminal 2, and may be shared even if the shopper changes. Then, when the shopper presses the operation button of the cashier terminal, the automatic settlement process is started.
  • step S101 the cashier camera 211 of the cashier terminal 2 captures an image after an object is placed on the predetermined area A of the cashier terminal 2.
  • the image is input to the image acquisition unit 232 as a captured image of an object as illustrated in FIG. 10, for example.
  • the inside of the cashier terminal 2 is illuminated by the illumination unit 221, so that the shadow of the object is not generated or reduced.
  • the object captured image may be input to the display control unit 238 and output from the output unit 206.
  • the external camera 212 of the cashier terminal 2 images the shopper, and the personal authentication unit 231 personally authenticates the shopper.
  • step S102 the object recognition unit 233 of the cashier terminal 2 recognizes the presence of an object placed in the predetermined area A from the object captured image acquired by the image acquisition unit 232 using the above-described predetermined image recognition method. To do. That is, the object recognizing unit 233 compares the background image before the object is placed in the predetermined area A of the cashier terminal 2 with the captured image of the object after the object is placed, and performs the background subtraction process on each object region for each object. The existence of an object is recognized by defining (specifying).
  • step S103 the product specifying unit 235 of the cashier terminal 2 determines whether or not there is an object whose product cannot be specified among the objects placed in the predetermined area A of the cashier terminal 2.
  • step S104 the product specification unit 235 of the cashier terminal 2 determines whether the specific object inspection mode or the all object inspection mode is set. If the specific object visual inspection mode is set, the process proceeds to step S106. If the all-object visual inspection mode is set, the process proceeds to step S105.
  • step S ⁇ b> 105 the eye inspection result acquisition unit 239 of the cashier terminal 2 requests the eye inspection terminal Q to perform the eye inspection (specification of the product) for the target object or product, and acquires the eye inspection result.
  • step S105 the process proceeds to step S107.
  • step S105 when there is an object whose product is not specified even by the visual inspection at the visual inspection terminal Q, the presentation unit 210 of the cashier terminal 2 notifies the error state by color, sound, or the like.
  • the object may be replaced and a predetermined operation may be performed on the cashier terminal 2 to return to step S101 and image the object again.
  • the replacement may be performed by a store clerk who has found the error state or may be performed by the shopper himself.
  • the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
  • the error state is that when the system processing of the cashier terminal 2 becomes abnormal, the characteristic part of the object cannot be imaged, the objects overlap, or become a shadow, the trade-restricted product is This is the case when a product or personal property is left behind in the cashier terminal 2 when it is placed in the terminal 2.
  • the objects can be replaced to pick up which product the object is and specify the product.
  • step S105 If there is an object whose product identification unit 235 cannot identify which product (YES in step S105), the process proceeds to step S105.
  • step S106 If there is no object whose product specifying unit 235 cannot specify which product in step S106 (NO in step S103), the product specifying unit 235 stores the product held in the DB information holding unit 241 or the storage unit of the server 1.
  • the product is specified by including information such as name, price, and restricted product. As a result, the process proceeds to step S107.
  • the identified product information may be output to the display control unit 238.
  • step S107 the trade-restricted product determination unit 236 executes the process of the trade-restricted product.
  • the processing of the trade-restricted product is a process of determining whether the specified product is a trade-restricted product.
  • the trade-restricted product determination unit 236 determines, for example, whether the specified product is an age-restricted product, whether it is a product that does not correspond to the halal or a product that contains an allergic component, or whether it is an expired product or an expired product. To do. Details of the processing of the trade-restricted product will be described later with reference to FIG. Note that the trade-restricted product determination unit 236 can execute the process of the trade-restricted product for each shopper by acquiring the personal authentication in advance at any timing up to step S105 (including before step S101). it can.
  • step S108 the settlement unit 237 settlements the products placed in the predetermined area A. Specifically, the settlement unit 237 obtains the prices of the products specified in step S103 individually and totals them to settle all the products placed in the predetermined area A.
  • the product information such as the product name and price of the adjusted product is output from the display control unit 238 to the output unit 206, displayed on the display unit D of the output unit 206, and printed on the receipt from the receipt output unit R and output.
  • the cashier terminal 2 is connected to the server 1 via the communication unit 213, it can be utilized as a POS (Point Of Sale). That is, the cashier terminal 2 links the payment item purchase information and the estimated age / sex information to the POS system.
  • POS Point Of Sale
  • step S108 when the quantity of the object specified by the object quantity recognition unit 234 and the quantity of the product specified by the product specification unit 235 are different from the settlement quantity, the presenting unit 210 of the cashier terminal 2 indicates that fact. May be presented.
  • the cashier terminal 2 has a cancel function, so that the checkout process can be stopped.
  • FIG. 13 is a flowchart for explaining the processing of the trade-restricted product executed by the server 1, the cashier terminal 2, and the inspection terminal Q of FIG.
  • the sale restriction of the sale-restricted product is released by either the store clerk on the spot in response to the result of the determination or the result of the visual inspection is adopted as a system.
  • FIG. 13 a case where the result of the visual inspection is adopted as the system will be described.
  • step S111 the trade-restricted product determination unit 236 determines whether or not the product identified by the product identification unit 235 is an alcoholic beverage or other product that requires age confirmation.
  • step S111 If it is determined in step S111 that the product identified by the product identification unit 235 is a product that requires age confirmation, that is, if YES, the process proceeds to step S112.
  • step S112 the cashier terminal 2 uses the external camera 212 to acquire an image of the purchaser's face and age confirmation document. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S112 is skipped and the process proceeds to step S116.
  • step S113 the trading restricted product determination unit 236 of the cashier terminal 2 transmits the acquired image to the eye inspection terminal Q via the eye inspection result acquisition unit 239 to request the eye inspection. The eye examiner who receives this image confirms the age of the purchaser based on the image.
  • the trade-restricted product determination unit 236 determines whether or not an instruction to release the trade restriction has been received from the inspection terminal Q via the inspection result acquisition unit 239.
  • step S114 When it is determined in step S114 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S115. If it is determined in step S114 that the cancel instruction for canceling the trading restriction has been accepted, that is, if YES, the process proceeds to step S116.
  • step S115 the visual inspection result transmission unit 412 of the visual inspection terminal Q transmits a warning that the trading restriction has not been released even by the visual inspection result. By receiving this warning, for example, the display control unit 238 of the cashier terminal 2 presents, via the output unit 206, a warning that the trading restriction has not been canceled even by the result of the visual inspection.
  • step S115 the trading restriction process ends.
  • step S116 the trade restriction product determination unit 236 cancels the trade restriction. In this way, when step S116 is completed, or when it is determined in step S111 that the product is not age-restricted (NO is determined), the process proceeds to step S117.
  • step S117 the trade restriction product determination unit 236 determines whether the product identified by the product identification unit 235 is a product other than a halal (permitted) food product or an allergic product. To do.
  • the trade-restricted product determination unit 236 determines that the product is a product other than a halal product (a product that is not allowed) or an allergy product, the process proceeds to step S118.
  • step S118 the display control unit 238 causes the cashier display unit D to display that the product is a product other than a halal product or an allergic product.
  • step S118 is skipped and the process proceeds to step S121.
  • step S119 the trading restriction product determination unit 236 determines whether or not an instruction to cancel the trading restriction has been received. If it is determined that the instruction to release the trading restriction is not received (NO is determined), the process proceeds to step S115. If it is determined that the instruction to cancel the trading restriction has been received (YES is determined), the process proceeds to step S120.
  • step S120 the trade restriction product determination unit 236 cancels the trade restriction. In this way, when step S120 is completed, or when the product is not a product other than a halal product or is not an allergic product (when it is determined to be NO) in step S117, the process proceeds to step S121.
  • step S121 the trade-restricted product determination unit 236 determines whether the product identified by the product identification unit 235 is a product whose expiration date has expired.
  • the determination in step S121 is YES, and the process proceeds to step S122.
  • step S122 the display control unit 238 causes the display unit D of the cashier terminal 2 to display that there is a possibility that the expired product is included. If it is determined in step S123 that the instruction to cancel the trading restriction has not been received, NO is determined and the process proceeds to step S115.
  • step S123 If it is determined in step S123 that the instruction to cancel the trading restriction has been accepted, YES is determined and the process proceeds to step S124.
  • step 124 the trade restriction product determination unit 236 cancels the trade restriction.
  • the trade-restricted product determination unit 236 determines that the product whose expiration date has expired is not included, the determination is NO in step S121, and the process ends.
  • the trading restriction process ends.
  • the process proceeds to the settlement at the cashier terminal in step S106.
  • the information processing system can recognize the product placed on the cashier terminal and automatically make a payment.
  • the objects and products to be subjected to the eye inspection for example, objects for which the product cannot be specified at the cashier terminal 2, products corresponding to the sale-restricted products, etc.
  • Inspection is performed, and automatic settlement is performed according to the result of the visual inspection (the result of identifying the product, the result of determining the trade-restricted product, etc.). Therefore, according to the present information processing system, when the shopper purchases a product, it is possible to automate the payment of the price of the product and to improve the accuracy of specifying the product.
  • the first embodiment is not limited to the above-described embodiments, and modifications, improvements, etc. within a range in which the object of the present invention can be achieved are included in the present invention.
  • the appearance configuration of the cashier terminal 2 shown in FIG. 5 is an example, and the appearance is not limited to this appearance.
  • the cashier terminal 2 only needs to include at least the predetermined area A, an image capturing unit such as the cashier camera 211, and the output unit 206, and other components may be added.
  • the product recognition system according to the first embodiment may include a transport mechanism (for example, a belt conveyor) that transports one or more objects from the upstream side to the downstream side.
  • a predetermined area having an imaging unit is arranged on the upstream side.
  • a settlement area is arranged on the downstream side. In the predetermined area, the number of objects imaged by the imaging unit is counted. This product recognition system detects an error when the number of counted objects is different from the number of products settled in the settlement area.
  • the information processing system according to the second embodiment is a product recognition system including a cashier terminal 2 as shown in FIGS. 5 and 15 in a store 20 such as a bookstore as shown in FIG.
  • a book which is a product
  • FIG. 14 is a diagram illustrating a layout example of a bookstore that employs the information processing system according to the second embodiment.
  • a store 20 such as a bookstore has a doorway 22 having a gate 21.
  • the entrances / outlets 22 are not limited to the two locations as illustrated, and may be one location or three or more locations.
  • the gate 21 is not limited to the type having the opening / closing member as shown in the figure, but also the type having a function of notifying the occurrence of an abnormal situation by sound or light like a speaker or a lamp, which is not shown, is dealt with in case of misconduct. It is made possible.
  • the store 20 has a plurality of shelf racks 23 for displaying books.
  • the shelf rack 23 displays books by arranging a plurality of shelves in the vertical direction at intervals.
  • a passage 24 is formed between the rack racks 23 facing each other in the horizontal direction.
  • a plurality of ceiling cameras 310 are installed on the ceiling of the passage 24.
  • the ceiling camera 310 always captures an image of a shopper who has entered the store without blind spots in the inside of the store.
  • a plurality of (only one in the drawing) shelf cameras 311 that constantly image the inside of each shelf rack 23 may be installed in each shelf rack 23.
  • a plurality of shelf cameras 311 are arranged so that the inside of the shelf can be imaged without blind spots, and even a shopper standing in front of the shelf rack 43 can also image.
  • a cashier counter 25 is installed near the doorway 22 in the store 20.
  • a manned register 26 is installed next to the cashier terminal 2.
  • a shopping cart (not shown) for inserting books may be placed near the doorway 22 or in the passage 24.
  • a clerk Mt is working in the passage 24, the cashier counter 25, or the like.
  • the store clerk Mt owns the information terminal 9.
  • the information terminal 9 is also installed in the backyard of the store 20 or the general headquarters outside the store 20.
  • the ceiling camera 310 captures the action of the shopper taking out or returning one or more books from the shelf of the shelf rack 23, and after grasping the number of books taken by the shopper. As shown in FIG. 15, information such as the price of a book placed on the unmanned cashier terminal 2 is acquired and automatically settled.
  • the shopper will be described as a “moving object Mo”.
  • FIG. 15 is a schematic perspective view showing the general configuration of the cashier terminal 2 used in the second embodiment, showing a state in which a book (without numbering) is being paid.
  • the unmanned cashier terminal 2 according to the second embodiment employs the same external configuration as the cashier terminal 2 according to the first embodiment shown in FIG. Therefore, the cashier terminal 2 includes the surrounding portion 270 that surrounds the predetermined area A in which the book is placed.
  • the surrounding portion 270 includes a top plate portion 271, a bottom plate portion 272, and a pair of side plate portions 273.
  • the surrounding portion 270 has the same structure as the surrounding portion 270 of the first embodiment shown in FIG.
  • the cash register camera 211 that images the predetermined area A is fixed to the top plate portion 271 and the pair of side plate portions 273.
  • at least one cash register camera 211 images at least the spine of a book placed in the predetermined area A.
  • the book is placed so that the spine cover faces one side plate portion 273 and the spine cover faces the other side plate portion 273, but the spine cover is attached to the cash register camera 211 installed in the top plate portion 271. May be directed, and there are no restrictions on how to place books.
  • FIG. 16 is a configuration diagram showing the configuration of the product recognition system as the second embodiment of the information processing system of the present invention.
  • the product recognition system includes a server 1, cash register terminals 2-1 to 2-n, a sales floor device 3, and an inspection terminal Q.
  • the sales floor device 3 has a function of recognizing the number of books from a captured image of the book captured by the ceiling camera 310.
  • the server 1 is installed in the backyard of the store 20 or outside the store in order to manage the cashier terminals 2-1 to 2-n, the sales floor device 3, and the inspection terminal Q.
  • the sales floor device 3 controls the ceiling camera 310 installed in the store 20 in order to discover and track the moving object Mo in the store 20 shown in FIG.
  • the server 1, the cashier terminals 2-1 to 2-n, the sales floor device 3, and the inspection terminal Q are connected to each other via a network N such as an Internet line.
  • the inspection terminal Q can be possessed by a store clerk distant from the cashier terminal 2, provided in a backyard of the store, or provided in a call center remote from the store.
  • the server 1 executes each process in order to manage each operation of the cashier terminal 2 and the sales floor device 3.
  • the server 1 includes a CPU 101, a ROM 102, a RAM 103, a bus 104, an input / output interface 105, an output unit 106, an input unit 107, a storage unit 108, a communication unit 109, and a drive 110. There is. These are configured similarly to the server 1 described in the first embodiment shown in FIG.
  • FIG. 17 is a block diagram showing the hardware configuration of the sales floor device 3 in the product recognition system of FIG.
  • the sales floor device 3 includes a CPU 301, a ROM 302, a RAM 303, a bus 304, an input / output interface 305, a ceiling camera 310, a shelf camera 311, a communication unit 315, and an information terminal 9.
  • the CPU 301, the ROM 302, the RAM 303, the bus 304, the input / output interface 305, and the communication unit 315 of the sales floor device are configured similarly to those of the server 1 shown in FIG. 7.
  • the ceiling camera 310 is connected to the network by a USB (Universal Serial Bus) cable.
  • the shelf camera 311 may employ a camera capable of capturing a wide angle such as a fisheye camera. Further, the shelf camera 311 is connected to the network by a USB cable.
  • the information terminal 9 is an information device such as a smartphone or a tablet including a remote operation unit 390, a display unit 391, and the like.
  • the remote control unit 390 has a function of remotely correcting an error state such as a system processing abnormality.
  • the display unit has a screen for displaying an error state, a moving object Mo, and the like.
  • the information terminal 9 also includes a voice generation unit (not shown) that notifies the error state.
  • the error state in the store includes, for example, the case where the ceiling camera 310 cannot recognize the number of books taken from the shelf, the case where an unaccounted book is about to be taken out of the store, and the like.
  • the server 1 also includes an error display unit 151 that displays such an error and an error canceling unit 152 that cancels the error state.
  • the eye examination terminal Q requests the eye examination. By doing so, it is possible to eliminate the error state.
  • the cashier terminal 2 is connected to the sales floor device 3 through the network N.
  • the cashier terminal 2 is configured similarly to the cashier terminal of the first embodiment shown in FIG. Therefore, the cashier terminal 2 according to the second embodiment includes the CPU 201, the ROM 202, the RAM 203, the bus 204, the input / output interface 205, the output unit 206, the input unit 207, the illumination unit 221, and the light shielding unit 209.
  • the presentation unit 210, a cash register camera 211, a storage unit 208, a communication unit 213, and a drive 214 are provided.
  • the cash register camera 211 images a book placed in a predetermined area A, and outputs a captured image obtained as a result to the image acquisition unit 232 in the CPU 201 as an object captured image. If the cashier camera 211 can image in a wide angle like a fish-eye camera, or if it specializes in imaging only the spine of a book, only one cashier camera 211 may be provided.
  • the hardware configuration of the inspection terminal Q is the same as the hardware configuration of the server 1, for the hardware configuration of the inspection terminal Q, refer to FIG. The description is omitted.
  • FIG. 18 is a functional block diagram showing an example of the functional configuration of the server 1, the cashier terminal 2, the sales floor device 3, and the inspection terminal Q.
  • the CPU 101 of the server 1 includes an error determination unit 150.
  • a product DB 131 and a position information management DB 132 are provided in one area of the storage unit 108 of the server 1.
  • the product DB 131 is a DB (Data Base) that stores information such as book titles, prices, author names, publishers, and the like.
  • the position information management DB 132 manages the position of the moving object Mo.
  • the trading restricted product determination unit 236, the settlement unit 237, the display control unit 238, and the visual inspection result acquisition unit 239 function.
  • the image display control unit 411 and the eye examination result transmission unit 412 function.
  • the light emission control unit 228 responds to control for switching between a state in which the illumination unit 221 emits light at the timing of capturing an image of a book and a state of not emitting light at the timing for not capturing an image of the book, a situation of recognizing a book placed in the predetermined area A, and the like. Control to switch the emission color.
  • the light-shielding control unit 229 controls the light-shielding unit 209 provided in the surrounding unit 270 to switch between an opaque state at the timing of photographing a book placed in the predetermined area A and a transparent state at the timing of not photographing the book.
  • the presentation control unit 230 executes control so that the presentation unit 210 changes the emission color for presenting the state of the cashier terminal 2.
  • the image acquisition unit 232 acquires the data of the object image captured by the cashier camera 211 with the object placed in the predetermined area A.
  • the object recognition unit 233 recognizes the presence of an object placed in the predetermined area A using the above-described predetermined image recognition method.
  • the product identification unit 235 lists up product candidates for the object whose existence is recognized by the object recognition unit 233, by an image processing method such as specific object recognition, general object recognition, character recognition, or deep learning.
  • the listed product candidates are called “product candidate list S”.
  • the product specifying unit 235 causes the verification function to be performed and specifies the product with high accuracy.
  • the “image recognition method” described below means a method of recognizing what is made using some image. For example, specific object recognition, general object recognition, character recognition, and deep learning are examples of image recognition methods.
  • the verification function lists up the "commodity candidate list P" by an algorithm different from the method of listing up the product candidates described above.
  • the results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
  • a method of listing the “commodity candidate list” for example, it is realized by a method of matching the image information of the object obtained from the object whose existence is recognized with the image information held in the DB information holding unit 241 or the memory. May be. That is, when the feature information of both images matches (exceeds a threshold value), the object whose existence is recognized by the object recognition unit 233 is a product registered in the DB information holding unit 241, and thus the product specification unit 235 , And specifies that the product is registered in the DB information holding unit 241.
  • the product identification unit 235 when the product cannot be specified at the cashier terminal 2, the result is output as the product not specified. Then, as in the first embodiment, when the specific object visual inspection mode is set, the product identification unit 235 outputs the object captured image of the object not specified as the product and the related information to the visual inspection result acquisition unit. It transmits to the terminal Q for eye examinations via 239. On the other hand, when the all-object visual inspection mode is set, the product identification unit 235 determines the product identification result and the product captured image of the identified product, the object captured image of the object not identified as the product, and related information. And the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 239.
  • the product specifying unit 235 transmits the object captured image and the number of books counted in the sales floor device 3 to the inspection terminal Q via the inspection result acquisition unit 239. Furthermore, the product identification unit 235 identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 239 from the eye inspection terminal Q.
  • the product identification unit 235 identifies the product according to the content indicated by the inspection result. .
  • the product specifying unit 235 specifies the object as a product indicated by the visual inspection result. If the number of books is not confirmed by visual inspection (disagreement of the number of books is not resolved), an error message is displayed and settlement is stopped.
  • the trade-restricted product determination unit 236 determines whether or not the product identified by the product identification unit 235 is a trade-restricted product, based on the determination information, and presents it.
  • the trade-restricted product is, for example, a book whose sale is restricted to shoppers under a certain age. When a clerk sells a book, the clerk who sees the shopper can confirm the age of the shopper and determine whether or not to sell the book.
  • this system which employs automatic settlement, rather than face-to-face sales, requires a mechanism that allows shop assistants to confirm the age of shoppers.
  • the trade-restricted product determination unit 236 of the cashier terminal 2 that has identified the trade-restricted product receives the product image of the product corresponding to the trade-restricted product from the eye-checking terminal Q via the eye-check result acquisition unit 239. Send to.
  • the trade-restricted product determination unit 236 appropriately transmits the image (face image, identification image, etc.) regarding the shopper to the inspection terminal Q via the inspection result acquisition unit 239.
  • the purchaser may be informed by screen display or voice guidance at the cashier terminal 2 and the image retake may be requested.
  • the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2. Then, the trade-restricted product determination unit 236 determines whether or not the sale of the trade-restricted product is permitted based on the eye inspection result acquired by the eye-inspection result acquisition unit 239 from the inspection terminal Q (the sale-restriction is released). Whether or not) is determined. If the sale of the trade-restricted product is permitted according to the result of the visual inspection, the payment of the product is continued. On the other hand, if the sale of the trade-restricted product is not permitted according to the result of the visual inspection, the fact that the product is a trade-restricted product is presented and the settlement process is interrupted.
  • the salesclerk who received the error state may confirm the age of the shopper on the error display unit 151, operate the cashier terminal 2, and release the restricted state. Thereby, the settlement process is restarted.
  • the cashier terminal 2 includes a cashier camera 211 that captures the face, hands, and the like of the shopper, and estimates the age of the shopper, thereby restricting the trade to a shopper who is determined not to have reached a predetermined age. May not be sold.
  • the trade-restricted product determination unit 236 identifies a trade-restricted product such as an age-restricted book from the information of the DB management unit 141 of the server 1.
  • the trade-restricted product determination unit 236 may restrict the trade by associating with the shopper information obtained by the individual authentication.
  • the display unit D indicates that the book is a trade-restricted product.
  • the settlement unit 237 calculates the total price of the books specified by the product specifying unit 235 and which can be sold by the trade restriction product judging unit 236. For example, the settlement unit 237 reads out the prices of the books placed in the predetermined area A from the DB information holding unit 241, adds the prices, displays the prices on the display unit D (FIG. 8), and performs the settlement.
  • the display control unit 238 executes control to display the title and price of the book imaged by the cashier camera 211 of the cashier terminal 2 and settled by the settlement unit 237 to the purchaser and the clerk.
  • the output unit 206 is controlled so as to output (replacement instruction, etc.). Specifically, the purchaser may be notified by a screen display or voice guidance, and the purchaser may be requested to replace the product. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
  • the eye inspection result acquisition unit 239 requests the identification of the product by the eye inspection by transmitting the object captured image or the product identification result and the product captured image of the identified product to the eye inspection terminal Q, and The eye inspection result transmitted from the eye inspection terminal Q is acquired in response to the request.
  • the visual inspection result acquisition unit 239 outputs the acquired visual inspection result (commercial product identification result) to the commercial product identification unit 235.
  • the eye inspection result acquisition unit 239 requests the determination of the sale restricted product by the eye inspection by transmitting the product imaged image of the product corresponding to the sale restricted product to the eye inspection terminal Q, and the eye inspection terminal Q. Get the result of the eye inspection sent from.
  • the visual inspection result acquisition unit 239 appropriately transmits an image of the shopper (face image, image of identification card, etc.) to the visual inspection terminal Q.
  • the eye inspection result acquisition unit 239 outputs the acquired eye inspection result (determination result of the sale-restricted product) to the sale-restricted product determination unit 236.
  • the image display control unit 411 displays images such as the object imaged image or the product imaged image transmitted from the cashier terminal 2 and these images when the cashier terminal 2 requests the eye inspection.
  • Various information (commodity candidate list, etc.) transmitted in association with is output to the output unit 106.
  • the eye inspection result transmission unit 412 receives the object inspection image or the product captured image output to the output unit 106 from the eye inspection result input by the eye examiner through the input unit 107 (determination of product identification result or trade restricted product). (Result etc.) is transmitted to the cashier terminal 2 which requested the eye examination.
  • a personal authentication unit 320 In the CPU 301 of the sales floor device 3, as shown in FIG. 18, a personal authentication unit 320, a moving object tracking unit 330, a position information management unit 340, and a volume number counting unit 350 function.
  • the personal authentication unit 320 includes a personal information acquisition unit 321.
  • the personal authentication unit 320 personally authenticates who the shopper registered in the DB management unit 141 of the server 1 is based on the personal information of the shopper acquired by the personal information acquisition unit 321.
  • the personal authentication unit 320 and the personal information acquisition unit 321 may be provided in the cashier terminal 2 as in the first embodiment.
  • the personal information here includes, for example, personal information such as name, sex, date of birth, address, and telephone number, as well as biometric information such as fingerprints, veins, and iris, and financial information such as credit card numbers and bank account numbers.
  • Information about privacy such as information about The personal information acquisition unit 321 is provided in, for example, the gate 21 installed at the doorway 22.
  • a reading device that touches a portable information terminal such as an IC card of a shopper, a smartphone, or a tablet, a reading device that reads biometric information such as a fingerprint, a vein, or an iris is used.
  • the personal authentication may be performed from the image of the shopper captured by the ceiling camera 310 during shopping.
  • the acquired personal information is used for trading restrictions (including cancellation) and purchase analysis.
  • FIG. 19 is a functional block diagram showing a detailed functional configuration example of the moving object tracking unit 330 provided in the sales floor device 3 of FIG. As shown in FIG. 19, the moving object tracking unit 330 discovers the moving object Mo from the image captured by the ceiling camera 310 and tracks the moving moving object Mo.
  • the ceiling camera includes a moving object area definition unit 3304 and a ceiling camera moving object area tracking unit 3305.
  • the moving object tracking unit 330 using the ceiling camera is connected to the ceiling camera 310 via a USB cable, the network N, or the like. Therefore, the ceiling camera 310 is linked with another ceiling camera 310, a personal computer, or the like.
  • the moving object finding unit 3302 with the ceiling camera estimates the state of the moving object Mo using a state space model (Bayesian filter or the like) based on the captured image taken by the ceiling camera 310, and finds the moving object Mo, A uniquely identifiable ID is assigned.
  • a state space model Boyesian filter or the like
  • the ceiling camera 310 may acquire highly accurate position information by acquiring height information of the moving object Mo using a distance sensor or the like.
  • the moving object area definition unit 3304 using the ceiling camera updates the position information of the area of the moving object Mo after moving. Since the moving object Mo continues to move, the moving area changes within the range captured by one ceiling camera 310, and also moves within the range captured by another ceiling camera 310. Each time the moving object Mo moves, the moving object area is defined, and the position information of each moving object area such as the position information management DB 132 that manages the position information and the memory is updated.
  • the moving object area tracking unit 3305 estimates the position of the moving object area and continues to track the moving object area of the moving object Mo.
  • FIG. 20 is a functional block diagram showing a detailed functional configuration example of the position information management unit 340 provided in the sales floor device 3 in FIG.
  • the position information management unit 340 includes an inter-camera information transfer unit 341, a position definition unit 342 of each camera, and a moving object display unit 343.
  • the inter-camera information transfer unit 341 shares the image information captured by each ceiling camera 310 with the image information captured by another ceiling camera 310, so that the captured image of the ceiling camera 310 with the moving object Mo. Even when the image is captured by another ceiling camera 310, the moving object area can be continuously tracked.
  • the camera-to-camera information transfer unit 341 exchanges information between the ceiling cameras 310 on the storage unit 108, including the product DB 131, through the server 1 that controls the information obtained by imaging the ceiling cameras 310.
  • the inter-camera information passing unit 341 receives the images captured by the ceiling cameras 310 without passing through the server 1 between the ceiling cameras 310 by, for example, P2P. You may hand it over.
  • the position definition unit 342 of each camera defines the position information of where in the store each ceiling camera 310 is reflected. That is, the position definition unit 342 of each camera grasps where in the store the moving object imaged by the different ceiling cameras 310 by the inter-camera information transfer unit 341 is located.
  • the position definition unit 342 of each camera combines the captured images of the ceiling cameras 310 to create one store map. Further, the position definition unit 342 of each camera replaces the coordinates of each ceiling camera 310 and the shelf camera 311 with the coordinates on the store map. Further, the position definition unit 342 of each camera corrects the captured image captured by each ceiling camera 310 by perspective transformation so that the captured image directly faces the floor surface in the store.
  • the ceiling camera 310 is equipped with a distance sensor so that the height information is acquired, so that the positional information management unit 340 accurately corrects the distorted captured image and accurately recognizes the moving object Mo. can do.
  • the moving object display unit 343 displays the position information imaged by the position definition unit 342 of each camera for the moving object Mo in the store 30.
  • the moving object display unit 343 may be adopted as the information terminal 9 held by the clerk Mt, the backyard screen of the store, or the like.
  • FIG. 21 is a functional block diagram showing a detailed functional configuration example of the book number counting unit 350 provided in the sales floor device 3 of FIG.
  • the number-of-volumes counting unit 350 includes a number-of-volumes recognizing unit 351, a moving object-number-of-volumes associating unit 352, a number-of-volumes unidentifying unit 353, a number-of-volumes managing unit 354 associated with a person, and a moving object area transfer recognizing unit using a ceiling camera. 355 and the number-of-books recognition unit 356 transferred by the ceiling camera.
  • the number-of-books recognition unit 351 recognizes the number of books taken by the moving object Mo from the shelf and the number of books returned to the shelf from the captured images captured by the ceiling camera 310.
  • the number-of-books recognizing unit 351 detects the entry / exit of the moving object region into / from the shelf by providing the “object entry detection line” or the like in the captured image, and the moving object Mo indicates the shelf in the captured image at the time of detection. Define the area of the object taken from within and the object returned to the shelf. The number of the object areas is recognized as the number of books.
  • the ceiling camera 310 has a zoom function, even if the number-of-books recognition unit 351 recognizes the number of books after the ceiling camera zooms in consideration of the distance between the ceiling camera 310 and the moving object Mo. Good.
  • the number-of-books recognizing unit 351 may recognize the number of books taken by the moving object Mo from the shelf and returned to the shelf by the shelf camera 311, and from the combination of the ceiling camera 310 and the shelf camera 311, the shopper can recognize the number of books.
  • the number of books taken from the shelf and the number of books returned to the shelf may be recognized.
  • the shelf camera 311 at that time may be a camera capable of capturing an image in a wide range.
  • the moving object and the number-of-books association unit 352 associates the number of books recognized by the number-of-books recognition unit 351 with the person who took the book.
  • the unspecified number of books determining unit 353 associates the fact that the number of books cannot be recognized with the moving object Mo.
  • the number-of-volumes management unit 354 associated with a person continuously manages the number-of-volumes count list associated with the ID of the moving object Mo by utilizing the position information management DB 132 and the like.
  • the number of books acquired is added.
  • the number of the returned books is subtracted.
  • the moving object area transfer recognizing unit 355 by the ceiling camera transfers the book number information associated with each moving object Mo to the transferred moving object Mo.
  • the moving object area transfer recognizing unit 355 by the ceiling camera may analyze the movement of a person by using an object recognition method such as deep learning and may recognize the transfer, and the moving object area transfer may be performed during the transfer.
  • the hand may be recognized, or the overlap between the moving object regions (which may include the hand) may be recognized.
  • the moving object area transfer recognition unit 355 using the ceiling camera may use the shelf camera 311 instead of the ceiling camera 310.
  • the shelf camera 311 at that time may be a camera capable of capturing an image over a wide range.
  • the number-of-books recognition unit 356 delivered by the ceiling camera recognizes the number of books when the books are delivered between the moving object regions. For example, the number-of-books recognition unit 356 delivered by the ceiling camera recognizes the number of books from the captured image at the time when the delivery is recognized.
  • the number-of-books recognition unit 356 delivered by the ceiling camera may be equipped with a ceiling camera having a zoom function to zoom up the portion where the delivery is supposed to be performed and recognize the number of books.
  • the number-of-books recognition unit 356 delivered by the ceiling camera may recognize the number of books by using the shelf camera 311 capable of capturing a wide range instead of the ceiling camera 310. Further, the number-of-books recognition unit 356 delivered by the ceiling camera associates each moving object Mo identified by the delivery-recognition unit between moving object regions 355 by the ceiling camera with the number of books recognized here, Update the volume list.
  • FIG. 22 is a flowchart illustrating book settlement processing in the second embodiment.
  • step S201 when the shopper (moving object Mo) enters the store through the entrance / exit 22 of the store (FIG. 14), the ceiling camera 310 installed near the entrance / exit 22 starts imaging the shopper.
  • the ceiling camera 310 on the back side images the shopper. In this way, the plurality of ceiling cameras 310 constantly image the shopper.
  • the gate 21 may be provided near the entrance 22. When the doorway 22 is provided with the gate 21, the gate 21 is always closed, but the gate 21 is opened at the timing when the shopper enters the store and is closed after entering the store.
  • the personal authentication unit 320 may perform personal authentication of the shopper and acquire the personal information of the shopper before step S201.
  • step S202 the moving object detection unit 3302 using the ceiling camera defines as the moving object Mo by extracting only the area (moving object area) of the shopper imaged in step S201, and assigns the ID to the moving object Mo. A number is issued and the position information in the store associated with the ID is registered in the position information management DB 132 or the RAM 303 of the sales floor device 3.
  • step S203 when the moving object area definition unit 3304 by the ceiling camera moves within the range imaged by the ceiling camera 310, the position of the area of the moving object Mo after the movement is newly defined.
  • the position information is managed by the position information management DB 132 that manages the position of a person, a memory, and the like, and is updated for each area definition. This defined position is also recognized as a position imaged by another ceiling camera 310.
  • step S204 the moving object area tracking unit 3305 using the ceiling camera estimates the position of the moving object area and keeps track of the moving object Mo based on the fact that the shopper moves through the passage 24 toward the book to be purchased.
  • step S201 the moving object Mo is constantly imaged
  • step S202 the ID is assigned to the moving object Mo
  • step S203 the positional information of the area of the moving object Mo after the movement is updated, and the inter-camera information passing unit is further updated. Since position information is exchanged by 341, even if the moving object Mo moves in the passage 24 and is imaged by a different ceiling camera 310, the moving object area tracking unit 3305 by the ceiling camera continues to track the moving object Mo. You can
  • step S205 the number-of-books recognizing unit 351 recognizes the number of books taken from the shelf and returned to the shelf by the moving object Mo having the ID.
  • step S205 when the number of books taken cannot be counted, an unillustrated number-of-books determination unit outputs an error.
  • step S206 the moving object and the book number association unit 352 associates the number of books recognized (counted) in step S205 with the moving object Mo. Therefore, the number-of-books recognition unit 351 recognizes how many books the uniquely identified shopper has taken.
  • step S207 the number-of-books management unit 363 associated with the person continues to manage the number of books associated with the person by utilizing the DB management unit 141 of the server 1. Therefore, even when the book picked up from the shelf by the shopper is returned to the shelf, the book quantity management unit 363 associated with the person recognizes the number of books owned by the shopper.
  • the shopper When the shopper finishes picking up the book, it goes to the cashier terminal 2 and puts the book in a predetermined area A of the cashier terminal 2. As shown in FIG. 15, the book is placed so that the spine cover is captured by at least one or more cameras 211 of the cashier terminal 2. Then, the shopper presses a button, which is the input unit 207 provided in the cashier terminal 2. The cashier terminal 2 uses the pressing of this button or the like as a trigger to specify the product.
  • the product identification is performed by making full use of the image recognition method and identifying the title of the book from the cover or spine of the book.
  • step S210 the product specification unit 235 of the cashier terminal 2 determines whether the number-of-volumes information associated with the moving object Mo recognized in step S207 matches the number-of-volumes information of the books placed in the predetermined area A of the cashier terminal 2. Verify whether. The specific flow of this verification will be described later with reference to FIG. If the number-of-volumes information associated with the moving object Mo does not match the number-of-volumes information of the books placed in the predetermined area A of the cashier terminal 2, a visual inspection (confirmation of the number of volumes) at the visual inspection terminal Q is requested. If the number of books is not confirmed by visual inspection (disagreement of the number of books is not resolved), an error is displayed on the error display unit 151 and settlement is not possible.
  • step S210 the trade-restricted product determination unit 236 also determines whether or not the book placed on the cashier terminal corresponds to the presence or absence of the restriction. By obtaining the personal information in advance at any timing up to step S207 (including before step S201), the trade-restricted product determination unit 236 allows the shopper who cannot purchase the restricted book to purchase the book.
  • the error display unit 151 displays an error.
  • step S211 the settlement unit 237 of the cashier terminal 2 settles the total amount of money of the books placed in the predetermined area A. That is, as described in the first embodiment, the product specification unit 235 obtains information such as the price of a book and calculates the total price. The shopper pays with a credit card, electronic money, or the like to complete the settlement.
  • the DB management unit 141 updates the number-of-volumes count list associated with the person to the state of “settlement completed”.
  • the take-out detection unit Detects it, and the error display section 151 near the entrance 22 and the presentation section 210 of the cashier terminal 2 present it with sound, light, or the like.
  • the gate 21 when the gate 21 is installed, the gate 21 remains closed and the shopper cannot leave the store.
  • the error display unit 151 or the like may give a warning by sound, light, or the like. Thereby, shoplifting can be prevented.
  • the error display unit 151 and the presentation unit 210 of the cashier terminal 2 may be made to emit light in different colors depending on the error state.
  • FIG. 23 is a flowchart for explaining the verification of the book volume information and the booked volume number in step S210 of FIG.
  • step S221 when the reading start button of the input unit 207 of the cashier terminal 2 is pressed, the image acquisition unit 232 of the cashier terminal 2 acquires the object imaged image placed in the predetermined area A of the cashier terminal 2 and then exists. With respect to the object for which is recognized, the product specification unit 235 specifies which product, and also recognizes the number of products.
  • the person of the volume number counting unit 350 of the sales floor device 3 and the volume number associating unit 362 make an inquiry from the ID of the moving object Mo about the volume number of the books placed on the cashier terminal 2.
  • step S223 the product identification unit 235 determines whether the number of books placed on the cashier terminal 2 and the number of books counted by the number management unit associated with a person match. If the number of books placed on the cashier terminal 2 and the number of books counted by the book number management unit associated with the person match (YES), the process proceeds to step S226. If the number of books placed on the cashier terminal 2 does not match the number of books counted by the book number management unit associated with the person (NO), the process proceeds to step S224. In step S224, the product specifying unit 235 transmits the object captured image and the number of books counted in the sales floor device 3 to the eye inspection terminal Q via the eye inspection result acquisition unit 239, and thus, the eye inspection terminal Q.
  • step S225 the product identification unit 235 determines whether or not a visual inspection result indicating that the discrepancy in the number of books has been resolved is obtained.
  • the error display unit 151 of the cashier terminal 2 displays an error and issues a warning.
  • the purchaser and the eye examiner may talk with each other via the microphone M and the speaker S provided in the cashier terminal 2. If an attempt is made to leave the store while the warning is given, the error display section 151 near the entrance 22 gives a warning by sound or light.
  • step S225 the product identification unit 235 determines whether the specific object inspection mode or the all object inspection mode is set. If the specific object visual inspection mode is set, the process proceeds to step S227. If the all-object visual inspection mode is set, the process proceeds to step S228. In step S227, the eye inspection result acquisition unit 239 requests eye inspection (specification of the product) at the eye inspection terminal Q for the target book, and acquires the eye inspection result.
  • step S227 the process proceeds to step S229.
  • step S228, the product identification unit 235 determines whether or not it is not possible to identify which of the books placed in the predetermined area A of the cashier terminal 2. If there is a book that cannot be identified by product identification unit 235 (YES in step S228), the process proceeds to step S227.
  • step S228 If there is no book that the product specification unit 235 cannot identify in step S228 (NO in step S228), the product specification unit 235 stores the books held in the DB information holding unit 241 or the storage unit of the server 1. A book is specified by including information such as name, price, and restricted product. As a result, the process proceeds to step S229. Information on the specified book may be output to the display control unit 238.
  • step S229 the trade-restricted product determination unit 236 determines whether the book identified by the product identification unit 235 is a book that needs age confirmation.
  • step S229 If it is determined in step S229 that the book identified by the product identification unit 235 is a book that requires age confirmation, that is, if YES, the process proceeds to step S230. If it is determined in step S229 that the book identified by the product identification unit 235 is not a book that requires age confirmation, that is, if NO, the process returns to the book automatic settlement process.
  • step S230 the display control unit 238 causes the display unit D of the cashier terminal 2 to display a screen for age confirmation. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S230 is skipped and the process proceeds to step S234.
  • the purchaser is notified by a screen display or voice guidance at the cashier terminal 2.
  • the purchaser may be requested to retake the image.
  • the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
  • the eye inspection result acquisition unit 239 requests the eye inspection terminal Q to perform eye inspection (determination of trade-restricted products) for the target book, and acquires the eye inspection result.
  • the trade-restricted product determination unit 236 determines whether or not an instruction to cancel the trade restriction has been received.
  • step S232 If it is determined in step S232 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S233. If it is determined in step S232 that the cancel instruction for canceling the trading restriction has been received, that is, if YES, the process proceeds to step S234.
  • step S233 the eye inspection result transmission unit 412 of the eye inspection terminal Q transmits a warning that the trading restriction has not been released even by the eye inspection result. By receiving this warning, for example, the display control unit 238 of the cashier terminal 2 presents, via the output unit 206, a warning that the trading restriction has not been canceled even by the result of the visual inspection.
  • step S234 the process ends and the settlement is stopped.
  • step S234 the trade restriction product determination unit 236 cancels the trade restriction.
  • step S234 when step S234 is ended or it is determined in step S229 that the product is not age-restricted (NO is determined), the process returns to the book automatic settlement process.
  • the information processing system recognizes the number of books taken by the shopper in the store and determines whether the number matches with the number of books placed on the cashier terminal.
  • the visual inspection is performed.
  • a visual inspection is performed at the terminal Q, and automatic settlement is performed according to a visual inspection result (a result of checking the number of books, a result of identifying a book, a result of determining a trade-restricted product, etc.). Therefore, according to the present information processing system, when the shopper purchases a product such as a book, it is possible to automate the settlement of the price of the product and improve the product identification accuracy.
  • the system adopted in this embodiment takes various actions including the following examples.
  • A The image including the shopper captured by the ceiling camera 310 as an object is transmitted to the inspection terminal Q, and the detection of the shopper by the inspection is requested.
  • a new moving object Mo is defined, an ID is issued, and tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the eye checker cannot detect the shopper, the fact is notified to the clerk.
  • B When a shopper who is not associated with any ID registered in the positional information management DB 132 is recognized, the imaged image of the shopper and a list of ID candidates to be associated are displayed in the terminal Q for inspection. Sent to.
  • the eye examiner associates the most appropriate ID with the shopper based on the past association information of the shopper and the ID, and starts re-tracking by the moving object area tracking unit 3305 using the ceiling camera. If the re-tracking cannot be started for some reason, the clerk is informed accordingly. In addition, when an ID that has lost connection with the shopper is detected, the ID and a list of imaged images of shoppers that are not associated with any ID are transmitted to the inspection terminal Q. The eye examiner attempts to associate the shopper with the ID based on the association information of the shopper and the ID in the past, and when the association is successful, the moving object area tracking unit 3305 by the ceiling camera uses Start retracking.
  • the image list of each shopper and the list of IDs that should be associated with each shopper are transmitted to the inspection terminal Q.
  • the eye examiner tries to allocate the ID to the shopper most appropriately based on the past information of the shopper and the ID.
  • the re-tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the IDs cannot be properly allocated to the shoppers, the allocation is abandoned and the clerk is informed that the ties have been exchanged.
  • FIG. 24 is a diagram showing a layout example of a supermarket adopting the product recognition system of the third embodiment.
  • the product recognition system of the third embodiment is a system applied to a store 30 such as a supermarket as shown in FIG.
  • the store 30 has a sales floor from the entrance 31 to the exit 32.
  • a plurality of shelf racks 33 for displaying products are installed in the sales floor.
  • a passage 34 is formed between the two shelf racks 33 facing each other.
  • the settlement area 35 is located in front of the exit 32.
  • a plurality of cash registers 36 are installed in the settlement area 35.
  • the cashier table 36 is equipped with n settlement machines 4.
  • the third embodiment does not include the cashier terminal 2 as employed in the first and second embodiments. Further, in the third embodiment, unlike the second embodiment in which a shopper is treated as a moving object Mo, baskets grasped (discovered / tracked) by the system are treated as a moving object Mo.
  • the shopper enters the store through the entrance 31, picks up the basket, and proceeds through the passage 34.
  • the shopper takes the product in the shelf, puts it in the basket, and proceeds through the aisle 34.
  • the shopper proceeds to the settlement area 35 and makes the settlement with the settlement machine 4.
  • the clerk Mt is looking around.
  • the store clerk Mt owns the information terminal 9a.
  • the information terminal 9a is a portable information processing terminal such as a smartphone, and includes a screen for displaying the state of the store. In FIG. 24, not the inside of the store 30 but the outside of the store 30 or the backyard of the store 30 is depicted inside the cloud-shaped diagram.
  • the server 1 (FIG.
  • the inspection terminal Q is installed outside the store 30.
  • a management center existing outside the store 30, a store 30 backyard, or the like.
  • the store clerk Mt monitors the inside of the store 30 through the screen of a large monitor (not shown) or the screen of the information terminal 9b.
  • information terminal 9 when it is not necessary to distinguish the information terminal 9a and the information terminal 9b, they are collectively referred to as "information terminal 9".
  • the inspection terminal Q is installed.
  • a plurality of ceiling cameras 310 are separately installed on the ceiling above the passage 34, the shelf rack 33, and other arbitrary positions in the store 30.
  • the ceiling camera 310 captures an image of the passage 34, the shelf rack 33, and a predetermined area therebelow. That is, when the moving object Mo enters, the ceiling camera 310 images the predetermined area including the moving object Mo as a subject.
  • a shelf camera 311 is installed as an example of a sensing device at each of a plurality of positions on each shelf in each shelf rack 33.
  • the shelf camera 311 images the inside of the shelf, the products inside the shelf, and other predetermined areas. Further, the shelf camera 311 takes an image of the hand or the object in the shelf as an object imaged image when the shopper's hand or the like enters the predetermined area in the shelf or when an object is taken from the shelf.
  • a basket camera 312 (not shown in FIG. 24) may be attached to each of the baskets as an example of a sensing device. In that case, one or more basket cameras 312 of the sales floor device 3 are installed in the baskets and always image the objects put in the baskets.
  • the basket camera 312 is installed so that the inside of baskets can be imaged without blind spots.
  • the cage camera 312 captures an image of a characteristic portion of at least the front surface of an object placed in a basket.
  • the basket camera 312 is linked with the ceiling camera 310 and the shelf camera 311 via the network N.
  • the sales floor device 3 can share both the captured image captured by the basket camera 312 and the captured image of the object captured by the ceiling camera 310 or the shelf camera 311. By this sharing, it is possible to improve the accuracy of identifying the product from the object imaged in the captured image.
  • the product recognition system according to the third embodiment has a function of continuously tracking the moving object Mo even if the moving object Mo captured by the ceiling camera 310 moves.
  • the information processing system according to the third embodiment has a function of identifying which product the object picked up from the shelf rack 33 is from the captured image captured by the shelf camera 311.
  • the product recognition system of the third embodiment may have a function of identifying which product the object put in the basket is from the captured image captured by the basket camera 312.
  • the product recognition system according to the third embodiment has a function of identifying which product the object in the shelf is and further automatically adjusting the product in the checkout machine 4.
  • the settlement machine 4 reads the information of the product associated with the moving object Mo, and the automatic settlement becomes possible.
  • the checkout machine 4 includes functions necessary for completing shopping, such as the total price of purchased commodities, points, display of details, display of trade-restricted commodity determination, and settlement function.
  • the ceiling camera 310, the shelf camera 311, and the basket camera 312 are incorporated in the sales floor device 3 as shown in FIGS. 25 and 26.
  • the sales floor device 3 has a function of specifying a product from a captured image captured by the ceiling camera 310, the shelf camera 311 and the like, and a function of discovering and tracking a moving object Mo. This sales floor device 3 is incorporated in a product recognition system as shown in FIG.
  • FIG. 25 is a configuration diagram showing the configuration of the product recognition system of the third embodiment.
  • the product recognition system according to the third embodiment includes a server 1, a sales floor device 3, a settlement machine 4, and a visual inspection terminal Q.
  • a cashier 4 is provided instead of the cashier terminal 2 described in the first and second embodiments.
  • FIG. 25 only one server 1 and the sales floor device 3 are drawn, but in reality, there may be a plurality of devices. Further, hereinafter, when it is not necessary to individually distinguish the settlement machines 4, these are collectively referred to as a “settlement machine 4”.
  • the server 1, the sales floor device 3, the settlement machine 4, and the inspection terminal Q are mutually connected via a network N such as the Internet.
  • the server 1 is configured similarly to the server 1 (FIG. 7) of the first embodiment.
  • the eye examination terminal Q has the same configuration as the eye examination terminal Q of the first embodiment.
  • FIG. 26 is a block diagram showing the hardware configuration of the sales floor device 3 in the product recognition system of FIG.
  • the sales floor device 3 includes a CPU 301, a ROM 302, a RAM 303, a bus 304, an input / output interface 305, a ceiling camera 310, a shelf camera 311, a basket camera 312, a communication unit 315, and an information terminal 9.
  • the CPU 301, the ROM 302, the RAM 303, the bus 304, the input / output interface 305, and the communication unit 315 of the sales floor device 3 are configured similarly to those of the server 1 illustrated in FIG. 7.
  • the ceiling camera 310, the shelf camera 311, the communication unit 315, and the information terminal 9 of the sales floor device 3 are configured similarly to those of the sales floor device 3 (FIG. 17) described in the second embodiment.
  • FIG. 27 is a functional block diagram showing an example of a functional configuration of the server 1, the sales floor device 3, the settlement machine 4, and the inspection terminal Q.
  • the server 1 includes a CPU 101, a storage unit 108, a communication unit 109, an error display unit 151, and an error canceling unit 152. These are configured similarly to the second embodiment shown in FIG.
  • the image display control unit 411 when a request for eye inspection is made from the sales floor device 3 or the settlement machine 4, picks up an object imaged image or a product image transmitted from the sales floor device 3 or the settlement machine 4. Images such as images and various information (commodity candidate list etc.) transmitted accompanying these images are output to the output unit 106.
  • the eye inspection result transmission unit 412 receives the object inspection image or the product captured image output to the output unit 106 from the eye inspection result input by the eye examiner through the input unit 107 (determination of product identification result or trade restricted product). The result, etc.) is transmitted to the sales floor device 3 or the settlement machine 4 that requested the inspection.
  • a personal authentication unit 320 In the CPU 301 of the sales floor device 3, as shown in FIG. 27, a personal authentication unit 320, a moving object tracking unit 330, a position information management unit 340, a shelf product recognition unit 360, a basket product recognition unit 370, and trading.
  • the restricted product determination unit 380 and the inspection result acquisition unit 301a function.
  • the personal authentication unit 320 includes a personal information acquisition unit 321.
  • the personal authentication unit 320 and the personal information acquisition unit 321 are configured similarly to the second embodiment shown in FIG.
  • the moving object tracking unit 330 includes a moving object finding unit 3302 using a ceiling camera, a basket finding unit 3303 using a ceiling camera, a basket area defining unit 3306 using a ceiling camera, and a basket area tracking unit using a ceiling camera. 3307, a grouping unit 3308 an inter-cage area transfer recognition unit 3310 by a ceiling camera, an object recognition unit 3312 transferred by a ceiling camera, and a product specifying unit 3313 transferred by a ceiling camera.
  • the moving object tracking unit 330 of the third embodiment handles baskets that are grasped (discovered / tracked) on the system as the moving object Mo, unlike the second embodiment that handles a shopper as a moving object.
  • the moving object tracking unit 330 is connected to the ceiling camera 310 via a USB cable, a network N, or the like. Therefore, the ceiling camera 310 is linked with another ceiling camera 310, a personal computer, or the like.
  • the moving object finding unit 3302 by the ceiling camera finds an object (shopper, basket, cart, etc.) moving in the store by using a state space model (Bayesian filter, etc.) based on the imaged image taken by the ceiling camera 310. .
  • the ceiling camera-based basket discovery unit 3303 discovers baskets (moving objects Mo) from the objects moving in the store 30 discovered by the moving object detection unit 3302, and assigns individual IDs to the moving objects Mo. .
  • the ID of the moving object Mo is continuously used until a predetermined timing such as leaving the store or completing settlement.
  • a method of discovering the moving object Mo from an object moving in the store 30 for example, one or more markers holding individual identifiable information are attached to each basket, and the basket finding unit 3303 by the ceiling camera is used. Detects the moving object Mo using this marker as a mark.
  • the marker is not limited as long as it can identify the moving object Mo such as a two-dimensional code or a characteristic shape.
  • the basket finding unit 3303 using the ceiling camera may use information unique to the basket such as color information and shape information of the baskets. At that time, the basket finding unit 3303 using the ceiling camera can find the moving object Mo because the colors and shapes of the baskets can be distinguished from the floor surface and the like.
  • thermography or the like may be used with the baskets kept at a low temperature. At that time, the basket finding unit 3303 using the ceiling camera finds the moving object Mo that has been cooled to a low temperature from the temperature difference between the basket and the region other than the baskets. As a method of finding the moving object Mo from the object moving in the store 30, thermography or the like may be used by generating harmless gas from the baskets. At that time, the basket finding unit 3303 using the ceiling camera detects a temperature change associated with the generation of harmless gas generated from the baskets by thermography or the like, and discovers the moving object Mo.
  • the basket finding unit 3303 using the ceiling camera finds the moving object Mo by detecting the sound generated from the baskets.
  • a sensor that applies invisible paint to a basket and can recognize the invisible paint may be used.
  • the basket finding unit 3303 using the ceiling camera finds the moving object Mo by the sensor recognizing the invisible paint applied to the baskets.
  • the baskets may be irradiated with visible light, infrared rays, or the like from the ceiling camera 310.
  • the basket finding unit 3303 with the ceiling camera finds the moving object Mo by receiving the visible light or the reflected light of infrared rays with which the baskets are irradiated.
  • the ceiling camera basket area definition unit 3306 defines, as the area of the moving object Mo, a certain range within the captured image around the moving object Mo found by the ceiling camera basket finding unit 3303.
  • a certain range from the attachment position where the marker is attached is defined as the area of the moving object Mo.
  • the marker area definition unit 3306 of the ceiling camera does not allow each marker to position itself in the area of the moving object Mo. Since the information is held, the area of the moving object Mo may be defined by one or more markers.
  • the basket area definition unit 3306 using the ceiling camera may use a method that complements the above-described method for defining the basket area.
  • a complementary method when the baskets are cooled to a low temperature as described above, the region of low temperature is defined as the region of the moving object Mo from thermography and image recognition.
  • the temperature change due to the generation of gas is detected by thermography and the like, and by making full use of image recognition technology, the temperature change The area where there is defined as the area of the moving object Mo.
  • the basket area definition unit 3306 by the ceiling camera estimates the edge of the basket from the coating position of the paint when the invisible paint is applied to the edge of the basket as described above, The area closed by the edge is defined as the area of the moving object Mo.
  • the basket area definition unit 3306 determines the area of the moving object Mo from the measurement result of the reflection. Define.
  • the basket area tracking unit 3307 using the ceiling camera estimates (defines) the position of the area of the moving object Mo.
  • the ceiling camera basket area tracking unit 3307 tracks the moving object Mo with the same ID from the time when the moving object Mo is found to a predetermined time when the store is closed or the settlement is completed, and keeps track of the position information.
  • the basket area tracking unit 3307 by the ceiling camera always keeps track of the basket area in the captured image, for example, in cooperation with a large number of ceiling cameras 310.
  • the basket area tracking unit 3307 using the ceiling cameras may transfer the images captured by a certain ceiling camera 310 to the images captured by adjacent ceiling cameras. To do.
  • the ceiling camera basket area tracking unit 3307 stores the position information of the moving object Mo being tracked in the position information management DB 132 of the server 1 or the storage unit 108.
  • the basket area tracking unit 3307 using the ceiling camera As another method for the basket area tracking unit 3307 using the ceiling camera to track the moving object Mo, a marker having information identifying the moving object Mo is attached to each basket, and the ceiling camera 310 moves the moving object including the marker. Image Mo.
  • the basket area tracking unit 3307 that uses the ceiling camera discovers the moving object Mo and acquires the position information by extracting the marker from the captured image. Even if the moving object Mo moves, the basket area tracking unit 3307 using the ceiling camera can continue to track the moving object Mo by finding the marker from the captured image and acquiring the position information of the moving object Mo.
  • the basket area tracking unit 3307 using the ceiling camera tracks the moving object Mo by using an object tracking technique in the image such as Bayesian filter, fast Fourier transform, and TLD. Good.
  • the basket area tracking unit 3307 using the ceiling camera estimates that the moving objects Mo that have acquired the same feature data are the same moving object Mo based on the feature data such as the colors and shapes of the baskets. The moving object Mo may be tracked. At that time, the basket area tracking unit 3307 using the ceiling camera continues to collect the characteristic data of the tracking target.
  • the moving object Mo that is not directly facing the ceiling camera 310 is captured at an angle (from an oblique direction), and thus the position information may not be accurately acquired. Therefore, it is conceivable that the captured image is corrected so that the image is directly faced. However, even if such a correction is applied, the position information of the moving object Mo may not be acquired with high accuracy. Therefore, the basket area tracking unit 3307 using the ceiling camera acquires high-accuracy position information by acquiring height information of the moving object Mo using a distance sensor or the like. In this way, the basket area tracking unit 3307 using the ceiling camera may continue to track the moving object Mo.
  • the grouping unit 3308 may associate a plurality of baskets (moving objects Mo). By being linked in this way, the product list of each moving object Mo can be collectively settled in the settlement machine 4 installed in the settlement area 35.
  • the inter-cargo area passing / recognizing unit 3310 by the ceiling camera recognizes that the object is passed (moved in and out) between the moving objects Mo by the ceiling camera 310 and the like.
  • the basket area transfer recognition unit 3310 using the ceiling camera may recognize the overlap between the areas of the moving object Mo to recognize the transfer.
  • the inter-cargo area passing / recognizing unit 3310 by the ceiling camera identifies the moving object Mo to which the object is transferred, and reads the product list associated with the ID of each moving object Mo.
  • the object recognition unit 3312 delivered by the ceiling camera defines the area of the object from the captured image at the time when the delivery is recognized.
  • the object recognition unit 3312 delivered by the ceiling camera may make the ceiling camera 310 a zoomable camera, and may zoom up the location where the delivery is supposed to be performed to define the area of the object.
  • the product identification unit 3313 delivered by the ceiling camera identifies which of the products in the product list associated with the moving object Mo the delivered object is from the image after the object area definition,
  • the moving object Mo specified by the inter-cargo area passing / recognizing unit 3310 by the ceiling camera is linked with the product specified by the transfer, and the list of products linked to each moving object Mo is updated.
  • the object recognition unit 3312 delivered by the ceiling camera and the product identification unit 3313 delivered by the ceiling camera may be realized by a shelf camera 311 or the like capable of capturing a wide range instead of the ceiling camera 310.
  • the position information management unit 340 is configured similarly to the position information management unit 340 described in the second embodiment. That is, the position information management unit 340 includes an inter-camera information transfer unit 341, a position definition unit 342 of each camera, and a moving object display unit 343, as shown in FIG.
  • FIG. 29 is a functional block diagram showing a detailed functional configuration of the shelf product recognition unit 360 provided in the sales floor device 3 of FIG.
  • the shelf product recognition unit 360 includes an object recognition unit 3602 using a shelf camera, a product specifying unit 3603 using a shelf camera, a basket and product association unit 3605, a product list management unit 3607 associated with the basket, and an object input / output by the shelf camera.
  • a detection unit 3608, a product unspecified determination unit 3609, a label recognition unit 3610, a discount sticker recognition unit 3611, and a basket entry / exit detection unit 3612 using a shelf camera or a ceiling camera are provided.
  • the shelf product recognition unit 360 is linked with another camera, a personal computer, or the like through the shelf camera 311 and a USB cable or a network.
  • the object recognition unit 3602 using the shelf camera compares the images before and after the image in which the object is taken from the shelf or the object is placed (returned) in the shelf, and the image area targeted for product identification is determined.
  • the object recognition unit 3602 by the shelf camera uses the object imaged image before the object is taken from the shelf or placed on the shelf and the object taken image after the object is taken from the shelf or placed on the shelf.
  • the object captured images before and after the image change are compared, and the changed image area is specified.
  • the object recognition unit 3602 by the shelf camera confirms the change in each of the RGB data when specifying the image area.
  • the object recognition unit 3602 by the shelf camera uses the object entrance / exit detection unit as a trigger to capture an image of the object taken from the inside of the shelf or placed on the shelf, and the object taken from the shelf or the shelf.
  • the area of the object may be defined from only one object-captured image by using a method different from the comparison before and after the change of the object-captured image after being placed on the. Since the color data of the object is the same, the object recognition unit 3602 by the shelf camera also defines the area by using the shadow of the object so that it is not determined that there is no change even if there is a change such as entry and exit. You may.
  • the information on the recognized object area is passed to the product specifying unit 3603 by the shelf camera.
  • the shelf camera product identification unit 3603 identifies which product is the object in the shelf recognized by the shelf camera object recognition unit 3602.
  • the shelf camera product identification unit 3603 lists up product candidates by image processing methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product specification unit 3603 using the shelf camera specifies the product with high accuracy by performing the verification function.
  • the verification function lists up the "commodity candidate list P" by an algorithm different from the method of listing up the product candidates described above.
  • the results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
  • the method of listing the “commodity candidate list” may be realized by, for example, a method of matching image information of an object obtained from an object whose existence is recognized with image information held in the product DB 131 or the memory. . That is, when the feature information of both images matches (exceeds a threshold value), the object whose existence is recognized by the object recognition unit 3602 by the shelf camera is a product registered in the product DB 131, and thus the product identification by the shelf camera is performed.
  • the section 3603 specifies that the product is registered in the product DB 131.
  • a product candidate is created by deep learning, and then the verification function is exerted to specify the product with high accuracy.
  • the shelf camera-based product specification unit 3603 does not specify the product in one frame of the captured image captured by the shelf camera 311, but also uses the captured image captured by the ceiling camera 310 to identify the product over a plurality of captured images. May be. At that time, the product specifying unit 3603 using the shelf camera gives a percentage to the product candidate, adds the percentage based on information such as purchase history, time, place, and person's preference, and exceeds a certain threshold. Identify the product when
  • the basket and the product association unit 3605 associate the product information of the product identified by the product identification unit 3603 by the shelf camera with the moving object Mo.
  • the basket and the product association unit 3605 specify the ID of the moving object Mo using the position information attached to the marker or the like attached to the moving object Mo.
  • the basket entrance / exit detection unit by a shelf camera or a ceiling camera described later moves into the moving object Mo.
  • the moving object Mo containing the object taken from the shelf is identified, and the moving object Mo and the product information are associated with each other.
  • at least one shelf camera 311 or at least one shelf camera 311 is used.
  • the car camera 312 detects entry into and exit from the moving object Mo, specifies the moving object Mo, and then associates the specified moving object Mo with the product information.
  • the product list management unit 3607 associated with the basket continues to manage the product list associated with the moving object Mo and the specified product until the settlement. That is, the product list management unit 3607 associated with the basket uses the position information management DB 132 and the like to constantly manage the list of products associated with the ID of the moving object Mo. When the object is taken from the shelf, the product list management unit 3607 associated with the basket is added by the number of the acquired products. On the contrary, when the product is returned to the shelf, the product list management unit 3607 associated with the basket subtracts the number of the returned products.
  • the object entering / exiting detection unit 3608 by the shelf camera can act as a trigger for activating the object recognition unit 3602 by the shelf camera by detecting that an object has entered the shelf.
  • the object entry / exit detection unit 3608 by the shelf camera detects the entry of the object into the shelf from the change in the image data in the “entry detection area” set in the captured image of each shelf camera 311. Further, the object entrance / exit detection unit of the shelf camera also detects that the object exits the area by tracking the incoming object in the image.
  • the object entrance / exit detection unit 3608 by the shelf camera applies a particle filter to the entrance detection area in the shelf.
  • the particles are again spread over the “entry detection area”. , Prepare for the next entry.
  • the object entering / exiting detection unit 3608 by the shelf camera does not sprinkle particles in the entry detection area that is present in the area where the object already exists.
  • the object entrance / exit detection unit 3608 by the shelf camera determines “object entrance” when the ratio of particles having a certain likelihood is equal to or more than a threshold value in a predetermined area, and when the particles having a certain likelihood is less than the threshold value. Is determined to be "object”.
  • the object entry / exit detection unit of the shelf camera detects entry / exit of a product each time a product is entered so that the area of the object is estimated as small as possible once.
  • the images before and after the change are stored in the storage unit 108 of the server 1 so that they can be used in the estimation area.
  • the object entry / exit detection unit 3608 using the shelf camera may detect object entry / exit from spectrum (wavelength) data obtained from a captured image of the object, in addition to the above example. Further, the object entrance / exit detection unit using the shelf camera may detect the object entrance / exit using a method such as a weight / pressure sensor, an infrared sensor, or an ethylene gas sensor.
  • the product non-identification determination unit 3609 associates that the product could not be identified by the product identification unit 3603 by the shelf camera with the moving object Mo that picked up the object from the shelf.
  • the shelf camera product identification unit 3603 identifies the number of feature points at which the image of the object taken from the shelf and the image of the object placed on the shelf are similar. For example, the size of the object imaged in both images is specified and compared, the color difference is specified, and the colors are compared to determine whether the two objects are similar. If the number of characteristic points is small, the product cannot be specified, and it is possible to prevent accidental payment.
  • the label recognition unit 3610 recognizes the attached label according to the product specified by the product specification unit 3603 by the shelf camera.
  • the label recognition unit 3610 makes full use of an image recognition method to read a multidimensional code including a character, a barcode, etc. written on the label and complement product identification.
  • the discount sticker recognizing unit 3611 recognizes the discount sticker attached according to the product specified by the product specifying unit 3603 by the shelf camera.
  • the discount sticker recognizing unit 3611 makes full use of the image recognition method to identify the discount amount or the discount rate of the discount sticker attached to the product.
  • the discount label recognition unit 3611 is executed during the processing of the product identification unit.
  • the moving object Mo exists in a predetermined area of the shelf camera and the ceiling camera from a marker or the like to which at least one of the shelf camera 311 and the ceiling camera 310 is attached to the basket. Whether or not the moving object Mo is present or not is detected.
  • the car entrance / exit detection unit 3612 using the shelf camera or the ceiling camera may further detect the entry of the moving object Mo defined by a marker or the like into the area. At that time, the moving object Mo enters the area by comparing the position information of the tracked product with the position information of the entrance detection line (the edge of the basket or the like) of the area of the moving object Mo.
  • the information processing system of the third embodiment may further include a basket product recognition unit 370.
  • FIG. 30 is a functional block diagram showing a detailed functional configuration example of the basket product recognition unit 370 provided in the sales floor device 3 of FIG.
  • the basket product recognition unit 370 includes a basket camera entrance / exit detection unit 372, a basket camera object recognition unit 373, a basket camera product identification unit 374, a basket product non-identification determination unit 375, and a basket product label recognition.
  • a unit 376 and a basket product discount seal recognition unit 377 are provided.
  • the car entrance / exit detection unit 372 using the car camera detects the entrance, and thus the object recognition unit 373 using the car camera is activated. Can be the trigger of.
  • the car entrance / exit detection unit 372 by the car camera detects the entrance of an object into the car from the change of the image in the “entry detection area” set in the image captured by each car camera 312.
  • the car entrance / exit detection unit 372 of the car camera also detects that the moving object Mo leaves the frame by tracking an object that has entered the frame of the captured image within the frame.
  • the car entrance / exit detection unit 372 using the car camera tracks an object using a particle filter.
  • the car entrance / exit detection unit 372 of the car camera tracks a plurality of moving objects Mo at the same time, so that after the first object enters, particles are again applied to the entrance detection area to prepare for the next object to enter.
  • the car entrance / exit detection unit 372 using the car camera determines “object entrance” when the ratio of particles having a certain likelihood is equal to or more than a threshold value within a predetermined area, and “object exit” when the ratio is less than the threshold value. To determine.
  • the car entrance / exit detection unit of the car camera detects entrance / exit every time an object enters / exits so as to estimate a region of the object as small as possible once.
  • the object recognition unit 373 by the basket camera compares an image when an object such as a hand of a shopper enters the baskets (moving object Mo) with an image after exiting from the baskets (moving object Mo).
  • An image area is defined in order to specify an object which is taken from the shelf and put in the basket (moving object Mo) (or returned from the basket (moving object Mo) to the shelf).
  • the object recognition unit 373 by the car camera confirms the change of the area using the RGB data.
  • the object recognition unit 373 using the basket camera may recognize an object from spectrum (wavelength) data obtained from a captured image of the object, a weight sensor, a pressure sensor, an infrared sensor, a methylene gas sensor, or the like.
  • the product identification unit 374 by the basket camera identifies which product is the object put in the basket.
  • the product identification unit 374 by the car camera lists up product candidates for the object recognized by the object recognition unit 373 by the car camera by image processing methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product identification unit 374 by the car camera causes the verification function to be performed and identifies the product with high accuracy.
  • the verification function lists up the "commodity candidate list P" by an algorithm different from the above-mentioned method of listing up product candidates.
  • the product identification unit 374 using the basket camera matches the product candidate lists S and P, and identifies a product when a predetermined threshold is exceeded.
  • a method of listing product candidates for example, a method of matching information of an object captured image obtained from an object whose existence is recognized with image information held in the product DB 131 or the storage unit 108 is realized. Good. That is, when the feature information of both images matches, that is, exceeds the threshold value, the object whose existence is recognized by the object recognition unit 373 by the car camera is specified as the product registered in the product DB 131.
  • the product identification unit 374 including the basket camera may specify the product from the captured images over a plurality of frames instead of specifying the product in one frame of the captured image. At that time, the product identification unit 374 by the car camera gives a percentage to the product candidates, adds the percentage based on information such as purchase history, time, place, and person's preference, and exceeds a certain threshold. Sometimes the product may be specified. In the present embodiment, there are a specific object inspection mode in which the inspection terminal Q performs inspection for a specific object and an all-object inspection mode in which inspection is performed by the inspection terminal Q for all objects. It can be set.
  • the product identification unit 374 by the basket camera visually examines the object captured image of the object that is not specified as the product and the related information via the eye inspection result acquisition unit 301a. To the terminal Q for use.
  • the product identification unit 374 using the basket camera determines the product identification result and the product captured image of the identified product, and the object captured image of the object that is not identified as the product.
  • Related information and the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 301a. Then, the product identification unit 374 using the basket camera identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 301a from the eye inspection terminal Q.
  • the product identification unit 374 by the car camera when the identification result is approved or corrected by the result of the eye inspection, the product identification unit 374 by the car camera follows the contents indicated by the result of the eye inspection. , Specify the product. Further, for an object for which the product identification unit 374 for the car camera determines that the product is unspecified, the product specification unit 374 for the car camera identifies the object as a product indicated by the visual inspection result.
  • the product identification unit 374 using the basket camera requests the product identification by visual inspection, the request may be made after the product identification complement by the label recognition unit 376 for the basket product is completed. In this case, as a result of the product identification complement by the basket product label recognition unit 376, products whose products have been identified can be excluded, and products can be identified by visual inspection.
  • the basket product non-identification determination unit 375 associates the fact that the product placed in the basket cannot be recognized as a product by the product identification unit 374 of the basket camera with the ID of the moving object Mo.
  • the basket product label recognition unit 376 recognizes the attached label according to the product specified in the basket by the product specification unit 374 by the car camera.
  • the label recognition unit of the basket product makes full use of the image recognition method to read the multidimensional code including the characters, the barcode, etc. written on the label to complement the product identification.
  • the basket product discount sticker recognizing unit 377 makes full use of the image recognition method to specify the discount amount of the discount sticker attached to the product specified by the product specifying unit 374 by the car camera and the discount rate.
  • the discount sticker recognizing unit for the basket product identifies the discount amount or the like during the processing by the product identifying unit using the basket camera.
  • the information processing system also includes a trade-restricted product determination unit 380 and a remote operation unit 390.
  • the trade-restricted product determination unit 380 determines whether the specified product is a trade-restricted product, as described in the first embodiment.
  • the trade restricted product determination unit displays the information on the error display unit 151.
  • the trade-restricted product determination unit 380 transmits the product captured image of the product corresponding to the trade-restricted product to the eye inspection terminal Q via the eye inspection result acquisition unit 301a.
  • the trade-restricted product determination unit 380 appropriately sends information about the shopper (face image, personal information, etc.) to the inspection-use terminal Q via the inspection-result acquisition unit 301a according to the type of the sale-restricted product. Send. Then, the trade-restricted product determination unit 380 determines whether or not the sale of the trade-restricted product is permitted based on the inspection result acquired by the inspection result acquisition unit 301a from the inspection terminal Q (the sale-restriction is released). Whether or not) is determined.
  • the remote operation unit 390 is provided in the information terminal 38 and the server 1, and has a function of canceling the error state when the error state is notified, for example.
  • the eye inspection result acquisition unit 301a requests the identification of the product by the eye inspection by transmitting the object captured image or the product identification result and the product captured image of the identified product to the eye inspection terminal Q.
  • the eye inspection result transmitted from the eye inspection terminal Q is acquired in response to the request.
  • the visual inspection result acquisition unit 301a outputs the acquired visual inspection result (product identification result) to the shelf product recognition unit 360.
  • the eye inspection result acquisition unit 301a requests the determination of the sale restricted product by the eye inspection by transmitting the product imaged image of the product corresponding to the sale restricted product to the eye inspection terminal Q, and the eye inspection terminal Q Get the result of the eye inspection sent from.
  • the visual inspection result acquisition unit 301a appropriately transmits information (face image, personal information, etc.) regarding the shopper to the visual inspection terminal Q.
  • the visual inspection result acquisition unit 301a outputs the acquired visual inspection result (determination result of the trade restricted product) to the trade restricted product determination unit 380.
  • the settlement machine 4 installed in the settlement area 35 calculates the total amount of money of all the commodities put in one or more baskets, and performs settlement or settlement.
  • the settlement machine performs settlement based on the moving object Mo that has been continuously tracked and the product information associated with the moving object Mo. Therefore, as shown in FIG. 27, the settlement machine 4 includes a CPU 401, an input unit 406, an output unit 407, and a communication unit 409. In the CPU 401, the settlement unit 435, the input control unit 436, and the output control unit 437 function.
  • the input unit 406 includes a credit card, an electronic money reading unit, and the like.
  • the output unit 407 has a function of displaying a screen for displaying a payment item and a receipt.
  • the settlement unit 435 determines the settlement amount and the item to be settled.
  • the input control unit 436 inputs a signal from the input unit 406 and operates the CPU 401.
  • the output control unit 437 outputs the calculation result of the settlement unit 435 to the output unit 407.
  • the settlement machine 4 compares the positional information of the moving object Mo with the predetermined positional relation of the settlement area 35, and the positional information of the moving object Mo has entered the settlement area 35, or the moving object Mo is the settlement machine. Settled in 4 as a trigger.
  • the settlement machine 4 may be provided with a button for instructing to start the settlement, and the settlement may be triggered by a button press or the like.
  • the settlement machine 4 is provided with a weight sensor (not shown) and the moving object Mo is placed. It is also possible to recognize the change in the weight of the item and make the settlement by the settlement machine 4.
  • the settlement machine 4 can be used not only for cash, but also for gift certificates, cash vouchers, virtual currency, and the like.
  • an error state may occur.
  • the error state includes (A) system processing abnormality, (B) a product unspecified object is linked to the moving object Mo, (C) a trade-restricted product is linked to the moving object Mo, and the like. Examples include (but are not limited to, of course).
  • the system adopted by the present embodiment takes various actions according to each error state.
  • the fact that the system processing is abnormal is presented to the settlement machine 4, the information terminal 38a, the information terminal 38b, and the server 1. This allows the store clerk to eliminate the error condition.
  • the elimination of the error state may be performed by the remote operation unit 390.
  • the fact that an unspecified item is linked to the moving object Mo is presented to the settlement machine 4, the information terminal 38a, the information terminal 38b, and the server 1.
  • a visual inspection is performed at the visual inspection terminal Q, and the settlement is continued or stopped depending on the visual inspection result. As a result, it is possible to increase the number of cases in which settlement can be performed automatically.
  • the clerk may eliminate the error state.
  • the error state can be resolved by the remote control unit 390.
  • the fact that the trade-restricted product is linked to the moving object Mo is presented to the settlement machine 4, the information terminal 38a, the information terminal 38b, and the server 1.
  • the error state of (C) occurs, the visual inspection at the visual inspection terminal Q is performed, and the settlement is continued or stopped depending on the visual inspection result. As a result, it is possible to increase the number of cases in which settlement can be performed automatically.
  • the error state of (C) occurs, for example, in the case of a trade-restricted product due to age restriction, a clerk checks the age of the shopper, or in the case of a trade-restricted product due to a product whose consumption / expiration date has expired. If so, the store clerk may exchange the product, or if the product is a trade-restricted product other than allergy / halal food, the shopper may confirm the error by himself / herself to eliminate the error condition.
  • the error state can be resolved by the remote control unit 390.
  • FIG. 31 is a flowchart for explaining the basic flow of the automatic settlement processing executed by the server 1, the sales floor device 3, and the settlement machine 4 of FIG.
  • a shopper (moving object) enters the store from the entrance 31 of the store 30 (FIG. 24), and the ceiling camera 310 installed near the entrance 31 starts imaging the shopper.
  • the ceiling camera 310 at the back starts capturing an image of the shopper.
  • the plurality of ceiling cameras 310 constantly capture images of the entire store 30 including shoppers, baskets, and carts.
  • the personal authentication unit 320 may perform personal authentication of the shopper and acquire personal information of the shopper before step S301.
  • step S302 the moving object finding unit 3302 using the ceiling camera finds a moving object (not numbered) in the captured image using a state space model (Bayesian filter) or the like.
  • a state space model Boyesian filter
  • step S303 the ceiling camera basket finding unit 3303 discovers baskets (moving object Mo in the third embodiment) from the objects found by the ceiling camera moving object finding unit 3302, and obtains individual IDs. To turn. The ID is continuously used until a specific timing such as leaving the store or completing settlement.
  • the ceiling camera basket area definition unit 3306 defines the position of the area of the moving object Mo discovered by the ceiling camera basket discovery unit 3303. Further, when the moving object Mo moves within the range imaged by the ceiling camera 310, the position of the area of the moving object Mo after the movement is defined again.
  • the position information is managed in the position information management DB 132, a memory or the like in association with the ID of the moving object Mo, and is updated for each area definition. This defined position is also recognized as a position imaged by another ceiling camera 310.
  • step S ⁇ b> 305 the ceiling camera basket area tracking unit 3307 estimates the position where the moving object Mo moves within the captured image captured by a certain ceiling camera 310. Further, the area of the moving object Mo is defined for the position where the basket area defining unit 3306 by the ceiling camera is estimated to have moved, and the position information of the moving object Mo stored in the position information management DB 132 or the memory is updated. To do.
  • step S306 the object entering / exiting detection unit 3608 by the shelf camera detects whether an object such as the hand of the shopper has entered or exited the shelf. This detection triggers the object recognition unit 3602 by the shelf camera. The entry of the object into the shelf is detected by whether or not the image data in the entry detection area set in each shelf camera 311 has changed. The object entrance / exit detection unit 3608 by the shelf camera detects that the object exits the shelf by keeping track of the object that has entered the shelf.
  • step S307 the object recognition unit 3602 by the shelf camera compares the captured image before the object has entered the shelf with the captured image after the moving object Mo leaves the shelf, and the captured image is taken from the shelf. Recognize objects
  • step S308 the shelf camera product identification unit 3603 identifies which product the detected moving object Mo is.
  • the product specifying unit 3603 using the shelf camera lists up product candidates for the objects detected by the object recognizing unit 3602 by image processing methods such as specific object recognition, general object recognition, and deep learning.
  • the listed product candidates are called “product candidate list S”.
  • the product specification unit 3603 by the shelf camera causes the verification function to be performed and specifies the product with high accuracy.
  • the verification function lists up the "commodity candidate list P" by an algorithm different from the above-mentioned method of listing up product candidates.
  • the shelf camera-based product specification unit 3603 matches the product candidate lists S and P, and specifies a product when a predetermined threshold is exceeded.
  • a method of listing product candidates for example, a method of matching information of an object captured image obtained from an object whose existence is recognized with image information held in the product DB 131 or the storage unit 108 is realized. Good. That is, when the feature information of both images matches, that is, exceeds the threshold value, the object whose existence is recognized by the object recognition unit 3602 by the shelf camera is specified as the product held in the product DB 131.
  • the product non-identification determination unit 3609 transmits error information to the server 1 when the product cannot be identified. This error information is displayed on the error display unit 151 and the information terminal 38.
  • step S309 the basket and product association unit 3605 associates the product identified by the product identification unit 3603 with the shelf camera with the ID of the moving object Mo. That is, one of the baskets (moving objects Mo) held by the shopper who picks up the specified product is specified. Further, when the product non-identification determination unit 3609 fails to identify the product when the product or the object is taken from or returned from the shelf, the product non-identification information (error information) is associated with the ID of the moving object Mo.
  • step S310 the product list management unit 357 associated with the moving object Mo continues to manage the list of products associated with the ID of the moving object Mo. This management is continued until a predetermined timing such as when the moving object Mo moves to the settlement area 35.
  • step S311 when all the commodities of the moving object Mo are summed up in the settlement area 35 and settled or settled by the settlement machine 37, the product list associated with the moving object Mo is updated to the settled status, and the automatic settlement processing is performed. To finish. If the moving object Mo is associated with the product-unspecified information (error information), the error information is transmitted to the server 1 without ending the automatic settlement process. This error information is displayed on the error display unit 151 and the information terminal 38. That is, the clerk can check and cancel the error by calling the shopper.
  • error information is displayed on the error display unit 151 and the information terminal 38. That is, the clerk can check and cancel the error by calling the shopper.
  • error information is transmitted to the server 1.
  • This error information is displayed on the error display unit 151, the information terminal 38a, and the information terminal 38b.
  • an alarm device (not shown) is provided near the exit 32, the alarm device issues a warning by sound, light or the like. Since the automatic settlement processing of the third embodiment specifies the product based on the image captured by the shelf camera, it is not necessary to specify the product for the object placed in the basket. However, in order to verify whether the product has been mistakenly specified, it is preferable to specify the product for the object placed in the basket.
  • FIG. 32 is a flow chart for explaining a process of recognizing a product in the basket of the automatic settlement process executed by the server sales floor device and the settlement machine of FIG.
  • step S321 the car camera entrance / exit detection unit 372 installs particles in the entrance detection region set in advance at a predetermined position in the captured image so that the likelihood other than the car region is low.
  • step S322 the car camera entrance / exit detection unit 372 detects that an object (a shopper's hand or the like) has entered an entrance detection area in the car based on the likelihood of particles.
  • the basket entry / exit detection unit 372 by the basket camera installs new particles.
  • step S323 the basket camera 312 images the state of the inside of the basket at the time when an object (a shopper's hand or the like) enters the entrance detection area.
  • the car camera 312 also images outside the approach detection area. For example, it is assumed that three products are captured in the captured image (previous image). The product captured in the previous image is a product because it has already been identified from the object.
  • step S324 the basket camera 312 images the state of the inside of the basket at the time when the object exits the entry detection area. For example, it is assumed that, in the image (post-image) captured at this time, one object is captured in addition to the above-described three products.
  • step S325 the object recognition unit 373 by the car camera compares the front image and the rear image, and defines the image area for one added object.
  • step S326 the product identification unit 374 by the car camera identifies which product the increased one object is. This product can be specified by using the same method as the specific method performed by the shelf product recognition unit 360.
  • step S327 the basket camera product specification unit 374 determines whether the specific object visual inspection mode or the all object visual inspection mode is set. If the specific object inspection mode is set, the process proceeds to step S328. If the all-object visual inspection mode is set, the process proceeds to step S329.
  • step S328 the visual inspection result acquisition unit 301a requests a visual inspection (specification of a product) at the visual inspection terminal Q for the target object, and acquires the visual inspection result. After step S328, the process proceeds to step S329.
  • step S329 the product identification unit 374 for the car camera determines whether it is not possible to identify which product the added one object is. If the product identification unit 374 by the car camera cannot identify which product (YES in step S329), the process proceeds to step S327.
  • step S329 When it is possible to identify which product the product identification unit 374 by the car camera is in step S329 (NO in step S329), the product identification unit 374 by the car camera determines the product name stored in the storage unit of the server 1 or A product is specified including information such as a price and a product that is restricted for sale. As a result, the process proceeds to step S330.
  • the identified product information may be output to the output unit 407.
  • step S329 when it is not possible to identify which product the product identification unit 374 by the car camera is, the screen display of a display unit (not shown) provided in the car camera, voice guidance by a speaker (not shown), or the like, The purchaser may be requested to reinsert the product into the basket. Further, the purchaser and the eye examiner may talk with each other via a microphone (not shown) provided in the car camera.
  • step S330 the trade-restricted product determination unit 380 determines whether or not the product identified by the product identification unit 374 by the car camera is a product that requires age confirmation.
  • step S330 If it is determined in step S330 that the product identified by the product identification unit 374 by the car camera is a product for which age confirmation is required, that is, if YES, the process proceeds to step S331.
  • step S330 if it is determined that the product identified by the product identification unit 374 by the car camera is not a product that requires age confirmation, that is, if NO, the recognition process of the product in the basket ends. Becomes In step S331, the output control unit 437 causes the output unit 407 of the settlement machine 4 to display a screen for age confirmation. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S330 is skipped and the process proceeds to step S335.
  • the purchaser and the eye checker may make a call via a speaker (not shown) and a microphone (not shown) provided in the checkout machine 4.
  • the visual inspection result acquisition unit 301a requests a visual inspection (determination of a trade-restricted commercial product) at the visual inspection terminal Q for the target product, and acquires the visual inspection result.
  • the trade restriction product determination unit 380 determines whether or not an instruction to cancel the trade restriction has been received.
  • step S333 If it is determined in step S333 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S334. If it is determined in step S333 that the cancel instruction for canceling the trading restriction has been received, that is, if it is determined to be YES, the process proceeds to step S335.
  • step S334 the visual inspection result transmission unit 412 of the visual inspection terminal Q transmits a warning that the trading restriction has not been released even by the visual inspection result. By receiving this warning, for example, the output control unit 437 of the settlement machine 4 presents, through the output unit 407, a warning that the trading restriction has not been canceled even by the result of the visual inspection. After step S334, the process ends and the settlement is stopped.
  • step S335 the trade restriction product determination unit 380 cancels the trade restriction. In this way, when step S335 is completed, or when it is determined in step S330 that the product is not age-restricted product (NO is determined), the recognition process of the in-car product is completed.
  • the information processing system identifies which product is the object put in the basket. Then, the information processing system can verify whether or not the product specified in step S308 and the product specified in step S326 match.
  • the error status is displayed on the error display unit 151, the information terminal 38a, and the information terminal 38b. Then, for the objects and products to be inspected (for example, objects for which the product cannot be specified at the cashier terminal 2 and products corresponding to restricted products), the inspection at the inspection terminal Q is performed. That is, the automatic settlement is performed according to the result of the visual inspection (the result of specifying the product, the result of determining the trade prohibited product, etc.).
  • the present information processing system when the shopper purchases a product, it is possible to automate the payment of the price of the product and to improve the accuracy of specifying the product. In addition, when an object or a product for which settlement is not appropriate is subjected to settlement, it can be prevented from being settled by mistake. Further, when the fact that the product is unspecified is displayed on the information terminals 38a and 38b, the store clerk may go to the sales floor where the basket is located and verify the product, and the store clerk inside and outside the store can remotely check. The error state may be released by operation.
  • errors that can be expected in tracking the moving object Mo include (A) when a shopper enters the store 31 through the entrance 31 and picks up the basket or the cart.
  • the moving object detection unit 3302 by the camera cannot detect the moving object Mo
  • the system adopted in this embodiment takes various actions including the following examples.
  • (A) An image captured by the ceiling camera 310 and including a shopper who holds a basket or a cart as a subject is transmitted to the inspection terminal Q, and a detection of the moving object Mo is requested by the inspection.
  • a detection of the moving object Mo is requested by the inspection.
  • the inspector detects the moving object Mo a new moving object Mo corresponding to the basket or cart held by the shopper is defined, an ID is issued, and the basket area tracking unit 3307 by the ceiling camera is used. Start tracking by. When the eye examiner cannot detect the moving object Mo, the clerk is notified of that fact.
  • the eye examiner attempts to associate the basket or cart with the ID based on the association information of the past basket or cart and the ID, and if the association is successful, the basket area tracking unit 3307 using the ceiling camera. Start re-tracking by. If the start of re-tracking fails for some reason, the clerk is informed accordingly.
  • C An image list of shoppers who hold their respective baskets and carts, and a list of IDs that should be associated with the respective baskets and carts are transmitted to the inspection terminal Q. The eye examiner tries to assign the ID to the basket or cart most appropriately based on the past association information of the basket or cart and the ID.
  • the re-tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the ID cannot be properly allocated to the basket or cart, the allocation is abandoned and the clerk is notified that the linking has been exchanged.
  • FIG. 33 is a diagram showing a layout example of a supermarket that adopts the product recognition system according to the fourth embodiment.
  • the information processing system of the fourth embodiment is a product recognition system applied to a store 40 such as a supermarket as shown in FIG.
  • the information processing system according to the fourth embodiment is configured such that the merchandise can be automatically settled only by passing through the settlement gates 5-1 to 5-3 without being placed on the cash register.
  • a shopping cart, a shopping cart, etc. (without numbering) are placed.
  • the settlement gates 5-1 to 5-3 and the settlement register 6 operated by the clerk Mt are installed.
  • a plurality of shelf racks 43 for displaying products are installed.
  • the shelf rack 43 a plurality of shelves are arranged in the vertical direction at a predetermined interval, and various products are displayed on each of the plurality of shelves.
  • the space between the shelves is also referred to as “inside the shelves”.
  • a passage 44 is defined between the rack racks 43 facing each other in the horizontal direction.
  • the shopper enters the store through the entrance 41, picks up the shopping cart, pushes the shopping cart, or proceeds through the aisle 44 with his / her own bag brought.
  • the shopper takes the product in the shelf, puts it in the basket, and proceeds through the aisle 44.
  • the shopper proceeds to the settlement area 45 and makes the settlement.
  • the clerk Mt is looking around.
  • the store clerk Mt owns the information terminal 9a.
  • the information terminal 9a is a portable information processing terminal such as a smartphone, and has a function of displaying a state of the store, a function of displaying an error state occurring in the store, a remote control function, and the like.
  • FIG. 33 not the inside of the store 40 but the outside of the store 40 or the backyard of the store 40 is drawn inside the cloud-shaped diagram.
  • the server 1 (FIG. 7) is installed outside the store 40 or in the backyard of the store. Outside the store or in the backyard, the store clerk Mt can monitor the inside of the store 40 through the screen of a large monitor (not shown) or the screen of the information terminal 9b.
  • the shopping cart, the shopping cart, and the my bag are referred to as baskets.
  • the passage 44 including the settlement gates 5-1 to 5-3 is a settlement area 45.
  • baskets and the like and shoppers are referred to as a moving object Mo.
  • a plurality of ceiling cameras 310 are installed on the ceiling of the passage 44 between the entrance 41 and the exit 42. Further, a plurality of shelf cameras 311 are installed at a plurality of positions on each shelf in each shelf rack 43. Note that in FIG. 33, the inside of the diagram drawn in a cloud shape does not show inside the store 40 but outside the store 40 or the backyard of the store 40.
  • the server 1 (FIG. 7) is installed outside the store 40. In the backyard, the store clerk Mt monitors the inside of the store 30 through the screen of a large monitor (not shown) and the screen of the information terminal 9b.
  • FIG. 34 is a configuration diagram showing the configuration of a product recognition system as the fourth embodiment of the information processing system.
  • the product recognition system of the fourth embodiment has a configuration as shown in FIG.
  • the product recognition system includes a server 1, a sales floor device 3, and n (n is an arbitrary integer) settlement gates 5-1 to 5-n.
  • the cashier terminal 2 adopted in the first and second embodiments is not adopted in the fourth embodiment.
  • the server 1, the sales floor device 3, and the settlement gates 5-1 to 5-n are connected to each other via a network N such as an Internet line. Note that, for convenience of description, only one server 1 is depicted in FIG. 34, but in reality, there may be a plurality of servers. Further, hereinafter, when it is not necessary to individually distinguish the settlement gates 5-1 to 5-n, they are collectively referred to as a “settlement gate 5”.
  • the server 1 executes each process in order to manage each operation of the sales floor device 3 and the settlement gate 5.
  • the server 1 has the same hardware configuration as the server 1 of the first embodiment shown in FIG.
  • the server 1 includes the CPU 101, the ROM 102, the RAM 103, the bus 104, the input / output interface 105, the output unit 106, the input unit 107, the storage unit 108, the communication unit 109, and the drive 110. I have it.
  • the sales floor device 3 is configured similarly to the hardware configuration of the third embodiment shown in FIG. 26, but does not include the illustrated basket camera 312.
  • the sales floor device 3 includes a CPU 301, a ROM 302, a RAM 303, a bus 304, an input / output interface 305, a ceiling camera 310, a shelf camera 311, an information terminal 9, and a communication unit 315. .
  • an unspecified object or a moving object Mo associated with the trade-restricted product is This occurs under various circumstances, such as when the customer attempts to pass through the settlement gate 5 or when a shopper (moving object Mo) who has not made the settlement attempts to leave the store.
  • a settlement gate 5 is connected to such a sales floor device 3 via a network N.
  • the settlement gate 5 is divided into a settlement gate 5-1 that uses the settlement machine 5a, a settlement gate 5-2 that uses electronic money, and a settlement gate 5-3 that can be settled only by passing through.
  • the settlement gates 5-1 to 5-3 other than the manned settlement register 6 may be provided with an opening / closing member (not numbered) that is always closed.
  • the settlement gate 5-1 using the settlement machine 5a is provided with a settlement button (not shown) to calculate the total amount of the purchased commodities, and the settlement machine 5a installed on the exit 42 side performs the settlement.
  • the settlement machine 5a is installed closer to the exit 42 than the settlement gate 5-1.
  • the settlement machine 5a is equipped with a settlement means capable of making payment by cash, credit card, electronic money, point payment, gift certificate / virtual currency, prepaid card, or the like.
  • the pushing down triggers to read out the product information associated with the moving object area, which will be described later.
  • the settlement amount is fixed, and it becomes possible to pass through the settlement gate 5-1.
  • the opening / closing member opens.
  • the store can be closed.
  • the settlement gate 5-2 which uses electronic money or the like, performs settlement by holding electronic money or the like over the gate body. That is, a settlement gate using electronic money or the like includes a card reading unit (not shown), does not include a settlement button like the settlement gate 5-1 using the settlement machine 5a, and does not use the settlement machine 5a.
  • the electronic money and the like include not only IC cards that can be settled but also credit cards, so-called point cards, prepaid cards and other narrowly defined cards, as well as portable information terminals, but these will be described below as various cards.
  • the shopper whose personal information has been acquired at the entrance 41, the passage 44, etc. only needs to pass through the checkout gate 5-3 to complete the checkout and settlement. That is, settlement and payment are completed without paying cash or reading a card.
  • FIG. 35 is a block diagram showing the hardware configuration of the settlement gate 5 in the product recognition system of FIG. 34.
  • the settlement gate 5 includes a CPU 501, a ROM 502, a RAM 503, a bus 504, an input / output interface 505, an input unit 506, an output unit 507, a storage unit 508, and a communication unit 409.
  • the CPU 501, the ROM 502, the RAM 503, the bus 504, the input / output interface 505, the storage unit 508, and the communication unit 509 of the settlement gate 5 are configured similarly to those of the server 1.
  • the input unit 506 is a settlement button provided in the main body of the settlement gate 5-1 that uses the settlement machine 5a, and the settlement gate that uses electronic money and the settlement gate that can be settled only by passing through the settlement gate 5-2.
  • Reference numeral 5-3 is an information reading unit for detecting information on various cards.
  • the output unit 507 outputs a signal for opening and closing the opening / closing member (without numbering) provided in the settlement gates 5-1 to 5-3. Further, the settlement gate 5-1 using the settlement machine 5a outputs the amount to be settled, the product name, etc. to the settlement machine 5a.
  • FIG. 36 is a functional block diagram showing an example of the functional configuration of the server 1 of FIG. 7, the sales floor device 3 of FIG. 26, and the settlement gate 5 of FIG. 35 described above.
  • the server 1 includes a CPU 101 similar to that of the third embodiment, a storage unit 108, an error display unit 151, and an error canceling unit 152.
  • the server 1 is configured similarly to the server 1 of the third embodiment.
  • FIG. 37 is a functional block diagram showing a detailed functional configuration example of the moving object tracking unit 330 provided in the sales floor device 3 in the third embodiment.
  • a personal authentication unit 320 In the CPU 301 of the sales floor device 3, as shown in FIG. 37, a personal authentication unit 320, a moving object tracking unit 330, a position information management unit 340, a shelf product recognition unit 360, a basket product recognition unit 370, and trading. And a restricted product determination unit 380.
  • the personal authentication unit 320, the moving object tracking unit 330, the position information management unit 340, the basket product recognition unit 370, and the trade restricted product determination unit 380 are configured similarly to the third embodiment.
  • the personal authentication unit 320 includes the same personal information acquisition unit 321 as in the third embodiment.
  • the personal authentication unit 320 acquires personal information from the personal information acquisition unit 321 or the DB management unit 141 of the server 1.
  • the moving object tracking unit 330 includes a moving object finding unit 3302 using a ceiling camera, a moving object region defining unit 3304 using a ceiling camera, a moving object region tracking unit 3305 using a ceiling camera, and a grouping unit 3308.
  • the moving object finding unit 3302 by the ceiling camera finds the moving object Mo (shopper, basket, cart, etc.) using a state space model (Bayesian filter, etc.) based on the captured image taken by the ceiling camera 310.
  • the moving object area definition unit 3304 using the ceiling camera defines the area of the moving object Mo found by the moving object finding unit 3302 using the ceiling camera as the moving object area.
  • the moving object area definition unit 3304 using the ceiling camera defines a moving object area by continuously finding changed areas around the moving object Mo. That is, the moving object area defining unit 3304 using the ceiling camera defines the moving object Mo that has been found, the moving object Mo, and a certain range of the periphery of the moving object Mo as a moving object area.
  • the moving object Mo is clarified by comparing it with the moving object region in the actually obtained image.
  • a person area an area centered on a person
  • the person area is a subordinate concept of the moving object area.
  • the basket region is a subordinate concept of the moving object region.
  • the cart area is a subordinate concept of the moving object area.
  • the moving object area tracking unit 3305 using the ceiling camera tracks the movement of the moving object Mo.
  • the moving object region tracking unit 3305 using the ceiling camera also tracks the movement of the moving object Mo by collecting the characteristic data (color, shape, etc.) of the moving object Mo.
  • the moving object area tracking unit 3305 using the ceiling camera tracks the movement of the moving object Mo by using an object tracking technique in an image such as a Bayesian filter, a fast Fourier transform, and TLD (Tracking-Learning-Detection).
  • the grouping unit 3308 groups the plurality of people.
  • the grouping unit 3308 may group a plurality of people by using information such as a sense of distance (overlapping, sticking, etc.) between the moving objects Mo, a moving direction (vector), and the like.
  • the grouping unit 3308 may associate the human area with the basket area / cart area. With the function of the grouping unit 3308 in this manner, one person can make a settlement at the settlement gate 5.
  • the blood relationship determination unit 3309 uses a face authentication method to distinguish the parent-child relationship, sibling relationship, and the like.
  • the blood relationship determination unit assists the function of the grouping unit.
  • the blood relationship determination unit may determine the degree of face similarity using a deep learning face recognition method and estimate the blood relationship.
  • the moving object area transfer recognizing unit 3311 by the ceiling camera transfers the object from the moving object Mo to the moving object Mo. That is, the moving object Mo passed / passed is identified, and the product list associated with each moving object Mo is read.
  • the moving object area transfer recognizing unit 3311 by the ceiling camera recognizes that a product is transferred from the shopper (moving object Mo) to the shopper (moving object Mo), and is transferred to the transferred moving object Mo.
  • the moving object Mo is identified and the product list associated with each moving object Mo is read.
  • the object recognition unit 3312 delivered by the ceiling camera then defines the area of the object from the captured image at the time when the delivery is recognized. Further, it is determined which of the products in the product list associated with the moving object Mo that has been transferred from the image after the object area is defined, the object transferred by the product specifying unit transferred by the ceiling camera.
  • Each moving object Mo identified by the moving object area passing / recognizing unit 3311 by the ceiling camera is associated with the item for sale identified, and the list of each item for sale is updated.
  • the moving object area transfer recognizing unit 3111 using the ceiling camera may analyze the movement of the shopper (moving object Mo) by using an object recognition method such as deep learning, and may recognize the transfer. May recognize the hand inside the hand, and the recognition of passing may recognize the overlap between the human areas (which may include the hand).
  • the moving object area transfer recognizing unit 3311 by the ceiling camera may be realized by a shelf camera or the like capable of capturing a wide range instead of the ceiling camera.
  • the object recognition unit 3312 delivered by the ceiling camera defines the area of the object from the captured image at the time when the delivery is recognized.
  • the product identification unit 3313 delivered by the ceiling camera associates the object recognized by the object recognition unit 3312 delivered by the ceiling camera with the read moving object Mo (person who has delivered). It identifies which product in the product list, and links each moving object Mo specified by the moving object area passing / recognizing unit 3311 by the ceiling camera with the product specified by the selling camera specified by the ceiling camera 3313. Then, the list of products of each moving object Mo is updated.
  • the object recognition unit 3312 delivered by the ceiling camera may make the ceiling camera a zoomable camera, and may zoom up a portion where the delivery is estimated to be performed to define the area of the object.
  • the moving object area transfer recognizing unit 3311 by the ceiling camera, the object recognizing unit 3312 transferred by the ceiling camera, and the product specifying unit 3313 transferred by the ceiling camera are shelf cameras capable of capturing a wide range instead of the ceiling camera. And the like.
  • the position information management unit 340 includes an inter-camera information transfer unit 341, a position definition unit 342 of each camera, and a moving object display unit 343, as shown in FIG.
  • FIG. 38 is a functional block diagram showing a detailed functional configuration example of the shelf product recognition unit 360 provided in the sales floor device 3 according to the fourth embodiment.
  • the shelf product recognition unit 360 includes an object recognition unit 3602 including a shelf camera, a product specification unit 3603 including a shelf camera, a moving object and product association unit 3604, and a product list management unit 3606 associated with the moving object, which are illustrated in FIG. 38.
  • An object entry / exit detection unit 3608 by a shelf camera, a product non-identification determination unit 3609, a label recognition unit 3610, and a discount sticker recognition unit 3611 are provided.
  • the shelf camera product identification unit 3603 identifies which product is the object in the shelf recognized by the shelf camera object recognition unit 3602.
  • the shelf camera product identification unit 3603 lists up product candidates by image processing methods such as specific object recognition, general object recognition, and deep learning.
  • the listed product candidates are called “product candidate list S”.
  • the product specification unit 3603 using the shelf camera specifies the product with high accuracy by performing the verification function.
  • the verification function lists up the "commodity candidate list P" by an algorithm different from the method of listing up the product candidates described above.
  • the results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
  • the method of listing the “commodity candidate list” may be realized by, for example, a method of matching image information of an object obtained from an object whose existence is recognized with image information held in the product DB 131 or the memory. . That is, when the feature information of both images matches (exceeds a threshold value), the object whose existence is recognized by the object recognition unit 3602 by the shelf camera is a product registered in the product DB 131, and thus the product identification by the shelf camera is performed.
  • the section 3603 specifies that the product is registered in the product DB 131.
  • a product candidate is created by deep learning, and then the verification function is exerted to specify the product with high accuracy.
  • the shelf camera-based product specification unit 3603 does not specify the product in one frame of the captured image captured by the shelf camera 311, but also uses the captured image captured by the ceiling camera 310 to identify the product over a plurality of captured images. May be. At that time, the product specifying unit 3603 using the shelf camera gives a percentage to the product candidate, adds the percentage based on information such as purchase history, time, place, and person's preference, and exceeds a certain threshold. Specify the product when
  • the inspection terminal Q performs inspection for a specific object
  • an all-object inspection mode in which inspection is performed by the inspection terminal Q for all objects. It can be set.
  • the specific object visual inspection mode the product identification unit 3603 by the shelf camera visually inspects the object captured image of the object that is not specified as the product and the related information via the eye inspection result acquisition unit 301a.
  • the product identification unit 3603 by the shelf camera determines the product identification result and the product captured image of the identified product, and the object captured image of the object that is not identified as the product and Related information and the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 301a.
  • the product identification unit 3603 using the shelf camera identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 301a from the eye inspection terminal Q. Specifically, for the product identified by the product identification unit 3603 by the shelf camera, when the identification result is approved or corrected by the eye inspection result, the product identification unit 3603 by the shelf camera follows the contents indicated by the inspection result. , Specify the product. Further, for an object for which the product specification unit 3603 by the shelf camera determines that the product is unspecified, the product specification unit 3603 by the shelf camera specifies the object as the product indicated by the visual inspection result.
  • the request may be made after the product identification complement by the basket product label recognition unit 376 is completed.
  • the products whose products are identified can be excluded, and the products can be identified by visual inspection.
  • the other functional units in the shelf product recognition unit 360 are the same as those in the second embodiment.
  • a settlement unit 535 In the CPU 501 of the settlement gate 5, as shown in FIG. 35, a settlement unit 535, an input control unit 536, and an output control unit 537 function.
  • the settlement unit 535 receives product information (list of products) associated with one or more moving object regions from the server 1 via the communication unit 509 when the continuously tracked moving object region enters the settlement area 45. , Settle the payment amount and the item to be settled.
  • the input control unit 536 inputs a signal from the input unit 506 such as a payment button and an information reading unit provided in the gate body.
  • the output control unit 437 displays the settlement amount on the output unit 507, outputs information to the settlement machine 5a, and opens / closes the opening / closing member.
  • the settlement unit 535 based on the product information (list of products) associated with one or more moving objects Mo when the moving object Mo that has been continuously tracked, that is, a person who makes a payment enters the settlement area 45. , Settle the payment amount and the item to be settled. Further, for example, when the father makes a payment with a wallet, the product information (a list of products) associated with the companion mother and the child may be grouped so that the father can make the payment.
  • the settlement gate 5-1 using the settlement machine 5a is a device for calculating and summing the total amount of goods and has a settlement button (not shown).
  • the settlement machine 5a is installed closer to the exit 42 than the settlement gate 5.
  • the settlement machine 5a is equipped with a settlement means capable of paying with cash, point payment, gift certificate, cash voucher, or the like.
  • the depression causes the merchandise information associated with the moving object Mo to be read out and the settlement amount to be settled. -1 can be passed. Then, when the person who makes the payment makes the payment using the checkout machine, the store can be closed.
  • the settlement gate 5-1 maintains the passage-impossible state by keeping the opening / closing member closed.
  • the settlement gate 5-1 may present the fact that there is an error state by means of sound or light.
  • the moving object Mo is associated with an object (unspecified object) that cannot be identified as which product
  • the flow is the same as that for the age-restricted product.
  • the unspecified object may be dealt with by the clerk Mt at the manned cash register 6 to specify the product.
  • the settlement gate 5-2 using electronic money is provided with a card reading unit (not shown), is not provided with a settlement button like the settlement gate 5-1, and does not use the settlement machine 5a.
  • the card reading unit is adapted to handle any of credit cards, debit cards, electronic money, prepaid cards, and the like.
  • the product information associated with the moving object Mo is read, and the payment and payment are completed. It becomes possible to pass through the settlement gate 5-2.
  • the settlement object product is a trade-restricted product, or when an unspecified object is tied to the moving object Mo, the same operation as that of the settlement gate 5-1 is performed.
  • the moving object Mo that has been continuously tracked needs to be personally authenticated and the payment information needs to be specified.
  • the settlement gate 5-3 is automatically settled and can pass through the settlement gate 5-3 unless there is an error.
  • the settlement object product is a trade-restricted product, or when an unspecified object is tied to the moving object Mo, the same operation as that of the settlement gate 5-1 is performed.
  • the product recognition system according to the fourth embodiment may include a remote control unit (not shown) in addition to the functions described above.
  • the trade-restricted product determination unit 380 determines whether or not the specified product corresponds to the trade-restricted product. That is, the trade-restricted product determination unit 380 determines from the DB information that the identified product is a trade-restricted product due to the age restriction of alcoholic beverages and tobacco. Further, the trade-restricted product determination unit 380 uses image recognition or the like to determine that the product is a trade-restricted product due to expiration or expiration of the expiration date. In addition, the trade-restricted product determination unit 380 associates the product with the personal information obtained by the personal authentication, and determines that the product is a trade-restricted product other than allergies to shoppers and products other than halal foods.
  • the sale-restricted product determination unit 380 determines that the product is a sale-restricted product, it indicates that it is a sale-restricted product. Further, the trading restricted product determination unit 380 may restrict trading based on the personal information obtained from the personal authentication. Further, the trade-restricted product determination unit 380 may estimate the age and gender from the face and hand recognition, and restrict the trade.
  • the remote control unit When the remote control unit receives a notification of an error state such as a system processing abnormality, an unspecified product, or a trading restriction, the remote control unit cancels the error state by remote control.
  • the ceiling camera 310, the shelf camera 311, and the checkout gate 5 that have detected the error state send information indicating that there is an error state and a captured image of the state via the network N to the information terminal 9a for remote operation inside or outside the store, 9b and the server 1 are notified.
  • the error condition can be eliminated.
  • the error state may be released at a manned cash register in the store, or the store clerk Mt or the like may go to a place where the error state has occurred and release the error state.
  • 39 and 40 are flowcharts illustrating the automatic settlement process executed by the server 1, the sales floor device 3, and the settlement gate 5 in FIG. 36.
  • step S401 the shopper (moving object) enters the shop through the entrance 41 of the store 40 (FIG. 33), and the ceiling camera 310 installed near the entrance 41 starts imaging the shopper.
  • the ceiling camera 310 at the back starts capturing an image of the shopper.
  • the plurality of ceiling cameras 310 constantly capture images of the entire store 30 including shoppers, baskets, and carts.
  • the personal authentication unit 320 may perform personal authentication of the shopper and acquire the personal information of the shopper before step S401.
  • step S402 the moving object finding unit 3302 using the ceiling camera finds the moving object Mo and assigns an individual ID.
  • the ID is continuously used until a specific timing such as leaving the store or completing settlement.
  • the personal authentication unit 320 may perform personal authentication of the shopper at this timing to acquire the personal information of the shopper.
  • the grouping unit 3308 may group a plurality of moving objects Mo as one group, and the blood relationship determination unit 3309 determines the blood relationship of a plurality of shoppers (moving objects Mo) and complements the grouping unit 3308. You may
  • the moving object area defining unit 3304 with the ceiling camera defines a predetermined area including the moving object Mo found by the moving object finding unit 3302 with the ceiling camera. Further, when the moving object Mo moves within the range imaged by the ceiling camera 310, the position of the area of the moving object Mo after the movement is defined again.
  • the position information is managed in the position information management DB 132, a memory or the like in association with the ID of the moving object Mo, and is updated for each area definition. This defined position is also recognized as a position imaged by another ceiling camera 310.
  • step S404 the position at which the moving object Mo moves in the captured image captured by the ceiling camera 310 having the moving object area tracking unit 3305 by the ceiling camera is estimated. Further, the moving object region definition unit 334 defines the region of the moving object Mo with respect to the position estimated to have moved, and updates the position information of the moving object Mo stored in the position information management DB 132 or the memory.
  • step S405 the object entry / exit detection unit 3608 by the shelf camera detects entry / exit of an object into / from the shelf, as in the third embodiment.
  • step S406 similarly to the third embodiment, the object entry / exit detection unit 3608 triggered by the shelf camera is used as a trigger, and the object recognition unit 3602 by the shelf camera causes images before and after the image in which the object is taken or the image in which the object is placed. And the image area that is the target of product identification is defined. Further, in step S406, the object acquisition recognition unit using the ceiling camera may recognize that the moving object Mo has acquired an object from the inside of the shelf or the like.
  • step S407 the product identification unit 3603 by the shelf camera identifies which product the object is.
  • the specified product is a trade-restricted product
  • the trade-restricted product is linked to the moving object Mo.
  • the product-unspecified object is linked to the moving object Mo.
  • the information terminal 9a, the information terminal 9b, and the server 1 may be informed that the error state is due to unspecified product, and the clerk Mt who recognizes the error state may cancel the error state by remote control (obviously, the clerk Mt. May go directly to release the error condition).
  • the label recognition unit may recognize the label associated with the specified product.
  • the discount label recognition unit may recognize the attached discount label according to the identified product.
  • the product specifying unit using the ceiling camera may specify which product is the object region acquired by the object acquisition recognizing unit using the ceiling camera.
  • step S408 the shelf-camera product specifying unit 3603 determines whether the specific object visual inspection mode or the all object visual inspection mode is set. If the specific object visual inspection mode is set, the process proceeds to step S409. If the all-object visual inspection mode is set, the process proceeds to step S410.
  • step S410 the visual inspection result acquisition unit 301a requests a visual inspection (specification of a product) at the visual inspection terminal Q for the target object, and acquires the visual inspection result.
  • step S411 the product identification unit 3603 by the shelf camera determines whether it is not possible to identify which product the added one object is. If the product specification unit 3603 by the shelf camera cannot specify which product (YES in step S410), the process proceeds to step S409.
  • step S410 if the product specification unit 3603 by the shelf camera can specify which product (NO in step S410), the product specification unit 3603 by the shelf camera determines the product name stored in the storage unit of the server 1 or A product is specified including information such as a price and a product that is restricted for sale. As a result, the process proceeds to step S411.
  • the identified product information may be output to the output unit 507.
  • step S410 when the product specification unit 3603 by the shelf camera cannot specify which product, the screen display of the display unit (not shown) provided in the sales floor device 3, the voice guidance by the speaker (not shown), etc. Then, the purchaser may be requested to retrieve the product from the shelf. Further, the purchaser and the eye examiner may talk with each other via a microphone (not shown) provided in the sales floor device 3.
  • step S411 the trade-restricted product determination unit 380 determines whether or not the product identified by the product identification unit 3603 using the shelf camera is a product for which age confirmation is required.
  • step S411 If it is determined in step S411 that the product identified by the product identification unit 3603 using the shelf camera is a product that requires age confirmation, that is, if YES, the process proceeds to step S412. If it is determined in step S411 that the product specified by the product specifying unit 3603 using the shelf camera is not a product for which age confirmation is required, that is, if NO is determined, the process proceeds to step S417. In step S412, the output control unit 437 causes the output unit 507 of the settlement gate 5 to display a screen for age confirmation. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S412 is skipped and the process proceeds to step S416.
  • the purchaser and the eye checker may make a call via a speaker (not shown) and a microphone (not shown) provided in the checkout gate 5.
  • the visual inspection result acquisition unit 301a requests a visual inspection (determination of a trade-restricted commercial product) at the visual inspection terminal Q for the target product, and acquires the visual inspection result.
  • the trading restriction product determination unit 380 determines whether or not an instruction to cancel the trading restriction has been received.
  • the sale restriction of the sale-restricted product is released by either the store clerk on the spot in response to the result of the determination or the result of the visual inspection is adopted as a system.
  • the result of the visual inspection is adopted as the system will be described.
  • step S414 If it is determined in step S414 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S415. If it is determined in step S414 that the cancel instruction for canceling the trading restriction has been accepted, that is, if YES, the process proceeds to step S416.
  • step S415 the visual inspection result transmission unit 412 of the visual inspection terminal Q transmits a warning that the trading restriction has not been released even by the visual inspection result. By receiving this warning, for example, the output control unit 537 of the settlement gate 5 presents, through the output unit 507, a warning that the trading restriction has not been canceled even by the result of the visual inspection. After step S416, the process ends and the settlement is stopped.
  • step S416 the trade restriction product determination unit 380 cancels the trade restriction. In this way, when step S416 is completed, or when it is determined in step S411 that the product is not age-restricted (NO is determined), the process proceeds to step S417.
  • step S417 the moving object and the product association unit 3604 associate the moving object Mo with the specified product.
  • step S4108 the product list management unit 3606 associated with the moving object continues to manage the product list associated with the person until settlement.
  • the settlement gate 5 performs settlement or settlement based on the product information associated with the moving object Mo.
  • the error display unit 151, the information terminal 9a, the information terminal 9b, or the output unit 507 of the settlement gate 5 may notify the clerk or the like of any error state during the process up to step S419 or at the time of step S419. .
  • step S410 when the trade-restricted product is associated with the moving object Mo, the error state may be displayed by keeping the gate closed without opening it.
  • the system adopted in this embodiment takes various actions including the following examples.
  • A The image including the shopper captured by the ceiling camera 310 as an object is transmitted to the inspection terminal Q, and the detection of the shopper by the inspection is requested.
  • a new moving object Mo is defined, an ID is issued, and tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the eye checker cannot detect the shopper, the fact is notified to the clerk.
  • B When a shopper who is not associated with any ID registered in the positional information management DB 132 is recognized, the imaged image of the shopper and a list of ID candidates to be associated are displayed in the terminal Q for inspection. Sent to.
  • the eye examiner associates the most appropriate ID with the shopper based on the past association information of the shopper and the ID, and starts re-tracking by the moving object area tracking unit 3305 using the ceiling camera. If the re-tracking cannot be started for some reason, the clerk is informed accordingly. In addition, when an ID that has lost connection with the shopper is detected, the ID and a list of imaged images of shoppers that are not associated with any ID are transmitted to the inspection terminal Q. The eye examiner attempts to associate the shopper with the ID based on the association information of the shopper and the ID in the past, and when the association is successful, the moving object area tracking unit 3305 by the ceiling camera uses Start retracking.
  • the image list of each shopper and the list of IDs that should be associated with each shopper are transmitted to the inspection terminal Q.
  • the eye examiner most appropriately attempts to assign the ID to the shopper based on the past information of the shopper and the ID.
  • the re-tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the IDs cannot be properly allocated to the shoppers, the allocation is abandoned and the clerk is informed that the ties have been exchanged.
  • the information processing system uses a method in which an eye examiner visually confirms an image including an object as a subject, and acquires a result of attempting to specify the object as a product. And a settlement means for performing settlement processing for the product identified based on the result of the product identification means.
  • an eye examiner visually confirms an image including an object as a subject, and acquires a result of attempting to specify the object as a product.
  • a settlement means for performing settlement processing for the product identified based on the result of the product identification means.
  • the information processing system may include a shelf inventory management function.
  • the shelf inventory management function includes a salesclerk discrimination means and an inventory information update means.
  • the clerk discrimination means discriminates between the clerk and the shopper.
  • the salesclerk discrimination unit attaches a physical marker that can be used to identify the salesclerk to items such as hats and clothes worn by the salesclerk.
  • the clerk is identified by the ceiling camera and the shelf camera capturing images of this physical marker. This clerk discrimination means can be effectively used especially in the third and fourth embodiments.
  • the salesclerk determination means may include a button provided in a predetermined area such as a backyard of the store. What is this salesclerk discrimination means? When a button is pushed by a clerk existing in a predetermined area, the clerk is assigned an ID. This clerk discrimination means can be effectively used especially in the fourth embodiment.
  • the ceiling camera continues to track the area of the person recognized as the salesclerk.
  • the inventory information updating means makes full use of the object recognition unit and the product identification unit when the store clerk replenishes the shelf with products (addition) or takes out or discards the products (subtraction) to store the inventory information of the shelf.
  • the inventory information updating means updates the inventory information of the shelf where the product is present and the inventory information of the entire store when the product is purchased.
  • the inventory information updating means manages the inventory information of the shelves and the inventory information of the entire store, and when the inventory number falls below a threshold value, notifies the remote control unit or automatically places an order.
  • the present invention may include a shopper attribute estimation function.
  • the shopper's attribute estimation function estimates attributes of the shopper, such as approximate age, age, and gender, from face recognition, for example.
  • the attribute estimation function of the shopper is provided in the cashier terminal in the first embodiment, and is provided in the sales floor device in the second to fourth embodiments. In the second embodiment, the cashier terminal may be provided.
  • the present invention also links purchase information and shopper attribute information to the POS system. As the purchase information, information such as a product name and an amount of money is converted into data, and the information is linked to the POS system after the settlement is completed.
  • the attribute information of the shopper the information obtained by the attribute estimation function of the shopper described above is linked to the POS system.
  • the shopper attribute estimation function may be linked with, for example, the trade restricted product determination unit. As a result of the attribute estimation, the shopper attribute estimation function may control the trade restriction product determination unit not to perform age confirmation if the shopper is in the thirties or older who does not obviously need age confirmation. Good.
  • Image recognition using deep learning is promising as a product recognition method in the product recognition system in each embodiment.
  • deep learning requires re-learning when a product cannot be identified or when it is erroneously recognized, and in that case, it is necessary to manually give an image of the correct product, which is a great deal of effort. It takes.
  • the product recognition system of the present invention is When purchasing a product, a function that corrects the product recognition by visually checking by an inspector or scanning a barcode, A function to link and record the correct product image and necessary information (modified product number, specific code, etc.) when product recognition is corrected, Based on the correct product image and necessary information, the function to automatically perform learning of deep learning, A function to automatically deploy learning results to the product recognition system,
  • the series of deep learning re-learning processes can be automated.
  • the deployment of the learning result refers to a process of updating, for example, a deep learning model or the like of the product recognition system based on the learning result so that it can be used for product recognition.
  • the re-learning process may be configured to be triggered by a predetermined condition such as product identification failure, or may be configured to be triggered by an explicit operation by a person in charge, for example, pressing a button. Good.
  • the deep learning re-learning process as described above, it is possible to simplify the complicated work involved in the deep learning re-learning and reduce manpower, and to provide a product recognition system in which the product recognition accuracy is continuously improved.
  • an image recognition method applied when an object is specified as a product for example, object recognition using AI (Artificial Intelligence), features It is possible to adopt object recognition using points / features, shape recognition of product logo, character recognition (character shape recognition using AI, character recognition using OCR (Optical Character Recognition), etc.). is there.
  • AI Artificial Intelligence
  • OCR Optical Character Recognition
  • weight recognition weight recognition of an object using a weight sensor
  • product recognition by scanning identification information such as a bar code or QR code (registered trademark) (a bar by an infrared sensor or image reading, etc.)
  • QR code registered trademark
  • RFID infrared sensor
  • the above-mentioned recognition methods such as weight recognition, product recognition by scanning identification information such as a bar code and QR code (registered trademark), and product recognition by reading electronically recorded identification information are combined with an image recognition method. Instead, it is possible to recognize the object as a product by combining one or more.
  • the visual inspection of the object or the product is performed by the visual inspection terminal Q, but the present invention is not limited to this. That is, the eye examination by the eye examiner may be performed by another device such as the cashier terminal 2, the settlement machine 4, and the information terminal 9.
  • the eye examination by the eye examiner may be performed by another device such as the cashier terminal 2, the settlement machine 4, and the information terminal 9.
  • one store or one cashier terminal 2 is provided with one eye inspection terminal Q, or one store or plurality of cash register terminals is provided with one eye inspection terminal Q. It is possible to prepare.
  • the hybrid check system in the above-described embodiment is not necessarily used only for product identification. That is, for example, the product includes (A) tobacco, liquor, etc. that cannot be purchased until a certain age is reached, (B) expired / expired product, and (C) allergic ingredients, so It can also be used to determine whether or not a product should not be ingested, a product other than (D) halal food, or a product with religious restrictions.
  • the image processing by the above-described CPU may be processing by a GPU (Graphic Processing Unit).
  • the illustrated hardware configuration and block diagram are merely examples for achieving the object of the present invention, and the present invention is not limited to the illustrated example.
  • the location of the functional block is not limited to that shown in the figure, and may be arbitrary.
  • one functional block may be configured by hardware alone, software alone, or a combination thereof.
  • the program forming the software is installed in a computer or the like from a network or a recording medium.
  • the computer may be a computer incorporated in dedicated hardware, or may be a smartphone or a personal computer. Further, the processing of the trade-restricted product described in the first embodiment can be applied to the second to fourth embodiments. However, the trade-restricted product in the second embodiment is applied to the age-restricted product.
  • the steps for writing the program recorded on the recording medium are not limited to the processing performed in time series according to the order, but may be performed in parallel or individually even if the processing is not necessarily performed in time series. It also includes the processing executed in.
  • the term “system” means an overall device configured by a plurality of devices, a plurality of means and the like.
  • the information processing system to which the present invention is applied can have various configurations in addition to the above-described first to fourth exemplary embodiments, as long as the following configuration is adopted. That is, the information processing system A first specifying means (for example, a visual inspection) that obtains the result of attempting to specify the object as a product by using the first method in which the eye examiner visually confirms the image including the object as a subject. A result acquisition unit 239 and a visual inspection terminal Q), A settlement means (for example, a settlement section 237) for performing settlement processing for the product identified based on the result of the first identifying means, Is provided.
  • a first specifying means for example, a visual inspection
  • a settlement means for example, a settlement section 237) for performing settlement processing for the product identified based on the result of the first identifying means, Is provided.
  • a second specifying means (for example, an object recognizing unit 233) that attempts to specify the object as a product by using a second method other than the first method
  • the settlement means may be configured to perform settlement processing on a product identified based on a result of at least one of the first identification means and the second identification means.
  • the second method may include at least a predetermined image recognition method (for example, deep learning).
  • the product information of the object specified as a product using one or more methods not including the predetermined image recognition method and the image are linked, Using the information including the linked product information and the image, re-learning the predetermined image recognition method, The learning result obtained by the re-learning may be further provided with an image recognition re-learning unit which is provided in the second specifying unit.
  • the second method may include at least one of weight recognition, scanning of identification information, and reading of electronically recorded identification information.
  • the product specified based on the result by the first specifying means may be the target of the payment processing by the payment means.
  • the product specified based on the result of each of the first specifying unit and the second specifying unit may be a target of the settlement process by the settlement unit.
  • Error canceling unit 209 ... Shading unit, 210 ... Presentation unit, 211 ... Cash register camera, 212 ... External camera, 221 ... Illumination unit, 228 ... Emission control unit, 229 ... Shading control unit, 230 ... Presentation control , 231, 320 ... Personal authentication section, 232 ... Image acquisition section, 233 ... Object recognition section, 234 ... Object quantity recognition section, 235 ... Product identification section, 236 ... Trading restriction Product determination unit, 237, 435, 535 ... Settlement unit, 238 ... Display control unit, 239, 301a ... Eye inspection result acquisition unit, 241 ... DB information holding unit, 270 ... Surrounding unit , 271 ...

Abstract

The present invention enables the automatic payment of the price for a product and improves the accuracy of specifying the product, when a buyer purchases the product. An object recognition unit (233) recognizes an object from an image including the object as a subject. A product specification unit (235) specifies the object as a product, by using a method such as image recognition. A visual inspection result acquisition unit (239) transmits the image to a visual inspection terminal (Q), and acquires a specific result of the product by means of visual inspection transmitted from the visual inspection terminal (Q). A settlement unit (237) performs a settlement process on the product specified on the basis of the result by means of the product specification unit (235) and the visual inspection result acquisition unit (239).

Description

情報処理システムInformation processing system
 本発明は、情報処理システムに関する。 The present invention relates to an information processing system.
 従来より、コンビニエンスストア、スーパーマーケットやショッピングセンター、各種の量販店等の商店では、購入者が商品棚から商品を取り、ショッピングカートや買い物カゴに入れて商店の出口付近に設けられたレジまで運び、セルフキャッシュレジスターにより商品の代金の精算を行っている(例えば、特許文献1参照)。 Conventionally, in shops such as convenience stores, supermarkets, shopping centers, and various mass retailers, buyers take products from product shelves, put them in a shopping cart or shopping cart, and carry them to a cashier located near the exit of the store, The price of the product is settled by a self cash register (for example, refer to Patent Document 1).
特開2001-76261号公報Japanese Patent Laid-Open No. 2001-76261
 しかし、レジでショッピングカートや買い物カゴに入っている商品を精算する場合には、セルフレジであっても商品の夫々のバーコードの読み取りが発生してしまうため、レジ待ちの列を解消することはできず、購入者は長時間レジ待ちをすることになる。
 また、購入者が商店で購入したい商品を購入するときに、他にも多数の購入者がいる場合には、購入者は買い物を諦めてしまう場合が存在する。
 したがって、上記のような事情を考慮すると、購入者が商店に陳列されている商品を購入する際に、商品の代金の精算の自動化及び商品の代金の精算にかかる時間の短縮を図ることができるシステムが要求されている。
 さらにまた、従来の店舗では買い物客やレジ担当者による万引き等の不正も問題となっており、このような不正防止が可能になるシステムも要求されている。
However, when paying for the products in the shopping cart or shopping cart at the cash register, the barcodes for each of the products will be read even at the self-checkout counter, so it is not possible to eliminate the waiting line at the cash register. If not, the purchaser will have to wait for a long time at the cashier.
Further, when a purchaser purchases a desired product at a store, if there are many other purchasers, the purchaser may give up shopping.
Therefore, in consideration of the above circumstances, when the purchaser purchases the product displayed in the store, it is possible to automate the settlement of the price of the product and shorten the time required for the settlement of the price of the product. The system is required.
Further, in the conventional store, fraud such as shoplifting by shoppers and cashiers is also a problem, and a system capable of preventing such fraud is also required.
 本発明は、このような状況に鑑みてなされたものであり、購入者が商品を購入する際に、商品の代金の精算の自動化を可能にすると共に、商品の特定精度を高めることを目的とする。 The present invention has been made in view of such a situation, and when the purchaser purchases a product, it is possible to automate the settlement of the price of the product, and to improve the identification accuracy of the product. To do.
 上記目的を達成するため、本発明の一態様の情報処理システムは、
 物体を被写体に含む画像に対して、目検者が目視による確認をする第1手法を用いて、当該物体を商品として特定することを試みた結果を取得する第1特定手段と、
 前記第1特定手段の結果に基づいて特定された商品について、精算処理を行う精算手段と、
 を備える。
In order to achieve the above object, an information processing system according to one embodiment of the present invention includes:
First identifying means for obtaining a result of attempting to identify the object as a product by using a first method in which an eye examiner visually confirms an image including the object as a subject;
Settlement means for performing settlement processing for the product identified based on the result of the first identifying means,
Is provided.
 本発明によれば、購入者が商品を購入する際に、商品の代金の精算の自動化を可能にすると共に、商品の特定精度を高めることが可能な情報処理システムを提供することができる。 According to the present invention, when a purchaser purchases a product, it is possible to provide an information processing system capable of automating the payment of the price of the product and increasing the accuracy of identifying the product.
本発明に係る情報処理システムの実施形態1乃至実施形態4の要点を一覧表にした図である。FIG. 1 is a diagram showing a list of essential points of Embodiments 1 to 4 of an information processing system according to the present invention. 実施形態1,2におけるシステムフローの概要を表す模式図である。It is a schematic diagram showing the outline of the system flow in Embodiments 1 and 2. 実施形態3,4におけるシステムフローの概要を表す模式図である。It is a schematic diagram showing the outline of the system flow in Embodiments 3 and 4. 実施形態1における商品認識システムを採用するコンビニエンスストアのレイアウト例を示す図である。FIG. 3 is a diagram showing a layout example of a convenience store that employs the product recognition system according to the first embodiment. 実施形態1で採用するレジ端末の外観の構成例を示す概略透視図である。1 is a schematic perspective view showing an example of the external configuration of a cashier terminal used in Embodiment 1. FIG. 本発明の情報処理システムの実施形態1としての商品認識システムの構成を示す構成図である。It is a block diagram which shows the structure of the goods recognition system as Embodiment 1 of the information processing system of this invention. 図6の商品認識システムのうちサーバのハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of a server in the goods recognition system of FIG. 図6の商品認識システムのうちレジ端末のハードウェア構成を示す構成図である。It is a block diagram which shows the hardware constitutions of the cashier terminal in the goods recognition system of FIG. 図7のサーバと図8のレジ端末との機能的構成の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of a functional structure with the server of FIG. 7 and the cashier terminal of FIG. レジ端末に置かれた物体の撮像画面の例を示している。The example of the imaging screen of the object placed in the cashier terminal is shown. レジ端末に置かれた物体の数を算出するための真理値表の一例を示す図である。It is a figure which shows an example of the truth table for calculating the number of the objects put on the cashier terminal. 図9のサーバとレジ端末が実行する自動精算処理を説明するフローチャートである。10 is a flowchart illustrating an automatic settlement process executed by the server and the cashier terminal of FIG. 9. 図9のサーバとレジ端末が実行する自動精算処理において売買制限商品の処理を説明するフローチャートである。10 is a flowchart illustrating processing of a trade-restricted product in the automatic settlement processing executed by the server and the cashier terminal of FIG. 9. 実施形態2における商品認識システムを採用する書店のレイアウト例を示す図である。FIG. 9 is a diagram showing a layout example of a bookstore that employs the product recognition system according to the second embodiment. 実施形態2で採用するレジ端末によって書籍を自動精算している一例を示す概略斜視図である。FIG. 9 is a schematic perspective view showing an example in which a book is automatically settled by a cashier terminal adopted in the second embodiment. 本発明の情報処理システムの実施形態2としての商品認識システムの構成を示す構成図である。It is a block diagram which shows the structure of the goods recognition system as Embodiment 2 of the information processing system of this invention. 図16の商品認識システムのうち売場装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of a sales floor apparatus in the goods recognition system of FIG. 図7のサーバと図8のレジ端末と図17の売場装置との機能的構成の一例を示す機能ブロック図である。FIG. 18 is a functional block diagram showing an example of a functional configuration of the server of FIG. 7, the cashier terminal of FIG. 8, and the sales floor device of FIG. 17. 図18の売場装置に備えられた移動物体追跡部の詳細な機能的構成例を示す機能ブロック図である。19 is a functional block diagram showing a detailed functional configuration example of a moving object tracking unit provided in the sales floor device of FIG. 18. FIG. 図18の売場装置に備えられた位置情報管理部の詳細な機能的構成例を示す機能ブロック図である。FIG. 19 is a functional block diagram showing a detailed functional configuration example of a position information management unit provided in the sales floor device of FIG. 18. 図18の売場装置に備えられた冊数カウント部の詳細な機能的構成例を示す機能ブロック図である。19 is a functional block diagram showing a detailed functional configuration example of a book number counting unit provided in the sales floor device of FIG. 18. FIG. 図18のサーバとレジ端末と売場装置が実行する自動精算処理を説明するフローチャートである。19 is a flowchart illustrating an automatic settlement process executed by the server, the cashier terminal, and the sales floor device of FIG. 18. 図22のステップS210において商品の冊数情報と精算される冊数とを検証する場合のフローチャートである。23 is a flowchart for verifying the number-of-volumes information of the product and the number of volumes to be settled in step S210 of FIG. 実施形態3における商品認識システムを採用するスーパーマーケットのレイアウト例を示す図である。It is a figure which shows the example of a layout of the supermarket which employs the goods recognition system in Embodiment 3. 実施形態3としての商品認識システムの構成を示す構成図である。It is a block diagram which shows the structure of the goods recognition system as Embodiment 3. 図25の商品認識システムのうち売場装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of a sales floor apparatus in the goods recognition system of FIG. 図7のサーバと精算機と図26の売場装置との機能的構成の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of a functional structure of the server of FIG. 7, a settlement machine, and the sales floor apparatus of FIG. 図27の売場装置に備えられた移動物体追跡部の詳細な機能的構成例を示す機能ブロック図である。FIG. 28 is a functional block diagram showing a detailed functional configuration example of a moving object tracking unit provided in the sales floor device of FIG. 27. 図27の売場装置に備えられた棚商品認識部の詳細な機能的構成例を示す機能ブロック図である。It is a functional block diagram which shows the detailed functional structural example of the shelf goods recognition part with which the sales floor apparatus of FIG. 27 was equipped. 図27の売場装置に備えられたカゴ商品認識部の詳細な機能的構成例を示す機能ブロック図である。FIG. 28 is a functional block diagram showing a detailed functional configuration example of a basket product recognition unit provided in the sales floor device of FIG. 27. 図27のサーバと売場装置と精算機が実行する自動精算処理の基本的な流れを説明するフローチャートである。It is a flowchart explaining the basic flow of the automatic settlement processing which the server of FIG. 27, a sales floor apparatus, and a settlement machine perform. 図27のサーバ売場装置と精算機が実行する自動精算処理のカゴ内商品を認識する処理を説明するフローチャートである。It is a flowchart explaining the process which recognizes the goods in a basket of the automatic settlement process which the server sales floor apparatus of FIG. 27 and a settlement machine perform. 実施形態4における商品認識システムを採用するスーパーマーケットのレイアウト例を示す図である。It is a figure which shows the example of a layout of the supermarket which employs the goods recognition system in Embodiment 4. 本発明の情報処理システムの実施形態4としての商品認識システムの構成を示す構成図である。It is a block diagram which shows the structure of the goods recognition system as Embodiment 4 of the information processing system of this invention. 図34の商品認識システムのうち精算ゲートのハードウェア構成を示す図である。It is a figure which shows the hardware constitutions of the settlement gate in the goods recognition system of FIG. 図7のサーバと図26の売場装置と図35の精算ゲートとの機能構成の一例を示す機能ブロック図である。It is a functional block diagram which shows an example of a functional structure of the server of FIG. 7, the sales floor apparatus of FIG. 26, and the settlement gate of FIG. 実施形態4における売場装置に備えられた移動物体追跡部の詳細な機能的構成例を示す機能ブロック図である。FIG. 13 is a functional block diagram showing a detailed functional configuration example of a moving object tracking unit included in a sales floor device according to a fourth embodiment. 実施形態4における売場装置に備えられた棚商品認識部の詳細な機能的構成例を示す機能ブロック図である。It is a functional block diagram which shows the detailed functional structural example of the shelf goods recognition part with which the sales floor apparatus in Embodiment 4 was equipped. 図36のサーバ1と売場装置と精算ゲートが実行する自動精算処理を説明するフローチャートである。It is a flowchart explaining the automatic settlement process which the server 1, the sales floor apparatus, and the settlement gate of FIG. 36 perform. 図36のサーバ1と売場装置と精算ゲートが実行する自動精算処理を説明するフローチャートである。It is a flowchart explaining the automatic settlement process which the server 1, the sales floor apparatus, and the settlement gate of FIG. 36 perform.
〔概要〕
 以下、本発明の情報処理システムの実施形態の概要について説明する。
 本発明の情報処理システムは、商品を自動精算する商品認識システムとして適用される。以下、商品を特定するためのレジ端末を含む商品認識システムの実施形態1,2、及び、商品を特定するためのレジ端末を含まない商品認識システムの実施形態3,4について、夫々説明する。
〔Overview〕
The outline of the embodiment of the information processing system of the present invention will be described below.
The information processing system of the present invention is applied as a product recognition system that automatically pays for products. Hereinafter, Embodiments 1 and 2 of a product recognition system including a cashier terminal for specifying a product and Embodiments 3 and 4 of a product recognition system that does not include a cashier terminal for specifying a product will be described respectively.
 図1は、本発明の情報処理システムの実施形態1乃至実施形態4の要点を一覧表にした図である。
 図1中の「実施店舗」欄には、実施形態1乃至4の夫々の実施店舗が記載されている。例えば、実施形態1は、主としてコンビニエンスストアでの適用を想定する情報処理システムである。
 ただし、この実施店舗は、あくまで例示に過ぎず、実施形態1乃至4の夫々の適用先は特に限定されない。例えば、実施形態1の実施店舗をスーパーマーケット等の小売店舗や食堂、精算が行われる店舗としてもよい。
 図1中の「精算場所」欄には、実施形態1乃至4の夫々において、買い物客が精算を行う場所が記載されている。実施形態1,2での精算場所は、レジ端末を例示している。レジ端末は、物体を商品として特定し、その商品を精算する機能を有している。実施形態3での精算場所は、レジ台である。レジ台は、物体から商品として既に特定されたものが置かれ、その商品を精算する機能を有している。実施形態4での精算場所は、精算ゲートを例示している。精算ゲートは、物体から商品として既に特定されたものをレジ台に置かずに、その商品を精算する機能を有している。
FIG. 1 is a diagram showing a list of the main points of the first to fourth embodiments of the information processing system of the present invention.
In the “implementation store” column in FIG. 1, each implementation store of Embodiments 1 to 4 is described. For example, the first embodiment is an information processing system that is mainly intended for application in convenience stores.
However, this shop is merely an example, and the application destinations of the first to fourth embodiments are not particularly limited. For example, the implementation store of Embodiment 1 may be a retail store such as a supermarket, a cafeteria, or a store where settlement is performed.
In the “payment place” column in FIG. 1, the place where the shopper pays is described in each of the first to fourth embodiments. The checkout terminal in Embodiments 1 and 2 exemplifies a cashier terminal. The cashier terminal has a function of identifying an object as a product and paying the product. The settlement place in the third embodiment is a cashier table. The cash register has a function to settle a product already specified as a product from the object. The settlement place in the fourth embodiment exemplifies a settlement gate. The checkout gate has a function of paying out a product that has already been specified as a product from an object without placing it on the cash register.
 図1中の「概要」欄には、各実施形態1乃至4の夫々の概要が記載されている。例えば、実施形態1の情報処理システムは、要約すると、レジ端末に置かれた商品の自動精算を行う情報処理システムである。
 図1中の「詳細」欄には、各実施形態の詳細が記載されている。つまり、例えば、実施形態1の情報処理システムは、レジ端末に設置されたレジカメラによってレジ端末に置かれた手持ち商品を認識し、その商品の自動精算を行う。
In the "Outline" column in FIG. 1, an outline of each of the first to fourth embodiments is described. For example, in summary, the information processing system according to the first embodiment is an information processing system that performs automatic settlement of a product placed on a cashier terminal.
Details of each embodiment are described in the “Details” column in FIG. 1. That is, for example, the information processing system according to the first embodiment recognizes a hand-held product placed on the cashier terminal by the cashier camera installed on the cashier terminal, and automatically pays the product.
 ここで、店舗において販売されている商品がどのような商品か特定される前のものを「物体」と呼ぶ。なお、商品ではないものの、例えば買い物客が持参する私物等も「物体」となる。そして、「物体」が商品として特定されたものを「商品」と呼ぶ。したがって、「商品」とは、店頭において販売されているものを意味する。
 また、実施形態1乃至4の商品認識システムは、物体を撮像する1台以上のセンシングデバイスを備えている。センシングデバイスは、画像センサー(カメラ等)の他、温度センサーや距離センサー等種々のものを採用することができる。
Here, a product that has not been identified as a product sold in the store is called an “object”. In addition, although it is not a product, for example, personal items brought by a shopper are also “objects”. A product in which the "object" is specified as a product is called a "product". Therefore, the "commodity" means a product sold at a store.
In addition, the product recognition system of Embodiments 1 to 4 includes one or more sensing devices that image an object. As the sensing device, various devices such as a temperature sensor and a distance sensor can be adopted in addition to the image sensor (camera or the like).
 カメラ等の画像センサーにより撮像された画像を、以下「撮像画像」と呼ぶ。さらに、物体を被写体として含む撮像画像を、以下「物体撮像画像」と呼ぶ。一方、商品を被写体として含む撮像画像を、以下「商品撮像画像」と呼ぶ。
 また、本明細書では、商品撮像画像及び物体撮像画像に対して各種画像処理を施す場合、実際にはデータの形態で取り扱うが、以下においては説明の便宜上、データは省略して説明する。
 ここで、物体撮像画像及び商品撮像画像としては、例えば、レジ端末の物体が置かれる領域全体の撮像画像、商品を個別に切り出した画像、商品のロゴの画像、商品のバーコードの画像、商品のラベルの画像、商品の棚全体の撮像画像、店舗内を天井等から撮像した店舗内の画像等を用いることができる。
 また、本発明の情報処理システムでは、物体が商品として特定される場合、物体撮像画像に対して各種認識手法を用いることにより、物体撮像画像の被写体がいずれの商品であるか推定され、物体撮像画像の被写体が合致する商品及びその合致度合いのリスト(以下、「商品候補リスト」とも呼ぶ。)が生成される。
An image captured by an image sensor such as a camera will be referred to as “captured image” below. Furthermore, a captured image including an object as a subject is hereinafter referred to as an “object captured image”. On the other hand, a captured image including a product as a subject is hereinafter referred to as a “product captured image”.
Further, in the present specification, when various kinds of image processing are performed on the product imaged image and the object imaged image, they are actually handled in the form of data, but in the following, the data will be omitted for the sake of convenience of description.
Here, as the object imaged image and the product imaged image, for example, an imaged image of the entire area where the object of the cashier terminal is placed, an image obtained by individually cutting out the product, a product logo image, a product barcode image, a product image The image of the label, the image of the entire shelf of the product, the image of the inside of the store taken from the ceiling or the like can be used.
Further, in the information processing system of the present invention, when an object is specified as a product, it is estimated which product the subject of the object captured image is, by using various recognition methods for the object captured image, A list of products for which the subject of the image matches and the matching degree thereof (hereinafter, also referred to as “product candidate list”) is generated.
 さらに、実施形態1乃至4の情報処理システムでは、商品を精算する際に、商品として特定できなかった物体や、売買制限されている商品について、物体撮像画像や商品撮像画像を目検用端末に送信し、目検用端末において、目検者が商品の特定や売買制限の判定(売買制限を解除するか否かの判定等)が行われる。売買制限商品の売買制限の解除は、判定の結果を受けて店員がその場で対応する、もしくはシステムとして目検の結果を採用する、のいずれかの方法で行う。そして、目検者による目検結果が適宜参照されて、商品の精算が行われる。 Further, in the information processing systems of the first to fourth embodiments, when the product is settled, the object imaged image or the product imaged image of the object that cannot be specified as the product or the product whose sale is restricted is displayed on the terminal for eye examination. The information is transmitted, and the eye examination terminal identifies the product and determines the trade restriction (determines whether or not to cancel the trade restriction). The sale restriction of the sale-restricted product is released either by the salesclerk responding on the spot in response to the result of the determination or by adopting the result of the visual inspection as a system. Then, the result of the eye inspection by the eye inspector is appropriately referred to, and the product is settled.
 図2は、実施形態1,2におけるシステムフローの概要を表す模式図である。
 図2に示すように、実施形態1,2においては、(1)レジ端末で物体撮像画像の撮像及び商品の特定が行われ、(2)商品の特定が行えない物体または売買制限商品であると判定されると、(3)これらの物体あるいは商品に関する関連情報が目検用端末に送信される。関連情報として、例えば、物体撮像画像、商品撮像画像、認識結果のログ、商品候補リスト等を採用することができる。この際、複数フレームの物体撮像画像を送信する構成としてもよい。そして、(4)目検用端末において目検の依頼(目検対象に関する関連情報の受信)が報知され、(5)目検者が目検を実行する。これにより、レジ端末において商品の特定が行えなかった物体を商品として一意に特定したり、売買制限商品の売買制限を解除したりすることができる。さらに、(6)目検者による目検結果(商品の特定結果あるいは売買制限の判定結果等)がレジ端末に送信され、(7)レジ端末において、目検結果が参照される。このとき、レジ端末において、売買制限商品である旨を報知することも可能である。
 目検を経ても商品不特定との結果が返されたときや、売買制限商品の売買制限の解除判定に用いる物体(購入者の顔や身分証明書等)の撮像画像が不鮮明なとき等は、レジ端末において画面表示等で購入者に知らせる構成としてもよいし、音声で案内をする構成としてもよい。具体的に考えられるケースとしては(A)商品不特定となった際に、購入者へ商品の置き直し(商品同士を離す、商品を重ねない)を依頼する、(B)売買制限の解除判定に用いる購入者の顔や身分証明書の撮像画像が不鮮明な際に、画像の撮り直しを依頼する、(C)購入者へのレジ端末の使用方法のガイドを行う、等がある。
 音声案内をする際には、予め録音された音声を流してもよい。また、レジ端末に備え付けられたマイクロフォンとスピーカーを介して、購入者と目検者等が通話する構成としてもよい。
FIG. 2 is a schematic diagram showing an outline of the system flow in the first and second embodiments.
As shown in FIG. 2, in the first and second embodiments, (1) an object-captured image is captured and a product is specified by a cashier terminal, and (2) an object or trade-restricted product in which the product cannot be specified. If it is determined that (3), relevant information regarding these objects or products is transmitted to the inspection terminal. As the related information, for example, an object captured image, a product captured image, a recognition result log, a product candidate list, or the like can be adopted. At this time, a configuration may be adopted in which the object captured images of a plurality of frames are transmitted. Then, (4) the eye examination request (reception of related information regarding the eye examination object) is notified by the eye examination terminal, and (5) the eye examiner executes the eye examination. This makes it possible to uniquely identify an object for which a product cannot be identified at the cashier terminal as a product, or to release the trade restriction of the trade-restricted product. Further, (6) the eye inspection result by the eye examiner (the result of specifying the product or the determination result of the trade restriction) is transmitted to the cashier terminal, and (7) the eye inspection result is referred to by the cashier terminal. At this time, it is possible to inform the cashier terminal that the product is a trade-restricted product.
When the result that the product is unspecified is returned even after the eye examination, or when the imaged image of the object (purchaser's face, identification card, etc.) used to judge the release of the trade restriction of the trade-restricted product is unclear, etc. The purchaser may be informed by a screen display or the like at the cashier terminal, or may be instructed by voice. Specific cases that can be considered are (A) when the product is unspecified, the purchaser is requested to replace the products (separate the products from each other, do not stack the products), and (B) determine whether to cancel the trading restriction. When the captured image of the purchaser's face or identification card used for the above is unclear, a request is made to retake the image, (C) the purchaser is guided to use the cashier terminal, and the like.
When providing voice guidance, prerecorded voice may be played. Further, the purchaser and the eye examiner may make a call via a microphone and a speaker provided in the cashier terminal.
 また、図3は、実施形態3,4におけるシステムフローの概要を表す模式図である。
 図3に示すように、実施形態3,4においては、(1)店舗内(売場内)で物体撮像画像の撮像及び商品の特定が行われ、(2)商品の特定が行えない物体または売買制限商品であると判定されると、(3)これらの物体あるいは商品に関する関連情報が目検用端末に送信される。関連情報として、例えば、物体撮像画像、商品撮像画像、認識結果のログ、商品候補リスト、物体撮像画像が撮像された位置情報、店舗内で商品あるいは買い物客を追跡するための情報等を採用することができる。この際、複数フレームの物体撮像画像を送信してもよい。そして、(4)目検用端末において目検の依頼(目検対象に関する関連情報の受信)が報知され、(5)目検者が目検を実行する。これにより、店舗内において商品の特定が行えなかった物体を商品として一意に特定したり、売買制限商品の売買制限を解除したりすることができる。さらに、(6)目検者による目検結果(商品の特定結果あるいは売買制限の判定結果等)が売場内に設置された売場装置に送信され、(7)売場装置において、目検結果が参照される。このとき、売場装置において、売買制限商品である旨を報知することも可能である。
 目検を経ても商品不特定との結果が返されたときや、売買制限商品の売買制限の解除判定に用いる物体(購入者の顔や身分証明書等)の撮像画像が不鮮明なとき等は、売場装置等において画面表示等で購入者に知らせる構成としてもよいし、音声で案内をする構成としてもよい。具体的に考えられるケースとしては、(A)商品不特定となった際に、購入者へ商品の取り直し(棚からの取り直し、カゴへの入れ直し)を依頼する、(B)売買制限の解除判定に用いる購入者の顔や身分証明書等の撮像画像が不鮮明な際に、画像の撮り直しを依頼する、(C)購入者への売場装置等の使用方法のガイドを行う、等がある。
 音声案内をする際には、予め録音された音声を流してもよい。また、売場装置等に備え付けられたマイクロフォンとスピーカーを介して、購入者と目検者等が通話する構成としてもよい。
 このようなシステムフローが実行される情報処理システムの実施形態1乃至4は、具体的には、以下のような商品認識システムとして実現することができる。
Further, FIG. 3 is a schematic diagram showing an outline of the system flow in the third and fourth embodiments.
As shown in FIG. 3, in the third and fourth embodiments, (1) an object imaged image is captured and a product is specified in the store (in the sales floor), and (2) an object or a trade that cannot specify the product. When it is determined that the product is a restricted product, (3) related information about these objects or products is transmitted to the inspection terminal. As the related information, for example, an object captured image, a product captured image, a recognition result log, a product candidate list, position information where the object captured image is captured, information for tracking a product or a shopper in a store, and the like are adopted. be able to. At this time, the object captured images of a plurality of frames may be transmitted. Then, (4) the eye examination request (reception of related information regarding the eye examination object) is notified by the eye examination terminal, and (5) the eye examiner executes the eye examination. This makes it possible to uniquely identify, as a product, an object for which the product could not be identified in the store, or release the trade restriction of the trade-restricted product. Further, (6) the eye inspection result by the eye examiner (the result of specifying the product or the judgment result of the trade restriction) is transmitted to the sales floor device installed in the sales floor, and (7) the eye inspection result is referred to in the sales floor device. To be done. At this time, it is also possible to inform the sales floor device that the product is a trade-restricted product.
When the result that the product is unspecified is returned even after the eye examination, or when the imaged image of the object (purchaser's face, identification card, etc.) used to judge the release of the trade restriction of the trade-restricted product is unclear, etc. The configuration may be such that a purchaser is informed by a screen display or the like in a sales floor device or the like, or a configuration in which voice guidance is provided. Specific possible cases include (A) requesting the purchaser to retake the product (removing it from the shelf, putting it back into the basket) when the product is unspecified, and (B) determining whether to cancel the trading restrictions. When the captured image of the purchaser's face, identification card, or the like used for the above is unclear, a request is made to retake the image, and (C) the purchaser is guided to use the sales floor device.
When providing voice guidance, prerecorded voice may be played. Further, the purchaser and the eye examiner may make a call via a microphone and a speaker provided in the sales floor device or the like.
The first to fourth embodiments of the information processing system in which such a system flow is executed can be specifically realized as the following product recognition system.
 実施形態1の商品認識システムは、レジ端末に置かれた物体を被写体として含む物体撮像画像に基づいて、物体の存を認識する。
 具体的には例えば、実施形態1では、レジ端末はセンシングデバイスの一例として、1台以上のレジカメラを備えている。レジカメラは、レジ端末の所定エリアを撮像する。具体的には、レジカメラは、物体が置かれる前の所定エリアを撮像する。また、レジカメラは、物体が所定エリアに置かれた後の所定エリアを撮像する。そこで、実施形態1の商品認識システムのうちレジ端末は、物体がレジ端末の所定エリアに置かれる前の撮像画像と、物体がレジ端末の所定エリアに置かれた後の物体撮像画像とを対比することで、物体の存在を認識する。
 また、物体がレジ端末の所定エリアに置かれる前の撮像画像と、物体がレジ端末の所定エリアに置かれた後の物体撮像画像とを対比する以外の画像認識におけるセグメンテーション技術を用いて、物体がレジ端末の所定エリアに置かれた後の物体撮像画像のみから物体の存在を認識してもよい。
 実施形態1では、レジ端末は、この認識された物体の夫々がいずれの商品であるかを画像認識による物体認識手法によって特定する。ここでは、例えば、ディープラーニング(Deep Learning)により、商品候補を作成し、その後、検証機能を発揮させることで、商品を高い精度で特定する手法が採用される。
 また、実施形態1では、設定された条件に応じて、レジ端末に置かれた物体を、レジ端末とネットワークを介して接続された目検用端末において目検者が確認する。この確認結果(目検結果)はレジ端末に送信される。なお、「目検」とは、人間が目視により物体を確認して、例えばその物体が何の商品であるかを特定する等、何らかの結論を出すことを意味する。このような結論を、以下、「確認結果」または「目検結果」と呼ぶ。
 このような確認結果(目検結果)がレジ端末に送信された場合、当該レジ端末は、上述の画像認識による物体認識手法による特定結果と、確認結果とに基づいて、最終的な商品の特定を行う。なお、最終的な商品の特定は、レジ端末ではなく、その他の図示せぬ装置や自然人により行われてもよい。
 そして、実施形態1では、レジ端末は、次に特定された商品の数量を認識する。実施形態1では、次に特定された商品の精算をする。
The product recognition system according to the first embodiment recognizes the presence of an object based on an object captured image including an object placed on a cashier terminal as a subject.
Specifically, for example, in Embodiment 1, the cashier terminal includes one or more cashier cameras as an example of a sensing device. The cashier camera images a predetermined area of the cashier terminal. Specifically, the cash register camera images a predetermined area before an object is placed. Further, the cash register camera images the predetermined area after the object is placed in the predetermined area. Therefore, in the commodity recognition system of the first embodiment, the cashier terminal compares the captured image before the object is placed in the predetermined area of the cashier terminal with the object captured image after the object is placed in the predetermined area of the cashier terminal. By doing so, the existence of the object is recognized.
In addition, by using a segmentation technique in image recognition other than comparing the captured image before the object is placed in the predetermined area of the cashier terminal with the captured image of the object after the object is placed in the predetermined area of the cashier terminal, The presence of the object may be recognized only from the object captured image after being placed in a predetermined area of the cashier terminal.
In the first embodiment, the cashier terminal specifies which product each of the recognized objects is by an object recognition method by image recognition. Here, for example, a method of identifying a product with high accuracy by creating a product candidate by deep learning and then activating the verification function is adopted.
Further, in the first embodiment, the eye examiner confirms the object placed on the cashier terminal at the eye examination terminal connected to the cashier terminal via the network according to the set conditions. This confirmation result (result of visual inspection) is transmitted to the cashier terminal. The “eye check” means that a person visually confirms an object and makes some conclusion, for example, specifying what kind of product the object is. Hereinafter, such a conclusion will be referred to as a “confirmation result” or an “eye examination result”.
When such a confirmation result (visual inspection result) is transmitted to the cashier terminal, the cashier terminal identifies the final product based on the identification result by the object recognition method based on the image recognition and the confirmation result. I do. It should be noted that the final identification of the product may be performed not by the cashier terminal but by another device (not shown) or a natural person.
Then, in the first embodiment, the cashier terminal recognizes the quantity of the next specified product. In the first embodiment, the product specified next is settled.
 実施形態2の商品認識システムは、書店のような店舗に対して適用する。具体的には、実施形態2の商品認識システムは、書店内の売場に設置された棚と棚との間やワゴンのような台上(以下、台上を含めて「棚内」として説明する)の物体を書籍の冊数として認識し、この書籍が取られたときにレジ端末に置かれるまで買い物客を追跡し、レジ端末に書籍が置かれた場合、置かれた書籍の冊数を認識したうえで、その書籍を特定することで商品として認識し、当該書籍を自動精算する。このとき、実施形態2では、設定された条件に応じて、レジ端末に置かれた書籍を、レジ端末とネットワークを介して接続された目検用端末において目検者が確認(当該書籍を商品として特定)する。そこで、実施形態2の商品認識システムは、この確認結果(目検結果)もさらに考慮して、商品を特定するようにしてもよい。
 なお、目検者は、上述の「書籍」の目視による確認の前に、上述の買い物客を目視で確認してもよい。これにより、商品認識システムによる追跡が失敗等した場合であっても、商品の特定が可能になる。
The product recognition system according to the second embodiment is applied to a store such as a bookstore. Specifically, the product recognition system according to the second embodiment is described as a space between shelves installed at a sales floor in a bookstore or a table such as a wagon (hereinafter, referred to as “inside the shelf” including the table). ) Object is recognized as the number of books, and when this book is taken, the shopper is tracked until it is placed in the cashier terminal, and when the book is placed in the cashier terminal, the number of books placed is recognized. Then, by identifying the book, the book is recognized as a product and the book is automatically settled. At this time, in the second embodiment, the eye checker confirms the book placed on the cashier terminal at the eye examination terminal connected to the cashier terminal via the network according to the set conditions (the book is sold as Specified as). Therefore, the product recognition system according to the second embodiment may specify the product by further considering this confirmation result (visual inspection result).
The eye checker may visually check the shopper before visually checking the “book”. As a result, even if the tracking by the product recognition system fails, the product can be identified.
 実施形態3の商品認識システムは、スーパーマーケット等の小売店舗に対して適用する。具体的には、実施形態3の商品認識システムは、スーパーマーケット等の小売店舗の売場に置かれたカゴ類(買い物カゴやカート)を認識し、売場を移動するカゴ類を追跡する。実施形態3の商品認識システムでは、物体が棚内から取られた段階でその物体を商品として認識・特定し、レジ台にカゴ類が置かれることで、カゴ類に入れられた商品のリストが読み出され、その商品を自動精算する。このとき、実施形態3では、設定された条件に応じて、棚から取られた物体は、売場に設置されたカメラ等とネットワークを介して接続された目検用端末において目検者によって確認される。そこで、実施形態3の商品認識システムは、この確認結果(目検結果)もさらに考慮して、商品を特定するようにしてもよい。 The product recognition system of the third embodiment is applied to retail stores such as supermarkets. Specifically, the product recognition system of the third embodiment recognizes baskets (shopping baskets and carts) placed in the sales floors of retail stores such as supermarkets, and tracks baskets moving in the sales floors. In the product recognition system according to the third embodiment, when an object is taken from the shelf, the object is recognized and specified as a product, and the baskets are placed on the cash register so that the list of the products placed in the baskets is displayed. It is read and the item is automatically settled. At this time, in the third embodiment, according to the set conditions, the object taken from the shelf is confirmed by the eye examiner at the eye inspection terminal connected to the camera or the like installed in the sales floor via the network. It Therefore, the product recognition system according to the third embodiment may specify the product by further considering the confirmation result (visual inspection result).
 実施形態4の商品認識システムは、スーパーマーケット等の小売店舗に対して適用する。具体的には、買い物客や、スーパーマーケットに置かれた買い物カゴ及びカートだけでなく、買い物客のマイバッグやレジ袋等を含めたカゴ類と買い物客とを移動物体として認識して追跡する。そして、実施形態4の商品認識システムでは、棚から物体が取られた段階で商品を認識・特定し、商品がレジ端末に置かれなくてもレジゲートにて自動精算できるようにする。このとき、実施形態4では、設定された条件に応じて、棚から取られた物体は、売場に設置されたカメラ等とネットワークを介して接続された目検用端末において目検者によっても確認される。そこで、実施形態4の商品認識システムは、この確認結果(目検結果)もさらに考慮して、商品を特定するようにしてもよい。 The product recognition system of Embodiment 4 is applied to retail stores such as supermarkets. Specifically, not only the shopper, the shopping cart and the cart placed in the supermarket, but also the baskets including the shopper's my bag and the shopping bag and the shopper are recognized and tracked as moving objects. Then, in the product recognition system of the fourth embodiment, the product is recognized and specified when the object is taken from the shelf, and the product can be automatically settled at the cash register even if the product is not placed in the cashier terminal. At this time, in the fourth embodiment, according to the set conditions, the object taken from the shelf is also confirmed by the eye examiner at the eye inspection terminal connected to the camera or the like installed in the sales floor via the network. To be done. Therefore, the product recognition system according to the fourth embodiment may specify the product by further considering this confirmation result (visual inspection result).
 以下、実施形態1乃至実施形態4について、夫々図面を参照して説明する。 Hereinafter, Embodiments 1 to 4 will be described with reference to the drawings.
〔実施形態1〕
 実施形態1の情報処理システムは、図4に示すようなコンビニエンスストア等の店舗に採用される図5に示すようなレジ端末2を有する商品認識システムである。実施形態1の情報処理システムは、商品がレジ端末に置かれることで、自動精算することができるようにしてある。
[Embodiment 1]
The information processing system of the first embodiment is a product recognition system having a cashier terminal 2 as shown in FIG. 5 which is adopted in a store such as a convenience store as shown in FIG. The information processing system according to the first embodiment enables automatic payment by placing a product on a cashier terminal.
 図4は、実施形態1の情報処理システムを採用する店舗がコンビニエンスストアである場合のレイアウト例を示す図である。
 店舗10内の出入口11付近には、レジカウンター12が設置されている。レジカウンター12上には、商品を自動精算するための無人のレジ端末2が設置されている。このレジ端末2の隣には、有人のレジスター13が設置されている。
 店舗10内には、商品を陳列する複数台の棚ラック14が設置され、向き合った棚ラック14間が、買い物客が移動する通路15とされている。
 棚内の商品は、通路15を移動してきた買い物客によって取られ、レジ端末2の所定エリア(後述の図5の所定エリアA等)に置かれる。所定エリアに置かれた商品は、買い物客のレジ端末2に対する所定の操作をトリガーとして、レジ端末2により複数商品が一括で特定されて自動精算される。
 なお、有人のレジスター13は、従来通り、店員が商品を1個ずつバーコードで認識し、精算する。
FIG. 4 is a diagram illustrating a layout example when the store that employs the information processing system according to the first embodiment is a convenience store.
A cashier counter 12 is installed near the doorway 11 in the store 10. An unmanned cashier terminal 2 is installed on the cashier counter 12 for automatically paying the merchandise. A manned register 13 is installed next to the cashier terminal 2.
In the store 10, a plurality of shelf racks 14 for displaying products are installed, and the facing shelf racks 14 serve as passages 15 through which shoppers move.
The goods in the shelf are picked up by the shopper who has moved through the aisle 15 and placed in a predetermined area of the cashier terminal 2 (a predetermined area A in FIG. 5 described later). With respect to the products placed in the predetermined area, a plurality of products are collectively specified by the cashier terminal 2 and triggered by a predetermined operation of the shopper on the cashier terminal 2 for automatic payment.
In the manned register 13, as in the conventional case, the clerk recognizes the products one by one by the barcode and makes the payment.
 無人のレジ端末2の外観の構成例について図5を参照して説明する。
 図5は、レジ端末2の外観の構成例を示す概略透視図である。
 レジ端末2は、物体が載置される所定エリアAを囲む囲繞部270を備えている。囲繞部270は、天板部271と、底板部272と、一対の側板部273と、を備えている。
An example of the external configuration of the unmanned cashier terminal 2 will be described with reference to FIG.
FIG. 5 is a schematic perspective view showing an example of the external configuration of the cashier terminal 2.
The cashier terminal 2 includes a surrounding portion 270 that surrounds a predetermined area A in which an object is placed. The surrounding portion 270 includes a top plate portion 271, a bottom plate portion 272, and a pair of side plate portions 273.
 天板部271と一対の側板部273の夫々には、所定エリアAを撮像するレジカメラ211が夫々固定されている。レジカメラ211は、所定エリアAに置かれた物体を撮像する。
 なお、図5において、レジカメラ211は、3台のみ描画されているが、後述のように5台でもよく、少なくとも1台以上存在すれば足り、台数は限定されない。また、レジ端末2は、買い物客の顔や手等を撮像する外部カメラ212も備える。
A cash register camera 211 that images the predetermined area A is fixed to each of the top plate portion 271 and the pair of side plate portions 273. The cashier camera 211 images an object placed in the predetermined area A.
Although only three cash register cameras 211 are depicted in FIG. 5, five cashier cameras may be provided as described later, and it is sufficient that at least one cashier camera is present, and the number of cashier cameras is not limited. The cashier terminal 2 also includes an external camera 212 that images the face, hands, etc. of the shopper.
 底板部272上に筐体部275が設置されている。筐体部275の正面には、図5には図示しないレシート出力部や表示部(後述する図8の出力部206のレシート出力部Rや表示部D)が備えられている。
 筐体部275上には物体を載せる半透明のプレート276が設置される。プレート276の上面の盤面が所定エリアAとされる。プレート276の盤面は、波形状に形成されている。波形状は、正弦波形状だけでなく、矩形波であってもよく、さらに、ピッチや振幅は、均等だけでなく、不均等であってもよい。
 プレート276は、このように所定エリアAが凹部と凸部とを繰り返して形成したものとされることで、円柱状や球状の物体の少なくとも一部を凸部と凸部との間に挟み、転がらないようにすることができる。
The housing portion 275 is installed on the bottom plate portion 272. A front surface of the housing 275 is provided with a receipt output unit and a display unit (not shown in FIG. 5) (a receipt output unit R and a display unit D of the output unit 206 of FIG. 8 described later).
A semitransparent plate 276 on which an object is placed is installed on the housing 275. The plate surface on the upper surface of the plate 276 is set as the predetermined area A. The board surface of the plate 276 is formed in a wave shape. The wave shape is not limited to a sine wave shape, and may be a rectangular wave. Furthermore, the pitch and amplitude may be not only uniform but also uneven.
The plate 276 is configured such that the predetermined area A is formed by repeatedly forming the concave portion and the convex portion in this manner, so that at least a part of a columnar or spherical object is sandwiched between the convex portion and the convex portion, You can prevent it from rolling.
 このプレート276内と囲繞部270の天板部271内には、所定エリアAを照明する照明部221が備えられている。照明部221は、囲繞部270の側板部273に備えられてもよい。
 照明部221は、白色だけでなく、青色や赤色その他限定しない種々の色で発光する。照明部221が発光することで、所定エリアAに置かれた物体の影が所定エリアAに生じないまたは減少するようにされている。
An illumination unit 221 that illuminates a predetermined area A is provided inside the plate 276 and the top plate portion 271 of the surrounding portion 270. The lighting unit 221 may be included in the side plate portion 273 of the surrounding portion 270.
The illumination unit 221 emits not only white light but also various colors such as blue and red, which are not limited. The illumination unit 221 emits light so that the shadow of an object placed in the predetermined area A is not generated or reduced in the predetermined area A.
 囲繞部270は、レジ端末2の状態、例えば、正常待機中の状態であるか、精算中の状態であるか、店員が操作中の状態であるか、さらに異常事態が発生しているか等を色で視認できるように、提示部210を変色可能としている。
 囲繞部270のうち少なくとも天板部271と側板部273は、透明になる透明状態と、不透明になる不透明状態とに切り替えられるように、瞬間調光シートで構成されていてもよい。
 その場合、囲繞部270が透明状態とされることで、所定エリアAの視認性を確保することができる。囲繞部270が不透明状態とされることで、撮影時において外部光の影響を抑制しつつ、物体撮像画像を取得することができる。
The surrounding section 270 determines the state of the cashier terminal 2, for example, the state of normal standby, the state of settlement, the state of operation by a store clerk, and the occurrence of an abnormal situation. The color of the presentation unit 210 can be changed so that the color can be visually recognized.
At least the top plate portion 271 and the side plate portion 273 of the surrounding portion 270 may be configured with an instantaneous light control sheet so that the transparent state becomes transparent and the opaque state becomes opaque.
In that case, the visibility of the predetermined area A can be ensured by making the surrounding portion 270 transparent. Since the surrounding portion 270 is in the opaque state, it is possible to acquire the captured image of the object while suppressing the influence of external light during shooting.
 このようなレジ端末2は、情報処理システムの実施形態1としての商品認識システムに組み込まれる。
 図6は、本発明の情報処理システムの実施形態1としての商品認識システムの構成を示す構成図である。
 実施形態1の商品認識システムは、サーバ1と、n台(nは1以上の任意の整数値)のレジ端末2-1乃至2-nと、目検用端末Qと、を有している。
 サーバ1と、n台のレジ端末2-1乃至2-nと、目検用端末Qとは、インターネット等のネットワークNを介して相互に接続されている。目検用端末Qは、レジ端末2から離れた店員が所持したり、店舗のバックヤードに備えたり、店舗から遠隔地にあるコールセンターに備えたりすることができる。
Such a cashier terminal 2 is incorporated in the product recognition system as the first embodiment of the information processing system.
FIG. 6 is a configuration diagram showing a configuration of a product recognition system as the first embodiment of the information processing system of the present invention.
The product recognition system according to the first embodiment includes a server 1, n (n is an arbitrary integer value of 1 or more) cashier terminals 2-1 to 2-n, and a visual inspection terminal Q. .
The server 1, the n cashier terminals 2-1 to 2-n, and the inspection terminal Q are connected to each other via a network N such as the Internet. The inspection terminal Q can be possessed by a store clerk distant from the cashier terminal 2, provided in a backyard of the store, or provided in a call center remote from the store.
 なお、説明の便宜上、図6のサーバ1は、1台しか描画されていないが、実際には複数台の場合もある。
 また、以下、レジ端末2-1乃至2-nを個々に区別する必要がない場合、これらをまとめて「レジ端末2」と呼ぶ。
Note that, for convenience of explanation, only one server 1 is depicted in FIG. 6, but in reality, there may be a plurality of servers.
Further, hereinafter, when it is not necessary to individually distinguish the cashier terminals 2-1 to 2-n, they are collectively referred to as a “cash register terminal 2”.
 サーバ1は、複数のレジ端末2の各動作が調和して行われるべく、必要な処理を実行する。レジ端末2は、図4に示したレジカウンター12上に置かれている。レジ端末2では、買い物客によりレジ端末2の所定エリアAに置かれた物体の数量を特定したうえ、商品を特定し、自動精算する。
 ただし、レジ端末2は必ずしもサーバ1の存在を必要とせず、単独でも機能し得る。この場合、サーバ1の有する機能のうち一部または全てを、他の情報処理装置(例えば、レジ端末2)が有する。
The server 1 executes necessary processing so that the respective operations of the plurality of cashier terminals 2 are performed in harmony. The cashier terminal 2 is placed on the cashier counter 12 shown in FIG. At the cashier terminal 2, the number of objects placed by the shopper in the predetermined area A of the cashier terminal 2 is specified, and then the merchandise is specified and automatically settled.
However, the cashier terminal 2 does not necessarily need to have the server 1 and can function independently. In this case, some or all of the functions of the server 1 are possessed by another information processing device (for example, the cashier terminal 2).
図7は、図6の実施形態1の情報処理システムのうちサーバ1のハードウェア構成を示すブロック図である。
 サーバ1は、CPU(Central Processing Unit)101と、ROM(Read Only Memory)102と、RAM(Random Access Memory)103と、バス104と、入出力インターフェース105と、出力部106と、入力部107と、記憶部108と、通信部109と、ドライブ110と、を備えている。
FIG. 7 is a block diagram showing the hardware configuration of the server 1 in the information processing system according to the first embodiment shown in FIG.
The server 1 includes a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, a bus 104, an input / output interface 105, an output unit 106, and an input unit 107. The storage unit 108, the communication unit 109, and the drive 110 are provided.
 CPU101は、ROM102に記憶されているプログラム、または、記憶部108からRAM103にロードされたプログラムに従って各種の処理を実行する。
 RAM103には、CPU101が各種の処理を実行する上において必要なデータ等も適宜記憶される。
The CPU 101 executes various processes according to a program stored in the ROM 102 or a program loaded from the storage unit 108 into the RAM 103.
The RAM 103 also stores data necessary for the CPU 101 to execute various processes.
 CPU101、ROM102及びRAM103は、バス104を介して相互に接続されている。このバス104には、また、入出力インターフェース105も接続されている。入出力インターフェース105には、出力部106、入力部107、記憶部108、通信部109及びドライブ110が接続されている。 The CPU 101, the ROM 102, and the RAM 103 are connected to each other via a bus 104. An input / output interface 105 is also connected to the bus 104. An output unit 106, an input unit 107, a storage unit 108, a communication unit 109, and a drive 110 are connected to the input / output interface 105.
 出力部106は、ディスプレイやスピーカー等で構成され、各種情報を画像や音声として出力する。
 入力部107は、タッチパネル、キーボード、マウスやマイクロフォン等で構成され、各種情報を入力する。
The output unit 106 includes a display, a speaker, and the like, and outputs various kinds of information as an image or sound.
The input unit 107 includes a touch panel, a keyboard, a mouse, a microphone, etc., and inputs various information.
 記憶部108は、ハードディスクやDRAM(Dynamic Random Access Memory)等で構成され、各種データを記憶する。
 通信部109は、図6に示すように、インターネットを含むネットワークNを介してレジ端末2との間で通信を行う。
The storage unit 108 includes a hard disk, a DRAM (Dynamic Random Access Memory), and the like, and stores various data.
As illustrated in FIG. 6, the communication unit 109 communicates with the cashier terminal 2 via the network N including the Internet.
 ドライブ110には、磁気ディスク、光ディスク、光磁気ディスク、あるいは半導体メモリ等よりなるリムーバブルメディア120が適宜装着される。ドライブ110によってリムーバブルメディア120から読み出されたプログラムは、必要に応じて記憶部108にインストールされる。
 また、リムーバブルメディア120は、記憶部108に記憶されている各種データも、記憶部108と同様に記憶することができる。
A removable medium 120 including a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is appropriately mounted on the drive 110. The program read from the removable medium 120 by the drive 110 is installed in the storage unit 108 as needed.
In addition, the removable medium 120 can also store various data stored in the storage unit 108, similarly to the storage unit 108.
 図8は、図6の実施形態1の情報処理システムのうちレジ端末2のハードウェア構成を示すブロック図である。
 レジ端末2は、CPU201と、ROM202と、RAM203と、バス204と、入出力インターフェース205と、出力部206と、入力部207と、照明部221と、遮光部209と、提示部210と、レジカメラ211と、記憶部208と、通信部213と、ドライブ214と、を備えている。
 ドライブ214には、リムーバブルメディア220が適宜装着される。
FIG. 8 is a block diagram showing a hardware configuration of the cashier terminal 2 in the information processing system according to the first embodiment shown in FIG.
The cashier terminal 2 includes a CPU 201, a ROM 202, a RAM 203, a bus 204, an input / output interface 205, an output unit 206, an input unit 207, an illumination unit 221, a light shielding unit 209, a presentation unit 210, and a cash register. The camera 211, the storage unit 208, the communication unit 213, and the drive 214 are provided.
The removable medium 220 is appropriately attached to the drive 214.
 レジ端末2のCPU201、ROM202、RAM203、バス204、入出力インターフェース205、記憶部208、通信部213、ドライブ214、リムーバブルメディア220は、サーバ1のこれらと同様に構成されている。 The CPU 201, the ROM 202, the RAM 203, the bus 204, the input / output interface 205, the storage unit 208, the communication unit 213, the drive 214, and the removable medium 220 of the cashier terminal 2 are configured similarly to those of the server 1.
 出力部206は、図5に示した筐体部275に備えられる。出力部206は、商品に関する情報や精算に関する情報等を表示する表示部Dと、レシートを出力するレシート出力部R、音声を出力するスピーカーSとを備えている。
 入力部207は、図5に示した筐体部275に備えられる。入力部207は、タッチパネル(図示せず)やカードリーダー部C、マイクロフォンMを備えている。また、図示しないバーコードリーダーを備えていてもよい。
The output unit 206 is provided in the housing unit 275 shown in FIG. The output unit 206 includes a display unit D that displays information about products and information about settlement, a receipt output unit R that outputs a receipt, and a speaker S that outputs a voice.
The input unit 207 is provided in the housing unit 275 shown in FIG. The input unit 207 includes a touch panel (not shown), a card reader unit C, and a microphone M. Further, a bar code reader (not shown) may be provided.
 遮光部209は、囲繞部270が瞬間調光シートで構成されている場合に、図5に示した囲繞部270を透明状態と不透明状態とに切り替える。
 提示部210は、レジ端末2の状態が正常待機中の状態であるか、精算中の状態であるか、店員が操作中の状態であるか、さらに異常事態が発生しているか等がわかるように、図5に示した提示部210が異なる色で発光するように切り替える。なお、提示部210は、前面だけでなく、背面にも備えている。
 レジカメラ211は、所定エリアAに置かれた物体を撮像し、その結果として得られる1以上の撮像画像を物体撮像画像として出力する。
The light shielding unit 209 switches the surrounding unit 270 shown in FIG. 5 between a transparent state and an opaque state when the surrounding unit 270 is formed of an instant light control sheet.
The presenting unit 210 can recognize whether the cashier terminal 2 is in a normal standby state, a settlement state, a store clerk in operation, an abnormal situation, or the like. Then, the presentation unit 210 shown in FIG. 5 is switched to emit light in a different color. The presentation unit 210 is provided not only on the front surface but also on the back surface.
The cash register camera 211 images an object placed in the predetermined area A, and outputs one or more captured images obtained as a result as an object captured image.
 なお、目検用端末Qのハードウェア構成はサーバ1のハードウェア構成と同様であるため、目検用端末Qのハードウェア構成については、図7及びその説明を適宜参照することとし、ここでは説明を省略する。 Since the hardware configuration of the inspection terminal Q is the same as the hardware configuration of the server 1, for the hardware configuration of the inspection terminal Q, refer to FIG. The description is omitted.
 図9は、サーバ1、レジ端末2及び目検用端末Qの機能的構成の一例を示す機能ブロック図である。
 サーバ1のCPU101においては、個人情報及び商品情報等を管理するDB管理部141が機能する。DB管理部141はレジ端末2の夫々に備える構成としてもよい。
 サーバ1の記憶部108の一領域には、商品DB131が設けられている。商品DB131は、商品情報を記憶するDB(Data Base)である。
FIG. 9 is a functional block diagram showing an example of the functional configuration of the server 1, the cashier terminal 2, and the eye examination terminal Q.
In the CPU 101 of the server 1, the DB management unit 141 that manages personal information, product information, and the like functions. The DB management unit 141 may be provided in each cashier terminal 2.
A product DB 131 is provided in an area of the storage unit 108 of the server 1. The product DB 131 is a DB (Data Base) that stores product information.
 レジ端末2のCPU201においては、図9に示すように、発光制御部228と、遮光制御部229と、提示制御部230と、個人認証部231と、画像取得部232と、物体認識部233と、物体数量認識部234と、商品特定部235と、売買制限商品判定部236と、精算部237と、表示制御部238と、目検結果取得部239と、が機能する。
 なお、レジ端末2は、個人情報及び商品情報等を保持するDB情報保持部241を備えている。
 目検用端末QのCPU101においては、画像表示制御部411と、目検結果送信部412と、が機能する。
In the CPU 201 of the cashier terminal 2, as shown in FIG. 9, a light emission control unit 228, a light shielding control unit 229, a presentation control unit 230, a personal authentication unit 231, an image acquisition unit 232, and an object recognition unit 233. The object quantity recognition unit 234, the product identification unit 235, the trade restricted product determination unit 236, the settlement unit 237, the display control unit 238, and the visual inspection result acquisition unit 239 function.
The cashier terminal 2 includes a DB information holding unit 241 that holds personal information, product information, and the like.
In the CPU 101 of the eye examination terminal Q, the image display control unit 411 and the eye examination result transmission unit 412 function.
 レジ端末2のCPU201の発光制御部228は、物体を撮像するタイミングで照明部221を発光させる状態と、物体を撮像しないタイミングで照明部221を発光させない状態とに切り替える制御や、所定エリアAに置かれる物体を認識する状況等に応じて照明部221の発光色を切り替える制御を実行する。
 遮光制御部229は、囲繞部270が瞬間調光シートで構成されている場合に、囲繞部270を不透明状態と透明状態とに切り替える。即ち、遮光制御部229は、囲繞部270に備えられた遮光部209が所定エリアAに置かれた物体を撮影するタイミングでの不透明状態と、撮像しないタイミングでの透明状態とのうち一方から他方に切り替える制御を実行する。
 提示制御部230は、提示部210がレジ端末2の状態を提示する発光色を変化させるように制御を実行する。
The light emission control unit 228 of the CPU 201 of the cashier terminal 2 performs control for switching between a state in which the illumination unit 221 emits light at the timing of capturing an object and a state in which the illumination unit 221 does not emit at the timing of not capturing an object, or in the predetermined area A. The control for switching the emission color of the illumination unit 221 is executed according to the situation of recognizing an object to be placed.
The light-shielding control unit 229 switches the surrounding unit 270 between an opaque state and a transparent state when the surrounding unit 270 is formed of an instantaneous light control sheet. That is, the light-shielding control unit 229 controls the light-shielding unit 209 provided in the surrounding unit 270 from one of the opaque state at the timing of capturing an image of an object placed in the predetermined area A and the transparent state at the timing of not capturing the image. Execute control to switch to.
The presentation control unit 230 executes control so that the presentation unit 210 changes the emission color for presenting the state of the cashier terminal 2.
 個人認証部231は、精算処理の最中において、DB管理部141によって管理されている個人情報を参照することにより、買い物客の個人認証を行う。具体的には、個人認証部231は、商品の画像認識前に、顔認証や、カード認証、指紋認証、静脈認証、虹彩認証等の各種生体認証等の認証手法を用いた認証処理を実施する。DB管理部141によって管理されている個人情報は、年齢やアレルギー、ハラル等の情報を含んでいる。したがって、個人認証部231によって取得された個人情報は、売買制限商品判定部236で活用される。
 画像取得部232は、物体が所定エリアAに置かれてレジカメラ211によって撮像された物体撮像画像を図10に示すように取得する。
The personal authentication unit 231 performs personal authentication of the shopper by referring to the personal information managed by the DB management unit 141 during the settlement process. Specifically, the personal authentication unit 231 performs authentication processing using authentication methods such as face authentication, card authentication, fingerprint authentication, vein authentication, various biometric authentication such as iris authentication before image recognition of the product. . The personal information managed by the DB management unit 141 includes information such as age, allergy, and halal. Therefore, the personal information acquired by the personal authentication unit 231 is utilized by the trading restricted product determination unit 236.
The image acquisition unit 232 acquires the object captured image captured by the cashier camera 211 with the object placed in the predetermined area A as shown in FIG. 10.
 図10は、レジ端末2に置かれた物体の撮像画面の例を示している。ここでは、図10は、所定エリアAに置かれた物体X、Y、Z、Z’が3台のレジカメラ211の夫々により撮像された結果得られる物体撮像画像の例を示す図である。
 図10(a)は、レジ端末2の天板部271に固定されたレジカメラ211によって物体X、Y、Z、Z’が撮像された結果得られる物体撮像画像の例を示している。この物体撮像画像には、6個の物体が含まれている。この物体撮像画像は、物体の影になる物体X、Y、Z、Z’がないため、所定エリアAに置かれた全ての物体X、Y、Z、Z’が撮像されている。
 図10(b)は、レジ端末2の一方の側板部273に固定されたレジカメラ211によって物体X、Y、Z、Z’が撮像された結果得られる物体撮像画像の例を示している。この物体撮像画像には、2個の物体が含まれている。この物体撮像画像は、レジカメラ211側の物体X、Y、Z、Z’が奥側の物体を隠した状態となっている。
 図10(c)は、レジ端末2の他方の側板部273に固定されたレジカメラ211によって物体X、Y、Z、Z’が撮像された結果得られる撮像画像を示している。この物体撮像画像には、2個の物体が含まれている。この物体撮像画像は、レジカメラ211側の物体X、Y、Z、Z’が奥側の物体を隠した状態となっている。
 図10(b)と図10(c)の夫々の物体撮像画像に含まれる2つの物体Zのうち、一方は同一の物体Z’であるが、他方は異なる物体である。
FIG. 10 shows an example of an image pickup screen of an object placed on the cashier terminal 2. Here, FIG. 10 is a diagram showing an example of an object imaged image obtained as a result of the objects X, Y, Z, and Z ′ placed in the predetermined area A being imaged by each of the three cash register cameras 211.
FIG. 10A illustrates an example of an object imaged image obtained as a result of imaging the objects X, Y, Z, and Z ′ by the cashier camera 211 fixed to the top plate portion 271 of the cashier terminal 2. This object captured image contains six objects. In this object-captured image, there are no objects X, Y, Z, and Z ′ that are shadows of the object, so all the objects X, Y, Z, and Z ′ placed in the predetermined area A are captured.
FIG. 10B shows an example of an object imaged image obtained as a result of imaging the objects X, Y, Z, and Z ′ by the cashier camera 211 fixed to one side plate portion 273 of the cashier terminal 2. This object captured image includes two objects. In this object captured image, the objects X, Y, Z, and Z ′ on the cash register camera 211 side hide the objects on the back side.
FIG. 10C shows a captured image obtained as a result of capturing the objects X, Y, Z, and Z ′ by the cashier camera 211 fixed to the other side plate portion 273 of the cashier terminal 2. This object captured image includes two objects. In this object captured image, the objects X, Y, Z, and Z ′ on the cash register camera 211 side hide the objects on the back side.
Of the two objects Z included in the respective object captured images of FIGS. 10B and 10C, one is the same object Z ′, but the other is a different object.
 物体認識部233は、上述した所定の画像認識手法を用いて、画像取得部232によって取得された物体撮像画像から、所定エリアAに置かれた物体の存在を認識する。即ち、物体認識部233は、レジ端末2の所定エリアAに物体が置かれる前の背景画像と、物体が置かれた後の物体撮像画像を比較し、背景差分処理により、物体ごとに物体領域を定義(特定)することで、物体の存在を認識する。
 なお、物体認識部233は、背景差分処理以外の方法を用い、所定エリアAに物体が置かれる前の背景画像と、物体が置かれた後の物体撮像画像を比較せず、物体撮像画像のみから物体領域を定義することで、物体の存在を認識するようにしてもよい。
The object recognition unit 233 recognizes the presence of an object placed in the predetermined area A from the object captured image acquired by the image acquisition unit 232 using the above-described predetermined image recognition method. That is, the object recognizing unit 233 compares the background image before the object is placed in the predetermined area A of the cashier terminal with the imaged image of the object after the object is placed, and performs the background subtraction process on each object region for each object. The existence of an object is recognized by defining (specifying).
The object recognizing unit 233 does not compare the background image before the object is placed in the predetermined area A with the object captured image after the object is placed using a method other than the background difference processing, and only the object captured image is obtained. The existence of an object may be recognized by defining the object area from.
 物体数量認識部234は、物体認識数量と精算数量とを比較することで、所定エリアAに置かれた物体の数量を認識する。物体認識数量は、レジ端末2の複数台のレジカメラ211によって撮像された物体撮像画像から、物体認識部233が認識した物体の数量である。
 精算数量は、精算対象の商品の数量である。
 物体認識数量は、図10に示すように、レジカメラ211によって異なることがある。即ち、図10(a)では、6個の物体が撮像されているが、図10(b)(c)では、2個の物体が撮像されている。
 このような場合、物体数量認識部234は、図11に示すような論理和を取ることで、物体の数量を認識する。
The object quantity recognition unit 234 recognizes the quantity of objects placed in the predetermined area A by comparing the object recognition quantity with the settlement quantity. The object recognition quantity is the quantity of objects recognized by the object recognition unit 233 from the object captured images captured by the plurality of cash register cameras 211 of the cashier terminal 2.
Settlement quantity is the quantity of the product to be settled.
The object recognition quantity may vary depending on the cashier camera 211, as shown in FIG. That is, in FIG. 10A, six objects are imaged, but in FIGS. 10B and 10C, two objects are imaged.
In such a case, the object quantity recognition unit 234 recognizes the quantity of objects by taking the logical sum as shown in FIG.
 図11は、レジ端末2に置かれた物体の個数を算出するための真理値表の一例を示す図である。
 図11は、図10で示した物体X、Y、Z、Z’についてレジ端末2(図5)に備えられた第1乃至第5のレジカメラ211が撮像し、撮像できた場合は「1」、撮像できなかった場合は「0」で表している。なお、物体Zと物体Z’は、同じ物体であるものの、撮像したレジカメラ211が異なることから、異なる状態に撮像されていることを表すものである。
 図11において、第1のレジカメラ211は「レジ台カメラ1」と表され、同様に、第2乃至第5のレジカメラ211は「レジ台カメラ2」乃至「レジ台カメラ5」と表されている。
 図11の例では、物体Xは第1、第4及び第5のレジカメラ211に撮像されている。物体Yは第2、第4及び第5のレジカメラ211に撮像されている。物体Zは第2、第3及び第5のレジカメラ211に撮像されている。物体Z’は第1及び第3のレジカメラ211に撮像されている。
FIG. 11 is a diagram showing an example of a truth table for calculating the number of objects placed on the cashier terminal 2.
FIG. 11 shows images of the objects X, Y, Z, and Z ′ shown in FIG. 10 taken by the first to fifth cash register cameras 211 provided in the cashier terminal 2 (FIG. 5). , "0" indicates that the image could not be captured. It should be noted that the object Z and the object Z ′ are the same object, but since the cashier cameras 211 that have imaged them are different, they represent that they are imaged in different states.
In FIG. 11, the first cash register camera 211 is represented by “cash table camera 1”, and similarly, the second to fifth cash register cameras 211 are represented by “cash table camera 2” to “cash table camera 5”. ing.
In the example of FIG. 11, the object X is imaged by the first, fourth, and fifth cashier cameras 211. The object Y is imaged by the second, fourth and fifth cashier cameras 211. The object Z is imaged by the second, third and fifth cashier cameras 211. The object Z ′ is imaged by the first and third cashier cameras 211.
 図11の例では、第1乃至第5の全てのレジカメラ211に撮像された商品は存在しない。また、物体Zと物体Z’は、同じ商品であるにもかかわらず、撮像したレジカメラ211が異なることから別の物体のように撮像されている。これは、撮像するレジカメラ211のアングルが異なるからであると考えられる。
 このことを踏まえ、物体数量認識部234は、論理和を用いた手法によって商品の数量を認定する。即ち、物体が重なって所定エリアAに置かれたり、ある物体が他の物体の影になるように所定エリアAに置かれても、いずれかのレジカメラ211が撮像した物体については、論理和を用いることで所定エリアAに置かれたと認識される。
In the example of FIG. 11, there is no product imaged by all the first to fifth cash register cameras 211. Further, although the object Z and the object Z ′ are the same product, they are imaged as different objects because the cashier cameras 211 that image the images are different. It is considered that this is because the cashier cameras 211 for imaging have different angles.
Based on this, the object quantity recognition unit 234 recognizes the quantity of the product by the method using the logical sum. That is, even if objects are overlapped and placed in the predetermined area A, or even if one object is placed in the predetermined area A so as to be a shadow of another object, the logical sum of the objects imaged by one of the cash register cameras 211 is obtained. It is recognized that the object is placed in the predetermined area A by using.
 このようにして複数の物体撮像画像から真理値表を用いて認識された物体認識数量と、精算数量とが同一でない場合、物体数量認識部234は、数量が異なるという情報を表示制御部238に出力する。 In this way, when the object recognition quantity recognized using the truth table from a plurality of object captured images and the settlement quantity are not the same, the object quantity recognition unit 234 informs the display control unit 238 that the quantities are different. Output.
 図9に戻り、商品特定部235は、物体認識部233によって存在が認識された物体について、DB情報保持部241に保持されている商品情報とマッチングさせる。即ち、商品特定部235は、まず、特定物体認識、一般物体認識、ディープラーニング等の画像認識手法により、商品候補をリストアップする。このリストアップされた商品候補を「商品候補リストS」と呼ぶ。その後、商品特定部235は、検証機能を発揮させ、商品を高い精度で特定する。 Returning to FIG. 9, the product identification unit 235 matches the object whose existence is recognized by the object recognition unit 233 with the product information held in the DB information holding unit 241. That is, the product identification unit 235 first lists up product candidates by image recognition methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product specifying unit 235 causes the verification function to be performed and specifies the product with high accuracy.
 検証機能は前述の商品候補をリストアップする手法と異なるアルゴリズムによって「商品候補リストP」をリストアップする機能である。商品候補リストSとPの結果をマッチングさせて、所定の閾値を超える場合、商品を特定する。
 「商品候補リスト」のリストアップ手法として、例えば、存在が認識された物体から得られる物体の画像情報と、DB情報保持部241やメモリ上に保持されている画像情報とマッチングさせる方法により実現されてもよい。即ち、両画像の特徴情報が一致する(閾値を超える)と、物体認識部233によって存在が認識された物体が、DB情報保持部241に登録されている商品であるため、商品特定部235は、そのDB情報保持部241に登録された商品であると特定する。
The verification function is a function of listing the "commodity candidate list P" by an algorithm different from the method of listing the product candidates described above. The results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
As a method of listing the “commodity candidate list”, for example, it is realized by a method of matching the image information of the object obtained from the object whose existence is recognized with the image information held in the DB information holding unit 241 or the memory. May be. That is, when the feature information of both images matches (exceeds a threshold value), the object whose existence is recognized by the object recognition unit 233 is a product registered in the DB information holding unit 241, and thus the product specification unit 235 , And specifies that the product is registered in the DB information holding unit 241.
 ここで、商品特定部235が画像処理手法によって商品を特定できなかったときに、その結果を商品不特定として出力する具体例について説明する。まず、商品特定部235は、DB情報保持部241に保存されている商品画像と所定エリアAで撮像された物体撮像画像とを比較し、両画像が類似する特徴点(類似特徴点)及び特徴量を算出する。次に、商品特定部235は、商品候補リストに含む商品の画像の特徴点及び特徴量をサーバ1の商品DB131から読み取る。次に、商品特定部235は、読み取られた商品候補リストに含む商品の特徴点ごとの特徴量と、認識された物体の特徴点ごとの特徴量とを比較し、DB情報保持部241に保存されている商品画像と所定エリアAで撮像された物体撮像画像との類似特徴点をマッチングする。 Here, a specific example in which the product specification unit 235 outputs the result as product non-specification when the product cannot be specified by the image processing method will be described. First, the product identification unit 235 compares the product image stored in the DB information storage unit 241 with the object captured image captured in the predetermined area A, and the feature points (similar feature points) and the features that are similar to each other in the images. Calculate the amount. Next, the product identification unit 235 reads the feature points and the feature amounts of the images of the products included in the product candidate list from the product DB 131 of the server 1. Next, the product identification unit 235 compares the feature amount for each feature point of the product included in the read product candidate list with the feature amount for each feature point of the recognized object, and saves it in the DB information holding unit 241. The similar feature points of the displayed product image and the object captured image captured in the predetermined area A are matched.
 次に、商品特定部235は、類似特徴点の各組の対応点座標を用いて位置関係を比較し、回転、並進による変化において正しく対応していない(位置関係が合っていない)類似特徴点の組を除去し、残った類似特徴点の数を算出する。商品特定部235は、類似特徴点の数が閾値未満の場合に商品不特定とする。 Next, the product identifying unit 235 compares the positional relationships using the corresponding point coordinates of each set of similar characteristic points, and does not correctly correspond (change in positional relationship) in changes due to rotation and translation. Is removed and the number of remaining similar feature points is calculated. The product specifying unit 235 determines that the product is unspecified when the number of similar feature points is less than the threshold value.
 このような商品不特定とする方法にあっては、パッケージ商品でない弁当のような商品を不特定商品とすることがある。弁当は、おかずの位置が若干異なることがあり、このような場合に不特定商品とされるおそれがある。このような場合は、商品特定部235は、商品に貼付されているラベルに記載されたバーコード等を含む多次元コード等各種コードや商品名等の文字を検出し、DB情報保持部241やサーバ1の商品DB131に記憶されている商品情報をもとに商品名や商品コード等を読み取り、商品を特定してもよい。 In such a method of making products unspecified, products such as lunch boxes that are not packaged products may be unspecified products. The bento may have a slightly different side dish position, and in such a case, it may be an unspecified product. In such a case, the product specification unit 235 detects various codes such as a multi-dimensional code including a bar code and the like written on a label attached to the product and characters such as a product name, and the DB information holding unit 241 or The product name or product code may be read based on the product information stored in the product DB 131 of the server 1 to specify the product.
 さらに、商品特定部235は、類似商品や関連商品(以下「グループ商品」と呼ぶ)についてDB情報保持部241やサーバ1の商品DB131に記憶されている商品情報を用いて検証する。商品特定部235では、例えば、サイズや色等が違うシリーズもののグループ商品について商品の大きさや色等の特徴を用いて閾値を比較する。商品特定部235は、その閾値が所定の閾値未満の場合に商品不特定とする。 Furthermore, the product identification unit 235 verifies similar products and related products (hereinafter referred to as “group products”) using the product information stored in the DB information storage unit 241 and the product DB 131 of the server 1. In the product specification unit 235, for example, for group products of series having different sizes, colors, etc., thresholds are compared using characteristics such as size and color of products. The product specifying unit 235 determines that the product is unspecified when the threshold is less than a predetermined threshold.
 本実施形態においては、特定の物体について目検用端末Qによる目検が行われるモード(以下、「特定物体目検モード」と呼ぶ。)と、全ての物体について目検用端末Qによる目検が行われるモード(以下、「全物体目検モード」と呼ぶ。)とが設定可能となっている(以下の各実施形態においても同様とする)。
 特定物体目検モードに設定されている場合、商品特定部235は、商品不特定とされた物体の物体撮像画像及び関連する情報等を、目検結果取得部239を介して目検用端末Qに送信する。一方、全物体目検モードに設定されている場合、商品特定部235は、商品の特定結果及び特定された商品の商品撮像画像と、商品不特定とされた物体の物体撮像画像及び関連する情報等とを目検結果取得部239を介して目検用端末Qに送信する。
 そして、商品特定部235は、目検結果取得部239が目検用端末Qから取得した目検結果に基づいて、商品の特定を行う。具体的には、商品特定部235によって特定された商品については、目検結果により特定結果が承認または修正された場合、商品特定部235は、目検結果が示す内容に従って、商品の特定を行う。また、商品特定部235によって商品不特定とされた物体については、商品特定部235は、その物体を目検結果が示す商品として特定する。
In the present embodiment, a mode in which the eye inspection terminal Q performs eye inspection for a specific object (hereinafter, referred to as “specific object eye inspection mode”) and all the objects are inspected by the eye inspection terminal Q. The mode (hereinafter, referred to as “all-object visual inspection mode”) in which is performed can be set (the same applies to each of the following embodiments).
When the specific object visual inspection mode is set, the product identification unit 235 transmits the object captured image of the object that is not specified as the product and the related information to the visual inspection terminal Q via the visual inspection result acquisition unit 239. Send to. On the other hand, when the all-object visual inspection mode is set, the product identification unit 235 determines the product identification result and the product captured image of the identified product, the object captured image of the object not identified as the product, and related information. And the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 239.
Then, the product identification unit 235 identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 239 from the eye inspection terminal Q. Specifically, for the product identified by the product identification unit 235, when the identification result is approved or corrected by the inspection result, the product identification unit 235 identifies the product according to the content indicated by the inspection result. . In addition, for an object for which the product specifying unit 235 has determined that the product is unspecified, the product specifying unit 235 specifies the object as a product indicated by the visual inspection result.
 売買制限商品判定部236は、判定情報に基づいて、商品特定部235によって特定された商品が、売買制限商品に該当するか否かを判定する。
 なお、売買制限商品としては、(A)タバコ、酒類等のような一定の年齢に達しないと購入できない商品、(B)消費期限切れ・賞味期限切れの商品、(C)アレルギー成分を含むため、体質によって、摂取すべきでない商品、(D)ハラル食品以外の商品等、宗教による制限のある商品、が該当する。
 本実施形態において、売買制限商品判定部236は、売買制限商品に該当する商品の商品撮像画像を、目検結果取得部239を介して目検用端末Qに送信する。このとき、売買制限商品判定部236は、売買制限商品の種別(上記の(A)~(D)等)に応じて、買い物客に関する画像(顔画像、身分証明書の画像等)を、目検結果取得部239を介して目検用端末Qに適宜送信する。この際、買い物客に関する画像が不鮮明なときは、レジ端末2において画面表示や音声案内等で購入者に知らせて、画像の撮り直しを依頼する構成としてもよい。さらに、レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。
 そして、売買制限商品判定部236は、目検結果取得部239が目検用端末Qから取得した目検結果に基づいて、売買制限商品の販売が許可されるか否か(売買制限が解除されたか否か)を判定する。
The trade-restricted product determination unit 236 determines whether or not the product identified by the product identification unit 235 corresponds to the trade-restricted product, based on the determination information.
In addition, the trade-restricted products include (A) products such as cigarettes and alcoholic beverages that cannot be purchased until a certain age is reached, (B) products whose expiry date and expiration date have expired, and (C) allergic ingredients According to the above, products that should not be ingested, products that are restricted by religion, such as (D) products other than halal foods, correspond.
In the present embodiment, the trade-restricted product determination unit 236 transmits the product imaged image of the product corresponding to the trade-restricted product to the eye inspection terminal Q via the eye inspection result acquisition unit 239. At this time, the trade-restricted product determination unit 236 determines an image (face image, identification card image, etc.) relating to the shopper according to the type of the trade-restricted product (above (A) to (D), etc.). The information is appropriately transmitted to the inspection terminal Q via the inspection result acquisition unit 239. At this time, when the image relating to the shopper is unclear, the purchaser may be informed by screen display or voice guidance at the cashier terminal 2 and the image retake may be requested. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
Then, the trade-restricted product determination unit 236 determines whether or not the sale of the trade-restricted product is permitted based on the eye inspection result acquired by the eye-inspection result acquisition unit 239 from the inspection terminal Q (the sale-restriction is released). Whether or not) is determined.
 精算部237は、商品特定部235で特定された全ての商品について合計金額を算出する。その際、売買制限商品判定部236により、売買制限商品があるとされた場合には、それら売買制限商品の全てに対し制限が解除されている必要がある。そこで、精算部237は、所定エリアAに置かれた商品の価格をDB情報保持部241から読み出し、表示部D(図8)に表示させる。 The settlement unit 237 calculates the total amount of money for all the products identified by the product identification unit 235. At this time, when the trading restricted products determination unit 236 determines that there are trading restricted products, it is necessary that the restrictions are released for all of the trading restricted products. Therefore, the settlement section 237 reads out the price of the product placed in the predetermined area A from the DB information holding section 241, and displays it on the display section D (FIG. 8).
 表示制御部238は、物体数量認識部234によって物体認識数量と精算数量とが一致しなかった場合に、商品の数量を確認するよう買い物客や店員に向けて出力部206によって警告するように制御する。表示制御部238は、商品特定部235によって商品が特定されなかった場合、あるいは、目検用端末Qにおける目検によって商品が特定されなかった場合に、買い物客や店員に向けて注意喚起(物体の置き直しの指示等)を出力するように出力部206を制御する。具体的には、画面表示や音声案内等で購入者に知らせて、購入者へ商品の置き直しを依頼する構成としてもよい。さらに、レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。表示制御部238は、売買制限商品判定部236によって物体が売買制限商品であると認定された場合に、売買制限商品である旨を買い物客や店員に向けて警告するように出力部206を制御する。さらに、精算部237で合計金額が算出された場合に、その商品の商品名や価格等を買い物客及び店員に向けて表示させるように出力部206を制御する。 The display control unit 238 controls the output unit 206 to warn the shopper or the clerk to confirm the quantity of the product when the object quantity recognition unit 234 does not match the object recognition quantity with the settlement quantity. To do. The display control unit 238, when the product is not specified by the product specifying unit 235, or when the product is not specified by the visual inspection in the inspection terminal Q, alerts the shopper or the clerk (object). The output unit 206 is controlled so as to output (replacement instruction, etc.). Specifically, the purchaser may be notified by a screen display or voice guidance, and the purchaser may be requested to replace the product. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2. The display control unit 238 controls the output unit 206 to warn the shopper or the clerk that the object is a trade-restricted product when the object is determined to be a trade-restricted product by the trade-restricted product determination unit 236. To do. Further, when the total amount is calculated by the settlement unit 237, the output unit 206 is controlled to display the product name, price, etc. of the product to the shopper and the clerk.
 目検結果取得部239は、物体撮像画像、または、商品の特定結果及び特定された商品の商品撮像画像を目検用端末Qに送信することにより、目検による商品の特定を依頼し、この依頼に応じて目検用端末Qから送信された目検結果を取得する。目検結果取得部239は、取得した目検結果(商品の特定結果)を商品特定部235に出力する。また、目検結果取得部239は、売買制限商品に該当する商品の商品撮像画像を目検用端末Qに送信することにより、目検による売買制限商品の判定を依頼し、目検用端末Qから送信された目検結果を取得する。このとき、目検結果取得部239は、売買制限商品の種別に応じて、買い物客に関する画像(顔画像、身分証明書の画像等)を、適宜、目検用端末Qに送信する。目検結果取得部239は、取得した目検結果(売買制限商品の判定結果)を売買制限商品判定部236に出力する。
 売買制限商品判定部236において物体が売買制限商品であると認定され、目検結果によっても売買制限が解除されなかった商品について、その旨の警告が提示部210や出力部206から提示される。この提示に気づいた店員は売買制限商品を販売しないように、売買制限商品を回収する等の対応を行う。
 本商品認識システムにおいて、目検用端末Qとは別に、売買制限商品に対する売買制限の解除を指示するための遠隔操作部を備えることとしてもよい。遠隔操作部は、レジ端末2から離れた店員が所持したり、店舗のバックヤードに備えたりすることができる。遠隔操作部は、1台で複数台のレジ端末2を遠隔操作することも可能である。
The eye inspection result acquisition unit 239 requests the identification of the product by the eye inspection by transmitting the object captured image, or the product identification result and the product captured image of the identified product to the eye inspection terminal Q. The eye inspection result transmitted from the eye inspection terminal Q is acquired in response to the request. The visual inspection result acquisition unit 239 outputs the acquired visual inspection result (commercial product identification result) to the commercial product identification unit 235. Further, the eye inspection result acquisition unit 239 requests the determination of the sale restricted product by the eye inspection by transmitting the product imaged image of the product corresponding to the sale restricted product to the eye inspection terminal Q, and the eye inspection terminal Q. Get the result of the eye inspection sent from. At this time, the visual inspection result acquisition unit 239 appropriately transmits an image (a face image, an image of an identification card, etc.) regarding the shopper to the visual inspection terminal Q according to the type of the restricted sale product. The eye inspection result acquisition unit 239 outputs the acquired eye inspection result (determination result of the sale-restricted product) to the sale-restricted product determination unit 236.
For the products for which the trade restriction product determination unit 236 determines that the object is a trade restriction product, and the trade restriction is not released even by the result of the visual inspection, a warning to that effect is presented from the presentation unit 210 and the output unit 206. The clerk who notices this presentation collects the trade-restricted products so as not to sell the trade-restricted products.
The product recognition system may be provided with a remote operation unit for instructing cancellation of the trading restriction for the trading restricted product, in addition to the inspection terminal Q. The remote control unit can be possessed by a clerk distant from the cashier terminal 2 or provided in the backyard of the store. A single remote control unit can also remotely control a plurality of cashier terminals 2.
 目検用端末Qにおいて、画像表示制御部411は、レジ端末2から目検の依頼が行われた場合に、レジ端末2から送信された物体撮像画像あるいは商品撮像画像等の画像及びこれらの画像に付随して送信された各種情報(商品候補リスト等)を出力部106に出力する。
 目検結果送信部412は、出力部106に出力された物体撮像画像あるいは商品撮像画像について、目検者が入力部107を介して入力した目検結果(商品の特定結果あるいは売買制限商品の判定結果等)を、目検を依頼したレジ端末2に送信する。
In the eye inspection terminal Q, the image display control unit 411 displays images such as the object imaged image or the product imaged image transmitted from the cashier terminal 2 and these images when the cashier terminal 2 requests the eye inspection. Various information (commodity candidate list, etc.) transmitted in association with is output to the output unit 106.
The eye inspection result transmission unit 412 receives the object inspection image or the product captured image output to the output unit 106 from the eye inspection result input by the eye examiner through the input unit 107 (determination of product identification result or trade restricted product). (Result etc.) is transmitted to the cashier terminal 2 which requested the eye examination.
 本明細書の情報処理システムとして適用される商品認識システムは、店舗10で販売される商品について、商品の外観を撮像し商品の価格等の商品情報と共に登録しておくための商品登録システムを備える。商品登録は、店舗10内で行われてもよく、また、商品のメーカーや卸売業者等の店舗外において行われてもよく、場所は限られない。
 この商品登録システムは、図示しない登録画像生成部と、商品マスタ登録部と、撮像画像取得部と、認識部と、を備える。
The product recognition system applied as the information processing system of the present specification includes a product registration system for imaging the appearance of a product sold at the store 10 and registering it together with product information such as the price of the product. . The product registration may be performed in the store 10 or may be performed outside the store such as the manufacturer or wholesaler of the product, and the location is not limited.
The product registration system includes a registration image generation unit (not shown), a product master registration unit, a captured image acquisition unit, and a recognition unit.
 登録画像生成部は、レジ端末2の所定エリアAに置かれ得る商品の画像を、商品登録画像として生成する。
 商品マスタ登録部は、登録画像生成部により生成された商品登録画像と、当該商品登録画像に被写体として含まれる商品に対して一意に付与された商品識別子とを関連付けて登録する。
 撮像画像取得部は、レジ端末2の所定エリアAに置かれた物体の撮像画像を、物体撮像画像として取得する。この物体撮像画像と商品登録画像とに基づいて、商品特定部235は、存在を認識した物体がいずれの商品かを特定する。
 物体認識部233は、取得された物体撮像画像に基づいて、所定エリアAに置かれた物体の存在を認識する。
 この商品認識システムは、存在が認識された物体の物体撮像画像とDB情報保持部241またはサーバ1の記憶部108に保持されている商品の画像を上述したようにマッチングさせることで、商品特定部235によっていずれの商品であるかが特定される。
The registration image generation unit generates an image of a product that can be placed in the predetermined area A of the cashier terminal 2 as a product registration image.
The product master registration unit associates and registers the product registration image generated by the registration image generation unit and the product identifier uniquely assigned to the product included in the product registration image as a subject.
The captured image acquisition unit acquires a captured image of an object placed in a predetermined area A of the cashier terminal 2 as an object captured image. Based on the object captured image and the product registration image, the product specifying unit 235 specifies which product is the object whose existence is recognized.
The object recognition unit 233 recognizes the presence of an object placed in the predetermined area A based on the acquired object captured image.
This product recognition system matches the object captured image of the object whose existence is recognized with the image of the product held in the DB information holding unit 241 or the storage unit 108 of the server 1 as described above, and thereby the product specifying unit. 235 identifies which product.
 このような商品認識システムを設置した店舗10では、商品登録画像を生成し、この商品登録画像に含まれる商品に商品識別子を付与することで、商品に一意に付与された商品識別子によるマスタ登録が可能になる。さらに、この商品登録システムを備えた商品認識システムは、バーコードシールを貼付できない商品の商品管理をすることができる。 In the store 10 in which such a product recognition system is installed, a product registration image is generated, and a product identifier is given to a product included in the product registration image, whereby master registration by the product identifier uniquely given to the product is performed. It will be possible. Furthermore, the product recognition system provided with this product registration system can manage the products for which the barcode sticker cannot be attached.
 ここで、実施形態1の商品認識システムにおける商品の精算方法について、図12を参照して説明する。
 図12は、図9のサーバ1、レジ端末2及び目検用端末Qが実行する商品の精算処理を説明するフローチャートである。
 この商品の自動精算処理に先立って、レジ端末2の所定エリアAに物体が置かれていない状態で予め撮像された所定エリアAの画像が画像取得部232に保存されている。
 この予め撮像された所定エリアAの画像は、所定のタイミングで更新され、買い物客がレジ端末2を使用するたびに更新されず、買い物客が変わっても共用されてもよい。
 そして、買い物客がレジ端末の操作ボタンを押すことで、自動精算処理が開始される。
Here, a product settlement method in the product recognition system according to the first embodiment will be described with reference to FIG.
FIG. 12 is a flowchart illustrating a product settlement process executed by the server 1, the cashier terminal 2, and the eye examination terminal Q in FIG. 9.
Prior to the automatic payment processing of the product, an image of the predetermined area A captured in advance in a state where no object is placed in the predetermined area A of the cashier terminal 2 is stored in the image acquisition unit 232.
The image of the predetermined area A captured in advance is updated at a predetermined timing, is not updated every time the shopper uses the cashier terminal 2, and may be shared even if the shopper changes.
Then, when the shopper presses the operation button of the cashier terminal, the automatic settlement process is started.
 ステップS101において、レジ端末2のレジカメラ211は、レジ端末2の所定エリアA上に物体が置かれた後の画像を撮像する。その画像は、例えば、図10に示すような物体撮像画像として画像取得部232に入力される。
 この物体撮像画像は、レジ端末2内が照明部221によって照明されることで、物体の影が生じないまたは減少するようにされている。
 物体撮像画像は、表示制御部238に入力され、出力部206から出力されてもよい。
 また、レジ端末2の外部カメラ212が買い物客を撮像することで、個人認証部231が買い物客を個人認証する。
In step S101, the cashier camera 211 of the cashier terminal 2 captures an image after an object is placed on the predetermined area A of the cashier terminal 2. The image is input to the image acquisition unit 232 as a captured image of an object as illustrated in FIG. 10, for example.
In the object-captured image, the inside of the cashier terminal 2 is illuminated by the illumination unit 221, so that the shadow of the object is not generated or reduced.
The object captured image may be input to the display control unit 238 and output from the output unit 206.
In addition, the external camera 212 of the cashier terminal 2 images the shopper, and the personal authentication unit 231 personally authenticates the shopper.
 ステップS102において、レジ端末2の物体認識部233は、上述した所定の画像認識手法を用いて、画像取得部232によって取得された物体撮像画像から、所定エリアAに置かれた物体の存在を認識する。即ち、物体認識部233は、レジ端末2の所定エリアAに物体が置かれる前の背景画像と、物体が置かれた後の物体撮像画像を比較し、背景差分処理により、物体ごとに物体領域を定義(特定)することで、物体の存在を認識する。 In step S102, the object recognition unit 233 of the cashier terminal 2 recognizes the presence of an object placed in the predetermined area A from the object captured image acquired by the image acquisition unit 232 using the above-described predetermined image recognition method. To do. That is, the object recognizing unit 233 compares the background image before the object is placed in the predetermined area A of the cashier terminal 2 with the captured image of the object after the object is placed, and performs the background subtraction process on each object region for each object. The existence of an object is recognized by defining (specifying).
 ステップS103において、レジ端末2の商品特定部235は、レジ端末2の所定エリアAに置かれている物体の中にいずれの商品であるか特定できない物体があるかどうか判定する。 In step S103, the product specifying unit 235 of the cashier terminal 2 determines whether or not there is an object whose product cannot be specified among the objects placed in the predetermined area A of the cashier terminal 2.
 ステップS104において、レジ端末2の商品特定部235は、特定物体目検モードまたは全物体目検モードのいずれに設定されているか判定する。
 特定物体目検モードに設定されている場合、処理はステップS106に進む。
 また、全物体目検モードに設定されている場合、処理はステップS105に進む。
In step S104, the product specification unit 235 of the cashier terminal 2 determines whether the specific object inspection mode or the all object inspection mode is set.
If the specific object visual inspection mode is set, the process proceeds to step S106.
If the all-object visual inspection mode is set, the process proceeds to step S105.
 ステップS105において、レジ端末2の目検結果取得部239は、対象となる物体または商品について、目検用端末Qにおける目検(商品の特定)を依頼し、その目検結果を取得する。
 ステップS105の後、処理はステップS107に進む。
In step S <b> 105, the eye inspection result acquisition unit 239 of the cashier terminal 2 requests the eye inspection terminal Q to perform the eye inspection (specification of the product) for the target object or product, and acquires the eye inspection result.
After step S105, the process proceeds to step S107.
 なお、ステップS105において、目検用端末Qにおける目検によっても商品が特定されない物体が存在する場合、レジ端末2の提示部210によってエラー状態が色や音等によって知らされる。
 商品不特定によるエラー状態となった場合、物体を置き直し、レジ端末2において、所定の操作をすることで、ステップS101に戻り、再度、物体を撮像し直すこととしてもよい。置き直しは、エラー状態を発見した店員が行ってもよく、買い物客自身で行ってもよい。レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。
 なお、エラー状態となるのは、レジ端末2のシステム処理が異常となった場合、物体の特徴部分を撮像できなかったり、物体が重なりあったり、影になったりした場合、売買制限商品がレジ端末2に置かれた場合、レジ端末2に商品や私物が置き忘れられた場合等である。物体が重なりあったり、影になったりした場合は、物体を置き直すことで、物体がいずれの商品であるか撮像し、商品を特定できるようになる。
In step S105, when there is an object whose product is not specified even by the visual inspection at the visual inspection terminal Q, the presentation unit 210 of the cashier terminal 2 notifies the error state by color, sound, or the like.
When an error occurs due to unspecified goods, the object may be replaced and a predetermined operation may be performed on the cashier terminal 2 to return to step S101 and image the object again. The replacement may be performed by a store clerk who has found the error state or may be performed by the shopper himself. The purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
It should be noted that the error state is that when the system processing of the cashier terminal 2 becomes abnormal, the characteristic part of the object cannot be imaged, the objects overlap, or become a shadow, the trade-restricted product is This is the case when a product or personal property is left behind in the cashier terminal 2 when it is placed in the terminal 2. When the objects are overlapped with each other or have a shadow, the objects can be replaced to pick up which product the object is and specify the product.
 商品特定部235がいずれの商品であるか特定できない物体がある場合(ステップS105においてYES)は、処理はステップS105に進む。 If there is an object whose product identification unit 235 cannot identify which product (YES in step S105), the process proceeds to step S105.
 ステップS106において商品特定部235がいずれの商品であるか特定できない物体がない場合(ステップS103においてNO)は、商品特定部235は、DB情報保持部241またはサーバ1の記憶部に保持された商品名や価格、売買制限商品である等の情報を含めて商品を特定する。これにより、処理はステップS107に進む。特定された商品情報は、表示制御部238に出力されてもよい。 If there is no object whose product specifying unit 235 cannot specify which product in step S106 (NO in step S103), the product specifying unit 235 stores the product held in the DB information holding unit 241 or the storage unit of the server 1. The product is specified by including information such as name, price, and restricted product. As a result, the process proceeds to step S107. The identified product information may be output to the display control unit 238.
 ステップS107において、売買制限商品判定部236は、売買制限商品の処理を実行する。売買制限商品の処理とは、特定された商品が売買制限商品であるかどうかを判定する処理である。具体的には、売買制限商品判定部236は、例えば、特定された商品が年齢制限商品かどうか、ハラル非該当商品またはアレルギー成分を含む商品かどうか、消費期限切れ・賞味期限切れ商品かどうか等を判定する。売買制限商品の処理の詳細については、図13を参照して後述する。
 なお、ステップS105まで(ステップS101以前を含む)の任意のタイミングで個人認証が事前に取得されることで、売買制限商品判定部236は、買い物客ごとに売買制限商品の処理を実行することができる。
In step S107, the trade-restricted product determination unit 236 executes the process of the trade-restricted product. The processing of the trade-restricted product is a process of determining whether the specified product is a trade-restricted product. Specifically, the trade-restricted product determination unit 236 determines, for example, whether the specified product is an age-restricted product, whether it is a product that does not correspond to the halal or a product that contains an allergic component, or whether it is an expired product or an expired product. To do. Details of the processing of the trade-restricted product will be described later with reference to FIG.
Note that the trade-restricted product determination unit 236 can execute the process of the trade-restricted product for each shopper by acquiring the personal authentication in advance at any timing up to step S105 (including before step S101). it can.
 ステップS108において、精算部237は、所定エリアAに置かれている商品を精算する。
 具体的には、精算部237は、ステップS103において特定された商品の価格を個々に取得し、合計することにより、所定エリアAに置かれている全ての商品を精算する。
 精算された商品の商品名や価格等の商品情報は、表示制御部238から出力部206に出力され、出力部206の表示部Dに表示され、レシート出力部Rからレシートに印刷されて出力される。
 また、このレジ端末2は、通信部213を介してサーバ1に接続されていることから、POS(Point Of Sale)として活用することができる。即ち、レジ端末2によって、精算された商品の購買情報及び年齢性別推定情報がPOSシステムに連携される。
 なお、ステップS108において、レジ端末2の提示部210は、物体数量認識部234が特定した物体の数量と、商品特定部235が特定した商品の数量と精算数量とが異なる場合に、その旨を提示してもよい。レジ端末2は、キャンセル機能を備えることで、精算処理を中止することができる。
In step S108, the settlement unit 237 settlements the products placed in the predetermined area A.
Specifically, the settlement unit 237 obtains the prices of the products specified in step S103 individually and totals them to settle all the products placed in the predetermined area A.
The product information such as the product name and price of the adjusted product is output from the display control unit 238 to the output unit 206, displayed on the display unit D of the output unit 206, and printed on the receipt from the receipt output unit R and output. It
Further, since the cashier terminal 2 is connected to the server 1 via the communication unit 213, it can be utilized as a POS (Point Of Sale). That is, the cashier terminal 2 links the payment item purchase information and the estimated age / sex information to the POS system.
In step S108, when the quantity of the object specified by the object quantity recognition unit 234 and the quantity of the product specified by the product specification unit 235 are different from the settlement quantity, the presenting unit 210 of the cashier terminal 2 indicates that fact. May be presented. The cashier terminal 2 has a cancel function, so that the checkout process can be stopped.
 次に、ステップS107における売買制限商品の処理について図13を参照して詳しく説明する。
 図13は、図9のサーバ1、レジ端末2及び目検用端末Qが実行する売買制限商品の処理を説明するフローチャートである。売買制限商品の売買制限の解除は、判定の結果を受けて店員がその場で対応する、もしくはシステムとして目検の結果を採用する、いずれかの方法で行う。図13ではシステムとして目検の結果を採用する場合について説明する。
Next, the processing of the trade-restricted product in step S107 will be described in detail with reference to FIG.
FIG. 13 is a flowchart for explaining the processing of the trade-restricted product executed by the server 1, the cashier terminal 2, and the inspection terminal Q of FIG. The sale restriction of the sale-restricted product is released by either the store clerk on the spot in response to the result of the determination or the result of the visual inspection is adopted as a system. In FIG. 13, a case where the result of the visual inspection is adopted as the system will be described.
 ステップS111において、売買制限商品判定部236は、商品特定部235によって特定された商品がアルコール飲料等の年齢確認が必要とされる商品であるかどうかを判定する。 In step S111, the trade-restricted product determination unit 236 determines whether or not the product identified by the product identification unit 235 is an alcoholic beverage or other product that requires age confirmation.
 ステップS111において、商品特定部235によって特定された商品が年齢確認必要とされる商品であると判定された場合は、即ちYESと判定された場合、処理はステップS112に進む。
 ステップS112において、レジ端末2は、外部カメラ212により、購入者の顔や年齢確認用書類を撮像した画像を取得する。ただし、買い物客の個人情報が取得され、ここで年齢確認する必要がない場合は、ステップS112はスキップされ、処理はステップS116に進む。
 ステップS113において、レジ端末2の売買制限商品判定部236は、取得した画像を、目検結果取得部239を介して目検用端末Qに送信し、目検を依頼する。この画像を受信した目検者は、画像を元に購入者の年齢を確認する。
 この際、購入者の顔や年齢確認用書類を撮像した画像が不鮮明なとき等、購入者の年齢認証に問題があるときは、レジ端末2において画面表示や音声案内等で購入者に知らせて、購入者に画像の撮り直しを依頼する構成としてもよい。さらに、レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。
 ステップS114において、売買制限商品判定部236は、目検結果取得部239を介して、目検用端末Qから、売買制限を解除する指示を受け付けたか否かを判定する。
If it is determined in step S111 that the product identified by the product identification unit 235 is a product that requires age confirmation, that is, if YES, the process proceeds to step S112.
In step S112, the cashier terminal 2 uses the external camera 212 to acquire an image of the purchaser's face and age confirmation document. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S112 is skipped and the process proceeds to step S116.
In step S113, the trading restricted product determination unit 236 of the cashier terminal 2 transmits the acquired image to the eye inspection terminal Q via the eye inspection result acquisition unit 239 to request the eye inspection. The eye examiner who receives this image confirms the age of the purchaser based on the image.
At this time, if there is a problem with the purchaser's age authentication, such as when the purchaser's face or an image of the age confirmation document is unclear, the purchaser is notified by a screen display or voice guidance at the cashier terminal 2. Alternatively, the purchaser may be requested to retake the image. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
In step S114, the trade-restricted product determination unit 236 determines whether or not an instruction to release the trade restriction has been received from the inspection terminal Q via the inspection result acquisition unit 239.
 ステップS114において売買制限を解除する指示を受け付けていないと判定された場合は、処理はステップS115に進む。
 ステップS114において売買制限を解除する解除指示を受け付けたと判定されると、即ちYESと判定されると、処理はステップS116に進む。
 ステップS115において、目検用端末Qの目検結果送信部412は、目検結果によっても売買制限が解除されなかった旨の警告を送信する。この警告を受信することにより、例えば、レジ端末2の表示制御部238は、目検結果によっても売買制限が解除されなかった旨の警告を、出力部206を介して提示する。ステップS115の後、売買制限処理は、終了する。
 ステップS116において、売買制限商品判定部236は、売買制限を解除する。
 このようにしてステップS116が終了されるか、またはステップS111において年齢制限商品でないと判定された場合(NOであると判定された場合)、処理はステップS117に進む。
When it is determined in step S114 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S115.
If it is determined in step S114 that the cancel instruction for canceling the trading restriction has been accepted, that is, if YES, the process proceeds to step S116.
In step S115, the visual inspection result transmission unit 412 of the visual inspection terminal Q transmits a warning that the trading restriction has not been released even by the visual inspection result. By receiving this warning, for example, the display control unit 238 of the cashier terminal 2 presents, via the output unit 206, a warning that the trading restriction has not been canceled even by the result of the visual inspection. After step S115, the trading restriction process ends.
In step S116, the trade restriction product determination unit 236 cancels the trade restriction.
In this way, when step S116 is completed, or when it is determined in step S111 that the product is not age-restricted (NO is determined), the process proceeds to step S117.
 ステップS117において、売買制限商品判定部236は、商品特定部235によって特定された商品がハラル(許されている)食品以外の商品であるか否か、またはアレルギー商品であるか否かの判定をする。
 売買制限商品判定部236が、ハラル商品以外の商品(許されていない商品)である、またはアレルギー商品であると判定した場合は、ステップS118に進む。
In step S117, the trade restriction product determination unit 236 determines whether the product identified by the product identification unit 235 is a product other than a halal (permitted) food product or an allergic product. To do.
When the trade-restricted product determination unit 236 determines that the product is a product other than a halal product (a product that is not allowed) or an allergy product, the process proceeds to step S118.
 ステップS118において、表示制御部238がレジの表示部Dに、ハラル商品以外の商品である旨、またはアレルギー商品である旨を表示させる。ただし、買い物客の個人情報が取得され、ここでハラル非該当商品であるかアレルギー商品であるかを判定する必要がない場合は、ステップS118はスキップされ、処理はステップS121に進む。 In step S118, the display control unit 238 causes the cashier display unit D to display that the product is a product other than a halal product or an allergic product. However, when the personal information of the shopper is acquired and it is not necessary to determine here whether the product is a non-halal product or an allergy product, step S118 is skipped and the process proceeds to step S121.
 ステップS119において、売買制限商品判定部236は、売買制限を解除する指示を受けたかどうか判定する。
 売買制限を解除する指示を受けていないと判定された場合(NOと判定された場合)は、処理はステップS115に進む。
 売買制限を解除する指示を受けたと判定された場合(YESと判定された場合)は、ステップS120に進む。
In step S119, the trading restriction product determination unit 236 determines whether or not an instruction to cancel the trading restriction has been received.
If it is determined that the instruction to release the trading restriction is not received (NO is determined), the process proceeds to step S115.
If it is determined that the instruction to cancel the trading restriction has been received (YES is determined), the process proceeds to step S120.
 ステップS120において、売買制限商品判定部236は、売買制限を解除する。
 このようにしてステップS120が終了されるか、またはステップS117においてハラル商品以外の商品でない、またはアレルギー商品でない場合(NOであると判定された場合)、処理はステップS121に進む。
In step S120, the trade restriction product determination unit 236 cancels the trade restriction.
In this way, when step S120 is completed, or when the product is not a product other than a halal product or is not an allergic product (when it is determined to be NO) in step S117, the process proceeds to step S121.
 ステップS121において、売買制限商品判定部236は、商品特定部235によって特定された商品が消費期限切れの商品であるかどうかを判定する。
 売買制限商品判定部236が消費期限切れの商品を含んでいると判定した場合は、ステップS121においてYESと判定され、処理はステップS122に進む。
In step S121, the trade-restricted product determination unit 236 determines whether the product identified by the product identification unit 235 is a product whose expiration date has expired.
When the trade-restricted product determination unit 236 determines that the product includes an expired product, the determination in step S121 is YES, and the process proceeds to step S122.
 ステップS122において、表示制御部238は、レジ端末2の表示部Dに消費期限切れの商品を含んでいる可能性がある旨の表示をさせる。
 ステップS123において、売買制限を解除する指示を受け付けていないと判定された場合は、NOと判定され、処理はステップS115に進む。
In step S122, the display control unit 238 causes the display unit D of the cashier terminal 2 to display that there is a possibility that the expired product is included.
If it is determined in step S123 that the instruction to cancel the trading restriction has not been received, NO is determined and the process proceeds to step S115.
 ステップS123において、売買制限を解除する指示を受け付けたと判定されると、YESと判定され、ステップS124に進む。
 ステップ124において、売買制限商品判定部236は、売買制限を解除する。
 売買制限商品判定部236が消費期限切れの商品を含んでいないと判定した場合は、ステップS121においてNOと判定され、処理は終了する。
If it is determined in step S123 that the instruction to cancel the trading restriction has been accepted, YES is determined and the process proceeds to step S124.
In step 124, the trade restriction product determination unit 236 cancels the trade restriction.
When the trade-restricted product determination unit 236 determines that the product whose expiration date has expired is not included, the determination is NO in step S121, and the process ends.
 これにより、売買制限処理は、終了する。そして、処理は、ステップS106のレジ端末での精算に進む。
 このようにして、本情報処理システムは、レジ端末に置かれた商品を認識し、自動精算することができる。また、このとき、目検の対象とされる物体や商品(例えば、レジ端末2において商品の特定が行えない物体や、売買制限商品に該当する商品等)については、目検用端末Qにおける目検が行われ、目検結果(商品の特定結果や売買制限商品の判定結果等)に応じて、自動精算が行われる。
 したがって、本情報処理システムによれば、買い物客が商品を購入する際に、商品の代金の精算の自動化が可能になると共に、商品の特定精度を高めることができる。
As a result, the trading restriction process ends. Then, the process proceeds to the settlement at the cashier terminal in step S106.
In this way, the information processing system can recognize the product placed on the cashier terminal and automatically make a payment. Further, at this time, regarding the objects and products to be subjected to the eye inspection (for example, objects for which the product cannot be specified at the cashier terminal 2, products corresponding to the sale-restricted products, etc.), Inspection is performed, and automatic settlement is performed according to the result of the visual inspection (the result of identifying the product, the result of determining the trade-restricted product, etc.).
Therefore, according to the present information processing system, when the shopper purchases a product, it is possible to automate the payment of the price of the product and to improve the accuracy of specifying the product.
 実施形態1は、上述の実施形態に限定されるものでなく、本発明の目的を達成できる範囲での変形、改良等は本発明に含まれるものである。
 例えば、図5に示されたレジ端末2の外観構成は、例示であり、この外観に限定されない。レジ端末2は、少なくとも所定エリアA、レジカメラ211のような撮像部、出力部206が含まれていればよく、他の構成要素を追加してもよい。
 また、実施形態1の商品認識システムは、1以上の物体を上流側から下流側へ搬送する搬送機構(例えばベルトコンベア)を設けてもよい。上流側には、撮像部を有する所定エリアが配置される。下流側には、精算エリアが配置される。
 所定エリアでは、撮像部によって撮像された物体の数量が計数される。この商品認識システムは、計数された物体の数量と精算エリアで精算された商品の数量が異なる場合に、エラーとして検知する。
The first embodiment is not limited to the above-described embodiments, and modifications, improvements, etc. within a range in which the object of the present invention can be achieved are included in the present invention.
For example, the appearance configuration of the cashier terminal 2 shown in FIG. 5 is an example, and the appearance is not limited to this appearance. The cashier terminal 2 only needs to include at least the predetermined area A, an image capturing unit such as the cashier camera 211, and the output unit 206, and other components may be added.
Further, the product recognition system according to the first embodiment may include a transport mechanism (for example, a belt conveyor) that transports one or more objects from the upstream side to the downstream side. A predetermined area having an imaging unit is arranged on the upstream side. A settlement area is arranged on the downstream side.
In the predetermined area, the number of objects imaged by the imaging unit is counted. This product recognition system detects an error when the number of counted objects is different from the number of products settled in the settlement area.
〔実施形態2〕
 実施形態2の情報処理システムは、図14に示すような書店等の店舗20において、図5及び図15に示すようなレジ端末2を含む商品認識システムである。実施形態2の情報処理システムは、商品である書籍がレジ台に置かれることで、自動精算することができるようにしてある。
 図14は、実施形態2の情報処理システムを採用する書店のレイアウト例を示す図である。
[Embodiment 2]
The information processing system according to the second embodiment is a product recognition system including a cashier terminal 2 as shown in FIGS. 5 and 15 in a store 20 such as a bookstore as shown in FIG. In the information processing system according to the second embodiment, a book, which is a product, is placed on a cash register to enable automatic settlement.
FIG. 14 is a diagram illustrating a layout example of a bookstore that employs the information processing system according to the second embodiment.
 書店のような店舗20は、ゲート21を有する出入口22を設置している。出入口22は、店舗20の規模によって、図示したような2か所に限らず、1か所でも3か所以上であってもよい。ゲート21は、図示したような開閉部材を備えるタイプだけでなく、図示しないがスピーカーやランプのように音や光で異常事態の発生を知らせる機能を備えるタイプ等、不正行為があった場合に対処できるようにされている。
 そして、店舗20は、書籍を陳列する複数の棚ラック23を設置している。棚ラック23は、鉛直方向に複数の棚を、間隔をあけて並べ、書籍を陳列する。水平方向に向き合った棚ラック23と棚ラック23との間が通路24とされる。
A store 20 such as a bookstore has a doorway 22 having a gate 21. Depending on the scale of the store 20, the entrances / outlets 22 are not limited to the two locations as illustrated, and may be one location or three or more locations. The gate 21 is not limited to the type having the opening / closing member as shown in the figure, but also the type having a function of notifying the occurrence of an abnormal situation by sound or light like a speaker or a lamp, which is not shown, is dealt with in case of misconduct. It is made possible.
The store 20 has a plurality of shelf racks 23 for displaying books. The shelf rack 23 displays books by arranging a plurality of shelves in the vertical direction at intervals. A passage 24 is formed between the rack racks 23 facing each other in the horizontal direction.
 通路24の天井には、複数(図面では1台のみ描かれている)の天井カメラ310が設置されている。天井カメラ310は、店内の状態を死角なく、入店した買い物客を常時撮像する。
 また、各棚ラック23には、各棚ラック23内を常時撮像する複数(図面では1台のみ描かれている)の棚カメラ311が設置されてもよい。棚カメラ311は、棚内の状態を死角なく撮像し、棚ラック43の前に立つ買い物客も撮像できるように複数台配置されている。
A plurality of ceiling cameras 310 (only one is shown in the drawing) are installed on the ceiling of the passage 24. The ceiling camera 310 always captures an image of a shopper who has entered the store without blind spots in the inside of the store.
In addition, a plurality of (only one in the drawing) shelf cameras 311 that constantly image the inside of each shelf rack 23 may be installed in each shelf rack 23. A plurality of shelf cameras 311 are arranged so that the inside of the shelf can be imaged without blind spots, and even a shopper standing in front of the shelf rack 43 can also image.
 また、店舗20内の出入口22付近には、レジカウンター25が設置されている。レジカウンター25上には、自動精算するための無人のレジ端末2が複数設置されている。このレジ端末2の隣には、有人のレジスター26が設置されている。なお、出入口22付近や通路24には、書籍を入れるための買い物カゴ(図示せず)が置かれることもある。
 通路24やレジカウンター25内等において、店員Mtが作業している。店員Mtは、情報端末9を所持している。なお、情報端末9は、店舗20のバックヤードや店舗20外の統括本部等にも設置される。
A cashier counter 25 is installed near the doorway 22 in the store 20. On the cashier counter 25, a plurality of unmanned cashier terminals 2 for automatic payment are installed. A manned register 26 is installed next to the cashier terminal 2. A shopping cart (not shown) for inserting books may be placed near the doorway 22 or in the passage 24.
A clerk Mt is working in the passage 24, the cashier counter 25, or the like. The store clerk Mt owns the information terminal 9. The information terminal 9 is also installed in the backyard of the store 20 or the general headquarters outside the store 20.
 本情報処理システムは、買い物客が棚ラック23の棚内から1冊以上の書籍を取り出したり戻したりした行為を天井カメラ310が撮像し、買い物客に取られた書籍の冊数を把握したうえで、図15に示すように無人のレジ端末2に置かれた書籍の価格等の情報を取得して自動精算する。
 実施形態2では、買い物客を「移動物体Mo」として説明する。
 図15は、実施形態2で採用するレジ端末2の概観構成を示す概略斜視図であって、書籍(採番せず)を精算している状態を示している。
In the information processing system, the ceiling camera 310 captures the action of the shopper taking out or returning one or more books from the shelf of the shelf rack 23, and after grasping the number of books taken by the shopper. As shown in FIG. 15, information such as the price of a book placed on the unmanned cashier terminal 2 is acquired and automatically settled.
In the second embodiment, the shopper will be described as a “moving object Mo”.
FIG. 15 is a schematic perspective view showing the general configuration of the cashier terminal 2 used in the second embodiment, showing a state in which a book (without numbering) is being paid.
 実施形態2の無人のレジ端末2は、図5に示した実施形態1のレジ端末2と外観構成を同様にしたものを採用する。
 したがって、レジ端末2は、書籍が載置される所定エリアAを囲む囲繞部270を備えている。囲繞部270は、天板部271と、底板部272と、一対の側板部273と、を備えている。この囲繞部270は、図5に示す実施形態1の囲繞部270と同様に構成されている。
The unmanned cashier terminal 2 according to the second embodiment employs the same external configuration as the cashier terminal 2 according to the first embodiment shown in FIG.
Therefore, the cashier terminal 2 includes the surrounding portion 270 that surrounds the predetermined area A in which the book is placed. The surrounding portion 270 includes a top plate portion 271, a bottom plate portion 272, and a pair of side plate portions 273. The surrounding portion 270 has the same structure as the surrounding portion 270 of the first embodiment shown in FIG.
 したがって、天板部271と一対の側板部273には、所定エリアAを撮像するレジカメラ211が固定されている。少なくとも1台のレジカメラ211は、図15に示すように、所定エリアAに置かれた書籍の少なくとも背表紙を撮像する。なお、図15において、書籍は、片方の側板部273に寄せて、背表紙が他方の側板部273に向くように置かれているが、天板部271に設置されるレジカメラ211に背表紙が向けられてもよく、書籍の置き方は制限されない。 Therefore, the cash register camera 211 that images the predetermined area A is fixed to the top plate portion 271 and the pair of side plate portions 273. As shown in FIG. 15, at least one cash register camera 211 images at least the spine of a book placed in the predetermined area A. Note that, in FIG. 15, the book is placed so that the spine cover faces one side plate portion 273 and the spine cover faces the other side plate portion 273, but the spine cover is attached to the cash register camera 211 installed in the top plate portion 271. May be directed, and there are no restrictions on how to place books.
 このようなレジ端末2は、情報処理システムの一実施形態としての商品認識システムに組み込まれる。
 図16は、本発明の情報処理システムの実施形態2としての商品認識システムの構成を示す構成図である。
 商品認識システムは、サーバ1と、レジ端末2-1乃至2-nと、売場装置3と、目検用端末Qと、を有している。売場装置3は、天井カメラ310によって撮像された書籍の撮像画像から書籍の冊数を認識する機能を備えている。
 なお、サーバ1は、レジ端末2-1乃至2-n、売場装置3及び目検用端末Qを管理するため、店舗20のバックヤードまたは店舗外に設置される。また、売場装置3は、図14に示す店舗20内での移動物体Moを発見し、追跡するため、店舗20内に設置された天井カメラ310を制御する。
 サーバ1と、レジ端末2-1乃至2-nと、売場装置3と、目検用端末Qとは、インターネット(Internet)回線等のネットワークNを介して相互に接続されている。目検用端末Qは、レジ端末2から離れた店員が所持したり、店舗のバックヤードに備えたり、店舗から遠隔地にあるコールセンターに備えたりすることができる。
Such a cashier terminal 2 is incorporated in a product recognition system as one embodiment of the information processing system.
FIG. 16 is a configuration diagram showing the configuration of the product recognition system as the second embodiment of the information processing system of the present invention.
The product recognition system includes a server 1, cash register terminals 2-1 to 2-n, a sales floor device 3, and an inspection terminal Q. The sales floor device 3 has a function of recognizing the number of books from a captured image of the book captured by the ceiling camera 310.
The server 1 is installed in the backyard of the store 20 or outside the store in order to manage the cashier terminals 2-1 to 2-n, the sales floor device 3, and the inspection terminal Q. Further, the sales floor device 3 controls the ceiling camera 310 installed in the store 20 in order to discover and track the moving object Mo in the store 20 shown in FIG.
The server 1, the cashier terminals 2-1 to 2-n, the sales floor device 3, and the inspection terminal Q are connected to each other via a network N such as an Internet line. The inspection terminal Q can be possessed by a store clerk distant from the cashier terminal 2, provided in a backyard of the store, or provided in a call center remote from the store.
 なお、説明の便宜上、図16のサーバ1は、1台しか描画されていないが、実際には複数台の場合もある。
 また、以下、レジ端末2-1乃至2-nを個々に区別する必要がない場合、これらをまとめて「レジ端末2」と呼ぶ。
Note that, for convenience of explanation, only one server 1 in FIG. 16 is drawn, but in reality, there may be a plurality of servers.
Further, hereinafter, when it is not necessary to individually distinguish the cashier terminals 2-1 to 2-n, they are collectively referred to as a “cash register terminal 2”.
 サーバ1は、レジ端末2及び売場装置3の各動作を管理すべく、各処理を実行する。
 サーバ1は、CPU101と、ROM102と、RAM103と、バス104と、入出力インターフェース105と、出力部106と、入力部107と、記憶部108と、通信部109と、ドライブ110と、を備えている。これらは、図7に示した実施形態1で説明したサーバ1と同様に構成されている。
The server 1 executes each process in order to manage each operation of the cashier terminal 2 and the sales floor device 3.
The server 1 includes a CPU 101, a ROM 102, a RAM 103, a bus 104, an input / output interface 105, an output unit 106, an input unit 107, a storage unit 108, a communication unit 109, and a drive 110. There is. These are configured similarly to the server 1 described in the first embodiment shown in FIG.
 このようなサーバ1にネットワークNを介して売場装置3が接続されている。
 図17は、図16の商品認識システムのうち売場装置3のハードウェア構成を示すブロック図である。
The sales floor device 3 is connected to the server 1 via the network N.
FIG. 17 is a block diagram showing the hardware configuration of the sales floor device 3 in the product recognition system of FIG.
 売場装置3は、CPU301と、ROM302と、RAM303と、バス304と、入出力インターフェース305と、天井カメラ310と、棚カメラ311と、通信部315と、情報端末9と、を備えている。 The sales floor device 3 includes a CPU 301, a ROM 302, a RAM 303, a bus 304, an input / output interface 305, a ceiling camera 310, a shelf camera 311, a communication unit 315, and an information terminal 9.
 売場装置のCPU301、ROM302、RAM303、バス304、入出力インターフェース305、通信部315は、図7に示したサーバ1のこれらと同様に構成されている。 The CPU 301, the ROM 302, the RAM 303, the bus 304, the input / output interface 305, and the communication unit 315 of the sales floor device are configured similarly to those of the server 1 shown in FIG. 7.
 天井カメラ310は、USB(Universal Serial Bus)ケーブルによってネットワークと接続されている。
 棚カメラ311は、魚眼カメラ等の広角を撮像できるカメラを採用してもよい。
 また、棚カメラ311は、USBケーブルによってネットワークと接続されている。
The ceiling camera 310 is connected to the network by a USB (Universal Serial Bus) cable.
The shelf camera 311 may employ a camera capable of capturing a wide angle such as a fisheye camera.
Further, the shelf camera 311 is connected to the network by a USB cable.
 情報端末9は、遠隔操作部390や表示部391等を備えているスマートフォンやタブレット等の情報機器である。遠隔操作部390は、システム処理異常のようなエラー状態等を遠隔操作で解消する機能を備えている。表示部は、エラー状態や移動物体Mo等を表示する画面を備えている。また、情報端末9は、エラー状態を通知する音声発生部(図示せず)を備えている。
 店舗内におけるエラー状態としては、例えば、天井カメラ310が棚内から取られた書籍の冊数を認識できなかった場合や精算されていない書籍が店舗外に持ち出されようとした場合等がある。また、レジ端末におけるエラー状態としては、レジ端末2において、書籍を特定できなかったり、冊数を認識できなかった場合、年齢制限商品が精算されようとした場合、レジ端末2内に書籍が置き忘れられている場合等である。
 また、サーバ1は、このようなエラーを表示するエラー表示部151と、エラー状態を解除するエラー解除部152とを設けている。
 なお、本実施形態において、レジ端末2において、書籍を特定できなかったり、冊数を認識できなかった場合、年齢制限商品が精算されようとした場合には、目検用端末Qによる目検を依頼することにより、エラー状態を解消することも可能となっている。
The information terminal 9 is an information device such as a smartphone or a tablet including a remote operation unit 390, a display unit 391, and the like. The remote control unit 390 has a function of remotely correcting an error state such as a system processing abnormality. The display unit has a screen for displaying an error state, a moving object Mo, and the like. The information terminal 9 also includes a voice generation unit (not shown) that notifies the error state.
The error state in the store includes, for example, the case where the ceiling camera 310 cannot recognize the number of books taken from the shelf, the case where an unaccounted book is about to be taken out of the store, and the like. In addition, as an error state in the cashier terminal, when the book cannot be identified or the number of books cannot be recognized in the cashier terminal 2, when the age-restricted product is about to be settled, the book is left in the cashier terminal 2. When it is.
The server 1 also includes an error display unit 151 that displays such an error and an error canceling unit 152 that cancels the error state.
In the present embodiment, when the cashier terminal 2 cannot identify the book or cannot recognize the number of books, or when the age-restricted product is about to be settled, the eye examination terminal Q requests the eye examination. By doing so, it is possible to eliminate the error state.
 このような売場装置3にネットワークNを介してレジ端末2が接続されている。
 レジ端末2は、図8に示す実施形態1のレジ端末と同様に構成されている。
 したがって、実施形態2のレジ端末2は、CPU201と、ROM202と、RAM203と、バス204と、入出力インターフェース205と、出力部206と、入力部207と、照明部221と、遮光部209と、提示部210と、レジカメラ211と、記憶部208と、通信部213と、ドライブ214と、を備えている。
The cashier terminal 2 is connected to the sales floor device 3 through the network N.
The cashier terminal 2 is configured similarly to the cashier terminal of the first embodiment shown in FIG.
Therefore, the cashier terminal 2 according to the second embodiment includes the CPU 201, the ROM 202, the RAM 203, the bus 204, the input / output interface 205, the output unit 206, the input unit 207, the illumination unit 221, and the light shielding unit 209. The presentation unit 210, a cash register camera 211, a storage unit 208, a communication unit 213, and a drive 214 are provided.
 レジカメラ211は、所定エリアAに置かれた書籍を撮像し、その結果として得られる撮像画像を物体撮像画像としてCPU201内の画像取得部232に出力する。レジカメラ211が魚眼カメラのように広角で撮像できる場合や、書籍の背表紙のみ撮像することに特化した場合は、レジカメラ211は1台のみ備えてもよい。 The cash register camera 211 images a book placed in a predetermined area A, and outputs a captured image obtained as a result to the image acquisition unit 232 in the CPU 201 as an object captured image. If the cashier camera 211 can image in a wide angle like a fish-eye camera, or if it specializes in imaging only the spine of a book, only one cashier camera 211 may be provided.
 なお、目検用端末Qのハードウェア構成はサーバ1のハードウェア構成と同様であるため、目検用端末Qのハードウェア構成については、図7及びその説明を適宜参照することとし、ここでは説明を省略する。 Since the hardware configuration of the inspection terminal Q is the same as the hardware configuration of the server 1, for the hardware configuration of the inspection terminal Q, refer to FIG. The description is omitted.
 図18は、サーバ1、レジ端末2、売場装置3及び目検用端末Qの機能的構成の一例を示す機能ブロック図である。
 サーバ1のCPU101は、エラー判定部150を備えている。
 サーバ1の記憶部108の一領域には、商品DB131と、位置情報管理DB132とが設けられている。
 商品DB131は、書籍に関する書名、価格、著者名、出版社等の情報を記憶するDB(Data Base)である。位置情報管理DB132は、移動物体Moの位置を管理する。
FIG. 18 is a functional block diagram showing an example of the functional configuration of the server 1, the cashier terminal 2, the sales floor device 3, and the inspection terminal Q.
The CPU 101 of the server 1 includes an error determination unit 150.
A product DB 131 and a position information management DB 132 are provided in one area of the storage unit 108 of the server 1.
The product DB 131 is a DB (Data Base) that stores information such as book titles, prices, author names, publishers, and the like. The position information management DB 132 manages the position of the moving object Mo.
 レジ端末2のCPU201においては、図18に示すように、発光制御部228と、遮光制御部229と、提示制御部230と、画像取得部232と、物体認識部233と、商品特定部235と、売買制限商品判定部236と、精算部237と、表示制御部238と、目検結果取得部239と、が機能する。
 目検用端末QのCPU101においては、画像表示制御部411と、目検結果送信部412と、が機能する。
In the CPU 201 of the cashier terminal 2, as shown in FIG. 18, a light emission control unit 228, a light shielding control unit 229, a presentation control unit 230, an image acquisition unit 232, an object recognition unit 233, and a product identification unit 235. The trading restricted product determination unit 236, the settlement unit 237, the display control unit 238, and the visual inspection result acquisition unit 239 function.
In the CPU 101 of the eye examination terminal Q, the image display control unit 411 and the eye examination result transmission unit 412 function.
 発光制御部228は、書籍を撮像するタイミングで照明部221を発光させる状態と、書籍を撮像しないタイミングで発光させない状態とに切り替える制御や、所定エリアAに置かれる書籍を認識する状況等に応じて発光色を切り替える制御を実行する。
 遮光制御部229は、囲繞部270に備えられた遮光部209が所定エリアAに置かれた書籍を撮影するタイミングでの不透明状態と,撮像しないタイミングでの透明状態とに切り替える制御を実行する。
 提示制御部230は、提示部210がレジ端末2の状態を提示する発光色を変化させるように制御を実行する。
The light emission control unit 228 responds to control for switching between a state in which the illumination unit 221 emits light at the timing of capturing an image of a book and a state of not emitting light at the timing for not capturing an image of the book, a situation of recognizing a book placed in the predetermined area A, and the like. Control to switch the emission color.
The light-shielding control unit 229 controls the light-shielding unit 209 provided in the surrounding unit 270 to switch between an opaque state at the timing of photographing a book placed in the predetermined area A and a transparent state at the timing of not photographing the book.
The presentation control unit 230 executes control so that the presentation unit 210 changes the emission color for presenting the state of the cashier terminal 2.
 画像取得部232は、物体が所定エリアAに置かれてレジカメラ211によって撮像された物体画像のデータを取得する。
 物体認識部233は、上述した所定の画像認識手法を用いて、所定エリアAに置かれた物体の存在を認識する。
The image acquisition unit 232 acquires the data of the object image captured by the cashier camera 211 with the object placed in the predetermined area A.
The object recognition unit 233 recognizes the presence of an object placed in the predetermined area A using the above-described predetermined image recognition method.
 商品特定部235は、物体認識部233によって存在が認識された物体について、特定物体認識、一般物体認識、文字認識やディープラーニング等の画像処理手法により、商品候補をリストアップする。このリストアップされた商品候補を「商品候補リストS」と呼ぶ。その後、商品特定部235は、検証機能を発揮させ、商品を高い精度で特定する。なお、以下において説明する「画像認識手法」とは、何らかの画像を用いて何からのものを認識する手法を意味する。例えば特定物体認識、一般物体認識、文字認識、ディープラーニングは、画像認識手法の一例である。
 検証機能は前述の商品候補をリストアップする手法と異なるアルゴリズムによって「商品候補リストP」をリストアップする。商品候補リストSとPの結果をマッチングさせて、所定の閾値を超える場合、商品を特定する。
 「商品候補リスト」のリストアップ手法として、例えば、存在が認識された物体から得られる物体の画像情報と、DB情報保持部241やメモリ上に保持されている画像情報とマッチングさせる方法により実現されてもよい。即ち、両画像の特徴情報が一致する(閾値を超える)と、物体認識部233によって存在が認識された物体が、DB情報保持部241に登録されている商品であるため、商品特定部235は、そのDB情報保持部241に登録された商品であると特定する。
 なお、本実施形態においても、実施形態1と同様に、レジ端末2において商品を特定できなかったときは、その結果が商品不特定として出力される。
 そして、実施形態1と同様に、特定物体目検モードに設定されている場合、商品特定部235は、商品不特定とされた物体の物体撮像画像及び関連する情報等を、目検結果取得部239を介して目検用端末Qに送信する。一方、全物体目検モードに設定されている場合、商品特定部235は、商品の特定結果及び特定された商品の商品撮像画像と、商品不特定とされた物体の物体撮像画像及び関連する情報等とを目検結果取得部239を介して目検用端末Qに送信する。
 また、本実施形態においては、レジ端末2に置かれた商品(書籍)の冊数と、売場装置3でカウントされた書籍の冊数とが一致しない場合にも、その結果が商品数(冊数)の不一致として出力される。この場合、商品特定部235は、物体撮像画像及び売場装置3でカウントされた書籍の冊数を目検結果取得部239を介して目検用端末Qに送信する。
 さらに、商品特定部235は、目検結果取得部239が目検用端末Qから取得した目検結果に基づいて、商品の特定を行う。具体的には、商品特定部235によって特定された商品については、目検結果により特定結果が承認または修正された場合、商品特定部235は、目検結果が示す内容に従って、商品の特定を行う。また、商品特定部235によって商品不特定とされた物体については、商品特定部235は、その物体を目検結果が示す商品として特定する。なお、目検によって冊数が確認(冊数の不一致が解消)されない場合には、エラー表示が行われ、精算が中止される。
The product identification unit 235 lists up product candidates for the object whose existence is recognized by the object recognition unit 233, by an image processing method such as specific object recognition, general object recognition, character recognition, or deep learning. The listed product candidates are called “product candidate list S”. After that, the product specifying unit 235 causes the verification function to be performed and specifies the product with high accuracy. The “image recognition method” described below means a method of recognizing what is made using some image. For example, specific object recognition, general object recognition, character recognition, and deep learning are examples of image recognition methods.
The verification function lists up the "commodity candidate list P" by an algorithm different from the method of listing up the product candidates described above. The results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
As a method of listing the “commodity candidate list”, for example, it is realized by a method of matching the image information of the object obtained from the object whose existence is recognized with the image information held in the DB information holding unit 241 or the memory. May be. That is, when the feature information of both images matches (exceeds a threshold value), the object whose existence is recognized by the object recognition unit 233 is a product registered in the DB information holding unit 241, and thus the product specification unit 235 , And specifies that the product is registered in the DB information holding unit 241.
Also in the present embodiment, as in the first embodiment, when the product cannot be specified at the cashier terminal 2, the result is output as the product not specified.
Then, as in the first embodiment, when the specific object visual inspection mode is set, the product identification unit 235 outputs the object captured image of the object not specified as the product and the related information to the visual inspection result acquisition unit. It transmits to the terminal Q for eye examinations via 239. On the other hand, when the all-object visual inspection mode is set, the product identification unit 235 determines the product identification result and the product captured image of the identified product, the object captured image of the object not identified as the product, and related information. And the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 239.
Further, in the present embodiment, even when the number of products (books) placed on the cashier terminal 2 and the number of books counted on the sales floor device 3 do not match, the result is the number of products (number of books). It is output as a mismatch. In this case, the product specifying unit 235 transmits the object captured image and the number of books counted in the sales floor device 3 to the inspection terminal Q via the inspection result acquisition unit 239.
Furthermore, the product identification unit 235 identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 239 from the eye inspection terminal Q. Specifically, for the product identified by the product identification unit 235, when the identification result is approved or corrected by the inspection result, the product identification unit 235 identifies the product according to the content indicated by the inspection result. . In addition, for an object for which the product specifying unit 235 has determined that the product is unspecified, the product specifying unit 235 specifies the object as a product indicated by the visual inspection result. If the number of books is not confirmed by visual inspection (disagreement of the number of books is not resolved), an error message is displayed and settlement is stopped.
 売買制限商品判定部236は、判定情報に基づいて、商品特定部235によって特定された商品が売買制限商品か否かを判定し、提示する。売買制限商品とは、例えば、ある年齢以下の買い物客には、販売が規制されている書籍である。店員が書籍を販売する場合は、買い物客を見た店員が買い物客の年齢を確認し、販売してよいかどうかを判断できる。
 しかし、対面販売ではない、自動精算を採用する本システムでは、店員が買い物客の年齢を確認できる仕組みが必要となる。
 本実施形態において、売買制限商品を特定したレジ端末2の売買制限商品判定部236は、売買制限商品に該当する商品の商品撮像画像を、目検結果取得部239を介して目検用端末Qに送信する。このとき、売買制限商品判定部236は、買い物客に関する画像(顔画像、身分証明書の画像等)を、目検結果取得部239を介して目検用端末Qに適宜送信する。この際、買い物客に関する画像が不鮮明なときは、レジ端末2において画面表示や音声案内等で購入者に知らせて、画像の撮り直しを依頼する構成としてもよい。さらに、レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。
 そして、売買制限商品判定部236は、目検結果取得部239が目検用端末Qから取得した目検結果に基づいて、売買制限商品の販売が許可されるか否か(売買制限が解除されたか否か)を判定する。
 目検結果によって売買制限商品の販売が許可された場合には、商品の精算が継続される。一方、目検結果によって売買制限商品の販売が許可されない場合には、売買制限商品である旨を提示し、精算処理を中断する。この場合、エラー表示部151により、エラー状態を受け取った店員が買い物客の年齢を確認し、レジ端末2を操作し、制限状態を解除することとしてもよい。それにより、精算処理が再開される。また、レジ端末2は、買い物客の顔や手等を写すレジカメラ211を備え、買い物客の年齢を推定することで、所定の年齢に達していないと判断される買い物客には売買制限商品を販売しないようにしてもよい。
The trade-restricted product determination unit 236 determines whether or not the product identified by the product identification unit 235 is a trade-restricted product, based on the determination information, and presents it. The trade-restricted product is, for example, a book whose sale is restricted to shoppers under a certain age. When a clerk sells a book, the clerk who sees the shopper can confirm the age of the shopper and determine whether or not to sell the book.
However, this system, which employs automatic settlement, rather than face-to-face sales, requires a mechanism that allows shop assistants to confirm the age of shoppers.
In the present embodiment, the trade-restricted product determination unit 236 of the cashier terminal 2 that has identified the trade-restricted product receives the product image of the product corresponding to the trade-restricted product from the eye-checking terminal Q via the eye-check result acquisition unit 239. Send to. At this time, the trade-restricted product determination unit 236 appropriately transmits the image (face image, identification image, etc.) regarding the shopper to the inspection terminal Q via the inspection result acquisition unit 239. At this time, when the image relating to the shopper is unclear, the purchaser may be informed by screen display or voice guidance at the cashier terminal 2 and the image retake may be requested. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
Then, the trade-restricted product determination unit 236 determines whether or not the sale of the trade-restricted product is permitted based on the eye inspection result acquired by the eye-inspection result acquisition unit 239 from the inspection terminal Q (the sale-restriction is released). Whether or not) is determined.
If the sale of the trade-restricted product is permitted according to the result of the visual inspection, the payment of the product is continued. On the other hand, if the sale of the trade-restricted product is not permitted according to the result of the visual inspection, the fact that the product is a trade-restricted product is presented and the settlement process is interrupted. In this case, the salesclerk who received the error state may confirm the age of the shopper on the error display unit 151, operate the cashier terminal 2, and release the restricted state. Thereby, the settlement process is restarted. In addition, the cashier terminal 2 includes a cashier camera 211 that captures the face, hands, and the like of the shopper, and estimates the age of the shopper, thereby restricting the trade to a shopper who is determined not to have reached a predetermined age. May not be sold.
 そのため、売買制限商品判定部236は、サーバ1のDB管理部141の情報から年齢制限書籍等の売買制限商品を特定する。
 売買制限商品判定部236は、個人認証に得られる買い物客の情報と紐づけ、売買制限してもよい。当該書籍が売買制限商品であると判定されると、表示部Dが売買制限商品である旨を提示する。
Therefore, the trade-restricted product determination unit 236 identifies a trade-restricted product such as an age-restricted book from the information of the DB management unit 141 of the server 1.
The trade-restricted product determination unit 236 may restrict the trade by associating with the shopper information obtained by the individual authentication. When it is determined that the book is a trade-restricted product, the display unit D indicates that the book is a trade-restricted product.
 精算部237は、商品特定部235で特定された書籍であって、売買制限商品判定部236で販売可能とされた書籍について合計金額を算出する。例えば、精算部237は、所定エリアAに置かれた書籍の価格をDB情報保持部241から読み出して加算し、表示部D(図8)に表示させ、精算する。 The settlement unit 237 calculates the total price of the books specified by the product specifying unit 235 and which can be sold by the trade restriction product judging unit 236. For example, the settlement unit 237 reads out the prices of the books placed in the predetermined area A from the DB information holding unit 241, adds the prices, displays the prices on the display unit D (FIG. 8), and performs the settlement.
 表示制御部238は、レジ端末2のレジカメラ211で撮像され、精算部237で精算される書籍の題名や価格等を購入者及び店員に向けて表示させる制御を実行する。表示制御部238は、商品特定部235によって商品が特定されなかった場合、あるいは、目検用端末Qにおける目検によって商品が特定されなかった場合に、買い物客や店員に向けて注意喚起(物体の置き直しの指示等)を出力するように出力部206を制御する。具体的には、画面表示や音声案内等で購入者に知らせて、購入者へ商品の置き直しを依頼する構成としてもよい。さらに、レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。 The display control unit 238 executes control to display the title and price of the book imaged by the cashier camera 211 of the cashier terminal 2 and settled by the settlement unit 237 to the purchaser and the clerk. The display control unit 238, when the product is not specified by the product specifying unit 235, or when the product is not specified by the visual inspection in the inspection terminal Q, alerts the shopper or the clerk (object). The output unit 206 is controlled so as to output (replacement instruction, etc.). Specifically, the purchaser may be notified by a screen display or voice guidance, and the purchaser may be requested to replace the product. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
 目検結果取得部239は、物体撮像画像、または、商品の特定結果及び特定された商品の商品撮像画像を目検用端末Qに送信することにより、目検による商品の特定を依頼し、この依頼に応じて目検用端末Qから送信された目検結果を取得する。目検結果取得部239は、取得した目検結果(商品の特定結果)を商品特定部235に出力する。また、目検結果取得部239は、売買制限商品に該当する商品の商品撮像画像を目検用端末Qに送信することにより、目検による売買制限商品の判定を依頼し、目検用端末Qから送信された目検結果を取得する。このとき、目検結果取得部239は、買い物客に関する画像(顔画像、身分証明書の画像等)を、適宜、目検用端末Qに送信する。目検結果取得部239は、取得した目検結果(売買制限商品の判定結果)を売買制限商品判定部236に出力する。 The eye inspection result acquisition unit 239 requests the identification of the product by the eye inspection by transmitting the object captured image or the product identification result and the product captured image of the identified product to the eye inspection terminal Q, and The eye inspection result transmitted from the eye inspection terminal Q is acquired in response to the request. The visual inspection result acquisition unit 239 outputs the acquired visual inspection result (commercial product identification result) to the commercial product identification unit 235. Further, the eye inspection result acquisition unit 239 requests the determination of the sale restricted product by the eye inspection by transmitting the product imaged image of the product corresponding to the sale restricted product to the eye inspection terminal Q, and the eye inspection terminal Q. Get the result of the eye inspection sent from. At this time, the visual inspection result acquisition unit 239 appropriately transmits an image of the shopper (face image, image of identification card, etc.) to the visual inspection terminal Q. The eye inspection result acquisition unit 239 outputs the acquired eye inspection result (determination result of the sale-restricted product) to the sale-restricted product determination unit 236.
 目検用端末Qにおいて、画像表示制御部411は、レジ端末2から目検の依頼が行われた場合に、レジ端末2から送信された物体撮像画像あるいは商品撮像画像等の画像及びこれらの画像に付随して送信された各種情報(商品候補リスト等)を出力部106に出力する。
 目検結果送信部412は、出力部106に出力された物体撮像画像あるいは商品撮像画像について、目検者が入力部107を介して入力した目検結果(商品の特定結果あるいは売買制限商品の判定結果等)を、目検を依頼したレジ端末2に送信する。
In the eye inspection terminal Q, the image display control unit 411 displays images such as the object imaged image or the product imaged image transmitted from the cashier terminal 2 and these images when the cashier terminal 2 requests the eye inspection. Various information (commodity candidate list, etc.) transmitted in association with is output to the output unit 106.
The eye inspection result transmission unit 412 receives the object inspection image or the product captured image output to the output unit 106 from the eye inspection result input by the eye examiner through the input unit 107 (determination of product identification result or trade restricted product). (Result etc.) is transmitted to the cashier terminal 2 which requested the eye examination.
 売場装置3のCPU301においては、図18に示すように、個人認証部320と、移動物体追跡部330と、位置情報管理部340と、冊数カウント部350と、が機能する。 In the CPU 301 of the sales floor device 3, as shown in FIG. 18, a personal authentication unit 320, a moving object tracking unit 330, a position information management unit 340, and a volume number counting unit 350 function.
 個人認証部320は、個人情報取得部321を備えている。
 個人認証部320は、個人情報取得部321によって取得された買い物客の個人情報から、サーバ1のDB管理部141において登録されている買い物客が誰であるかを個人認証する。なお、個人認証部320と個人情報取得部321は、実施形態1と同様にレジ端末2に備えてもよい。
The personal authentication unit 320 includes a personal information acquisition unit 321.
The personal authentication unit 320 personally authenticates who the shopper registered in the DB management unit 141 of the server 1 is based on the personal information of the shopper acquired by the personal information acquisition unit 321. The personal authentication unit 320 and the personal information acquisition unit 321 may be provided in the cashier terminal 2 as in the first embodiment.
 ここでの個人情報は、例えば、氏名や性別、生年月日、住所、電話番号といった個人を特定できる情報だけでなく、指紋や静脈、虹彩等の生体情報、クレジットカード番号や銀行口座番号といった金融に関する情報等のプライバシーに関する情報も含まれる。
 個人情報取得部321は、例えば出入口22に設置されたゲート21に備えられる。この個人情報取得部321は、買い物客のICカードやスマートフォン、タブレット等の携帯情報端末をタッチする読取装置や、指紋や静脈、虹彩等の生体情報を読み取る読取装置等が採用される。
The personal information here includes, for example, personal information such as name, sex, date of birth, address, and telephone number, as well as biometric information such as fingerprints, veins, and iris, and financial information such as credit card numbers and bank account numbers. Information about privacy, such as information about
The personal information acquisition unit 321 is provided in, for example, the gate 21 installed at the doorway 22. As the personal information acquisition unit 321, a reading device that touches a portable information terminal such as an IC card of a shopper, a smartphone, or a tablet, a reading device that reads biometric information such as a fingerprint, a vein, or an iris is used.
 入店時に個人認証できなかった場合は、買い物中において天井カメラ310が撮像した買い物客の画像から個人認証してもよい。
 取得された個人情報は、売買制限(解除含む)や購買分析に活用される。
When the personal authentication cannot be performed when entering the store, the personal authentication may be performed from the image of the shopper captured by the ceiling camera 310 during shopping.
The acquired personal information is used for trading restrictions (including cancellation) and purchase analysis.
 図19は、図18の売場装置3に備えられた移動物体追跡部330の詳細な機能的構成例を示している機能ブロック図である。
 移動物体追跡部330は、図19に示すように、天井カメラ310によって撮像された画像から移動物体Moを発見し、移動する移動物体Moを追跡するため、天井カメラによる移動物体発見部3302と、天井カメラによる移動物体領域定義部3304と、天井カメラによる移動物体領域追跡部3305とを備える。
FIG. 19 is a functional block diagram showing a detailed functional configuration example of the moving object tracking unit 330 provided in the sales floor device 3 of FIG.
As shown in FIG. 19, the moving object tracking unit 330 discovers the moving object Mo from the image captured by the ceiling camera 310 and tracks the moving moving object Mo. The ceiling camera includes a moving object area definition unit 3304 and a ceiling camera moving object area tracking unit 3305.
 また、天井カメラによる移動物体追跡部330は、USBケーブルやネットワークN等を通じて天井カメラ310と接続されている。したがって、天井カメラ310は、他の天井カメラ310やパーソナルコンピュータ等と連携される。 Moreover, the moving object tracking unit 330 using the ceiling camera is connected to the ceiling camera 310 via a USB cable, the network N, or the like. Therefore, the ceiling camera 310 is linked with another ceiling camera 310, a personal computer, or the like.
 天井カメラによる移動物体発見部3302は、移動物体Moの状態を、天井カメラ310によって撮像された撮像画像に基づいて状態空間モデル(ベイジアンフィルタ等)を用いて推定し、移動物体Moを発見し、一意に識別可能なIDを採番する。 The moving object finding unit 3302 with the ceiling camera estimates the state of the moving object Mo using a state space model (Bayesian filter or the like) based on the captured image taken by the ceiling camera 310, and finds the moving object Mo, A uniquely identifiable ID is assigned.
 ところで、天井カメラ310からの映像では、天井カメラ310に正対していない移動物体Moの周辺領域が角度をもって(斜め方向から)撮像されるため、移動物体Moの位置情報を正確に取得できないおそれがある。したがって、撮像画像に対してキャリブレーションによる補正をかけることで、正対しているように撮像することも考えられる。しかし、このような補正をかけても、精度高く位置情報を取得できない場合がある。
 そこで、天井カメラ310は、距離センサー等を用いて移動物体Moの高さ情報を取得することで、精度の高い位置情報を取得してもよい。
By the way, in the image from the ceiling camera 310, since the peripheral area of the moving object Mo that is not directly facing the ceiling camera 310 is imaged at an angle (from an oblique direction), the position information of the moving object Mo may not be accurately acquired. is there. Therefore, it is conceivable that the captured image is corrected so that the image is directly faced. However, even if such a correction is applied, it may not be possible to accurately acquire the position information.
Therefore, the ceiling camera 310 may acquire highly accurate position information by acquiring height information of the moving object Mo using a distance sensor or the like.
 天井カメラによる移動物体領域定義部3304は、移動した後の移動物体Moの領域の位置情報を更新する。移動物体Moは、移動し続けるため、1つの天井カメラ310によって撮像されている範囲内で移動領域が変更し、他の天井カメラ310によって撮像される範囲内でも移動する。移動物体Moが移動するたびに移動物体領域が定義され、位置情報を管理する位置情報管理DB132やメモリ等の各移動物体領域の位置情報が更新される。 The moving object area definition unit 3304 using the ceiling camera updates the position information of the area of the moving object Mo after moving. Since the moving object Mo continues to move, the moving area changes within the range captured by one ceiling camera 310, and also moves within the range captured by another ceiling camera 310. Each time the moving object Mo moves, the moving object area is defined, and the position information of each moving object area such as the position information management DB 132 that manages the position information and the memory is updated.
 移動物体領域追跡部3305は、移動物体領域の位置を推定し、移動物体Moの移動物体領域を追跡し続ける。 The moving object area tracking unit 3305 estimates the position of the moving object area and continues to track the moving object area of the moving object Mo.
 図20は、図18売場装置3に備えられた位置情報管理部340の詳細な機能的構成例を示している機能ブロック図である。
 位置情報管理部340は、カメラ間情報受け渡し部341と、各カメラの位置定義部342と、移動物体表示部343と、を備える。
 カメラ間情報受け渡し部341は、各天井カメラ310に撮像された画像情報を他の天井カメラ310に撮像された画像情報と共有するようにすることにより、移動物体Moがある天井カメラ310の撮像画像から別の天井カメラ310に撮像されるようになっても、移動物体領域を追跡し続けることができる。
FIG. 20 is a functional block diagram showing a detailed functional configuration example of the position information management unit 340 provided in the sales floor device 3 in FIG.
The position information management unit 340 includes an inter-camera information transfer unit 341, a position definition unit 342 of each camera, and a moving object display unit 343.
The inter-camera information transfer unit 341 shares the image information captured by each ceiling camera 310 with the image information captured by another ceiling camera 310, so that the captured image of the ceiling camera 310 with the moving object Mo. Even when the image is captured by another ceiling camera 310, the moving object area can be continuously tracked.
 カメラ間情報受け渡し部341は、例えば、天井カメラ310に撮像されて得られた情報を統括するサーバ1を通じて、商品DB131を含め、記憶部108上において天井カメラ310間で情報をやり取りする。
 他の例として、天井カメラ310の台数が多いことに鑑み、カメラ間情報受け渡し部341は、サーバ1を通じることなく各天井カメラ310で撮像された画像を各天井カメラ310間で例えばP2Pによって受け渡すようにしてもよい。
The camera-to-camera information transfer unit 341 exchanges information between the ceiling cameras 310 on the storage unit 108, including the product DB 131, through the server 1 that controls the information obtained by imaging the ceiling cameras 310.
As another example, in view of the large number of ceiling cameras 310, the inter-camera information passing unit 341 receives the images captured by the ceiling cameras 310 without passing through the server 1 between the ceiling cameras 310 by, for example, P2P. You may hand it over.
 各カメラの位置定義部342は、各天井カメラ310が店内のどこを映しているかの位置情報を定義する。即ち、各カメラの位置定義部342は、カメラ間情報受け渡し部341によって別々の天井カメラ310に撮像された移動物体が店舗内のどこに位置しているかを把握する。
 この各カメラの位置定義部342は、各天井カメラ310の撮像画像を合成し、1枚の店舗マップを作成する。また、この各カメラの位置定義部342は、各天井カメラ310、棚カメラ311の座標を店舗マップ上の座標に置き換える。また、各カメラの位置定義部342は、各天井カメラ310によって撮像された撮像画像について、透視変換により、計算上、店舗内床面に正対した撮像画像となるように補正する。
The position definition unit 342 of each camera defines the position information of where in the store each ceiling camera 310 is reflected. That is, the position definition unit 342 of each camera grasps where in the store the moving object imaged by the different ceiling cameras 310 by the inter-camera information transfer unit 341 is located.
The position definition unit 342 of each camera combines the captured images of the ceiling cameras 310 to create one store map. Further, the position definition unit 342 of each camera replaces the coordinates of each ceiling camera 310 and the shelf camera 311 with the coordinates on the store map. Further, the position definition unit 342 of each camera corrects the captured image captured by each ceiling camera 310 by perspective transformation so that the captured image directly faces the floor surface in the store.
 なお、天井カメラ310が距離センサーを搭載することにより、高さ情報を取得するようにすることで、位置情報管理部340は、歪んだ撮像画像を的確に補正し、移動物体Moを的確に認識することができる。
 移動物体表示部343は、各カメラの位置定義部342によって写されている位置情報を、店舗30内の移動物体Moを表示する。移動物体表示部343は、店員Mtが保持する情報端末9、あるいは店舗のバックヤードの画面等として採用されてもよい。
The ceiling camera 310 is equipped with a distance sensor so that the height information is acquired, so that the positional information management unit 340 accurately corrects the distorted captured image and accurately recognizes the moving object Mo. can do.
The moving object display unit 343 displays the position information imaged by the position definition unit 342 of each camera for the moving object Mo in the store 30. The moving object display unit 343 may be adopted as the information terminal 9 held by the clerk Mt, the backyard screen of the store, or the like.
 図21は、図18の売場装置3に備えられた冊数カウント部350の詳細な機能的構成例を示している機能ブロック図である。
 冊数カウント部350は、冊数認識部351と、移動物体と冊数紐づけ部352と、冊数不特定判定部353と、人と紐づく冊数管理部354と、天井カメラによる移動物体領域間受け渡し認識部355と、天井カメラによる受け渡された冊数認識部356を備える。
FIG. 21 is a functional block diagram showing a detailed functional configuration example of the book number counting unit 350 provided in the sales floor device 3 of FIG.
The number-of-volumes counting unit 350 includes a number-of-volumes recognizing unit 351, a moving object-number-of-volumes associating unit 352, a number-of-volumes unidentifying unit 353, a number-of-volumes managing unit 354 associated with a person, and a moving object area transfer recognizing unit using a ceiling camera. 355 and the number-of-books recognition unit 356 transferred by the ceiling camera.
 冊数認識部351は、移動物体Moが棚から取った書籍と棚に戻した書籍の冊数を天井カメラ310に撮像された撮像画像から認識する。冊数認識部351は、撮像画像内に「物体進入検知ライン」等が設けられることにより、棚内への移動物体領域の入出を検知し、この検知した際の撮像画像から、移動物体Moが棚内から取った物体と、棚に戻した物体の領域を定義する。その物体領域の数を書籍の冊数として認識する。なお、天井カメラ310にズーム機能が備わっている場合に、天井カメラ310と移動物体Moとの距離を考慮して、天井カメラがズームをした後、冊数認識部351が、冊数認識を行ってもよい。 The number-of-books recognition unit 351 recognizes the number of books taken by the moving object Mo from the shelf and the number of books returned to the shelf from the captured images captured by the ceiling camera 310. The number-of-books recognizing unit 351 detects the entry / exit of the moving object region into / from the shelf by providing the “object entry detection line” or the like in the captured image, and the moving object Mo indicates the shelf in the captured image at the time of detection. Define the area of the object taken from within and the object returned to the shelf. The number of the object areas is recognized as the number of books. When the ceiling camera 310 has a zoom function, even if the number-of-books recognition unit 351 recognizes the number of books after the ceiling camera zooms in consideration of the distance between the ceiling camera 310 and the moving object Mo. Good.
 さらに冊数認識部351は、棚カメラ311によって、移動物体Moが棚から取った書籍と棚に戻した書籍の冊数を認識してもよく、天井カメラ310と棚カメラ311の組み合わせから、買い物客が棚から取った書籍と棚に戻した書籍の冊数を認識してもよい。その際の棚カメラ311は、広範囲にわたって撮像できるカメラであってもよい。 Further, the number-of-books recognizing unit 351 may recognize the number of books taken by the moving object Mo from the shelf and returned to the shelf by the shelf camera 311, and from the combination of the ceiling camera 310 and the shelf camera 311, the shopper can recognize the number of books. The number of books taken from the shelf and the number of books returned to the shelf may be recognized. The shelf camera 311 at that time may be a camera capable of capturing an image in a wide range.
 移動物体と冊数紐づけ部352は、冊数認識部351によって認識された書籍の冊数と、その書籍を取った当該人とを紐づける。
 冊数不特定判定部353は、冊数が認識できなかった場合、冊数が認識できなかったことを移動物体Moに紐づける。
The moving object and the number-of-books association unit 352 associates the number of books recognized by the number-of-books recognition unit 351 with the person who took the book.
When the number of books cannot be recognized, the unspecified number of books determining unit 353 associates the fact that the number of books cannot be recognized with the moving object Mo.
 人と紐づく冊数管理部354は、位置情報管理DB132等を活用し、移動物体MoのIDに紐づく冊数カウントリストを常に管理し続ける。書籍が棚から取られた際は、取得された書籍数分が加算される。反対に、書籍が棚に戻された際は、戻された書籍数分が減算される。 The number-of-volumes management unit 354 associated with a person continuously manages the number-of-volumes count list associated with the ID of the moving object Mo by utilizing the position information management DB 132 and the like. When a book is taken from the shelf, the number of books acquired is added. On the contrary, when the books are returned to the shelf, the number of the returned books is subtracted.
 天井カメラによる移動物体領域間受け渡し認識部355は、移動物体領域間で書籍が受け渡された場合、各移動物体Moに紐づく冊数情報を受け渡された移動物体Moに受け渡す。
 なお、この天井カメラによる移動物体領域間受け渡し認識部355は、ディープラーニング等の物体認識手法を用いて、人の動きを分析し、受け渡し認識してもよく、受け渡しの際に移動物体領域の中の手を認識しても良く、移動物体領域(手を含んでもよい)間の重なりを認識してもよい。天井カメラによる移動物体領域間受け渡し認識部355は、天井カメラ310に替えて棚カメラ311を使用してもよい。その際の棚カメラ311は、広範囲にわたって撮像できるカメラでもよい。
When a book is transferred between the moving object areas, the moving object area transfer recognizing unit 355 by the ceiling camera transfers the book number information associated with each moving object Mo to the transferred moving object Mo.
It should be noted that the moving object area transfer recognizing unit 355 by the ceiling camera may analyze the movement of a person by using an object recognition method such as deep learning and may recognize the transfer, and the moving object area transfer may be performed during the transfer. The hand may be recognized, or the overlap between the moving object regions (which may include the hand) may be recognized. The moving object area transfer recognition unit 355 using the ceiling camera may use the shelf camera 311 instead of the ceiling camera 310. The shelf camera 311 at that time may be a camera capable of capturing an image over a wide range.
 天井カメラによる受け渡された冊数認識部356は、移動物領域体間で書籍が受け渡された場合、その書籍の数を認識する。例えば、天井カメラによる受け渡された冊数認識部356は、受け渡しが認識された時点での撮像画像から書籍の数を認識する。
 天井カメラによる受け渡された冊数認識部356は、ズーム機能を有する天井カメラを備えることにより、受け渡しが行われたと推定される箇所をズームアップし、書籍の数を認識してもよい。
 天井カメラによる受け渡された冊数認識部356は、天井カメラ310に替えて広範囲を撮像できる棚カメラ311を使用することで、書籍の数を認識してもよい。
 また、天井カメラによる受け渡された冊数認識部356は、天井カメラによる移動物体領域間受け渡し認識部355によって特定された夫々の移動物体Moと、ここで認識された書籍の数とを紐づけ、冊数リストを更新する。
The number-of-books recognition unit 356 delivered by the ceiling camera recognizes the number of books when the books are delivered between the moving object regions. For example, the number-of-books recognition unit 356 delivered by the ceiling camera recognizes the number of books from the captured image at the time when the delivery is recognized.
The number-of-books recognition unit 356 delivered by the ceiling camera may be equipped with a ceiling camera having a zoom function to zoom up the portion where the delivery is supposed to be performed and recognize the number of books.
The number-of-books recognition unit 356 delivered by the ceiling camera may recognize the number of books by using the shelf camera 311 capable of capturing a wide range instead of the ceiling camera 310.
Further, the number-of-books recognition unit 356 delivered by the ceiling camera associates each moving object Mo identified by the delivery-recognition unit between moving object regions 355 by the ceiling camera with the number of books recognized here, Update the volume list.
 次に、実施形態2の商品認識システムにおける書籍の精算方法について、図22を中心に図14も合わせて参照して説明する。
 図22は、実施形態2において書籍の精算処理を説明するフローチャートである。
Next, a book settlement method in the product recognition system according to the second embodiment will be described mainly with reference to FIG. 22 and also with reference to FIG.
FIG. 22 is a flowchart illustrating book settlement processing in the second embodiment.
 ステップS201において、買い物客(移動物体Mo)が店舗(図14)の出入口22から店内に入店すると、出入口22付近に設置された天井カメラ310がその買い物客の撮像を開始する。
 買い物客が通路24を進むと、奥側の天井カメラ310がその買い物客を撮像する。このようにして、複数の天井カメラ310が買い物客を常時撮像する。出入口22付近にゲート21を設けてもよい。
 出入口22にゲート21が備えられている場合、ゲート21は、常時、閉じられているが、買い物客が入店するタイミングで開けられ、入店後に閉じられる。なお、ステップS201の前に、個人認証部320が買い物客の個人認証し、買い物客の個人情報を取得してもよい。
In step S201, when the shopper (moving object Mo) enters the store through the entrance / exit 22 of the store (FIG. 14), the ceiling camera 310 installed near the entrance / exit 22 starts imaging the shopper.
When the shopper advances along the aisle 24, the ceiling camera 310 on the back side images the shopper. In this way, the plurality of ceiling cameras 310 constantly image the shopper. The gate 21 may be provided near the entrance 22.
When the doorway 22 is provided with the gate 21, the gate 21 is always closed, but the gate 21 is opened at the timing when the shopper enters the store and is closed after entering the store. Note that the personal authentication unit 320 may perform personal authentication of the shopper and acquire the personal information of the shopper before step S201.
 ステップS202において、天井カメラによる移動物体発見部3302が、ステップS201で撮像されている買い物客の領域(移動物体領域)のみを抽出することで移動物体Moとして定義のうえ、移動物体MoにIDを発番し、当該IDとIDに紐づく店内における位置情報を位置情報管理DB132や売場装置3のRAM303に登録する。 In step S202, the moving object detection unit 3302 using the ceiling camera defines as the moving object Mo by extracting only the area (moving object area) of the shopper imaged in step S201, and assigns the ID to the moving object Mo. A number is issued and the position information in the store associated with the ID is registered in the position information management DB 132 or the RAM 303 of the sales floor device 3.
 ステップS203において、天井カメラによる移動物体領域定義部3304が、天井カメラ310によって撮像されている範囲内で移動したときに、移動後の移動物体Moの領域の位置を改めて定義する。位置情報は人の位置を管理する位置情報管理DB132やメモリ等で管理され、領域定義ごとに更新される。この定義された位置は、別の天井カメラ310において撮像されている位置でも認識される。 In step S203, when the moving object area definition unit 3304 by the ceiling camera moves within the range imaged by the ceiling camera 310, the position of the area of the moving object Mo after the movement is newly defined. The position information is managed by the position information management DB 132 that manages the position of a person, a memory, and the like, and is updated for each area definition. This defined position is also recognized as a position imaged by another ceiling camera 310.
 ステップS204において、買い物客が購入したい書籍を目指して通路24を移動することを踏まえ、天井カメラによる移動物体領域追跡部3305が移動物体領域の位置を推定し、移動物体Moを追跡し続ける。ステップS201において移動物体Moが常時撮像され、ステップS202において、移動物体MoにIDが発番され、ステップS203において、移動後の移動物体Moの領域の位置情報を更新し、さらにカメラ間情報受け渡し部341によって位置情報がやり取りされているため、移動物体Moが通路24を移動し、異なる天井カメラ310で撮像されても、天井カメラによる移動物体領域追跡部3305は、移動物体Moを追跡し続けることができる。 In step S204, the moving object area tracking unit 3305 using the ceiling camera estimates the position of the moving object area and keeps track of the moving object Mo based on the fact that the shopper moves through the passage 24 toward the book to be purchased. In step S201, the moving object Mo is constantly imaged, in step S202, the ID is assigned to the moving object Mo, in step S203, the positional information of the area of the moving object Mo after the movement is updated, and the inter-camera information passing unit is further updated. Since position information is exchanged by 341, even if the moving object Mo moves in the passage 24 and is imaged by a different ceiling camera 310, the moving object area tracking unit 3305 by the ceiling camera continues to track the moving object Mo. You can
 ステップS205において、冊数認識部351が、IDを発番された移動物体Moによって棚から取られた書籍と棚に戻された書籍の冊数を認識する。
 なお、ステップS205において、取られた書籍の冊数をカウントできなかった場合は、図示していない冊数不特定判定部がエラーを出力する。
In step S205, the number-of-books recognizing unit 351 recognizes the number of books taken from the shelf and returned to the shelf by the moving object Mo having the ID.
In step S205, when the number of books taken cannot be counted, an unillustrated number-of-books determination unit outputs an error.
 ステップS206において、移動物体と冊数紐づけ部352が、ステップS205で認識(カウント)された書籍の冊数と移動物体Moとを紐づける。したがって、冊数認識部351は、一意に特定された買い物客が何冊の書籍を取ったかを認識する。 In step S206, the moving object and the book number association unit 352 associates the number of books recognized (counted) in step S205 with the moving object Mo. Therefore, the number-of-books recognition unit 351 recognizes how many books the uniquely identified shopper has taken.
 ステップS207において、人と紐づく冊数管理部363が、サーバ1のDB管理部141を活用して人と紐づく書籍の冊数を管理し続ける。したがって、買い物客が棚内から取った書籍が棚内に戻されたときであっても、人と紐づく冊数管理部363は、買い物客が所持している書籍の冊数を認識する。 In step S207, the number-of-books management unit 363 associated with the person continues to manage the number of books associated with the person by utilizing the DB management unit 141 of the server 1. Therefore, even when the book picked up from the shelf by the shopper is returned to the shelf, the book quantity management unit 363 associated with the person recognizes the number of books owned by the shopper.
 買い物客は、書籍を取り終えると、レジ端末2まで行き、その書籍をレジ端末2の所定エリアAに置く。その書籍は、図15に示すように、背表紙がレジ端末2の少なくとも1台以上のカメラ211に写されるように置かれる。そして、買い物客がレジ端末2に備えられた入力部207であるボタンを押下等する。レジ端末2は、このボタンの押下等をトリガーとし、商品特定を行う。商品特定は画像認識手法を駆使し、書籍の表紙や背表紙等から書名を特定することにより行われる。 When the shopper finishes picking up the book, it goes to the cashier terminal 2 and puts the book in a predetermined area A of the cashier terminal 2. As shown in FIG. 15, the book is placed so that the spine cover is captured by at least one or more cameras 211 of the cashier terminal 2. Then, the shopper presses a button, which is the input unit 207 provided in the cashier terminal 2. The cashier terminal 2 uses the pressing of this button or the like as a trigger to specify the product. The product identification is performed by making full use of the image recognition method and identifying the title of the book from the cover or spine of the book.
 ステップS210において、レジ端末2の商品特定部235が、ステップS207で認識された移動物体Moに紐づく冊数情報と、レジ端末2の所定エリアAに置かれた書籍の冊数情報とが一致するかどうかを検証する。この検証の具体的な流れについては、図23を参照して後述する。
 移動物体Moに紐づく冊数情報と、レジ端末2の所定エリアAに置かれた書籍の冊数情報とが一致しない場合は、目検用端末Qにおける目検(冊数の確認)が依頼される。目検によって冊数が確認(冊数の不一致が解消)されない場合には、エラー表示部151にエラーが表示され、精算することができない。また、その状態で退店を試みると、出入口22付近に設置されたエラー表示部151が音、光等により警告が発せられる。ゲート21が設置されている場合は、閉じられた状態を維持する。
 ステップS210において、レジ端末2に置かれた書籍が制限有無に該当するかどうかを売買制限商品判定部236が併せて判定する。ステップS207まで(ステップS201以前を含む)の任意のタイミングで個人情報が事前に取得されることで、売買制限商品判定部236は、制限のある書籍を購入できない買い物客が購入しようとしているときに、エラー表示部151によってエラー表示する。
In step S210, the product specification unit 235 of the cashier terminal 2 determines whether the number-of-volumes information associated with the moving object Mo recognized in step S207 matches the number-of-volumes information of the books placed in the predetermined area A of the cashier terminal 2. Verify whether. The specific flow of this verification will be described later with reference to FIG.
If the number-of-volumes information associated with the moving object Mo does not match the number-of-volumes information of the books placed in the predetermined area A of the cashier terminal 2, a visual inspection (confirmation of the number of volumes) at the visual inspection terminal Q is requested. If the number of books is not confirmed by visual inspection (disagreement of the number of books is not resolved), an error is displayed on the error display unit 151 and settlement is not possible. If an attempt is made to leave the store in that state, the error display unit 151 installed near the entrance 22 is warned by sound, light, or the like. If the gate 21 is installed, it remains closed.
In step S210, the trade-restricted product determination unit 236 also determines whether or not the book placed on the cashier terminal corresponds to the presence or absence of the restriction. By obtaining the personal information in advance at any timing up to step S207 (including before step S201), the trade-restricted product determination unit 236 allows the shopper who cannot purchase the restricted book to purchase the book. The error display unit 151 displays an error.
 ステップS211において、レジ端末2の精算部237が所定エリアAに置かれた書籍の合計金額を精算する。即ち、実施形態1において説明したように商品特定部235によって書籍の価格等の情報が得られ、合計金額が算出される。買い物客は、クレジットカードや電子マネー等で支払うことで、精算が完了する。
 なお、精算が完了すると、DB管理部141は、人に紐づく冊数カウントリストを「精算済」の状態に更新する。
In step S211, the settlement unit 237 of the cashier terminal 2 settles the total amount of money of the books placed in the predetermined area A. That is, as described in the first embodiment, the product specification unit 235 obtains information such as the price of a book and calculates the total price. The shopper pays with a credit card, electronic money, or the like to complete the settlement.
When the settlement is completed, the DB management unit 141 updates the number-of-volumes count list associated with the person to the state of “settlement completed”.
 また、出入口22付近において、買い物客が精算されていない書籍を持ち出そうとしているとき、即ち移動物体Moに紐づく冊数カウントリストが「精算済」になっていない状態のときは、持ち出し検知部がそれを検知し、出入口22付近のエラー表示部151やレジ端末2の提示部210が音や光等で提示する。また、ゲート21が設置されている場合、ゲート21は閉じた状態を維持し、買い物客は退店することができない。ゲート21が閉じた状態を維持する他、エラー表示部151等が音や光等により警告を発してもよい。これにより、万引きを防止することができる。
 エラー表示部151やレジ端末2の提示部210は、エラー状態に応じて異なる色で発光するようにもされてもよい。
Further, when the shopper is trying to bring out an unpaid book near the doorway 22, that is, when the book count count list associated with the moving object Mo is not “paid”, the take-out detection unit Detects it, and the error display section 151 near the entrance 22 and the presentation section 210 of the cashier terminal 2 present it with sound, light, or the like. Further, when the gate 21 is installed, the gate 21 remains closed and the shopper cannot leave the store. In addition to maintaining the gate 21 closed, the error display unit 151 or the like may give a warning by sound, light, or the like. Thereby, shoplifting can be prevented.
The error display unit 151 and the presentation unit 210 of the cashier terminal 2 may be made to emit light in different colors depending on the error state.
 図23は、図22のステップS210において書籍の冊数情報と精算される冊数との検証について説明するフローチャートである。
 ステップS221において、レジ端末2の入力部207の読取開始ボタンが押下されると、レジ端末2の画像取得部232がレジ端末2の所定エリアAに置かれた物体撮像画像を取得した後、存在が認識された物体について、商品特定部235がいずれの商品かを特定し、商品の冊数をも認識する。
 ステップS222において、売場装置3の冊数カウント部350の人と冊数紐づけ部362が、レジ端末2に置かれた書籍の冊数の情報を移動物体MoのIDから照会する。
FIG. 23 is a flowchart for explaining the verification of the book volume information and the booked volume number in step S210 of FIG.
In step S221, when the reading start button of the input unit 207 of the cashier terminal 2 is pressed, the image acquisition unit 232 of the cashier terminal 2 acquires the object imaged image placed in the predetermined area A of the cashier terminal 2 and then exists. With respect to the object for which is recognized, the product specification unit 235 specifies which product, and also recognizes the number of products.
In step S222, the person of the volume number counting unit 350 of the sales floor device 3 and the volume number associating unit 362 make an inquiry from the ID of the moving object Mo about the volume number of the books placed on the cashier terminal 2.
 ステップS223において、商品特定部235が、レジ端末2に置かれた書籍の冊数と、人と紐づく冊数管理部でカウントされた書籍の冊数とが一致するか判定する。
 レジ端末2に置かれた書籍の冊数と、人と紐づく冊数管理部でカウントされた書籍の冊数とが一致する場合(YES)は、処理はステップS226に進む。
 レジ端末2に置かれた書籍の冊数と、人と紐づく冊数管理部でカウントされた書籍の冊数とが一致しない場合(NO)は、処理はステップS224に進む。
 ステップS224において、商品特定部235は、物体撮像画像及び売場装置3でカウントされた書籍の冊数を目検結果取得部239を介して目検用端末Qに送信することにより、目検用端末Qにおける目検(冊数の確認)を依頼し、その目検結果を取得する。
 ステップS225において、商品特定部235は、冊数の不一致が解消した旨の目検結果を取得したか否かを判定する。
 ステップS225において冊数の不一致が解消した旨の目検結果を取得していないと判定された場合は、レジ端末2のエラー表示部151がエラーを表示し、警告を発する。
 この際、レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。
 なお、警告が発せられた状態のまま退店を試みると、出入口22付近のエラー表示部151が音や光等により警告を発する。ゲート21が設置されている場合は、ゲート21は閉じた状態を維持し、買い物客は退店できない。これにより、万引きを防止することができる。
 ステップS225において冊数の不一致が解消した旨の目検結果を取得したと判定された場合は、処理はステップS226に進む。
 ステップS226において、商品特定部235は、特定物体目検モードまたは全物体目検モードのいずれに設定されているか判定する。
 特定物体目検モードに設定されている場合、処理はステップS227に進む。
 また、全物体目検モードに設定されている場合、処理はステップS228に進む。
 ステップS227において、目検結果取得部239は、対象となる書籍について、目検用端末Qにおける目検(商品の特定)を依頼し、その目検結果を取得する。
 ステップS227の後、処理はステップS229に進む。
 ステップS228において、商品特定部235は、レジ端末2の所定エリアAに置かれている書籍の中にいずれの書籍であるか特定できないものがあるかどうか判定する。商品特定部235がいずれの書籍であるか特定できないものがある場合(ステップS228においてYES)は、処理はステップS227に進む。
In step S223, the product identification unit 235 determines whether the number of books placed on the cashier terminal 2 and the number of books counted by the number management unit associated with a person match.
If the number of books placed on the cashier terminal 2 and the number of books counted by the book number management unit associated with the person match (YES), the process proceeds to step S226.
If the number of books placed on the cashier terminal 2 does not match the number of books counted by the book number management unit associated with the person (NO), the process proceeds to step S224.
In step S224, the product specifying unit 235 transmits the object captured image and the number of books counted in the sales floor device 3 to the eye inspection terminal Q via the eye inspection result acquisition unit 239, and thus, the eye inspection terminal Q. Request a visual inspection (confirm the number of books) in and obtain the visual inspection result.
In step S225, the product identification unit 235 determines whether or not a visual inspection result indicating that the discrepancy in the number of books has been resolved is obtained.
When it is determined in step S225 that the eye inspection result indicating that the discrepancy in the number of books has been resolved is not acquired, the error display unit 151 of the cashier terminal 2 displays an error and issues a warning.
At this time, the purchaser and the eye examiner may talk with each other via the microphone M and the speaker S provided in the cashier terminal 2.
If an attempt is made to leave the store while the warning is given, the error display section 151 near the entrance 22 gives a warning by sound or light. When the gate 21 is installed, the gate 21 remains closed and the shopper cannot exit. Thereby, shoplifting can be prevented.
If it is determined in step S225 that the visual inspection result indicating that the discrepancy in the number of books has been resolved is acquired, the process proceeds to step S226.
In step S226, the product identification unit 235 determines whether the specific object inspection mode or the all object inspection mode is set.
If the specific object visual inspection mode is set, the process proceeds to step S227.
If the all-object visual inspection mode is set, the process proceeds to step S228.
In step S227, the eye inspection result acquisition unit 239 requests eye inspection (specification of the product) at the eye inspection terminal Q for the target book, and acquires the eye inspection result.
After step S227, the process proceeds to step S229.
In step S228, the product identification unit 235 determines whether or not it is not possible to identify which of the books placed in the predetermined area A of the cashier terminal 2. If there is a book that cannot be identified by product identification unit 235 (YES in step S228), the process proceeds to step S227.
 ステップS228において商品特定部235がいずれの書籍であるか特定できないものがない場合(ステップS228においてNO)は、商品特定部235は、DB情報保持部241またはサーバ1の記憶部に保持された書籍名や価格、売買制限商品である等の情報を含めて書籍を特定する。これにより、処理はステップS229に進む。特定された書籍の情報は、表示制御部238に出力されてもよい。 If there is no book that the product specification unit 235 cannot identify in step S228 (NO in step S228), the product specification unit 235 stores the books held in the DB information holding unit 241 or the storage unit of the server 1. A book is specified by including information such as name, price, and restricted product. As a result, the process proceeds to step S229. Information on the specified book may be output to the display control unit 238.
 ステップS229において、売買制限商品判定部236は、商品特定部235によって特定された書籍が、年齢確認が必要とされる書籍であるかどうかを判定する。 In step S229, the trade-restricted product determination unit 236 determines whether the book identified by the product identification unit 235 is a book that needs age confirmation.
 ステップS229において、商品特定部235によって特定された書籍が、年齢確認が必要とされる書籍であると判定された場合は、即ちYESと判定された場合、処理はステップS230に進む。
 ステップS229において、商品特定部235によって特定された書籍が、年齢確認が必要とされる書籍でないと判定された場合は、即ちNOと判定された場合、処理は書籍の自動精算処理に戻る。
 ステップS230において、表示制御部238は、レジ端末2の表示部Dに年齢確認のための画面を表示させる。ただし、買い物客の個人情報が取得され、ここで年齢確認する必要がない場合は、ステップS230はスキップされ、処理はステップS234に進む。
 この際、購入者の顔や年齢確認用書類を撮像した画像が不鮮明なとき等、購入者の年齢認証に問題があるときは、レジ端末2において画面表示や音声案内等で購入者に知らせて、購入者に画像の撮り直しを依頼する構成としてもよい。さらに、レジ端末2に備え付けられたマイクロフォンMとスピーカーSを介して、購入者と目検者等が通話する構成としてもよい。
 ステップS231において、目検結果取得部239は、対象となる書籍について、目検用端末Qにおける目検(売買制限商品の判定)を依頼し、その目検結果を取得する。
 ステップS232において、売買制限商品判定部236は、売買制限を解除する指示を受け付けたか否かを判定する。
If it is determined in step S229 that the book identified by the product identification unit 235 is a book that requires age confirmation, that is, if YES, the process proceeds to step S230.
If it is determined in step S229 that the book identified by the product identification unit 235 is not a book that requires age confirmation, that is, if NO, the process returns to the book automatic settlement process.
In step S230, the display control unit 238 causes the display unit D of the cashier terminal 2 to display a screen for age confirmation. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S230 is skipped and the process proceeds to step S234.
At this time, if there is a problem with the purchaser's age authentication, such as when the purchaser's face or an image of the age confirmation document is unclear, the purchaser is notified by a screen display or voice guidance at the cashier terminal 2. Alternatively, the purchaser may be requested to retake the image. Further, the purchaser and the eye examiner may make a call via the microphone M and the speaker S provided in the cashier terminal 2.
In step S231, the eye inspection result acquisition unit 239 requests the eye inspection terminal Q to perform eye inspection (determination of trade-restricted products) for the target book, and acquires the eye inspection result.
In step S232, the trade-restricted product determination unit 236 determines whether or not an instruction to cancel the trade restriction has been received.
 ステップS232において売買制限を解除する指示を受け付けていないと判定された場合は、処理はステップS233に進む。
 ステップS232において売買制限を解除する解除指示を受け付けたと判定されると、即ちYESと判定されると、処理はステップS234に進む。
 ステップS233において、目検用端末Qの目検結果送信部412は、目検結果によっても売買制限が解除されなかった旨の警告を送信する。この警告を受信することにより、例えば、レジ端末2の表示制御部238は、目検結果によっても売買制限が解除されなかった旨の警告を、出力部206を介して提示する。ステップS234の後、処理は終了し、精算が中止となる。
 ステップS234において、売買制限商品判定部236は、売買制限を解除する。
 このようにしてステップS234が終了されるか、またはステップS229において年齢制限商品でないと判定された場合(NOであると判定された場合)、処理は書籍の自動精算処理に戻る。
 このようにして、本情報処理システムは、店舗内で買い物客が取った書籍の冊数を認識し、レジ端末に置かれた書籍の冊数との一致を判定する。また、このとき、目検の対象とされる場合(例えば、冊数が一致しない場合、レジ端末2において商品の特定が行えない場合、売買制限商品に該当する書籍の場合等)については、目検用端末Qにおける目検が行われ、目検結果(書籍の冊数の確認結果、書籍の特定結果、売買制限商品の判定結果等)に応じて、自動精算が行われる。
 したがって、本情報処理システムによれば、買い物客が書籍等の商品を購入する際に、商品の代金の精算の自動化が可能になると共に、商品の特定精度を高めることができる。
If it is determined in step S232 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S233.
If it is determined in step S232 that the cancel instruction for canceling the trading restriction has been received, that is, if YES, the process proceeds to step S234.
In step S233, the eye inspection result transmission unit 412 of the eye inspection terminal Q transmits a warning that the trading restriction has not been released even by the eye inspection result. By receiving this warning, for example, the display control unit 238 of the cashier terminal 2 presents, via the output unit 206, a warning that the trading restriction has not been canceled even by the result of the visual inspection. After step S234, the process ends and the settlement is stopped.
In step S234, the trade restriction product determination unit 236 cancels the trade restriction.
In this way, when step S234 is ended or it is determined in step S229 that the product is not age-restricted (NO is determined), the process returns to the book automatic settlement process.
In this way, the information processing system recognizes the number of books taken by the shopper in the store and determines whether the number matches with the number of books placed on the cashier terminal. In addition, at this time, in the case of being subject to the visual inspection (for example, when the number of volumes does not match, when the product cannot be specified at the cashier terminal 2 or when the book corresponds to the sale-restricted product), the visual inspection is performed. A visual inspection is performed at the terminal Q, and automatic settlement is performed according to a visual inspection result (a result of checking the number of books, a result of identifying a book, a result of determining a trade-restricted product, etc.).
Therefore, according to the present information processing system, when the shopper purchases a product such as a book, it is possible to automate the settlement of the price of the product and improve the product identification accuracy.
 実施形態2において、移動物体Mo(買い物客)の追跡において想定されるエラーには、(A)買い物客が店舗20の出入口22から入店した際に、天井カメラによる移動物体発見部3302が移動物体を検出できなかった場合、(B)天井カメラによる移動物体領域追跡部3305が追跡中の移動物体Moを見失った場合、(C)二つ以上の異なる移動物体Moの夫々に紐付けられたIDが追跡途中に入れ替わった場合、等が含まれる(当然これらに限定されない)。 In the second embodiment, as an error expected in tracking the moving object Mo (shopper), (A) when the shopper enters from the entrance 22 of the store 20, the moving object finding unit 3302 using the ceiling camera moves. When the object cannot be detected, (B) when the moving object area tracking unit 3305 with the ceiling camera loses track of the moving object Mo being tracked, (C) it is associated with each of two or more different moving objects Mo. If the IDs are swapped during tracking, etc. are included (not limited to these, of course).
 各エラー状態に応じ、本実施形態が採用するシステムが、以下の例を含む、種々の対応を行う。
 (A)天井カメラ310により撮像された買い物客を被写体として含む画像を目検用端末Qに送信し、目検による買い物客の検出を依頼する。目検者が買い物客を検出した場合は、新たな移動物体Moを定義の上、IDを発番して、天井カメラによる移動物体領域追跡部3305による追跡を開始する。目検者が買い物客を検出できなかった場合は、その旨を店員に報知する。
 (B)位置情報管理DB132に登録されているどのIDとも紐づけられていない買い物客が認められた場合、その買い物客の撮像画像と、紐づけられるべきID候補のリストが目検用端末Qに送信される。目検者は、過去の買い物客とIDとの紐づけ情報に基づいて、最も適切なIDを当該買い物客に紐づけ、天井カメラによる移動物体領域追跡部3305による再追跡を開始する。何らかの理由で再追跡が開始できなかった場合は、その旨が店員に報知される。
 また、買い物客との紐づけを失ったIDが検出された場合には、当該IDと、どのIDとも紐づけられていない買い物客の撮像画像のリストが目検用端末Qに送信される。目検者は、過去の買い物客とIDとの紐づけ情報に基づいて、当該IDに買い物客を紐づけることを試み、紐づけが成功した場合は、天井カメラによる移動物体領域追跡部3305による再追跡を開始する。何らかの理由で再追跡の開始が失敗した場合は、その旨が店員に報知される。
 (C)夫々の買い物客の画像リストと、夫々の買い物客に紐づけられるべきIDのリストが目検用端末Qに送信される。目検者は、過去の買い物客とIDとの紐づけ情報に基づいて、最も適切に、IDを買い物客に割り振ることを試みる。IDを買い物客に適切に割り振ることができた場合は天井カメラによる移動物体領域追跡部3305による再追跡を開始する。IDを買い物客に適切に割り振ることができない場合は、割り振りを諦め、紐づけが入れ替わったままの旨を店員に報知する。
Depending on each error state, the system adopted in this embodiment takes various actions including the following examples.
(A) The image including the shopper captured by the ceiling camera 310 as an object is transmitted to the inspection terminal Q, and the detection of the shopper by the inspection is requested. When the eye checker detects a shopper, a new moving object Mo is defined, an ID is issued, and tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the eye checker cannot detect the shopper, the fact is notified to the clerk.
(B) When a shopper who is not associated with any ID registered in the positional information management DB 132 is recognized, the imaged image of the shopper and a list of ID candidates to be associated are displayed in the terminal Q for inspection. Sent to. The eye examiner associates the most appropriate ID with the shopper based on the past association information of the shopper and the ID, and starts re-tracking by the moving object area tracking unit 3305 using the ceiling camera. If the re-tracking cannot be started for some reason, the clerk is informed accordingly.
In addition, when an ID that has lost connection with the shopper is detected, the ID and a list of imaged images of shoppers that are not associated with any ID are transmitted to the inspection terminal Q. The eye examiner attempts to associate the shopper with the ID based on the association information of the shopper and the ID in the past, and when the association is successful, the moving object area tracking unit 3305 by the ceiling camera uses Start retracking. If the start of re-tracking fails for some reason, the clerk is informed accordingly.
(C) The image list of each shopper and the list of IDs that should be associated with each shopper are transmitted to the inspection terminal Q. The eye examiner tries to allocate the ID to the shopper most appropriately based on the past information of the shopper and the ID. When the IDs can be appropriately assigned to the shoppers, the re-tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the IDs cannot be properly allocated to the shoppers, the allocation is abandoned and the clerk is informed that the ties have been exchanged.
〔実施形態3〕
 図24は、実施形態3の商品認識システムを採用するスーパーマーケットのレイアウト例を示す図である。
 実施形態3の商品認識システムは、図24に示すようなスーパーマーケット等の店舗30に対して適用するシステムである。
[Embodiment 3]
FIG. 24 is a diagram showing a layout example of a supermarket adopting the product recognition system of the third embodiment.
The product recognition system of the third embodiment is a system applied to a store 30 such as a supermarket as shown in FIG.
 店舗30では、入口31から出口32まで売場が設けられている。売場には、商品を陳列する複数の棚ラック33が設置されている。向き合った2つの棚ラック33の間は、通路34とされている。 The store 30 has a sales floor from the entrance 31 to the exit 32. A plurality of shelf racks 33 for displaying products are installed in the sales floor. A passage 34 is formed between the two shelf racks 33 facing each other.
 入口31には、買い物カゴやショッピングカート等のカゴ類(採番せず)が置かれている。出口32の手前は、精算エリア35とされている。精算エリア35内には、複数のレジ台36が設置されている。レジ台36は、n台の精算機4を備えている。実施形態3では、実施形態1,2で採用したようなレジ端末2を備えない。また、実施形態3では買い物客を移動物体Moとして扱う実施形態2と異なり、システム上把握(発見・追跡等)されるカゴ類を移動物体Moとして扱う。 At the entrance 31, baskets such as shopping carts and shopping carts (not numbered) are placed. The settlement area 35 is located in front of the exit 32. A plurality of cash registers 36 are installed in the settlement area 35. The cashier table 36 is equipped with n settlement machines 4. The third embodiment does not include the cashier terminal 2 as employed in the first and second embodiments. Further, in the third embodiment, unlike the second embodiment in which a shopper is treated as a moving object Mo, baskets grasped (discovered / tracked) by the system are treated as a moving object Mo.
 買い物客は、入口31から入店し、カゴ類を取って通路34を進む。買い物客は、棚内の商品を取って、カゴ類の中に入れて通路34を進む。買い物客は、購入したい商品を全て取ると、精算エリア35へ進み、精算機4で精算する。
 通路34や精算エリア35等において、店員Mtが見回り等をしている。店員Mtは、情報端末9aを所持している。情報端末9aは、スマートフォン等の携帯型情報処理端末であり、店内の状態を表示する画面等を備えている。
 図24において、雲形で描かれた線図の内側には、店舗30内でなく、店舗30外や店舗30のバックヤードの様子が描画されている。店舗30外には、サーバ1(図7)が設置されている。店舗30外に存在する管理センターや店舗30バックヤード等では、店員Mtが大型モニターの画面(図示せず)や情報端末9bの画面を通じて店舗30内を監視している。また、以下、情報端末9aと情報端末9bとを区別する必要がない場合、これらをまとめて、「情報端末9」と呼ぶ。また、図24には図示されていないが、本実施形態においても、実施形態1,2と同様に、目検用端末Qが設置されている。
The shopper enters the store through the entrance 31, picks up the basket, and proceeds through the passage 34. The shopper takes the product in the shelf, puts it in the basket, and proceeds through the aisle 34. When all the desired products are purchased, the shopper proceeds to the settlement area 35 and makes the settlement with the settlement machine 4.
In the passage 34, the settlement area 35, etc., the clerk Mt is looking around. The store clerk Mt owns the information terminal 9a. The information terminal 9a is a portable information processing terminal such as a smartphone, and includes a screen for displaying the state of the store.
In FIG. 24, not the inside of the store 30 but the outside of the store 30 or the backyard of the store 30 is depicted inside the cloud-shaped diagram. The server 1 (FIG. 7) is installed outside the store 30. At a management center existing outside the store 30, a store 30 backyard, or the like, the store clerk Mt monitors the inside of the store 30 through the screen of a large monitor (not shown) or the screen of the information terminal 9b. Further, hereinafter, when it is not necessary to distinguish the information terminal 9a and the information terminal 9b, they are collectively referred to as "information terminal 9". Further, although not shown in FIG. 24, in the present embodiment as well, as in the first and second embodiments, the inspection terminal Q is installed.
 通路34や棚ラック33、その他、店舗30内の任意の位置の上方の天井には、複数の天井カメラ310が離間して夫々設置されている。天井カメラ310は、通路34や棚ラック33、その他その下方の所定領域を撮像する。即ち、天井カメラ310は、移動物体Moが進入した場合には、移動物体Moを被写体として含む当該所定領域を撮像する。 A plurality of ceiling cameras 310 are separately installed on the ceiling above the passage 34, the shelf rack 33, and other arbitrary positions in the store 30. The ceiling camera 310 captures an image of the passage 34, the shelf rack 33, and a predetermined area therebelow. That is, when the moving object Mo enters, the ceiling camera 310 images the predetermined area including the moving object Mo as a subject.
 また、各棚ラック33内の各棚の複数個所の夫々には、センシングデバイスの一例として棚カメラ311が設置されている。棚カメラ311は、棚内や棚内の商品、その他所定領域を撮像する。また、棚カメラ311は、棚内の所定領域内に買い物客の手等が進入した場合や、棚内から物体が取られた場合に、その手や棚内の物体等を物体撮像画像として撮像する。
 カゴ類の夫々には、センシングデバイスの一例としてカゴカメラ312(図24において図示せず)が取り付けられていてもよい。
 その場合、売場装置3のカゴカメラ312は、カゴ類に1台以上設置され、カゴ類に入れられた物体を常時撮像する。カゴカメラ312は、カゴ類の内側を死角なく撮像できるように設置されている。カゴカメラ312は、カゴ類に入れられた物体の少なくとも正面等の特徴がある部位を撮像する。
A shelf camera 311 is installed as an example of a sensing device at each of a plurality of positions on each shelf in each shelf rack 33. The shelf camera 311 images the inside of the shelf, the products inside the shelf, and other predetermined areas. Further, the shelf camera 311 takes an image of the hand or the object in the shelf as an object imaged image when the shopper's hand or the like enters the predetermined area in the shelf or when an object is taken from the shelf. To do.
A basket camera 312 (not shown in FIG. 24) may be attached to each of the baskets as an example of a sensing device.
In that case, one or more basket cameras 312 of the sales floor device 3 are installed in the baskets and always image the objects put in the baskets. The basket camera 312 is installed so that the inside of baskets can be imaged without blind spots. The cage camera 312 captures an image of a characteristic portion of at least the front surface of an object placed in a basket.
 また、カゴカメラ312は、ネットワークNを通じて天井カメラ310や棚カメラ311と連携される。この連携により、売場装置3は、カゴカメラ312によって撮像された撮像画像と、天井カメラ310や棚カメラ311によって撮像された物体の撮像画像の両撮像画像とを共有することができる。この共有により、この撮像画像に撮像された物体から商品を特定する精度を高めることができる。 Further, the basket camera 312 is linked with the ceiling camera 310 and the shelf camera 311 via the network N. By this cooperation, the sales floor device 3 can share both the captured image captured by the basket camera 312 and the captured image of the object captured by the ceiling camera 310 or the shelf camera 311. By this sharing, it is possible to improve the accuracy of identifying the product from the object imaged in the captured image.
 実施形態3の商品認識システムは、天井カメラ310によって撮像される移動物体Moが移動しても、その移動物体Moを追跡し続ける機能を備えている。実施形態3の情報処理システムは、棚ラック33から取られた物体がいずれの商品であるかを、棚カメラ311によって撮像された撮像画像から特定する機能を備えている。
 実施形態3の商品認識システムは、カゴ類に入れられた物体がいずれの商品であるかを、カゴカメラ312によって撮像された撮像画像から特定する機能を備えていてもよい。
 実施形態3の商品認識システムは、棚内の物体がいずれの商品であるかが特定され、さらに、その商品が精算機4において自動精算する機能を備えている。この自動精算に際して、精算機4において移動物体Moと紐づけられていた商品の情報とが読み出され、自動精算が可能となる。
 精算機4は、購入商品の合計金額、点数、明細の表示、売買制限商品判定の表示、決済機能等、買い物を完結するために必要な機能を含んでいる。
The product recognition system according to the third embodiment has a function of continuously tracking the moving object Mo even if the moving object Mo captured by the ceiling camera 310 moves. The information processing system according to the third embodiment has a function of identifying which product the object picked up from the shelf rack 33 is from the captured image captured by the shelf camera 311.
The product recognition system of the third embodiment may have a function of identifying which product the object put in the basket is from the captured image captured by the basket camera 312.
The product recognition system according to the third embodiment has a function of identifying which product the object in the shelf is and further automatically adjusting the product in the checkout machine 4. At the time of this automatic settlement, the settlement machine 4 reads the information of the product associated with the moving object Mo, and the automatic settlement becomes possible.
The checkout machine 4 includes functions necessary for completing shopping, such as the total price of purchased commodities, points, display of details, display of trade-restricted commodity determination, and settlement function.
 天井カメラ310、棚カメラ311及びカゴカメラ312は、図25及び図26に示すような売場装置3に組み込まれる。売場装置3は、天井カメラ310や棚カメラ311等によって撮像された撮像画像から商品を特定するための機能や移動物体Moを発見・追跡するための機能を備える。この売場装置3は、図25に示すような商品認識システムに組み込まれる。 The ceiling camera 310, the shelf camera 311, and the basket camera 312 are incorporated in the sales floor device 3 as shown in FIGS. 25 and 26. The sales floor device 3 has a function of specifying a product from a captured image captured by the ceiling camera 310, the shelf camera 311 and the like, and a function of discovering and tracking a moving object Mo. This sales floor device 3 is incorporated in a product recognition system as shown in FIG.
 図25は、実施形態3の商品認識システムの構成を示す構成図である。
 実施形態3の商品認識システムは、サーバ1と、売場装置3と、精算機4と、目検用端末Qと、を有している。実施形態3では、物体がいずれの商品であるかを売場装置3が特定するために、実施形態1,2で説明したレジ端末2に替えて精算機4を備えている。
 図25において、サーバ1及び売場装置3は、1台しか描画されていないが、実際には複数台の場合もある。また、以下、精算機4を個々に区別する必要がない場合、これらをまとめて、「精算機4」と呼ぶ。
 サーバ1と、売場装置3と、精算機4と、目検用端末Qとは、インターネット等のネットワークNを介して相互に接続されている。
 サーバ1は、実施形態1のサーバ1(図7)と同様に構成される。また、目検用端末Qは、実施形態1の目検用端末Qと同様に構成される。
FIG. 25 is a configuration diagram showing the configuration of the product recognition system of the third embodiment.
The product recognition system according to the third embodiment includes a server 1, a sales floor device 3, a settlement machine 4, and a visual inspection terminal Q. In the third embodiment, in order for the sales floor device 3 to specify which product the object is, a cashier 4 is provided instead of the cashier terminal 2 described in the first and second embodiments.
In FIG. 25, only one server 1 and the sales floor device 3 are drawn, but in reality, there may be a plurality of devices. Further, hereinafter, when it is not necessary to individually distinguish the settlement machines 4, these are collectively referred to as a “settlement machine 4”.
The server 1, the sales floor device 3, the settlement machine 4, and the inspection terminal Q are mutually connected via a network N such as the Internet.
The server 1 is configured similarly to the server 1 (FIG. 7) of the first embodiment. The eye examination terminal Q has the same configuration as the eye examination terminal Q of the first embodiment.
 図26は、図25の商品認識システムのうち売場装置3のハードウェア構成を示すブロック図である。
 売場装置3は、CPU301と、ROM302と、RAM303と、バス304と、入出力インターフェース305と、天井カメラ310と、棚カメラ311と、カゴカメラ312と、通信部315と、情報端末9と、を備える。
FIG. 26 is a block diagram showing the hardware configuration of the sales floor device 3 in the product recognition system of FIG.
The sales floor device 3 includes a CPU 301, a ROM 302, a RAM 303, a bus 304, an input / output interface 305, a ceiling camera 310, a shelf camera 311, a basket camera 312, a communication unit 315, and an information terminal 9. Prepare
 売場装置3のCPU301、ROM302、RAM303、バス304、入出力インターフェース305、通信部315は、図7に示したサーバ1のこれらと同様に構成される。
 売場装置3の天井カメラ310、棚カメラ311、通信部315、情報端末9は、実施形態2において説明した売場装置3(図17)のこれらと同様に構成される。
The CPU 301, the ROM 302, the RAM 303, the bus 304, the input / output interface 305, and the communication unit 315 of the sales floor device 3 are configured similarly to those of the server 1 illustrated in FIG. 7.
The ceiling camera 310, the shelf camera 311, the communication unit 315, and the information terminal 9 of the sales floor device 3 are configured similarly to those of the sales floor device 3 (FIG. 17) described in the second embodiment.
 図27は、サーバ1、売場装置3、精算機4及び目検用端末Qの機能的構成の一例を示す機能ブロック図である。 FIG. 27 is a functional block diagram showing an example of a functional configuration of the server 1, the sales floor device 3, the settlement machine 4, and the inspection terminal Q.
 サーバ1は、CPU101と、記憶部108と、通信部109と、エラー表示部151と、エラー解除部152と、を備えている。
 これらは、図18に示した実施形態2と同様に構成されている。
The server 1 includes a CPU 101, a storage unit 108, a communication unit 109, an error display unit 151, and an error canceling unit 152.
These are configured similarly to the second embodiment shown in FIG.
 目検用端末Qにおいて、画像表示制御部411は、売場装置3または精算機4から目検の依頼が行われた場合に、売場装置3または精算機4から送信された物体撮像画像あるいは商品撮像画像等の画像及びこれらの画像に付随して送信された各種情報(商品候補リスト等)を出力部106に出力する。
 目検結果送信部412は、出力部106に出力された物体撮像画像あるいは商品撮像画像について、目検者が入力部107を介して入力した目検結果(商品の特定結果あるいは売買制限商品の判定結果等)を、目検を依頼した売場装置3または精算機4に送信する。
In the terminal Q for eye inspection, the image display control unit 411, when a request for eye inspection is made from the sales floor device 3 or the settlement machine 4, picks up an object imaged image or a product image transmitted from the sales floor device 3 or the settlement machine 4. Images such as images and various information (commodity candidate list etc.) transmitted accompanying these images are output to the output unit 106.
The eye inspection result transmission unit 412 receives the object inspection image or the product captured image output to the output unit 106 from the eye inspection result input by the eye examiner through the input unit 107 (determination of product identification result or trade restricted product). The result, etc.) is transmitted to the sales floor device 3 or the settlement machine 4 that requested the inspection.
 売場装置3のCPU301においては、図27に示すように、個人認証部320と、移動物体追跡部330と、位置情報管理部340と、棚商品認識部360と、カゴ商品認識部370と、売買制限商品判定部380と、目検結果取得部301aと、が機能する。 In the CPU 301 of the sales floor device 3, as shown in FIG. 27, a personal authentication unit 320, a moving object tracking unit 330, a position information management unit 340, a shelf product recognition unit 360, a basket product recognition unit 370, and trading. The restricted product determination unit 380 and the inspection result acquisition unit 301a function.
 個人認証部320は、個人情報取得部321を備えている。
 個人認証部320及び個人情報取得部321は、図18に示した実施形態2と同様に構成されている。
The personal authentication unit 320 includes a personal information acquisition unit 321.
The personal authentication unit 320 and the personal information acquisition unit 321 are configured similarly to the second embodiment shown in FIG.
 図28は、図27の売場装置3に備えられた移動物体追跡部330の詳細な機能的構成例を示す機能ブロック図である。
 移動物体追跡部330は、図28に示すように、天井カメラによる移動物体発見部3302と、天井カメラによるカゴ発見部3303と、天井カメラによるカゴ領域定義部3306と、天井カメラによるカゴ領域追跡部3307と、グルーピング部3308天井カメラによるカゴ領域間受け渡し認識部3310と、天井カメラによる受け渡された物体認識部3312と、天井カメラによる受け渡された商品特定部3313と、を備えている。
 ただし、実施形態3の移動物体追跡部330は、買い物客を移動物体として扱う実施形態2と異なり、システム上把握(発見・追跡等)されるカゴ類を移動物体Moとして扱う。
28 is a functional block diagram showing a detailed functional configuration example of the moving object tracking unit 330 provided in the sales floor device 3 of FIG.
As shown in FIG. 28, the moving object tracking unit 330 includes a moving object finding unit 3302 using a ceiling camera, a basket finding unit 3303 using a ceiling camera, a basket area defining unit 3306 using a ceiling camera, and a basket area tracking unit using a ceiling camera. 3307, a grouping unit 3308 an inter-cage area transfer recognition unit 3310 by a ceiling camera, an object recognition unit 3312 transferred by a ceiling camera, and a product specifying unit 3313 transferred by a ceiling camera.
However, the moving object tracking unit 330 of the third embodiment handles baskets that are grasped (discovered / tracked) on the system as the moving object Mo, unlike the second embodiment that handles a shopper as a moving object.
 移動物体追跡部330は、天井カメラ310とUSBケーブルやネットワークN等を通じて接続されている。したがって、天井カメラ310は、他の天井カメラ310やパーソナルコンピュータ等と連携される。 The moving object tracking unit 330 is connected to the ceiling camera 310 via a USB cable, a network N, or the like. Therefore, the ceiling camera 310 is linked with another ceiling camera 310, a personal computer, or the like.
 天井カメラによる移動物体発見部3302は、天井カメラ310によって撮像された撮像画像に基づいて状態空間モデル(ベイジアンフィルタ等)を用いて店内を移動する物体(買い物客、カゴ、カート等)を発見する。 The moving object finding unit 3302 by the ceiling camera finds an object (shopper, basket, cart, etc.) moving in the store by using a state space model (Bayesian filter, etc.) based on the imaged image taken by the ceiling camera 310. .
 天井カメラによるカゴ発見部3303は、移動物体発見部3302にて発見された店舗30内を移動する物体からカゴ類(移動物体Mo)を発見し、その移動物体Moに個別のIDを採番する。この移動物体MoのIDは、退店または精算完了等の所定のタイミングまで使い続けられる。
 この店舗30内を移動する物体から移動物体Moを発見する手法としては、例えば、個体識別可能な情報を保持したマーカーを各カゴ類に1か所以上付けておき、天井カメラによるカゴ発見部3303は、このマーカーを目印として移動物体Moを発見する。マーカーは、例えば2次元コードや特徴的な形状のような、移動物体Moを特定できればよく、限定されない。
 店舗30内を移動する物体から移動物体Moを発見する他の手法として、天井カメラによるカゴ発見部3303は、カゴ類の色情報や形状情報等のカゴ特有の情報を利用してもよい。その際、天井カメラによるカゴ発見部3303は、カゴ類の色や形状が床面等から区別できることから、移動物体Moを発見することができる。
The ceiling camera-based basket discovery unit 3303 discovers baskets (moving objects Mo) from the objects moving in the store 30 discovered by the moving object detection unit 3302, and assigns individual IDs to the moving objects Mo. . The ID of the moving object Mo is continuously used until a predetermined timing such as leaving the store or completing settlement.
As a method of discovering the moving object Mo from an object moving in the store 30, for example, one or more markers holding individual identifiable information are attached to each basket, and the basket finding unit 3303 by the ceiling camera is used. Detects the moving object Mo using this marker as a mark. The marker is not limited as long as it can identify the moving object Mo such as a two-dimensional code or a characteristic shape.
As another method for finding the moving object Mo from the object moving in the store 30, the basket finding unit 3303 using the ceiling camera may use information unique to the basket such as color information and shape information of the baskets. At that time, the basket finding unit 3303 using the ceiling camera can find the moving object Mo because the colors and shapes of the baskets can be distinguished from the floor surface and the like.
 店舗30内を移動する物体から移動物体Moを発見する手法として、カゴ類を低温にして、サーモグラフィ等を使用してもよい。その際、天井カメラによるカゴ発見部3303は、カゴ類以外の領域との温度差から、低温にされた移動物体Moを発見する。
 店舗30内を移動する物体から移動物体Moを発見する手法として、カゴ類から無害なガスを発生させ、サーモグラフィ等を使用してもよい。その際、天井カメラによるカゴ発見部3303は、カゴ類から発生した無害なガスの発生に伴う温度変化をサーモグラフィ等で検知し、移動物体Moを発見する。
As a method of discovering the moving object Mo from an object moving in the store 30, thermography or the like may be used with the baskets kept at a low temperature. At that time, the basket finding unit 3303 using the ceiling camera finds the moving object Mo that has been cooled to a low temperature from the temperature difference between the basket and the region other than the baskets.
As a method of finding the moving object Mo from the object moving in the store 30, thermography or the like may be used by generating harmless gas from the baskets. At that time, the basket finding unit 3303 using the ceiling camera detects a temperature change associated with the generation of harmless gas generated from the baskets by thermography or the like, and discovers the moving object Mo.
 店舗30内を移動する物体から移動物体Moを発見する手法として、カゴ類から可聴音かどうかにかかわらず音を発生させてもよい。その際、天井カメラによるカゴ発見部3303は、そのカゴ類から発生する音を検知することで、移動物体Moを発見する。
 店舗30内を移動する物体から移動物体Moを発見する手法として、不可視の塗料をカゴ類に塗布し、不可視の塗料を認識できるセンサーを使用してもよい。その際、天井カメラによるカゴ発見部3303は、カゴ類に塗布された不可視の塗料をセンサーが認識することによって移動物体Moを発見する。
 店舗30内を移動する物体から移動物体Moを発見する手法として、天井カメラ310からカゴ類に可視光や赤外線等を照射してもよい。その際、天井カメラによるカゴ発見部3303は、カゴ類に照射された可視光や赤外線の反射光を受信することで移動物体Moを発見する。
As a method of discovering the moving object Mo from the object moving in the store 30, sound may be generated from the baskets regardless of whether or not the sound is audible. At that time, the basket finding unit 3303 using the ceiling camera finds the moving object Mo by detecting the sound generated from the baskets.
As a method of discovering the moving object Mo from an object moving in the store 30, a sensor that applies invisible paint to a basket and can recognize the invisible paint may be used. At that time, the basket finding unit 3303 using the ceiling camera finds the moving object Mo by the sensor recognizing the invisible paint applied to the baskets.
As a method of finding the moving object Mo from the object moving in the store 30, the baskets may be irradiated with visible light, infrared rays, or the like from the ceiling camera 310. At that time, the basket finding unit 3303 with the ceiling camera finds the moving object Mo by receiving the visible light or the reflected light of infrared rays with which the baskets are irradiated.
 天井カメラによるカゴ領域定義部3306は、天井カメラによるカゴ発見部3303によって発見された移動物体Moの周辺のうち撮像画像内の一定の範囲を移動物体Moの領域として定義する。移動物体Moの領域を定義するには、例えば、上述のようにカゴ類にマーカーを付けた場合において、このマーカーを付けた添付位置から一定の範囲を移動物体Moの領域として定義する。
 カゴ類に複数のマーカーが付けられている場合において、天井カメラによるカゴ領域定義部3306は、天井カメラ310が全てのマーカーを撮像できなくても、各マーカーが移動物体Moの領域における自らの位置情報を保持していることから、1つ以上のマーカーから移動物体Moの領域を定義してもよい。
The ceiling camera basket area definition unit 3306 defines, as the area of the moving object Mo, a certain range within the captured image around the moving object Mo found by the ceiling camera basket finding unit 3303. To define the area of the moving object Mo, for example, when a marker is attached to the baskets as described above, a certain range from the attachment position where the marker is attached is defined as the area of the moving object Mo.
When a plurality of markers are attached to the baskets, even if the ceiling camera 310 cannot capture all the markers, the marker area definition unit 3306 of the ceiling camera does not allow each marker to position itself in the area of the moving object Mo. Since the information is held, the area of the moving object Mo may be defined by one or more markers.
 天井カメラによるカゴ領域定義部3306は、上述のカゴ領域を定義する手法を補完する手法を併用してもよい。補完手法としては、上述のようにカゴ類を低温にする場合において、サーモグラフィ及び画像認識から低温の領域を移動物体Moの領域と定義する。
 別の補完的手法としては、上述のようにカゴ類から無害のガスを発生させる場合において、ガスの発生に伴う温度変化をサーモグラフィ等で検知し、画像認識技術をも駆使することで、温度変化のあった領域を移動物体Moの領域と定義する。
The basket area definition unit 3306 using the ceiling camera may use a method that complements the above-described method for defining the basket area. As a complementary method, when the baskets are cooled to a low temperature as described above, the region of low temperature is defined as the region of the moving object Mo from thermography and image recognition.
As another complementary method, in the case of generating harmless gas from baskets as described above, the temperature change due to the generation of gas is detected by thermography and the like, and by making full use of image recognition technology, the temperature change The area where there is is defined as the area of the moving object Mo.
 さらに別の補完的手法としては、天井カメラによるカゴ領域定義部3306は、上述のようにカゴ類の縁に不可視の塗料を塗布した場合において、塗料の塗布位置からカゴ類の縁を推定し、その縁により閉じられた領域を移動物体Moの領域と定義する。
 さらに別の補完的手法としては、天井カメラによるカゴ領域定義部3306は、上述のように天井カメラ310から可視光や赤外線等を照射する場合において、その反射の測定結果から移動物体Moの領域を定義する。
As yet another complementary method, the basket area definition unit 3306 by the ceiling camera estimates the edge of the basket from the coating position of the paint when the invisible paint is applied to the edge of the basket as described above, The area closed by the edge is defined as the area of the moving object Mo.
As yet another complementary method, when the ceiling camera 310 irradiates visible light or infrared rays from the ceiling camera 310 as described above, the basket area definition unit 3306 determines the area of the moving object Mo from the measurement result of the reflection. Define.
 天井カメラによるカゴ領域追跡部3307は、移動物体Moの領域の位置を推定(定義)する。天井カメラによるカゴ領域追跡部3307は、移動物体Moを発見した時点から退店または精算完了等の所定の時点まで、同一のIDにより、移動物体Moを追跡し、位置情報を把握し続ける。 The basket area tracking unit 3307 using the ceiling camera estimates (defines) the position of the area of the moving object Mo. The ceiling camera basket area tracking unit 3307 tracks the moving object Mo with the same ID from the time when the moving object Mo is found to a predetermined time when the store is closed or the settlement is completed, and keeps track of the position information.
 そのため、天井カメラによるカゴ領域追跡部3307は、例えば、多数の天井カメラ310が連携することで、常時、撮像画像内のカゴ領域を追跡し続ける。多数の天井カメラ310が連携するようにするため、天井カメラによるカゴ領域追跡部3307は、ある天井カメラ310によって撮像された撮像画像が隣り合った天井カメラによって撮像された撮像画像に引き継がれるようにする。天井カメラによるカゴ領域追跡部3307は、追跡している移動物体Moの位置情報をサーバ1の位置情報管理DB132や記憶部108に格納する。 Therefore, the basket area tracking unit 3307 by the ceiling camera always keeps track of the basket area in the captured image, for example, in cooperation with a large number of ceiling cameras 310. In order to allow a large number of ceiling cameras 310 to cooperate with each other, the basket area tracking unit 3307 using the ceiling cameras may transfer the images captured by a certain ceiling camera 310 to the images captured by adjacent ceiling cameras. To do. The ceiling camera basket area tracking unit 3307 stores the position information of the moving object Mo being tracked in the position information management DB 132 of the server 1 or the storage unit 108.
 天井カメラによるカゴ領域追跡部3307が移動物体Moを追跡するための他の手法として、移動物体Moを特定する情報を有するマーカーが各カゴ類に付けられ、天井カメラ310がこのマーカーを含む移動物体Moを撮像する。天井カメラによるカゴ領域追跡部3307は、撮像画像からマーカーを抽出することで、移動物体Moを発見し、位置情報を取得する。
 天井カメラによるカゴ領域追跡部3307は、マーカーを撮像画像から発見し、移動物体Moの位置情報を取得することで、移動物体Moが移動しても、移動物体Moを追跡し続けることができる。
As another method for the basket area tracking unit 3307 using the ceiling camera to track the moving object Mo, a marker having information identifying the moving object Mo is attached to each basket, and the ceiling camera 310 moves the moving object including the marker. Image Mo. The basket area tracking unit 3307 that uses the ceiling camera discovers the moving object Mo and acquires the position information by extracting the marker from the captured image.
Even if the moving object Mo moves, the basket area tracking unit 3307 using the ceiling camera can continue to track the moving object Mo by finding the marker from the captured image and acquiring the position information of the moving object Mo.
 天井カメラによるカゴ領域追跡部3307は、移動物体Moを追跡するための他の手法として、ベイジアンフィルタ、高速フーリエ変換、TLD等の画像内における物体追跡の技術を用いて移動物体Moを追跡してもよい。
 あるいは、天井カメラによるカゴ領域追跡部3307は、カゴ類が持つ色や形状等の特徴データをもとに、同じ特徴データを取得した移動物体Moは同一の移動物体Moである、と推定しながら移動物体Moを追跡してもよい。その際、天井カメラによるカゴ領域追跡部3307は、追跡対象の特徴データを収集し続ける。
As another method for tracking the moving object Mo, the basket area tracking unit 3307 using the ceiling camera tracks the moving object Mo by using an object tracking technique in the image such as Bayesian filter, fast Fourier transform, and TLD. Good.
Alternatively, the basket area tracking unit 3307 using the ceiling camera estimates that the moving objects Mo that have acquired the same feature data are the same moving object Mo based on the feature data such as the colors and shapes of the baskets. The moving object Mo may be tracked. At that time, the basket area tracking unit 3307 using the ceiling camera continues to collect the characteristic data of the tracking target.
 いずれにしても、天井カメラ310からの映像では、天井カメラ310に正対していない移動物体Moが角度をもって(斜め方向から)撮像されるため、位置情報を正確に取得できないおそれがある。したがって、撮像画像に対してキャリブレーションによる補正をかけることで、正対しているように撮像することも考えられる。しかし、このような補正をかけても、高い精度で移動物体Moの位置情報を取得できないことがあり得る。
 そこで、天井カメラによるカゴ領域追跡部3307は、距離センサー等を用いて移動物体Moの高さ情報を取得することで、精度の高い位置情報を取得する。このようにして天井カメラによるカゴ領域追跡部3307は、移動物体Moを追跡し続けてもよい。
In any case, in the image from the ceiling camera 310, the moving object Mo that is not directly facing the ceiling camera 310 is captured at an angle (from an oblique direction), and thus the position information may not be accurately acquired. Therefore, it is conceivable that the captured image is corrected so that the image is directly faced. However, even if such a correction is applied, the position information of the moving object Mo may not be acquired with high accuracy.
Therefore, the basket area tracking unit 3307 using the ceiling camera acquires high-accuracy position information by acquiring height information of the moving object Mo using a distance sensor or the like. In this way, the basket area tracking unit 3307 using the ceiling camera may continue to track the moving object Mo.
 グルーピング部3308は、複数のカゴ類(移動物体Mo)を紐づけてもよい。このように紐づけられることで、精算エリア35内に設置された精算機4において、各移動物体Moの商品リストをまとめて精算することができる。 The grouping unit 3308 may associate a plurality of baskets (moving objects Mo). By being linked in this way, the product list of each moving object Mo can be collectively settled in the settlement machine 4 installed in the settlement area 35.
 天井カメラによるカゴ領域間受け渡し認識部3310は、移動物体Mo間で物体が受け渡された(出入りがあった)ことを天井カメラ310等によって認識する。天井カメラによるカゴ領域間受け渡し認識部3310は、移動物体Moの領域間の重なりを認識して受け渡しを認識してもよい。天井カメラによるカゴ領域間受け渡し認識部3310は、物体の受け渡しがされた移動物体Moを特定し、各移動物体MoのIDに紐づく商品リストを読み込む。 The inter-cargo area passing / recognizing unit 3310 by the ceiling camera recognizes that the object is passed (moved in and out) between the moving objects Mo by the ceiling camera 310 and the like. The basket area transfer recognition unit 3310 using the ceiling camera may recognize the overlap between the areas of the moving object Mo to recognize the transfer. The inter-cargo area passing / recognizing unit 3310 by the ceiling camera identifies the moving object Mo to which the object is transferred, and reads the product list associated with the ID of each moving object Mo.
 天井カメラによる受け渡された物体認識部3312は、受け渡しが認識された時点の撮像画像から物体の領域を定義する。
 天井カメラによる受け渡された物体認識部3312は、天井カメラ310をズーム可能なカメラにし、受け渡しが行われたと推定される箇所をズームアップし、物体の領域定義をしてもよい。
The object recognition unit 3312 delivered by the ceiling camera defines the area of the object from the captured image at the time when the delivery is recognized.
The object recognition unit 3312 delivered by the ceiling camera may make the ceiling camera 310 a zoomable camera, and may zoom up the location where the delivery is supposed to be performed to define the area of the object.
 天井カメラによる受け渡された商品特定部3313は、受け渡された物体が、物体領域定義後の画像から、読み込まれた移動物体Moに紐づく商品リストのうちいずれの商品かを特定し、前記天井カメラによるカゴ領域間受け渡し認識部3310で特定した移動物体Moと、受け渡しで特定された商品を紐づけ、各移動物体Moに紐づく商品のリストを更新する。
 天井カメラによる受け渡された物体認識部3312、天井カメラによる受け渡された商品特定部3313は、天井カメラ310の代わりに、広範囲を撮像可能な棚カメラ311等で実現してもよい。
The product identification unit 3313 delivered by the ceiling camera identifies which of the products in the product list associated with the moving object Mo the delivered object is from the image after the object area definition, The moving object Mo specified by the inter-cargo area passing / recognizing unit 3310 by the ceiling camera is linked with the product specified by the transfer, and the list of products linked to each moving object Mo is updated.
The object recognition unit 3312 delivered by the ceiling camera and the product identification unit 3313 delivered by the ceiling camera may be realized by a shelf camera 311 or the like capable of capturing a wide range instead of the ceiling camera 310.
 位置情報管理部340は、実施形態2において説明した位置情報管理部340と同様に構成されている。
 即ち、位置情報管理部340は、図20に示すように、カメラ間情報受け渡し部341と、各カメラの位置定義部342と、移動物体表示部343と、を備えている。
The position information management unit 340 is configured similarly to the position information management unit 340 described in the second embodiment.
That is, the position information management unit 340 includes an inter-camera information transfer unit 341, a position definition unit 342 of each camera, and a moving object display unit 343, as shown in FIG.
 図29は、図27の売場装置3に備えられた棚商品認識部360の詳細な機能的構成を示す機能ブロック図である。
 棚商品認識部360は、棚カメラによる物体認識部3602と、棚カメラによる商品特定部3603と、カゴと商品紐づけ部3605と、カゴと紐づく商品リスト管理部3607と、棚カメラによる物体入出検知部3608と、商品不特定判定部3609と、ラベル認識部3610と、値引きシール認識部3611と、棚カメラまたは天井カメラによるカゴ入出検知部3612と、を備える。
 棚商品認識部360は、棚カメラ311とUSBケーブルやネットワークを通じ、他のカメラやパーソナルコンピュータ等と連携される。
FIG. 29 is a functional block diagram showing a detailed functional configuration of the shelf product recognition unit 360 provided in the sales floor device 3 of FIG.
The shelf product recognition unit 360 includes an object recognition unit 3602 using a shelf camera, a product specifying unit 3603 using a shelf camera, a basket and product association unit 3605, a product list management unit 3607 associated with the basket, and an object input / output by the shelf camera. A detection unit 3608, a product unspecified determination unit 3609, a label recognition unit 3610, a discount sticker recognition unit 3611, and a basket entry / exit detection unit 3612 using a shelf camera or a ceiling camera are provided.
The shelf product recognition unit 360 is linked with another camera, a personal computer, or the like through the shelf camera 311 and a USB cable or a network.
 棚カメラによる物体認識部3602は、物体が棚内から取られた、または物体が棚内に置かれた(戻された)画像の前後の画像を比較し、商品特定の対象となる画像領域を定義する(領域定義)。
 即ち、棚カメラによる物体認識部3602は、物体が棚内から取られたまたは棚に置かれた前の物体撮像画像と、物体が棚内から取られたまたは棚に置かれた後の物体撮像画像の変化前後の物体撮像画像を比較し、変化のあった画像領域を特定する。棚カメラによる物体認識部3602は、画像領域を特定する際にRGBデータ夫々で変化を確認する。
The object recognition unit 3602 using the shelf camera compares the images before and after the image in which the object is taken from the shelf or the object is placed (returned) in the shelf, and the image area targeted for product identification is determined. Define (area definition).
That is, the object recognition unit 3602 by the shelf camera uses the object imaged image before the object is taken from the shelf or placed on the shelf and the object taken image after the object is taken from the shelf or placed on the shelf. The object captured images before and after the image change are compared, and the changed image area is specified. The object recognition unit 3602 by the shelf camera confirms the change in each of the RGB data when specifying the image area.
 また、棚カメラによる物体認識部3602は、物体入出検知部をトリガーに、物体が棚内から取られたまたは棚に置かれた前の物体撮像画像と、物体が棚内から取られたまたは棚に置かれた後の物体撮像画像の変化前後の比較とは別の方法を用い、1枚の物体撮像画像のみから物体の領域を定義してもよい。
 物体の色データが同じであるため、入出等の変化があっても変化がなかったと判定されないようにするため、棚カメラによる物体認識部3602は、物体の影を利用することによっても領域を定義してもよい。
 認識した物体領域の情報は、棚カメラによる商品特定部3603に引き渡される。
Further, the object recognition unit 3602 by the shelf camera uses the object entrance / exit detection unit as a trigger to capture an image of the object taken from the inside of the shelf or placed on the shelf, and the object taken from the shelf or the shelf. The area of the object may be defined from only one object-captured image by using a method different from the comparison before and after the change of the object-captured image after being placed on the.
Since the color data of the object is the same, the object recognition unit 3602 by the shelf camera also defines the area by using the shadow of the object so that it is not determined that there is no change even if there is a change such as entry and exit. You may.
The information on the recognized object area is passed to the product specifying unit 3603 by the shelf camera.
 棚カメラによる商品特定部3603は、棚カメラによる物体認識部3602によって認識された棚内の物体がいずれの商品であるかを特定する。棚カメラによる商品特定部3603は、特定物体認識、一般物体認識、ディープラーニング等の画像処理手法により、商品候補をリストアップする。このリストアップされた商品候補を「商品候補リストS」と呼ぶ。その後、棚カメラによる商品特定部3603は、検証機能を発揮させることで、商品を高い精度で特定する。 The shelf camera product identification unit 3603 identifies which product is the object in the shelf recognized by the shelf camera object recognition unit 3602. The shelf camera product identification unit 3603 lists up product candidates by image processing methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product specification unit 3603 using the shelf camera specifies the product with high accuracy by performing the verification function.
 検証機能は前述の商品候補をリストアップする手法と異なるアルゴリズムによって「商品候補リストP」をリストアップする。商品候補リストSとPの結果をマッチングさせて、所定の閾値を超える場合、商品を特定する。
「商品候補リスト」のリストアップ手法として、例えば、存在が認識された物体から得られる物体の画像情報と、商品DB131やメモリ上に保持されている画像情報とマッチングさせる方法により実現されてもよい。即ち、両画像の特徴情報が一致する(閾値を超える)と、棚カメラによる物体認識部3602によって存在が認識された物体が、商品DB131に登録されている商品であるため、棚カメラによる商品特定部3603は、その商品DB131に登録された商品であると特定する。ディープラーニングにより、商品候補を作成し、その後、検証機能を発揮させることで、商品を高い精度で特定する。
The verification function lists up the "commodity candidate list P" by an algorithm different from the method of listing up the product candidates described above. The results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
The method of listing the “commodity candidate list” may be realized by, for example, a method of matching image information of an object obtained from an object whose existence is recognized with image information held in the product DB 131 or the memory. . That is, when the feature information of both images matches (exceeds a threshold value), the object whose existence is recognized by the object recognition unit 3602 by the shelf camera is a product registered in the product DB 131, and thus the product identification by the shelf camera is performed. The section 3603 specifies that the product is registered in the product DB 131. A product candidate is created by deep learning, and then the verification function is exerted to specify the product with high accuracy.
 棚カメラによる商品特定部3603は、棚カメラ311によって撮像された撮像画像の1フレームで商品を特定せず、天井カメラ310によって撮像された撮像画像も駆使し、複数の撮像画像にわたって商品を特定してもよい。その際、棚カメラによる商品特定部3603は、商品候補に対し、パーセンテージを持たせ、購入履歴、時間、場所、人の嗜好等の情報をもとに、パーセンテージを加算し、一定の閾値を超えたときに商品を特定する。 The shelf camera-based product specification unit 3603 does not specify the product in one frame of the captured image captured by the shelf camera 311, but also uses the captured image captured by the ceiling camera 310 to identify the product over a plurality of captured images. May be. At that time, the product specifying unit 3603 using the shelf camera gives a percentage to the product candidate, adds the percentage based on information such as purchase history, time, place, and person's preference, and exceeds a certain threshold. Identify the product when
 カゴと商品紐づけ部3605は、棚カメラによる商品特定部3603によって特定された商品の商品情報と移動物体Moとを紐づける。この前提として、1台の棚カメラ311が撮像する所定エリア内に、1台の移動物体Moのみが存在する場合であれば、当該移動物体MoのIDを特定することで、移動物体Moと商品情報とを紐づけることができる。この際、カゴと商品紐づけ部3605は、移動物体Moに付けられたマーカー等に付随する位置情報を用いて移動物体MoのIDを特定する。 The basket and the product association unit 3605 associate the product information of the product identified by the product identification unit 3603 by the shelf camera with the moving object Mo. As a premise of this, if only one moving object Mo exists in a predetermined area imaged by one shelf camera 311, by specifying the ID of the moving object Mo, the moving object Mo and the product Information can be linked. At this time, the basket and the product association unit 3605 specify the ID of the moving object Mo using the position information attached to the marker or the like attached to the moving object Mo.
 別の前提として、1台の棚カメラ311が撮像する所定エリア内に複数台の移動物体Moが存在する場合であれば、後述の棚カメラまたは天井カメラによるカゴ入出検知部が移動物体Mo内への物体の入出を検知した結果、棚内から取られた物体が入った移動物体Moを特定し、当該移動物体Moと商品情報とを紐づける。
 さらに別の前提として、1台の棚カメラ311が撮像する所定エリア内に移動物体Moが1台も存在しないとされる場合であれば、少なくとも1台以上の棚カメラ311または少なくとも1台以上のカゴカメラ312によって、移動物体Moへの入出を検知し、その移動物体Moを特定したうえで、その特定された移動物体Moと商品情報とを紐づける。
As another premise, if there are a plurality of moving objects Mo in a predetermined area imaged by one shelf camera 311, the basket entrance / exit detection unit by a shelf camera or a ceiling camera described later moves into the moving object Mo. As a result of detecting the entry / exit of the object, the moving object Mo containing the object taken from the shelf is identified, and the moving object Mo and the product information are associated with each other.
As yet another premise, if it is determined that no moving object Mo exists in the predetermined area imaged by one shelf camera 311, at least one shelf camera 311 or at least one shelf camera 311 is used. The car camera 312 detects entry into and exit from the moving object Mo, specifies the moving object Mo, and then associates the specified moving object Mo with the product information.
 カゴと紐づく商品リスト管理部3607は、精算までの間、移動物体Moと特定された商品とを紐づける商品リストを管理し続ける。即ち、カゴと紐づく商品リスト管理部3607は、位置情報管理DB132等を活用し、移動物体MoのIDに紐づく商品のリストを常に管理し続ける。
 物体が棚内から取られた際、カゴと紐づく商品リスト管理部3607は、取得された商品数分加算される。反対に、商品が棚内に戻された際、カゴと紐づく商品リスト管理部3607は、戻された商品数分を減算する。
The product list management unit 3607 associated with the basket continues to manage the product list associated with the moving object Mo and the specified product until the settlement. That is, the product list management unit 3607 associated with the basket uses the position information management DB 132 and the like to constantly manage the list of products associated with the ID of the moving object Mo.
When the object is taken from the shelf, the product list management unit 3607 associated with the basket is added by the number of the acquired products. On the contrary, when the product is returned to the shelf, the product list management unit 3607 associated with the basket subtracts the number of the returned products.
 棚カメラによる物体入出検知部3608は、棚内へ物体が進入したことを検知することで、棚カメラによる物体認識部3602の発動のトリガーとすることができる。
 例えば、棚カメラによる物体入出検知部3608は、棚内への物体の進入を、各棚カメラ311の撮像画像に設定した「進入検知領域」における画像データの変化から検知する。また、棚カメラによる物体入出検知部は、入ってきた物体を画像内で追跡することで、物体が領域外に退出したことも検知する。
The object entering / exiting detection unit 3608 by the shelf camera can act as a trigger for activating the object recognition unit 3602 by the shelf camera by detecting that an object has entered the shelf.
For example, the object entry / exit detection unit 3608 by the shelf camera detects the entry of the object into the shelf from the change in the image data in the “entry detection area” set in the captured image of each shelf camera 311. Further, the object entrance / exit detection unit of the shelf camera also detects that the object exits the area by tracking the incoming object in the image.
 物体を追跡するために、棚カメラによる物体入出検知部3608は、棚内の進入検知領域にパーティクルフィルタをまく。複数の買い物客が同時に手を伸ばした場合のように、複数の移動物体Moを同時に追跡できるようにするため、1つ目の物体が進入後に、前記「進入検知領域」に再度、パーティクルをまき、次の物体の進入に備える。
 ただし、同じ領域に複数の物体が進入しないことを前提とすれば、棚カメラによる物体入出検知部3608は、既に物体が存在する領域にかかる進入検知領域にはパーティクルをまかない。
In order to track the object, the object entrance / exit detection unit 3608 by the shelf camera applies a particle filter to the entrance detection area in the shelf. In order to track a plurality of moving objects Mo at the same time as in the case where a plurality of shoppers reach their hands at the same time, after the first object enters, the particles are again spread over the “entry detection area”. , Prepare for the next entry.
However, if it is premised that a plurality of objects do not enter the same area, the object entering / exiting detection unit 3608 by the shelf camera does not sprinkle particles in the entry detection area that is present in the area where the object already exists.
 棚カメラによる物体入出検知部3608は、所定領域において、一定の尤度を持つパーティクルの占める割合が閾値以上の場合は「物体入」と判定し、一定の尤度を持つパーティクルが閾値未満の場合は「物体出」と判定する。
 一度になるべく少ない量の物体の領域が推定されるように、棚カメラによる物体入出検知部は、商品が進入される都度、入出を検知する。
 変化前後の画像は、推定領域で利用できるようにサーバ1の記憶部108に記憶される。
The object entrance / exit detection unit 3608 by the shelf camera determines “object entrance” when the ratio of particles having a certain likelihood is equal to or more than a threshold value in a predetermined area, and when the particles having a certain likelihood is less than the threshold value. Is determined to be "object".
The object entry / exit detection unit of the shelf camera detects entry / exit of a product each time a product is entered so that the area of the object is estimated as small as possible once.
The images before and after the change are stored in the storage unit 108 of the server 1 so that they can be used in the estimation area.
 棚カメラによる物体入出検知部3608は、上述の例以外に、物体の撮像画像から得られるスペクトル(波長)データから物体入出を検知してもよい。さらに、棚カメラによる物体入出検知部は、重量・圧力センサー、赤外線センサー、エチレンガスセンサー等の方法を用いて物体入出を検知してもよい。 The object entry / exit detection unit 3608 using the shelf camera may detect object entry / exit from spectrum (wavelength) data obtained from a captured image of the object, in addition to the above example. Further, the object entrance / exit detection unit using the shelf camera may detect the object entrance / exit using a method such as a weight / pressure sensor, an infrared sensor, or an ethylene gas sensor.
 商品不特定判定部3609は、棚カメラによる商品特定部3603によって商品を特定できなかったことを棚から物体を取った移動物体Moと紐づける。 The product non-identification determination unit 3609 associates that the product could not be identified by the product identification unit 3603 by the shelf camera with the moving object Mo that picked up the object from the shelf.
 ここで、棚カメラによる商品特定部3603における検証機能により、商品を特定することができなかった場合について説明する。まず、棚カメラによる商品特定部3603は、物体が棚から取られた画像と物体が棚に置かれた画像とが類似する特徴点数を特定する。例えば、両画像に撮像された物体のサイズを特定したうえで比較し、色違いを特定したうえで色を比較し、両物体が類似しているかどうかを決定する。特徴点数が少ないと、商品を特定することができず、誤って精算されないようにすることができる。 Here, a case where the product cannot be specified by the verification function of the product specifying unit 3603 using the shelf camera will be described. First, the shelf camera product identification unit 3603 identifies the number of feature points at which the image of the object taken from the shelf and the image of the object placed on the shelf are similar. For example, the size of the object imaged in both images is specified and compared, the color difference is specified, and the colors are compared to determine whether the two objects are similar. If the number of characteristic points is small, the product cannot be specified, and it is possible to prevent accidental payment.
 ラベル認識部3610は、棚カメラによる商品特定部3603によって特定された商品に応じて、貼付されたラベルを認識する。ラベル認識部3610は、画像認識の手法を駆使し、ラベルに記載された文字、バーコード等を含む多次元コードを読み取り、商品特定を補完する。 The label recognition unit 3610 recognizes the attached label according to the product specified by the product specification unit 3603 by the shelf camera. The label recognition unit 3610 makes full use of an image recognition method to read a multidimensional code including a character, a barcode, etc. written on the label and complement product identification.
 値引きシール認識部3611は、棚カメラによる商品特定部3603によって特定された商品に応じて、貼付された値引きシールを認識する。値引きシール認識部3611は、画像認識の手法を駆使し、商品に貼付された値引きシールの値引き額や割引率を特定する。値引きシール認識部3611は、商品特定部の処理中に実行される。 The discount sticker recognizing unit 3611 recognizes the discount sticker attached according to the product specified by the product specifying unit 3603 by the shelf camera. The discount sticker recognizing unit 3611 makes full use of the image recognition method to identify the discount amount or the discount rate of the discount sticker attached to the product. The discount label recognition unit 3611 is executed during the processing of the product identification unit.
 棚カメラまたは天井カメラによるカゴ入出検知部3612は、棚カメラ311または天井カメラ310の少なくともいずれか一方がカゴ類に付けられたマーカー等から棚カメラ及び天井カメラの所定エリア内に移動物体Moが存在しているかどうかや存在していない場合の移動物体Moへの入出を検知する。棚カメラまたは天井カメラによるカゴ入出検知部3612は、さらに、マーカー等により定義された移動物体Moの領域への進入を検知してもよい。その際、移動物体Moの領域への進入は、追跡されている商品の位置情報と移動物体Moの領域の進入検知ライン(カゴの縁等)の位置情報との比較により行う。 In the basket entry / exit detection unit 3612 using the shelf camera or the ceiling camera, the moving object Mo exists in a predetermined area of the shelf camera and the ceiling camera from a marker or the like to which at least one of the shelf camera 311 and the ceiling camera 310 is attached to the basket. Whether or not the moving object Mo is present or not is detected. The car entrance / exit detection unit 3612 using the shelf camera or the ceiling camera may further detect the entry of the moving object Mo defined by a marker or the like into the area. At that time, the moving object Mo enters the area by comparing the position information of the tracked product with the position information of the entrance detection line (the edge of the basket or the like) of the area of the moving object Mo.
 実施形態3の情報処理システムは、さらに、カゴ商品認識部370を備えていてもよい。
 図30は、図27の売場装置3に備えられたカゴ商品認識部370の詳細な機能的構成例を示す機能ブロック図である。
 カゴ商品認識部370は、カゴカメラによるカゴ入出検知部372と、カゴカメラによる物体認識部373と、カゴカメラによる商品特定部374と、カゴ商品の不特定判定部375と、カゴ商品のラベル認識部376と、カゴ商品の値引きシール認識部377と、を備える。
The information processing system of the third embodiment may further include a basket product recognition unit 370.
FIG. 30 is a functional block diagram showing a detailed functional configuration example of the basket product recognition unit 370 provided in the sales floor device 3 of FIG.
The basket product recognition unit 370 includes a basket camera entrance / exit detection unit 372, a basket camera object recognition unit 373, a basket camera product identification unit 374, a basket product non-identification determination unit 375, and a basket product label recognition. A unit 376 and a basket product discount seal recognition unit 377 are provided.
 カゴカメラによるカゴ入出検知部372は、カゴ類(移動物体Mo)内へ買い物客の手のような物体が進入した際に、その進入を検知することで、カゴカメラによる物体認識部373の発動のトリガーとすることができる。カゴカメラによるカゴ入出検知部372は、カゴ類内への物体の進入を、各カゴカメラ312の撮像画像に設定した「進入検知領域」における画像の変化から検知する。 When the object such as the hand of the shopper enters the basket (moving object Mo), the car entrance / exit detection unit 372 using the car camera detects the entrance, and thus the object recognition unit 373 using the car camera is activated. Can be the trigger of. The car entrance / exit detection unit 372 by the car camera detects the entrance of an object into the car from the change of the image in the “entry detection area” set in the image captured by each car camera 312.
 カゴカメラによるカゴ入出検知部372は、撮像画像のフレーム内に入ってきた物体をそのフレーム内で追跡することで、その移動物体Moがそのフレーム外に退出したことも検知する。カゴカメラによるカゴ入出検知部372は、パーティクルフィルタを用いて物体を追跡する。
 カゴカメラによるカゴ入出検知部372は、複数の移動物体Moを同時に追跡するため、1つ目の物体が進入した後、進入検知領域に再度、パーティクルをまき、次の物体の進入に備える。
 カゴカメラによるカゴ入出検知部372は、所定領域内において、一定の尤度を持つパーティクルの占める割合が閾値以上である場合に「物体入」と判定し、閾値未満である場合に「物体出」と判定する。カゴカメラによるカゴ入出検知部は、一度になるべく少ない量の物体の領域を推定するように、物体が入出するたびに入出を検出する。
The car entrance / exit detection unit 372 of the car camera also detects that the moving object Mo leaves the frame by tracking an object that has entered the frame of the captured image within the frame. The car entrance / exit detection unit 372 using the car camera tracks an object using a particle filter.
The car entrance / exit detection unit 372 of the car camera tracks a plurality of moving objects Mo at the same time, so that after the first object enters, particles are again applied to the entrance detection area to prepare for the next object to enter.
The car entrance / exit detection unit 372 using the car camera determines “object entrance” when the ratio of particles having a certain likelihood is equal to or more than a threshold value within a predetermined area, and “object exit” when the ratio is less than the threshold value. To determine. The car entrance / exit detection unit of the car camera detects entrance / exit every time an object enters / exits so as to estimate a region of the object as small as possible once.
 カゴカメラによる物体認識部373は、買い物客の手等の物体がカゴ類(移動物体Mo)内に入った際の画像とカゴ類(移動物体Mo)内から出た後の画像とを比較し、棚内から取られてカゴ類(移動物体Mo)内に入れられた(またはカゴ類(移動物体Mo)内から棚に戻された)物体を商品特定するために画像領域を定義する。カゴカメラによる物体認識部373は、RGBデータを用いて領域の変化を確認する。
 カゴカメラによる物体認識部373は、物体の撮像画像から得られるスペクトル(波長)データ、重量センサー、圧力センサー、赤外線センサー、メチレンガスセンサー等から物体を認識してもよい。
The object recognition unit 373 by the basket camera compares an image when an object such as a hand of a shopper enters the baskets (moving object Mo) with an image after exiting from the baskets (moving object Mo). , An image area is defined in order to specify an object which is taken from the shelf and put in the basket (moving object Mo) (or returned from the basket (moving object Mo) to the shelf). The object recognition unit 373 by the car camera confirms the change of the area using the RGB data.
The object recognition unit 373 using the basket camera may recognize an object from spectrum (wavelength) data obtained from a captured image of the object, a weight sensor, a pressure sensor, an infrared sensor, a methylene gas sensor, or the like.
 カゴカメラによる商品特定部374は、カゴ類に入れられた物体がいずれの商品かを特定する。カゴカメラによる商品特定部374は、カゴカメラによる物体認識部373により認識された物体について、特定物体認識、一般物体認識、ディープラーニング等の画像処理手法により、商品候補をリストアップする。このリストアップされた商品候補を「商品候補リストS」と呼ぶ。その後、カゴカメラによる商品特定部374は、検証機能を発揮させ、商品を高い精度で特定する。 The product identification unit 374 by the basket camera identifies which product is the object put in the basket. The product identification unit 374 by the car camera lists up product candidates for the object recognized by the object recognition unit 373 by the car camera by image processing methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product identification unit 374 by the car camera causes the verification function to be performed and identifies the product with high accuracy.
 検証機能は、前述の商品候補をリストアップする手法と異なるアルゴリズムによって「商品候補リストP」をリストアップする。カゴカメラによる商品特定部374は、商品候補リストSとPとをマッチングさせ、所定の閾値を超える場合に商品を特定する。
 商品候補のリストアップの手法として、例えば、存在が認識された物体から得られる物体撮像画像の情報と、商品DB131や記憶部108によって保持されている画像情報とをマッチングさせる方法により、実現してもよい。即ち、両画像の特徴情報が一致する、即ち、閾値を超えると、カゴカメラによる物体認識部373によって存在が認識された物体は、その商品DB131に登録された商品であると特定する。
 カゴカメラによる商品特定部374は、撮像画像の1フレームで商品を特定せず、複数フレームにわたる撮像画像から商品を特定してもよい。その際、カゴカメラによる商品特定部374は、商品候補に対し、パーセンテージを持たせ、購入履歴、時間、場所、人の嗜好等の情報をもとにパーセンテージを加算し、一定の閾値を超えたときに商品を特定してもよい。
 本実施形態においては、特定の物体について目検用端末Qによる目検が行われる特定物体目検モードと、全ての物体について目検用端末Qによる目検が行われる全物体目検モードとが設定可能となっている。
 特定物体目検モードに設定されている場合、カゴカメラによる商品特定部374は、商品不特定とされた物体の物体撮像画像及び関連する情報等を、目検結果取得部301aを介して目検用端末Qに送信する。一方、全物体目検モードに設定されている場合、カゴカメラによる商品特定部374は、商品の特定結果及び特定された商品の商品撮像画像と、商品不特定とされた物体の物体撮像画像及び関連する情報等とを目検結果取得部301aを介して目検用端末Qに送信する。
 そして、カゴカメラによる商品特定部374は、目検結果取得部301aが目検用端末Qから取得した目検結果に基づいて、商品の特定を行う。具体的には、カゴカメラによる商品特定部374によって特定された商品については、目検結果により特定結果が承認または修正された場合、カゴカメラによる商品特定部374は、目検結果が示す内容に従って、商品の特定を行う。また、カゴカメラによる商品特定部374によって商品不特定とされた物体については、カゴカメラによる商品特定部374は、その物体を目検結果が示す商品として特定する。なお、カゴカメラによる商品特定部374が目検による商品の特定を依頼する場合、カゴ商品のラベル認識部376による商品特定の補完を経た後に、依頼を行うこととしてもよい。この場合、カゴ商品のラベル認識部376による商品特定の補完の結果、商品が特定されたものを除外して、目検による商品の特定を行うことができる。
The verification function lists up the "commodity candidate list P" by an algorithm different from the above-mentioned method of listing up product candidates. The product identification unit 374 using the basket camera matches the product candidate lists S and P, and identifies a product when a predetermined threshold is exceeded.
As a method of listing product candidates, for example, a method of matching information of an object captured image obtained from an object whose existence is recognized with image information held in the product DB 131 or the storage unit 108 is realized. Good. That is, when the feature information of both images matches, that is, exceeds the threshold value, the object whose existence is recognized by the object recognition unit 373 by the car camera is specified as the product registered in the product DB 131.
The product identification unit 374 including the basket camera may specify the product from the captured images over a plurality of frames instead of specifying the product in one frame of the captured image. At that time, the product identification unit 374 by the car camera gives a percentage to the product candidates, adds the percentage based on information such as purchase history, time, place, and person's preference, and exceeds a certain threshold. Sometimes the product may be specified.
In the present embodiment, there are a specific object inspection mode in which the inspection terminal Q performs inspection for a specific object and an all-object inspection mode in which inspection is performed by the inspection terminal Q for all objects. It can be set.
When the specific object visual inspection mode is set, the product identification unit 374 by the basket camera visually examines the object captured image of the object that is not specified as the product and the related information via the eye inspection result acquisition unit 301a. To the terminal Q for use. On the other hand, when the all-object visual inspection mode is set, the product identification unit 374 using the basket camera determines the product identification result and the product captured image of the identified product, and the object captured image of the object that is not identified as the product. Related information and the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 301a.
Then, the product identification unit 374 using the basket camera identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 301a from the eye inspection terminal Q. Specifically, for the product identified by the product identification unit 374 by the car camera, when the identification result is approved or corrected by the result of the eye inspection, the product identification unit 374 by the car camera follows the contents indicated by the result of the eye inspection. , Specify the product. Further, for an object for which the product identification unit 374 for the car camera determines that the product is unspecified, the product specification unit 374 for the car camera identifies the object as a product indicated by the visual inspection result. When the product identification unit 374 using the basket camera requests the product identification by visual inspection, the request may be made after the product identification complement by the label recognition unit 376 for the basket product is completed. In this case, as a result of the product identification complement by the basket product label recognition unit 376, products whose products have been identified can be excluded, and products can be identified by visual inspection.
 カゴ商品の不特定判定部375は、カゴカメラによる商品特定部374によって、カゴに入れられた物体を商品として認識できなかったことを当該移動物体MoのIDと紐づける。
 カゴ商品のラベル認識部376は、カゴカメラによる商品特定部374によって、カゴに入れられて特定された商品に応じて、貼付されたラベルを認識する。カゴ商品のラベル認識部は、画像認識の手法を駆使し、ラベルに記載された文字、バーコード等を含む多次元コードを読み取り、商品特定を補完する。
 カゴ商品の値引きシール認識部377は、画像認識の手法を駆使し、カゴカメラによる商品特定部374によって特定された商品に貼付された値引きシールの値引き額や、割引率を特定する。カゴ商品の値引きシール認識部は、カゴカメラによる商品特定部が処理中に値引き額等の特定を実行する。
The basket product non-identification determination unit 375 associates the fact that the product placed in the basket cannot be recognized as a product by the product identification unit 374 of the basket camera with the ID of the moving object Mo.
The basket product label recognition unit 376 recognizes the attached label according to the product specified in the basket by the product specification unit 374 by the car camera. The label recognition unit of the basket product makes full use of the image recognition method to read the multidimensional code including the characters, the barcode, etc. written on the label to complement the product identification.
The basket product discount sticker recognizing unit 377 makes full use of the image recognition method to specify the discount amount of the discount sticker attached to the product specified by the product specifying unit 374 by the car camera and the discount rate. The discount sticker recognizing unit for the basket product identifies the discount amount or the like during the processing by the product identifying unit using the basket camera.
 また、本情報処理システムは、売買制限商品判定部380と、遠隔操作部390と、を備える。
 売買制限商品判定部380は、実施形態1において説明したように、特定された商品が売買制限商品であるか否かを判定する。売買制限商品判定部は、売買制限商品を検知すると、その情報をエラー表示部151に表示されるようにする。
 本実施形態において、売買制限商品判定部380は、売買制限商品に該当する商品の商品撮像画像を、目検結果取得部301aを介して目検用端末Qに送信する。このとき、売買制限商品判定部380は、売買制限商品の種別に応じて、買い物客に関する情報(顔画像、個人情報等)を、目検結果取得部301aを介して目検用端末Qに適宜送信する。
 そして、売買制限商品判定部380は、目検結果取得部301aが目検用端末Qから取得した目検結果に基づいて、売買制限商品の販売が許可されるか否か(売買制限が解除されたか否か)を判定する。
 遠隔操作部390は、情報端末38やサーバ1に備えられ、例えば、エラー状態が通知されたときに、エラー状態を解消する機能を有している。
The information processing system also includes a trade-restricted product determination unit 380 and a remote operation unit 390.
The trade-restricted product determination unit 380 determines whether the specified product is a trade-restricted product, as described in the first embodiment. When the trade restricted product determination unit detects the trade restricted product, the trade restricted product determination unit displays the information on the error display unit 151.
In the present embodiment, the trade-restricted product determination unit 380 transmits the product captured image of the product corresponding to the trade-restricted product to the eye inspection terminal Q via the eye inspection result acquisition unit 301a. At this time, the trade-restricted product determination unit 380 appropriately sends information about the shopper (face image, personal information, etc.) to the inspection-use terminal Q via the inspection-result acquisition unit 301a according to the type of the sale-restricted product. Send.
Then, the trade-restricted product determination unit 380 determines whether or not the sale of the trade-restricted product is permitted based on the inspection result acquired by the inspection result acquisition unit 301a from the inspection terminal Q (the sale-restriction is released). Whether or not) is determined.
The remote operation unit 390 is provided in the information terminal 38 and the server 1, and has a function of canceling the error state when the error state is notified, for example.
 目検結果取得部301aは、物体撮像画像、または、商品の特定結果及び特定された商品の商品撮像画像を目検用端末Qに送信することにより、目検による商品の特定を依頼し、この依頼に応じて目検用端末Qから送信された目検結果を取得する。目検結果取得部301aは、取得した目検結果(商品の特定結果)を棚商品認識部360に出力する。また、目検結果取得部301aは、売買制限商品に該当する商品の商品撮像画像を目検用端末Qに送信することにより、目検による売買制限商品の判定を依頼し、目検用端末Qから送信された目検結果を取得する。このとき、目検結果取得部301aは、買い物客に関する情報(顔画像、個人情報等)を、適宜、目検用端末Qに送信する。目検結果取得部301aは、取得した目検結果(売買制限商品の判定結果)を売買制限商品判定部380に出力する。 The eye inspection result acquisition unit 301a requests the identification of the product by the eye inspection by transmitting the object captured image or the product identification result and the product captured image of the identified product to the eye inspection terminal Q. The eye inspection result transmitted from the eye inspection terminal Q is acquired in response to the request. The visual inspection result acquisition unit 301a outputs the acquired visual inspection result (product identification result) to the shelf product recognition unit 360. Further, the eye inspection result acquisition unit 301a requests the determination of the sale restricted product by the eye inspection by transmitting the product imaged image of the product corresponding to the sale restricted product to the eye inspection terminal Q, and the eye inspection terminal Q Get the result of the eye inspection sent from. At this time, the visual inspection result acquisition unit 301a appropriately transmits information (face image, personal information, etc.) regarding the shopper to the visual inspection terminal Q. The visual inspection result acquisition unit 301a outputs the acquired visual inspection result (determination result of the trade restricted product) to the trade restricted product determination unit 380.
 精算エリア35内に設置された精算機4は、1以上のカゴ類内に入れられた全商品の合計金額を算出し、精算乃至、決済する。精算機では、追跡し続けた移動物体Mo及び移動物体Moに紐づいた商品情報をもとに精算する。
 そのため、精算機4は、図27に示すように、CPU401と、入力部406と、出力部407と、通信部409と、を備えている。
 CPU401においては、精算部435と、入力制御部436と、出力制御部437と、が機能する。
The settlement machine 4 installed in the settlement area 35 calculates the total amount of money of all the commodities put in one or more baskets, and performs settlement or settlement. The settlement machine performs settlement based on the moving object Mo that has been continuously tracked and the product information associated with the moving object Mo.
Therefore, as shown in FIG. 27, the settlement machine 4 includes a CPU 401, an input unit 406, an output unit 407, and a communication unit 409.
In the CPU 401, the settlement unit 435, the input control unit 436, and the output control unit 437 function.
 入力部406は、クレジットカードや電子マネーの読み取り部等を備えている。
 出力部407は、精算商品を表示する画面やレシートを出力する機能等を備えている。
 精算部435は、精算金額及び精算対象品を確定する。
 入力制御部436は、入力部406からの信号を入力し、CPU401を作動させる。
 出力制御部437は、精算部435の演算結果を出力部407に出力する。
 精算機4は、移動物体Moの位置情報と予め定められた精算エリア35の位置関係を比較し、移動物体Moの位置情報が精算エリア35内に入ったこと、あるいは、移動物体Moが精算機4に置かれたことをトリガーに精算する。
The input unit 406 includes a credit card, an electronic money reading unit, and the like.
The output unit 407 has a function of displaying a screen for displaying a payment item and a receipt.
The settlement unit 435 determines the settlement amount and the item to be settled.
The input control unit 436 inputs a signal from the input unit 406 and operates the CPU 401.
The output control unit 437 outputs the calculation result of the settlement unit 435 to the output unit 407.
The settlement machine 4 compares the positional information of the moving object Mo with the predetermined positional relation of the settlement area 35, and the positional information of the moving object Mo has entered the settlement area 35, or the moving object Mo is the settlement machine. Settled in 4 as a trigger.
 あるいは、精算機4に精算開始を指示するボタンが備えられ、ボタン押下等をトリガーに精算してもよく、精算機4に重量センサー(図示せず)を備え、移動物体Moが置かれた際の重量の変化を認識し、精算機4が精算するようにしてもよい。
 精算機4は、現金だけでなく、商品券や金券、仮想通貨等でも支払いができるようにされている。
 このような精算機4においては、エラー状態が起こり得る。エラー状態とは、(A)システム処理異常、(B)商品不特定の物体が移動物体Moに紐づいている場合、(C)売買制限商品が移動物体Moに紐づいている場合、等が例として挙げられる(当然これらに限定されない)。
Alternatively, the settlement machine 4 may be provided with a button for instructing to start the settlement, and the settlement may be triggered by a button press or the like. When the settlement machine 4 is provided with a weight sensor (not shown) and the moving object Mo is placed. It is also possible to recognize the change in the weight of the item and make the settlement by the settlement machine 4.
The settlement machine 4 can be used not only for cash, but also for gift certificates, cash vouchers, virtual currency, and the like.
In such a settlement machine 4, an error state may occur. The error state includes (A) system processing abnormality, (B) a product unspecified object is linked to the moving object Mo, (C) a trade-restricted product is linked to the moving object Mo, and the like. Examples include (but are not limited to, of course).
 各エラー状態に応じ、本実施形態が採用するシステムが種々の対応を行う。
 (A)の場合は、システム処理異常の旨を精算機4、情報端末38a、情報端末38b、サーバ1に提示する。これにより、店員がエラー状態を解消することが可能となる。エラー状態の解消は、遠隔操作部390により、行われてもよい。
 (B)の場合は、商品不特定の物体が移動物体Moに紐づいている旨が精算機4、情報端末38a、情報端末38b、サーバ1に提示される。本実施形態において、(B)のエラー状態が発生した場合、目検用端末Qにおける目検が行われ、その目検結果に応じて、精算が継続または中止される。これにより、精算を自動的に行うことが可能なケースを増加させることができる。なお、(B)のエラー状態が発生した場合、店員がエラー状態を解消することとしてもよい。エラー状態の解消は、遠隔操作部390により、行うことができる。
 (C)の場合、売買制限商品が移動物体Moに紐づいている旨が精算機4、情報端末38a、情報端末38b、サーバ1に提示する。本実施形態において、(C)のエラー状態が発生した場合、目検用端末Qにおける目検が行われ、その目検結果に応じて、精算が継続または中止される。これにより、精算を自動的に行うことが可能なケースを増加させることができる。なお、(C)のエラー状態が発生した場合、例えば年齢制限による売買制限商品の場合であれば、店員が買い物客の年齢を確認したり、消費・賞味期限切れの商品による売買制限商品の場合であれば、店員が商品の交換対応をしたり、アレルギー・ハラル食品以外による売買制限商品の場合であれば、買い物客が自ら確認することにより、エラー状態を解消することとしてもよい。エラー状態の解消は、遠隔操作部390により、行うことができる。
The system adopted by the present embodiment takes various actions according to each error state.
In the case of (A), the fact that the system processing is abnormal is presented to the settlement machine 4, the information terminal 38a, the information terminal 38b, and the server 1. This allows the store clerk to eliminate the error condition. The elimination of the error state may be performed by the remote operation unit 390.
In the case of (B), the fact that an unspecified item is linked to the moving object Mo is presented to the settlement machine 4, the information terminal 38a, the information terminal 38b, and the server 1. In the present embodiment, when the error state (B) occurs, a visual inspection is performed at the visual inspection terminal Q, and the settlement is continued or stopped depending on the visual inspection result. As a result, it is possible to increase the number of cases in which settlement can be performed automatically. When the error state of (B) occurs, the clerk may eliminate the error state. The error state can be resolved by the remote control unit 390.
In the case of (C), the fact that the trade-restricted product is linked to the moving object Mo is presented to the settlement machine 4, the information terminal 38a, the information terminal 38b, and the server 1. In the present embodiment, when the error state of (C) occurs, the visual inspection at the visual inspection terminal Q is performed, and the settlement is continued or stopped depending on the visual inspection result. As a result, it is possible to increase the number of cases in which settlement can be performed automatically. In addition, when the error state of (C) occurs, for example, in the case of a trade-restricted product due to age restriction, a clerk checks the age of the shopper, or in the case of a trade-restricted product due to a product whose consumption / expiration date has expired. If so, the store clerk may exchange the product, or if the product is a trade-restricted product other than allergy / halal food, the shopper may confirm the error by himself / herself to eliminate the error condition. The error state can be resolved by the remote control unit 390.
 ここで、実施形態3の商品認識システムによって、物体から商品を特定し、その商品について精算する方法について、図31を参照して説明する。
 図31は、図27のサーバ1と売場装置3と精算機4が実行する自動精算処理の基本的な流れを説明するフローチャートである。
Here, a method of identifying a product from an object and paying for the product by the product recognition system according to the third embodiment will be described with reference to FIG.
FIG. 31 is a flowchart for explaining the basic flow of the automatic settlement processing executed by the server 1, the sales floor device 3, and the settlement machine 4 of FIG.
 ステップS301において、買い物客が(移動物体)が店舗30(図24)の入口31から店内に入店し、入口31付近に設置された天井カメラ310がその買い物客の撮像を開始する。買い物客がカゴやカードを手にし、通路34を進むと奥の天井カメラ310がその買い物客の撮像を開始する。このようにして複数の天井カメラ310が買い物客、カゴ、カートを含む店舗30内全体を常時撮像する。なお、ステップS301の前に、個人認証部320が買い物客の個人認証し、買い物客の個人情報を取得してもよい。 In step S301, a shopper (moving object) enters the store from the entrance 31 of the store 30 (FIG. 24), and the ceiling camera 310 installed near the entrance 31 starts imaging the shopper. When the shopper holds a basket or a card and proceeds through the aisle 34, the ceiling camera 310 at the back starts capturing an image of the shopper. In this way, the plurality of ceiling cameras 310 constantly capture images of the entire store 30 including shoppers, baskets, and carts. Note that the personal authentication unit 320 may perform personal authentication of the shopper and acquire personal information of the shopper before step S301.
 ステップS302において、天井カメラによる移動物体発見部3302は、状態空間モデル(ベイジアンフィルタ)等を用いて撮像画像中の移動する物体(採番せず)を発見する。
 撮像画像には、店舗30内を移動する物体だけでなく、静止物体も撮像されているため、天井カメラによる移動物体発見部3302は、静止物体を除去した店舗30内を移動する物体のみ発見する。
In step S302, the moving object finding unit 3302 using the ceiling camera finds a moving object (not numbered) in the captured image using a state space model (Bayesian filter) or the like.
In the captured image, not only an object moving in the store 30 but also a stationary object is imaged. Therefore, the moving object finding unit 3302 by the ceiling camera finds only the object moving in the store 30 from which the stationary object is removed. .
 ステップS303において、天井カメラによるカゴ発見部3303は、天井カメラによる移動物体発見部3302によって発見された物体の中から、カゴ類(実施形態3における移動物体Mo)を発見し、個別のIDを採番する。IDは、退店または精算完了等の特定のタイミングまで使い続けられる。 In step S303, the ceiling camera basket finding unit 3303 discovers baskets (moving object Mo in the third embodiment) from the objects found by the ceiling camera moving object finding unit 3302, and obtains individual IDs. To turn. The ID is continuously used until a specific timing such as leaving the store or completing settlement.
 ステップS304において、天井カメラによるカゴ領域定義部3306は、天井カメラによるカゴ発見部3303によって発見した移動物体Moの領域の位置を定義する。また、天井カメラ310によって撮像されている範囲内で移動物体Moが移動したときに、移動後の移動物体Moの領域の位置を改めて定義する。位置情報は移動物体MoのIDと紐づけて位置情報管理DB132やメモリ等で管理され、領域定義ごとに更新される。この定義された位置は、別の天井カメラ310において撮像されている位置でも認識される。 In step S304, the ceiling camera basket area definition unit 3306 defines the position of the area of the moving object Mo discovered by the ceiling camera basket discovery unit 3303. Further, when the moving object Mo moves within the range imaged by the ceiling camera 310, the position of the area of the moving object Mo after the movement is defined again. The position information is managed in the position information management DB 132, a memory or the like in association with the ID of the moving object Mo, and is updated for each area definition. This defined position is also recognized as a position imaged by another ceiling camera 310.
 ステップS305において、天井カメラによるカゴ領域追跡部3307は、ある天井カメラ310によって撮像されている撮像画像内で移動物体Moが移動する位置を推定する。さらに、天井カメラによるカゴ領域定義部3306が移動したと推定される位置に対し、移動物体Moの領域を定義し、位置情報管理DB132やメモリ上に記憶されている移動物体Moの位置情報を更新する。 In step S <b> 305, the ceiling camera basket area tracking unit 3307 estimates the position where the moving object Mo moves within the captured image captured by a certain ceiling camera 310. Further, the area of the moving object Mo is defined for the position where the basket area defining unit 3306 by the ceiling camera is estimated to have moved, and the position information of the moving object Mo stored in the position information management DB 132 or the memory is updated. To do.
 ステップS306において、棚カメラによる物体入出検知部3608は、買い物客の手等の物体が棚内に入ったこと及び出たことを検知する。この検知がトリガーとなって、棚カメラによる物体認識部3602が発動する。棚内への物体の進入は、各棚カメラ311に設定された進入検知領域における画像データが変化したかどうかによって検知する。
 棚カメラによる物体入出検知部3608は、棚内に進入した物体を追跡し続けることで物体が棚内から退出したことを検知する。
In step S306, the object entering / exiting detection unit 3608 by the shelf camera detects whether an object such as the hand of the shopper has entered or exited the shelf. This detection triggers the object recognition unit 3602 by the shelf camera. The entry of the object into the shelf is detected by whether or not the image data in the entry detection area set in each shelf camera 311 has changed.
The object entrance / exit detection unit 3608 by the shelf camera detects that the object exits the shelf by keeping track of the object that has entered the shelf.
 ステップS307において、棚カメラによる物体認識部3602は、棚内へ前記物体が進入する前の撮像画像と、棚内から移動物体Moが退出した後の撮像画像とを比較し、棚内から取られた物体を認識する。 In step S307, the object recognition unit 3602 by the shelf camera compares the captured image before the object has entered the shelf with the captured image after the moving object Mo leaves the shelf, and the captured image is taken from the shelf. Recognize objects
 ステップS308において、棚カメラによる商品特定部3603は、検出された移動物体Moがいずれの商品であるかを特定する。棚カメラによる商品特定部3603は、物体認識部3602により検出された物体について、特定物体認識、一般物体認識、ディープラーニング等の画像処理手法により、商品候補をリストアップする。このリストアップされた商品候補を「商品候補リストS」と呼ぶ。その後、棚カメラによる商品特定部3603は、検証機能を発揮させ、商品を高い精度で特定する。 In step S308, the shelf camera product identification unit 3603 identifies which product the detected moving object Mo is. The product specifying unit 3603 using the shelf camera lists up product candidates for the objects detected by the object recognizing unit 3602 by image processing methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product specification unit 3603 by the shelf camera causes the verification function to be performed and specifies the product with high accuracy.
 検証機能は、前述の商品候補をリストアップする手法と異なるアルゴリズムによって「商品候補リストP」をリストアップする。棚カメラによる商品特定部3603は、商品候補リストSとPとをマッチングさせ、所定の閾値を超える場合に商品を特定する。
 商品候補のリストアップの手法として、例えば、存在が認識された物体から得られる物体撮像画像の情報と、商品DB131や記憶部108によって保持されている画像情報とをマッチングさせる方法により、実現してもよい。即ち、両画像の特徴情報が一致する、即ち、閾値を超えると、棚カメラによる物体認識部3602によって存在が認識された物体は、その商品DB131保持された商品であると特定する。
 なお、商品不特定判定部3609は、商品を特定することができなかった場合に、エラー情報をサーバ1へ送信する。このエラー情報は、エラー表示部151や情報端末38に表示される。
The verification function lists up the "commodity candidate list P" by an algorithm different from the above-mentioned method of listing up product candidates. The shelf camera-based product specification unit 3603 matches the product candidate lists S and P, and specifies a product when a predetermined threshold is exceeded.
As a method of listing product candidates, for example, a method of matching information of an object captured image obtained from an object whose existence is recognized with image information held in the product DB 131 or the storage unit 108 is realized. Good. That is, when the feature information of both images matches, that is, exceeds the threshold value, the object whose existence is recognized by the object recognition unit 3602 by the shelf camera is specified as the product held in the product DB 131.
The product non-identification determination unit 3609 transmits error information to the server 1 when the product cannot be identified. This error information is displayed on the error display unit 151 and the information terminal 38.
 ステップS309において、カゴと商品紐づけ部3605は、棚カメラによる商品特定部3603によって特定された商品と移動物体MoのIDと紐づける。即ち、特定された商品を取った買い物客が持つカゴ類(移動物体Mo)がいずれかを特定する。
 また、棚から商品または物体を取った・戻した時に商品不特定判定部3609によって商品を特定できなかった場合、商品不特定の情報(エラー情報)を移動物体MoのIDと紐づける。
In step S309, the basket and product association unit 3605 associates the product identified by the product identification unit 3603 with the shelf camera with the ID of the moving object Mo. That is, one of the baskets (moving objects Mo) held by the shopper who picks up the specified product is specified.
Further, when the product non-identification determination unit 3609 fails to identify the product when the product or the object is taken from or returned from the shelf, the product non-identification information (error information) is associated with the ID of the moving object Mo.
 ステップS310において、移動物体Moと紐づく商品リスト管理部357は、移動物体MoのIDに紐づく商品のリストを管理し続ける。この管理は移動物体Moが精算エリア35に移動する等、所定のタイミングまで続けられる。 In step S310, the product list management unit 357 associated with the moving object Mo continues to manage the list of products associated with the ID of the moving object Mo. This management is continued until a predetermined timing such as when the moving object Mo moves to the settlement area 35.
 ステップS311において、精算エリア35において、移動物体Moの全商品が合計され、精算機37で精算乃至決済されると、移動物体Moに紐づく商品リストを精算済みのステータスに更新し、自動精算処理を終了する。また、移動物体Moが商品不特定の情報(エラー情報)に紐づいていた場合は、自動精算処理を終了せず、エラー情報をサーバ1へ送信する。このエラー情報は、エラー表示部151や情報端末38に表示される。即ち、店員が買い物客のもとにかけよりエラーの確認・解除を行うことができる。 In step S311, when all the commodities of the moving object Mo are summed up in the settlement area 35 and settled or settled by the settlement machine 37, the product list associated with the moving object Mo is updated to the settled status, and the automatic settlement processing is performed. To finish. If the moving object Mo is associated with the product-unspecified information (error information), the error information is transmitted to the server 1 without ending the automatic settlement process. This error information is displayed on the error display unit 151 and the information terminal 38. That is, the clerk can check and cancel the error by calling the shopper.
 なお、精算されていない移動物体Moが出口32を通過した場合、エラー情報がサーバ1へ送信される。このエラー情報は、エラー表示部151や情報端末38a、情報端末38bに表示される。出口32付近に警報機(図示せず)が備えられていると、警報機は音や光等により、警告を発する。
 実施形態3の自動精算処理は、棚カメラによって撮像された画像に基づいて商品を特定するため、カゴに入れられた物体について商品を特定する必要はない。
 しかし、その商品が間違って特定されていないかを検証するため、カゴに入れられた物体についても商品を特定することが好ましい。
When the moving object Mo that has not been settled passes through the exit 32, error information is transmitted to the server 1. This error information is displayed on the error display unit 151, the information terminal 38a, and the information terminal 38b. When an alarm device (not shown) is provided near the exit 32, the alarm device issues a warning by sound, light or the like.
Since the automatic settlement processing of the third embodiment specifies the product based on the image captured by the shelf camera, it is not necessary to specify the product for the object placed in the basket.
However, in order to verify whether the product has been mistakenly specified, it is preferable to specify the product for the object placed in the basket.
 図32は、図27のサーバ売場装置と精算機が実行する自動精算処理のカゴ内商品を認識する処理を説明するフローチャートである。 FIG. 32 is a flow chart for explaining a process of recognizing a product in the basket of the automatic settlement process executed by the server sales floor device and the settlement machine of FIG.
 ステップS321において、カゴカメラによるカゴ入出検知部372は、カゴの領域以外の尤度が低くなるように、事前に撮像画像内の所定の位置に設定された進入検知領域にパーティクルを設置する。
 ステップS322において、カゴカメラによるカゴ入出検知部372は、パーティクルの尤度によって物体(買い物客の手等)がカゴ内の進入検知領域に進入したことを検知する。なお、複数物体の進入に備えるため、物体がカゴ内の進入検知領域に進入したのち、カゴカメラによるカゴ入出検知部372は、新たなパーティクルを設置する。
 ステップS323において、カゴカメラ312は、物体(買い物客の手等)が進入検知領域に進入した時点のカゴ内の状態を撮像する。カゴカメラ312は、進入検知領域外も撮像している。
 例えば、撮像された画像(前画像)には、3個の商品が撮像されていたとする。なお、前画像に撮像されている商品は、すでに物体から特定されているため、商品となる。
In step S321, the car camera entrance / exit detection unit 372 installs particles in the entrance detection region set in advance at a predetermined position in the captured image so that the likelihood other than the car region is low.
In step S322, the car camera entrance / exit detection unit 372 detects that an object (a shopper's hand or the like) has entered an entrance detection area in the car based on the likelihood of particles. In addition, in order to prepare for the entry of a plurality of objects, after the objects enter the entry detection area in the basket, the basket entry / exit detection unit 372 by the basket camera installs new particles.
In step S323, the basket camera 312 images the state of the inside of the basket at the time when an object (a shopper's hand or the like) enters the entrance detection area. The car camera 312 also images outside the approach detection area.
For example, it is assumed that three products are captured in the captured image (previous image). The product captured in the previous image is a product because it has already been identified from the object.
 ステップS324において、カゴカメラ312は、物体が進入検知領域から退出した時点のカゴ内の状態を撮像する。例えば、このときに撮像された画像(後画像)には、前述の3個の商品に追加して1個の物体が撮像されていたとする。 In step S324, the basket camera 312 images the state of the inside of the basket at the time when the object exits the entry detection area. For example, it is assumed that, in the image (post-image) captured at this time, one object is captured in addition to the above-described three products.
 ステップS325において、カゴカメラによる物体認識部373は、前画像と後画像とを比較し、追加された1個の物体についての画像領域を定義する。
 ステップS326において、カゴカメラによる商品特定部374は、増加した1個の物体がいずれの商品であるかを特定する。この商品の特定は、棚商品認識部360がした特定の手法と同じ手法を採用することができる。
In step S325, the object recognition unit 373 by the car camera compares the front image and the rear image, and defines the image area for one added object.
In step S326, the product identification unit 374 by the car camera identifies which product the increased one object is. This product can be specified by using the same method as the specific method performed by the shelf product recognition unit 360.
 ステップS327において、カゴカメラによる商品特定部374は、特定物体目検モードまたは全物体目検モードのいずれに設定されているか判定する。
 特定物体目検モードに設定されている場合、処理はステップS328に進む。
 また、全物体目検モードに設定されている場合、処理はステップS329に進む。
 ステップS328において、目検結果取得部301aは、対象となる物体について、目検用端末Qにおける目検(商品の特定)を依頼し、その目検結果を取得する。
 ステップS328の後、処理はステップS329に進む。
 ステップS329において、カゴカメラによる商品特定部374は、追加された1個の物体がいずれの商品であるか特定できないものであるかどうか判定する。カゴカメラによる商品特定部374がいずれの商品であるか特定できないものである場合(ステップS329においてYES)は、処理はステップS327に進む。
In step S327, the basket camera product specification unit 374 determines whether the specific object visual inspection mode or the all object visual inspection mode is set.
If the specific object inspection mode is set, the process proceeds to step S328.
If the all-object visual inspection mode is set, the process proceeds to step S329.
In step S328, the visual inspection result acquisition unit 301a requests a visual inspection (specification of a product) at the visual inspection terminal Q for the target object, and acquires the visual inspection result.
After step S328, the process proceeds to step S329.
In step S329, the product identification unit 374 for the car camera determines whether it is not possible to identify which product the added one object is. If the product identification unit 374 by the car camera cannot identify which product (YES in step S329), the process proceeds to step S327.
 ステップS329においてカゴカメラによる商品特定部374がいずれの商品であるか特定できた場合(ステップS329においてNO)は、カゴカメラによる商品特定部374は、サーバ1の記憶部に保持された商品名や価格、売買制限商品である等の情報を含めて商品を特定する。これにより、処理はステップS330に進む。特定された商品情報は、出力部407に出力されてもよい。
 また、ステップS329においてカゴカメラによる商品特定部374がいずれの商品であるか特定できなかった場合は、カゴカメラに備え付けられた図示しない表示部の画面表示や、図示しないスピーカーによる音声案内等で、購入者へ商品のカゴへの入れ直しを依頼する構成としてもよい。さらに、カゴカメラに備え付けられた図示しないマイクロフォンを介して、購入者と目検者等が通話する構成としてもよい。
When it is possible to identify which product the product identification unit 374 by the car camera is in step S329 (NO in step S329), the product identification unit 374 by the car camera determines the product name stored in the storage unit of the server 1 or A product is specified including information such as a price and a product that is restricted for sale. As a result, the process proceeds to step S330. The identified product information may be output to the output unit 407.
In addition, in step S329, when it is not possible to identify which product the product identification unit 374 by the car camera is, the screen display of a display unit (not shown) provided in the car camera, voice guidance by a speaker (not shown), or the like, The purchaser may be requested to reinsert the product into the basket. Further, the purchaser and the eye examiner may talk with each other via a microphone (not shown) provided in the car camera.
 ステップS330において、売買制限商品判定部380は、カゴカメラによる商品特定部374によって特定された商品が、年齢確認が必要とされる商品であるかどうかを判定する。 In step S330, the trade-restricted product determination unit 380 determines whether or not the product identified by the product identification unit 374 by the car camera is a product that requires age confirmation.
 ステップS330において、カゴカメラによる商品特定部374によって特定された商品が、年齢確認が必要とされる商品であると判定された場合は、即ちYESと判定された場合、処理はステップS331に進む。
 ステップS330において、カゴカメラによる商品特定部374によって特定された商品が、年齢確認が必要とされる商品でないと判定された場合は、即ちNOと判定された場合、カゴ内商品の認識処理は終了となる。
 ステップS331において、出力制御部437は、精算機4の出力部407に年齢確認のための画面を表示させる。ただし、買い物客の個人情報が取得され、ここで年齢確認する必要がない場合は、ステップS330はスキップされ、処理はステップS335に進む。ここで、買い物客の年齢認証に問題があるときは、精算機4に備え付けられた図示しないスピーカーと図示しないマイクロフォンを介して、購入者と目検者等が通話する構成としてもよい。
 ステップS332において、目検結果取得部301aは、対象となる商品について、目検用端末Qにおける目検(売買制限商品の判定)を依頼し、その目検結果を取得する。
 ステップS333において、売買制限商品判定部380は、売買制限を解除する指示を受け付けたか否かを判定する。
If it is determined in step S330 that the product identified by the product identification unit 374 by the car camera is a product for which age confirmation is required, that is, if YES, the process proceeds to step S331.
In step S330, if it is determined that the product identified by the product identification unit 374 by the car camera is not a product that requires age confirmation, that is, if NO, the recognition process of the product in the basket ends. Becomes
In step S331, the output control unit 437 causes the output unit 407 of the settlement machine 4 to display a screen for age confirmation. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S330 is skipped and the process proceeds to step S335. Here, when there is a problem in the age verification of the shopper, the purchaser and the eye checker may make a call via a speaker (not shown) and a microphone (not shown) provided in the checkout machine 4.
In step S332, the visual inspection result acquisition unit 301a requests a visual inspection (determination of a trade-restricted commercial product) at the visual inspection terminal Q for the target product, and acquires the visual inspection result.
In step S333, the trade restriction product determination unit 380 determines whether or not an instruction to cancel the trade restriction has been received.
 ステップS333において売買制限を解除する指示を受け付けていないと判定された場合は、処理はステップS334に進む。
 ステップS333において売買制限を解除する解除指示を受け付けたと判定されると、即ちYESと判定されると、処理はステップS335に進む。
 ステップS334において、目検用端末Qの目検結果送信部412は、目検結果によっても売買制限が解除されなかった旨の警告を送信する。この警告を受信することにより、例えば、精算機4の出力制御部437は、目検結果によっても売買制限が解除されなかった旨の警告を、出力部407を介して提示する。ステップS334の後、処理は終了し、精算が中止となる。
 ステップS335において、売買制限商品判定部380は、売買制限を解除する。
 このようにしてステップS335が終了されるか、またはステップS330において年齢制限商品でないと判定された場合(NOであると判定された場合)、カゴ内商品の認識処理は終了となる。
If it is determined in step S333 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S334.
If it is determined in step S333 that the cancel instruction for canceling the trading restriction has been received, that is, if it is determined to be YES, the process proceeds to step S335.
In step S334, the visual inspection result transmission unit 412 of the visual inspection terminal Q transmits a warning that the trading restriction has not been released even by the visual inspection result. By receiving this warning, for example, the output control unit 437 of the settlement machine 4 presents, through the output unit 407, a warning that the trading restriction has not been canceled even by the result of the visual inspection. After step S334, the process ends and the settlement is stopped.
In step S335, the trade restriction product determination unit 380 cancels the trade restriction.
In this way, when step S335 is completed, or when it is determined in step S330 that the product is not age-restricted product (NO is determined), the recognition process of the in-car product is completed.
 このようにして、本情報処理システムは、カゴ内に入れられた物体がいずれの商品であるかを特定する。
 そして、本情報処理システムは、上述のステップS308において特定した商品とステップS326において特定した商品とが一致しているかどうかを検証することができる。商品が不特定の場合、エラー状態の旨がエラー表示部151、情報端末38a、情報端末38bに表示される。そして、目検の対象とされる物体や商品(例えば、レジ端末2において商品の特定が行えない物体や、売買制限商品に該当する商品等)については、目検用端末Qにおける目検が行われ、目検結果(商品の特定結果や売買制限商品の判定結果等)に応じて、自動精算が行われる。
 したがって、本情報処理システムによれば、買い物客が商品を購入する際に、商品の代金の精算の自動化が可能になると共に、商品の特定精度を高めることができる。
 また、精算することが適切でない物体や商品が精算にかけられた場合に、誤って精算されないようにすることができる。また、商品が不特定の旨が情報端末38a、情報端末38bに表示されると、店員がこのカゴが位置している売場まで行き、商品を検証してもよく、店内外の店員等が遠隔操作でエラー状態を解除してもよい。
In this way, the information processing system identifies which product is the object put in the basket.
Then, the information processing system can verify whether or not the product specified in step S308 and the product specified in step S326 match. When the product is not specified, the error status is displayed on the error display unit 151, the information terminal 38a, and the information terminal 38b. Then, for the objects and products to be inspected (for example, objects for which the product cannot be specified at the cashier terminal 2 and products corresponding to restricted products), the inspection at the inspection terminal Q is performed. That is, the automatic settlement is performed according to the result of the visual inspection (the result of specifying the product, the result of determining the trade prohibited product, etc.).
Therefore, according to the present information processing system, when the shopper purchases a product, it is possible to automate the payment of the price of the product and to improve the accuracy of specifying the product.
In addition, when an object or a product for which settlement is not appropriate is subjected to settlement, it can be prevented from being settled by mistake. Further, when the fact that the product is unspecified is displayed on the information terminals 38a and 38b, the store clerk may go to the sales floor where the basket is located and verify the product, and the store clerk inside and outside the store can remotely check. The error state may be released by operation.
 実施形態3において、移動物体Mo(カゴやカート)の追跡において想定されるエラーには、(A)買い物客が店舗30の入口31から入店し、カゴやカートを手にした際に、天井カメラによる移動物体発見部3302が移動物体Moを検出できなかった場合、(B)天井カメラによるカゴ領域追跡部3307が追跡中の移動物体Moを見失った場合、(C)二つ以上の異なる移動物体Moの夫々に紐付けられたIDが追跡途中に入れ替わった場合、等が含まれる(当然これらに限定されない)。 In the third embodiment, errors that can be expected in tracking the moving object Mo (a basket or a cart) include (A) when a shopper enters the store 31 through the entrance 31 and picks up the basket or the cart. When the moving object detection unit 3302 by the camera cannot detect the moving object Mo, (B) when the basket area tracking unit 3307 by the ceiling camera loses track of the moving object Mo, (C) two or more different movements When the IDs associated with the respective objects Mo are changed during the tracking, etc. are included (not limited to these, of course).
 各エラー状態に応じ、本実施形態が採用するシステムが、以下の例を含む、種々の対応を行う。
 (A)天井カメラ310により撮像された、カゴやカートを保持する買い物客を被写体として含む画像を目検用端末Qに送信し、目検による移動物体Moの検出を依頼する。目検者が移動物体Moを検出した場合は、当該買い物客の保持するカゴやカートに対応する新たな移動物体Moを定義の上、IDを発番して、天井カメラによるカゴ領域追跡部3307による追跡を開始する。目検者が移動物体Moを検出できなかった場合は、その旨を店員に報知する。
 (B)位置情報管理DB132に登録されているどのIDとも紐づけられていないカゴやカートが認められた場合、そのカゴやカートを保持する買い物客を含む撮像画像と、紐づけられるべきID候補のリストが目検用端末Qに送信される。目検者は、過去のカゴやカートとIDとの紐づけ情報に基づいて、最も適切なIDを当該カゴやカートに紐づけ、天井カメラによるカゴ領域追跡部3307による再追跡を開始する。何らかの理由で再追跡が開始できなかった場合は、その旨が店員に報知される。
 また、カゴやカートとの紐づけを失ったIDが検出された場合には、当該IDと、どのIDとも紐づけられていないカゴやカートを保持する買い物客の撮像画像のリストが目検用端末Qに送信される。目検者は、過去のカゴやカートとIDとの紐づけ情報に基づいて、当該IDにカゴやカートを紐づけることを試み、紐づけが成功した場合は、天井カメラによるカゴ領域追跡部3307による再追跡を開始する。何らかの理由で再追跡の開始が失敗した場合は、その旨が店員に報知される。
 (C)夫々のカゴやカートを保持する買い物客の画像リストと、夫々のカゴやカートに紐づけられるべきIDのリストが目検用端末Qに送信される。目検者は、過去のカゴやカートとIDとの紐づけ情報に基づいて、最も適切に、IDをカゴやカートに割り振ることを試みる。IDをカゴやカートに適切に割り振ることができた場合は天井カメラによる移動物体領域追跡部3305による再追跡を開始する。IDをカゴやカートに適切に割り振ることができない場合は、割り振りを諦め、紐づけが入れ替わったままの旨を店員に報知する。
Depending on each error state, the system adopted in this embodiment takes various actions including the following examples.
(A) An image captured by the ceiling camera 310 and including a shopper who holds a basket or a cart as a subject is transmitted to the inspection terminal Q, and a detection of the moving object Mo is requested by the inspection. When the inspector detects the moving object Mo, a new moving object Mo corresponding to the basket or cart held by the shopper is defined, an ID is issued, and the basket area tracking unit 3307 by the ceiling camera is used. Start tracking by. When the eye examiner cannot detect the moving object Mo, the clerk is notified of that fact.
(B) When a basket or cart that is not associated with any ID registered in the location information management DB 132 is recognized, a captured image including a shopper who holds the basket or cart and an ID candidate to be associated. List is transmitted to the inspection terminal Q. The eye examiner associates the most appropriate ID with the basket or cart based on the past association information of the basket or cart and starts re-tracking by the basket area tracking unit 3307 with the ceiling camera. If the re-tracking cannot be started for some reason, the clerk is informed accordingly.
In addition, when an ID that loses the association with the basket or cart is detected, the ID and a list of imaged images of shoppers holding the basket or cart that is not associated with any ID are used for the visual inspection. It is transmitted to the terminal Q. The eye examiner attempts to associate the basket or cart with the ID based on the association information of the past basket or cart and the ID, and if the association is successful, the basket area tracking unit 3307 using the ceiling camera. Start re-tracking by. If the start of re-tracking fails for some reason, the clerk is informed accordingly.
(C) An image list of shoppers who hold their respective baskets and carts, and a list of IDs that should be associated with the respective baskets and carts are transmitted to the inspection terminal Q. The eye examiner tries to assign the ID to the basket or cart most appropriately based on the past association information of the basket or cart and the ID. When the IDs can be properly assigned to the basket and the cart, the re-tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the ID cannot be properly allocated to the basket or cart, the allocation is abandoned and the clerk is notified that the linking has been exchanged.
〔実施形態4〕
 図33は、実施形態4における商品認識システムを採用するスーパーマーケットのレイアウト例を示す図である。
 実施形態4の情報処理システムは、図33に示すようなスーパーマーケット等の店舗40に対して適用する商品認識システムである。実施形態4の情報処理システムは、商品がレジ台に置かれることなく精算ゲート5-1乃至5-3を通過するだけでも自動精算できるようになされている。
[Embodiment 4]
FIG. 33 is a diagram showing a layout example of a supermarket that adopts the product recognition system according to the fourth embodiment.
The information processing system of the fourth embodiment is a product recognition system applied to a store 40 such as a supermarket as shown in FIG. The information processing system according to the fourth embodiment is configured such that the merchandise can be automatically settled only by passing through the settlement gates 5-1 to 5-3 without being placed on the cash register.
 店舗40において、入口41には、買い物カゴやショッピングカート等(採番せず)が置かれている。出口42の手前には、精算ゲート5-1乃至5-3と店員Mtが操作する精算レジ6とが設置されている。また、入口41から出口42までの間に、商品を陳列する複数の棚ラック43が設置されている。棚ラック43においては、鉛直方向に複数の棚が、所定の間隔をあけて並べられ、複数の棚の夫々には各種商品が陳列される。以下、棚と棚との間を「棚内」とも記述する。水平方向に向き合った棚ラック43と棚ラック43との間が通路44とされる。 At the entrance 40 of the store 40, a shopping cart, a shopping cart, etc. (without numbering) are placed. In front of the exit 42, the settlement gates 5-1 to 5-3 and the settlement register 6 operated by the clerk Mt are installed. Further, between the entrance 41 and the exit 42, a plurality of shelf racks 43 for displaying products are installed. In the shelf rack 43, a plurality of shelves are arranged in the vertical direction at a predetermined interval, and various products are displayed on each of the plurality of shelves. Hereinafter, the space between the shelves is also referred to as “inside the shelves”. A passage 44 is defined between the rack racks 43 facing each other in the horizontal direction.
 買い物客は、入口41から入店し、買い物カゴを取って、ショッピングカートを押して、あるいは、持参したマイバッグを持って通路44を進む。買い物客は、棚内の商品を取って、カゴ類の中に入れて通路44を進む。買い物客は、購入したい商品を全て取ると、精算エリア45へ進み、精算する。
 通路44や精算エリア45等において、店員Mtが見回り等をしている。店員Mtは、情報端末9aを所持している。情報端末9aは、スマートフォン等の携帯型情報処理端末であり、店内の状態を表示する機能、店内で発生したエラー状態を表示する機能、遠隔操作機能等を備えている。
The shopper enters the store through the entrance 41, picks up the shopping cart, pushes the shopping cart, or proceeds through the aisle 44 with his / her own bag brought. The shopper takes the product in the shelf, puts it in the basket, and proceeds through the aisle 44. When the shopper has taken all the desired products, the shopper proceeds to the settlement area 45 and makes the settlement.
In the passage 44, the settlement area 45, etc., the clerk Mt is looking around. The store clerk Mt owns the information terminal 9a. The information terminal 9a is a portable information processing terminal such as a smartphone, and has a function of displaying a state of the store, a function of displaying an error state occurring in the store, a remote control function, and the like.
 図33において、雲形で描かれた線図の内側には、店舗40内でなく、店舗40外や店舗40のバックヤードの様子が描画されている。店舗40外または店舗のバックヤードには、サーバ1(図7)が設置されている。店舗外やバックヤードでは、店員Mtが大型モニターの画面(図示せず)や情報端末9bの画面を通じて店舗40内を監視することができる。
 なお、買い物カゴ、ショッピングカート及びマイバッグを含めてカゴ類と呼ぶ。精算ゲート5-1乃至5-3を含む通路44が精算エリア45とされる。また、カゴ類等と買い物客を含めて移動物体Moと呼ぶ。
In FIG. 33, not the inside of the store 40 but the outside of the store 40 or the backyard of the store 40 is drawn inside the cloud-shaped diagram. The server 1 (FIG. 7) is installed outside the store 40 or in the backyard of the store. Outside the store or in the backyard, the store clerk Mt can monitor the inside of the store 40 through the screen of a large monitor (not shown) or the screen of the information terminal 9b.
In addition, the shopping cart, the shopping cart, and the my bag are referred to as baskets. The passage 44 including the settlement gates 5-1 to 5-3 is a settlement area 45. In addition, baskets and the like and shoppers are referred to as a moving object Mo.
 入口41から出口42までの間の通路44の天井には、複数の天井カメラ310が設置されている。また、各棚ラック43内の各棚の複数個所には、複数の棚カメラ311が設置されている。
 なお、図33において、雲形で描かれた線図の内側には、店舗40内でなく、店舗40外や店舗40のバックヤードの様子が描画されている。店舗40外には、サーバ1(図7)が設置されている。バックヤードでは、店員Mtが大型モニターの画面(図示せず)や情報端末9bの画面を通じて店舗30内を監視している。
A plurality of ceiling cameras 310 are installed on the ceiling of the passage 44 between the entrance 41 and the exit 42. Further, a plurality of shelf cameras 311 are installed at a plurality of positions on each shelf in each shelf rack 43.
Note that in FIG. 33, the inside of the diagram drawn in a cloud shape does not show inside the store 40 but outside the store 40 or the backyard of the store 40. The server 1 (FIG. 7) is installed outside the store 40. In the backyard, the store clerk Mt monitors the inside of the store 30 through the screen of a large monitor (not shown) and the screen of the information terminal 9b.
 図34は、本情報処理システムの実施形態4としての商品認識システムの構成を示す構成図である。
 実施形態4の商品認識システムは、図35に示すような構成を有している。
 商品認識システムは、サーバ1と、売場装置3と、n台(nは任意の整数)の精算ゲート5-1乃至5-nと、を有している。なお、実施形態1及び2において採用したレジ端末2は、実施形態4において採用されない。
 サーバ1と、売場装置3と、精算ゲート5-1乃至5-nとの夫々は、インターネット(Internet)回線等のネットワークNを介して相互に接続されている。
 なお、説明の便宜上、図34のサーバ1は、1台しか描画されていないが、実際には複数台の場合もある。
 また、以下、精算ゲート5-1乃至5-nを個々に区別する必要がない場合、これらをまとめて「精算ゲート5」と呼ぶ。
FIG. 34 is a configuration diagram showing the configuration of a product recognition system as the fourth embodiment of the information processing system.
The product recognition system of the fourth embodiment has a configuration as shown in FIG.
The product recognition system includes a server 1, a sales floor device 3, and n (n is an arbitrary integer) settlement gates 5-1 to 5-n. The cashier terminal 2 adopted in the first and second embodiments is not adopted in the fourth embodiment.
The server 1, the sales floor device 3, and the settlement gates 5-1 to 5-n are connected to each other via a network N such as an Internet line.
Note that, for convenience of description, only one server 1 is depicted in FIG. 34, but in reality, there may be a plurality of servers.
Further, hereinafter, when it is not necessary to individually distinguish the settlement gates 5-1 to 5-n, they are collectively referred to as a “settlement gate 5”.
 サーバ1は、売場装置3及び精算ゲート5の各動作を管理すべく、各処理を実行する。サーバ1は、図7に示した実施形態1のサーバ1のハードウェア構成と同様に構成されている。 The server 1 executes each process in order to manage each operation of the sales floor device 3 and the settlement gate 5. The server 1 has the same hardware configuration as the server 1 of the first embodiment shown in FIG.
 したがって、サーバ1は、CPU101と、ROM102と、RAM103と、バス104と、入出力インターフェース105と、出力部106と、入力部107と、記憶部108と、通信部109と、ドライブ110と、を備えている。 Therefore, the server 1 includes the CPU 101, the ROM 102, the RAM 103, the bus 104, the input / output interface 105, the output unit 106, the input unit 107, the storage unit 108, the communication unit 109, and the drive 110. I have it.
 売場装置3は、図26に示した実施形態3のハードウェア構成と同様に構成されているが、図示されたカゴカメラ312を備えていない。 The sales floor device 3 is configured similarly to the hardware configuration of the third embodiment shown in FIG. 26, but does not include the illustrated basket camera 312.
 したがって、売場装置3は、CPU301と、ROM302と、RAM303と、バス304と、入出力インターフェース305と、天井カメラ310と、棚カメラ311と、情報端末9と、通信部315と、を備えている。 Therefore, the sales floor device 3 includes a CPU 301, a ROM 302, a RAM 303, a bus 304, an input / output interface 305, a ceiling camera 310, a shelf camera 311, an information terminal 9, and a communication unit 315. .
 ただし、実施形態4でのエラー状態は、システム処理において異常が発生した場合や、棚商品認識において商品を特定できなかった場合、不特定の物体や売買制限商品が紐づいている移動物体Moが精算ゲート5を通過しようとした場合、精算していない買い物客(移動物体Mo)が退店しようとしている場合等種々の状況下で発生する。 However, in the error state in the fourth embodiment, when an abnormality occurs in the system processing or when the product cannot be specified in the shelf product recognition, an unspecified object or a moving object Mo associated with the trade-restricted product is This occurs under various circumstances, such as when the customer attempts to pass through the settlement gate 5 or when a shopper (moving object Mo) who has not made the settlement attempts to leave the store.
 このような売場装置3にネットワークNを介して精算ゲート5が接続されている。
 精算ゲート5は、精算機5aを用いた精算ゲート5-1、電子マネー等を利用する精算ゲート5-2、通過するだけで精算できる精算ゲート5-3に区分されている。
 有人の精算レジ6以外の精算ゲート5-1~5-3には、常時閉じている開閉部材(採番せず)が備えられていてもよい。
A settlement gate 5 is connected to such a sales floor device 3 via a network N.
The settlement gate 5 is divided into a settlement gate 5-1 that uses the settlement machine 5a, a settlement gate 5-2 that uses electronic money, and a settlement gate 5-3 that can be settled only by passing through.
The settlement gates 5-1 to 5-3 other than the manned settlement register 6 may be provided with an opening / closing member (not numbered) that is always closed.
 また、精算機5aを用いた精算ゲート5-1は、精算ボタン(図示せず)を備え、購入商品の金額の合計を算出し、出口42側に設置された精算機5aにて精算する。精算機5aは、精算ゲート5-1よりも出口42側に設置されている。この精算機5aは、現金、クレジットカード、電子マネー、ポイント支払い、商品券・仮想通貨、プリペイドカード等で支払いができる決算手段を備えている。
 この精算機5aを用いた精算ゲート5-1にあっては、決済する人が、精算ボタンを押下すると、この押下がトリガーとなって後述する移動物体領域に紐づく商品情報が読み出され、精算金額が確定し、精算ゲート5-1を通過可能となる。開閉部材が備えられている場合は、開閉部材が開く。そして、決済する人が精算機5aによって決済すると、退店可能となる。
Further, the settlement gate 5-1 using the settlement machine 5a is provided with a settlement button (not shown) to calculate the total amount of the purchased commodities, and the settlement machine 5a installed on the exit 42 side performs the settlement. The settlement machine 5a is installed closer to the exit 42 than the settlement gate 5-1. The settlement machine 5a is equipped with a settlement means capable of making payment by cash, credit card, electronic money, point payment, gift certificate / virtual currency, prepaid card, or the like.
In the settlement gate 5-1 using the settlement machine 5a, when the person who makes the settlement pushes down the settlement button, the pushing down triggers to read out the product information associated with the moving object area, which will be described later. The settlement amount is fixed, and it becomes possible to pass through the settlement gate 5-1. When the opening / closing member is provided, the opening / closing member opens. When the person who makes the payment makes the payment using the settlement machine 5a, the store can be closed.
 電子マネー等を利用する精算ゲート5-2は、ゲート本体に電子マネー等をかざすことで精算する。即ち、電子マネー等を利用する精算ゲートは、カード読取部(図示せず)を備え、精算機5aを用いた精算ゲート5-1のような精算ボタンを備えず、精算機5aを使用しない。
 電子マネー等には、決済可能なICカードだけでなく、クレジットカード、いわゆるポイントカード、プリペイドカード等の狭義のカードだけでなく、情報携帯端末も含まれるが、以下、各種カードとして説明する。
 決済する人がこの精算ゲート5-2の精算エリア45に進入し、各種カードがカード読取部に読み取られると、後述する移動物体領域に紐づく商品情報が読み出され、精算及び決済が完了し、精算ゲート5-2を通過可能となる。
The settlement gate 5-2, which uses electronic money or the like, performs settlement by holding electronic money or the like over the gate body. That is, a settlement gate using electronic money or the like includes a card reading unit (not shown), does not include a settlement button like the settlement gate 5-1 using the settlement machine 5a, and does not use the settlement machine 5a.
The electronic money and the like include not only IC cards that can be settled but also credit cards, so-called point cards, prepaid cards and other narrowly defined cards, as well as portable information terminals, but these will be described below as various cards.
When the person making the settlement enters the settlement area 45 of the settlement gate 5-2 and the various cards are read by the card reading unit, the product information associated with the moving object area, which will be described later, is read out, and the settlement and settlement are completed. , It becomes possible to pass through the settlement gate 5-2.
 精算ゲート5-3は、入口41や通路44等において個人情報が取得された買い物客が、精算ゲート5-3を通過するのみで、精算及び決済が完了させる。即ち、現金を支払う、カードを読み取らせる等せずに、精算及び決済が完了する。 At the checkout gate 5-3, the shopper whose personal information has been acquired at the entrance 41, the passage 44, etc. only needs to pass through the checkout gate 5-3 to complete the checkout and settlement. That is, settlement and payment are completed without paying cash or reading a card.
 有人の精算レジ6は、店員Mtが商品の価格を個々に入力し、精算する。
 図35は、図34の商品認識システムのうち精算ゲート5のハードウェア構成を示すブロック図である。
 精算ゲート5は、CPU501と、ROM502と、RAM503と、バス504と、入出力インターフェース505と、入力部506と、出力部507と、記憶部508と、通信部409と、を備えている。
At the manned checkout cash register 6, the clerk Mt individually inputs the price of the item for payment.
FIG. 35 is a block diagram showing the hardware configuration of the settlement gate 5 in the product recognition system of FIG. 34.
The settlement gate 5 includes a CPU 501, a ROM 502, a RAM 503, a bus 504, an input / output interface 505, an input unit 506, an output unit 507, a storage unit 508, and a communication unit 409.
 精算ゲート5のCPU501、ROM502、RAM503、バス504、入出力インターフェース505、記憶部508、通信部509は、サーバ1のこれらと同様に構成されている。 The CPU 501, the ROM 502, the RAM 503, the bus 504, the input / output interface 505, the storage unit 508, and the communication unit 509 of the settlement gate 5 are configured similarly to those of the server 1.
 入力部506は、精算機5aを用いた精算ゲート5-1にあってはゲート本体に備えられた精算ボタンであり、電子マネーを利用する精算ゲート5-2と通過するだけで精算できる精算ゲート5-3にあっては各種カード等の情報を検知する情報読取部である。 The input unit 506 is a settlement button provided in the main body of the settlement gate 5-1 that uses the settlement machine 5a, and the settlement gate that uses electronic money and the settlement gate that can be settled only by passing through the settlement gate 5-2. Reference numeral 5-3 is an information reading unit for detecting information on various cards.
 出力部507は、精算ゲート5-1~5-3に備えられた開閉部材(採番せず)を開閉する信号を出力する。また、精算機5aを用いた精算ゲート5-1は、精算機5aに精算する金額や商品名等を出力する。 The output unit 507 outputs a signal for opening and closing the opening / closing member (without numbering) provided in the settlement gates 5-1 to 5-3. Further, the settlement gate 5-1 using the settlement machine 5a outputs the amount to be settled, the product name, etc. to the settlement machine 5a.
 図36は、上述した図7のサーバ1と図26の売場装置3と図35の精算ゲート5との機能的構成の一例を示す機能ブロック図である。 FIG. 36 is a functional block diagram showing an example of the functional configuration of the server 1 of FIG. 7, the sales floor device 3 of FIG. 26, and the settlement gate 5 of FIG. 35 described above.
 サーバ1は、実施形態3と同様のCPU101と、記憶部108と、エラー表示部151と、エラー解除部152とを備えている。
 サーバ1は、実施形態3のサーバ1と同様に構成されている。
The server 1 includes a CPU 101 similar to that of the third embodiment, a storage unit 108, an error display unit 151, and an error canceling unit 152.
The server 1 is configured similarly to the server 1 of the third embodiment.
 図37は、実施形態3における売場装置3に備えられた移動物体追跡部330の詳細な機能的構成例を示す機能ブロック図である。
 売場装置3のCPU301においては、図37に示すように、個人認証部320と、移動物体追跡部330と、位置情報管理部340と、棚商品認識部360と、カゴ商品認識部370と、売買制限商品判定部380と、を備える。
 個人認証部320と、移動物体追跡部330と、位置情報管理部340と、カゴ商品認識部370と、売買制限商品判定部380は、実施形態3と同様に構成されている。
FIG. 37 is a functional block diagram showing a detailed functional configuration example of the moving object tracking unit 330 provided in the sales floor device 3 in the third embodiment.
In the CPU 301 of the sales floor device 3, as shown in FIG. 37, a personal authentication unit 320, a moving object tracking unit 330, a position information management unit 340, a shelf product recognition unit 360, a basket product recognition unit 370, and trading. And a restricted product determination unit 380.
The personal authentication unit 320, the moving object tracking unit 330, the position information management unit 340, the basket product recognition unit 370, and the trade restricted product determination unit 380 are configured similarly to the third embodiment.
 個人認証部320は、実施形態3と同様の個人情報取得部321を備えている。個人認証部320は、個人情報取得部321またはサーバ1のDB管理部141から個人情報を取得する。 The personal authentication unit 320 includes the same personal information acquisition unit 321 as in the third embodiment. The personal authentication unit 320 acquires personal information from the personal information acquisition unit 321 or the DB management unit 141 of the server 1.
 移動物体追跡部330は、図37に示すように、天井カメラによる移動物体発見部3302と、天井カメラによる移動物体領域定義部3304と、天井カメラによる移動物体領域追跡部3305と、グルーピング部3308と、血縁関係判定部3309と、天井カメラによる移動物体領域間受け渡し認識部3311と、天井カメラによる受け渡された物体認識部3312と、天井カメラによる受け渡された商品特定部3313と、を備えている。
 天井カメラによる移動物体発見部3302は、天井カメラ310によって撮像された撮像画像に基づいて状態空間モデル(ベイジアンフィルタ等)を用いて移動物体Mo(買い物客、カゴ、カート等)を発見する。
 天井カメラによる移動物体領域定義部3304は、天井カメラによる移動物体発見部3302により発見された移動物体Moの領域を移動物体領域として定義する。天井カメラによる移動物体領域定義部3304は、移動物体Moを中心として、変化のあった領域を連続的に見つけることで、移動物体領域を定義する。即ち、天井カメラによる移動物体領域定義部3304は、発見した移動物体Moと、その移動物体Mo及びその移動物体Moの周辺のうち一定の範囲内を移動物体領域として定義する。
As shown in FIG. 37, the moving object tracking unit 330 includes a moving object finding unit 3302 using a ceiling camera, a moving object region defining unit 3304 using a ceiling camera, a moving object region tracking unit 3305 using a ceiling camera, and a grouping unit 3308. A blood relationship determination unit 3309, a moving object area transfer recognition unit 3311 by a ceiling camera, an object recognition unit 3312 transferred by a ceiling camera, and a product specification unit 3313 transferred by a ceiling camera. There is.
The moving object finding unit 3302 by the ceiling camera finds the moving object Mo (shopper, basket, cart, etc.) using a state space model (Bayesian filter, etc.) based on the captured image taken by the ceiling camera 310.
The moving object area definition unit 3304 using the ceiling camera defines the area of the moving object Mo found by the moving object finding unit 3302 using the ceiling camera as the moving object area. The moving object area definition unit 3304 using the ceiling camera defines a moving object area by continuously finding changed areas around the moving object Mo. That is, the moving object area defining unit 3304 using the ceiling camera defines the moving object Mo that has been found, the moving object Mo, and a certain range of the periphery of the moving object Mo as a moving object area.
 その上で、買い物客を上側から見た場合の骨格モデルを当てはめることで、おおよそのポーズを推定し、実際に得られた映像での移動物体領域と比較することで、移動物体Moを明確に定義してもよい。
 ここで、人を中心とした領域を人領域と呼ぶと、人領域は移動物体領域の下位概念となる。
 また、カゴ類を中心とした領域をカゴ領域と呼ぶと、カゴ領域は移動物体領域の下位概念となる。
 また、カートを中心とした領域をカート領域と呼ぶと、カート領域は移動物体領域の下位概念となる。
Then, by applying a skeleton model when the shopper is viewed from above, an approximate pose is estimated, and the moving object Mo is clarified by comparing it with the moving object region in the actually obtained image. May be defined.
Here, when an area centered on a person is called a person area, the person area is a subordinate concept of the moving object area.
Also, when a region centering on baskets is called a basket region, the basket region is a subordinate concept of the moving object region.
When the area centering on the cart is called a cart area, the cart area is a subordinate concept of the moving object area.
 天井カメラによる移動物体領域追跡部3305は、移動物体Moの移動を追跡する。例えば、天井カメラによる移動物体領域追跡部3305は、移動物体Moの特徴データ(色や形状等)を収集することによっても、移動物体Moの移動を追跡する。
 あるいは、天井カメラによる移動物体領域追跡部3305は、ベイジアンフィルタ、高速フーリエ変換、TLD(Tracking-Learning-Detection)等の画像内における物体追跡の技術を用いて、移動物体Moの移動を追跡する。
The moving object area tracking unit 3305 using the ceiling camera tracks the movement of the moving object Mo. For example, the moving object region tracking unit 3305 using the ceiling camera also tracks the movement of the moving object Mo by collecting the characteristic data (color, shape, etc.) of the moving object Mo.
Alternatively, the moving object area tracking unit 3305 using the ceiling camera tracks the movement of the moving object Mo by using an object tracking technique in an image such as a Bayesian filter, a fast Fourier transform, and TLD (Tracking-Learning-Detection).
 グルーピング部3308は、家族や友人等の複数人で来店した場合において、その複数人をグルーピングする。
 また、グルーピング部3308は、移動物体Mo同士の距離感(重なり、くっついている等)、移動方向(ベクトル)等の情報を用い、複数人をグルーピングしてもよい。
 グルーピング部3308は、人領域とカゴ領域・カート領域を紐づけてもよい。
 このようにグルーピング部3308が機能することにより、精算ゲート5において、一人によって精算することができる。
When a plurality of people, such as family members and friends, visit the grouping unit 3308, the grouping unit 3308 groups the plurality of people.
In addition, the grouping unit 3308 may group a plurality of people by using information such as a sense of distance (overlapping, sticking, etc.) between the moving objects Mo, a moving direction (vector), and the like.
The grouping unit 3308 may associate the human area with the basket area / cart area.
With the function of the grouping unit 3308 in this manner, one person can make a settlement at the settlement gate 5.
 血縁関係判定部3309は、顔認証手法を用い、親子関係や兄弟関係等を見分ける。血縁関係判定部は、グルーピング部の機能を補助する。血縁関係判定部は、ディープラーニングの顔認識手法を用いて顔の類似度を判定し、血縁関係を推定してもよい。 The blood relationship determination unit 3309 uses a face authentication method to distinguish the parent-child relationship, sibling relationship, and the like. The blood relationship determination unit assists the function of the grouping unit. The blood relationship determination unit may determine the degree of face similarity using a deep learning face recognition method and estimate the blood relationship.
 買い物客(移動物体Mo)から買い物客(移動物体Mo)への商品の受け渡しが発生した場合、天井カメラによる移動物体領域間受け渡し認識部3311が移動物体Moから移動物体Moへ物体が受け渡されたことを認識し、受け渡した/渡された移動物体Moを特定し、各移動物体Moに紐づく商品リストを読み込む。 When goods are transferred from the shopper (moving object Mo) to the shopper (moving object Mo), the moving object area transfer recognizing unit 3311 by the ceiling camera transfers the object from the moving object Mo to the moving object Mo. That is, the moving object Mo passed / passed is identified, and the product list associated with each moving object Mo is read.
 天井カメラによる移動物体領域間受け渡し認識部3311は、買い物客(移動物体Mo)から買い物客(移動物体Mo)への商品の受け渡しが発生したことを認識し、この受け渡した移動物体Moと渡された移動物体Moとを特定し、各移動物体Moに紐づく商品リストを読み込む。天井カメラによる受け渡された物体認識部3312は、その後、受け渡しが認識された時点の撮像画像から物体の領域を定義する。
 さらに、天井カメラによる受け渡された商品特定部が受け渡された物体が、物体領域定義後の画像から、読み込まれた受け渡しを行った移動物体Moに紐づく商品リストのうちいずれの商品かを特定し、天井カメラによる移動物体領域間受け渡し認識部3311で特定した各移動物体Moと、受け渡しで特定された商品を紐づけ、夫々の商品のリストを更新する。
 天井カメラによる移動物体領域間受け渡し認識部3111は、ディープラーニング等の物体認識手法を用いて、買い物客(移動物体Mo)の動きを分析し、受け渡し認識してもよく、受け渡しの際に人領域の中の手を認識してもよく、受け渡しの認識は人領域(手を含んでもよい)間の重なりを認識してもよい。
 天井カメラによる移動物体領域間受け渡し認識部3311は、天井カメラの代わりに、広範囲を撮像可能な棚カメラ等で実現されてもよい。
The moving object area transfer recognizing unit 3311 by the ceiling camera recognizes that a product is transferred from the shopper (moving object Mo) to the shopper (moving object Mo), and is transferred to the transferred moving object Mo. The moving object Mo is identified and the product list associated with each moving object Mo is read. The object recognition unit 3312 delivered by the ceiling camera then defines the area of the object from the captured image at the time when the delivery is recognized.
Further, it is determined which of the products in the product list associated with the moving object Mo that has been transferred from the image after the object area is defined, the object transferred by the product specifying unit transferred by the ceiling camera. Each moving object Mo identified by the moving object area passing / recognizing unit 3311 by the ceiling camera is associated with the item for sale identified, and the list of each item for sale is updated.
The moving object area transfer recognizing unit 3111 using the ceiling camera may analyze the movement of the shopper (moving object Mo) by using an object recognition method such as deep learning, and may recognize the transfer. May recognize the hand inside the hand, and the recognition of passing may recognize the overlap between the human areas (which may include the hand).
The moving object area transfer recognizing unit 3311 by the ceiling camera may be realized by a shelf camera or the like capable of capturing a wide range instead of the ceiling camera.
 その後天井カメラによる受け渡された物体認識部3312が、受け渡しが認識された時点の撮像画像から物体の領域を定義する。 After that, the object recognition unit 3312 delivered by the ceiling camera defines the area of the object from the captured image at the time when the delivery is recognized.
 さらに、天井カメラによる受け渡された商品特定部3313が、天井カメラによる受け渡された物体認識部3312により、認識された物体が、読み込まれた移動物体Mo(受け渡しを行った人)に紐づく商品リストのうちいずれの商品かを特定し、天井カメラによる移動物体領域間受け渡し認識部3311で特定した各移動物体Moと、天井カメラによる受け渡された商品特定部3313で特定された商品を紐づけ、各移動物体Moの商品のリストを更新する。 Further, the product identification unit 3313 delivered by the ceiling camera associates the object recognized by the object recognition unit 3312 delivered by the ceiling camera with the read moving object Mo (person who has delivered). It identifies which product in the product list, and links each moving object Mo specified by the moving object area passing / recognizing unit 3311 by the ceiling camera with the product specified by the selling camera specified by the ceiling camera 3313. Then, the list of products of each moving object Mo is updated.
 天井カメラによる受け渡された物体認識部3312は、天井カメラをズーム可能なカメラにし、受け渡しが行われたと推定される箇所をズームアップし、物体の領域定義をしてもよい。
 天井カメラによる移動物体領域間受け渡し認識部3311、天井カメラによる受け渡された物体認識部3312、天井カメラによる受け渡された商品特定部3313は、天井カメラの代わりに、広範囲を撮像可能な棚カメラ等で実現されてもよい。
The object recognition unit 3312 delivered by the ceiling camera may make the ceiling camera a zoomable camera, and may zoom up a portion where the delivery is estimated to be performed to define the area of the object.
The moving object area transfer recognizing unit 3311 by the ceiling camera, the object recognizing unit 3312 transferred by the ceiling camera, and the product specifying unit 3313 transferred by the ceiling camera are shelf cameras capable of capturing a wide range instead of the ceiling camera. And the like.
 位置情報管理部340は、実施形態2と同様、図20に示すように、カメラ間情報受け渡し部341と、各カメラの位置定義部342と、移動物体表示部343を備える。 Similar to the second embodiment, the position information management unit 340 includes an inter-camera information transfer unit 341, a position definition unit 342 of each camera, and a moving object display unit 343, as shown in FIG.
 図38は、実施形態4における売場装置3に備えられた棚商品認識部360の詳細な機能的構成例を示す機能ブロック図である。
 棚商品認識部360は、図38に示す棚カメラによる物体認識部3602と、棚カメラによる商品特定部3603と、移動物体と商品紐づけ部3604と、移動物体と紐づく商品リスト管理部3606と、棚カメラによる物体入出検知部3608と、商品不特定判定部3609と、ラベル認識部3610と、値引きシール認識部3611と、を備えている。
FIG. 38 is a functional block diagram showing a detailed functional configuration example of the shelf product recognition unit 360 provided in the sales floor device 3 according to the fourth embodiment.
The shelf product recognition unit 360 includes an object recognition unit 3602 including a shelf camera, a product specification unit 3603 including a shelf camera, a moving object and product association unit 3604, and a product list management unit 3606 associated with the moving object, which are illustrated in FIG. 38. An object entry / exit detection unit 3608 by a shelf camera, a product non-identification determination unit 3609, a label recognition unit 3610, and a discount sticker recognition unit 3611 are provided.
 これらのうち、棚カメラによる商品特定部3603は、棚カメラによる物体認識部3602によって認識された棚内の物体がいずれの商品であるかを特定する。棚カメラによる商品特定部3603は、特定物体認識、一般物体認識、ディープラーニング等の画像処理手法により、商品候補をリストアップする。このリストアップされた商品候補を「商品候補リストS」と呼ぶ。その後、棚カメラによる商品特定部3603は、検証機能を発揮させることで、商品を高い精度で特定する。 Among these, the shelf camera product identification unit 3603 identifies which product is the object in the shelf recognized by the shelf camera object recognition unit 3602. The shelf camera product identification unit 3603 lists up product candidates by image processing methods such as specific object recognition, general object recognition, and deep learning. The listed product candidates are called “product candidate list S”. After that, the product specification unit 3603 using the shelf camera specifies the product with high accuracy by performing the verification function.
 検証機能は前述の商品候補をリストアップする手法と異なるアルゴリズムによって「商品候補リストP」をリストアップする。商品候補リストSとPの結果をマッチングさせて、所定の閾値を超える場合、商品を特定する。
「商品候補リスト」のリストアップ手法として、例えば、存在が認識された物体から得られる物体の画像情報と、商品DB131やメモリ上に保持されている画像情報とマッチングさせる方法により実現されてもよい。即ち、両画像の特徴情報が一致する(閾値を超える)と、棚カメラによる物体認識部3602によって存在が認識された物体が、商品DB131に登録されている商品であるため、棚カメラによる商品特定部3603は、その商品DB131に登録された商品であると特定する。ディープラーニングにより、商品候補を作成し、その後、検証機能を発揮させることで、商品を高い精度で特定する。
The verification function lists up the "commodity candidate list P" by an algorithm different from the method of listing up the product candidates described above. The results of the product candidate lists S and P are matched with each other, and a product is specified when the result exceeds a predetermined threshold.
The method of listing the “commodity candidate list” may be realized by, for example, a method of matching image information of an object obtained from an object whose existence is recognized with image information held in the product DB 131 or the memory. . That is, when the feature information of both images matches (exceeds a threshold value), the object whose existence is recognized by the object recognition unit 3602 by the shelf camera is a product registered in the product DB 131, and thus the product identification by the shelf camera is performed. The section 3603 specifies that the product is registered in the product DB 131. A product candidate is created by deep learning, and then the verification function is exerted to specify the product with high accuracy.
 棚カメラによる商品特定部3603は、棚カメラ311によって撮像された撮像画像の1フレームで商品を特定せず、天井カメラ310によって撮像された撮像画像も駆使し、複数の撮像画像にわたって商品を特定してもよい。その際、棚カメラによる商品特定部3603は、商品候補に対し、パーセンテージを持たせ、購入履歴、時間、場所、人の嗜好等の情報をもとに、パーセンテージを加算し、一定の閾値を超えた時に商品を特定する。 The shelf camera-based product specification unit 3603 does not specify the product in one frame of the captured image captured by the shelf camera 311, but also uses the captured image captured by the ceiling camera 310 to identify the product over a plurality of captured images. May be. At that time, the product specifying unit 3603 using the shelf camera gives a percentage to the product candidate, adds the percentage based on information such as purchase history, time, place, and person's preference, and exceeds a certain threshold. Specify the product when
 本実施形態においては、特定の物体について目検用端末Qによる目検が行われる特定物体目検モードと、全ての物体について目検用端末Qによる目検が行われる全物体目検モードとが設定可能となっている。
 特定物体目検モードに設定されている場合、棚カメラによる商品特定部3603は、商品不特定とされた物体の物体撮像画像及び関連する情報等を、目検結果取得部301aを介して目検用端末Qに送信する。一方、全物体目検モードに設定されている場合、棚カメラによる商品特定部3603は、商品の特定結果及び特定された商品の商品撮像画像と、商品不特定とされた物体の物体撮像画像及び関連する情報等とを目検結果取得部301aを介して目検用端末Qに送信する。
 そして、棚カメラによる商品特定部3603は、目検結果取得部301aが目検用端末Qから取得した目検結果に基づいて、商品の特定を行う。具体的には、棚カメラによる商品特定部3603によって特定された商品については、目検結果により特定結果が承認または修正された場合、棚カメラによる商品特定部3603は、目検結果が示す内容に従って、商品の特定を行う。また、棚カメラによる商品特定部3603によって商品不特定とされた物体については、棚カメラによる商品特定部3603は、その物体を目検結果が示す商品として特定する。なお、棚カメラによる商品特定部3603が目検による商品の特定を依頼する場合、カゴ商品のラベル認識部376による商品特定の補完を経た後に、依頼を行うこととしてもよい。この場合、ラベル認識部3610による商品特定の補完の結果、商品が特定されたものを除外して、目検による商品の特定を行うことができる。
 なお、棚商品認識部360における他の機能部については、実施形態2と同様である。
In the present embodiment, there are a specific object inspection mode in which the inspection terminal Q performs inspection for a specific object and an all-object inspection mode in which inspection is performed by the inspection terminal Q for all objects. It can be set.
When the specific object visual inspection mode is set, the product identification unit 3603 by the shelf camera visually inspects the object captured image of the object that is not specified as the product and the related information via the eye inspection result acquisition unit 301a. To the terminal Q for use. On the other hand, when the all-object visual inspection mode is set, the product identification unit 3603 by the shelf camera determines the product identification result and the product captured image of the identified product, and the object captured image of the object that is not identified as the product and Related information and the like are transmitted to the inspection terminal Q via the inspection result acquisition unit 301a.
Then, the product identification unit 3603 using the shelf camera identifies the product based on the eye inspection result acquired by the eye inspection result acquisition unit 301a from the eye inspection terminal Q. Specifically, for the product identified by the product identification unit 3603 by the shelf camera, when the identification result is approved or corrected by the eye inspection result, the product identification unit 3603 by the shelf camera follows the contents indicated by the inspection result. , Specify the product. Further, for an object for which the product specification unit 3603 by the shelf camera determines that the product is unspecified, the product specification unit 3603 by the shelf camera specifies the object as the product indicated by the visual inspection result. When the product identification unit 3603 using the shelf camera requests the product identification by visual inspection, the request may be made after the product identification complement by the basket product label recognition unit 376 is completed. In this case, as a result of the supplement of the product identification by the label recognition unit 3610, the products whose products are identified can be excluded, and the products can be identified by visual inspection.
Note that the other functional units in the shelf product recognition unit 360 are the same as those in the second embodiment.
 精算ゲート5のCPU501においては、図35に示すように、精算部535と、入力制御部536と、出力制御部537と、が機能する。 In the CPU 501 of the settlement gate 5, as shown in FIG. 35, a settlement unit 535, an input control unit 536, and an output control unit 537 function.
 精算部535は、追跡され続けた移動物体領域が精算エリア45に進入したときに、1以上の移動物体領域に紐づく商品情報(商品のリスト)が通信部509を介してサーバ1から入力され、精算金額及び精算対象商品を確定する。
 入力制御部536は、ゲート本体に備えられた精算ボタンや情報読取部等の入力部506からの信号を入力する。
 出力制御部437は、出力部507に精算金額を表示させたり、精算機5aに情報を出力したり、開閉部材を開閉させたりする。
The settlement unit 535 receives product information (list of products) associated with one or more moving object regions from the server 1 via the communication unit 509 when the continuously tracked moving object region enters the settlement area 45. , Settle the payment amount and the item to be settled.
The input control unit 536 inputs a signal from the input unit 506 such as a payment button and an information reading unit provided in the gate body.
The output control unit 437 displays the settlement amount on the output unit 507, outputs information to the settlement machine 5a, and opens / closes the opening / closing member.
 また、精算部535は、追跡され続けた移動物体Moが、即ち、決済する人が精算エリア45に進入した場合、1以上の移動物体Moに紐づく商品情報(商品のリスト)をもとに、精算金額・精算対象商品を確定する。
 また、例えば、父親が財布を持って決済する場合において、同伴者である母親と子供に紐づく商品情報(商品のリスト)をも、グルーピングを活用し、父親が決済できるようにしてもよい。
In addition, the settlement unit 535, based on the product information (list of products) associated with one or more moving objects Mo when the moving object Mo that has been continuously tracked, that is, a person who makes a payment enters the settlement area 45. , Settle the payment amount and the item to be settled.
Further, for example, when the father makes a payment with a wallet, the product information (a list of products) associated with the companion mother and the child may be grouped so that the father can make the payment.
 そして、精算機5aを用いた精算ゲート5-1は、商品金額の合計を算出し、精算する装置であり、精算ボタン(図示せず)を備えている。精算機5aは、精算ゲート5よりも出口42側に設置されている。この精算機5aは、現金、ポイント支払い、商品券・金券等で支払いができる決済手段を備えている。 The settlement gate 5-1 using the settlement machine 5a is a device for calculating and summing the total amount of goods and has a settlement button (not shown). The settlement machine 5a is installed closer to the exit 42 than the settlement gate 5. The settlement machine 5a is equipped with a settlement means capable of paying with cash, point payment, gift certificate, cash voucher, or the like.
 この精算ゲート5-1にあっては、決済する人が精算ボタンを押下すると、この押下がトリガーとなって移動物体Moに紐づく商品情報が読み出され、精算金額が確定し、精算ゲート5-1を通過可能となる。そして、決済する人が精算機によって決済すると、退店可能となる。 In the settlement gate 5-1, when the person who makes the settlement pushes down the settlement button, the depression causes the merchandise information associated with the moving object Mo to be read out and the settlement amount to be settled. -1 can be passed. Then, when the person who makes the payment makes the payment using the checkout machine, the store can be closed.
 ただし、例えば、精算対象商品が年齢制限商品であり、年齢確認ができていないといったエラー状態となった場合は、情報端末9a、情報端末9bやエラー表示部151にエラー状態の旨が提示される。その際、精算ゲート5-1は、開閉部材を閉じたままにする等し、通過不可の状態を維持する。精算ゲート5-1は、音や光等の手段によって、エラー状態である旨を提示してもよい。店員Mtが年齢を確認する等、制限を解除できる状態となると、店員Mtの操作によってエラー解除部152の操作により、制限が解除され、精算ゲート5-1は通過可能な状態となる。なお、制限の解除は、遠隔操作によって実現することもできる。
 また、移動物体Moにいずれの商品か特定できない物体(不特定の物体)が紐づいている場合も、年齢制限商品の際と同様の流れとなる。なお、この不特定の物体を、有人の精算レジ6において、店員Mtが対応し、商品を特定してもよい。
However, for example, when the settlement object product is an age-restricted product and there is an error state that the age cannot be confirmed, the information terminal 9a, the information terminal 9b and the error display unit 151 are notified of the error state. . At that time, the settlement gate 5-1 maintains the passage-impossible state by keeping the opening / closing member closed. The settlement gate 5-1 may present the fact that there is an error state by means of sound or light. When the store clerk Mt is in a state in which the restriction can be released, such as when confirming the age, the limit is released by the operation of the error release unit 152 by the operation of the store clerk Mt, and the settlement gate 5-1 can be passed. The restriction can be released by remote control.
Further, even when the moving object Mo is associated with an object (unspecified object) that cannot be identified as which product, the flow is the same as that for the age-restricted product. The unspecified object may be dealt with by the clerk Mt at the manned cash register 6 to specify the product.
 また、電子マネーを用いた精算ゲート5-2にあっては、カード読取部(図示せず)を備え、前記精算ゲート5-1のような精算ボタンを備えず,精算機5aを使用しない。カード読取部は、カードがクレジットカード、デビッドカード、電子マネー、プリペイドカード等いずれあっても、対応できるようにされている。 Further, the settlement gate 5-2 using electronic money is provided with a card reading unit (not shown), is not provided with a settlement button like the settlement gate 5-1, and does not use the settlement machine 5a. The card reading unit is adapted to handle any of credit cards, debit cards, electronic money, prepaid cards, and the like.
 そして、決済する人がこの精算ゲート5-2の精算エリア45に進入し、カードがカード読取部に読み取られると、移動物体Moに紐づく商品情報が読み出され、精算及び決済が完了し、精算ゲート5-2を通過可能となる。
 精算対象商品が売買制限商品であったり、移動物体Moに不特定の物体が紐づいている場合は、上述の精算ゲート5-1と同じ動作となる。
When the person making the payment enters the payment area 45 of the payment gate 5-2 and the card is read by the card reading unit, the product information associated with the moving object Mo is read, and the payment and payment are completed. It becomes possible to pass through the settlement gate 5-2.
When the settlement object product is a trade-restricted product, or when an unspecified object is tied to the moving object Mo, the same operation as that of the settlement gate 5-1 is performed.
 また、精算ゲート5-3は、追跡され続けた移動物体Moが個人認証済みであり、決済の情報が特定されている必要がある。
 精算ゲート5-3は、決済する人が精算ゲート5-3の精算エリア45に進入すると、エラーがない限り、自動精算され、精算ゲート5-3を通過可能となる。
精算対象商品が売買制限商品であったり、移動物体Moに不特定の物体が紐づいている場合は、上述の精算ゲート5-1と同じ動作となる。
In addition, in the settlement gate 5-3, the moving object Mo that has been continuously tracked needs to be personally authenticated and the payment information needs to be specified.
When the person making the settlement enters the settlement area 45 of the settlement gate 5-3, the settlement gate 5-3 is automatically settled and can pass through the settlement gate 5-3 unless there is an error.
When the settlement object product is a trade-restricted product, or when an unspecified object is tied to the moving object Mo, the same operation as that of the settlement gate 5-1 is performed.
 実施形態4の商品認識システムは、上述した機能以外に、図示しない遠隔操作部を備えてもよい。 The product recognition system according to the fourth embodiment may include a remote control unit (not shown) in addition to the functions described above.
 売買制限商品判定部380は、特定された商品が売買制限商品に該当するか否かを判定する。即ち、売買制限商品判定部380は、DB情報から、特定された商品が酒類、タバコ類の年齢制限による売買制限商品と判定する。また、売買制限商品判定部380は、画像認識等を活用し、当該商品が消費期限・賞味期限切れによる売買制限商品と判定する。また、売買制限商品判定部380は、個人認証により得られた個人情報と紐づけ、当該商品が買い物客に対しアレルギー、ハラル食品以外等の売買制限商品と判定する。
 また、売買制限商品判定部380は、当該商品が売買制限商品であると判定した場合、売買制限商品である旨を提示する。また、売買制限商品判定部380は、個人認証から得られた個人情報から売買制限を行ってもよい。また、売買制限商品判定部380は、顔や手認識から年齢性別を推定し、売買制限を行ってもよい。
The trade-restricted product determination unit 380 determines whether or not the specified product corresponds to the trade-restricted product. That is, the trade-restricted product determination unit 380 determines from the DB information that the identified product is a trade-restricted product due to the age restriction of alcoholic beverages and tobacco. Further, the trade-restricted product determination unit 380 uses image recognition or the like to determine that the product is a trade-restricted product due to expiration or expiration of the expiration date. In addition, the trade-restricted product determination unit 380 associates the product with the personal information obtained by the personal authentication, and determines that the product is a trade-restricted product other than allergies to shoppers and products other than halal foods.
In addition, when the sale-restricted product determination unit 380 determines that the product is a sale-restricted product, it indicates that it is a sale-restricted product. Further, the trading restricted product determination unit 380 may restrict trading based on the personal information obtained from the personal authentication. Further, the trade-restricted product determination unit 380 may estimate the age and gender from the face and hand recognition, and restrict the trade.
 遠隔操作部は、システム処理異常、商品不特定または売買制限等のエラー状態の通知を受けた場合、遠隔操作でエラー状態を解消する。エラー状態を検知した天井カメラ310、棚カメラ311や精算ゲート5は、エラー状態である旨と当該状態の撮像画像を、ネットワークNを経由し、店内または店外の遠隔操作用の情報端末9a,9bやサーバ1に通知する。これら装置の操作により、エラー状態を解消することができる。
当然、店内の有人レジにてエラー状態を解除してもよく、店員Mt等がエラー状態の発生している場に赴きエラー状態を解除してもよい。
When the remote control unit receives a notification of an error state such as a system processing abnormality, an unspecified product, or a trading restriction, the remote control unit cancels the error state by remote control. The ceiling camera 310, the shelf camera 311, and the checkout gate 5 that have detected the error state send information indicating that there is an error state and a captured image of the state via the network N to the information terminal 9a for remote operation inside or outside the store, 9b and the server 1 are notified. By operating these devices, the error condition can be eliminated.
Of course, the error state may be released at a manned cash register in the store, or the store clerk Mt or the like may go to a place where the error state has occurred and release the error state.
 次に、本実施形態4の商品認識システムにおける商品の精算方法について、図39を参照して説明する。
 図39及び図40は、図36のサーバ1と売場装置3と精算ゲート5が実行する自動精算処理を説明するフローチャートである。
Next, a product settlement method in the product recognition system according to the fourth embodiment will be described with reference to FIG. 39.
39 and 40 are flowcharts illustrating the automatic settlement process executed by the server 1, the sales floor device 3, and the settlement gate 5 in FIG. 36.
 ステップS401において、買い物客が(移動物体)が店舗40(図33)の入口41から店内に入店し、入口41付近に設置された天井カメラ310がその買い物客の撮像を開始する。買い物客がカゴやカードを手にし、通路44を進むと奥の天井カメラ310がその買い物客の撮像を開始する。このようにして実施形態3と同様に複数の天井カメラ310が買い物客、カゴ、カートを含む店舗30内全体を常時撮像する。なお、ステップS401の前に、個人認証部320が買い物客の個人認証し、買い物客の個人情報を取得してもよい。 In step S401, the shopper (moving object) enters the shop through the entrance 41 of the store 40 (FIG. 33), and the ceiling camera 310 installed near the entrance 41 starts imaging the shopper. When the shopper holds a basket or a card and proceeds through the aisle 44, the ceiling camera 310 at the back starts capturing an image of the shopper. In this way, as in the third embodiment, the plurality of ceiling cameras 310 constantly capture images of the entire store 30 including shoppers, baskets, and carts. Note that the personal authentication unit 320 may perform personal authentication of the shopper and acquire the personal information of the shopper before step S401.
 ステップS402において、天井カメラによる移動物体発見部3302が移動物体Moを発見し、個別のIDを採番する。IDは、退店または精算完了等の特定のタイミングまで使い続けられる。なお、このタイミングで個人認証部320が買い物客の個人認証をし、買い物客の個人情報を取得してもよい。さらに、グルーピング部3308が複数の移動物体Moを1つのまとまりとして、グルーピングしてもよく、血縁関係判定部3309が複数の買い物客(移動物体Mo)の血縁関係を判定し、グルーピング部3308の補完をしてもよい。 In step S402, the moving object finding unit 3302 using the ceiling camera finds the moving object Mo and assigns an individual ID. The ID is continuously used until a specific timing such as leaving the store or completing settlement. The personal authentication unit 320 may perform personal authentication of the shopper at this timing to acquire the personal information of the shopper. Furthermore, the grouping unit 3308 may group a plurality of moving objects Mo as one group, and the blood relationship determination unit 3309 determines the blood relationship of a plurality of shoppers (moving objects Mo) and complements the grouping unit 3308. You may
 ステップS403において、天井カメラによる移動物体領域定義部3304が天井カメラによる移動物体発見部3302によって発見した移動物体Moを含む所定の領域を定義する。また、天井カメラ310によって撮像されている範囲内で移動物体Moが移動したときに、移動後の移動物体Moの領域の位置を改めて定義する。位置情報は移動物体MoのIDと紐づけて位置情報管理DB132やメモリ等で管理され、領域定義ごとに更新される。この定義された位置は、別の天井カメラ310において撮像されている位置でも認識される。 In step S403, the moving object area defining unit 3304 with the ceiling camera defines a predetermined area including the moving object Mo found by the moving object finding unit 3302 with the ceiling camera. Further, when the moving object Mo moves within the range imaged by the ceiling camera 310, the position of the area of the moving object Mo after the movement is defined again. The position information is managed in the position information management DB 132, a memory or the like in association with the ID of the moving object Mo, and is updated for each area definition. This defined position is also recognized as a position imaged by another ceiling camera 310.
 ステップS404において、天井カメラによる移動物体領域追跡部3305がある天井カメラ310によって撮像されている撮像画像内で移動物体Moが移動する位置を推定する。さらに、移動物体領域定義部334が移動したと推定される位置に対し、移動物体Moの領域を定義し、位置情報管理DB132やメモリ上に記憶されている移動物体Moの位置情報を更新する。 In step S404, the position at which the moving object Mo moves in the captured image captured by the ceiling camera 310 having the moving object area tracking unit 3305 by the ceiling camera is estimated. Further, the moving object region definition unit 334 defines the region of the moving object Mo with respect to the position estimated to have moved, and updates the position information of the moving object Mo stored in the position information management DB 132 or the memory.
 ステップS405において、実施形態3と同様に、棚カメラによる物体入出検知部3608が物体の棚内への入出を検知する。 In step S405, the object entry / exit detection unit 3608 by the shelf camera detects entry / exit of an object into / from the shelf, as in the third embodiment.
 ステップS406において、実施形態3と同様に、棚カメラによる物体入出検知部3608発動をトリガーとし、棚カメラによる物体認識部3602が、物体が取られた画像または物体が置かれた画像の前後の画像を比較し、商品特定の対象となる画像領域を定義する。
 また、ステップS406に際して、天井カメラによる物体取得認識部が棚内等から移動物体Moが物体を取得したことを認識してもよい。
In step S406, similarly to the third embodiment, the object entry / exit detection unit 3608 triggered by the shelf camera is used as a trigger, and the object recognition unit 3602 by the shelf camera causes images before and after the image in which the object is taken or the image in which the object is placed. And the image area that is the target of product identification is defined.
Further, in step S406, the object acquisition recognition unit using the ceiling camera may recognize that the moving object Mo has acquired an object from the inside of the shelf or the like.
 ステップS407において、実施形態3と同様に、棚カメラによる商品特定部3603が、物体がいずれの商品であるかを特定する。この特定された商品が売買制限商品である場合は、移動物体Moに売買制限商品が紐づけられる。また、物体がいずれの商品であるかが特定できなかった場合も、商品不特定の物体が移動物体Moに紐づけられる。なお、情報端末9a、情報端末9b、サーバ1に商品不特定によるエラー状態である旨が提示され、エラー状態を認識した店員Mtが遠隔操作でエラー状態を解消してもよい(当然、店員Mtが直接エラー状態を解除しに行ってもよい)。
 また、ステップS407に際して、ラベル認識部が、特定された商品に応じて紐づけされたラベルを認識してもよい。
 また、ステップS407に際して、値引きシール認識部が、特定された商品に応じて、貼付された値引きシールを認識してもよい。
 また、ステップS407に際して、天井カメラによる商品特定部が、天井カメラによる物体取得認識部が取得した物体領域について、いずれの商品かを特定してもよい。
In step S407, as in the third embodiment, the product identification unit 3603 by the shelf camera identifies which product the object is. When the specified product is a trade-restricted product, the trade-restricted product is linked to the moving object Mo. Further, even when it is not possible to specify which product the object is, the product-unspecified object is linked to the moving object Mo. The information terminal 9a, the information terminal 9b, and the server 1 may be informed that the error state is due to unspecified product, and the clerk Mt who recognizes the error state may cancel the error state by remote control (obviously, the clerk Mt. May go directly to release the error condition).
Further, in step S407, the label recognition unit may recognize the label associated with the specified product.
In addition, in step S407, the discount label recognition unit may recognize the attached discount label according to the identified product.
In addition, at the time of step S407, the product specifying unit using the ceiling camera may specify which product is the object region acquired by the object acquisition recognizing unit using the ceiling camera.
 ステップS408において、棚カメラによる商品特定部3603は、特定物体目検モードまたは全物体目検モードのいずれに設定されているか判定する。
 特定物体目検モードに設定されている場合、処理はステップS409に進む。
 また、全物体目検モードに設定されている場合、処理はステップS410に進む。
 ステップS410において、目検結果取得部301aは、対象となる物体について、目検用端末Qにおける目検(商品の特定)を依頼し、その目検結果を取得する。
 ステップS409の後、処理はステップS411に進む。
 ステップS410において、棚カメラによる商品特定部3603は、追加された1個の物体がいずれの商品であるか特定できないものであるかどうか判定する。棚カメラによる商品特定部3603がいずれの商品であるか特定できないものである場合(ステップS410においてYES)は、処理はステップS409に進む。
In step S408, the shelf-camera product specifying unit 3603 determines whether the specific object visual inspection mode or the all object visual inspection mode is set.
If the specific object visual inspection mode is set, the process proceeds to step S409.
If the all-object visual inspection mode is set, the process proceeds to step S410.
In step S410, the visual inspection result acquisition unit 301a requests a visual inspection (specification of a product) at the visual inspection terminal Q for the target object, and acquires the visual inspection result.
After step S409, the process proceeds to step S411.
In step S410, the product identification unit 3603 by the shelf camera determines whether it is not possible to identify which product the added one object is. If the product specification unit 3603 by the shelf camera cannot specify which product (YES in step S410), the process proceeds to step S409.
 ステップS410において棚カメラによる商品特定部3603がいずれの商品であるか特定できた場合(ステップS410においてNO)は、棚カメラによる商品特定部3603は、サーバ1の記憶部に保持された商品名や価格、売買制限商品である等の情報を含めて商品を特定する。これにより、処理はステップS411に進む。特定された商品情報は、出力部507に出力されてもよい。
 また、ステップS410において、棚カメラによる商品特定部3603がいずれの商品であるか特定できなかった場合は、売場装置3に備え付けられた図示しない表示部の画面表示や、図示しないスピーカーによる音声案内等で、購入者へ商品の棚からの取り直しを依頼する構成としてもよい。さらに、売場装置3備え付けられた図示しないマイクロフォンを介して、購入者と目検者等が通話する構成としてもよい。
In step S410, if the product specification unit 3603 by the shelf camera can specify which product (NO in step S410), the product specification unit 3603 by the shelf camera determines the product name stored in the storage unit of the server 1 or A product is specified including information such as a price and a product that is restricted for sale. As a result, the process proceeds to step S411. The identified product information may be output to the output unit 507.
Further, in step S410, when the product specification unit 3603 by the shelf camera cannot specify which product, the screen display of the display unit (not shown) provided in the sales floor device 3, the voice guidance by the speaker (not shown), etc. Then, the purchaser may be requested to retrieve the product from the shelf. Further, the purchaser and the eye examiner may talk with each other via a microphone (not shown) provided in the sales floor device 3.
 ステップS411において、売買制限商品判定部380は、棚カメラによる商品特定部3603によって特定された商品が、年齢確認が必要とされる商品であるかどうかを判定する。 In step S411, the trade-restricted product determination unit 380 determines whether or not the product identified by the product identification unit 3603 using the shelf camera is a product for which age confirmation is required.
 ステップS411において、棚カメラによる商品特定部3603によって特定された商品が、年齢確認が必要とされる商品であると判定された場合は、即ちYESと判定された場合、処理はステップS412に進む。
 ステップS411において、棚カメラによる商品特定部3603によって特定された商品が、年齢確認が必要とされる商品でないと判定された場合は、即ちNOと判定された場合、処理はステップS417に進む。
 ステップS412において、出力制御部437は、精算ゲート5の出力部507に年齢確認のための画面を表示させる。ただし、買い物客の個人情報が取得され、ここで年齢確認する必要がない場合は、ステップS412はスキップされ、処理はステップS416に進む。ここで、買い物客の年齢認証に問題があるときは、精算ゲート5に備え付けられた図示しないスピーカーと図示しないマイクロフォンを介して、購入者と目検者等が通話する構成としてもよい。
 ステップS413において、目検結果取得部301aは、対象となる商品について、目検用端末Qにおける目検(売買制限商品の判定)を依頼し、その目検結果を取得する。
 ステップS414において、売買制限商品判定部380は、売買制限を解除する指示を受け付けたか否かを判定する。売買制限商品の売買制限の解除は、判定の結果を受けて店員がその場で対応する、もしくはシステムとして目検の結果を採用する、いずれかの方法で行う。ここではシステムとして目検の結果を採用する場合について説明する。
If it is determined in step S411 that the product identified by the product identification unit 3603 using the shelf camera is a product that requires age confirmation, that is, if YES, the process proceeds to step S412.
If it is determined in step S411 that the product specified by the product specifying unit 3603 using the shelf camera is not a product for which age confirmation is required, that is, if NO is determined, the process proceeds to step S417.
In step S412, the output control unit 437 causes the output unit 507 of the settlement gate 5 to display a screen for age confirmation. However, if the personal information of the shopper is acquired and it is not necessary to confirm the age here, step S412 is skipped and the process proceeds to step S416. Here, when there is a problem in the age verification of the shopper, the purchaser and the eye checker may make a call via a speaker (not shown) and a microphone (not shown) provided in the checkout gate 5.
In step S413, the visual inspection result acquisition unit 301a requests a visual inspection (determination of a trade-restricted commercial product) at the visual inspection terminal Q for the target product, and acquires the visual inspection result.
In step S414, the trading restriction product determination unit 380 determines whether or not an instruction to cancel the trading restriction has been received. The sale restriction of the sale-restricted product is released by either the store clerk on the spot in response to the result of the determination or the result of the visual inspection is adopted as a system. Here, a case where the result of the visual inspection is adopted as the system will be described.
 ステップS414において売買制限を解除する指示を受け付けていないと判定された場合は、処理はステップS415に進む。
 ステップS414において売買制限を解除する解除指示を受け付けたと判定されると、即ちYESと判定されると、処理はステップS416に進む。
 ステップS415において、目検用端末Qの目検結果送信部412は、目検結果によっても売買制限が解除されなかった旨の警告を送信する。この警告を受信することにより、例えば、精算ゲート5の出力制御部537は、目検結果によっても売買制限が解除されなかった旨の警告を、出力部507を介して提示する。ステップS416の後、処理は終了し、精算が中止となる。
 ステップS416において、売買制限商品判定部380は、売買制限を解除する。
 このようにしてステップS416が終了されるか、またはステップS411において年齢制限商品でないと判定された場合(NOであると判定された場合)、処理はステップS417に進む。
If it is determined in step S414 that the instruction to cancel the trading restriction has not been received, the process proceeds to step S415.
If it is determined in step S414 that the cancel instruction for canceling the trading restriction has been accepted, that is, if YES, the process proceeds to step S416.
In step S415, the visual inspection result transmission unit 412 of the visual inspection terminal Q transmits a warning that the trading restriction has not been released even by the visual inspection result. By receiving this warning, for example, the output control unit 537 of the settlement gate 5 presents, through the output unit 507, a warning that the trading restriction has not been canceled even by the result of the visual inspection. After step S416, the process ends and the settlement is stopped.
In step S416, the trade restriction product determination unit 380 cancels the trade restriction.
In this way, when step S416 is completed, or when it is determined in step S411 that the product is not age-restricted (NO is determined), the process proceeds to step S417.
 ステップS417において、移動物体と商品紐づけ部3604が、移動物体Moと特定された商品を紐づける。 In step S417, the moving object and the product association unit 3604 associate the moving object Mo with the specified product.
 ステップS418において、移動物体と紐づく商品リスト管理部3606が、人と紐づく商品リストを精算まで管理し続ける。
 ステップS419において、移動物体Moに紐づく商品情報に基づき、精算ゲート5が精算乃至決済する。
 なお、ステップS419までの途中、またはステップS419の際に、エラー表示部151、情報端末9a、情報端末9bや精算ゲート5の出力部507が、何かしらのエラー状態を店員等に通知してもよい。
 また、ステップS410において、売買制限商品が移動物体Moに紐づいている場合、ゲートを開けずに閉じたままとし、エラー状態を表示してもよい。
In step S418, the product list management unit 3606 associated with the moving object continues to manage the product list associated with the person until settlement.
In step S419, the settlement gate 5 performs settlement or settlement based on the product information associated with the moving object Mo.
Note that the error display unit 151, the information terminal 9a, the information terminal 9b, or the output unit 507 of the settlement gate 5 may notify the clerk or the like of any error state during the process up to step S419 or at the time of step S419. .
Further, in step S410, when the trade-restricted product is associated with the moving object Mo, the error state may be displayed by keeping the gate closed without opening it.
 実施形態4において、移動物体Mo(買い物客)の追跡において想定されるエラーには、(A)買い物客が店舗40の入口41から入店した際に、天井カメラによる移動物体発見部3302が移動物体を検出できなかった場合、(B)天井カメラによる移動物体領域追跡部3305が追跡中の移動物体Moを見失った場合、(C)二つ以上の異なる移動物体Moの夫々に紐付けられたIDが追跡途中に入れ替わった場合、等が含まれる(当然これらに限定されない)。         In the fourth embodiment, as an error assumed in tracking the moving object Mo (shopper), (A) when the shopper enters the store 41 from the entrance 41, the moving object detection unit 3302 by the ceiling camera moves. When the object cannot be detected, (B) when the moving object area tracking unit 3305 with the ceiling camera loses track of the moving object Mo being tracked, (C) it is associated with each of two or more different moving objects Mo. If the IDs are swapped during tracking, etc. are included (not limited to these, of course).
 各エラー状態に応じ、本実施形態が採用するシステムが、以下の例を含む、種々の対応を行う。
 (A)天井カメラ310により撮像された買い物客を被写体として含む画像を目検用端末Qに送信し、目検による買い物客の検出を依頼する。目検者が買い物客を検出した場合は、新たな移動物体Moを定義の上、IDを発番して、天井カメラによる移動物体領域追跡部3305による追跡を開始する。目検者が買い物客を検出できなかった場合は、その旨を店員に報知する。
 (B)位置情報管理DB132に登録されているどのIDとも紐づけられていない買い物客が認められた場合、その買い物客の撮像画像と、紐づけられるべきID候補のリストが目検用端末Qに送信される。目検者は、過去の買い物客とIDとの紐づけ情報に基づいて、最も適切なIDを当該買い物客に紐づけ、天井カメラによる移動物体領域追跡部3305による再追跡を開始する。何らかの理由で再追跡が開始できなかった場合は、その旨が店員に報知される。
 また、買い物客との紐づけを失ったIDが検出された場合には、当該IDと、どのIDとも紐づけられていない買い物客の撮像画像のリストが目検用端末Qに送信される。目検者は、過去の買い物客とIDとの紐づけ情報に基づいて、当該IDに買い物客を紐づけることを試み、紐づけが成功した場合は、天井カメラによる移動物体領域追跡部3305による再追跡を開始する。何らかの理由で再追跡の開始が失敗した場合は、その旨が店員に報知される。
 (C)夫々の買い物客の画像リストと、夫々の買い物客に紐づけられるべきIDのリストが目検用端末Qに送信される。目検者は、過去の買い物客とIDとの紐づけ情報に基づいて、最も適切に、IDを買い物客に割り振ことを試みる。IDを買い物客に適切に割り振ることができた場合は天井カメラによる移動物体領域追跡部3305による再追跡を開始する。IDを買い物客に適切に割り振ることができない場合は、割り振りを諦め、紐づけが入れ替わったままの旨を店員に報知する。
Depending on each error state, the system adopted in this embodiment takes various actions including the following examples.
(A) The image including the shopper captured by the ceiling camera 310 as an object is transmitted to the inspection terminal Q, and the detection of the shopper by the inspection is requested. When the eye checker detects a shopper, a new moving object Mo is defined, an ID is issued, and tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the eye checker cannot detect the shopper, the fact is notified to the clerk.
(B) When a shopper who is not associated with any ID registered in the positional information management DB 132 is recognized, the imaged image of the shopper and a list of ID candidates to be associated are displayed in the terminal Q for inspection. Sent to. The eye examiner associates the most appropriate ID with the shopper based on the past association information of the shopper and the ID, and starts re-tracking by the moving object area tracking unit 3305 using the ceiling camera. If the re-tracking cannot be started for some reason, the clerk is informed accordingly.
In addition, when an ID that has lost connection with the shopper is detected, the ID and a list of imaged images of shoppers that are not associated with any ID are transmitted to the inspection terminal Q. The eye examiner attempts to associate the shopper with the ID based on the association information of the shopper and the ID in the past, and when the association is successful, the moving object area tracking unit 3305 by the ceiling camera uses Start retracking. If the start of re-tracking fails for some reason, the clerk is informed accordingly.
(C) The image list of each shopper and the list of IDs that should be associated with each shopper are transmitted to the inspection terminal Q. The eye examiner most appropriately attempts to assign the ID to the shopper based on the past information of the shopper and the ID. When the IDs can be appropriately assigned to the shoppers, the re-tracking by the moving object area tracking unit 3305 by the ceiling camera is started. If the IDs cannot be properly allocated to the shoppers, the allocation is abandoned and the clerk is informed that the ties have been exchanged.
〔他の実施形態〕
 本発明は、上述の実施形態に限定されるものでなく、本発明の目的を達成できる範囲での変形、改良等は本発明に含まれるものである。
[Other embodiments]
The present invention is not limited to the above-described embodiments, but includes modifications and improvements as long as the object of the present invention can be achieved.
 例えば、本情報処理システムは、物体を被写体に含む画像に対して、目検者が目視による確認をする手法を用いて、当該物体を商品として特定することを試みた結果を取得する商品特定手段と、当該商品特定手段の結果に基づいて特定された商品について、精算処理を行う精算手段とを備える構成としてもよい。
 これにより、購入者が商品を購入する際に、有人レジと同等の商品特定の精度を維持しながら、商品の代金の精算の自動化を可能にすることができる。
For example, the information processing system uses a method in which an eye examiner visually confirms an image including an object as a subject, and acquires a result of attempting to specify the object as a product. And a settlement means for performing settlement processing for the product identified based on the result of the product identification means.
As a result, when the purchaser purchases a product, it is possible to automate the settlement of the price of the product while maintaining the same accuracy of product identification as that of a manned cash register.
 さらに、例えば、本情報処理システムは、棚在庫管理機能を備えてもよい。棚在庫管理機能は、店員判別手段と在庫情報更新手段とを備える。
 店員判別手段は、店員と買い物客とを判別する。例えば、店員判別部は、店員を識別できる物理的なマーカーを帽子や服等、店員が身に着けているものに付与する。この物理的なマーカーを天井カメラ及び棚カメラが撮像することで、店員を判別する。この店員判別手段は、特に実施形態3,4において有効に利用することができる。
Further, for example, the information processing system may include a shelf inventory management function. The shelf inventory management function includes a salesclerk discrimination means and an inventory information update means.
The clerk discrimination means discriminates between the clerk and the shopper. For example, the salesclerk discrimination unit attaches a physical marker that can be used to identify the salesclerk to items such as hats and clothes worn by the salesclerk. The clerk is identified by the ceiling camera and the shelf camera capturing images of this physical marker. This clerk discrimination means can be effectively used especially in the third and fourth embodiments.
 また、店員判別手段は、店舗のバックヤード等の所定のエリアに設けられたボタンを備えてもよい。この店員判別手段は。所定のエリアに存在する店員にボタンを押されることでその店員にIDを採番する。この店員判別手段は、特に実施形態4において有効に利用することができる。なお、店員判別部は、店員と認識した人物の領域を天井カメラが追跡し続ける。 Also, the salesclerk determination means may include a button provided in a predetermined area such as a backyard of the store. What is this salesclerk discrimination means? When a button is pushed by a clerk existing in a predetermined area, the clerk is assigned an ID. This clerk discrimination means can be effectively used especially in the fourth embodiment. In addition, in the salesclerk determination unit, the ceiling camera continues to track the area of the person recognized as the salesclerk.
 そして、在庫情報の更新手段は、店員が棚に商品を補充した場合(加算)または取り出したり廃棄したりした場合(減算)に、物体認識部及び商品特定部を駆使し、棚の在庫情報をサーバの商品DBにおいて更新する。
 在庫情報の更新手段は、商品が購入された場合に、当該商品が存在する棚の在庫及び店舗全体の在庫情報を更新する。
 在庫情報の更新手段は、棚の在庫及び店舗全体の在庫情報を管理し、在庫数が閾値を下回ると、遠隔操作部に通知する、または自動発注する。
Then, the inventory information updating means makes full use of the object recognition unit and the product identification unit when the store clerk replenishes the shelf with products (addition) or takes out or discards the products (subtraction) to store the inventory information of the shelf. Update in the product DB of the server.
The inventory information updating means updates the inventory information of the shelf where the product is present and the inventory information of the entire store when the product is purchased.
The inventory information updating means manages the inventory information of the shelves and the inventory information of the entire store, and when the inventory number falls below a threshold value, notifies the remote control unit or automatically places an order.
 また、本発明は、買い物客の属性推定機能を備えてもよい。買い物客の属性推定機能は、買い物客の大まかな年齢や年代、性別等の属性を例えば顔認識から推定する。買い物客の属性推定機能は、実施形態1では、レジ端末に備えられ、実施形態2乃至4では売場装置に備えられる。なお、実施形態2においては、レジ端末に備えられていてもよい。また、本発明は、購買情報や買い物客の属性情報をPOSシステムに連携する。購買情報は、商品名や金額等の情報をデータ化して、決済完了後にその情報をPOSシステムに連携する。買い物客の属性情報は、上述の買い物客の属性推定機能にて得られた情報をPOSシステム連携する。 Also, the present invention may include a shopper attribute estimation function. The shopper's attribute estimation function estimates attributes of the shopper, such as approximate age, age, and gender, from face recognition, for example. The attribute estimation function of the shopper is provided in the cashier terminal in the first embodiment, and is provided in the sales floor device in the second to fourth embodiments. In the second embodiment, the cashier terminal may be provided. The present invention also links purchase information and shopper attribute information to the POS system. As the purchase information, information such as a product name and an amount of money is converted into data, and the information is linked to the POS system after the settlement is completed. As the attribute information of the shopper, the information obtained by the attribute estimation function of the shopper described above is linked to the POS system.
 買い物客の属性推定機能は、例えば売買制限商品判定部と連携されてもよい。この買い物客の属性推定機能は、属性を推定した結果、買い物客が明らかに年齢確認を必要としない30歳代以上であれば、売買制限商品判定部が年齢確認をしないように制御してもよい。 The shopper attribute estimation function may be linked with, for example, the trade restricted product determination unit. As a result of the attribute estimation, the shopper attribute estimation function may control the trade restriction product determination unit not to perform age confirmation if the shopper is in the thirties or older who does not obviously need age confirmation. Good.
 ここで上述の実施形態1乃至4、及び他の実施形態に関する課題について補足する。
各実施形態における商品認識システムでの商品認識手法として、ディープラーニングを用いた画像認識が有望である。しかし、ディープラーニングは、商品を特定できなかったとき、もしくは誤認識したときに再学習をさせる必要があり、その際には正解商品の画像を人の手で与える必要があり、多大な手間がかかる。
Here, the problems relating to the above-described first to fourth embodiments and other embodiments will be supplemented.
Image recognition using deep learning is promising as a product recognition method in the product recognition system in each embodiment. However, deep learning requires re-learning when a product cannot be identified or when it is erroneously recognized, and in that case, it is necessary to manually give an image of the correct product, which is a great deal of effort. It takes.
 上述の課題の解決方法として、本発明の商品認識システムは、
 商品購入時に、目検者の目視による確認や、バーコードスキャン等を行い、商品認識を修正する機能と、
 商品認識の修正が行われた際に、正解商品画像及び必要な情報(修正後の商品番号や特定のコード等)を紐付けて記録する機能と、
 前記正解商品画像及び必要な情報をもとに、ディープラーニングの学習を自動で実行する機能と、
 学習結果を自動で同商品認識システムに配備する機能と、
 を実装することにより、上記の一連のディープラーニングの再学習処理を自動化を実現できる。
 ここで、学習結果の配備とは、商品認識に用いることができるよう、学習の結果に基づいて、同商品認識システムの、例えばディープラーニングのモデル等を更新する処理を指す。
 上記の再学習処理は、例えば商品特定の失敗等の所定の条件をトリガーとして開始される構成としてもよいし、担当者による明示の操作、例えばボタンの押下等をトリガーとして開始される構成としてもよい。
As a solution to the above problems, the product recognition system of the present invention is
When purchasing a product, a function that corrects the product recognition by visually checking by an inspector or scanning a barcode,
A function to link and record the correct product image and necessary information (modified product number, specific code, etc.) when product recognition is corrected,
Based on the correct product image and necessary information, the function to automatically perform learning of deep learning,
A function to automatically deploy learning results to the product recognition system,
By implementing, the series of deep learning re-learning processes can be automated.
Here, the deployment of the learning result refers to a process of updating, for example, a deep learning model or the like of the product recognition system based on the learning result so that it can be used for product recognition.
The re-learning process may be configured to be triggered by a predetermined condition such as product identification failure, or may be configured to be triggered by an explicit operation by a person in charge, for example, pressing a button. Good.
 上記のようなディープラーニングの再学習処理によれば、ディープラーニングの再学習にかかる煩雑な作業を簡略化し人手を削減できると共に、商品認識精度が向上し続ける商品認識システムを提供することができる。 According to the deep learning re-learning process as described above, it is possible to simplify the complicated work involved in the deep learning re-learning and reduce manpower, and to provide a product recognition system in which the product recognition accuracy is continuously improved.
 また、上述の実施形態1乃至4及び他の実施形態を含む各実施形態において、物体を商品として特定する場合に適用される画像認識手法として、例えばAI(Artificial Intelligence)を用いた物体認識、特徴点・特徴量を用いた物体認識、商品のロゴの形状認識、文字認識(AIを用いた文字形状の認識、OCR(Optical Character Recognition)を用いた文字の認識等)を採用することが可能である。また、画像認識手法に加えて、重量認識(重量センサを用いた物体の重量認識)、バーコードやQRコード(登録商標)等の識別情報のスキャンによる商品認識(赤外線センサーあるいは画像読み取り等によるバーコードやQRコード(登録商標)等のスキャン)、RFID等の電子的に記録された識別情報の読み取りによる商品認識の1つまたは複数の認識手法を組み合わせて、物体を商品として認識することが可能である。この場合、各認識手法による物体の認識結果をスコア化し、その物体をスコアが最も高い商品であると推定すること等が可能である。
 さらに、上記の重量認識、バーコードやQRコード(登録商標)等の識別情報のスキャンによる商品認識、電子的に記録された識別情報の読み取りによる商品認識等の認識手法は、画像認識手法と組み合わせずに、1つまたは複数を組み合わせて、物体を商品として認識することが可能である。
Further, in each of the above-described first to fourth embodiments and other embodiments, as an image recognition method applied when an object is specified as a product, for example, object recognition using AI (Artificial Intelligence), features It is possible to adopt object recognition using points / features, shape recognition of product logo, character recognition (character shape recognition using AI, character recognition using OCR (Optical Character Recognition), etc.). is there. In addition to the image recognition method, weight recognition (weight recognition of an object using a weight sensor), product recognition by scanning identification information such as a bar code or QR code (registered trademark) (a bar by an infrared sensor or image reading, etc.) It is possible to recognize an object as a product by combining one or more recognition methods of product recognition by scanning electronically recorded identification information such as a code or QR code (registered trademark)) or RFID. Is. In this case, it is possible to score the recognition result of the object by each recognition method and to estimate that the object is the product with the highest score.
Furthermore, the above-mentioned recognition methods such as weight recognition, product recognition by scanning identification information such as a bar code and QR code (registered trademark), and product recognition by reading electronically recorded identification information are combined with an image recognition method. Instead, it is possible to recognize the object as a product by combining one or more.
 上述の各実施形態において、物体または商品の目検を目検用端末Qによって行うこととしたが、これに限られない。即ち、目検者による目検をレジ端末2、精算機4、情報端末9等の他の装置によって行うこととしてもよい。
 また、上述の各実施形態において、1つの店舗または1つのレジ端末2ごとに1つの目検用端末Qを備えることや、複数の店舗または複数のレジ端末ごとに1つの目検用端末Qを備えることが可能である。
In each of the above-described embodiments, the visual inspection of the object or the product is performed by the visual inspection terminal Q, but the present invention is not limited to this. That is, the eye examination by the eye examiner may be performed by another device such as the cashier terminal 2, the settlement machine 4, and the information terminal 9.
In addition, in each of the above-described embodiments, one store or one cashier terminal 2 is provided with one eye inspection terminal Q, or one store or plurality of cash register terminals is provided with one eye inspection terminal Q. It is possible to prepare.
 また、上述の実施形態におけるハイブリッドチェックシステム(目検)は必ずしも商品特定のみに用いられるわけではない。即ち、例えば、商品が(A)タバコ、酒類等のような一定の年齢に達しないと購入できない商品、(B)消費期限切れ・賞味期限切れの商品、(C)アレルギー成分を含むため、体質によって、摂取すべきでない商品、(D)ハラル食品以外の商品等、宗教による制限のある商品、であるか否か等の判定にも利用可能である。 Also, the hybrid check system (eye check) in the above-described embodiment is not necessarily used only for product identification. That is, for example, the product includes (A) tobacco, liquor, etc. that cannot be purchased until a certain age is reached, (B) expired / expired product, and (C) allergic ingredients, so It can also be used to determine whether or not a product should not be ingested, a product other than (D) halal food, or a product with religious restrictions.
 また、本発明は、上述のCPUによる画像処理をGPU(Graphic Processing Unit)による処理としてもよい。
 また、図示したハードウェア構成やブロック図は、本発明の目的を達成するための例示に過ぎず、本発明は図示された例に限定されない。
 また、機能ブロックの存在場所は、図示されたものに限定されず、任意でよい。例えば、1つの機能ブロックは、ハードウェア単体で構成してもよいし、ソフトウェア単体で構成してもよいし、それらを組み合わせで構成してもよい。各機能ブロックの処理をソフトウェアにより実行される場合には、そのソフトウェアを構成するプログラムがコンピュータ等にネットワークや記録媒体からインストールされる。
Further, in the present invention, the image processing by the above-described CPU may be processing by a GPU (Graphic Processing Unit).
Further, the illustrated hardware configuration and block diagram are merely examples for achieving the object of the present invention, and the present invention is not limited to the illustrated example.
The location of the functional block is not limited to that shown in the figure, and may be arbitrary. For example, one functional block may be configured by hardware alone, software alone, or a combination thereof. When the processing of each functional block is executed by software, the program forming the software is installed in a computer or the like from a network or a recording medium.
 コンピュータは、専用のハードウェアに組み込まれるコンピュータであってもよいし、スマートフォンやパーソナルコンピュータであってもよい。
 また、実施形態1において説明した売買制限商品の処理は、実施形態2乃至4においても適用することができる。ただし、実施形態2での売買制限商品は、年齢制限商品に適用される。
The computer may be a computer incorporated in dedicated hardware, or may be a smartphone or a personal computer.
Further, the processing of the trade-restricted product described in the first embodiment can be applied to the second to fourth embodiments. However, the trade-restricted product in the second embodiment is applied to the age-restricted product.
 なお、本明細書において、記録媒体に記録されるプログラムを記述するステップは、その順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくても、並列的あるいは個別に実行される処理をも含むものである。
 また、本明細書において、システムの用語は、複数の装置や複数の手段等により構成される全体的な装置を意味するものである。
In the present specification, the steps for writing the program recorded on the recording medium are not limited to the processing performed in time series according to the order, but may be performed in parallel or individually even if the processing is not necessarily performed in time series. It also includes the processing executed in.
Further, in the present specification, the term “system” means an overall device configured by a plurality of devices, a plurality of means and the like.
 以上まとめると、本発明が適用される情報処理システムは、次のような構成を取れば足り、上述の実施形態1乃至4の他、各種各様な実施形態を取ることができる。
 即ち、情報処理システムは、
 物体を被写体に含む画像に対して、目検者が目視による確認をする第1手法を用いて、当該物体を商品として特定することを試みた結果を取得する第1特定手段(例えば、目検結果取得部239及び目検用端末Q)と、
 前記第1特定手段の結果に基づいて特定された商品について、精算処理を行う精算手段(例えば、精算部237)と、
 を備える。
 これにより、購入者が商品を購入する際に、有人レジと同等の商品特定の精度を維持しながら、商品の代金の精算の自動化を可能にすることができる。
 さらに言えば、このような自動精算処理システムは、商品購入の手続きを購入者とシステムの間で完結することを可能にし、無人化店舗実現へのハードルを下げることができる。
In summary, the information processing system to which the present invention is applied can have various configurations in addition to the above-described first to fourth exemplary embodiments, as long as the following configuration is adopted.
That is, the information processing system
A first specifying means (for example, a visual inspection) that obtains the result of attempting to specify the object as a product by using the first method in which the eye examiner visually confirms the image including the object as a subject. A result acquisition unit 239 and a visual inspection terminal Q),
A settlement means (for example, a settlement section 237) for performing settlement processing for the product identified based on the result of the first identifying means,
Is provided.
As a result, when the purchaser purchases a product, it is possible to automate the settlement of the price of the product while maintaining the same accuracy of product identification as that of a manned cash register.
Furthermore, such an automatic settlement processing system makes it possible to complete the procedure for purchasing a product between the purchaser and the system, and lowers the hurdle for realizing an unmanned store.
 前記第1手法以外の第2手法を用いて、前記物体を商品として特定することを試みる第2特定手段(例えば、物体認識部233)をさらに備え、
 前記精算手段は、前記第1特定手段と前記第2特定手段のうち、少なくとも一方の結果に基づいて特定された商品について精算処理を行う構成としてもよい。
A second specifying means (for example, an object recognizing unit 233) that attempts to specify the object as a product by using a second method other than the first method,
The settlement means may be configured to perform settlement processing on a product identified based on a result of at least one of the first identification means and the second identification means.
 前記第2手法は、少なくとも所定の画像認識手法(例えば、ディープラーニング)を含む構成としてもよい。 The second method may include at least a predetermined image recognition method (for example, deep learning).
 前記所定の画像認識手法により商品と特定されなかった前記物体については、
 前記第1特定手段及び前記第2特定手段により、前記所定の画像認識手法を含まない1以上の手法を用いて商品として特定された前記物体の商品情報と、前記画像とを紐付けし、
 紐づけられた前記商品情報と前記画像とを含む情報を用いて、前記所定の画像認識手法の再学習をおこない、
 前記再学習で得られた学習結果を前記第2特定手段に配備する画像認識再学習手段とをさらに備える構成としてもよい。
For the object not specified as a product by the predetermined image recognition method,
By the first specifying unit and the second specifying unit, the product information of the object specified as a product using one or more methods not including the predetermined image recognition method and the image are linked,
Using the information including the linked product information and the image, re-learning the predetermined image recognition method,
The learning result obtained by the re-learning may be further provided with an image recognition re-learning unit which is provided in the second specifying unit.
 前記第2手法は、少なくとも、重量認識、識別情報のスキャン、及び、電子的に記録された識別情報の読み取りのうち、1以上の手法を含む構成としてもよい。 The second method may include at least one of weight recognition, scanning of identification information, and reading of electronically recorded identification information.
 前記第2手法により商品として特定されなかった前記物体については、前記第1特定手段による結果に基づいて特定された商品が、前記精算手段による前記精算処理の対象となる構成としてもよい。 Regarding the object not specified as the product by the second method, the product specified based on the result by the first specifying means may be the target of the payment processing by the payment means.
 前記第1特定手段及び前記第2特定手段の夫々の結果に基づいて特定された商品が、前記精算手段による前記精算処理の対象となる構成としてもよい。 The product specified based on the result of each of the first specifying unit and the second specifying unit may be a target of the settlement process by the settlement unit.
 1・・・サーバ、2,2-1,2-n・・・レジ端末、3・・・売場装置、4,4-1,4-n,5a・・・精算機、5,5-1,5-2,5-3,5-n・・・精算ゲート、6・・・精算レジ、9,9a,9b・・・情報端末、10,20,30,40・・・店舗、11,22・・・出入口、12,25・・・レジカウンター、13,26・・・有人のレジスター、14,23,33,43・・・棚ラック、15,24,34,44・・・通路、21・・・ゲート、31,41・・・入口、32,42・・・出口、35,45・・・精算エリア、36・・・レジ台、101,201,301,401,501・・・CPU、102,202,302,502・・・ROM、103,203,303,503・・・RAM、104,204,304,504・・・バス、105,205,305,505・・・入出力インターフェース、106,206,406,506・・・出力部、107,207,407,507・・・入力部、108,208,508・・・記憶部、109,213,315,409,509・・・通信部、110,214・・・ドライブ、120,220・・・リムーバブルメディア、131・・・商品DB、132・・・位置情報管理DB、141・・・DB管理部、150・・・エラー判定部、151・・・エラー表示部、152・・・エラー解除部、209・・・遮光部、210・・・提示部、211・・・レジカメラ、212・・・外部カメラ、221・・・照明部、228・・・発光制御部、229・・・遮光制御部、230・・・提示制御部、231,320・・・個人認証部、232・・・画像取得部、233・・・物体認識部、234・・・物体数量認識部、235・・・商品特定部、236・・・売買制限商品判定部、237,435,535・・・精算部、238・・・表示制御部、239,301a・・・目検結果取得部、241・・・DB情報保持部、270・・・囲繞部、271・・・天板部、272・・・底板部、273・・・側板部、275・・・筐体部、276・・・プレート、310・・・天井カメラ、311・・・棚カメラ、312・・・カゴカメラ、321・・・個人情報取得部、330・・・移動物体追跡部、340・・・位置情報管理部、341・・・カメラ間情報受け渡し部、342・・・各カメラの位置定義部、343・・・移動物体表示部、350・・・冊数カウント部、351・・・冊数認識部、352・・・移動物体と冊数紐づけ部、353・・・冊数不特定判定部、354・・・移動物体と紐づく冊数管理部、355・・・天井カメラによる移動物体領域間受け渡し認識部、356・・・天井カメラによる受け渡された冊数認識部、360・・・棚商品認識部、370・・・カゴ商品認識部、372・・・カゴカメラによるカゴ入出検知部、373・・・カゴカメラによる物体認識部、374・・・カゴカメラによる商品特定部、375・・・カゴ商品の商品不特定判定部、376・・・カゴ商品のラベル認識部、377・・・カゴ商品の値引きシール認識部、380・・・売買制限商品判定部、390・・・遠隔操作部、391,D・・・表示部、411・・・画像表示制御部、412・・・目検結果送信部、436,536・・・入力制御部、437,537・・・出力制御部、3302・・・天井カメラによる移動物体発見部、3303・・・天井カメラによるカゴ発見部、3304・・・天井カメラによる移動物体領域定義部、3305・・・天井カメラによる移動物体領域追跡部、3306・・・天井カメラによるカゴ領域定義部、3307・・・天井カメラによるカゴ領域追跡部、3308・・・グルーピング部、3309・・・血縁関係判定部、3310・・・天井カメラによるカゴ領域間受け渡し認識部、3311・・・天井カメラによる移動物体領域間受け渡し認識部、3312・・・天井カメラによる受け渡された物体認識部、3313・・・天井カメラによる受け渡された商品特定部、3602・・・棚カメラによる物体認識部、3603・・・棚カメラによる商品特定部、3604・・・移動物体と商品紐づけ部、3605・・・カゴと商品紐づけ部、3606・・・移動物体と紐づく商品リスト管理部、3607・・・カゴと紐づく商品リスト管理部、3608・・・棚カメラによる物体入出検知部、3609・・・商品不特定判定部、3610・・・ラベル認識部、3611・・・値引きシール認識部、3612・・・棚カメラまたは天井カメラによるカゴ入出検知部、A・・・所定エリア、C・・・カードリーダー部、Mo・・・移動物体、Mt・・・店員、M・・・マイクロフォン、N・・・ネットワーク、Q・・・目検用端末、R・・・レシート部、S・・・スピーカー 1 ... Server, 2,2-1,2-n ... Cashier terminal, 3 ... Sales floor device, 4,4-1,4-n, 5a ... Adjustment machine, 5,5-1 , 5-2, 5-3, 5-n ... Settlement gate, 6 ... Settlement cash register, 9, 9a, 9b ... Information terminal 10, 20, 30, 40 ... Store, 11, 22 ... Doorway, 12, 25 ... Cashier counter, 13, 26 ... Manned register, 14, 23, 33, 43 ... Shelf rack, 15, 24, 34, 44 ... Aisle, 21 ... Gate, 31, 41 ... Entrance, 32, 42 ... Exit, 35, 45 ... Settlement area, 36 ... Cash register, 101, 201, 301, 401, 501 ... CPU, 102, 202, 302, 502 ... ROM, 103, 203, 303, 503 ... RAM, 104, 204 304, 504 ... Bus, 105, 205, 305, 505 ... Input / output interface, 106, 206, 406, 506 ... Output section, 107, 207, 407, 507 ... Input section, 108, 208, 508 ... Storage unit, 109, 213, 315, 409, 509 ... Communication unit, 110, 214 ... Drive, 120, 220 ... Removable media, 131 ... Product DB, 132 ... ..Position information management DB, 141 ... DB management unit, 150 ... Error determination unit, 151 ... Error display unit, 152 ... Error canceling unit, 209 ... Shading unit, 210 ... Presentation unit, 211 ... Cash register camera, 212 ... External camera, 221 ... Illumination unit, 228 ... Emission control unit, 229 ... Shading control unit, 230 ... Presentation control , 231, 320 ... Personal authentication section, 232 ... Image acquisition section, 233 ... Object recognition section, 234 ... Object quantity recognition section, 235 ... Product identification section, 236 ... Trading restriction Product determination unit, 237, 435, 535 ... Settlement unit, 238 ... Display control unit, 239, 301a ... Eye inspection result acquisition unit, 241 ... DB information holding unit, 270 ... Surrounding unit , 271 ... Top plate part, 272 ... Bottom plate part, 273 ... Side plate part, 275 ... Housing part, 276 ... Plate, 310 ... Ceiling camera, 311 ... Shelf camera Reference numeral 312 ... Basket camera, 321 ... Personal information acquisition unit, 330 ... Moving object tracking unit, 340 ... Position information management unit, 341 ... Inter-camera information transfer unit, 342 ... Camera position definition unit, 343 ... Moving object display unit, 350 ... Volume counting unit, 351 ... Volume number recognition unit, 352 ... Moving object and volume number associating unit, 353 ... Volume number unspecified determination unit, 354 ... Volume number managing unit that is associated with moving object Reference numeral 355 ... Passing recognition unit between moving object areas by the ceiling camera, 356 ... Recognizing number of books delivered by the ceiling camera, 360 ... Shelf product recognition unit, 370 ... Basket product recognition unit, 372・ ・ ・ Car entrance / exit detection unit by car camera 373 ・ ・ ・ Object recognition unit by car camera 374 ・ ・ ・ Product identification unit by car camera 375 ・ ・ ・ Product non-identification determination unit for car product 376 ・ ・ ・Basket product label recognition unit, 377 ... Basket product discount sticker recognition unit, 380 ... Trading restricted product determination unit, 390 ... Remote operation unit, 391, D ... Display unit, 411 ... Image display system Section 412 ... Eye inspection result sending section, 436,536 ... Input control section, 437,537 ... Output control section, 3302 ... Moveable object finding section by ceiling camera, 3303 ... Ceiling camera By the ceiling camera, the moving object area definition section by the ceiling camera, 3305 ... the moving object area tracking section by the ceiling camera, 3306 ... the basket area definition section by the ceiling camera, 3307 ... by the ceiling camera Basket region tracking unit, 3308 ... Grouping unit, 3309 ... Blood relationship determination unit, 3310 ... Cargo region passing / recognizing unit by ceiling camera, 3311 ... Moving object region passing / recognizing unit by ceiling camera, 3312 ... Object recognition unit delivered by ceiling camera, 3313 ... Product identification unit delivered by ceiling camera, 36 2 ... object recognition unit by shelf camera, 3603 ... product identification unit by shelf camera, 3604 ... moving object and product association unit, 3605 ... basket and product association unit, 3606 ... move Product list management unit associated with object, 3607 ... Product list management unit associated with basket, 3608 ... Object entry / exit detection unit by shelf camera, 3609 ... Product non-identification determination unit, 3610 ... Label recognition 3611 ... Discount sticker recognition unit, 3612 ... Basket entry / exit detection unit by shelf camera or ceiling camera, A ... Predetermined area, C ... Card reader unit, Mo ... Moving object, Mt. ..Sales staff, M ... Microphone, N ... Network, Q ... Visual inspection terminal, R ... Receipt part, S ... Speaker

Claims (9)

  1.  物体を被写体に含む画像に対して、目検者が目視による確認をする第1手法を用いて、当該物体を商品として特定することを試みた結果を取得する第1特定手段と、
     前記第1特定手段の結果に基づいて特定された商品について、精算処理を行う精算手段と、
     を備える情報処理システム。
    First identifying means for obtaining a result of attempting to identify the object as a product by using a first method in which an eye examiner visually confirms an image including the object as a subject;
    Settlement means for performing settlement processing for the product identified based on the result of the first identifying means,
    An information processing system comprising:
  2.  前記第1手法以外の第2手法を用いて、前記物体を商品として特定することを試みる第2特定手段をさらに備え、
     前記精算手段は、前記第1特定手段と前記第2特定手段のうち、少なくとも一方の結果に基づいて特定された商品について精算処理を行う、
     請求項1に記載の情報処理システム。
    Further comprising second specifying means for attempting to specify the object as a product by using a second method other than the first method,
    The settlement means performs a settlement process on a product identified based on a result of at least one of the first identification means and the second identification means.
    The information processing system according to claim 1.
  3.  前記第2手法は、少なくとも所定の画像認識手法を含む、
     請求項2に記載の情報処理システム。
    The second method includes at least a predetermined image recognition method,
    The information processing system according to claim 2.
  4.  前記所定の画像認識手法により商品と特定されなかった前記物体については、
     前記第1特定手段及び前記第2特定手段により、前記所定の画像認識手法を含まない1以上の手法を用いて商品として特定された前記物体の商品情報と、前記画像とを紐付けし、
     紐づけられた前記商品情報と前記画像とを含む情報を用いて、前記所定の画像認識手法の再学習をおこない、
     前記再学習で得られた学習結果を前記第2特定手段に配備する画像認識再学習手段と、
     をさらに備える請求項3に記載の情報処理システム。
    For the object not specified as a product by the predetermined image recognition method,
    By the first specifying unit and the second specifying unit, the product information of the object specified as a product by using one or more methods not including the predetermined image recognition method and the image are linked,
    Using the information including the linked product information and the image, re-learning the predetermined image recognition method,
    Image recognition re-learning means for allocating the learning result obtained by the re-learning to the second specifying means;
    The information processing system according to claim 3, further comprising:
  5.  前記第2手法は、少なくとも、重量認識、識別情報のスキャン、及び、電子的に記録された識別情報の読み取りのうち、1以上の手法を含む、
     請求項2乃至4のうちいずれか1項に記載の情報処理システム。
    The second method includes at least one of weight recognition, scanning of identification information, and reading of electronically recorded identification information.
    The information processing system according to any one of claims 2 to 4.
  6.  前記第2手法により商品として特定されなかった前記物体については、前記第1特定手段による結果に基づいて特定された商品が、前記精算手段による前記精算処理の対象となる、
     請求項2乃至5のうちいずれか1項に記載の情報処理システム。
    For the object not specified as the product by the second method, the product specified based on the result of the first specifying means is the target of the payment processing by the payment means.
    The information processing system according to any one of claims 2 to 5.
  7.  前記第1特定手段及び前記第2特定手段の夫々の結果に基づいて特定された商品が、前記精算手段による前記精算処理の対象となる、
     請求項2乃至5のうちいずれか1項に記載の情報処理システム。
    The product specified based on the result of each of the first specifying unit and the second specifying unit is an object of the settlement processing by the settlement unit.
    The information processing system according to any one of claims 2 to 5.
  8.  情報処理装置が実行する情報処理方法であって、
     物体を被写体に含む画像に対して、目検者が目視による確認をする手法を用いて、当該物体を商品として特定することを試みた結果を取得する特定ステップと、
     前記特定ステップの処理の結果に基づいて特定された商品について、精算処理を行う精算ステップと、
     を含む情報処理方法。
    An information processing method executed by an information processing apparatus,
    An identifying step of acquiring the result of attempting to identify the object as a product by using a method in which an eye checker visually confirms the image including the object as a subject,
    For a product identified based on the result of the process of the specific step, a payment step of performing a payment process,
    Information processing method including.
  9.  コンピュータに、
     物体を被写体に含む画像に対して、目検者が目視による確認をする手法を用いて、当該物体を商品として特定することを試みた結果を取得する特定ステップと、
     前記特定ステップの処理の結果に基づいて特定された商品について、精算処理を行う精算ステップと、
     を含む制御処理を実行させるプログラム。
    On the computer,
    An identifying step of acquiring the result of attempting to identify the object as a product by using a method in which an eye checker visually confirms the image including the object as a subject,
    For a product identified based on the result of the process of the specific step, a payment step of performing a payment process,
    A program that executes control processing including.
PCT/JP2019/040161 2018-10-12 2019-10-11 Information processing system WO2020075837A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-193014 2018-10-12
JP2018193014A JP6653813B1 (en) 2018-10-12 2018-10-12 Information processing system

Publications (1)

Publication Number Publication Date
WO2020075837A1 true WO2020075837A1 (en) 2020-04-16

Family

ID=69624535

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/040161 WO2020075837A1 (en) 2018-10-12 2019-10-11 Information processing system

Country Status (2)

Country Link
JP (1) JP6653813B1 (en)
WO (1) WO2020075837A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115311830A (en) * 2021-01-19 2022-11-08 东芝泰格有限公司 Informing device and informing method

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021234842A1 (en) * 2020-05-20 2021-11-25 日本電気株式会社 Processing system, processing method, and program
WO2022003888A1 (en) * 2020-07-02 2022-01-06 日本電気株式会社 Warning apparatus, system, method, and non-transitory computer-readable medium having program stored therein
US20230005267A1 (en) * 2021-06-30 2023-01-05 Fujitsu Limited Computer-readable recording medium, fraud detection method, and fraud detection apparatus
JP7318684B2 (en) * 2021-07-30 2023-08-01 富士通株式会社 Information processing program, information processing method, and information processing apparatus
JP7318683B2 (en) * 2021-07-30 2023-08-01 富士通株式会社 Information processing program, information processing method, and information processing apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016185845A (en) * 2015-03-27 2016-10-27 日本電気株式会社 Inspection processing apparatus, inspection processing method and program
JP2017139019A (en) * 2017-05-01 2017-08-10 東芝テック株式会社 Information processing device, and program
JP2017157216A (en) * 2016-02-29 2017-09-07 サインポスト株式会社 Information processing system
JP2017220202A (en) * 2016-06-02 2017-12-14 サインポスト株式会社 Information processing system, information processing method, and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5132732B2 (en) * 2010-08-23 2013-01-30 東芝テック株式会社 Store system and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016185845A (en) * 2015-03-27 2016-10-27 日本電気株式会社 Inspection processing apparatus, inspection processing method and program
JP2017157216A (en) * 2016-02-29 2017-09-07 サインポスト株式会社 Information processing system
JP2017220202A (en) * 2016-06-02 2017-12-14 サインポスト株式会社 Information processing system, information processing method, and program
JP2017139019A (en) * 2017-05-01 2017-08-10 東芝テック株式会社 Information processing device, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115311830A (en) * 2021-01-19 2022-11-08 东芝泰格有限公司 Informing device and informing method

Also Published As

Publication number Publication date
JP6653813B1 (en) 2020-02-26
JP2020061044A (en) 2020-04-16

Similar Documents

Publication Publication Date Title
JP6968399B2 (en) Information processing system
JP6653813B1 (en) Information processing system
US11948364B2 (en) Portable computing device installed in or mountable to a shopping cart
JP7093783B2 (en) Systems and methods for a dynamic customer checkout experience in an automated shopping environment
RU2727084C1 (en) Device and method for determining order information
CN111626681B (en) Image recognition system for inventory management
JP7225434B2 (en) Information processing system
RU2739542C1 (en) Automatic registration system for a sales outlet
US10383461B2 (en) System of control and identification of goods in a shop
JP6836256B2 (en) Information processing system
CN111919233A (en) Shop management apparatus and shop management method
TWM570489U (en) Smart store shopping system
TW202006628A (en) Smart store shopping system and purchasing method using thereof
JP6735888B2 (en) Product data processing system, product data processing method
US20220270061A1 (en) System and method for indicating payment method availability on a smart shopping bin
JP2021179927A (en) Sales system, settlement device and program
JP2021128363A (en) Commodity sales data processing system, portable terminal device, and computer
JP2024037466A (en) Information processing system, information processing method and program
JP2020187776A (en) Information processing device, program and information processing method
JP2022045990A (en) Commodity sales data processing system, portable terminal, and program
JP2023028007A (en) program
CN117043809A (en) Information processing apparatus, information processing method, and recording medium
JP2022008316A (en) Check apparatus and check program
JP2022098820A (en) Item sales data processing system and program
JP2022164939A (en) Sales system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19871137

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19871137

Country of ref document: EP

Kind code of ref document: A1