WO2022190164A1 - 決済システム、決済方法、及びコンピュータプログラム - Google Patents
決済システム、決済方法、及びコンピュータプログラム Download PDFInfo
- Publication number
- WO2022190164A1 WO2022190164A1 PCT/JP2021/008957 JP2021008957W WO2022190164A1 WO 2022190164 A1 WO2022190164 A1 WO 2022190164A1 JP 2021008957 W JP2021008957 W JP 2021008957W WO 2022190164 A1 WO2022190164 A1 WO 2022190164A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- customer
- product
- payment
- camera
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 88
- 238000004590 computer program Methods 0.000 title claims description 19
- 238000012790 confirmation Methods 0.000 claims abstract description 76
- 238000012545 processing Methods 0.000 claims abstract description 75
- 230000008569 process Effects 0.000 claims abstract description 51
- 230000008859 change Effects 0.000 claims description 42
- 238000003384 imaging method Methods 0.000 claims description 42
- 230000001815 facial effect Effects 0.000 claims description 30
- 239000000047 product Substances 0.000 description 174
- 238000010586 diagram Methods 0.000 description 61
- 230000000694 effects Effects 0.000 description 16
- 238000005286 illumination Methods 0.000 description 14
- 210000003128 head Anatomy 0.000 description 11
- 238000001816 cooling Methods 0.000 description 9
- 238000012795 verification Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 230000007423 decrease Effects 0.000 description 5
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 238000002834 transmittance Methods 0.000 description 4
- 230000004397 blinking Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/08—Payment architectures
- G06Q20/20—Point-of-sale [POS] network systems
- G06Q20/208—Input by product or record sensing, e.g. weighing or scanner processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/40—Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
- G06Q20/401—Transaction verification
- G06Q20/4014—Identity check for transactions
- G06Q20/40145—Biometric identity checks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/42—Confirmation, e.g. check or permission by the legal debtor of payment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/40—Spoof detection, e.g. liveness detection
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07G—REGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
- G07G1/00—Cash registers
- G07G1/0036—Checkout procedures
Definitions
- This disclosure relates to the technical field of payment systems, payment methods, and computer programs for product payment processing.
- an authentication device capable of authenticating a target person such as a person
- an authentication device that performs authentication processing using a plurality of biometric information (that is, composite biometric authentication or multimodal biometric authentication) is known.
- Japanese Patent Laid-Open No. 2002-100001 discloses a technique of performing authentication processing by fusing outputs from a plurality of biometric authentication devices.
- Japanese Patent Laid-Open No. 2002-200002 discloses a technique of performing weighted addition of authentication levels of two authentication methods and performing identity verification based on the resulting authentication level.
- This disclosure has been made, for example, in view of the above cited documents, and aims to provide a payment system, a payment method, and a computer program that can appropriately execute product payment processing.
- One aspect of the disclosed payment system includes product reading means for reading a product, product information acquisition means for acquiring product information related to the read product, and confirmation information for confirming the customer's intention to pay for the product.
- confirmation information output means for outputting a confirmation information output means for receiving input from the customer for the confirmation information; face acquisition means for acquiring the customer's face image; iris acquisition means for acquiring the customer's iris image; payment processing means for executing payment processing for the product based on the input from the customer and at least one of the face image and the iris image;
- One aspect of the settlement method disclosed in this disclosure reads a product, acquires product information related to the read product, outputs confirmation information for confirming the customer's intention to make a payment for the product, and outputs the confirmation information to the confirmation information.
- receiving an input from a customer obtaining a facial image of the customer, obtaining an iris image of the customer, and producing the product based on the input from the customer and at least one of the facial image and the iris image payment processing.
- One aspect of the computer program disclosed herein reads a product, acquires product information related to the read product, outputs confirmation information for confirming the customer's intention to pay for the product, and outputs the confirmation information to the confirmation information.
- receiving an input from a customer obtaining a facial image of the customer, obtaining an iris image of the customer, and producing the product based on the input from the customer and at least one of the facial image and the iris image operate the computer to perform the payment processing of
- FIG. 1 is a block diagram showing the hardware configuration of a settlement system according to a first embodiment
- FIG. 1 is a block diagram showing a functional configuration of a settlement system according to a first embodiment
- FIG. It is a flow chart which shows a flow of operation of a settlement system concerning a 1st embodiment.
- FIG. 11 is a schematic diagram (part 1) showing the configuration of a camera according to the second embodiment
- FIG. 11 is a schematic diagram (part 2) showing the configuration of the camera according to the second embodiment
- 4 is a plan view showing the relationship between the imaging range of the face camera and the imaging range of the iris camera
- FIG. It is a schematic diagram showing an example of a visible light filter provided in the illumination unit.
- FIG. 1 is a block diagram showing the hardware configuration of a settlement system according to a first embodiment
- FIG. 1 is a block diagram showing a functional configuration of a settlement system according to a first embodiment
- FIG. It is a flow chart which shows a flow of operation of a settlement system
- FIG. 4 is a schematic diagram showing a configuration in which a motor is fixed to a fixing portion outside the device;
- FIG. 11 is a conceptual diagram showing driving directions of a camera according to the second embodiment; 9 is a flow chart showing the flow of operations of a camera according to the second embodiment;
- FIG. 4 is a conceptual diagram showing an example of a method of adjusting an imaging range based on face position;
- FIG. 11 is a block diagram showing a functional configuration of a settlement system according to a third embodiment;
- FIG. FIG. 4 is a conceptual diagram showing a display example when reading a product; It is a block diagram showing a functional configuration of a settlement system according to a fourth embodiment.
- FIG. 12 is a block diagram showing a functional configuration of a settlement system according to a fifth embodiment
- FIG. FIG. 4 is a conceptual diagram showing a display example of a gaze area
- FIG. 10 is a conceptual diagram showing a display example of a gaze area considering the position of the camera
- FIG. 12 is a block diagram showing a functional configuration of a settlement system according to a sixth embodiment
- FIG. FIG. 4 is a conceptual diagram showing a display example of a frame that gradually converges on a region of interest
- FIG. 21 is a block diagram showing a functional configuration of a settlement system according to a seventh embodiment
- FIG. 10 is a conceptual diagram showing a display example in which the color of the gaze area is gradually changed toward the outside of the screen;
- FIG. 22 is a block diagram showing a functional configuration of a settlement system according to an eighth embodiment;
- FIG. FIG. 14 is a flow chart showing the flow of operations of a settlement system according to an eighth embodiment;
- FIG. FIG. 10 is a conceptual diagram showing a display example when final confirmation of payment intention is made;
- FIG. 10 is a conceptual diagram showing a display example when the distance to the camera is set within the proper range;
- FIG. 10 is a conceptual diagram showing a display example for notifying a line-of-sight direction when performing iris authentication;
- FIG. 4 is a conceptual diagram showing a display example for prompting removal of a wearable item;
- FIG. 10 is a conceptual diagram showing a display example for notifying the user to open their eyes; It is a schematic diagram showing a modification of the camera.
- FIG. 7 is a conceptual diagram showing a display example of a cancel button;
- FIG. 11 is a conceptual diagram showing a display example after cancellation;
- FIG. 11 is a conceptual diagram showing a display example of a number change button; It is a conceptual diagram which shows the example of a display of an amount change button.
- FIG. 10 is a conceptual diagram showing a display example when there are products that require age confirmation;
- FIG. 1 is a block diagram showing the hardware configuration of the payment system according to the first embodiment.
- the payment system 10 includes a processor 11, a RAM (Random Access Memory) 12, a ROM (Read Only Memory) 13, and a storage device 14. Payment system 10 may further comprise an input device 15 , an output device 16 and a camera 20 . Processor 11 , RAM 12 , ROM 13 , storage device 14 , input device 15 , output device 16 and camera 20 are connected via data bus 17 .
- the processor 11 reads a computer program.
- processor 11 is configured to read a computer program stored in at least one of RAM 12, ROM 13 and storage device .
- the processor 11 may read a computer program stored in a computer-readable recording medium using a recording medium reader (not shown).
- the processor 11 may acquire (that is, read) a computer program from a device (not shown) arranged outside the payment system 10 via a network interface.
- the processor 11 controls the RAM 12, the storage device 14, the input device 15 and the output device 16 by executing the read computer program.
- the processor 11 implements functional blocks for executing processing related to product settlement.
- processor 11 one of CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (Field-Programmable Gate Array), DSP (Demand-Side Platform), ASIC (Application Specific) integrated circuit is used. may be used, or a plurality of them may be used in parallel.
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- FPGA Field-Programmable Gate Array
- DSP Demand-Side Platform
- ASIC Application Specific integrated circuit
- the RAM 12 temporarily stores computer programs executed by the processor 11.
- the RAM 12 temporarily stores data temporarily used by the processor 11 while the processor 11 is executing the computer program.
- the RAM 12 may be, for example, a D-RAM (Dynamic RAM).
- the ROM 13 stores computer programs executed by the processor 11 .
- the ROM 13 may also store other fixed data.
- the ROM 13 may be, for example, a P-ROM (Programmable ROM).
- the storage device 14 stores data that the payment system 10 saves over the long term.
- Storage device 14 may act as a temporary storage device for processor 11 .
- the storage device 14 may include, for example, at least one of a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), and a disk array device.
- the input device 15 is a device that receives input instructions from users of the payment system 10 .
- Input device 15 may include, for example, at least one of a keyboard, mouse, and touch panel.
- the output device 16 is a device that outputs information about the payment system 10 to the outside.
- output device 16 may be a display device (eg, display) capable of displaying information about payment system 10 .
- the camera 20 is a camera capable of imaging the iris and face of a living body.
- Camera 20 may be configured, for example, as a near-infrared camera.
- the camera 20 may be arranged at a position such that its imaging range includes the periphery of the living body's face.
- the camera 20 may be a camera that captures still images, or may be a camera that captures moving images. Also, as will be described later, there may be two cameras instead of one. One is a visible light camera and the other is a near-infrared camera.
- the camera may be capable of capturing an image of the iris. Alternatively, both may be near-infrared cameras, or both may be visible light cameras.
- FIG. 2 is a block diagram showing the functional configuration of the payment system according to the first embodiment.
- the payment system 10 includes a product reading unit 110, a product information acquisition unit 120, a confirmation information output unit 130, an input It comprises a reception unit 140 , a face image acquisition unit 150 , an iris image acquisition unit 160 and a payment processing unit 170 .
- Product reading unit 110, product information acquisition unit 120, confirmation information output unit 130, input reception unit 140, face image acquisition unit 150, iris image acquisition unit 160, and payment processing unit 170 are each implemented in processor 11 described above, for example. may be Further, the product reading section 110, the facial image acquisition section 150, and the iris image acquisition section 160 may be configured to include the camera 20 described above.
- the input reception unit 140 may be configured including the input device 15 and the camera 20 described above.
- the confirmation information output unit 130 may be configured including the output device 16 described above (more specifically, a display device such as a display).
- the product reading unit 110 is configured to be able to read products.
- the product reading unit 110 may be configured to read the product by acquiring an image of the product using the camera 20 .
- the product reading unit 110 may be configured to be able to read products using various readers and scanners (for example, barcode scanners, etc.).
- the merchandise reading unit 110 may be configured to be able to read merchandise placed at a predetermined position, or may be configured to be able to read merchandise held by a customer.
- the product information acquisition unit 120 is configured to be able to acquire product information related to products read by the product reading unit 110 .
- product information include bar code information, price, product name, stock quantity, and the like.
- the product information acquisition unit 120 may read and acquire product information from a database or the like in which product information is stored in advance.
- the confirmation information output unit 130 is configured to be capable of outputting confirmation information for confirming the customer's payment intention (that is, whether or not there is an intention to purchase the product) for the product read by the product reading unit 110.
- the confirmation information output unit 130 may be configured to output image information using, for example, a display device.
- the confirmation information output unit 130 may be configured to output audio information using a speaker or the like. A specific example of the confirmation information will be described in detail in another embodiment described later.
- the input reception unit 140 is configured to be able to receive input from the customer with respect to the confirmation information (in other words, information regarding the intention of payment).
- the input reception unit 140 may receive input from the customer using, for example, a camera.
- the input reception unit 140 may receive input from the customer using the input device 15, for example. A specific method of receiving input from the customer will be described in detail in other embodiments described later.
- the facial image acquisition unit 150 is configured to be able to acquire the customer's facial image.
- a face image is an image including a customer's face, and is typically an image captured so that the entire face is included. Moreover, the face image may be an image including parts other than the customer's face.
- the face image acquiring section 150 may acquire an image captured by the camera 20 as a face image.
- the iris image acquisition unit 160 is configured to be able to acquire a customer's iris image.
- the iris image is an image including the customer's iris, and is typically a high-definition image to the extent that the pattern of the iris can be seen.
- the iris facial image may be an image including portions other than the customer's iris.
- the iris image acquisition section 160 may acquire an image captured by the camera 20 as an iris image.
- camera 20 may include a plurality of cameras that capture each of the face image and iris image described above. A specific example of such a camera 20 will be described in detail in another embodiment described later.
- the payment processing unit 170 is configured to be able to execute payment processing for products read by the product reading unit 110 . Based on the input from the customer received by the input receiving unit 140 and at least one of the face image obtained by the face image obtaining unit 150 and the iris image obtained by the iris image obtaining unit 160, the payment processing unit 170 Execute payment processing.
- the settlement processing unit 170 may determine whether or not to execute settlement processing based on the input from the customer received by the input receiving unit 140, for example.
- the payment processing unit 170 also performs authentication processing (that is, identity verification) based on at least one of the face image acquired by the face image acquisition unit 150 and the iris image acquired by the iris image acquisition unit 160. good too.
- the database of the payment service may store customer's biometric information (for example, information on the face image, iris image, etc. used in the authentication process) and information on the financial institution in association with each other. In this case, in the settlement process after the authentication process, the settlement amount is withdrawn from the bank account of the customer specified in the authentication process.
- the customer's biometric information may be credit card information, electronic payment service account information (in this case, withdrawal from the charged balance), mobile phone number, etc. ( In this case, it may be stored in association with the billing that is combined with the usage fee of the mobile phone.
- a screen for selecting a payment method based on line-of-sight direction estimation is displayed, as described in another embodiment below.
- the user may be allowed to select a payment method.
- FIG. 3 is a flow chart showing the operation flow of the payment system according to the first embodiment.
- the product reading unit 110 first reads the product (step S101). Then, the product information acquisition unit 120 acquires the product information of the product read by the product reading unit 110 (step S102). Note that the processes of steps S101 and S102 described above may be collectively executed for a plurality of products.
- the confirmation information output unit 130 outputs confirmation information for confirming the intention of payment (step S103). Then, based on the input from the customer, the input reception unit 140 determines whether or not the customer has an intention to make a payment using biometric authentication (step S104). The intention to make a payment determined here may be, for example, to confirm whether or not the products read are correct (for example, whether there are any missing products, or whether unnecessary products have been read, etc.). Note that the input receiving unit 140 may determine that there is an intention to make a payment when there is an input from the customer, and may determine that there is no intention to make a payment when there is no input from the customer.
- step S104 If it is determined that the customer has no intention of making a payment using biometric authentication (step S104: NO), the series of processes ends. That is, the settlement processing unit 170 determines that the customer does not intend to purchase the product using biometric authentication, and terminates the operation without executing settlement processing. In this case, the customer either cancels the payment or uses a means other than biometric authentication.
- the facial image acquiring unit 150 acquires the customer's facial image (step S105).
- the iris image acquisition unit 160 acquires the customer's iris image (step S106).
- step S107 the payment processing unit 170 executes customer authentication processing based on at least one of the acquired face image and iris image.
- the authentication processing using the face image and the iris image existing techniques can be appropriately adopted, so detailed description thereof will be omitted here. Further, step S105 and step S106 may be performed at the same time.
- step S107: NO If the customer authentication process is not successful (step S107: NO), the series of processes ends. That is, the payment processing unit 170 determines that the customer is not the real customer (for example, the customer is impersonated), and terminates the operation without executing the payment processing. Alternatively, it may be possible to confirm with the customer whether or not to perform payment by biometric authentication again. On the other hand, if the customer authentication process is successful (step S107: YES), the payment processing unit 170 executes the product payment process using the product information (step S108).
- the process of confirming the payment intention (that is, step S104) is executed after reading the product (that is, after step S101), but it may be executed before reading the product. .
- the process of confirming the payment intention may be executed after the authentication process (that is, after step S107).
- the process of confirming the payment intention may be executed in a plurality of stages. For example, as in the example described above, after performing the process of confirming the payment intention before the authentication process, the process of confirming the payment intention again may be performed after the authentication process.
- the contents may be different from each other.
- confirmation of payment intention before authentication processing confirms whether or not the product read is correct as described above
- confirmation of payment intention after authentication processing confirms whether the authentication result of biometric authentication is correct. It may be a thing to confirm with the customer himself/herself whether or not.
- settlement processing may be automatically performed at the same time as identity verification, or the customer may be allowed to select a settlement method after identity verification. For example, the customer may be asked to choose whether to use the payment method associated with the biometric information or another method.
- the product payment process is performed. executed. In this way, it is possible to avoid execution of inappropriate payment processing. For example, it is possible to prevent the payment processing from being executed even though the customer does not intend to make the payment, or the payment processing from being executed by someone pretending to be the customer.
- FIG. 4 to 11 A settlement system 10 according to the second embodiment will be described with reference to FIGS. 4 to 11.
- FIG. The second embodiment describes in detail the camera 20 used in the settlement system 10, and the overall system configuration and operation flow are the same as in the first embodiment (see FIGS. 1 to 3). can be Therefore, in the following description, descriptions of portions that overlap with the already described first embodiment will be omitted as appropriate.
- FIG. 4 is a schematic diagram (part 1) showing the configuration of the camera according to the second embodiment.
- FIG. 5 is a schematic diagram (part 2) showing the configuration of the camera according to the second embodiment.
- FIG. 6 is a plan view showing the relationship between the imaging range of the face camera and the imaging range of the iris camera.
- FIG. 7 is a schematic diagram showing an example of a visible light filter provided in the illumination section.
- FIG. 8 is a schematic diagram showing a configuration in which the motor is fixed to a fixed portion outside the device.
- FIG. 9 is a conceptual diagram showing driving directions of the camera according to the second embodiment.
- FIG. 4 is a view of the authentication imaging device viewed from the front side (in other words, the imaging target side), and FIG. 5 is a view of the authentication imaging device viewed from the rear side (that is, the side opposite to FIG. 1). It is a diagram.
- the cameras 20 included in the payment system 10 include an iris camera 210, a face camera 220, an illumination unit 230, a holding unit 250, an air cooling fan 260, and a motor 270 .
- the face camera 220 is configured as a visible light camera for capturing a face image used for face authentication.
- the iris camera 210 is configured as a near-infrared camera for capturing an iris image used for iris authentication, and has a narrower imaging range (also referred to as a field of view) compared to the face camera 220 .
- the face camera 220 and the iris camera 210 are arranged so that their imaging ranges overlap. For example, they are adjusted so that the imaging range of the iris camera 210 is positioned near the center of the imaging range of the face camera 220 (FIG. 6).
- Face camera 220 and iris camera 210 are integrally configured as camera unit 225 .
- the face camera 220 and the iris camera 210 are fixed to the holding portion 250 and can be integrally driven by a motor 270 which will be described later.
- the illumination unit 230 is configured to be able to emit illumination light (that is, near-infrared light) that assists the iris camera 210 in imaging.
- the illumination section 230 may be provided with a visible light cut filter that transmits illumination light (that is, near-infrared light) and has a low visible light transmittance.
- the visible light cut filter is provided so as to cover at least a part (preferably the whole) of the light source of the illumination section 230 (see FIG. 7). In this case, it is possible to reduce the amount of visible light that escapes from the lighting section 230 to the outside of the camera 20 (in other words, to the imaging subject side). As a result, it is possible to make it difficult to recognize the existence of the lighting section 230 from the outside of the camera 20 .
- the illumination unit 230 is fixed to the holding unit 250 together with the face camera 220 and the iris camera 210, and can be driven integrally with the face camera 220 and the iris camera 210 by a motor 270, which will be described later
- the air cooling fan 260 is a fan that blows air to cool the camera 20 .
- Air cooling fan 260 may or may not be fixed to holding portion 250 .
- air cooling fan 260 may be driven integrally with face camera 220, iris camera 210, and lighting unit 230, or may be configured not to be driven integrally (in this case, air cooling
- the fan 260 may be fixed to a fixing member other than the holding portion 250 (for example, a member outside the device). Note that the cooling fan 260 may be omitted if cooling is not required.
- the motor 270 is connected to the camera unit 225 (in other words, the facial camera 220 and the iris camera 210), and rotates the facial camera 220, the iris camera 210, and the lighting unit 230 in the vertical direction (see the arrows in the figure). ) can be integrally driven. Specifically, when the motor 270 is driven, the facial camera 220 and the iris camera 210 are coaxially rotated, and the imaging ranges of the facial camera 220 and the iris camera 210 are vertically changed by the same angle. It should be noted that the driving direction of the motors here is merely an example, and the face camera 220, the iris camera 210, and the illumination unit 230 may be driven in directions other than the vertical direction. Also, motor 270 may have multiple drive shafts to achieve more complex movements.
- the motor 270 may or may not be fixed to the holding portion 250. If the motor 270 is not fixed to the holding part 250, the motor 270 itself does not move even if the motor 270 is driven, and the face camera 220, the iris camera 210, and the illumination part 230 fixed to the holding part 250 move. In this case, the motor 270 may be fixed to the housing 280 or the like, for example. On the other hand, when the motor 270 is fixed to the holding portion 250, the motor 270 itself moves together with the face camera 220, the iris camera 210, and the illumination portion 230 fixed to the holding portion 250 (in this case, the motor The drive shaft only needs to be connected to the outside of the device). In this case, the drive shaft of the motor 270 may be fixed, for example, to a fixing portion 275 outside the device (see FIG. 8).
- the face camera 220, the iris camera 210, the illumination section 230, the holding section 250, the cooling fan 260, and the motor 270 are arranged inside a cylindrical housing 280.
- the drive by the motor 270 is a rotational movement about the central axis of the housing 280 (that is, the cylindrical central axis).
- the face camera 220 and the iris camera 210 can be driven smoothly inside the housing 280 .
- the housing 280 itself does not move.
- This configuration is realized by, for example, a tilt mechanism. In this way, it is possible to make it difficult for the person to be imaged to recognize the movement inside the housing 280 .
- At least part of the housing 280 (specifically, the part covering the face camera 220 and the iris camera 210) is made of a material that has a high transmittance to light from the outside and a low transmittance to light from the inside.
- the housing 280 may be configured as, for example, a half mirror or a smoked mirror.
- at least a part of the housing 280 functions as a cover portion that adjusts the transmittance, so that the movement of the face camera 220 and the iris camera 210 is difficult to recognize from the outside without degrading the image quality of the captured image. can do.
- the drive of the motor 270 is controlled by the drive control section 290.
- the drive control unit 290 calculates the drive amount of the motor 270 (in other words, the amount of movement of the imaging ranges of the face camera 220 and the iris camera 210) and controls the drive of the motor 270.
- FIG. A specific control method for motor 270 by drive control unit 290 will be described in detail later.
- the drive control unit 290 may be provided outside the housing 280 or may be provided inside the housing 280 .
- FIG. 10 is a flow chart showing the operation flow of the camera according to the second embodiment.
- FIG. 11 is a conceptual diagram showing an example of an imaging range adjustment method based on the face position.
- the face camera 220 first detects whether or not there is a customer to be imaged (step S201).
- the presence of the customer may be detected, for example, by a sensor (not shown) or the like, or may be detected by the face camera 220 itself. Alternatively, the presence of the customer may be detected when the customer operates the device. Note that if the customer is not detected (step S201: NO), subsequent processing is omitted, and the series of operations ends. In this case, the process of step S201 may be executed again after a predetermined period of time has elapsed.
- the face camera 220 captures the customer's facial image (step S202).
- the position of the customer's face changes depending on the customer's height, standing position, etc.
- the imaging range of the face camera 220 is set relatively wide, the customer's face image can be captured without adjusting the imaging range. can do.
- the customer may be guided to the imaging range using a display unit or the like (not shown).
- the drive control unit 290 can be used as an iris camera. After changing the imaging range of 210, the face camera 220 may capture the face image again.
- the drive control unit 290 acquires the facial image from the facial camera 220 and detects the customer's facial position (also called facial area) from the facial image (step S203). That is, it detects where the customer's face is located in the imaging range of the face camera 220 . It should be noted that existing techniques can be appropriately adopted for a specific detection method of the face position, so detailed description thereof will be omitted here.
- the drive control unit 290 estimates the customer's iris position (also called eye region) based on the detected customer's face position (step S204). This estimation can be realized, for example, by pre-storing the relationship between the customer's face position and iris position. For example, the drive control unit 290 estimates that an eye region exists near the center of the detected face region (see FIG. 11a). Alternatively, the eyes may be detected directly from the image based on the position of the face.
- the drive control unit 290 calculates the driving amount of the motor 270 so that the customer's iris is within the imaging range of the iris camera 210 (step S205). In other words, it calculates how much the imaging range of the iris camera 210 needs to be moved so that the customer's iris falls within the imaging range of the iris camera 210 .
- the drive control unit 290 controls the drive of the motor 270 based on the calculated drive amount of the motor 270 (step S206).
- the imaging range of the iris camera 210 is changed, and the iris camera 210 can reliably capture the customer's iris image. More specifically, the estimated eye region falls within the imaging range of the iris camera 210 (see FIG. 11b).
- the iris camera 210 takes an iris image of the customer (step S207). Since the lighting unit 230 is also driven together with the iris camera 210 (that is, the position irradiated with the lighting is also moved according to the imaging range of the iris camera 210), an iris image with better quality can be captured. .
- the face camera 220 may capture the face image again after the driving control unit 290 changes the imaging range of the iris camera 210 . Since the iris camera 210 is driven integrally with the face camera 220, if the imaging range of the iris camera 210 is changed, the imaging range of the face camera 220 is also changed to a more appropriate position. Therefore, by capturing the face image again at this timing, the customer's face image can be captured more appropriately. In this way, for example, even if the face image captured in step S202 is an image that cannot be used for face authentication (for example, an image in which only a part of the face is captured), the face camera 220 can capture the image after adjustment. Face authentication can be reliably executed using a face image.
- the face camera 220 that captures face images and the iris camera 210 that captures iris images are integrally driven. In this way, it is possible to appropriately capture (acquire) the face image and iris image of the customer using the payment system 10 .
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS. 12 and 13.
- FIG. 12 A settlement system 10 according to the third embodiment will be described with reference to FIGS
- FIG. 12 is a block diagram showing the functional configuration of the payment system according to the third embodiment.
- the same reference numerals are given to the same elements as those shown in FIG. 2, and the illustration of the elements shown in FIG. 2 that are less relevant to the present embodiment is omitted. is doing.
- the product reading unit 110 and the face image acquisition unit 150 are configured to be able to acquire the image captured by the face camera 220. That is, each of the product reading unit 110 and the facial image acquisition unit 150 is configured to be able to acquire an image from one common camera.
- the product reading unit 110 is configured to be able to read the product from the image captured by the face camera 220.
- the product reading unit 110 performs object recognition on an image containing a product captured by a face camera (hereinafter referred to as a “product image” as appropriate), and recognizes that an object detected from the product image is a product.
- a product image an image containing a product captured by a face camera
- the existing technique can be suitably adopted about the specific method of object recognition, detailed description shall be abbreviate
- the product image captured by the face camera 220 does not have to include the customer's face. That is, the face camera 220 in this case may function as a camera that preferentially captures an image of the product that the customer is about to purchase rather than the customer's face.
- the face camera 220 is configured to be drivable by the drive control section 290 in order to reliably capture an image of the product. Specifically, the face camera 220 is configured to be able to move its imaging range so that the product fits within the imaging range.
- the drive control unit 290 may drive the face camera 220 vertically as described in the second embodiment (for example, see FIG. 9) to capture an image of the product.
- FIG. 13 is a conceptual diagram showing a display example when reading a product. Note that FIG. 13 shows a display example when the face camera 220 is driven in the vertical direction.
- the product image captured by the face camera 220 is displayed to the customer so that the tilt angle of the face camera 220 can be understood.
- the image captured by the face camera 220 is displayed in the area corresponding to the tilt position on the display unit.
- the “captured image display area” in the drawing is an area where the product image captured by the face camera 220 is displayed on the display unit.
- the “drivable area” is an area indicating an area in which the face camera 220 can be driven (in other words, an area in which the imaging range of the face camera 220 can be moved).
- the "information display area” is an area in which other various types of information can be displayed.
- the captured image display area is displayed in the middle of the drivable area. That is, the image is displayed so that there are drivable areas above and below the captured image display area.
- the captured image display area is the maximum of the drivable area. displayed at the top. That is, the image is displayed so that the drivable area exists only below the captured image display area.
- the customer can intuitively know the shooting status of the product image, and for example, it is possible to prompt the customer to move the position of the product. For example, even though the tilt position of the face camera 220 is at the upper limit, if the product does not fit within the imaging range, the position of the product may need to be moved slightly downward in order to properly capture the product image. desired. In such a situation, if the above-described display is given to the customer, it is expected that the customer will voluntarily move the product downward. In addition to the above display examples, for example, a message such as "Please move the product to the lower side" may be displayed.
- the product is read using an image captured by the face camera 220.
- the camera 220 is driven so as to image the product. In this way, if the face camera 220 is used for reading products, it is not necessary to separately provide a device for reading products. Also, by driving the face camera 220 when capturing a product image, it is possible to reliably capture an image that includes the product (even if the imaging range is normally set assuming the position of the face). is.
- FIG. 14 A settlement system 10 according to the fourth embodiment will be described with reference to FIGS. 14 and 15.
- FIG. 14 It should be noted that the fourth embodiment may differ from the first to third embodiments described above only in a part of the configuration and operation, and the other parts may be the same as those of the first to third embodiments. Therefore, hereinafter, descriptions of portions that overlap with the already described embodiments will be omitted as appropriate.
- FIG. 14 is a block diagram showing the functional configuration of a payment system according to the fourth embodiment.
- symbol is attached
- the payment system 10 includes processing blocks for realizing the functions thereof, including a product reading unit 110, a product information acquisition unit 120, a confirmation information output unit 130, an input It comprises a reception unit 140 , a face image acquisition unit 150 , an iris image acquisition unit 160 and a payment processing unit 170 .
- the input reception unit 140 according to the fourth embodiment includes a line-of-sight direction estimation unit 141 .
- the line-of-sight direction estimating section 141 is configured to be able to estimate the line-of-sight direction of the customer using at least one of the facial image acquired by the face image and the iris image acquired by the iris image acquiring section 160 .
- the line-of-sight direction estimator 141 estimates the line-of-sight direction, it is desirable to adjust the position of the camera so that the center of the coordinate system of the camera 20 coincides with the line-of-sight position (center of both eyes) of the customer.
- the method of estimating the line-of-sight direction from the face image and the iris image an existing technique can be appropriately adopted, so detailed description thereof will be omitted here.
- the input reception unit 140 receives information about the customer's line-of-sight direction estimated by the line-of-sight direction estimation unit 141 as an input from the customer indicating the payment intention.
- the input reception unit 140 receives information that the line-of-sight direction is in a predetermined direction (for example, the right direction with respect to the front of the customer) as information that the customer has an intention to make a payment, and receives information in a direction other than the predetermined direction (for example, The information that the customer has no intention of making a payment may be accepted as the information that the customer has no intention of making a payment.
- a predetermined direction for example, the right direction with respect to the front of the customer
- the information that the customer has no intention of making a payment may be accepted as the information that the customer has no intention of making a payment.
- the input reception unit 140 may receive information regarding the line-of-sight direction as an input from the customer when the line-of-sight direction of the customer is maintained for a predetermined time. For example, if the line-of-sight direction of the customer is maintained in the same direction for a predetermined period of time (for example, several seconds), information on the line-of-sight direction may be accepted as an input from the customer.
- the predetermined time may be a preset fixed value, or may be a value that varies depending on the situation.
- a predetermined range may be provided in the predetermined direction.
- the range of 90 degrees to the right from the customer's front direction may be defined as the "right direction”.
- the numerical value is not limited to this, and the direction may be left, up, or down.
- the state may be such that "the line-of-sight direction is maintained in the same direction".
- the degree of difficulty in estimating the line-of-sight direction changes depending on the tilt angle of the camera. For example, the more the tilt angle of the camera 20 deviates from the horizontal, the more likely it is that the camera 20 does not face the customer's face, making it difficult to estimate the line-of-sight direction. Also, when the camera 20 looks down from above, it is difficult to estimate the line-of-sight direction because the eyes tend to look down. In such a case, the input reception unit 140 may change the above-described predetermined time according to the difficulty of determining the line-of-sight direction (that is, the tilt angle of the camera 20).
- the input reception unit changes the threshold for determining the line-of-sight direction of the customer (for example, the threshold for the angle of the line-of-sight direction) according to the difficulty of determining the line-of-sight direction (that is, the tilt angle of the camera 20).
- the larger the tilt angle the larger the threshold for determining the line-of-sight direction may be changed (that is, unless the line-of-sight direction is swung greatly, it is not recognized that the user is facing that direction).
- the method or algorithm used for line-of-sight estimation may be changed depending on the tilt angle. For example, in the case of a deep learning-based estimation method, a line-of-sight estimation engine trained for each tilt angle may be constructed and used by switching according to the tilt angle.
- the line-of-sight direction estimation unit 141 cannot normally estimate the line-of-sight direction (for example, if the line-of-sight angle does not exceed the threshold), the customer may be guided to swing the line of sight more. For example, when a line-of-sight position marker (a pointer indicating where the user is looking) is displayed on the display device, the sensitivity to changes in line-of-sight angle may be reduced so that the marker does not move unless the line of sight is greatly moved. Alternatively, a voice, a message, or the like may be output so as to swing the line of sight.
- a line-of-sight position marker a pointer indicating where the user is looking
- FIG. 15 is a flow chart showing the operation flow of the payment system according to the fourth embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the product reading unit 110 first reads the product (step S101). Then, the product information acquisition unit 120 acquires the product information of the product read by the product reading unit 110 (step S102).
- the confirmation information output unit 130 outputs confirmation information for confirming the intention of payment (step S103).
- Confirmation of payment intention includes, for example, confirmation of whether the product list (for example, at least one of the amount, product name, quantity, etc.) is correct.
- the face image acquisition unit 150 and the iris image acquisition unit 160 acquires a face image and an iris image for estimating the line-of-sight direction of the customer (step S401).
- the line-of-sight direction estimation unit 141 estimates the line-of-sight direction of the customer based on at least one of the acquired face image and iris image (step S402).
- the input reception unit 140 determines whether the customer intends to make a payment by biometric authentication based on the customer's gaze direction (step S104). If it is determined that the customer does not intend to make payment by biometric authentication (step S104: NO), the series of processes ends. On the other hand, if it is determined that the customer intends to make a payment by biometric authentication (step S104: YES), the facial image acquiring unit 150 acquires the customer's facial image (step S105). Also, the iris image acquisition unit 160 acquires the customer's iris image (step S106). It should be noted that if the acquired face image and iris image are used when estimating the line-of-sight direction, the processing of steps S105 and S106 described above may be omitted.
- the payment processing unit 170 executes customer authentication processing based on at least one of the acquired face image and iris image (step S107). If the customer authentication process is not successful (step S107: NO), the series of processes ends. On the other hand, if the customer authentication process is successful (step S107: YES), the payment processing unit 170 executes the product payment process using the product information (step S108).
- step S104 information may be output for the customer to select whether to make a payment using biometric authentication or a payment method other than biometric authentication.
- the guidance “Would you like to make payment by biometric authentication?” and “Yes” and “No” buttons may be displayed. If the customer selects "yes", the process proceeds to S105. If “No” is selected here, a screen may be displayed to select another payment method (for example, cash, electronic money, credit card, etc.).
- a selection screen including payment by biometric authentication and payment other than biometric authentication at least one button of "credit card” may be displayed.
- the confirmation information for confirming the intention of payment may include information for allowing the customer to select whether the payment is to be made by biometric authentication or by a method other than biometric authentication.
- step S107 YES and step S108
- information indicating whether to execute the settlement process may be output.
- a screen for asking the customer whether or not to execute the settlement process is displayed.
- This screen display may include at least one of the payment amount, the customer's personal ID, and the customer's name, or may display only information for confirming whether or not to execute payment processing.
- Information for confirming whether or not to execute payment processing is, for example, a confirmation button and a return button. screen).
- a button for canceling the payment process may be displayed.
- the customer's payment intention is determined based on the customer's line-of-sight direction. In this way, the customer's intention to pay can be determined without touching and operating the device. Therefore, even if the customer's hands are full with goods, for example, it is possible to appropriately determine the customer's intention to make a payment.
- FIG. 16 to 18 A settlement system 10 according to the fifth embodiment will be described with reference to FIGS. 16 to 18.
- FIG. The fifth embodiment may differ from the above-described fourth embodiment only in a part of configuration and operation, and the other parts may be the same as those of the first to fourth embodiments. Therefore, hereinafter, descriptions of portions that overlap with the already described embodiments will be omitted as appropriate.
- FIG. 16 is a block diagram showing the functional configuration of the payment system according to the fifth embodiment.
- symbol is attached
- the payment system 10 includes a product reading unit 110, a product information acquisition unit 120, a confirmation information output unit 130, an input It comprises a reception unit 140 , a face image acquisition unit 150 , an iris image acquisition unit 160 and a payment processing unit 170 .
- An input reception unit 140 according to the fifth embodiment includes a line-of-sight direction estimation unit 141 as in the fourth embodiment.
- the confirmation information output section 130 includes a gaze area display section 131 .
- the gaze area display unit 131 is configured to be able to display the gaze area as confirmation information for confirming the customer's intention to make a payment.
- the gaze area display unit 131 may display the gaze area on, for example, a display, which is the output device 16 (see FIG. 1).
- the fixation area is an area displayed to encourage the customer to move in the line-of-sight direction.
- the attention area may be displayed as at least one area corresponding to the payment intention, but a plurality of attention areas may be displayed.
- only one gaze area corresponding to the line-of-sight direction when there is a payment intention may be displayed, or the first gaze area corresponding to the line-of-sight direction when there is a payment intention and the first gaze area corresponding to the line-of-sight direction when there is no payment intention may be displayed.
- a second gaze region corresponding to the line-of-sight direction may also be displayed.
- FIG. 17 is a conceptual diagram showing a display example of a gaze area.
- FIG. 18 is a conceptual diagram showing a display example of the gaze area considering the position of the camera.
- the gaze areas may be displayed as the left half and right half areas of the display.
- the left half of the display is an area corresponding to the line of sight when the customer has the intention to make a payment. An input that there is is accepted.
- the right half of the display is an area corresponding to the line of sight when the customer does not intend to make a payment. will be accepted.
- the gaze area may be displayed on a portion of the display. Specifically, it may be displayed as an area like a button.
- the display position of the gaze area may be determined by the position of the camera 20. Specifically, it may be displayed on the side closer to the camera 20 on the display.
- the camera 20 is arranged below the display. In such cases, the gaze area may be located on the underside of the display.
- the camera 20 is arranged above the display. In such cases, the gaze area may be located on the top side of the display.
- a gaze area is displayed for confirming the customer's payment intention.
- the customer can be urged to look at the attention area, so it is possible to appropriately determine the customer's payment intention from the customer's line-of-sight direction.
- FIG. 19 A settlement system 10 according to the sixth embodiment will be described with reference to FIGS. 19 and 20.
- FIG. The sixth embodiment may differ from the above-described fifth embodiment only in a part of configuration and operation, and the other parts may be the same as those of the first to fifth embodiments. Therefore, hereinafter, descriptions of portions that overlap with the already described embodiments will be omitted as appropriate.
- FIG. 19 is a block diagram showing the functional configuration of a payment system according to the sixth embodiment.
- symbol is attached
- the payment system 10 includes a product reading unit 110, a product information acquisition unit 120, a confirmation information output unit 130, an input It comprises a reception unit 140 , a face image acquisition unit 150 , an iris image acquisition unit 160 and a payment processing unit 170 .
- An input receiving unit 140 according to the sixth embodiment includes a line-of-sight direction estimating unit 141, as in the fourth and fifth embodiments.
- the confirmation information output section 130 further includes a frame display section 132 in addition to the gaze area display section 131 described in the fifth embodiment.
- the frame display unit 132 is configured to be able to display a frame that gradually converges from the outside of the gaze area toward the outline of the gaze area according to the time that the customer gazes at the gaze area. For example, the frame display unit 132 may start displaying the frame at the timing when it can be determined that the line-of-sight direction of the customer is directed to one gaze area. The frame display unit 132 may stop displaying the frame when the size of the frame becomes the same as (overlaps with) the outline of the attention area, or when the size of the frame becomes the same as the outline of the attention area ( (overlapping) frames may continue to be displayed as they are.
- the speed at which the frame converges may be a preset value, or may be a value that changes according to the customer's line of sight. For example, if the customer's line of sight is moving in the direction of the attention area, the frame will converge relatively quickly. can be Also, the speed at which the frame is moved may not be constant, and may be changed on the way. For example, the frame may be moved quickly at the beginning of convergence, and the moving speed may be decreased as the focus area is approached. As a result, it is possible to make the customer quickly recognize the gaze area, and to effectively impress the customer that it is necessary to continue to gaze at the gaze area stably for a while.
- FIG. 20 is a conceptual diagram showing a display example of a frame that gradually converges on the gaze area.
- information related to the product and payment is displayed near the center of the display. Also, at the lower left of the display, there is displayed a "cancel" button that is to be watched when there is no payment intention. At the bottom right of the display, a button labeled "Payment” is displayed, which the user will pay attention to when he/she intends to make a payment. That is, in this case, the button becomes the gaze area. In such a situation, when the customer gazes at the payment button, a frame larger than the payment button is first displayed outside the payment button. Then, as the customer continues to gaze at the payment button, the frames gradually converge toward the payment button.
- the frame shrinks toward the payment button. Then, it stops while overlapping the outline of the payment button, and continues to display the frame for a certain period of time. It should be noted that if the customer changes the line-of-sight direction on the way, the display of the frame may be stopped. Also, when the customer's line of sight moves to another attention area, a new frame may be displayed outside the other attention area. For example, if a customer who has been gazing at the payment button begins to gaze at the cancel button halfway through, the frame displayed on the payment button side may disappear and a new frame may be displayed on the cancel button side. .
- the frame may gradually increase in size toward the original size. Furthermore, in this case, when the line-of-sight direction returns to the payment button again, the frame may stop increasing at the timing when the line-of-sight direction returns, and the frame may gradually become smaller toward the payment button again. Also, the color of the frame may be changed depending on whether the pay button is gazed at or the stop button is gazed at. For example, when the user gazes at the payment button, the frame may be displayed in green, and when the user gazes at the cancel button, the frame may be displayed in orange.
- the display of this embodiment may be displayed when confirming the payment intention in S104 of FIG. May be displayed when outputting.
- a frame that gradually converges on the attention area is provided. If a frame is displayed on the side of the gazed area, the customer himself/herself can know how the line-of-sight direction is recognized. In addition, as the frame gradually converges toward the outline of the attention area, the customer's line of sight is directed more towards the attention area. For example, even if the line-of-sight direction is slightly swung to the gaze area side at first, the frame gradually converges, so that the line-of-sight direction can be induced to swing largely to the gaze area side. Therefore, it becomes possible to determine the line-of-sight direction of the customer more appropriately.
- FIG. 21 and 22 A settlement system 10 according to the seventh embodiment will be described with reference to FIGS. 21 and 22.
- FIG. The seventh embodiment may differ from the above-described fifth and sixth embodiments only in a part of configuration and operation, and the other parts may be the same as those of the first to sixth embodiments. Therefore, hereinafter, descriptions of portions that overlap with the already described embodiments will be omitted as appropriate.
- FIG. 21 is a block diagram showing the functional configuration of a payment system according to the seventh embodiment.
- symbol is attached
- the settlement system 10 includes processing blocks for realizing the functions thereof, including a product reading unit 110, a product information acquisition unit 120, a confirmation information output unit 130, an input It comprises a reception unit 140 , a face image acquisition unit 150 , an iris image acquisition unit 160 and a payment processing unit 170 .
- An input receiving unit 140 according to the seventh embodiment includes a line-of-sight direction estimating unit 141 as in the fourth to sixth embodiments.
- the confirmation information output unit 130 further includes an area color changing unit 133 in addition to the gaze area display unit 131 described in the fifth embodiment.
- the area color changing unit 133 can gradually change the color of the gaze area toward the outside of the screen (in other words, the side opposite to other gaze areas) according to the time the customer gazes at the gaze area. is configured to In addition, it is preferable that the color after the change is a color that the customer can easily recognize that the color has changed.
- the color after change may be, for example, a conspicuous color such as red or yellow, or may be a complementary color of the color before change.
- the area color changing unit 133 may start changing the color of the gaze area at the timing when it can be determined that the line of sight of the customer is directed to one gaze area, for example.
- the speed at which the color of the gaze area is changed may be a preset value, or may be a value that changes according to the direction of the customer's line of sight. For example, if the customer is moving their gaze toward the gaze area, the color will change relatively quickly, and if the customer is moving their gaze toward the gaze area, the color will change relatively slowly.
- the speed at which the color is changed may not be constant, and may be changed in the middle. For example, the color changing speed may be increased at first, and the changing speed may be decreased gradually. As a result, it is possible to make the customer quickly recognize the gaze area, and to effectively impress the customer that it is necessary to continue to gaze at the gaze area stably for a while.
- FIG. 22 is a conceptual diagram showing a display example in which the color of the gaze area is gradually changed toward the outside of the screen.
- information related to the product and payment is displayed near the center of the display. Also, at the lower left of the display, there is displayed a "cancel" button that is to be watched when there is no payment intention. At the bottom right of the display, a button labeled "Payment” is displayed, which the user will pay attention to when he/she intends to make a payment. In this situation, when the customer gazes at the payment button, the color of the gaze area is gradually changed from the screen inner portion of the payment button (that is, the left end portion of the payment button).
- the color of the gaze area changes toward the outside of the screen, and finally the color of all the payment buttons changes.
- the meter is displayed to accumulate. Note that if the customer changes the direction of the line of sight on the way, the color change may be stopped. When the color change is aborted, the color of the gaze area may return to the color before the change. In addition, when the customer's line of sight direction shifts to another gaze area, the color of the other gaze area may be changed. For example, if a customer who continues to gaze at the payment button starts to gaze at the stop button halfway through, the payment button may return to its original color, and the color of the stop button may be changed again.
- the color of the gaze area may be changed toward the inside of the screen. Furthermore, in this case, when the line-of-sight direction returns to the payment button again, the inward color change may be stopped at the timing when the line-of-sight direction returns, and the outward color may be changed again. Also, the color to be changed may be changed depending on whether the pay button is gazed at or the cancel button is gazed at. For example, the color may be changed to green when the user gazes at the payment button, and the color may be changed to orange when the user gazes at the cancel button.
- the color change of the gaze area described above may be executed in combination with the frame display (see FIG. 20) described in the sixth embodiment. For example, after the frame displayed by the frame display unit 132 converges on the focused area, the color change by the area color change unit 133 may be started.
- the display of this embodiment may be displayed when confirming the payment intention in S104 of FIG. May be displayed when outputting.
- the color of the gaze area gradually changes toward the outside of the screen in the payment system 10 according to the sixth embodiment.
- the customer's line of sight is directed to the outside of the screen (in other words, to the side opposite to other gaze areas).
- the gradual change in the color of the gaze area can guide the user to swing the line-of-sight direction greatly toward the gaze area. Therefore, it becomes possible to determine the line-of-sight direction of the customer more appropriately.
- FIG. 8 A settlement system 10 according to the eighth embodiment will be described with reference to FIGS. 23 and 24.
- FIG. The settlement system 10 according to the eighth embodiment differs from the above-described first to seventh embodiments only in a part of the configuration and operation, and other parts are the same as those in the first to seventh embodiments. can be Therefore, hereinafter, descriptions of portions that overlap with the already described embodiments will be omitted as appropriate.
- FIG. 23 is a block diagram showing the functional configuration of a payment system according to the eighth embodiment.
- symbol is attached
- the payment system 10 includes processing blocks for realizing the functions thereof, including a product reading unit 110, a product information acquisition unit 120, a confirmation information output unit 130, an input It comprises a reception unit 140 , a face image acquisition unit 150 , an iris image acquisition unit 160 , a payment processing unit 170 and a biometric determination unit 180 .
- the payment system 10 according to the eighth embodiment further includes a biometric determining section 180 in addition to the configuration of the first embodiment (see FIG. 2).
- the living body determination unit 180 may be implemented, for example, in the processor 11 (see FIG. 1) described above.
- the biometric determination unit 180 is configured to be able to determine the customer's biometric-likeness based on the movement of the customer's line of sight estimated from the iris image acquired by the iris image acquisition unit 160 .
- the “living body-likeness” is the degree of possibility that the customer is a living body.
- the biometric determining unit 180 may determine the bio-likeness based on whether or not the movement of the customer's line of sight is similar to that of a living body (in other words, a movement that cannot be reproduced by spoofing). Also, the biometric determination unit 180 may perform biometric determination using other information (for example, information obtained from the customer's face image, etc.) in addition to the movement of the customer's line of sight.
- the determination result of the biometric determination unit 180 is configured to be output to the payment processing unit 170 .
- the payment processing unit 170 is configured to change the mode of authentication processing based on the determination result (that is, the customer's biometric likelihood) in the biometric determination unit 180 . Specifically, the payment processing unit 170 is configured to change the mode of authentication processing depending on whether or not the biometric-likeness is higher than a predetermined threshold.
- the "predetermined threshold value” here is a threshold value for determining whether or not the customer is a living body (for example, not impersonating) as to whether or not it is highly likely to be a living body. is obtained and set in advance. Changes in the authentication process according to biometric-likeness will be described in detail below.
- FIG. 24 is a flow chart showing the operation flow of the payment system according to the eighth embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the product reading unit 110 first reads the product (step S101). Then, the product information acquisition unit 120 acquires the product information of the product read by the product reading unit 110 (step S102).
- the confirmation information output unit 130 outputs confirmation information for confirming the intention of payment (step S103). Then, based on the input from the customer, the input reception unit 140 determines whether or not the customer has an intention to make a payment using biometric authentication (step S104).
- step S104 If it is determined that the customer has no intention of making a payment using biometric authentication (step S104: NO), the series of processes ends. On the other hand, when it is determined that the customer has an intention to settle the account using biometric authentication (step S104: YES), the facial image acquiring unit 150 acquires the customer's facial image (step S105). Also, the iris image acquisition unit 160 acquires the customer's iris image (step S106).
- the living body determination unit 180 determines whether the customer is living body based on the movement of the line of sight estimated from the iris image (step S801). Then, if the customer's bio-likeness is higher than the predetermined threshold (step S801: YES), the payment processing unit 170 performs authentication based on the face image acquired by the face image acquiring unit 150 (step S802). On the other hand, if the customer's bio-likeness is lower than the predetermined threshold (step S801: NO), the payment processing unit 170 retrieves the face image acquired by the face image acquisition unit 150 and the iris image acquired by the iris image acquisition unit 160. Authentication is performed based on (step S803).
- step S107 NO
- the series of processing ends.
- step S107: YES the payment processing unit 170 executes product payment processing using the product information (step S108). Note that although the case of using face authentication in step S802 has been described in this figure, iris authentication may be performed instead of face authentication.
- biometric determination is performed based on the line-of-sight direction of the customer, and different authentication processes are executed according to the determination result.
- biometric-likeness is high, it is possible to simplify the authentication process and reduce the processing load while executing the payment process.
- a more stringent authentication process can be performed to prevent fraudulent payment processing due to impersonation or the like.
- FIG. 25 is a conceptual diagram showing an example of display for final confirmation of payment intention.
- FIG. 26 is a conceptual diagram showing a display example when the distance between the customer and the camera is set to the appropriate range.
- FIG. 27 is a conceptual diagram showing a display example for notifying the line-of-sight direction when performing iris authentication.
- FIG. 28 is a conceptual diagram showing a display example for prompting removal of the wearable item.
- FIG. 29 is a conceptual diagram showing a display example for notifying to open your eyes.
- the payment system 10 displays the attention area to confirm the customer's payment intention, and then confirms the payment intention again before actually executing the payment process.
- the attention area when confirming the final intention of payment, the attention area may be displayed in the same manner as when confirming the intention of payment for the first time.
- the attention area when confirming the final payment intention may be displayed so that the position of each area is reversed compared to when the payment intention is first confirmed.
- a "Yes (i.e., willing to pay)" button is displayed on the left side
- a "No (i.e., no intention to pay)" button is displayed on the right side.
- confirmation may be performed by a different method (that is, other than the line-of-sight direction) from when confirming the intention to make a payment first.
- the payment intention may be confirmed by shaking the head vertically or horizontally (or tilting the head sideways).
- the face camera 220 detects the orientation of the face in addition to the position of the face, and determines how the orientation changes in time series to determine the motion.
- a display may be provided to adjust the distance between the face camera 220 and the customer. Specifically, an indicator showing the relationship between the customer's current position and the appropriate range may be displayed. The indicator shows the result of estimating the distance between the customer and the camera, indicating that the distance is smaller as it goes upwards and larger as it goes downwards. Then, the part corresponding to the estimated distance in the indicator is highlighted (for example, it becomes brighter, the width becomes slightly wider, or it blinks). For example, in the example shown in FIG. 26(a), the face camera 220 is too close to the customer, and the customer's face protrudes from the frame in which the face should be placed.
- the display indicating the current position of the indicator is displayed above the appropriate range.
- a message may be displayed to the customer to "Please step down".
- the face camera 220 is too far from the customer, and the customer's face is considerably smaller than the frame in which the face should be placed.
- the display indicating the current position of the indicator is displayed below the appropriate range.
- a message may be displayed to the customer to "please approach”.
- the display indicating the current position of the indicator is displayed within the proper range.
- a message may also be displayed to the customer to "keep that position". Such a display prompts the customer to move his or her standing position, making it possible to pick up a face image at an appropriate distance.
- the color of the indicator may be changed depending on whether it is within the appropriate range or out of the range. For example, the indicator may be displayed in green in the proper range and in orange otherwise.
- the distance between the customer and the camera may be measured using a distance sensor, or may be estimated from the image taken by the camera. In the latter case, estimation may be made based on the distance between feature points of the face and the size of the partial region. For example, the distance between the camera and the customer may be estimated based on the distance between the eyes, or the distance between the camera and the customer may be estimated from the size of the detected iris.
- a display may be provided to appropriately guide the customer's line of sight direction.
- a mark for example, a mark such as a double circle in the figure
- the double circle increases in size, decreases in size, and fluctuates (oscillates) periodically.
- a message such as "Look here" may be displayed. By doing so, it is possible to prompt the customer to move the line-of-sight direction to an appropriate direction and to fix the line-of-sight direction as it is.
- the speed at which the size is vibrated may be changed according to the direction of the line of sight. For example, the closer the line of sight is to the center of the double circle, the slower the speed, and if the line of sight deviates, the speed of vibration may be increased. As a result, it is possible to provide feedback to the customer as to whether or not the direction of the line of sight is appropriate, and guide the line of sight to a more appropriate direction.
- the speed may be changed according to the distance between the customer and the iris camera 210 . That is, when the distance is too close or too far, the speed of vibration may be increased, and when appropriate, the speed of vibration may be decreased. As a result, it is possible to give feedback to the customer as to whether the distance to the camera is appropriate or not, so that the iris can be photographed at a more appropriate position.
- a display prompting the customer to remove the attachment may be provided.
- a message and an image prompting the customer to remove the glasses may be displayed.
- a message and an image prompting the customer to remove the sunglasses may be displayed.
- a message and an image prompting the customer to remove the mask may be displayed.
- Such a display may be always performed when the customer's image is captured, or may be performed only when the corresponding wearing object is detected from the customer's image.
- a display may be provided to encourage the customer to open their eyes.
- an animation may be displayed in which closed eyes gradually open.
- the animation shifts to the eye-opened state step by step.
- the eye open animation the eye opening speed is accelerated and stopped when the maximum state is reached. This makes it possible to more effectively make the customer aware of opening their eyes.
- a message such as "Please open your eyes carefully” may be displayed. By performing such a display, the customer can be conscious of keeping their eyes open, and it is possible to pick up an appropriate iris image.
- the speed of the animation of gradually opening the eyes may be changed according to the degree of eye opening detected. For example, if the method for determining the degree of eye opening determines that the degree of eye opening is small, the opening of the eyes may be emphasized by increasing the speed of the eye-opening animation. Also, the animation may be repeatedly displayed. For example, multiple iterations during iris acquisition and matching can induce the customer to open their eyes more.
- FIG. 30 is a schematic diagram showing a modification of the camera.
- the same symbols are attached to the same components as those shown in FIG.
- the camera 20 may be configured with a scanner 300 as shown in FIG.
- the scanner 300 is a scanner capable of reading products, and is configured as a bar code scanner, for example. In this case, for example, if the product is placed below the camera 20, the scanner 300 reads the product. At this time, the camera 20 may be driven by the drive control unit 290 so that the scanner 300 can appropriately read the product (that is, the scanner 300 may be driven to change the scanning position). .
- the scanner 300 may be a scanner that reads an electronic tag that is attached to or attached to a product.
- FIG. 31 is a conceptual diagram showing a display example of a cancel button.
- FIG. 32 is a conceptual diagram showing a display example after cancellation.
- FIG. 33 is a conceptual diagram showing a display example of the number change button.
- FIG. 34 is a conceptual diagram showing a display example of an amount change button.
- FIG. 35 is a conceptual diagram showing a display example when there are products that require age confirmation.
- a "cancel” button (a button to cancel the purchase) may be displayed next to the product information indicating the read product.
- the color of the cancel button gradually changes as described in the seventh embodiment (see FIG. 22).
- the color of the cancel button changes completely, the purchase of the product is canceled.
- the canceled product is deleted from the product list.
- the canceled product may be grayed out without being deleted. If grayed out, a "repurchase (cancel cancellation)" button may be displayed instead of the cancel button. As with the cancel button described above, the repurchase button allows the customer to cancel the cancellation (return to a purchaseable state) by paying attention to it. It should be noted that the repurchase button may also be displayed in such a way that the color gradually changes in the same manner as the cancel button.
- the total amount before cancellation may be displayed in addition to the total amount after cancellation. That is, the total amounts before and after cancellation may be displayed side by side for comparison. Alternatively, in addition to the total amount after cancellation, the difference between the amount before cancellation may be displayed.
- a "Change Quantity” button may be displayed next to the product you are about to purchase.
- the number change button is a button that allows the customer to change the number of products by paying attention to it. Specifically, when the customer gazes at the quantity change button, the color of the quantity change button gradually changes in the same manner as the cancel button described above. Then, when the color of the change quantity button changes, the display of the quantity column changes (for example, the number blinks, the background color of the number changes, the column becomes blank, etc.), indicating that the quantity can be changed.
- the quantity of the product is changed.
- the quantity increases when the right arrow is watched, and the quantity decreases when the left arrow is watched.
- This arrow may be displayed at the same time that the color of the cancel button finishes changing, or as shown in FIG. After a lapse of time (for example, after 1 second has passed), the display of the arrow may be started.
- the quantity may change from 1 to 2 to 3 every second as the player continues to watch the quantity column.
- blinking the right eye may increase the quantity by 1 for each blink
- blinking the left eye may decrease the quantity by 1 for each blink.
- the quantity may be incremented by 1 for each shake of the head to the right, and decremented by 1 for each shake of the head to the left.
- Blinking and shaking the head are the opposite of the above (if you blink with your right eye, the quantity decreases by 1 for each blink, and if you blink with your left eye, the quantity increases by 1 for each blink. Alternatively, turn your head to the right). Shaking the head decreases the quantity by 1 each time the head is shaken, and shaking the head to the left increases the quantity by 1 each time the head is shaken).
- the quantity change button may change to a "confirm" button while the quantity is being changed.
- the customer gazes at the confirmation button after changing the quantity the changed quantity is confirmed and the total price is changed (the total price may be changed each time the quantity is changed).
- a "change amount” button may be displayed on the screen showing the read product information. If there is a discrepancy in the read product information (for example, if there is a discrepancy between the read product and the product name displayed), the price change button will change the price displayed on the leaflet or on the price tag inside the store. This button is used when there is a discrepancy, etc.), and the customer can request a change in the amount by paying attention to it.
- a process of calling a member of staff (such as a store clerk) is executed. At this time, as shown in FIG. 35(b), a message such as "A staff member is being called. Please wait for a while" may be displayed.
- a display may be made to indicate that age verification is required. Specifically, "unnecessary/not yet/completed" for age confirmation may be displayed on the payment screen. At this time, the customer may be notified to prompt an action to confirm the age. For example, as shown in the example of FIG. 35(a), a message such as "There are products that require age verification. Please hold your ID over the camera.”
- buttons described in FIGS. 31 to 35 are only examples. Therefore, the arrangement position, size, and the like of each button described above are not particularly limited. It should be noted that these buttons may not only be displayed in one type, but may also be displayed in a combination of multiple types.
- a processing method of recording a program for operating the configuration of each embodiment so as to realize the functions of each embodiment described above on a recording medium, reading the program recorded on the recording medium as a code, and executing it on a computer is also implemented. Included in the category of form. That is, a computer-readable recording medium is also included in the scope of each embodiment. In addition to the recording medium on which the above program is recorded, the program itself is also included in each embodiment.
- a floppy (registered trademark) disk, hard disk, optical disk, magneto-optical disk, CD-ROM, magnetic tape, non-volatile memory card, and ROM can be used as recording media.
- the program recorded on the recording medium alone executes the process, but also the one that operates on the OS and executes the process in cooperation with other software and functions of the expansion board. included in the category of
- the payment system described in appendix 1 includes product reading means for reading a product, product information acquisition means for acquiring product information related to the read product, and output of confirmation information for confirming the customer's intention to pay for the product.
- confirmation information output means for receiving input from the customer with respect to the confirmation information; face acquisition means for acquiring the face image of the customer; iris acquisition means for acquiring the iris image of the customer; and payment processing means for executing payment processing for the product based on an input from and at least one of the face image and the iris image.
- the face obtaining means and the iris obtaining means obtain the face image and the iris image from a face camera and an iris camera that are integrally driven.
- the described payment system In the payment system according to appendix 2, the face obtaining means and the iris obtaining means obtain the face image and the iris image from a face camera and an iris camera that are integrally driven. The described payment system.
- the product reading means reads the product using a face camera for acquiring the face image, and the face camera scans the product in at least one direction. 3.
- the confirmation information output means displays at least one attention area corresponding to the payment intention, and the reception means corresponds to the attention area that the customer is paying attention to.
- the payment system according to appendix 4 wherein information is accepted as input from the customer.
- the confirmation information output means displays a frame that gradually converges from outside the gaze area toward the gaze area according to the time that the customer gazes at the gaze area.
- appendix 7 The payment system according to appendix 7 is characterized in that the confirmation information output means gradually changes the color of the gaze area toward the outside of the screen according to the time the customer gazes at the gaze area.
- the payment system according to appendix 8 further comprises biometric determination means for determining the biometric-likeness of the customer based on the movement of the customer's line of sight estimated from the iris image, and the payment processing means determines the biometric-likeness of the customer. If it is higher than a predetermined threshold, it is determined whether or not to execute the settlement process based on the face image, and if the bio-likeness is lower than the predetermined threshold, it is determined based on the face image and the iris image. 8. The settlement system according to any one of claims 1 to 7, wherein it is determined whether or not to execute the settlement process.
- the payment method described in Supplementary Note 9 reads a product, acquires product information related to the read product, outputs confirmation information for confirming the customer's intention to make a payment for the product, and receives confirmation from the customer in response to the confirmation information. to obtain the customer's face image, to obtain the customer's iris image, and to make payment for the product based on the input from the customer and at least one of the face image and the iris image.
- a settlement method characterized by executing processing.
- appendix 10 The computer program according to appendix 10 reads a product, acquires product information related to the read product, outputs confirmation information for confirming the customer's intention to pay for the product, and receives confirmation from the customer in response to the confirmation information. to obtain the customer's face image, to obtain the customer's iris image, and to make payment for the product based on the input from the customer and at least one of the face image and the iris image.
- a computer program characterized by causing a computer to perform a process.
- a recording medium according to appendix 11 is a recording medium characterized by recording the computer program according to appendix 10.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- General Business, Economics & Management (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- General Engineering & Computer Science (AREA)
- Collating Specific Patterns (AREA)
- Image Input (AREA)
Abstract
Description
第1実施形態に係る決済システムについて、図1から図3を参照して説明する。
まず、図1を参照しながら、第1実施形態に係る決済システム10のハードウェア構成について説明する。図1は、第1実施形態に係る決済システムのハードウェア構成を示すブロック図である。
次に、図2を参照しながら、第1実施形態に係る決済システム10の機能的構成について説明する。図2は、第1実施形態に係る決済システムの機能的構成を示すブロック図である。
次に、図3を参照しながら、第1実施形態に係る決済システム10の動作の流れについて説明する。図3は、第1実施形態に係る決済システムの動作の流れを示すフローチャートである。
次に、第1実施形態に係る決済システム10によって得られる技術的効果について説明する。
第2実施形態に係る決済システム10について、図4から図11を参照して説明する。なお、第2実施形態は、決済システム10で用いられるカメラ20について詳しく説明するものであり、システム全体の構成や動作の流れ等については、第1実施形態(図1から図3参照)と同一であってよい。このため、以下では、すでに説明した第1実施形態と重複する部分について適宜説明を省略するものとする。
第2実施形態に係る決済システムで用いられるカメラ20の構成について、図4から図9を参照しながら説明する。図4は、第2実施形態に係るカメラの構成を示す概略図(その1)である。図5は、第2実施形態に係るカメラの構成を示す概略図(その2)である。図6は、顔カメラの撮像範囲と虹彩カメラの撮像範囲との関係を示す平面図である。図7は、照明部に設けられる可視光フィルタの一例を示す概略図である。図8は、モータが装置外部の固定部に固定される構成を示す概略図である。図9は、第2実施形態に係るカメラの駆動方向を示す概念図である。なお、図4から図9では、説明の便宜上、主な構成要素のみを図示し、本実施形態に関連の薄い構成要素についての図示を省略している。図4は、認証用撮像装置を前面側(言い換えれば、撮像対象者側)から見た図であり、図5は、認証用撮像装置を後面側(即ち、図1とは反対側)から見た図である。
次に、第2実施形態に係る決済システム10が備えるカメラ20の動作の流れについて、図10及び図11を参照しながら説明する。図10は、第2実施形態に係るカメラの動作の流れを示すフローチャートである。図11は、顔位置に基づく撮像範囲の調整方法の一例を示す概念図である。
次に、第2実施形態に係る決済システム10によって得られる技術的効果について説明する。
第3実施形態に係る決済システム10について、図12及び図13を参照して説明する。なお、第3実施形態は、上述した第1及び第2実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1及び第2実施形態と同一であってよい。このため、以下では、すでに説明した実施形態と重複する部分について適宜説明を省略するものとする。
まず、図12を参照しながら、第3実施形態に係る決済システム10の機能的構成について説明する。図12は、第3実施形態に係る決済システムの機能的構成を示すブロック図である。なお、図12では、図2で示した構成要素と同様の要素に同一の符号を付しており、図2で示した構成要素のうち本実施形態との関連が薄いものについては図示を省略している。
次に、図13を参照しながら、顔カメラ220を駆動する際の表示例(顧客への提示例)について説明する。図13は、商品を読み取る際の表示例を示す概念図である。なお、図13では、顔カメラ220が上下方向に駆動する場合の表示例を示している。
次に、第3実施形態に係る決済システム10によって得られる技術的効果について説明する。
第4実施形態に係る決済システム10について、図14及び図15を参照して説明する。なお、第4実施形態は、上述した第1から第3実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1から第3実施形態と同一であってよい。このため、以下では、すでに説明した実施形態と重複する部分については適宜説明を省略するものとする。
まず、図14を参照しながら、第4実施形態に係る決済システム10の機能的構成について説明する。図14は、第4実施形態に係る決済システムの機能的構成を示すブロック図である。なお、図14では、図2で示した構成要素と同様の要素に同一の符号を付している。
次に、図15を参照しながら、第4実施形態に係る決済システム10の動作の流れについて説明する。図15は、第4実施形態に係る決済システムの動作の流れを示すフローチャートである。なお、図15では、図3に示した処理と同様の処理に同一の符号を付している。
次に、第4実施形態に係る決済システム10によって得られる技術的効果について説明する。
第5実施形態に係る決済システム10について、図16から図18を参照して説明する。なお、第5実施形態は、上述した第4実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1から第4実施形態と同一であってよい。このため、以下では、すでに説明した実施形態と重複する部分については適宜説明を省略するものとする。
まず、図16を参照しながら、第5実施形態に係る決済システム10の機能的構成について説明する。図16は、第5実施形態に係る決済システムの機能的構成を示すブロック図である。なお、図16では、図14で示した構成要素と同様の要素に同一の符号を付している。
次に、図17及び図18を参照しながら、上述した注視領域表示部131による注視領域の表示例について具体的に説明する。図17は、注視領域の表示例を示す概念図である。図18は、カメラの位置を考慮した注視領域の表示例を示す概念図である。
次に、第5実施形態に係る決済システム10によって得られる技術的効果について説明する。
第6実施形態に係る決済システム10について、図19及び図20を参照して説明する。なお、第6実施形態は、上述した第5実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1から第5実施形態と同一であってよい。このため、以下では、すでに説明した実施形態と重複する部分については適宜説明を省略するものとする。
まず、図19を参照しながら、第6実施形態に係る決済システム10の機能的構成について説明する。図19は、第6実施形態に係る決済システムの機能的構成を示すブロック図である。なお、図19では、図16で示した構成要素と同様の要素に同一の符号を付している。
次に、図20を参照しながら、上述した枠表示部132による枠の表示例について具体的に説明する。図20は、注視領域に徐々に収束する枠の表示例を示す概念図である。
次に、第6実施形態に係る決済システム10によって得られる技術的効果について説明する。
第7実施形態に係る決済システム10について、図21及び図22を参照して説明する。なお、第7実施形態は、上述した第5及び第6実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1から第6実施形態と同一であってよい。このため、以下では、すでに説明した実施形態と重複する部分については適宜説明を省略するものとする。
まず、図21を参照しながら、第7実施形態に係る決済システム10の機能的構成について説明する。図21は、第7実施形態に係る決済システムの機能的構成を示すブロック図である。なお、図21では、図16で示した構成要素と同様の要素に同一の符号を付している。
次に、図22を参照しながら、上述した領域色変更部133による注視領域の色変更について具体的に説明する。図22は、注視領域の色を画面外側に向けて徐々に変化させる表示例を示す概念図である。
次に、第7実施形態に係る決済システム10によって得られる技術的効果について説明する。
第8実施形態に係る決済システム10について、図23及び図24を参照して説明する。なお、第8実施形態に係る決済システム10は、上述した第1から第7実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1から第7実施形態と同一であってよい。このため、以下では、すでに説明した実施形態と重複する部分については適宜説明を省略するものとする。
まず、図23を参照しながら、第8実施形態に係る決済システム10の機能的構成について説明する。図23は、第8実施形態に係る決済システムの機能的構成を示すブロック図である。なお、図23では、図2で示した構成要素と同様の要素に同一の符号を付している。
次に、図24を参照しながら、第8実施形態に係る決済システム10の動作の流れについて説明する。図24は、第8実施形態に係る決済システムの動作の流れを示すフローチャートである。なお、図24では、図3に示した処理と同様の処理に同一の符号を付している。
次に、第8実施形態に係る決済システム10によって得られる技術的効果について説明する。
上述した第1から第8実施形態に係る決済システム10に適用可能な他の表示例について、図25から図28を参照して説明する。図25は、決済意思の最終確認をする際の表示例を示す概念図である。図26は、顧客とカメラとの距離を適正範囲にする際の表示例を示す概念図である。図27は、虹彩認証する際の視線方向を通知するための表示例を示す概念図である。図28は、装着物の取り外しを促すための表示例を示す概念図である。図29は、目を開くように通知するための表示例を示す概念図である。
第2実施形態で詳しく説明したカメラ20の変形例について、図30を参照して説明する。図30は、カメラの変形例を示す概略図である。なお、図30では、図9で示した構成要素と同様のものに同一の符号を付している。
決済における表示例について、図31から図35を参照して説明する。図31は、キャンセルボタンの表示例を示す概念図である。図32は、キャンセル後の表示例を示す概念図である。図33は、個数変更ボタンの表示例を示す概念図である。図34は、金額変更ボタンの表示例を示す概念図である。図35は、年齢確認が必要な商品がある場合の表示例を示す概念図である。
以上説明した実施形態に関して、更に以下の付記のようにも記載されうるが、以下には限られない。
付記1に記載の決済システムは、商品を読み取る商品読取手段と、読み取った前記商品に関する商品情報を取得する商品情報取得手段と、前記商品の決済の意思を顧客に確認するための確認情報を出力する確認情報出力手段と、前記確認情報に対する前記顧客からの入力を受け付ける受付手段と、前記顧客の顔画像を取得する顔取得手段と、前記顧客の虹彩画像を取得する虹彩取得手段と、前記顧客からの入力と、前記顔画像及び前記虹彩画像の少なくとも一方と、に基づいて、前記商品の決済処理を実行する決済処理手段とを備えることを特徴とする決済システムである。
付記2に記載の決済システムは、前記顔取得手段及び前記虹彩取得手段は、一体的に駆動する顔カメラ及び虹彩カメラから、前記顔画像及び前記虹彩画像を取得することを特徴とする付記1に記載の決済システムである。
付記3に記載の決済システムは、前記商品読取手段は、前記顔取得手段が前記顔画像を取得する顔カメラを用いて前記商品を読み取り、前記顔カメラは、前記商品を読み取る際に少なくとも1方向に駆動して撮像範囲を変更することを特徴とする付記1又は2に記載の決済システムである。
前記推定手段は、前記受付手段は、前記顔画像及び前記虹彩画像の少なくとも一方から推定した前記顧客の視線方向に基づいて、前記顧客からの入力を受け付けることを特徴とする付記1から3のいずれか一項に記載の決済システムである。
付記5に記載の決済システムは、前記確認情報出力手段は、前記決済の意思に対応する少なくとも1つの注視領域を表示させ、前記受付手段は、前記顧客が注視している前記注視領域に対応する情報を、前記顧客からの入力として受け付けることを特徴とする付記4に記載の決済システムである。
付記6に記載の決済システムは、前記確認情報出力手段は、前記顧客が前記注視領域を注視している時間に応じて、前記注視領域の外側から前記注視領域に向けて徐々に収束する枠を表示させることを特徴とする付記5に記載の決済システムである。
付記7に記載の決済システムは、前記確認情報出力手段は、前記顧客が前記注視領域を注視している時間に応じて、前記注視領域の色を画面外側に向けて徐々に変化させることを特徴とする付記5又は6に記載の決済システムである。
付記8に記載の決済システムは、前記虹彩画像から推定した前記顧客の視線の動きに基づいて、前記顧客の生体らしさを判定する生体判定手段を更に備え、前記決済処理手段は、前記生体らしさが所定閾値より高い場合には、前記顔画像に基づいて前記決済処理を実行するか否かを判定し、前記生体らしさが前記所定閾値より低い場合には、前記顔画像及び前記虹彩画像に基づいて前記決済処理を実行するか否かを判定することを特徴とする請求項1から7のいずれか一項に記載の決済システムである。
付記9に記載の決済方法は、商品を読み取り、読み取った前記商品に関する商品情報を取得し、前記商品の決済の意思を顧客に確認するための確認情報を出力し、前記確認情報に対する前記顧客からの入力を受け付け、前記顧客の顔画像を取得し、前記顧客の虹彩画像を取得し、前記顧客からの入力と、前記顔画像及び前記虹彩画像の少なくとも一方と、に基づいて、前記商品の決済処理を実行することを特徴とする決済方法である。
付記10に記載のコンピュータプログラムは、商品を読み取り、読み取った前記商品に関する商品情報を取得し、前記商品の決済の意思を顧客に確認するための確認情報を出力し、前記確認情報に対する前記顧客からの入力を受け付け、前記顧客の顔画像を取得し、前記顧客の虹彩画像を取得し、前記顧客からの入力と、前記顔画像及び前記虹彩画像の少なくとも一方と、に基づいて、前記商品の決済処理を実行するようにコンピュータを動作させることを特徴とするコンピュータプログラムである。
付記11に記載の記録媒体は、付記10に記載のコンピュータプログラムを記録していることを特徴とする記録媒体である。
11 プロセッサ
20 カメラ
110 商品読取部
120 商品情報取得部
130 確認情報出力部
131 注視領域表示部
132 枠表示部
133 領域色変更部
140 入力受付部
141 視線方向推定部
150 顔画像取得部
160 虹彩画像取得部
170 決済処理部
180 生体判定部
210 虹彩カメラ
220 顔カメラ
225 カメラユニット
230 照明部
235 可視光フィルタ
250 保持部
260 空冷ファン
270 モータ
275 固定部
280 筐体
290 駆動制御部
300 スキャナ
Claims (10)
- 商品を読み取る商品読取手段と、
読み取った前記商品に関する商品情報を取得する商品情報取得手段と、
前記商品の決済の意思を顧客に確認するための確認情報を出力する確認情報出力手段と、
前記確認情報に対する前記顧客からの入力を受け付ける受付手段と、
前記顧客の顔画像を取得する顔取得手段と、
前記顧客の虹彩画像を取得する虹彩取得手段と、
前記顧客からの入力と、前記顔画像及び前記虹彩画像の少なくとも一方と、に基づいて、前記商品の決済処理を実行する決済処理手段と
を備えることを特徴とする決済システム。 - 前記顔取得手段及び前記虹彩取得手段は、一体的に駆動する顔カメラ及び虹彩カメラから、前記顔画像及び前記虹彩画像を取得することを特徴とする請求項1に記載の決済システム。
- 前記商品読取手段は、前記顔取得手段が前記顔画像を取得する顔カメラを用いて前記商品を読み取り、
前記顔カメラは、前記商品を読み取る際に少なくとも1方向に駆動して撮像範囲を変更する
ことを特徴とする請求項1又は2に記載の決済システム。 - 前記受付手段は、前記顔画像及び前記虹彩画像の少なくとも一方から推定した前記顧客の視線方向に基づいて、前記顧客からの入力を受け付けることを特徴とする請求項1から3のいずれか一項に記載の決済システム。
- 前記確認情報出力手段は、前記決済の意思に対応する少なくとも1つの注視領域を表示させ、
前記受付手段は、前記顧客が注視している前記注視領域に対応する情報を、前記顧客からの入力として受け付ける
ことを特徴とする請求項4に記載の決済システム。 - 前記確認情報出力手段は、前記顧客が前記注視領域を注視している時間に応じて、前記注視領域の外側から前記注視領域に向けて徐々に収束する枠を表示させることを特徴とする請求項5に記載の決済システム。
- 前記確認情報出力手段は、前記顧客が前記注視領域を注視している時間に応じて、前記注視領域の色を画面外側に向けて徐々に変化させることを特徴とする請求項5又は6に記載の決済システム。
- 前記虹彩画像から推定した前記顧客の視線の動きに基づいて、前記顧客の生体らしさを判定する生体判定手段を更に備え、
前記決済処理手段は、前記生体らしさが所定閾値より高い場合には、前記顔画像に基づいて前記決済処理を実行するか否かを判定し、前記生体らしさが前記所定閾値より低い場合には、前記顔画像及び前記虹彩画像に基づいて前記決済処理を実行するか否かを判定する
ことを特徴とする請求項1から7のいずれか一項に記載の決済システム。 - 商品を読み取り、
読み取った前記商品に関する商品情報を取得し、
前記商品の決済の意思を顧客に確認するための確認情報を出力し、
前記確認情報に対する前記顧客からの入力を受け付け、
前記顧客の顔画像を取得し、
前記顧客の虹彩画像を取得し、
前記顧客からの入力と、前記顔画像及び前記虹彩画像の少なくとも一方と、に基づいて、前記商品の決済処理を実行する
ことを特徴とする決済方法。 - 商品を読み取り、
読み取った前記商品に関する商品情報を取得し、
前記商品の決済の意思を顧客に確認するための確認情報を出力し、
前記確認情報に対する前記顧客からの入力を受け付け、
前記顧客の顔画像を取得し、
前記顧客の虹彩画像を取得し、
前記顧客からの入力と、前記顔画像及び前記虹彩画像の少なくとも一方と、に基づいて、前記商品の決済処理を実行する
ようにコンピュータを動作させることを特徴とするコンピュータプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/640,602 US20230196364A1 (en) | 2021-03-08 | 2021-03-08 | Payment system, payment method, and computer program |
PCT/JP2021/008957 WO2022190164A1 (ja) | 2021-03-08 | 2021-03-08 | 決済システム、決済方法、及びコンピュータプログラム |
JP2023504886A JPWO2022190164A1 (ja) | 2021-03-08 | 2021-03-08 | |
EP21930023.3A EP4307196A4 (en) | 2021-03-08 | 2021-03-08 | PAYMENT SYSTEM, PAYMENT METHOD AND COMPUTER PROGRAM |
CN202180095390.3A CN116997918A (zh) | 2021-03-08 | 2021-03-08 | 支付系统、支付方法和计算机程序 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/008957 WO2022190164A1 (ja) | 2021-03-08 | 2021-03-08 | 決済システム、決済方法、及びコンピュータプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190164A1 true WO2022190164A1 (ja) | 2022-09-15 |
Family
ID=83227501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/008957 WO2022190164A1 (ja) | 2021-03-08 | 2021-03-08 | 決済システム、決済方法、及びコンピュータプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230196364A1 (ja) |
EP (1) | EP4307196A4 (ja) |
JP (1) | JPWO2022190164A1 (ja) |
CN (1) | CN116997918A (ja) |
WO (1) | WO2022190164A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230035711A1 (en) * | 2021-07-31 | 2023-02-02 | Qualcomm Incorporated | Satellite signal environment determination and/or position estimate selection |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005242677A (ja) * | 2004-02-26 | 2005-09-08 | Ntt Comware Corp | 複合認証システムおよびその方法ならびにプログラム |
JP2009104426A (ja) * | 2007-10-24 | 2009-05-14 | Advanced Telecommunication Research Institute International | インタラクティブ看板システム |
JP2009237643A (ja) | 2008-03-26 | 2009-10-15 | Nec Corp | 認証システム、認証方法および認証用プログラム |
US20150358594A1 (en) * | 2014-06-06 | 2015-12-10 | Carl S. Marshall | Technologies for viewer attention area estimation |
JP2018010625A (ja) | 2016-07-11 | 2018-01-18 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 複数の生体認証器を用いたユーザ認証方法及びその装置 |
US20200057847A1 (en) * | 2017-03-10 | 2020-02-20 | Crucialtec Co.Ltd | Contactless multiple body part recognition method and multiple body part recognition device, using multiple biometric data |
JP2020166642A (ja) * | 2019-03-29 | 2020-10-08 | パナソニックIpマネジメント株式会社 | 精算決済装置および無人店舗システム |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090157481A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a cohort-linked avatar attribute |
KR101542124B1 (ko) * | 2011-04-13 | 2015-08-13 | 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 | 동적 광고 컨텐츠 선택 |
GB2497553B (en) * | 2011-12-13 | 2018-05-16 | Irisguard Inc | Improvements relating to iris cameras |
US20140002352A1 (en) * | 2012-05-09 | 2014-01-02 | Michal Jacob | Eye tracking based selective accentuation of portions of a display |
US9449221B2 (en) * | 2014-03-25 | 2016-09-20 | Wipro Limited | System and method for determining the characteristics of human personality and providing real-time recommendations |
CN105100633A (zh) * | 2014-05-22 | 2015-11-25 | 宁波舜宇光电信息有限公司 | 虹膜识别应用中的补光方法及装置 |
US9741026B1 (en) * | 2014-09-30 | 2017-08-22 | Square, Inc. | Payment by use of identifier |
KR102460459B1 (ko) * | 2015-02-27 | 2022-10-28 | 삼성전자주식회사 | 전자 장치를 이용한 카드 서비스 방법 및 장치 |
TWI607336B (zh) * | 2015-07-08 | 2017-12-01 | 台灣色彩與影像科技股份有限公司 | 區域的監控方法 |
US10044712B2 (en) * | 2016-05-31 | 2018-08-07 | Microsoft Technology Licensing, Llc | Authentication based on gaze and physiological response to stimuli |
CN109376666B (zh) * | 2018-10-29 | 2022-01-25 | 百度在线网络技术(北京)有限公司 | 一种商品售卖方法、装置、售卖机及存储介质 |
JP7445856B2 (ja) * | 2019-09-30 | 2024-03-08 | パナソニックIpマネジメント株式会社 | 物体認識装置、物体認識システムおよび物体認識方法 |
JP2021163235A (ja) * | 2020-03-31 | 2021-10-11 | 富士通株式会社 | 情報処理方法、情報処理システム、情報処理装置及び情報処理プログラム |
-
2021
- 2021-03-08 CN CN202180095390.3A patent/CN116997918A/zh active Pending
- 2021-03-08 US US17/640,602 patent/US20230196364A1/en active Pending
- 2021-03-08 JP JP2023504886A patent/JPWO2022190164A1/ja active Pending
- 2021-03-08 EP EP21930023.3A patent/EP4307196A4/en active Pending
- 2021-03-08 WO PCT/JP2021/008957 patent/WO2022190164A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005242677A (ja) * | 2004-02-26 | 2005-09-08 | Ntt Comware Corp | 複合認証システムおよびその方法ならびにプログラム |
JP2009104426A (ja) * | 2007-10-24 | 2009-05-14 | Advanced Telecommunication Research Institute International | インタラクティブ看板システム |
JP2009237643A (ja) | 2008-03-26 | 2009-10-15 | Nec Corp | 認証システム、認証方法および認証用プログラム |
US20150358594A1 (en) * | 2014-06-06 | 2015-12-10 | Carl S. Marshall | Technologies for viewer attention area estimation |
JP2018010625A (ja) | 2016-07-11 | 2018-01-18 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 複数の生体認証器を用いたユーザ認証方法及びその装置 |
US20200057847A1 (en) * | 2017-03-10 | 2020-02-20 | Crucialtec Co.Ltd | Contactless multiple body part recognition method and multiple body part recognition device, using multiple biometric data |
JP2020166642A (ja) * | 2019-03-29 | 2020-10-08 | パナソニックIpマネジメント株式会社 | 精算決済装置および無人店舗システム |
Non-Patent Citations (3)
Title |
---|
OTSUNA, RYOMA: "High-precision biometric authentication technology that can authenticate face and iris with one action", RESEARCH AND DEVELOPMENT (R&D) - INTRODUCTION OF RESEARCH ACTIVITIES - TECHNOLOGY LIST, JP, pages 1 - 3, XP009539945, Retrieved from the Internet <URL:https://jpn.nec.com/rd/technologies/202004/index.html> [retrieved on 20210601] * |
See also references of EP4307196A4 |
YAGI, DAISUKE; SHIMIZU, ATSUSHI; MATSUMOTO, KAZUNORI: "1K2-3 Research of Personal Authentication System using Gaze Data Mining", THE 30TH ANNUAL CONFERENCE OF THE JAPANESE SOCIETY OF ARTIFICIAL INTELLIGENCE (JSAI), vol. 30, 9 June 2016 (2016-06-09), pages 1 - 4, XP009540755 * |
Also Published As
Publication number | Publication date |
---|---|
EP4307196A4 (en) | 2024-05-08 |
JPWO2022190164A1 (ja) | 2022-09-15 |
CN116997918A (zh) | 2023-11-03 |
US20230196364A1 (en) | 2023-06-22 |
EP4307196A1 (en) | 2024-01-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5548042B2 (ja) | ユーザ端末装置及びショッピングシステム | |
US11157606B2 (en) | Facial recognition authentication system including path parameters | |
KR102441716B1 (ko) | 경로 파라미터들을 포함하는 안면 인식 인증 시스템 | |
JP3642336B2 (ja) | 目画像撮像装置 | |
JP2012022589A (ja) | 商品選択支援方法 | |
JP2020003873A (ja) | 生体認証プログラム、生体認証方法 | |
WO2022190164A1 (ja) | 決済システム、決済方法、及びコンピュータプログラム | |
US20230073410A1 (en) | Facial recognition and/or authentication system with monitored and/or controlled camera cycling | |
JP2018073265A (ja) | 自動販売機、その制御方法、およびプログラム | |
WO2021245823A1 (ja) | 情報取得装置、情報取得方法及び記憶媒体 | |
WO2021090354A1 (ja) | 虹彩認証装置及び虹彩認証方法 | |
JP2024003669A (ja) | プログラム、情報端末、システム | |
JP2015228096A (ja) | 電子機器 | |
JP2012113437A (ja) | 自動券売機 | |
WO2020261423A1 (ja) | 認証システム、認証方法、制御装置、コンピュータプログラム及び記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21930023 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023504886 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180095390.3 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021930023 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021930023 Country of ref document: EP Effective date: 20231009 |