US20170116491A1 - Commodity recognition apparatus and commodity recognition method - Google Patents
Commodity recognition apparatus and commodity recognition method Download PDFInfo
- Publication number
- US20170116491A1 US20170116491A1 US15/398,875 US201715398875A US2017116491A1 US 20170116491 A1 US20170116491 A1 US 20170116491A1 US 201715398875 A US201715398875 A US 201715398875A US 2017116491 A1 US2017116491 A1 US 2017116491A1
- Authority
- US
- United States
- Prior art keywords
- commodity
- image
- recognized
- reason
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06K9/00912—
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07G—REGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
- G07G1/00—Cash registers
- G07G1/0036—Checkout procedures
- G07G1/0045—Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
- G07G1/0054—Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
- G07G1/0063—Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the geometric dimensions of the article of which the code is read, such as its size or height, for the verification of the registration
-
- G06K9/00624—
-
- G06K9/4671—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/60—Static or dynamic means for assisting the user to position a body part for biometric acquisition
- G06V40/67—Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
-
- G06K2209/17—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/68—Food, e.g. fruit or vegetables
Definitions
- Embodiments described herein relate generally to a commodity recognition apparatus which recognizes a commodity from a captured image and a commodity recognition method for enabling a computer to function as the commodity recognition apparatus.
- the general object recognition technology described above is proposed to be applied to a recognition apparatus for recognizing a commodity, especially a commodity on which a barcode is not applied, such as vegetables, fruits and the like purchased by a customer in a checkout system (POS system) of a retail store.
- POS system checkout system
- an operator a shop clerk or a customer holds the recognition target commodity over the image capturing section.
- the commodity recognition apparatus the commodity is recognized from a captured image and the information (for example, a commodity name) indicating the recognition result is displayed on a display section.
- the commodity is not always recognized. For example, there is a case in which the commodity is not recognized because the commodity is shield by the hand of the operator. Alternatively, there is a case in which the commodity is not recognized because the similar feature amount data is not registered in the recognition dictionary file. Further, there is a case in which the commodity is not recognized because the commodity recognition program is frozen.
- the commodity recognition apparatus has no reaction in a case in which the commodity is not recognized. As a result, it is difficult to determine the reason why the commodity is not recognized, and most operators feel stress because they tried to change the position and the direction to hold the commodity.
- FIG. 1 is an external view of a store checkout system
- FIG. 2 is a block diagram illustrating the hardware constitution of a scanner device and a POS terminal
- FIG. 3 is a schematic view illustrating the structure of dictionary data for each commodity stored in a recognition dictionary file
- FIG. 4 is a schematic view illustrating a main memory area formed in a RAM of the scanner device
- FIG. 5 is a flowchart illustrating the procedure of a main information processing executed by a CPU of the scanner device according to a commodity recognition program
- FIG. 6 is a schematic view illustrating an example of a screen of a touch panel when a commodity recognition operation is carried out
- FIG. 7 is a schematic view illustrating an example of a screen of the touch panel when a commodity recognition operation is not carried out
- FIG. 8 is a schematic view illustrating an example of a notification screen in a case in which an object is not recognized as a commodity
- FIG. 9 is a schematic view illustrating another example of the notification screen in a case in which an object is not recognized as a commodity.
- FIG. 10 is a schematic view illustrating another example of the notification screen in a case in which an object is not recognized as a commodity.
- a commodity recognition apparatus comprises a recognition module, a specification module and a notification module.
- the recognition module recognizes, from a captured image, a commodity imaged in the captured image.
- the specification module specifies the reason in a case in which the commodity cannot be recognized by the recognition module.
- the notification module notifies an operator of the reason specified by the specification module.
- a scanner device 1 constituting a store checkout system of a retail store which deals in vegetables, fruits and the like is provided with the functions of the commodity recognition apparatus.
- FIG. 1 is an external view of the store checkout system.
- the system includes the scanner device 1 serving as a registration section for registering a commodity to be purchased by a customer and a POS (Point Of Sales) terminal 2 serving as a settlement section for processing the payment of the customer.
- the scanner device 1 is arranged on a checkout counter 3 .
- the POS terminal 2 is arranged on a register table 4 across a drawer 5 .
- the scanner device 1 and the POS terminal 2 are electrically connected with each other through a communication cable 300 (refer to FIG. 2 ).
- the scanner device 1 is provided with a keyboard 11 , a touch panel 12 and a customer display 13 as the devices required for the reading of the commodity. These display/operation devices are arranged on a thin rectangular housing 1 A constituting a main body of the scanner device 1 .
- An image capturing section 14 is arranged inside the housing 1 A.
- a rectangular-shaped reading window 1 B is formed at the front side of the housing 1 A.
- the image capturing section 14 includes a CCD (Charge Coupled Device) image capturing element as an area image sensor and a drive circuit thereof, and an image capturing lens for focusing the image of an image capturing area on the CCD image capturing element.
- the image capturing area refers to an area of a frame image focused on the area of the CCD image capturing element through the image capturing lens from the reading window 1 B.
- the image capturing section 14 outputs the image of the image capturing area focused on the CCD image capturing element through the image capturing lens.
- the image capturing section 14 may be a CMOS (complementary metal oxide semiconductor) image sensor.
- the POS terminal 2 is provided with a keyboard 21 , an operator display 22 , a customer display 23 and a receipt printer 24 as the devices required for the settlement.
- the keyboard 21 , the operator display 22 and the customer display 23 within these devices are arranged on a housing constituting a main body of the POS terminal 2 .
- the receipt printer 24 is arranged inside the housing, and a receipt printed by the printer is issued from a receipt issuing port formed at the front side of the housing.
- the checkout counter 3 is in an elongated shape along a customer passage at the rear side thereof.
- the register table 4 is arranged at a side opposite to the customer passage with respect to the checkout counter 3 at a substantially right angle to the checkout counter 3 .
- the register table 4 is located at the end of the checkout counter 3 at the downstream side of a movement direction of a customer moving along the checkout counter 3 . Therefore, the checkout counter 3 and the register table 4 are arranged in an L-shape to define a space for a shop clerk in charge of settlement, i.e., so called casher.
- the housing 1 A of the scanner device 1 is vertically arranged such that the keyboard 11 , the touch panel 12 and the reading window 1 B are directed to the space for a shop clerk (cashier).
- the display for customer 13 of the scanner device 1 is arranged on the housing 1 A, facing to the customer passage.
- a first upper surface portion of the checkout counter 3 at the upstream side thereof through the scanner device 1 in the customer movement direction serves as a space for placing a shopping basket 6 in which an unregistered commodity M purchased by a customer is held.
- a second upper surface portion at the downstream side through the scanner device 1 serves as another space for placing a shopping basket 7 in which a commodity M registered by the scanner device 1 is held.
- FIG. 2 is a block diagram illustrating the hardware constitutions of the scanner device 1 and the POS terminal 2 .
- the scanner device 1 comprises a scanner section 101 and an operation-output section 102 .
- the scanner section 101 includes a CPU (Central Processing Unit) 111 , a ROM (Read Only Memory) 112 , a RAM (Random Access Memory) 113 and a connection interface 114 which are connected with each other via a bus line 115 including an address bus and a data bus.
- the scanner section 101 further connects the image capturing section 14 with the bus line 115 through an input/output circuit (not shown).
- the CPU 111 is a central part of a computer.
- the CPU 111 controls each section to achieve various functions of the scanner device 1 according to an operating system or an application program.
- the ROM 112 is a main storage part of the computer.
- the ROM 112 stores the operating system and the application program mentioned above. As occasion demands, ROM 112 also stores data required to execute various processing by the CPU 111 .
- the RAM 113 is also a main storage part of the computer mentioned above.
- the RAM 113 stores data required to execute various processing by the CPU 111 as needed. Further, the RAM 113 is also used as a work area for the CPU 111 when various processing, for example, the processing of storing the frame image captured by the image capturing section 14 , is executed.
- the operation-output section 102 includes a connection interface 116 .
- the operation-output section 102 connects a bus line 117 with the connection interface 116 , and connects the keyboard 11 , the touch panel 12 and the display for customer 13 with the bus line 117 through an input/output circuit (not shown) to realize the functions of the operation-output section 102 .
- the touch panel 12 includes a panel type display 12 a and a touch panel sensor 12 b overlaid on the screen of the display 12 a.
- the operation-output section 102 further connects a speech synthesis section 118 with the bus line 117 .
- the speech synthesis section 118 outputs a speech or voice signal to a speaker 17 in response to a command input via the bus line 117 .
- the speaker 17 converts the voice signal into a voice to output it.
- connection interface 114 of the scanner section 101 and the connection interface 116 of the operation-output section 102 are connected with each other through the communication cable 300 . Through the connection, the data signal from the scanner section 101 is sent to the operation-output section 102 , and the operation of the operation-output section 102 is controlled.
- the POS terminal 2 also carries a CPU 201 as a main body of the control section.
- the CPU 201 is connected with a ROM 203 , a RAM 204 , an auxiliary storage section 205 , a communication interface 206 and a connection interface 207 via a bus line 202 .
- the keyboard 21 , display for operator 22 , display for customer 23 , printer 24 and drawer 5 are respectively connected with the bus line 202 via an input-output circuit (not shown).
- the communication interface 206 is connected with a store server (not shown) serving as a center of the store via a network such as a LAN (Local Area Network) and the like. Through this connection, the POS terminal 2 can perform a transmission/reception of data with the store server.
- a store server (not shown) serving as a center of the store via a network such as a LAN (Local Area Network) and the like.
- LAN Local Area Network
- connection interface 207 is connected with the two connection interfaces 114 and 116 of the scanner apparatus 1 via a communication cable 300 .
- the POS terminal 2 receives information from the scanner section 101 of the scanner apparatus 1 .
- the scanner apparatus 1 accesses the data file stored in the auxiliary storage section 205 of the POS terminal 2 .
- the auxiliary storage section 205 which is, for example, a HDD (Hard Disk Drive) device or a SSD (Solid State Drive) device, further stores data files such as a recognition dictionary file 30 , a PLU file 40 and the like, in addition to various programs.
- a HDD Hard Disk Drive
- SSD Solid State Drive
- FIG. 3 is a schematic view illustrating the structure of the dictionary data for each commodity stored in the recognition dictionary file 30 .
- a plurality of feature amount data are stored in the recognition dictionary file 30 for each recognition target commodity in association with a commodity name and a commodity ID for identifying the commodity.
- the feature amount data is obtained by extracting, from a reference image obtained by photographing a commodity identified with the corresponding commodity ID, an appearance feature amount serving as the surface information (appearance shape, tint, pattern, and concave-convex state and the like) of the commodity, and representing the appearance feature amount in the form of parameters.
- Feature amount data 1 -N obtained when a commodity is observed from various directions are respectively stored for the commodity.
- the number N of feature amount data for one commodity is not fixed.
- the number N of feature amount data is different from one another according to commodity.
- the commodity name is not necessarily contained in the dictionary data for each commodity.
- the PLU file 40 stores commodity data such as a unit price serving as the price per unit in association with the commodity ID of each commodity.
- the CPU 201 reads the unit price associated with the commodity ID from the PLU file 40 and registers the sales data of the commodity for sale with the unit price. Further, it is possible to collectively manage the dictionary data for each commodity stored in the recognition dictionary file 30 and the data for each commodity stored in the PLU file 40 in one data file.
- the image capturing section 14 of the scanner device 1 functions as an image capturing module for photographing the commodity held over the reading window 1 B by the operator (shop clerk).
- the touch panel 12 functions as a display module for displaying the image captured by the image capturing module (image capturing section 14 ).
- the CPU 111 functions as a detection module, an extraction module, a calculating module, the recognition module, the specification module and the notification module.
- the detection module detects, from an image captured by the image capturing section 14 , an object imaged in the image.
- the extraction module analyzes the image of the object detected from the captured image and extracts the appearance feature amount of the object.
- the calculating module compares the appearance feature amount of the object extracted from the object image with the feature amount data of each recognition target commodity stored in the recognition dictionary file 30 to calculate, for each recognition target commodity, a similarity degree indicating how similar the appearance feature amount is to the feature amount data.
- the recognition module recognizes, from the captured image, the commodity imaged in the captured image based on the calculated similarity degree for each recognition target commodity.
- the specification module specifies the reason in a case in which the commodity cannot be recognized.
- the notification module notifies the operator of the reason specified by the specification module.
- These modules are realized by executing processing according to the commodity recognition program by the CPU 111 of the scanner device 1 .
- the commodity recognition program is stored in, for example, the ROM 112 .
- the scanner device 1 forms a determined commodity memory 41 , a confirmation commodity memory 42 , a candidate commodity memory 43 , a pattern memory 44 and a color memory 45 in the RAM 113 as the memory areas required to execute the processing.
- All the determined commodity memory 41 , the confirmation commodity memory 42 and the candidate commodity memory 43 have an area for storing the commodity ID and the later described similarity degree as a pair.
- the determined commodity memory 41 and the confirmation commodity memory 42 store only one pair of commodity ID and similarity degree.
- the candidate commodity memory 43 can store a plurality of pairs of commodity ID and similarity degree.
- the determined commodity memory 41 stores the similarity degree and the commodity ID of a commodity of which the similarity degree is greater than a first threshold value A.
- the confirmation commodity memory 42 stores the similarity degree and the commodity ID of a commodity of which the similarity degree is equal to or smaller than the first threshold value A and greater than a second threshold value B smaller than the first threshold value A.
- the candidate commodity memory 43 stores the similarity degree and the commodity ID of a commodity of which the similarity degree is equal to or smaller than the second threshold value B and greater than a third threshold value C smaller than the second threshold value B.
- the pattern memory 44 stores pattern data obtained by patterning various shapes of the hand of the operator of the scanner device 1 when the operator holds the commodity over the reading window 1 B.
- the hand of the operator is one reason why the commodity cannot be recognized by the recognition module. That is, there is a case in which the hand of the operator shields the commodity when the operator holds the commodity over the reading window 1 B, and as a result, the commodity cannot be recognized.
- the pattern memory 44 is arranged in the present embodiment to specify such a reason.
- the color memory 45 stores color data set for each reason when the commodity cannot be recognized.
- the reason why the commodity cannot be recognized further includes a reason that the position of the commodity is too far, or a reason that the position of the commodity is shifted, in addition to the reason that the hand of the operator shields the commodity. That is, there is a case in which the commodity cannot be recognized because the position of the commodity held over the reading window 1 B is shifted from the image capturing area. There is a case in which the commodity cannot be recognized because the position of the commodity held over the reading window 1 B is too far, and as a result, the resolution of the commodity image is low. Alternatively, there is a case in which the feature amount data of the commodity is not set in the recognize dictionary memory 30 . In the present embodiment, the differences of these reasons can be recognized according to the differences of the colors of frame borders surrounding the recognition target object.
- the CPU 111 starts the procedure shown in the flowchart in FIG. 5 .
- the procedure of the processing described below including the procedure shown in FIG. 5 is just described as an example, and various processing procedures can be used properly to achieve the same result.
- the CPU 111 outputs an ON-signal of image capturing to the image capturing section 14 (ACT 1 ).
- the image capturing section 14 starts to photograph the image capturing area according to the ON-signal of image capturing.
- the frame images of the image capturing area captured by the image capturing section 14 are stored in the RAM 113 in sequence.
- the CPU 111 After outputting the ON-signal of image capturing, the CPU 111 sequentially acquires the frame images stored in the RAM 113 and displays the images on the display 12 a (ACT 2 ).
- the CPU 111 detects, from the frame image, the object imaged in the frame image (ACT 3 : detection module). For example, the CPU 111 extracts a contour line and the like from the binary image of the frame image. Then the CPU 111 tries to extract the contour of the object imaged in the frame image. After the contour of the object is extracted, the CPU 111 regards that the object is detected from the frame image (ACT 4 ).
- the CPU 111 In a case in which the object cannot be detected from the frame image (NO in ACT 4 ), the CPU 111 returns to execute the processing in ACT 2 . That is, the CPU 111 acquires a next frame image. Then the CPU 111 tries to detect the object from the frame image.
- the CPU 111 specifies, for example, a minimum rectangular-shaped image area containing the contour of the object as the image area of the object. Then as shown in FIG. 6 or FIG. 7 , the CPU 111 displays a frame border L 1 surrounding the image area of the object (ACT 5 ).
- the frame border L 1 is formed in a rectangular shape in FIG. 6 or FIG. 7 , however, the shape of the frame border L 1 is not limited to this.
- the frame border L 1 may be formed in a rhombus, circle or ellipse. Further, the frame border L 1 is exemplified as a dotted line in FIG. 6 or FIG. 7 , however, it is not limited to this, and the frame border L 1 may be a solid line, one dotted line and the like.
- the CPU 111 After displaying the frame border L 1 , the CPU 111 extracts the appearance feature amount such as the shape, the surface tint, the pattern, the concave-convex state and the like of the object from the image in the area (ACT 6 : extraction module).
- the CPU 111 accesses the POS terminal 2 through the connection interface 115 . Then the CPU 111 sequentially compares the appearance feature amount with the feature amount data of each recognition target commodity stored in the recognition dictionary file 30 to calculate the similarity degree with the appearance feature amount for each recognition target commodity (ACT 7 : calculating module).
- the CPU 111 writes the similarity degree and the commodity ID of the commodity into the determined commodity memory 41 .
- the CPU 111 retains the data with the greater similarity degree in the determined commodity memory 41 .
- the CPU 111 In a case in which a similarity degree equal to or smaller than the first threshold value A but greater than the second threshold value B is calculated, the CPU 111 writes the similarity degree and the commodity ID of the commodity into the confirmation commodity memory 42 . In this case, if the data is already written in the confirmation commodity memory 42 , the CPU 111 retains the data with the greater similarity degree in the confirmation commodity memory 42 .
- the CPU 111 In a case in which a similarity degree equal to or smaller than the second threshold value B but greater than the third threshold value C is calculated, the CPU 111 writes the similarity degree and the commodity ID of the commodity into the candidate commodity memory 43 . At this time, if the data is already written in the candidate commodity memory 43 , the CPU 111 sorts the data in the candidate commodity memory 43 in the descending order of similarity degree.
- the CPU 111 confirms whether or not the similarity degree greater than the third threshold value C is calculated (ACT 8 : recognition module). In a case in which the data is not written in the determined commodity memory 41 , the confirmation commodity memory 42 or the candidate commodity memory 43 , the similarity degree greater than the third threshold value C is not calculated. In this case (NO in ACT 8 ), the CPU 111 specifies the reason why the commodity cannot be recognized (ACT 9 : specification module).
- the CPU 111 sequentially compares the image in the area surrounded by the frame border L 1 with each pattern data set in the pattern memory 44 to calculate a correspondence rate. For example, when a pattern data corresponding to the image at a probability as high as 80% is detected, the CPU 111 specifies that the reason why the commodity cannot be recognized is the hand of the operator.
- the CPU 111 measures the size of the area surrounded by the frame border L 1 .
- the CPU 111 specifies that the reason why the commodity cannot be recognized relates to the position of the commodity.
- the CPU 111 specifies that the reason is that the data is not set in the recognize dictionary memory 30 .
- the CPU 111 may first determine whether or not the size of the area surrounded by the frame border L 1 is smaller than the minimum value, and then execute the processing of comparing with each pattern data in the pattern memory 44 to specify the reason if the size is smaller than the minimum value.
- the CPU 111 After the reason why the commodity cannot be recognized is specified, the CPU 111 detects the color data set for the reason by reference to the color memory 45 . Then the CPU 111 changes the color of the frame border L 1 to the color of the color data (ACT 10 : notification module).
- the CPU 111 returns to ACT 2 to acquire a next frame image. Then the CPU 111 executes the processing following ACT 3 again.
- the CPU 111 stops the notification processing carried out in ACT 10 (ACT 11 ). Sequentially, the CPU 111 executes a determination processing (ACT 12 ).
- the content of the determination processing varies in a case in which the data is written in the determined commodity memory 41 , in a case in which the data is written in the confirmation commodity memory 42 , and in a case in which the data is written in the candidate commodity memory 43 .
- the CPU 111 determines the commodity specified with the commodity ID stored in the determined commodity memory 41 as the commodity held over the reading window 1 B. Then the CPU 111 generates a voice from the speaker 17 to notify that the commodity is determined.
- the CPU 111 displays, on the display 12 a, an image of a name button which displays the name of the commodity specified with the commodity ID stored in the confirmation commodity memory 42 in a button frame. Then the CPU 111 waits until the name button is touched. When it is detected that the name button is touched according to the signal detected by the touch panel sensor 12 b, the CPU 111 determines the commodity specified with the commodity ID stored in the confirmation commodity memory 42 as the commodity held over the reading window 1 B. Then the CPU 111 generates a voice from the speaker 17 to notify that the commodity is determined. On the contrary, in a case in which it is detected that a screen area other than the name button is touched, the CPU 111 deletes the image of the name button. At this time, the commodity is not determined.
- the CPU 111 displays, on the display 12 a, the images of name buttons which display the names of the commodities specified with the commodity IDs stored in candidate commodity memory 43 in button frames. Then the CPU 111 waits until any of the name buttons is touched.
- the CPU 111 determines the commodity corresponding to the name displayed in the name button as the commodity held over the reading window 1 B. Then the CPU 111 generates a voice from the speaker 17 to notify that the commodity is determined.
- the CPU 111 deletes the image of the name button. At this time, the commodity is not determined.
- the CPU 111 determines whether or not the commodity is determined (ACT 13 ). In a case in which the commodity is not determined (NO in ACT 13 ), the CPU 111 returns to ACT 2 to acquire a next frame image. Then the CPU 111 executes the processing following ACT 3 again.
- the CPU 111 In a case in which the commodity is determined (YES in ACT 13 ), the CPU 111 outputs the commodity ID of the determined commodity to the POS terminal 2 through the connection interface 115 (ACT 14 ).
- the shop clerk holds an object M 1 over the reading window 1 B as shown in FIG. 6 .
- almost the entire object M 1 is photographed by the image capturing section 14 .
- the rectangular-shaped frame border L 1 indicated by a dotted line is displayed on the touch panel 12 in a manner of surrounding the recognition area of the object.
- the appearance feature amount of the object M 1 is extracted and compared with the feature amount data of each commodity stored in the recognition dictionary file 30 .
- the feature amount data of which the similarity degree with the appearance feature amount of the object M 1 is greater than the first threshold value A is set in the recognition dictionary file 30 as the feature amount data of a commodity “banana”.
- the object M 1 held over the reading window 1 B is the commodity “banana”
- the commodity ID of the commodity “banana” is output to the POS terminal 2 .
- the sales data of the commodity “banana” is registered in the POS terminal 2 .
- a button image N 1 of the commodity “banana” is displayed on the touch panel 12 in the scanner device 1 . If the shop clerk touches the button image N 1 , the commodity ID of the commodity “banana” is output to the POS terminal 2 . As a result, the sales data of the commodity “banana” is registered in the POS terminal 2 .
- the shop clerk holds an object M 2 over the reading window 1 B in a state in which the object M 2 is almost entirely shield by the hand of the shop clerk, as shown in FIG. 7 .
- the entire back of the hand including the object M 2 is recognized as one object in the scanner device 1 .
- the rectangular-shaped frame border L 1 indicated by a dotted line is displayed on the touch panel 12 in a manner of surrounding the recognition area of the object.
- the appearance feature amount of the object is extracted and compared with the feature amount data of each commodity stored in the recognition dictionary file 30 .
- the feature amount data of which the similarity degree with the appearance feature amount of the object is greater than the third threshold value C is not set in the recognition dictionary file 30 in most cases.
- the recognition dictionary file 30 there is no change in the screen of the touch panel 12 .
- the shop clerk cannot determine whether the commodity is being recognized or the commodity cannot be recognized, which makes the shop clerk feel stress.
- the line type of the frame border L 1 is changed from the dotted line to a solid line in the present embodiment.
- the color of the frame border L 1 is also changed to a color corresponding to the reason why the commodity cannot be recognized.
- the scanner device 1 it is assumed in the scanner device 1 that the image in the area surrounded by the frame border L 1 is sequentially compared with each pattern data set in the pattern memory 44 , and as a result, a pattern data corresponding to the image at a high probability is detected.
- the hand of the shop clerk is the reason why the commodity cannot be recognized.
- the color of the frame border L 1 is changed to a color (for example, red) corresponding to the reason.
- the shop clerk is aware that the commodity N 1 cannot be recognized because his/her hand shields the commodity N 1 .
- the shop clerk holds the commodity N 1 over the reading window 1 B in a manner that the commodity N 1 is not shield by the hand, thus, the shop clerk does not feel stress.
- the reason why the commodity cannot be recognized is the position of the commodity.
- the color of the frame border L 1 is changed to a color (for example, green) corresponding to the reason.
- the shop clerk is aware that the commodity cannot be recognized because the commodity is too far from the reading window 1 B. At this time, the shop clerk only needs to move the commodity closer to the reading window 1 B, thus, the shop clerk does not feel stress.
- the color of the frame border L 1 is changed to, for example, yellow.
- the shop clerk is aware that the commodity cannot be recognized because the feature amount data of the commodity is not registered in the recognition dictionary file 30 .
- the shop clerk only needs to operate the keyboard 11 of the scanner device 1 to register the commodity through manual input, thus, the shop clerk does not feel stress.
- the rectangular-shaped frame border L 1 is displayed on the touch panel 12 in a manner of surrounding the recognition area of the object.
- the color of the frame border L 1 is changed.
- the operator can easily be aware why the commodity cannot be recognized.
- the operator can confirm that the program is not frozen.
- the present invention is not limited to the embodiment described above.
- the reason why the object cannot be recognized may be notified in a display form other than the frame border.
- the operator may be notified of the reason by displaying an icon C 1 on the display 12 .
- FIG. 9 shows an example of an icon which is displayed when the commodity cannot be recognized because the hand of the operator shields the commodity.
- an icon of a mark “?” is displayed. In this way, the operator can easily be aware of the reason why the object cannot be recognized.
- the operator may be notified of the reason by displaying a message D 1 indicating the reason.
- the message D 1 may be given through a voice under the action of the speech synthesis section 118 . Further, the message D 1 may be given by both voice and display.
- the present invention is not limited to the commodity recognition apparatus in which the general object recognition technology is applied.
- the present invention may also be applied to a commodity recognition apparatus which detects a barcode image from an image captured by an image capturing module and decodes the barcode of the barcode image to recognize a commodity.
- the method of recognizing the hand of the operator is not limited to the pattern recognition.
- the reason is the hand of the operator in a case in which the ratio of a flesh color area to the image area is greater than a given ratio.
- the transfer of the commodity recognition apparatus is generally carried out in a state in which the programs such as commodity recognition program are stored in the ROM.
- the preset invention is not limited to this.
- the commodity recognition program and the like transferred separately from a computer device may be written in a writable storage device of the computer device through an operation of a user and the like.
- the transfer of the commodity recognition program and the like may be carried out by recording the program in a removable recording medium, or through a communication via a network.
- the form of the recording medium is not limited as long as the recording medium can store programs like a CD-ROM, a memory card and the like, and is readable by an apparatus.
- the function realized by an installed or downloaded program can also be realized through the cooperation with an OS (Operating System) installed in the apparatus.
- OS Operating System
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Human Computer Interaction (AREA)
- Cash Registers Or Receiving Machines (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
In accordance with one embodiment, a commodity recognition apparatus detects, from a captured image, an object imaged in the captured image and extracts an appearance feature amount of the object from the image of the object; compares the extracted appearance feature amount with feature amount data of a dictionary file in which feature amount data indicating the surface information of a commodity is stored for each recognition target commodity to calculate a similarity degree indicating how similar the appearance feature amount is to the feature amount data for each recognition target commodity; recognizes whether or not the object is a commodity based on the calculated similarity degree; and specifies and notifies the reason in a case in which the object is not recognized as a commodity.
Description
- This application is a Continuation of application Ser. No. 14/537,963 filed Nov. 11, 2014, the entire contents of which are incorporated herein by reference.
- This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2013-240021, filed Nov. 20, 2013, the entire contents of which are incorporated herein by reference.
- Embodiments described herein relate generally to a commodity recognition apparatus which recognizes a commodity from a captured image and a commodity recognition method for enabling a computer to function as the commodity recognition apparatus.
- There is a technology in which an object is recognized according to a similarity degree calculated by extracting an appearance feature amount of the object from the image data of the object photographed by an image capturing section and comparing the extracted appearance feature amount with the feature amount data of a reference image of each object pre-registered in a recognition dictionary file. Such a technology is called as a general object recognition, and various recognition technologies are disclosed in the following document.
- Keiji Yanai “Current status and future direction of general object recognition”, Journal of Information Processing Society, Vol. 48, No. SIG16 [Search on Heisei 22 Aug. 10], Internet <URL: http://mm.cs.uec.ac.jp/IPSJ-TCVIM-Yanai.pdf>
- In addition, the technology carrying out the general object recognition by area division of the image for each object is described in the following document.
- Jamie Shotton etc, “Semantic Texton Forests for Image Categorization and Segmentation”, [Search on Heisei 22 Aug. 10], Internet <URL:http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.145.3036&rep=repl&type=pdf>
- In recent years, for example, the general object recognition technology described above is proposed to be applied to a recognition apparatus for recognizing a commodity, especially a commodity on which a barcode is not applied, such as vegetables, fruits and the like purchased by a customer in a checkout system (POS system) of a retail store. In this case, an operator (a shop clerk or a customer) holds the recognition target commodity over the image capturing section. In this way, in the commodity recognition apparatus, the commodity is recognized from a captured image and the information (for example, a commodity name) indicating the recognition result is displayed on a display section.
- However, the commodity is not always recognized. For example, there is a case in which the commodity is not recognized because the commodity is shield by the hand of the operator. Alternatively, there is a case in which the commodity is not recognized because the similar feature amount data is not registered in the recognition dictionary file. Further, there is a case in which the commodity is not recognized because the commodity recognition program is frozen.
- The commodity recognition apparatus has no reaction in a case in which the commodity is not recognized. As a result, it is difficult to determine the reason why the commodity is not recognized, and most operators feel stress because they tried to change the position and the direction to hold the commodity.
-
FIG. 1 is an external view of a store checkout system; -
FIG. 2 is a block diagram illustrating the hardware constitution of a scanner device and a POS terminal; -
FIG. 3 is a schematic view illustrating the structure of dictionary data for each commodity stored in a recognition dictionary file; -
FIG. 4 is a schematic view illustrating a main memory area formed in a RAM of the scanner device; -
FIG. 5 is a flowchart illustrating the procedure of a main information processing executed by a CPU of the scanner device according to a commodity recognition program; -
FIG. 6 is a schematic view illustrating an example of a screen of a touch panel when a commodity recognition operation is carried out; -
FIG. 7 is a schematic view illustrating an example of a screen of the touch panel when a commodity recognition operation is not carried out; -
FIG. 8 is a schematic view illustrating an example of a notification screen in a case in which an object is not recognized as a commodity; -
FIG. 9 is a schematic view illustrating another example of the notification screen in a case in which an object is not recognized as a commodity; and -
FIG. 10 is a schematic view illustrating another example of the notification screen in a case in which an object is not recognized as a commodity. - In accordance with one embodiment, a commodity recognition apparatus comprises a recognition module, a specification module and a notification module. The recognition module recognizes, from a captured image, a commodity imaged in the captured image. The specification module specifies the reason in a case in which the commodity cannot be recognized by the recognition module. The notification module notifies an operator of the reason specified by the specification module.
- Hereinafter, the embodiment of the commodity recognition apparatus is described with reference to the accompanying drawings. In the present embodiment, a
scanner device 1 constituting a store checkout system of a retail store which deals in vegetables, fruits and the like is provided with the functions of the commodity recognition apparatus. -
FIG. 1 is an external view of the store checkout system. The system includes thescanner device 1 serving as a registration section for registering a commodity to be purchased by a customer and a POS (Point Of Sales)terminal 2 serving as a settlement section for processing the payment of the customer. Thescanner device 1 is arranged on acheckout counter 3. ThePOS terminal 2 is arranged on a register table 4 across adrawer 5. Thescanner device 1 and thePOS terminal 2 are electrically connected with each other through a communication cable 300 (refer toFIG. 2 ). - The
scanner device 1 is provided with akeyboard 11, atouch panel 12 and acustomer display 13 as the devices required for the reading of the commodity. These display/operation devices are arranged on a thinrectangular housing 1A constituting a main body of thescanner device 1. - An
image capturing section 14 is arranged inside thehousing 1A. A rectangular-shaped reading window 1B is formed at the front side of thehousing 1A. Theimage capturing section 14 includes a CCD (Charge Coupled Device) image capturing element as an area image sensor and a drive circuit thereof, and an image capturing lens for focusing the image of an image capturing area on the CCD image capturing element. The image capturing area refers to an area of a frame image focused on the area of the CCD image capturing element through the image capturing lens from thereading window 1B. Theimage capturing section 14 outputs the image of the image capturing area focused on the CCD image capturing element through the image capturing lens. Theimage capturing section 14 may be a CMOS (complementary metal oxide semiconductor) image sensor. - The
POS terminal 2 is provided with akeyboard 21, anoperator display 22, acustomer display 23 and areceipt printer 24 as the devices required for the settlement. Thekeyboard 21, the operator display 22 and thecustomer display 23 within these devices are arranged on a housing constituting a main body of thePOS terminal 2. Thereceipt printer 24 is arranged inside the housing, and a receipt printed by the printer is issued from a receipt issuing port formed at the front side of the housing. - The
checkout counter 3 is in an elongated shape along a customer passage at the rear side thereof. The register table 4 is arranged at a side opposite to the customer passage with respect to thecheckout counter 3 at a substantially right angle to thecheckout counter 3. Specifically, the register table 4 is located at the end of thecheckout counter 3 at the downstream side of a movement direction of a customer moving along thecheckout counter 3. Therefore, thecheckout counter 3 and the register table 4 are arranged in an L-shape to define a space for a shop clerk in charge of settlement, i.e., so called casher. - At the approximate center of the
checkout counter 3, thehousing 1A of thescanner device 1 is vertically arranged such that thekeyboard 11, thetouch panel 12 and thereading window 1B are directed to the space for a shop clerk (cashier). The display forcustomer 13 of thescanner device 1 is arranged on thehousing 1A, facing to the customer passage. - A first upper surface portion of the
checkout counter 3 at the upstream side thereof through thescanner device 1 in the customer movement direction serves as a space for placing ashopping basket 6 in which an unregistered commodity M purchased by a customer is held. On the other side, a second upper surface portion at the downstream side through thescanner device 1 serves as another space for placing ashopping basket 7 in which a commodity M registered by thescanner device 1 is held. -
FIG. 2 is a block diagram illustrating the hardware constitutions of thescanner device 1 and thePOS terminal 2. Thescanner device 1 comprises ascanner section 101 and an operation-output section 102. Thescanner section 101 includes a CPU (Central Processing Unit) 111, a ROM (Read Only Memory) 112, a RAM (Random Access Memory) 113 and aconnection interface 114 which are connected with each other via abus line 115 including an address bus and a data bus. Thescanner section 101 further connects theimage capturing section 14 with thebus line 115 through an input/output circuit (not shown). - The
CPU 111 is a central part of a computer. TheCPU 111 controls each section to achieve various functions of thescanner device 1 according to an operating system or an application program. - The
ROM 112 is a main storage part of the computer. TheROM 112 stores the operating system and the application program mentioned above. As occasion demands,ROM 112 also stores data required to execute various processing by theCPU 111. - The
RAM 113 is also a main storage part of the computer mentioned above. TheRAM 113 stores data required to execute various processing by theCPU 111 as needed. Further, theRAM 113 is also used as a work area for theCPU 111 when various processing, for example, the processing of storing the frame image captured by theimage capturing section 14, is executed. - The operation-
output section 102 includes aconnection interface 116. The operation-output section 102 connects abus line 117 with theconnection interface 116, and connects thekeyboard 11, thetouch panel 12 and the display forcustomer 13 with thebus line 117 through an input/output circuit (not shown) to realize the functions of the operation-output section 102. Thetouch panel 12 includes apanel type display 12 a and atouch panel sensor 12 b overlaid on the screen of thedisplay 12 a. The operation-output section 102 further connects aspeech synthesis section 118 with thebus line 117. Thespeech synthesis section 118 outputs a speech or voice signal to a speaker 17 in response to a command input via thebus line 117. The speaker 17 converts the voice signal into a voice to output it. - The
connection interface 114 of thescanner section 101 and theconnection interface 116 of the operation-output section 102 are connected with each other through thecommunication cable 300. Through the connection, the data signal from thescanner section 101 is sent to the operation-output section 102, and the operation of the operation-output section 102 is controlled. - The
POS terminal 2 also carries aCPU 201 as a main body of the control section. TheCPU 201 is connected with aROM 203, aRAM 204, anauxiliary storage section 205, acommunication interface 206 and aconnection interface 207 via abus line 202. In addition, thekeyboard 21, display foroperator 22, display forcustomer 23,printer 24 anddrawer 5 are respectively connected with thebus line 202 via an input-output circuit (not shown). - The
communication interface 206 is connected with a store server (not shown) serving as a center of the store via a network such as a LAN (Local Area Network) and the like. Through this connection, thePOS terminal 2 can perform a transmission/reception of data with the store server. - The
connection interface 207 is connected with the twoconnection interfaces scanner apparatus 1 via acommunication cable 300. Through the connection, thePOS terminal 2 receives information from thescanner section 101 of thescanner apparatus 1. On the other hand, thescanner apparatus 1 accesses the data file stored in theauxiliary storage section 205 of thePOS terminal 2. - The
auxiliary storage section 205, which is, for example, a HDD (Hard Disk Drive) device or a SSD (Solid State Drive) device, further stores data files such as arecognition dictionary file 30, aPLU file 40 and the like, in addition to various programs. -
FIG. 3 is a schematic view illustrating the structure of the dictionary data for each commodity stored in therecognition dictionary file 30. As shown inFIG. 3 , a plurality of feature amount data are stored in therecognition dictionary file 30 for each recognition target commodity in association with a commodity name and a commodity ID for identifying the commodity. The feature amount data is obtained by extracting, from a reference image obtained by photographing a commodity identified with the corresponding commodity ID, an appearance feature amount serving as the surface information (appearance shape, tint, pattern, and concave-convex state and the like) of the commodity, and representing the appearance feature amount in the form of parameters. Feature amount data 1-N obtained when a commodity is observed from various directions are respectively stored for the commodity. The number N of feature amount data for one commodity is not fixed. The number N of feature amount data is different from one another according to commodity. In addition, the commodity name is not necessarily contained in the dictionary data for each commodity. - The
PLU file 40 stores commodity data such as a unit price serving as the price per unit in association with the commodity ID of each commodity. When the commodity ID of a commodity for sale is specified in thescanner device 1, theCPU 201 reads the unit price associated with the commodity ID from thePLU file 40 and registers the sales data of the commodity for sale with the unit price. Further, it is possible to collectively manage the dictionary data for each commodity stored in therecognition dictionary file 30 and the data for each commodity stored in thePLU file 40 in one data file. - In the store checkout system with such a constitution, the
image capturing section 14 of thescanner device 1 functions as an image capturing module for photographing the commodity held over the readingwindow 1B by the operator (shop clerk). Thetouch panel 12 functions as a display module for displaying the image captured by the image capturing module (image capturing section 14). TheCPU 111 functions as a detection module, an extraction module, a calculating module, the recognition module, the specification module and the notification module. - The detection module detects, from an image captured by the
image capturing section 14, an object imaged in the image. The extraction module analyzes the image of the object detected from the captured image and extracts the appearance feature amount of the object. The calculating module compares the appearance feature amount of the object extracted from the object image with the feature amount data of each recognition target commodity stored in therecognition dictionary file 30 to calculate, for each recognition target commodity, a similarity degree indicating how similar the appearance feature amount is to the feature amount data. The recognition module recognizes, from the captured image, the commodity imaged in the captured image based on the calculated similarity degree for each recognition target commodity. The specification module specifies the reason in a case in which the commodity cannot be recognized. The notification module notifies the operator of the reason specified by the specification module. - These modules are realized by executing processing according to the commodity recognition program by the
CPU 111 of thescanner device 1. The commodity recognition program is stored in, for example, theROM 112. As shown inFIG. 4 , thescanner device 1 forms adetermined commodity memory 41, a confirmation commodity memory 42, acandidate commodity memory 43, apattern memory 44 and acolor memory 45 in theRAM 113 as the memory areas required to execute the processing. - All the
determined commodity memory 41, the confirmation commodity memory 42 and thecandidate commodity memory 43 have an area for storing the commodity ID and the later described similarity degree as a pair. Particularly, thedetermined commodity memory 41 and the confirmation commodity memory 42 store only one pair of commodity ID and similarity degree. On the other hand, thecandidate commodity memory 43 can store a plurality of pairs of commodity ID and similarity degree. Specifically, thedetermined commodity memory 41 stores the similarity degree and the commodity ID of a commodity of which the similarity degree is greater than a first threshold value A. The confirmation commodity memory 42 stores the similarity degree and the commodity ID of a commodity of which the similarity degree is equal to or smaller than the first threshold value A and greater than a second threshold value B smaller than the first threshold value A. Thecandidate commodity memory 43 stores the similarity degree and the commodity ID of a commodity of which the similarity degree is equal to or smaller than the second threshold value B and greater than a third threshold value C smaller than the second threshold value B. - The
pattern memory 44 stores pattern data obtained by patterning various shapes of the hand of the operator of thescanner device 1 when the operator holds the commodity over the readingwindow 1B. The hand of the operator is one reason why the commodity cannot be recognized by the recognition module. That is, there is a case in which the hand of the operator shields the commodity when the operator holds the commodity over the readingwindow 1B, and as a result, the commodity cannot be recognized. Thus, thepattern memory 44 is arranged in the present embodiment to specify such a reason. - The
color memory 45 stores color data set for each reason when the commodity cannot be recognized. The reason why the commodity cannot be recognized further includes a reason that the position of the commodity is too far, or a reason that the position of the commodity is shifted, in addition to the reason that the hand of the operator shields the commodity. That is, there is a case in which the commodity cannot be recognized because the position of the commodity held over the readingwindow 1B is shifted from the image capturing area. There is a case in which the commodity cannot be recognized because the position of the commodity held over the readingwindow 1B is too far, and as a result, the resolution of the commodity image is low. Alternatively, there is a case in which the feature amount data of the commodity is not set in the recognizedictionary memory 30. In the present embodiment, the differences of these reasons can be recognized according to the differences of the colors of frame borders surrounding the recognition target object. - When the commodity recognition program is started, the
CPU 111 starts the procedure shown in the flowchart inFIG. 5 . The procedure of the processing described below including the procedure shown inFIG. 5 is just described as an example, and various processing procedures can be used properly to achieve the same result. - First, the
CPU 111 outputs an ON-signal of image capturing to the image capturing section 14 (ACT 1). Theimage capturing section 14 starts to photograph the image capturing area according to the ON-signal of image capturing. The frame images of the image capturing area captured by theimage capturing section 14 are stored in theRAM 113 in sequence. - After outputting the ON-signal of image capturing, the
CPU 111 sequentially acquires the frame images stored in theRAM 113 and displays the images on thedisplay 12 a (ACT 2). TheCPU 111 detects, from the frame image, the object imaged in the frame image (ACT 3: detection module). For example, theCPU 111 extracts a contour line and the like from the binary image of the frame image. Then theCPU 111 tries to extract the contour of the object imaged in the frame image. After the contour of the object is extracted, theCPU 111 regards that the object is detected from the frame image (ACT 4). - In a case in which the object cannot be detected from the frame image (NO in ACT 4), the
CPU 111 returns to execute the processing inACT 2. That is, theCPU 111 acquires a next frame image. Then theCPU 111 tries to detect the object from the frame image. - In a case in which the object is detected from the frame image (YES in ACT 4), the
CPU 111 specifies, for example, a minimum rectangular-shaped image area containing the contour of the object as the image area of the object. Then as shown inFIG. 6 orFIG. 7 , theCPU 111 displays a frame border L1 surrounding the image area of the object (ACT 5). The frame border L1 is formed in a rectangular shape inFIG. 6 orFIG. 7 , however, the shape of the frame border L1 is not limited to this. The frame border L1 may be formed in a rhombus, circle or ellipse. Further, the frame border L1 is exemplified as a dotted line inFIG. 6 orFIG. 7 , however, it is not limited to this, and the frame border L1 may be a solid line, one dotted line and the like. - After displaying the frame border L1, the
CPU 111 extracts the appearance feature amount such as the shape, the surface tint, the pattern, the concave-convex state and the like of the object from the image in the area (ACT 6: extraction module). - After the appearance feature amount is extracted, the
CPU 111 accesses thePOS terminal 2 through theconnection interface 115. Then theCPU 111 sequentially compares the appearance feature amount with the feature amount data of each recognition target commodity stored in therecognition dictionary file 30 to calculate the similarity degree with the appearance feature amount for each recognition target commodity (ACT 7: calculating module). - At this time, in a case in which a similarity degree greater than the first threshold value A is calculated, the
CPU 111 writes the similarity degree and the commodity ID of the commodity into thedetermined commodity memory 41. In a case in which the data is already written in thedetermined commodity memory 41, theCPU 111 retains the data with the greater similarity degree in thedetermined commodity memory 41. - In a case in which a similarity degree equal to or smaller than the first threshold value A but greater than the second threshold value B is calculated, the
CPU 111 writes the similarity degree and the commodity ID of the commodity into the confirmation commodity memory 42. In this case, if the data is already written in the confirmation commodity memory 42, theCPU 111 retains the data with the greater similarity degree in the confirmation commodity memory 42. - In a case in which a similarity degree equal to or smaller than the second threshold value B but greater than the third threshold value C is calculated, the
CPU 111 writes the similarity degree and the commodity ID of the commodity into thecandidate commodity memory 43. At this time, if the data is already written in thecandidate commodity memory 43, theCPU 111 sorts the data in thecandidate commodity memory 43 in the descending order of similarity degree. - The
CPU 111 confirms whether or not the similarity degree greater than the third threshold value C is calculated (ACT 8: recognition module). In a case in which the data is not written in thedetermined commodity memory 41, the confirmation commodity memory 42 or thecandidate commodity memory 43, the similarity degree greater than the third threshold value C is not calculated. In this case (NO in ACT 8), theCPU 111 specifies the reason why the commodity cannot be recognized (ACT 9: specification module). - Specifically, the
CPU 111 sequentially compares the image in the area surrounded by the frame border L1 with each pattern data set in thepattern memory 44 to calculate a correspondence rate. For example, when a pattern data corresponding to the image at a probability as high as 80% is detected, theCPU 111 specifies that the reason why the commodity cannot be recognized is the hand of the operator. - On the contrary, for example, in a case in which the pattern data corresponding to the image at a probability as high as 80% is not detected, the
CPU 111 measures the size of the area surrounded by the frame border L1. In a case in which the size is smaller than a preset minimum value, theCPU 111 specifies that the reason why the commodity cannot be recognized relates to the position of the commodity. In a case in which the size of the area surrounded by the frame border L1 is greater than the minimum value, theCPU 111 specifies that the reason is that the data is not set in the recognizedictionary memory 30. In ACT 9, theCPU 111 may first determine whether or not the size of the area surrounded by the frame border L1 is smaller than the minimum value, and then execute the processing of comparing with each pattern data in thepattern memory 44 to specify the reason if the size is smaller than the minimum value. - After the reason why the commodity cannot be recognized is specified, the
CPU 111 detects the color data set for the reason by reference to thecolor memory 45. Then theCPU 111 changes the color of the frame border L1 to the color of the color data (ACT 10: notification module). - Sequentially, the
CPU 111 returns toACT 2 to acquire a next frame image. Then theCPU 111 executes theprocessing following ACT 3 again. - In a case in which a similarity degree greater than the third threshold value C is calculated (YES in ACT 8), the
CPU 111 stops the notification processing carried out in ACT 10 (ACT 11). Sequentially, theCPU 111 executes a determination processing (ACT 12). The content of the determination processing varies in a case in which the data is written in thedetermined commodity memory 41, in a case in which the data is written in the confirmation commodity memory 42, and in a case in which the data is written in thecandidate commodity memory 43. - In a case in which the data is written in the
determined commodity memory 41, theCPU 111 determines the commodity specified with the commodity ID stored in thedetermined commodity memory 41 as the commodity held over the readingwindow 1B. Then theCPU 111 generates a voice from the speaker 17 to notify that the commodity is determined. - In a case in which the data is written in the confirmation commodity memory 42, the
CPU 111 displays, on thedisplay 12 a, an image of a name button which displays the name of the commodity specified with the commodity ID stored in the confirmation commodity memory 42 in a button frame. Then theCPU 111 waits until the name button is touched. When it is detected that the name button is touched according to the signal detected by thetouch panel sensor 12 b, theCPU 111 determines the commodity specified with the commodity ID stored in the confirmation commodity memory 42 as the commodity held over the readingwindow 1B. Then theCPU 111 generates a voice from the speaker 17 to notify that the commodity is determined. On the contrary, in a case in which it is detected that a screen area other than the name button is touched, theCPU 111 deletes the image of the name button. At this time, the commodity is not determined. - In a case in which the data is written in the
candidate commodity memory 43, theCPU 111 displays, on thedisplay 12 a, the images of name buttons which display the names of the commodities specified with the commodity IDs stored incandidate commodity memory 43 in button frames. Then theCPU 111 waits until any of the name buttons is touched. When it is detected that one name button is touched according to the signal detected by thetouch panel sensor 12 b, theCPU 111 determines the commodity corresponding to the name displayed in the name button as the commodity held over the readingwindow 1B. Then theCPU 111 generates a voice from the speaker 17 to notify that the commodity is determined. On the contrary, in a case in which it is detected that an area other than the name button is touched, theCPU 111 deletes the image of the name button. At this time, the commodity is not determined. - The
CPU 111 determines whether or not the commodity is determined (ACT 13). In a case in which the commodity is not determined (NO in ACT 13), theCPU 111 returns toACT 2 to acquire a next frame image. Then theCPU 111 executes theprocessing following ACT 3 again. - In a case in which the commodity is determined (YES in ACT 13), the
CPU 111 outputs the commodity ID of the determined commodity to thePOS terminal 2 through the connection interface 115 (ACT 14). - Now assume that the shop clerk holds an object M1 over the reading
window 1B as shown inFIG. 6 . In this case, almost the entire object M1 is photographed by theimage capturing section 14. In thescanner device 1, the rectangular-shaped frame border L1 indicated by a dotted line is displayed on thetouch panel 12 in a manner of surrounding the recognition area of the object. The appearance feature amount of the object M1 is extracted and compared with the feature amount data of each commodity stored in therecognition dictionary file 30. - Herein, it is assumed that the feature amount data of which the similarity degree with the appearance feature amount of the object M1 is greater than the first threshold value A is set in the
recognition dictionary file 30 as the feature amount data of a commodity “banana”. In this case, it is determined in thescanner device 1 that the object M1 held over the readingwindow 1B is the commodity “banana”, and the commodity ID of the commodity “banana” is output to thePOS terminal 2. As a result, the sales data of the commodity “banana” is registered in thePOS terminal 2. - In a case in which the similarity degree between the feature amount data of the commodity “banana” and the appearance feature amount of the object M1 is greater than the second threshold value B but smaller than the first threshold value A, as shown in
FIG. 6 , a button image N1 of the commodity “banana” is displayed on thetouch panel 12 in thescanner device 1. If the shop clerk touches the button image N1, the commodity ID of the commodity “banana” is output to thePOS terminal 2. As a result, the sales data of the commodity “banana” is registered in thePOS terminal 2. - On the other hand, assume that the shop clerk holds an object M2 over the reading
window 1B in a state in which the object M2 is almost entirely shield by the hand of the shop clerk, as shown inFIG. 7 . In this case, the entire back of the hand including the object M2 is recognized as one object in thescanner device 1. Then in thescanner device 1, the rectangular-shaped frame border L1 indicated by a dotted line is displayed on thetouch panel 12 in a manner of surrounding the recognition area of the object. The appearance feature amount of the object is extracted and compared with the feature amount data of each commodity stored in therecognition dictionary file 30. - At this time, the feature amount data of which the similarity degree with the appearance feature amount of the object is greater than the third threshold value C is not set in the
recognition dictionary file 30 in most cases. In this case, conventionally, there is no change in the screen of thetouch panel 12. There is no voice generated from the speaker, either. That is, the commodity recognition apparatus has no reaction. Thus, the shop clerk cannot determine whether the commodity is being recognized or the commodity cannot be recognized, which makes the shop clerk feel stress. - On the contrary, as shown in
FIG. 8 , the line type of the frame border L1 is changed from the dotted line to a solid line in the present embodiment. The color of the frame border L1 is also changed to a color corresponding to the reason why the commodity cannot be recognized. For example, it is assumed in thescanner device 1 that the image in the area surrounded by the frame border L1 is sequentially compared with each pattern data set in thepattern memory 44, and as a result, a pattern data corresponding to the image at a high probability is detected. In this case, it is specified in thescanner device 1 that the hand of the shop clerk is the reason why the commodity cannot be recognized. Then the color of the frame border L1 is changed to a color (for example, red) corresponding to the reason. - In this way, as the color of the frame border L1 is changed to red, the shop clerk is aware that the commodity N1 cannot be recognized because his/her hand shields the commodity N1. At this time, the shop clerk holds the commodity N1 over the reading
window 1B in a manner that the commodity N1 is not shield by the hand, thus, the shop clerk does not feel stress. - On the other hand, though the commodity is held over the reading
window 1B, there is a case in which the commodity cannot be recognized because the position of the commodity is too far from the readingwindow 1B, and as a result, the resolution of the commodity image is low. In this case, as the size of the commodity image area is smaller than the preset minimum value, thus, it is specified in thescanner device 1 that the reason why the commodity cannot be recognized is the position of the commodity. Then the color of the frame border L1 is changed to a color (for example, green) corresponding to the reason. In this way, as the color of the frame border L1 is changed to green, the shop clerk is aware that the commodity cannot be recognized because the commodity is too far from the readingwindow 1B. At this time, the shop clerk only needs to move the commodity closer to thereading window 1B, thus, the shop clerk does not feel stress. - Further, there is a case in which the commodity cannot be recognized because the feature amount data of the commodity is not registered in the
recognition dictionary file 30. In this case, the color of the frame border L1 is changed to, for example, yellow. In this way, as the color of the frame border L1 is changed to yellow, the shop clerk is aware that the commodity cannot be recognized because the feature amount data of the commodity is not registered in therecognition dictionary file 30. In this case, for example, the shop clerk only needs to operate thekeyboard 11 of thescanner device 1 to register the commodity through manual input, thus, the shop clerk does not feel stress. - As stated above, in accordance with the present embodiment, the rectangular-shaped frame border L1 is displayed on the
touch panel 12 in a manner of surrounding the recognition area of the object. When the object cannot be recognized, the color of the frame border L1 is changed. As a result, as the color of the frame border L1 is changed, thus, the operator can easily be aware why the commodity cannot be recognized. There is another advantage that the operator can confirm that the program is not frozen. - The present invention is not limited to the embodiment described above.
- For example, it is exemplified in the embodiment described above that not only is the color of the frame border L1 changed, but also the line type of the frame border L1 is changed from the dotted line to the solid line in a case in which the object cannot be recognized, however, the line type may be remained to be the dotted line or the solid line without being changed.
- Further, the reason why the object cannot be recognized may be notified in a display form other than the frame border. For example, as shown in
FIG. 9 , the operator may be notified of the reason by displaying an icon C1 on thedisplay 12. Incidentally,FIG. 9 shows an example of an icon which is displayed when the commodity cannot be recognized because the hand of the operator shields the commodity. For example, in a case in which the commodity cannot be recognized because the commodity is too far from the reading window B, an icon of a mark “?” is displayed. In this way, the operator can easily be aware of the reason why the object cannot be recognized. - As shown in
FIG. 10 , the operator may be notified of the reason by displaying a message D1 indicating the reason. In this case, the message D1 may be given through a voice under the action of thespeech synthesis section 118. Further, the message D1 may be given by both voice and display. - For example, it is also applicable to display the frame border surrounding the image area of the object or change the color of the frame border when the reason is the position of the object, and to display an icon when the reason is the shielding by the hand, and vice versa.
- The present invention is not limited to the commodity recognition apparatus in which the general object recognition technology is applied. For example, the present invention may also be applied to a commodity recognition apparatus which detects a barcode image from an image captured by an image capturing module and decodes the barcode of the barcode image to recognize a commodity.
- Further, the method of recognizing the hand of the operator is not limited to the pattern recognition. For example, it may be recognized that the reason is the hand of the operator in a case in which the ratio of a flesh color area to the image area is greater than a given ratio.
- In addition, the transfer of the commodity recognition apparatus is generally carried out in a state in which the programs such as commodity recognition program are stored in the ROM. However, the preset invention is not limited to this. The commodity recognition program and the like transferred separately from a computer device may be written in a writable storage device of the computer device through an operation of a user and the like. The transfer of the commodity recognition program and the like may be carried out by recording the program in a removable recording medium, or through a communication via a network. The form of the recording medium is not limited as long as the recording medium can store programs like a CD-ROM, a memory card and the like, and is readable by an apparatus. Further, the function realized by an installed or downloaded program can also be realized through the cooperation with an OS (Operating System) installed in the apparatus.
- While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the invention. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the invention. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the invention.
Claims (21)
1. (canceled)
2. A commodity recognition apparatus comprising:
an image sensor which has an area image sensor to photograph a commodity and generate a captured image of the commodity;
a display configured to display the image captured by the image sensor;
a memory configured to store computer-readable instructions; and
a central processing unit which processes the captured image according to the computer-readable instructions causing the commodity recognition apparatus to:
recognize the commodity imaged in the captured image;
specify a reason why the commodity cannot be recognized in a case in which the commodity cannot be recognized; and
notify a specified reason by a category of a frame displayed in the display.
3. The commodity recognition apparatus according to claim 2 , wherein
the specified reason is notified by difference of colors of the frame.
4. The commodity recognition apparatus according to claim 2 , wherein
the specified reason is notified by the frame or an icon.
5. The commodity recognition apparatus according to claim 2 , wherein
the central processing unit detects, from the image captured by the image sensor, an object imaged in the image, and
the frame is displayed in a manner of surrounding the object.
6. The commodity recognition apparatus according to claim 5 , wherein
the central processing unit measures a size of the frame in a case in which the commodity cannot be recognized, and specifies that the reason why the commodity cannot be recognized is a position of the commodity in a case in which the size is smaller than a preset value.
7. The commodity recognition apparatus according to claim 2 , further comprising:
a pattern memory storing pattern data obtained by patterning shapes of a hand of an operator.
8. The commodity recognition apparatus according to claim 7 , wherein
the central processing unit detects, from the image captured by the image sensor, an object imaged in the image, and determines whether the reason why the commodity cannot be recognized is the hand of the operator by reference to the detected object and the pattern data stored in the pattern memory in a case in which the commodity cannot be recognized.
9. A commodity recognition method comprising:
generating a captured image of a commodity with an image sensor;
displaying the image captured by the image sensor in a display;
recognizing the commodity imaged in the captured image;
specifying a reason why the commodity cannot be recognized in a case in which the commodity cannot be recognized; and
notifying a specified reason by a category of a frame displayed in the display.
10. The commodity recognition method according to claim 9 , wherein
notifying the specified reason is carried out by a difference of colors of the frame.
11. The commodity recognition method according to claim 9 , wherein
notifying the specified reason is carried out by the frame or an icon.
12. The commodity recognition method according to claim 9 , further comprising:
detecting, from the image captured by the image sensor, an object imaged in the image; and
displaying the frame in a manner of surrounding the object.
13. The commodity recognition method according to claim 12 , further comprising:
measuring a size of the frame in a case in which the commodity cannot be recognized; and
specifying that the reason why the commodity cannot be recognized is a position of the commodity in a case in which the size is smaller than a preset value.
14. The commodity recognition method according to claim 9 , further comprising:
storing pattern data obtained by patterning shapes of a hand of an operator in a pattern memory;
detecting, from the image captured by the image sensor, an object imaged in the image; and
determining whether the reason why the commodity cannot be recognized is the hand of the operator by reference to the detected object and the pattern data stored in the pattern memory in a case in which the commodity cannot be recognized.
15. A settlement system comprising:
the commodity recognition apparatus of claim 2 ; and
a POS terminal for registering sales data of the commodity recognized by the commodity recognition apparatus.
16. The settlement system of claim 15 , wherein
the central processing unit of the commodity recognition apparatus outputs a commodity ID of the commodity to the POS terminal in a case in which the commodity can be recognized from the image captured by the image sensor.
17. The settlement system of claim 15 , wherein
the commodity recognition apparatus notifies of the specified reason by difference of colors of the frame.
18. The settlement system of claim 15 , wherein
the commodity recognition apparatus notifies of the specified reason by the frame or an icon.
19. The settlement system of claim 15 , wherein
the central processing unit detects, from the image captured by the image sensor, an object imaged in the image, and
the frame is displayed in a manner of surrounding the object.
20. The settlement system of claim 19 , wherein
the central processing unit measures a size of the frame in a case in which the commodity cannot be recognized, and specifies that the reason why the commodity cannot be recognized is a position of the commodity if the size is smaller than a preset value.
21. The settlement system of claim 15 , wherein
the commodity recognition apparatus further comprises a pattern memory storing pattern data obtained by patterning shapes of a hand of an operator, and
the central processing unit detects, from the image captured by the image sensor, an object imaged in the image, and determines whether the reason why the commodity cannot be recognized is the hand of the operator by reference to the detected object and the pattern data stored in the pattern memory in a case where the commodity cannot be recognized.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/398,875 US20170116491A1 (en) | 2013-11-20 | 2017-01-05 | Commodity recognition apparatus and commodity recognition method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013240021A JP2015099549A (en) | 2013-11-20 | 2013-11-20 | Article-of-commerce recognition device and article-of-commerce recognition program |
JP2013-240021 | 2013-11-20 | ||
US14/537,963 US9569665B2 (en) | 2013-11-20 | 2014-11-11 | Commodity recognition apparatus |
US15/398,875 US20170116491A1 (en) | 2013-11-20 | 2017-01-05 | Commodity recognition apparatus and commodity recognition method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/537,963 Continuation US9569665B2 (en) | 2013-11-20 | 2014-11-11 | Commodity recognition apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170116491A1 true US20170116491A1 (en) | 2017-04-27 |
Family
ID=53173348
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/537,963 Active US9569665B2 (en) | 2013-11-20 | 2014-11-11 | Commodity recognition apparatus |
US15/398,875 Abandoned US20170116491A1 (en) | 2013-11-20 | 2017-01-05 | Commodity recognition apparatus and commodity recognition method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/537,963 Active US9569665B2 (en) | 2013-11-20 | 2014-11-11 | Commodity recognition apparatus |
Country Status (2)
Country | Link |
---|---|
US (2) | US9569665B2 (en) |
JP (1) | JP2015099549A (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015099549A (en) * | 2013-11-20 | 2015-05-28 | 東芝テック株式会社 | Article-of-commerce recognition device and article-of-commerce recognition program |
JP6274097B2 (en) * | 2014-12-17 | 2018-02-07 | カシオ計算機株式会社 | Product identification device and product recognition navigation method |
US10049273B2 (en) * | 2015-02-24 | 2018-08-14 | Kabushiki Kaisha Toshiba | Image recognition apparatus, image recognition system, and image recognition method |
JP6341124B2 (en) * | 2015-03-16 | 2018-06-13 | カシオ計算機株式会社 | Object recognition device and recognition result presentation method |
JP6747870B2 (en) * | 2016-05-23 | 2020-08-26 | 東芝テック株式会社 | Checkout system |
JP2017211880A (en) * | 2016-05-26 | 2017-11-30 | 東芝テック株式会社 | Information processing apparatus and program |
JP2018032332A (en) * | 2016-08-26 | 2018-03-01 | 東芝テック株式会社 | Information processor and program |
JP6330115B1 (en) * | 2018-01-29 | 2018-05-23 | 大黒天物産株式会社 | Product management server, automatic cash register system, product management program, and product management method |
US20210342876A1 (en) * | 2018-05-09 | 2021-11-04 | Nec Corporation | Registration system, registration method, and non-transitory storage medium |
US11595632B2 (en) * | 2019-12-20 | 2023-02-28 | Samsara Networks Inc. | Camera configuration system |
JP7451320B2 (en) * | 2020-06-18 | 2024-03-18 | 京セラ株式会社 | Information processing system, information processing device, and information processing method |
JP7360997B2 (en) * | 2020-06-18 | 2023-10-13 | 京セラ株式会社 | Information processing system, information processing device, and information processing method |
CN116308327B (en) * | 2022-12-28 | 2023-10-27 | 深圳市销邦数据技术有限公司 | Self-service cashing system and method based on RFID technology |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6236736B1 (en) * | 1997-02-07 | 2001-05-22 | Ncr Corporation | Method and apparatus for detecting movement patterns at a self-service checkout terminal |
US20130208122A1 (en) * | 2012-01-30 | 2013-08-15 | Toshiba Tec Kabushiki Kaisha | Commodity reading apparatus and commodity reading method |
US20130223680A1 (en) * | 2012-02-24 | 2013-08-29 | Toshiba Tec Kabushiki Kaisha | Recognition system, recognition method and computer readable medium |
US20160132855A1 (en) * | 2014-11-06 | 2016-05-12 | Toshiba Tec Kabushiki Kaisha | Commodity sales data processing apparatus, reading apparatus and method by the same |
US9569665B2 (en) * | 2013-11-20 | 2017-02-14 | Toshiba Tec Kabushiki Kaisha | Commodity recognition apparatus |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1965344B1 (en) * | 2007-02-27 | 2017-06-28 | Accenture Global Services Limited | Remote object recognition |
US8876001B2 (en) * | 2007-08-07 | 2014-11-04 | Ncr Corporation | Methods and apparatus for image recognition in checkout verification |
JP4460611B2 (en) * | 2008-01-31 | 2010-05-12 | 東芝テック株式会社 | Product registration system and method |
JP5138667B2 (en) * | 2009-12-22 | 2013-02-06 | 東芝テック株式会社 | Self-checkout terminal |
US20120099756A1 (en) * | 2010-10-20 | 2012-04-26 | Faiz Feisal Sherman | Product Identification |
JP5194160B1 (en) | 2011-10-19 | 2013-05-08 | 東芝テック株式会社 | Information processing apparatus, information processing method, and program |
JP5619095B2 (en) * | 2012-09-03 | 2014-11-05 | 東芝テック株式会社 | Product recognition apparatus and product recognition program |
US9239943B2 (en) * | 2014-05-29 | 2016-01-19 | Datalogic ADC, Inc. | Object recognition for exception handling in automatic machine-readable symbol reader systems |
-
2013
- 2013-11-20 JP JP2013240021A patent/JP2015099549A/en active Pending
-
2014
- 2014-11-11 US US14/537,963 patent/US9569665B2/en active Active
-
2017
- 2017-01-05 US US15/398,875 patent/US20170116491A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6236736B1 (en) * | 1997-02-07 | 2001-05-22 | Ncr Corporation | Method and apparatus for detecting movement patterns at a self-service checkout terminal |
US20130208122A1 (en) * | 2012-01-30 | 2013-08-15 | Toshiba Tec Kabushiki Kaisha | Commodity reading apparatus and commodity reading method |
US20130223680A1 (en) * | 2012-02-24 | 2013-08-29 | Toshiba Tec Kabushiki Kaisha | Recognition system, recognition method and computer readable medium |
US9569665B2 (en) * | 2013-11-20 | 2017-02-14 | Toshiba Tec Kabushiki Kaisha | Commodity recognition apparatus |
US20160132855A1 (en) * | 2014-11-06 | 2016-05-12 | Toshiba Tec Kabushiki Kaisha | Commodity sales data processing apparatus, reading apparatus and method by the same |
Also Published As
Publication number | Publication date |
---|---|
JP2015099549A (en) | 2015-05-28 |
US20150139493A1 (en) | 2015-05-21 |
US9569665B2 (en) | 2017-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9569665B2 (en) | Commodity recognition apparatus | |
US10108830B2 (en) | Commodity recognition apparatus and commodity recognition method | |
JP6686290B2 (en) | System and method | |
US20140177912A1 (en) | Commodity reading apparatus, commodity sales data processing apparatus and commodity reading method | |
US20140023241A1 (en) | Dictionary registration apparatus and method for adding feature amount data to recognition dictionary | |
US20140126773A1 (en) | Commodity recognition apparatus and commodity recognition method | |
US10169752B2 (en) | Merchandise item registration apparatus, and merchandise item registration method | |
US9454708B2 (en) | Recognition dictionary creation apparatus and method for creating recognition dictionary by the same | |
US9990619B2 (en) | Holding manner learning apparatus, holding manner learning system and holding manner learning method | |
US9235764B2 (en) | Commodity recognition apparatus and commodity recognition method | |
US20160140534A1 (en) | Information processing apparatus, store system and method | |
US10078828B2 (en) | Commodity registration apparatus and commodity registration method | |
US20130322700A1 (en) | Commodity recognition apparatus and commodity recognition method | |
US20180225746A1 (en) | Information processing apparatus and information processing method | |
US10482447B2 (en) | Recognition system, information processing apparatus, and information processing method | |
US20130208122A1 (en) | Commodity reading apparatus and commodity reading method | |
JP5551140B2 (en) | Information processing apparatus and program | |
US20200334656A1 (en) | Product recognition apparatus, sales data processing apparatus, and control method | |
US9619836B2 (en) | Recognition dictionary evaluation apparatus and recognition dictionary evaluation method | |
US20170344851A1 (en) | Information processing apparatus and method for ensuring selection operation | |
US20140064570A1 (en) | Information processing apparatus and information processing method | |
US9269026B2 (en) | Recognition dictionary creation apparatus and recognition dictionary creation method | |
JP2016031599A (en) | Information processor and program | |
JP2015099550A (en) | Article-of-commerce recognition device and article-of-commerce recognition program | |
US20140222602A1 (en) | Information processing apparatus and method for detecting stain on iamge capturing surface thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOSHIBA TEC KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TAKENO, YUISHI;REEL/FRAME:040858/0481 Effective date: 20141110 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |