US20150016672A1 - Commodity recognition apparatus and commodity recognition method - Google Patents

Commodity recognition apparatus and commodity recognition method Download PDF

Info

Publication number
US20150016672A1
US20150016672A1 US14/302,788 US201414302788A US2015016672A1 US 20150016672 A1 US20150016672 A1 US 20150016672A1 US 201414302788 A US201414302788 A US 201414302788A US 2015016672 A1 US2015016672 A1 US 2015016672A1
Authority
US
United States
Prior art keywords
commodity
frame
image
candidate
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/302,788
Other languages
English (en)
Inventor
Youji Tsunoda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba TEC Corp
Original Assignee
Toshiba TEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba TEC Corp filed Critical Toshiba TEC Corp
Assigned to TOSHIBA TEC KABUSHIKI KAISHA reassignment TOSHIBA TEC KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TSUNODA, YOUJI
Publication of US20150016672A1 publication Critical patent/US20150016672A1/en
Priority to US15/201,650 priority Critical patent/US10061490B2/en
Priority to US16/043,299 priority patent/US20180329614A1/en
Priority to US16/999,116 priority patent/US20200379630A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06K9/6202
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • G06K9/60
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • Embodiments described herein relate to a commodity recognition apparatus used in a dining facility such as a cafeteria and a commodity recognition method performed by the commodity recognition apparatus.
  • a dining facility such as a cafeteria
  • prepared cooked food is taken up and placed on a tray by a user himself or herself or by a staff working in the cafeteria and then a checkout processing is carried out after all desired cooked food are taken up.
  • a casher registers the cooked food placed on the tray one by one to a cash register to carry out the checkout processing.
  • the checkout processing takes relatively much time to recognize cooked food, and thus various proposals are made to solve the problem.
  • One of the conventionally proposed recognition apparatuses recognizes the container such as a dish on which the commodity (cooked food) is placed.
  • the container is limited to one that can be recognized by the recognition apparatus. Therefore, the degree of freedom of selecting the container is also limited, and such commodity recognition is hardly to use.
  • FIG. 1 is an external view illustrating a commodity recognition apparatus according to one embodiment
  • FIG. 2 is a block diagram illustrating the constitution of the main portions of the same commodity recognition apparatus shown in FIG. 1 ;
  • FIG. 3 is a schematic view illustrating the data structure of a recognition dictionary file
  • FIG. 4 is a functional block diagram illustrating the same commodity recognition apparatus
  • FIG. 5 is a schematic view illustrating a main memory area formed in a RAM of the same commodity recognition apparatus
  • FIG. 6 is a flowchart illustrating the main portions of an information processing procedure executed by a CPU of the same commodity recognition apparatus according to a commodity recognition program
  • FIG. 7 is a flowchart specifically illustrating the procedure of image recognition processing shown in FIG. 6 ;
  • FIG. 8 is a flowchart illustrating an information processing procedure carried out after YES is taken in determination block ACT 13 shown in FIG. 6 ;
  • FIG. 9 is a flowchart illustrating an information processing procedure carried out after YES is taken in determination block ACT 14 shown in FIG. 6 ;
  • FIG. 10 is a flowchart illustrating an information processing procedure carried out after YES is taken in determination block ACT 7 shown in FIG. 6 ;
  • FIG. 11 is a schematic view illustrating an example of a screen displayed on a panel display module of the commodity recognition apparatus
  • FIG. 12 is a schematic view illustrating an example of a screen displayed after a cooked food item is touched through the screen shown in FIG. 11 ;
  • FIG. 13 is a schematic view illustrating an example of a screen displayed after the cooked food item is recognized from the screen shown in FIG. 12 ;
  • FIG. 14 is a schematic view illustrating an example of a screen displayed after other cooked food item is touched through the screen shown in FIG. 13 ;
  • FIG. 15 is a schematic view illustrating an example of a screen displayed after the cooked food item is recognized from the screen shown in FIG. 14 ;
  • FIG. 16 is a schematic view illustrating an example of a screen displayed after an area inside a frame is touched through the screen shown in FIG. 15 ;
  • FIG. 17 is a schematic view illustrating an example of a screen displayed after the cooked food item is recognized from the screen shown in FIG. 16 .
  • a commodity recognition apparatus comprises an image capturing module, an image display module, an input receiving module, a frame display module, a recognition module, a first output module, a change receiving module and a second output module.
  • the image capturing module photographs an image capturing area including a commodity to capture an image thereof.
  • the image display module displays the image captured by the image capturing module on a display module.
  • the input receiving module receives a selection input on any position on the image displayed on the display module.
  • the frame display module displays, on the image displayed on the display module, a frame for surrounding the commodity the selection input on which is received by the input receiving module.
  • the recognition module recognizes a candidate of the commodity imaged in the frame according to the feature amount of the image in the area surrounded by the frame.
  • the first output module outputs information of a best candidate commodity among the candidates of the commodity recognized by the recognition module.
  • the change receiving module receives a change instruction for the candidate.
  • the second output module outputs information of the commodity other than the best candidate selected from the candidates if the change instruction for the candidate is received.
  • the commodity recognition apparatus according to one embodiment is described below.
  • the commodity recognition apparatus is arranged in a dining facility such as a cafeteria for staff or a cafeteria for student.
  • the commodity recognition apparatus adopts an object recognition technology.
  • the object recognition is a technology in which a target object is photographed by a camera to capture an image thereof and then the category and the like of the object is recognized according to the captured image data.
  • a computer extracts appearance feature amount of the object contained in the image from the image data. Then, the computer compares the extracted appearance feature amount with the feature amount data of a reference image previously registered in a recognition dictionary file to calculate a similarity degree, and then recognizes the category and the like of the object based on the similarity degree.
  • a technology for recognizing an object contained in an image is disclosed in the following document.
  • FIG. 1 is an external view illustrating a commodity recognition apparatus 1 according to the present embodiment.
  • the commodity recognition apparatus 1 arranged on a table surface 2 A of a checkout counter 2 includes a plate-shaped body part 11 A and an eaves part 11 B.
  • the body part 11 A is vertically arranged along the edge of the rear side of the table surface 2 A in such a manner that the front side thereof faces to the near side of a user to provide an area 2 B of the table surface 2 A between the front side and the user, as shown in FIG. 1 .
  • the eaves part 11 B protrudes from the upper end part of the front side of the body part 11 A in a direction substantially orthogonal to the front side.
  • the underside of the eaves part 11 B faces to the area of the table surface 2 A.
  • An image capturing module 12 is arranged inside the eaves part 11 B. Further, a reading window (not shown) is formed at the underside of the eaves part 11 B.
  • the image capturing module 12 comprises a CCD (Charge Coupled Device) image capturing element serving as an area image sensor and a drive circuit thereof, and an image capturing lens used for focusing the image of an image capturing area on the CCD image capturing element.
  • the image capturing area refers to an area of an image focused on an area of the CCD image capturing element through the image capturing lens from the reading window.
  • the image capturing module 12 outputs the image of the image capturing area focused on the CCD image capturing element through the image capturing lens.
  • the image capturing module 12 is not limited to a module which uses the CCD image capturing element.
  • the image capturing module 12 may be a module which uses a CMOS (complementary metal oxide semiconductor) image sensor.
  • the commodity recognition apparatus 1 further comprises an operation panel 13 and an IC card reader/writer (hereinafter referred to as a card RW) 18 .
  • the operation panel 13 is arranged at the top of the body part 11 A.
  • the operation panel 13 takes the front side of the body part 11 A as an operation surface.
  • the operation panel 13 includes a keyboard 14 , a touch panel 15 and a receipt issuing port 16 on the operation surface.
  • the receipt issuing port 16 issues a receipt printed by a receipt printer 17 arranged inside the operation panel 13 .
  • the card RW 18 is arranged at one side of the operation panel 13 .
  • the card RW 18 carries out a data writing and a data reading to and from an IC card.
  • the IC card stores a unique user ID and electronic money data to realize a user ID function and an electronic money function.
  • a user (a staff, student and the like) using the dining facility holds their own IC card.
  • the area 2 B on the table surface 2 A at the near side of the user serves as a space for putting a container 3 in which cooked food (commodity) is placed.
  • the area 2 B is included in the image capturing area of the image capturing module 12 .
  • the image capturing module 12 photographs the container 3 , on which cooked food is placed, in the area 2 B (image capturing area).
  • No specific limitation is given to the shape and the number of the containers 3 although an oval shaped tray is shown as one example of the container 3 in FIG. 1 . In a word, any dishes or trays can be used as long as the user can take dish or tray in which prepared cooked food is placed to the checkout counter 2 and put it in the area 2 B.
  • FIG. 2 is a block diagram illustrating the constitution of the main portions of the commodity recognition apparatus 1 .
  • the commodity recognition apparatus 1 comprises a CPU (Central Processing Unit) 21 , a ROM (Read Only Memory) 22 , a RAM (Random Access Memory) 23 and an auxiliary storage device 24 .
  • the commodity recognition apparatus 1 connects the CPU 21 with the ROM 22 , the RAM 23 and the auxiliary storage device 24 through a bus line 25 such as an address bus, a data bus and the like.
  • a bus line 25 such as an address bus, a data bus and the like.
  • the CPU 21 is a central part of a computer.
  • the CPU 21 controls each module to achieve various functions of the commodity recognition apparatus 1 according to an operating system or an application program.
  • the ROM 22 is a main storage part of the computer.
  • the ROM 22 stores the operating system and the application program mentioned above. As occasion demands, the ROM 22 also stores data required to execute various processing by the CPU 21 .
  • the RAM 23 is also a main storage part of the computer mentioned above.
  • the RAM 23 stores data required to execute various processing by the CPU 21 as needed. Further, the RAM 23 is also used as a work area for the CPU 21 when various processing is executed.
  • the auxiliary storage device 24 is an auxiliary storage part of the computer.
  • the auxiliary storage device 24 is, for example, an EEPROM (electric erasable programmable read-only memory), a hard disk drive, a SSD (solid state drive) and the like.
  • the auxiliary storage device 24 stores data used by the CPU 21 to carry out various processing and data generated in the processing carried out by the CPU 21 .
  • the auxiliary storage device 24 also stores the application programs mentioned above.
  • the commodity recognition apparatus 1 connects a communication interface 26 with the bus line 25 .
  • the commodity recognition apparatus 1 accesses a database server 41 through the communication interface 26 .
  • the database server 41 is provided with a user database 40 which stores personal information such as the sex, age and information relating to a personal usage history of the dining facility in association with the user ID of each user.
  • the commodity recognition apparatus 1 connects the image capturing module 12 , the keyboard 14 , the touch panel 15 , the receipt printer 17 and the card RW 18 with each other through an input/output circuit (not shown).
  • the touch panel 15 includes a panel display module 151 and a touch panel sensor 152 overlaid on the screen of the panel display module 151 .
  • the commodity recognition apparatus 1 with such a constitution further includes a recognition dictionary file 50 which is stored in the auxiliary storage device 24 .
  • the storage of the recognition dictionary file 50 is not limited to the auxiliary storage device 24 .
  • the recognition dictionary file 50 may also be stored in the RAM 23 .
  • the recognition dictionary file 50 may be stored in a storage device of an external machine connected with the commodity recognition apparatus 1 through the communication interface 26 .
  • FIG. 3 is a schematic view illustrating the data structure of the recognition dictionary file 50 .
  • the recognition dictionary file 50 stores recognition dictionary data for each item of the cooked food provided by the dining facility for users.
  • the recognition dictionary data contains items including a menu ID, menu name, price, calorie and appearance feature parameter.
  • the menu ID is a code for individually identifying cooked food items.
  • the menu name is a unique name of the cooked food item identified with the corresponding menu ID.
  • the price is the sales price (Yen) per unit quantity of the cooked food item identified with the corresponding menu ID.
  • the calorie is a parameter derived for the standard quantity of the cooked food item identified with the corresponding menu ID.
  • the appearance feature parameter is obtained by parameterizing the appearance feature such as the standard shape, surface hue, pattern, concave-convex state and the like of the cooked food item identified with the corresponding menu ID.
  • the commodity recognition apparatus 1 includes an acquisition module 61 , an image display module 62 , a frame display module 63 , a recognition module 64 , a first output module 65 , a change receiving module 66 , a second output module 67 , an input receiving module 68 and a determination module 69 .
  • the acquisition module 61 acquires an image captured by the image capturing module 12 .
  • the image display module 62 displays the image acquired by the acquisition module 61 on the touch panel 15 serving as a display module.
  • the frame display module 63 displays frames for individually surrounding commodities (cooked food) at any one or multiple positions on the image displayed on the touch panel 15 .
  • the recognition module 64 recognizes a candidate of a commodity imaged in the frame according to the feature amount of the image in the area surrounded by the frame.
  • the first output module 65 outputs information of the best or highest ranked candidate commodity among the candidates of the commodity recognized by the recognition module 64 .
  • the change receiving module 66 receives an instruction of changing the candidate of the commodity output by the first output module 65 .
  • the second output module 67 outputs information of the commodity other than the best candidate selected from the candidates of the commodity if the change receiving module 66 receives the instruction of changing the candidate of the commodity.
  • the input receiving module 68 receives an instruction input to any position on the image displayed on the touch panel 15 .
  • the determination module 69 determines whether or not the position where the instruction input is received by the input receiving module 68 is inside the frame displayed by the frame display module 63 . Then, if it is determined that the position is outside the frame, the frame display module 63 is operated, and if it is determined that the position is inside the frame, the change receiving module 66 is operated.
  • the commodity recognition apparatus 1 stores the commodity recognition program in the ROM 22 or the auxiliary storage device 24 . Further, as shown in FIG. 5 , the commodity recognition apparatus 1 forms a number of items counter 71 of a count value n, a candidate memory 72 and a detail memory 73 in the RAM 23 .
  • the candidate memory 72 includes an area 721 which stores, in the order of the frame number, menu IDs of the menu items from the first candidate to the fourth candidate recognized in the image recognition processing described later, and an area 722 which stores a candidate number m.
  • the detail memory 73 includes an area 731 which stores, in the order of the frame number, a detail record including a deleting mark coordinate (X0, Y0), the menu ID, the menu name, the price, the calorie, and an area 732 which stores the total price and the total calorie. Each of the items is described later.
  • the CPU 21 starts the processing of procedures shown in the flowcharts in FIG. 6-FIG . 10 .
  • the content of the processing which is shown in FIG. 6-FIG . 10 and is described later is just an example, and the same result can be achieved by carrying out various processing properly.
  • the CPU 21 waits for the card RW 18 to read the IC card (ACT 1 ). The CPU 21 does not execute the processing in ACT 2 until the IC card is read (NO in ACT 1 ). If the IC card is read (YES in ACT 1 ), the CPU 21 execute the processing in ACT 2 . The CPU 21 resets the number of items counter 71 to “0” (ACT 2 ).
  • the CPU 21 acquires the image (frame image) captured by the image capturing module 12 (ACT 3 : acquisition module 61 ). Then the CPU 21 displays the captured image on the panel display module 151 of the touch panel 15 (ACT 4 : image display module 62 ).
  • FIG. 11 is an example of a screen of the panel display module 151 on which a captured image 81 is displayed. As shown in FIG. 11 , in addition to the captured image 81 , a total amount column 82 , a total calorie column 83 , a “determine” button 84 and a “cancel” button 85 are also displayed on the screen.
  • the CPU 21 confirms whether or not the screen is touched (ACT 5 ). If the screen is not touched (NO in ACT 5 ), the CPU 21 checks the content of the number of items counter 71 (ACT 6 ). If the number of items counter 71 is reset to “0” (NO in ACT 6 ), the CPU 21 returns to the processing in ACT 3 . The CPU 21 acquires a next captured image from the image capturing module 12 (ACT 3 ), and then displays the image on the panel display module 151 (ACT 4 ).
  • the CPU 21 repeats the acquiring and displaying processing of the captured image until the screen is touched. If the screen is touched (YES in ACT 5 ), the CPU 21 checks which area of the screen is touched (ACT 7 , ACT 8 ). If neither of the areas of the “determine” button 84 and the “cancel” button 85 is touched (NO in ACT 7 ), and the area of the captured image 81 is not touched (NO in ACT 8 ), the CPU 21 returns to the processing in ACT 5 . Because the touch input is ignored.
  • the CPU 21 detects the coordinate (X, Y) of the touched position (ACT 9 : input receiving module 68 ).
  • the coordinate (X, Y) defines a reference point, for example, the lower-left corner of the area of the captured image 81 as the origin (0, 0) of the two-dimensional coordinate, the right direction from the origin (0, 0) as the X direction, and the upward direction as the Y direction. Then the CPU 21 calculates the distances from the origin (0, 0) to the touched position in the X direction and the Y direction, and then converts the distances into the coordinate (X, Y).
  • the CPU 21 checks the number of items counter 71 (ACT 10 ). If the number of items counter 71 is reset to “0” (NO in ACT 10 ), the CPU 21 determines the recognition area of the image, and displays a frame 91 surrounding the recognition area in the captured image 81 . As shown in FIG. 12 , the CPU 21 displays the frame 91 in a rectangular shape of which the vertical length is “a” and the horizontal length is “b” by taking the touched position coordinate (X, Y) as a center thereof (ACT 11 : frame display module 63 ). At this time, the color of the frame 91 is a default color, for example, black.
  • FIG. 7 is a flowchart illustrating a specific procedure of the image recognition processing.
  • the CPU 21 increases the number of items counter 71 by “1” (ACT 21 ). Further, the CPU 21 extracts the appearance feature amount such as the contour shape, surface hue, pattern, concave-convex state and the like from the image surrounded by the frame 91 (ACT 22 ). Then the CPU 21 calculates, for the record of each menu item registered in the recognition dictionary file 50 , a similarity degree indicating how much similar the feature amount is to the appearance feature parameter in the record (ACT 23 ).
  • the CPU 21 selects the first to the fourth menu items in the descending order of similarity degree. Then the menu IDs of the first to the fourth menu items are sequentially stored in the area 721 from the first candidate to the fourth candidate of the frame number n (n is the count value of the number of items counter 71 ) in the candidate memory 72 (ACT 24 : recognition module 64 ).
  • the menu ID of the menu item having the highest similarity degree is stored in the area of the first candidate, and the menu ID of the menu item having the second highest similarity degree is stored in the area of the second candidate. Sequentially, the menu ID of the menu item having the third highest similarity degree is stored in the area of the third candidate, and the menu ID of the menu item having the fourth highest similarity degree is stored in the area of the fourth candidate.
  • the CPU 21 sets the candidate number m of the frame number n in the candidate memory 72 to “1” (ACT 25 ). Then the CPU 21 acquires information of the menu item (menu ID, menu name, price and calorie) specified with the menu ID of the m-th candidate (m is the candidate number) from the recognition dictionary file 50 (ACT 26 ). Further, the CPU 21 calculates the coordinate of the upper-right corner of the frame 91 as the deleting mark coordinate (X0, Y0) (ACT 27 ).
  • the CPU 21 sets the deleting mark coordinate (X0, Y0) and the menu item information in the detail record of the frame number n of the detail memory 73 (ACT 28 ).
  • the CPU 21 displays a deleting mark 92 at the position of the deleting mark coordinate (X0, Y0).
  • the CPU 21 displays the menu name, the price and the calorie of the menu item information in the frame 91 (ACT 29 : first output module 65 ).
  • the CPU 21 calculates the total amount and the total calorie according to the data of the detail memory 73 , and respectively displays the total amount and the total calorie in the total amount column 82 and the total calorie column 83 of the screen (ACT 30 ). Then, the image recognition processing is ended.
  • the CPU 21 After the image recognition processing is ended, the CPU 21 returns to the processing in ACT 5 .
  • the CPU 21 confirms whether or not the screen of the panel display module 151 is touched (ACT 5 ). If the screen is not touched (NO in ACT 5 ), the CPU 21 checks the number of items counter 71 (ACT 6 ).
  • the CPU 21 returns to the processing in ACT 5 .
  • the CPU 21 does not acquire the captured image.
  • the former acquired captured image is maintained as a still image displayed on the panel display module 151 .
  • the CPU 21 checks which area of the screen is touched (ACT 7 , ACT 8 ). If the area of the captured image 81 is touched again (YES in ACT 8 ), the CPU 21 detects the coordinate (X, Y) of the touched position (ACT 9 ). At this time, the number of items counter 71 is increased. Thus, “YES” is taken in the processing in ACT 10 .
  • the CPU 21 determines whether the deleting mark 92 is touched, the area inside the frame 91 of the recognition area is touched, or the area outside the frame 91 is touched (ACT 13 , ACT 14 : determination module 69 ).
  • the CPU 21 compares the touched position coordinate (X, Y) with the deleting mark coordinate (X0, Y0) of each detail record stored in the detail memory 73 . Then, if there exists a deleting mark coordinate (X0, Y0) which is substantially consistent with the touched position coordinate (X, Y), it is determined that the deleting mark 92 is touched. On the other hand, if the touched position coordinate (X, Y) is consistent with any one of the coordinates in the area of the frame 91 , it is determined that the area inside the frame 91 is touched. If the touched position coordinate (X, Y) is not consistent with any one of the coordinates in the area of the frame 91 , it is determined that the area outside the frame 91 is touched.
  • the CPU 21 executes the processing in ACT 11 .
  • the CPU 21 displays a frame 91 a in a rectangular shape of which the vertical length is “a” and the horizontal length is “b” by taking the touched position coordinate (X, Y) as a center thereof (ACT 11 : frame display module 63 ).
  • the CPU 21 executes the image recognition processing for the image in the frame 91 a .
  • the content of the number of items counter (frame number n) is further increased.
  • the menu IDs of the first to the fourth menu items in the descending order of similarity degree are stored in the area 721 from the first candidate to the fourth candidate of the frame number n of the candidate memory 72 . Further, the deleting mark coordinate (X0, Y0) and information of the menu item of the first candidate (having the highest similarity degree) are set in the detail record of the frame number n of the detail memory 73 . Then, as shown in FIG. 15 , the menu name, the price and the calorie of the menu item information are displayed in the frame 91 a . Further, the total amount and the total calorie are updated.
  • the CPU 21 executes the processing shown in the flowchart in FIG. 8 .
  • the CPU 21 detects a frame number P from the detail record containing the deleting mark coordinate (X0, Y0) which is substantially consistent with the touched position coordinate (X, Y) (ACT 41 ).
  • the CPU 21 erases the frame 91 identified with the frame number P from the screen (ACT 42 : erasing module 68 ).
  • the CPU 21 erases the record of the frame number P from the detail memory 73 (ACT 43 ).
  • the CPU 21 recalculates the total amount and the total calorie according to the data of the detail memory 73 and updates the display of the total amount column 82 and the total calorie column 83 (ACT 44 ).
  • the CPU 21 decreases the number of items counter 71 by “1” (ACT 45 ). Then the CPU 21 confirms whether or not the count value n of the number of items counter 71 is “0” (ACT 46 ). If the count value n is not “0” (NO in ACT 46 ), the CPU 21 executes the processing in ACT 5 . If the count value n is “0” (YES in ACT 46 ), the CPU 21 executes the processing in ACT 3 .
  • the CPU 21 does not acquire a new captured image.
  • the captured image displayed on the panel display module 151 is not changed.
  • the CPU 21 starts to acquire a captured image again.
  • the captured image displayed on the panel display module 151 is updated to the newest image.
  • the CPU 21 detects a frame number Q from the detail record containing the deleting mark coordinate (X0, Y0) forming the frame 91 in which the touched position coordinate (X, Y) exists (ACT 51 ). Then the CPU 21 adds “1” to the candidate number m corresponding to the frame number Q of the candidate memory 72 (ACT 52 ).
  • the CPU 21 determines whether or not the candidate number m is greater than the maximum value M (“4” in the present embodiment) of the number of candidates (ACT 53 ). Then if the candidate number m is greater than the maximum value M (YES in ACT 53 ), the CPU 21 resets the candidate number m to “1” (ACT 54 ). If the candidate number m is not greater than the maximum value M (NO in ACT 53 ), the candidate number m is not changed.
  • the CPU 21 acquires information of the menu item (menu ID, menu name, price and calorie) specified with the menu ID of the m-th candidate (m is the candidate number) from the recognition dictionary file 50 (ACT 55 ). Then the CPU 21 rewrites the menu item information in the detail record of the frame number n of the detail memory 73 to the current menu item information acquired (ACT 56 : second output module 67 ). The CPU 21 changes the menu name, the price and the calorie displayed in the frame 91 of the frame number Q to the current menu item information acquired (ACT 57 ). Then the CPU 21 recalculates the total amount and the total calorie according to the data of the detail memory 73 and updates the values of the total amount column 82 and the total calorie column 83 (ACT 58 ).
  • the CPU 21 changes the color of the frame 91 of the frame number Q to the color corresponding to the m-th candidate (ACT 59 ).
  • at least four different colors are prepared as the color of the frame 91 , and the color of the frame 91 is changed according to the candidate order of the menu item displayed in the frame 91 .
  • the CPU 21 executes the processing in ACT 21 .
  • the menu item information in the frame is sequentially switched to information of the next menu item from the first rank thereof in the candidate order. Then, if the area inside the frame in which information of the menu item ranked at the last in the candidate order is displayed is further touched, the menu item information in the frame is switched back to information of the menu item ranked at the first in the candidate order.
  • the CPU 21 confirms which one of the “determine” button 84 and the “cancel” button 85 is touched (ACT 61 ). If the “determine” button 84 is touched (“determine” in ACT 61 ), the CPU 21 executes each processing in ACT 62 -ACT 64 .
  • the CPU 21 executes settlement processing with electronic money.
  • the CPU 21 decreases data corresponding to the amount of the total price from the electronic money data of the IC card through the card RW 18 .
  • the CPU 21 edits the receipt data based on the data of the detail memory 73 . Then the CPU 21 outputs the receipt data to the printer 17 and controls the issuing of the receipt.
  • the CPU 21 edits the personal user history information using the data of the detail memory 73 and the user ID of the IC card. Then the CPU 21 sends the edited personal user history information to the database server 41 through the communication interface 26 .
  • the edited personal user history information is stored for each user ID in the user database 40 .
  • the CPU 21 initializes the screen of the panel display module 151 (ACT 65 ). Further, the CPU 21 clears the candidate memory 72 and the detail memory 73 (ACT 66 ).
  • the CPU 21 does not execute the processing in ACT 62 -ACT 64 described above.
  • the CPU 21 initializes the screen of the panel display module 151 (ACT 65 ). Further, the CPU 21 clears the candidate memory 72 and the detail memory 73 (ACT 66 ). In this way, the processing of the commodity recognition program for one user is ended.
  • the user puts the container 3 in which the cooked food is placed in the area 2 B of the checkout counter 2 . Then the user enables the card RW 18 to read the data of his or her own IC card. In this way, as shown in FIG. 11 , the captured image 81 is displayed on the screen of the touch panel 15 and then, the user touches one by one the images of the cooked food placed on the container 3 .
  • the frame 91 having the touched position coordinate (X, Y) as a center thereof is displayed on the captured image 81 .
  • the recognition processing of the cooked food in the frame 91 is executed, and the menu items ranked from the first to the fourth in the candidate order in the descending order of similarity degree are selected.
  • the name, price and calorie of the menu item ranked at the first in the candidate order are displayed in the frame 91 .
  • the deleting mark 92 is displayed in the frame 91 .
  • the total amount and the total calorie of the recognized cooked food are also displayed.
  • the frame 91 a having the touched position coordinate (X, Y) as a center thereof is displayed on the captured image 81 .
  • the recognition processing of the cooked food in the frame 91 a is executed, and the menu items ranked from the first to the fourth in the descending candidate order of similarity degree are selected.
  • the name, price and calorie of the menu item ranked at the first in the candidate order are displayed in the frame 91 a .
  • a deleting mark 92 a is displayed in the frame 91 a .
  • the total amount and the total calorie of the recognized cooked food are also added up.
  • the frame 91 is erased. Then the user touches other position of the cooked food image to execute the recognition processing again. In this way, the frame 91 having the touched position coordinate (X, Y) as a center thereof is displayed on the captured image 81 again.
  • the recognition processing of the cooked food in the frame 91 is executed, and the menu items ranked from the first to the fourth in the candidate order in the descending order of similarity degree are selected. Then the name, price and calorie of the menu item ranked at the first in the candidate order are displayed in the frame 91 .
  • the area of the recognition image differs from that in the last time due to different touched positions, there is a high possibility that the menu item is correctly recognized from the cooked food image through a new recognition processing.
  • the display content in the frame 91 is switched to information of the next menu item in the descending candidate order, as shown in FIG. 17 . Further, the total amount and the total calorie of the recognized cooked food are updated.
  • the menu item ranked below the second in the candidate order is recognized as the correct menu item, it is possible to correct the information in the frame 91 to the information of the correct menu item through a simple operation.
  • the user repeats the touching operation described above until all the cooked food placed in the container 3 are correctly recognized. Then, if all the cooked food are recognized, the user touches the “determine” button 84 . In this way, the settlement processing for the cooked food placed in the container 3 is executed with electronic money, and a receipt thereof is issued from the receipt issuing port 16 . Further, the personal user history information is sent from the commodity recognition apparatus 1 to the database server 41 , and the usage history information of the user stored in the database 40 is updated.
  • the commodity recognition apparatus 1 recognizes the items of the cooked food placed in the container 3 one by one instead of recognizing the container 3 in which the cooked food is placed.
  • a constraint such as the one-to-one correspondence between the container 3 and the cooked food and the arrangement of special information medium applied to the container 3 is not required, and thus, the dining facility such as a cafeteria can be used efficiently.
  • the commodity recognition apparatus 1 not only recognizes the cooked food placed in the container, but also adds up the prices of the cooked food to calculate the total amount. Then the commodity recognition apparatus 1 automatically settles the payment of the recognized cooked food with the electronic money data of the IC card read by the card RW 18 . The commodity recognition apparatus 1 not only recognizes the cooked food but also settles the payment, and thus, the dining facility such as a cafeteria can be used efficiently.
  • the present invention is not limited to the embodiment described above.
  • the change receiving module 66 receives an instruction of changing the candidate, information of next menu item ranked below the current one in a descending candidate order is displayed.
  • information of all the menu items ranked below the current one is output and displayed for the user to select, and information of the menu item properly selected by the user from the information mentioned above is output to display it in the frame 91 .
  • the commodity recognition apparatus 1 adopting the electronic money settlement is exemplified.
  • the settlement method for example, the credit card settlement or cash settlement can also be applied.
  • the commodity recognition apparatus 1 does not include the settlement function.
  • the commodity recognition apparatus 1 is connected with a settlement terminal such as a POS (Point Of Sales) terminal through a communication line. Then the commodity recognition apparatus 1 outputs information of the recognized cooked food item to the POS terminal.
  • POS Point Of Sales
  • the recognition dictionary file 50 is stored in the auxiliary storage device 24 of the commodity recognition apparatus 1 .
  • the storage location of the recognition dictionary file 50 is not limited to the auxiliary storage device 24 .
  • the recognition dictionary file 50 may be stored in a storage device attached outside the commodity recognition apparatus 1 , and the CPU 21 accesses the storage device as needed to retrieve the data of the recognition dictionary file 50 .
  • the frame 91 is a rectangular shaped frame of which the vertical length is “a” and the horizontal length is “b”.
  • the size and the shape of the frame 91 may be included in the apparatus. With such a function, the image of the cooked food can be surrounded by a frame correctly, thus, a high recognition rate can be expected.
  • the menu item is recognized from the image every time the image is touched.
  • the present embodiment is not limited to this.
  • it may be programmed that the coordinates of the touched positions are stored in sequence, and the menu item is sequentially recognized from the image in an area specified with each coordinate if a recognition operation is instructed after that. In this way, the image recognition for all the cooked food touched is stared after the user touches a plurality of cooked food.
  • the calorie is described as an example of the parameter derived for the standard quantity of a commodity.
  • the parameter is not limited to calorie.
  • the parameter may be nutritional ingredient such as protein, lipid, calcium, and the like.
  • the control program for realizing the function of the invention is pre-recorded in the ROM 22 or the auxiliary storage device 24 serving as program storage module inside the apparatus.
  • the present invention is not limited to this. Same programs may be downloaded to the apparatus from a network. Alternatively, same programs recorded in a recording medium may also be installed in the apparatus. The form of the recording medium is not limited as long as the recording medium can store programs like a CD-ROM and a memory card and is readable by an apparatus. Further, the function realized by an installed or downloaded program can also be realized through the cooperation with an OS (operating system) installed in the apparatus. Further, the program of the present embodiment may be incorporated in a portable information terminal having a communication function such as a mobile phone or a so-called PDA to realize the function.
US14/302,788 2013-07-12 2014-06-12 Commodity recognition apparatus and commodity recognition method Abandoned US20150016672A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/201,650 US10061490B2 (en) 2013-07-12 2016-07-05 Commodity recognition apparatus and commodity recognition method
US16/043,299 US20180329614A1 (en) 2013-07-12 2018-07-24 Commodity recognition apparatus and commodity recognition method
US16/999,116 US20200379630A1 (en) 2013-07-12 2020-08-21 Commodity recognition apparatus and commodity recognition method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-146821 2013-07-12
JP2013146821A JP5927147B2 (ja) 2013-07-12 2013-07-12 商品認識装置及び商品認識プログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/201,650 Division US10061490B2 (en) 2013-07-12 2016-07-05 Commodity recognition apparatus and commodity recognition method

Publications (1)

Publication Number Publication Date
US20150016672A1 true US20150016672A1 (en) 2015-01-15

Family

ID=52277147

Family Applications (4)

Application Number Title Priority Date Filing Date
US14/302,788 Abandoned US20150016672A1 (en) 2013-07-12 2014-06-12 Commodity recognition apparatus and commodity recognition method
US15/201,650 Active US10061490B2 (en) 2013-07-12 2016-07-05 Commodity recognition apparatus and commodity recognition method
US16/043,299 Abandoned US20180329614A1 (en) 2013-07-12 2018-07-24 Commodity recognition apparatus and commodity recognition method
US16/999,116 Abandoned US20200379630A1 (en) 2013-07-12 2020-08-21 Commodity recognition apparatus and commodity recognition method

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15/201,650 Active US10061490B2 (en) 2013-07-12 2016-07-05 Commodity recognition apparatus and commodity recognition method
US16/043,299 Abandoned US20180329614A1 (en) 2013-07-12 2018-07-24 Commodity recognition apparatus and commodity recognition method
US16/999,116 Abandoned US20200379630A1 (en) 2013-07-12 2020-08-21 Commodity recognition apparatus and commodity recognition method

Country Status (2)

Country Link
US (4) US20150016672A1 (ja)
JP (1) JP5927147B2 (ja)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150271396A1 (en) * 2014-03-24 2015-09-24 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
EP3121797A1 (en) * 2015-07-22 2017-01-25 Toshiba TEC Kabushiki Kaisha Image recognition system and an image-based search method
CN109118681A (zh) * 2018-08-22 2019-01-01 明超 食堂餐盘菜品划价机构
AT520332A4 (de) * 2017-09-14 2019-03-15 The Moonvision Gmbh Vorrichtung und Verfahren zum Abrechnen ausgebrachter Speisen
US20190272648A1 (en) * 2018-03-05 2019-09-05 Toshiba Tec Kabushiki Kaisha Sales data processing apparatus, information processing apparatus, and information processing method
CN111640267A (zh) * 2020-04-16 2020-09-08 浙江口碑网络技术有限公司 一种自助结算方法、装置、存储介质及计算机设备
US11087302B2 (en) * 2017-07-26 2021-08-10 Jes Labs Installation and method for managing product data
US20220044221A1 (en) * 2019-03-29 2022-02-10 Panasonic Intellectual Property Management Co., Ltd. Clearing and settlement device, and unmanned store system
US20220188796A1 (en) * 2019-03-29 2022-06-16 Panasonic Intellectual Property Management Co., Ltd. Settlement payment device and unmanned store system
US20220230514A1 (en) * 2021-01-20 2022-07-21 Nec Platforms, Ltd. Product recognition apparatus, system, and method

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6341124B2 (ja) * 2015-03-16 2018-06-13 カシオ計算機株式会社 オブジェクト認識装置および認識結果提示方法
JP6886906B2 (ja) * 2017-10-10 2021-06-16 東芝テック株式会社 読取装置およびプログラム
WO2019163096A1 (ja) * 2018-02-23 2019-08-29 日本電気株式会社 登録装置、登録方法及びプログラム
JP7260517B2 (ja) * 2020-09-16 2023-04-18 ヤフー株式会社 制御プログラム、制御方法、端末装置及びサーバ装置
US11681997B2 (en) * 2021-09-30 2023-06-20 Toshiba Global Commerce Solutions Holdings Corporation Computer vision grouping recognition system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5546475A (en) * 1994-04-29 1996-08-13 International Business Machines Corporation Produce recognition system
US20120243779A1 (en) * 2011-03-25 2012-09-27 Kabushiki Kaisha Toshiba Recognition device, recognition method, and computer program product
US20130057692A1 (en) * 2011-09-06 2013-03-07 Toshiba Tec Kabushiki Kaisha Store system and method
US20130101168A1 (en) * 2011-10-19 2013-04-25 Toshiba Tec Kabushiki Kaisha Information processing apparatus and information processing method
US20130182106A1 (en) * 2012-01-13 2013-07-18 Brain Co., Ltd. Object Identification Apparatus
US20140023241A1 (en) * 2012-07-23 2014-01-23 Toshiba Tec Kabushiki Kaisha Dictionary registration apparatus and method for adding feature amount data to recognition dictionary
US20140126775A1 (en) * 2012-11-08 2014-05-08 Toshiba Tec Kabushiki Kaisha Commodity recognition apparatus and commodity recognition method
US9036870B2 (en) * 2012-09-03 2015-05-19 Toshiba Tec Kabushiki Kaisha Commodity recognition apparatus and commodity recognition method
US9165202B2 (en) * 2012-02-24 2015-10-20 Toshiba Tec Kabushiki Kaisha Recognition system, recognition method and computer readable medium for calculating feature values of an object image
US9245424B2 (en) * 2010-08-23 2016-01-26 Toshiba Tec Kabushiki Kaisha Store system and sales registration method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216571A (ja) 2000-02-03 2001-08-10 Glory Ltd 料金精算方法および装置
US7845554B2 (en) * 2000-10-30 2010-12-07 Fujitsu Frontech North America, Inc. Self-checkout method and apparatus
JP2007034570A (ja) * 2005-07-26 2007-02-08 Seiko Epson Corp 動きベクトル検出装置、動きベクトル検出方法、動きベクトル検出プログラム、及び記録媒体
JP2007065878A (ja) * 2005-08-30 2007-03-15 Central Res Inst Of Electric Power Ind 水面または水中における移動体計数方法、計数装置および計数プログラム
JP5544332B2 (ja) 2010-08-23 2014-07-09 東芝テック株式会社 店舗システムおよびプログラム
JP2012252396A (ja) 2011-05-31 2012-12-20 Toshiba Tec Corp 外観検査装置、外観検査プログラムおよび外観検査システム
EP2570967A1 (en) * 2011-09-13 2013-03-20 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Semi-automatic check-out system and method
JP5450560B2 (ja) * 2011-10-19 2014-03-26 東芝テック株式会社 商品データ処理装置、商品データ処理方法及び制御プログラム
JP5551140B2 (ja) * 2011-10-19 2014-07-16 東芝テック株式会社 情報処理装置及びプログラム
JP5622756B2 (ja) * 2012-01-30 2014-11-12 東芝テック株式会社 商品読取装置及び商品読取プログラム

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5546475A (en) * 1994-04-29 1996-08-13 International Business Machines Corporation Produce recognition system
US9245424B2 (en) * 2010-08-23 2016-01-26 Toshiba Tec Kabushiki Kaisha Store system and sales registration method
US20120243779A1 (en) * 2011-03-25 2012-09-27 Kabushiki Kaisha Toshiba Recognition device, recognition method, and computer program product
US20130057692A1 (en) * 2011-09-06 2013-03-07 Toshiba Tec Kabushiki Kaisha Store system and method
US20130101168A1 (en) * 2011-10-19 2013-04-25 Toshiba Tec Kabushiki Kaisha Information processing apparatus and information processing method
US20130182106A1 (en) * 2012-01-13 2013-07-18 Brain Co., Ltd. Object Identification Apparatus
US9165202B2 (en) * 2012-02-24 2015-10-20 Toshiba Tec Kabushiki Kaisha Recognition system, recognition method and computer readable medium for calculating feature values of an object image
US20140023241A1 (en) * 2012-07-23 2014-01-23 Toshiba Tec Kabushiki Kaisha Dictionary registration apparatus and method for adding feature amount data to recognition dictionary
US9036870B2 (en) * 2012-09-03 2015-05-19 Toshiba Tec Kabushiki Kaisha Commodity recognition apparatus and commodity recognition method
US20140126775A1 (en) * 2012-11-08 2014-05-08 Toshiba Tec Kabushiki Kaisha Commodity recognition apparatus and commodity recognition method

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150271396A1 (en) * 2014-03-24 2015-09-24 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
US9560272B2 (en) * 2014-03-24 2017-01-31 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
EP3121797A1 (en) * 2015-07-22 2017-01-25 Toshiba TEC Kabushiki Kaisha Image recognition system and an image-based search method
US11087302B2 (en) * 2017-07-26 2021-08-10 Jes Labs Installation and method for managing product data
AT520332A4 (de) * 2017-09-14 2019-03-15 The Moonvision Gmbh Vorrichtung und Verfahren zum Abrechnen ausgebrachter Speisen
AT520332B1 (de) * 2017-09-14 2019-03-15 The Moonvision Gmbh Vorrichtung und Verfahren zum Abrechnen ausgebrachter Speisen
US20190272648A1 (en) * 2018-03-05 2019-09-05 Toshiba Tec Kabushiki Kaisha Sales data processing apparatus, information processing apparatus, and information processing method
CN109118681A (zh) * 2018-08-22 2019-01-01 明超 食堂餐盘菜品划价机构
US20220044221A1 (en) * 2019-03-29 2022-02-10 Panasonic Intellectual Property Management Co., Ltd. Clearing and settlement device, and unmanned store system
US20220188796A1 (en) * 2019-03-29 2022-06-16 Panasonic Intellectual Property Management Co., Ltd. Settlement payment device and unmanned store system
CN111640267A (zh) * 2020-04-16 2020-09-08 浙江口碑网络技术有限公司 一种自助结算方法、装置、存储介质及计算机设备
US20220230514A1 (en) * 2021-01-20 2022-07-21 Nec Platforms, Ltd. Product recognition apparatus, system, and method

Also Published As

Publication number Publication date
US20180329614A1 (en) 2018-11-15
JP5927147B2 (ja) 2016-05-25
US20200379630A1 (en) 2020-12-03
US20160313897A1 (en) 2016-10-27
JP2015018506A (ja) 2015-01-29
US10061490B2 (en) 2018-08-28

Similar Documents

Publication Publication Date Title
US20200379630A1 (en) Commodity recognition apparatus and commodity recognition method
US9269005B2 (en) Commodity recognition apparatus and commodity recognition method
US9990541B2 (en) Commodity recognition apparatus and commodity recognition method
JP6999404B2 (ja) 物品認識装置及び物品認識方法
US9292748B2 (en) Information processing apparatus and information processing method
US20140023241A1 (en) Dictionary registration apparatus and method for adding feature amount data to recognition dictionary
US20150023548A1 (en) Information processing device and program
JP5572651B2 (ja) 商品読取装置および商品読取プログラム
US9805357B2 (en) Object recognition apparatus and method for managing data used for object recognition
US20140064570A1 (en) Information processing apparatus and information processing method
US20170076698A1 (en) Image recognition system that displays a user-friendly graphical user interface
US9355395B2 (en) POS terminal apparatus and commodity specification method
US20210241356A1 (en) Information processing apparatus, control method, and program
JP6394340B2 (ja) 商品登録装置、商品登録方法およびプログラム
JP7407242B2 (ja) 情報処理装置およびプログラム
JP2021018470A (ja) 物品特定装置及びプログラム
JP6116717B2 (ja) 商品認識装置及び商品認識プログラム
JP2016177433A (ja) 商品登録装置および商品登録方法
JP7325138B2 (ja) 情報処理装置、情報処理方法及びプログラム
US9092836B2 (en) Commodity selection supporting system and commodity selection supporting method
US20220092573A1 (en) Portable terminal and information processing method for a portable terminal
JP7172060B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP2022117795A (ja) 表示装置及びサーバ装置
JP2021144289A (ja) 情報処理装置、情報処理システム及び情報処理方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOSHIBA TEC KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TSUNODA, YOUJI;REEL/FRAME:033089/0595

Effective date: 20140528

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION