JP6241666B2 - User management device, user management system, and user management method - Google Patents

User management device, user management system, and user management method Download PDF

Info

Publication number
JP6241666B2
JP6241666B2 JP2014119296A JP2014119296A JP6241666B2 JP 6241666 B2 JP6241666 B2 JP 6241666B2 JP 2014119296 A JP2014119296 A JP 2014119296A JP 2014119296 A JP2014119296 A JP 2014119296A JP 6241666 B2 JP6241666 B2 JP 6241666B2
Authority
JP
Japan
Prior art keywords
user
customer
person
time
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2014119296A
Other languages
Japanese (ja)
Other versions
JP2015232791A (en
Inventor
邦雄 平川
邦雄 平川
裕一 中畑
裕一 中畑
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to JP2014119296A priority Critical patent/JP6241666B2/en
Publication of JP2015232791A publication Critical patent/JP2015232791A/en
Application granted granted Critical
Publication of JP6241666B2 publication Critical patent/JP6241666B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a user management apparatus, a user management system, and a user management method for managing the state of a user who has visited a facility that provides services or articles to users in order based on a reception number.

  At the bank counter, in order to receive requests from customers in the order they come to the store, they issue the reception number tag with the ticket issuing machine and call the customer with the reception number, and the customer receives the call Will wait in the waiting area. At this time, if there are many customers waiting in turn and the customer is allowed to wait for a long time, the customer satisfaction is lowered. For this reason, grasping the status of customers who visit the store and studying improvement measures to eliminate problems such as waiting time are beneficial for improving customer satisfaction and for efficient store operation. It is.

  As related to managing the status of customers who have entered such stores, conventionally, a person is collated between images taken by a camera for each person entering and leaving the room and determined to be the same person. A technique is known in which the elapsed time from entry to exit is calculated as the residence time from the respective shooting times of the entrance image and the exit image (see Patent Document 1). Also, customers who enter the store, customers who check out, and customers who leave the store are photographed with a camera, and face images extracted from the photographed images are collected, and person verification based on the face images is performed. A technique is known in which information on customer behavior is collected by associating persons (see Patent Document 2). In addition, an image of the customer who receives the receipt tag with the receipt tag issuing device, the purpose of visiting the store acquired by the customer performing an input operation with the receipt tag issuing device, the waiting time of the customer calculated from the issue time of the receipt tag, and There is known a technique for displaying information such as an implementation status of a customer service action (voice call) acquired by a store clerk performing an input operation on a mobile terminal device possessed by a store clerk who performs customer service (Patent Document 3). reference).

JP 07-249138 A JP 2002-041770 A JP 2009-122871 A

  However, the technique described in Patent Document 1 is merely capable of grasping the waiting time of a customer who waits for delivery of goods in a store, and the technique described in Patent Document 2 Then, it is only possible to collect the facial images of the customers who visited the store, and to roughly grasp the customer's behavior, such as that the customer who visited the store again visited the store, and to understand the state of the customer who visited the store in sufficient detail There was a problem that could not be done.

  On the other hand, in the technique described in Patent Document 3, information such as a person image, a purpose of visiting the store, a waiting time, and the implementation status of customer service (voice call) can be acquired, but an IC tag is embedded in the reception number tag. In order to manage the customer based on the information acquired by the IC tag, the normal behavior of the customer at the window, that is, after receiving the receipt number tag at the ticketing machine and waiting in the waiting area, receiving the call and performing the procedure at the window When a customer takes an irregular behavior different from the normal behavior, there has been a problem that the state of such a customer cannot be properly grasped.

  In other words, the behavior of customers who come to the store differs depending on the congestion situation in the store and the customer's own circumstances, etc., and the customer who has entered the store once but immediately leaves the store because the store is crowded, or waiting There is a customer who goes out of the store temporarily to kill time. In addition, a change machine is installed in the bank store, and a customer who uses the change machine leaves the store if the change machine completes the exchange without receiving the receipt number tag at the ticketing machine. Therefore, a technology that can grasp the state of a customer who takes such anomalous behavior without omission is desired.

  The present invention has been devised to solve such problems of the prior art, and its main purpose is a user who can grasp the state of the user who has visited the facility without omission. A management device, a user management system, and a user management method are provided.

The user management apparatus according to the present invention displays a person image of a user who has visited at a facility that provides services or goods to the user in order based on the reception number, and can view the state of each user. A number management unit that issues a reception number tag to a visiting user, and a user at least at any time after entering the facility and receiving the reception number tag A first image acquisition unit that acquires a plurality of first person images taken, and a user is photographed at least at a point in time from when the call with the reception number is received until the provision of the service or article ends the second image acquisition unit, the third image acquisition unit that acquires a plurality of third person image obtained by photographing the user at the time of the exit from at least the facility of obtaining a plurality of second person image A user information management unit which manages and generates a status display information including the user image of each user, a user information presentation unit for presenting the status information generated in this user information management section to the user The user information management unit includes a person verification unit that performs person verification multiple times between the first, second, and third person images, and a user type determination that determines a user type. And detecting the state of the user related to the provision of the service or the article according to the collation result by the person collation unit, and determining the type of the user based on the detected state of the user The state display information that is determined and grouped according to the type is generated.

In addition, the user management system of the present invention displays a person image taken at a facility that provides services or goods to the user in order based on the reception number, and browses the state of each user. A user management system that enables a camera installed in the facility, a number tag issuing device that issues a reception number tag to a visiting user, and a plurality of information processing devices. The first information processing apparatus acquires a plurality of first person images obtained by photographing a user at least at any time from when the information processing apparatus enters the facility with the camera until the receipt number tag is received. And a second image of the user taken at least at any point in time from the receipt of the call by the receipt number by the camera to the end of the provision of the service or article. A plurality of second image acquisition unit that acquires a person image, a third image acquisition unit that acquires a plurality of third person image obtained by photographing the user at the time of the exit from at least the property by the camera, use A user information management unit that generates and manages status display information including a user image for each user, and a user information presentation unit that presents the status display information generated by the user information management unit to the user. The user information management unit includes a person collation unit that performs person collation a plurality of times between the first, second, and third person images , a user type determination unit that determines a user type, and And detecting the state of the user related to the provision of the service or the article and determining the type of the user based on the detected state of the user. , Grouped by type A configuration for generating the status information.

In addition, the user management method of the present invention displays a person image taken at a facility that provides services to the user in order based on the reception number, and also displays the state of each user. A user management method for performing an enabling process by an information processing apparatus, the step of issuing a receipt number tag to a visiting user, and at least one of receiving the receipt number tag after entering the facility Acquiring a plurality of first person images obtained by photographing the user at the time of step (a), and receiving the user at least at a point in time from when receiving the call by the reception number to when the provision of the service or article ends. A step of acquiring a plurality of second person images taken, and a step of acquiring a plurality of third person images obtained by photographing the user at least when leaving the facility. And-up, a step of managing and generates a status display information including the user image of each user, and presenting the status information generated in this step to the user, provided with, the status information The step of generating and managing comprises the steps of performing person verification a plurality of times between the first, second and third person images, and determining the type of user, According to the result, the state of the user related to the provision of the service or the article is detected, the type of the user is determined based on the detected state of the user, and the group is displayed according to the type. The status display information is generated.

According to the present invention, the state of a user related to the provision of services or articles is detected from a person image obtained by photographing a user who has visited the facility according to the result of person verification, and the detected state of the user is determined. Based on the type of user, the status display information that is grouped and displayed according to that type is generated. The user can grasp the state of the user who has visited the facility, including the user who takes action, without omission.

Overall configuration diagram of a user management system according to the first embodiment A plan view of the store explaining the store layout, the installation status of the cameras 1a to 1c, and the state of the customer Explanatory drawing explaining the outline | summary of the customer management process performed by PC3 Explanatory drawing explaining the outline | summary of the customer management process performed by PC3 Explanatory drawing explaining the outline | summary of the customer management process performed by PC3 Explanatory drawing explaining the outline | summary of the customer management process performed by PC3 Explanatory drawing explaining the outline | summary of the waiting time measurement process performed by PC3 Functional block diagram showing schematic configuration of PC3 Explanatory drawing which shows the video display screen and exclusion person registration screen which are displayed on the monitor 7 Explanatory drawing explaining the outline | summary of the person collation process performed in the person collation part 42 Explanatory drawing which shows the form showing the total result output by the printer 8 Explanatory drawing which shows the form showing the total result output by the printer 8 Explanatory drawing which shows the customer status display screen displayed on the monitor 7 Explanatory drawing which shows the principal part of the customer status display screen shown in FIG. The top view of the store explaining the layout of the store in 2nd Embodiment, the installation condition of the cameras 1a-1c, and the state of a customer Explanatory drawing explaining the outline | summary of the customer management process performed by PC3 in 2nd Embodiment.

The first invention made to solve the above-mentioned problems is to display a person image obtained by photographing a user who has visited at a facility that provides services or goods to the user in order based on the reception number, and for each user. A user management device that enables viewing of the status of the user, a number tag issuing unit that issues a reception number tag to a visiting user, and at least one of receiving the reception number tag after entering the facility A first image acquisition unit that acquires a plurality of first person images obtained by photographing the user at the time of at least one of receiving a call by the receipt number and ending provision of the service or article acquiring a second image acquisition unit that acquires a plurality of second person image obtained by photographing the user at the time, a plurality of third person image obtained by photographing the user at the time of the exit from at least the facility A third image acquisition unit that, the user information management unit which manages and generates a status display information including the user image of each user, the state display information generated in the user information management unit to the user A user information presenting unit for presenting, the user information managing unit comprising: a person collating unit that performs person collation a plurality of times between the first, second, and third person images ; A user type determination unit that determines a type, and detects a user status related to the provision of the service or article according to a verification result by the person verification unit, and the detected user status Based on the above, the type of the user is determined, and the state display information displayed by grouping according to the type is generated.

According to this, according to the result of person verification from the person image taken of the user who visited the facility, the state of the user related to the service or provision of goods is detected, and based on the detected state of the user, Since the type of user is determined and the status display information that is displayed in groups according to the type is generated, anomalous behavior that is different from normal behavior, such as users who do not receive the receipt number tag, etc. The user can grasp the state of the user who has visited the facility including the user who takes it without omission.

  In addition, according to a second aspect of the present invention, the user information management unit calculates an elapsed time relating to the user state for each user based on the user state detected by the person verification unit and the photographing time of the person image. Further, a time measuring unit for measuring is further provided.

  According to this, the user can grasp the elapsed time related to the state of the user. Thereby, the congestion state in a facility, the efficiency of the work which concerns on provision of service or goods, etc. can be evaluated.

  Moreover, 3rd invention is set as the structure which the said time measurement part measures the staying time in a hall | hole required until the user leaves | exited as an elapsed time for every user.

  According to this, the user can grasp the stay time in the hall for each user.

  In addition, the fourth invention is configured such that the time measuring unit measures, as the elapsed time, the time required for work from the start to the end of the work related to the provision of the service or the article for each user. To do.

  According to this, the user can grasp the work required time for each user. Then, by summing up the time required for work for each user for each window for business, and obtaining the time required for work for each window, it is possible to evaluate the work efficiency for each window.

  In addition, the fifth invention further includes a totalization processing unit in which the user information management unit totals the elapsed time for each user acquired by the time measurement unit and acquires the elapsed time for each predetermined period. The configuration is as follows.

  According to this, the user can grasp the elapsed time for each predetermined period. If the elapsed time for each predetermined period is displayed side by side in chronological order, the user can easily grasp the temporal transition status of the elapsed time, and the elapsed time of different dates and times can be compared. If they are displayed side by side, the user can easily understand the difference in elapsed time depending on the date.

In addition, according to a sixth aspect of the invention, the user type determination unit detects that the user whose provision of services or articles is unknown has been detected by the person verification unit, and the user has entered again. If this is detected, the user is determined to be a temporary outing who has temporarily left the facility.

  According to this, a user can grasp a temporary going out person.

According to a seventh aspect of the present invention, the user information management unit measures the time spent on-site for each user from the time the user enters to the time the user leaves based on the shooting time of the person image. A user type determination unit configured to determine the type of the user based on a measurement result by the time measurement unit.

  According to this, it is possible to determine the type of user that cannot be determined only by person comparison between person images.

In addition, according to an eighth aspect of the present invention, the user type determination unit detects that a user whose provision of services or articles is unknown has been detected by the person verification unit, and the time measurement unit includes When the staying time is within a predetermined time, the user is determined to be an immediate exit who has immediately exited without entering the service or goods after entering.

  According to this, a user can grasp an immediate exit person.

In addition, according to a ninth aspect of the invention, the user type determination unit detects, by the person verification unit, that a user whose provision of the service or the article is unknown has exited, and the time measurement unit detects the inside of the place When the staying time exceeds a predetermined time, the user is determined to be a self-machine user using a self-machine that provides the service or goods in accordance with the user's operation.

  According to this, the user can grasp the self machine user.

In the tenth aspect of the invention, the user information management unit totals the types for each user acquired by the user type determination unit, and acquires the number of users by type for each predetermined period. The configuration further includes a processing unit.

  According to this, the user can grasp the number of users by type for each predetermined period. If the number of users by type for each predetermined period is displayed in chronological order, the user can easily grasp the temporal transition status of the number of users by type, If the numbers of users of different types at different dates and times are displayed side by side in a comparable manner, the user can easily grasp the difference in the number of users by type of date and time.

In addition, the eleventh invention displays a person image of a user who has visited at a facility that provides services or articles to the user in order based on the reception number, and enables the state of each user to be viewed. A user management system, comprising: a camera installed in the facility; a number tag issuing device that issues a reception number tag to a user who has visited; and a plurality of information processing devices; First image acquisition for acquiring a plurality of first person images obtained by photographing a user at least at any point in time from when the device enters the facility by the camera until the receipt number tag is received And a second person image obtained by photographing the user at least at a point in time from when the call by the receipt number is received by the camera to when the provision of the service or the article ends. A plurality of second image acquisition unit that, the third image acquisition unit that acquires a plurality of third person image obtained by photographing the user at the time of the exit from at least the property by the camera, the use of each user A user information management unit that generates and manages state display information including a person image, and a user information presentation unit that presents the state display information generated by the user information management unit to a user, The person information management unit includes a person collation unit that performs person collation a plurality of times between the first, second, and third person images, and a user type determination unit that determines a user type. According to the collation result by the person collation unit, the state of the user related to the provision of the service or the article is detected, and based on the detected state of the user, the type of the user is determined, and depending on the type The status displayed in groups A configuration that generates the display information.

  According to this, as in the first aspect, the user can grasp the state of the user who has visited the facility without omission.

In addition, the twelfth invention displays a person image of a user who has visited at a facility that provides services or goods to the user in order based on the reception number, and enables the state of each user to be viewed. A user management method in which processing is performed by an information processing device, wherein a step of issuing a receipt number tag to a visiting user, and at least one point of time after entering the facility and receiving the receipt number tag A step of acquiring a plurality of first person images obtained by photographing a user, and a plurality of images obtained by photographing a user at least at a point in time after receiving a call by the reception number and ending provision of the service or article. obtaining a second person image, acquiring a plurality of third person image obtained by photographing the user at the time of the exit from at least the facility, Comprising a step of managing and generates a status display information including the user image of each use's and presenting the status information generated in this step to the user, and to generate and maintain the status information The step of performing a plurality of person verifications between the first, second, and third person images and a step of determining the type of user, depending on the result of the person verification The status display that detects the status of the user related to the provision of the service or the article, determines the type of the user based on the detected status of the user, and displays the status grouped according to the type The information is generated.

  According to this, as in the first aspect, the user can grasp the state of the user who has visited the facility without omission.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

(First embodiment)
FIG. 1 is an overall configuration diagram of a customer management system according to the first embodiment. This customer management system (user management system) is constructed for banks, and includes cameras 1a to 1c, a recorder (video storage device) 2, and a PC (user management device) 3. I have.

  The cameras 1a to 1c are installed at appropriate places in the store, the inside of the store is photographed by the cameras 1a to 1c, and the video obtained thereby is stored in the recorder 2.

  Connected to the PC 3 are an input device 6 such as a mouse on which a user such as an administrator performs various input operations, a monitor (display device) 7 for displaying a monitoring screen, and a printer 8. The PC 3 is installed at a proper place in the store, and the user can view the images in the store taken by the cameras 1a to 1c in real time on the monitor screen displayed on the monitor 7, and the recorder 2 It is possible to browse past videos recorded in the store.

  The cameras 1a to 1c, the recorder 2, and the PC 3 are installed in each of a plurality of stores (facility), and a PC 11 is installed in a headquarters that generalizes the plurality of stores. In-store images taken in 1c can be browsed in real time, and past store images recorded in the recorder 2 can be browsed, thereby confirming the situation in the store at the headquarters. be able to.

  The PC 3 installed in the store is configured as a user management device that manages the status of customers (users) who have visited the store, and the customer information (user information) generated by the PC 3 is viewed by the PC 3 itself. Furthermore, it is transmitted to the PC 11 installed in the headquarters and can also be browsed by this PC 11, and the PCs 3 and 11 are configured as browsing devices for browsing customer information.

  Next, the store layout shown in FIG. 1, the installation status of the cameras 1a to 1c, and the customer status will be described. FIG. 2 is a plan view of the store explaining the store layout, the installation status of the cameras 1a to 1c, and the customer state.

  The store has an entrance, a lobby, a window, and a backyard. The customer enters and exits from the entrance, and the customer who enters the store stays in the lobby. A waiting area is provided in the lobby, and a customer using the window waits in the waiting area until a call is received. A plurality of windows are provided, and at each window, a store clerk (teller) receives a request from a customer and performs a necessary window operation. In the backyard, a store clerk (operator) performs a logistic support operation for supporting a window operation.

  In the lobby, a ticketing machine 21 and a currency exchange machine 22 are installed. The ticket-issuing machine 21 issues a receipt number tag for calling customers who use the window in the order in which they visited the store. The money changer 22 is operated by the customer himself to change money.

  A call display panel 23 and an operation terminal 24 are installed at the window. The call display panel 23 displays a reception number and calls a customer. The operation terminal 24 is operated by a store clerk (teller) and includes operation buttons (not shown). When the business of the customer is finished and the call button is operated, the reception number is displayed on the call display panel 23. When the calling customer appears at the window, the reception button is operated. If the called customer does not appear at the window, the absent button is operated, and the customer is called by the next reception number in response to this.

  In addition, a plurality of cameras 1a to 1c for photographing customers who have visited the store are installed in the store. In this embodiment, there are a store entrance camera 1a that captures a customer who enters a store from an entrance, a window camera 1b that captures a customer who performs a procedure at the entrance, and a store exit camera 1c that captures a customer exiting from the entrance. is set up. These cameras 1a to 1c are installed so as to photograph the customer's face from the front.

  In the present embodiment, the customer enters and exits the store at one entrance, but the entrance and the exit may be provided separately. In this case, the entrance camera 1a and the exit camera 1c are installed at the entrance and the exit, respectively.

  A customer who visits the store takes various actions depending on the purpose of visiting the store, the congestion situation of the store, the convenience of the customer, and the like (see the route of customer C indicated by the dotted line in FIG. 2). That is, in the case of a customer who uses a window (a window customer), when entering the store from the entrance, first, the ticketing machine receives the receipt number tag and waits in the waiting area until the call is received. When a call is received, a request for business related to various procedures is made at the counter, and the store exits from the entrance when the counter service ends. Also, if it takes time for back support work in the backyard, after making a request at the counter, you can temporarily leave the counter, wait for a call again in the waiting area, receive the call, and the counter service will end Then, the store exits from the doorway.

  Further, in the case of a customer who uses the money changer 22 (customer for money changer), when entering the store from the entrance, the ticket issuing machine 21 goes to the money changer 22 without receiving the receipt number tag, and the money changer 22 completes the money exchange. And exit from the doorway.

  In addition, if the store is crowded, especially when there are many customers waiting at the counter, customers who come to the store for the purpose of using the counter immediately enter the store (immediately exit customers) A customer who temporarily goes out of the store (temporary going out customer) appears to cut down the waiting time outside the store.

  Next, an overview of customer management processing performed by the PC 3 shown in FIG. 1 will be described. 3, 4, 5, and 6 are explanatory diagrams for explaining an overview of customer management processing performed by the PC 3. FIG. 3 shows the case of a window customer, FIG. 4 shows the case of a money changer customer, FIG. 5 shows the case of an immediate exit customer, and FIG. 6 shows the case of a temporary going-out customer.

  As shown in FIG. 3A, in the case of a window customer, first, when a customer enters the store from the entrance, the customer is photographed by the entrance camera 1a. At this time, the person verification is performed between the person image extracted from the photographed image and the registered person image, and the person verification fails, that is, the person who entered the store does not exist in the registered customer and the clerk. In this case, it is detected that a new customer has entered the store, and the person is newly registered as a customer.

  Next, the customer who entered the store receives the receipt number tag at the ticket issuing machine 21, and then waits for a call in the waiting area. Then, when the customer receives a call, the customer requests the store clerk for business at the window, and the customer is photographed by the window camera 1b. At this time, person verification is performed between the person image extracted from the photographed image and the registered person image, and if the person verification is successful, it is detected that the work for providing the service to the customer has been carried out. Customer information of the customer to be updated is updated. When the business at the window is finished, the customer leaves the entrance and the customer is photographed with the exit camera 1c. At this time, person verification is performed between the person image extracted from the captured image and the registered person image, and when the person verification is successful, it is detected that the customer who has finished providing services has left the store, Customer registration is deleted.

  Also, the store entry and exit times are obtained from the shooting times of the person image at the time of entry and the person image at the time of exit, and the time from entry to exit, that is, the stay time in the store is measured. Is done. Further, the customer is continuously photographed while the customer stays at the window by the window camera 1b, and the start time and the end time of the window work are acquired from each shooting time of the person image obtained during the window use. Calculates the time required from when a business is requested at the window to the end of the business, that is, the time required for the business.

  In addition, if the logistical support work performed in the backyard in relation to the window work is time-consuming, the window work is divided into multiple times (here, twice) as shown in FIG. May be implemented. In this case, when measuring the time required for work, it is recommended to include the time required for logistical support work in the backyard, from the start time of the first window service to the end time of the last window service (here, the second time). Is the time required for work.

  As shown in FIG. 4, in the case of a change machine customer, first, when a customer enters the store from the entrance, the customer is photographed by the entrance camera 1a, and a person detected from the photographed image is registered as a customer. The Then, the customer who entered the store heads for the money change machine 22, and after changing money with the money change machine 22, the customer leaves the entrance and the customer at that time is photographed by the exit camera 1c. At this time, since the customer is not photographed by the window camera 1b, it is not detected that the provision of the service has been completed, and the person verification is performed between the person image extracted from the photographed image by the exit camera 1c and the registered person image. If the person verification is successful, it is detected that a customer whose service provision is unknown has left, and the customer information is updated.

  In addition, the store entry time and the store exit time are acquired from the photographing times of the person image at the store entrance and the person image at the store exit, and the staying time in the store is measured.

  As shown in FIG. 5A, in the case of an immediate exit customer, when the customer enters the store from the entrance, the customer is photographed by the entrance camera 1a, and the person detected from the photographed image is the customer. Registered as When the customer who entered the store looks at the inside of the store and gives up giving up the procedure at the window, the customer leaves the entrance and the customer at that time is photographed by the exit camera 1c. At this time, since the customer is not photographed by the window camera 1b, there is no detection that the provision of the service has been completed, and person verification is performed between the person image extracted from the photographed image by the exit camera 1c and the registered person image. If the person verification is successful, it is detected that a customer whose service provision is unknown has left, and the customer information is updated. In addition, as shown in FIG. 5B, after the customer receives the receipt number tag at the ticketing machine 21, the customer may return immediately after seeing the inside of the store. In this case as well, the case shown in FIG. It is the same.

  In addition, the store entry time and the store exit time are acquired from the photographing times of the person image at the store entrance and the person image at the store exit, and the staying time in the store is measured.

  As shown in FIG. 6, in the case of temporarily going out, when a customer enters the store from the entrance, the customer is photographed by the entrance camera 1a, and a person detected from the photographed image is registered as a customer. . Next, after the customer who entered the store receives the receipt number tag at the ticketing machine 21, he looks at the inside of the store and thinks that there will be no immediate call. The customer at that time is photographed by the exit camera 1c. At this time, since the customer is not photographed by the window camera 1b, there is no detection that the provision of the service has been completed, and person verification is performed between the person image extracted from the photographed image by the exit camera 1c and the registered person image. If the person verification is successful, it is detected that a customer whose service provision is unknown has left, and the customer information is updated.

  When the customer returns to the store and enters the store from the entrance, the customer is photographed by the store entrance camera 1a. At this time, person verification is performed between the person image extracted from the photographed image and the registered person image, and the person verification succeeds, that is, the person who entered the store exists in the registered customer. It is detected that the customer who has gone out returns to the store. The subsequent steps are the same as those of the window customer shown in FIG.

  Here, in the case of a temporary outing customer, the customer returns to the store, and therefore it is possible to determine the temporary outing when it is detected that a customer who has left the store in a state where service provision is unknown is detected. . Also, in the case of customers using currency exchange machines and customers who leave the store immediately, unlike temporary going out customers, they do not return to the store. Can be distinguished from temporary going out.

  On the other hand, both the users of money changers and the customers who immediately leave the store are in a state where the provision of services is unknown when they leave the store. Can not do it. Therefore, in the present embodiment, the customer who uses the change machine and the customer who leaves the store are discriminated based on the stay time in the store. In other words, the change machine customer uses the change machine, but the customer who leaves the store does nothing at all, so the customer who uses the change machine often spends more time in the store than the customer who leaves the store. Therefore, the staying time in the store is compared with the reference time (threshold value) to determine whether the customer is an exchange machine customer or an immediate exit customer. If the staying time in the store is longer than the reference time, the change machine It is determined that the customer is a customer, and if the staying time in the store is shorter than the reference time, it is determined that the customer is an immediate customer leaving the store.

  Since the customer who uses the money changer does not receive the reception number tag, as shown in FIG. 5B, when the customer who has immediately left the store has received the reception number tag, Since it can be determined that the user is absent, it may be determined that the customer has left the store immediately at this point.

  Next, an outline of the waiting time measurement process performed by the PC 3 shown in FIG. 1 will be described. FIG. 7 is an explanatory diagram for explaining an outline of the waiting time measurement process performed by the PC 3.

  The customer at the window will wait in the waiting area until the customer's work is completed and the call is received, but if there are many customers waiting in turn, the customer may wait for a long time, which is a big dissatisfaction for the customer. feel. Therefore, in this embodiment, the state of the customer who entered the store on the PC 3 is managed, and in particular, the waiting time for each customer is measured, and the customer service action required for the store clerk, that is, apologize for waiting for a long time. It is designed to improve customer satisfaction by allowing customers to spend money.

  Specifically, when a customer enters the store from the entrance of the store, the customer is photographed by the entrance camera 1a. At this time, a person detected from the photographed image is registered as a customer, and waiting time measurement starts. Is done. Then, when the customer's work is completed and the customer receives a call and the customer appears at the window, the customer is photographed by the window camera 1b. At this time, the person image extracted from the photographed image and the registered person image are displayed. When it is detected that the person verification at the site is successful and the business of providing the service to the customer is performed, the measurement of the waiting time ends.

  Here, when the customer enters the store, the waiting time at which the measurement is started becomes long, and when this waiting time exceeds a predetermined reference time, a notification is made to urge the clerk to engage in customer service. Then, the customer is found out from the customers staying in the waiting area, and the customer is asked to apologize for the necessary customer service, that is, waiting for a long time.

  Next, the customer management process performed by the PC 3 shown in FIG. 1 will be described in detail. FIG. 8 is a functional block diagram showing a schematic configuration of the PC 3.

  The PC 3 includes a first image acquisition unit 31, a second image acquisition unit 32, a third image acquisition unit 33, a customer information management unit (user information management unit) 34, and an input / output control unit 35. It is equipped with.

  The first image acquisition unit 31 acquires a person image obtained by taking a picture of a customer at least at a point in time from entering the store through the entrance and receiving the receipt number tag. In the present embodiment, a person image (first person image) at the time of entry, in which a customer entering the store from the entrance / exit is photographed by the entrance camera 1a, is acquired.

  The second image acquisition unit 32 acquires a person image obtained by photographing the customer at least at any point in time from when the call with the reception number is received until the service is ended. In the present embodiment, a person image at the time of use of the window obtained by photographing a customer who performs a procedure at the window by the window camera 1b, that is, a person image at the time of service provision (second person) of a customer who is receiving the service. Image).

  In the 3rd image acquisition part 33, the person image (3rd person image) at the time of a store which image | photographed the customer who leaves a store from the entrance / exit with the store closing camera 1c is acquired.

  The first to third image acquisition units 31 to 33 detect a person from the captured images input from the cameras 1a to 1c, and an image of the face area of the person detected by the person detection process (face Image processing is performed to cut out (image) from the captured image, and a face image is output as a person image.

  The customer information management unit 34 generates and manages customer information related to the customer state based on the person images respectively acquired by the first to third image acquisition units 31 to 33. The excluded person registration unit 41 A person verification unit 42, a customer registration unit 43, a customer information storage unit 44, a time measurement unit 45, a customer type determination unit 46, a customer service necessity determination unit 47, and a customer service information acquisition unit 48. And a processing unit 49.

  In the excluded person registration unit 41, processing for registering a person to be excluded from the target of customer management is performed. In the lobby where the customer stays, a store clerk (lobby attendant) specializing in the business of guiding the customer is arranged. When such a store clerk is photographed by the cameras 1a to 1c and included in the customer management target, the customer management The accuracy of is reduced. For this reason, in the present embodiment, the excluded person registration unit 41 performs a process of registering a person who is not a target in advance using a person image obtained by photographing a person who is not a target such as a store clerk. Thereby, it is possible to avoid registration of a person who is not a target such as a store clerk as a customer.

  In person collation part 42, if the 1st-3rd image acquisition parts 31-33 acquire the photography picture by store entrance camera 1a, window camera 1b, and exit camera 1c, respectively, the person picture extracted from the photography picture and A person collation is performed with the registered person image, and the state of the customer is detected according to the result of the person collation. In this embodiment, as a customer state, a new customer has entered a store, a service providing services to a customer has been performed, a customer who has completed providing services has left the store, and service provision is unknown It is detected that a new customer has left the store and that a customer who has left the store with unknown service provision has entered again.

  That is, in the person verification unit 42, when the first image acquisition unit 31 acquires a photographed image by the entrance camera 1a, the person verification is performed between the person image extracted from the photographed image and the registered customer person image. Is done. Furthermore, person verification is performed with the person image of the person registered as excluded by the excluded person registration unit 41. If both person verifications fail, that is, the person who entered the store does not exist in the registered customer and the person registered as excluded, it is detected that a new customer has entered the store ( New customer entry detection). Further, the store entry time is acquired from the photographing time of the person image at this time.

  In the person verification unit 42, when the second image acquisition unit 32 acquires a captured image by the window camera 1b, the person verification is performed between the person image extracted from the captured image and the registered person image. When the person verification succeeds, that is, when it is determined that they are the same person, it is detected that a service providing service to the customer has been performed (provided customer detection). Further, the start time and end time of the counter service are acquired from the photographing time of the person image at this time.

  In the person verification unit 42, when the third image acquisition unit 33 acquires a photographed image by the exit camera 1c, the person verification is performed between the person image extracted from the photographed image and the registered person image. If the person verification succeeds, that is, if it is determined that they are the same person, it is detected that the customer has left the store. Then, if the customer is in a state where the provision of the service has been completed, it is detected that the customer who has completed the provision of the service has left the store (provided that the customer has been completed). On the other hand, if the customer is in a state where the provision of the service has not been completed, it is detected that the customer whose service provision is unknown has left the store (provided that the customer is unknown). Further, the store exit time is acquired from the photographing time of the person image at this time.

  Further, in the person verification unit 42, when the first image acquisition unit 31 acquires a photographed image by the store entrance camera 1a, the person image extracted from the photographed image and the customer who has left the store in a state where the provision of services is unknown When the person collation is performed with the person image and the person collation is successful, that is, when it is determined that the person is the same person, it is detected that the customer who has gone out returns to the store (outgoing customer return detection). Further, the return time is acquired from the photographing time of the person image at this time.

  For this person collation, a known person recognition technique may be used. For example, feature amount data extracted from two images may be compared to determine whether or not they are the same person. Further, it is not necessary to acquire feature amount data from an image every time person verification is performed. If feature amount data has already been acquired, person verification may be performed using the feature amount data.

  In the customer registration unit 43, when the person verification unit 42 detects that a new customer has entered the store, a process of registering a person shown in the person image as a new customer is performed. In this customer registration process, a serial number is given to a customer in accordance with the order of entry, and customer information including a person image and a serial number is generated. The customer information generated by this customer registration process is stored in the customer information storage unit 44. In addition to the person image and serial number, the customer information includes information such as the shooting time of the person image at the time of entry, that is, the entry time.

  Further, in the customer registration unit 43, when it is detected by the person verification unit 42 that the customer whose service has been completed has left, a process for deleting the registration of the customer is performed. The customer information of the customer whose registration has been deleted may be deleted from the customer information storage unit 44, but may be left as a history. However, when it is used as a history, it is desirable to leave customer information in a form that protects privacy. For example, it is preferable to delete the person image and retain information relating to the customer state necessary for the processing in the totalization processing unit 49.

  In addition, in the customer registration unit 43, the person verification unit 42 has performed a service for providing services to the customer, that a customer whose service is unknown has left, and that the customer who has gone out has returned to the store. Are detected, update processing for adding the state information indicating the state of these customers to the customer information related to the customer is performed.

  The time measuring unit 45 performs processing for measuring the elapsed time related to the customer state for each customer based on the detection result by the person collating unit 42. In the present embodiment, processing for measuring the waiting time of the customer in real time is performed as the elapsed time. In this waiting time measurement process, the elapsed time from the store entry time acquired by the new customer entrance detection process in the person verification unit 42 to the present is measured as the waiting time. This waiting time measurement process is continued until it is detected by the person verification unit 42 that the business of providing the service to the customer is started.

  In the present embodiment, as the elapsed time, the in-store stay time required from the customer entering the store to leaving the store is measured. Further, in the present embodiment, as the elapsed time, the work required time from the start to the end of the work related to the provision of the service, that is, the window work for the window customer is measured (see FIG. 3).

  The in-store stay time is calculated from the store entry time acquired by the new customer entrance detection process in the person verification unit 42 and the store exit time acquired by the store exit detection process. Further, the task required time is calculated from the start time and end time of the counter task acquired by the provided customer detection process in the person verification unit 42. If the window service is only required once (see Fig. 3 (A)), the time from the start time to the end time of the window service is the time required for the service, but the window service is divided into multiple times. In such a case (see FIG. 3B), the time from the start time of the first window service to the end time of the last window service is the time required for the service.

  The customer type determination unit 46 performs a process of determining the customer type based on the customer state detected by the person verification unit 42. In the present embodiment, it is determined whether the customer type is a window customer, a change machine customer, an immediate exit customer, or a temporary customer.

  That is, in the case of a window customer, the person verification unit 42 detects that a service providing services to the customer is performed at the window, so that it is determined as a window customer based on the detection result. Further, in the case of a temporary out-going customer, the person verification unit 42 detects that the out-going customer has returned to the store, so it is determined as a temporary out-going customer based on this detection result.

  On the other hand, in the case of a change machine customer and a customer who leaves the store immediately, the person verification unit 42 detects that a customer whose service provision is unknown has left the store. Since it is the same for the customers who have left the store, it is not possible to discriminate between the customer using the money changer and the customer who has left the store immediately based only on the detection result in the person verification unit 42. Therefore, in the present embodiment, the staying time in the store is compared with a reference time (threshold value) to discriminate between customers using the money changer and customers who leave the store immediately. That is, when the stay time in the store is longer than the reference time, it is determined that the customer is a change machine customer, and when the stay time in the store is shorter than the reference time, it is determined that the customer is an immediate exit customer.

  Based on the waiting time acquired by the time measuring unit 45, the customer service necessity determination unit 47 requires a customer service act required when the waiting time becomes long, that is, a request for apologizing for waiting for a long time. Processing for determining NO is performed. In this customer service necessity determination process, the waiting time is compared with a predetermined reference time, and when the waiting time exceeds the reference time, it is determined that a customer service action is necessary.

  Note that when the person verification unit 42 detects that a customer whose service provision is unknown has left the store, the waiting time measurement may be terminated, but the waiting time measurement may be continued, You may make it respond also to a temporary going out customer. In this case, even if the waiting time acquired by the time measuring unit 45 exceeds the reference time, if the customer does not stay in the store, it is determined that the customer service is unnecessary and the customer service necessity determination regarding the customer is not performed. You can do it.

  Further, for example, in the case of prohibiting the customer from going out in principle according to the store side policy, when the customer goes out, the time measurement of the waiting time in the time measuring unit 45 is temporarily stopped. It is possible to set a long time, determine whether or not to wait for customer service after waiting for the customer to return to the store, and do not perform customer service (voice call) for customers who have returned from the store. Thus, it may be possible to change and set various conditions for determining the necessity of customer service as required on the store side.

  The customer service information acquisition unit 48 acquires customer service information from a store clerk. In the customer service information acquisition unit 48, information on the state of implementation of customer service by the store clerk or information on a customer who does not need customer service as the customer service information Is acquired via the input / output control unit 35. The information acquired by the customer service information acquisition unit 48 is stored in the customer information storage unit 44 as customer information of the corresponding customer.

  The totaling processing unit 49 performs processing for totaling the elapsed time for each customer acquired by the time measuring unit 45 and acquiring the elapsed time for each predetermined period. In the present embodiment, the in-store stay time for each customer acquired by the time measuring unit 45 is totaled for each time zone, and the in-store stay time for each time zone is acquired. In addition, the required work time at the window for each customer acquired by the time measuring unit 45 is totaled for each time zone, and the required time for each window is acquired.

  Further, the totaling processing unit 49 performs processing for totaling the types for each customer acquired by the customer type determining unit 46 and acquiring the number of customers for each type for each predetermined period. In this embodiment, the change machine user and the immediate store customer acquired by the customer type determination unit 46 are totaled, and the number of the change machine user and the immediate store customer for each time period is acquired.

  The input / output control unit 35 outputs the customer information generated by the customer information management unit 34 by a GUI (Graphical User Interface) using the input device 6 such as a monitor and a mouse, and presents it to the store clerk, and the store clerk's input A process for acquiring input information by an operation is performed, and a display information generation unit (user information presentation unit) 51 and an input information acquisition unit 52 are provided.

  The display information generation unit 51 generates display information related to a customer status display screen (see FIG. 13) that displays a list of customer information of registered customers and outputs the display information to the monitor 7. The customer status display screen is monitored. 7 is displayed. Further, the display information generation unit 51 performs processing for generating display information (see FIGS. 11 and 12) related to the aggregation result information generated by the aggregation processing unit 49 and outputting the display information to the monitor 7 and the printer 8. In the input information acquisition unit 52, processing is performed in which the store clerk acquires input information in accordance with an input operation performed on the screen displayed on the monitor 7 using the input device 6.

  In the present embodiment, the customer information management unit 34 performs processing for managing customer information in cooperation with the window reception system. The window reception system includes a ticketing machine 21, a call display panel 23, and an operation terminal 24. The ticket issuing machine 21 also includes a control unit of a window reception system.

  In this window reception system, when the window service for the customer is finished and the store clerk (teller) operates the call button of the operation terminal 24, the next reception number is displayed on the call display panel 23, and the customer is called. When the customer comes to the window, the store clerk operates the reception button on the operation terminal 24. At this time, the time when the store clerk operates the call button of the operation terminal 24 is called the call time, and the time when the store clerk operates the reception button of the operation terminal 24 is the window reception time. Sent from the ticket issuing machine 21 to the customer registration unit 43.

  In the customer registration unit 43, processing for associating the customer with the reception number is performed, and the reception number is added to the customer information of the corresponding customer. The correspondence between the customer and the reception number is performed based on the time. That is, when the second image acquisition unit 32 acquires the person image at the time of use of the window taken by the window camera 1b and detects that the window operation is started by the person verification unit 42, the person image and the shooting are performed. The time is sent to the customer registration unit 43. The customer registration unit 43 associates the customer with the reception number on the condition that the window reception time acquired from the ticketing machine 21 and the photographing time are close to each other.

  In this way, in the window reception system, the call time and the window reception time can be acquired. When the call time and the window reception time are stored as customer information in the customer information storage unit 44, the time measurement unit 45 The elapsed time from the entrance time to the calling time or the window reception time can also be measured as a waiting time.

  Each part of the PC 3 shown in FIG. 8 is realized by causing the CPU of the PC 3 to execute a customer management program. This program is pre-installed in the PC 3 as an information processing device and configured as a dedicated device, and is recorded on an appropriate program recording medium as an application program that runs on a general-purpose OS, and is also transmitted to the user via a network. It may be provided.

  In the present embodiment, the first to third image acquisition units 31 to 33, the person verification unit 42, the customer registration unit 43, and the time measurement unit are provided for the convenience of presenting customer information to the user in real time for customer service. Although it is necessary to perform processing of each part of 45 in real time, when processing for customer service is not necessary, the process of each said part does not need to be performed in real time. In this case, the first to third image acquisition units 31 to 33 may acquire images from the recorder 2.

  Next, an excluded person registration process performed by the excluded person registration unit 41 shown in FIG. 8 will be described. FIG. 9 is an explanatory diagram showing a video display screen and an excluded person registration screen displayed on the monitor 7 shown in FIG.

  In the present embodiment, the excluded person registration unit 41 performs a process of registering a person to be excluded from the target of customer management such as a store clerk. In this excluded person registration process, the video display screen 61 and the excluded person registration screen 62 shown in FIG. 9 are displayed on the monitor 7, and the user performs a required operation on the video display screen 61 and the excluded person registration screen 62. The excluded person registration unit 41 executes an excluded person registration process.

  Specifically, when the images of the cameras 1a to 1c are displayed on the image display screen 61, the person to be registered is photographed by the cameras 1a to 1c, and the image of the person is displayed on the image display screen 61, STOP is displayed. When the registration button 64 is operated after the button 63 is operated to stop the reproduction of the video, the excluded person registration screen 62 is displayed by hop-up, and at this time, the face image of the person detected from the video is extracted and excluded. It is displayed on the person registration screen 62. With this excluded person registration screen, it is possible to register information about the face image of the person to be excluded from the customer management target and the attributes of the person.

  The excluded person registration screen 62 is provided with a face image display unit 71, an attribute display unit 72, and a comment display unit 73 for each person. The face image display unit 71 displays a person's face image. The attribute display unit 72 displays face images and attributes (such as shooting date, name, employee, etc.) related to the person. In the comment display section 73, comments such as job titles are displayed.

  The excluded person registration screen 62 includes a file reading button 74, an employee button 75, a registered face editing button 76, and a deletion button 77. The file reading button 74 reads an image file prepared separately, and an excluded person is registered using a face image stored in the image file. The employee button 75 is used to display a screen for selecting an employee. When a corresponding person is selected from persons registered in advance as an employee, attributes (name and employee) of the person are set. The The registered face editing button 76 starts image editing for a face image. The delete button 77 is used to delete an unnecessary person from the excluded person registration screen 62. In the attribute display section 72 and the comment display section 73, necessary information may be directly input.

  Next, the person verification process performed by the person verification unit 42 shown in FIG. 8 will be described. FIG. 10 is an explanatory diagram for explaining the outline of the person verification process performed by the person verification unit 42.

  In this embodiment, a customer who enters a store from the entrance / exit is photographed by the store entrance camera 1a, and each time a person is detected from the image captured by the store entrance camera 1a by person tracking in the first image acquisition unit 31, the person is detected. Are acquired and stored in the customer information storage unit 44. As a result, the customer information storage unit 44 stores face images for each of a plurality of persons, and further stores a plurality of face images for one person.

  On the other hand, a customer who exits from the entrance / exit is photographed by the exit camera 1c, and each time a person is detected from the image captured by the exit camera 1c by person tracking in the third image acquisition unit 33, the face image of the person is detected. To get. The person verification unit 42 performs person verification between the face image acquired by the third image acquisition unit 33 and all the face images stored in the customer information storage unit 44. This is performed every time the third image acquisition unit 33 acquires a face image.

  The person verification unit 42 calculates the similarity between the plurality of face images at the time of entering the store and the plurality of face images at the time of leaving the store, and when the similarity exceeds a predetermined threshold value, the two face images are calculated. Judge as the same person.

  As described above, since the person collation unit 42 performs person collation with a plurality of face images, occurrence of erroneous collation can be suppressed. That is, the first and third image acquisition units 31 and 33 cannot always acquire an appropriate face image facing the front. However, in this embodiment, since person verification is performed using a plurality of face images, the accuracy of person verification can be improved and the occurrence of erroneous verification can be suppressed. On the other hand, if person verification is performed with multiple face images, the number of person verification increases, but there are about tens of customers who are subject to human verification, that is, about 10 customers who stay in the store. The load is not so large and real-time processing is possible.

  If person verification succeeds in this way, the shooting time of the face image acquired first among the face images when the corresponding person enters the store is taken as the store entry time, and the face image when the corresponding person leaves the store The shooting time of the last acquired face image is set as the store closing time.

  Note that the face image at the time of use of the window acquired by the second image acquisition unit 32 taken by the window camera 1b is similar to the face image at the time of entering the store stored in the customer information storage unit 44 in the same manner. The person verification is performed at.

  Next, a summary result in the summary processing unit 49 shown in FIG. 8 will be described. FIG. 11 and FIG. 12 are explanatory diagrams showing a form representing a total result output by the printer 8 shown in FIG.

  In the present embodiment, the aggregation processing unit 49 performs a process of totaling the in-store stay time for each customer acquired by the time measurement unit 45 for each time slot (1 hour) and acquiring the in-store stay time for each time slot. Thus, the graph shown in FIG. 11A is generated in the display information generation unit 51 and output from the printer 8. In this case, the totalization processing unit 49 extracts the customers who stayed in the store in each time zone based on the store stay time for each customer, and averages the stay time in the store for each customer, so that the in-store for each time zone Find your stay time. In addition, what is necessary is just to let the store stay time be a store entrance time, a store exit time, and an appropriate time in between.

  With this graph, it is possible to grasp the temporal transition state of the staying time in the store. Since the in-store stay time changes according to the degree of congestion in the store, the graph shows the congestion status in the store.

  Further, in the present embodiment, the total processing unit 49 totals the time required for work at the window for each customer acquired by the time measuring unit 45 for each time period, and acquires the required time for each window according to the window. Thus, the graph shown in FIG. 11B is generated in the display information generation unit 51 and output from the printer 8. In this case, the aggregation processing unit 49 extracts the customers who used each window in each time zone based on the window usage time for each customer, and averages the work required time for each customer, thereby What is necessary is just to obtain the time required for business by window. Note that the window use time may be a start time and an end time of the window service and an appropriate time therebetween.

  With this graph, it is possible to grasp the temporal transition state of the work required time for each window. The time required for business for each window changes according to the business efficiency for each window, so the graph can grasp the business efficiency for each window.

  Moreover, in this embodiment, the totalization processing unit 49 totals the customers determined by the customer type determination unit 46 as the change machine user and the customer who immediately leaves the store for each time period, and the change machine user for each time period. Then, a process of acquiring the number of customers who have immediately left the store is performed, whereby the graphs shown in FIGS. 12A and 12B are generated in the display information generation unit 51 and output by the printer 8. In this case, the totalization processing unit 49 extracts customers who have stayed in the store at each time zone based on the store stay time for each customer out of the customers determined to be money changer customers and immediate customers. What is necessary is just to obtain | require the number of customers of the money changing machine for every time slot | zone, and the number of customers leaving immediately by totaling the number of customers for every time slot | zone.

  With these graphs, it is possible to grasp the temporal transition status of the number of customers using the money changer and the customers who immediately leave the store.

  In the example shown in FIGS. 11 and 12, the total unit period is a time zone, but the total unit period may be one day, one week, and one month. In the present embodiment, the form representing the total result is output by the printer 8, but a screen representing the total result may be displayed on the monitor 7.

  Next, a customer status display screen for presenting the customer status to the store clerk will be described. FIG. 13 is an explanatory diagram showing a customer status display screen displayed on the monitor 7 shown in FIG. FIG. 14 is an explanatory diagram showing a main part of the customer status display screen shown in FIG.

  In the present embodiment, the display information generation unit 51 performs processing for generating display information related to a customer status display screen that displays a list of customer information of customers waiting to be called, and the customer status display screen shown in FIG. It is displayed on the monitor 7.

  The customer status display screen is provided with a customer information display unit 81 for displaying customer information for each customer. In addition, an average waiting time obtained by averaging the waiting times of all customers is displayed on the customer status display screen.

  The customer information display part 81 is grouped and displayed in places where customers are present (waiting area and outside the store), and in the waiting area column, customers who are assumed to be in the waiting area, that is, staying in the store are displayed. Customers who have not been called out are displayed among the customers who are to be called, and in the column outside the store, customers who are outside the store, that is, customers who have left the store without knowing the provision of services are displayed. When the customer enters the store and is registered, the customer information display unit 81 related to the customer is displayed in the waiting area column. When the customer leaves the store without knowing the provision of services, the customer is displayed outside the store from the waiting area column. When the customer information display unit 81 moves to the column “No.” and the customer returns to the store, the customer information display unit 81 moves from the column outside the store to the column for the waiting area.

  It should be noted that, in addition to temporary customers, customers who change money machines or customers who immediately leave the store are customers who have left the store without knowing the provision of services. Displayed in the column.

  As shown in FIG. 14A, the customer information display unit 81 includes a serial number display unit 82, a person image display unit 83, a waiting time display unit 84, an outing state display unit 85, and a customer service information input display unit. 86 and a delete button 87 are provided. In the serial number display part 82, the serial number assigned to the customer in the order of entry at the time of registration by the customer registration part 43 is displayed. The person image display unit 83 displays the customer face image registered by the customer registration unit 43. The waiting time display unit 84 displays the waiting time acquired by the time measuring unit 45.

  The outing status display section 85 displays status information regarding going out. Specifically, when the user has not left the store, the field is blank, and when the user has left the store and is not in the store, the characters “out of office” are displayed. Displayed, and when returning from going out, the characters “going out” are displayed.

  In the customer service information input / display section 86, the customer service conduct state of the store clerk, that is, the customer store act actually performed by the store clerk, is input by the store clerk, and the customer service conduct state corresponding to the input content is displayed.

  Specifically, as shown in FIG. 14B, when the customer service information input display unit 86 is operated, as the pull-down menu, unsupported, supported once, supported twice, and non-target selection items are displayed. The In the customer service information input display section 86, “unsupported” is displayed as an initial state. When the customer service is performed for the first time, “corresponding once” is selected, and when the customer service is performed for the second time, “ Select 2 Responses. Also, customers who do not need customer service when it is determined that the customer is a companion due to customer service, etc. performed by the store clerk, or when it is determined that the customer is in the waiting area for reasons other than waiting for a call. And select “Not applicable”. In accordance with this selection operation, display characters on the customer service information input display section 86 are switched.

  Here, the customer for whom “not applicable” is selected is excluded from the customer service necessity determination process performed by the customer service necessity determination unit 47, and the customer is not notified that the waiting time has been exceeded. .

  In addition, when the window service is performed in a plurality of times (see FIG. 3B), a selection item representing the second and subsequent waiting states may be provided in the customer service information input display unit 86. If it does in this way, a salesclerk can grasp a customer who waits for a call twice or more.

  As shown in FIG. 14A, the delete button 87 is used to manually delete the registration of the customer. When this delete button 87 is operated, the registration cancellation process is executed by the customer registration unit 43, and the customer is deleted. Customer information display section 81 is deleted from the customer status display screen. When manually deleting a customer's registration by using the delete button 87, in the case of a person who is not subject to customer service, that is, a person who does not need customer service such as a companion, the same person can be deleted by some wrong processing. This is the case when the registration is repeated, or when an unregistered clerk is registered as a customer.

  In addition, in order to avoid accidentally deleting the customer registration due to an erroneous operation, if the delete button 87 is operated, a confirmation screen is displayed in a pop-up, and if the OK button is operated on the confirmation screen, the deletion is confirmed. Good. Further, it may be possible to delete only the person who has selected “not applicable” in the customer service information input display unit 86.

  In addition, about the customer from whom the provision of service was detected by the person collation part 42, registration cancellation processing is automatically performed in the customer registration part 43, and the customer information display part 81 of the customer is displayed on a customer status display screen Deleted from.

  Further, in the customer status display screen shown in FIG. 13, the customer service necessity determination unit 47 needs to call the customer to apologize for waiting for a long time and waiting for a long time. In order to notify the store clerk of the determined customer, the customer information display section 81 of the corresponding customer is highlighted. In particular, the customer information display unit 81 is highlighted here. In addition to the reverse display of the customer information display unit 81, the display color of the customer information display unit 81 may be changed. In addition, a message prompting customer service may be displayed in a pop-up.

  Further, in the customer status display screen shown in FIG. 13, the display of the customer service information input display unit 86 is performed according to the input contents of the customer service information input display unit 86 (unsupported, supported once, supported twice, and excluded). The display color of the customer information display unit 81 is changed together with the characters.

  As described above, in the present embodiment, the customer information management unit 34 can acquire the customer status related to the provision of the service, and the customer status can be determined only by the person images taken by the cameras 1a to 1c. Since it can be acquired and does not depend on the reception number tag, it can also acquire information on the status of customers who take anomalous behavior different from normal behavior, such as customers who do not receive the reception number tag. Thus, the user can grasp the state of the customer who has visited the store without omission.

  In the present embodiment, the time measurement unit 45 measures the elapsed time related to the customer status for each customer based on the customer status detected by the person verification unit 42 and the shooting time of the person image. The user can grasp the elapsed time related to the customer's condition. Thereby, the congestion state in a store, the efficiency of the operation | work which concerns on provision of a service, etc. can be evaluated.

  Further, in the present embodiment, the time measuring unit 45 measures the staying time in the hall from the time the customer enters to the time when the customer leaves, as the elapsed time. Can grasp.

  In the present embodiment, the time measuring unit 45 measures the time required for each customer from the start to the end of the work related to the provision of services as the elapsed time. The user can grasp the required time. Then, if the business time required for each customer is aggregated for each business window and the business time required for each service is obtained, the business efficiency for each service can be evaluated.

  In the present embodiment, the total processing unit 19 totals the elapsed time for each customer acquired by the time measuring unit 45 and generates customer information regarding the elapsed time for each predetermined period. The user can grasp the elapsed time. If the elapsed time for each predetermined period is displayed side by side in chronological order, the user can easily grasp the temporal transition status of the elapsed time, and the elapsed time of different dates and times can be compared. If they are displayed side by side, the user can easily understand the difference in elapsed time depending on the date.

  In the present embodiment, the customer type determination unit 46 determines the customer type based on the customer status detected by the person verification unit 42 (customer at the window, customer at the change machine, customer who leaves the store immediately, and customer who goes out temporarily). Therefore, the user can grasp the customer type.

  In the present embodiment, in the customer type determination unit 46, when the person verification unit 42 detects that a customer whose service provision is unknown has left, and detects that the customer has entered again, Since the customer is determined to be a temporary customer who has temporarily left the facility, the user can grasp the temporary customer.

  Further, in the present embodiment, the time measuring unit 45 measures the in-place stay time required for each customer from the entrance to the exit based on the photographing time of the person image, and the customer type determination unit 46 Since the customer type is determined based on the measurement result by the time measuring unit, it is possible to determine the customer type that cannot be determined only by person verification between person images.

  Moreover, in this embodiment, in the customer type determination part 46, while the person collation part 42 detects that the customer whose service provision is unknown left, the stay time in the place by the time measurement part 45 is less than predetermined time. In some cases, since the customer is determined to be an immediately leaving customer who has immediately left without entering the service after entering, the user can grasp the immediate leaving customer.

  Moreover, in this embodiment, in the customer type determination part 46, while the person collation part 42 detects that the customer whose service provision is unknown has left, the stay time in the place by the time measurement part 45 exceeds predetermined time. In this case, since the customer is determined to be a self-machine user using a money changer that provides services according to the operation of the customer, the user can grasp the self-machine customer.

  Further, in the present embodiment, the total processing unit 49 totals the types for each customer acquired by the customer type determination unit 46, and acquires the number of customers by type for each predetermined period. The user can grasp the number of customers by type. If the number of customers by type for each predetermined period is displayed side by side in chronological order, the user can easily grasp the temporal transition status of the number of customers by type, and different If the number of customers by date and time is displayed side by side so that they can be compared, the user can easily understand the difference in the number of customers by date and time.

  Further, in the present embodiment, since customer information related to the customer state detected by the person verification unit 42 is presented to the user in real time, a customer who is waiting for service provision, that is, a customer service action is required. The store clerk can grasp the customer's condition without omission. This makes it possible for the store clerk to perform the necessary customer service and to prevent the customer from being left without receiving the service, so that the satisfaction of the customer can be improved.

(Second Embodiment)
Next, a customer management system according to the second embodiment will be described. The points not particularly mentioned here are the same as those in the first embodiment. FIG. 15 is a plan view of the store explaining the store layout, the installation status of the cameras 1a to 1c, and the customer's state. FIG. 16 is an explanatory diagram for explaining an overview of customer management processing performed by the PC 3.

  In the second embodiment, as shown in FIG. 15, a money changer camera 1d for photographing a customer who uses the money changer 22 is installed.

  In this case, as shown in FIG. 16, when a customer enters the store from the entrance, the customer is photographed by the entrance camera 1a, and a person detected from the photographed image is registered as a customer. When the customer changes money with the money changer 22, the customer is photographed with the money changer camera 1d. At this time, person verification is performed between the person image extracted from the photographed image and the registered person image. When the person verification is successful, the customer has used the money changer 22, that is, the provision of the service is completed. Is detected, and the customer information of the corresponding customer is updated.

  When the customer completes the exchange with the money changer 22 and leaves the store from the entrance, the customer is photographed by the store exit camera 1c. At this time, person verification is performed between the person image extracted from the captured image and the registered person image, and when the person verification is successful, it is detected that the customer who has finished providing services has left the store, Customer registration is deleted.

  As described above, in the second embodiment, since the fact that the customer has used the money changer 22 is detected from the image of the money changer camera 1d, the customer of the money changer can be more accurately identified. In the first embodiment, it is impossible to discriminate between a change machine user and a temporary going-out customer when the customer leaves the store, but in the second embodiment, the change machine customer is discriminated when the customer leaves the store. be able to.

  As mentioned above, although this invention was demonstrated based on specific embodiment, these embodiment is an illustration to the last, Comprising: This invention is not limited by these embodiment. In addition, all the components of the user management apparatus, the user management system, and the user management method according to the present invention described in the above embodiment are not necessarily essential, and at least as long as they do not depart from the scope of the present invention. It is possible to choose.

  For example, in this embodiment, an example of a financial institution such as a bank has been described. However, a facility that issues a receipt number tag to a user with a ticketing machine, such as a mobile phone shop, a travel agency, a dispensing pharmacy, a medical facility such as a hospital, It can be widely applied to department stores, government offices, and event venues. In this embodiment, the service is provided to the user in an example of a financial institution such as a bank. However, the present invention can also be applied to a store that provides an article such as a product to the user.

  In the present embodiment, the money changer is provided as a self machine (self-service type service providing device), but other devices such as an ATM (automated teller machine) may be used. The ATM is usually installed in a dedicated area, but it may be installed in an area (lobby) where customers who use the counter stay.

  In this embodiment, the customer who receives the receipt number tag from the ticketing machine is not photographed by the camera. However, a ticketing machine camera that photographs such a customer may be installed. Further, since the ticket issuing machine is installed in the vicinity of the entrance / exit, the entrance camera 1a may be used to photograph the customer who receives the reception number tag with the ticket issuing machine together with the customer entering from the entrance / exit.

  In the present embodiment, the first image acquisition unit 31 acquires a person image obtained by photographing a customer entering the store with the store camera 1a. However, in the first image acquisition unit 31, the store enters the store. Then, a person image obtained by photographing the customer at least at any point in time from when the receipt number tag is received may be acquired. In addition, since it is not using a ticket-issuing machine when targeting the customer of a change machine, it is necessary to acquire the person image which image | photographed the user at the time of entering a store at least.

  In the present embodiment, the second image acquisition unit 32 acquires the person image obtained by photographing the customer who appears at the window by the window camera 1b. However, the second image acquisition unit 32 receives the reception number. What is necessary is just to acquire the person image which image | photographed the customer at the time of receiving service at least from the time of receiving the call by the end. As a result, it is possible to detect that the business of providing the service to the customer has been performed.

  In the present embodiment, the first to third image acquisition units 31 to 33 are provided in the PC 3. However, an image acquisition device may be provided separately from the PC 3. In addition, this image acquisition device can be integrated with the cameras 1a to 1c and configured as a camera with an image acquisition function.

  In the present embodiment, the person tracking process is performed by the PC 3, but the person tracking process may be configured by an apparatus different from the PC 3. It is also possible to configure a camera with a person tracking function by mounting a person tracking function on the cameras 1a to 1c.

  Further, in the present embodiment, processing necessary for customer management is performed by the PC 3 provided in the store. However, as illustrated in FIG. 1, the necessary processing is performed by the PC 11 or the like installed in the headquarters. The cloud computer 12 constituting the cloud computing system may be made to perform the operation. In addition, necessary processing may be shared by a plurality of information processing apparatuses, and information may be transferred between the plurality of information processing apparatuses via a communication medium such as an IP network or a LAN. In this case, a customer management system is composed of a plurality of information processing apparatuses that share necessary processing.

  In such a configuration, it is preferable to cause the PC 3 or the like provided in the store to perform at least a process with a large calculation amount among processes necessary for customer management, for example, a person detection process. With this configuration, the amount of information required for the remaining processing can be reduced, so that the remaining processing is performed by an information processing apparatus installed at a location different from the store, for example, the PC 11 installed at the headquarters. Even if it does in this way, since communication load can be reduced, operation of the system by a wide area network connection form becomes easy.

  Further, among the processes necessary for customer management, at least a process with a large calculation amount, for example, a person detection process may be performed by the cloud computer 12. If comprised in this way, since the amount of calculations of the remaining processes may be small, a high-speed processing capability becomes unnecessary for user side apparatuses, such as a shop, and the cost which a user bears can be reduced.

  In addition, the cloud computer 12 may perform all necessary processes, or at least display information generation processing among necessary processes may be shared by the cloud computer 12. In addition to the PCs 3 and 11 provided in the mobile phone, the customer information can be displayed on a portable terminal such as the smartphone 13 or the tablet terminal 14. You can check the customer status.

  In the present embodiment, the PC 3 installed in the store performs processing necessary for customer management, and the GUI screen is displayed on the monitor 7 of the PC 3 so that necessary input and output are performed on the PC 3. However, the necessary input and output may be performed by an information processing apparatus different from the information processing apparatus that performs processing necessary for customer management, for example, a portable terminal such as the PC 11 or the tablet terminal 14 installed in the headquarters. . In particular, when a portable terminal such as the tablet terminal 14 is carried by a store clerk or installed at an appropriate place in a store so that a store clerk who specializes in customer service can perform browsing and input operations. , Can increase convenience.

  The user management device, the user management system, and the user management method according to the present invention have the effect that the user can grasp the state of the user who has visited the facility without omission, and in order based on the reception number It is useful as a user management device, a user management system, a user management method, and the like for managing the state of a user who has visited at a facility that provides services or goods to the user.

1a Entry camera 1b Window camera 1c Exit camera 1d Currency exchange camera 2 Recorder 3 PC
6 Input device 7 Monitor 8 Printer 11 PC
12 Cloud Computer 13 Smartphone 14 Tablet Terminal 21 Ticketing Machine (Number Tag Issuer, Number Tag Issuer)
22 Currency exchange machine (self-machine)
31 1st image acquisition part 32 2nd image acquisition part 33 3rd image acquisition part 34 Customer information management part (user information management part)
35 Input / output control unit 41 Excluded person registration unit 42 Person collation unit 43 Customer registration unit 44 Customer information storage unit 45 Time measurement unit 46 Customer type determination unit (user type determination unit)
49 Total Processing Unit 51 Display Information Generation Unit (User Information Presentation Unit)

Claims (12)

  1. A user management device that displays a person image of a user who has visited at a facility that provides services or goods to a user in order based on a reception number, and enables browsing of the state of each user,
    A number tag issuing unit that issues a reception number tag to visitors,
    A first image acquisition unit that acquires a plurality of first person images obtained by photographing a user at least at any point in time from entering the facility to receiving the receipt number tag;
    A second image acquisition unit that acquires a plurality of second person images obtained by photographing the user at least at any point in time from when the call by the reception number is received until the provision of the service or the article is completed;
    A third image acquisition unit that acquires a plurality of third person images taken of the user at least when leaving the facility;
    A user information management unit that generates and manages status display information including user images for each user ;
    A user information presentation unit for presenting the state display information generated by the user information management unit to a user;
    With
    The user information management unit
    A person collation unit that performs person collation a plurality of times between the first, second, and third person images, and a user type determination unit that determines the type of user,
    According to the collation result by the person collation unit, the state of the user related to the provision of the service or the article is detected, the type of the user is determined based on the detected state of the user, and the group is determined according to the type. A user management apparatus that generates the state display information displayed separately .
  2. The user information management unit
    The apparatus further comprises a time measuring unit that measures an elapsed time relating to the user state for each user based on the user state detected by the person verification unit and the photographing time of the person image. Item 2. The user management apparatus according to Item 1.
  3.   3. The user management apparatus according to claim 2, wherein the time measuring unit measures, as the elapsed time, a staying time in the hall that is required until the user leaves after entering the hall.
  4.   The said time measurement part measures the operation | work required time required from the start to the completion | finish after the operation | work regarding provision of the said service or goods as said elapsed time for each user, The said time measurement part is characterized by the above-mentioned. User management device.
  5. The user information management unit
    5. The data processing apparatus according to claim 2, further comprising a totalization processing unit that totals the elapsed time for each user acquired by the time measurement unit and acquires the elapsed time for each predetermined period. The user management device described in 1.
  6. When the user verification unit detects that the user whose provision of services or articles is unknown has exited and detects that the user has entered again, the user type determination unit The user management apparatus according to claim 1 , wherein the user is determined as a temporary outing person who has temporarily left the facility.
  7. The user information management unit
    Based on the shooting time of the person image, a time measuring unit that measures the staying time in the hall required for the user to enter and leave after entering,
    The user management apparatus according to claim 1 , wherein the user type determination unit determines the type of the user based on a measurement result by the time measurement unit.
  8. In the user type determination unit, the person verification unit detects that the user whose service or article is unknown has left, and the time measurement unit stays within the predetermined time. In this case, the user management apparatus according to claim 7 , wherein the user management apparatus determines that the user is an immediate exit person who has exited immediately after entering without receiving the service or the article.
  9. In the case where the user type determination unit detects that the user whose provision of the service or the article is unknown has been left by the person verification unit, and the staying time in the hall by the time measurement unit exceeds a predetermined time The user management apparatus according to claim 7 , wherein the user is determined to be a self-machine user using a self-machine that provides the service or article according to a user's operation.
  10. The user information management unit
    Aggregate the type of each user obtained by the user type determining unit, wherein the claim 6, further comprising a counting processing unit for acquiring the number of Type of user for each predetermined period Item 10. The user management apparatus according to any one of Items 9 .
  11. A user management system that displays a person image of a user who has visited at a facility that provides services or goods to a user in order based on a reception number, and enables browsing of the state of each user,
    A camera installed in the facility;
    A number tag issuing device that issues a reception number tag to visitors,
    A plurality of information processing devices;
    Have
    Any of the plurality of information processing devices
    A first image acquisition unit that acquires a plurality of first person images obtained by photographing a user at least at any time from entering the facility with the camera and receiving the receipt number tag;
    A second image acquisition unit that acquires a plurality of second person images obtained by photographing the user at least at any point in time from when the call by the receipt number is received by the camera to when the provision of the service or the article ends. When,
    A third image acquisition unit that acquires a plurality of third person images obtained by photographing the user at least at the time of leaving the facility by the camera;
    A user information management unit that generates and manages status display information including user images for each user ;
    A user information presentation unit for presenting the state display information generated by the user information management unit to a user;
    With
    The user information management unit
    A person collation unit that performs person collation a plurality of times between the first, second, and third person images, and a user type determination unit that determines the type of user,
    According to the collation result by the person collation unit, the state of the user related to the provision of the service or the article is detected, the type of the user is determined based on the detected state of the user, and the group is determined according to the type. A user management system characterized by generating the status display information displayed separately .
  12. Use of an information processing device to display a person image of a user who has visited a facility that provides services or goods to the user in order based on the reception number, and to make it possible to view the status of each user Management method,
    Issuing a receipt number tag to the visiting user;
    Acquiring a plurality of first person images taken of a user at least at any time from entering the facility to receiving the receipt number tag;
    Acquiring a plurality of second person images obtained by photographing a user at least at any point in time from receiving the call by the receipt number to ending provision of the service or article;
    Obtaining a plurality of third person images taken of the user at least when leaving the facility;
    Generating and managing status display information including user images for each user ;
    Presenting the state display information generated in this step to a user;
    Equipped with a,
    In the step of generating and managing the status display information ,
    Performing a plurality of person verifications between the first, second and third person images, and determining a type of user,
    According to the result of the person verification, the state of the user related to the provision of the service or the article is detected, the type of the user is determined based on the detected state of the user, and the user is grouped according to the type. The user management method, wherein the status display information displayed is displayed .
JP2014119296A 2014-06-10 2014-06-10 User management device, user management system, and user management method Active JP6241666B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014119296A JP6241666B2 (en) 2014-06-10 2014-06-10 User management device, user management system, and user management method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014119296A JP6241666B2 (en) 2014-06-10 2014-06-10 User management device, user management system, and user management method

Publications (2)

Publication Number Publication Date
JP2015232791A JP2015232791A (en) 2015-12-24
JP6241666B2 true JP6241666B2 (en) 2017-12-06

Family

ID=54934196

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014119296A Active JP6241666B2 (en) 2014-06-10 2014-06-10 User management device, user management system, and user management method

Country Status (1)

Country Link
JP (1) JP6241666B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190016578A (en) * 2016-06-22 2019-02-18 로렐세이키 가부시키가이샤 Counter reception system and service robot
US20190147251A1 (en) 2017-11-15 2019-05-16 Canon Kabushiki Kaisha Information processing apparatus, monitoring system, method, and non-transitory computer-readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07249138A (en) * 1994-03-09 1995-09-26 Nippon Telegr & Teleph Corp <Ntt> Residence time measuring method
JP4374759B2 (en) * 2000-10-13 2009-12-02 オムロン株式会社 Image comparison system and image comparison apparatus
JP5017873B2 (en) * 2006-02-07 2012-09-05 コニカミノルタホールディングス株式会社 Personal verification device and personal verification method
JP2009163417A (en) * 2007-12-28 2009-07-23 Glory Ltd Automatic reception system
JP5260233B2 (en) * 2008-10-29 2013-08-14 三菱電機インフォメーションシステムズ株式会社 Tracking device, tracking method, and tracking program
JP5356615B1 (en) * 2013-02-01 2013-12-04 パナソニック株式会社 Customer behavior analysis device, customer behavior analysis system, and customer behavior analysis method

Also Published As

Publication number Publication date
JP2015232791A (en) 2015-12-24

Similar Documents

Publication Publication Date Title
US9277185B2 (en) Intelligent video verification of point of sale (POS) transactions
US7652687B2 (en) Still image queue analysis system and method
KR20110044309A (en) Information providing apparatus, information providing method, and recording medium
CN104025573A (en) System and method for site abnormality recording and notification
JP5942278B2 (en) System and method for improving field work by detecting anomalies
US20080288276A1 (en) Method, Process and System for Survey Data Acquisition and Analysis
JP2010113692A (en) Apparatus and method for recording customer behavior, and program
JP2005267430A (en) Information providing system
US20140063256A1 (en) Queue group leader identification
JP2009301297A (en) Information providing device, computer program and store system
JP5356615B1 (en) Customer behavior analysis device, customer behavior analysis system, and customer behavior analysis method
JP2004326208A (en) Customer managing system, program for realizing system, and recording medium
DE102013213841B4 (en) Intelligent POS system
US9875481B2 (en) Capture of retail store data and aggregated metrics
US20140214484A1 (en) Customer category analysis device, customer category analysis system and customer category analysis method
JP2012208854A (en) Action history management system and action history management method
US9786113B2 (en) Investigation generation in an observation and surveillance system
JP5319260B2 (en) Work monitoring device
US10445693B2 (en) Goods monitoring device, goods monitoring system, and goods monitoring method
US20160063712A1 (en) Monitoring apparatus, monitoring system and monitoring method
US7669761B2 (en) Sales shop system
JP2010181920A (en) Area management system
US7944358B2 (en) Traffic and population counting device system and method
JP4991440B2 (en) Product sales apparatus, product sales management system, product sales management method and program
JP5728654B1 (en) Product monitoring device, product monitoring system and product monitoring method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160229

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170131

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170131

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170331

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20170523

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170822

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20170831

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170926

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20171025

R151 Written notification of patent or utility model registration

Ref document number: 6241666

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151