US20200202553A1 - Information processing apparatus - Google Patents

Information processing apparatus Download PDF

Info

Publication number
US20200202553A1
US20200202553A1 US16/805,866 US202016805866A US2020202553A1 US 20200202553 A1 US20200202553 A1 US 20200202553A1 US 202016805866 A US202016805866 A US 202016805866A US 2020202553 A1 US2020202553 A1 US 2020202553A1
Authority
US
United States
Prior art keywords
processing apparatus
information processing
target object
target
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/805,866
Inventor
Daisuke Ikeda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fuji Xerox Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Xerox Co Ltd filed Critical Fuji Xerox Co Ltd
Priority to US16/805,866 priority Critical patent/US20200202553A1/en
Publication of US20200202553A1 publication Critical patent/US20200202553A1/en
Assigned to FUJIFILM BUSINESS INNOVATION CORP. reassignment FUJIFILM BUSINESS INNOVATION CORP. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: FUJI XEROX CO., LTD.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06K9/00335
    • G06K9/00771
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Definitions

  • the present invention relates to an information processing apparatus.
  • an information processing apparatus which includes an acquisition section that acquires first information indicating whether or not a target person performs a specific behavior on target objects disposed in plural places and second information indicating a behavior of the target person and including a stay time in the plural places, for each target person, a calculation section that calculates an evaluation value indicating a probability of the target person who has not performed the specific behavior performing the specific behavior on the target object, based on the acquired first information, and an estimation section that extracts data on the target object disposed in the place having a stay time which is smaller than a predetermined value based on the acquired second information, and estimates an opportunity loss for the target object based on the evaluation value calculated for the target object.
  • FIG. 1 is a plan view illustrating an example of a layout of an information processing system according to an exemplary embodiment of the present invention
  • FIG. 2 is a block diagram illustrating an example of a control system in the information processing system
  • FIG. 3 is a diagram illustrating a stay time
  • FIG. 4 is a diagram illustrating an example of a purchase data table
  • FIG. 5 is a diagram illustrating an example of an evaluation data table
  • FIG. 6 is a diagram illustrating an example of a behavior information table
  • FIG. 7 is a graph illustrating an example of estimating an opportunity loss
  • FIG. 8 is a flowchart illustrating an example of an operation of the information processing apparatus.
  • An information processing apparatus includes an acquisition section that acquires first information indicating whether or not a target person performs a specific behavior on target objects disposed in plural places and second information indicating a behavior of the target person and including a stay time in the plural places, for each target person, a calculation section that calculates an evaluation value indicating a probability of the target person who has not performed the specific behavior performing the specific behavior on the target object, based on the acquired first information, and an estimation section that extracts data on the target object disposed in the place having a stay time which is smaller than a predetermined value, based on the acquired second information, and estimates an opportunity loss for the target object based on the evaluation value calculated for the target object.
  • the target objects disposed in the plural places may be objects different from each other or may include the same object.
  • the specific behavior may be a behavior based on the preference of a target person. For example, in a case where the target object is a product, the specific behavior corresponds to purchasing, rental, and the like. In a case where the target object is an exhibit, the specific behavior corresponds to document request and the like.
  • the place having a stay time which is smaller than a predetermined value means a place in which it can be considered that the target person does not stop by, for example, a place in a case where the target person does not stop by at all or a case where the target person momentarily stops by.
  • the predetermined value is, for example, about 1 or 2 seconds and includes zero.
  • the opportunity loss means that, if the target person has noticed the existence of the target object, the target person performs the specific behavior.
  • FIG. 1 is a plan view illustrating an example of a layout of an information processing system according to an exemplary embodiment of the present invention.
  • the information processing system 1 may be applied to, for example, a store 100 such as a convenience store, a department store, and a shopping center.
  • a store 100 such as a convenience store, a department store, and a shopping center.
  • plural display cases 10 in which products are displayed a terminal device 4 used for a clerk performing accounting processing, and plural (for example, three) cameras (first camera 3 A, second camera 3 B, and third camera 3 C (which are simply referred to as “a camera 3 ” when being collectively referred to)) are arranged.
  • the product is an example of the target object.
  • the display case 10 is an example of the place.
  • the display case 10 includes display cases 10 a to 10 d in which rice balls, box lunches, teas, and cup noodles are respectively displayed, and display cases 10 e to 10 i in which confectionery, daily necessities, bread, alcohol, and magazines are respectively displayed, as the products, for example.
  • Areas E 1 to E 12 in which a customer may pass are provided in the store 100 .
  • a route R illustrated in FIG. 1 indicates a moving route of a customer having a person ID of “A”, which is used for identifying a person and will be described later, as an example.
  • the customer is an example of a person and a target person.
  • An image obtained by imaging of the camera 3 may be a video or a still image obtained by performing imaging plural times for each second.
  • the camera 3 transmits the image obtained by imaging to an information processing apparatus 2 (see FIG. 2 ) in a wireless or wired manner.
  • the first camera 3 A images an inside of the store, which includes the areas E 1 to E 7 , E 11 , and E 12 .
  • the second camera 3 B images an inside of the store, which includes the areas E 7 to E 12 .
  • the third camera 3 C images an inside of the store, which includes the areas E 1 , E 11 , and E 12 , the terminal device 4 , and an entrance 101 .
  • the terminal device 4 is a computer device called a point-of-sale (POS) register disposed on a counter.
  • POS point-of-sale
  • a customer who enters the store 100 puts a product picked up with a hand, on the counter and performs payment.
  • the terminal device 4 performs processing for accounting, issues a receipt on which purchase of the product is recorded, and generates purchase data indicating that the product has been purchased, for each product.
  • the terminal device 4 transmits the purchase data to the information processing apparatus 2 (see FIG. 2 ) in a wireless or wired manner.
  • FIG. 2 is a block diagram illustrating an example of a control system of the information processing system 1 .
  • the information processing system 1 includes the information processing apparatus 2 connected to the terminal device 4 and the camera 3 A to 3 C illustrated in FIG. 1 .
  • the information processing apparatus 2 includes a control unit 20 , a storage unit 21 , and a display unit 22 .
  • the control unit 20 controls units of the information processing apparatus 2 .
  • the storage unit 21 stores various kinds of information.
  • the display unit 22 is realized by a display such as a liquid crystal display, and displays various kinds of information.
  • the control unit 20 is configured with a central processing unit (CPU), an interface, and the like.
  • the CPU operates in accordance with a program 210 stored in the storage unit 21 so as to function as a purchase data receiving unit 201 , an evaluation data generating unit 202 , a behavior information acquiring unit 203 , an estimation section 204 , and the like.
  • the purchase data receiving unit 201 and the behavior information acquiring unit 203 are examples of the acquisition section.
  • the evaluation data generating unit 202 is an example of the calculation section.
  • the units 201 to 204 will be described later in detail.
  • the storage unit 21 is configured with a read only memory (ROM), a random access memory (RAM), and the like.
  • the program 210 , a purchase data table 211 (see FIG. 4 ), an evaluation data table 212 (see FIG. 5 ), a behavior information table 213 (see FIG. 6 ), and the like are stored in the storage unit 21 .
  • recording or registering is used in a case of writing information in a table, and storing is used in a case of writing information in the storage unit.
  • the purchase data table 211 is an example of the first information.
  • the behavior information table 213 is an example of the second information.
  • the purchase data receiving unit 201 acquires purchase data indicating whether or not a target person purchases a product disposed in the plural display cases 10 . Specifically, the purchase data receiving unit 201 receives a person ID for identifying a customer who is performing accounting processing with the terminal device 4 , from the behavior information acquiring unit 203 . The purchase data receiving unit 201 sequentially receives purchase data of each person ID from the terminal device 4 . The purchase data receiving unit 201 records the purchase data of each person ID in the purchase data table 211 .
  • the evaluation data generating unit 202 calculates an evaluation value indicating a probability that the customer purchases a product which has not been purchased by the customer, based on the purchase data table 211 .
  • the evaluation data generating unit records the calculated evaluation value in the evaluation data table 212 .
  • the evaluation data generating unit 202 calculates the evaluation value by well-known collaborative filtering such as GroupLens. Specifically, the evaluation data generating unit 202 calculates a similarity degree between customers, based on the purchase data table 211 . Then, the evaluation data generating unit estimates an evaluation value based on the calculated similarity degree.
  • the similarity degree indicates similarity between preferences to the product. For example, the similarity degree of 1 means a case where the preferences completely coincide with each other.
  • the behavior information acquiring unit 203 acquires behavior information of a customer from an image obtained by imaging of the camera 3 and records the acquired behavior information in the behavior information table 213 .
  • the behavior information includes a stay time of a customer having a person ID assigned thereto, for each display case 10 , for example.
  • the stay time is set to be a time during which the customer is directed toward the display case 10 while being in the front of the display case 10 (for example, within 1 m).
  • the stay time does not include a time during which the customer is directed toward the display case 10 from a place (for example, position at a distance of 2 m or greater) far from the display case 10 or a time during which the customer is not directed toward the display case 10 even though he or she is in the front of the display case 10 .
  • a place for example, position at a distance of 2 m or greater
  • the customer looks at the product.
  • the behavior information acquiring unit 203 determines whether or not the customer is directed toward the display case 10 , by analyzing an image which has been obtained by imaging of the camera 3 .
  • a time during which the customer is in the front of the display case 10 without determination of whether or not the customer is directed toward the display case 10 may be recorded as the stay time, in the behavior information table 213 .
  • a camera may be disposed on each display case 10 and a gaze direction of the customer may be detected. Thus, it may be determined whether or not the customer looks at the display case 10 .
  • the behavior information acquiring unit 203 sequentially takes an image obtained by imaging of the camera 3 and monitors whether or not a person is included in the image.
  • the behavior information acquiring unit 203 determines whether or not a person is included in the image, by determining whether or not a face is included in the image.
  • the behavior information acquiring unit assigns a person ID to a customer corresponding to the person in the image and tracks the moving route of the customer in the store.
  • the image obtained by imaging of the camera 3 includes the peripheral parts such as the display case 10 , products, and the floor.
  • the behavior information acquiring unit 203 may specify an area in which the customer is located among the areas E 1 to E 12 , based on the positional relationship between the peripheral parts and the customer.
  • the estimation section 204 extracts data on a product disposed in a display case 10 having a stay time which is smaller than a predetermined value (for example, one second) based on the acquired behavior information table 213 . Then, the estimation section 204 estimates an opportunity loss for the product, based on an evaluation value calculated for the product. The estimation section 204 may estimate the number of target persons counted for each evaluation value, as the opportunity loss for a target object. The estimation section 204 may estimate the number of target persons counted for each evaluation value which is equal to or greater than a predetermined value in the number of target persons counted for each evaluation value, as the opportunity loss for the target object. Further, the estimation section 204 may estimate a ratio of the number of target persons who perform the specific behavior on the target object at a predetermined probability to the number of target persons who come into an area in which the plural places are provided as the opportunity loss for the target object.
  • a predetermined value for example, one second
  • FIG. 3 is a diagram illustrating the stay time. It is assumed that a customer is in the area E 4 and is directed toward the display case 10 b of box lunches. In a case where the customer is directed toward the display case 10 , the stay time is recorded in association with this display case 10 . In a case illustrated in FIG. 3 , the customer is directed toward the display case 10 b of box lunches, and thus the behavior information acquiring unit 203 acquires a stay time for the display case 10 b of box lunches from images. In the case illustrated in FIG. 3 , the customer is not directed toward the display case 10 d of cup noodles. Thus, the behavior information acquiring unit 203 acquires the stay time for the display case 10 d of cup noodles from images even though the customer actually stays in the front of the display case 10 d of cup noodles.
  • FIG. 4 is a diagram illustrating an example of the purchase data table 211 .
  • Purchase data indicating whether or not a product as the target object has been purchased is recorded in the purchase data table 211 .
  • the purchase data table 211 includes an item of “person ID”, in which a person ID for identifying a customer is recorded, and items of “rice ball”, “box lunch”, “tea”, “cup noodle”, and the like as products.
  • the number of purchased products is recorded in the corresponding product item.
  • contents that a customer having a person ID of “A” has purchased one box lunch and one tea, a customer having a person ID of “B” has purchased one box lunch, and a customer having a person ID of “C” has purchased one box lunch, one tea, and one cup noodle are recorded.
  • a behavior of a customer purchasing a product is an example of the specific behavior.
  • FIG. 5 is a diagram illustrating an example of the evaluation data table 212 .
  • Evaluation data obtained in a manner that purchase data recorded in the purchase data table 211 illustrated in FIG. 4 is evaluated by collaborative filtering is recorded in the evaluation data table 212 .
  • the evaluation data table 212 includes items of “person ID”, “rice ball”, “box lunch”, “tea”, “cup noodle”, and the like.
  • FIG. 5 illustrates a state where evaluation data only for the customer having a person ID of “A” is recorded.
  • FIG. 5 illustrates that evaluation values for a rice ball and a cup noodle which have not been purchased by the customer having a person ID of “A” are set to 0.2 and 0.8, respectively, for the customer having a person ID of “A”.
  • FIG. 6 is a diagram illustrating an example of the behavior information table 213 .
  • the behavior information table 213 includes the item of “person ID” and items of “rice ball shelf”, “box lunch shelf”, “tea shelf”, “cup noodle shelf”, and the like as the title of the display case 10 .
  • a time during which the customer is in the front of the display case 10 and is directed toward this display case 10 is recorded as the stay time for each title of the display case 10 .
  • FIG. 7 is a graph illustrating an example of estimating the opportunity loss.
  • FIG. 7 focuses on “cup noodle” in which the stay time is smaller than a predetermined value (for example, one second).
  • a horizontal axis indicates an evaluation value and a vertical axis indicates an integrated value obtained by integrating the number of persons corresponding to evaluation values for the product of “cup noodle”.
  • the evaluation value for “cup noodle” is 0.8.
  • counting is performed at a place in which the evaluation value is 0.8.
  • the estimation section 204 may estimate the number (hatched area in FIG. 7 ) of customers having an evaluation value which is equal to or greater than a value (for example, 0.5), that is, 70 , as the opportunity loss for the target object (for example, cup noodle).
  • a value for example, 0.5
  • a probability also referred to as a purchase probability
  • the estimation section 204 may obtain the total number of visitors who have entered the store 100 during a predetermined period (for example, one week or one month). Then, the estimation section 204 may estimate a ratio of the number (hatched area in FIG. 7 ) of persons having an evaluation value which is equal to or greater than a predetermined value (for example, 0.5), that is, 70 , to the obtained total number of visitors, as the opportunity loss for “cup noodle”. Even in a case where the total number of visitors varies by changing a counting period, a difference of an opportunity loss between products may be recognized based on the ratio.
  • a predetermined value for example, 0.5
  • FIG. 8 is a flowchart illustrating an example of an operation of the information processing apparatus 2 .
  • the behavior information acquiring unit 203 determines whether or not a person is included in an image obtained by imaging of the camera 3 (S 1 ). In a case where the behavior information acquiring unit determines that the person is included in the image, the behavior information acquiring unit 203 assigns a person ID to the person (S 2 ) and starts tracking a moving route of this person in a store.
  • the behavior information acquiring unit 203 determines whether or not the person looks at a display case 10 (S 3 ). In a case where the person looks at the display case 10 (Yes in S 3 ), the behavior information acquiring unit 203 specifies the display case 10 at which the person looks based on an image obtained by imaging of the camera 3 . The behavior information acquiring unit 203 acquires a stay time and records a stay place and the stay time along with the person ID, in the behavior information table 213 (S 4 ).
  • the behavior information acquiring unit 203 determines whether or not the person is performing payment processing, based on an image obtained by imaging of the camera 3 (S 5 ). In a case where the person is performing the payment processing (Yes in S 5 ), the behavior information acquiring unit 203 notifies the terminal device 4 of the person ID (S 6 ).
  • the terminal device 4 transmits purchase data of each person ID to the information processing apparatus 2 .
  • the purchase data receiving unit 201 in the information processing apparatus 2 receives the purchase data of each person ID, which has been transmitted from the terminal device 4 (S 7 ).
  • the exemplary embodiment of the present invention is described.
  • exemplary embodiments of the present invention are not limited to the above-described exemplary embodiment and various modifications and various implementations may be made in a range without changing the gist of the present invention.
  • the descriptions are made by using a product as the target object.
  • the present invention may also be applied to a case using an exhibit as the target object.
  • the behavior information acquiring unit acquires a stay time of a visitor in the vicinity of each exhibit and records the acquired stay time in the behavior information table.
  • the behavior information acquiring unit acquires a behavior of the visitor having an interest in the exhibit, as the specific behavior.
  • the specific behavior on an exhibit a case of giving a high evaluation in a questionnaire, a case of requesting a document, or the like is considered.
  • different products are disposed in the display cases 10 .
  • the same product may be disposed in some display cases 10 among plural display cases 10 . It is possible to recommend a more favorable place based on a purchase probability only by a difference in item placement.
  • each unit of the control unit 20 may be configured by a hardware circuit such as a field programmable gate array (FPGA) or an application specific integrated circuit (ASIC).
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • Some of the components in the exemplary embodiment may be omitted or be changed in a range without changing the gist of the present invention.
  • the step may be added, deleted, changed, replaced, or the like in the range without changing the gist of the present invention.
  • a program used in the exemplary embodiment may be provided in a state of being recorded in a computer-readable recording medium such as a CD-ROM.
  • the program used in the exemplary embodiment may be stored in an external server such as a cloud server and be used via a network.

Abstract

An information processing apparatus includes an acquisition section that acquires first information indicating whether or not a target person performs a specific behavior on a target object disposed in plural places and second information indicating a behavior of the target person and including a stay time in the plural places, for each target person, a calculation section that calculates an evaluation value indicating a probability of the target person who has not performed the specific behavior performing the specific behavior on the target object, based on the acquired first information, and an estimation section that extracts data on the target object disposed in the place having a stay time which is smaller than a predetermined value, based on the acquired second information, and estimates an opportunity loss for the target object based on the evaluation value calculated for the target object.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a divisional application of and claims the priority benefit of a prior application Ser. No. 15/984,422, filed on May 21, 2018, now allowed. The prior application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2017-237612 filed Dec. 12, 2017.
  • BACKGROUND Technical Field
  • The present invention relates to an information processing apparatus.
  • SUMMARY
  • According to an aspect of the invention, there is provided an information processing apparatus which includes an acquisition section that acquires first information indicating whether or not a target person performs a specific behavior on target objects disposed in plural places and second information indicating a behavior of the target person and including a stay time in the plural places, for each target person, a calculation section that calculates an evaluation value indicating a probability of the target person who has not performed the specific behavior performing the specific behavior on the target object, based on the acquired first information, and an estimation section that extracts data on the target object disposed in the place having a stay time which is smaller than a predetermined value based on the acquired second information, and estimates an opportunity loss for the target object based on the evaluation value calculated for the target object.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Exemplary embodiment(s) of the present invention will be described in detail based on the following figures, wherein:
  • FIG. 1 is a plan view illustrating an example of a layout of an information processing system according to an exemplary embodiment of the present invention;
  • FIG. 2 is a block diagram illustrating an example of a control system in the information processing system;
  • FIG. 3 is a diagram illustrating a stay time;
  • FIG. 4 is a diagram illustrating an example of a purchase data table;
  • FIG. 5 is a diagram illustrating an example of an evaluation data table;
  • FIG. 6 is a diagram illustrating an example of a behavior information table;
  • FIG. 7 is a graph illustrating an example of estimating an opportunity loss; and
  • FIG. 8 is a flowchart illustrating an example of an operation of the information processing apparatus.
  • DETAILED DESCRIPTION
  • Hereinafter, an exemplary embodiment of the present invention will be described with reference to the drawings. In the drawings, components having substantially the same function are denoted by the same reference signs and descriptions thereof will not be repeated.
  • SUMMARY OF EXEMPLARY EMBODIMENT
  • An information processing apparatus according to an exemplary embodiment of the present invention includes an acquisition section that acquires first information indicating whether or not a target person performs a specific behavior on target objects disposed in plural places and second information indicating a behavior of the target person and including a stay time in the plural places, for each target person, a calculation section that calculates an evaluation value indicating a probability of the target person who has not performed the specific behavior performing the specific behavior on the target object, based on the acquired first information, and an estimation section that extracts data on the target object disposed in the place having a stay time which is smaller than a predetermined value, based on the acquired second information, and estimates an opportunity loss for the target object based on the evaluation value calculated for the target object.
  • “The target objects disposed in the plural places” may be objects different from each other or may include the same object. “The specific behavior” may be a behavior based on the preference of a target person. For example, in a case where the target object is a product, the specific behavior corresponds to purchasing, rental, and the like. In a case where the target object is an exhibit, the specific behavior corresponds to document request and the like. “The place having a stay time which is smaller than a predetermined value” means a place in which it can be considered that the target person does not stop by, for example, a place in a case where the target person does not stop by at all or a case where the target person momentarily stops by. “The predetermined value” is, for example, about 1 or 2 seconds and includes zero. “The opportunity loss” means that, if the target person has noticed the existence of the target object, the target person performs the specific behavior.
  • FIG. 1 is a plan view illustrating an example of a layout of an information processing system according to an exemplary embodiment of the present invention.
  • The information processing system 1 may be applied to, for example, a store 100 such as a convenience store, a department store, and a shopping center. In the store 100, for example, plural display cases 10 in which products are displayed, a terminal device 4 used for a clerk performing accounting processing, and plural (for example, three) cameras (first camera 3A, second camera 3B, and third camera 3C (which are simply referred to as “a camera 3” when being collectively referred to)) are arranged. The product is an example of the target object. The display case 10 is an example of the place.
  • The display case 10 includes display cases 10 a to 10 d in which rice balls, box lunches, teas, and cup noodles are respectively displayed, and display cases 10 e to 10 i in which confectionery, daily necessities, bread, alcohol, and magazines are respectively displayed, as the products, for example.
  • Areas E1 to E12 in which a customer may pass are provided in the store 100. A route R illustrated in FIG. 1 indicates a moving route of a customer having a person ID of “A”, which is used for identifying a person and will be described later, as an example. The customer is an example of a person and a target person.
  • An image obtained by imaging of the camera 3 may be a video or a still image obtained by performing imaging plural times for each second. The camera 3 transmits the image obtained by imaging to an information processing apparatus 2 (see FIG. 2) in a wireless or wired manner. The first camera 3A images an inside of the store, which includes the areas E1 to E7, E11, and E12. The second camera 3B images an inside of the store, which includes the areas E7 to E12. The third camera 3C images an inside of the store, which includes the areas E1, E11, and E12, the terminal device 4, and an entrance 101.
  • The terminal device 4 is a computer device called a point-of-sale (POS) register disposed on a counter. A customer who enters the store 100 puts a product picked up with a hand, on the counter and performs payment. In a case where a product on sale in the store 100 is purchased by the customer, for example, the terminal device 4 performs processing for accounting, issues a receipt on which purchase of the product is recorded, and generates purchase data indicating that the product has been purchased, for each product. The terminal device 4 transmits the purchase data to the information processing apparatus 2 (see FIG. 2) in a wireless or wired manner.
  • FIG. 2 is a block diagram illustrating an example of a control system of the information processing system 1. The information processing system 1 includes the information processing apparatus 2 connected to the terminal device 4 and the camera 3A to 3C illustrated in FIG. 1.
  • The information processing apparatus 2 includes a control unit 20, a storage unit 21, and a display unit 22. The control unit 20 controls units of the information processing apparatus 2. The storage unit 21 stores various kinds of information. The display unit 22 is realized by a display such as a liquid crystal display, and displays various kinds of information.
  • The control unit 20 is configured with a central processing unit (CPU), an interface, and the like. The CPU operates in accordance with a program 210 stored in the storage unit 21 so as to function as a purchase data receiving unit 201, an evaluation data generating unit 202, a behavior information acquiring unit 203, an estimation section 204, and the like. The purchase data receiving unit 201 and the behavior information acquiring unit 203 are examples of the acquisition section. The evaluation data generating unit 202 is an example of the calculation section. The units 201 to 204 will be described later in detail.
  • The storage unit 21 is configured with a read only memory (ROM), a random access memory (RAM), and the like. The program 210, a purchase data table 211 (see FIG. 4), an evaluation data table 212 (see FIG. 5), a behavior information table 213 (see FIG. 6), and the like are stored in the storage unit 21. In this specification, recording or registering is used in a case of writing information in a table, and storing is used in a case of writing information in the storage unit. The purchase data table 211 is an example of the first information. The behavior information table 213 is an example of the second information.
  • The purchase data receiving unit 201 acquires purchase data indicating whether or not a target person purchases a product disposed in the plural display cases 10. Specifically, the purchase data receiving unit 201 receives a person ID for identifying a customer who is performing accounting processing with the terminal device 4, from the behavior information acquiring unit 203. The purchase data receiving unit 201 sequentially receives purchase data of each person ID from the terminal device 4. The purchase data receiving unit 201 records the purchase data of each person ID in the purchase data table 211.
  • The evaluation data generating unit 202 calculates an evaluation value indicating a probability that the customer purchases a product which has not been purchased by the customer, based on the purchase data table 211. The evaluation data generating unit records the calculated evaluation value in the evaluation data table 212. The evaluation data generating unit 202 calculates the evaluation value by well-known collaborative filtering such as GroupLens. Specifically, the evaluation data generating unit 202 calculates a similarity degree between customers, based on the purchase data table 211. Then, the evaluation data generating unit estimates an evaluation value based on the calculated similarity degree. The similarity degree indicates similarity between preferences to the product. For example, the similarity degree of 1 means a case where the preferences completely coincide with each other.
  • The behavior information acquiring unit 203 acquires behavior information of a customer from an image obtained by imaging of the camera 3 and records the acquired behavior information in the behavior information table 213. The behavior information includes a stay time of a customer having a person ID assigned thereto, for each display case 10, for example. In this specification, for example, “the stay time” is set to be a time during which the customer is directed toward the display case 10 while being in the front of the display case 10 (for example, within 1 m). Thus, the stay time does not include a time during which the customer is directed toward the display case 10 from a place (for example, position at a distance of 2 m or greater) far from the display case 10 or a time during which the customer is not directed toward the display case 10 even though he or she is in the front of the display case 10. In a case where a customer has an interest in a product, generally, the customer looks at the product. Thus, a time during which the customer is directed toward the display case 10 while being in the front of the display case 10 is set as the stay time, and thereby data on the product corresponding to the preference of the customer may be more accurately extracted in comparison to a case where a time during which the customer is in the front of the display case 10 regardless of being directed toward the display case 10 is set to the stay time. The behavior information acquiring unit 203 determines whether or not the customer is directed toward the display case 10, by analyzing an image which has been obtained by imaging of the camera 3. A time during which the customer is in the front of the display case 10 without determination of whether or not the customer is directed toward the display case 10 may be recorded as the stay time, in the behavior information table 213. A camera may be disposed on each display case 10 and a gaze direction of the customer may be detected. Thus, it may be determined whether or not the customer looks at the display case 10.
  • The behavior information acquiring unit 203 sequentially takes an image obtained by imaging of the camera 3 and monitors whether or not a person is included in the image. The behavior information acquiring unit 203 determines whether or not a person is included in the image, by determining whether or not a face is included in the image. In a case where the behavior information acquiring unit 203 determines that a person is included in the image, the behavior information acquiring unit assigns a person ID to a customer corresponding to the person in the image and tracks the moving route of the customer in the store. The image obtained by imaging of the camera 3 includes the peripheral parts such as the display case 10, products, and the floor. Thus, the behavior information acquiring unit 203 may specify an area in which the customer is located among the areas E1 to E12, based on the positional relationship between the peripheral parts and the customer.
  • The estimation section 204 extracts data on a product disposed in a display case 10 having a stay time which is smaller than a predetermined value (for example, one second) based on the acquired behavior information table 213. Then, the estimation section 204 estimates an opportunity loss for the product, based on an evaluation value calculated for the product. The estimation section 204 may estimate the number of target persons counted for each evaluation value, as the opportunity loss for a target object. The estimation section 204 may estimate the number of target persons counted for each evaluation value which is equal to or greater than a predetermined value in the number of target persons counted for each evaluation value, as the opportunity loss for the target object. Further, the estimation section 204 may estimate a ratio of the number of target persons who perform the specific behavior on the target object at a predetermined probability to the number of target persons who come into an area in which the plural places are provided as the opportunity loss for the target object.
  • FIG. 3 is a diagram illustrating the stay time. It is assumed that a customer is in the area E4 and is directed toward the display case 10 b of box lunches. In a case where the customer is directed toward the display case 10, the stay time is recorded in association with this display case 10. In a case illustrated in FIG. 3, the customer is directed toward the display case 10 b of box lunches, and thus the behavior information acquiring unit 203 acquires a stay time for the display case 10 b of box lunches from images. In the case illustrated in FIG. 3, the customer is not directed toward the display case 10 d of cup noodles. Thus, the behavior information acquiring unit 203 acquires the stay time for the display case 10 d of cup noodles from images even though the customer actually stays in the front of the display case 10 d of cup noodles.
  • FIG. 4 is a diagram illustrating an example of the purchase data table 211. Purchase data indicating whether or not a product as the target object has been purchased is recorded in the purchase data table 211. The purchase data table 211 includes an item of “person ID”, in which a person ID for identifying a customer is recorded, and items of “rice ball”, “box lunch”, “tea”, “cup noodle”, and the like as products. In a case where a customer having a person ID has purchased a product, the number of purchased products is recorded in the corresponding product item. In FIG. 4, contents that a customer having a person ID of “A” has purchased one box lunch and one tea, a customer having a person ID of “B” has purchased one box lunch, and a customer having a person ID of “C” has purchased one box lunch, one tea, and one cup noodle are recorded. A behavior of a customer purchasing a product is an example of the specific behavior.
  • FIG. 5 is a diagram illustrating an example of the evaluation data table 212. Evaluation data obtained in a manner that purchase data recorded in the purchase data table 211 illustrated in FIG. 4 is evaluated by collaborative filtering is recorded in the evaluation data table 212. Similar to the purchase data table 211 illustrated in FIG. 4, the evaluation data table 212 includes items of “person ID”, “rice ball”, “box lunch”, “tea”, “cup noodle”, and the like. FIG. 5 illustrates a state where evaluation data only for the customer having a person ID of “A” is recorded. FIG. 5 illustrates that evaluation values for a rice ball and a cup noodle which have not been purchased by the customer having a person ID of “A” are set to 0.2 and 0.8, respectively, for the customer having a person ID of “A”.
  • FIG. 6 is a diagram illustrating an example of the behavior information table 213. The behavior information table 213 includes the item of “person ID” and items of “rice ball shelf”, “box lunch shelf”, “tea shelf”, “cup noodle shelf”, and the like as the title of the display case 10. A time during which the customer is in the front of the display case 10 and is directed toward this display case 10 is recorded as the stay time for each title of the display case 10.
  • FIG. 7 is a graph illustrating an example of estimating the opportunity loss. FIG. 7 focuses on “cup noodle” in which the stay time is smaller than a predetermined value (for example, one second). A horizontal axis indicates an evaluation value and a vertical axis indicates an integrated value obtained by integrating the number of persons corresponding to evaluation values for the product of “cup noodle”. In a case of the customer having a person ID of “A”, the evaluation value for “cup noodle” is 0.8. Thus, in the graph illustrated in FIG. 7, counting is performed at a place in which the evaluation value is 0.8.
  • For example, the estimation section 204 may estimate the number (hatched area in FIG. 7) of customers having an evaluation value which is equal to or greater than a value (for example, 0.5), that is, 70, as the opportunity loss for the target object (for example, cup noodle). The number of persons who purchase a cup noodle at a probability (also referred to as a purchase probability) of 50% or greater if the customer who has entered the store 100 has noticed the existence of the cup noodle may be estimated. Thus, it is possible to recommend the layout of the display case 10 d of cup noodles to be changed to a place in which a customer easily notices the cup noodles in the display case 10 d.
  • The estimation section 204 may obtain the total number of visitors who have entered the store 100 during a predetermined period (for example, one week or one month). Then, the estimation section 204 may estimate a ratio of the number (hatched area in FIG. 7) of persons having an evaluation value which is equal to or greater than a predetermined value (for example, 0.5), that is, 70, to the obtained total number of visitors, as the opportunity loss for “cup noodle”. Even in a case where the total number of visitors varies by changing a counting period, a difference of an opportunity loss between products may be recognized based on the ratio.
  • Operation of Exemplary Embodiment
  • Next, an example of an operation of the information processing system 1 will be described. FIG. 8 is a flowchart illustrating an example of an operation of the information processing apparatus 2.
  • The behavior information acquiring unit 203 determines whether or not a person is included in an image obtained by imaging of the camera 3 (S1). In a case where the behavior information acquiring unit determines that the person is included in the image, the behavior information acquiring unit 203 assigns a person ID to the person (S2) and starts tracking a moving route of this person in a store.
  • The behavior information acquiring unit 203 determines whether or not the person looks at a display case 10 (S3). In a case where the person looks at the display case 10 (Yes in S3), the behavior information acquiring unit 203 specifies the display case 10 at which the person looks based on an image obtained by imaging of the camera 3. The behavior information acquiring unit 203 acquires a stay time and records a stay place and the stay time along with the person ID, in the behavior information table 213 (S4).
  • The behavior information acquiring unit 203 determines whether or not the person is performing payment processing, based on an image obtained by imaging of the camera 3 (S5). In a case where the person is performing the payment processing (Yes in S5), the behavior information acquiring unit 203 notifies the terminal device 4 of the person ID (S6).
  • The terminal device 4 transmits purchase data of each person ID to the information processing apparatus 2. The purchase data receiving unit 201 in the information processing apparatus 2 receives the purchase data of each person ID, which has been transmitted from the terminal device 4 (S7).
  • In a case where the person goes out of the store (S8), counting processing is ended.
  • In a case where purchase data is recorded in the purchase data table 211 and behavior information is recorded in the behavior information table 213, in a predetermined period (for example, one week or one month), an evaluation value is recorded in the evaluation data table 212 by the evaluation data generating unit 202. An opportunity loss for a product disposed in a place in which it is considered that the customer does not stop by is estimated by the estimation section 204. Then, an estimation result is displayed in the display unit 22.
  • Hitherto, the exemplary embodiment of the present invention is described. However, exemplary embodiments of the present invention are not limited to the above-described exemplary embodiment and various modifications and various implementations may be made in a range without changing the gist of the present invention. For example, in the exemplary embodiment, the descriptions are made by using a product as the target object. However, the present invention may also be applied to a case using an exhibit as the target object. In this case, the behavior information acquiring unit acquires a stay time of a visitor in the vicinity of each exhibit and records the acquired stay time in the behavior information table. The behavior information acquiring unit acquires a behavior of the visitor having an interest in the exhibit, as the specific behavior. As the specific behavior on an exhibit, a case of giving a high evaluation in a questionnaire, a case of requesting a document, or the like is considered.
  • In the exemplary embodiment, different products are disposed in the display cases 10. However, the same product may be disposed in some display cases 10 among plural display cases 10. It is possible to recommend a more favorable place based on a purchase probability only by a difference in item placement.
  • A portion or the entirety of each unit of the control unit 20 may be configured by a hardware circuit such as a field programmable gate array (FPGA) or an application specific integrated circuit (ASIC).
  • Some of the components in the exemplary embodiment may be omitted or be changed in a range without changing the gist of the present invention. In the flow in the exemplary embodiment, the step may be added, deleted, changed, replaced, or the like in the range without changing the gist of the present invention. A program used in the exemplary embodiment may be provided in a state of being recorded in a computer-readable recording medium such as a CD-ROM. The program used in the exemplary embodiment may be stored in an external server such as a cloud server and be used via a network.
  • The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims (20)

What is claimed is:
1. An information processing apparatus comprising:
an estimation section that estimates an opportunity loss for a target object being disposed in a certain place based on a probability of a target person performing a specific behavior on the target object without performing the specific behavior and an information related to a stay time of the target person in the place.
2. The information processing apparatus according to claim 1,
wherein the estimation section estimates a number of target persons counted for each evaluation value, as the opportunity loss for the target object.
3. The information processing apparatus according to claim 2,
wherein the estimation section estimates the number of target persons counted for each evaluation value which is equal to or greater than a predetermined value in the number of target persons counted for each evaluation value, as the opportunity loss for the target object.
4. The information processing apparatus according to claim 2,
wherein the estimation section estimates a ratio of the number of target persons who perform the specific behavior on the target object at a predetermined probability to the number of target persons coming into an area in which a plurality of places are provided, as the opportunity loss for the target object.
5. The information processing apparatus according to claim 1,
wherein a calculation section calculates an evaluation value by using collaborative filtering.
6. The information processing apparatus according to claim 2,
wherein a calculation section calculates an evaluation value by using collaborative filtering.
7. The information processing apparatus according to claim 3,
wherein a calculation section calculates an evaluation value by using collaborative filtering.
8. The information processing apparatus according to claim 4,
wherein a calculation section calculates an evaluation value by using collaborative filtering.
9. The information processing apparatus according to claim 1,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
10. The information processing apparatus according to claim 2,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
11. The information processing apparatus according to claim 3,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
12. The information processing apparatus according to claim 4,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
13. The information processing apparatus according to claim 5,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
14. The information processing apparatus according to claim 6,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
15. The information processing apparatus according to claim 7,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
16. The information processing apparatus according to claim 8,
wherein the stay time in an information is obtained by excluding a time during which the target person is not directed toward the target object.
17. The information processing apparatus according to claim 9,
wherein it is determined whether or not the target person is directed toward the target object, based on an image obtained by imaging.
18. The information processing apparatus according to claim 10,
wherein it is determined whether or not the target person is directed toward the target object, based on an image obtained by imaging.
19. The information processing apparatus according to claim 1,
wherein the specific behavior is a behavior of purchasing the target object.
20. The information processing apparatus according to claim 1,
wherein the specific behavior is a behavior of showing an interest in the target object as an exhibit.
US16/805,866 2017-12-12 2020-03-02 Information processing apparatus Abandoned US20200202553A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/805,866 US20200202553A1 (en) 2017-12-12 2020-03-02 Information processing apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2017237612A JP6965713B2 (en) 2017-12-12 2017-12-12 Information processing equipment and programs
JP2017-237612 2017-12-12
US15/984,422 US10600198B2 (en) 2017-12-12 2018-05-21 Information processing apparatus
US16/805,866 US20200202553A1 (en) 2017-12-12 2020-03-02 Information processing apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/984,422 Division US10600198B2 (en) 2017-12-12 2018-05-21 Information processing apparatus

Publications (1)

Publication Number Publication Date
US20200202553A1 true US20200202553A1 (en) 2020-06-25

Family

ID=66696314

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/984,422 Active 2038-09-08 US10600198B2 (en) 2017-12-12 2018-05-21 Information processing apparatus
US16/805,866 Abandoned US20200202553A1 (en) 2017-12-12 2020-03-02 Information processing apparatus

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/984,422 Active 2038-09-08 US10600198B2 (en) 2017-12-12 2018-05-21 Information processing apparatus

Country Status (3)

Country Link
US (2) US10600198B2 (en)
JP (1) JP6965713B2 (en)
CN (1) CN109920172B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210192903A1 (en) * 2019-12-20 2021-06-24 Invoxia Method and System for Monitoring the Presence of A Point-Of-Sale Display in A Shop, at the Sight of Consumers

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI745653B (en) * 2019-02-18 2021-11-11 宏碁股份有限公司 Customer behavior analyzing method and customer behavior analyzing system

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09297685A (en) * 1996-05-07 1997-11-18 Nippon Telegr & Teleph Corp <Ntt> Action deciding method for inference device, and device for executing the method
JP2001331875A (en) * 2000-05-24 2001-11-30 Nec Corp Consumer behavior monitoring device and consumer behavior monitoring method used for the device
JP2005148839A (en) 2003-11-11 2005-06-09 Toshiba Tec Corp Commodity sales management device and commodity sales management system
US7574682B2 (en) * 2007-02-28 2009-08-11 Freescale Semiconductor, Inc. Yield analysis and improvement using electrical sensitivity extraction
JP2008257496A (en) * 2007-04-05 2008-10-23 Seiko Epson Corp Point giving system, computer, and program
JP5342432B2 (en) * 2009-12-24 2013-11-13 株式会社日立製作所 Order plan creation device, order plan creation method and order plan creation program
US9760802B2 (en) * 2010-01-27 2017-09-12 Ebay Inc. Probabilistic recommendation of an item
JP5440394B2 (en) * 2010-05-31 2014-03-12 ソニー株式会社 Evaluation prediction apparatus, evaluation prediction method, and program
JP5731766B2 (en) * 2010-07-14 2015-06-10 株式会社野村総合研究所 Loss opportunity analysis system and analysis method
JP2013540327A (en) * 2010-10-21 2013-10-31 ホーリーブレイン ビーヴイビーエー Method and apparatus for neuropsychological modeling of human experience and purchase behavior
JP5548654B2 (en) * 2011-06-22 2014-07-16 楽天株式会社 Information processing apparatus, information processing method, information processing program, and recording medium on which information processing program is recorded
CN102214345A (en) * 2011-07-19 2011-10-12 纽海信息技术(上海)有限公司 On-line shopping guiding device and method
CN102376061B (en) * 2011-08-26 2015-04-22 浙江工业大学 Omni-directional vision-based consumer purchase behavior analysis device
US9541997B2 (en) * 2012-07-27 2017-01-10 Nec Solution Innovators, Ltd. Three-dimensional user interface apparatus and three-dimensional operation method
JP5632512B1 (en) * 2013-07-02 2014-11-26 パナソニック株式会社 Human behavior analysis device, human behavior analysis system, human behavior analysis method, and monitoring device
CN103412948B (en) * 2013-08-27 2017-10-24 北京交通大学 The Method of Commodity Recommendation and system of collaborative filtering based on cluster
JP6164993B2 (en) * 2013-09-06 2017-07-19 株式会社富士通アドバンストエンジニアリング Evaluation system, evaluation program, and evaluation method
WO2015033577A1 (en) * 2013-09-06 2015-03-12 日本電気株式会社 Customer behavior analysis system, customer behavior analysis method, non-temporary computer-readable medium, and shelf system
JP6101620B2 (en) * 2013-11-12 2017-03-22 日本電信電話株式会社 Purchase forecasting apparatus, method, and program
WO2015162723A1 (en) * 2014-04-23 2015-10-29 株式会社日立製作所 Behavior analysis device
JP6653413B2 (en) * 2015-02-04 2020-02-26 パナソニックIpマネジメント株式会社 Purchase analysis device and purchase analysis method
CN104574005B (en) * 2015-02-15 2018-03-16 蔡耿新 Collect augmented reality, body-sensing, the advertising display management system and method for scratching green technology
JP6464024B2 (en) * 2015-04-27 2019-02-06 株式会社日立ソリューションズ Behavior analysis system and behavior analysis method
JP6562077B2 (en) * 2015-08-20 2019-08-21 日本電気株式会社 Exhibition device, display control device, and exhibition system
JP6719727B2 (en) * 2016-03-23 2020-07-08 富士ゼロックス株式会社 Purchase behavior analysis device and program
CN105869015A (en) * 2016-03-28 2016-08-17 联想(北京)有限公司 Information processing method and system
CN106095089A (en) * 2016-06-06 2016-11-09 郑黎光 A kind of method obtaining interesting target information
CN107371042B (en) * 2017-08-31 2020-09-11 深圳创维-Rgb电子有限公司 Advertisement putting method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210192903A1 (en) * 2019-12-20 2021-06-24 Invoxia Method and System for Monitoring the Presence of A Point-Of-Sale Display in A Shop, at the Sight of Consumers
US11594110B2 (en) * 2019-12-20 2023-02-28 Invoxia Method and system for monitoring the presence of a point-of-sale display in a shop, at the sight of consumers

Also Published As

Publication number Publication date
CN109920172B (en) 2022-09-20
US10600198B2 (en) 2020-03-24
JP6965713B2 (en) 2021-11-10
CN109920172A (en) 2019-06-21
JP2019105971A (en) 2019-06-27
US20190180465A1 (en) 2019-06-13

Similar Documents

Publication Publication Date Title
JP5438859B1 (en) Customer segment analysis apparatus, customer segment analysis system, and customer segment analysis method
JP4991440B2 (en) Product sales apparatus, product sales management system, product sales management method and program
US20110199486A1 (en) Customer behavior recording device, customer behavior recording method, and recording medium
WO2017085771A1 (en) Payment assistance system, payment assistance program, and payment assistance method
TWI778030B (en) Store apparatus, store management method and program
JP2004348618A (en) Customer information collection and management method and system therefor
JP2014146154A (en) Customer segment analyzer, customer segment analyzing system and customer segment analyzing method
US20130328765A1 (en) Signage system and display method by the same
US20200202553A1 (en) Information processing apparatus
JP2012088878A (en) Customer special treatment management system
JP2018005691A (en) Information processing system, information processing device and information processing method
JP7384516B2 (en) Information processing device, product recommendation method, program, and system
JP2005309951A (en) Sales promotion support system
JP2016009416A (en) Sales promotion system, sales promotion management device, and sales promotion management program
JP7011801B2 (en) Support systems, support devices, support methods and programs
JP7294663B2 (en) Customer service support device, customer service support method, and program
JP2024015277A (en) Information providing device and its control program
JP2016219065A (en) Staying analysis system and method
JP2017102846A (en) Customer servicing evaluation device and customer servicing evaluation method
CN110610358A (en) Commodity processing method and device and unmanned goods shelf system
JP5027637B2 (en) Marketing data analysis method, marketing data analysis system, data analysis server device, and program
KR101325506B1 (en) Digital display apparatus and method for displaying customized product and using system there of
JP6519833B2 (en) INFORMATION PRESENTATION DEVICE, INFORMATION PRESENTATION SYSTEM, AND INFORMATION PRESENTATION METHOD
JP6662141B2 (en) Information processing device and program
JP2015133131A (en) Data output system and method for selling opportunity loss analysis

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: FUJIFILM BUSINESS INNOVATION CORP., JAPAN

Free format text: CHANGE OF NAME;ASSIGNOR:FUJI XEROX CO., LTD.;REEL/FRAME:056308/0360

Effective date: 20210401

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION