WO2021176552A1 - User terminal and program - Google Patents

User terminal and program Download PDF

Info

Publication number
WO2021176552A1
WO2021176552A1 PCT/JP2020/008864 JP2020008864W WO2021176552A1 WO 2021176552 A1 WO2021176552 A1 WO 2021176552A1 JP 2020008864 W JP2020008864 W JP 2020008864W WO 2021176552 A1 WO2021176552 A1 WO 2021176552A1
Authority
WO
WIPO (PCT)
Prior art keywords
product
information
user terminal
user
image
Prior art date
Application number
PCT/JP2020/008864
Other languages
French (fr)
Japanese (ja)
Inventor
好孝 西田
由樹 源
Original Assignee
株式会社ASIAN Frontier
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ASIAN Frontier filed Critical 株式会社ASIAN Frontier
Priority to PCT/JP2020/008864 priority Critical patent/WO2021176552A1/en
Publication of WO2021176552A1 publication Critical patent/WO2021176552A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a search for displayed products.
  • Patent Document 1 For example, at a place where alcoholic beverages are served, such as a bar, there is a need to easily find a product that suits one's taste from among the many products on display. In relation to this point, in order to support the inventory management of products, there is a technique for identifying the brand of each product by photographing the display shelves of the store (Patent Document 1 and the like).
  • An object of the present invention is to support finding a product that suits a customer's taste from among the displayed products.
  • the present invention is photographed by a photographing means, a display means for displaying an image photographed by the photographing means, a storage means for storing product information and conditions related to the product, and the photographing means. Based on the image and the information of the product, the specific means for identifying the photographed product and, when the specified article satisfies the above conditions, the image of the article is processed and displayed on the display means.
  • a user terminal having a display control means is provided.
  • the work of finding a product that suits a customer's taste from among the displayed products is supported.
  • the functional diagram of the user terminal 100 The functional diagram of the user terminal 100.
  • Example of user preference information An example of a user's order history.
  • Example of product identification information Example of detailed product information.
  • An operation example of the user terminal 100 An operation example of the user terminal 100 (No. 2).
  • An example showing the usage state of the user terminal 100 An example of a screen displayed on the user terminal 100 (No. 1).
  • FIG. 1 is a functional diagram of the user terminal 100.
  • the user terminal 100 is a device carried by the user, and includes a photographing means 110, a control unit 120, a display means 130, an input means 140, a storage means 150, a touch screen 160, a communication means 170, and the like. Includes position acquisition means 180.
  • the case where the user terminal 100 is a smartphone will be described as an example, but the call function is not essential, and conversely, the user terminal 100 may be a device to which a function not related to the present invention is added.
  • the photographing means 110 is mounted as a module including a camera, a lens, a light receiving element, and other optical elements, and photographs the surroundings of the user terminal 100 according to the user's operation. Specifically, the photographing means 110 generates data indicating a photographed image and supplies the data to the specific means 121.
  • the acquired image may be a still image or a moving image. Further, an infrared image may be acquired instead of the visible light image.
  • the display means 130 is realized by, for example, a liquid crystal screen, and displays an image according to an instruction from the control unit 120.
  • the input means 140 is an input device such as a touch pad or a mouse, and accepts a user's designation of a position on the screen.
  • the functions of the display means 130 and the input means 140 are integrated to form the touch screen 160.
  • the communication means 170 is realized as a wireless communication module such as WiFi (registered trademark), 4G, 5G, etc., is connected to another device via a communication network such as the Internet, and is necessary for identifying a product as needed. Get information and product information.
  • WiFi registered trademark
  • 4G 4G
  • 5G 5G
  • the communication means 170 is realized as a wireless communication module such as WiFi (registered trademark), 4G, 5G, etc., is connected to another device via a communication network such as the Internet, and is necessary for identifying a product as needed. Get information and product information.
  • the position acquisition means 180 is implemented as a module for receiving radio waves from GPS satellites or a communication module for receiving information including position information from a base station or a wireless router, and obtains the current position of the user terminal 100. Get the information shown.
  • the storage means 150 stores at least product information and conditions regarding product selection.
  • the storage means 150 stores user preference information, order history, product identification information, product detail information, and inventory information, as shown in FIGS. 2 to 6. The above conditions are generated at least based on user preference information.
  • the storage means 150 stores an application program (hereinafter, referred to as an application) that is executed by the processor of the user terminal 100 and realizes the function of the user terminal 100.
  • FIG. 2 is an example of user preference information.
  • Information such as brand, place of origin, etc. may be included. In short, it contains information used to determine to what extent the user's preferences match (or do not) the displayed product. Specifically, a condition for determining whether or not a certain product is suitable for this user is generated based on the user preference information.
  • FIG. 3 is an example of the user's order history.
  • This product is identified by, for example, a number assigned according to GTIN (Global Trade Item Number), which is an international standard for uniquely identifying a product. Alternatively, it may be identified by the name of the manufacturing company, the brand name, the place of origin, the age, the type, and the like.
  • GTIN Global Trade Item Number
  • FIG. 4 is an example of product identification information.
  • each ID that uniquely identifies the product includes information necessary for distinguishing the product from its appearance (package or container).
  • a GTIN-compliant 13-digit number string can be adopted.
  • the characters on the label typically, a trademark for identifying the product such as a brand name
  • the pattern of the label typically, a trademark for identifying the product such as a brand name
  • the shape of the label and the label
  • the information on the container can be included.
  • the example in the figure is just an example, and any information may be used as long as it is information that characterizes the appearance of the product.
  • FIG. 5 is an example of detailed product information.
  • the detailed product information may be any information related to the product, such as the characteristics and properties of the product, but is preferably information that can be referred to when the user decides to order the product.
  • text data, figures, illustrations, photographs, graphs and other image information for explaining the characteristics of the product, and the manufacturer, retailer, wholesaler, etc. of the product are associated with the product ID such as GTIN.
  • Information (URL link) for accessing the information of the provided product is included.
  • FIG. 6 is an example of inventory information. For each store, the location information of the store and the information for identifying all the products handled in the store are stored in association with each other.
  • the control unit 120 is realized as one or more general-purpose or image processing dedicated processors, and includes a specific means 121 and a display control means 122.
  • the specifying means 121 identifies the product photographed by the photographing means 110 based on the image photographed by the photographing means 110 and the product information read from the storage means 150. When the specifying means 121 determines that the product cannot be specified due to the camera work of the user, the specifying means 121 may output a signal to that effect to the display controlling means 122.
  • "identifying" a product is not limited to specifying all publicly available information such as the producer, place of origin, model number, type, brand, brand, version, variation, etc. of the product, for example, the brand is It is specified to the extent that it can be specified but it is not possible to specify which of the multiple variations exists, or the manufacturer can be specified, but the candidates for the applicable model number can be narrowed down to three.
  • the identification means 121 determines whether or not each of the specified products satisfies the condition. Specifically, the specific means 121 refers to the user preference information and the product identification information stored in the storage means 150 to determine the degree of conformity with the conditions, and the determination result is information indicating an image area of the product. Is output to the display control means 122 in association with.
  • the degree of conformity can be calculated based on the number of matching items, for example, when the condition consists of a plurality of items (matching / not matching).
  • the specific means 121 determines whether or not the user's camera work and shooting conditions exceed a predetermined value based on the image data, and allows the user's camera work (for example, the degree of camera shake caused by a sudden camera operation). When the product cannot be specified due to (when it is out of the range), a flag indicating that fact is output to the display control means 122.
  • the display control means 122 processes the image of the article and displays it on the display means 130. Specifically, the image area to be processed and the processing content are determined based on the information supplied from the specific means 121.
  • the processing process is a process (in other words, a process of making the image area of an article other than the specified article (for example, a product can be identified but the goodness of fit to the condition is less than or equal to a predetermined value) relatively inconspicuous). It may be a process for reducing visibility).
  • the image area includes blackening processing, blurring processing, processing for reducing lightness or saturation (hereinafter referred to as mask processing), and the like.
  • the display control means 122 may change the content of the effect processing according to the degree of matching with the conditions. Specifically, at least one of color, brightness (luminance), and light emission timing regarding light emission may be changed according to the degree of conformity with the conditions. For example, an image having a higher brightness (for example, an image that looks dazzling in gold) is added to a product having a higher degree of conformity with the conditions.
  • this effect is an effect process for drawing an image that imitates the blinking of a predetermined object
  • the blinking time interval is set to be shorter as the degree of matching is higher.
  • the degree to which the image of the product becomes inconspicuous is changed by controlling other drawing methods such as brightness, saturation, lightness, and edge blurring, depending on the degree of conformity to the conditions. May be good.
  • the display control means 122 supplies the image data applied to the image data acquired from the photographing means 110 by the processing processed in this way to the display means 130.
  • the display control means 122 may generate image data so as to superimpose and display the information about the specified product on the image taken by the photographing means 110. Further, when the display control means 122 receives a flag from the identification means 121 indicating that the product cannot be specified due to the user's camera work or the like, a message to that effect (for example, "the product cannot be specified due to large blurring"). You may generate image data in which the text information) "Please move it a little more slowly" is superimposed. This encourages the user to perform appropriate camera operations.
  • FIG. 7 is an operation example of the user terminal 100.
  • the user who visits the store starts the above-mentioned application.
  • the position acquisition means 180 acquires the position information (S501), and refers to the inventory information stored in the storage means 150 to identify the store where the user is currently located.
  • the user points the camera lens of the photographing means toward the display shelf in the store (here, a bar serving alcoholic beverages), performs a predetermined operation, and starts photographing.
  • the user terminal 100 continuously acquires image data at a predetermined frame rate.
  • the acquired data is displayed on the display means 130 in real time (S502).
  • an image corresponding to the shooting target is displayed.
  • the identification means 121 sequentially analyzes the acquired image data and attempts to identify each photographed product (S503). At this time, for example, when it is not possible to specify one product but a plurality of candidates are found, the identification means 121 refers to the storage means 150 as necessary, and the user is present from the acquired position information. You may perform processing to improve the accuracy of product identification, such as identifying a store and using the information of the products handled at the store to narrow down to one from the plurality of candidates. In addition, there may be a product that cannot be specified or is specified incorrectly.
  • the specifying means 121 determines whether or not the specified conditions are met for each of the specified products (S505). For each product whose degree of matching is equal to or higher than a predetermined value (S506: YES), as shown in FIG. 11, the specific means 121 determines a light emitting mode according to the degree of matching, and performs light emitting effect processing on the product. While superimposing and displaying on the screen (S507), the information of the product is read out from the storage means 150 and superimposing and displaying (S509). In the example of FIG.
  • the light emitting object OB4 and the product information window OB2-1 are displayed in the vicinity of the product image OB1-1. ing.
  • the product information window OB2-1 for example, a photograph showing a photograph of the product, a description, and a chart showing the characteristics of the flavor are displayed.
  • the image of the product is masked (S508).
  • the masking objects OB3-1, OB3-2, and OB3-3 are masked as a result of determining that the degree of matching is a product of a predetermined value or less.
  • the control unit 120 performs light emission processing or other conspicuous processing on the product image, the target product on which the light emission processing or other conspicuous processing is performed on the product image, mask processing, or the like.
  • the target product that has been subjected to the inconspicuous processing is stored in the storage means 150 (S510).
  • the process proceeds to S550 in FIG.
  • the specifying means 121 identifies the product reflected in the position by the long-pressing operation, reads the information of the specified product from the storage means 150, and takes a picture. It is superimposed and displayed on the image (S551).
  • FIG. 12 shows an example of a screen displayed at this time. In this example, when the long press operation is performed on the product image OB1-2, the product information window OB2-2 is displayed. In this way, by specifying the photographed product by the user, it is possible to acquire information about the product that does not meet the conditions but is of interest to the user.
  • the guidance information window OB5 is superimposed and displayed as shown in FIG. 13 (S553).
  • the information displayed in the guidance information window includes, for example, information about the current store generated based on the order history and inventory information, and information about the products that the user has placed an order for in the past at this store.
  • Various guidance information such as information for changing the information, may be included.
  • the guidance information window OB5 is closed by flicking in the opposite direction.
  • the display control means 122 when the user performs a tap operation (S554: YES), the display control means 122 superimposes and displays a pin image on the position on the screen where the tap operation is performed (S555).
  • FIG. 14 is an example of the screen displayed at this time.
  • the display control means 122 stores in the storage means 150 that the product is registered as a favorite product (S556). Then, the process returns to S502 of FIG. 7, and the processes of S502 to S510 (movement is taken, the photographed product is specified, and the image processing according to the condition is met) are repeated.
  • a flick operation is performed on a pin object, the display of the pin object disappears.
  • the user can easily find a product that suits his / her taste among a large number of displayed products.
  • the product is a brand that the user does not know, if the conditions are met, the product is visually notified, so that the hurdle to try an unknown product is lowered.
  • the clerk does not need to have abundant knowledge to answer user's questions about all the products he handles, so for example, he should concentrate on work other than providing product knowledge to customers. Can be done.
  • the presence of the product should be visually recognized due to circumstances such as a small label, small characters on the label, or dim lighting. Even in situations where it is difficult (for example, in a place such as a bar counter), if the shooting conditions (illuminance, resolution, zoom function, etc.) that can perform image analysis are satisfied, the user can display the product. Just take a picture of the shelf and you will be presented with a product that suits your taste.
  • the "display” does not necessarily mean only a state in which the products are arranged in an orderly manner, but in short, it is sufficient that a plurality of products are present at a position where they can be captured by the photographing means.
  • the above-mentioned smartphone is an example of a user terminal, and as the user terminal, for example, smart glasses (glasses type terminals) using so-called augmented reality (AR) technology may be used.
  • AR augmented reality
  • an existing input mechanism such as line-of-sight detection or voice input may be used.
  • the product to be searched was a beverage, but it may be a food product, daily necessities, or an electric appliance.
  • the present invention can be applied as long as it is possible to support the search and selection of products from a large number of displayed products.
  • the above-mentioned long press operation, flick operation, and tap operation are merely examples of operations for inputting commands such as product specification, registration instruction holding, and information window display.
  • the above-mentioned "light emission processing” does not have to be something that everyone perceives that a product emits light, but in short, it visually distinguishes a product that meets (or has a high degree of conformity) with other products. Any image processing may be used to support recognition by distinguishing between the two. Further, in the above embodiment, the product that satisfies the conditions is subjected to the light emission processing and the product information is displayed, but only the light emission processing may be performed and the display of the product information may be omitted. Alternatively, depending on the degree of conformity with the conditions, it may be selected whether to perform only the light emission processing or to perform the light emission processing and the display of the product information.
  • the degree of conformity to the conditions is set in three stages ("high”, “medium”, “low"), and when the degree of conformity is "high”, light emission processing and product information are displayed, and the degree of conformity is "". If it is “medium”, only light emission processing is performed, and if the goodness of fit is "low", masking processing is performed.
  • whether or not to execute the display of the product information may be determined according to the order history information. For example, if the order history information identifies that the item has been ordered in the past, the user is likely to already know the information about the item, depending on the goodness of fit of the condition. No product information is displayed. Alternatively, the user may set in advance whether or not to display the product information in addition to the light emission processing.
  • an advanced mode and a beginner mode are provided, and one of the modes is specified in advance by the user.
  • product information is provided regardless of the degree of conformity with the conditions.
  • beginner mode without displaying always perform light emission processing and product information display as a set.
  • the notification is made by vibrating the housing of the user terminal 100 and / or outputting a predetermined sound. You may.
  • the product information and / or guidance information window that is displayed when the conditions are met or when the user specifies a product does not need to be displayed superimposed on the captured image, and is on a screen different from or on the same screen as the captured image. May be displayed in different display areas. For example, when the user terminal 100 physically has a plurality of display screens, the captured image and the product information are displayed on different screens. Alternatively, the product information may be displayed on a device different from the user terminal 100. For example, when a user carries a plurality of devices, only the photographed image is displayed on the smartphone, and only the product information is displayed on the device worn on the wrist. Alternatively, the product information may be displayed on a device not owned by the user (for example, a screen provided in the store). In this case, wireless communication is performed between different devices, and a product information display command is transmitted from the user terminal 100 to another device.
  • the user preference information may be input by the user himself or may be automatically generated by the user terminal 100.
  • the specific means 121 automatically generates preference information according to a predetermined algorithm based on the user's order history.
  • information on the preferences of other users and order history may be used. For example, order history and preference information are acquired for a plurality of users, the relationship between order history and preference information is learned using a method such as machine learning to generate a learning model, and order history information is input to this learning model. By doing so, the conditions may be determined.
  • a part of the function of the user terminal 100 may be possessed by another device (for example, a server connected to the user terminal 100 via a network).
  • a server connected to the user terminal 100 via a network.
  • all or part of the functions of the control unit 120 and the storage means 150 are omitted from the user terminal 100.
  • the user terminal 100 need at least have a photographing means, a display screen, an input means and a communication means, and for identification of the photographed product and / or image processing, an external device via the communication means. May receive the processing result.
  • an external storage device instead of storing the information necessary for identifying the product and determining the image processing content in the storage means 150, an external storage device stores it, and the user terminal 100 acquires it as needed.
  • the user terminal 100 may inquire of the search server for detailed information on the product or information necessary for identifying the product, and may acquire the search result.
  • the present invention includes a step of inputting conditions related to product selection, a step of acquiring product information, and a step of acquiring an image taken by a user in an information processing system including one or a plurality of information processing devices. , The step of specifying the photographed product based on the image and the information of the product, and the step of processing and displaying the image of the article when the specified product satisfies the above conditions. It should be executed.

Abstract

A user terminal (100) has: an imaging means; a display means (130) that displays an image captured by the imaging means; a storage means (150) that stores information about a commodity and a condition concerning the commodity; a specification means (121) that, on the basis of the image captured by the imaging means and the information about the commodity, specifies the imaged commodity; and a display control means (122) that, when the specified article satisfies the condition, processes the image of the article and displays the image on the display means.

Description

ユーザ端末およびプログラムUser terminal and program
 本発明は、陳列された商品の検索に関する。 The present invention relates to a search for displayed products.
 例えば、バー等のアルコール飲料を提供する場所において、陳列されている多数の商品のなかから自身の好みに合う商品を簡単に見つけたいというニ-ズがある。この点に関連し、商品の在庫管理を支援すべく、店舗の陳列棚を撮影することにより、各商品の銘柄を特定する技術がある(特許文献1等)。 For example, at a place where alcoholic beverages are served, such as a bar, there is a need to easily find a product that suits one's taste from among the many products on display. In relation to this point, in order to support the inventory management of products, there is a technique for identifying the brand of each product by photographing the display shelves of the store (Patent Document 1 and the like).
特開2017-117362号公報JP-A-2017-117362
 特許文献1の技術では、存在する商品を把握することができたとしても、顧客の嗜好に適合する商品を提示することはできない。
 本発明は、陳列された商品のなかから顧客の嗜好に合う商品を見つけることを支援することを目的とする。
With the technique of Patent Document 1, even if an existing product can be grasped, it is not possible to present a product that suits the taste of the customer.
An object of the present invention is to support finding a product that suits a customer's taste from among the displayed products.
 本発明は、一の態様において、撮影手段と、前記撮影手段にて撮影された画像を表示する表示手段と、商品の情報および商品に関する条件を記憶する記憶手段と、前記撮影手段にて撮影された画像および前記商品の情報に基づき、該撮影された商品を特定する特定手段と、該特定された物品が前記条件を満たす場合、当該物品の画像について加工処理を施して前記表示手段に表示する表示制御手段とを有するユーザ端末を提供する。 In one embodiment, the present invention is photographed by a photographing means, a display means for displaying an image photographed by the photographing means, a storage means for storing product information and conditions related to the product, and the photographing means. Based on the image and the information of the product, the specific means for identifying the photographed product and, when the specified article satisfies the above conditions, the image of the article is processed and displayed on the display means. A user terminal having a display control means is provided.
 本発明によれば、陳列された商品のなかから顧客の嗜好に合う商品を見つける作業が支援される。 According to the present invention, the work of finding a product that suits a customer's taste from among the displayed products is supported.
ユーザ端末100の機能図。The functional diagram of the user terminal 100. ユーザ嗜好情報の例。Example of user preference information. ユーザの注文履歴の例。An example of a user's order history. 商品識別情報の例。Example of product identification information. 商品詳細情報の例。Example of detailed product information. 在庫情報の例。An example of inventory information. ユーザ端末100の動作例。An operation example of the user terminal 100. ユーザ端末100の動作例(その2)。An operation example of the user terminal 100 (No. 2). ユーザ端末100の使用状態を示す例。An example showing the usage state of the user terminal 100. ユーザ端末100に表示される画面の例(その1)。An example of a screen displayed on the user terminal 100 (No. 1). ユーザ端末100に表示される画面の例(その2)。An example of a screen displayed on the user terminal 100 (No. 2). ユーザ端末100に表示される画面の例(その3)。An example of a screen displayed on the user terminal 100 (No. 3). ユーザ端末100に表示される画面の例(その4)。An example of a screen displayed on the user terminal 100 (No. 4). ユーザ端末100に表示される画面の例(その5)。An example of a screen displayed on the user terminal 100 (No. 5).
<実施例>
 図1はユーザ端末100の機能図である。ユーザ端末100は、ユーザに携帯されるデバイスであって、撮影手段110と、制御部120と、表示手段130と、入力手段140と、記憶手段150と、タッチスクリーン160と、通信手段170と、位置取得手段180とを含む。以下では、ユーザ端末100はスマートフォンである場合を例にとって説明するが、通話機能は必須ではなく、逆に、本発明に関係しない機能が付加されているデバイスであってもよい。
<Example>
FIG. 1 is a functional diagram of the user terminal 100. The user terminal 100 is a device carried by the user, and includes a photographing means 110, a control unit 120, a display means 130, an input means 140, a storage means 150, a touch screen 160, a communication means 170, and the like. Includes position acquisition means 180. Hereinafter, the case where the user terminal 100 is a smartphone will be described as an example, but the call function is not essential, and conversely, the user terminal 100 may be a device to which a function not related to the present invention is added.
 撮影手段110は、カメラ、レンズ、受光素子、その他の光学素子を含むモジュールとして実装され、ユーザの操作に従ってユーザ端末100の周囲の様子を撮影する。具体的には、撮影手段110は、撮影画像を示すデータを生成して特定手段121へ供給する。なお、取得される画像は、静止画でも動画でもよい。また、可視光画像に替えて赤外線画像を取得してもよい。 The photographing means 110 is mounted as a module including a camera, a lens, a light receiving element, and other optical elements, and photographs the surroundings of the user terminal 100 according to the user's operation. Specifically, the photographing means 110 generates data indicating a photographed image and supplies the data to the specific means 121. The acquired image may be a still image or a moving image. Further, an infrared image may be acquired instead of the visible light image.
 表示手段130は、例えば液晶画面によって実現され、制御部120からの指示に従って画像を表示する。 The display means 130 is realized by, for example, a liquid crystal screen, and displays an image according to an instruction from the control unit 120.
 入力手段140は、タッチパッドやマウス等の入力デバイスであって、ユーザによる画面内の位置の指定を受け付ける。好ましくは、表示手段130および入力手段140の機能が一体となり、タッチスクリーン160を構成する。 The input means 140 is an input device such as a touch pad or a mouse, and accepts a user's designation of a position on the screen. Preferably, the functions of the display means 130 and the input means 140 are integrated to form the touch screen 160.
 通信手段170は、WiFi(登録商標)や4G、5G等の無線通信モジュールとして実現され、インタ-ネット等の通信網を介して他の装置と接続し、必要に応じて、商品の特定に必要な情報や商品の情報等を取得する。 The communication means 170 is realized as a wireless communication module such as WiFi (registered trademark), 4G, 5G, etc., is connected to another device via a communication network such as the Internet, and is necessary for identifying a product as needed. Get information and product information.
 位置取得手段180は、GPS衛星からの電波を受信するモジュール、あるいは基地局や無線ル-タ-から位置情報を含む情報を受信するための通信モジュールとして実装され、ユーザ端末100の現在の位置を示す情報を取得する。 The position acquisition means 180 is implemented as a module for receiving radio waves from GPS satellites or a communication module for receiving information including position information from a base station or a wireless router, and obtains the current position of the user terminal 100. Get the information shown.
 記憶手段150は、少なくとも、商品の情報および商品選択に関する条件を記憶する。好ましい態様において、記憶手段150は、図2~6に示すように、ユーザ嗜好情報、注文履歴、商品識別情報、商品詳細情報、および在庫情報を記憶する。上記条件は、少なくともユーザ嗜好情報に基づいて生成される。
 加えて、記憶手段150には、ユーザ端末100のプロセッサによって実行され、ユーザ端末100の機能を実現させるためのアプリケーションプログラム(以下、アプリという)が記憶される。
The storage means 150 stores at least product information and conditions regarding product selection. In a preferred embodiment, the storage means 150 stores user preference information, order history, product identification information, product detail information, and inventory information, as shown in FIGS. 2 to 6. The above conditions are generated at least based on user preference information.
In addition, the storage means 150 stores an application program (hereinafter, referred to as an application) that is executed by the processor of the user terminal 100 and realizes the function of the user terminal 100.
 図2はユーザ嗜好情報の例である。この例では、飲料の種類ごとに、アルコールの度数、香り、味、風味、喉ごしといった観点からみた、ユーザ端末100のユーザの嗜好の特徴のほか価格(ユーザが想定してい予算額)、ブランド、産地などといった情報が含まれ得る。要するに、ユーザの嗜好が、陳列された商品にどの程度合致するのか(しないのか)を判定するために用いられる情報が含まれる。具体的には、ユーザ嗜好情報に基づいて、ある商品がこのユーザに適合するのか否かを判定するための条件が生成される。 FIG. 2 is an example of user preference information. In this example, for each type of beverage, in addition to the characteristics of the user's preference of the user terminal 100 from the viewpoint of alcohol content, aroma, taste, flavor, and throat, the price (budget amount assumed by the user), Information such as brand, place of origin, etc. may be included. In short, it contains information used to determine to what extent the user's preferences match (or do not) the displayed product. Specifically, a condition for determining whether or not a certain product is suitable for this user is generated based on the user preference information.
 図3はユーザの注文履歴の例である。ユーザ端末100のユーザが過去に訪れた店舗、訪問日時、その店舗で注文した商品、およびユーザが登録した商品(例えば、注文には至らなかったがユーザが気になった商品であって、注文の有無とは無関係であってもよい)の情報が含まれる。この商品は、例えば商品を一意に識別するための国際規格であるGTIN(Global Trade Item Number)に従って付与された番号によって識別される。あるいは、製造会社名、ブランド名、産地、年代、種類等によって識別されてもよい。 FIG. 3 is an example of the user's order history. The store visited by the user of the user terminal 100 in the past, the date and time of the visit, the product ordered at that store, and the product registered by the user (for example, a product that did not reach the order but was of concern to the user and was ordered. Includes information (which may be irrelevant to the presence or absence of). This product is identified by, for example, a number assigned according to GTIN (Global Trade Item Number), which is an international standard for uniquely identifying a product. Alternatively, it may be identified by the name of the manufacturing company, the brand name, the place of origin, the age, the type, and the like.
 図4は商品識別情報の例である。この例では、商品を一意に識別するIDごとに、商品をその外観(パッケージや容器)から判別するために必要な情報を含む。このIDとしては、例えばGTIN準拠の13桁の数字列を採用することができる。この例では、商品に貼付されたラベルについての情報として、ラベルに記載された文字(典型的には、ブランド名などの商品を識別するための商標)、ラベルの柄、ラベルの形状、ラベルのサイズ、容器にラベルが貼付される位置に加え、容器の情報(ボトルの形状や色など)を含むことができる。なお、同図の例はあくまで一例であって、商品の外観を特徴づける情報であれば、いかなる情報であってもよい。 FIG. 4 is an example of product identification information. In this example, each ID that uniquely identifies the product includes information necessary for distinguishing the product from its appearance (package or container). As this ID, for example, a GTIN-compliant 13-digit number string can be adopted. In this example, as information about the label affixed to the product, the characters on the label (typically, a trademark for identifying the product such as a brand name), the pattern of the label, the shape of the label, and the label In addition to the size and the position where the label is attached to the container, the information on the container (bottle shape, color, etc.) can be included. The example in the figure is just an example, and any information may be used as long as it is information that characterizes the appearance of the product.
 図5は商品詳細情報の例である。商品詳細情報は、商品の特徴、性質など、商品に関する情報であれば何でもよいが、ユーザがその商品の注文を決定する際に参考になりうる情報であることが好ましい。この例では、GTINなどの商品IDに対応付けて、商品の特徴の説明するためのテキストデータ、図、イラスト、写真、グラフその他の画像情報、および商品の製造会社、小売業者、卸売り業者などが提供している商品の情報へアクセスするための情報(URLのリンク)が含まれる。 FIG. 5 is an example of detailed product information. The detailed product information may be any information related to the product, such as the characteristics and properties of the product, but is preferably information that can be referred to when the user decides to order the product. In this example, text data, figures, illustrations, photographs, graphs and other image information for explaining the characteristics of the product, and the manufacturer, retailer, wholesaler, etc. of the product are associated with the product ID such as GTIN. Information (URL link) for accessing the information of the provided product is included.
 図6は在庫情報の例である。店舗ごとに、その店舗の位置情報と、その店舗で取り扱っているすべての商品を識別するための情報とが対応付けて記憶される。 FIG. 6 is an example of inventory information. For each store, the location information of the store and the information for identifying all the products handled in the store are stored in association with each other.
 制御部120は、1以上の汎用または画像処理専用のプロセッサとして実現され、特定手段121および表示制御手段122を含む。 The control unit 120 is realized as one or more general-purpose or image processing dedicated processors, and includes a specific means 121 and a display control means 122.
 特定手段121は、撮影手段110にて撮影された画像および記憶手段150から読み出した商品の情報に基づき、撮影手段110にて撮影された商品を特定する。なお、特定手段121は、ユーザのカメラワークに起因して商品を特定することができないと判断した場合、その旨を示す信号を表示制御手段122に出力してもよい。ここで、商品を「特定」するとは、商品の、生産者、産地、型番、種類、銘柄、ブランド、バージョン、バリエーション等といった公開されているあらゆる情報をすべて特定する場合のみならず、例えば銘柄は特定できるが複数存在するバリエーションのうちどのバリエーションに該当するのかまでは特定できない場合や、製造者は特定できるが、該当しうる型番の候補を3つに絞ることができるという程度に特定がなされる場合も含む意味である。要するに、ユーザがその商品を選択する際の意思決定に資する情報が撮影画像から取得できればよく、商品を一意に特定することはできない、あるいは不完全に特定されているといえるとしても、商品を「特定」することに該当する。 The specifying means 121 identifies the product photographed by the photographing means 110 based on the image photographed by the photographing means 110 and the product information read from the storage means 150. When the specifying means 121 determines that the product cannot be specified due to the camera work of the user, the specifying means 121 may output a signal to that effect to the display controlling means 122. Here, "identifying" a product is not limited to specifying all publicly available information such as the producer, place of origin, model number, type, brand, brand, version, variation, etc. of the product, for example, the brand is It is specified to the extent that it can be specified but it is not possible to specify which of the multiple variations exists, or the manufacturer can be specified, but the candidates for the applicable model number can be narrowed down to three. It means to include the case. In short, it suffices if the information that contributes to the decision-making when the user selects the product can be obtained from the photographed image, and even if it can be said that the product cannot be uniquely identified or is incompletely identified, the product is described as " Corresponds to "identify".
 特定手段121は、商品が特定されると、該特定された各商品が条件を満たすか否かを判定する。具体的には、特定手段121は、記憶手段150に記憶されたユーザ嗜好情報および商品識別情報を参照して、条件への適合の度合いを判定し、判定結果を、商品の画像領域を示す情報に対応付けて表示制御手段122へ出力する。適合の度合いとは、例えば、条件が複数の項目(合致する/しないの二択)からなる場合において、合致する項目の数に基づいて算出することができる。 When a product is specified, the identification means 121 determines whether or not each of the specified products satisfies the condition. Specifically, the specific means 121 refers to the user preference information and the product identification information stored in the storage means 150 to determine the degree of conformity with the conditions, and the determination result is information indicating an image area of the product. Is output to the display control means 122 in association with. The degree of conformity can be calculated based on the number of matching items, for example, when the condition consists of a plurality of items (matching / not matching).
 なお、特定手段121は、画像データに基づいて、ユーザのカメラワークや撮影条件が所定値を超えたか否かを判定し、ユーザのカメラワーク(例えば急激なカメラ操作に起因した手ぶれの程度が許容範囲外である場合)に起因して商品を特定することができない場合、その旨を示すフラグを表示制御手段122へ出力する。 The specific means 121 determines whether or not the user's camera work and shooting conditions exceed a predetermined value based on the image data, and allows the user's camera work (for example, the degree of camera shake caused by a sudden camera operation). When the product cannot be specified due to (when it is out of the range), a flag indicating that fact is output to the display control means 122.
 表示制御手段122は、特定手段121にて特定された物品が条件を満たす場合、当該物品の画像について加工処理を施して表示手段130に表示する。具体的には、特定手段121から供給された情報に基づいて、加工処理の対象である画像領域と加工内容とを決定する。
 好ましい態様において、加工処理は、該特定された物品を他の商品から相対的に目立たせる処理である。例えば、当該商品が発光する(=光り輝く)様子を模したエフェクト処理(以下、「発光処理」という)である。また、加工処理は、該特定された物品以外の物品(例えば商品は特定できたが、条件に対する適合度が所定値以下であるもの)の画像領域を相対的に目立たなくする処理(換言すると、視認性を低下させる処理)であってもよい。具体的には、その画像領域について、黒塗り処理、ぼかし処理、明度または彩度を低下させる処理(以下、マスク処理という)などが該当する。
When the article specified by the specific means 121 satisfies the condition, the display control means 122 processes the image of the article and displays it on the display means 130. Specifically, the image area to be processed and the processing content are determined based on the information supplied from the specific means 121.
In a preferred embodiment, the processing process is a process that makes the identified article stand out relative to other commodities. For example, it is an effect process (hereinafter referred to as "light emission process") that imitates the appearance of the product emitting light (= shining). Further, the processing process is a process (in other words, a process of making the image area of an article other than the specified article (for example, a product can be identified but the goodness of fit to the condition is less than or equal to a predetermined value) relatively inconspicuous). It may be a process for reducing visibility). Specifically, the image area includes blackening processing, blurring processing, processing for reducing lightness or saturation (hereinafter referred to as mask processing), and the like.
 好ましい態様において、表示制御手段122は、条件への合致の度合いに応じて、上記エフェクト処理の内容を変化させてもよい。具体的には、発光に関する、色、明るさ(輝度)、発光のタイミングのうち少なくともいずれか一つを、条件への適合度に応じて変化させてもよい。例えば、条件の合致度がより良い高い商品についてはより高い輝度の画像(例えば金色に眩しく輝いて見える画像)を付加する。あるいは、このエフェクトが所定のオブジェクトの点滅を模した画像を描画するエフェクト処理である場合において、点滅の時間間隔が合致の度合いがより高いものほどより短くなるようにする。同様に、マスク処理に関して、条件への適合度に応じて、輝度、彩度、明度、エッジのぼかしとの他の描画方法を制御することによって、その商品の画像が目立たなくなる程度を変化させてもよい。
 このように、発光の態様を変えることで、ユーザは条件への合致度を直感的に理解することができる。
 表示制御手段122は、こうして実行した加工処理が撮影手段110から取得した画像データに施された画像データを表示手段130へ供給する。
In a preferred embodiment, the display control means 122 may change the content of the effect processing according to the degree of matching with the conditions. Specifically, at least one of color, brightness (luminance), and light emission timing regarding light emission may be changed according to the degree of conformity with the conditions. For example, an image having a higher brightness (for example, an image that looks dazzling in gold) is added to a product having a higher degree of conformity with the conditions. Alternatively, when this effect is an effect process for drawing an image that imitates the blinking of a predetermined object, the blinking time interval is set to be shorter as the degree of matching is higher. Similarly, with respect to masking, the degree to which the image of the product becomes inconspicuous is changed by controlling other drawing methods such as brightness, saturation, lightness, and edge blurring, depending on the degree of conformity to the conditions. May be good.
By changing the mode of light emission in this way, the user can intuitively understand the degree of conformity with the conditions.
The display control means 122 supplies the image data applied to the image data acquired from the photographing means 110 by the processing processed in this way to the display means 130.
 更に、表示制御手段122は、特定された商品についての情報を撮影手段110にて撮影された画像に重ねて表示するように、画像データを生成してもよい。また、表示制御手段122は、ユーザのカメラワーク等に起因して商品が特定できないことを示すフラグを特定手段121から受信した場合、その旨のメッセージ(例えば、「ブレが大きくて商品を特定できません。もう少しゆっくりと動かして下さい」というテキスト情報)を重ね表示した画像データを生成してもよい。これにより、ユーザは適切なカメラ操作を行うように促される。 Further, the display control means 122 may generate image data so as to superimpose and display the information about the specified product on the image taken by the photographing means 110. Further, when the display control means 122 receives a flag from the identification means 121 indicating that the product cannot be specified due to the user's camera work or the like, a message to that effect (for example, "the product cannot be specified due to large blurring"). You may generate image data in which the text information) "Please move it a little more slowly" is superimposed. This encourages the user to perform appropriate camera operations.
 図7はユーザ端末100の動作例である。店舗を訪れたユーザは、上述したアプリを起動しておく。すると、まず、位置取得手段180は位置情報を取得し(S501)、記憶手段150に記憶されている在庫情報を参照して、ユーザが今いる場所の店舗を特定する。ユーザは、図9に示すように、店舗(ここではアルコール飲料を提供するバーとする)内の陳列棚の方向へ撮影手段のカメラレンズを向けて所定の操作を行い、撮影を開始する。以後、ユーザ端末100は、所定のフレームレートで画像データを連続的に取得する。図10に示すように、取得したデータはリアルタイムに表示手段130に表示される(S502)。ユーザは、必要に応じてユーザ端末100の向きを変えて撮影対象を変えると、撮影対象に応じた画像が表示される。 FIG. 7 is an operation example of the user terminal 100. The user who visits the store starts the above-mentioned application. Then, first, the position acquisition means 180 acquires the position information (S501), and refers to the inventory information stored in the storage means 150 to identify the store where the user is currently located. As shown in FIG. 9, the user points the camera lens of the photographing means toward the display shelf in the store (here, a bar serving alcoholic beverages), performs a predetermined operation, and starts photographing. After that, the user terminal 100 continuously acquires image data at a predetermined frame rate. As shown in FIG. 10, the acquired data is displayed on the display means 130 in real time (S502). When the user changes the direction of the user terminal 100 to change the shooting target as necessary, an image corresponding to the shooting target is displayed.
 特定手段121は、取得した画像データを逐次解析し、撮影された各商品の特定を試みる(S503)。この際、例えば、商品を一つに特定することはできないが複数の候補が判明した場合は、特定手段121は、必要に応じて記憶手段150を参照し、取得した位置情報からユーザが今いる店舗を特定し、当該店舗にて取り扱っている商品の情報を用いて、当該複数の候補の中から一つに絞り込むなど、商品の同定の精度を高める処理を行ってもよい。なお、特定できない、あるいは正しくない特定がなされた商品が存在しても構わない。 The identification means 121 sequentially analyzes the acquired image data and attempts to identify each photographed product (S503). At this time, for example, when it is not possible to specify one product but a plurality of candidates are found, the identification means 121 refers to the storage means 150 as necessary, and the user is present from the acquired position information. You may perform processing to improve the accuracy of product identification, such as identifying a store and using the information of the products handled at the store to narrow down to one from the plurality of candidates. In addition, there may be a product that cannot be specified or is specified incorrectly.
 続いて、特定手段121は、特定した各商品について、条件に合致するか否かを判定する(S505)。合致の度合いが所定値以上である各商品については(S506:YES)、図11に示すように、特定手段121は、合致の度合いに応じた発光態様を決定し、その商品に関する発光エフェクト処理を画面に重畳表示するとともに(S507)、その商品の情報を記憶手段150から読み出して重畳表示する(S509)。図11の例においては、商品画像OB1-1の商品が条件を満たしていると判定され、この結果、商品画像OB1-1の近傍に、発光オブジェクトOB4と、商品情報ウィンドウOB2-1が表示されている。商品情報ウィンドウOB2-1には、例えば、商品の写真、説明文、風味の特徴を示すチャ-トが表示される。
 一方、合致の度合いが所定値以下である各商品については(S506:NO)、その商品の画像に対してマスク処理を施す(S508)。図11の例では、マスキングオブジェクトOB3-1、OB3-2、およびOB3-3は、合致の度合いが所定値以下の商品であると判定された結果、マスク処理がなされている。
Subsequently, the specifying means 121 determines whether or not the specified conditions are met for each of the specified products (S505). For each product whose degree of matching is equal to or higher than a predetermined value (S506: YES), as shown in FIG. 11, the specific means 121 determines a light emitting mode according to the degree of matching, and performs light emitting effect processing on the product. While superimposing and displaying on the screen (S507), the information of the product is read out from the storage means 150 and superimposing and displaying (S509). In the example of FIG. 11, it is determined that the product of the product image OB1-1 satisfies the condition, and as a result, the light emitting object OB4 and the product information window OB2-1 are displayed in the vicinity of the product image OB1-1. ing. In the product information window OB2-1, for example, a photograph showing a photograph of the product, a description, and a chart showing the characteristics of the flavor are displayed.
On the other hand, for each product whose degree of matching is equal to or less than a predetermined value (S506: NO), the image of the product is masked (S508). In the example of FIG. 11, the masking objects OB3-1, OB3-2, and OB3-3 are masked as a result of determining that the degree of matching is a product of a predetermined value or less.
 制御部120は、商品の画像に対して発光処理その他の目立せる処理を行った場合および商品の画像に対して発光処理その他の目立せる処理を行った対象の商品、およびマスク処理などの目立たなくする処理を行った対象の商品を記憶手段150に記憶する(S510)。 The control unit 120 performs light emission processing or other conspicuous processing on the product image, the target product on which the light emission processing or other conspicuous processing is performed on the product image, mask processing, or the like. The target product that has been subjected to the inconspicuous processing is stored in the storage means 150 (S510).
 ユーザがタッチスクリーン160に対して何らかの操作を行った場合(S504:YES)、処理は図8のS550に進む。当該操作が長押し操作であった場合(S550:YES)、特定手段121は当該長押し操作によって位置に映っている商品を特定し、該特定した商品の情報を記憶手段150から読み出して、撮影画像に重畳表示する(S551)。図12は、このときに表示される画面例を示す。この例では、商品画像OB1-2に対して長押し操作が行われると、商品情報ウィンドウOB2-2が表示されている。このように、撮影された商品をユーザが指定することで、条件には合致しないがユーザが興味を持った商品について情報を取得することができるようになっている。 When the user performs some operation on the touch screen 160 (S504: YES), the process proceeds to S550 in FIG. When the operation is a long-press operation (S550: YES), the specifying means 121 identifies the product reflected in the position by the long-pressing operation, reads the information of the specified product from the storage means 150, and takes a picture. It is superimposed and displayed on the image (S551). FIG. 12 shows an example of a screen displayed at this time. In this example, when the long press operation is performed on the product image OB1-2, the product information window OB2-2 is displayed. In this way, by specifying the photographed product by the user, it is possible to acquire information about the product that does not meet the conditions but is of interest to the user.
 また、ユーザがフリック操作(例えば画面端部から中央部への方向)を行った場合(S552:YES)、図13に示すように、案内情報ウィンドウOB5が重畳表示される(S553)。案内情報ウィンドウ内に表示される情報としては、例えば、注文履歴および在庫情報に基づいて生成された、今いる店舗についての情報、この店舗にてユーザが過去に注文を行った商品についての情報、お気に入り登録(後述する)を行った商品のリスト、商品のさらなる詳細情報を提供しているウェブサイトへのリンクや、商品を通信販売で購入するサービスを提供しているウェブサイトへのリンク、条件を変更するための情報などの、各種の案内情報が含まれ得る。案内情報ウィンドウOB5は、逆方向にフリック操作することで閉じられる。 Further, when the user performs a flick operation (for example, the direction from the edge of the screen to the center) (S552: YES), the guidance information window OB5 is superimposed and displayed as shown in FIG. 13 (S553). The information displayed in the guidance information window includes, for example, information about the current store generated based on the order history and inventory information, and information about the products that the user has placed an order for in the past at this store. A list of products that have been registered as favorites (described later), links to websites that provide further detailed information on products, links to websites that provide services for purchasing products by mail order, and conditions. Various guidance information, such as information for changing the information, may be included. The guidance information window OB5 is closed by flicking in the opposite direction.
 また、ユーザがタップ操作を行うと(S554:YES)、表示制御手段122はタップ操作された画面上の位置に、ピン画像を重畳表示する(S555)。図14は、この時に表示される画面の例である。この例では、商標画像OB1-3に対してタップ操作がされると、ピンオブジェクトOB6が表示されている。そして、表示制御手段122は、その商品をお気に入り商品として登録したことを記憶手段150に記憶する(S556)。そして、処理は図7のS502へ戻り、S502~S510の処理(動画を撮影し、撮影された商品の特定処理、および条件との合致の如何に応じた画像処理)を繰り返す。なお、ピンオブジェクトを対象としてフリック操作が行われると、当該ピンオブジェクトの表示は消える。 Further, when the user performs a tap operation (S554: YES), the display control means 122 superimposes and displays a pin image on the position on the screen where the tap operation is performed (S555). FIG. 14 is an example of the screen displayed at this time. In this example, when the trademark image OB1-3 is tapped, the pin object OB6 is displayed. Then, the display control means 122 stores in the storage means 150 that the product is registered as a favorite product (S556). Then, the process returns to S502 of FIG. 7, and the processes of S502 to S510 (movement is taken, the photographed product is specified, and the image processing according to the condition is met) are repeated. When a flick operation is performed on a pin object, the display of the pin object disappears.
 上記実施例によれば、ユーザは、陳列された多数の商品のうち自分の好みに合う商品を容易に見つけることができる。この際、例えばユーザの知らない銘柄の商品であっても条件に合致すれば視覚的に報知されるので、未知の商品を試すことへのハードルが下がる。また、例えば商品が自分の好みに合うかどうかを店員に尋ねたりする必要がなく、本発明の方法によって提示された商品に基づいてユーザが決定した注文すべき商品を店員に伝えるだけでよいので、店員との無用なコミュニケーションを望まない、あるいは今まで試したことがない銘柄の商品を気軽に試したいと考えているユーザにとって、都合がよい。一方、店員にとっては、取り扱っているすべての商品についてユーザの質問に答えられるだけの豊富な知識を有している必要がないので、例えば、客への商品知識の提供以外の業務に専念することができる。 According to the above embodiment, the user can easily find a product that suits his / her taste among a large number of displayed products. At this time, for example, even if the product is a brand that the user does not know, if the conditions are met, the product is visually notified, so that the hurdle to try an unknown product is lowered. Further, for example, it is not necessary to ask the clerk whether the product suits his / her taste, and it is only necessary to inform the clerk of the product to be ordered determined by the user based on the product presented by the method of the present invention. , It is convenient for users who do not want unnecessary communication with the clerk or who want to feel free to try products of brands that they have never tried before. On the other hand, the clerk does not need to have abundant knowledge to answer user's questions about all the products he handles, so for example, he should concentrate on work other than providing product knowledge to customers. Can be done.
 また、陳列された商品がユーザにとって既知のものであったとしても、ラベルが小さい、ラベルに記載された文字が小さい、あるいは照明が暗い、といった事情のために、その商品の存在を視認することが困難であるような状況(例えば、バーカウンターのような場所)であっても、画像解析が実行できる程度の撮影条件(照度、解像度、ズーム機能など)が満たされていれば、ユーザは陳列棚を撮影するだけで自分の好みに合う商品が提示される。 Also, even if the displayed product is known to the user, the presence of the product should be visually recognized due to circumstances such as a small label, small characters on the label, or dim lighting. Even in situations where it is difficult (for example, in a place such as a bar counter), if the shooting conditions (illuminance, resolution, zoom function, etc.) that can perform image analysis are satisfied, the user can display the product. Just take a picture of the shelf and you will be presented with a product that suits your taste.
 また、陳列商品の数が多くてカメラフレームに収まらない場合であっても、パン、チルト、ズーム等の操作を行って撮影範囲を変化させることで、陳列された商品全体をくまなく探索することができる。なお、条件に適合する商品を検出すると、その旨がリアルタイムに画面表示に反映されるため、ユーザは条件に合う商品が一つ見つかった時点でアプリを終了することもできるし、とりあえず陳列商品全体を撮影し、仮に適合商品が複数検出された場合、そのなかから購入商品を吟味することも可能である。この際、ユーザは、条件への適合度が反映された発光態様の違いを参考にして、商品を絞り込むことができる。また、適合度が低い(あるいは適合しない)商品についてはマスク処理されることで、似たような外観の多数の商品が密集して陳列されているために個別の商品を視認し辛いような状況であっても、ユーザが欲している商品の発見が容易となる。なお、「陳列」とは、必ずしも整然と配置される状態のみを指すのでなく、要するに、複数の商品が撮影手段で捉えられる位置に存在していればよい。 Even if the number of displayed products is too large to fit in the camera frame, you can search the entire displayed products by changing the shooting range by operating pan, tilt, zoom, etc. Can be done. When a product that meets the conditions is detected, that fact is reflected on the screen display in real time, so the user can exit the application when one product that meets the conditions is found, and for the time being, the entire displayed product. If multiple conforming products are detected, it is possible to examine the purchased products from among them. At this time, the user can narrow down the products by referring to the difference in the light emitting mode reflecting the degree of conformity with the conditions. In addition, products with low (or non-conforming) degree of conformity are masked, and many products with similar appearances are densely displayed, making it difficult to see individual products. Even so, it becomes easy to find the product that the user wants. Note that the "display" does not necessarily mean only a state in which the products are arranged in an orderly manner, but in short, it is sufficient that a plurality of products are present at a position where they can be captured by the photographing means.
<変形例>
 上述したスマートフォンはユーザ端末の一例であって、ユーザ端末として、例えばいわゆる拡張現実(AR)技術を用いたスマートグラス(眼鏡型端末)を用いてもよい。要するに、ユーザの周囲の状況(例えば陳列棚)を撮影した画像と、当該画像に上記画像処理を施した画像とを重畳表示してユーザに提供する機能を有していればよい。この場合、画面内の位置を指定するには、視線検知や音声入力などの既存の入力機構を用いればよい。
<Modification example>
The above-mentioned smartphone is an example of a user terminal, and as the user terminal, for example, smart glasses (glasses type terminals) using so-called augmented reality (AR) technology may be used. In short, it suffices to have a function of superimposing and displaying an image of the user's surroundings (for example, a display shelf) and an image subjected to the above image processing on the image and providing the user with the image. In this case, in order to specify the position on the screen, an existing input mechanism such as line-of-sight detection or voice input may be used.
 上記実施例において検索対象の商品は飲料であったが、食料品でも日用品でも電化製品でも構わない。要するに、多数の陳列されている商品から商品の検索や選択を支援することができる状況であれば、本発明を適用することができる。 In the above embodiment, the product to be searched was a beverage, but it may be a food product, daily necessities, or an electric appliance. In short, the present invention can be applied as long as it is possible to support the search and selection of products from a large number of displayed products.
 上述した、長押し操作、フリック操作、およびタップ操作は、商品の指定、登録指示持、情報ウィンドウの表示等の命令を入力するための操作の一例に過ぎない。 The above-mentioned long press operation, flick operation, and tap operation are merely examples of operations for inputting commands such as product specification, registration instruction holding, and information window display.
 上述した「発光処理」は、商品が光を放っていると万人が知覚するものである必要はなく、要するに、条件に適合する(あるいは適合度が高い)商品とその他の商品とを視覚的に区別して認知することを支援するための画像処理であればよい。また、上記実施例においては、条件を満たす商品については、発光処理を行うとともに商品情報を表示したが、発光処理のみを行い、商品情報の表示は省略してもよい。
 あるいは、条件への合致度に応じて、発光処理のみを行うか、発光処理と商品情報の表示とを行うかを選択してもよい。例えば、条件への適合度を三段階(「高」、「中」、「低」)設定し、適合度が「高」の場合は、発光処理と商品情報の表示を行い、適合度が「中」の場合は発光処理のみを行い、適合度が「低」の場合は、マスキング処理を行う。
 あるいは、商品情報の表示の実行の有無を注文履歴情報に応じて決定してもよい。例えば、注文履歴情報からその商品を過去に注文したがことがあることが特定された場合、そのユーザはその商品についての情報をすでに把握している可能性が高いので、条件の適合度によらず、商品情報の表示は行わない。
 あるいは、発光処理に加えて商品情報の表示を行うか否かを予めユーザが設定しておいてもよい。例えば、上級者モ-ドと初心者モ-ドとを設け、予めユーザにいずれかのモ-ドを指定させておき、上級者モ-ドの場合は条件への適合度にかかわらず商品情報を表示せず、初心者モ-ドの場合は、発光処理と商品情報の表示とを常にセットで行うようにする。
The above-mentioned "light emission processing" does not have to be something that everyone perceives that a product emits light, but in short, it visually distinguishes a product that meets (or has a high degree of conformity) with other products. Any image processing may be used to support recognition by distinguishing between the two. Further, in the above embodiment, the product that satisfies the conditions is subjected to the light emission processing and the product information is displayed, but only the light emission processing may be performed and the display of the product information may be omitted.
Alternatively, depending on the degree of conformity with the conditions, it may be selected whether to perform only the light emission processing or to perform the light emission processing and the display of the product information. For example, the degree of conformity to the conditions is set in three stages ("high", "medium", "low"), and when the degree of conformity is "high", light emission processing and product information are displayed, and the degree of conformity is "". If it is "medium", only light emission processing is performed, and if the goodness of fit is "low", masking processing is performed.
Alternatively, whether or not to execute the display of the product information may be determined according to the order history information. For example, if the order history information identifies that the item has been ordered in the past, the user is likely to already know the information about the item, depending on the goodness of fit of the condition. No product information is displayed.
Alternatively, the user may set in advance whether or not to display the product information in addition to the light emission processing. For example, an advanced mode and a beginner mode are provided, and one of the modes is specified in advance by the user. In the case of the advanced mode, product information is provided regardless of the degree of conformity with the conditions. In the case of beginner mode without displaying, always perform light emission processing and product information display as a set.
 また、条件に適合する商品の存在を報知する際に、発光その他の画像処理による報知に加えて、ユーザ端末100の筐体を振動させること、および/または所定の音声を出力することによって報知してもよい。 Further, when notifying the existence of a product that meets the conditions, in addition to the notification by light emission or other image processing, the notification is made by vibrating the housing of the user terminal 100 and / or outputting a predetermined sound. You may.
 条件に適合した場合ないしはユーザが商品を指定した場合に表示される商品情報および/または案内情報ウィンドウは、撮影画像に重畳して表示される必要はなく、撮影画像とは別の画面あるいは同一画面の異なる表示領域に表示されてもよい。例えば、ユーザ端末100が物理的に複数の表示画面を有する場合は、撮影画像と商品情報を異なる画面に表示する。あるいは、ユーザ端末100とは異なるデバイスに商品情報を表示させてもよい。例えば、ユーザが複数のデバイスを携帯している場合、スマートフォンには撮影画像のみ表示させ、腕に装着するデバイスには商品情報のみを表示させる。あるいは、商品情報をユーザが所有していないデバイス(例えば店舗に備え付けられた画面)に表示させてもよい。この場合、異なるデバイス間において無線通信を行って、商品情報の表示命令をユーザ端末100から他のデバイスへ送信する。 The product information and / or guidance information window that is displayed when the conditions are met or when the user specifies a product does not need to be displayed superimposed on the captured image, and is on a screen different from or on the same screen as the captured image. May be displayed in different display areas. For example, when the user terminal 100 physically has a plurality of display screens, the captured image and the product information are displayed on different screens. Alternatively, the product information may be displayed on a device different from the user terminal 100. For example, when a user carries a plurality of devices, only the photographed image is displayed on the smartphone, and only the product information is displayed on the device worn on the wrist. Alternatively, the product information may be displayed on a device not owned by the user (for example, a screen provided in the store). In this case, wireless communication is performed between different devices, and a product information display command is transmitted from the user terminal 100 to another device.
 ユーザ嗜好情報はユーザ自身が入力してもよいし、ユーザ端末100が自動的に生成してもよい。具体的には、特定手段121は、ユーザの注文履歴に基づいて、所定のアルゴリズムに従って嗜好情報を自動的に生成する。この際、他のユーザの嗜好に関する情報や注文履歴を用いてもよい。例えば、複数のユーザについて注文履歴および嗜好情報を取得し、機械学習等の手法を用いて注文履歴および嗜好情報の関係性を学習して学習モデルを生成し、この学習モデルに注文履歴情報を入力することで、条件を決定してもよい。 The user preference information may be input by the user himself or may be automatically generated by the user terminal 100. Specifically, the specific means 121 automatically generates preference information according to a predetermined algorithm based on the user's order history. At this time, information on the preferences of other users and order history may be used. For example, order history and preference information are acquired for a plurality of users, the relationship between order history and preference information is learned using a method such as machine learning to generate a learning model, and order history information is input to this learning model. By doing so, the conditions may be determined.
 ユーザ端末100が有する機能の一部を他の装置(例えばネットワークを介してユーザ端末100と接続されたサ-バ)が有していてもよい。例えば、制御部120および記憶手段150の機能の全部または一部を、ユーザ端末100から省略する。ユーザ端末100は、最低限、撮影手段、表示画面、入力手段と通信手段を有していればよく、撮影された商品の特定、および/または画像処理については、通信手段を介して外部の装置に依頼し、処理結果を受信してもよい。この際、商品の特定や画像処理内容の決定に必要な情報は、記憶手段150に記憶しておく代わりに、外部の記憶装置が記憶しておき、必要に応じてユーザ端末100が取得してもよいし、ユーザ端末100は、商品の詳細な情報や商品を特定するために必要な情報を検索サ-バに問い合わせ、検索結果を取得してもよい。
 要するに、本発明は、1または複数の情報処理装置からなる情報処理システムにおいて、商品選択に関する条件を入力するステップと、商品の情報を取得するステップと、ユーザにより撮影された画像を取得するステップと、前記画像および前記商品の情報に基づき、撮影された商品を特定するステップと、該特定された商品が前記条件を満たす場合に当該物品の画像に対して加工処理を施して表示するステップとが実行されればよい。
A part of the function of the user terminal 100 may be possessed by another device (for example, a server connected to the user terminal 100 via a network). For example, all or part of the functions of the control unit 120 and the storage means 150 are omitted from the user terminal 100. The user terminal 100 need at least have a photographing means, a display screen, an input means and a communication means, and for identification of the photographed product and / or image processing, an external device via the communication means. May receive the processing result. At this time, instead of storing the information necessary for identifying the product and determining the image processing content in the storage means 150, an external storage device stores it, and the user terminal 100 acquires it as needed. Alternatively, the user terminal 100 may inquire of the search server for detailed information on the product or information necessary for identifying the product, and may acquire the search result.
In short, the present invention includes a step of inputting conditions related to product selection, a step of acquiring product information, and a step of acquiring an image taken by a user in an information processing system including one or a plurality of information processing devices. , The step of specifying the photographed product based on the image and the information of the product, and the step of processing and displaying the image of the article when the specified product satisfies the above conditions. It should be executed.
100:ユーザ端末
110:撮影手段
120:制御部
121:特定手段
122:表示制御手段
130:表示手段
140:入力手段
150:記憶手段
160:タッチスクリーン
170:通信手段
180:位置取得手段
100: User terminal 110: Imaging means 120: Control unit 121: Specific means 122: Display control means 130: Display means 140: Input means 150: Storage means 160: Touch screen 170: Communication means 180: Position acquisition means

Claims (10)

  1.  撮影手段と、
     前記撮影手段にて撮影された画像を表示する表示手段と、
     商品の情報および商品選択に関する条件を記憶する記憶手段と、
     前記撮影手段にて撮影された画像および前記商品の情報に基づき、該撮影された商品を特定する特定手段と、
     該特定された物品が前記条件を満たす場合、当該物品の画像について加工処理を施して前記表示手段に表示する表示制御手段と
     を有するユーザ端末。
    Shooting means and
    A display means for displaying an image taken by the shooting means and a display means.
    A storage means for storing product information and conditions related to product selection,
    A specific means for identifying the photographed product based on the image photographed by the photographing means and the information of the product, and
    A user terminal having a display control means that processes an image of the article and displays it on the display means when the specified article satisfies the above conditions.
  2.  前記加工処理は、該特定された物品が発光する様子を模したエフェクト処理である
     請求項1に記載のユーザ端末。
    The user terminal according to claim 1, wherein the processing process is an effect process that imitates a state in which the specified article emits light.
  3.  前記加工処理は、該特定された物品以外の物品の視認性を低下させる処理である
     請求項1または2に記載のユーザ端末。
    The user terminal according to claim 1 or 2, wherein the processing is a process for reducing the visibility of articles other than the specified article.
  4.  前記条件との合致の度合いに応じて、発光に関する、色、明るさ、発光のタイミングの少なくともいずれか一つが異なる
     請求項2に記載のユーザ端末。
    The user terminal according to claim 2, wherein at least one of the color, the brightness, and the timing of light emission differs depending on the degree of matching with the above conditions.
  5.  自ユーザ端末の位置情報を取得する手段を更に備え、
     前記記憶手段は、店舗の位置情報と、当該店舗が取り扱っている商品情報とを更に記憶し、
     前記特定手段は、前記商品情報に更に基づいて該撮影された商品を特定する
     請求項1~4のいずれか一つに記載のユーザ端末。
    Further equipped with a means for acquiring the position information of the own user terminal,
    The storage means further stores the location information of the store and the product information handled by the store.
    The user terminal according to any one of claims 1 to 4, wherein the specific means further specifies the photographed product based on the product information.
  6.  前記撮影手段は連続的に画像を撮影し、
     前記特定手段は、ユーザのカメラワークに起因して商品を特定することができない場合、当該ユーザに対して通知を行う
     請求項1~5のいずれか一つに記載のユーザ端末。
    The photographing means continuously captures images and
    The user terminal according to any one of claims 1 to 5, wherein the specific means notifies the user when the product cannot be specified due to the camera work of the user.
  7.  前記加工処理は、該特定された商品についての情報を表示する処理である
     請求項1~6のいずれか一つに記載のユーザ端末。
    The user terminal according to any one of claims 1 to 6, wherein the processing process is a process of displaying information about the specified product.
  8.  ユーザによる表示画面内の位置の指定を受け付ける入力手段を更に有し、
     前記特定手段は、該指定された位置に写っている商品を特定し、
     前記表示制御手段は、該特定した商品の情報を表示する
     請求項1~7のいずれか一つに記載のユーザ端末。
    It also has an input means that accepts the user's specification of the position on the display screen.
    The identification means identifies the product in the designated position and
    The user terminal according to any one of claims 1 to 7, wherein the display control means displays information on the specified product.
  9.  前記条件は、ユーザの嗜好に関する条件を含む
     請求項1~8のいずれか一つに記載のユーザ端末。
    The user terminal according to any one of claims 1 to 8, wherein the condition includes a condition relating to a user's preference.
  10.  コンピュ-タに、
     商品選択に関する条件を入力するステップと、
     商品の情報を取得するステップと、
     ユーザにより撮影された画像を取得するステップと、
     前記画像および前記商品の情報に基づき、撮影された商品を特定するステップと、
     該特定された商品が前記条件を満たす場合、当該商品の画像に対して加工処理を施して表示するステップと
     を実行させるためのプログラム。
    For computers
    Steps to enter conditions for product selection and
    Steps to get product information and
    Steps to get the image taken by the user,
    Based on the image and the information of the product, the step of identifying the photographed product and
    A program for executing a step of processing and displaying an image of the specified product when the specified product satisfies the above conditions.
PCT/JP2020/008864 2020-03-03 2020-03-03 User terminal and program WO2021176552A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/008864 WO2021176552A1 (en) 2020-03-03 2020-03-03 User terminal and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/008864 WO2021176552A1 (en) 2020-03-03 2020-03-03 User terminal and program

Publications (1)

Publication Number Publication Date
WO2021176552A1 true WO2021176552A1 (en) 2021-09-10

Family

ID=77614485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/008864 WO2021176552A1 (en) 2020-03-03 2020-03-03 User terminal and program

Country Status (1)

Country Link
WO (1) WO2021176552A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011253324A (en) * 2010-06-01 2011-12-15 Sharp Corp Merchandise information providing terminal and merchandise information providing system
WO2015083450A1 (en) * 2013-12-06 2015-06-11 株式会社Nttドコモ Shopping support device and shopping support method
WO2017030177A1 (en) * 2015-08-20 2017-02-23 日本電気株式会社 Exhibition device, display control device and exhibition system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011253324A (en) * 2010-06-01 2011-12-15 Sharp Corp Merchandise information providing terminal and merchandise information providing system
WO2015083450A1 (en) * 2013-12-06 2015-06-11 株式会社Nttドコモ Shopping support device and shopping support method
WO2017030177A1 (en) * 2015-08-20 2017-02-23 日本電気株式会社 Exhibition device, display control device and exhibition system

Similar Documents

Publication Publication Date Title
US11367130B2 (en) Method for in-store object highlighting by a real world user interface
US11614803B2 (en) Individually interactive multi-view display system for non-stationary viewing locations and methods therefor
JP5280590B2 (en) Information processing system, information processing method, and program
US9418481B2 (en) Visual overlay for augmenting reality
US10026116B2 (en) Methods and devices for smart shopping
KR20200136377A (en) Customized augmented reality item filtering system
KR101620938B1 (en) A cloth product information management apparatus and A cloth product information management sever communicating to the appartus, a server recommending a product related the cloth, a A cloth product information providing method
CN112585667A (en) Intelligent platform counter display system and method
WO2018066102A1 (en) Information providing system, information providing device, information providing method, and program
US20110050900A1 (en) Image processing apparatus, wearable image processing apparatus, and method of controlling image processing apparatus
US9990665B1 (en) Interfaces for item search
US20160078056A1 (en) Data Processing Method and Data Processing System
JP6076304B2 (en) Article information providing apparatus, article information providing system, article information providing method, and article information providing program
US20170358135A1 (en) Augmenting the Half-Mirror to Display Additional Information in Retail Environments
JP2016038877A (en) Display system and display method
JP2023029520A (en) Display control device, control method, program, and storage medium
US11854068B2 (en) Frictionless inquiry processing
US9626804B2 (en) Article information providing apparatus that provides information of article, article information providing system,and article information provision method
KR20160149185A (en) Integrative image searching system and service method of the same
WO2021176552A1 (en) User terminal and program
WO2016033161A1 (en) Apparatus and method for smart photography
JP2009026112A (en) Commodity information provision system
KR101885669B1 (en) System for intelligent exhibition based on transparent display and method thereof
JP2022090355A (en) Information provision system
JP2021087029A (en) Position detection system, position detection device, and position detection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20923090

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.01.2023)

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 20923090

Country of ref document: EP

Kind code of ref document: A1