WO2022201233A1 - 画像処理装置、画像処理方法、及びプログラム - Google Patents
画像処理装置、画像処理方法、及びプログラム Download PDFInfo
- Publication number
- WO2022201233A1 WO2022201233A1 PCT/JP2021/011656 JP2021011656W WO2022201233A1 WO 2022201233 A1 WO2022201233 A1 WO 2022201233A1 JP 2021011656 W JP2021011656 W JP 2021011656W WO 2022201233 A1 WO2022201233 A1 WO 2022201233A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- estimation
- image processing
- processing device
- product
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 17
- 238000000034 method Methods 0.000 claims abstract description 38
- 238000003384 imaging method Methods 0.000 claims description 14
- 239000000463 material Substances 0.000 claims description 6
- 230000006855 networking Effects 0.000 claims description 5
- 239000000047 product Substances 0.000 description 84
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 13
- 239000006227 byproduct Substances 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
Definitions
- the present invention relates to an image processing device, an image processing method, and a program.
- Japanese Patent Application Laid-Open No. 2002-200002 describes using image data extracted from a photographed image of an actual product or a catalog, or a moving image such as a television image, as a search key.
- Patent document 2 describes that images of products sourced from various advertising media such as magazines, leaflets, pamphlets, posters, TV commercials, advertising videos, and web advertisements are used as search keys.
- the inventor of the present invention thought that in order to estimate the degree of influence, it is necessary to estimate the shooting target when a person generates an image including a product.
- One example of the purpose of the present invention is to estimate a photographing target when a person generates an image including a product.
- an image acquiring means for acquiring an image of an object to be photographed, in which a product is included in a part of the area; estimating means for processing the image to generate object estimation data indicating an estimation result of the type of the photographing object; output means for outputting based on the estimated target data;
- An image processing apparatus is provided.
- the computer an image acquisition process for acquiring an image that is a photographed image of a photographing target and that includes a product in a part of the area; an estimation process for processing the image to generate object estimation data indicating an estimation result of the type of the imaging object; an output process for performing an output based on the estimated target data;
- An image processing method is provided for performing
- the computer an image acquisition process for acquiring an image that is a photographed image of a photographing target and that includes a product in a part of the area; an estimation process for processing the image to generate object estimation data indicating an estimation result of the type of the imaging object; an output process for performing an output based on the estimated target data;
- a program is provided that causes the
- the present invention it is possible to estimate the photographing target when a person generates an image containing a product.
- FIG. 1 is a diagram for explaining a usage environment of an image processing apparatus according to an embodiment;
- FIG. It is a figure showing an example of functional composition of an image processing device. It is a figure for demonstrating the 1st example of the estimation process which an estimation part performs.
- FIG. 10 is a diagram for explaining a second example of estimation processing performed by an estimation unit;
- FIG. 10 is a diagram for explaining a second example of estimation processing performed by an estimation unit;
- FIG. 11 is a diagram for explaining a third example of estimation processing performed by an estimation unit;
- FIG. 11 is a diagram for explaining a fourth example of estimation processing performed by an estimation unit; It is a figure which shows an example of the information which the human information storage part has memorize
- FIG. 10 is a diagram showing a second example of information output by the output unit; It is a figure which shows the hardware structural example of an image processing apparatus. 4 is a flowchart showing an example of processing performed by an image processing apparatus together with processing performed by a terminal;
- FIG. 1 is a diagram for explaining the usage environment of the image processing device 10 according to the embodiment.
- the image processing device 10 is used together with the terminal 20 .
- the terminal 20 is, for example, a portable terminal such as a smartphone or a tablet terminal, and is operated by a person.
- the terminal 20 transmits the image to the image processing device 10 according to this operation.
- This image is generated by photographing an object to be photographed, and includes products in a part of the area.
- An example of an object to be photographed is a product (actual product) arranged in a real space such as a store, and a medium containing an image of the product.
- Examples of media include printed materials such as magazines and advertisements, screens displayed on displays based on broadcasts, screens displayed on displays based on SNS (Social networking service), and screens displayed on displays based on e-mails. This is the screen that is displayed.
- Printed materials also include advertisements placed around town.
- the image that the terminal 20 transmits to the image processing device 10 may be a still image or a moving image.
- the image processing device 10 When the image processing device 10 acquires an image from the terminal 20, the image processing device 10 processes this image to generate data indicating the result of estimating the type of the shooting target (hereinafter referred to as target estimation data). Further, the image processing apparatus 10 performs output based on this target estimation data. This output may be the target estimation data itself. Further, when there are multiple terminals 20 and multiple target estimation data are generated, the output unit 130 may output the result of statistically processing the target estimation data.
- the image processing device 10 processes the image acquired from the terminal 20 to estimate the product included in the image. Then, the image processing apparatus 10 executes at least part of the processing for causing the user of the terminal 20 to purchase the product.
- the terminal 20 may have a shooting function. In this case, the terminal 20 may generate an image to be transmitted to the image processing device 10 .
- FIG. 2 is a diagram showing an example of the functional configuration of the image processing device 10. As shown in FIG.
- the image processing apparatus 10 includes an image acquisition section 110 , an estimation section 120 and an output section 130 .
- the image acquisition unit 110 acquires an image from the terminal 20. This image contains products in some areas, as described above.
- the estimation unit 120 generates the above-described target estimation data by processing the image acquired by the image acquisition unit 110 .
- the estimation unit 120 generates target estimation data by processing the area around the product in the image.
- the estimation unit 120 generates target estimation data by processing a plurality of frame images included in the moving image. Details of the target estimation data generation process will be described later with reference to other drawings.
- the estimation unit 120 processes the image acquired by the image acquisition unit 110 to estimate the product included in this image. At this time, the estimation unit 120 may further estimate products similar to this product (hereinafter referred to as similar products).
- This estimation result includes, for example, at least one of the product name and product code (for example, JAN code). Note that this estimation processing may be performed, for example, by feature amount matching, or may be performed using a model generated using machine learning.
- the estimating unit 120 performs at least part of the processing necessary for the user of the terminal 20 to purchase this product and/or similar products as necessary.
- An example of this process is to identify an online shop and/or a physical store where this product and/or similar products can be purchased, and provide information identifying this online shop and/or physical store (hereinafter referred to as purchase assistance information). It is to transmit to the terminal 20 .
- the purchase assistance information may include the URL or link of the online shop, or may include information indicating the location of the physical store (eg, address and/or map). Further, the purchase assistance information may include advertisement information or coupon information regarding the product estimated by the estimation unit 120 and/or similar products.
- the estimation unit 120 uses at least one of the above-described estimation result, that is, the product name and the product code, when identifying the online shop and/or the physical store.
- the estimation unit 120 stores information indicating the estimation result in the human information storage unit 150 in association with the user of the terminal 20 . Details of the information stored in the human information storage unit 150 will be described later using other drawings.
- the human information storage unit 150 may be part of the image processing device 10 or may be located outside the image processing device 10 .
- the output unit 130 outputs based on the target estimation data. As noted above, this output may be the target estimation data itself. Also, when there are multiple terminals 20 , the estimation unit 120 generates target estimation data for each of the multiple terminals 20 . In this case, the output unit 130 may generate output data by statistically processing the plurality of target estimation data. In this case, an example of the output data is data (hereinafter referred to as first relevance data) indicating the relevance between the attribute information of the user of each of the plurality of terminals 20 and the target estimation data. User attribute information for each of the plurality of terminals 20 is stored in the personal information storage unit 150 . By using the first relevance data, it is possible to specify preferred imaging targets by age group (or gender), for example. A specific example of the output performed by the output unit 130 will be described later using other drawings.
- the image processing device 10 further includes a purchase result acquisition unit 140 .
- the purchase result acquisition unit 140 acquires information indicating whether or not the product included in the image acquired by the image acquisition unit 110 has been purchased (hereinafter referred to as purchase result information).
- the purchase result acquisition unit 140 acquires purchase result information from the terminal 20, for example, but may acquire purchase result information from another device (for example, a server that manages an online shop or a physical store). Information acquired by the purchase result acquisition unit 140 is stored in the personal information storage unit 150 .
- the output unit 130 outputs data indicating the relationship between the target estimation data and the purchase result information (hereinafter referred to as second relationship data). At this time, the output unit 130 uses information stored in the human information storage unit 150 . The user of the image processing apparatus 10 can recognize to what extent the photographed object is associated with the purchase by using the second relevance data.
- FIG. 3 is a diagram for explaining a first example of estimation processing performed by the estimation unit 120.
- the object to be photographed is the screen displayed on the display based on radio waves or online broadcasting. This display may be used as a television or as a digital signage.
- the user of the terminal 20 causes the terminal 20 to photograph the screen of this display.
- the terminal 20 may generate a still image or may generate a moving image.
- the image generated by the terminal 20 may include the frame of the display, or may include only the screen of the display without the frame.
- the estimating unit 120 determines that the imaging target is the display by detecting the frame of the display.
- the estimating unit 120 determines that the imaging target is the display when the image includes scanning lines specific to the display.
- the estimation unit 120 processes this moving image to determine the type of content displayed on the display (for example, advertisements played on TV broadcasts, or broadcasted on digital signage). advertisements) can also be specified. The estimating unit 120 can also identify the imaging target by this.
- the estimation unit 120 detects the feature amount of the portion outside the display in the image, and performs matching processing on this feature amount to determine the position of the display.
- a location for example, indoors or outdoors
- FIG. 4 is a diagram for explaining a second example of estimation processing performed by the estimation unit 120.
- the object to be photographed is the actual product.
- the terminal 20 preferably generates moving images.
- the estimation unit 120 determines that the object to be photographed is the actual product if at least a part of the surroundings of the product has changed, and that the object to be photographed is a printed matter such as a magazine if the surroundings of the product have not changed. judge there is.
- the edge of the magazine may appear in the printed matter generated by the terminal 20 .
- the estimating unit 120 detects this edge, the estimating unit 120 determines that the object to be photographed is the printed matter.
- the estimating unit 120 can identify the place where the printed matter is placed (for example, whether it is indoors or outdoors) by processing the portion outside the printed matter in the image. .
- FIG. 6 is a diagram for explaining a third example of estimation processing performed by the estimation unit 120.
- the imaging target is the screen displayed on the display based on the SNS. In this case, this screen has a screen configuration unique to that SNS. By detecting the presence or absence of this screen configuration, the estimating unit 120 can identify that the imaging target is the SNS and the service name of the SNS.
- FIG. 7 is a diagram for explaining a fourth example of estimation processing performed by the estimation unit 120.
- the object to be photographed is the screen displayed on the display based on the e-mail.
- this screen has a screen configuration unique to e-mail. By detecting the presence or absence of this screen configuration, the estimation unit 120 can identify that the subject to be photographed is the e-mail.
- FIG. 8 is a diagram showing an example of information stored in the human information storage unit 150 of the image processing device 10.
- the personal information storage unit 150 stores identification information (hereinafter referred to as personal identification information) assigned to each person, attribute information, and history information in association with each other. .
- Attribute information includes the person's name, gender, and age, but may include other information.
- the history information includes the result of analysis by the estimation unit 120 of the image sent from the terminal 20 by the person.
- the analysis result of the estimating unit 120 includes the photographing target and the product name and/or product code.
- the history information also includes information indicating whether or not the product estimated by the estimation unit 120 has been purchased.
- the history information may further include information specifying the date and time of purchase of the product, and the online shop or physical store where the product was sold.
- FIG. 9 is a diagram showing a first example of information output by the output unit 130 of the image processing device 10.
- the output unit 130 outputs first relevance data.
- the first relevance data indicates the relevance between the user attribute information of each of the plurality of terminals 20 and the target estimation data, as described above.
- the first relevance data indicates, by attribute, the number of times a photographing target is used for retrieval by a person having that attribute.
- the information shown in this figure may be by product name or product code, or may be by product category.
- the product category may be a large category such as clothes and food, or a small category such as coats and shirts.
- FIG. 10 is a diagram showing a second example of information output by the output unit 130 of the image processing device 10.
- the output unit 130 outputs the second relevance data.
- the second relevance data indicates the relevance between the target estimation data and the purchase result information, as described above.
- the second relevance data indicates the number of times of use of the photographed object used by the person who has purchased the product for each attribute of the person.
- the information shown in this figure may also be sorted by product name or product code, or by product category.
- FIG. 11 is a diagram showing a hardware configuration example of the image processing apparatus 10. As shown in FIG.
- the image processing apparatus 10 has a bus 1010 , a processor 1020 , a memory 1030 , a storage device 1040 , an input/output interface 1050 and a network interface 1060 .
- the bus 1010 is a data transmission path for the processor 1020, the memory 1030, the storage device 1040, the input/output interface 1050, and the network interface 1060 to exchange data with each other.
- the method of connecting processors 1020 and the like to each other is not limited to bus connection.
- the processor 1020 is a processor realized by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or the like.
- the memory 1030 is a main memory implemented by RAM (Random Access Memory) or the like.
- the storage device 1040 is an auxiliary storage device realized by a HDD (Hard Disk Drive), SSD (Solid State Drive), memory card, ROM (Read Only Memory), or the like.
- the storage device 1040 stores program modules that implement each function of the image processing apparatus 10 (for example, the image acquisition unit 110, the estimation unit 120, the output unit 130, and the purchase result acquisition unit 140). Each function corresponding to the program module is realized by the processor 1020 reading each program module into the memory 1030 and executing it.
- the storage device 1040 also functions as the human information storage unit 150 .
- the input/output interface 1050 is an interface for connecting the image processing apparatus 10 and various input/output devices.
- a network interface 1060 is an interface for connecting the image processing apparatus 10 to a network.
- This network is, for example, a LAN (Local Area Network) or a WAN (Wide Area Network).
- a method for connecting the network interface 1060 to the network may be a wireless connection or a wired connection.
- Image processing device 10 may communicate with terminal 20 via network interface 1060 .
- FIG. 12 is a flowchart showing an example of processing performed by the image processing device 10 together with processing performed by the terminal 20.
- FIG. 12 is a flowchart showing an example of processing performed by the image processing device 10 together with processing performed by the terminal 20.
- the user of the terminal 20 finds an item of interest, the user causes the terminal 20 to generate an image including the item (step S10).
- an example of an object to be photographed by the terminal 20 is as explained using FIG.
- the terminal 20 then transmits the captured image to the image processing device 10 .
- the terminal 20 also transmits the personal identification information of the user to the image processing device 10 (step S20).
- the image acquisition unit 110 of the image processing device 10 acquires the image transmitted from the terminal 20.
- the estimation unit 120 then processes this image to estimate the imaging target (step S30).
- An example of the processing performed here is as described with reference to FIGS. 2 to 7.
- the estimating unit 120 also identifies products and/or similar products included in the image by processing this image (step S40). The estimating unit 120 then identifies an online shop and/or physical store where this product and/or similar product can be purchased, and generates information on this online shop and/or physical store, that is, purchase assistance information (step S50). . The estimation unit 120 then transmits this purchase assistance information to the terminal 20 (step S60).
- the estimating unit 120 also associates information (for example, at least one of a product name and a product code) indicating the imaging target estimated in step S30 and the product and/or similar product specified in step S40 with each other to obtain human information. Store in the storage unit 150 . At this time, the estimation unit 120 associates these pieces of information with the person identification information transmitted in step S20 (step S70).
- information for example, at least one of a product name and a product code
- the terminal 20 Upon acquiring the purchase assistance information from the image processing device 10, the terminal 20 displays this purchase assistance information on the display (step S80). The user of the terminal 20 uses this purchase assistance information when purchasing a product.
- the terminal 20 also generates information indicating whether or not the product has been purchased (hereinafter referred to as purchase result information) (step S90), and transmits this purchase result information to the image processing apparatus 10 together with the person identification information. (Step S100).
- the purchase result information includes information specifying the date and time of purchase and the online shop or physical store where the product was sold, as required.
- the purchase result acquisition unit 140 of the image processing device 10 stores the purchase result information transmitted from the terminal 20 as a part of the history information of the person information storage unit 150 . At this time, the purchase result acquisition unit 140 associates the purchase result information with the person identification information transmitted in step S100 (step S110).
- the output unit 130 of the image processing device 10 After that, the output unit 130 of the image processing device 10 generates and outputs output data at the necessary timing.
- the image processing apparatus 10 estimates a shooting target when a person generates an image including a product. Therefore, by using this estimation result, it is possible to estimate the degree of influence of the actual product or the medium providing the product image on the consumer's behavior. This degree of influence is indicated by, for example, the above-described first relevance data and second relevance data.
- An image processing device comprising: 2. In the image processing device described in 1 above, The image processing device, wherein the estimating means generates the target estimation data by processing a region around the product in the image. 3.
- the image acquisition means acquires a moving image as the image
- the image processing device wherein the estimation means generates the target estimation data by processing a plurality of frame images included in the moving image. 4.
- the types of the shooting targets include screens displayed on the display based on broadcasting, screens displayed on the display based on SNS (Social networking service), screens displayed on the display based on e-mail, printed materials, and real An image processing device including at least one of the products arranged in space. 5.
- the image is generated by a terminal
- the image acquisition means acquires the images from the plurality of terminals
- the estimation means generates the target estimation data for each of the plurality of terminals
- the image processing device, wherein the output means outputs first relevance data indicating relevance between the user attribute information of each of the plurality of terminals and the target estimation data. 6.
- the image processing device, wherein the estimating means further estimates the product and/or a similar product similar to the product by processing the image. 7. 6.
- the image processing device further comprising purchase result acquisition means for acquiring purchase result information indicating whether or not the product and/or the similar product has been purchased;
- the image processing device wherein the output means outputs second relevance data indicating relevance between the estimated target data and the purchase result information.
- the computer an image acquisition process for acquiring an image that is a photographed image of a photographing target and that includes a product in a part of the area; an estimation process for processing the image to generate object estimation data indicating an estimation result of the type of the imaging object; an output process for performing an output based on the estimated target data;
- An image processing method that performs 9.
- the image processing method wherein in the estimation process, the computer generates the target estimation data by processing a region around the product in the image. 10.
- the computer is obtaining a moving image as the image in the image acquisition;
- An image processing method wherein in the estimation process, the target estimation data is generated by processing a plurality of frame images included in the moving image.
- the types of the shooting targets include screens displayed on the display based on broadcasting, screens displayed on the display based on SNS (Social networking service), screens displayed on the display based on e-mail, printed materials, and real An image processing method including at least one of the commodities arranged in space. 12.
- the image is generated by a terminal
- the computer is acquiring the images from the plurality of terminals in the image acquisition process; generating the target estimation data for each of the plurality of terminals in the estimation process;
- the image processing method wherein, in the output process, first relevance data indicating relevance between user attribute information of each of the plurality of terminals and the target estimation data is output.
- An image processing method wherein, in the estimation process, the computer further processes the image to estimate the product and/or similar products similar to the product. 14.
- the computer is further performing purchase result acquisition processing for acquiring purchase result information indicating whether the product and/or the similar product has been purchased;
- the image processing method wherein in the output process, second relevance data indicating relevance of the estimated target data and the purchase result information is output.
- an image acquisition process for acquiring an image that is a photographed image of a photographing target and that includes a product in a part of the area; an estimation process for processing the image to generate object estimation data indicating an estimation result of the type of the imaging object; an output process for performing an output based on the estimated target data; A program that allows you to do 16.
- the image acquisition function acquires a moving image as the image;
- the estimation function generates the target estimation data by processing a plurality of frame images included in the moving image.
- the types of the shooting targets include screens displayed on the display based on broadcasting, screens displayed on the display based on SNS (Social networking service), screens displayed on the display based on e-mail, printed materials, and real A program including at least one of the commodities arranged in space. 19.
- the image is generated by a terminal
- the image acquisition function acquires the images from the plurality of terminals
- the estimation function generates the target estimation data for each of the plurality of terminals
- the program, wherein the output function outputs first relevance data indicating relevance between user attribute information of each of the plurality of terminals and the target estimation data.
- the estimating function further includes a program for estimating the product and/or similar products similar to the product by processing the image. 21. 20.
- image processing device 20 terminal 110 image acquisition unit 120 estimation unit 130 output unit 140 purchase result acquisition unit 150 human information storage unit
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- Development Economics (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Signal Processing (AREA)
- Data Mining & Analysis (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定手段と、
前記対象推定データに基づいた出力を行う出力手段と、
を備える画像処理装置が提供される。
撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得処理と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定処理と、
前記対象推定データに基づいた出力を行う出力処理と、
を行う画像処理方法が提供される。
撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得処理と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定処理と、
前記対象推定データに基づいた出力を行う出力処理と、
を行わせるプログラムが提供される。
1.撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得手段と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定手段と、
前記対象推定データに基づいた出力を行う出力手段と、
を備える画像処理装置。
2.上記1に記載の画像処理装置において、
前記推定手段は、前記画像のうち前記商品の周囲の領域を処理することにより、前記対象推定データを生成する、画像処理装置。
3.上記1又は2に記載の画像処理装置において、
前記画像取得手段は前記画像として動画を取得し、
前記推定手段は、前記動画に含まれる複数のフレーム画像を処理することにより、前記対象推定データを生成する、画像処理装置。
4.上記1~3のいずれか一項に記載の画像処理装置において、
前記撮影対象の種類は、放送に基づいてディスプレイに表示された画面、SNS(Social networking service)に基づいてディスプレイに表示された画面、電子メールに基づいてディスプレイに表示された画面、印刷物、及び実空間に配置された前記商品の少なくとも一つを含む、画像処理装置。
5.上記1~4のいずれか一項に記載の画像処理装置において、
前記画像は端末によって生成されており、
前記画像取得手段は、複数の前記端末から前記画像を取得し、
前記推定手段は、前記複数の端末別に前記対象推定データを生成し、
前記出力手段は、前記複数の端末それぞれの使用者の属性情報と前記対象推定データの関連性を示す第1関連性データを出力する、画像処理装置。
6.上記1~5のいずれか一項に記載の画像処理装置において、
前記推定手段は、さらに、前記画像を処理することにより、前記商品及び/又は前記商品に類似する類似製品を推定する画像処理装置。
7.上記6に記載の画像処理装置において、
前記商品及び/または前記類似製品が購入されたか否かを示す購入結果情報を取得する購入結果取得手段をさらに備え、
前記出力手段は、前記対象推定データと前記購入結果情報の関連性を示す第2関連性データを出力する、画像処理装置。
8.コンピュータが、
撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得処理と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定処理と、
前記対象推定データに基づいた出力を行う出力処理と、
を行う画像処理方法。
9.上記8に記載の画像処理方法において、
前記コンピュータは、前記推定処理において、前記画像のうち前記商品の周囲の領域を処理することにより、前記対象推定データを生成する、画像処理方法。
10.上記8又は9に記載の画像処理方法において、
前記コンピュータは、
前記画像取得において、前記画像として動画を取得し、
前記推定処理において、前記動画に含まれる複数のフレーム画像を処理することにより、前記対象推定データを生成する、画像処理方法。
11.上記8~10のいずれか一項に記載の画像処理方法において、
前記撮影対象の種類は、放送に基づいてディスプレイに表示された画面、SNS(Social networking service)に基づいてディスプレイに表示された画面、電子メールに基づいてディスプレイに表示された画面、印刷物、及び実空間に配置された前記商品の少なくとも一つを含む、画像処理方法。
12.上記8~11のいずれか一項に記載の画像処理方法において、
前記画像は端末によって生成されており、
前記コンピュータは、
前記画像取得処理において、複数の前記端末から前記画像を取得し、
前記推定処理において、前記複数の端末別に前記対象推定データを生成し、
前記出力処理において、前記複数の端末それぞれの使用者の属性情報と前記対象推定データの関連性を示す第1関連性データを出力する、画像処理方法。
13.上記8~12のいずれか一項に記載の画像処理方法において、
前記コンピュータは、前記推定処理において、さらに、前記画像を処理することにより、前記商品及び/又は前記商品に類似する類似製品を推定する画像処理方法。
14.上記13に記載の画像処理方法において、
前記コンピュータは、
前記商品及び/または前記類似製品が購入されたか否かを示す購入結果情報を取得する購入結果取得処理をさらに行い、
前記出力処理において、前記対象推定データと前記購入結果情報の関連性を示す第2関連性データを出力する、画像処理方法。
15.コンピュータに、
撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得処理と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定処理と、
前記対象推定データに基づいた出力を行う出力処理と、
を行わせるプログラム。
16.上記15に記載のプログラムにおいて、
前記推定機能は、前記画像のうち前記商品の周囲の領域を処理することにより、前記対象推定データを生成する、プログラム。
17.上記15又は16に記載のプログラムにおいて、
前記画像取得機能は前記画像として動画を取得し、
前記推定機能は、前記動画に含まれる複数のフレーム画像を処理することにより、前記対象推定データを生成する、プログラム。
18.上記15~17のいずれか一項に記載のプログラムにおいて、
前記撮影対象の種類は、放送に基づいてディスプレイに表示された画面、SNS(Social networking service)に基づいてディスプレイに表示された画面、電子メールに基づいてディスプレイに表示された画面、印刷物、及び実空間に配置された前記商品の少なくとも一つを含む、プログラム。
19.上記15~18のいずれか一項に記載のプログラムにおいて、
前記画像は端末によって生成されており、
前記画像取得機能は、複数の前記端末から前記画像を取得し、
前記推定機能は、前記複数の端末別に前記対象推定データを生成し、
前記出力機能は、前記複数の端末それぞれの使用者の属性情報と前記対象推定データの関連性を示す第1関連性データを出力する、プログラム。
20.上記15~19のいずれか一項に記載のプログラムにおいて、
前記推定機能は、さらに、前記画像を処理することにより、前記商品及び/又は前記商品に類似する類似製品を推定するプログラム。
21.上記20に記載のプログラムにおいて、
前記コンピュータに、さらに、前記商品及び/または前記類似製品が購入されたか否かを示す購入結果情報を取得する購入結果取得機能を持たせ、
前記出力機能は、前記対象推定データと前記購入結果情報の関連性を示す第2関連性データを出力する、プログラム。
20 端末
110 画像取得部
120 推定部
130 出力部
140 購入結果取得部
150 人情報記憶部
Claims (9)
- 撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得手段と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定手段と、
前記対象推定データに基づいた出力を行う出力手段と、
を備える画像処理装置。 - 請求項1に記載の画像処理装置において、
前記推定手段は、前記画像のうち前記商品の周囲の領域を処理することにより、前記対象推定データを生成する、画像処理装置。 - 請求項1又は2に記載の画像処理装置において、
前記画像取得手段は前記画像として動画を取得し、
前記推定手段は、前記動画に含まれる複数のフレーム画像を処理することにより、前記対象推定データを生成する、画像処理装置。 - 請求項1~3のいずれか一項に記載の画像処理装置において、
前記撮影対象の種類は、放送に基づいてディスプレイに表示された画面、SNS(Social networking service)に基づいてディスプレイに表示された画面、電子メールに基づいてディスプレイに表示された画面、印刷物、及び実空間に配置された前記商品の少なくとも一つを含む、画像処理装置。 - 請求項1~4のいずれか一項に記載の画像処理装置において、
前記画像は端末によって生成されており、
前記画像取得手段は、複数の前記端末から前記画像を取得し、
前記推定手段は、前記複数の端末別に前記対象推定データを生成し、
前記出力手段は、前記複数の端末それぞれの使用者の属性情報と前記対象推定データの関連性を示す第1関連性データを出力する、画像処理装置。 - 請求項1~5のいずれか一項に記載の画像処理装置において、
前記推定手段は、さらに、前記画像を処理することにより、前記商品及び/又は前記商品に類似する類似製品を推定する画像処理装置。 - 請求項6に記載の画像処理装置において、
前記商品及び/または前記類似製品が購入されたか否かを示す購入結果情報を取得する購入結果取得手段をさらに備え、
前記出力手段は、前記対象推定データと前記購入結果情報の関連性を示す第2関連性データを出力する、画像処理装置。 - コンピュータが、
撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得処理と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定処理と、
前記対象推定データに基づいた出力を行う出力処理と、
を行う画像処理方法。 - コンピュータに、
撮影対象を撮影した画像であって、一部の領域に商品を含んでいる画像を取得する画像取得処理と、
前記画像を処理することにより、前記撮影対象の種類の推定結果を示す対象推定データを生成する推定処理と、
前記対象推定データに基づいた出力を行う出力処理と、
を行わせるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/011656 WO2022201233A1 (ja) | 2021-03-22 | 2021-03-22 | 画像処理装置、画像処理方法、及びプログラム |
US18/282,845 US20240303990A1 (en) | 2021-03-22 | 2021-03-22 | Image processing apparatus, image processing method, and non-transitory computer-readable medium |
JP2023508155A JPWO2022201233A1 (ja) | 2021-03-22 | 2021-03-22 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/011656 WO2022201233A1 (ja) | 2021-03-22 | 2021-03-22 | 画像処理装置、画像処理方法、及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022201233A1 true WO2022201233A1 (ja) | 2022-09-29 |
Family
ID=83395330
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/011656 WO2022201233A1 (ja) | 2021-03-22 | 2021-03-22 | 画像処理装置、画像処理方法、及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240303990A1 (ja) |
JP (1) | JPWO2022201233A1 (ja) |
WO (1) | WO2022201233A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006085392A (ja) * | 2004-09-15 | 2006-03-30 | Xing Inc | テレビショッピング方法、テレビショッピングシステム、端末装置及び中央装置 |
JP2014197253A (ja) * | 2013-03-29 | 2014-10-16 | 株式会社日本総合研究所 | 通信販売処理装置、通信販売処理方法および通信販売処理プログラム |
US10650442B2 (en) * | 2012-01-13 | 2020-05-12 | Amro SHIHADAH | Systems and methods for presentation and analysis of media content |
-
2021
- 2021-03-22 JP JP2023508155A patent/JPWO2022201233A1/ja active Pending
- 2021-03-22 WO PCT/JP2021/011656 patent/WO2022201233A1/ja active Application Filing
- 2021-03-22 US US18/282,845 patent/US20240303990A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006085392A (ja) * | 2004-09-15 | 2006-03-30 | Xing Inc | テレビショッピング方法、テレビショッピングシステム、端末装置及び中央装置 |
US10650442B2 (en) * | 2012-01-13 | 2020-05-12 | Amro SHIHADAH | Systems and methods for presentation and analysis of media content |
JP2014197253A (ja) * | 2013-03-29 | 2014-10-16 | 株式会社日本総合研究所 | 通信販売処理装置、通信販売処理方法および通信販売処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022201233A1 (ja) | 2022-09-29 |
US20240303990A1 (en) | 2024-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Taherdoost et al. | Marketing vs E-marketing | |
US20200265463A1 (en) | Digital Media Environment for Analysis of Components of Digital Content | |
KR101710733B1 (ko) | 객체 식별을 이용하는 타깃화된 광고 | |
CN103518215A (zh) | 用于以跨设备上下文输入为基础的电视观众验证的系统和方法 | |
JP5236762B2 (ja) | 広告表示装置、広告表示方法、広告表示プログラム、及びそのプログラムを記憶するコンピュータ読取可能な記録媒体 | |
CN107220848B (zh) | 一种广告展示方法和装置 | |
JP2016218821A (ja) | 販売情報利用装置、販売情報利用方法、およびプログラム | |
WO2016095638A1 (zh) | 在用户设备及网络设备处用于提供直达服务的方法及装置 | |
CN112396456A (zh) | 广告推送方法、装置、存储介质以及终端 | |
JP2020086658A (ja) | 情報処理装置、情報処理システム、情報処理方法および類否判断方法 | |
JP2016009416A (ja) | 販売促進システム、販売促進管理装置および販売促進管理プログラム | |
JP7110738B2 (ja) | 情報処理装置、プログラム及び情報処理システム | |
CN104954826B (zh) | 多媒体文件的生成方法及装置 | |
CN112418994B (zh) | 商品的导购方法、装置、电子设备及存储介质 | |
JP5767413B1 (ja) | 情報処理システム、情報処理方法、および情報処理プログラム | |
WO2022201233A1 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
US10915938B2 (en) | Including instructions upon item procurement | |
US20230114462A1 (en) | Selective presentation of an augmented reality element in an augmented reality user interface | |
JP6905615B1 (ja) | 拡張現実システム | |
JP7355035B2 (ja) | 処理装置、処理方法及びプログラム | |
CN114842115A (zh) | 生成商品主图的方法及电子设备 | |
JP6829391B2 (ja) | 情報処理装置、情報配信方法、及び情報配信プログラム | |
JP2018147137A (ja) | 画像生成プログラム、画像生成装置、画像生成方法、広告編集プログラム、広告編集装置及び広告編集方法 | |
Rane et al. | Digital food menu application for restaurants based on augmented reality | |
JP7488484B2 (ja) | 情報処理装置、情報配信方法、及び情報配信プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21932847 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18282845 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023508155 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21932847 Country of ref document: EP Kind code of ref document: A1 |