WO2019235820A1 - Dispositif d'affichage et procédé de commande correspondant - Google Patents

Dispositif d'affichage et procédé de commande correspondant Download PDF

Info

Publication number
WO2019235820A1
WO2019235820A1 PCT/KR2019/006743 KR2019006743W WO2019235820A1 WO 2019235820 A1 WO2019235820 A1 WO 2019235820A1 KR 2019006743 W KR2019006743 W KR 2019006743W WO 2019235820 A1 WO2019235820 A1 WO 2019235820A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
image
area
processor
product
Prior art date
Application number
PCT/KR2019/006743
Other languages
English (en)
Korean (ko)
Inventor
안우람
함철희
송가영
신희원
이신욱
이태영
Original Assignee
삼성전자(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자(주) filed Critical 삼성전자(주)
Publication of WO2019235820A1 publication Critical patent/WO2019235820A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content

Definitions

  • the present invention relates to a display apparatus and a control method thereof, and more particularly, to a display apparatus for searching for an item included in image content and a control method thereof.
  • An object of the present invention is to provide a display apparatus and a control method thereof for acquiring additional information related to an item when searching for an item in an image to increase the accuracy of the search.
  • Another object of the present invention is to provide a display apparatus and a control method for providing a search function without prior work such as generating information on an item when searching for an item in an image.
  • another object of the present invention is to provide a display apparatus and a control method for providing a differentiated user experience (UX) to a user watching an image.
  • UX differentiated user experience
  • a display apparatus comprising: a signal receiving unit for receiving a signal of an image composed of a plurality of frames; A signal processor which processes a signal; A display unit; Detecting the first region of the first item to be searched in the image and the second region of the second item related to the first item, and detecting the first region of the first region based on the detected information of the second item of the second region. It may be achieved by a display apparatus including a processor for searching for an item and controlling the search result of the searched first item to be displayed on the display unit.
  • the processor may transmit information of the detected first item of the first area and the second item of the second area to the server, and receive a search result of the first item from the server.
  • the first item may include a product image displayed on the image
  • the second item may include at least one of a person image or a background image displayed on the image including the first item.
  • the information of the second item may be generated based on information on at least one of the person image or the background image related to the product image.
  • the product image, the person image, or the background image may include a plurality of different images of the corresponding item displayed on each of the plurality of frames.
  • the processor may extract a key frame from the plurality of frames and detect the first region and the second region from the extracted key frame.
  • the apparatus may further include a user input unit configured to receive a user input, wherein the processor is further configured to select a frame period of a part of the plurality of frames based on the received user input, and to select the first area and the second area from the selected frame period. The area can be detected.
  • a user input unit configured to receive a user input
  • the processor is further configured to select a frame period of a part of the plurality of frames based on the received user input, and to select the first area and the second area from the selected frame period. The area can be detected.
  • the apparatus may further include a user input unit configured to receive a user input, and the processor may display the plurality of first items detected in the image on the display unit in the form of a UI, and the received user among the plurality of first items.
  • the search result of the first item selected based on the input may be controlled to be displayed on the display unit.
  • a computer program product comprising: a memory in which a plurality of instructions are stored; And a processor, wherein the instruction, when executed by the processor, detects the first region of the first item and the second region of the second item related to the first item from the image composed of the plurality of frames received.
  • a computer program product may be achieved which includes searching for the first item of the detected first area based on the information of the second item of the second area and displaying the search result of the searched first item.
  • the instruction may include transmitting information of the detected first item of the first area and the second item of the second area to a server, and receiving a search result of the first item from the server.
  • a control method of a display apparatus comprising: receiving a signal of an image consisting of a plurality of frames; Detecting a first area of a first item and a second area of a second item related to the first item in a video; Retrieving a first item of the detected first area based on information of the second item of the detected second area; And displaying a search result of the found first item.
  • the method includes transmitting information of the detected first item of the first area and the second item of the second area to a server;
  • the method may further include receiving a search result of the first item from the server.
  • the first item may include a product image displayed on the image
  • the second item may include at least one of a person image or a background image displayed on the image including the first item.
  • the information of the second item may be generated based on information on at least one of the person image or the background image related to the product image.
  • the product image, the person image, or the background image may include a plurality of different images of the corresponding item displayed on each of the plurality of frames.
  • the detecting may include extracting a key frame of the plurality of frames;
  • the method may include detecting the first region and the second region from the extracted key frame.
  • the method includes selecting a frame section of a portion of the plurality of frames based on a user input;
  • the method may include detecting the second region and the second region from the selected frame section.
  • the method includes displaying a plurality of the first items detected in the image in the form of a UI;
  • the method may include displaying a search result of the first item selected from the plurality of first items based on a user input.
  • the accuracy of the search can be improved.
  • a search function when searching for an item in an image, a search function can be provided without prior work such as generating information on the item.
  • 1 is an example of providing a search result by using a first item and a second item detected from a plurality of frames of an image according to an exemplary embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating a configuration of a display apparatus according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a control method of a display apparatus according to an embodiment of the present invention.
  • FIG. 4 is an example of a flow of an operation of providing a search result of a product in an image according to an embodiment of the present invention.
  • 5 is an example of providing a search result by using information on goods and persons detected from a plurality of frames of an image according to an embodiment of the present invention.
  • FIG. 6 is an example of providing a UI for selecting a frame section for searching for a product in an image according to an embodiment of the present invention.
  • FIG. 7 is an example of acquiring a plurality of images of a product detected from a plurality of frames according to an embodiment of the present invention.
  • FIG. 8 is an example of acquiring additional information related to a product by detecting a person from a frame including a product according to an embodiment of the present invention.
  • 9 is an example of acquiring additional information related to a product by detecting an item that is a background from a frame including a product according to an embodiment of the present invention.
  • 10 is an example of searching for a similar product image by using a plurality of images of the product to be searched according to an embodiment of the present invention.
  • 11 is an example of a formula for calculating similarity when searching for a product in an image according to an embodiment of the present invention.
  • FIG. 12 illustrates an example of a search result of a product having a similarity level or more according to an embodiment of the present invention.
  • the display apparatus 10 of the present invention provides a service for searching for an item such as a fashion product in an image.
  • the display apparatus 10 includes a first area 1 corresponding to the first item 3 from a plurality of frames 5 of an image, and a second item related to the first item 3.
  • the second area 2 corresponding to (4) is detected.
  • the display apparatus 10 searches for the first item 3 of the first area 1 using the information about the second item 4 of the second area 2 detected as described above, and displays the display unit. In (13), the search result for the first item (3) is provided.
  • the display apparatus 10 may search for the first item 3 and include the first item 3 and the second area of the first area 1 detected from the plurality of frames 5.
  • the information of the second item 4 in (2) may be transmitted to the server (see reference numeral 20 in FIG. 2) to receive the search result from the server 20. Accordingly, detection of an item in an image may be performed by the display apparatus 10, and a detection of the detected item may be performed by a server, thereby reducing a system load and increasing a search speed.
  • the display apparatus 10 uses the additional information extracted by analyzing information of the first item 3 and the second item 4 related to the first item 3 to search for the first item 3. do.
  • the additional information may be, for example, information about a feature and type of a product, a feature of a person, or a season and a place related to a background.
  • the acquisition of the additional information may be performed by either the display apparatus 10 or the server 20.
  • FIG. 2 is a block diagram illustrating a configuration of the display apparatus illustrated in FIG. 1.
  • the display apparatus 10 of the present invention includes a signal receiving unit 11, a signal processing unit 12, a display unit 13, a communication unit 14, a processor 15, and a user input unit 16.
  • the display apparatus 10 is implemented with, for example, a smart TV, a smart phone, a tablet, a PC, and the like.
  • the configuration included in the display apparatus 10 of the present invention is not limited to the above-described embodiment, and may be implemented except for some configurations, or may include other additional configurations.
  • the display apparatus 10 may communicate with the server 20 through the communication unit 14.
  • the server 20 may be provided as a computing device, for example, a PC, which stores a database or may perform a search function by accessing an external database.
  • the communication unit 14 communicates with the server 20 using a wired or wireless communication method.
  • the communication unit 14 communicates with the server 20 through a wired communication method such as Ethernet, or communicates with the server 20 through a wireless router through a wireless communication method such as Wi-Fi or Bluetooth. can do.
  • the communication unit 14 may be provided as a printed circuit board (PCB) including a wireless communication module such as Wi-Fi.
  • the communication method of the communication unit 14 is not limited thereto, and may communicate with the external device 19 using another communication method.
  • the signal receiver 11 receives a broadcast signal or an image signal from the outside, and the image signal may include, for example, a video or graphic signal.
  • the signal receiver 11 may be provided in various formats according to the standard of the broadcast signal and the video signal to be received and the implementation form of the display apparatus 10.
  • the signal receiver 11 may be implemented as a tuner for receiving a radio frequency (RF) broadcast signal or a satellite signal transmitted from a broadcasting station.
  • RF radio frequency
  • the signal processor 12 performs a preset signal processing process on the broadcast signal or the video signal received by the signal receiver 11. Examples of signal processing performed by the signal processor 12 include demultiplexing, decoding, de-interlacing, scaling, noise reduction, detail enhancement, and detail enhancement. ), And the kind thereof is not limited.
  • the signal processor 12 may be implemented as an SOC (System-On-Chip) incorporating these various functions or an image processing board equipped with individual components capable of independently performing each process.
  • the display unit 13 displays an image based on the broadcast signal or the image signal processed by the signal processor 12.
  • the display method of the display 13 is not limited, and the display 13 may be formed in various forms such as a plasma display panel (PDP), a liquid crystal display (LCD), organic light emitting diodes (OLED), and a flexible display. Can be implemented.
  • PDP plasma display panel
  • LCD liquid crystal display
  • OLED organic light emitting diodes
  • the user input unit 16 receives a user input for controlling at least one function of the display apparatus 10.
  • the user input unit 16 may receive a user input for selecting a part of a user interface displayed on the display unit 13.
  • the user input unit 16 may be implemented in the form of an input panel provided outside the display apparatus 10 or as a remote controller communicating with the display apparatus 10 in an infrared manner.
  • the user input unit 16 may be implemented as a keyboard, a mouse, or the like connected to the display apparatus 10, or may be implemented as a touch screen provided in the display apparatus 10.
  • the user input unit 16 may receive a user input from a mobile device (not shown) that communicates with the display apparatus 10 through Wi-Fi, Bluetooth, or Infrared communication.
  • the mobile device may be provided as a smart phone, and the like, for example, by installing a remote control app and executing a remote control app to display a user input through a touch of a button for controlling the operation of the display apparatus 10. Transmit to device 10.
  • the storage unit 17 includes a first memory (not shown) and a second memory (not shown).
  • the first memory may be implemented as a nonvolatile memory such as a flash memory to store data regardless of whether the display device 10 is provided with system power.
  • the first memory stores a plurality of instructions of at least one executable program.
  • the first memory allows read, write, edit, delete, update, etc. to be performed on each of the plurality of stored instructions.
  • the second memory is a high speed buffer memory provided between the first memory and the processor 15 and is referred to as cache memory or local memory.
  • the second memory is faster than the flash memory and can be directly accessed by the processor 15.
  • the second memory is an area in which a plurality of instructions of data or programs frequently accessed by the processor 15 are stored for immediate use without repetitive retrieval, and may be implemented as, for example, RAM.
  • the second memory may be integrally provided inside the processor 15, for example.
  • the processor 15 performs a control process for controlling a plurality of functions that the display apparatus 10 can perform.
  • the processor 15 may be implemented as a central processing unit (CPU) and includes three areas of control, operation, and registers.
  • the control region interprets a plurality of instructions stored in the first memory and instructs the operation of each configuration of the display apparatus 10 according to the meaning of the interpreted command.
  • the arithmetic area performs arithmetic operations and logical operations, and performs operations required for each component of the display apparatus 10 to operate according to the instructions of the control area.
  • the register is a storage place for storing information necessary while executing a plurality of instructions in the CPU.
  • the register stores instructions and data for each configuration of the display apparatus 10 and stores the calculated result.
  • the processor 15 executes a plurality of instructions of at least one program stored in the first memory and the second memory, for example, an operating system of the display apparatus 10, a security program such as a DRM or CAS, and a client.
  • a security program such as a DRM or CAS
  • the processor 15 may download and execute instructions stored in a separate computer program product (not shown).
  • a computer program product includes a memory in which instructions are stored and a processor.
  • the instruction when executed by the processor 15, the first area 1 and the first item 3 of the first item 3 to be searched in the image composed of a plurality of frames received by the signal receiver 11 ) Detects the second area 2 of the second item 4 related to the second item 4 and detects the first area 1 of the detected first area 1 based on the information of the second item 4 of the detected second area 2. Searching one item (3), and displaying the search result of the first item (3) found.
  • the operation of the processor 15 may be shown as the flowchart of FIG.
  • the processor 15 receives a signal of an image composed of a plurality of frames 5 received by the signal receiver 11.
  • the processor 15 may determine that the first area 1 of the first item 3, which is a search target in the image, and the second area (2) of the second item 4 related to the first item 3. 2) is detected.
  • the first item 3 may include a product image displayed on each of the plurality of frames 5.
  • the product image may be, for example, an image related to a fashion product such as a bag or sunglasses.
  • the second item 4 may include at least one of a person image and a background image displayed in a frame including the first item 3.
  • the person image corresponds to, for example, an image including a face of a person related to the product image in a frame in which the product image is detected.
  • the background image includes, for example, an image related to an item such as a surrounding background or an object representing a season and a place in a frame in which a product image is detected.
  • the product in order to search for a product in the image, the product may be searched more accurately by using the characteristics and background information of the person displayed in the frame including the product.
  • information about a person and a background detected from a corresponding frame may be combined to narrow the search range of the product.
  • the product image, the person image, or the background image may include a plurality of different images of the corresponding item displayed in each of the plurality of frames. Accordingly, in order to search for a product in an image, the accuracy of the search may be improved by using images of various angles obtained from various frames with respect to the product, the person, and the background.
  • the processor 15 automatically detects the first region 1 of the first item 3 in the image, and automatically detects the first region 1 through a convolutional neural network (CNN) algorithm.
  • the first area 1 of the first item 3 may be selected by recognition or by user input.
  • the processor 15 displays an icon for selecting a product image on the display unit 13 while displaying an image, and drags the icon to a partial region of the image according to a user input to correspond to the partial region. You can choose a product image.
  • the processor 15 selects the first item 1 of the detected first area 3 based on the information of the second item 2 of the detected second area 4.
  • the information of the second item 2 may be generated based on information about at least one of a person image or a background image related to the product image.
  • the processor 15 uses the additional information extracted by analyzing the information of the first item 3 and the second item 4 to search for the first item 3.
  • the processor 15 may determine the position of the person 1 81 in relation to the 'top' 83 when the 'top' 83 is detected as a product image in the image 81. Detect.
  • person 1 81 corresponds to, for example, an image including a face.
  • the processor 15 may extract additional information 811 about age, gender, such as 'Age: 20th', 'Gender: Male', etc., from the feature of the person 1 81 of the detected position. . From this, the additional information 811 may be used to search for a product that is the same as or similar to the 'top' 83 in the category of 'male in twenties'.
  • the processor 15 may display 'Snowy Mountain' as a background item related to the 'jumper' 91. 92, 'lift' 93, and the like are detected. At this time, the processor 15 may extract the additional information regarding the place and season, such as 'place: ski resort', 'season: winter' by combining the detected item types. From this, the additional information may be used to search for the same or similar product as the 'jumper' 91 in the category of 'ski suit'.
  • the processor 15 may select 'Golf Club' 951, 'as a background item related to the' shirt '952. Golf field '(953) and' Golf Ball '(954) can be detected, and additional information such as' place: golf course' can be extracted therefrom. From this, the additional information may be used to search for the same or similar product as the 'shirt' 952 in the category of 'golf clothing'.
  • the processor 15 displays the search result of the searched first item 3.
  • the search result may include information about a product having a high similarity with the searched product image by using additional information extracted from the information of the first item 3 and the second item 4.
  • the search result may include a UI that displays a connection of an ordering site for inducing a purchase regarding a searched product.
  • a search function may be provided without prior work such as generating information about the item.
  • FIG. 4 is an example of a flow of an operation of providing a search result of a product in an image according to an embodiment of the present invention.
  • the illustrated example embodies the flow of the operation of the processor 15 of FIG. 3 when searching for fashion goods in the image, and the operation of each step may be described by each operation of FIG. 3.
  • operation S41 corresponds to operation S31 of FIG. 3
  • operations S42, S431, S441, and S451 correspond to operation S32 of FIG. 3.
  • operations S432, S433, S442, S443, S452, S453, and S46 correspond to operation S33 of FIG. 3
  • operation S47 corresponds to operation S34 of FIG. 3.
  • the processor 15 plays back the video selected by the user.
  • a representative key frame is extracted from the plurality of frames of the video being played or frames are extracted at specific intervals.
  • the processor 15 detects the position of the fashion product in the extracted frame.
  • the processor 15 detects the position of the person in the extracted frame
  • the processor 15 detects the position of the item in the background of the extracted frame. Accordingly, by selecting a frame representative of a plurality of frames, it is possible to obtain a product for the search and additional information related to the product.
  • the detecting operation of the position of the person and the item in the background may be performed on the same frame in which the fashion product is detected or a plurality of frames in which the same fashion product appears.
  • the processor 15 extracts the feature of the product and the type of the product by analyzing the image of the fashion product at the position detected in operation S431.
  • the operations S431 and S432 are performed N times on the extracted N frames, and accordingly, in operation S433, the processor 15 obtains N pieces of information about the features and types of the same product.
  • the processor 15 extracts additional information by analyzing the person image of the position detected in operation S441. Accordingly, in operation S443, the processor 15 may, for example, be gender, age group, race, or body type. Obtain additional information related to a person, such as country or territory.
  • the processor 15 extracts additional information by analyzing the background image of the position detected in operation S451. Accordingly, in operation S453, the processor 15 may be associated with a background such as a season or a place. Acquire additional information.
  • the processor 15 searches for a fashion product based on the types and features of the goods acquired in the operations S433, S443, and S453, the facial features, and the background information.
  • the processor 15 sorts the search results of the fashion goods by the similarity search and displays the display unit 13.
  • the search result may include information about a product that is the same as or similar to the fashion product to be searched, and may include a UI related to the order such as a link to an order site for purchasing the searched product.
  • FIG. 5 is an example of providing a search result by using information on goods and persons detected from a plurality of frames of an image according to an embodiment of the present invention.
  • the illustrated example illustrates a case where information about 'person' corresponding to the second item is used to search for 'product' corresponding to the first item in operations S32 and S33 of FIG. 3.
  • the processor 15 stores the merchandise 511, 521, 531 and the person 512, 522, 532 from each of the plurality of frames 501, 502, 503, and ⁇ of the image being played. Detect location.
  • the processor 15 displays, on the display unit 13, a UI including images of the plurality of products 511, 521, and 531 detected as described above as a candidate to be searched together with the image.
  • the processor 15 extracts the features and types of the product from the image of the product 2 521, and the product. For example, a feature of a face is extracted from an image of person 2 522 detected in a second frame 502 including 2 521.
  • the processor 15 searches for the goods 2 521 by using the information on the features and types of the goods 2 521 extracted as described above and the facial features of the person 2 522. For example, when the product 2 521 is 'sunglasses' and the feature of the face of the person 2 522 is 'women in their 30s', a search for sunglasses may be performed in a search category of women in their 30s.
  • the processor 15 displays, as the search result 53 of the commodity 2 521, information about a product which is the same as or similar to the commodity 2 521.
  • the order UI 54 for inducing the purchase of the displayed product may be displayed together.
  • the processor 15 transmits the image of the selected product 2 521 and the information about the facial feature of the person 2 522 to the server 20, so that the server 20 receives the product 2 521. May be performed.
  • the processor 15 may receive and display a search result 53 of the searched product 2 521 using information on the facial feature of the person 2 522 from the server 20.
  • the processor 15 may display a list of products having a high similarity with the product 2 521 as the search result 53 of the product 2 521. In this case, the processor 15 may display a scene of a related video when one of the searched goods list is selected. Accordingly, through the product search results, it is possible to provide convenience for the user to look up the related image again.
  • FIG. 6 is an example of providing a UI for selecting a frame section for searching for a product in an image according to an embodiment of the present invention.
  • the illustrated example illustrates a case where a section for performing a product search is selected while receiving an image of a plurality of frames in operation S31 of FIG. 3.
  • the processor 15 stores a part of the image 60 to be reproduced in the storage unit 17.
  • the size of the frame of the image 60 to be stored may be set to the minimum size required for the product search.
  • the processor 15 may store an image frame from a time point of 60 seconds to the current time point.
  • the processor 15 may execute a predetermined application to search for a product in the stored past image, or select a function of displaying the past image by using the remote control 61.
  • the processor 15 displays a UI 62 for setting a predetermined time interval together with the past image at the bottom of the image.
  • the processor 15 selects a section in which a fashion product of interest exists, for example, among the past images according to a user input through the UI 62, and makes it possible to search for fashion products within the selected section. Accordingly, a user may directly set a frame section for searching for a product in the image. In addition, a list of products detected from each frame while viewing an image may be provided together with the image.
  • FIG. 7 is an example of acquiring a plurality of images of a product detected from a plurality of frames according to an embodiment of the present invention.
  • the illustrated example illustrates a case where a plurality of product images are obtained by detecting a position of a 'product' corresponding to the first item from each of the plurality of frames in operation S32 of FIG. 3.
  • the processor 15 detects a plurality of merchandise 71, 72, 73 from the image frame 70.
  • the processor 15 uses an object detection algorithm based on, for example, a convolutional neural network (CNN) for product detection in the image.
  • CNN convolutional neural network
  • CNN Convolutional Neural Network
  • the convolutional neural network has a structure suitable for learning two-dimensional data and can be trained through a backpropagation algorithm.
  • the convolutional neural network is used for object detection, object classification, and the like in an image, and in particular, an object may be detected by extracting and processing a feature from data composed of a multidimensional array such as a color image.
  • the processor 15 detects each of the plurality of products 71, 72, and 73 once by using the convolutional neural network. For example, the processor 15 detects 20 times from each of 20 frames.
  • the processor 15 may acquire a plurality of different images 711, 721, and 731 for each of the plurality of products 71, 72, and 73.
  • the plurality of shoe images 731 the shoe images 73 of various angles detected from each of 20 frames are included.
  • FIG. 10 is an example of searching for a similar product image by using a plurality of images of the product to be searched according to an embodiment of the present invention.
  • a product is searched using a plurality of query images 711, 721, and 731 of each detected product 71, 72, and 73.
  • the query image includes different images of the same product detected from a plurality of frames.
  • the processor 15 performs a product search using the plurality of query images 101 detected in the image.
  • the processor 15 obtains K query images 101 to be used as a query for product search.
  • the processor 15 calculates the similarity between M product images, that is, C * M product images 102 and K query images 101, for each of C product types stored in a database or server. The product having a high matching rate with the K query images 101 is searched.
  • FIG. 11 is an example of a formula for calculating similarity when searching for a product in an image according to an embodiment of the present invention.
  • the illustrated example shows a similarity calculation formula for finding a product having a high matching rate with K query images 101 detected in the image among the C * M product images 102 in FIG. 10.
  • the processor 15 calculates the similarity between the K query images 101 and the C * M product images 102, and each query image 101 in n dimension space for n features. The distance 111 between the product image 102 and the product image 102 is calculated.
  • the processor 15 calculates the distance 111 for each of the K query images 101 and obtains the similarity 112 as the maximum value of the calculated K distances 111.
  • the processor 15 obtains a similarity 112 for each of the C * M product images 102 and various schemes such as summing, average, or maximum of the calculated C * M similarities 112. The final similarity 113 is calculated.
  • the processor 15 may set the case where the final similarity 113 is 50% or more as the matched image 123. Accordingly, the processor 15 may display, as a search result, products having a final similarity of 50% or more, such as 55%, 60%, 65%, and the like, among the four query images 101 of the 15 product images 122. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un dispositif d'affichage comprenant : une unité de réception de signal permettant de recevoir un signal d'une image composée d'une pluralité de trames ; une unité de traitement de signal permettant de traiter le signal ; une unité d'affichage ; et un processeur permettant de détecter une première zone pour un premier élément et une seconde zone pour un second élément associé au premier élément dans l'image, constituant des cibles de recherche, de rechercher le premier élément de la première zone détectée en fonction d'informations du second élément de la seconde zone détectée, et d'afficher un résultat de recherche du premier élément recherché sur l'unité d'affichage.
PCT/KR2019/006743 2018-06-04 2019-06-04 Dispositif d'affichage et procédé de commande correspondant WO2019235820A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180064398A KR20190138168A (ko) 2018-06-04 2018-06-04 디스플레이장치 및 그 제어방법
KR10-2018-0064398 2018-06-04

Publications (1)

Publication Number Publication Date
WO2019235820A1 true WO2019235820A1 (fr) 2019-12-12

Family

ID=68769752

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/006743 WO2019235820A1 (fr) 2018-06-04 2019-06-04 Dispositif d'affichage et procédé de commande correspondant

Country Status (2)

Country Link
KR (1) KR20190138168A (fr)
WO (1) WO2019235820A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8649572B2 (en) * 2005-05-09 2014-02-11 Google Inc. System and method for enabling the use of captured images through recognition
KR20150044507A (ko) * 2013-10-16 2015-04-27 한양대학교 에리카산학협력단 인물상품 검색모듈을 갖는 스마트 디스플레이
KR20160031226A (ko) * 2014-09-12 2016-03-22 삼성에스디에스 주식회사 동영상 내 객체 관련 정보 검색 방법 및 동영상 재생 장치
KR101660803B1 (ko) * 2016-05-03 2016-09-28 아이피랩 주식회사 정보 제공 장치 및 그 방법
US20170289643A1 (en) * 2016-03-31 2017-10-05 Valeria Kachkova Method of displaying advertising during a video pause
US20180041807A1 (en) * 2015-02-17 2018-02-08 Jong Park Interaction system and interaction method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8649572B2 (en) * 2005-05-09 2014-02-11 Google Inc. System and method for enabling the use of captured images through recognition
KR20150044507A (ko) * 2013-10-16 2015-04-27 한양대학교 에리카산학협력단 인물상품 검색모듈을 갖는 스마트 디스플레이
KR20160031226A (ko) * 2014-09-12 2016-03-22 삼성에스디에스 주식회사 동영상 내 객체 관련 정보 검색 방법 및 동영상 재생 장치
US20180041807A1 (en) * 2015-02-17 2018-02-08 Jong Park Interaction system and interaction method thereof
US20170289643A1 (en) * 2016-03-31 2017-10-05 Valeria Kachkova Method of displaying advertising during a video pause
KR101660803B1 (ko) * 2016-05-03 2016-09-28 아이피랩 주식회사 정보 제공 장치 및 그 방법

Also Published As

Publication number Publication date
KR20190138168A (ko) 2019-12-12

Similar Documents

Publication Publication Date Title
US10643264B2 (en) Method and computer readable medium for presentation of content items synchronized with media display
WO2018174637A1 (fr) Procédé d'achat en temps réel utilisant une reconnaissance vidéo dans une diffusion, et dispositif intelligent dans lequel une application pour la mise en oeuvre de celui-ci est installée
WO2016024806A1 (fr) Procédé et appareil de fourniture de contenus d'image
WO2016013914A1 (fr) Procédé, appareil, système et programme d'ordinateur permettant de fournir et d'afficher des informations de produit
US9384408B2 (en) Image analysis system and method using image recognition and text search
WO2018128298A1 (fr) Appareil électronique et son procédé de commande
WO2018043990A1 (fr) Procédé, dispositif et programme informatique de fourniture d'informations de recherche d'image
WO2015056883A1 (fr) Serveur de récapitulation de contenu, système de fourniture de contenu, et procédé de récapitulation de contenu
WO2018151507A1 (fr) Dispositif et procédé d'affichage, et serveur de publicité
WO2016013915A1 (fr) Procédé, appareil et programme d'ordinateur d'affichage d'informations de recherche
WO2017142143A1 (fr) Procédé et appareil permettant de fournir des informations de résumé d'une vidéo
WO2014175520A1 (fr) Appareil d'affichage destiné à fournir des informations de recommandation et procédé associé
WO2019231138A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2019216554A1 (fr) Appareil électronique et son procédé de commande
WO2021085812A1 (fr) Appareil électronique et son procédé de commande
WO2021132922A1 (fr) Dispositif informatique et procédé de fonctionnement associé
CN114025242A (zh) 视频处理方法、视频处理装置和电子设备
WO2015102248A1 (fr) Appareil d'affichage et son procédé de gestion de carte de canaux
WO2018043923A1 (fr) Dispositif d'affichage et procédé de commande associé
WO2021137507A1 (fr) Appareil d'affichage et son procédé de commande
WO2012118259A1 (fr) Système et procédé de fourniture d'un service lié à la vidéo sur la base d'une image
WO2019235820A1 (fr) Dispositif d'affichage et procédé de commande correspondant
WO2015060685A1 (fr) Dispositif électronique et procédé de fourniture de données publicitaires par le dispositif électronique
WO2016036049A1 (fr) Programme informatique, procédé, système et appareil de fourniture de service de recherche
WO2020111844A2 (fr) Procédé et appareil pour améliorer un point caractéristique d'image dans le slam visuel à l'aide d'une étiquette d'objet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19815848

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19815848

Country of ref document: EP

Kind code of ref document: A1