KR20150033448A - Method for searching object using a wearable device and device for searching object - Google Patents

Method for searching object using a wearable device and device for searching object Download PDF

Info

Publication number
KR20150033448A
KR20150033448A KR20130113483A KR20130113483A KR20150033448A KR 20150033448 A KR20150033448 A KR 20150033448A KR 20130113483 A KR20130113483 A KR 20130113483A KR 20130113483 A KR20130113483 A KR 20130113483A KR 20150033448 A KR20150033448 A KR 20150033448A
Authority
KR
South Korea
Prior art keywords
data
information
character
recognizing
page
Prior art date
Application number
KR20130113483A
Other languages
Korean (ko)
Inventor
조시연
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR20130113483A priority Critical patent/KR20150033448A/en
Publication of KR20150033448A publication Critical patent/KR20150033448A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method for searching an object by using a wearable device comprises: a step for recognizing the object by using received external signals; a step for determining whether the object is included in data based on a comparison between the received data and the object; and a step for outputting information related to the object based on the determination result. The present invention easily transmits information searched by other external devices.

Description

A device for searching an object using a wearable device and a device for searching for an object. {Method for searching for a wearable device and device for searching object}

A method for searching an object using a wearable device, and a device for searching for an object.

BACKGROUND ART With the development of information and communication technology, devices capable of performing new functions as well as conventional functions have been developed. For example, a mobile phone that has only performed communication functions among the separated users has developed into a smart phone capable of performing various functions including searching for information, and functions that can be performed in a smart phone are continuously developed and gradually evolved have.

On the other hand, in order for a user to find desired information in an offline document or document, there has been a problem that the user must use a device or a device separately provided. Therefore, even if the user does not have a separate device, a technique of mounting a function of searching for information on the accessory currently worn by the user is being developed.

A method for searching an object using a wearable device and a device for searching for an object are provided. It is still another object of the present invention to provide a computer-readable recording medium storing a program for causing a computer to execute the method. The technical problem to be solved is not limited to the technical problems as described above, and other technical problems may exist.

A method for searching an object using a wearable device according to an exemplary embodiment includes: recognizing the object using a received external signal; Determining whether the object is included in the data based on a result of comparison between the received data and the object; And outputting information on the object based on the determination result.

A computer-readable recording medium according to another embodiment includes a recording medium on which a program for causing a computer to execute the above-described method is recorded.

According to another aspect of the present invention, there is provided a wearable device capable of searching for an object, the wearable device comprising: a recognition unit recognizing the object using a received external signal; A determination unit for determining whether the object includes the object based on a result of comparison between the received data and the object; And an output unit outputting information on the object based on the determination result.

As described above, it is possible to easily obtain information to be searched in a large amount of data (e.g., an offline document or document). In addition, information retrieved by another external device can be easily transmitted. Also, information can be retrieved based on the user's voice or gesture.

1 is a block diagram illustrating an example of an information search system according to an embodiment.
2 is a diagram illustrating an example in which a device according to one embodiment receives data.
FIG. 3 is a diagram illustrating an example of outputting information on an object according to an embodiment.
4 is a flow diagram illustrating an example of a method by which a device according to one embodiment retrieves information.
5 is a block diagram showing an example of a device according to an embodiment.
6 is a flow diagram illustrating another example of a method by which a device according to one embodiment retrieves information.

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. The following examples of the present invention are intended only to illustrate the present invention and do not limit or limit the scope of the present invention. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

1 is a block diagram illustrating an example of an information search system according to an embodiment.

Referring to FIG. 1, an information retrieval system includes an external device 120 connected with a device 110 and a device 110 and a network 130. Here, the network 130 includes, but is not limited to, the Internet, a LAN (Local Area Network), a Wireless LAN (Local Area Network), a WAN Or other types of networks 130 that can send and receive information or data. In the following description, the terms "communication "," communication network ", and "network" The three terms refer to a wired and wireless local area and wide area data transmission and reception network capable of transmitting and receiving information to and from the device 110 and the external device 120.

The external device 120 is connected to the device 110 and the network 130. The external device 120 may correspond to any device capable of transmitting and receiving signals with the device 110 and capable of displaying information received from the device 110. [ For example, the external device 120 may include a smart phone, a personal computer, and the like.

The device 110 is characterized by being a wearable device. For example, the device 110 may include, but is not limited to, wearable glasses, watches, and the like. In other words, the device 110 may include various items that the user wears in his or her daily life, rather than a device that the user must separately provide for retrieving information.

In the following, the device 110 describes an example of how the device 110 searches for an object, assuming it is wearable by the user.

The device 110 recognizes the object using the received external signal.

Here, the external signal means a signal including information representing an object. For example, assuming that an object is a shape of an object (hereinafter, referred to as a key picture), the external signal may be a signal representing an image of a shape of an object. Meanwhile, the device 110 may mount a separate camera, and may directly generate an external signal using an image of the shape of the object photographed through the camera.

If the object is assumed to be a character (hereinafter referred to as a key word), the external signal may be a voice signal or a character-typed signal. For example, the device 110 may receive a voice signal using a microphone provided in the device 110, or may receive a separate input device (e.g., a keypad, a touch screen, Pen) of the inputting unit.

The device 110 may also receive an external signal containing information about an object (i.e., a key picture or a keyword) from the external device 120 via the network 130.

The device 110 recognizes the object using the received external signal. In other words, the device 110 recognizes an object or a character corresponding to the received external signal. For example, if the external signal is a signal corresponding to the key picture, the device 110 can recognize the object by segmenting the shape of the key picture in the image. Also, assuming that the external signal is a voice signal corresponding to the keyword, the device 110 can recognize the object using the voice recognition technology.

The device 110 determines whether or not an object (i.e., a key picture or a keyword) is included in the data, based on a result of comparison between the received data and the object. Here, the data means an object including information representing an object. For example, the data may be text or image data included in a book, but is not limited thereto. Hereinafter, an example in which the device 110 receives data will be described with reference to FIG.

2 is a diagram illustrating an example in which a device according to one embodiment receives data.

2, the device is glasses 210 worn by the user, and the data is character or image data included in the book 220, but the present invention is not limited thereto.

The user gazes at the book 220 while wearing the device 210. At this time, the device 210 uses the camera provided in the device 210 to photograph a page that the user is looking at.

On the other hand, the device 210 can photograph each of a plurality of pages included in the book 220. In other words, each time the user flips a page of the book 220, the device 210 can shoot each page using the camera. For example, the device 210 may capture each page by recognizing a voice signal corresponding to the user's voice (e.g., "photographed", "next"), or by recognizing a gesture You can also take each page.

Thereafter, the device 210 recognizes the data contained in the photographed images. For example, when a picture is included in the photographed image, the device 210 can recognize the data by segmenting the shape included in the picture. In addition, when a character is included in the photographed image, the device 210 can recognize data using a normal character recognition algorithm. Here, the character recognition algorithm will be obvious to those skilled in the art, so a detailed description thereof will be omitted.

Referring again to FIG. 1, the device 110 compares the object with the received data. For example, when the object is a key picture, the device 110 determines whether the data recognized in the photographed page (i.e., the shape separated from the photographed image) is the same as the key picture. When the object is a keyword, the device 110 determines whether the data recognized by the photographed page (i.e., the character recognized in the photographed image) is the same as the keyword.

Meanwhile, the external device 120 may recognize the data included in the image and compare the object with the recognized data. Specifically, the device 110 transmits a signal representing an image photographed using a camera to the external device 120 through the network 130. [ Thereafter, the external device 120 uses the transmitted signal and determines whether the recognized data includes an object (i.e., a key picture or a keyword). Here, a concrete method of the external device 120 recognizing the image and comparing the data with the object is the same as the method in which the device 110 recognizes the image and compares the object with the data.

When the external device 120 compares the data and the object, the external device 120 transmits the comparison result to the device 110 via the network 130.

The device 110 outputs information about the object based on the determination result. Here, the information on the object includes the image of the entire page of the book including the object (i.e., key picture or keyword), the page number including the object, and the like.

Specifically, when the device 110 determines that the object is included in the comparison result data of the data and the object, the device 110 outputs information about the object to the display window. For example, the device 110 may display a page number of a book containing an object on a display window, or may display an image of an entire page on a display window.

On the other hand, the device 110 may output information about the object to the display window after recognizing and comparing each page, and may output information about the object to the display window after recognizing and comparing all the pages of the book It is possible.

For example, assume that the book is composed of 1 to 100 pages, and the object is included in pages 3 and 80 of the book. In one example, the device 110 performs recognition and comparison with one or more pages of the book, recognizes three pages of the book and compares them with the object and displays the page number or the entire three pages of the image You can output to the window. Then, the device 110 performs recognition with respect to pages 4 to 79 of the book and comparison with the object, and recognizes 80 pages of the book and compares the pages with the object, . The device 110 then performs the recognition and comparison with the object for pages 81-100.

In another example, the device 110 performs perception and comparison with objects of pages 1 to 100 of the book. Then, the device 110 can output the page numbers of page 3 and page 80 of the book or the images of the third page and the page 80 in the display window.

On the other hand, when the device 110 determines that the object is included in the data as a result of the comparison between the data and the object, the device 110 may transmit information about the object to the external device 120. In other words, the device 110 may send information about the object to the external device 120 via the network 130. [ Thereafter, the external device 120 can output the received information to the display window of the external device 120. [ Here, the information on the object includes the image of the entire page of the book including the object (i.e., key picture or keyword), the page number including the object, and the like.

In addition, the device 110 or the external device 120 may store information about the object in a storage unit provided for each device.

FIG. 3 is a diagram illustrating an example of outputting information on an object according to an embodiment.

3 (a) shows information about the object displayed on the display window of the device 110, and FIG. 3 (b) shows information about the object displayed on the display window of the external device 120. FIG.

As described above, the information about the object may include the entire page of the book including the object (i.e., key picture or keyword). Therefore, as shown in FIG. 3A, the device 110 can output an image of an entire page including an object among the pages constituting the book to the display window.

3 (b), when the external device 120 compares the data and the object, or when the device 110 transmits the information about the object to the external device 120, The external device 120 can output the image 310 of the entire page including the object among the pages constituting the book to the display window.

Referring again to FIG. 1, the device 110 may output a predetermined alarm based on the determination result. In other words, when the device 110 determines that an object is included in the data as a result of comparing the data with the object, the device 110 may output a predetermined alarm. Here, the alarm may correspond to a display window of the device 110 blinking or a sound output through a speaker provided to the device 110, but the present invention is not limited thereto.

As described above, the device 110 can easily obtain information to be searched in a large amount of data (e.g., an offline document or document). In addition, the device 110 can easily transmit information retrieved to another external device. In addition, the device 110 may retrieve information based on the user's voice or gesture.

4 is a flow diagram illustrating an example of a method by which a device according to one embodiment retrieves information.

Referring to FIG. 4, a method for retrieving information is comprised of steps that are processed in a time-series manner in the information retrieval system or device 110 shown in FIG. Therefore, it is understood that the contents described above with respect to the information search system or the device 110 shown in FIG. 1 apply to the method of retrieving the information of FIG. 4, even if omitted from the following description.

In step 410, the device 110 specifies an object (i.e., a key picture or a keyword) using the received external signal. For example, the device 110 can designate an object based on an image photographed using a camera, a character input through an input device, or a voice input through a microphone.

In operation 420, the device 110 recognizes an object including information indicating an object. For example, assuming that the object containing the information representing the object is a book composed of a plurality of pages, the device 110 recognizes that the page included in the book is being skipped. For example, the device 110 may recognize that the page has crossed by recognizing the user's voice (e.g., "Next") or a gesture by which the user flips through the bookshelf.

In operation 430, the device 110 photographs each page using a camera provided in the device 110. [ For example, the device 110 may capture each page based on the user's voice (e.g., "photographed ") or a gesture of a predetermined user.

In step 440, the device 110 recognizes the data (i.e., characters or pictures) included in the photographed page. The device 110 may also transmit information about the photographed page to the external device 120 via the network 130. [ The external device 120 can recognize the data included in the photographed page based on the transmitted information.

In step 450, the device 110 determines whether the recognized data includes an object (i.e., a key picture or a keyword). For example, the device 110 may apply a search algorithm to determine whether an object is included in the recognized data. When the device 110 transmits information about the photographed page to the external device 120 in step 440, the external device 120 can check whether or not the object is included in the recognized data.

If it is determined in step 450 that the device 110 (or the external device 120) includes the object in the data, the process proceeds to step 460; otherwise, the process proceeds to step 470.

In step 460, the device 110 outputs information or an alarm about the object to the display window. Here, the information on the object includes the image of the entire page of the book including the object (i.e., key picture or keyword), the page number including the object, and the like. Also, the external device 120 may output information or an alarm about the object to the display window.

In step 470, the device 110 determines whether the current page has passed to the next page. In other words, the device 110 recognizes that the pages included in the book are being passed. Here, a concrete method of recognizing whether the device 110 has turned the page is described above with reference to step 420.

If the device 110 recognizes that the page of the book has passed, the operation proceeds to step 430. Otherwise, the operation ends.

5 is a block diagram showing an example of a device according to an embodiment.

Referring to FIG. 5, the device 110 includes an interface unit 111, a recognition unit 112, a determination unit 113, and an output unit 114. Only the components related to the present embodiment are shown in the device 110 shown in Fig. Accordingly, it will be understood by those skilled in the art that other general-purpose components other than the components shown in FIG. 5 may be further included.

It should be noted that the interface 111, recognition unit 112, determination unit 113 and output unit 114 of the device 110 shown in FIG. 5 may exist as independent diagnostic devices. It will be understood by those of ordinary skill in the art.

The interface unit 111, the recognition unit 112, the determination unit 113, and the output unit 114 of the device 110 shown in FIG. 5 may correspond to one or a plurality of processors. A processor may be implemented as an array of a plurality of logic gates, or may be implemented as a combination of a general purpose microprocessor and a memory in which a program executable in the microprocessor is stored. It will be appreciated by those skilled in the art that the present invention may be implemented in other forms of hardware.

The interface unit 111 receives an external signal input to the device 110. [ The interface unit 111 transmits and receives signals to and from the external device 120 via the network 130. [ For example, the interface unit 111 can transmit information about an object to an external device connected to the device.

5, when the device 110 includes a camera (not shown), a microphone (not shown), or a speaker (not shown), the interface unit 111 may include a camera (not shown), a microphone (Not shown), or a speaker (not shown) and other units included in the device (i.e., the recognition unit 112, the determination unit 113, and the output unit 114).

For example, the interface unit 111 may correspond to a communication interface unit, and the communication interface unit may include a modem used for transmitting and receiving signals according to functions of the device 110, a network module for connection with a network, A USB host module for forming a data movement channel with the medium, and the like.

The recognition unit 112 recognizes the object using the external signal transmitted from the interface unit 111. [ In other words, the device 110 recognizes an object or a character corresponding to the received external signal. For example, if the external signal is a signal corresponding to the key picture, the device 110 can recognize the object by segmenting the shape of the key picture in the image. Also, assuming that the external signal is a voice signal corresponding to the keyword, the device 110 can recognize the object using the voice recognition technology.

The determination unit 113 compares the data transmitted from the interface unit 111 with the object transmitted from the recognition unit 112. [ Then, the determination unit 113 determines whether or not the object is included in the data based on the comparison result. Here, the data means an object including information representing an object.

For example, when the object is a key picture, the determination unit 113 determines whether or not the data recognized in the photographed page (i.e., the shape separated from the photographed image) is the same as the key picture. If the object is a keyword, the determination unit 113 determines whether the data recognized in the photographed page (i.e., the character recognized in the photographed image) is the same as the keyword.

Meanwhile, the external device 120 may recognize the data included in the image and compare the object with the recognized data. In this case, the interface unit 111 transmits information about the image to the external device 120 through the network 130. [

The output unit 114 outputs information about the object based on the determination result transmitted from the determination unit 113. [ The output unit may also output a predetermined alarm based on the determination result transmitted from the determination unit 113. [ Here, the information on the object includes the image of the entire page of the book including the object (i.e., key picture or keyword), the page number including the object, and the like.

For example, the output unit 114 includes an output device such as a display panel, a touch screen, a monitor, a speaker, and the like, and a software module for driving them.

On the other hand, the external device 120 may output information about an object or an alarm to a display window. In this case, the interface unit 111 transmits information about the object to the external device 120 through the network 130. [

Meanwhile, although not shown in FIG. 5, the device 110 may store information about an object in a storage unit (not shown). In addition, the external device 120 may store information about the object in a storage unit provided in the external device 120. [

As described above, the device 110 can easily obtain information to be searched in a large amount of data (e.g., an offline document or document). In addition, the device 110 can easily transmit information retrieved to another external device. In addition, the device 110 may retrieve information based on the user's voice or gesture.

6 is a flow diagram illustrating another example of a method by which a device according to one embodiment retrieves information.

Referring to FIG. 6, a method for retrieving information is comprised of steps that are processed in a time-series manner in the information retrieval system or device 110 shown in FIG. Therefore, it is understood that the contents described above with respect to the information search system or the device 110 shown in FIG. 1 are applied to the method of retrieving the information of FIG. 6, even if omitted from the following description.

In operation 610, the device 110 recognizes the object using the received external signal.

In step 620, the device 110 determines whether or not the object is included in the data based on the result of the comparison between the received data and the object.

In step 630, the device 110 outputs information about the object based on the determination result.

Meanwhile, the above-described method can be implemented in a general-purpose digital computer that can be created as a program that can be executed by a computer and operates the program using a computer-readable recording medium. In addition, the structure of the data used in the above-described method can be recorded on a computer-readable recording medium through various means. The computer readable recording medium may be a magnetic storage medium such as a ROM, a RAM, a USB, a floppy disk or a hard disk, an optical reading medium such as a CD-ROM or a DVD, ) (E.g., PCI, PCI-express, Wifi, etc.).

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed methods should be considered from an illustrative point of view, not from a restrictive point of view. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

110: Device
120: External device
130: Network

Claims (15)

A method for searching an object using a wearable device,
Recognizing the object using the received external signal;
Determining whether the object is included in the data based on a result of comparison between the received data and the object; And
And outputting information about the object based on the determination result.
The method according to claim 1,
And transmitting information about the object to an external device connected to the device based on the determination result.
The method according to claim 1,
Wherein the determining includes determining whether the object is included in data included in an image photographed using a camera included in the device.
The method according to claim 1,
The data refers to data included in a document composed of a plurality of pages,
Wherein the determining comprises obtaining data contained in each of the pages based on a voice or gesture that implies passing the page and determining based on a result of the comparison of the obtained data and the object.
The method according to claim 1,
Said object comprising the shape of an object,
Wherein the recognizing step recognizes the object using a signal corresponding to the image of the shape.
The method according to claim 1,
The object includes a character,
Wherein the recognizing step recognizes the object by using either a signal corresponding to the type of the character or a voice signal corresponding to the utterance of the character.
The method according to claim 1,
And outputting a predetermined alarm based on the determination result.
A computer-readable recording medium storing a program for causing a computer to execute the method according to any one of claims 1 to 7. A wearable device capable of retrieving an object,
A recognition unit for recognizing the object using the received external signal;
A determination unit for determining whether the object includes the object based on a result of comparison between the received data and the object; And
And an output unit for outputting information on the object based on the determination result.
10. The method of claim 9,
And an interface unit for transmitting information on the object to an external device connected to the device based on the determination result.
10. The method of claim 9,
Wherein the determination unit determines whether the object is included in the data included in the captured image using the camera included in the device.
10. The method of claim 9,
The data refers to data included in a document composed of a plurality of pages,
Wherein the determination unit obtains data included in each of the pages based on a voice or gesture indicating that the page is to be turned over and determines based on a result of comparison between the obtained data and the object.
10. The method of claim 9,
Said object comprising the shape of an object,
Wherein the recognizing unit recognizes the object using a signal corresponding to the image of the shape.
10. The method of claim 9,
The object includes a character,
Wherein the recognizing unit recognizes the object using any one of a signal corresponding to typing of the character or a voice signal corresponding to utterance of the character.
10. The method of claim 9,
And the output unit outputs a predetermined alarm based on the determination result.
KR20130113483A 2013-09-24 2013-09-24 Method for searching object using a wearable device and device for searching object KR20150033448A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR20130113483A KR20150033448A (en) 2013-09-24 2013-09-24 Method for searching object using a wearable device and device for searching object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR20130113483A KR20150033448A (en) 2013-09-24 2013-09-24 Method for searching object using a wearable device and device for searching object

Publications (1)

Publication Number Publication Date
KR20150033448A true KR20150033448A (en) 2015-04-01

Family

ID=53030787

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130113483A KR20150033448A (en) 2013-09-24 2013-09-24 Method for searching object using a wearable device and device for searching object

Country Status (1)

Country Link
KR (1) KR20150033448A (en)

Similar Documents

Publication Publication Date Title
US10614172B2 (en) Method, apparatus, and system for providing translated content
US10706887B2 (en) Apparatus and method for displaying times at which an object appears in frames of video
KR102567285B1 (en) Mobile video search
WO2020029966A1 (en) Method and device for video processing, electronic device, and storage medium
KR102039427B1 (en) Smart glass
CN109918669B (en) Entity determining method, device and storage medium
US20130120548A1 (en) Electronic device and text reading guide method thereof
JP2018504727A (en) Reference document recommendation method and apparatus
CN112269853B (en) Retrieval processing method, device and storage medium
EP3190527A1 (en) Multimedia data processing method of electronic device and electronic device thereof
WO2019105457A1 (en) Image processing method, computer device and computer readable storage medium
CN106257452B (en) Modifying search results based on contextual characteristics
CN110909209A (en) Live video searching method and device, equipment, server and storage medium
US20180357318A1 (en) System and method for user-oriented topic selection and browsing
CN112632445A (en) Webpage playing method, device, equipment and storage medium
US20140136196A1 (en) System and method for posting message by audio signal
CN108416026B (en) Index generation method, content search method, device and equipment
JP2017146672A (en) Image display device, image display method, image display program, and image display system
CN110837557B (en) Abstract generation method, device, equipment and medium
CN109977390B (en) Method and device for generating text
CN111524518B (en) Augmented reality processing method and device, storage medium and electronic equipment
KR20150033448A (en) Method for searching object using a wearable device and device for searching object
KR20130094493A (en) Apparatus and method for outputting a image in a portable terminal
KR102488359B1 (en) Method and apparatus for processing lexical database
CN110929122B (en) Data processing method and device for data processing

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination