KR101659063B1 - Apparatus and method for providing searching service - Google Patents

Apparatus and method for providing searching service Download PDF

Info

Publication number
KR101659063B1
KR101659063B1 KR1020160006856A KR20160006856A KR101659063B1 KR 101659063 B1 KR101659063 B1 KR 101659063B1 KR 1020160006856 A KR1020160006856 A KR 1020160006856A KR 20160006856 A KR20160006856 A KR 20160006856A KR 101659063 B1 KR101659063 B1 KR 101659063B1
Authority
KR
South Korea
Prior art keywords
search
search interface
display
interface
recognition
Prior art date
Application number
KR1020160006856A
Other languages
Korean (ko)
Other versions
KR20160017652A (en
Inventor
서현주
박마리아
김여래
임정훈
Original Assignee
네이버 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 네이버 주식회사 filed Critical 네이버 주식회사
Priority to KR1020160006856A priority Critical patent/KR101659063B1/en
Publication of KR20160017652A publication Critical patent/KR20160017652A/en
Application granted granted Critical
Publication of KR101659063B1 publication Critical patent/KR101659063B1/en

Links

Images

Classifications

    • G06F17/30864
    • G06F17/30681
    • G06F17/30879
    • G06F17/30964
    • G06F17/30973
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

A search service providing apparatus and method are disclosed. A computer program recorded on a recording medium for executing a search service providing method in combination with a computer-implemented electronic apparatus, the method comprising: providing a first search interface corresponding to a text-based search on a display, Displaying a browsing area for displaying the contents of the browsing area; Receiving a user's touch input to the display during the first search interface display; Sensing a calling interaction with a second search interface corresponding to the recognition search by the received touch input; And displaying the second search interface on the display, wherein the second search interface includes a graphical user interface for each of a plurality of types of recognition-based searches, Wherein the first search interface, the second search interface and the browsing area are displayed together on the display.

Description

[0001] APPARATUS AND METHOD FOR PROVIDING SEARCHING SERVICE [0002]

The present invention relates to a device and method for providing a search service, and more particularly to a terminal device including a search service application that provides various services for a mobile terminal and a method of driving the same.

User experience (UX) is referred to as the total sum of all experience that a user experiences as interacting with a product, a service, and the subject it provides. This user experience can provide intuitive manipulation to the user and provide fast and convenient functional visual design.

Especially, recently, information communication (IT) terminals including smart phones and tablet PCs including touch-sensitive displays have been popularized. Accordingly, an application providing a search service and the like has been designed in consideration of a user's experience.

Meanwhile, in recent years, advanced search using image recognition, voice recognition, music recognition, or the like has been provided.

Image recognition-based search includes QR code, barcode recognition search, OCR search, image label recognition, etc. In order to assist in character recognition search, a tool such as placing a specific frame around a character tools are used.

In addition, the speech recognition based retrieval uses a method of extracting text by applying a speech recognition algorithm to the input speech signal and then using it as a keyword, and the music recognition based retrieval is a method of performing feature analysis from an input music signal .

An apparatus and method for providing a search service that provides a user experience efficiently and conveniently is provided.

There is provided an apparatus and method for providing a search service capable of increasing the exposure of recognition search to induce the use of the recognition search service.

There is provided an apparatus and method for providing a search service in which a conversion between a recognition search service and a normal text based search service is intuitively and quickly provided.

A computer program recorded on a recording medium for executing a search service providing method in combination with a computer-implemented electronic apparatus, the method comprising: providing a first search interface corresponding to a text-based search on a display, Displaying a browsing area for displaying the contents of the browsing area; Receiving a user's touch input to the display during the first search interface display; Sensing a calling interaction with a second search interface corresponding to the recognition search by the received touch input; And displaying the second search interface on the display, wherein the second search interface includes a graphical user interface for each of a plurality of types of recognition-based searches, Wherein the first search interface, the second search interface and the browsing area are displayed together on the display.

A computer-readable recording medium embodying an application for providing a search service, which is executed in a terminal apparatus including a processor, the application comprising: a first search interface corresponding to a text-based search on a display; A code for displaying a browsing area for display; And a code for displaying the second search interface on the display when a call interaction for a second search interface corresponding to the recognition search is detected by the user's touch input to the display during the first search interface display Wherein the second search interface includes a graphical user interface for each of a plurality of types of awareness-based searches, wherein the second search interface is displayed as the first search interface, the second search interface, Are displayed together on the display.

A computer-implemented method of providing a search service, comprising: displaying a first search interface corresponding to a text-based search on a display and a browsing area for displaying at least one content; Receiving a user's touch input to the display during the first search interface display; Sensing a calling interaction with a second search interface corresponding to the recognition search by the received touch input; And displaying the second search interface on the display, wherein the second search interface includes a graphical user interface for each of a plurality of types of recognition-based searches, Wherein the first search interface and the second search interface and the browsing area are displayed together on the display.

According to embodiments of the present invention, the user experience (UX) in providing search services is efficiently and conveniently maximized.

In addition, the use of the recognition search service can be induced by increasing the exposure of the recognition search, and the transition between the recognition search service and the normal text-based search service can be intuitively and quickly provided.

FIG. 1 illustrates a search service providing terminal according to an embodiment of the present invention.
FIG. 2 shows a plurality of application graphic user interfaces (GUI) displayed on the touch-sensitive display of the terminal device of FIG.
FIG. 3 shows a screen configuration at the time of executing a search service providing application according to an embodiment of the present invention.
Figure 4 illustrates an in-app screen configuration that may provide conventional text-based searches in accordance with an embodiment of the present invention.
5 is a conceptual diagram for explaining how a recognition search call interaction is received by a touch input of a user according to an embodiment of the present invention.
FIG. 6 illustrates that the recognition search interface is pulled down corresponding to the recognition search call interaction described in FIG. 5 in accordance with an embodiment of the present invention.
FIG. 7 illustrates a search result for a user query in an in-web according to an embodiment of the present invention.
FIG. 8 shows a state in which the in-web of FIG. 7 is switched to the landscape mode according to an embodiment of the present invention.
FIG. 9 illustrates a process result when a recognition search call interaction is received by a user's touch input in FIG. 8 according to an embodiment of the present invention.
10 and 11 illustrate a code search process during recognition search according to an embodiment of the present invention.
12 and 13 illustrate a voice search process during recognition search according to an embodiment of the present invention.
FIGS. 14 and 15 illustrate a music search process during recognition search according to an embodiment of the present invention.
16 and 17 illustrate a wine label recognition search process during recognition search according to an embodiment of the present invention.
18 to 21 illustrate a Japanese recognition search process during recognition search according to an embodiment of the present invention.
FIG. 22 and FIG. 23 show a process of searching for a green window during recognition search according to an embodiment of the present invention.
24 is a flowchart illustrating an application according to an embodiment of the present invention.

Hereinafter, some embodiments of the present invention will be described in detail with reference to the accompanying drawings. However, the present invention is not limited to or limited by the embodiments. Like reference symbols in the drawings denote like elements.

FIG. 1 illustrates a search service providing terminal according to an embodiment of the present invention.

The terminal device 100 may be various information communication devices such as a smart phone, a tablet PC, and the like. According to an embodiment of the present invention, the terminal device 100 includes a touch-sensitive display 110 as means for providing a user interface.

According to embodiments of the present invention, the processor 120 is driven by the application stored in the memory 130 to render the search service application on the touch-sensitive display 110.

The process by which the processor 120 drives an application of the memory 130 to provide a search service in accordance with various embodiments of the present invention is described below with reference to Figures 2 to 24. [

FIG. 2 shows a plurality of application graphic user interfaces (GUI) displayed on the touch-sensitive display 110 of the terminal device 100 of FIG.

A graphical user interface 210 for executing a search service providing application according to an embodiment of the present invention among graphic user interfaces for executing an application is included.

When a user selects the graphical user interface 210 with a touch input or the like for the touch-sensitive display 110, a search service providing application according to an embodiment of the present invention starts.

More detailed application driving screens and operations will be described later with reference to FIG. 3 and the following.

FIG. 3 shows a screen configuration 300 at the time of executing a search service providing application according to an embodiment of the present invention.

According to an embodiment of the present invention, the search service providing application can provide both the recognition search and the normal text-based search, and one of them is used for query reception for providing the search service by user selection.

However, there is a phenomenon in which users are not well aware of the fact that the recognition search service can be provided, or the recognition search service is not directly and intuitively found. Further, the application production / distribution subject providing the search service recognizes the necessity of increasing the utilization of the recognition search service.

Thus, according to one embodiment of the present invention, when the application is installed or updated in the terminal device 100 for the first time, Based search interface 320. The text-

In this case, the content to be forwarded from the service providing server (not shown) or the browsing area 330 that can provide the search results using the recognition search interface 310 and / or the text-based search interface 320 together In accordance with an embodiment of the present invention, the browsing area 330 may be provided in a dimmed state as shown.

In this screen configuration 300, when the user touches any one of the GUIs 211 to 216 for recognition search, the corresponding recognition search is performed, and the recognition search performing process is described with reference to FIGS. 10 to 23 Will be described later in more detail.

For reference, in this drawing and specification, examples of recognition search include a bar code corresponding to each of the GUIs 211 to 216 (a concept including a one-dimensional bar code and a two-dimensional QR code, simply referred to as a "code" Voice search, music search, wine label search, Japanese search, and Green Windows TM search are exemplarily listed, but the present invention is not limited thereto. Therefore, the type and form of recognition search and the design of a concrete GUI can be changed without changing the idea of the present invention.

FIG. 4 illustrates an in-app screen configuration 400 that may provide conventional text-based searches in accordance with an embodiment of the present invention.

The in-app (in-App) is a variant of an application (App) which is an abbreviation of an application. The in-app includes a browsing area 420 for providing contents including a general purpose navigating bar (GNB) Interface 410, and the like.

While this in-app screen configuration 400 is being provided, the case where the user desires to call the recognition search interface will be described with reference to FIG.

FIG. 5 is a conceptual diagram 500 illustrating how a recognition search call interaction is received by a user's touch input according to an embodiment of the present invention.

When the user touches the display on the touch-sensitive display and then gives a gesture input 510 for dragging in a predetermined direction, the terminal device 100 according to the embodiment of the present invention transmits the gesture input 510 to the recognition- And starts providing the recognition search interface. A more detailed process will be described later with reference to FIG.

FIG. 6 illustrates that the recognition search interface is pulled down corresponding to the recognition search call interaction described in FIG. 5 in accordance with an embodiment of the present invention.

The screen configuration 600 illustrates the process 120 of the terminal device 100 renders a transitional transition animation on the touch-sensitive display 110 according to the recognition search call interaction.

The recognition search interface 610 has been hiding in FIG. 5, but is being pulled down along with user dragging upon receipt of the recognition search call interaction.

In this specification, pull-down refers to a visualization process in which a hidden area is revealed according to a user's drag interaction, and a drop-down refers to a downward arrow or a specific button The hidden area is revealed by pressing the button.

Embodiments of the present invention will now be described, by way of example only, with respect to pulldown, and not to the case of drop down, but embodiments of the pull down can be replaced by embodiments of drop down, without departing from the spirit of the present invention .

In this case, the normal text-based search interface 620 may be moved down with the pull-down of the recognition search interface 610, and the browsing area 630 may be shrinked while darkly dimming have. After completion of the transition animation, the screen configuration 300 shown in FIG. 3 is obtained.

However, according to an embodiment of the present invention, when a predetermined threshold, for example, when the exposed portion of the recognition search interface 610 is less than 1/4 of the entire portion, the drag input is interrupted and the recognition search interface 610 releases ), The transition animation can be rolled back and returned to the state shown in FIG. 5 again.

Likewise, if the transition animation is advanced beyond the threshold value, if the recognition search interface 610 is released, the animation may proceed to result in the screen configuration 300 of FIG. 3 as well.

FIG. 7 illustrates a search result for a user query in an in-web according to an embodiment of the present invention.

In contrast to the in-app described above, an in-web screen configuration 700 is shown in FIG. An in-web refers to a browser in an application that provides general-purpose browsing during application execution.

7, an interface 710, such as a search window for normal text-based search, may be presented along with a browsing region 720 and a control bar 730.

During this in-web execution, the gravity direction applied to the terminal device 100 may be sensed to support the landscape view mode, as shown in FIG.

FIG. 8 shows a state in which the in-web of FIG. 7 is switched to the landscape mode according to an embodiment of the present invention.

In the screen configuration 800, the screen configuration 700 of FIG. 7 is enlarged in the landscape view mode.

However, according to an embodiment of the present invention, the recognition search call interaction 810 can be received and processed even in the in-web landscape view state. This aware search call interaction can be a touch and drag input to the touch sensitive display as described in FIG. The result of the processing in this case is shown in Fig.

FIG. 9 shows a processing result when the recognition search call interaction 810 is received by the user's touch input in FIG. 8 according to an embodiment of the present invention.

Referring to the screen configuration 900, a recognition search interface 910 is provided in the imeWeb screen configuration 800 in landscape view, a normal text-based search interface 920 is pushed down, Is dimmed.

This is similar to the recognition search interface exposure screen configuration 300 in the in-app configuration of FIG. However, since the in-web supports the landscape view mode, unlike the in-app, the screen configuration 900 has a feature of being configured in the landscape view mode.

In another embodiment of the present invention, the landscape view mode may be supported in the In-App according to the setting.

Hereinafter, referring to Figs. 10 to 23, examples of the recognition search service corresponding to each of the GUIs in the recognition search interface 910 (or 310 in Fig. 3) will be described.

10 and 11 illustrate a code search process during recognition search according to an embodiment of the present invention.

The code search process may correspond to the touch of the GUI 211 in Fig.

When a user photographs a code using the built-in camera of the terminal device 100 and places the code in the recognition area 1010, the application identifies and decodes the code and extracts the code as text information.

In this process, the user can use the photo album call button 1020 to load and recognize the image photographed in advance, instead of the instant photography. When the surroundings are dark, the user can use the flash operation button 1030 Flash can be disconnected.

This code search result is shown in Fig.

12 and 13 illustrate a voice search process during recognition search according to an embodiment of the present invention.

The voice search process may correspond to the touch of the GUI 212 in Fig.

In the screen configuration 1200, the voice search is performed by the user inputting the voice through the microphone built in the terminal device 100 after touching the voice input start button 1210. [ Guidance for voice search may be provided in the indication area 1220. [

After this voice input, a voice recognition algorithm is driven at the terminal device 100 or at a server terminal connected to the network, so that a text query can also be derived, and the search result is shown in FIG.

FIG. 13 illustrates a voice search GUI 1310 provided with a conventional text-based search interface in accordance with an embodiment of the present invention. As such, at least some of the recognition search interface GUIs according to an embodiment of the present invention may be provided in combination with the text search interface.

FIGS. 14 and 15 illustrate a music search process during recognition search according to an embodiment of the present invention.

The music search process may correspond to the touch of the GUI 213 in Fig.

The music search process is similar to the voice search described with reference to FIGS. 12 to 13, but differs in utilizing a music recognition algorithm that may be different from the voice recognition algorithm.

In the specific operation after the screen configuration 1400 is provided, the music input start button 1410 and the indication area 1420 are similar to those in Fig. 12 for voice search.

The music search result is provided as shown in FIG. 15, and various application services capable of providing music derived from the search result, menus for connecting to an SNS service or an external application, and additional information such as lyrics and albums are provided .

16 and 17 illustrate a wine label recognition search process during recognition search according to an embodiment of the present invention.

The wine label recognition search process may correspond to the touch of the GUI 214 in Fig.

The wine label is photographed in the recognition area 1610 in the screen configuration 1600, but the content is the same as general image search, and the image search target DB is limited to wine related data.

The contents of the button 1620 for loading an image photographed in the photo album or the flash operation button 1630 are similar to the code search described above with reference to Fig.

In FIG. 17, the result of the wine label search is shown, purchase information such as the wine price can be provided, and it is also possible to connect the shortcut shopping although it is not shown.

18 to 21 illustrate a Japanese recognition search process during recognition search according to an embodiment of the present invention.

The Japanese recognition search process may correspond to the touch of the GUI 215 in Fig.

Referring to the screen configuration 1800, in the Japanese language search, Japanese characters are recognized in the recognition area 1810, and there are peculiar points, and conventional OCR engines can be utilized. The contents of the photo album call button 1820 and the flash operation button 1830 for instant photography are similar to those in the code recognition described above.

On the other hand, the Japanese recognition search is a process in which the user precisely targets an area to which the OCR algorithm is to be applied by adjusting the position and size of the box 1910 shown in Fig.

When the user who has finished the targeting such as box 2010 of FIG. 20 touches the completion button 2020, the OCR algorithm is activated to recognize the text in the box 2010. FIG.

Then, the result of such text recognition is provided as a query, and a search result is provided in a Japanese dictionary or the like as shown in FIG.

22 and 23 illustrate a process of searching for a green window TM during recognition search according to an embodiment of the present invention.

The green window TM search process may correspond to the touch of the GUI 216 in FIG.

As shown in the screen configuration 2200, the Green Window TM search has a peculiar point in that a predetermined frame predetermined in the recognition area 2210 is subject to photography and recognition.

The text recognition area is designed not to be designated as the box 1910 of FIG. 19 but to be limited within predetermined predetermined frames, and typically advertisers may provide the keywords of such frames through various media, . Of course, the keyword may be purchased by the advertiser.

Also, a photo album call 2220 is provided so that an image taken at a point in time such as when the user is not connected to the network can be loaded in the photo album, and a flash operation button 2230 can also be provided.

This Green Windows TM search result is shown in FIG.

24 is a flowchart illustrating an application according to an embodiment of the present invention.

According to an embodiment of the present invention, after starting the application (App), in step 2410, it is judged whether it is the first execution after the installation of the application or the first execution after the update.

If it is the first execution, the recognition search interface may be provided as the screen configuration 300 of FIG. 3 along with the normal search interface and browsing area (step 2430).

In this case, the dimming process and the like of the browsing area are as described above in Fig.

As a result of the determination in step 2410, if it is not the first execution, the recognition search interface may be hiding (step 2420) and displayed. Such a display result may be an in-app screen of FIG. 4, an in-web screen of FIG. 7, or the like.

Then, at step 2440, when the recognition search call interaction is received, the application pulls down the recognition search interface (step 2450).

The contents of transition animation management and rollback in this process are as described above with reference to FIG. 5 to FIG.

While the recognition search interface is being pulled down, the recognition search interface hiding process of step 2420 is repeated again when the recognition search canceling interaction is received in step 2460. [ Then, a normal in-app or in-web screen can be provided.

Here, the recognition-undiscovered interaction is opposite to the recognition-search-call interaction described above and corresponds to an instruction of the user who wishes to hid the recognition-recognition interface again. This recognition cancel canceling interaction can be understood as a content in which the user touches a dimmed browsing area in the above embodiments, or an arbitrary area other than the recognition search interface.

In addition, in the above-described embodiments, the user has pulled down or dropped the recognition search interfaces into a touch and drag input on the touch-sensitive display 110 of the terminal device 100, If you receive a touch-and-drag input in the opposite direction while it is on the screen, this input can be the recognition-undo interaction.

If no recognition search cancellation interaction is received, the current interface is maintained (2470).

Here, the recognition search canceling interaction may mean an input such as the touch of the browsing area 330 in FIG. Of course, the present invention is not limited to these embodiments, and the provision of the recognition search interface can be canceled in various ways.

The method according to an embodiment of the present invention can be implemented in the form of a program command which can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. This is possible.

Therefore, the scope of the present invention should not be limited to the described embodiments, but should be determined by the equivalents of the claims, as well as the claims.

100: terminal device
110: Touch-sensitive display
120: Processor
130: memory

Claims (10)

  1. A computer-readable recording medium storing an application for providing a search service, which is executed in a terminal device including a processor,
    The application comprises:
    Code for displaying a first search interface corresponding to a text-based search on the display and a browsing area for displaying at least one content;
    Code for receiving a user's touch input to the display during the first search interface display;
    Code for detecting a calling interaction with a second search interface corresponding to the recognition search by the received touch input; And
    A code for displaying the second search interface on the display
    Lt; / RTI >
    Wherein the second search interface comprises a graphical user interface for each of a plurality of types of awareness based searches,
    Wherein the first search interface, the second search interface and the browsing area are displayed together on the display as the second search interface is displayed
    And a computer-readable storage medium.
  2. The method according to claim 1,
    As the second search interface is displayed, the browsing area is dimmed and displayed in a reduced size
    And a computer-readable storage medium.
  3. The method according to claim 1,
    Wherein the first search interface includes a search window for receiving text input of a user corresponding to a text based search
    And a computer-readable storage medium.
  4. The method according to claim 1,
    Wherein the second search interface includes a graphical user interface for each of at least one of image recognition based search, speech recognition based search, and music recognition based search
    And a computer-readable storage medium.
  5. The method according to claim 1,
    Wherein the calling interaction for the second search interface is a drag input of the user to the display
    And a computer-readable storage medium.
  6. The method according to claim 1,
    The code for displaying the second search interface includes:
    Displaying the second search interface in a form of a pull-down menu or a drop-down menu corresponding to a touch input of the user on the display
    And a computer-readable storage medium.
  7. The method according to claim 6,
    The code for displaying the second search interface includes:
    And when the touch input of the user is stopped when the progress of the visualization is less than a predetermined threshold ratio, the progress of the visualization is rolled back to return to the first search interface display state
    And a computer-readable storage medium.
  8. The method according to claim 6,
    The code for displaying the second search interface includes:
    When the touch input of the user to the browsing area is received during the display of the second search interface, returning to the first search interface display state before the visualization progresses
    And a computer-readable storage medium.
  9. delete
  10. A computer-implemented search service providing method,
    Displaying a first search interface corresponding to a text based search on the display and a browsing area for displaying at least one content;
    Receiving a user's touch input to the display during the first search interface display;
    Sensing a calling interaction with a second search interface corresponding to the recognition search by the received touch input; And
    Displaying the second search interface on the display
    Lt; / RTI >
    Wherein the second search interface comprises a graphical user interface for each of a plurality of types of awareness based searches,
    Wherein the first search interface and the second search interface and the browsing area are displayed together on the display as the second search interface is displayed,
    A method of providing a search service.
KR1020160006856A 2016-01-20 2016-01-20 Apparatus and method for providing searching service KR101659063B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160006856A KR101659063B1 (en) 2016-01-20 2016-01-20 Apparatus and method for providing searching service

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160006856A KR101659063B1 (en) 2016-01-20 2016-01-20 Apparatus and method for providing searching service

Publications (2)

Publication Number Publication Date
KR20160017652A KR20160017652A (en) 2016-02-16
KR101659063B1 true KR101659063B1 (en) 2016-09-26

Family

ID=55448073

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160006856A KR101659063B1 (en) 2016-01-20 2016-01-20 Apparatus and method for providing searching service

Country Status (1)

Country Link
KR (1) KR101659063B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018182270A1 (en) * 2017-03-28 2018-10-04 삼성전자 주식회사 Electronic device and screen control method for processing user input by using same

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050022130A1 (en) 2003-07-01 2005-01-27 Nokia Corporation Method and device for operating a user-input area on an electronic display device
JP2010067032A (en) 2008-09-11 2010-03-25 Yahoo Japan Corp Device, system and method for retrieval of commodity, and program
US20110035406A1 (en) 2009-08-07 2011-02-10 David Petrou User Interface for Presenting Search Results for Multiple Regions of a Visual Query

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101119161B1 (en) * 2009-09-30 2012-03-19 주식회사 하이닉스반도체 Fuse structure for high integrated semiconductor device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050022130A1 (en) 2003-07-01 2005-01-27 Nokia Corporation Method and device for operating a user-input area on an electronic display device
JP2010067032A (en) 2008-09-11 2010-03-25 Yahoo Japan Corp Device, system and method for retrieval of commodity, and program
US20110035406A1 (en) 2009-08-07 2011-02-10 David Petrou User Interface for Presenting Search Results for Multiple Regions of a Visual Query

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018182270A1 (en) * 2017-03-28 2018-10-04 삼성전자 주식회사 Electronic device and screen control method for processing user input by using same

Also Published As

Publication number Publication date
KR20160017652A (en) 2016-02-16

Similar Documents

Publication Publication Date Title
US9535600B2 (en) Touch-sensitive device and touch-based folder control method thereof
JP5933641B2 (en) Device, method and graphical user interface for managing folders
US9448694B2 (en) Graphical user interface for navigating applications
US8806369B2 (en) Device, method, and graphical user interface for managing and interacting with concurrently open software applications
US10223466B2 (en) Method for searching and device thereof
KR101859100B1 (en) Mobile device and control method for the same
JP5987054B2 (en) Device, method and graphical user interface for document manipulation
RU2635231C2 (en) User interface for mobile device application management
US8698845B2 (en) Device, method, and graphical user interface with interactive popup views
EP2510427B1 (en) Device, method, and graphical user interface for navigating through multiple viewing areas
CN103649898B (en) Starter for the menu based on context
US10126930B2 (en) Device, method, and graphical user interface for scrolling nested regions
US9146751B2 (en) Device, method, and graphical user interface for navigation of multiple applications
US9753611B2 (en) Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content
CN102033710B (en) Method for managing file folder and related equipment
CN105930064B (en) The method and system and calculating equipment of item in managing user interface
JP2016522926A (en) Smart mobile application development platform
AU2014287943B2 (en) User terminal device for supporting user interaction and methods thereof
US9417781B2 (en) Mobile terminal and method of controlling the same
CN101963886B (en) Mobile terminal and method for controlling thereof
US20120123865A1 (en) Enhanced shopping experience for mobile station users
US10444979B2 (en) Gesture-based search
DE112011102383T5 (en) Touch-based gesture detection for a touch-sensitive device
TWI594178B (en) Touch and gesture input-based control method and terminal therefor
US20130227413A1 (en) Method and Apparatus for Providing a Contextual User Interface on a Device

Legal Events

Date Code Title Description
A107 Divisional application of patent
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
FPAY Annual fee payment

Payment date: 20190701

Year of fee payment: 4