US20160274789A1 - Electronic apparatus and method for executing application thereof - Google Patents
Electronic apparatus and method for executing application thereof Download PDFInfo
- Publication number
- US20160274789A1 US20160274789A1 US15/032,522 US201415032522A US2016274789A1 US 20160274789 A1 US20160274789 A1 US 20160274789A1 US 201415032522 A US201415032522 A US 201415032522A US 2016274789 A1 US2016274789 A1 US 2016274789A1
- Authority
- US
- United States
- Prior art keywords
- user input
- input
- electronic device
- displaying
- menu
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- H04M1/72583—
Definitions
- Various embodiments of the present invention relate an electronic device with a touch-screen and a method of executing applications in the electronic device.
- a touch-screen is configured to have a surface layer for sensing input actions and a display layer for performing an outputting function.
- a touch-screen-based electronic device receives a user input, or a touch gesture, analyzes and recognizes the user input, and outputs a corresponding result.
- the touch screen receives a touch input from the user and transmits a corresponding control command to the electronic device, the electronic device analyzes and recognizes the touch input via the touch sensor, processes a corresponding function, and outputs the result on the touch-screen.
- Electronic devices such as terminals, etc. are capable of executing applications to perform various functions.
- the user activates the corresponding application in the terminal and the terminal executes the function of the activated application.
- the touch-screen provides users with more intuitive and efficient operations in activating applications in the devices.
- the present invention provides an electronic device with a touch screen and a method of easily and efficiently executing applications in the electronic device.
- a method of executing applications in an electronic device includes: creating, on a touch-screen, a keyboard interface including a virtual input keypad and an output area in response to a first user input; receiving a second user input via the virtual input keypad, and displaying the second user input on the output area; determining a data attribute of the second user input; displaying a menu including one or more applications corresponding to the data attribute; and receiving a third user input selecting one of the one or more applications included in the menu and executing the selected application.
- an electronic device includes: a touch-screen for receiving user inputs and displaying a keyboard interface including a virtual input keypad and an output area; and a controller for controlling: the display of the keyboard interface in response to a first user input; the display of the second user input, received by via the virtual input keypad, on the output area; the determination of a data attribute of the second user input; the display of a menu including one or more applications corresponding to the data attribute; and the execution of an application selected by a third user input, from among the one or more applications included in the menu.
- Embodiments of the present invention allow users to execute functions of an application, more conveniently, using a keyboard interface.
- FIG. 1 is a block diagram showing an electronic device according to an embodiment of the present invention.
- FIG. 2 shows screens that describe a method of executing applications via a keyboard interface according to various embodiments of the present invention.
- FIG. 3 shows screens that describe a method of executing applications via a keyboard interface according to various embodiments of the present invention.
- FIG. 4 shows screens that describe a method of executing a call application via a keyboard interface according to various embodiments of the present invention.
- FIGS. 5A and 5B shows screens that describe a method of recognizing an image when a call application is executed via a keyboard interface, according to various embodiments of the present invention.
- FIG. 6 shows screens that describe a method of executing an application via a keyboard interface according to various embodiments of the present invention.
- the electronic device 100 is capable of including a touch-screen 110 , a controller 140 , a storage unit 130 and a communication unit 140 .
- the touch-screen 110 is configured to simultaneously receive and display touch inputs and perform a displaying function.
- the touch-screen 110 is implemented to include a touch receiving unit 111 and a display unit 112 .
- the touch receiving unit 111 is capable of receiving a user's touch inputs applied to the touch-screen.
- the touch receiving unit 111 is capable of including a touch sensor for detecting a user's touch inputs.
- the touch sensor may be implemented with a resistive type, a capacitive type, an electromagnetic induction type, a pressure type, and any other types of sensors employing various touch technologies.
- the touch sensor may also be implemented to detect direct contact inputs or proximity inputs close thereto but apart therefrom within a certain distance.
- the display unit 112 displays information processed by the electronic device 100 .
- the display unit 112 is capable of displaying a keyboard interface.
- the display unit 112 is also capable of displaying applications executed via the keyboard interface.
- the controller 120 is capable of controlling all the functions of the electronic device 100 .
- the controller 120 is capable of including a keyboard interface executing unit 121 , a data attribute determining unit 122 and an application executing unit 123 .
- the keyboard interface executing unit 121 When a first user input is detected via the touch receiving unit 111 , the keyboard interface executing unit 121 is capable of outputting a keyboard interface to the display unit 112 .
- the first user input may be a dragging-input starting from an edge of the touch-screen.
- the keyboard interface executing unit 121 may display a keyboard interface on an area of the touch-screen, e.g., a bottom area of the touch-screen.
- the displayed keyboard interface may include a virtual input pad and an output area.
- the virtual input pad may be configured, based on various types of text, e.g., Korean, English, symbols, and numbers.
- the virtual input pad may include a text switching key. When the text switching key is selected, the keyboard interface executing unit 121 is capable of switching between types of text for the virtual input pad.
- the keyboard interface executing unit 121 is capable of receiving a second user input via the virtual input pad, and displaying the received input on the output area.
- the data attribute determining unit 122 is capable of determining a data attribute of the second user input.
- the data attribute determining unit 122 is capable of determining one or more applications corresponding to the determined data attribute and displaying the application menu on the screen.
- the data attribute may be determined based on a text type of the second user input. For example, when the text type is English, starting with ‘www’ or ‘http,’ the data attribute is considered to be a URL address. When the text type is number, starting with ‘010,’ ‘02,’ etc., the data attribute is considered to be a contact phone number.
- the data attribute determining unit 122 is capable of displaying an Internet browser application menu.
- the data attribute determining unit 122 is capable of displaying a call or message application menu.
- the data attribute determining unit 122 is capable of determining whether the received, second user input corresponds to a name contained in the contact details stored in the storage unit 130 .
- the data attribute determining unit 122 is capable of displaying the name on the output area in preview mode.
- the data attribute determining unit 122 ascertains that the second user input corresponds to a name contained in the contact details, it is capable of determining that the data attribute corresponds to a contact name and displaying a call or message application menu.
- the data attribute determining unit 122 When a second user input, e.g., a number, is input, the data attribute determining unit 122 is capable of determining whether the received, number corresponds to a number contained in the contact details stored in the storage unit 130 . When the received, second user input has characters corresponding to a number contained in the contact details, the data attribute determining unit 122 is capable of displaying the number on the output area in preview mode. When the user perceives that the number displayed in preview mode is a number that he/she wants to search for, he/she selects the displayed number. When the data attribute determining unit 122 ascertains that the second user input corresponds to a number contained in the contact details, it is capable of determining that the data attribute corresponds to a contact number and displaying a call or message application menu.
- a second user input e.g., a number
- the second user input may be determined as it has a data attribute or various types of attributes.
- the second user input may be determined based on a contact name or a keyword.
- the data attribute determining unit 122 may display all the applications corresponding to the data attributes, e.g., a call or message application, a search application, etc.
- the data attribute determining unit 122 is capable of determining data attributes of the second user input according to various settings and displaying application menus corresponding to the determined data attributes.
- the application executing unit 123 is capable of executing an application selected by a third user input, from among the displayed application menus. For example, when a call or message application is selected, the application executing unit 123 is capable of executing a call or message function via a contact name or number displayed on the output area of the keyboard interface. When an Internet browser application is selected, the application executing unit 123 is capable of executing the Internet browser via an URL address displayed on the output area of the keyboard interface. When a search application is selected, the application executing unit 123 is capable of executing a search function via a keyword displayed on the output area of the keyboard interface.
- the storage unit 130 is capable of storing data and applications required for functions according to embodiments of the present invention therein.
- the storage unit 130 is capable of storing contact details containing names and phone numbers therein.
- the communication unit 140 is capable of allowing the electronic device 100 to transmit/receive data to/from other devices.
- FIG. 2 shows screens that describe a method of executing applications via a keyboard interface according to various embodiments of the present invention.
- Diagram a illustrates an embodiment which displays a keyboard interface 200 in response to a first user input.
- the keyboard interface 200 may be located in at least part of the area of the touch-screen.
- the keyboard interface 200 is capable of including a virtual input pad 210 and an output area 220 .
- the virtual input pad 210 may be set in various modes according to text types, including a key for switching between text types.
- the second user input is displayed on the output area 220 .
- the data attribute of the second user input is determined and a menu 230 including one or more applications corresponding to the determined data attribute is displayed.
- the menu 230 may include call and message applications. Since data may be determined as various types, the menu 230 may also include a search application, App Market application, etc.
- the menu 230 may also include a default application according to a user's settings. As shown in diagram b, the menu 230 may be displayed on the output area 230 of the keyboard interface 200 .
- the menu 230 may be displayed on an area outside the keyboard interface 200 .
- the menu 230 may be displayed in a form of a notification window 240 .
- the menu 230 may be displayed in such a way that one of the application menus is substituted by one of the keys of the virtual input pad 210 . For example, when an input, “Hong Gil Dong,” corresponds to a name contained in the contact details, the call application menu 250 is displayed, substituting the Enter key.
- the second user input corresponds to at least part of a name contained in the contact details
- the corresponding name and contact number are displayed on a display box 260 . It should be understood that a number of names and contact numbers may be displayed on the display box.
- FIG. 3 shows screens displaying an application menu in response to a second user input according to various embodiments of the present invention.
- a when an URL address as a second user input is input, one of the keys of the virtual input pad 210 is displayed with being substituted by an Internet browser application menu 310 .
- a phone number search application menu 320 when data in Korean as a second user input, not contained in the contact details, is input, a phone number search application menu 320 for searching for a phone number using the web according to settings may be displayed.
- the electronic device when a second user input is received, the electronic device is capable of extracting expectable input data from the second user input and displaying the extracted data on the output area 220 .
- the electronic device determines the attribute of the selected data 330 and displays an application menu 340 corresponding to the determined data attribute.
- the selected data is a Korean word, a search menu, an Internet browser, App Market application menu, etc. according to settings may be displayed.
- FIG. 4 shows screens that describe a method of executing a call application via a keyboard interface according to various embodiments of the present invention.
- the keyboard interface 200 may be set as a basic option to provide the virtual input pad 210 with a Korean character-based keyboard layout.
- a text switching key 410 of the virtual input pad 210 is selected, the virtual input pad 210 switches from the Korean character-based keyboard layout to a numeric-based keyboard layout as shown in diagram b.
- the electronic device is capable of displaying a call application menu 420 which is substituted for one of the keys of the virtual input pad 210 , e.g., Enter key.
- the text switching key 410 When the text switching key 410 is selected to switch the virtual input pad 210 from the Korean character-based keyboard layout to a numeric-based keyboard layout, a list of one or more calls that have recently been made may be displayed on the output area 220 .
- the virtual input pad 210 may display contact details corresponding to the phone numbers on the numeric keys the number of which serve as speed dial numbers.
- the electronic device executes a call application and makes a call connection to the received contact number.
- the electronic device makes a call connection to the contact number corresponding to the speed dial number.
- the electronic device makes a call connection to the contact number corresponding to the speed dial number.
- contact numbers are displayed on the output area 220 .
- contact numbers may be displayed alone or along with contact names.
- the electronic device performs call connection to the corresponding contact number.
- the electronic device performs call connection to the selected contact.
- FIGS. 5A and 5B shows screens that describe a method of recognizing an image when a call application is executed via a keyboard interface, according to various embodiments of the present invention.
- an application including an image may be executed on a touch-screen 500 .
- an image recognition menu 510 may be displayed on the keyboard interface 200 according to the settings.
- the image recognition menu 510 may be displayed only when the application including images is being executed on the touch-screen 500 .
- the embodiment is implemented in such a way that the image recognition menu 510 is displayed on the output area 220 of the keyboard interface 200 , it should be understood that the present invention is not limited thereto.
- the image recognition menu 510 may be implemented with a phone number recognition menu.
- a notification window 520 for requesting the selection of an image to be recognized may be displayed as shown in diagram 502 of FIG. 5A . Outputting the notification window 520 may be optionally omitted.
- the user may apply an image selection input 530 to the touch-screen 500 as shown in diagram 503 of FIG. 5A .
- the image selection input 530 may be an underline or a circle for selecting an image.
- the electronic device receives a phone number in a form of image.
- the electronic device is capable of performing character recognition on the selected image. While character recognition is performing, the notification window 540 is displayed as shown in diagram 504 of FIG. 5B . When character recognition has been completed, the notification window 540 disappears.
- the electronic device is capable of performing call connection to the recognized phone number.
- the call connection notification window 560 is displayed on the screen as shown in diagram 506 of FIG. 5B .
- FIG. 6 shows screens that describe a method of executing an application via a keyboard interface according to various embodiments of the present invention.
- the electronic device when the touch-screen 500 does not have an input field for requesting the input of data, e.g., when an idle state screen or a home screen is displayed, the electronic device is capable of receiving a first user input and outputting a keyboard interface 200 on the touch-screen 500 .
- the displayed keyboard interface 200 may be divided into a virtual input pad 210 and an output area 220 .
- the virtual input pad 210 may contain one or more application menus according to a user's settings.
- the virtual input pad 210 may be set in such a way as to contain application menus which are frequently used or essential.
- the output area 220 may display recently used application icons in a form of history.
- the output area 220 may display icons in order of execution from the latest to the oldest, e.g., in order of a call icon, a message icon, a camera icon, etc.
- the electronic device is capable of executing the application. That is, according to various embodiment of the present invention, the keyboard interface 200 serves as a tool for executing applications.
- the electronic device displays the received input on the output area 220 .
- the electronic device displays, on the output area 220 , contact details corresponding to the name, e.g., a photo, a name, a phone number, etc.
- the electronic device may further display a message asking the user whether he/she wants to make a call to a corresponding contact.
- the electronic device when the electronic device receives the second user input via the virtual input pad 210 , it searches for an application menu starting with data of the second user input and displays the second user input along with the searched application menu on the output area 220 . For example, when the user inputs a letter “car,” the electronic device displays application icons starting with “car” on the output area 220 .
- the application search may be performed for applications stored in the electronic device. Alternatively, the application search is performed with respect to servers via a network. According to an embodiment, when the electronic device has not searched for an application menu starting with data of the received, second user input as shown in diagram c, it determines whether the corresponding list is stored in the contact details as shown in diagram b.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Input From Keyboards Or The Like (AREA)
Abstract
Description
- Various embodiments of the present invention relate an electronic device with a touch-screen and a method of executing applications in the electronic device.
- In recent years, the touch-screen market has grown fast. As mobile terminals, laptop computers, smartphones, etc., are equipped with a touch-screen panel, the portable touch-screen market is growing rapidly. In addition, home appliances start to employ a touch-screen panel. In the near future, most home appliances will be equipped with a touch-screen panel.
- A touch-screen is configured to have a surface layer for sensing input actions and a display layer for performing an outputting function. A touch-screen-based electronic device receives a user input, or a touch gesture, analyzes and recognizes the user input, and outputs a corresponding result. When the touch screen receives a touch input from the user and transmits a corresponding control command to the electronic device, the electronic device analyzes and recognizes the touch input via the touch sensor, processes a corresponding function, and outputs the result on the touch-screen.
- Electronic devices such as terminals, etc. are capable of executing applications to perform various functions. In order to perform a particular function in a terminal, the user activates the corresponding application in the terminal and the terminal executes the function of the activated application. The touch-screen provides users with more intuitive and efficient operations in activating applications in the devices.
- Conventional electronic devices are disadvantageous in that, in order to execute functions of an application, they execute the application first and then perform the functions according to inputs that the user applies to the touch screen every time. The present invention has been made to address the above problems and disadvantages, and to provide at least the advantages described below. The present invention provides an electronic device with a touch screen and a method of easily and efficiently executing applications in the electronic device.
- In accordance with an aspect of the present invention, a method of executing applications in an electronic device is provided. The method includes: creating, on a touch-screen, a keyboard interface including a virtual input keypad and an output area in response to a first user input; receiving a second user input via the virtual input keypad, and displaying the second user input on the output area; determining a data attribute of the second user input; displaying a menu including one or more applications corresponding to the data attribute; and receiving a third user input selecting one of the one or more applications included in the menu and executing the selected application.
- In accordance with another aspect of the present invention, an electronic device is provided. The electronic device includes: a touch-screen for receiving user inputs and displaying a keyboard interface including a virtual input keypad and an output area; and a controller for controlling: the display of the keyboard interface in response to a first user input; the display of the second user input, received by via the virtual input keypad, on the output area; the determination of a data attribute of the second user input; the display of a menu including one or more applications corresponding to the data attribute; and the execution of an application selected by a third user input, from among the one or more applications included in the menu.
- Embodiments of the present invention allow users to execute functions of an application, more conveniently, using a keyboard interface.
-
FIG. 1 is a block diagram showing an electronic device according to an embodiment of the present invention. -
FIG. 2 shows screens that describe a method of executing applications via a keyboard interface according to various embodiments of the present invention. -
FIG. 3 shows screens that describe a method of executing applications via a keyboard interface according to various embodiments of the present invention. -
FIG. 4 shows screens that describe a method of executing a call application via a keyboard interface according to various embodiments of the present invention. -
FIGS. 5A and 5B shows screens that describe a method of recognizing an image when a call application is executed via a keyboard interface, according to various embodiments of the present invention. -
FIG. 6 shows screens that describe a method of executing an application via a keyboard interface according to various embodiments of the present invention. - Various embodiments of the present invention are described in detail referring to the accompanying drawings. The same reference numbers are used throughout the drawings to refer to the same or similar parts. Detailed descriptions of well-known functions and structures incorporated herein may be omitted to avoid obscuring the subject matter of the invention.
- The terms as used in the present disclosure are merely for the purpose of describing particular embodiments and are not intended to limit the present disclosure. Unless defined otherwise, all terms used herein, including technical terms and scientific terms, have the same meaning as commonly understood by a person of ordinary skill in the art to which the present disclosure pertains. Such terms as those defined in a generally used dictionary are to be interpreted to have the meanings equal to the contextual meanings in the relevant field of art, and are not to be interpreted to have ideal or excessively formal meanings unless clearly defined in the present disclosure.
- In the present disclosure, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes,” “comprises,” “including” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- Referring to
FIG. 1 , theelectronic device 100 is capable of including a touch-screen 110, acontroller 140, astorage unit 130 and acommunication unit 140. - The touch-
screen 110 is configured to simultaneously receive and display touch inputs and perform a displaying function. The touch-screen 110 is implemented to include atouch receiving unit 111 and adisplay unit 112. - The
touch receiving unit 111 is capable of receiving a user's touch inputs applied to the touch-screen. Thetouch receiving unit 111 is capable of including a touch sensor for detecting a user's touch inputs. The touch sensor may be implemented with a resistive type, a capacitive type, an electromagnetic induction type, a pressure type, and any other types of sensors employing various touch technologies. The touch sensor may also be implemented to detect direct contact inputs or proximity inputs close thereto but apart therefrom within a certain distance. - The
display unit 112 displays information processed by theelectronic device 100. Thedisplay unit 112 is capable of displaying a keyboard interface. Thedisplay unit 112 is also capable of displaying applications executed via the keyboard interface. - The
controller 120 is capable of controlling all the functions of theelectronic device 100. - The
controller 120 is capable of including a keyboardinterface executing unit 121, a dataattribute determining unit 122 and anapplication executing unit 123. - When a first user input is detected via the
touch receiving unit 111, the keyboardinterface executing unit 121 is capable of outputting a keyboard interface to thedisplay unit 112. The first user input may be a dragging-input starting from an edge of the touch-screen. For example, when detecting a dragging input applied to the touch screen in a direction from the bottom to the top, the keyboardinterface executing unit 121 may display a keyboard interface on an area of the touch-screen, e.g., a bottom area of the touch-screen. - The displayed keyboard interface may include a virtual input pad and an output area. The virtual input pad may be configured, based on various types of text, e.g., Korean, English, symbols, and numbers. In addition, the virtual input pad may include a text switching key. When the text switching key is selected, the keyboard
interface executing unit 121 is capable of switching between types of text for the virtual input pad. - The keyboard
interface executing unit 121 is capable of receiving a second user input via the virtual input pad, and displaying the received input on the output area. - The data
attribute determining unit 122 is capable of determining a data attribute of the second user input. The dataattribute determining unit 122 is capable of determining one or more applications corresponding to the determined data attribute and displaying the application menu on the screen. The data attribute may be determined based on a text type of the second user input. For example, when the text type is English, starting with ‘www’ or ‘http,’ the data attribute is considered to be a URL address. When the text type is number, starting with ‘010,’ ‘02,’ etc., the data attribute is considered to be a contact phone number. When the data attribute is determined as a URL address, the data attribute determiningunit 122 is capable of displaying an Internet browser application menu. When the data attribute is determined as a contact phone number, the data attribute determiningunit 122 is capable of displaying a call or message application menu. - When a second user input, e.g., Korean text, is input, the data attribute determining
unit 122 is capable of determining whether the received, second user input corresponds to a name contained in the contact details stored in thestorage unit 130. When the received, second user input has characters corresponding to a name contained in the contact details, the data attribute determiningunit 122 is capable of displaying the name on the output area in preview mode. When the user perceives that the name displayed in preview mode is a name that he/she wants to search for, he/she selects the displayed name. When the data attribute determiningunit 122 ascertains that the second user input corresponds to a name contained in the contact details, it is capable of determining that the data attribute corresponds to a contact name and displaying a call or message application menu. - When a second user input, e.g., a number, is input, the data attribute determining
unit 122 is capable of determining whether the received, number corresponds to a number contained in the contact details stored in thestorage unit 130. When the received, second user input has characters corresponding to a number contained in the contact details, the data attribute determiningunit 122 is capable of displaying the number on the output area in preview mode. When the user perceives that the number displayed in preview mode is a number that he/she wants to search for, he/she selects the displayed number. When the data attribute determiningunit 122 ascertains that the second user input corresponds to a number contained in the contact details, it is capable of determining that the data attribute corresponds to a contact number and displaying a call or message application menu. - The second user input may be determined as it has a data attribute or various types of attributes. For example, the second user input may be determined based on a contact name or a keyword. In this case, the data attribute determining
unit 122 may display all the applications corresponding to the data attributes, e.g., a call or message application, a search application, etc. - In addition, the data attribute determining
unit 122 is capable of determining data attributes of the second user input according to various settings and displaying application menus corresponding to the determined data attributes. - The
application executing unit 123 is capable of executing an application selected by a third user input, from among the displayed application menus. For example, when a call or message application is selected, theapplication executing unit 123 is capable of executing a call or message function via a contact name or number displayed on the output area of the keyboard interface. When an Internet browser application is selected, theapplication executing unit 123 is capable of executing the Internet browser via an URL address displayed on the output area of the keyboard interface. When a search application is selected, theapplication executing unit 123 is capable of executing a search function via a keyword displayed on the output area of the keyboard interface. - The
storage unit 130 is capable of storing data and applications required for functions according to embodiments of the present invention therein. Thestorage unit 130 is capable of storing contact details containing names and phone numbers therein. - The
communication unit 140 is capable of allowing theelectronic device 100 to transmit/receive data to/from other devices. -
FIG. 2 shows screens that describe a method of executing applications via a keyboard interface according to various embodiments of the present invention. - Diagram a illustrates an embodiment which displays a
keyboard interface 200 in response to a first user input. Thekeyboard interface 200 may be located in at least part of the area of the touch-screen. Thekeyboard interface 200 is capable of including avirtual input pad 210 and anoutput area 220. Thevirtual input pad 210 may be set in various modes according to text types, including a key for switching between text types. - Referring to diagrams b to e, when the user inputs a second user input to the
virtual input pad 210, the second user input is displayed on theoutput area 220. The data attribute of the second user input is determined and amenu 230 including one or more applications corresponding to the determined data attribute is displayed. When the input, e.g., Hong Gil Dong, corresponds to a name contained in the contact details, themenu 230 may include call and message applications. Since data may be determined as various types, themenu 230 may also include a search application, App Market application, etc. Themenu 230 may also include a default application according to a user's settings. As shown in diagram b, themenu 230 may be displayed on theoutput area 230 of thekeyboard interface 200. As shown in diagram c, themenu 230 may be displayed on an area outside thekeyboard interface 200. As shown in diagram d, themenu 230 may be displayed in a form of anotification window 240. In addition, themenu 230 may be displayed in such a way that one of the application menus is substituted by one of the keys of thevirtual input pad 210. For example, when an input, “Hong Gil Dong,” corresponds to a name contained in the contact details, thecall application menu 250 is displayed, substituting the Enter key. - Referring to diagram f, when the second user input corresponds to at least part of a name contained in the contact details, the corresponding name and contact number are displayed on a
display box 260. It should be understood that a number of names and contact numbers may be displayed on the display box. -
FIG. 3 shows screens displaying an application menu in response to a second user input according to various embodiments of the present invention. As shown in diagram a, when an URL address as a second user input is input, one of the keys of thevirtual input pad 210 is displayed with being substituted by an Internetbrowser application menu 310. As shown in diagram b, when data in Korean as a second user input, not contained in the contact details, is input, a phone numbersearch application menu 320 for searching for a phone number using the web according to settings may be displayed. - Referring to diagrams c and d, when a second user input is received, the electronic device is capable of extracting expectable input data from the second user input and displaying the extracted data on the
output area 220. When the user selects among input data or expectable input data displayed on theoutput area 220, i.e., selecteddata 330, the electronic device determines the attribute of the selecteddata 330 and displays anapplication menu 340 corresponding to the determined data attribute. When the selected data is a Korean word, a search menu, an Internet browser, App Market application menu, etc. according to settings may be displayed. -
FIG. 4 shows screens that describe a method of executing a call application via a keyboard interface according to various embodiments of the present invention. - As shown in diagram a, the
keyboard interface 200 may be set as a basic option to provide thevirtual input pad 210 with a Korean character-based keyboard layout. When atext switching key 410 of thevirtual input pad 210 is selected, thevirtual input pad 210 switches from the Korean character-based keyboard layout to a numeric-based keyboard layout as shown in diagram b. When switching to avirtual input pad 210 with a number-based keyboard layout, the electronic device is capable of displaying acall application menu 420 which is substituted for one of the keys of thevirtual input pad 210, e.g., Enter key. When the text switching key 410 is selected to switch thevirtual input pad 210 from the Korean character-based keyboard layout to a numeric-based keyboard layout, a list of one or more calls that have recently been made may be displayed on theoutput area 220. When speed dial numbers have already been set for phone numbers, thevirtual input pad 210 may display contact details corresponding to the phone numbers on the numeric keys the number of which serve as speed dial numbers. - As shown in diagram c, when the user inputs a second user input corresponding to a contact number and then a third user input for selecting a
call application menu 420, the electronic device executes a call application and makes a call connection to the received contact number. As shown in diagram d, when the user applies a long-push input to a speed dial number, the electronic device makes a call connection to the contact number corresponding to the speed dial number. Alternatively, when the user inputs a speed dial number and then select acall application menu 420, the electronic device makes a call connection to the contact number corresponding to the speed dial number. - Referring to diagrams e and f, when the second user input corresponds to at least part of a contact number or contact numbers contained in the contact details, one or more contact numbers are displayed on the
output area 220. In this case, contact numbers may be displayed alone or along with contact names. When the user selects one of the displayed contact numbers, the electronic device performs call connection to the corresponding contact number. Alternatively, when the user selects one of the displayed contact numbers and then acall application menu 420, the electronic device performs call connection to the selected contact. -
FIGS. 5A and 5B shows screens that describe a method of recognizing an image when a call application is executed via a keyboard interface, according to various embodiments of the present invention. - As shown in diagram 501 of
FIG. 5A , an application including an image, e.g., an image of a contact detail, may be executed on a touch-screen 500. When a first user input is applied to thetouch screen 500, thecorresponding keyboard interface 200 is displayed on the screen. Animage recognition menu 510 may be displayed on thekeyboard interface 200 according to the settings. Alternatively, theimage recognition menu 510 may be displayed only when the application including images is being executed on the touch-screen 500. Although the embodiment is implemented in such a way that theimage recognition menu 510 is displayed on theoutput area 220 of thekeyboard interface 200, it should be understood that the present invention is not limited thereto. For example, theimage recognition menu 510 may be implemented with a phone number recognition menu. - When the
image recognition menu 510 is selected, anotification window 520 for requesting the selection of an image to be recognized may be displayed as shown in diagram 502 ofFIG. 5A . Outputting thenotification window 520 may be optionally omitted. After that, the user may apply animage selection input 530 to the touch-screen 500 as shown in diagram 503 ofFIG. 5A . Theimage selection input 530 may be an underline or a circle for selecting an image. In the embodiment, the electronic device receives a phone number in a form of image. The electronic device is capable of performing character recognition on the selected image. While character recognition is performing, thenotification window 540 is displayed as shown in diagram 504 ofFIG. 5B . When character recognition has been completed, thenotification window 540 disappears. After that, as shown in diagram 505 ofFIG. 5B , when acall application menu 550 is selected, the electronic device is capable of performing call connection to the recognized phone number. When call connection to the recognized phone number is performed, the call connection notification window 560 is displayed on the screen as shown in diagram 506 ofFIG. 5B . -
FIG. 6 shows screens that describe a method of executing an application via a keyboard interface according to various embodiments of the present invention. - Referring to diagram a, when the touch-
screen 500 does not have an input field for requesting the input of data, e.g., when an idle state screen or a home screen is displayed, the electronic device is capable of receiving a first user input and outputting akeyboard interface 200 on the touch-screen 500. The displayedkeyboard interface 200 may be divided into avirtual input pad 210 and anoutput area 220. Thevirtual input pad 210 may contain one or more application menus according to a user's settings. Thevirtual input pad 210 may be set in such a way as to contain application menus which are frequently used or essential. Theoutput area 220 may display recently used application icons in a form of history. For example, theoutput area 220 may display icons in order of execution from the latest to the oldest, e.g., in order of a call icon, a message icon, a camera icon, etc. When the user selects an icon corresponding to an application, the electronic device is capable of executing the application. That is, according to various embodiment of the present invention, thekeyboard interface 200 serves as a tool for executing applications. - Referring to diagram b, when the user inputs a second user input to the
virtual input pad 210, the electronic device displays the received input on theoutput area 220. When data of “Hong Gil Dong” as a second user input corresponds to a name contained in the contact details, the electronic device displays, on theoutput area 220, contact details corresponding to the name, e.g., a photo, a name, a phone number, etc. The electronic device may further display a message asking the user whether he/she wants to make a call to a corresponding contact. - Referring to diagram c, when the electronic device receives the second user input via the
virtual input pad 210, it searches for an application menu starting with data of the second user input and displays the second user input along with the searched application menu on theoutput area 220. For example, when the user inputs a letter “car,” the electronic device displays application icons starting with “car” on theoutput area 220. The application search may be performed for applications stored in the electronic device. Alternatively, the application search is performed with respect to servers via a network. According to an embodiment, when the electronic device has not searched for an application menu starting with data of the received, second user input as shown in diagram c, it determines whether the corresponding list is stored in the contact details as shown in diagram b. - The embodiments of the present invention described in the description and drawings are merely provided to assist in a comprehensive understanding of the invention and are not suggestive of limitation. Although embodiments of the invention have been described in detail above, it should be understood that many variations and modifications of the basic inventive concept herein described, which may be apparent to those skilled in the art, will still fall within the spirit and scope of the embodiments of the invention as defined in the appended claims.
Claims (24)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2013-0132807 | 2013-11-04 | ||
KR1020130132807A KR102204261B1 (en) | 2013-11-04 | 2013-11-04 | Electronic device and method for executing application thereof |
PCT/KR2014/010480 WO2015065146A1 (en) | 2013-11-04 | 2014-11-04 | Electronic apparatus and method for executing application thereof |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2014/010480 A-371-Of-International WO2015065146A1 (en) | 2013-11-04 | 2014-11-04 | Electronic apparatus and method for executing application thereof |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/022,708 Continuation US11379116B2 (en) | 2013-11-04 | 2020-09-16 | Electronic apparatus and method for executing application thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160274789A1 true US20160274789A1 (en) | 2016-09-22 |
Family
ID=53004645
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/032,522 Abandoned US20160274789A1 (en) | 2013-11-04 | 2014-11-04 | Electronic apparatus and method for executing application thereof |
US17/022,708 Active US11379116B2 (en) | 2013-11-04 | 2020-09-16 | Electronic apparatus and method for executing application thereof |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/022,708 Active US11379116B2 (en) | 2013-11-04 | 2020-09-16 | Electronic apparatus and method for executing application thereof |
Country Status (6)
Country | Link |
---|---|
US (2) | US20160274789A1 (en) |
KR (1) | KR102204261B1 (en) |
CN (1) | CN105706039A (en) |
DE (1) | DE112014005034T5 (en) |
GB (1) | GB2534100B (en) |
WO (1) | WO2015065146A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180113609A1 (en) * | 2016-02-11 | 2018-04-26 | Hyperkey, Inc. | Enhanced Keyboard Including Multiple Application Execution |
US10976923B2 (en) | 2016-02-11 | 2021-04-13 | Hyperkey, Inc. | Enhanced virtual keyboard |
US11382812B2 (en) * | 2017-06-27 | 2022-07-12 | Stryker Corporation | Patient support systems and methods for assisting caregivers with patient care |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10489768B2 (en) * | 2015-12-30 | 2019-11-26 | Visa International Service Association | Keyboard application with third party engagement selectable items |
KR102167774B1 (en) * | 2019-06-14 | 2020-10-19 | 최현준 | Telephone application image control method, program and computer readable recording medium |
WO2021033221A1 (en) * | 2019-08-16 | 2021-02-25 | ソニー株式会社 | Information processing device, information processing method, and information processing program |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060015819A1 (en) * | 1999-08-12 | 2006-01-19 | Hawkins Jeffrey C | Integrated handheld computing and telephony system and services |
US20070156747A1 (en) * | 2005-12-12 | 2007-07-05 | Tegic Communications Llc | Mobile Device Retrieval and Navigation |
US20100008490A1 (en) * | 2008-07-11 | 2010-01-14 | Nader Gharachorloo | Phone Dialer with Advanced Search Feature and Associated Method of Searching a Directory |
US20100318696A1 (en) * | 2009-06-15 | 2010-12-16 | Nokia Corporation | Input for keyboards in devices |
US20110125733A1 (en) * | 2009-11-25 | 2011-05-26 | Fish Nathan J | Quick access utility |
US20110320307A1 (en) * | 2010-06-18 | 2011-12-29 | Google Inc. | Context-influenced application recommendations |
US20130285926A1 (en) * | 2012-04-30 | 2013-10-31 | Research In Motion Limited | Configurable Touchscreen Keyboard |
US20140123018A1 (en) * | 2012-11-01 | 2014-05-01 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US8775407B1 (en) * | 2007-11-12 | 2014-07-08 | Google Inc. | Determining intent of text entry |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7793233B1 (en) * | 2003-03-12 | 2010-09-07 | Microsoft Corporation | System and method for customizing note flags |
JP2004295805A (en) | 2003-03-28 | 2004-10-21 | Clarion Co Ltd | Keyword extraction device and automatic application starting device |
JP4297442B2 (en) * | 2004-11-30 | 2009-07-15 | 富士通株式会社 | Handwritten information input device |
KR100657630B1 (en) * | 2005-05-20 | 2006-12-20 | 주식회사 케이티프리텔 | Method and apparatus for performing application using number keyword |
US9204525B2 (en) | 2006-07-07 | 2015-12-01 | Cocoon Inc. | Protective covers |
KR100832800B1 (en) | 2007-02-03 | 2008-05-27 | 엘지전자 주식회사 | Mobile communication device providing candidate telephone number and control method thereof |
KR101626461B1 (en) * | 2009-06-09 | 2016-06-02 | 삼성전자주식회사 | Method for providing UI and display apparatus applying the same |
CN102006563A (en) * | 2009-09-01 | 2011-04-06 | 中兴通讯股份有限公司 | Information file processing method and device |
KR101160543B1 (en) * | 2010-02-26 | 2012-06-28 | 에스케이플래닛 주식회사 | Method for providing user interface using key word and terminal |
KR101951257B1 (en) | 2011-09-09 | 2019-02-26 | 삼성전자주식회사 | Data input method and portable device thereof |
US9596515B2 (en) * | 2012-01-04 | 2017-03-14 | Google Inc. | Systems and methods of image searching |
KR101156610B1 (en) * | 2012-03-20 | 2012-06-14 | 라오넥스(주) | Method for input controlling by using touch type, and computer-readable recording medium with controlling program using touch type |
KR102039553B1 (en) | 2012-08-31 | 2019-11-01 | 삼성전자 주식회사 | Method and apparatus for providing intelligent service using inputted character in a user device |
KR20140043644A (en) * | 2012-10-02 | 2014-04-10 | 엘지전자 주식회사 | Mobile terminal and control method for the mobile terminal |
US9519403B2 (en) | 2013-05-21 | 2016-12-13 | Samsung Electronics Co., Ltd. | Method and apparatus for performing URL linkage function using the keypad |
KR102181895B1 (en) | 2013-05-21 | 2020-11-23 | 삼성전자 주식회사 | Method and apparatus for performing an interlocking operation to a URL using the keypad |
TWI510994B (en) | 2013-09-13 | 2015-12-01 | Acer Inc | Electronic apparatus and method for controlling the same |
-
2013
- 2013-11-04 KR KR1020130132807A patent/KR102204261B1/en active IP Right Grant
-
2014
- 2014-11-04 DE DE112014005034.2T patent/DE112014005034T5/en active Pending
- 2014-11-04 US US15/032,522 patent/US20160274789A1/en not_active Abandoned
- 2014-11-04 CN CN201480060294.5A patent/CN105706039A/en active Pending
- 2014-11-04 WO PCT/KR2014/010480 patent/WO2015065146A1/en active Application Filing
- 2014-11-04 GB GB1607775.2A patent/GB2534100B/en active Active
-
2020
- 2020-09-16 US US17/022,708 patent/US11379116B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060015819A1 (en) * | 1999-08-12 | 2006-01-19 | Hawkins Jeffrey C | Integrated handheld computing and telephony system and services |
US20070156747A1 (en) * | 2005-12-12 | 2007-07-05 | Tegic Communications Llc | Mobile Device Retrieval and Navigation |
US8775407B1 (en) * | 2007-11-12 | 2014-07-08 | Google Inc. | Determining intent of text entry |
US20100008490A1 (en) * | 2008-07-11 | 2010-01-14 | Nader Gharachorloo | Phone Dialer with Advanced Search Feature and Associated Method of Searching a Directory |
US20100318696A1 (en) * | 2009-06-15 | 2010-12-16 | Nokia Corporation | Input for keyboards in devices |
US20110125733A1 (en) * | 2009-11-25 | 2011-05-26 | Fish Nathan J | Quick access utility |
US20110320307A1 (en) * | 2010-06-18 | 2011-12-29 | Google Inc. | Context-influenced application recommendations |
US20130285926A1 (en) * | 2012-04-30 | 2013-10-31 | Research In Motion Limited | Configurable Touchscreen Keyboard |
US20140123018A1 (en) * | 2012-11-01 | 2014-05-01 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180113609A1 (en) * | 2016-02-11 | 2018-04-26 | Hyperkey, Inc. | Enhanced Keyboard Including Multiple Application Execution |
US10768810B2 (en) * | 2016-02-11 | 2020-09-08 | Hyperkey, Inc. | Enhanced keyboard including multiple application execution |
US10976923B2 (en) | 2016-02-11 | 2021-04-13 | Hyperkey, Inc. | Enhanced virtual keyboard |
US11382812B2 (en) * | 2017-06-27 | 2022-07-12 | Stryker Corporation | Patient support systems and methods for assisting caregivers with patient care |
US20220304875A1 (en) * | 2017-06-27 | 2022-09-29 | Stryker Corporation | Patient Support Systems And Methods For Assisting Caregivers With Patient Care |
Also Published As
Publication number | Publication date |
---|---|
US20210004156A1 (en) | 2021-01-07 |
KR20150051409A (en) | 2015-05-13 |
CN105706039A (en) | 2016-06-22 |
KR102204261B1 (en) | 2021-01-18 |
US11379116B2 (en) | 2022-07-05 |
WO2015065146A1 (en) | 2015-05-07 |
GB2534100A (en) | 2016-07-13 |
DE112014005034T5 (en) | 2016-09-22 |
GB201607775D0 (en) | 2016-06-15 |
GB2534100B (en) | 2021-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11379116B2 (en) | Electronic apparatus and method for executing application thereof | |
CN105843491B (en) | Page rapid navigation switching method and device and terminal | |
US11054988B2 (en) | Graphical user interface display method and electronic device | |
KR101152008B1 (en) | Method and device for associating objects | |
EP2523070A2 (en) | Input processing for character matching and predicted word matching | |
CN107103013B (en) | Mobile terminal and control method thereof | |
CN105630327B (en) | The method of the display of portable electronic device and control optional element | |
JP5739131B2 (en) | Portable electronic device, control method and program for portable electronic device | |
US20130120271A1 (en) | Data input method and apparatus for mobile terminal having touchscreen | |
EP2350800A1 (en) | Live preview of open windows | |
EP3531258A1 (en) | Method for searching for icon, and terminal | |
US20140240262A1 (en) | Apparatus and method for supporting voice service in a portable terminal for visually disabled people | |
EP2884382B1 (en) | Dynamic application association with hand-written pattern | |
WO2022242586A1 (en) | Application interface method and apparatus, and electronic device | |
US10627953B2 (en) | Information processing apparatus, program, and information processing system | |
EP2741194A1 (en) | Scroll jump interface for touchscreen input/output device | |
WO2012008425A1 (en) | Electronic device and method of controlling same | |
US20120200508A1 (en) | Electronic device with touch screen display and method of facilitating input at the electronic device | |
CA2766877C (en) | Electronic device with touch-sensitive display and method of facilitating input at the electronic device | |
EP2485133A1 (en) | Electronic device with touch-sensitive display and method of facilitating input at the electronic device | |
JP2018132954A (en) | Information processing device, information processing method, information processing program, and computer readable storage media | |
KR101570510B1 (en) | Method and System to Display Search Result for fast scan of Search Result using Touch type Terminal | |
KR101039927B1 (en) | Function execution method and device for application of mobile terminal | |
TW201520946A (en) | Method for fast displaying Skype contact person list and computer program product thereof and portable electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARK, SEHWAN;PARK, SUNGWOOK;LEE, JAEYONG;REEL/FRAME:038396/0150 Effective date: 20160420 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |