US20230043168A1 - Method for outputting command method - Google Patents

Method for outputting command method Download PDF

Info

Publication number
US20230043168A1
US20230043168A1 US17/971,083 US202217971083A US2023043168A1 US 20230043168 A1 US20230043168 A1 US 20230043168A1 US 202217971083 A US202217971083 A US 202217971083A US 2023043168 A1 US2023043168 A1 US 2023043168A1
Authority
US
United States
Prior art keywords
item
graphic object
touch
processor
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/971,083
Inventor
Hyo June Kim
Ji Hye SEO
Hye In KIM
Ye In KIM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US17/971,083 priority Critical patent/US20230043168A1/en
Publication of US20230043168A1 publication Critical patent/US20230043168A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Definitions

  • a process of finding and executing a desired command is accompanied by a plurality of touch inputs and touch input releases. Accordingly, a process of outputting and executing a command menu is complicated and requires many processes to process a command.
  • a process of executing a command requires an additional operation of applying touch input.
  • a method of outputting a command menu includes outputting an upper graphic object indicating an upper layer menu among the command menus at a point from which touch input is detected, in response to a case in which the touch input of a user is detected, detecting a point to which the touch point is moved in response to touch movement input of moving the touch input while the touch input of the user is maintained, selecting the target upper item corresponding to the moved point among candidate upper items of the upper layer menu, outputting a lower graphic object indicating a lower layer menu corresponding to the selected target upper item while extending the lower graphic object from the upper graphic object, detecting a drop input of releasing the touch point from the lower layer menu, and executing an operation corresponding to one of the target lower item in response to a case in which the drop input is detected from a point corresponding to one target lower item of candidate lower items of the lower layer menu.
  • the executing an operation corresponding to one of the target lower item may include outputting a graphic object for requesting user approval in response to whether the operation is executed before the operation corresponding to the target lower item is executed.
  • the method may further include acquiring biometric information of the user in response to the touch input of the user, matching the biometric information and registered user information from a registered database, granting access authority for at least one of an application, a device, and a menu to the user in response to a case in which matching is successful, and outputting at least one of an upper graphic object and a lower graphic object based on the granted access authority.
  • the detecting the drop input may further include selecting a target lower item corresponding to a point of a candidate lower item of the lower layer menu, and visualizing information on the target lower item semi-transparently and overlaying and outputting a graphic object for the lower layer menu on information on the target lower item.
  • the method may further include detecting an external touch input of at least one point of a region outside a graphic object for the layer menu, detecting a movement trajectory of the external touch input, and rotating the graphic object based on the movement trajectory when the touch input is maintained in a region inside a graphic object for the layer menu.
  • the method may further include detecting an external touch for at least one point from a region outside a graphic object for the layer menu, detecting a movement trajectory of the external touch input, and moving at least a portion of the graphic object in an identified direction based on the external touch input.
  • a graphic object indicating some items of the candidate lower items of the lower layer menu may be output, and at least some of remaining items except for the some items may be exposed and output of at least some of the some items may be excluded in response to user input distinct from the touch input.
  • an item array combination of the lower graphic object may be output around one item of the upper layer menu based on an execution history of the user.
  • a method of outputting a layered command menu on a display includes outputting an upper graphic object indicating an upper layer menu among the command menus at a reference point from which the touch input is detected, in response to a case in which the user touch input is detected, detecting a touch point moved from the reference point in response to touch movement input of moving the touch input while the user touch input is maintained, selecting a target upper item from the upper layer menus in response to a case in which the touch point is moved to a point indicating a target upper item of a candidate upper item of the upper layer menu is moved, detecting a touch return movement input in which the touch point returns to the reference point after the target upper item is selected, and replacing the upper graphic object with a lower graphic object indicating a lower layer menu corresponding to the target upper item and outputting the replaced lower graphic object in response to a case in which the touch return movement input is detected.
  • a method of outputting a layered command menu on a display includes, in response to a case in which a manipulation detector detects touch input of a user, outputting a manipulation indicator at a portion of the display, indicating a touch point from which the touch input is detected, outputting an upper graphic object indicating an upper layer menu among the layered command menus on the display based on the manipulation indicator, detecting a point to which the manipulation indicator is moved, in response to touch movement input of moving the touch input while the user touch input is maintained, selecting a target upper item corresponding to the point to which the manipulation indicator is moved among candidate upper items of the upper layer menu, outputting a lower graphic object indicating a lower layer menu corresponding to the target upper item while extending the lower graphic object from the upper graphic object, detecting a drop input of releasing the touch point from a point corresponding to one target lower item among candidate lower items of the lower layer menu, and in response to a case in which the drop input is detected, executing an operation corresponding to the target lower item
  • FIG. 1 is a flowchart of a method of outputting a command menu on a large screen depending on an embodiment.
  • FIG. 2 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • FIG. 3 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • FIG. 4 is a diagram showing a graphic object indicating some items of a command menu on a large screen depending on an embodiment.
  • FIG. 5 is a diagram showing rotation of a graphic object depending on an embodiment.
  • FIG. 6 is a diagram movement of a graphic object depending on an embodiment.
  • FIG. 7 is a flowchart of a method of outputting a command menu on a small screen depending on an embodiment.
  • FIG. 8 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • FIG. 9 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • FIG. 10 is a diagram showing a graphic object indicating some items of a command menu on a small screen depending on an embodiment.
  • FIG. 11 is a diagram for explaining a method of outputting a command menu when a touch detector and a display are separated from each other depending on an embodiment.
  • FIG. 12 is a diagram for explaining a method of outputting a command menu in virtual reality depending on an embodiment.
  • FIG. 13 is a block diagram showing the overall configuration of an apparatus for outputting a command menu depending on an embodiment.
  • first and second are used herein merely to describe a variety of constituent elements, but the terms are used only for the purpose of distinguishing one constituent element from another constituent element. For example, a first element may be termed a second element, and a second element may be termed a first element.
  • a processor may wake up a device in a power saving mode or a power off state, or may call a command menu for executing a command in a device that is already turned on.
  • the user touch input may include, but is not limited to, a case in which a device detects touch of any part of the user body and may include a case in which the device senses the part of the user body through an input device.
  • the input device may be a mouse or a sensor installed in a display, and in response to detection of user touch input, the mouse or the sensor may transmit an electrical signal to a processor.
  • FIG. 1 is a flowchart of a method of outputting a command menu on a large screen depending on an embodiment.
  • a processor may output an upper graphic object indicating an upper layer menu among layered command menus at a touch point from which the touch input is detected.
  • the layer menu may be a combination of at least two layers.
  • the upper layer menu and the lower layer menu may be relatively determined depending on a layer stage.
  • the uppermost layer menu to the lowermost layer menu may sequentially be a first layer menu, a second layer menu, and a third layer menu.
  • the first layer menu may be a higher layer than the second layer menu, and thus, between the first layer menu and the second layer menu, the first layer menu may be an upper layer menu, and the second layer menu may be a lower layer.
  • the second layer menu may be a higher layer than the third layer menu, and thus, between the second layer menu and the third layer menu, the second layer menu may be an upper layer menu, and the third layer menu may be a lower layer menu.
  • the processor may detect a point to which a touch point is moved in response to touch movement input of moving the touch point while the user touch input is maintained.
  • the touch movement input may indicate an input of moving a point from which the touch input is detected.
  • the touch movement input may be an input of dragging the touch point while the touch input is maintained.
  • the processor may select a target upper item corresponding to the moved touch point among candidate upper items of the upper layer menu.
  • the layer menu may include a plurality of candidate items, and the target item may be an item selected by the touch movement input among the candidate items.
  • the processor may select the corresponding item as the target item.
  • the processor may determine the touch point to enter the item graphic object.
  • the upper layer menu may include at least one upper item, and a lower layer menu may be mapped to each of at least one upper item.
  • the lower layer menu may include at least one lower item, and a next lower layer menu may be mapped to each of at least one lower item.
  • FIGS. 2 , 3 , 5 , and 6 show the case in which the number of items included in each upper layer menu and each lower layer menu is 8, the embodiments are not limited thereto and may be changed depending on a design.
  • the processor may output the lower graphic object indicating the lower layer menu corresponding to the selected upper item while extending the lower graphic object from the upper graphic object indicating the upper layer menu.
  • the processor may output a lower graphic object in the form of surrounding an outer boundary of the upper graphic object in an outward direction from the upper graphic object.
  • the processor may output a lower graphic object in the form of covering a part of the outer boundary of the upper graphic object.
  • the processor may extend the lower graphic object in a direction toward the touch movement input.
  • the touch movement input may be directed in one direction consistently.
  • the processor may detect a drop input of releasing the touch point from the lower layer menu.
  • the drop input may be an input of releasing the touch point and may indicate an input of terminating the touch input at an arbitrary point.
  • the processor may execute an operation corresponding to the target lower item.
  • the target lower item may be an item selected among lower candidate items depending on the touch movement input.
  • An operation corresponding to the lower item may be pre-stored in a memory, and for example, various operations such as application execution, preview display, and execution of a function of a device may be allocated to respective lower items.
  • the memory in which the operation is pre-stored may be included in a device including the processor, but is not limited thereto, and the memory may be included in an external cloud device and the processor may communicate with the external cloud device to receive an operation. For example, when “Execute Application A” is allocated to the selected target lower item, the processor may execute application A by loading an operating process related to “Execute Application A” from the memory.
  • FIG. 1 shows only the graphic object including the upper layer menu and the lower layer menu, the embodiments are not limited thereto, and thus a method of outputting a command menu may also include performing a command operation by outputting a graphic object indicating layer menus of three stages or more.
  • a command menu outputting apparatus may provide a command menu based on authority while authenticating a user.
  • the processor may acquire biometric information of the user in response to the user touch input.
  • the biometric information of the user may be data related to a fingerprint of the user.
  • the data related to the fingerprint of the user may include a pattern of the fingerprint and an interval between curves of the fingerprint.
  • the data related to the fingerprint of the user may include a rate of change in the interval between the curves of the fingerprint over time.
  • the processor may determine that the finger of the user is not moved.
  • the processor may determine that the touch input is unstable and may skip a matching operation between the biometric information of the user to be described below and registered user information.
  • the processor may match the biometric information of the user and the registered user information.
  • the processor may match biometric information from a registered database and the registered user information.
  • the registered database may be stored in a memory associated with the processor, and the registered database may include user information (e.g., information on a registered fingerprint of each user) on a plurality of users.
  • the processor may calculate matching similarity between a plurality of pieces of registered user information and the user biometric information.
  • the processor may determine that the user biometric information matches the corresponding registered user information in response to the case in which the matching similarity calculated for arbitrary registered user information exceeds critical similarity.
  • the processor may determine that matching is successful when there is information that matches the user biometric information among a plurality of pieces of registered user information.
  • the processor may grant access authority for at least one of an application, a device, and a menu.
  • the access authority for at least one of an application, a device, and a menu may be individually set differently for each user.
  • the registered user information may include information on the access authority granted to the registered user, and for example, information on the application, the device, and the menu that the corresponding registered user accesses may be stored.
  • the processor may identify the application, the device, and the menu, which are permitted to the matched user, while loading the matched registered user information from the memory.
  • the processor may load the stored information on the application, the device, and the menu that the first user is capable of accessing. Since the first user is a minor, the application, the device, and the menu that the first user is capable of accessing may be limited.
  • the processor may output at least one of the upper graphic object or the lower graphic object based on the granted access authority. For example, the processor may output a graphic object of a layer menu (e.g., an upper layer menu or a lower layer menu) including items indicating an operation that an arbitrary user is capable of accessing. In response to the case in which a non-authorized user attempts access, the processor may also output a layer menu in a guest mode.
  • a layer menu e.g., an upper layer menu or a lower layer menu
  • the processor may also output a layer menu in a guest mode.
  • FIG. 2 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • the method of outputting a command menu depending on an embodiment may be performed by an apparatus 200 including a large-screen display.
  • the apparatus 200 including a large-screen display may have a relatively wide area for outputting a graphic object related to command menus.
  • the apparatus 200 including the large-screen display may be embodied in various forms of products such as a television (TV), a personal computer, a laptop computer, an intelligent vehicle, or a kiosk.
  • a graphic object indicating a layer menu may be output in response to user touch input detected by a display of a TV. After the graphic object is called, a graphic object indicating lower layer menus may be output toward a margin space of a large-screen display in response to touch movement input.
  • the apparatus 200 including a large-screen display is not limited only to the aforementioned embodiment, and may include an apparatus including a display that is difficult to grip with one hand.
  • the processor of the apparatus 200 including a large-screen display may output an upper graphic object 210 indicating the upper layer menu among layered command menus based on a touch point 220 from which the touch input is detected.
  • the touch point 220 from which the touch input is detected may be output to a display.
  • FIG. 2 shows the touch point 220 , and the touch point 220 may not be output.
  • the processor may detect a point to which the touch point is moved in response to touch movement input 230 from the detected touch point 220 .
  • the point to which the touch point is moved may be positioned at one item of the upper graphic object.
  • the processor may determine that a corresponding item is selected. Thus, the processor may select a target upper item 211 corresponding to the moved point among candidate upper items of the upper layer menu. For example, the processor may select the target upper item 211 depending on whether a touch region occupies a critical ratio or greater of a graphic object corresponding to the target upper item 211 . For example, in response to the case in which the touch region detected by the display occupies 50% or greater of a graphic object indicated by the target upper item 211 , the processor may select the target upper item 211 .
  • FIG. 3 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • a processor of an apparatus 300 may output a lower graphic object 320 indicating a lower layer menu corresponding to a selected upper item while extending the lower graphic object 320 from an upper graphic object 310 .
  • the upper graphic object 310 may be shaped like a circle with an empty inside, and for example, may be shaped like a donut.
  • the lower graphic object 320 may be shaped like a circle that is in contact with an outer circumference of the upper graphic object 310 and accommodates the upper graphic object 310 .
  • the shapes of the upper graphic object 310 and the lower graphic object are not limited thereto, and in another example, the lower graphic object may be shaped like a sector, and the processor may output the lower graphic object shaped like a sector in a direction extending from the upper graphic object 310 based on a target upper item 311 .
  • the processor may detect a point to which the touch point is moved in response to detection of touch movement input 330 .
  • the processor may select the target lower item 321 .
  • the processor may visualize information on the target lower item 321 semi-transparently, and may overlay and output a graphic object for the lower layer menu on the information on the target lower item 321 .
  • the information on the target lower item 321 may be preview information (e.g., a preview image) related to the target lower item 321 .
  • the information on the target lower item 321 is not limited thereto, and may include various pieces of information that a user is capable of referring to for performing an operation related to the target lower item 321 . For example, when the target lower item 321 is “missed call”, the information on the target lower item 321 may be a missed call list.
  • the processor may output a graphic object indicating a lower layer item of the target lower item 321 in a direction extending from the lower graphic object 320 .
  • the processor may detect a drop input of releasing a touch point from a point corresponding to one target lower item 321 among candidate lower items 322 of the lower layer menu.
  • the processor may execute an operation corresponding to the target lower item 321 in response to the case in which the drop input is detected.
  • a pop-up operation may be an operation of visualizing and extending graphic expression corresponding to the lower layer menu to an entire screen starting from the target lower item 321 .
  • the processor may output a graphic object for requesting user approval as to whether to execute the operation.
  • the graphic object for requesting user approval may include a graphic object (e.g., a message window) for asking the user whether to execute the operation.
  • a graphic object for allowing the user to select to execute the operation may be output.
  • the processor may determine whether to execute the corresponding operation.
  • the processor may execute the corresponding operation in response to the case in which approval manipulation (e.g., activation of an approval button) for execution of the operation is received from the user.
  • the processor may exclude execution of the corresponding operation in response to receiving rejection manipulation (e.g., activation of a reject button) for execution of the operation from the user.
  • the processor may execute an operation corresponding to the target lower item.
  • the processor may not immediately execute the operation corresponding to the target lower item, and may wait and execute the operation only when the touch input of the corresponding target lower item is detected once again, and accordingly, it may be possible to prevent an erroneous operation in which the operation is executed differently from user intention.
  • the processor may determine an item array combination of the lower graphic object 320 based on an execution history of the user and may output the determined item array combination around one item of the upper layer menu.
  • the execution history may be, but is not limited to, a frequency of execution of an operation, a frequency of selection of an item, or the like, and may be a sequence of recent execution.
  • the processor may output lower candidate items around the target upper item in an order from a lower candidate item with the highest priority depending on the execution history. For example, with respect to the target upper item ‘A’, the processor may position the most selected or executed lower candidate item ‘A1’ around the target upper item ‘A’.
  • FIG. 4 is a diagram showing a graphic object indicating some items of a command menu on a large screen depending on an embodiment.
  • the processor may output a graphic object 410 indicating some items 430 among candidate lower items of the lower layer menu on a display 400 . Then, in response to user input 420 distinct from touch input, the processor may expose at least some of remaining items except for the some items 430 and may exclude output of at least some of the some items 430 .
  • the processor may output only the some items 430 of the candidate lower item on the graphic object 410 .
  • the graphic object 410 may be a graphic object formed by listing candidate lower items.
  • the user input 420 may be scroll input, and the scroll input may be touch movement input of moving a touch point in a direction in which candidate lower items are listed (e.g., a vertical direction in FIG. 4 ).
  • the processor may expose some of remaining items except for the some items 430 , and in response thereto, may exclude output of some of the some items 430 .
  • the processor may expose only the some items 430 among candidate lower items related to a phone number, may output the some items 430 on the graphic object 410 , and may exclude output of remaining items except for the some items 430 .
  • the processor may expose some of the remaining items.
  • the processor may exclude output of some of the some items 430 that have been exposed on the graphic object 410 in response to a portion of the graphic object 410 , which is additionally exposed.
  • FIG. 5 is a diagram showing rotation of a graphic object depending on an embodiment.
  • the processor may detect external touch inputs 521 and 522 of at least one point of a region outside a graphic object for a layer menu.
  • the processor may detect a movement trajectory 530 of the external touch input.
  • touch input 520 is maintained in a region 510 inside the graphic object, the processor may rotate the graphic object based on the movement trajectory 530 .
  • the region outside the graphic object may be a remaining region except for a portion occupied by the graphic object in the display.
  • the movement trajectory 530 of the external touch input may be shaped like, for example, a curve, and the processor may determine whether the movement trajectory 530 is clockwise or counterclockwise. For example, when the movement trajectory 530 is detected to be clockwise, the processor may rotate the graphic object clockwise. As the processor rotates the graphic object, a direction in which the lower layer menu is output may be adjusted. Thus, the user may position an item hidden by his or her finger into his or her field of view by rotating the graphic object. In addition, a margin of the display may be effectively utilized.
  • the processor may gradually rearrange graphic objects corresponding items clockwise or counterclockwise based on a reference point (e.g., the center of a circle).
  • the processor may rearrange only the position of each graphic object while maintaining the shape of each graphic object rather rotating the same.
  • FIG. 6 is a diagram movement of a graphic object depending on an embodiment.
  • the processor may detect an external touch input 621 of at least one point of a region outside the graphic object for the layer menu.
  • the processor may detect a movement trajectory of the external touch input 621 .
  • touch input 620 is maintained in a region 610 inside the graphic object, the processor may move the graphic object based on a movement trajectory 630 .
  • the processor may move at least a portion of the graphic object in identified directions 630 and 631 based on the external touch input 621 .
  • the processor may detect touch movement input direction of the external touch input 621 .
  • the processor may move at least a portion of the graphic object in the identified directions 630 and 631 based on the external touch input 621 .
  • At least a portion moved in the graphic object may be, but is not limited to, a graphic object except for a graphic object for the uppermost layer menu, but the graphic object may be entirely moved.
  • the processor may move the graphic object and may output the added lower layer menu depending on user input.
  • FIG. 7 is a flowchart of a method of outputting a command menu on a small screen depending on an embodiment.
  • Operations 700 to 720 of outputting a graphic object at a reference point at which touch input is detected and then selecting a target upper item are the same as the above description of operations 100 to 120 of FIG. 1 , and thus a detailed description thereof is omitted.
  • the processor may detect a touch return movement input in which the touch point returns to the reference point after the target upper item is selected.
  • the reference point may be a point corresponding to a touch point at which a display touch input of a user is generated.
  • the processor may detect a touch return movement input of returning from the touch point detected in response to the touch movement input to the reference point after the target upper item is selected.
  • the processor may replace a graphic object indicating an upper layer menu with a graphic object indicating a lower layer menu corresponding to the target upper item and may output the same.
  • the upper layer menu is replaced with the lower layer menu, a portion of the display, occupied by the graphic object, may not be increased, and thus, it may be easy to output a command menu on a small screen with a small display margin compared with a large screen.
  • a small-screen apparatus is often held with one hand, and thus movement of the touch movement input may be shorter than in the case in which a command menu is output on a large screen.
  • the graphic object indicating the lower layer menu is replaced and output, and thus the command menu to the lowermost layer menu from the uppermost layer menu may be output with relatively short movement of the touch movement input.
  • the uppermost layer menu and the lowermost layer menu may correspond to layer menus of uppermost and lowermost stages depending on an embodiment.
  • Operations 750 and 760 are the same as the above description of operations 140 and 150 of FIG. 1 , and thus a detailed description thereof is omitted.
  • FIG. 8 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • the method of outputting a command menu depending on an embodiment may be performed by an apparatus 800 including a small-screen display.
  • the apparatus 800 including a small-screen display may have a relatively small area for outputting a graphic object related to command menus compared with a large-screen display.
  • the apparatus 800 including a small-screen display may be embodied in various forms of products such as a smartphone, a tablet PC, a smart electronic device, an intelligent vehicle, an intelligent vehicle, or a wearable device.
  • a graphic object indicating a layer menu may be output.
  • a graphic object indicating lower layer menus may replace a graphic object indicating upper layer menus and may be output.
  • the apparatus 800 including a small-screen display is not limited only to the above embodiments, and may be an apparatus including a display to be gripped by one hand.
  • a processor of the apparatus 800 including a small-screen display may output an upper graphic object 810 indicating an upper layer menu among layered command menus based on a reference point 820 from which touch input is detected.
  • the reference point 820 from which the touch input is detected may be output on a display
  • FIG. 8 is a diagram for explaining the reference point 820 , but depending on another embodiment, the reference point 820 may not be output on the display.
  • the processor may detect a point to which the touch point is moved in response to touch movement input 830 from the detected reference point 820 .
  • the point to which the touch point is moved in response to the touch movement input 830 may be positioned on one item of the upper graphic object.
  • the processor may determine that the positioned item is selected.
  • the processor may select a target upper item 811 corresponding to the selected point among candidate upper items of the upper layer menu. Selection of the target upper item may be determined depending on whether the touch point occupies a critical ratio or greater of a graphic object indicated by the target upper item. For example, in response to the case in which the touch point detected by the processor corresponds to 50% or greater of the graphic object indicated by the target upper item, the processor may determine that the upper item is selected.
  • the processor may detect the touch return movement input 831 in which the touch point returns to the reference point 820 .
  • the touch return movement input 831 may be an input of a direction corresponding to the touch movement input.
  • a trajectory of the touch movement input may be formed in an opposite direction to a movement direction of the touch movement input, but is not limited thereto, and a trajectory of moving a touch point to the reference point from the target upper item may be formed.
  • a graphic object related to a menu depending on an embodiment may be shaped like a sector, but may be shaped like a circle that radiates based on the reference point.
  • the shape of the graphic object is not limited thereto, and the graphic object may be configured for a user to select a menu item.
  • FIG. 9 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • a processor of an apparatus 900 may replace a graphic object indicating an upper layer menu with a graphic object indicating a lower layer menu corresponding to the target upper item and may output the same.
  • the processor may detect a drop input of releasing a touch point from a point corresponding to one target lower item 911 among candidate lower items of the lower layer menu. In response to the case in which the drop input is detected, the processor may execute an operation corresponding to the target lower item 911 .
  • the processor may execute an operation corresponding to the target lower item 911 .
  • the processor may calculate a moving acceleration of the touch point to the external point from the target lower item 911 .
  • the processor may execute an operation corresponding to the target lower item 911 .
  • execution of an operation depending on touch movement input with an acceleration may prevent the operation from being executed differently from user intention.
  • the processor Before executing an operation corresponding to the target lower item 911 , the processor may output a graphic object for requesting user approval in response to whether to execute the operation.
  • the processor may execute an operation corresponding to the target lower item.
  • the processor may detect the touch return movement input 931 of selecting the target lower item 911 and then moving the touch point to return to a reference point 920 depending on touch movement input 930 .
  • the processor may replace the lower graphic object 910 with a graphic object indicating an additional lower layer menu corresponding to the target lower item 911 and may output the same.
  • FIG. 10 is a diagram showing a graphic object indicating some items of a command menu on a small screen depending on an embodiment.
  • a processor of an apparatus 1000 may output a graphic object 1010 indicating some items 1020 among candidate lower items of the lower layer menu on a display. Then, in response to user input distinct from touch input, the processor may expose at least some of remaining items except for the some items 1020 and may exclude output of at least some of the some items 1020 .
  • the processor may output only the some items 1020 of the candidate lower item on the graphic object 1010 .
  • the graphic object 1010 may be a graphic object formed by listing candidate lower items.
  • the user input may be a two-touch round-trip input 1030 corresponding to a desired scrolling direction, distinct from the touch input and the touch return movement input.
  • the touch round-trip input may include touch movement input and a touch return movement input in a direction corresponding to a touch movement direction.
  • One touch round-trip input and the two-touch round-trip input 1030 may be distinguished depending on the number of round trip inputs detected for a predetermined time.
  • the processor may expose some of remaining items except for the some items 1020 , and in response thereto, may exclude output of some of the some items 1020 .
  • the processor may list only the some items 1020 among candidate lower items related to a phone number and may output the list on the graphic object 1010 .
  • the processor may expose an item corresponding to an upper end of the graphic object 1010 among remaining items except for the some items 1020 .
  • output of some of the some items 1020 exposed out of a lower end of the graphic object 1010 , may be excluded.
  • FIG. 11 is a diagram for explaining a method of outputting a command menu when a touch detector and a display are separated from each other depending on an embodiment.
  • a manipulation detector for detecting user manipulation may be physically separated from a display 1100 , and an electrical signal detected by the touch detector may be transferred to a processor through a communication unit between the manipulation detector and the display 1100 .
  • the manipulation detector includes, for example, a touch detector 1120 , but is not limited thereto.
  • a processor may output a manipulation indicator on a part of the display 1100 , indicating a touch point from which touch input 1130 is detected.
  • the manipulation indicator may be a graphic object displayed on the display 1100 to correspond to a point from which the touch input is detected.
  • An operation of outputting a graphic object 1110 related to a layered menu to correspond to the touch point and executing an operation corresponding to the target lower item among lower menus is the same as the above description of FIGS. 1 to 10 , and thus a detailed description thereof is omitted.
  • FIG. 12 is a diagram for explaining a method of outputting a command menu in virtual reality depending on an embodiment.
  • a manipulation detector for detecting user input and a display may be physically separated from each other, and an electrical signal detected by the manipulation detector may be transferred to a processor through a communication unit between the manipulation detector and the display.
  • the manipulation detector may be a sensor for detecting movement of the body part of the user and may include, for example, a sensor for detecting finger joint movement.
  • the manipulation detector may include a sensor implemented in the form of a glove for sensing bending and unfolding of the user knuckles.
  • the sensor for sensing bending and unfolding of the user knuckles may be positioned in a portion corresponding to the user knuckle of the glove.
  • the sensor may include a piezoelectric sensor, and in response to a piezoelectric signal generated when the finger is bent, the sensor may detect finger joint movement.
  • the embodiments are not limited thereto, and the sensor may include a pressure sensor, and in response to the case in which a pressure generated by bending the user finger is sensed, the sensor may detect whether the finger is bent.
  • the movement of the body part is not limited to detection of the movement of the finger joint and may include blinking of an eye, movement of legs and arms, and a joint motion of the body part.
  • the manipulation detector may be a sensor for detecting movement of the body part of the user and may include an image sensor for detecting hand movement. Sensing data of the image sensor including a camera may be transmitted to the processor, and the camera may photograph the user hand.
  • the embodiments are merely embodiments for detecting movement of the user hand, and methods of detecting whether the user finger is unfolded are not limited to the above embodiments and may use any method at a level of an ordinary skill in the art for detecting movement of the user hand, such as a wearable device for measuring an angle or a distance between joints, and electrical resistance.
  • the display may be a virtual display implemented through virtual reality, and virtual reality may be implemented by connecting a virtual reality device 1220 in contact with the user face to a processor.
  • the virtual reality device 1220 may limit a user view, and only the display of virtual reality may be implemented by the processor.
  • the virtual reality device 1220 may provide, for example, a right eye image to a right eye of the user and may provide a left eye image to a left eye of the user, and the right eye image and the left eye image may have disparity with each other.
  • the virtual reality device 1220 may provide the aforementioned left eye image and the right eye image to the user, and thus may visualize and provide three-dimensional content to the user.
  • the virtual reality device 1220 may not limit the user view, and the user may execute an operation related to a command item while viewing a screen on which virtual reality overlaps reality.
  • the processor may detect a state 1210 in which the user finger is unfolded and may output a graphic object 1230 indicating an upper layer menu among layered command menus with a manipulation indicator on a portion indicating a touch point corresponding to one end of the finger on the display of virtual reality.
  • the processor may detect touch movement input of moving a touch point corresponding to one end of the finger and may detect a point to which the manipulation indicator is moved as the touch point is moved while the state 1210 in which the user finger is unfolded is maintained. In response to detection of the point to which the manipulation indicator is moved, the processor may select a target upper item of the upper layer menu and may output a lower graphic object indicating a lower layer menu corresponding to the target upper item. The processor may detect a state 1200 in which the user finger is unfolded from a point corresponding to one target lower item among candidate lower items of the lower layer menu and may execute an operation corresponding to the target lower item.
  • FIG. 13 is a block diagram showing the overall configuration of an apparatus 1300 for outputting a command menu depending on an embodiment.
  • the apparatus 1300 for outputting a command menu may include a processor 1310 , a display 1320 , and a touch detector 1330 .
  • the processor 1310 may receive an electrical signal converted from touch input detected by the touch detector 1330 .
  • the processor 1310 may search for a point on the display 1320 , corresponding to a touch point from which the touch input is detected, based on the received electrical signal.
  • the processor 1310 may output a graphic object at a corresponding point on the display 1320 and may then perform a series of processes for executing an operation.
  • the process for executing the operation after the graphic object is output is the same as the above description of FIGS. 1 to 12 , and thus a detailed description thereof is omitted.
  • FIG. 13 illustrates the case in which the display 1320 and the touch detector 1330 are separate components, the embodiments are not limited thereto.
  • the display 1320 and the touch detector 1330 may be integrally implemented as a sensitive display.
  • the apparatus 1300 for outputting a command menu may further include a memory of a registered database for storing registered user information that matches user biometric information.
  • the processor 1310 may grant access authority to the user based on the registered user information stored in the memory.
  • the embodiments described above may be implemented by a hardware component, a software component, and/or a combination of a hardware component and a software component.
  • the device, the method, and the components described with regard to the embodiments may be implemented using one or more general-purpose computers or a special purpose computer, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), a programmable logic unit (PLU), a microprocessor, or any other device for executing and responding to an instruction.
  • the processing device may execute an operating system (OS) and one or more software applications executed on the OS.
  • OS operating system
  • software applications executed on the OS.
  • the processing device may access, store, manipulate, process, and generate data in response to execution of the software.
  • the processing device includes a plurality of processing elements and/or a plurality of types of processing elements.
  • the processing device may include a plurality of processors or one processor and one controller.
  • the processing may also include other processing configurations such as a parallel processor.
  • Software may include a computer program, a code, an instruction, or a combination of one or more thereof and may configure the processing device to operate as described or may independently or collectively issue a command to the processing device.
  • the software and/or data may be permanently or temporarily embodied by any type of machine, a component, a physical device, virtual equipment, a computer storage or device, or a received signal wave in order to be interpreted by the processing device or to provide a command or data to the processing device.
  • the software may be distributed over a networked computer system and may be stored or executed in a distributed manner.
  • the software and data may be stored in one or more computer-readable recording media.
  • the methods depending on the embodiments may be recorded in a computer readable medium including program commands for executing operations implemented through various computers.
  • the computer readable medium may store program commands, data files, data structures or combinations thereof.
  • the program commands recorded in the medium may be specially designed and configured for the present invention or be known to those skilled in the field of computer software.
  • Examples of a computer readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, or hardware devices such as ROMs, RAMS and flash memories, which are specially configured to store and execute program commands.
  • Examples of the program commands include machine language code created by a compiler and high-level language code executable by a computer using an interpreter and the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Abstract

Disclosed is an invention relating to a processor-executed method for outputting a hierarchical command menu on a display depending on user inputs.

Description

    FIELD OF INVENTION
  • Hereinafter, provided is technology for recognizing touch input and touch release and outputting and executing a command menu.
  • BACKGROUND OF INVENTION
  • In an electronic device for executing a command using touch input, a process of finding and executing a desired command is accompanied by a plurality of touch inputs and touch input releases. Accordingly, a process of outputting and executing a command menu is complicated and requires many processes to process a command.
  • In addition, after a user wakes up an electronic device from a power saving mode using a fingerprint or facial recognition, a process of executing a command requires an additional operation of applying touch input.
  • SUMMARY OF INVENTION Technical Solution
  • Depending on an embodiment, a method of outputting a command menu includes outputting an upper graphic object indicating an upper layer menu among the command menus at a point from which touch input is detected, in response to a case in which the touch input of a user is detected, detecting a point to which the touch point is moved in response to touch movement input of moving the touch input while the touch input of the user is maintained, selecting the target upper item corresponding to the moved point among candidate upper items of the upper layer menu, outputting a lower graphic object indicating a lower layer menu corresponding to the selected target upper item while extending the lower graphic object from the upper graphic object, detecting a drop input of releasing the touch point from the lower layer menu, and executing an operation corresponding to one of the target lower item in response to a case in which the drop input is detected from a point corresponding to one target lower item of candidate lower items of the lower layer menu.
  • Depending on an embodiment, the executing an operation corresponding to one of the target lower item may include outputting a graphic object for requesting user approval in response to whether the operation is executed before the operation corresponding to the target lower item is executed.
  • Depending on an embodiment, the method may further include acquiring biometric information of the user in response to the touch input of the user, matching the biometric information and registered user information from a registered database, granting access authority for at least one of an application, a device, and a menu to the user in response to a case in which matching is successful, and outputting at least one of an upper graphic object and a lower graphic object based on the granted access authority.
  • Depending on an embodiment, the detecting the drop input may further include selecting a target lower item corresponding to a point of a candidate lower item of the lower layer menu, and visualizing information on the target lower item semi-transparently and overlaying and outputting a graphic object for the lower layer menu on information on the target lower item.
  • Depending on another embodiment, the method may further include detecting an external touch input of at least one point of a region outside a graphic object for the layer menu, detecting a movement trajectory of the external touch input, and rotating the graphic object based on the movement trajectory when the touch input is maintained in a region inside a graphic object for the layer menu.
  • Depending on another embodiment, the method may further include detecting an external touch for at least one point from a region outside a graphic object for the layer menu, detecting a movement trajectory of the external touch input, and moving at least a portion of the graphic object in an identified direction based on the external touch input.
  • In the method of outputting a command menu, when a number of items of the lower layer menu is greater than a predetermined number, a graphic object indicating some items of the candidate lower items of the lower layer menu may be output, and at least some of remaining items except for the some items may be exposed and output of at least some of the some items may be excluded in response to user input distinct from the touch input.
  • In the method of outputting a command menu depending on an embodiment, an item array combination of the lower graphic object may be output around one item of the upper layer menu based on an execution history of the user.
  • Depending on another embodiment, a method of outputting a layered command menu on a display includes outputting an upper graphic object indicating an upper layer menu among the command menus at a reference point from which the touch input is detected, in response to a case in which the user touch input is detected, detecting a touch point moved from the reference point in response to touch movement input of moving the touch input while the user touch input is maintained, selecting a target upper item from the upper layer menus in response to a case in which the touch point is moved to a point indicating a target upper item of a candidate upper item of the upper layer menu is moved, detecting a touch return movement input in which the touch point returns to the reference point after the target upper item is selected, and replacing the upper graphic object with a lower graphic object indicating a lower layer menu corresponding to the target upper item and outputting the replaced lower graphic object in response to a case in which the touch return movement input is detected.
  • Depending on another embodiment, a method of outputting a layered command menu on a display includes, in response to a case in which a manipulation detector detects touch input of a user, outputting a manipulation indicator at a portion of the display, indicating a touch point from which the touch input is detected, outputting an upper graphic object indicating an upper layer menu among the layered command menus on the display based on the manipulation indicator, detecting a point to which the manipulation indicator is moved, in response to touch movement input of moving the touch input while the user touch input is maintained, selecting a target upper item corresponding to the point to which the manipulation indicator is moved among candidate upper items of the upper layer menu, outputting a lower graphic object indicating a lower layer menu corresponding to the target upper item while extending the lower graphic object from the upper graphic object, detecting a drop input of releasing the touch point from a point corresponding to one target lower item among candidate lower items of the lower layer menu, and in response to a case in which the drop input is detected, executing an operation corresponding to the target lower item.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a flowchart of a method of outputting a command menu on a large screen depending on an embodiment.
  • FIG. 2 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • FIG. 3 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • FIG. 4 is a diagram showing a graphic object indicating some items of a command menu on a large screen depending on an embodiment.
  • FIG. 5 is a diagram showing rotation of a graphic object depending on an embodiment.
  • FIG. 6 is a diagram movement of a graphic object depending on an embodiment.
  • FIG. 7 is a flowchart of a method of outputting a command menu on a small screen depending on an embodiment.
  • FIG. 8 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • FIG. 9 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • FIG. 10 is a diagram showing a graphic object indicating some items of a command menu on a small screen depending on an embodiment.
  • FIG. 11 is a diagram for explaining a method of outputting a command menu when a touch detector and a display are separated from each other depending on an embodiment.
  • FIG. 12 is a diagram for explaining a method of outputting a command menu in virtual reality depending on an embodiment.
  • FIG. 13 is a block diagram showing the overall configuration of an apparatus for outputting a command menu depending on an embodiment.
  • BEST MODE
  • Particular structural and functional descriptions of embodiments are only for the purpose of describing particular embodiments and are formed in many different forms. Thus, the embodiments should not be construed as being limited to the embodiments set forth herein, and all changes, equivalents, and substitutes that do not depart from the technical scope are encompassed in the specification.
  • Terms such as “first” and “second” are used herein merely to describe a variety of constituent elements, but the terms are used only for the purpose of distinguishing one constituent element from another constituent element. For example, a first element may be termed a second element, and a second element may be termed a first element.
  • When an element is referred to as being “connected to” or “coupled to” another element, it may be directly on, connected or coupled to the other element, or intervening elements may be present.
  • The singular expressions in the present specification include plural expressions unless clearly specified otherwise in context. Also, terms such as “include” or “comprise” may be construed to denote a certain characteristic, number, step, operation, constituent element, or a combination thereof, but may not be construed to exclude the existence of or a possibility of addition of one or more other characteristics, numbers, steps, operations, constituent elements, or combinations thereof.
  • Unless otherwise defined, all terms including technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. Like reference numerals in each figure indicate like elements.
  • In response to detection of user touch input, a processor depending on an embodiment may wake up a device in a power saving mode or a power off state, or may call a command menu for executing a command in a device that is already turned on. The user touch input may include, but is not limited to, a case in which a device detects touch of any part of the user body and may include a case in which the device senses the part of the user body through an input device. For example, the input device may be a mouse or a sensor installed in a display, and in response to detection of user touch input, the mouse or the sensor may transmit an electrical signal to a processor.
  • FIG. 1 is a flowchart of a method of outputting a command menu on a large screen depending on an embodiment.
  • In operation 100, in response to detection of user touch input, a processor may output an upper graphic object indicating an upper layer menu among layered command menus at a touch point from which the touch input is detected.
  • The layer menu may be a combination of at least two layers. The upper layer menu and the lower layer menu may be relatively determined depending on a layer stage. For example, when the layer menu includes three layers, the uppermost layer menu to the lowermost layer menu may sequentially be a first layer menu, a second layer menu, and a third layer menu. The first layer menu may be a higher layer than the second layer menu, and thus, between the first layer menu and the second layer menu, the first layer menu may be an upper layer menu, and the second layer menu may be a lower layer. In contrast, the second layer menu may be a higher layer than the third layer menu, and thus, between the second layer menu and the third layer menu, the second layer menu may be an upper layer menu, and the third layer menu may be a lower layer menu.
  • In operation 110, the processor may detect a point to which a touch point is moved in response to touch movement input of moving the touch point while the user touch input is maintained.
  • The touch movement input may indicate an input of moving a point from which the touch input is detected. For example, the touch movement input may be an input of dragging the touch point while the touch input is maintained.
  • In operation 120, the processor may select a target upper item corresponding to the moved touch point among candidate upper items of the upper layer menu. The layer menu may include a plurality of candidate items, and the target item may be an item selected by the touch movement input among the candidate items. For example, in response to the case in which the touch point enters an item graphic object indicating an arbitrary item among the candidate items, the processor may select the corresponding item as the target item. In response to the case in which an area equal to or greater than a predetermined ratio is detected from a region in which the touch is formed, the processor may determine the touch point to enter the item graphic object.
  • The upper layer menu may include at least one upper item, and a lower layer menu may be mapped to each of at least one upper item. The lower layer menu may include at least one lower item, and a next lower layer menu may be mapped to each of at least one lower item. For reference, although FIGS. 2, 3, 5, and 6 show the case in which the number of items included in each upper layer menu and each lower layer menu is 8, the embodiments are not limited thereto and may be changed depending on a design.
  • In operation 130, the processor may output the lower graphic object indicating the lower layer menu corresponding to the selected upper item while extending the lower graphic object from the upper graphic object indicating the upper layer menu. For example, the processor may output a lower graphic object in the form of surrounding an outer boundary of the upper graphic object in an outward direction from the upper graphic object. However, without being limited thereto, the processor may output a lower graphic object in the form of covering a part of the outer boundary of the upper graphic object. In this case, the processor may extend the lower graphic object in a direction toward the touch movement input. Thus, while the processor extends the lower graphic object from the upper graphic object, the touch movement input may be directed in one direction consistently.
  • In operation 140, the processor may detect a drop input of releasing the touch point from the lower layer menu. The drop input may be an input of releasing the touch point and may indicate an input of terminating the touch input at an arbitrary point.
  • In operation 150, in response to the case in which the drop input is detected at a point corresponding to one target lower item among candidate lower items of the lower layer menu, the processor may execute an operation corresponding to the target lower item. The target lower item may be an item selected among lower candidate items depending on the touch movement input. An operation corresponding to the lower item may be pre-stored in a memory, and for example, various operations such as application execution, preview display, and execution of a function of a device may be allocated to respective lower items. The memory in which the operation is pre-stored may be included in a device including the processor, but is not limited thereto, and the memory may be included in an external cloud device and the processor may communicate with the external cloud device to receive an operation. For example, when “Execute Application A” is allocated to the selected target lower item, the processor may execute application A by loading an operating process related to “Execute Application A” from the memory.
  • Although FIG. 1 shows only the graphic object including the upper layer menu and the lower layer menu, the embodiments are not limited thereto, and thus a method of outputting a command menu may also include performing a command operation by outputting a graphic object indicating layer menus of three stages or more.
  • A command menu outputting apparatus depending on an embodiment may provide a command menu based on authority while authenticating a user.
  • For example, the processor may acquire biometric information of the user in response to the user touch input. The biometric information of the user may be data related to a fingerprint of the user. The data related to the fingerprint of the user may include a pattern of the fingerprint and an interval between curves of the fingerprint. Depending on an embodiment, the data related to the fingerprint of the user may include a rate of change in the interval between the curves of the fingerprint over time. When the rate of change in the interval between the curves of the fingerprint over time is not changed, the processor may determine that the finger of the user is not moved. In contrast, when the rate of change in the interval between the curves of the fingerprint is equal to or greater than a threshold value, the processor may determine that the touch input is unstable and may skip a matching operation between the biometric information of the user to be described below and registered user information.
  • The processor may match the biometric information of the user and the registered user information. The processor may match biometric information from a registered database and the registered user information. The registered database may be stored in a memory associated with the processor, and the registered database may include user information (e.g., information on a registered fingerprint of each user) on a plurality of users. When registered user information related to a plurality of users is stored in the registered database, the processor may calculate matching similarity between a plurality of pieces of registered user information and the user biometric information. The processor may determine that the user biometric information matches the corresponding registered user information in response to the case in which the matching similarity calculated for arbitrary registered user information exceeds critical similarity. The processor may determine that matching is successful when there is information that matches the user biometric information among a plurality of pieces of registered user information.
  • In response to the case in which matching is successful, the processor may grant access authority for at least one of an application, a device, and a menu. Depending on an embodiment, the access authority for at least one of an application, a device, and a menu may be individually set differently for each user. The registered user information may include information on the access authority granted to the registered user, and for example, information on the application, the device, and the menu that the corresponding registered user accesses may be stored. When matching is successful, the processor may identify the application, the device, and the menu, which are permitted to the matched user, while loading the matched registered user information from the memory. For example, when a first user is a minor, the processor may load the stored information on the application, the device, and the menu that the first user is capable of accessing. Since the first user is a minor, the application, the device, and the menu that the first user is capable of accessing may be limited.
  • The processor may output at least one of the upper graphic object or the lower graphic object based on the granted access authority. For example, the processor may output a graphic object of a layer menu (e.g., an upper layer menu or a lower layer menu) including items indicating an operation that an arbitrary user is capable of accessing. In response to the case in which a non-authorized user attempts access, the processor may also output a layer menu in a guest mode.
  • FIG. 2 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • The method of outputting a command menu depending on an embodiment may be performed by an apparatus 200 including a large-screen display. The apparatus 200 including a large-screen display may have a relatively wide area for outputting a graphic object related to command menus. The apparatus 200 including the large-screen display may be embodied in various forms of products such as a television (TV), a personal computer, a laptop computer, an intelligent vehicle, or a kiosk. For example, depending on embodiments, a graphic object indicating a layer menu may be output in response to user touch input detected by a display of a TV. After the graphic object is called, a graphic object indicating lower layer menus may be output toward a margin space of a large-screen display in response to touch movement input. However, the apparatus 200 including a large-screen display is not limited only to the aforementioned embodiment, and may include an apparatus including a display that is difficult to grip with one hand.
  • When detecting touch input of a user, the processor of the apparatus 200 including a large-screen display may output an upper graphic object 210 indicating the upper layer menu among layered command menus based on a touch point 220 from which the touch input is detected.
  • Depending on an embodiment, as shown in FIG. 2 , the touch point 220 from which the touch input is detected may be output to a display. However, for convenience of description, FIG. 2 shows the touch point 220, and the touch point 220 may not be output.
  • While the touch input is maintained, the processor may detect a point to which the touch point is moved in response to touch movement input 230 from the detected touch point 220. Referring to FIG. 2 , in response to the touch movement input 230, the point to which the touch point is moved may be positioned at one item of the upper graphic object.
  • When the touch point is positioned on one item of the upper graphic object, the processor may determine that a corresponding item is selected. Thus, the processor may select a target upper item 211 corresponding to the moved point among candidate upper items of the upper layer menu. For example, the processor may select the target upper item 211 depending on whether a touch region occupies a critical ratio or greater of a graphic object corresponding to the target upper item 211. For example, in response to the case in which the touch region detected by the display occupies 50% or greater of a graphic object indicated by the target upper item 211, the processor may select the target upper item 211.
  • FIG. 3 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a large screen depending on an embodiment.
  • A processor of an apparatus 300 may output a lower graphic object 320 indicating a lower layer menu corresponding to a selected upper item while extending the lower graphic object 320 from an upper graphic object 310.
  • As shown in FIG. 3 , the upper graphic object 310 may be shaped like a circle with an empty inside, and for example, may be shaped like a donut. The lower graphic object 320 may be shaped like a circle that is in contact with an outer circumference of the upper graphic object 310 and accommodates the upper graphic object 310. However, the shapes of the upper graphic object 310 and the lower graphic object are not limited thereto, and in another example, the lower graphic object may be shaped like a sector, and the processor may output the lower graphic object shaped like a sector in a direction extending from the upper graphic object 310 based on a target upper item 311.
  • After the lower graphic object 320 is output, the processor may detect a point to which the touch point is moved in response to detection of touch movement input 330. When the point to which the touch point is moved is a point corresponding to one target lower item 321 among candidate lower items of the lower layer menu, the processor may select the target lower item 321.
  • Depending on an embodiment, in response to the case in which the target lower item 321 is selected, the processor may visualize information on the target lower item 321 semi-transparently, and may overlay and output a graphic object for the lower layer menu on the information on the target lower item 321. The information on the target lower item 321 may be preview information (e.g., a preview image) related to the target lower item 321. However, the information on the target lower item 321 is not limited thereto, and may include various pieces of information that a user is capable of referring to for performing an operation related to the target lower item 321. For example, when the target lower item 321 is “missed call”, the information on the target lower item 321 may be a missed call list.
  • Depending on an embodiment, when a lower layer menu of the target lower item 321 is further present, the processor may output a graphic object indicating a lower layer item of the target lower item 321 in a direction extending from the lower graphic object 320.
  • The processor may detect a drop input of releasing a touch point from a point corresponding to one target lower item 321 among candidate lower items 322 of the lower layer menu. The processor may execute an operation corresponding to the target lower item 321 in response to the case in which the drop input is detected.
  • Depending on an embodiment, even if a lower layer menu of the target lower item 321 is further present, when the processor detects the drop input of releasing the touch point corresponding to the target lower item 321, the processor may pop up the lower layer menu of the target lower item 321. For example, a pop-up operation may be an operation of visualizing and extending graphic expression corresponding to the lower layer menu to an entire screen starting from the target lower item 321.
  • Depending on an embodiment, before executing an operation corresponding to the target lower item 321, the processor may output a graphic object for requesting user approval as to whether to execute the operation. The graphic object for requesting user approval may include a graphic object (e.g., a message window) for asking the user whether to execute the operation. In response to the graphic object for asking whether to execute the operation, a graphic object for allowing the user to select to execute the operation may be output. When detecting the touch input as to whether to execute the operation, the processor may determine whether to execute the corresponding operation. The processor may execute the corresponding operation in response to the case in which approval manipulation (e.g., activation of an approval button) for execution of the operation is received from the user. The processor may exclude execution of the corresponding operation in response to receiving rejection manipulation (e.g., activation of a reject button) for execution of the operation from the user.
  • Depending on another embodiment, when detecting the drop input of the user at a point corresponding to the target lower item and then detecting the touch input at the point corresponding to the target lower item within a predetermined time, the processor may execute an operation corresponding to the target lower item. In response to the case in which the drop input is detected from the target lower item, the processor may not immediately execute the operation corresponding to the target lower item, and may wait and execute the operation only when the touch input of the corresponding target lower item is detected once again, and accordingly, it may be possible to prevent an erroneous operation in which the operation is executed differently from user intention.
  • Depending on an embodiment, the processor may determine an item array combination of the lower graphic object 320 based on an execution history of the user and may output the determined item array combination around one item of the upper layer menu. The execution history may be, but is not limited to, a frequency of execution of an operation, a frequency of selection of an item, or the like, and may be a sequence of recent execution. The processor may output lower candidate items around the target upper item in an order from a lower candidate item with the highest priority depending on the execution history. For example, with respect to the target upper item ‘A’, the processor may position the most selected or executed lower candidate item ‘A1’ around the target upper item ‘A’.
  • FIG. 4 is a diagram showing a graphic object indicating some items of a command menu on a large screen depending on an embodiment.
  • When the number of items of the lower layer menu is greater than a predetermined number, the processor may output a graphic object 410 indicating some items 430 among candidate lower items of the lower layer menu on a display 400. Then, in response to user input 420 distinct from touch input, the processor may expose at least some of remaining items except for the some items 430 and may exclude output of at least some of the some items 430.
  • The processor may output only the some items 430 of the candidate lower item on the graphic object 410. The graphic object 410 may be a graphic object formed by listing candidate lower items. The user input 420 may be scroll input, and the scroll input may be touch movement input of moving a touch point in a direction in which candidate lower items are listed (e.g., a vertical direction in FIG. 4 ). In response to the case in which the touch movement input in the direction in which candidate lower items are listed is detected, the processor may expose some of remaining items except for the some items 430, and in response thereto, may exclude output of some of the some items 430.
  • As shown in an example of FIG. 4 , the processor may expose only the some items 430 among candidate lower items related to a phone number, may output the some items 430 on the graphic object 410, and may exclude output of remaining items except for the some items 430. In response to the case in which touch movement input toward an upper end from a lower end of the graphic object 410 is detected in a direction in which the candidate lower items are listed, the processor may expose some of the remaining items. The processor may exclude output of some of the some items 430 that have been exposed on the graphic object 410 in response to a portion of the graphic object 410, which is additionally exposed.
  • FIG. 5 is a diagram showing rotation of a graphic object depending on an embodiment.
  • The processor may detect external touch inputs 521 and 522 of at least one point of a region outside a graphic object for a layer menu. The processor may detect a movement trajectory 530 of the external touch input. When touch input 520 is maintained in a region 510 inside the graphic object, the processor may rotate the graphic object based on the movement trajectory 530.
  • The region outside the graphic object may be a remaining region except for a portion occupied by the graphic object in the display. The movement trajectory 530 of the external touch input may be shaped like, for example, a curve, and the processor may determine whether the movement trajectory 530 is clockwise or counterclockwise. For example, when the movement trajectory 530 is detected to be clockwise, the processor may rotate the graphic object clockwise. As the processor rotates the graphic object, a direction in which the lower layer menu is output may be adjusted. Thus, the user may position an item hidden by his or her finger into his or her field of view by rotating the graphic object. In addition, a margin of the display may be effectively utilized. For reference, when the graphic object is rotated, the processor may gradually rearrange graphic objects corresponding items clockwise or counterclockwise based on a reference point (e.g., the center of a circle). The processor may rearrange only the position of each graphic object while maintaining the shape of each graphic object rather rotating the same.
  • FIG. 6 is a diagram movement of a graphic object depending on an embodiment.
  • The processor may detect an external touch input 621 of at least one point of a region outside the graphic object for the layer menu. The processor may detect a movement trajectory of the external touch input 621. When touch input 620 is maintained in a region 610 inside the graphic object, the processor may move the graphic object based on a movement trajectory 630. The processor may move at least a portion of the graphic object in identified directions 630 and 631 based on the external touch input 621.
  • In response to the case in which the movement trajectory of the external touch input 621 is identified to be a straight line, the processor may detect touch movement input direction of the external touch input 621. The processor may move at least a portion of the graphic object in the identified directions 630 and 631 based on the external touch input 621. At least a portion moved in the graphic object may be, but is not limited to, a graphic object except for a graphic object for the uppermost layer menu, but the graphic object may be entirely moved. When a margin of the display is not left to the extent that a lower layer menu, which is no longer added, is not output in a direction in which the lower layer menu is output, the processor may move the graphic object and may output the added lower layer menu depending on user input.
  • FIG. 7 is a flowchart of a method of outputting a command menu on a small screen depending on an embodiment.
  • Operations 700 to 720 of outputting a graphic object at a reference point at which touch input is detected and then selecting a target upper item are the same as the above description of operations 100 to 120 of FIG. 1 , and thus a detailed description thereof is omitted.
  • In operation 730, the processor may detect a touch return movement input in which the touch point returns to the reference point after the target upper item is selected. The reference point may be a point corresponding to a touch point at which a display touch input of a user is generated. The processor may detect a touch return movement input of returning from the touch point detected in response to the touch movement input to the reference point after the target upper item is selected.
  • In operation 740, in response to the case in which the touch return movement input is detected, the processor may replace a graphic object indicating an upper layer menu with a graphic object indicating a lower layer menu corresponding to the target upper item and may output the same. As the upper layer menu is replaced with the lower layer menu, a portion of the display, occupied by the graphic object, may not be increased, and thus, it may be easy to output a command menu on a small screen with a small display margin compared with a large screen. In addition, a small-screen apparatus is often held with one hand, and thus movement of the touch movement input may be shorter than in the case in which a command menu is output on a large screen. In response to the touch return movement input, the graphic object indicating the lower layer menu is replaced and output, and thus the command menu to the lowermost layer menu from the uppermost layer menu may be output with relatively short movement of the touch movement input. Here, the uppermost layer menu and the lowermost layer menu may correspond to layer menus of uppermost and lowermost stages depending on an embodiment.
  • Operations 750 and 760 are the same as the above description of operations 140 and 150 of FIG. 1 , and thus a detailed description thereof is omitted.
  • FIG. 8 is a diagram showing an upper graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • The method of outputting a command menu depending on an embodiment may be performed by an apparatus 800 including a small-screen display. The apparatus 800 including a small-screen display may have a relatively small area for outputting a graphic object related to command menus compared with a large-screen display. The apparatus 800 including a small-screen display may be embodied in various forms of products such as a smartphone, a tablet PC, a smart electronic device, an intelligent vehicle, an intelligent vehicle, or a wearable device. For example, depending on embodiments, in response to user touch input detected by a display of a smartphone, a graphic object indicating a layer menu may be output. In response to a touch return movement input 831, after a graphic object is called, a graphic object indicating lower layer menus may replace a graphic object indicating upper layer menus and may be output. However, the apparatus 800 including a small-screen display is not limited only to the above embodiments, and may be an apparatus including a display to be gripped by one hand.
  • When detecting touch input of a user, a processor of the apparatus 800 including a small-screen display may output an upper graphic object 810 indicating an upper layer menu among layered command menus based on a reference point 820 from which touch input is detected.
  • Depending on an embodiment, as shown in FIG. 8 , the reference point 820 from which the touch input is detected may be output on a display, and FIG. 8 is a diagram for explaining the reference point 820, but depending on another embodiment, the reference point 820 may not be output on the display.
  • While the touch input is maintained, the processor may detect a point to which the touch point is moved in response to touch movement input 830 from the detected reference point 820. Referring to FIG. 8 , the point to which the touch point is moved in response to the touch movement input 830 may be positioned on one item of the upper graphic object.
  • When the touch point is positioned on one item of the upper graphic object, the processor may determine that the positioned item is selected. Thus, the processor may select a target upper item 811 corresponding to the selected point among candidate upper items of the upper layer menu. Selection of the target upper item may be determined depending on whether the touch point occupies a critical ratio or greater of a graphic object indicated by the target upper item. For example, in response to the case in which the touch point detected by the processor corresponds to 50% or greater of the graphic object indicated by the target upper item, the processor may determine that the upper item is selected.
  • After the target upper item 811 is selected, the processor may detect the touch return movement input 831 in which the touch point returns to the reference point 820. The touch return movement input 831 may be an input of a direction corresponding to the touch movement input. A trajectory of the touch movement input may be formed in an opposite direction to a movement direction of the touch movement input, but is not limited thereto, and a trajectory of moving a touch point to the reference point from the target upper item may be formed.
  • A graphic object related to a menu depending on an embodiment may be shaped like a sector, but may be shaped like a circle that radiates based on the reference point. The shape of the graphic object is not limited thereto, and the graphic object may be configured for a user to select a menu item.
  • FIG. 9 is a diagram showing a lower graphic object for explaining a method of outputting a command menu on a small screen depending on an embodiment.
  • In response to the case in which a touch return movement input 931 is detected, a processor of an apparatus 900 may replace a graphic object indicating an upper layer menu with a graphic object indicating a lower layer menu corresponding to the target upper item and may output the same.
  • Depending on an embodiment, the processor may detect a drop input of releasing a touch point from a point corresponding to one target lower item 911 among candidate lower items of the lower layer menu. In response to the case in which the drop input is detected, the processor may execute an operation corresponding to the target lower item 911.
  • Depending on another embodiment, when the touch point deviates from a point corresponding to the target lower item 911 depending on touch movement input having an acceleration equal to or greater than a critical acceleration, the processor may execute an operation corresponding to the target lower item 911. For example, in response to the case in which the touch point is moved to an external point of a graphic object 910 of a layer menu from the target lower item 911, the processor may calculate a moving acceleration of the touch point to the external point from the target lower item 911. In response to the case in which the moving acceleration of the touch point to the external point from the target lower item 911 exceeds the critical acceleration, the processor may execute an operation corresponding to the target lower item 911. Compared with the case in which an operation is performed in response to a drop input of releasing the touch point, execution of an operation depending on touch movement input with an acceleration may prevent the operation from being executed differently from user intention.
  • Before executing an operation corresponding to the target lower item 911, the processor may output a graphic object for requesting user approval in response to whether to execute the operation. Depending on another embodiment, when the drop input of the user is detected from the target lower item 911 and the touch input is detected from a point corresponding to the target lower item again, the processor may execute an operation corresponding to the target lower item.
  • Depending on another embodiment, the processor may detect the touch return movement input 931 of selecting the target lower item 911 and then moving the touch point to return to a reference point 920 depending on touch movement input 930. In response to the case in which the touch return movement input 931 is detected, the processor may replace the lower graphic object 910 with a graphic object indicating an additional lower layer menu corresponding to the target lower item 911 and may output the same.
  • FIG. 10 is a diagram showing a graphic object indicating some items of a command menu on a small screen depending on an embodiment.
  • When the number of items of the lower layer menu is greater than a predetermined number, a processor of an apparatus 1000 may output a graphic object 1010 indicating some items 1020 among candidate lower items of the lower layer menu on a display. Then, in response to user input distinct from touch input, the processor may expose at least some of remaining items except for the some items 1020 and may exclude output of at least some of the some items 1020.
  • The processor may output only the some items 1020 of the candidate lower item on the graphic object 1010. The graphic object 1010 may be a graphic object formed by listing candidate lower items. The user input may be a two-touch round-trip input 1030 corresponding to a desired scrolling direction, distinct from the touch input and the touch return movement input. The touch round-trip input may include touch movement input and a touch return movement input in a direction corresponding to a touch movement direction. One touch round-trip input and the two-touch round-trip input 1030 may be distinguished depending on the number of round trip inputs detected for a predetermined time. In response to the two-touch round-trip input 1030 corresponding to a desired scrolling direction, the processor may expose some of remaining items except for the some items 1020, and in response thereto, may exclude output of some of the some items 1020.
  • As shown in an example of FIG. 10 , the processor may list only the some items 1020 among candidate lower items related to a phone number and may output the list on the graphic object 1010. In response to the case in which the two-touch round-trip input 1030 corresponding to an upward direction from the reference point is detected, the processor may expose an item corresponding to an upper end of the graphic object 1010 among remaining items except for the some items 1020. In response to an item exposed out of an upper end of the graphic object 1010, output of some of the some items 1020, exposed out of a lower end of the graphic object 1010, may be excluded.
  • FIG. 11 is a diagram for explaining a method of outputting a command menu when a touch detector and a display are separated from each other depending on an embodiment.
  • Depending on an embodiment, a manipulation detector for detecting user manipulation may be physically separated from a display 1100, and an electrical signal detected by the touch detector may be transferred to a processor through a communication unit between the manipulation detector and the display 1100. In FIG. 11 , the manipulation detector includes, for example, a touch detector 1120, but is not limited thereto.
  • In response to the case in which the touch detector 1120 detects user touch input 1130, a processor may output a manipulation indicator on a part of the display 1100, indicating a touch point from which touch input 1130 is detected. The manipulation indicator may be a graphic object displayed on the display 1100 to correspond to a point from which the touch input is detected.
  • An operation of outputting a graphic object 1110 related to a layered menu to correspond to the touch point and executing an operation corresponding to the target lower item among lower menus is the same as the above description of FIGS. 1 to 10 , and thus a detailed description thereof is omitted.
  • FIG. 12 is a diagram for explaining a method of outputting a command menu in virtual reality depending on an embodiment.
  • Depending on an embodiment, a manipulation detector for detecting user input and a display may be physically separated from each other, and an electrical signal detected by the manipulation detector may be transferred to a processor through a communication unit between the manipulation detector and the display.
  • The manipulation detector may be a sensor for detecting movement of the body part of the user and may include, for example, a sensor for detecting finger joint movement. The manipulation detector may include a sensor implemented in the form of a glove for sensing bending and unfolding of the user knuckles. The sensor for sensing bending and unfolding of the user knuckles may be positioned in a portion corresponding to the user knuckle of the glove. The sensor may include a piezoelectric sensor, and in response to a piezoelectric signal generated when the finger is bent, the sensor may detect finger joint movement. However, the embodiments are not limited thereto, and the sensor may include a pressure sensor, and in response to the case in which a pressure generated by bending the user finger is sensed, the sensor may detect whether the finger is bent. However, the movement of the body part is not limited to detection of the movement of the finger joint and may include blinking of an eye, movement of legs and arms, and a joint motion of the body part.
  • Depending on another embodiment, the manipulation detector may be a sensor for detecting movement of the body part of the user and may include an image sensor for detecting hand movement. Sensing data of the image sensor including a camera may be transmitted to the processor, and the camera may photograph the user hand.
  • The embodiments are merely embodiments for detecting movement of the user hand, and methods of detecting whether the user finger is unfolded are not limited to the above embodiments and may use any method at a level of an ordinary skill in the art for detecting movement of the user hand, such as a wearable device for measuring an angle or a distance between joints, and electrical resistance.
  • The display may be a virtual display implemented through virtual reality, and virtual reality may be implemented by connecting a virtual reality device 1220 in contact with the user face to a processor. The virtual reality device 1220 may limit a user view, and only the display of virtual reality may be implemented by the processor. The virtual reality device 1220 may provide, for example, a right eye image to a right eye of the user and may provide a left eye image to a left eye of the user, and the right eye image and the left eye image may have disparity with each other. The virtual reality device 1220 may provide the aforementioned left eye image and the right eye image to the user, and thus may visualize and provide three-dimensional content to the user. Depending on another embodiment, the virtual reality device 1220 may not limit the user view, and the user may execute an operation related to a command item while viewing a screen on which virtual reality overlaps reality.
  • The processor may detect a state 1210 in which the user finger is unfolded and may output a graphic object 1230 indicating an upper layer menu among layered command menus with a manipulation indicator on a portion indicating a touch point corresponding to one end of the finger on the display of virtual reality.
  • The processor may detect touch movement input of moving a touch point corresponding to one end of the finger and may detect a point to which the manipulation indicator is moved as the touch point is moved while the state 1210 in which the user finger is unfolded is maintained. In response to detection of the point to which the manipulation indicator is moved, the processor may select a target upper item of the upper layer menu and may output a lower graphic object indicating a lower layer menu corresponding to the target upper item. The processor may detect a state 1200 in which the user finger is unfolded from a point corresponding to one target lower item among candidate lower items of the lower layer menu and may execute an operation corresponding to the target lower item.
  • FIG. 13 is a block diagram showing the overall configuration of an apparatus 1300 for outputting a command menu depending on an embodiment.
  • The apparatus 1300 for outputting a command menu may include a processor 1310, a display 1320, and a touch detector 1330. The processor 1310 may receive an electrical signal converted from touch input detected by the touch detector 1330. The processor 1310 may search for a point on the display 1320, corresponding to a touch point from which the touch input is detected, based on the received electrical signal. The processor 1310 may output a graphic object at a corresponding point on the display 1320 and may then perform a series of processes for executing an operation. The process for executing the operation after the graphic object is output is the same as the above description of FIGS. 1 to 12 , and thus a detailed description thereof is omitted. For reference, although FIG. 13 illustrates the case in which the display 1320 and the touch detector 1330 are separate components, the embodiments are not limited thereto. The display 1320 and the touch detector 1330 may be integrally implemented as a sensitive display.
  • Depending on an embodiment, the apparatus 1300 for outputting a command menu may further include a memory of a registered database for storing registered user information that matches user biometric information. The processor 1310 may grant access authority to the user based on the registered user information stored in the memory.
  • The embodiments described above may be implemented by a hardware component, a software component, and/or a combination of a hardware component and a software component. For example, the device, the method, and the components described with regard to the embodiments may be implemented using one or more general-purpose computers or a special purpose computer, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), a programmable logic unit (PLU), a microprocessor, or any other device for executing and responding to an instruction. The processing device may execute an operating system (OS) and one or more software applications executed on the OS. The processing device may access, store, manipulate, process, and generate data in response to execution of the software. For convenience of understanding, although one processing device is described as being used, those of ordinary skill in the art would understand that the processing device includes a plurality of processing elements and/or a plurality of types of processing elements. For example, the processing device may include a plurality of processors or one processor and one controller. In addition, the processing may also include other processing configurations such as a parallel processor.
  • Software may include a computer program, a code, an instruction, or a combination of one or more thereof and may configure the processing device to operate as described or may independently or collectively issue a command to the processing device. The software and/or data may be permanently or temporarily embodied by any type of machine, a component, a physical device, virtual equipment, a computer storage or device, or a received signal wave in order to be interpreted by the processing device or to provide a command or data to the processing device. The software may be distributed over a networked computer system and may be stored or executed in a distributed manner. The software and data may be stored in one or more computer-readable recording media.
  • The methods depending on the embodiments may be recorded in a computer readable medium including program commands for executing operations implemented through various computers. The computer readable medium may store program commands, data files, data structures or combinations thereof. The program commands recorded in the medium may be specially designed and configured for the present invention or be known to those skilled in the field of computer software. Examples of a computer readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, or hardware devices such as ROMs, RAMS and flash memories, which are specially configured to store and execute program commands. Examples of the program commands include machine language code created by a compiler and high-level language code executable by a computer using an interpreter and the like. The hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
  • As described above, although the embodiments have been described with reference to the limited drawings, those skilled in the art may apply various technical modifications and variations based on the above description. For example, the described technologies are performed in an order different from the described method, and/or the described components of a system, a structure, an apparatus, a circuit, etc. are coupled or combined in a different form than the described method or are substituted with other elements or equivalents or an appropriate result may be achieved despite substitution.

Claims (2)

1. A method of outputting a layered command menu on a display, which is performed by a processor, the method comprising:
outputting an upper graphic object indicating an upper layer menu among the command menus at a touch point from which the touch input is detected, in response to a case in which touch input of a user is detected;
detecting a point to which the touch point is moved in response to touch movement input of moving the touch point while the touch input of the user is maintained;
selecting a target upper item corresponding to the moved point among candidate upper items of the upper layer menu;
outputting a lower graphic object indicating a lower layer menu corresponding to the selected target upper item while extending the lower graphic object from the upper graphic object;
selecting a target lower item among candidate lower items of the lower layer menu depending on the touch movement input from the target upper item;
when a next lower layer menu is mapped to the target lower item, outputting a graphic object indicating the next lower layer item; and
executing an operation corresponding to one of the target lower item and the next lower layer item in response to a case in which a drop input is detected from a point corresponding to one of the target lower item and the next lower layer item.
2. A command menu outputting apparatus including a processor, wherein the processor:
in response to a case in which the touch input of a user is detected:
outputs an upper graphic object indicating an upper layer menu among layered command menus at a touch point from which touch input is detected,
in response to touch movement input of moving the touch point while the user touch input is maintained:
detects a point to which the touch point is moved, and
in response to a case in which a drop input is detected from a point corresponding to one of a target lower item and a next lower layer item:
selects a target upper item corresponding to the moved point among candidate upper items of the upper layer menu,
outputs a lower graphic object indicating a lower layer menu corresponding to the selected target upper item while extending the lower graphic object from the upper graphic object,
selects the target lower item among candidate lower items of the lower layer menu depending on the touch movement input from the target upper item,
when the next lower layer menu is mapped to the target lower item, outputs a graphic object indicating the next lower layer item, and
executes an operation corresponding to one of the target lower item and the next lower layer item.
US17/971,083 2019-04-09 2022-10-21 Method for outputting command method Abandoned US20230043168A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/971,083 US20230043168A1 (en) 2019-04-09 2022-10-21 Method for outputting command method

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
KR10-2019-0041341 2019-04-09
KR20190041341 2019-04-09
KR10-2019-0083089 2019-07-10
KR1020190083089A KR102086578B1 (en) 2019-04-09 2019-07-10 Method to output command menu
PCT/KR2020/003877 WO2020209520A1 (en) 2019-04-09 2020-03-20 Method for outputting command menu
US202117602978A 2021-10-11 2021-10-11
US17/971,083 US20230043168A1 (en) 2019-04-09 2022-10-21 Method for outputting command method

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/KR2020/003877 Division WO2020209520A1 (en) 2019-04-09 2020-03-20 Method for outputting command menu
US17/602,978 Division US11513662B2 (en) 2019-04-09 2020-03-20 Method for outputting command method

Publications (1)

Publication Number Publication Date
US20230043168A1 true US20230043168A1 (en) 2023-02-09

Family

ID=70911958

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/602,978 Active US11513662B2 (en) 2019-04-09 2020-03-20 Method for outputting command method
US17/971,083 Abandoned US20230043168A1 (en) 2019-04-09 2022-10-21 Method for outputting command method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/602,978 Active US11513662B2 (en) 2019-04-09 2020-03-20 Method for outputting command method

Country Status (6)

Country Link
US (2) US11513662B2 (en)
EP (1) EP3955100A4 (en)
JP (1) JP2022528771A (en)
KR (1) KR102086578B1 (en)
CN (1) CN113678097A (en)
WO (1) WO2020209520A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114780195B (en) * 2022-04-29 2024-02-13 济南浪潮数据技术有限公司 Operation visualization method, device, equipment and storage medium
GB202209690D0 (en) * 2022-07-01 2022-08-17 Kasekende Wilfred Menu navigation arrangement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050140661A1 (en) * 2002-01-18 2005-06-30 Trigenix Limited Graphic user interface for data processing device
US20130268897A1 (en) * 2011-12-08 2013-10-10 Huawei Technologies Co., Ltd. Interaction method and interaction device
US20140033127A1 (en) * 2012-07-25 2014-01-30 Samsung Electronics Co., Ltd. User terminal apparatus and control method thereof
US20140157200A1 (en) * 2012-12-05 2014-06-05 Samsung Electronics Co., Ltd. User terminal apparatus and method of controlling the same
US20150082162A1 (en) * 2013-09-13 2015-03-19 Samsung Electronics Co., Ltd. Display apparatus and method for performing function of the same

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1510911A3 (en) 2003-08-28 2006-03-22 Sony Corporation Information processing apparatus, information processing method, information processing program and storage medium containing information processing program
JP2006139615A (en) * 2004-11-12 2006-06-01 Access Co Ltd Display device, menu display program, and tab display program
US7509348B2 (en) * 2006-08-31 2009-03-24 Microsoft Corporation Radially expanding and context-dependent navigation dial
KR100973354B1 (en) 2008-01-11 2010-07-30 성균관대학교산학협력단 Device and method for providing user interface of menu
WO2009158549A2 (en) * 2008-06-28 2009-12-30 Apple Inc. Radial menu selection
JP4632102B2 (en) * 2008-07-17 2011-02-16 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
KR101004463B1 (en) 2008-12-09 2010-12-31 성균관대학교산학협력단 Handheld Terminal Supporting Menu Selecting Using Drag on the Touch Screen And Control Method Using Thereof
US20100185985A1 (en) * 2009-01-19 2010-07-22 International Business Machines Corporation Managing radial menus in a computer system
US8375329B2 (en) * 2009-09-01 2013-02-12 Maxon Computer Gmbh Method of providing a graphical user interface using a concentric menu
JP2011107823A (en) * 2009-11-13 2011-06-02 Canon Inc Display controller and display control method
US20130014053A1 (en) * 2011-07-07 2013-01-10 Microsoft Corporation Menu Gestures
US8707211B2 (en) * 2011-10-21 2014-04-22 Hewlett-Packard Development Company, L.P. Radial graphical user interface
US9875023B2 (en) * 2011-11-23 2018-01-23 Microsoft Technology Licensing, Llc Dial-based user interfaces
CN104838352B (en) * 2012-12-07 2018-05-08 优特设备有限公司 Action initialization in multi-surface device
KR102107810B1 (en) * 2013-03-19 2020-05-28 삼성전자주식회사 Display apparatus and displaying method for information regarding activity using the same
KR101529886B1 (en) * 2013-11-06 2015-06-18 김소훈 3D gesture-based method provides a graphical user interface
US20160313910A1 (en) * 2013-11-28 2016-10-27 Samsung Electronics Co., Ltd. Method and device for organizing a plurality of items on an electronic device
KR20160038413A (en) * 2014-09-30 2016-04-07 삼성전자주식회사 Contents searching apparatus and method for searching contents
TWI530886B (en) * 2015-01-29 2016-04-21 茂丞科技股份有限公司 Electronic apparatus having fingerprint sensor operating in vector mode
KR20160128739A (en) * 2015-04-29 2016-11-08 삼성전자주식회사 Display apparatus and user interface providing method thereof
KR20170040706A (en) * 2015-10-05 2017-04-13 삼성전자주식회사 Device For Providing One-Handed Manipulation User Interface and Method Thereof
CN106648330B (en) * 2016-10-12 2019-12-17 广州视源电子科技股份有限公司 man-machine interaction method and device
EP3385831A1 (en) * 2017-04-04 2018-10-10 Lg Electronics Inc. Mobile terminal
CN108399042B (en) * 2018-01-31 2020-09-01 歌尔科技有限公司 Touch identification method, device and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050140661A1 (en) * 2002-01-18 2005-06-30 Trigenix Limited Graphic user interface for data processing device
US20130268897A1 (en) * 2011-12-08 2013-10-10 Huawei Technologies Co., Ltd. Interaction method and interaction device
US20140033127A1 (en) * 2012-07-25 2014-01-30 Samsung Electronics Co., Ltd. User terminal apparatus and control method thereof
US20140157200A1 (en) * 2012-12-05 2014-06-05 Samsung Electronics Co., Ltd. User terminal apparatus and method of controlling the same
US20150082162A1 (en) * 2013-09-13 2015-03-19 Samsung Electronics Co., Ltd. Display apparatus and method for performing function of the same

Also Published As

Publication number Publication date
US11513662B2 (en) 2022-11-29
WO2020209520A1 (en) 2020-10-15
EP3955100A1 (en) 2022-02-16
CN113678097A (en) 2021-11-19
JP2022528771A (en) 2022-06-15
KR102086578B1 (en) 2020-05-29
EP3955100A4 (en) 2023-01-25
US20220113848A1 (en) 2022-04-14

Similar Documents

Publication Publication Date Title
US20230043168A1 (en) Method for outputting command method
US9710630B2 (en) Electronic device and method of providing security using complex biometric information
EP3086200B1 (en) Electronic device including rotary member and display method thereof
US10503373B2 (en) Visual feedback for highlight-driven gesture user interfaces
US10317947B2 (en) Electronic device and method for processing gesture thereof
EP2972727B1 (en) Non-occluded display for hover interactions
US10013143B2 (en) Interfacing with a computing application using a multi-digit sensor
US11281370B2 (en) Electronic device and touch gesture control method thereof
CN113748407A (en) Electronic device and method for displaying split screen providing object
US9696815B2 (en) Method, device, system and non-transitory computer-readable recording medium for providing user interface
WO2014127697A1 (en) Method and terminal for triggering application programs and application program functions
US10204265B2 (en) System and method for authenticating user
US20170131785A1 (en) Method and apparatus for providing interface interacting with user by means of nui device
US9971490B2 (en) Device control
US9424416B1 (en) Accessing applications from secured states
US11635805B2 (en) Wearable device for using external entity as controller and method
US9589126B2 (en) Lock control method and electronic device thereof
EP3571577B1 (en) Method for displaying handler and electronic device therefor
CN112534390A (en) Electronic device for providing virtual input tool and method thereof
EP3210101B1 (en) Hit-test to determine enablement of direct manipulations in response to user actions
US11714538B2 (en) Electronic device, method, and computer-readable medium for switchable bar region of user interface
US20170090606A1 (en) Multi-finger touch
KR102568550B1 (en) Electronic device for executing application using handwirting input and method for controlling thereof
KR102205235B1 (en) Control method of favorites mode and device including touch screen performing the same
TWI544400B (en) Methods and systems for application activation, and related computer program prodcuts

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION