WO2017086578A1 - Procédé de saisie tactile par l'écran de bord et dispositif électronique - Google Patents

Procédé de saisie tactile par l'écran de bord et dispositif électronique Download PDF

Info

Publication number
WO2017086578A1
WO2017086578A1 PCT/KR2016/009678 KR2016009678W WO2017086578A1 WO 2017086578 A1 WO2017086578 A1 WO 2017086578A1 KR 2016009678 W KR2016009678 W KR 2016009678W WO 2017086578 A1 WO2017086578 A1 WO 2017086578A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
touch
electronic device
application
combination
Prior art date
Application number
PCT/KR2016/009678
Other languages
English (en)
Korean (ko)
Inventor
지아오루원
리우밍후이
리슈앙
콴웨이싱
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201510788114.1A external-priority patent/CN106708399A/zh
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to US15/765,858 priority Critical patent/US11003328B2/en
Priority to EP16866531.3A priority patent/EP3343341B1/fr
Publication of WO2017086578A1 publication Critical patent/WO2017086578A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to a technology for controlling an electronic device, and more particularly, to a touch input method and an electronic device through an edge screen.
  • an operation preset in the electronic device may be efficiently performed.
  • a first aspect of the present disclosure comprising: detecting a first touch on the first edge screen and a second touch on the second edge screen; Determining a first location on the first edge screen corresponding to the first touch and a second location on the second edge screen corresponding to the second touch; Detecting a first gesture using the first position as a starting point and a second gesture using the second position as a starting point; According to the combination of the first gesture and the second gesture, performing a preset operation of the electronic device, it may provide a method for controlling an electronic device having an edge screen.
  • FIG. 1 and 2 are diagrams illustrating an example of an electronic device 1000 according to an embodiment of the present disclosure.
  • FIG. 3 is a flowchart illustrating a method of controlling the electronic device 1000 having both side edge screens 140 and 160 according to an exemplary embodiment.
  • 4A to 4B are diagrams illustrating an example of a relative relationship between a first position p1 and a second position p2 and preset conditions in a control method of the electronic device 1000 according to an embodiment of the present disclosure. .
  • 5A to 5D illustrate examples of a combination of a first gesture and a second gesture in a control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG. 6 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG. 7 is a flowchart illustrating a method of performing a preset operation according to a combination of a detected gesture by the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG 8 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG. 9 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG. 10 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG. 11 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG. 12 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • FIG. 13 is a block diagram of an electronic device 1000 according to an exemplary embodiment.
  • FIG. 14 is a block diagram of an electronic device 1000 according to another exemplary embodiment.
  • a first aspect of the present disclosure comprising: detecting a first touch on the first edge screen and a second touch on the second edge screen; Determining a first location on the first edge screen corresponding to the first touch and a second location on the second edge screen corresponding to the second touch; Detecting a first gesture using the first position as a starting point and a second gesture using the second position as a starting point; According to the combination of the first gesture and the second gesture, performing a preset operation of the electronic device, it may provide a method for controlling an electronic device having an edge screen.
  • the determining of the first position on the first edge screen corresponding to the first touch and the second position on the second edge screen corresponding to the second touch may include: the first touch and the second touch.
  • the first position and the second position may be determined.
  • the method may further include determining whether a relative relationship between the first position and the second position satisfies a preset condition.
  • the preset condition may be a distance between a first virtual line perpendicular to the first edge screen at the first position and a second virtual line perpendicular to the second edge screen at the second position is less than or equal to a preset threshold. Can be.
  • the first gesture may include at least one of clicking, sliding, and maintaining, and the second gesture may include at least one of clicking, sliding, and maintaining.
  • the form of the sliding may include at least one of a straight one-way, a straight direction, and a curved.
  • the performing of the preset operation of the electronic device according to the combination of the first gesture and the second gesture may be performed by detecting one of the first gesture and the second gesture while the other gesture is detected.
  • the preset operation may be performed.
  • the performing of the preset operation of the electronic device according to the combination of the first gesture and the second gesture may include: determining an application running on the electronic device; Determining a predetermined operation on the application corresponding to the combination of the first gesture and the second gesture; The method may further include performing a preset operation of the running application corresponding to the combination of the first gesture and the second gesture.
  • the method may further include selecting an item corresponding to the first position and the second position in an execution window of the application running on the electronic device, wherein the preset operation comprises: selecting the selected item; It may be to move the display position of.
  • a touch screen including a main screen and a plurality of edge screens; A memory in which at least one program is stored; And a processor configured to cause the electronic device to perform a predetermined operation by executing the at least one program, wherein the at least one program comprises: a first touch on a first edge screen and a second on a second edge screen; Sensing a touch; Determining a first location on the first edge screen corresponding to the first touch and a second location on the second edge screen corresponding to the second touch; Detecting a first gesture using the first position as a starting point and a second gesture using the second position as a starting point; And performing a preset operation of the electronic device according to the combination of the first gesture and the second gesture.
  • the program may further include determining the first position and the second position when another touch is detected while sensing one of the first touch and the second touch. It may further include instructions to.
  • the program may further include determining whether a relative relationship between the first position and the second position satisfies a preset condition.
  • the preset condition may be a distance between a first virtual line perpendicular to the first edge screen at the first position and a second virtual line perpendicular to the second edge screen at the second position.
  • the first gesture may include at least one of clicking, sliding, and maintaining, and the second gesture may include at least one of clicking, sliding, and maintaining.
  • the form of the sliding may include at least one of a straight one-way, a straight direction, and a curved.
  • the program may further include: performing the preset operation when another gesture is detected while one of the first gesture and the second gesture is detected. can do.
  • the program may further determine an application running on the electronic device, determine a preset operation regarding the application corresponding to the combination of the first gesture and the second gesture, and determine the first gesture and the second gesture.
  • the method may further include performing a preset operation of the running application corresponding to the combination of gestures.
  • the program may further include instructions for selecting an item corresponding to the first location and the second location in an execution window of the application running on the electronic device, wherein the preset operation comprises: The display position of the selected item may be shifted.
  • the third aspect of the present disclosure can provide a computer readable recording medium having recorded thereon a program for executing the method of the first aspect on a computer.
  • FIG. 1 and 2 are diagrams illustrating an example of an electronic device 1000 according to an embodiment of the present disclosure.
  • the electronic device 1000 refers to an electronic device that provides specific information to a user.
  • the electronic device 1000 may be a device such as a smart phone, a laptop computer, a tablet PC, a game console, a personal digital assistant, a digital multimedia player (DMP), or the like.
  • the present invention is not limited thereto and may include various touch screen electronic devices.
  • the electronic device 1000 includes a touch screen 100 that provides specific information to a user, and the touch screen 100 includes a main screen 120 on a front surface thereof. ) And a plurality of edge screens 140 and 160 provided at both sides of the main screen 120.
  • the main screen 120 and the plurality of edge screens 140 and 160 may be physically divided, but are not limited thereto.
  • the central portion of the physically one touch screen 100 may be designated as the main screen 120 and both portions may be designated as the edge screens 140 and 160.
  • the edge screens 140 and 160 may be provided in a flat plane on the same plane as the main screen 120.
  • the edge screens 140 and 160 may be provided on a plane that is not the same as the main screen 120.
  • the edge screens 140 and 160 may be provided in a flat plane or curved surface.
  • the edge screens 140 and 160 may be provided in directions parallel to the edges of the electronic device 1000.
  • the edge screens 140 and 160 may be provided at both corners of the electronic device 1000, or the edge screens 140 and 160 may be provided at the upper, lower, left, and right sides of the electronic device 1000.
  • the edge screens 140 and 160 are provided at both edges of the electronic device 1000, the edge screens 140 and 160 are disposed at the other edges opposite to the first edge screen 140 at one edge of the main screen 120. It may include a second edge screen 160.
  • first direction One of the directions parallel to the corners of the main screen 120 having the edge screens 140 and 160 is defined as the first direction, and a direction opposite to the first direction is defined as the second direction.
  • first direction may be defined as the upper direction
  • second direction may be defined as the lower direction.
  • the touch screen 100 of the electronic device 1000 may display an execution window of an application executed in the electronic device 1000.
  • the execution window may mean a screen displayed on the touch screen 100 when an application is executed.
  • the execution window of the application may be displayed on the main screen 120 and the entire edge screens 140 and 160 and may be displayed on the main screen 120 or the edge screens 140 and 160.
  • the electronic apparatus 1000 may display specific information to the user or receive specific information from the user through the execution window of the application.
  • FIG. 3 is a flowchart illustrating a method of controlling the electronic device 1000 having both side edge screens 140 and 160 according to an exemplary embodiment.
  • the electronic apparatus 1000 may perform a preset operation according to touch inputs to the plurality of edge screens 140 and 160.
  • the electronic device 1000 may detect a plurality of touches on the plurality of edge screens 140 and 160.
  • the electronic device 1000 may detect the presence or absence of a touch by sensing pressure or electrical.
  • the electronic apparatus 1000 may detect a touch using fingers of both hands of the user. For example, the electronic apparatus 1000 may detect a touch using the index finger of one hand and the index finger of the other hand of the user.
  • the electronic apparatus 1000 may detect a touch using two fingers of a user's hand.
  • the electronic apparatus 1000 may detect a touch using the thumb of one hand of the user and the index finger of the same hand.
  • the electronic device 1000 may also sense a touch using an input means such as a pen in addition to the user's finger.
  • the electronic apparatus 1000 may detect a touch on the first edge screen 140 as a first touch and a touch on the second edge screen 160 as a second touch, respectively.
  • the electronic device 1000 may perform various preset operations when the first touch and the second touch are simultaneously sensed.
  • the start time of the first touch and the start time of the second touch are perfectly matched, but also the touch of any one of the first touch and the second touch. It may include detecting the other touch while the detection of the continues.
  • “when the first touch and the second touch are simultaneously detected” may include a time interval in which the first touch and the second touch are simultaneously detected by the electronic device 1000.
  • the electronic device 1000 may determine the first position and the second position when the other touch is detected while the touch of one of the first touch and the second touch is detected.
  • the electronic apparatus 1000 may determine a location where a user's touch is detected on the touch screen 100.
  • the electronic apparatus 1000 may determine a position of the first touch detected on the first edge screen 140 as a first position p1 and a position of a second touch sensed at the second edge screen 160 as a second position p2. Can be determined.
  • 4A to 4B are diagrams illustrating an example of a relative relationship between a first position p1 and a second position p2 and preset conditions in a control method of the electronic device 1000 according to an embodiment of the present disclosure. .
  • a relative relationship between the first position p1 and the second position p2 satisfies a preset condition between steps S320 and S330. Determining whether the information is determined.
  • the electronic apparatus 1000 may locate the first position p1 and the second position p2 within the same level. If the preset condition is not satisfied, it may be determined that the first position p1 and the second position p2 are located at different levels.
  • satisfying a preset condition means that a first virtual line L1 perpendicular to the first edge screen 140 at which the first position p1 is sensed is drawn from the first position p1.
  • the second virtual line L2 perpendicular to the second edge screen 160 where the second position p2 is detected from the second position p2 is drawn, the first virtual line L1 and the second virtual line L2 are drawn.
  • the vertical distance ⁇ L between the virtual lines L2 may be equal to or less than the preset threshold ⁇ Lc.
  • the first position p1 and the first position may be used.
  • the relative relationship between the two positions p2 can be said to be the same level.
  • the first touch of the first edge screen 140 and the second touch of the second edge screen 160 by the user are electronic. It is made at a similar distance from the same edge of the device 1000.
  • the first position p1 and the first virtual line L1 are exceeded.
  • the relative relationship between the two positions (p2) can be said to be another level.
  • the first touch of the first edge screen 140 and the second touch of the second edge screen 160 by the user are electronic. It means that the device 1000 is made at a different distance from the same edge.
  • a selection area may be determined on the touch screen 100 of the electronic device 1000 according to the first position p1 and the second position p2.
  • the selection area may mean an area between the first virtual line L1 and the second virtual line L2.
  • the selection area is related to whether an item is selected and the specification of a part to be enlarged, which will be described later.
  • the electronic apparatus 1000 may detect a gesture of a user whose starting point is a location where a user's touch is detected on the touch screen 100.
  • the gesture refers to the movement of the user sensed on the touch screen 100, and the gesture may include sliding, clicking, and maintaining.
  • Sliding means moving to a position different from the starting point while maintaining the touch on the touch screen 100 from the position touched by the user as the starting point. Examples may include one-way straight, direction straight, curved, and the like.
  • Clicking means releasing the touch of the touch screen 100 to the starting point from the position touched by the user and touching the starting point again. Examples may include single clicking, double clicking, triple clicking, and the like.
  • Maintaining means maintaining a touch on a starting point over a preset time value based on a position touched by a user.
  • the electronic apparatus 1000 may detect a gesture using fingers of both hands of the user. For example, the electronic apparatus 1000 may detect a gesture using the index finger of one hand and the index finger of the other hand of the user.
  • the electronic apparatus 1000 may detect a gesture using two fingers of one hand of the user.
  • the electronic apparatus 1000 may detect a gesture using a thumb of one hand of the user and an index finger of the same hand.
  • the electronic apparatus 1000 may also sense a gesture using an input means such as a pen in addition to the user's finger.
  • the electronic apparatus 1000 may detect a first gesture having a first position p1 as a starting point and a second gesture having a second position p2 as a starting point.
  • the first gesture may be any one of sliding, clicking, and maintaining
  • the second gesture may be any one of sliding, clicking, and maintaining, but is not limited thereto.
  • the electronic apparatus 1000 may detect a combination of the first gesture and the second gesture.
  • the electronic apparatus 1000 may detect a first gesture of sliding and a second gesture of clicking.
  • the control method of the electronic apparatus 1000 may allow the first gesture and the second gesture to be detected only when the first touch and the second touch are simultaneously sensed.
  • the electronic apparatus 1000 may perform a plurality of preset operations to be described later.
  • the start time of the first gesture and the start time of the second gesture perfectly match, but also the gesture of any one of the first gesture and the second gesture. It may include detecting another gesture while the is detected.
  • “when the first gesture and the second gesture are sensed at the same time” means that there is a time interval in which the first gesture and the second gesture are simultaneously detected by the electronic apparatus 1000.
  • Step S330 may be performed.
  • the first gesture and the second gesture may not be detected, but are not limited thereto.
  • the electronic apparatus 1000 may perform a preset operation according to the combination of the first gesture and the second gesture.
  • control method according to an embodiment of the present disclosure may be set to respond only to the combination of the first gesture and the second gesture instead of the single first gesture or the single second gesture.
  • the preset operation may be determined according to a corresponding relationship between a plurality of combinations of the first gesture and the second gesture and a plurality of operations of the electronic apparatus 1000.
  • the correspondence relationship between the plurality of combinations of the first gesture and the second gesture and the plurality of operations of the electronic device 1000 may correspond to the plurality of combinations of the first gesture and the second gesture with each of the plurality of operations of the electronic device 1000. By corresponding, it can be set in advance.
  • a corresponding relationship between a plurality of combinations of a first gesture and a second gesture and a plurality of operations may be preset and stored in the electronic apparatus 1000.
  • step S340 a combination of the first gesture and the second gesture is first determined, and then a preset operation corresponding to the combination of the first gesture and the second gesture corresponds to the combination and operation of the first gesture and the second gesture. Can be retrieved by relationship.
  • 5A to 5D illustrate examples of a combination of a first gesture and a second gesture in a control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • the combination of the first gesture and the second gesture may include at least one of the following combinations: i) First edge screen 140 with first position p1 as a starting point. A combination of the first gesture sliding along the first direction of the second direction and the second gesture sliding along the first direction of the second edge screen 160 with the second position p2 as a starting point, ii) the first position ( a combination of the first gesture sliding toward the inside of the touch screen 100 with the starting point p1) and the second gesture sliding toward the inside of the touch screen 100 with the second position p2 as the starting point, iii) A combination of the first gesture clicked at the first position p1 and the second gesture clicked at the second position p2, iv) the first edge p1 of the first edge screen 140 A second edge with the first gesture and the second position p2 sliding along the first direction as a starting point The combination of the second gesture that is sliding in a second direction of the screen 160.
  • FIG. 6 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • the combination of the first gesture and the second gesture is, for example, the first gesture and the second position p2 sliding toward the inside of the touch screen 100 using the first position p1 as a starting point.
  • the touch screen 100 of the electronic device 1000 may be turned off.
  • the touch screen 100 may be deactivated. In this case, it may not be necessary for the electronic apparatus 1000 to determine an application running in the foreground.
  • the user does not need to press the power button to turn off the touch screen 100 of the electronic device 1000, and the touch screen 100 of the electronic device 1000 is based on the combination of the above-described gestures. Can be turned off faster. Through this, the user can achieve a power saving effect, and can effectively prevent people around the user from viewing the content displayed on the electronic device 1000.
  • FIG. 7 is a flowchart illustrating a method of performing a preset operation according to a combination of a detected gesture by the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • the electronic apparatus 1000 may perform a preset operation regarding an application that is running. Even in this case, the electronic apparatus 1000 may perform a preset operation regarding the application according to the combination of the first gesture and the second gesture.
  • the electronic apparatus 1000 performs a predetermined operation on an application being executed according to a combination of a first gesture and a second gesture.
  • the electronic apparatus 1000 may determine an application running in the electronic apparatus 1000.
  • the electronic apparatus 1000 may determine an application on which the execution window is displayed on the touch screen 100 among applications currently running.
  • the electronic apparatus 1000 may determine a preset operation regarding an application corresponding to the combination of the first gesture and the second gesture.
  • the preset operation of the application corresponding to the combination of the first gesture and the second gesture may be set differently for each application in the electronic apparatus 1000.
  • the same combination of the first gesture and the second gesture may correspond to different operations with respect to different applications in the electronic apparatus 1000.
  • Correspondence relationships between the plurality of combinations of the first gesture and the second gesture and the plurality of operations may be preset in the electronic apparatus 1000 according to each application.
  • the electronic apparatus 1000 may determine a preset operation of the running application corresponding to the combination of the first gesture and the second gesture.
  • the electronic apparatus 1000 may perform a preset operation of the running application corresponding to the combination of the first gesture and the second gesture.
  • FIG 8 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • a preset operation corresponding to the combination of the first gesture and the second gesture may be based on content displayed on the touch screen 100 of the electronic device 1000 by an application running on the electronic device 1000. It may be to enlarge the display size.
  • the electronic apparatus 1000 detects a combination of the first gesture clicked at the first position p1 and the second gesture clicked at the second position p2, the touch of the electronic device 1000 is detected by the application.
  • the display size of the content displayed on the screen 100 can be enlarged.
  • the preset operation may be to enlarge the content displayed in the execution window by the application, not to enlarge the execution window of the application.
  • an application running in the electronic apparatus 1000 may be a web browser application.
  • the electronic apparatus 1000 may simultaneously receive the click input at the first position p1 and the click input at the second position p2 by a combination of the first gesture and the second gesture in a touch operation.
  • the preset operation corresponding to the combination of the first gesture and the second gesture may be to enlarge the display size of the content displayed in the execution window of the web browser application.
  • the electronic apparatus 1000 may enlarge the size of the text displayed in the selection area according to the first position p1 and the second position p2 that are the same level among the text displayed in the web page.
  • FIG. 9 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • the electronic apparatus 1000 uses a first position p1 as a starting point and a first gesture and a second position p2 sliding along a first direction of the first edge screen 140 as a starting point.
  • the combination of the second gesture sliding along the second direction of the second edge screen 160 can be detected.
  • the electronic apparatus 1000 may determine an application displaying an image on the touch screen 100 as an application that is running.
  • the preset operation corresponding to the combination of the above first gesture and the second gesture enlarges the display size of the image displayed on the touch screen 100 of the electronic device 1000 by an application running on the electronic device 1000. It may be.
  • the electronic apparatus 1000 may determine the size of the selection area according to the first position p1 and the second position p2 of the image displayed on the touch screen 100 on the first edge screen 140 of the first gesture.
  • the touch endpoint may be extended to a selection area according to the touch endpoint and the touch endpoint at the second edge screen 160 of the second gesture.
  • the selection area may mean an area within the first virtual line L1 and the second virtual line L2 on the touch screen 100.
  • FIG. 10 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • the electronic apparatus 1000 may follow the first direction of the second edge screen 160 with the first gesture and the second position p2 maintained at the first position p1 as a starting point.
  • the combination of the sliding second gesture may be sensed.
  • the electronic apparatus 1000 may determine an application that plays a video on the touch screen 100 as an application that is running.
  • the electronic apparatus 1000 plays the video by the application detected on the touch screen 100.
  • a bar representing an entire play time section of the video may be displayed.
  • an object indicating a current playback time may be displayed on a corresponding bar.
  • the preset operation corresponding to the combination of the first gesture and the second gesture may move a playback time point of a video played on the touch screen 100 of the electronic device 1000 by an application running on the electronic device 1000. It may be to.
  • the electronic apparatus 1000 may be configured on the touch screen 100 such that the ratio of the movement distance of the second gesture to the total length of the second edge screen 160 and the movement distance ratio of the play time button to the total length of the bottom bar are the same.
  • the current playback time button of the played video may be moved in the first direction.
  • the starting point of the moved playback time point may be the current playback time point of the video.
  • the current playback time point may be moved forward. If the sliding direction of the second gesture is the second direction, the current playback time point may be moved backward.
  • the electronic device 1000 maintains the first gesture and the second position p2 sliding along the first direction of the first edge screen 140 with the first position p1 as a starting point.
  • the combination of the second gesture may be detected.
  • the electronic apparatus 1000 may determine an application that plays a video on the touch screen 100 as an application that is running.
  • the electronic device 1000 may play a video using an application detected on the touch screen 100.
  • a bar indicating an available playback volume range of the video may be displayed at the top of the video to be played.
  • an object representing the current playback volume may be displayed on the bar.
  • the preset operation corresponding to the combination of the first gesture and the second gesture may change the playback volume of the video played on the touch screen 100 of the electronic device 1000 by an application running on the electronic device 1000. It may be.
  • the electronic apparatus 1000 may adjust the current playback volume button of the currently played video such that the moving distance ratio of the first gesture to the total length of the first edge screen 140 is equal to the moving ratio of the playback volume button to the entire length of the top bar. Can be moved.
  • the starting point of the changed volume may be the current playback volume of the video.
  • the playback volume may be increased. If the sliding direction of the first gesture is the second direction, the playback volume may be changed small.
  • FIG. 11 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • a control method of the electronic apparatus 1000 may include a preset method for an item displayed in an execution window of an application when the relationship between the first position p1 and the second position p2 is the same level.
  • the operation can be performed.
  • an application running in the foreground of the electronic device 1000 may be a message service application, and the displayed item may be an item in a list of messages.
  • a predetermined operation regarding the item selected on the execution window of the application corresponding to the first position p1 and the second position p2 may be performed, and details of the item selection will be described later in detail.
  • performing a preset operation according to a combination of the first gesture and the second gesture may include a first position p1 and a first position in an execution window of the application. And selecting an item corresponding to the second position p2.
  • the selected item may mean an item corresponding to the first position p1 and the second position p2 in the execution window of the application being executed.
  • the selected item may mean an item selected on the touch screen 100 according to the first position p1 and the second position p2 by the user's touch.
  • the selected item may be a corresponding item when the first position p1 and the second position p2 are within an area displayed by the item on the touch screen 100.
  • the selected item may be the item having the widest display area overlapping the selection area between the first virtual line L1 and the second virtual line L2 among the items on the touch screen 100.
  • Selecting an item corresponding to both the first position p1 and the second position p2 in the execution window of the application may be located between steps S710 and S720.
  • the preset operation corresponding to the combination of the first gesture and the second gesture may be an operation performed on the item selected in step S730.
  • the preset operation corresponding to the combination of the first gesture and the second gesture may be to change the display position of the item selected in the execution window of the application.
  • the combination of the first gesture and the second gesture may include sliding along the first direction of the first edge screen 140 from the first position p1 and starting from the second position p2 as the starting point.
  • the selected item may be moved along the first direction to change the display position of the selected item on the execution window of the application.
  • an application running in the electronic apparatus 1000 may be assumed to be a message service application.
  • the message service application may include a plurality of message items.
  • the plurality of message items may be displayed in a list on the execution window of the message service application.
  • the selected item corresponding to both the first position p1 and the second position p2 may be a message item in contact with “kimbb”.
  • the electronic device 1000 may include a first gesture sliding along a first direction of the first edge screen 140 at a first position p1 and a second edge screen 160 at a second position p2. May simultaneously receive a combination of second gestures sliding along the first direction.
  • an operation of moving the selected item along the first direction may be performed.
  • the item may be moved to the end of the touch screen 100 in the first direction and displayed.
  • the item may be displayed on the top of the touch screen 100.
  • the operation of the item according to the combination of the first gesture and the second gesture may be to change the display position of the item to a specific position.
  • the specific position may be viewed as a position displayed by an item other than the item corresponding to both the first position p1 and the second position p2.
  • the specific position may be viewed as the position of another item corresponding to both the third position at which the first gesture ends and the fourth position at which the second gesture ends.
  • the third position may correspond to the touch endpoint at the first edge screen 140 of the first gesture
  • the fourth position may correspond to the touch endpoint at the second edge screen 160 of the second gesture.
  • a third virtual line horizontal to a third position and a fourth virtual line horizontal to a fourth position may be defined, and the electronic apparatus 1000 may have a threshold at which a distance between the third virtual line and the fourth virtual line is preset. Only when the value is less than or equal to the selected item can be moved to a specific position.
  • the preset item corresponding to both the third position and the fourth position may mean a corresponding item when both the third position and the fourth position exist within the area displayed by the item on the touch screen 100.
  • the preset item corresponding to both the third position and the fourth position means an item having a widest display area that overlaps the area between the third virtual line at the third position and the fourth virtual line at the fourth position. can do.
  • the specific position may be a position to which the selected item is to be moved, and the “position to be moved to the item” may be a position corresponding to the third position and the fourth position.
  • FIG. 12 is a diagram illustrating an example of performing a preset operation corresponding to a combination of a first gesture and a second gesture in the control method of the electronic apparatus 1000 according to an embodiment of the present disclosure.
  • the combination of the first gesture and the second gesture is a sliding inward of the touch screen 100 at the first position p1 and a sliding inward of the touch screen 100 at the second position p2 as a starting point.
  • the selected item on the execution window may be hidden.
  • an application running in the electronic apparatus 1000 may be assumed to be a message service application.
  • the message service application may include a plurality of message items, and the plurality of message items may be displayed in a list on an execution window of the message service application.
  • An item corresponding to both the first location p1 and the second location p2 may be a message item in contact with “Alice”.
  • the selected item may represent an item partially overlapping both the first position p1 and the second position p2.
  • the electronic device 1000 slides toward the inside of the touch screen 100 from the first position and the second position p2, starting from the first position p1 toward the inside of the touch screen 100. If the combination of the second gestures is received, the electronic apparatus 1000 may perform a preset operation in which the selected item is hidden in the execution window of the application corresponding to the combination of the first gesture and the second gesture.
  • hidden message items may be viewed in a message service application using an existing operation.
  • the touch method uses both edge screens 140 and 160 of the electronic device 1000 to receive a touch action, and to simplify the process of the touch action in the electronic device 1000, In order to correspond to a plurality of operations, a combination of gestures corresponding to a touch operation having a touch point as a starting point may be created.
  • the touch operation of the electronic device 1000 having both side edge screens 140 and 160 may be implemented in computer code on a computer-readable recording medium.
  • the computer code can be implemented by one of ordinary skill in the art according to the description of the method mentioned above.
  • the method of the present disclosure is executed when computer code is executed on a computer.
  • FIG. 13 is a block diagram of an electronic device 1000 according to an exemplary embodiment.
  • an electronic device 1000 may include a touch screen 100, a processor 200, and a memory 300.
  • the touch screen 100 may include a main screen 120 flat on the front surface, and edge screens 140 and 160 provided at both sides of the main screen 120.
  • the touch screen 100 may include a first edge screen 140 at one corner of the main screen 120 and a second edge screen 160 at another corner opposite the one edge.
  • the touch screen 100 may display an execution window of an application executed in the electronic device 1000.
  • the touch screen 100 may display an execution window on the main screen 120 and the edge screens 140 and 160, and may display an execution window on the main screen 120 or the edge screens 140 and 160.
  • the touch screen 100 may display at least one item on the execution window.
  • the touch screen 100 is implemented in various forms of displays, such as a liquid crystal display (LCD), organic light emitting diodes (OLED) display, active-matrix organic light-emitting diode (AM-OLED), plasma display panel (PDP), and the like. Can be.
  • the touch screen 100 may be implemented to be flexible, transparent, or wearable.
  • the memory 300 may store at least one program executable by the processor 200.
  • the processor 200 may perform a predetermined operation on the electronic device by executing at least one program, and may be configured of at least one processor.
  • the processor 200 may detect a user's touch and gesture.
  • the touch screen 100 may include a touch sensor for detecting a user's touch and gesture.
  • the processor 200 may detect the pressure or the electrical pressure when the user physically contacts the touch screen 100.
  • the processor 200 may detect a touch and a gesture using the fingers of both hands of the user. For example, the processor 200 may detect a touch and a gesture using the index finger of one hand and the index finger of the other hand of the user.
  • the processor 200 may detect a touch and a gesture using two fingers of one hand of the user.
  • the processor 200 may detect a touch and a gesture using the thumb of one hand of the user and the index finger of the same hand.
  • the processor 200 may detect a touch and a gesture using an input means such as a pen in addition to the user's finger.
  • the processor 200 may detect a first touch on the first edge screen 140, and sense a second touch on the second edge screen 160.
  • the processor 200 may determine a case where the touch screen 100 simultaneously detects the first touch and the second touch.
  • the processor 200 may have a perfect start time of the first touch and a start time of the second touch detected by the touch screen 100. It can be judged to include the case of a match.
  • the processor 200 indicates that any one of the first touch and the second touch detected by the touch screen 100 is detected. It may be determined that it includes the case where another touch is detected while being detected.
  • the processor 200 may also have a time interval in which the touch screen 100 detects the first touch and the second touch together. It can be judged to include.
  • the processor 200 may determine a location of the touch as a location where the user's touch is detected on the touch screen 100.
  • the processor 200 may determine positions of the first touch and the second touch, respectively. In this case, the processor 200 may determine the position where the first touch is detected on the first edge screen 140 as the first position p1, and the position where the second touch is sensed on the second edge screen 160. May be determined as the second position p2.
  • the processor 200 may determine whether a relative relationship between the first position p1 and the second position p2 satisfies a preset condition.
  • the processor 200 may have the same position as the first position p1 and the second position p2. It may be determined that the position is within the level, and when the relative relationship between the first position p1 and the second position p2 does not satisfy the preset condition, the first position p1 and the second position p2 are different from each other. Can be determined to be at the level.
  • the processor 200 may determine the first gesture and the first gesture detected by the touch screen 100.
  • the preset operation may be performed according to the combination of the two gestures.
  • the processor 200 may be used regardless of the combination of the first gesture and the second gesture detected by the touch screen 100. You can do nothing.
  • the processor 200 may determine the selection area on the touch screen 100 according to the first position p1 and the second position p2.
  • the processor 200 may detect a gesture of a user whose starting point is a location where a touch is detected.
  • the processor 200 may detect a first gesture having a first position p1 as a starting point and a second gesture having a second position p2 as a starting point.
  • the first gesture using the first position p1 as the starting point or the second gesture using the second position p2 as the starting point may be any one of sliding, clicking, and maintaining.
  • the processor 200 may detect a combination of the first gesture and the second gesture.
  • the processor 200 may determine a case where the touch screen 100 simultaneously detects the first gesture and the second gesture.
  • the processor 200 When the touch screen 100 simultaneously detects the first gesture and the second gesture, the processor 200 has a perfect start time of the first gesture and the start time of the second gesture detected by the touch screen 100. It can be judged to include the case of a match. In addition, the processor 200 may determine that either the first gesture or the second gesture detected by the touch screen 100 is “when the touch screen 100 simultaneously detects the first gesture and the second gesture”. It may be determined to include the case where another gesture is detected while being detected.
  • the processor 200 When the touch screen 100 simultaneously detects the first gesture and the second gesture, the processor 200 includes a time interval in which the first gesture and the second gesture detected by the touch screen 100 are simultaneously detected. It can also be judged that it includes.
  • the processor 200 may perform various preset operations according to various combinations of the first gesture and the second gesture, and a specific embodiment will be described below.
  • the processor 200 may be configured as a starting point of the touch screen 100 using the first gesture and the second position p2 sliding toward the inside of the touch screen 100 using the first position p1 as a starting point. According to a combination of the second gestures sliding inwards, the operation of turning off the touch screen 100 of the electronic device 1000 may be performed.
  • the processor 200 may determine an application running in the foreground.
  • the processor 200 may determine a preset operation regarding an application corresponding to the combination of gestures.
  • the processor 200 may perform various operations on an application that is running according to a combination of gestures, which will be described below in detail.
  • the processor 200 may display the application in the application of the electronic device 1000 according to a combination of the first gesture clicked at the first position p1 and the second gesture clicked at the second position p2. An operation of enlarging the display size of the text may be performed.
  • the processor 200 may enlarge the size of the text of the selection area of the web page according to the first position p1 and the second position p2, which are the same level.
  • the processor 200 may have a first gesture sliding along a first direction of the first edge screen 140 with a first position p1 as a starting point and a second position with a second position p2 as a starting point. According to the combination of the second gestures sliding along the first direction on the edge screen 160, an operation of enlarging the display size of the image displayed in the application of the electronic device 1000 may be performed.
  • the processor 200 may determine the size of the selection area of the image according to the first position p1 and the second position p2, and determine the size of the first touch endpoint and the second gesture on the first edge screen 140 of the first gesture.
  • the screen may be enlarged up to the selection area according to the second touch end point of the second edge screen 160.
  • the processor 200 may slide in the first direction of the second edge screen 160 using the first gesture maintained at the first position p1 and the second position p2 as a starting point. According to a combination of gestures, an operation of moving a playback time point of a video executed in an application of the electronic apparatus 1000 may be performed.
  • the processor 200 may include a first gesture sliding in the first direction of the first edge screen 140 and a second maintained in the second position p2 using the first position p1 as a starting point. According to a combination of gestures, an operation of changing a playback volume of a video executed in an application of the electronic apparatus 1000 may be performed.
  • the processor 200 may select an item displayed on the execution window of the application.
  • the processor 200 may perform various operations on the selected item according to the combination of gestures, which will be described below in detail.
  • the processor 200 may include a first gesture sliding in a first direction of the first edge screen 140 with a first position p1 as a starting point and a second edge with a second position p2 as a starting point. According to a combination of the second gestures sliding in the first direction of the screen 160, an operation of changing the display position of the selected item on the execution window of the application of the electronic device 1000 may be performed.
  • the processor 200 may move the inside of the touch screen 100 with the first gesture and the second position p2 sliding toward the inside of the touch screen 100 from the first position p1 as a starting point. According to the combination of the second gesture sliding toward the other side, the selected item may be hidden on the execution window of the electronic apparatus 1000.
  • FIG. 14 is a block diagram of an electronic device 1000 according to another exemplary embodiment.
  • an electronic device 1000 may include a user input unit 1100, an output unit 1200, a processor 1300, a sensing unit 1400, a communication unit 1500, and A /. And a V input unit 1600 and a memory 1700.
  • the user input unit 1100 means a means for a user to input data for controlling the electronic apparatus 1000.
  • the user input unit 1100 includes a key pad, a dome switch, a touch pad (contact capacitive type, pressure resistive layer type, infrared sensing type, surface ultrasonic conduction type, and integral type). Tension measurement method, piezo effect method, etc.), a jog wheel, a jog switch, and the like, but are not limited thereto.
  • the function of the user input unit 1100 and the function of the display unit 1210 to be described below may be implemented together in the form of the touch screen 100.
  • the output unit 1200 may output an audio signal, a video signal, or a vibration signal, and the output unit 1200 may include a display unit 1210, an audio output unit 1220, and a vibration motor 1230. have.
  • the display unit 1210 displays and outputs information processed by the electronic apparatus 1000. Meanwhile, when the display unit 1210 and the touch pad form a layer structure and are configured as a touch screen, the display unit 1210 may be used as an input device in addition to the output device.
  • the display unit 1210 may include a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a three-dimensional display. 3D display, an electrophoretic display.
  • the electronic apparatus 1000 may include two or more display units 1210 according to the implementation form of the electronic apparatus 1000. In this case, the two or more display units 1210 may be disposed to face each other using a hinge.
  • the sound output unit 1220 outputs audio data received from the communication unit 1500 or stored in the memory 1700.
  • the vibration motor 1230 may output a vibration signal.
  • the vibration motor 1230 may output a vibration signal corresponding to the output of audio data or video data (eg, a call signal reception sound, a message reception sound, etc.).
  • the vibration motor 1230 may output a vibration signal when a touch is input to the touch screen.
  • the processor 1300 typically controls the overall operation of the electronic apparatus 1000.
  • the processor 1300 of FIG. 14 may be an embodiment of the processor 200 of FIG. 13.
  • the processor 1300 executes programs stored in the memory 1700 to thereby execute a user input unit 1100, an output unit 1200, a sensing unit 1400, a communication unit 1500, and an A / V input unit 1600. ) Can be controlled overall.
  • the processor 1300 may perform a function of the electronic apparatus 1000 described with reference to FIGS. 1 to 14 by executing programs stored in the memory 1700.
  • the sensing unit 1400 may detect a state of the electronic device 1000 or a state around the electronic device 1000 and transmit the detected information to the processor 1300.
  • the sensing unit 1400 may include a geomagnetic sensor 1410, an acceleration sensor 1420, a temperature / humidity sensor 1430, an infrared sensor 1440, a gyroscope sensor 1450, and a position sensor. (Eg, GPS) 1460, barometric pressure sensor 1470, proximity sensor 1480, and RGB sensor (illuminance sensor) 1490, but are not limited thereto.
  • the communicator 1500 may include one or more components that allow communication between the electronic device 1000 and an external device (not shown).
  • the communicator 1500 may include a short range communicator 1510, a mobile communicator 1520, and a broadcast receiver 1530.
  • the A / V input unit 1600 is for inputting an audio signal or a video signal, and may include a camera 1610 and a microphone 1620.
  • the memory 1700 may store a program for processing and controlling the processor 1300, and may store data input to or output from the electronic device 1000.
  • the memory 1700 of FIG. 14 may be an embodiment of the memory 300 of FIG. 13.
  • the memory 1700 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM Random Access Memory (RAM) Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic Disk It may include at least one type of storage medium of the optical disk.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • ROM Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • Magnetic Memory Magnetic Disk It may include at least one type of storage medium of the optical disk.
  • Programs stored in the memory 1700 may be classified into a plurality of modules according to their functions.
  • the programs stored in the memory 1700 may be classified into a UI module 1710, a touch screen module 1720, a notification module 1730, and the like. .
  • the UI module 1710 may provide a specialized UI, GUI, and the like that interoperate with the electronic device 1000 for each application.
  • the touch screen module 1720 may detect a touch gesture on the user's touch screen and transmit information about the touch gesture to the processor 1300.
  • the touch screen module 1720 according to some embodiments may recognize and analyze a touch code.
  • the touch screen module 1720 may be configured as separate hardware including a controller.
  • Various sensors may be provided inside or near the touch screen to detect a touch or a proximity touch of the touch screen.
  • An example of a sensor for sensing a touch of a touch screen is a tactile sensor.
  • the tactile sensor refers to a sensor that senses the contact of a specific object to the extent that a person feels or more.
  • the tactile sensor may sense various information such as the roughness of the contact surface, the rigidity of the contact object, the temperature of the contact point, and the like.
  • an example of a sensor for sensing a touch of a touch screen is a proximity sensor.
  • the proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • the user's touch gesture may include tap, touch and hold, double tap, drag, pan, flick, drag and drop, and swipe.
  • the notification module 1730 may generate a signal for notifying occurrence of an event of the electronic device 1000.
  • Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media.
  • Computer readable media may include both computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transmission mechanism, and includes any information delivery media.
  • unit may be a hardware component such as a processor or a circuit, and / or a software component executed by a hardware component such as a processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de saisie tactile par le biais d'un écran de bord, ainsi qu'un dispositif électronique. Le procédé tactile comprend les étapes consistant à : détecter un premier contact tactile par rapport à un premier écran de bord et un second contact tactile par rapport à un second écran de bord ; déterminer une première position sur le premier écran de bord correspondant au premier contact tactile et une seconde position sur le second écran de bord correspondant au second contact tactile ; détecter un premier geste pour lequel la première position est un point de départ et un second geste pour lequel la seconde position est un point de départ ; et réaliser une opération prédéfinie d'un dispositif électronique en fonction de l'association du premier geste et du second geste.
PCT/KR2016/009678 2015-11-17 2016-08-31 Procédé de saisie tactile par l'écran de bord et dispositif électronique WO2017086578A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/765,858 US11003328B2 (en) 2015-11-17 2016-08-31 Touch input method through edge screen, and electronic device
EP16866531.3A EP3343341B1 (fr) 2015-11-17 2016-08-31 Procédé de saisie tactile par l'écran de bord et dispositif électronique

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201510788114.1A CN106708399A (zh) 2015-11-17 2015-11-17 用于具有双侧边曲面屏幕的电子终端的触控方法和设备
CN201510788114.1 2015-11-17
KR10-2016-0106979 2016-08-23
KR1020160106979A KR102582541B1 (ko) 2015-11-17 2016-08-23 에지 스크린을 통한 터치 입력 방법 및 전자 장치

Publications (1)

Publication Number Publication Date
WO2017086578A1 true WO2017086578A1 (fr) 2017-05-26

Family

ID=58717557

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/009678 WO2017086578A1 (fr) 2015-11-17 2016-08-31 Procédé de saisie tactile par l'écran de bord et dispositif électronique

Country Status (1)

Country Link
WO (1) WO2017086578A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130076649A1 (en) * 2011-09-27 2013-03-28 Scott A. Myers Electronic Devices With Sidewall Displays
KR20150007925A (ko) * 2013-07-11 2015-01-21 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
KR101495967B1 (ko) * 2009-10-15 2015-02-25 퀄컴 인코포레이티드 다중의 터치 스크린들로부터의 제스처 입력을 하나의 제스처 입력으로 결합하는 방법, 시스템, 및 컴퓨터 프로그램 제품
US20150062053A1 (en) * 2008-12-29 2015-03-05 Hewlett-Packard Development Company, L.P. Gesture detection zones
KR20150072940A (ko) * 2013-12-20 2015-06-30 엘지전자 주식회사 이동 단말기 및 그것의 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062053A1 (en) * 2008-12-29 2015-03-05 Hewlett-Packard Development Company, L.P. Gesture detection zones
KR101495967B1 (ko) * 2009-10-15 2015-02-25 퀄컴 인코포레이티드 다중의 터치 스크린들로부터의 제스처 입력을 하나의 제스처 입력으로 결합하는 방법, 시스템, 및 컴퓨터 프로그램 제품
US20130076649A1 (en) * 2011-09-27 2013-03-28 Scott A. Myers Electronic Devices With Sidewall Displays
KR20150007925A (ko) * 2013-07-11 2015-01-21 삼성전자주식회사 사용자 인터렉션을 제공하는 사용자 단말 장치 및 그 방법
KR20150072940A (ko) * 2013-12-20 2015-06-30 엘지전자 주식회사 이동 단말기 및 그것의 제어 방법

Similar Documents

Publication Publication Date Title
WO2017065494A1 (fr) Dispositif portable et procédé d'affichage d'écran de dispositif portable
WO2017095040A1 (fr) Dispositif terminal d'utilisateur et son procédé d'affichage
WO2016129784A1 (fr) Appareil et procédé d'affichage d'image
WO2016111555A2 (fr) Dispositif de terminal utilisateur pliable et son procédé d'affichage
WO2018030594A1 (fr) Terminal mobile et son procédé de commande
WO2015053445A1 (fr) Dispositif mobile pliable et procédé pour le commander
WO2016104922A1 (fr) Dispositif électronique pouvant être porté
WO2014157885A1 (fr) Procédé et dispositif de présentation d'une interface avec menus
WO2014171705A1 (fr) Procédé pour régler une zone d'affichage et dispositif électronique associé
WO2015037932A1 (fr) Appareil d'affichage et procédé pour l'exécution d'une fonction de l'appareil d'affichage
WO2017105018A1 (fr) Appareil électronique et procédé d'affichage de notification pour appareil électronique
WO2015009110A1 (fr) Terminal portable équipé d'un affichage et procédé d'actionnement de celui-ci
WO2015016628A1 (fr) Procédé et appareil d'affichage d'applications
WO2015083975A1 (fr) Procédé d'affichage d'informations de pointage et dispositif pour mettre en œuvre ledit procédé
WO2015199280A1 (fr) Terminal mobile et son procédé de commande
WO2015088166A1 (fr) Terminal mobile, et procédé de commande d'une unité d'entrée de face arrière du terminal
WO2015030564A1 (fr) Appareil d'affichage, dispositif portable et procédés d'affichage sur écran associés
EP3243125A2 (fr) Dispositif de terminal utilisateur pliable et son procédé d'affichage
WO2017086559A1 (fr) Dispositif d'affichage d'images et son procédé de fonctionnement
WO2016089074A1 (fr) Dispositif et procédé de réception d'entrée de caractères par l'intermédiaire de ce dernier
WO2014027818A2 (fr) Dispositif électronique pour afficher une région tactile à présenter et procédé de ce dispositif
WO2014007425A1 (fr) Dispositif d'affichage comprenant un écran tactile et son procédé de commande
WO2015012629A1 (fr) Procédé de traitement d'entrée et dispositif électronique correspondant
WO2016137105A1 (fr) Dispositif et procédé d'exécution de systèmes d'exploitation multiples
WO2018056642A2 (fr) Dispositif électronique et son procédé de gestion d'applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16866531

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2016866531

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE