WO2017188840A1 - Mobile device with user interface - Google Patents

Mobile device with user interface Download PDF

Info

Publication number
WO2017188840A1
WO2017188840A1 PCT/RU2016/000255 RU2016000255W WO2017188840A1 WO 2017188840 A1 WO2017188840 A1 WO 2017188840A1 RU 2016000255 W RU2016000255 W RU 2016000255W WO 2017188840 A1 WO2017188840 A1 WO 2017188840A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
computer
display devices
user interactions
implemented method
Prior art date
Application number
PCT/RU2016/000255
Other languages
French (fr)
Inventor
Arseny Andreevich NIKOLAEV
Aleksey Vyacheslavovich SAZONOV
Nikita Viktorovich GLAZKOV
Mikhail Vladimirovich MALAKHOV
Dmitry Evgenevich CHALYKH
Original Assignee
Yota Devices Ipr Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yota Devices Ipr Limited filed Critical Yota Devices Ipr Limited
Priority to PCT/RU2016/000255 priority Critical patent/WO2017188840A1/en
Priority to CN201680087276.5A priority patent/CN109643178A/en
Publication of WO2017188840A1 publication Critical patent/WO2017188840A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]

Definitions

  • the proposed utility relates generally to methods of interaction with computer via display device, and
  • Patent US #8553001 discloses full-text methods
  • Patent US #8593421 discloses full-text local coordinate frame user interface for multitouch-enabled devices.
  • Patent US #8751955 discloses full-text scrollbar user interface for multitouch devices.
  • Patent US #8994736 discloses full-text method
  • Patent US #9176649 discloses full-text method
  • a US patent application #20100048252 discloses a method of controlling the operation of a mobile terminal equipped with first and second display modules provided on first and second surfaces.
  • the method includes displaying an image on the first display module; displaying an operation control menu on the second display module; and controlling the image according to a menu item selected from the operation control menu. Therefore, it becomes possible to increase the spatial efficiency of the first display module by displaying a screen image and an operation control menu for controlling the screen image separately on the first and second display modules .
  • a computer-implemented method disclosed herein according to the first embodiment comprises the following steps.
  • simultaneous sliding user interactions represent movement of displayed content in similar axial directions.
  • a computer-implemented method disclosed herein according to the second embodiment, applying for a mobile device furnished with two or more display screens comprises the following steps. Assignment of area in each screen for content or objects interscreen transfer (relocation) , so each said area on each screen corresponding to each one different screen.
  • An object in the screen once captured and moved to the assigned area till coincidence or overlap, disappears in the current screen and appears in the corresponding different screen, where it is transferred.
  • Fig. 1 shows the mobile device and first (face) and second (reverse) screens.
  • Fig. 2 shows two fingers simultaneous movement start position and further movement trajectories thereof (upward on the face screen) .
  • Fig. 3 shows two fingers simultaneous movement finish position and further movement trajectories thereof (upward on the face screen) .
  • Fig. 4 shows two fingers simultaneous movement start position and further movement trajectories thereof (from right to left on the face screen) .
  • Fig. 5 shows two fingers simultaneous movement finish position and further movement trajectories thereof (from right to left on the face screen) .
  • Proposed herein according to the first version is a computer-implemented method.
  • the method comprises the following.
  • simultaneous sliding user interactions represent movement of displayed content in similar axial directions.
  • the method may further comprise logical connection of proximate sides of both display devices (screens) so that the portion of moving content disappearing on one edge of the first display device's screen immediately automatically appears on the proximate side of the second display device's screen .
  • the sliding user interactions may be performed along the side edge of the display device (Fig.2 or Fig.3) .
  • the sliding user interactions may be performed along the top or bottom edge of the display device (Fig.4 or Fig.5) .
  • the sliding user interactions may be performed along the diagonal direction of the display device.
  • the direction of the content movement is also diagonal.
  • At least one display device may comprise integrated multitouch function.
  • the method may further comprise logical connection of proximate sides of both display devices so that the proximate edges of said display devices become logically stitched and both display surfaces logically comprises closed loop band.
  • Proposed herein according to the second version is a computer-implemented method, applying for a mobile device furnished with two or more display devices (screens) .
  • the disclosed method comprises the following steps.
  • the transferred object disappears in the current screen and is transferred to the different screen (corresponding with the interscreen transfer area) , where it appears.
  • Proposed herein according to the first version is a computer-implemented method.
  • the method comprises the following.
  • the user makes two simultaneous and counter directed sliding movements by two or more fingers along two display surfaces of first and second touch screen display devices placed on the opposite sides of a mobile device, axially directed along each proximate side of both display devices as recited in figures 2, 3, 4, 5.
  • the computer device determins, whether the detected two simultaneous sliding user interactions represent movement of displayed content in similar axial directions.
  • both screens' displayed content automatically starts moving on both display devices (screens) in the same direction as sliding user fingers.
  • Proposed herein according to the second version is a computer-implemented method, applying for a mobile device furnished with two or more display devices (screens) .
  • the disclosed method comprises the following steps.
  • An area 6 is assigned in each screen for content or objects interscreen transfer (relocation). So each assigned area 6 on each screen corresponds to each one different screen .
  • Each assigned area is logically connected to corresponding different screen.
  • the transferred object disappears in the current screen and is transferred to the different screen (corresponding with the interscreen transfer area 6), where it appears .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computer-implemented method, comprising: detecting two simultaneous and counter directed sliding user interactions along two display surfaces of first and second touch screen display devices placed on the opposite sides of a mobile device, axially directed along each proximate side of both display devices; determining: whether the detected two simultaneous sliding user interactions represent movement of displayed content; automatically moving displayed content on both display devices in the same direction as sliding user interactions. Another method for a mobile device furnished with two or more screens, comprising: assignment of area in each screen for objects interscreen transfer, each interscreen transfer area on each screen corresponding to each one different screen; logically connect each interscreen transfer area with corresponding screen, on capturing an object m the screen and moving to interscreen transfer area till coincidence or overlapping, the object is transferred to the corresponding different screen, where it appears.

Description

MOBILE DEVICE WITH USER INTERFACE
DESCRIPTION
The proposed utility relates generally to methods of interaction with computer via display device, and
particularly to the mentods of interaction with mobile multiscreen devices.
STATE OF ART
There are known a plurality of interfaces for user
interaction with mobile device, touch screen, two-screen and more including.
Patent US #8553001 discloses full-text methods and
apparatus for determining local coordinate frames for a human hand .
Patent US #8593421 discloses full-text local coordinate frame user interface for multitouch-enabled devices.
Patent US #8751955 discloses full-text scrollbar user interface for multitouch devices.
Patent US #8994736 discloses full-text method and
apparatus for freeform deformation of 3-D models.
Patent US #9176649 discloses full-text method and
apparatus of remote management of computer system using voice and gesture based input.
The main common disadvantage of such methods is limited ability of interacttion with mobile device.
There are a plurality of known methods, and the above mentioned are adduced only to briefly outline the field of knowledge .
A US patent application #20100048252 discloses a method of controlling the operation of a mobile terminal equipped with first and second display modules provided on first and second surfaces. The method includes displaying an image on the first display module; displaying an operation control menu on the second display module; and controlling the image according to a menu item selected from the operation control menu. Therefore, it becomes possible to increase the spatial efficiency of the first display module by displaying a screen image and an operation control menu for controlling the screen image separately on the first and second display modules .
This method is taken as the closest prior art.
Technical result attained by the proposed method is enhancing convenience, mainly by simplifying of gestures, easy-to-use and natural for a human being, expanding the range of devices of this type, increasing consumer
qualities .
Known devices do not allow to fully or in part reach these result .
DESCRIPTION OF INVENTION BRIEF SUMMARY
A computer-implemented method disclosed herein according to the first embodiment, comprises the following steps.
Detecting of two simultaneous and counter directed sliding user interactions along two display surfaces of first and second touch screen display devices placed on the opposite sides of a mobile device, axially directed along each
proximate side of both display devices.
Concurrently determining: whether the detected two
simultaneous sliding user interactions represent movement of displayed content in similar axial directions.
Automatically moving displayed content on both display devices in the same direction as said sliding user
interactions .
A computer-implemented method disclosed herein according to the second embodiment, applying for a mobile device furnished with two or more display screens comprises the following steps. Assignment of area in each screen for content or objects interscreen transfer (relocation) , so each said area on each screen corresponding to each one different screen.
Logically connect each said area with corresponding screen .
An object in the screen once captured and moved to the assigned area till coincidence or overlap, disappears in the current screen and appears in the corresponding different screen, where it is transferred.
DESCRIPTION OF DRAWINGS
Fig. 1 shows the mobile device and first (face) and second (reverse) screens.
Fig. 2 shows two fingers simultaneous movement start position and further movement trajectories thereof (upward on the face screen) .
Fig. 3 shows two fingers simultaneous movement finish position and further movement trajectories thereof (upward on the face screen) .
Fig. 4 shows two fingers simultaneous movement start position and further movement trajectories thereof (from right to left on the face screen) .
Fig. 5 shows two fingers simultaneous movement finish position and further movement trajectories thereof (from right to left on the face screen) .
Designations used in figures.
1 - display device screen 1.
2 - display device screen 2.
3 - mobile device.
4, 5 - front and back finger movement trajectories.
6 - assigned area for screen objects interscreen transfer.
7 - content movement trajectory. DETAILED DESCRIPTION
Proposed herein according to the first version is a computer-implemented method.
The method comprises the following.
Detecting two simultaneous and counter directed sliding user interactions along two display surfaces of first and second touch screen display devices placed on the opposite sides of a mobile device, axially directed along each
proximate side of both display devices. These counter directed sliding user interactions are usually made by two or more fingers as recited in Fig. 2, 3, 4, 5.
Concurrently determining: whether the detected two
simultaneous sliding user interactions represent movement of displayed content in similar axial directions.
If so, automatically moving displayed content on both display devices (screens) in the same direction as said sliding user interactions.
The method may further comprise logical connection of proximate sides of both display devices (screens) so that the portion of moving content disappearing on one edge of the first display device's screen immediately automatically appears on the proximate side of the second display device's screen .
The sliding user interactions may be performed along the side edge of the display device (Fig.2 or Fig.3) .
The sliding user interactions may be performed along the top or bottom edge of the display device (Fig.4 or Fig.5) .
The sliding user interactions may be performed along the diagonal direction of the display device. In this case the direction of the content movement is also diagonal.
At least one display device may comprise integrated multitouch function. The method may further comprise logical connection of proximate sides of both display devices so that the proximate edges of said display devices become logically stitched and both display surfaces logically comprises closed loop band.
Proposed herein according to the second version is a computer-implemented method, applying for a mobile device furnished with two or more display devices (screens) . The disclosed method comprises the following steps.
Assignment of area 6 in each screen for content or objects interscreen transfer (relocation) , so each said area 6 on each screen corresponds to each one different screen.
Logically connect each said area with corresponding screen .
By capturing (by finger gesture) an object in any one screen and moving to the interscreen transfer area 6 (plasced on this screen) till the coincidence or overlapping thereof (object and area 6) .
After that the transferred object disappears in the current screen and is transferred to the different screen (corresponding with the interscreen transfer area) , where it appears.
HOW IT WORKS.
Proposed herein according to the first version is a computer-implemented method.
The method comprises the following.
The user makes two simultaneous and counter directed sliding movements by two or more fingers along two display surfaces of first and second touch screen display devices placed on the opposite sides of a mobile device, axially directed along each proximate side of both display devices as recited in figures 2, 3, 4, 5. The computer device determins, whether the detected two simultaneous sliding user interactions represent movement of displayed content in similar axial directions.
If so, the both screens' displayed content automatically starts moving on both display devices (screens) in the same direction as sliding user fingers.
Proposed herein according to the second version is a computer-implemented method, applying for a mobile device furnished with two or more display devices (screens) . The disclosed method comprises the following steps.
An area 6 is assigned in each screen for content or objects interscreen transfer (relocation). So each assigned area 6 on each screen corresponds to each one different screen .
Each assigned area is logically connected to corresponding different screen.
By capturing an object by mouse device or a finger in touch screen in any one screen and moving to the interscreen transfer area 6 (plasced on this screen) till the coincidence or overlapping thereof (captured object and area 6) .
After that the transferred object disappears in the current screen and is transferred to the different screen (corresponding with the interscreen transfer area 6), where it appears .

Claims

1. A computer-implemented method, comprising:
• detecting two simultaneous and counter directed sliding user interactions along two display surfaces of first and second touch screen display devices placed on .the opposite sides of a mobile device, axially directed along each proximate side of both display devices;
• concurrently determining: whether the detected two simultaneous sliding user interactions represent movement of displayed content in similar axial directions;
• automatically moving displayed content on both display devices in the same direction as said sliding user
interactions .
2. A computer-implemented method as recited in claim 1, further comprising logically connecting proximate sides of both display devices so that the portion of moving content disappearing on one edge of the first display device's screen automatically appears on the proximate side of the second display device's screen.
3. A computer-implemented method as recited in claim 1, wherein said sliding user interactions are performed along the side edge of the display device.
4. A computer-implemented method as recited in claim 1, wherein said sliding user interactions are performed along the top or bottom edge of the display device.
5. A computer-implemented method as recited in claim 1, wherein said sliding user interactions are performed along the diagonal direction of the display device.
6. A computer-implemented method as recited in claim 1, wherein at least one display device comprises integrated multitouch function.
7. A computer-implemented method as recited in claim 1, further comprising logical connection of proximate sides of both display devices so that the proximate edges of said display devices become logically stitched and both display surfaces logically comprises closed loop band.
8. A computer-implemented method, applying for a mobile device furnished with two or more display devices, said menthod comprising:
• assignment of area in each screen for content or objects interscreen transfer (relocation) , so each said area on each screen corresponding to each one different screen;
• logically connect each said area with corresponding screen;
• on capturing an object in the screen and moving to said area till coincidence or overlapping, said object disappears in the current screen and is transferred to the corresponding different screen, where it appears.
PCT/RU2016/000255 2016-04-28 2016-04-28 Mobile device with user interface WO2017188840A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/RU2016/000255 WO2017188840A1 (en) 2016-04-28 2016-04-28 Mobile device with user interface
CN201680087276.5A CN109643178A (en) 2016-04-28 2016-04-28 Mobile device with user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2016/000255 WO2017188840A1 (en) 2016-04-28 2016-04-28 Mobile device with user interface

Publications (1)

Publication Number Publication Date
WO2017188840A1 true WO2017188840A1 (en) 2017-11-02

Family

ID=60160978

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2016/000255 WO2017188840A1 (en) 2016-04-28 2016-04-28 Mobile device with user interface

Country Status (2)

Country Link
CN (1) CN109643178A (en)
WO (1) WO2017188840A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134311A (en) * 2019-04-08 2019-08-16 努比亚技术有限公司 A kind of screen display method, wearable device and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100020034A1 (en) * 2008-07-25 2010-01-28 Do-Hyoung Mobile device having backpanel touchpad
US20110021251A1 (en) * 2009-07-22 2011-01-27 Sony Ericsson Mobile Communications Ab Electronic device with touch-sensitive control
US20130265284A1 (en) * 2012-04-07 2013-10-10 Samsung Electronics Co., Ltd. Object control method performed in device including transparent display, the device, and computer readable recording medium thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011248784A (en) * 2010-05-28 2011-12-08 Toshiba Corp Electronic apparatus and display control method
KR101802522B1 (en) * 2011-02-10 2017-11-29 삼성전자주식회사 Apparatus having a plurality of touch screens and screen changing method thereof
CN103324435B (en) * 2013-05-24 2017-02-08 华为技术有限公司 Multi-screen display method and device and electronic device thereof
KR102034584B1 (en) * 2013-06-20 2019-10-21 엘지전자 주식회사 Portable device and controlling method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100020034A1 (en) * 2008-07-25 2010-01-28 Do-Hyoung Mobile device having backpanel touchpad
US20110021251A1 (en) * 2009-07-22 2011-01-27 Sony Ericsson Mobile Communications Ab Electronic device with touch-sensitive control
US20130265284A1 (en) * 2012-04-07 2013-10-10 Samsung Electronics Co., Ltd. Object control method performed in device including transparent display, the device, and computer readable recording medium thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134311A (en) * 2019-04-08 2019-08-16 努比亚技术有限公司 A kind of screen display method, wearable device and computer readable storage medium

Also Published As

Publication number Publication date
CN109643178A (en) 2019-04-16

Similar Documents

Publication Publication Date Title
US8436832B2 (en) Multi-touch system and driving method thereof
JP6791994B2 (en) Display device
CN103297605B (en) A kind of display methods and electronic equipment
US20140184526A1 (en) Method and apparatus for dual display
TWI537813B (en) Electronic device and camera switching method thereof
US20170235376A1 (en) Systems and methods of direct pointing detection for interaction with a digital device
EP2817704B1 (en) Apparatus and method for determining the position of a user input
EP2741173A2 (en) Display apparatus and control method thereof
JP2009042796A (en) Gesture input device and method
US20160292922A1 (en) Display control device, display control method, and recording medium
EP2790089A1 (en) Portable device and method for providing non-contact interface
CN110427151A (en) A kind of method and electronic equipment controlling user interface
US20140267004A1 (en) User Adjustable Gesture Space
US20160291687A1 (en) Display control device, display control method, and recording medium
US20120013645A1 (en) Display and method of displaying icon image
US20130167084A1 (en) Information terminal, method of controlling information terminal, and program for controlling information terminal
EP2902889A1 (en) Touch-in-touch display apparatus
CN108932100A (en) A kind of operating method and head-mounted display apparatus of dummy keyboard
WO2015192763A1 (en) Touch screen control method and device
JP2016224686A5 (en)
CN105242839A (en) Touch menu control method and system
WO2017032193A1 (en) User interface layout adjustment method and apparatus
CN106796484B (en) Display device and control method thereof
TW201342121A (en) Mechanism to provide visual feedback regarding computing system command gestures
US20180210597A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16900639

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16900639

Country of ref document: EP

Kind code of ref document: A1