WO2014178800A1 - Interface utilisateur pour collaboration multi-utilisateurs - Google Patents

Interface utilisateur pour collaboration multi-utilisateurs Download PDF

Info

Publication number
WO2014178800A1
WO2014178800A1 PCT/TH2013/000023 TH2013000023W WO2014178800A1 WO 2014178800 A1 WO2014178800 A1 WO 2014178800A1 TH 2013000023 W TH2013000023 W TH 2013000023W WO 2014178800 A1 WO2014178800 A1 WO 2014178800A1
Authority
WO
WIPO (PCT)
Prior art keywords
tools
workspace
same time
unbounded
user tools
Prior art date
Application number
PCT/TH2013/000023
Other languages
English (en)
Inventor
Pawin Suthapong
Original Assignee
Spoton Cognitive Co., Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spoton Cognitive Co., Ltd filed Critical Spoton Cognitive Co., Ltd
Priority to PCT/TH2013/000023 priority Critical patent/WO2014178800A1/fr
Publication of WO2014178800A1 publication Critical patent/WO2014178800A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • aspects of the invention generally relate to software design touchscreen devices or computers with functions for multiple users working together at the same time.
  • touchscreen software and devices have been developed to enhance usability and to make computers more versatile throughout the evolution of the field of computer science. Touchscreen software and devices changes the experience of the users when interacting with computers. Many software and application becomes easier to use. Touchscreen also open up many new capabilities that the ordinary keyboard and mouse cannot perform.
  • touchscreen devices examples include the iPhone and tablet.
  • the touchscreen software has developed rapidly. Touchscreen used to work with only a stylus pen and only one point could be touch at one time.
  • touchscreen software is very fluid and response very well with the movement of the fingers and many points can be touch at the same time.
  • Numerous touchscreen devices and software allow more than one users to use the device at the same time.
  • One example would be the iPad and the Tap Tap Revolution Game.
  • these devices and software are usually not designed for multiusers to use the device at the same time and many of the devices have sizes that are too small.
  • the Sandbox addresses these issues by being software that utilize large touchscreen interface designed to enable many users to use the touchscreen at once with ease.
  • the software embedded is very important because it is designed for many users to use at the same time. All of the tools provided suits a variety of environments and many groups of people. All of the tools can be customize and many users can use different tools or even the same tools all at once.
  • One aspect of the invention is to optimize the multiuser touch screen experience.
  • the software allows more than one person to work together on the touchscreen devices at the same time.
  • Each user has their workspace and is able to interact together with another person work all of the work done in a workspace is a project.
  • Another aspect of the invention is the specialized user tools in which every user can simply draw a geometric shape to pull out the specialized user tools anytime and each user can use any tools individually at the same time.
  • the tools are very flexible and changes or adjustments to any creation done by the tools are always available.
  • Yet another aspect of the invention is the unbounded and indefinite workspace.
  • the zooming function enables the workspace to expand with no bound or limits.
  • the panning function enables each user to move the workspace being displayed at the screen.
  • FIG. 1 is a structure of the architecture of the touchscreen technology system.
  • FIG. 2 is a scenario of multiple users working together.
  • FIG. 3 is a structure of the gesture analysis.
  • FIG. 4 is a scenario of many individualized user tools on a workspace.
  • FIG. 5 is a scenario of the unbounded workspaces using.
  • FIG. 6 is a scenario of the object creating and changing.
  • This invention is created to help multiple users collaborate better in working together on computers. For multiple users to work together well on a touchscreen device, not only does the hardware which is the computer with a touchscreen need to be robust, but the software for multiusers need to be well created as well to work cooperatively between both the hardware and the software.
  • Sandbox is a combination of both a robust computer with touchscreen for multiusers and the software that has been created for multiple users to collaborate better. Functions and tools of Sandbox are designed to help multiple users share and work together on the same project while having individual workspace.
  • the Gesture Analysis enables multiple users to use different tools at the same time. For example, one user could draw a line while another user could be creating a shape at the same time.
  • the specialized user tools are drawn-pulled out by drawing a geometric shape. The tools enable many features and can be pulled out as many times as possible for each user.
  • the specialized user tools are movable.
  • the unbounded and indefinite workspace provides an unlimited area for maximum creativity. The area can be moved around with the panning tools and the zooming function can zoom in and out of an area.
  • FIG. 1 illustrates a diagram of the architecture of the touchscreen technology system.
  • the sensor detects the location of the point being touched on the touchscreen accurately.
  • the ID helps the computer to recognize each touch individually.
  • the driver calibrates the point to make an estimation of the point location accurately and send that point to Sandbox computer.
  • FIG. 2 illustrates a Sandbox being used by multiple users at the same time.
  • FIG. 3 illustrates a diagram of how each individual touch point is being analyzed by the gesture analysis. The touch point then is interpreted into different commands.
  • FIG 4 explains the functions of the tools usage on the workspace.
  • the individualized user tools are called out by drawing a geometric shape anywhere on the workspace. Each user can freely select and use any tool on the individualized user tools without interfering with another individualized user tools.
  • Each attribute of the tool of the individualized user tools can be adjusted as desire. For example, the pen tool can be used with different color attribute by different users at the same time.
  • the tool of one individualized user tools can also be used on the creation created by another person individualized user tools.
  • the eraser tool of one individualized user tools can be used to erase lines drawn by another individualized user tools.
  • FIG 4 illustrates the scenarios of the workspace software being in use.
  • One feature of the software is to show many workspaces at the same time. Another feature is to the usage of the workspace by multiple users at the same time.
  • the workspace is unbounded and indefinite and goes beyond the screen.
  • the zooming tool and the panning tool can be used to move around what of the unbounded and indefinite workspace is shown on the screen.
  • FIG 5 illustrates the usage of the object functions on workspace.
  • the first function is to open different type of files at the same time and to select a part or the full file to be used as an object on the workspace in an unlimited number of times.
  • Objects can be created by the individualized user tools of each user and the attribute of each object including the shape, size, and color can be change anytime.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un logiciel incorporé dans un ordinateur avec un écran tactile pour une pluralité d'utilisateurs constitué d'outils, de fonctions, et de programmes pour optimiser la collaboration entre des personnes. Le matériel et le logiciel sont conçus pour fonctionner conjointement en coopération et de manière fluide. La technologie d'écran tactile outre de comporter un matériel d'écran tactile qui peut détecter une pluralité points de contact doit comprendre un logiciel permettant la collaboration entre une pluralité d'utilisateurs.
PCT/TH2013/000023 2013-05-02 2013-05-02 Interface utilisateur pour collaboration multi-utilisateurs WO2014178800A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/TH2013/000023 WO2014178800A1 (fr) 2013-05-02 2013-05-02 Interface utilisateur pour collaboration multi-utilisateurs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/TH2013/000023 WO2014178800A1 (fr) 2013-05-02 2013-05-02 Interface utilisateur pour collaboration multi-utilisateurs

Publications (1)

Publication Number Publication Date
WO2014178800A1 true WO2014178800A1 (fr) 2014-11-06

Family

ID=51843789

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TH2013/000023 WO2014178800A1 (fr) 2013-05-02 2013-05-02 Interface utilisateur pour collaboration multi-utilisateurs

Country Status (1)

Country Link
WO (1) WO2014178800A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10698505B2 (en) 2016-01-13 2020-06-30 Hewlett-Packard Development Company, L.P. Executing multiple pen inputs
US10749701B2 (en) 2017-09-22 2020-08-18 Microsoft Technology Licensing, Llc Identification of meeting group and related content

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090193345A1 (en) * 2008-01-28 2009-07-30 Apeer Inc. Collaborative interface
WO2012050946A2 (fr) * 2010-09-29 2012-04-19 Bae Systems Information Solutions Inc. Procédé d'arrière-plan informatique collaboratif
US20120110431A1 (en) * 2010-11-02 2012-05-03 Perceptive Pixel, Inc. Touch-Based Annotation System with Temporary Modes
US20130047093A1 (en) * 2011-05-23 2013-02-21 Jeffrey Jon Reuschel Digital whiteboard collaboration apparatuses, methods and systems
US20130093708A1 (en) * 2011-10-13 2013-04-18 Autodesk, Inc. Proximity-aware multi-touch tabletop

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090193345A1 (en) * 2008-01-28 2009-07-30 Apeer Inc. Collaborative interface
WO2012050946A2 (fr) * 2010-09-29 2012-04-19 Bae Systems Information Solutions Inc. Procédé d'arrière-plan informatique collaboratif
US20120110431A1 (en) * 2010-11-02 2012-05-03 Perceptive Pixel, Inc. Touch-Based Annotation System with Temporary Modes
US20130047093A1 (en) * 2011-05-23 2013-02-21 Jeffrey Jon Reuschel Digital whiteboard collaboration apparatuses, methods and systems
US20130093708A1 (en) * 2011-10-13 2013-04-18 Autodesk, Inc. Proximity-aware multi-touch tabletop

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10698505B2 (en) 2016-01-13 2020-06-30 Hewlett-Packard Development Company, L.P. Executing multiple pen inputs
US10749701B2 (en) 2017-09-22 2020-08-18 Microsoft Technology Licensing, Llc Identification of meeting group and related content

Similar Documents

Publication Publication Date Title
CN108431729B (zh) 用以增大显示区域的三维对象跟踪
JP2015153420A (ja) マルチタスク切替方法及びそのシステム及び該システムを有する電子装置
KR20150014083A (ko) 전자 장치 및 전자 장치의 입력 인식 방법
JP2014215737A (ja) 情報処理装置、表示制御方法、及びコンピュータプログラム
JP6379880B2 (ja) プロジェクタ−カメラシステム又はディスプレイ−カメラシステムに対する微細なユーザインタラクションを可能とするシステム、方法及びプログラム
CN105934739A (zh) 用于触摸屏设备的虚拟鼠标
KR20170009979A (ko) 터치 입력을 위한 방법 및 시스템
Kolb et al. Towards gesture-based process modeling on multi-touch devices
JP2017534975A (ja) ユーザインタフェースのための対話方法
KR20160019762A (ko) 터치 스크린 한손 제어 방법
US20160054879A1 (en) Portable electronic devices and methods for operating user interfaces
US10073612B1 (en) Fixed cursor input interface for a computer aided design application executing on a touch screen device
Foucault et al. SPad: a bimanual interaction technique for productivity applications on multi-touch tablets
WO2014178800A1 (fr) Interface utilisateur pour collaboration multi-utilisateurs
Baldauf et al. Snap target: Investigating an assistance technique for mobile magic lens interaction with large displays
Brehmer et al. Interacting with visualization on mobile devices
US10838570B2 (en) Multi-touch GUI featuring directional compression and expansion of graphical content
Khan A survey of interaction techniques and devices for large high resolution displays
Klompmaker et al. Towards multimodal 3d tabletop interaction using sensor equipped mobile devices
US20140085197A1 (en) Control and visualization for multi touch connected devices
Aigner et al. Design Considerations for the Placement of Data Visualisations in Virtually Extended Desktop Environments
Weibel et al. Hiperface: A multichannel architecture to explore multimodal interactions with ultra-scale wall displays
Liu et al. Interactive space: a prototyping framework for touch and gesture on and above the desktop
Buda Rotation techniques for 3D object interaction on mobile devices
Lee et al. Smart and space-aware interactions using smartphones in a shared space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13883670

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13883670

Country of ref document: EP

Kind code of ref document: A1