KR101234096B1 - Multipoint user interface device and multipoint user interfacing method - Google Patents

Multipoint user interface device and multipoint user interfacing method Download PDF

Info

Publication number
KR101234096B1
KR101234096B1 KR1020090103439A KR20090103439A KR101234096B1 KR 101234096 B1 KR101234096 B1 KR 101234096B1 KR 1020090103439 A KR1020090103439 A KR 1020090103439A KR 20090103439 A KR20090103439 A KR 20090103439A KR 101234096 B1 KR101234096 B1 KR 101234096B1
Authority
KR
South Korea
Prior art keywords
input
event
user
multipoint
plurality
Prior art date
Application number
KR1020090103439A
Other languages
Korean (ko)
Other versions
KR20110046786A (en
Inventor
최은정
박준석
이전우
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020090103439A priority Critical patent/KR101234096B1/en
Publication of KR20110046786A publication Critical patent/KR20110046786A/en
Application granted granted Critical
Publication of KR101234096B1 publication Critical patent/KR101234096B1/en

Links

Images

Abstract

Disclosed are a multipoint user interface device and method for collaborative collaboration of a plurality of users. According to an aspect of the present invention, there is provided a multipoint user interface device, including: an input event identifier for identifying input events generated from a plurality of input devices associated with a specific user; An input event fusion unit configured to fuse two or more of the input events to generate a new event; And an input event transmitter configured to transmit one or more of the input events and the new event to an event waiting queue for each application so as to perform similar real-time processing. Therefore, a plurality of users can effectively control the plurality of pointers.
Multipoint, Collaborative, Contactless Input

Description

Multipoint user interface device and multipoint user interfacing method

TECHNICAL FIELD The present invention relates to multiple user interfaces, and to a multipoint user interface framework technology for collaborative collaboration of a plurality of users.

Recently, as a large display device is becoming more common, many users are increasingly looking at one display device. Meanwhile, a mouse and a keyboard are generally used as a user input interface of a computer currently being used.

When a plurality of users need to control the system jointly, such as a meeting or a collaboration of a plurality of users, it is inconvenient to use the input interface alternately one by one using a conventional mouse or keyboard. As the size of display devices increases, the necessity of a user interface for multi-user collaboration is increasing.

Recently, a system for recognizing a plurality of mice and providing a plurality of cursors on a screen has been introduced. However, in this system, when one mouse acquires control in an existing window environment, other mice cannot control the system. . In other words, multi-cursors are provided, but users are forced to work sequentially because the system is eventually controlled by only one cursor. In addition, if the control is taken away by another user in the middle of the work, there is a risk of losing the existing work.

In addition to input interfaces such as mouse and keyboard, user input through bend sensor, position sensor or camera recognition is also possible.In the development of the graphical user interface, most of the points on the system are simply pointed at the screen. Can be performed without inconvenience. Therefore, it is expected that an application in which one user simultaneously uses a plurality of input interfaces will be universalized. In this case, a necessity of a user interface technology capable of appropriately controlling input interfaces by efficiently handling events occurring in each input interface is required. This is desperately emerging.

An object of the present invention for solving the above problems is to facilitate a collaborative collaboration by allowing a plurality of users to simultaneously control the screen object through their interface at the time of meeting or collaboration.

In addition, an object of the present invention is to manage the events occurring in a plurality of input devices for each user, such as when one user uses a plurality of input devices for each user to effectively fuse the events generated in the plurality of input devices to effectively multi-point users Is to control.

In addition, an object of the present invention is to efficiently control events occurring in a plurality of input devices by placing an event waiting queue for each application when a plurality of applications are executed.

According to an aspect of the present invention, there is provided a multipoint user interface device, including: an input event identification unit for identifying input events generated from a plurality of input devices associated with a specific user; An input event fusion unit configured to fuse two or more of the input events to generate a new event; And an input event transmitter configured to transmit one or more of the input events and the new event to an event waiting queue for each application so as to perform similar real-time processing.

In this case, the multipoint may be controlled by one or more of the input devices.

In this case, the input event fusion unit may generate a new event in consideration of user information related to the identified input events. For example, the input event fusion unit may fuse only events generated from a plurality of input devices controlled by the same user, and even events generated from different users may not be fused even if the combination of the same events is performed.

In this case, the input devices may include any one or more of a bending sensor, a position sensor, and a camera input device.

At this time, the multipoint user interface device receives an output request from an application program and outputs the output data to the output device, and a multipoint location that tracks the location of the multipoint so that objects on the screen related to user input can be identified. The tracking unit may further include. In this case, a plurality of output devices may be provided and corresponding output devices may be set for each user.

In this case, the input event transmitter may transmit the input events and the new event to the event waiting queue in the order of occurrence.

In addition, the multi-point user interfacing method according to the present invention comprises the steps of identifying input events occurring from a plurality of input devices each associated with a particular user; Fusing two or more of the input events to generate a new event; And delivering one or more of the input events and the new event to an event waiting queue for each application so as to perform similar real-time processing.

In this case, the generating of the new event may generate the new event in consideration of user information related to the input events.

In this case, the multipoint user interfacing method may further include receiving an output request from an application program and providing output data to the output device. In this case, a plurality of output devices may be provided, and a corresponding output device may be set for each user.

At this time, the step of delivering to the event waiting queue for each application may deliver the input events and the new event to the event waiting queue in the order in which they occurred.

According to the present invention, a plurality of users can simultaneously control screen objects through their interfaces at the time of a meeting or collaboration, thereby facilitating collaborative collaboration.

In addition, the present invention manages the events occurring in a plurality of input devices for each user, such as when one user uses a plurality of input devices, by properly fusing the events generated in the plurality of input devices to effectively multi-point the user Take control.

In addition, an object of the present invention can efficiently control events occurring in a plurality of input devices by placing an event waiting queue for each application program when a plurality of application programs are executed.

The present invention will now be described in detail with reference to the accompanying drawings. Hereinafter, a repeated description, a known function that may obscure the gist of the present invention, and a detailed description of the configuration will be omitted. Embodiments of the present invention are provided to more fully describe the present invention to those skilled in the art. Accordingly, the shapes and sizes of the elements in the drawings and the like can be exaggerated for clarity.

Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings.

1 is a diagram illustrating a multipoint user interface environment according to the present invention.

Referring to FIG. 1, the multipoint user interface environment includes individual user devices 120 and 130, a collaborative collaboration device 140, a multipoint user interface 110, and an application 150.

Individual use devices 120 and 130 are individual input / output devices used by an individual, such as a tactile output device and a gesture input device, respectively. For example, the individual user devices 120 and 130 may each include a bending sensor, a position sensor and a camera input device for input.

The collaborative collaboration device 140 is a device that uses are jointly used, such as a speaker or a display. The collaboration device 140 may be composed of a plurality of devices. For example, when a plurality of displays constitute a screen shared by several users, the plurality of displays correspond to the cooperative collaboration device 140.

Application 150 may be software for collaborative collaboration. For example, application 150 may be an interior interior design tool.

According to an embodiment, the plurality of application programs 150 may be executed simultaneously. In particular, as the performance of the central processing unit has remarkably developed, it is possible to execute a plurality of programs at the same time, and to execute a plurality of applications at the same time for multi-user collaboration, appropriately distribute user control, and event by application. Multi-user control can be improved by managing events with wait queues.

Each of the application programs 150 has an event waiting queue, sequentially processes events stored in the event waiting queue, generates an output request, and transmits the generated request to the multipoint user interface device 110. At this time, the application program 150 is preferably executed at a performance that can be processed in a so-called quasi-realtime input event of a plurality of users.

The multi-user interface 110 receives events from the individual devices 120 and 130 and generates a new event by fusing a plurality of events as necessary, and provides the event to an event waiting queue of an application for similar real-time processing. do. In addition, the multi-user interface 110 receives an output request from an application program and provides output data to a corresponding output device.

FIG. 2 is a block diagram illustrating an example of the multipoint user interface device illustrated in FIG. 1.

Referring to FIG. 2, the multipoint user interface device includes an input controller 210 and an output controller 220.

The input control unit 210 identifies input events occurring from a plurality of input devices each associated with a specific user, and if necessary, fuses two or more of the input events to generate a new event, and identifies the identified event and / or generated It delivers the event to the event waiting queue for each application so that it can be processed in pseudo real time.

The output controller 220 selects an appropriate output device according to the output request received from the application program and provides output data. In this case, the output device may be a collaborative collaboration device or an individual use device.

In particular, the output controller 220 may provide output data to an output device corresponding to a user who has generated an input event corresponding to an output request from an application program. When a plurality of users generate input events using a plurality of input devices and collaborate through a plurality of display devices, selection of an output device such as a display suitable for the user is a very important problem, and user information corresponding to the event. By using, an appropriate output device can be selected efficiently.

The output controller 220 may perform a multipoint location tracking function. That is, the output control unit 220 may track the locations of the multipoints on the screen and select an object corresponding to the corresponding event when a specific event occurs.

FIG. 3 is a block diagram illustrating an example of the input controller illustrated in FIG. 2.

Referring to FIG. 3, the input control unit includes an input event identification unit 310, an input event fusion unit 320, and an input event transmitter 330.

The input event identification unit 310 identifies input events generated from a plurality of input devices associated with a specific user, respectively. For example, the A input device and the B input device may be a device associated with the user 1, and the C input device and the D input device may be a device associated with the user 2. The input event identification unit 310 may identify an event occurring in all of the A, B, C, and D input devices, and may also identify a user corresponding to the event. In addition, the input event identification unit 310 may also identify information on the type of the input device or the type of the generated event.

In this case, the input devices may include any one or more of a bending sensor, a position sensor, and a camera input device.

In this case, the multipoint may be controlled by one or more of the input devices. For example, the multipoint may be controlled by one inertial sensor or may be controlled by an inertial sensor and a bending sensor.

The input event fusion unit 320 generates a new event by fusing two or more of the input events if necessary.

It is possible to detect the movement of the hand through inertial sensor or camera recognition, and the movement of the finger through the bending sensor, etc., and an event corresponding to the combination of movement of two or more parts of the body can be defined. In this case, one event may be generated by combining events generated by each sensor.

For example, an event may be defined for a motion of turning the left hand clockwise while bending the right hand index, in which case the input event fusion unit combines an event generated by bending the right hand index and an event generated by turning the left hand clockwise. Can be created as a new event and provided to the application.

In particular, in the case of a non-contact input device through the movement of the human body rather than a simple pointing device such as a mouse, a multipoint user interface is defined by combining a plurality of input events and generating a new event corresponding to the same. The operation efficiency of the device can be increased.

In particular, in the case of using the non-contact input device, it is possible to minimize the restrictions on the position and movement of the user and to facilitate the operation of the enlarged display device.

In this case, the input event fusion unit 320 may generate a new event in consideration of user information related to the input events identified by the input event identification unit 310. For example, the input event fusion unit 320 generates a new event by fusing two events when the finger bending and hand movement are generated by the same user, but when the other users bend their fingers and move their hands, the event fusion is performed. May not be performed.

The input event transmitter 330 delivers the input events generated from the input devices and / or the new event generated by the event fusion unit 320 to an event waiting queue for each application to perform similar real-time processing.

Quasi-realtime processing is not strictly a real-time processing, but it means that the user perceives the event to be processed at the same time as the event occurs by processing the event within an acceptable time. In particular, in the case of a multipoint user interface, if the event processing speed through the event waiting queue is fast enough, a plurality of users may feel that each input is processed simultaneously.

In this case, the input event transmitter 330 may deliver the input events generated by the input device and / or the events generated by the input event fusion unit 320 to the event waiting queue for each application in the order in which the events occur. have.

4 is a block diagram illustrating an example of an output control unit illustrated in FIG. 2.

Referring to FIG. 4, the output controller illustrated in FIG. 2 includes an output controller 410 and a multipoint position tracker 420.

The output controller 410 receives an output request from an application program, selects a corresponding output device, and provides output data to the selected device.

In this case, there may be a plurality of output devices, and a corresponding output device may be set for each user.

The multipoint location tracking unit 420 tracks the locations of the multipoints on the screen and tracks the objects on the screen of the location where the user input occurs. For example, when the user wants to move an object on the screen by using a gesture, the object is recognized by the corresponding point so that the object can be selected and moved to a desired position.

According to an embodiment, the output controller 410 may perform output device selection and / or output data providing by using the output of the multipoint position tracker 420.

5 is a block diagram illustrating a multipoint user interfacing method according to the present invention in input processing.

Referring to FIG. 5, the multipoint user interfacing method identifies input events generated from a plurality of input devices associated with a specific user, respectively (S510).

In this case, the multipoint may be controlled by one or more of the input devices.

In this case, the identification of the input events may be performed by determining whether each of the input events corresponds to an input operation defined for the input devices. For example, when bending occurs at a predetermined angle or more with respect to the bending sensor, input events may occur when a movement at a predetermined position or more occurs with a predetermined acceleration with respect to the inertial sensor.

In addition, the multipoint user interfacing method determines whether fusion of input events is necessary (S520).

That is, a fusion event may be defined for two or more event combinations for efficient event management. In this case, when events corresponding to stored event combinations are input, it may be determined that event fusion is required.

If it is determined in step S520 that convergence is necessary, the multipoint user interfacing method fuses two or more input events (S530).

For example, the multipoint user interfacing method may search for a fusion event defined for a combination of two or more input events in a DB and output the new fusion event.

In this case, step S530 may generate a new event in consideration of user information related to the input events.

In addition, the multi-point user interfacing method transmits an event generated from input devices and / or an event generated by fusion to an event waiting queue for each application so as to perform similar real-time processing (S540). At this time, point location or object information may be transmitted to the application program together. For example, the point location may be a point location corresponding to the user associated with the input device that generated the event at the time the event occurs, and the object information may be information about an object corresponding to the location of the point at the time the event occurs. have.

At this time, step S540 may deliver the new events generated by the input events and / or fusion to the event waiting queue in the order in which they occurred.

If it is determined in step S520 that event convergence is not necessary, the multi-point user interfacing method proceeds to step S540 and transmits an event, point location and object information generated from input devices to an application program.

The application program provided with the event, the point location and the object information retrieves the event from the event waiting queue and processes the request so that the user is provided with an output corresponding to the processed event.

6 is a block diagram illustrating a multipoint user interfacing method according to the present invention in output processing.

Referring to FIG. 6, the multipoint user interfacing method receives an output request from an application program (S610).

The multipoint user interfacing method receiving the output request identifies the user corresponding to the input event related to the output request (S620).

In addition, the multipoint user interfacing method selects an output device available to the identified user (S630).

As such, it is possible to select an appropriate output device from among the plurality of output devices by using user information corresponding to an event generated in each input device.

In operation S640, the multipoint user interfacing method provides the output data to the selected output device.

Although not shown in FIG. 6, the multipoint user interfacing method may further include tracking the location of the multipoint so as to identify an object on the screen related to the user input.

The multipoint user interfacing methods illustrated in FIGS. 5 and 6 have been described in two for convenience, but may be an operation performed by one device.

7 is a diagram illustrating an example in which two points simultaneously perform a task.

Referring to FIG. 7, there are two points 710 and 720 on the screen, and it can be seen that two users collaborate using each point.

In the example shown in FIG. 7, the user using the point 710 can select and place the desired item on the design drawing irrespective of the point 720.

Similarly, a user using point 720 may select and place a desired item on a design drawing irrespective of point 710.

According to an embodiment, the points 710 and 720 may be controlled through one input device such as a mouse, or may be controlled through a plurality of input devices such as an inertial sensor and a bending sensor.

8 is a diagram illustrating an example in which four users each collaborate using two points.

Referring to FIG. 8, it can be seen that interior design work is being performed through joint collaboration using a plurality of displays 871, 872, 873, and 874.

In the example shown in FIG. 8, one user controls two points. At this time, the user can control two points through the gesture of both hands. In addition, it can be seen that each user can work while moving between display screens.

Points 811, 812 shown in FIG. 8 are controlled by user 810, points 821, 822 are controlled by user 820, and points 831, 832 are user 830. ), And points 841 and 842 are controlled by user 840.

The user 810 controls the point 811 with his left hand to perform interior design work on the display device 871, and the user 810 controls the point 812 with his right hand to perform work on the display device 872. The user 810 may move an item on the display 872 onto the display 871.

The user 820 performs interior design work using two points 821 and 822 on the display device 871. In this case, the user 820 may work by dragging items from the other display devices 872, 873, and 874 to the display device 871.

The user 830 controls the point 831 with his left hand to select an item on the display 872, and the right hand controls the point 832 to perform interior design work on the display 874.

The user 840 controls the point 841 with his left hand to select an item on the display device 873, and the right hand controls the point 842 to perform interior design work on the display device 874.

As illustrated in FIG. 8, when a plurality of displays are jointly used by a plurality of users who use two or more input devices, an event is effectively converged by using user information corresponding to an event occurring in the input device. In addition, an output device corresponding to a user may be selected at the time of output.

As described above, the multi-point user interface device and the method according to the present invention are not limited to the configuration and method of the embodiments described as described above, but the embodiments are implemented so that various modifications can be made. All or part of the examples may be optionally combined.

1 is a diagram illustrating a multipoint user interface environment according to the present invention.

FIG. 2 is a block diagram illustrating an example of the multipoint user interface device illustrated in FIG. 1.

FIG. 3 is a block diagram illustrating an example of the input controller illustrated in FIG. 2.

4 is a block diagram illustrating an example of an output control unit illustrated in FIG. 2.

5 is a block diagram illustrating a multipoint user interfacing method according to the present invention in input processing.

6 is a block diagram illustrating a multipoint user interfacing method according to the present invention in output processing.

7 is a diagram illustrating an example in which two points simultaneously perform a task.

8 is a diagram illustrating an example in which four users each collaborate using two points.

Claims (15)

  1. delete
  2. delete
  3. An input event identification unit identifying input events generated from a plurality of input devices respectively associated with a specific user;
    An input event fusion unit configured to fuse two or more of the input events to generate a new event; And
    And an input event transmitter configured to transmit one or more of the input events and the new event to an event waiting queue for each application so as to perform similar real-time processing.
    Multipoint is controlled by one or more of the input devices,
    The input event fusion unit
    In consideration of user information related to the input events identified by the input event identification unit, generating the new event by differentiating whether or not the input events are fused according to whether the event occurs in an input device controlled by the same user. A multipoint user interface device.
  4. The method of claim 3,
    The input devices
    A multipoint user interface device comprising at least one of a bending sensor, a position sensor, and a camera input device.
  5. The method of claim 3,
    And an output control unit which receives an output request from the application program and provides output data to an output device.
  6. The method of claim 5,
    And a plurality of output devices, and corresponding output devices are set for each user.
  7. The method of claim 3,
    And a multipoint location tracking unit for tracking the location of the multipoint so as to identify an object on the screen related to a user input.
  8. The method of claim 3,
    The input event transmitter is
    And deliver the input events and the new event to the event waiting queue in the order in which they occurred.
  9. delete
  10. delete
  11. Identifying input events occurring from a plurality of input devices each associated with a particular user;
    Fusing two or more of the input events to generate a new event; And
    And passing one or more of the input events and the new event to an event waiting queue for each application to perform pseudo real-time processing.
    Multipoint is controlled by one or more of the input devices,
    Generating the new event
    In consideration of user information related to the input events, multipoint user interfacing, wherein the new event is generated by differentiating whether or not the input events are fused according to whether or not the event occurs in an input device controlled by the same user. Way.
  12. The method of claim 11,
    Receiving an output request from the application program and providing output data to an output device.
  13. The method of claim 12,
    And a plurality of output devices, and corresponding output devices are set for each user.
  14. The method of claim 11,
    And tracking the location of the multipoint so as to identify an object on the screen associated with user input.
  15. The method of claim 11,
    Delivering to the event waiting queue for each application is
    And forwarding the input events and the new event to the event waiting queue in the order in which they occurred.
KR1020090103439A 2009-10-29 2009-10-29 Multipoint user interface device and multipoint user interfacing method KR101234096B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090103439A KR101234096B1 (en) 2009-10-29 2009-10-29 Multipoint user interface device and multipoint user interfacing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090103439A KR101234096B1 (en) 2009-10-29 2009-10-29 Multipoint user interface device and multipoint user interfacing method

Publications (2)

Publication Number Publication Date
KR20110046786A KR20110046786A (en) 2011-05-06
KR101234096B1 true KR101234096B1 (en) 2013-02-19

Family

ID=44238169

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090103439A KR101234096B1 (en) 2009-10-29 2009-10-29 Multipoint user interface device and multipoint user interfacing method

Country Status (1)

Country Link
KR (1) KR101234096B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101554076B1 (en) 2013-12-20 2015-09-18 단국대학교 천안캠퍼스 산학협력단 System for providing user interaction based on multiple input device in tiled display system and method for using the system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05108783A (en) * 1991-10-21 1993-04-30 Nec Corp Plural pointers information transmission system
JP2007114151A (en) * 2005-10-24 2007-05-10 Denso Corp On-vehicle multi-cursor system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05108783A (en) * 1991-10-21 1993-04-30 Nec Corp Plural pointers information transmission system
JP2007114151A (en) * 2005-10-24 2007-05-10 Denso Corp On-vehicle multi-cursor system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101554076B1 (en) 2013-12-20 2015-09-18 단국대학교 천안캠퍼스 산학협력단 System for providing user interaction based on multiple input device in tiled display system and method for using the system

Also Published As

Publication number Publication date
KR20110046786A (en) 2011-05-06

Similar Documents

Publication Publication Date Title
US9389779B2 (en) Depth-based user interface gesture control
US8122384B2 (en) Method and apparatus for selecting an object within a user interface by performing a gesture
Cabral et al. On the usability of gesture interfaces in virtual reality environments
Vogel et al. Distant freehand pointing and clicking on very large, high resolution displays
JP5900393B2 (en) Information processing apparatus, operation control method, and program
US9659280B2 (en) Information sharing democratization for co-located group meetings
US20130055143A1 (en) Method for manipulating a graphical user interface and interactive input system employing the same
KR20140117469A (en) System for gaze interaction
EP3333675A1 (en) Wearable device user interface control
Tse et al. Enabling interaction with single user applications through speech and gestures on a multi-user tabletop
US9104239B2 (en) Display device and method for controlling gesture functions using different depth ranges
US9285907B2 (en) Recognizing multiple input point gestures
Nacenta et al. Perspective cursor: perspective-based interaction for multi-display environments
JP2011028524A (en) Information processing apparatus, program and pointing method
Argyros et al. Vision-based interpretation of hand gestures for remote control of a computer mouse
US20070146347A1 (en) Flick-gesture interface for handheld computing devices
ES2539281T3 (en) Object search procedure and terminal provided with object search function
US20120127070A1 (en) Control signal input device and method using posture recognition
US20120192078A1 (en) Method and system of mobile virtual desktop and virtual trackball therefor
Parker et al. TractorBeam: seamless integration of local and remote pointing for tabletop displays
Gurevich et al. TeleAdvisor: a versatile augmented reality tool for remote assistance
US20100171696A1 (en) Motion actuation system and related motion database
CN105980965A (en) Systems, devices, and methods for touch-free typing
EP2941739A2 (en) Operating environment with gestural control and multiple client devices, displays, and users
EP2500809A2 (en) Handheld devices and related data transmission methods

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20151223

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20170223

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20180212

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20190211

Year of fee payment: 7