KR20120062209A - User interface for handy terminal - Google Patents

User interface for handy terminal Download PDF

Info

Publication number
KR20120062209A
KR20120062209A KR1020100123368A KR20100123368A KR20120062209A KR 20120062209 A KR20120062209 A KR 20120062209A KR 1020100123368 A KR1020100123368 A KR 1020100123368A KR 20100123368 A KR20100123368 A KR 20100123368A KR 20120062209 A KR20120062209 A KR 20120062209A
Authority
KR
South Korea
Prior art keywords
portable terminal
bezel
user interface
screen
keyboard
Prior art date
Application number
KR1020100123368A
Other languages
Korean (ko)
Other versions
KR101213021B1 (en
Inventor
유인오
Original Assignee
유인오
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 유인오 filed Critical 유인오
Priority to KR1020100123368A priority Critical patent/KR101213021B1/en
Priority to PCT/KR2011/009346 priority patent/WO2012077942A2/en
Publication of KR20120062209A publication Critical patent/KR20120062209A/en
Application granted granted Critical
Publication of KR101213021B1 publication Critical patent/KR101213021B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1632External expansion units, e.g. docking stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • H04B1/401Circuits for selecting or indicating operating mode

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • Position Input By Displaying (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Calculators And Similar Devices (AREA)

Abstract

PURPOSE: A motion sensing UI for a portable terminal is provided to operate a portable terminal through operation of both hands by supplying a sensing area as a sensing unit of a bezel frame. CONSTITUTION: A pair of bezel frames(120) is arranged along both sides of a portable terminal. A sensing unit(130) is installed in the bezel frame. The sensing unit is installed in a rotation shaft of the bezel frame. A detent(140) is installed between the bezel frame and the portable terminal.

Description

Motion Recognition User Interface for Portable Terminals

The present invention relates to a motion recognition user interface for a portable terminal. More specifically, the application of the bezel portion applied to the portable terminal in the form of a tablet, which not only mounts the portable terminal but also recognizes the motion in the space and can easily and simply manipulate the input to the terminal. It's about the interface.

Today, with the development of information technology (IT) technology, portable terminals are equipped with many functions such as call, music file playback, video playback, schedule management, and the Internet.

However, since the portable terminal is manufactured to be small in order to have mobility, there is a limit to having an input / output device that effectively provides the above multi-functions, and the recently emerged tablet-type medium-sized portable terminal or portable computer also has a virtual display on a wide screen. Although an input means such as a keyboard is provided, there are limitations in terms of size and convenience.

In the related art, a touch screen capable of simultaneously inputting and displaying an output on a screen is used. In this case, a touch panel is attached to a screen on which the screen is displayed, and the touch screen may be touched by capacitive or static pressure to operate the input. have.

However, the touch screen is used in direct contact with a finger, etc. on the screen, it is inconvenient to put your hand on the image keyboard displayed on the screen when the portable terminal is mounted, and to look at the screen when the portable terminal is placed on the floor. It's very burdensome to lean over your head and take a long position.

In addition, since the hand placed on the screen covers a part of the screen, it is cumbersome to remove the hand and look at the screen to check the keys or buttons of the keyboard displayed as an image.

In addition, conventionally, a so-called optical keyboard has been introduced to receive an input by providing a frame image of the keyboard on an arbitrary plane out of the limited area of the portable terminal through a laser or the like.

Such an optical keyboard projects a frame image of the keyboard output from the optical device on an arbitrary plane and recognizes an input action of a finger located thereon.

Therefore, using the optical keyboard widely displayed on any plane, it is possible to freely input using all of the fingers as in the conventional computer keyboard.

However, since the optical keyboard is projected on an arbitrary plane, that is, a desk or a hard table, the fingertips cause pain at the fingertips.

In addition, similar to the touch screen, when the user puts his or her hand on the frame image of the optical keyboard, part of the image is hidden and cannot be projected.

Recently, a user interface in a three-dimensional virtual space has been introduced to apply this to the development of computer graphics, virtual reality, and augmented reality technologies.

Here, the three-dimensional virtual space is a three-dimensional space with depth information added in the concept of a two-dimensional plane, and the user interface required for this senses three-dimensional manipulations having six degrees of freedom and interacts with the virtual input device to perform input manipulations.

However, in order to support this, it is inconvenient to carry a separate, expensive and bulky interface means, and even if the motion recognition technology through the camera is used, it is impossible to provide a camera embedded in the current flat terminal at an appropriate orientation angle. Situation.

Therefore, there is a need for an alternative that can provide an interface of a three-dimensional virtual space to a recent portable terminal equipped with a variety of functions and require a more effective interface.

The present invention has been made in view of the above circumstances, and the sensing means for utilizing the bezel area of the conventional portable terminal in the form of a tablet and recognizing the motion in space is provided to adjust the orientation angle so that the front is separated from the portable terminal. An object of the present invention is to provide an operation recognition user interface for a portable terminal capable of performing an intuitive operation on the portable terminal by the two-handed operation of the user.

In order to achieve the above object, the present invention is a pair of bezel frames which are disposed along the longitudinal direction of both sides of the portable terminal and rotate relative to the portable terminal, respectively; And sensing means mounted to the pair of bezel frames, respectively, and electrically connected to the portable terminal.

In addition, the sensing means is mounted to each of the bezel frame between the rotating shaft and one end, characterized in that arranged to have a sensing area in front of the portable terminal.

The apparatus may further include a detent interposed between the bezel frame and the portable terminal to fix relative rotation of each of the pair of bezel frames.

The motion recognition user interface for a portable terminal according to the present invention uses a rotation of a bezel frame in which a bezel portion designed in a conventional portable terminal in a tablet form is applied, and a sensing means mounted on the bezel frame is a virtual interface space. By providing a sensing area, the user can operate the portable terminal through the two-hand operation of the user.

In addition, the bezel frame equipped with the sensing means may be used as a mounting means of the portable terminal, and by adjusting the rotation angle of the bezel frame, it is possible to adjust the position of the sensing area that the sensing means can sense.

In addition, a user may input a keyboard by typing in a virtual interface space as if a user is typing a general keyboard, and further, intuitively check the input status of the keyboard through a virtual hand displayed on the screen.

In addition, the operation operation taken in the virtual interface space can be interpreted in three dimensions to intuitively operate and realistically play games that require three-dimensional manipulation.

1 is a front view showing a conventional portable terminal in the form of a tablet.
2 is a perspective view showing a gesture recognition user interface for a portable terminal according to an embodiment of the present invention.
3 is an exploded perspective view showing a detent related to a gesture recognition user interface for a portable terminal according to an embodiment of the present invention.
4 and 5 is a schematic diagram showing a sensing area for the motion recognition user interface for a portable terminal according to an embodiment of the present invention.
6 is a perspective view showing a motion recognition user interface for a portable terminal according to another embodiment of the present invention.
7 to 11 is a schematic diagram showing a use state of the motion recognition user interface for a portable terminal according to an embodiment of the present invention.

Hereinafter, the configuration and operation of an operation recognition user interface for a portable terminal according to an embodiment of the present invention will be described in detail.

In the case of a tablet-type portable terminal, which is currently increasing in interest, the reason for the interest is that the size of the portable and dramatically improved screen quality and intuitive and direct responsiveness by touch are the main reasons. Accordingly, when the user is satisfied with the feeling of the operation of holding the portable terminal in the hand and moving the screen around, the inconvenience of halting the attractiveness when inputting a character using the portable terminal is the attractiveness of the portable terminal. Will occur. In other words, the virtual keypad that appears while covering a large part of the wide screen must be operated with the hand obstructing the field of view. In addition, for many tasks, it is necessary to mount the tablet to secure the gaze, and in such an arrangement, input through the hand itself is impossible and a situation in which a separate keyboard is required occurs.

Accordingly, an embodiment of the present invention provides a new type of user interface that can simultaneously solve the mounting problem and the input problem of the portable terminal.

In addition, the gesture recognition user interface for a portable terminal according to an embodiment of the present invention uses a bezel frame 120 (see FIG. 2) applying a portion of the bezel 12 provided in the existing portable terminal 10 as shown in FIG. 1. By mounting the portable terminal at the same time, according to the natural structural arrangement according to the mounting operation, the sensing means 130 (see FIG. 2) can maintain an optimal angle to fully recognize both hands of the user.

Here, the bezel (bezel) refers to the edge portion for fixing the transparent cover such as glass in the watch in a dictionary meaning, such bezel 12 can be found in the conventional portable terminal 10 in the form of a tablet.

That is, the bezel 12 prevents the touch operation on the screen 11 from being inadvertently caused by the finger held in the tablet-type conventional portable terminal 10 having the screen 11 operated by touch. It is an area for That is, when gripping the conventional portable terminal 10 in the form of a tablet, both sides are mainly wrapped by hand, and the bezel 12 becomes a space to be picked up by a finger regardless of the touch manipulation of the screen 11.

Such, the bezel 12 is a design that is felt to be considerably thick, but it is true, but to avoid the interference of the touch screen 11 to the finger, such as the thumb that is gripped in actual use to enable a stable lifting of the terminal, In a tablet-type portable terminal, a mandatory configuration of UX (User eXperience) design is required, and a motion recognition user interface for a portable terminal according to the present invention actively utilizes it.

In particular, it is possible to obtain an angle at which the optimal finger recognition can be obtained when the hands arranged for the manipulation of the tablet-type portable terminal are similar to those for general keyboard input, that is, the familiar arrangement. It can be carried out.

To this end, the gesture recognition user interface for the portable terminal according to the present invention is a pair of bezel frames arranged along the longitudinal direction of both sides of the portable terminal 100 and rotating relative to the portable terminal 100, respectively ( 120, a sensing unit 130 mounted on each of the bezel frames 120 and electrically connected to the portable terminal 100.

Here, the portable terminal 100 is provided with a separate bezel frame 120 does not need the same border space as the conventional bezel 12 (see Fig. 1), and the screen 11 is arranged on both sides of the front is tightly Prepared.

Both side ends of the portable terminal 100 is preferably provided with a rotating shaft 110 for rotatably connecting a pair of bezel frames 120.

However, as described below, when the pair of bezel frames 120 are provided in a separate case 160 (refer to FIG. 5) from which the portable terminal 100 is detached, the rotation shaft 110 is omitted from the portable terminal 100. Of course.

Next, the pair of bezel frames 120 is preferably of a similar size and shape in order to maintain the grip provided by the bezel 12 included in the conventional portable terminal 10 (see FIG. 1) in tablet form. Prepared.

To this end, the bezel frame 120 extends in the same length as the longitudinal length of the portable terminal 100 and takes the same shape as both ends of the portable terminal 100.

As a result, the bezel frame 120 can be folded to coincide with both ends of the portable terminal 100 and provide an area where the portable terminal 100 can be easily gripped so that the conventional portable terminal 10 in tablet form (see FIG. 1). ) To achieve the same grip.

In addition, the bezel frame 120 may be provided so that the length in the longitudinal direction is adjusted.

At this time, preferably between the bezel frame 120 and the rotating shaft 110 further includes a detent 140 for fixing the relative rotation of the bezel frame 120 with respect to the portable terminal 100 separately.

Here, the detent 140 is preferably placed on the rotary shaft 110 between the portable terminal 100 and the bezel frame 120 as shown in FIG. , 143, and the first detent 141 is fixed to the bezel frame 120, and the second detent 143 is fixed to the side end of the portable terminal 100.

At this time, the first detent 141 is fixed to the bezel frame 120 to be pushed toward the portable terminal 100 by the elasticity of the spring 150, so that the first and second detents 141, 143 are engaged with the surface of the tooth The opposite rotations of each other are fixed separately.

As a result, the bezel frame 120 which rotates relative to the portable terminal 100 is also fixedly rotated.

Accordingly, the bezel frame 120 may be fixed to rotate at various angles to set the portable terminal 100 at a desired angle and adjust a sensing area described later of the sensing means 130 to a predetermined position.

The motion recognition user interface for a portable terminal according to the present invention uses an image-based 3D interface system that recognizes operations performed in a three-dimensional space as six-dimensional coordinates and intuitively manipulates the GUI of the terminal. The sensing means 130 may preferably be a 3D camera.

In addition, the sensing means 130 may include an infrared camera that distinguishes both hands and other objects of the user disposed in the sensing area and detects the actual movement only with respect to both hands of the user for manipulation, and may also include a projection display device. Can be.

In this case, the projection display device refers to a projector in other words, and may use a pico projector 131 (see FIG. 11) that has been developed in a very small size and mounted as an image projection device in a mobile phone, a laptop, a PMP, and the like.

Accordingly, by forming a separate display area outside the screen 11 and recognizing user's hand gestures, the UI can be further expanded and provide various UIs.

The sensing means 130 is mounted between the rotary shaft 110 and one end of each of the pair of bezel frames 120 and is disposed to have a sensing area toward the front of the portable terminal 100.

That is, as shown in FIGS. 4 and 5, the sensing means 130 mounted on each bezel frame 120 faces the front side of the portable terminal 100 and the photographing area is spaced apart from the portable terminal 100 at predetermined intervals. Form a sensing area that can detect a three-dimensional motion in any one of the spaces.

Such a sensing area becomes a kind of virtual interface space, and the hand operation and the like, which are taken here, are regarded as the operation operation, and the actual operation on the portable terminal 100 is performed. As a result, since the operation is performed in a predetermined sensing area spaced apart from the portable terminal 100, the screen 11 of the portable terminal 100 is not covered.

In addition, unlike the fixed camera installed in the conventional portable terminal, the sensing means 130 can take a position that is easy for the user to operate by adjusting the position of the sensing area by the position adjustment by the rotation of the bezel frame 120. A constant sensing area may be provided regardless of the mounting angle of (100).

Unlike the embodiments described so far, the gesture recognition user interface for the portable terminal according to the present invention is possible in another embodiment as shown in FIG.

That is, the portable terminal 100 can be attached to and detached from the case 160 having the rotation shaft 110 at both sides, and the bezel frame 120 is connected to the case 160 relative to the rotation via the rotation shaft 110.

In addition, a connection terminal 170 is provided at an inner lower end of the case 160 to couple with the input / output terminal 17 of the portable terminal 100 and electrically connect the sensing means 130 and the portable terminal 100.

Such a motion recognition user interface for a portable terminal according to the present invention can be utilized as follows.

As shown in FIG. 7, a virtual hand 20 is displayed on the screen 11 of the portable terminal 100 to match the two-hand operation of the user typing in the sensing area on the image keyboard 13 to intuitively display the input state of the keyboard. You can check it.

That is, since the movement of the finger can be projected on the screen, continuous and smooth keyboard work is possible without the user having to see his or her own hand.

In addition, the displayed image keyboard 13 is for showing only the state of the input, and may be expressed in a small size unlike the keyboard on the existing touch screen for receiving the input. Therefore, it can be expressed while occupying at least the screen 11 of the portable terminal 100.

In addition, typing can be performed in the space, so that no pain occurs at the tip of the finger, and the keyboard can be easily entered using all the fingers as if the actual keyboard is hit.

In addition, when viewing a book using an e-Book as shown in FIG. 8, an operation of turning over the bookshelf 15 displayed on the screen 11 of the portable terminal 100 may be intuitively performed as an operation operation in a space.

In addition, when playing a cubic or car game as shown in FIGS. 9 and 10, the cubic 17 to the steering wheel 19 of the car may be intuitively used by the virtual hand 20 displayed on the screen 11 corresponding to the motion of the hand in space. I can operate it and can enjoy a more realistic game.

In addition, as illustrated in FIG. 11, various manipulations may be performed through another manipulation image 133 projected from the pico projector 131 separately from the screen 11.

The above description is only a preferred embodiment of the present invention, and there may be various modifications. However, if these modifications are included in the technical spirit of the present invention will be within the scope of the present invention, the scope of the present invention can be easily understood by those skilled in the art through the following claims.

10: conventional portable terminal 11: screen
12: Bezel 13: Image Keyboard
15: Bookshelf 17: I / O terminal
18: cubic 19: handle
20: virtual hand 100: portable terminal
110: rotation axis 120: bezel frame
130: sensing means 131: pico projector
140: detent 141: first detent
143: second detent 150: spring
160: case 170: connection terminal

Claims (3)

A pair of bezel frames disposed along the longitudinal direction of both sides of the portable terminal and respectively rotating relative to the portable terminal; And
Motion recognition user interface for a portable terminal comprising a sensing means electrically mounted to the pair of bezel frames and electrically connected to the portable terminal.
The method according to claim 1,
The sensing means is a motion recognition user interface for a portable terminal mounted on each of the bezel frame between the rotary shaft and one end and arranged to have a sensing area in front of the portable terminal.
The method according to claim 1,
And a detent interposed between the bezel frame and the portable terminal to fix relative rotation of each of the pair of bezel frames.
KR1020100123368A 2010-12-06 2010-12-06 User interface for handy terminal KR101213021B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020100123368A KR101213021B1 (en) 2010-12-06 2010-12-06 User interface for handy terminal
PCT/KR2011/009346 WO2012077942A2 (en) 2010-12-06 2011-12-05 Motion recognition user interface for portable terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100123368A KR101213021B1 (en) 2010-12-06 2010-12-06 User interface for handy terminal

Publications (2)

Publication Number Publication Date
KR20120062209A true KR20120062209A (en) 2012-06-14
KR101213021B1 KR101213021B1 (en) 2012-12-18

Family

ID=46207579

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100123368A KR101213021B1 (en) 2010-12-06 2010-12-06 User interface for handy terminal

Country Status (2)

Country Link
KR (1) KR101213021B1 (en)
WO (1) WO2012077942A2 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004513416A (en) * 2000-09-07 2004-04-30 カネスタ インコーポレイテッド Pseudo three-dimensional method and apparatus for detecting and locating the interaction between a user object and a virtual transfer device
KR20070009207A (en) * 2005-07-15 2007-01-18 엘지전자 주식회사 Apparatus and method for input holographic keyboard in mobile terminal
KR101153488B1 (en) * 2005-09-13 2012-06-11 주식회사 팬택 Input method for virtual keyboard and communication terminal of enabling the method
KR20080109688A (en) * 2008-09-22 2008-12-17 (주)엘티엠에이피 Step hinge for portable terminals
KR101601268B1 (en) * 2009-05-08 2016-03-08 엘지전자 주식회사 Portable Device and Method for Controlling User Interface Thereof

Also Published As

Publication number Publication date
KR101213021B1 (en) 2012-12-18
WO2012077942A3 (en) 2012-09-07
WO2012077942A2 (en) 2012-06-14

Similar Documents

Publication Publication Date Title
US10534447B2 (en) Multi-surface controller
US10444849B2 (en) Multi-surface controller
US10037052B2 (en) Finger-wearable devices and associated systems
US20210333836A1 (en) Image display device having keyboard and keyboard unit therefor
US10082862B2 (en) Input cueing emmersion system and method
JP5801656B2 (en) Information processing apparatus and information processing method
JP5749043B2 (en) Electronics
US20180150204A1 (en) Switching of active objects in an augmented and/or virtual reality environment
CN102779000B (en) User interaction system and method
EP3224691A1 (en) Removable input/output module with adjustment mechanism
TWI451231B (en) Docking apparatus of mobile product
US6885314B2 (en) Hand-held input device particularly useful as a keyboard
Dhawale et al. Bare-hand 3D gesture input to interactive systems
WO2020216106A1 (en) Wearable computing device and human-computer interaction method
KR20080102503A (en) Mobile device with input units behind the display
US20060227101A1 (en) Hand-held screen-interface device
KR101213021B1 (en) User interface for handy terminal
JPH08305471A (en) Input device
KR101371614B1 (en) User interface for handy terminal
Tiefenbacher et al. [Poster] Touch gestures for improved 3D object manipulation in mobile augmented reality
US20140152628A1 (en) Computer input device for hand-held devices
JP6027182B2 (en) Electronics
Ducher Interaction with augmented reality
TWI485550B (en) Handheld electronic device and operating method thereof
JP3180086U (en) Touch panel type small terminal operation tool

Legal Events

Date Code Title Description
A201 Request for examination
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20151214

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20170612

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20171204

Year of fee payment: 6

LAPS Lapse due to unpaid annual fee