KR20140049324A - Method and apparatus for contents display according to handwriting - Google Patents

Method and apparatus for contents display according to handwriting Download PDF

Info

Publication number
KR20140049324A
KR20140049324A KR1020120115450A KR20120115450A KR20140049324A KR 20140049324 A KR20140049324 A KR 20140049324A KR 1020120115450 A KR1020120115450 A KR 1020120115450A KR 20120115450 A KR20120115450 A KR 20120115450A KR 20140049324 A KR20140049324 A KR 20140049324A
Authority
KR
South Korea
Prior art keywords
area
virtual
touch
hand
pen
Prior art date
Application number
KR1020120115450A
Other languages
Korean (ko)
Inventor
윤성진
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020120115450A priority Critical patent/KR20140049324A/en
Publication of KR20140049324A publication Critical patent/KR20140049324A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A content display method according to a handwriting input; When a touch input occurs, a hand touch signal of a touch signal by a body part and a pen touch signal of a touch signal by a touchable input means are detected to determine the hand touch and pen touch area, and the determined hand touch and pen Setting a hidden area in which previously displayed content including a touch area is covered; A virtual area having a preset size including the blind area is generated, and the display of the converted content image is changed by converting the image of the content in the virtual area so that the content image in the hidden area is located outside the hidden area. It is characterized by including the process.

Description

METHOD AND APPARATUS FOR CONTENTS DISPLAY ACCORDING TO HANDWRITING}

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a display for displaying information input to a terminal, and more particularly, to a content display method and apparatus according to a handwriting input in a terminal to which a touch type user interface environment such as a touch pad and a touch screen is applied.

In general, a display device refers to a screen display device for visually displaying data, and recently, the use of a touch screen equipped with a touch panel has become common.

The touch panel detects the touch position and performs specific processing using the stored software when an input means such as a human hand or an object touches a character displayed on the screen or a specific position without using a keyboard. Refers to the interface device.

In recent years, a smart phone having a touch screen equipped with a touch panel, a tablet PC, personal digital assistants (PAD), navigation, and the like can be easily accessed.

As described above, terminals with a touch screen have advantages in convenience of input and miniaturization of terminals when compared to terminals using a keypad input. In addition, when the device is the same size, there is an advantage that the terminal having a touch screen can have a larger display screen than the terminal having a keypad.

Prior art related to a display for displaying information input on a terminal includes Korean Patent Publication No. 2008-0069330 (name; "information display apparatus and method", inventor; Kang Tae-young, Ryu Dong-seok, Applicant; Samsung Electronics Co., Ltd., Publication Date; 2008) July 28, 2009) and Korean Patent Publication No. 2009-0041784 (Name; "Variable Display Device and Display Method thereof", Inventor; Jang Wook, Park Jun-A, Hyun-Jung Lee, Applicant; Samsung Electronics Co., Ltd., Publication Date; April 29, 2009 There is).

Korean Patent Publication No. 2008-0069330 relates to an information display apparatus and a method, and when a touch is detected on a touch panel and there is information in the contacted area, a method of displaying the information in a position outside the detected area. And the apparatus is disclosed. In addition, Korean Patent Publication No. 2009-0041784 relates to a variable display device and a display method thereof, comprising: detecting a position information of a contacted finger, extracting a finger contact area, and then rearranging an image that is hidden within the finger contact area; The apparatus is disclosed.

Although the prior arts disclose a method and apparatus for relocating information of an area that is hidden by a user so that a user can identify the information of the area that is covered by a touch, the method of relocating an area, which is covered by applying the prior art, is applied in writing input using a touch. Checking a piece of information can be inconvenient. For example, if articles with contents linked back and forth in the handwriting input are displayed, placing the articles in the area covered by the new area of the prior art does not mean that the entire previously written article is not placed in the new area. Because of this, it is inconvenient to be able to continuously check previously written text. In addition, when the position of the touch moves with direction according to the handwriting input, that is, when the position of the touch is continuously moved at a constant speed, the writing of the area covered by the new area must be continuously changed and placed. This can cause confusion rather than ease when identifying the information in areas that are hidden. That is, disposing the information of the area covered by the new area of the prior art, which has the purpose of easily confirming the information covered by the handwriting input, is beyond the original purpose, and is inconvenient when checking the information of the area covered by the user. Can cause problems.

Accordingly, an aspect of the present invention is to provide a content display method and apparatus according to a handwriting input for allowing a user to easily input information while checking information on an area covered by a touch during handwriting input.

In order to achieve the above object, according to one aspect of the invention, in the content display method according to the handwriting input, when the touch input is generated, the touch signal of the touch signal by the body part and the touch by the touchable input means Detecting a pen touch signal of the signal to determine the hand touch and the pen touch area, and setting a covering area where the pre-displayed content including the determined hand touch and pen touch area is covered; A virtual area having a preset size including the blind area is generated, and the display of the converted content image is changed by converting the image of the content in the virtual area so that the content image in the hidden area is located outside the hidden area. It is characterized by including the process.

According to another aspect of the present invention, a content display apparatus according to a handwriting input, comprising: an input / output module including a button for receiving a user's operation and a physical or virtual keypad; A touch screen that receives a user's manipulation and displays an execution image, an operation state, and a menu state of an application program; When the touch input is generated on the touch screen, the function unit is collectively controlled, and a pen touch signal of a touch signal of a touch signal by a body part and a touch signal of a touchable input means are sensed, so that the hand touch and An area setting unit configured to determine a pen touch area and to set a hidden area where content previously displayed on the touch screen including the determined hand touch and pen touch area is hidden, and the blind set by the area setting unit; And a controller including an image changer configured to generate a virtual area having a preset size including an area, convert the image of the content in the virtual area, and change the display of the converted content image.

As described above, by using the content display method and apparatus according to the handwriting input of the present invention, the content displayed in the hidden area is converted into an image and displayed outside the hidden area. Accordingly, there is an effect that the user can easily input the handwriting while confirming the area covered by the touch during the handwriting input.

1 is a block diagram of a portable terminal having a content display device according to a handwriting input according to an embodiment of the present invention
2 is a flowchart of a content display operation according to a handwriting input according to an embodiment of the present invention.
3 is an exemplary diagram illustrating a region where a stylus pen is touched and an area where a hand is touched when a touch input using a stylus pen is performed on a touch screen according to an embodiment of the present invention;
FIG. 4 is an exemplary diagram showing a pen touch area, a hand touch area, and an obscured area during touch input using a stylus pen on a touch screen according to an embodiment of the present invention.
FIG. 5 is an exemplary diagram illustrating an area covered by a writing progress direction when writing on a touch screen according to an embodiment of the present invention; FIG.
FIG. 6 is an exemplary view showing a virtual area including an area covered by a touch input on a touch screen according to an embodiment of the present invention and a change area created by changing a virtual area
FIG. 7 illustrates an example of displaying a content of an area that is covered by a touch input outside of an area that is covered by applying a content display operation according to a handwriting input according to an embodiment of the present invention; FIG.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. It will be appreciated that those skilled in the art will readily observe that certain changes in form and detail may be made therein without departing from the spirit and scope of the present invention as defined by the appended claims. To those of ordinary skill in the art.

1 is a block diagram of a smartphone among portable terminals with a content display device according to a handwriting input according to an embodiment of the present invention. Referring to Figure 1, The portable terminal 100 includes a display unit 190 and a display controller 195. In addition, the portable terminal 100 may include a control unit 110, a mobile communication module 120, a sub communication module 130, a multimedia module 140, a camera module 150, a GPS module 155, and an input / output module ( 160, a sensor module 170, a storage unit 175, and a power supply unit 180. The sub communication module 130 includes at least one of the wireless LAN module 131 and the short range communication module 132, and the multimedia module 140 includes the broadcast communication module 141, the audio play module 142, and the video play module. 143 at least one. The camera module 150 includes at least one of the first camera 151 and the second camera 152. Hereinafter, the display unit 190 and the display controller 195 are respectively a touch screen and a touch screen controller.

The controller 110 may store a signal or data input from the ROM (ROM) 112 and the outside of the portable terminal 100, in which the CPU 111, a control program for controlling the portable terminal 100 is stored, or the portable terminal ( The RAM 113 may be used as a storage area for a task performed at 100. The CPU 111 may include a single core, a dual core, a triple core, or a quad core. The CPU 111, the ROM 112, and the RAM 113 may be interconnected via an internal bus.

The control unit 110 includes a mobile communication module 120, a sub communication module 130, a multimedia module 140, a camera module 150, a GPS module 155, an input / output module 160, a sensor module 170, A storage unit 175, a power supply unit 180, a touch screen 190, and a touch screen controller 195, as shown in FIG. In addition, the controller 110 detects a pen touch signal of a touch signal of a touch signal by a body part and a touch signal of a touchable input means when the touch input occurs on the touch screen 190, and detects the hand touch area and And an area setting unit configured to determine a pen touch area and to set a hidden area where content previously displayed on the touch screen 190 including the determined hand touch area and the pen touch area is covered. When the area setting unit detects a hand touch signal and a pen touch signal, whether the input from the touch screen 190 is input by a user's body (eg, a finger including a thumb), or a touchable input means (eg, a stylus). Each signal can be distinguished by determining whether it is input by a pen). The controller may include a video changer that generates a virtual area having a preset size including a blind area set by the area setter, converts an image of content in the virtual area, and changes a display of the converted content image.

The mobile communication module 120 allows the portable terminal 100 to be connected to an external device using at least one or a plurality of antennas (not shown) under the control of the controller 110. The mobile communication module 120 includes a mobile phone (not shown), a smart phone, a tablet PC, or another device (not shown) having a phone number input to the mobile terminal 100, and a voice call, a video call, a text message ( Transmit / receive radio signals for SMS) or multimedia messages (MMS).

The sub communication module 130 may include at least one of a wireless LAN module 131 and a local area communication module 132.

The WLAN module 131 may be connected to the Internet at a place where a wireless access point (AP) (not shown) is installed under the control of the controller 110. Wireless LAN module 131 supports the wireless LAN standard (IEEE802.11x) of the Institute of Electrical and Electronics Engineers (IEEE). The short range communication module 132 may wirelessly perform short range communication between the portable terminal 100 and an image forming apparatus (not shown) under the control of the controller 110.

The portable terminal 100 may include at least one of the mobile communication module 120, the wireless LAN module 131, and the short range communication module 132 according to performance. For example, the portable terminal 100 may include a combination of the mobile communication module 120, the wireless LAN module 131, and the short range communication module 132 according to performance.

The multimedia module 140 may include an audio play module 142 and a video play module 143 except for the broadcast communication module 141. The audio reproducing module 142 or the moving picture reproducing module 143 of the multimedia module 140 may be included in the controller 110.

The camera module 150 may include at least one of a first camera 151 and a second camera 152 for capturing still images or moving images under the control of the controller 110. [

The GPS module 155 receives radio waves from a plurality of GPS satellites (not shown) on an earth orbit, and uses a time of arrival from the GPS satellites (not shown) to the mobile terminal 100. The position of the portable terminal 100 can be calculated.

The input / output module 160 includes at least one of a plurality of buttons 161, a microphone 162, a speaker 163, a vibration motor 164, a connector 165, a keypad 166, and an earphone connecting jack 167. It may include.

Button 161 may be formed on the front, side or rear of the housing of the portable terminal 100, the power / lock button (not shown), volume button (not shown), menu button, home button, back It may include at least one of a back button and a search button 161.

The microphone 162 receives a voice or a sound under the control of the controller 110 and generates an electrical signal.

The speaker 163 may output sound corresponding to various signals of the mobile communication module 120, the sub communication module 130, the multimedia module 140, or the camera module 150 under the control of the controller 110. Can be output to outside. The speaker 163 may output a sound corresponding to a function performed by the portable terminal 100. One or more speakers 163 may be formed at appropriate locations or locations of the housing of the portable terminal 100.

The vibration motor 164 can convert an electrical signal into a mechanical vibration under the control of the control unit 110. [ For example, when the mobile terminal 100 having the vibration mode receives a voice call from another device (not shown), the vibration motor 164 operates. One or more may be formed in the housing of the portable terminal 100. The vibration motor 164 may operate in response to a touch operation of a user who touches the touch screen 180 and a continuous movement of a touch on the touch screen 190.

The connector 165 may be used as an interface for connecting the portable terminal 100 to an external device (not shown) or a power source (not shown). The portable terminal 100 transmits data stored in the storage unit 175 of the portable terminal 100 to an external device (not shown) through a wired cable connected to the connector 165 under the control of the controller 110. Data may be received from an external device (not shown). In addition, the portable terminal 100 may receive power from a power source (not shown) through a wired cable connected to the connector 165 or charge a battery (not shown) using the power source.

The keypad 166 receives a key input from the user for the control of the portable terminal 100. The keypad 166 includes a physical keypad (not shown) formed on the portable terminal 100 or a virtual keypad (not shown) displayed on the touch screen 190. The physical keypad formed on the portable terminal 100 may be excluded according to the performance or structure of the portable terminal 100.

An earphone may be inserted into the earphone connecting jack 167 to be connected to the portable terminal 100.

The sensor module 170 may include at least one sensor that detects a state of the portable terminal 100, and may generate a signal corresponding to the detection and transmit the signal to the controller 110.

The storage unit 175 is connected to the mobile communication module 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the input / And may store signals or data input / output corresponding to the operation of the touch panel 160, the sensor module 170, and the touch screen 190. The storage unit 175 may store a control program and applications for controlling the portable terminal 100 or the controller 110.

The term "storage unit" includes a memory unit (eg, SD card, memory stick) mounted in the storage unit 175, a ROM, a RAM, or the portable terminal 100. The storage unit may include a nonvolatile memory, a volatile memory, a hard disk drive (HDD), or a solid state drive (SSD).

The power supply unit 180 may supply power to one or a plurality of batteries (not shown) disposed in the housing of the portable terminal 100 under the control of the controller 110. One or more batteries (not shown) supply power to the portable terminal 100. In addition, the power supply unit 180 may supply power input from an external power source (not shown) to the portable terminal 100 through a wired cable connected to the connector 165. In addition, the power supply unit 180 may supply power to the portable terminal 100 that is wirelessly input from an external power source through a wireless charging technology.

The touch screen 190 receives a user's manipulation and displays an execution image, an operation state, and a menu state of an application program.

The touch screen 190 may provide a user interface corresponding to various services (eg, a call, data transmission, broadcasting, and photographing). The touch screen 190 may transmit an analog signal corresponding to at least one touch input to the user interface to the touch screen controller 195. The touch screen 190 may receive at least one touch through a user's body (eg, a finger including a thumb) or a touchable input means (eg, a stylus pen). Also, the touch screen 190 can receive a continuous movement of one touch among at least one touch. The touch screen 190 may transmit an analog signal corresponding to the continuous movement of the input touch to the touch screen controller 195.

The touch screen may be implemented by, for example, a resistive method, a capacitive method, an infrared method, or an acoustic wave method.

Meanwhile, the touch screen controller 195 converts an analog signal received from the touch screen 190 into a digital signal (eg, X and Y coordinates) and transmits the same to the controller 110. The controller 110 may control the touch screen by using the digital signal received from the touch screen controller 195. Also, the touch screen controller 195 may be included in the control unit 110. [

In addition, the touch screen 190 may include at least two touch screen panels (not shown) capable of sensing the user's body and the touch or proximity of the touchable input means, respectively, so that the user's body and input by the touch- . ≪ / RTI > The at least two touch screen panels provide different output values to the touch screen controller 195, and the touch screen controller 195 recognizes values input from the at least two touch screen panels differently from the touch screen. It is possible to distinguish whether the input of the input by the user's body or the input means that can be touched.

The content display method according to the handwriting input of the present invention can be largely divided into two processes. The first process is to set a hidden area to be covered when a touch input occurs on the touch screen 190. That is, when a touch input occurs on the touch screen 190, each region is determined by detecting a touch signal by a body part and a touch signal by a touchable input means. In this case, a touch by a body part is defined as a hand touch, and a touch by a touchable input means is defined as a pen touch, and each area according to the detected touch signal is determined as a hand touch area and a pen touch area. Subsequently, a hidden area where the content previously displayed on the touch screen 190 including the determined hand touch and pen touch area is covered is set. After the first process, the second process is the process of converting the content image so that the content image inside the hidden area can be displayed outside the hidden area. That is, a virtual region having a preset size including a blind region set in the first process is generated, and the converted content image is converted by converting the image of the content in the virtual region so that the content image in the blind region is located outside the hidden region. To change the display.

Hereinafter, a case where a touch by a body part is a hand and a case where a touchable input means is a stylus pen will be described as an example.

2 is a flowchart illustrating a content display operation according to a handwriting input according to an embodiment of the present invention. Referring to FIG. 2, first, when a touch input is generated by a user's hand and a stylus pen in step 200, the touch signal is detected in step 202. When the touch signal is detected in operation 202, the touch input from the touch screen 190 may be distinguished from a hand touch input by a user's hand or a pen touch input by a stylus pen.

After the step 202, the hand touch area and the pen touch area are determined in step 204. 3 or 4, as an embodiment, the hand touch area a2 and the pen touch area a1 may be identified. In this case, the hand touch area a2 and the pen touch area a1 may be areas that are set by extending the area by a predetermined size from the area that is actually recognized. In addition, in order to distinguish from images of other contents displayed on the touch screen 190, the contrast may be set to the hand touch area a2 and the pen touch area a1 or displayed in different colors. In addition, the determination of the hand touch area a2 and the pen touch area a1 may be recognized as a surface touch and a hovering.

Subsequently, in operation 206, a covering area that is covered by the pen touch area a1 and the hand touch area a2 is set. Referring to FIG. 4, the covering area a3 is an invisible area that is hidden by the touch of the user of the portable terminal 100, and is a larger area including the pen touch area a1 and the hand touch area a2. . That is, the setting of the blind area a3 is generated by extending the area by a preset size in the hand touch a2 and the pen touch area a1, and includes the hand touch a2 and the area a1 of the pen touch. One new area is created, and the generated area is set as a hidden area a3 where the displayed content is hidden.

Thereafter, step 208 determines whether there is a handwriting input from the user. In step 208, when there is a handwriting input from the user, the operation of displaying the input handwriting of step 209 is performed, and when there is no handwriting input from the user, the operation includes the blind area a3, which is the operation of step 215. The operation of generating a virtual area a4 having a predetermined size is performed.

After the operation 208, when the operation of displaying the input writing in step 209 is performed, in step 210, the writing progress direction is determined according to the direction of the writing input by the user.

Thereafter, in step 211, it is determined whether the hand holding the pen is the left hand or the right hand according to the pen touch and hand touch area determined in step 204. Determining whether the hand holding the pen of the user in step 211 is the left hand or the right hand is determined by comparing the positions of the hand touch and the pen touch area determined in step 204. That is, if the pen touch area is to the left of the hand touch area, it is determined that the hand holding the user's pen is the right hand. If the pen touch area is to the right of the hand touch area, the hand holding the pen is determined to be the left hand.

Subsequently, in step 213, when the hand holding the pen is the left hand, the hand writing direction is the right hand according to the determination of the writing direction in step 210 and whether the hand holding the pen of the user in step 211 is the left hand or the right hand. When the hand holding the pen is the right hand, it is determined whether the writing direction is left. Referring to FIG. 5, when the hand holding the pen is the left hand and the writing progress direction is right as shown in FIG. 5A, the hand holding the pen is the right hand and the writing progress direction as shown in FIG. 5B. In the case of the left side, it can be seen that the covered area a3 is generated. That is, step 213 may be referred to as a step of determining whether a covering area a3 that is covered by the writing direction and the hand holding the pen of the user is generated.

In step 213, when the hand holding the pen is the left hand, the writing progress direction is right, or when the hand holding the pen is the right hand, the writing progress direction is left. Proceed to the operation. In addition, if the writing progress direction is not right when the hand holding the pen is not the right hand or the writing progress direction is not left when the hand holding the pen is the right hand, the touch input end determination operation of step 219 is performed.

After step 213, in step 215, a virtual area having a preset size including the covered area a3 is created. Referring to FIG. 6A, step 215 will be described in detail. In step 215, a new virtual area a4 is created by extending the area by a predetermined size in the blind area a3. In this case, the virtual area a4 is a part of the entire display area on the touch screen 190 of the portable terminal 100 and sets the area of the image to be changed among the images of the entire display area. In addition, the shape of the virtual area a4 may be a rectangle as shown in FIG. 6A, and may be variously set to circular, elliptical, or the like.

Subsequently, in step 217, a predetermined area is set in the entire display area except for the blind area a3 to generate a change area, and the content image in the virtual area a4 is converted and displayed on the set change area. . In this case, the changed area is a part of the entire display area on the touch screen of the portable terminal 100, and is a newly created area that does not include the blind area a3 by changing the shape in the virtual area a4, and the virtual area a4. This is the area displayed after the content video in) is converted.

Referring to step 217 in detail, first, the virtual area a4 generated in step 215 is divided into two areas, and is divided into a first virtual area and a second virtual area. The reference for dividing the virtual area a4 may be set in advance to the center of the pen area, the center of the virtual area a4, and the like, and may be set to the vertical direction in the vertical direction in consideration of the dividing direction or the writing direction. . After dividing into the first virtual region and the second virtual region, the modified region is generated by modifying a part of the divided virtual region a4). Referring to (b) of FIG. 6, the operation of generating a change region is performed in the divided first virtual region, excluding a hidden region inside the first virtual region, and facing a virtual line divided from the outside of the first virtual region. An area having the same or similar shape as that of the blind area inside the first virtual area is generated outside the viewing position to determine the first change area a5. In addition, the second virtual area excludes the hidden area inside the second virtual area and has the same or similar shape as the hidden area inside the second virtual area on the outside of the position facing the virtual line divided from the outside of the second virtual area. The area is generated and determined as the second change area a6.

After the change area is generated, the content image in the virtual area is converted and displayed in the change area. That is, the content image in the first virtual region is converted so that the content image in the first virtual region is displayed in the first change region a5, and the content image in the converted first virtual region is converted into the first change region a5. Display on the screen. In addition, the content image in the second virtual area is converted to display the content image in the second virtual area in the second change area a6, and the content image in the converted second virtual area is converted into the second change area a6. Display. In this case, the content image conversion in the virtual area is described with reference to FIGS. 7A and 7B as an example, in which the text in the virtual area may be disposed outside the hidden area. Is a video conversion to change. In this case, the character size and / or the space between lines may be kept the same without change, and conversely, the character size and / or line spacing of the text may be changed. When changing the character size of text, the rate of change of character size can be applied differently for each line of text, and when changing the line spacing, the rate of change of line spacing can be applied differently. In addition, the slope of the character can be expressed differently. In addition, various image transformations such as reduction, movement, enlargement, and tilt transformation of a content image may be applied by applying a constant ratio or a non-uniform ratio.

In addition, the generation of the change area in step 217 may be in the form as shown in FIG. Referring to FIG. 6C, an operation of generating a change area is performed to generate a first change area a5 by excluding the hidden area inside the first virtual area from the first virtual area. In addition, the second change area a6 is generated in the second virtual area by excluding the hidden area inside the second virtual area. In addition, the shape of the change region may be variously set, such as a wave shape and a donut shape. In addition, it can be set in advance so that the changed area can be contrasted or displayed in a different color.

After step 217, in step 219, it is determined whether the touch input is terminated. In operation 217, when the touch input is terminated, the content display operation according to the handwriting input of the present invention is terminated. If the touch input continues, the process returns to the touch signal detection step 202 and repeats the operations after step 202.

Steps 208 to 213 may be additionally included. If there is no obstruction area a3 according to the direction of the writing input, the process of converting the content image may not be executed. However, if the hand holding the pen is the right hand and the writing progress direction is right, and the hand holding the pen is the left hand, the writing progress direction is left according to the direction of the writing input. Set to case.

In addition, the generation of the change area in step 217 may be omitted, and the content image inside the virtual area may be converted and displayed outside the obstruction area a3 so as to overlap the previously displayed content image without generating the change area. However, it is necessary to change the background of the display of the converted content image to be translucent or to give a color to distinguish it from the previously displayed content image.

It will be appreciated that embodiments of the present invention may be implemented in hardware, software, or a combination of hardware and software. Such arbitrary software may be stored in a memory such as, for example, a volatile or non-volatile storage device such as a storage device such as ROM or the like, or a memory such as a RAM, a memory chip, a device or an integrated circuit, , Or a storage medium readable by a machine (e.g., a computer), such as a CD, a DVD, a magnetic disk, or a magnetic tape, as well as being optically or magnetically recordable. It will be appreciated that the memory that may be included in the portable terminal is an example of a machine-readable storage medium suitable for storing programs or programs containing instructions for implementing the embodiments of the present invention. Accordingly, the invention includes a program comprising code for implementing the apparatus or method as claimed in any of the claims, and a machine-readable storage medium storing such a program. In addition, such a program may be electronically transported through any medium such as a communication signal transmitted via a wired or wireless connection, and the present invention appropriately includes the same.

As described above, the configuration and operation according to an embodiment of the present invention can be made. Meanwhile, in the above description of the present invention, specific embodiments have been described, but various modifications can be made without departing from the scope of the present invention. For example, in the above embodiment, the case where the touch by the body part is the hand and the case where the touchable input means is the stylus pen are described as an example, but the hand touch may include both the touch by the user's body and the pen touch It may include both touch by the touchable input means. In addition, in the above embodiment, the content image conversion in the case where the content is text in the conversion of the content image has been described. However, various images may be converted as in the image. In addition, by creating a menu or icon to select whether or not to apply the content display operation according to the handwriting input of the present invention, if you want to apply the operation of the present invention, by selecting the menu or icon to convert the content image is hidden can do. Therefore, the scope of the present invention should not be defined by the described embodiments, but by the claims and equivalents of the claims.

Claims (14)

In the content display method according to the handwriting input,
When a touch input occurs, a hand touch signal of a touch signal by a body part and a pen touch signal of a touch signal by a touchable input means are detected to determine the hand touch and pen touch area, and the determined hand touch and pen Setting a hidden area in which previously displayed content including a touch area is covered;
A virtual area having a preset size including the blind area is generated, and the display of the converted content image is changed by converting the image of the content in the virtual area so that the content image in the hidden area is located outside the hidden area. Contents display method according to the handwriting input, characterized in that it comprises a process of.
The method of claim 1, wherein the setting of the blind area comprises:
And generating an area having a predetermined size including an area of the hand touch and a pen touch, and setting the generated area as an obscured area where the displayed content is covered. How to display content.
The method of claim 1, wherein the setting of the blind area comprises:
Displaying the input writing and determining the writing progress direction according to the writing progress direction of the input writing;
Determining whether the hand holding the user's pen is a left hand or a right hand according to the determination result of the hand touch and the pen touch area;
And determining whether or not to execute the process of changing the display of the content image according to a result of the determining of the writing direction and determining whether the hand holding the pen is a left hand or a right hand. Content display method according to the handwriting input, characterized in that.
The method of claim 3, wherein the determining whether the hand holding the user's pen is a left hand or a right hand is as follows.
Comparing the positions of the determined hand touch and pen touch area, if the pen touch area is to the left of the hand touch area, it is determined that the hand holding the user's pen is the right hand, and the pen touch area is the hand touch area. And determining that the hand holding the user's pen is the left hand if it is at the right side of the screen.
The method of claim 3, wherein the determining of whether to change the display of the content image is performed.
Changing the display of the content image when the determined writing direction is right when the hand holding the pen is the left hand or when the determined writing direction is left when the hand holding the pen is the right hand ,
Changing the display of the content image when the determined writing direction is left when the hand holding the pen is the left hand or when the determined writing direction is right when the hand holding the pen is the right hand Content display method according to the handwriting input, characterized in that it does not.
The method of claim 1, wherein the changing of the display of the content image comprises:
Dividing the virtual region into two regions and dividing the virtual region into a first virtual region and a second virtual region;
The blind area inside the first virtual area is excluded from the first virtual area except for the hidden area within the position facing the virtual line divided from the outside of the first virtual area; The virtual area is divided into an outer area of the second virtual area by generating an area having the same or similar shape and determining the area as the first change area, except for the blind area inside the second virtual area. A region having the same or similar shape as that of the blind region inside the second virtual region is determined outside the position facing the line of the second virtual region, and is determined as the second modified region.
Converting the content image in the first virtual area and displaying the content image in the first change area, and converting the content image in the second virtual area and displaying the content image in the second change area. Content display method according to.
The method of claim 1, wherein the changing of the display of the content image comprises:
Dividing the virtual region into two regions and dividing the virtual region into a first virtual region and a second virtual region;
In the first virtual area, a first change area is generated by excluding the hidden area inside the first virtual area, and in the second virtual area, the second change is excluded by excluding the hidden area inside the second virtual area. Create a region,
Converting the content image in the first virtual area and displaying the content image in the first change area, and converting the content image in the second virtual area and displaying the content image in the second change area. Content display method according to.
In the content display device according to the handwriting input,
An input / output module including a button for receiving a user's manipulation and a physical or virtual keypad;
A touch screen that receives a user's manipulation and displays an execution image, an operation state, and a menu state of an application program;
When the touch input is generated on the touch screen, the function unit is collectively controlled, and a pen touch signal of a touch signal of a touch signal by a body part and a touch signal of a touchable input means are sensed, so that the hand touch and An area setting unit configured to determine a pen touch area and to set a hidden area where content previously displayed on the touch screen including the determined hand touch and pen touch area is hidden, and the blind set by the area setting unit; And a controller for generating a virtual area having a predetermined size including an area, converting an image of the content in the virtual area, and including an image changer to change the display of the converted content image. Content display device according to.
The method of claim 8, wherein the area setting unit,
And generating an area having a predetermined size including an area of the hand touch and a pen touch, and setting the generated area as an area covered by the displayed content. Content display device.
The method of claim 8, wherein the area setting unit,
Displaying the input writing and determining the writing progress direction according to the writing progress direction of the input writing;
Determining whether the hand holding the pen of the user is the left hand or the right hand according to a result of the operation of determining the hand touch and the pen touch area;
And determining whether or not to execute the operation of the image changer according to the determining of the writing direction and the determining whether the hand holding the pen is a left hand or a right hand. Content display device according to the handwriting input.
The method of claim 10, wherein the determining whether the hand holding the pen of the user is a left hand or a right hand is performed.
Comparing the positions of the determined hand touch and pen touch area, if the pen touch area is to the left of the hand touch area, it is determined that the hand holding the user's pen is the right hand, and the pen touch area is the hand touch area. And determining that the hand holding the user's pen is the left hand if it is at the right side of the screen.
The method of claim 10, wherein the determining of the execution of the operation of the image changing unit comprises:
If the determined handwriting direction is right when the hand holding the pen of the user is the left hand, or the determined handwriting direction is left when the hand holding the pen of the user is the right hand, executing the image changing unit;
And when the determined writing direction is left when the hand holding the pen is the left hand or when the determined writing direction is right when the hand holding the pen is the right hand, the operation of the image changing unit is not executed. Content display device according to the handwriting input, characterized in that.
The method of claim 8, wherein the operation of the image changing unit,
Dividing the virtual region into two regions and dividing the virtual region into a first virtual region and a second virtual region;
The blind area inside the first virtual area is excluded, and the blind area inside the first virtual area is located at an outer side of a position facing the virtual line divided from the outside of the first virtual area in the first virtual area. The virtual area is divided into an outer area of the second virtual area by generating an area having the same or similar shape and determining the area as the first change area, and excluding the blind area inside the second virtual area from the second virtual area. A region having the same or similar shape as that of the blind region inside the second virtual region is determined outside the position facing the line of the second virtual region, and is determined as the second modified region.
Converting the content image in the first virtual area and displaying the content image in the first change area, and converting the content image in the second virtual area and displaying the content image in the second change area. Content display device according to.
The method of claim 8, wherein the operation of the image changing unit,
Dividing the virtual region into two regions and dividing the virtual region into a first virtual region and a second virtual region;
In the first virtual area, a first change area is generated by excluding the hidden area inside the first virtual area, and in the second virtual area, the second change is excluded by excluding the hidden area inside the second virtual area. Create a region,
Converting the content image in the first virtual area and displaying the content image in the first change area, and converting the content image in the second virtual area and displaying the content image in the second change area. Content display device according to.
KR1020120115450A 2012-10-17 2012-10-17 Method and apparatus for contents display according to handwriting KR20140049324A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120115450A KR20140049324A (en) 2012-10-17 2012-10-17 Method and apparatus for contents display according to handwriting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120115450A KR20140049324A (en) 2012-10-17 2012-10-17 Method and apparatus for contents display according to handwriting

Publications (1)

Publication Number Publication Date
KR20140049324A true KR20140049324A (en) 2014-04-25

Family

ID=50654962

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120115450A KR20140049324A (en) 2012-10-17 2012-10-17 Method and apparatus for contents display according to handwriting

Country Status (1)

Country Link
KR (1) KR20140049324A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017116216A1 (en) * 2015-12-31 2017-07-06 삼성전자 주식회사 Method for displaying contents on basis of smart desktop and smart terminal
CN107977112A (en) * 2016-10-25 2018-05-01 乐金显示有限公司 Touch display unit, active pen, touch system, touch circuit and stroke recognition method
CN112650357A (en) * 2020-12-31 2021-04-13 联想(北京)有限公司 Control method and device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017116216A1 (en) * 2015-12-31 2017-07-06 삼성전자 주식회사 Method for displaying contents on basis of smart desktop and smart terminal
CN106933465A (en) * 2015-12-31 2017-07-07 北京三星通信技术研究有限公司 A kind of content display method and intelligence desktop terminal based on intelligence desktop
CN106933465B (en) * 2015-12-31 2021-01-15 北京三星通信技术研究有限公司 Content display method based on intelligent desktop and intelligent desktop terminal
US11221745B2 (en) 2015-12-31 2022-01-11 Samsung Electronics Co., Ltd. Method for displaying contents on basis of smart desktop and smart terminal
CN107977112A (en) * 2016-10-25 2018-05-01 乐金显示有限公司 Touch display unit, active pen, touch system, touch circuit and stroke recognition method
US10963107B2 (en) 2016-10-25 2021-03-30 Lg Display Co., Ltd. Touch display device, active pen, touch system, touch circuit, and pen recognition method
CN107977112B (en) * 2016-10-25 2021-07-16 乐金显示有限公司 Touch display device, active pen, touch system, touch circuit, and pen recognition method
US11513641B2 (en) 2016-10-25 2022-11-29 Lg Display Co., Ltd. Touch display device, active pen, touch system, touch circuit, and pen recognition method
CN112650357A (en) * 2020-12-31 2021-04-13 联想(北京)有限公司 Control method and device

Similar Documents

Publication Publication Date Title
US20180225022A1 (en) Electronic device using auxiliary input device and operating method thereof
KR102051418B1 (en) User interface controlling device and method for selecting object in image and image input device
US9323446B2 (en) Apparatus including a touch screen and screen change method thereof
KR102158098B1 (en) Method and apparatus for image layout using image recognition
KR101990567B1 (en) Mobile apparatus coupled with external input device and control method thereof
KR102145577B1 (en) Method and apparatus for displaying user interface
KR20140076261A (en) Terminal and method for providing user interface using pen
KR20140000572A (en) An apparatus displaying a menu for mobile apparatus and a method thereof
KR20140064089A (en) Method and apparatus for providing user interface through proximity touch input
KR102156617B1 (en) Method and apparatus for controlling operation of touch key
KR102186815B1 (en) Method, apparatus and recovering medium for clipping of contents
US20150002417A1 (en) Method of processing user input and apparatus using the same
US9261996B2 (en) Mobile terminal including touch screen supporting multi-touch input and method of controlling the same
KR20140110646A (en) User termial and method for displaying screen in the user terminal
US10146342B2 (en) Apparatus and method for controlling operation of an electronic device
KR102115727B1 (en) Mobile apparatus providing hybrid-widget and control method thereof
KR20140049324A (en) Method and apparatus for contents display according to handwriting
KR20140068585A (en) Method and apparatus for distinction of finger touch and pen touch on touch screen
KR20140123325A (en) Method and apparatus for filling color of image
KR102482630B1 (en) Method and apparatus for displaying user interface
KR102278676B1 (en) Method and apparatus for displaying user interface
KR20150026110A (en) A method for managing icons and a mobile terminal therefor
KR20130123794A (en) Memo application
KR102187856B1 (en) Method and apparatus for displaying user interface
KR20140117092A (en) display device and method for controlling thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination