KR20150079501A - Methode For Making Emotion-Effective E-Book By Complexing Fuctions In Mobile Terminal - Google Patents

Methode For Making Emotion-Effective E-Book By Complexing Fuctions In Mobile Terminal Download PDF

Info

Publication number
KR20150079501A
KR20150079501A KR1020150051377A KR20150051377A KR20150079501A KR 20150079501 A KR20150079501 A KR 20150079501A KR 1020150051377 A KR1020150051377 A KR 1020150051377A KR 20150051377 A KR20150051377 A KR 20150051377A KR 20150079501 A KR20150079501 A KR 20150079501A
Authority
KR
South Korea
Prior art keywords
unit
touch
user interface
sound
display unit
Prior art date
Application number
KR1020150051377A
Other languages
Korean (ko)
Inventor
라호준
Original Assignee
라호준
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 라호준 filed Critical 라호준
Priority to KR1020150051377A priority Critical patent/KR20150079501A/en
Publication of KR20150079501A publication Critical patent/KR20150079501A/en
Priority to PCT/KR2016/003751 priority patent/WO2016167517A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention provides a user interface device and a method using a touch interface unit, an inclination sensing unit, a sound collection unit, and a humidity collection unit. According to the present invention, the interface device includes: the touch interface unit which senses touch, the direction, and the speed of the touch; the inclination sensing unit which senses inclination of the device; the sound collection unit which collects sound around the device; and the humidity collection unit that collects humidity around the device and displays animation of turning over the page on a display unit of a layer value corresponding to a touched class when the touch on a classified setting area of the touch interface unit has a predetermined value or greater. The sensed inclination, sound, and humidity have influence on relation between a value of touch on a touch interface and a value of the animation on the display unit within a predetermined range of the values.

Description

METHOD FOR FORMING EMOTIONAL E-BOOK BY COMPLEXING FUELS IN MOBILE TERMINAL USING MULTIPLE FUNCTIONS OF MOBILE DEVICES

The present invention relates to a technology that implements a visual, tactile, and emotional user interface of a paper book through a book driven by a touch-operated display, a tilt sensing device, a speaker capable of sensing sound, and a humidity measuring device to be.

The development and rapid expansion of various handsets such as smart phones, smart tablets and e-book readers are leading to the expansion of the north.

Modern display technologies now support colors closer to natural color and are integrated to accommodate more pixels in smaller screens. In addition, touch interface technology has been developed and used either individually or as a combination of two techniques, electromagnetic induction method and pressure sensing method. On the other hand, the display may have a slight bending that is not damaged by the function, but recently, a terminal equipped with a flexible display having a more flexible bending due to the development of material technology has been widely and widely introduced. The development of such industrial technology and its equivalent benefits can also have a positive effect on the effects of the present invention.

Many smartphones and smart tablets have a built-in tilt detection method. These methods operate the acceleration sensor, the gyro sensor, the geomagnetic sensor, etc. independently or detect the inclination by combining them.

The smartphone and the smartphone are equipped with a microphone that is used not only for voices but also for collecting sounds around them. There are two types of techniques for collecting and analyzing sound. Among them, the volume analysis technology is applied to enhance the quality of calls such as smartphones by increasing the speaking and listening according to the ambient noise. Another technique is the directional analysis technology of sound, which detects the direction of sound by using the time difference of the sound transmitted to two or more microphones and is applied to household appliances.

Hyundai smartphones and smart tablets and other devices compete with various sensors, and sensor technology is becoming more integrated and advanced. The present invention includes a humidity sensor in which a device such as a smart phone and a smart tablet, as an e-reader, incorporates a humidity sensor.

Due to the wide, thin, resilient, bent and restored physical properties of paper making ordinary paper books, and also because of the inherent physical properties of paper books, One is that the page of the book holds the repulsive force in the form of elasticity as the effective work that the hand feels when the page of the book is pulled and placed in the direction opposite to the direction to hand over the stacked corners. In addition, paper books are characterized by the degree of smoothness of the paper and the physical conditions such as the humidity of the surroundings, which in many cases are not irregularly different than intended when the page is passed over. In addition, if you usually use your thumb to flip through the edges of a book and quickly browse through the contents of a book, you may notice that the weight of the book and individual pages is different from each other's gravitational force, one page turns by irregular pressure and wind, It is difficult to precisely control multiple sheets. The characteristics of the two paper books that cause repulsive force in the hands and the spiritual and physical effort to pass one page precisely one sheet at a time are the physical tension that a paper book reader makes with the hand, The experience of chance provides a complex experience of two physical and mental experiences and the resulting emotions.

The object of the present invention is to provide a method and apparatus for measuring the gravity direction of a portable terminal using a display touch interface technology of a terminal, a tilt sensing method for a gravity direction, an ambient sound collection and analysis technique, It is to realize the sensibility that is felt in the north.

The present invention relates to an elastic effect of a paper book, which is experienced in a paper book and which is moved in a direction opposite to a direction in which a book is pulled and pulled toward the outside of the book, by using the touch interface method of the display unit This is implemented in the north.

The present invention implements the effect of gravity on the pages when the book is tilted by hand, through the sensing method of the tilt of the interface device.

The present invention analyzes the strength and direction of sound collected by using a microphone of a device when the wind is blowing, and implements the effect of the paper book passing through the page in the north.

The present invention implements the characteristics of a paper book in this book by measuring the humidity of the paper by multiplying the pages by the humidity according to the humidity and by influencing the degree of passing of the pages by combining with the above solutions .

The present invention establishes and implements the detailed situation of the touch interface together with the above-mentioned means for the natural page effect that an irregular number of pages of the paper book goes over.

It is often thought that this book is more convenient than paper books in terms of movement and retrieval. However, one of the fundamental reasons why people are reluctant to purchase this book is that it can not provide the emotional user experience of paper books alone. Therefore, the people who read paper books realize the emotional feeling that they feel in real life in this north, thereby enhancing the user experience in this north.

1 is a block diagram illustrating a configuration of a user interface apparatus having a touch interface unit, a tilt sensing unit, a sound collection unit, and a humidity collection unit according to an embodiment.
FIG. 2 is a diagram illustrating information for determining a touch by the touch interface unit according to the embodiment.
FIG. 3 is a diagram illustrating a reference line for setting a position of a layered setting area in a user interface device according to an embodiment.
FIG. 4 is a diagram illustrating an example of setting a position of a layered setting area in a user interface device according to an embodiment.
5 is a diagram illustrating another example of setting the position of the layered setting area in the user interface device according to the embodiment.
6 is a diagram illustrating an example in which a value detected by the tilt sensing unit is considered in a user interface apparatus according to an embodiment.
FIG. 7 is a diagram illustrating an example of taking into account the sound collected according to the direction and volume of the sound collecting unit in the user interface apparatus according to the embodiment.
FIG. 8 is a diagram illustrating an example in which a value obtained by collecting the humidity of the humidity collecting unit in the user interface device according to the embodiment is considered.
FIG. 9 is a flowchart illustrating an example of detecting a touch in a user interface device according to an embodiment of the present invention and controlling the information to display a result on a display.
10 is a flow chart illustrating an example in which a user interface device according to an embodiment detects a touch and displays the result on a display by controlling the information.
11 is a flowchart illustrating an example of detecting a tilt in a user interface device according to an embodiment and controlling the information to display results on a display.
12 is a flowchart illustrating an example of detecting and analyzing sound in a user interface device according to an embodiment and controlling the information to display results on a display.
13 is a flowchart illustrating an example in which a user interface device according to an embodiment detects humidity and controls the information to display results on a display.
FIG. 14 is a diagram illustrating an example of a portable terminal having a touch interface unit, a display unit, a tilt sensing unit, a sound collection unit, and a humidity collection unit according to an embodiment.

Hereinafter, an embodiment of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. In addition, the terms described below are defined in consideration of the functions of the present invention, which may vary depending on the intention of the user, the operator, or the custom. Therefore, the definition should be based on the contents throughout this specification.

1 is a block diagram illustrating a configuration of a user interface apparatus having a touch interface unit, a tilt sensing unit, a sound collection unit, and a humidity collection unit according to an embodiment.

1, the user interface device 100 includes a touch interface unit 110, a controller 120, a tilt sensing unit 130, a sound collection unit 140, a humidity collection unit 150, a storage unit 160, And a display unit 170.

The touch interface unit 110 may include at least one or at least one integrated film sensor, such as a conventional film-type film sensor, which electrically changes the pressure, the voltage, or both of the changes in accordance with a physical change externally applied. You can use it to detect physical changes.

The controller 120 determines at least one of a physical change range and a physical change direction when the touch interface unit 110 detects a physical change. At this time, the tilt sensing unit 130, the sound collecting unit 140, and the humidity collecting unit 150 may be controlled by the touch interface unit 110.

FIG. 2 is a diagram illustrating information determined by the touch interface unit according to the embodiment.

2, the controller 120 may determine a movement direction 210, a movement speed 220, a pressure 230, and the like collected by the touch interface unit 110. On the other hand, the touch interface unit 110 and the display unit 170 are a bundle for matching the touch and the display.

The tilt sensing unit 130 may sense the degree of tilting of the user interface device 100.

The sound collecting unit 140 picks up sound around the user interface device 110.

The humidity collecting unit 150 collects the humidity around the user interface device 110.

The storage unit 160 stores an operating system for controlling the overall operation of the user interface device 100, application programs corresponding to the layers, and storage data.

The touch interface unit 110, the tilt sensing unit 130, the sound collecting unit 140, and the humidity collecting unit 150 may control operation of other units. For example, when the predetermined application is activated, the touch interface unit 110 controls the tilt sensing unit 130, the sound collecting unit 140, and the humidity collecting unit 150 to operate, or the touch interface unit 110 When a touch is detected, the interface using the tilt sensing unit 130, the sound collecting unit 140, and the humidity collecting unit 150 may be serviced.

The touch interface unit 110 senses the physical change of the touch when a touch is detected in the preset layer setting area 210. The controller determines the sensed information and the display unit 170 displays the touch information, The layers can be hierarchically displayed in an area matched with the layering setting area 210 of the display unit 110. [

At this time, the layer displayed hierarchically is able to display each page of a book when an e-book app is executed, and it is possible to display all the applications in a so-called card-type interface in which information is hierarchically stacked and displayed.

FIG. 3 is a diagram illustrating a reference line for setting a position of a layered setting area in a user interface device according to an embodiment.

Referring to FIG. 3, the touch interface unit 110 can recognize a left or right direction to be displayed on the display unit 170 using a reference line having X and Y axes. However, the baseline can be made anywhere on the X or Y phase.

The touch interface unit 110 may set the predetermined range to the layering setting area based on the preset reference line by using the above method or set the corner of the north corner etc. displayed on the display unit 170 as the layering setting area It is possible.

FIG. 4 is a diagram illustrating an example of setting a position of a layered configuration area 430 in a user interface device according to an embodiment.

Referring to FIG. 4, when the touch sensed in the layer setting area 430 selects a plurality of layers, the touch interface 110 selects a layer corresponding to the uppermost layer among a plurality of selected layers. Or the touch interface unit 110 may select a layer having the largest touched area among a plurality of selected tiers when the touch detected in the tier setting area 430 selects a plurality of tiers.

Referring to FIGS. 3 and 4, when the touch direction detected by the touch interface unit 110 is determined, the movement direction 410 is distributed to the right with respect to the Y axis in FIG. At this time, when the touch and the following scrolling are applied to the corresponding portion, and the control unit determines the movement direction 410 to the right or left, the controller determines that the layer of each page is displayed on the portion of the display unit 170 matched with the layered- A layer having the value of the layer selected by the layering setting area 430 and released by the movement is displayed on the left side of the display part 170 corresponding to the part except for the layering setting area 430, The animation that sequentially moves leftward can be displayed. At this time, the left and right two animations can be divided into one or two parts, and can be adjusted to naturally connect each page unit.

3 and 4, the arrangement in the entire display of the layered setting area 430 may be an individual in the four directions including the right, left, top, and bottom, or an effective selective combination of directions, The operation according to the four directional corners can be adjusted by adjusting the manner in accordance with the right arrangement of the layering setting area 430 in a direction matching each arrangement.

5 is a diagram illustrating another example of setting the position of the layered setting area in the user interface device according to the embodiment.

4 and 5, the size of the left display surface except for the layered setting area 420 may be a predetermined size or may be changed according to the number of layers to be included in the layered setting area 430. [ 5, when the touch interface unit 110 senses a physical change that is equal to or greater than a predetermined pressure, a moving direction, and a moving speed, the touch interface unit 110 starts the touch position 510, (520).

At this time, the outline corresponding to the end position of the layering setting area 520 may consider the moving direction 410 and the moving speed 420 including the right detected by the touch interface unit 110 as shown in FIG. 4, The moving direction and the moving speed including the left side of the moving object 110 may be considered.

6 is a diagram illustrating an example in which a value detected by the tilt sensing unit is considered in a user interface apparatus according to an embodiment.

6, information of the inclination in consideration of the gravity direction 640 sensed by the inclination sensing unit 630 incorporated in the user interface device 100 is input to the touch interface unit 110 of the touch interface unit 110, It is possible to control the display unit 170 to display the animation that the corresponding pages are passed through by influencing the information of the other left layer selected as the position 610. [ On the other hand, even when the touch is not applied, the slope information may cause the display unit 170 to display an animation for passing the pages.

FIG. 7 is a diagram illustrating an example of taking into account the sound collected according to the direction and volume of the sound collecting unit in the user interface apparatus according to the embodiment.

7, the sound of the outside of the interface device 100 sensed by the sound collecting unit 730 is analyzed as the sound size and the position of the sound in the control unit, and the sound is extracted from the layer setting area 720 of the touch interface unit 110 It may be controlled to display on the display unit 170 an animation that passes through the corresponding pages so as to operate on the information of the left layer other than the one selected as the touch position 710. [ On the other hand, the size of the sound and the positional information of the sound can be controlled to display on the display unit 170 an animation of passing the pages even when the touch is not applied.

FIG. 8 is a view illustrating an example in which a value obtained by collecting the humidity of the humidity collecting unit in the user interface device according to the embodiment is considered.

8, the ambient humidity information of the interface device 100 sensed by the humidity collection unit 830 may be updated when the touch position 810 of the layered setting area 820 of the touch interface 110 is selected and scrolling occurs And may affect the degree to which the page of the animation displayed on the display unit 170 is weighted or lowered by less than or more than the release of selection by scrolling. On the other hand, if the page turning effect is activated by the tilt detection unit or the sound collection unit even when the touch is not applied, the humidity information may affect the degree to which the animation page has less or more pages.

As can be seen from FIGS. 5 to 8, when the present invention is applied to an application for viewing a book, the physical change detected by the touch interface unit 110 is generated by holding a book in a hand, It can provide visually impulses of turnover, provide an intuitive visual experience and feel to grip books and slide bookshelves through the thumbs, hold the book in place and support a portion of the corner with thumbs You can also use snapshots of your wrists to flip pages at a time, or you can give them the experience and feel of flipping a large number of pages by hanging them down as they are, You can blow up the pages to separate them and give them the experience of blowing the wind around them, Combining the effects of a combination of the above methods on the control of the humidity acquisition information, as if the humidity is high, the pages become sticky and difficult to fingertip, So you can create unintended mistakes, such as when you turn over a page in a real book. As a result, the present invention realizes a unique experience that the paper book provides to the user with the physical characteristics changing in real time based on the interaction with the natural environment of the hand and the surrounding environment, and the unique sensibility felt in the book.

The control unit 120 may control the overall operation of the user interface device 100. The control unit 120 may perform a part of the functions of the touch interface unit 110, the tilt sensing unit 130, the sound collection unit 140, and the humidity collection unit 150, It is also possible to assign processing functions. The control unit 120, the touch interface unit 110, the tilt sensing unit 130, the sound collecting unit 140, and the humidity collecting unit 150 are distinguished from each other to distinguish the respective functions. The controller 120 may include at least one processor configured to perform some of the functions of the touch interface 110, the tilt sensing unit 130, the sound collection unit 140, and the humidity collection unit 150 .

Hereinafter, a user interface method using the display according to the present invention will be described with reference to the drawings.

FIG. 9 is a flowchart illustrating an example of detecting a touch in a user interface device according to an embodiment of the present invention and controlling the information to display a result on a display.

Referring to FIG. 9, in step 910, the user interface device 100 determines whether the touch interface unit is touched.

If it is determined in step 910 that the touch of the touch interface unit is detected, the user interface device 100 measures the physical change of the touch interface unit in step 912.

Then, the user interface device 100 determines whether the information measured in step 914 is greater than a predetermined value.

If it is determined in step 914 that the physical change of the touch interface unit is smaller than the predetermined value, the user interface apparatus returns to step 910.

If it is determined in step 914 that the physical change of the touch interface unit is greater than a preset value, the user interface apparatus 100 sets a part of the touch interface unit as a layering setting area in step 916 based on a preset reference line of the touch interface unit.

At this time, the user interface device 100 may set up the layered configuration area up to the outer edge of the touch interface part starting from the touched position.

In step 918, the user interface device 100 hierarchically displays the layers through a display unit matched with the layering setting area.

In step 920, the user interface device 100 detects whether the user touches the layering setting area. At this time, when the touch sensed in the layer setting area selects a plurality of layers, the user interface device 100 selects a layer corresponding to the uppermost layer among a plurality of selected layers.

Then, the user interface device 100 enters H. However, according to the setting of the user, the user can directly enter step 922.

13 is a flowchart illustrating an example in which a user interface device according to an embodiment detects humidity and controls the information to display results on a display.

13, when the humidity collecting unit is activated in step 1310, when the humidity is collected in step 1312, the humidity information is measured in step 1314. If the humidity information is greater than the predetermined value in step 1316 In the same case, the user interface device 100 obtains a weight for more accurately displaying the animation on the display unit in step 1318, or if the humidity information is greater than or equal to the predetermined value in step 1316, A weight is obtained that can display the animation less accurately on the display unit, and returns to R in Fig. 9 through R. At this time, if the humidity collection unit is not activated in step 1310, or if the humidity is not collected in step 1312, the user interface device 100 ends the algorithm of FIG. 13 and returns to R in FIG.

In step 922, the user interface device 100 returned to R in FIG. 9 displays an animation in which a corner of the page is pressed or heard on the display unit matched with the layering setting area. At this time, the user interface device 100 may display the animation more accurately or less accurately by applying the weights obtained in steps 1318 and 1320 of FIG.

In step 924, the user interface device 100 enters G1 when the tilt sensing part is activated, enters S1 when the sound collection part is active, or enters S1 when the sound collection part is active, When activated, they can enter G1 and S1 in parallel at the same time. However, depending on the user's setting, you may enter A immediately.

11 is a flowchart illustrating an example of detecting a tilt in a user interface device according to an embodiment and controlling the information to display results on a display.

When the user interface device 100 enters G1 of FIG. 9 to G1 of FIG. 11, the process proceeds to step 1110. FIG.

If the user interface device 100 detects the inclination in step 1110 and the inclination size information is measured in step 1112 and the information is determined to be equal to or greater than the preset value in step 1114, And the direction of entry into the algorithm of FIG. 11 is determined in step 1118, and the process returns to F of FIG. 9 through F. FIG.

If the slope is not detected in step 1110 or the value determined in step 1114 is smaller than the preset value, the user interface device 100 ends the algorithm of FIG. 11 and returns to F of FIG.

12 is a flowchart illustrating an example of detecting and analyzing sound in a user interface device according to an embodiment and controlling the information to display results on a display.

When the user interface device 100 has entered S1 of FIG. 9 to S1 of FIG. 12, the process proceeds to step 1210. FIG.

When the sound is collected in step 1210 and the sound size and sound direction information are measured in step 1212, if the information is determined in step 1214 and the sound size is equal to or greater than the predetermined value , If the information is determined in step 1216 and the sound direction is the same within the preset range value, the weighting value for displaying the animation on the display part is obtained at step 1218. In step 1220, And returns to F in Fig. 9 through F. Fig.

If the sound size is not detected in step 1210, the value determined in step 1214 is smaller than the preset value, or the value determined in step 1216 is larger than the predetermined value, the user interface device 110 ends the algorithm of FIG. 11 And returns to F in Fig.

When the user interface device 100 returns to F in Fig. 9, the user interface device 100 enters into Fig. 10 through A in Fig.

10 is a flow chart illustrating an example in which a user interface device according to an embodiment detects a touch and displays the result on a display by controlling the information.

In step 1010, the user interface device 100 determines whether or not the scroll pressure, the scroll direction and the speed, or the scroll pressure, the direction and the speed of the scroll, which are started without interruption from the touch sensed in step 920 of FIG. If the information measured in step 1012 is equal to or greater than the preset value, the user interface device 100 displays an animation passing through the page on the display unit matching the non-layered setting area in step 1014 do. In this case, the user interface device 100 may confirm that the algorithm has passed through H and R in FIG. 9 in step 1014, display the animation more or less accurately using the weight values obtained in steps 1318 and 1320 of FIG. 13 , It is possible to display the animation more quickly by using one of the weight values for displaying animation on the display unit obtained in step 1116 of FIG. 11 and step 1218 of FIG. 12, or by accumulating both of them. Also, at this time, the animation displayed on the display unit matched with the layering setting area of step 922 of FIG. 9 can be displayed more quickly.

The user interface device 100 can start the algorithm from the start step in Fig. 11 when the layering setting area is not activated by touch. In this case, the user interface device 100 enters H in Fig. However, the user may immediately go to step 1110 according to the setting of the user.

11, the user interface device 100 has entered humidity H in FIG. 13, the humidity collecting unit is activated in step 1310, and the humidity is collected in step 1312, the humidity information is measured in step 1314 , If the humidity information is equal to or greater than the predetermined value in step 1316, the user interface device 100 obtains a weight that can more accurately display the animation on the display unit in step 1318, The user interface device 100 obtains a weight for less accurately displaying the animation on the display unit in step 1318, and returns to R in Fig. 11 through R. [ In this case, if the humidity collection unit is not activated in step 1310, or if the humidity is not collected in step 1314, the user interface device 100 ends the algorithm of FIG. 13 and returns to R in FIG.

If the user interface device 100 returned to R in FIG. 11 detects the inclination in step 1110, if the inclination size information is measured in step 1112, the information is determined to be equal to or greater than the predetermined value in step 1114 In step 1116, a weight for displaying the animation faster is obtained. In step 1118, the direction of entry of the algorithm is determined, and S2 is entered in S2 of FIG. However, the user may directly enter the step 1120 according to the setting of the user.

If the slope is not detected in step 1110 or the value determined in step 1114 is smaller than the preset value, the user interface device 100 may terminate the algorithm of FIG.

When the user interface device 100 enters the step S2 of FIG. 11 to S2 of FIG. 12, if the sound is collected in step 1210, if the sound size and the sound direction information are measured in step 1212, If the sound size is greater than or equal to the preset value, the information is determined in step 1216. If the sound direction is less than or equal to the preset value, a weight for displaying the animation on the display part at a higher speed is obtained at step 1218, The algorithm proceeds to step 1222. In step 1222, the animation is displayed on the display unit that matches the non-layered setting area. In this case, the user interface device 100 confirms that the algorithm has undergone the steps of FIG. 13 through H and R in FIG. 11, and displays the animation more or less accurately using the weight obtained in step 1318 or 1320 of FIG. , Or one of the weight values for displaying the animation on the display unit obtained in step 1116 of FIG. 11 and step 1218 of FIG. 12 may be used, or both of the weight values may be accumulated and then one of the weight values may be used or two You can also use it after stacking up to display the animation faster.

If the sound is not collected in step 1210, the value determined in step 1214 is smaller than the preset value, or the value determined in step 1216 is larger than the predetermined value, the user interface device 110 ends the algorithm of FIG. 12 The process returns to G2 in Fig.

The user interface device 100 may start from the starting step in Fig. 12 if the layering setting area is not activated by touch. In this case, H is entered in Fig. However, according to the setting of the user, step 2110 can be directly entered.

13, the user interface device 100 displays the humidity information in step 1314 when the humidity collecting unit is activated in step 1310, the humidity is collected in step 1312, and the humidity information is measured in step 1314 , If the humidity information is equal to or greater than the predetermined value in step 1316, the user interface device 100 obtains a weight that can more accurately display the animation on the display unit in step 1318, If greater than or equal to, the user interface device 100 obtains a weight to display the animation less accurately on the display unit in step 1318, and returns to R in FIG. 12 through R. FIG. If the humidity collection unit is not activated in step 1310, or if the humidity is not collected in step 1314, the user interface device 100 terminates the algorithm of FIG. 13 and returns to step R in FIG. 12 without obtaining the weight .

When the sound is collected in step 1210, the sound size and the sound direction information are measured in step 1212, and if the information is determined in step 1214 and the sound size is greater than or equal to the preset value, If the sound direction is less than or equal to the preset value, the weighting value for displaying the animation more rapidly on the display unit is obtained in step 1218. In step 1220, the direction of the algorithm is determined and the G2 is entered through G2 to G2 . However, the user can directly enter the step 1222 according to the setting of the user.

If the sound is not collected in step 1210, the value determined in step 1214 is smaller than the preset value, or the value determined in step 1216 is larger than the predetermined value, the user interface device 100 ends the algorithm of FIG. 12 There is also water.

When the user interface device 100 enters the G2 of FIG. 12 through G2 of FIG. 11, if the slope is detected in step 1110, if the slope size information is measured in step 1112, the information is determined in step 1114, If the value of the animation is greater than or equal to the value of the animation, a weight for displaying the animation more quickly on the display unit is obtained in step 1116, and an animation passing through the page is displayed on the display unit, In this case, the user interface device 100 confirms that the algorithm has undergone the steps of FIG. 13 through H and R in FIG. 12, and displays the animation more or less accurately using the weight values obtained in steps 1318 and 1320 of FIG. The animation can be displayed faster by using one of the weight values for displaying the animation in the display unit obtained in step 1218 of FIG. 12 and 1116 in FIG. 12, or by accumulating both of them, .

If the slope is not detected in step 1110 or the value determined in step 1114 is smaller than the preset value, the user interface device 100 ends the algorithm of FIG. 11 and returns to step S2 of FIG.

An example of a portable terminal to which the present invention can be applied will be described below with reference to FIG.

FIG. 14 is a diagram illustrating an example of a portable terminal having a touch interface unit, a display unit, a tilt sensing unit, a sound collection unit, and a humidity collection unit according to an embodiment.

Referring to FIG. 14, the portable terminal displays an animation passing over a page to a display unit located below the touch interface unit, and includes a main body 1410 including all other components necessary for the present invention, a normal home button, and a camera It can be seen that this is a general configuration.

The methods according to embodiments of the present invention may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software.

While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. This is possible.

Therefore, the scope of the present invention should not be limited to the embodiments described, but should be determined by equally advanced levels by examining the sequence of the technology in which the claims are located, as well as the claims .

Therefore, the scope of the present invention should also be free from the ranges that can only be claimed through additional and supplementary techniques that are not economically feasible and industrially feasible since they do not use background techniques commonly used at the time of filing.

100. Interface device
110. Touch interface unit
120. Control unit
130. A tilt detection unit
140. Sound collecting part
150. The humidity collecting section
160. The storage unit
170. Display section
210. Moving direction
220. Movement speed
230. Pressure
410. Moving direction
420. Moving direction
430. Layer setting area
440. Pressure
510. Touch location
520. Layer setting area
610. Touch position
620. Layer setting area
630. Tilt detection unit
640. Gravity Direction
710. Touch position
720. Layer setting area
730. A sound collection unit
810. Touch position
820. Layer setting area
830. Humidity collection section
1410. Body

Claims (7)

A touch interface unit capable of measuring pressure or voltage or both, corresponding to the physical force externally applied; A display unit for displaying information when the touch interface unit is touched; And a controller for hierarchically displaying a layer corresponding to a touched layer on the display unit when a touch is detected in a predetermined layering setting area of the touch interface, A touch interface unit configured to set a part of the touch interface unit as a layering setting area when the moving speed is equal to or greater than a moving direction and a moving speed and displaying the layers hierarchically through the display unit corresponding to the layering setting area, User interface method. The method according to claim 1, wherein, when displaying the layers corresponding to the lower layer scrolled and selected in the touch interface unit sequentially on the display unit according to the order of selection, A book corner displayed as a layer on the display unit corresponding to the area is displayed as an effect of being pressed or heard, or a page is displayed in a portion where the page except the layered setting area is displayed, The effect of each part being displayed successively as a page skipping effect, or each part being displayed in the upper two parts sliding continuously in a sliding manner, or the parts being displayed in the upper two parts disappear together Enemy By user interface method of the selected layer to be displayed. The method according to claim 1, wherein, when the touch interface unit detects a touched state, the touched interface is set to the tier setting area starting with the touched position. The touch screen display device according to claim 1 or 2, wherein the display unit displays the tilt information detected by the tilt sensing unit built in the interface device on the left layer other than the touch position A user interface method for displaying an animation that passes through corresponding pages or displaying an animation passing through a page corresponding to deselected layers at the time of scrolling. The method according to claim 1 or 2, wherein the display unit displays the sound detected by the sound collecting unit built in the interface device on the basis of the sound volume and sound position information analyzed by the control unit, A user interface that displays an animation that flips pages corresponding to the left layer other than the one selected as the touch position, or that displays an animation beyond the currently displayed page corresponding to the deselected layers as the scroll progresses, Way. The display device according to any one of claims 1, 2, 4, and 5, wherein the display unit displays, on the basis of the humidity information around the interface device detected by the humidity collection unit built in the interface device, A user interface method that displays animated transitions. The display device according to any one of claims 1, 2, 4, 5, and 6, wherein the display unit is a combination of two or more combinations that necessarily include any one of items 1, 4, 5, A user interface method for applying a liquidity of each term in a case of each interface action according to a predefined rule or according to a setting by a user and displaying a weighted value over a page.
KR1020150051377A 2015-04-12 2015-04-12 Methode For Making Emotion-Effective E-Book By Complexing Fuctions In Mobile Terminal KR20150079501A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150051377A KR20150079501A (en) 2015-04-12 2015-04-12 Methode For Making Emotion-Effective E-Book By Complexing Fuctions In Mobile Terminal
PCT/KR2016/003751 WO2016167517A1 (en) 2015-04-12 2016-04-10 Method for making emotional e-book using complex functions of terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150051377A KR20150079501A (en) 2015-04-12 2015-04-12 Methode For Making Emotion-Effective E-Book By Complexing Fuctions In Mobile Terminal

Publications (1)

Publication Number Publication Date
KR20150079501A true KR20150079501A (en) 2015-07-08

Family

ID=53791782

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150051377A KR20150079501A (en) 2015-04-12 2015-04-12 Methode For Making Emotion-Effective E-Book By Complexing Fuctions In Mobile Terminal

Country Status (2)

Country Link
KR (1) KR20150079501A (en)
WO (1) WO2016167517A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170068800A (en) * 2015-12-10 2017-06-20 주식회사 엘지유플러스 Operation method of devices for e-book reader, application and devices for e-book reader
KR20200112368A (en) 2019-03-22 2020-10-05 신영선 System for manufacturing E-book

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3544118B2 (en) * 1998-03-31 2004-07-21 シャープ株式会社 Electronic book display device and computer-readable recording medium
KR101680349B1 (en) * 2010-04-27 2016-11-28 엘지전자 주식회사 Mobile terminal and control method thereof
KR101743632B1 (en) * 2010-10-01 2017-06-07 삼성전자주식회사 Apparatus and method for turning e-book pages in portable terminal
KR101190048B1 (en) * 2010-10-27 2012-10-12 한국과학기술연구원 Electronic book apparatus and user interface providing method of the same
KR20120084467A (en) * 2011-01-20 2012-07-30 삼성전자주식회사 Method for changing page in e-book reader and device thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170068800A (en) * 2015-12-10 2017-06-20 주식회사 엘지유플러스 Operation method of devices for e-book reader, application and devices for e-book reader
KR20200112368A (en) 2019-03-22 2020-10-05 신영선 System for manufacturing E-book

Also Published As

Publication number Publication date
WO2016167517A1 (en) 2016-10-20

Similar Documents

Publication Publication Date Title
US10353570B1 (en) Thumb touch interface
EP3168713B1 (en) Method and devices for displaying graphical user interfaces based on user contact
EP2369460B1 (en) Terminal device and control program thereof
JP2020038706A (en) Multi-functional hand-held device
US20170060248A1 (en) Haptic feedback for interactions with foldable-bendable displays
CN104516675B (en) The control method and electronic equipment of a kind of folding screen
EP2796964A1 (en) System and Method for a Haptically-Enabled Deformable Surface
EP2133778A2 (en) Touch screen display device with a virtual keyboard and at least one proximity sensor
US20120297339A1 (en) Electronic device, control method, and storage medium storing control program
WO2015096020A1 (en) Adaptive enclosure for a mobile computing device
KR20130099186A (en) Display device, user interface method, and program
US9400572B2 (en) System and method to assist reaching screen content
KR20170118864A (en) Systems and methods for user interaction with a curved display
KR20090101035A (en) Electronic document player and playing method thereof
EP2759920B1 (en) Method and apparatus for controlling content playback
US20130154951A1 (en) Performing a Function
US9092198B2 (en) Electronic device, operation control method, and storage medium storing operation control program
WO2013093205A1 (en) Apparatus and method for providing transitions between screens
JP5654932B2 (en) User interface device, operation reception method using display device, and program
US20120218207A1 (en) Electronic device, operation control method, and storage medium storing operation control program
KR20150079501A (en) Methode For Making Emotion-Effective E-Book By Complexing Fuctions In Mobile Terminal
KR20130124139A (en) Control method of terminal by using spatial interaction
WO2014034549A1 (en) Information processing device, information processing method, program, and information storage medium
KR102194778B1 (en) Control method of terminal by using spatial interaction
JP5841023B2 (en) Information processing apparatus, information processing method, program, and information storage medium

Legal Events

Date Code Title Description
G15R Request for early opening
E902 Notification of reason for refusal
E902 Notification of reason for refusal