WO2015111791A1 - Dispositif et procédé pour diviser l'écran d'un terminal mobile - Google Patents

Dispositif et procédé pour diviser l'écran d'un terminal mobile Download PDF

Info

Publication number
WO2015111791A1
WO2015111791A1 PCT/KR2014/001860 KR2014001860W WO2015111791A1 WO 2015111791 A1 WO2015111791 A1 WO 2015111791A1 KR 2014001860 W KR2014001860 W KR 2014001860W WO 2015111791 A1 WO2015111791 A1 WO 2015111791A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
content
sub
touch
screens
Prior art date
Application number
PCT/KR2014/001860
Other languages
English (en)
Korean (ko)
Inventor
이동채
유수근
Original Assignee
주식회사 모브릭
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 모브릭 filed Critical 주식회사 모브릭
Publication of WO2015111791A1 publication Critical patent/WO2015111791A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present invention relates to a screen splitting apparatus and method for a mobile terminal, and more particularly, to a device and method for dividing a plurality of screens in accordance with a touch operation, and outputting the relevant content to each divided screen.
  • the portable terminal provides various functions such as a call function, a music play function, a text message transmission function, a digital broadcast reception function, a short range wireless communication function, and an internet access function.
  • the size of the display module provided in the portable terminal is also gradually increasing.
  • small screens such as 2 inches and 3 inches were mostly used, but in recent years, screens of portable terminals such as 4 inches, 5 inches, and 6 inches have become larger. Accordingly, it is necessary to develop a method for providing various information at the same time through the screen.
  • the present invention has been made in an effort to provide a screen splitting apparatus and method for a mobile terminal for dividing a plurality of screens according to a touch operation and outputting related contents to each of the divided screens.
  • a screen splitting apparatus for a portable terminal, comprising: a touch screen unit; And a controller configured to divide a screen into a plurality of sub screens based on the screen division operation and to output content to each of the divided sub screens when a screen division operation is performed through the touch screen unit.
  • the control unit may output first content output through the screen before the screen division operation on one sub lower surface of the plurality of sub screens, and output second content determined based on the first content on the remaining sub screens. Can be controlled.
  • the second content may be the same content as the first content and may be different from the first content.
  • the second content may be stored in a folder in which the first content is stored.
  • the first content type may be one of an image, a video, a text, and an email.
  • the second content may be the first content having an angle different from that of the first content.
  • the first content may be a multi-angle video streamed from the outside through a streaming server.
  • the screen division operation may be a touch operation in which a plurality of different points positioned at an edge of the screen are simultaneously touched, or a first point and a second point located at an edge of the screen may be dragged as starting and ending points, respectively. It may be a drag operation.
  • the controller may divide the screen into the plurality of sub screens based on a plurality of touch points according to the screen division operation.
  • the controller may be configured to divide the screen into the plurality of sub-screens based on a plurality of touched points when the screen division operation is a touch operation in which a plurality of different points located at an edge region of the screen are simultaneously touched. If the screen split operation is a drag operation of dragging the first and second points positioned in the edge area of the screen as starting and ending points, respectively, the screen may be divided into the plurality of sub-screens based on a starting point and an ending point. Can be.
  • the controller may control to change the content output on the selected sub screen based on the content change operation when the content change operation is performed through the touch screen unit while the screen is divided into the plurality of sub screens. .
  • the content change operation may be a drag operation dragged by using a point located at an edge area of the screen as a starting point.
  • the controller may control to change the content output on the sub screen at which the endpoint of the content change operation is located.
  • the controller If the content output before the content change operation is a multi-angle video through the sub-screen selected based on the content change operation, the controller performs the content change operation on the content having an angle different from the angle output before the content change operation. Control to output to the selected sub screen based on.
  • the control unit may control to output, on the sub screen selected based on the content change operation, the same type and different content as the content output before the content change operation to the sub screen selected based on the content change operation.
  • the controller may control the screen to be moved to a position according to the screen moving operation when the screen moving operation is performed through the touch screen unit.
  • the screen movement operation may be a touch and drag operation in which a sub screen is dragged while being touched.
  • a screen division method of a mobile terminal wherein the screen division operation is performed when a screen division operation is performed through a touch screen unit provided in the mobile terminal. Dividing a screen into a plurality of sub screens based on the basis; And outputting content to each of the divided sub-screens.
  • the outputting of the content may include outputting first content output through the screen before the screen division operation on one sub lower surface of the plurality of sub screens and second content determined on the basis of the first content on the remaining sub screens. It can be made to output.
  • the second content may be the same content as the first content and may be different from the first content.
  • the second content may be content stored in a folder in which the first content is stored.
  • the first content type may be one of an image, a video, a text, and an email.
  • the second content may be the first content having an angle different from that of the first content.
  • the first content may be a multi-angle video streamed from the outside through a streaming server.
  • the screen division operation may be a touch operation in which a plurality of different points positioned at an edge of the screen are simultaneously touched, or a first point and a second point located at an edge of the screen may be dragged as starting and ending points, respectively. It may be a drag operation.
  • the screen dividing step may include dividing the screen into the plurality of sub screens based on a plurality of touch points according to the screen dividing operation.
  • the screen dividing step may include splitting the screen into the plurality of sub-screens based on a plurality of touched points when the screen dividing operation is a touch operation in which a plurality of different points located at an edge of the screen are simultaneously touched.
  • the screen split operation is a drag operation in which the first point and the second point positioned in the edge area of the screen are dragged as starting and ending points, respectively, based on a starting point and an ending point. It can consist of dividing.
  • the method may further include changing the content output on the selected sub screen based on the content change operation when the content change operation is performed through the touch screen unit while the screen is divided into the plurality of sub screens.
  • the content change operation may be a drag operation dragged by using a point located at an edge area of the screen as a starting point.
  • the content changing step may include changing the content output on the sub screen at which the endpoint of the content change operation is located.
  • the content changing step may include: when the content output before the content change operation is a multi-angle video through the sub-screen selected based on the content change operation, the content change operation includes content having an angle different from the angle output before the content change operation. And outputting the selected sub screen based on the manipulation.
  • the content changing step may include outputting, on the sub screen selected on the basis of the content change operation, the same kind and different content as the content output before the content change operation on the sub screen selected on the basis of the content change operation. have.
  • the method may further include moving the selected sub screen based on the screen moving operation to a position according to the screen moving operation.
  • the screen moving operation may be a touch and drag operation in which a sub screen is dragged while being touched.
  • the computer-readable medium according to the present invention for achieving the above technical problem records a program for executing any one of the above methods in a computer.
  • a user divides the screen into a plurality of screens according to a user's touch operation, and outputs relevant contents to each of the divided screens. You can get information.
  • each user can divide the screen as desired by the user, thereby improving convenience.
  • FIG. 1 is a block diagram illustrating an internal structure of a portable terminal according to an exemplary embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating a screen division process according to an exemplary embodiment of the present invention.
  • 3 and 4 are diagrams for explaining an example of screen division operation according to a preferred embodiment of the present invention.
  • FIG. 5 is a diagram illustrating an example of a divided sub screen according to an exemplary embodiment of the present invention.
  • 6 and 7 are diagrams for explaining an example of screen division operation according to a preferred embodiment of the present invention.
  • FIG. 8 is a diagram for explaining an example of a divided sub screen according to an exemplary embodiment of the present invention.
  • FIGS. 9 and 10 are diagrams for explaining an example of screen division when a screen division operation is performed in a divided state into two sub-screens according to an exemplary embodiment of the present invention.
  • 11 and 12 are diagrams for explaining an example of screen division operation according to a preferred embodiment of the present invention, respectively.
  • FIG. 13 is a view for explaining an example of a divided sub screen according to an exemplary embodiment of the present invention.
  • 14 and 15 are diagrams for explaining an example of screen division operation according to a preferred embodiment of the present invention, respectively.
  • 16 is a diagram illustrating an example of a divided sub screen according to an exemplary embodiment of the present invention.
  • 17 and 18 are diagrams for explaining an example of screen division when a screen division operation is performed in a divided state into two sub-screens according to an exemplary embodiment of the present invention.
  • 19 to 21 are diagrams for describing an example in which screen division is continuously generated according to an exemplary embodiment of the present invention.
  • 22 is a diagram for explaining an example of the content change operation according to the preferred embodiment of the present invention.
  • FIG. 23 is a diagram for explaining an example of a sub screen whose content is changed according to an exemplary embodiment of the present invention.
  • 24 is a diagram for explaining an example of the content change operation according to the preferred embodiment of the present invention.
  • 25 is a view for explaining an example of a sub screen whose content is changed according to an exemplary embodiment of the present invention.
  • 26 is a view for explaining an example of a screen moving operation according to a preferred embodiment of the present invention.
  • FIG. 27 is a view for explaining an example of a moved screen according to a preferred embodiment of the present invention.
  • FIG. 28 is a view for explaining an example of a screen moving operation according to a preferred embodiment of the present invention.
  • 29 and 30 are diagrams for explaining an example of a moved screen according to a preferred embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating an internal structure of a portable terminal according to an exemplary embodiment of the present invention.
  • the mobile terminal 100 includes a touch screen unit 110, a power unit 120, a wireless communication unit 130, an audio processor 140, a storage unit 150, and a controller 160. It includes.
  • the touch screen unit 110 includes a touch sensor unit 111 and a display unit 113.
  • the touch sensor unit 111 and the display unit 113 have a mutual layer structure.
  • the touch sensor unit 111 converts a change in pressure applied to a specific portion of the display unit 113 or capacitance generated at a specific portion into an electrical input signal.
  • the touch sensor unit 111 may detect not only the position and area of the touch but also the pressure at the touch.
  • the touch sensor unit 111 detects a user's touch input, generates a detection signal, and transmits the detected signal to the controller 160.
  • the sensing signal may include coordinate information touched by the user.
  • the touch sensor unit 111 When the user moves (drags) while touching, the touch sensor unit 111 generates a detection signal including coordinate information of the movement path and transmits the detected signal to the controller 160.
  • the touch sensor 111 may include a touch sensing sensor such as a capacitive overlay, a resistive devislay, a surface acoustic wave, an infrared beam, or a pressure sensing sensor. (pressure sensor).
  • a touch sensing sensor such as a capacitive overlay, a resistive devislay, a surface acoustic wave, an infrared beam, or a pressure sensing sensor. (pressure sensor).
  • a touch sensing sensor such as a capacitive overlay, a resistive devislay, a surface acoustic wave, an infrared beam, or a pressure sensing sensor. (pressure sensor).
  • pressure sensor pressure sensor
  • all kinds of sensor devices capable of detecting contact or pressure of an object may be configured as the touch sensor unit 111 of the present invention.
  • the display unit 113 visually provides a menu, input data, function setting information, and various other information of the mobile terminal 100 to the user.
  • the display unit 113 outputs a boot screen, a standby screen, a menu screen, a call screen, and other application screens of the mobile terminal 100.
  • the display unit 113 may include a liquid crystal display (LCD), an organic light emitting diode (OLED), an active matrix organic light emitting diode (AMOLED), a flexible display, 3 It may be formed as a 3D display or the like.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • AMOLED active matrix organic light emitting diode
  • the power supply unit 120 includes a battery (not shown), and supplies power required for the operation of each component of the mobile terminal 100.
  • the battery may be an integrated battery fixed to the portable terminal 100 or a removable battery detachable to the portable terminal 100.
  • the wireless communication unit 130 performs a function of transmitting and receiving corresponding data for wireless communication of the mobile terminal 100.
  • the wireless communication unit 130 may include an RF transmitter for up-converting and amplifying a frequency of a transmitted signal, and an RF receiver for low noise amplifying and down-converting a received signal.
  • the wireless communication unit 130 may receive data through a wireless channel, output the data to the controller 160, and transmit data output from the controller 160 through the wireless channel.
  • the audio processor 140 may be configured as a codec, and the codec may be configured as a data codec for processing packet data and an audio codec for processing an audio signal such as voice.
  • the audio processor 140 converts the digital audio signal into an analog audio signal through an audio codec, plays back through the speaker SPK, and converts the analog audio signal input from the microphone MIC into a digital audio signal through the audio codec. .
  • the storage 160 stores a program and data necessary for the operation of the mobile terminal 100 and may be divided into a program area and a data area.
  • the program area includes a program for controlling the overall operation of the mobile terminal 100, an operating system (OS) for booting the mobile terminal 100, an application necessary for playing multimedia content, other optional functions of the mobile terminal 100, For example, an application program required for a voice chat function, a camera function, a sound play function, an image or a video play function, and the like can be stored.
  • the data area is an area in which data generated according to the use of the mobile terminal 100 is stored, and may store an image, a video, a phone book, and audio data.
  • the controller 160 controls the overall operation of each component of the mobile terminal 100.
  • the controller 160 controls screen division, content change output on the screen, and screen movement according to a user's manipulation.
  • the controller 160 according to the present invention may include a screen division controller 161.
  • FIG. 2 is a flowchart illustrating a screen division process according to an exemplary embodiment of the present invention.
  • the screen division controller 161 divides the screen into a plurality of sub screens based on the screen division operation (S220). That is, when a screen division operation is performed through the touch screen unit 110, the screen division control unit 161 may divide the screen into a plurality of sub screens based on a plurality of touch points according to the screen division operation. For example, the screen division controller 161 may divide a screen into a plurality of sub screens according to a plurality of virtual regions obtained by connecting a plurality of touch points.
  • the screen division operation may be a touch operation in which a plurality of different points positioned at edges of the screen are simultaneously touched.
  • touch manipulation refers to a user's touch of a plurality of points at the same time by using a finger or the like on the edge region of the screen, that is, the portion or bezel portion located outside the physical screen region.
  • the screen division control unit 161 may divide the screen into a plurality of sub screens based on the plurality of touched points.
  • the screen division operation may be a drag operation of dragging the first point and the second point positioned in the edge area of the screen as starting and ending points, respectively.
  • a drag operation may be performed by dragging a point (start point) of an edge area of the screen, that is, a part located outside the physical screen area or a bezel part while the user touches it, and a point (end point) located at another edge area. Say to end the touch.
  • the screen split controller 161 may divide the screen into a plurality of sub screens based on the start point and the end point.
  • the screen division controller 161 outputs the content to each of the plurality of divided sub screens (S230).
  • the content refers to a photo, a TV broadcast, a video, a movie, an email, an e-book, and the like.
  • the screen division controller 161 outputs the first content output through the screen before the screen division operation on one of the plurality of sub-screens, and the second content determined based on the first content on the remaining sub-screens. You can control the output.
  • the second content may be the same content as the first content and may be different from the first content.
  • Content types are images, videos, texts, emails, etc.
  • the first content is a video having a file name 'aaa.avi'
  • the second content may be a video having the same file type as the first content but different from the first content 'bbb.avi'.
  • the second content may be content stored in a folder in which the first content is stored.
  • the first content is an image with the file name 'aaa.jpg'
  • the image with the file name 'ccc.jpg' is stored in the folder where the first content is stored
  • the file name is in a folder other than the folder in which the first content is stored.
  • the second content may be an image having a file name 'ccc.jpg' which is the same as the type of the first content and is stored in the folder in which the first content is stored.
  • the second content when the first content is a multi angle video, the second content may be first content having an angle different from that of the first content.
  • the multi-angle video includes images captured at a plurality of angles with respect to one scene.
  • the first content when the first content is a multi-angle video and the output angle of the first content is 'angle A', the second content may be first content having an angle of 'angle B'. That is, when a screen is divided while a multi-angle video including images captured at a plurality of angles is output, an image having different angles is output to each sub-screen.
  • the first content that is, the multi-angle video
  • the screen division controller 161 may change the content output on the selected sub screen based on the content change operation (S250).
  • the content change operation may be a drag operation dragged by using a point located at an edge area of the screen as a starting point.
  • a drag operation is performed by dragging a point (start point) of an edge area of the screen, that is, a part located outside the physical screen area or a bezel part while the user touches it, and then touching the touch at a point (end point) located on the screen. Say to quit.
  • the screen division controller 161 may change the content output on the sub screen at which the endpoint of the content change operation is located.
  • the screen division controller 161 controls the content change operation on the content having an angle different from that output before the content change operation.
  • the sub screen may be output based on the selected sub screen.
  • the screen division controller 161 may output the same kind of content that is different from the content output before the content change operation on the sub screen selected based on the content change operation to the selected sub screen based on the content change operation.
  • the screen division controller 161 may move the selected sub screen based on the screen movement operation to a position according to the screen movement operation (S270).
  • the screen movement operation may be a touch and drag operation in which the sub screen is dragged while being touched. That is, the position of the sub-screen selected according to the screen moving operation moves as the screen moving operation, that is, dragged. Thereafter, when the user ends the screen moving operation, the sub screen is fixed to the final position.
  • screen movement manipulation related steps S260 and S270 are performed after the content change manipulation related steps S240 and S250, it is illustrated in FIG. 2. However, this is only an order for convenience of description, and may be changed afterwards.
  • FIG 3 and 4 are views for explaining an example of a screen division operation according to a preferred embodiment of the present invention
  • Figure 5 is a view for explaining an example of a divided sub screen according to a preferred embodiment of the present invention.
  • an example of a screen division operation refers to a touch operation in which a user simultaneously touches the first point TP_1 and the second point TP_2 located in the edge area of the screen SCR_M with a finger or the like.
  • another example of the screen division operation is that the user drags DG to the end point TP_E while touching the start point TP_S located in the edge area of the screen SCR_M with a finger or the like. Say the drag operation.
  • the screen SCR_M is divided into a plurality of sub screens SCR_S1 and SCR_S2, and the divided plurality of sub screens SCR_S1. , SCR_S2) outputs the content to each.
  • the first content output through the screen SCR_M is output to one sub-screen SCR_S1 of the plurality of sub-screens SCR_S1 and SCR_S2, and the first content is output to the remaining sub-screens SCR_S2.
  • the second content determined as a basis may be output.
  • the first sub screen SCR_S1 may include the first angle of the angle output through the screen SCR_M before the screen division operation.
  • the first content may be output, and the first content having an angle different from that of the first content output on the first sub screen SCR_S1 may be output to the second sub screen SCR_S2.
  • FIGS. 6 and 7 are views for explaining an example of a screen division operation according to a preferred embodiment of the present invention
  • Figure 8 is a view for explaining an example of a divided sub screen according to a preferred embodiment of the present invention.
  • 9 and 10 are diagrams for explaining an example of screen division when a screen division operation is performed in a divided state into two sub-screens according to an exemplary embodiment of the present invention.
  • an example of a screen division operation refers to a touch operation in which a user simultaneously touches the first point TP_1 and the second point TP_2 located in the edge area of the screen SCR_M with a finger or the like.
  • another example of the screen division operation is that the user drags DG to the end point TP_E while touching the start point TP_S located in the edge area of the screen SCR_M with a finger or the like. Say the drag operation.
  • the screen SCR_M is divided into a plurality of sub screens SCR_S1 and SCR_S2, and the divided plurality of sub screens SCR_S1. , SCR_S2) outputs the content to each.
  • a user simultaneously touches the first point TP_1 and the second point TP_2 positioned in an edge area of the screen with a finger or the like.
  • a touch operation that is, a screen division operation
  • each of the sub screens SCR_S1 and SCR_S2 may be divided again.
  • a total of four sub-screens SCR_S11, SCR_S12, SCR_S21, and SCR_S22 are created.
  • FIG. 11 and 12 are views for explaining an example of a screen division operation according to a preferred embodiment of the present invention
  • Figure 13 is a view for explaining an example of a divided sub screen according to a preferred embodiment of the present invention. .
  • an example of a screen division operation refers to a touch operation in which a user simultaneously touches the first point TP_1 and the second point TP_2 located in the edge area of the screen SCR_M through a finger or the like.
  • another example of the screen division operation is that the user drags DG to the end point TP_E while touching the start point TP_S located in the edge area of the screen SCR_M with a finger or the like. Say the drag operation.
  • the screen SCR_M is divided into a plurality of sub screens SCR_S1 and SCR_S2, and the divided plurality of sub screens SCR_S1 are provided. , SCR_S2) outputs the content to each.
  • the first content output through the screen SCR_M is output to one sub-screen SCR_S1 of the plurality of sub-screens SCR_S1 and SCR_S2, and the first content is output to the remaining sub-screens SCR_S2.
  • the second content determined as a basis may be output.
  • the first content output through the screen SCR_M before the screen division operation is an image
  • the first content that is the image output through the screen SCR_M before the screen division operation is displayed on the first sub screen SCR_S1.
  • the second sub screen SCR_S2 may output second content that is the same type as the first content that is output on the first sub screen SCR_S1 and is different from the first content.
  • the second content may be content stored in a folder in which the first content is stored.
  • FIGS. 19 to 21 are views of the present invention. It is a figure for explaining an example in which screen division according to a preferred embodiment occurs continuously.
  • an example of a screen division operation refers to a touch operation in which a user simultaneously touches the first point TP_1 and the second point TP_2 located in the edge area of the screen SCR_M through a finger or the like.
  • another example of the screen division operation is that the user drags DG to the end point TP_E while touching the start point TP_S located in the edge area of the screen SCR_M with a finger or the like. Say the drag operation.
  • the screen SCR_M is divided into a plurality of sub screens SCR_S1 and SCR_S2, and the divided plurality of sub screens SCR_S1. , SCR_S2) outputs the content to each.
  • a user simultaneously touches the first point TP_1 and the second point TP_2 positioned in the edge area of the screen with a finger or the like.
  • a touch operation that is, a screen division operation
  • each of the sub screens SCR_S1 and SCR_S2 may be divided again.
  • a total of four sub-screens SCR_S11, SCR_S12, SCR_S21, and SCR_S22 are created.
  • the first sub screen SCR_S11 is divided into two sub screens SCR_S111 and SCR_S112, and the second sub screen SCR_S21 is also divided into two sub screens SCR_S211 and SCR_S212.
  • the screen is divided into the second sub screen SCR_S12 and the fourth sub screen SCR_S22 while being divided into six sub-screens SCR_S111, SCR_S112, SCR_S12, SCR_S211, SCR_S212, and SCR_S22.
  • the second sub screen SCR_S12 is divided into two sub screens SCR_S121 and SCR_S122
  • the fourth sub screen SCR_S22 is also divided into two sub screens SCR_S221 and SCR_S222. Divided into.
  • FIG. 22 is a view for explaining an example of a content change operation according to a preferred embodiment of the present invention
  • FIG. 23 is a view for explaining an example of a sub screen in which content is changed according to a preferred embodiment of the present invention.
  • an example of a content change operation refers to a drag operation of dragging (DG) to an end point TP_E while a user touches a start point TP_S positioned in an edge region of a screen with a finger or the like.
  • the content output operation that is, the content output on the sub screen SCR_S2 where the end point TP_E of the drag operation is located, may be changed.
  • the angle output through the sub screen SCR_S2 before the content change operation as shown in FIG. 23 is a multi-angle video
  • FIG. 24 is a view for explaining an example of a content change operation according to a preferred embodiment of the present invention
  • FIG. 25 is a view for explaining an example of a sub screen in which content is changed according to a preferred embodiment of the present invention.
  • an example of a content change operation refers to a drag operation of dragging (DG) to an end point TP_E while a user touches a start point TP_S positioned in an edge region of a screen with a finger or the like.
  • the content output operation that is, the content output on the sub screen SCR_S2 where the end point TP_E of the drag operation is located, may be changed.
  • the same type and different content as the content output before the content change operation may be output to the sub screen SCR_S2 as shown in FIG. 25.
  • FIG. 26 is a view for explaining an example of a screen moving operation according to a preferred embodiment of the present invention
  • Figure 27 is a view for explaining an example of a moved screen according to a preferred embodiment of the present invention.
  • an example of a screen moving operation refers to a touch and drag operation in which a user drags a point TP positioned in the sub screen SCR_S2 while touching a finger.
  • the position of the sub screen SCR_S2 may move as the screen moving operation, that is, dragging.
  • FIGS. 29 and 30 are views for explaining an example of a moved screen according to a preferred embodiment of the present invention.
  • SCR_S2 can be output in PIP (Picture In Picture) format.
  • the screen moving operation that is, the position of the sub screen SCR_S2 may be moved as it is dragged as shown in FIG. 30.
  • the invention can also be embodied as computer readable code on a computer readable recording medium.
  • the computer-readable recording medium includes all kinds of recording devices in which data that can be read by a computer device is stored. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and also in the form of carrier wave (transfer over the Internet). It includes what is implemented.
  • the computer-readable recording medium can also be distributed over computer devices connected over a wired or wireless communication network so that the computer-readable code is stored and executed in a distributed fashion.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif et un procédé pour diviser l'écran d'un terminal mobile. La présente invention divise un écran en une pluralité de zones en réponse à une opération tactile, et transmet un contenu pertinent entre les écrans divisés respectifs. Selon la présente invention, l'écran est divisé en la pluralité de zones en réponse à l'opération tactile d'un utilisateur et le contenu pertinent pour les autres écrans est transmis sur les écrans divisés respectifs, de sorte que l'utilisateur puisse obtenir plus d'informations sur un seul écran. L'écran est divisé en réponse à l'opération tactile de l'utilisateur de sorte que chaque utilisateur puisse diviser l'écran comme il l'entend, ce qui améliore la facilité d'utilisation.
PCT/KR2014/001860 2014-01-22 2014-03-06 Dispositif et procédé pour diviser l'écran d'un terminal mobile WO2015111791A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20140007897 2014-01-22
KR10-2014-0007897 2014-01-22

Publications (1)

Publication Number Publication Date
WO2015111791A1 true WO2015111791A1 (fr) 2015-07-30

Family

ID=53681577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/001860 WO2015111791A1 (fr) 2014-01-22 2014-03-06 Dispositif et procédé pour diviser l'écran d'un terminal mobile

Country Status (1)

Country Link
WO (1) WO2015111791A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090249235A1 (en) * 2008-03-25 2009-10-01 Samsung Electronics Co. Ltd. Apparatus and method for splitting and displaying screen of touch screen
KR20100032660A (ko) * 2008-09-18 2010-03-26 삼성전자주식회사 휴대단말기의 터치스크린 동작 제어 방법 및 장치
KR20110023404A (ko) * 2009-08-31 2011-03-08 김기수 방송프로그램 메타데이터에 기반한 실시간 콘텐츠 제공시스템 및 방법
KR20110050248A (ko) * 2009-11-06 2011-05-13 엘지전자 주식회사 이동 단말기 및 그 화면 분할 방법
KR20130090467A (ko) * 2012-02-06 2013-08-14 엘지전자 주식회사 이동 단말기 및 그 제어방법, 이를 위한 기록매체
US20130222321A1 (en) * 2011-06-20 2013-08-29 Alexander Buening Method And System To Launch And Manage An Application On A Computer System Having A Touch Panel Input Device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090249235A1 (en) * 2008-03-25 2009-10-01 Samsung Electronics Co. Ltd. Apparatus and method for splitting and displaying screen of touch screen
KR20100032660A (ko) * 2008-09-18 2010-03-26 삼성전자주식회사 휴대단말기의 터치스크린 동작 제어 방법 및 장치
KR20110023404A (ko) * 2009-08-31 2011-03-08 김기수 방송프로그램 메타데이터에 기반한 실시간 콘텐츠 제공시스템 및 방법
KR20110050248A (ko) * 2009-11-06 2011-05-13 엘지전자 주식회사 이동 단말기 및 그 화면 분할 방법
US20130222321A1 (en) * 2011-06-20 2013-08-29 Alexander Buening Method And System To Launch And Manage An Application On A Computer System Having A Touch Panel Input Device
KR20130090467A (ko) * 2012-02-06 2013-08-14 엘지전자 주식회사 이동 단말기 및 그 제어방법, 이를 위한 기록매체

Similar Documents

Publication Publication Date Title
WO2021098678A1 (fr) Procédé de commande de vidéocapture d'écran et dispositif électronique
WO2017082519A1 (fr) Dispositif de terminal utilisateur pour recommander un message de réponse et procédé associé
WO2017111358A1 (fr) Dispositif de terminal d'utilisateur et procédé de conversion de mode ainsi que système sonore permettant de régler le volume de haut-parleur de ce dernier
WO2015119480A1 (fr) Dispositif terminal utilisateur et son procédé d'affichage
WO2016195291A1 (fr) Appareil terminal d'utilisateur et son procédé de commande
WO2016060514A1 (fr) Procédé pour partager un écran entre des dispositifs et dispositif l'utilisant
WO2015005605A1 (fr) Utilisation à distance d'applications à l'aide de données reçues
WO2014092476A1 (fr) Appareil d'affichage, appareil de commande à distance, et procédé pour fournir une interface utilisateur les utilisant
WO2015030364A1 (fr) Procédé pour partager des données multimédias et dispositif électronique associé
WO2015119485A1 (fr) Dispositif de terminal utilisateur et son procédé d'affichage
WO2013191462A1 (fr) Terminal et procédé d'exploitation du terminal
WO2017119664A1 (fr) Appareil d'affichage et ses procédés de commande
WO2016028042A1 (fr) Procédé de fourniture d'une image visuelle d'un son et dispositif électronique mettant en œuvre le procédé
WO2020220999A1 (fr) Procédé de capture d'écran et dispositif terminal
WO2015005628A1 (fr) Dispositif portable pour fournir un composant iu combiné, et procédé de commande de celui-ci
WO2020162709A1 (fr) Dispositif électronique pour la fourniture de données graphiques basées sur une voix et son procédé de fonctionnement
WO2020020126A1 (fr) Procédé de traitement d'informations et terminal
EP3039563A1 (fr) Procédé d'affichage multiple, support de stockage et dispositif électronique
WO2014092469A1 (fr) Appareil de lecture de contenu, procédé de fourniture d'une interface utilisateur (ui) d'un appareil de lecture de contenu, serveur de réseau et procédé de commande par un serveur de réseau
WO2014073847A1 (fr) Terminal d'utilisateur, appareil externe, système d'émission-réception de données, et procédé d'émission-réception de données
WO2014112847A1 (fr) Procédé et dispositif électronique de fourniture d'un guide
WO2014104658A1 (fr) Procédé et système d'exécution d'une application
WO2014030956A1 (fr) Appareil de téléchargement en amont de contenus, appareil terminal d'utilisateur de téléchargement en aval de contenus, serveur, système de partage de contenus et leur procédé de partage de contenus
WO2014021627A1 (fr) Appareil et procédé permettant de fournir un service multiécran dans un système de radiodiffusion
WO2014035171A1 (fr) Procédé et appareil permettant de transmettre un fichier pendant un appel vidéo sur un dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14879309

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14879309

Country of ref document: EP

Kind code of ref document: A1