WO2019198844A1 - 미디어 플레이어를 제어하는 방법 및 시스템 - Google Patents

미디어 플레이어를 제어하는 방법 및 시스템 Download PDF

Info

Publication number
WO2019198844A1
WO2019198844A1 PCT/KR2018/004287 KR2018004287W WO2019198844A1 WO 2019198844 A1 WO2019198844 A1 WO 2019198844A1 KR 2018004287 W KR2018004287 W KR 2018004287W WO 2019198844 A1 WO2019198844 A1 WO 2019198844A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
media player
computer
touch input
touch
Prior art date
Application number
PCT/KR2018/004287
Other languages
English (en)
French (fr)
Korean (ko)
Inventor
정재헌
류대원
조민경
조성용
최진원
Original Assignee
라인플러스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 라인플러스 주식회사 filed Critical 라인플러스 주식회사
Priority to PCT/KR2018/004287 priority Critical patent/WO2019198844A1/ko
Priority to KR1020207027238A priority patent/KR102512879B1/ko
Priority to JP2020555857A priority patent/JP7183295B6/ja
Publication of WO2019198844A1 publication Critical patent/WO2019198844A1/ko

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks

Definitions

  • the following description relates to a method and system for controlling a media player. More specifically, the user can perform various operations such as playing, stopping, selecting content, navigating in content, adjusting the volume, and the like for content provided by the media player.
  • a control method capable of providing a user experience (UX) and / or a user interface (UI) so as to be easily processed by a computer, a computer device performing the control method, and a computer coupled with the computer to execute the control method.
  • UX user experience
  • UI user interface
  • a computer program stored in a computer readable recording medium and a recording medium thereof.
  • a user experience in the form of providing a physical key button corresponding to each of the various operations for the media player or arranging virtual buttons corresponding to each of the various operations on the screen;
  • UI user interface
  • Korean Patent Laid-Open Publication No. 10-2010-0019241 relates to a technology for controlling a media player, and includes an icon for controlling a built-in media player by receiving a user input on a standby screen of a provided display means. It discloses that the media player to be displayed can be controlled.
  • UX user experience
  • UI user interface
  • a control method capable of providing a computer, a computer apparatus for performing the control method, and a computer program coupled to a computer and stored in a computer readable recording medium for executing the control method on a computer are provided.
  • Managing a plurality of layered objects for control of the media player Recognizing that the touch input is maintained for a first predetermined time or more at a first position of the touch screen of the computer device associated with the execution of the media player, objects of a higher layer among the plurality of objects are displayed through the touch screen.
  • the objects of the lower layer may be managed.
  • the categories may include: (1) a first category including a previous content selection operation, a next content selection operation, a play and stop playback operation of the selected content, (2) a forward search operation for the selected content and the selected content Two or more categories including a second search operation for the reverse search operation, and (3) a third category including a volume increase operation and a volume decrease operation.
  • the step of applying the operation corresponding to the second object to the media player, the degree for applying the operation to the media player based on the distance between the third position and the second position. I can regulate it.
  • applying the operation corresponding to the second object to the media player the closer the third position is to the second position relative to the degree for applying the operation to the media player. Decrease, and as the third position moves away from the second position, the degree of applying the manipulation to the media player can be relatively increased.
  • the degree for applying the manipulation to the media player includes a degree of decrease in search speed, a degree of increase in search speed, a degree of decrease in volume, and a degree of increase in volume depending on the type of the second object. It may include any one of.
  • displaying the objects of the upper layer may dynamically determine display positions of the objects of the upper layer based on the first position.
  • displaying the objects of the lower layer may dynamically determine the display position of the objects of the lower layer based on the second position.
  • control method may be configured to provide at least one of visual, auditory, and tactile feedback to at least one of: maintaining the touch input at the first position and moving the touch input to the second position. It may further include the step of outputting.
  • a computer program stored on a computer readable recording medium for executing the control method on the computer.
  • a computer-readable recording medium having a program recorded thereon for executing the control method on a computer is provided.
  • At least one processor implemented to execute readable instructions on a computer device, wherein the at least one processor manages a plurality of layered objects for control of a media player and is associated with execution of the media player.
  • the at least one processor manages a plurality of layered objects for control of a media player and is associated with execution of the media player.
  • objects of a higher layer among the plurality of objects are displayed on the touch screen, and the touch input is displayed. Recognizing the object of the lower layer corresponding to the first object is moved to the second position corresponding to the first object of the displayed upper layer object from the first position and maintained for more than a second predetermined time.
  • the operation corresponding to the second object may be recalled.
  • a computer device applied to a media player.
  • UX user experience
  • UI user interface
  • FIG. 1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating an example of a computer device according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a control method according to an embodiment of the present invention.
  • FIG. 4 is a diagram illustrating an example of displaying an object of a higher layer according to an embodiment of the present invention.
  • 5 to 7 are diagrams illustrating examples of displaying objects of a lower layer according to an embodiment of the present invention.
  • the control method according to embodiments of the present invention may be implemented through a computer device such as an electronic device to be described later.
  • a computer program according to an embodiment of the present invention may be installed and run on the computer device, and the computer device may perform the control method according to the embodiments of the present invention under the control of the driven computer program.
  • the above-described computer program may be stored in a computer-readable recording medium in combination with a computer device for causing the computer to execute the control method.
  • FIG. 1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention.
  • the network environment of FIG. 1 illustrates an example including a plurality of electronic devices 110, 120, 130, and 140, a plurality of servers 150 and 160, and a network 170.
  • 1 is an example for describing the invention, and the number of electronic devices and the number of servers are not limited as shown in FIG. 1.
  • the plurality of electronic devices 110, 120, 130, and 140 may be a fixed terminal or a mobile terminal implemented with a computer device.
  • Examples of the plurality of electronic devices 110, 120, 130, and 140 include a smart phone, a mobile phone, a navigation device, a computer, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), and a portable multimedia player (PMP). Tablet PC).
  • PDA personal digital assistant
  • PMP portable multimedia player
  • Tablet PC Tablet PC
  • FIG. 1 illustrates the shape of a smartphone as an example of the electronic device 1 110
  • the electronic device 1 110 may use a wireless or wired communication method to substantially connect the network 170. It may refer to one of various physical computer devices capable of communicating with other electronic devices 120, 130, 140 and / or server 150, 160 via.
  • the communication method is not limited and may include short-range wireless communication between devices as well as a communication method utilizing a communication network (for example, mobile communication network, wired internet, wireless internet, and broadcasting network) that the network 170 may include.
  • the network 170 may include a personal area network (PAN), a local area network (LAN), a campus area network (CAN), a metropolitan area network (MAN), a wide area network (WAN), and a broadband network (BBN). And one or more of networks such as the Internet.
  • the network 170 may also include any one or more of network topologies, including bus networks, star networks, ring networks, mesh networks, star-bus networks, trees, or hierarchical networks, but It is not limited.
  • Each of the servers 150 and 160 communicates with the plurality of electronic devices 110, 120, 130, and 140 via the network 170 to provide a command, code, file, content, service, or the like. It may be implemented in devices.
  • the server 150 may be a service (eg, a content providing service, a social network service, a messaging service, or a search service) with a plurality of electronic devices 110, 120, 130, and 140 connected through the network 170. , Mail service, etc.).
  • the control method according to the exemplary embodiments of the present invention is an embodiment.
  • the control method according to the embodiment of FIG. 1 may be a description of a case where the electronic device 110 receives content from the server 150 and plays the content through a media player.
  • the electronic device 110 may include a case in which the electronic device 110 plays content stored in the local storage of the electronic device 110 through the media player without separate communication with the server 150.
  • FIG. 2 is a block diagram illustrating an example of a computer device according to an embodiment of the present invention.
  • Each of the plurality of electronic devices 110, 120, 130, and 140 or each of the servers 150 and 160 described above may be implemented by the computer device 200 shown in FIG. 2.
  • a computer program according to an embodiment may be installed and driven in the computer device 200, and the computer device 200 may share and reproduce a sound source according to embodiments of the present disclosure under the control of the driven computer program. The method can be performed.
  • the computer device 200 may include a memory 210, a processor 220, a communication interface 230, and an input / output interface 240.
  • the memory 210 may be a computer-readable recording medium, and may include a permanent mass storage device such as random access memory (RAM), read only memory (ROM), and a disk drive.
  • RAM random access memory
  • ROM read only memory
  • the non-volatile mass storage device such as a ROM and a disk drive may be included in the computer device 200 as a separate permanent storage device separate from the memory 210.
  • the memory 210 may store an operating system and at least one program code. These software components may be loaded into the memory 210 from a computer-readable recording medium separate from the memory 210.
  • Such a separate computer-readable recording medium may include a computer-readable recording medium such as a floppy drive, disk, tape, DVD / CD-ROM drive, memory card, and the like.
  • the software components may be loaded into the memory 210 via the communication interface 230 rather than the computer readable recording medium.
  • software components may be loaded into memory 210 of computer device 200 based on a computer program installed by files received via network 170.
  • the processor 220 may be configured to process instructions of a computer program by performing basic arithmetic, logic, and input / output operations. Instructions may be provided to the processor 220 by the memory 210 or the communication interface 230. For example, the processor 220 may be configured to execute a command received according to a program code stored in a recording device such as the memory 210.
  • the communication interface 230 may provide a function for the computer device 200 to communicate with other devices (eg, storage devices described above) through the network 170. For example, a request, a command, data, a file, etc. generated by the processor 220 of the computer device 200 according to a program code stored in a recording device such as the memory 210 may be controlled according to the control of the communication interface 230. 170 may be transferred to other devices. Conversely, signals, commands, data, files, and the like from other devices may be received by the computer device 200 via the communication interface 230 of the computer device 200 via the network 170. Signals, commands, data, and the like received through the communication interface 230 may be transmitted to the processor 220 or the memory 210, and the files and the like may be further included in the storage medium (described above). Persistent storage).
  • the input / output interface 240 may be a means for interfacing with the input / output device 250.
  • the input device may include a device such as a microphone, a keyboard or a mouse
  • the output device may include a device such as a display or a speaker.
  • the input / output interface 240 may be a means for interfacing with a device in which functions for input and output are integrated into one, such as a touch screen.
  • the input / output device 250 may be configured as the computer device 200 and one device.
  • the computer device 200 may include fewer or more components than the components of FIG. 2. However, it is not necessary to clearly show most of the prior art components.
  • the computer device 200 may be implemented to include at least some of the input and output devices 250 described above, or may further include other components such as a transceiver, a database, and the like.
  • the control method according to the present exemplary embodiment may be performed by the computer apparatus 200 implementing any one of the plurality of electronic devices 110, 120, 130, and 140 described above, and the computer apparatus 200 may be touched. It may include a screen environment.
  • the processor 220 of the computer device 200 may be implemented to execute a control instruction according to a code of an operating system included in the memory 210 or a code of at least one program.
  • the program code described above may be code of an application providing a media player.
  • the processor 220 may perform the steps 310 to 340 included in the control method of FIG. 3 according to a control command provided by a code stored in the computer device 200. ) Can be controlled.
  • the computer device 200 may manage a plurality of layered objects for controlling the media player.
  • the computer device 200 may include a lower layer corresponding to each of the plurality of operations and objects of a higher layer corresponding to each of the categories that classify the plurality of operations defined for the media player in step 310.
  • the categories may include: (1) a first category including a previous content selection operation, a next content selection operation, a play and stop playback of the selected content, (2) a forward seek operation for the selected content, and a selection for the selected content.
  • Two or more categories of a second category including a reverse search operation, and (3) a third category including a volume increasing operation and a volume decreasing operation.
  • a first operation for playing and stopping playback of the currently selected content, a second operation for selecting previous content of the currently selected content, and a third operation for selecting the next content of the currently selected content are included in the first category.
  • the first category may correspond to the object 1-1 of the upper layer
  • the object 1-1 of the upper layer may correspond to the object 2-1 of the lower layer corresponding to the first operation and the object of the lower layer corresponding to the second operation. It may correspond to the object 2-3 of the lower layer corresponding to 2-2 and the third operation.
  • object 2-1, object 2-2 and object 2-3 of the lower layer may be provided.
  • a fourth operation for forward searching for the selected content and a fifth operation for reverse searching for the selected content may be included in the second category.
  • the second category may correspond to the object 1-2 of the upper layer
  • the object 1-2 of the upper layer corresponds to the objects 2-4 and the lower layer corresponding to the fifth operation of the lower layer corresponding to the fourth operation. It can correspond to 2-5.
  • objects 2-4 and object 2-5 of the lower layer may be provided.
  • a sixth operation for increasing the volume and a seventh operation for decreasing the volume may be included in the third category.
  • the third category may correspond to the objects 1-3 of the upper layer
  • the objects 1-3 of the upper layer correspond to the objects 2-6 and the lower layer corresponding to the seventh operation of the lower layer corresponding to the sixth operation. It can correspond to 2-7.
  • objects 2-6 and objects 2-7 of a lower layer may be provided.
  • the layering of such objects may be configured according to the operations defined for the media player, and may be layered more than three or further include additional objects, depending on the embodiment. For example, an operation for reproducing the selected content and an operation for stopping the reproduction may be separated into individual objects through the third layer. In addition, an operation for increasing or decreasing the brightness of the screen may be added to reproduce the moving image.
  • the computer device 200 maintains a touch input for a first predetermined time or more at a first position of the touch screen of the computer device associated with the execution of the media player (eg, the first position of the touch screen screen).
  • the upper layer of the plurality of objects may be displayed on the touch screen.
  • the computer device 200 may dynamically determine a position where objects of a higher layer are displayed on a touch screen screen, that is, display positions of objects of a higher layer, based on a first position that is an initial position of a touch input.
  • the display position may be a position corresponding to any one point (for example, a center point) of points included in an area in which an object is displayed separately from other objects and other image elements on the touch screen.
  • the computer apparatus 200 may include the center points of the objects 1-1, 1-2, and 1-3 of the upper layer corresponding to the first category, the second category, and the third category as described above.
  • the positions of the objects 1-1, 1-2, and 1-3 may be dynamically calculated to be located at the same distance from the initial coordinates (eg, coordinates on the touch screen) to form vertices of an equilateral triangle.
  • the center points of the two objects are located at the same distance from the position where the touch input is maintained, and are located at the top, bottom, left, or right angle directions of the position where the touch input is maintained.
  • We can compute the position of two objects so that
  • the positions of the four objects may be calculated such that the center point of each of the four objects forms a vertex of the square and is located at the same distance from the initial coordinate of the touch input.
  • the arrangement of the objects may be variously set according to an embodiment, and the display position thereof may be dynamically calculated according to the position of the touch input.
  • Each of the objects 1-1, 1-2, and 1-3 may be displayed while the touch input is maintained at a dynamically calculated position. If the touch input is released in operation 320, the computer device 200 may release the display of the objects of the upper layer.
  • the computer apparatus 200 moves the touch input from the first position to a second position corresponding to the first object among the objects of the upper layer displayed and maintained for the second predetermined time or more (for example, the touch input.
  • the objects of the lower layer corresponding to the first object may be displayed on the touch screen. For example, if the position of the touch input moves to the second position on the area where the object 1-1 is displayed while the objects 1-1, object 1-2, and object 1-3 are displayed as the objects of the upper layer, the object 1 Instead of -1, object 1-2, and object 1-3, object 2-1, object 2-2, and object 2-3 of lower layers corresponding to object 1-1 may be displayed.
  • the computer device 200 may dynamically determine the display position of the objects of the lower layer based on the second position.
  • the location where the touch input is maintained in step 320 may be a center point of the object 1-1, the object 1-2, and the object 1-3.
  • the computer device 200 may be an object 2 that is an object of a lower layer of the object 1-1 based on the location (second location).
  • the positions of 1, object 2-2 and object 2-3 can be calculated dynamically.
  • the computer device 200 may make the object 2-1 such that the center points of the objects 2-1, 2-2, and 2-3 are located at the same distance as the position where the touch input is maintained and form vertices of an equilateral triangle.
  • the positions of objects 2-2 and 2-3 may be calculated.
  • the arrangement of the objects may be variously set, and the display position may be dynamically calculated according to the current coordinates of the touch input in which the display position is moved and maintained.
  • the computer apparatus 200 moves the touch input from the second position to a third position corresponding to the second object among the objects of the lower layer displayed and maintained for the third predetermined time or longer (for example, the touch input. After the movement from the second position to the third position corresponding to the second object is maintained for 0.5 seconds or more), an operation corresponding to the second object can be applied to the media player.
  • object 1-1 is selected in step 320 to display object 2-1, object 2-2 and object 2-3 in step 330, and object 2-1 to select in step 340 May be considered.
  • the object 2-1 may correspond to a first operation for reproducing and resuming reproduction of the selected content, and the reproducing operation or reproducing operation may be applied to the media player. If the content selected through the media player is being played, the media player may be stopped from playing. If the selected content is played back through the media player, the media player may be played back. In this case, when the user touches the touch screen, the user touches the touch screen screen, and then moves the touch position to the object 1-1 and then to the object 2-1 again. The stop operation can be processed.
  • the display position of the objects is based on the touch position of the user, the user can easily handle such playback operation or playback stop operation with one hand.
  • the object 2-2 is selected in step 340 May be considered.
  • the object 2-2 may correspond to the second operation for selecting previous content of the currently selected content, and the user also touches the touch screen screen to move the touch position to the object 1-1, and then the object 2-2. Just by moving, the previous content can be selected. If the user moves the position of the touch to object 1-1 and then to object 2-3, the next content can be selected.
  • the user moves the position of the touch to Object 1-2 or Object 1-3, and then the object in the lower layer (Object 2-4 or Object 2 in lower layer corresponding to Object 1-2 in higher layer).
  • the object in the lower layer Object 2-4 or Object 2 in lower layer corresponding to Object 1-2 in higher layer.
  • FIG. 4 is a diagram illustrating an example of displaying an object of a higher layer according to an embodiment of the present invention.
  • 4 illustrates an example of the smartphone 400 as an example of the computer device 200 described above, and illustrates an example in which a track image 420 of an album is displayed through a media player on a first screen 410. Shows an example of touching the first position 430 on the first screen 410.
  • the smartphone 400 is an object of a higher layer as an object of a higher layer, an object 450 for a reproduction operation, an object 460 for a navigation operation, and a volume operation, as the objects of the upper layer, as shown in the second screen 440.
  • the object 470 may be displayed.
  • the user's touch may be maintained at the first position 430, and when the user's touch is released, the display of the objects 450, 460, and 470 may also be released.
  • the recognition of the touch associated with the display of the objects 450, 460, and 470 may be performed for the entire area of the first screen 410, and according to an exemplary embodiment, a predetermined area of the first screen 410 may be used. It may be limited to.
  • the user may select one of the objects of the upper layer by moving the position of the touch maintained to the position of the displayed objects 450, 460, and 470, in which case the smartphone 400 Objects of a lower layer corresponding to objects of the selected upper layer may be displayed.
  • 5 to 7 are diagrams illustrating examples of displaying objects of a lower layer according to an embodiment of the present invention.
  • the first screen 510 of FIG. 5 is a second position 520 on the object 450 for reproducing at the first location 430 the position of the touch held by the user on the second screen 440 of FIG. 4.
  • An example of moving to) is shown.
  • objects of a lower layer corresponding to the object 450 for the reproduction operation may be provided based on the second location 520.
  • the second screen 530 of FIG. 5 is an object of a lower layer corresponding to the object 450 for a reproduction operation, an object 540 for playing and stopping playback of selected content, and an object 550 for selecting previous content. And an example of providing the object 560 for selecting the next content based on the second location 520.
  • an operation corresponding to one of the objects in the lower layer may be applied to the media player. For example, when the user moves the touch position on the object 540 for playing and stopping playing the selected content at the second position 520, the playing of the selected content may be started or stopped.
  • the previous content of the content currently selected in the playlist may be selected.
  • the next content of the content currently selected in the playlist may be selected.
  • the first screen 610 of FIG. 6 is a third position 620 on the object 460 for the search operation at the first position 430 of the position of the touch held by the user on the second screen 440 of FIG. 4.
  • An example of moving to) is shown.
  • objects of a lower layer corresponding to the object 460 for the search operation may be provided based on the third location 620.
  • the second screen 630 of FIG. 6 is an object of a lower layer corresponding to the object 460 for the navigation operation, and the object 640 for forward navigation and the object 650 for reverse navigation are located in the third position 620.
  • An example is provided based on the above. In this case, as the user moves the position where the touch is maintained to one of the objects in the lower layer, an operation corresponding to one of the objects in the lower layer may be applied to the media player.
  • the forward search for the selected content may be performed.
  • the degree of forward search may be determined based on the time at which the user's touch is continuously positioned on the object 640 for forward search.
  • the degree of forward navigation may be determined according to the position on the object 640 for forward navigation of the user's touch. More specifically, as the touch location on the object 640 for forward navigation of the user is closer to the third location 620, the search speed in the forward direction is relatively decreased, and the user's touch location is reduced to the third location 620. The further away from, the relatively higher the search speed in the forward direction can be.
  • the search speed in the forward direction is relatively decreased, and the user's touch position is forward search.
  • the closer to the edge of the object 640 for the faster the search speed in the forward direction can be increased.
  • the degree of reverse search may be determined based on the time at which the user's touch is continuously positioned on the object 650 for reverse search.
  • the degree of reverse search may be determined according to the position on the object 650 for the reverse search of the user's touch. More specifically, as the touch position on the object 650 for the reverse search of the user is closer to the third position 620, the search speed in the reverse direction is relatively decreased, and the touch position of the user is the third position 620. As you move further away, the search speed in the reverse direction may increase relatively.
  • the search speed in the reverse direction may be relatively increased.
  • the position of the touch held by the user on the second screen 440 of FIG. 4 is located at the fourth position 720 on the object 470 for volume manipulation at the first location 430.
  • An example of moving to is shown.
  • objects of lower layers corresponding to the object 470 for volume manipulation may be provided based on the fourth location 720.
  • the second screen 730 of FIG. 7 is an object of a lower layer corresponding to the object 470 for volume manipulation.
  • the second screen 730 shows an object 740 for increasing volume and an object 750 for decreasing volume in a fourth position 720.
  • An example is provided based on the above. In this case, as the user moves the position where the touch is maintained to one of the objects in the lower layer, an operation corresponding to one of the objects in the lower layer may be applied to the media player.
  • the degree of increase in volume may be determined based on the time at which the user's touch is continuously positioned on the object 740 for increasing the volume.
  • the degree of volume increase may be determined according to a position on the object 740 for increasing the volume of the user's touch. More specifically, as the touch position on the object 740 for increasing the volume of the user is closer to the fourth position 720, the degree of volume increase is relatively decreased, and the touch position of the user is changed from the fourth position 720. The further away, the relative increase in volume can occur.
  • the closer the user's touch position is to the center of the object 740 for increasing the volume the lower the volume increase, and the closer the user's touch position to the edge of the object 740 for increasing the volume.
  • the degree of volume increase can be relatively increased.
  • the degree of volume reduction may be determined based on the time at which the user's touch is continuously positioned on the object 750 for volume reduction.
  • the degree of volume reduction may be determined according to a position on the object 750 for volume reduction of the user's touch. More specifically, as the touch position on the object 750 for decreasing the volume of the user approaches the fourth position 720, the degree of volume reduction is relatively decreased, and the touch position of the user is reduced from the fourth position 720. The further away, the relative decrease in the volume can be.
  • the degree of volume reduction can be relatively increased.
  • Objects may be highlighted by applying a dominant color to the displayed object.
  • feedback as the user's touch moves on the object may be output using at least one of visual, auditory, and tactile.
  • the smartphone 400 may provide feedback to the user, such as vibration, brightness adjustment of the screen, and object manipulation (eg, display of an object shaking on the screen).
  • the layered objects for manipulation of the media player are provided for each layer based on the touch position of the user, so that the user can easily select the layered objects with one hand. Accordingly, according to embodiments of the present invention, a user may easily process various operations such as playing, stopping, selecting content, searching in content, adjusting volume, and the like provided by a media player with one hand. experience and / or a user interface (UI).
  • UI user interface
  • the system or apparatus described above may be implemented as a hardware component, a software component or a combination of hardware components and software components.
  • the devices and components described in the embodiments are, for example, processors, controllers, arithmetic logic units (ALUs), digital signal processors, microcomputers, field programmable gate arrays (FPGAs).
  • ALUs arithmetic logic units
  • FPGAs field programmable gate arrays
  • PLU programmable logic unit
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may also access, store, manipulate, process, and generate data in response to the execution of the software.
  • a processing device may be described as one being used, but a person skilled in the art will appreciate that the processing device includes a plurality of processing elements and / or a plurality of types of processing elements. It can be seen that it may include.
  • the processing device may include a plurality of processors or one processor and one controller.
  • other processing configurations are possible, such as parallel processors.
  • the software may include a computer program, code, instructions, or a combination of one or more of the above, and configure the processing device to operate as desired, or process independently or collectively. You can command the device.
  • Software and / or data may be any type of machine, component, physical device, virtual equipment, computer storage medium or device in order to be interpreted by or to provide instructions or data to the processing device. It can be embodied in.
  • the software may be distributed over networked computer systems so that they are stored or executed in a distributed manner.
  • Software and data may be stored on one or more computer readable media.
  • the method according to the embodiment may be embodied in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, and the like, alone or in combination.
  • the program instructions recorded on the media may be those specially designed and constructed for the purposes of the embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Such a recording medium may be various recording means or storage means in the form of a single or several hardware combined, and is not limited to a medium directly connected to any computer system, but may be distributed on a network.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/KR2018/004287 2018-04-12 2018-04-12 미디어 플레이어를 제어하는 방법 및 시스템 WO2019198844A1 (ko)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/KR2018/004287 WO2019198844A1 (ko) 2018-04-12 2018-04-12 미디어 플레이어를 제어하는 방법 및 시스템
KR1020207027238A KR102512879B1 (ko) 2018-04-12 2018-04-12 미디어 플레이어를 제어하는 방법 및 시스템
JP2020555857A JP7183295B6 (ja) 2018-04-12 2018-04-12 メディアプレーヤを制御する方法およびシステム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2018/004287 WO2019198844A1 (ko) 2018-04-12 2018-04-12 미디어 플레이어를 제어하는 방법 및 시스템

Publications (1)

Publication Number Publication Date
WO2019198844A1 true WO2019198844A1 (ko) 2019-10-17

Family

ID=68164332

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/004287 WO2019198844A1 (ko) 2018-04-12 2018-04-12 미디어 플레이어를 제어하는 방법 및 시스템

Country Status (3)

Country Link
JP (1) JP7183295B6 (ja)
KR (1) KR102512879B1 (ja)
WO (1) WO2019198844A1 (ja)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101004463B1 (ko) * 2008-12-09 2010-12-31 성균관대학교산학협력단 터치 스크린의 드래그를 이용한 메뉴 선택을 지원하는 휴대용 단말 및 그 제어 방법
KR20120031776A (ko) * 2010-09-27 2012-04-04 엘지전자 주식회사 이동 단말기 및 이동 단말기의 제어 방법
KR20150059517A (ko) * 2013-11-22 2015-06-01 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
KR20150094478A (ko) * 2014-02-10 2015-08-19 삼성전자주식회사 사용자 단말 장치 및 이의 디스플레이 방법
JP6245788B2 (ja) * 2011-02-11 2017-12-13 ソニーモバイルコミュニケーションズ株式会社 情報入力装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8074178B2 (en) * 2007-06-12 2011-12-06 Microsoft Corporation Visual feedback display
KR20150092630A (ko) * 2014-02-05 2015-08-13 엘지전자 주식회사 전자 기기 및 전자 기기의 제어 방법
JP2016115208A (ja) * 2014-12-16 2016-06-23 シャープ株式会社 入力装置、ウェアラブル端末、携帯端末、入力装置の制御方法、および入力装置の動作を制御するための制御プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101004463B1 (ko) * 2008-12-09 2010-12-31 성균관대학교산학협력단 터치 스크린의 드래그를 이용한 메뉴 선택을 지원하는 휴대용 단말 및 그 제어 방법
KR20120031776A (ko) * 2010-09-27 2012-04-04 엘지전자 주식회사 이동 단말기 및 이동 단말기의 제어 방법
JP6245788B2 (ja) * 2011-02-11 2017-12-13 ソニーモバイルコミュニケーションズ株式会社 情報入力装置
KR20150059517A (ko) * 2013-11-22 2015-06-01 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
KR20150094478A (ko) * 2014-02-10 2015-08-19 삼성전자주식회사 사용자 단말 장치 및 이의 디스플레이 방법

Also Published As

Publication number Publication date
JP7183295B2 (ja) 2022-12-05
KR20200132884A (ko) 2020-11-25
JP2021528713A (ja) 2021-10-21
JP7183295B6 (ja) 2022-12-20
KR102512879B1 (ko) 2023-03-22

Similar Documents

Publication Publication Date Title
US11635869B2 (en) Display device and method of controlling the same
US11803564B2 (en) Method and system for keyword search using messaging service
KR102137240B1 (ko) 디스플레이 영역을 조절하기 위한 방법 및 그 방법을 처리하는 전자 장치
US11899903B2 (en) Display device and method of controlling the same
KR102184269B1 (ko) 디스플레이장치, 휴대장치 및 그 화면 표시방법
WO2011083962A2 (en) Method and apparatus for setting section of a multimedia file in mobile device
US20170315721A1 (en) Remote touchscreen interface for virtual reality, augmented reality and mixed reality devices
JP2021044804A (ja) 映像通話をしながら使用する360度パノラマ背景提供方法および装置
WO2014116056A1 (ko) 애니메이션의 모션 시퀀스를 생성하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체
US9836134B1 (en) Touchscreen input device based content sharing
WO2022183887A1 (zh) 视频编辑及播放方法、装置、设备、介质
US20220300144A1 (en) Method, system, and non-transitory computer readable record medium for providing chatroom in 3d form
WO2016108544A1 (ko) 대화 서비스 제공 방법 및 대화 서비스 제공 디바이스
CN113546419A (zh) 游戏地图显示方法、装置、终端及存储介质
KR20200113834A (ko) 애플리케이션 정보 제공 장치 및 방법
WO2024037563A1 (zh) 内容展示方法、装置、设备及存储介质
KR20220154825A (ko) 노트 생성 방법 및 전자기기
WO2019198844A1 (ko) 미디어 플레이어를 제어하는 방법 및 시스템
WO2019245062A1 (ko) 사용자 반응을 기반으로 컨텐츠를 제공하는 방법과 시스템 및 비-일시적인 컴퓨터 판독가능한 기록 매체
US9086746B1 (en) Stylus based profile management
US20220391046A1 (en) Method and system for exposing online content
WO2011037408A2 (ko) 가상 공간 인터페이스를 가지는 단말기 및 가상 공간 인터페이스 제어 방법
KR20210029635A (ko) 영상 통화를 하면서 사용하는 360도 파노라마 배경 제공 방법 및 장치
WO2011037409A2 (ko) 단말기에 의하여 실행되는 가상 공간 인터페이스 제어 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18914878

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020555857

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18914878

Country of ref document: EP

Kind code of ref document: A1