WO2023282519A1 - Procédé de création de travail multimédia basé sur une entrée tactile - Google Patents

Procédé de création de travail multimédia basé sur une entrée tactile Download PDF

Info

Publication number
WO2023282519A1
WO2023282519A1 PCT/KR2022/009144 KR2022009144W WO2023282519A1 WO 2023282519 A1 WO2023282519 A1 WO 2023282519A1 KR 2022009144 W KR2022009144 W KR 2022009144W WO 2023282519 A1 WO2023282519 A1 WO 2023282519A1
Authority
WO
WIPO (PCT)
Prior art keywords
authoring
multimedia
touch input
interface
frame
Prior art date
Application number
PCT/KR2022/009144
Other languages
English (en)
Korean (ko)
Inventor
박재범
이윤영
Original Assignee
박재범
이윤영
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 박재범, 이윤영 filed Critical 박재범
Publication of WO2023282519A1 publication Critical patent/WO2023282519A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • a technique described below relates to a technique for authoring a multimedia work.
  • the technology described below relates to a method of authoring a multimedia work based on a touch input input to a user terminal.
  • Multimedia works such as videos are mostly produced using professional writing programs.
  • professional writing programs require high-end computer equipment.
  • portable personal terminals such as smart phones have become capable of playing high-resolution videos and playing high-resolution games as hardware performance improves.
  • a portable personal terminal has a relatively small screen and receives a certain command from a user mostly through a touch input. Therefore, when authoring a multimedia work using a current portable personal terminal, there is a limit to selecting a wide variety of editing functions.
  • the technology to be described below aims to provide various multimedia writing tools or interfaces capable of authoring multimedia works based on touch input using a portable personal terminal.
  • a method for authoring a multimedia work based on a touch input includes a step in which a multimedia authoring device executes a multimedia authoring tool, a step in which the multimedia authoring device receives a command to select specific content, and a first step in which the multimedia authoring device calls a first authoring interface.
  • Receiving a touch input and displaying the first authoring interface on a screen receiving, by the multimedia authoring device, a first control command for the first authoring interface output on the screen; 1 Step of authoring the specific content according to a control command, receiving a second touch input for calling a second authoring interface by the multimedia authoring device, and outputting the second authoring interface to the screen; Receiving a second control command for the second authoring interface output on the screen, and authoring the specific content according to the second control command by the multimedia authoring device.
  • the technology to be described below provides various authoring interfaces through a touch input so that a user can easily author a multimedia work through a device such as a portable terminal.
  • the technology described below makes authoring multimedia works more convenient by using intuitive touch input.
  • 1 is an example of a multimedia authoring system.
  • FIG. 2 is an example of an execution screen of a multimedia authoring tool.
  • 3 is an example of a touch interface and command execution in a multimedia authoring tool.
  • FIG. 4 is another example of a touch interface and command execution in a multimedia authoring tool.
  • FIG. 5 is another example of a touch interface and command execution in a multimedia authoring tool.
  • FIG. 6 is an example of frame selection using a jog shuttle based on a touch input.
  • FIG. 7 is another example of frame selection using a touch input-based jog shuttle.
  • FIG. 8 is an example of a process of selecting a frame to be removed by a touch input.
  • FIG. 9 is an example of a content authoring process using a multimedia authoring tool based on a touch input.
  • FIG. 10 is an example of a multimedia authoring device.
  • first, second, A, B, etc. may be used to describe various elements, but the elements are not limited by the above terms, and are merely used to distinguish one element from another. used only as For example, without departing from the scope of the technology described below, a first element may be referred to as a second element, and similarly, the second element may be referred to as a first element.
  • the terms and/or include any combination of a plurality of related recited items or any of a plurality of related recited items.
  • each component to be described below may be combined into one component, or one component may be divided into two or more for each more subdivided function.
  • each component to be described below may additionally perform some or all of the functions of other components in addition to its main function, and some of the main functions of each component may be performed by other components. Of course, it may be dedicated and performed by .
  • each process constituting the method may occur in a different order from the specified order unless a specific order is clearly described in context. That is, each process may occur in the same order as specified, may be performed substantially simultaneously, or may be performed in the reverse order.
  • the technology to be described below is a technology of authoring a multimedia work using a personal terminal device.
  • Multimedia works include various individual contents such as images, videos, image clips, video clips, and audio, or contents in which a plurality of contents are combined.
  • Multimedia works are a concept including video works, photographic works, music works, etc. composed of digital data.
  • Personal terminal devices correspond to computer devices used by individuals.
  • the personal terminal device may be at least one of devices such as a PC, a laptop computer, a smart phone, and a tablet device.
  • the personal terminal device may support the creation of multimedia works by using a program for authoring or editing multimedia works.
  • the personal terminal device may support authoring of multimedia works by using a web-based program provided by a server on the network.
  • a multimedia writing program or application supported by a personal terminal device or server may be referred to as a multimedia authoring tool.
  • the technology to be described below mainly relates to an interface technique for authoring a multimedia work using a personal terminal device having a limited screen size.
  • the personal terminal device may be a device that is portable and has a relatively small screen size, such as a smart phone, a tablet PC, and a laptop computer.
  • the personal terminal device is a device that supports touch input.
  • 1 is an example of a multimedia authoring system 100 .
  • 1 shows personal terminal devices 100 such as smart phones, tablet PCs and notebooks.
  • the multimedia authoring system 100 may be composed of only the personal terminal device 110 .
  • the personal terminal device 110 may author a specific multimedia work using a pre-installed multimedia authoring tool.
  • the personal terminal device 110 calls a certain authoring menu with a touch input from the user, and supports the user to author or edit a current multimedia work by using the authoring menu.
  • the multimedia authoring system 100 may also consist of a multimedia authoring and service server 150 .
  • the personal terminal device 110 may access the service server 150 through the Internet and support a web-based multimedia authoring tool supported by the service server 150.
  • the user may perform user registration and authentication in advance to use the web-based multimedia authoring tool.
  • the personal terminal device 110 may execute a web-based multimedia authoring tool through a basic web browser or a pre-installed dedicated program.
  • FIG. 2 is an example of an execution screen 200 of a multimedia authoring tool.
  • the image frame area 210 outputs a specific frame of an image that is a current authoring target.
  • the thumbnail area 220 outputs available video contents in the form of thumbnails (thumbnail 1 to thumbnail 3) when different video contents are used.
  • the time line area 230 provides a time line interface to edit (delete, insert, etc.) selected videos (video 1, video 2) and audio (audio 1) for multimedia content authoring.
  • the content menu area 240 may output a multimedia content selection menu, a video playback related menu, and the like.
  • the authoring menu area 240 may output icons such as a content object, content list, video play, and video pause.
  • the authoring interface area 250 outputs a specific authoring interface for authoring multimedia content. 2 shows an example of outputting an equalizer for sound control to the interface area 250.
  • the touch input area 260 is an area that calls a specific authoring interface for authoring multimedia content.
  • the touch input area 260 receives a constant touch input from the user.
  • the multimedia authoring tool outputs a specific authoring interface to the interface area 250 according to the user's touch input.
  • the touch input area 260 may receive a touch input for controlling the outputted specific authoring interface.
  • a touch input for controlling a specific authoring interface on which the interface area 250 is output may be received.
  • the multimedia authoring tool may not provide a separate touch input area 260 for interface calling on the screen.
  • the multimedia authoring tool may utilize an area where no other touch input occurs, such as the image frame area 210, as a touch input area.
  • 3 is an example of a touch interface and command execution in a multimedia authoring tool.
  • FIG. 3(A) is an example of a user's touch input.
  • the touch input of FIG. 3(A) is an input of drawing a circle in one direction.
  • the touch input is an input that draws a single circle.
  • the touch input need not be a perfect circle, but may be a spiral shape.
  • one direction may be clockwise or counterclockwise.
  • FIG. 3(B) is an example of an authoring interface called according to the touch input of FIG. 3(A).
  • the authoring interface in Fig. 3(B) is a rotating tool.
  • a rotating tool is a tool that rotates multimedia content or an object in a certain direction.
  • a rotation tool is called to the authoring interface area 250 according to the touch input of FIG. 3(A). Thereafter, the multimedia content or the object may be constantly rotated according to the rotation direction and degree of rotation of the rotation tool.
  • 3(C) illustrates an example in which multimedia content is rotated according to the operation of the rotation tool R.
  • 3(C) shows an example in which one image frame F is rotated clockwise by 90 degrees.
  • U (Up), D (Down), R (Right), and L (Left) are displayed to indicate the directionality of the frame.
  • 3(C) is an example of rotating the image frame itself, but if the user selects a part of the image frame or a specific object, only the corresponding region or specific object may be rotated according to the adjustment of the rotation tool.
  • FIG. 4 is another example of a touch interface and command execution in a multimedia authoring tool.
  • FIG. 4(A) is an example of a user's touch input.
  • the touch input of FIG. 4(A) is an input of drawing a circle in one direction.
  • the touch input is an input that draws a circle twice.
  • the touch input need not be a perfect circle, but may be a spiral shape.
  • one direction may be clockwise or counterclockwise.
  • FIG. 4(B) is an example of an authoring interface called according to the touch input of FIG. 4(A).
  • the authoring interface of FIG. 4(B) is Jog & Shuttle.
  • the jog shuttle can be used to find a specific part in continuous multimedia content.
  • the jog shuttle interface may include an icon for adjusting the degree of movement as shown in FIG. 4(B).
  • the jog shuttle is called to the authoring interface area 250 according to the touch input of FIG. 4(A). Afterwards, it moves to a specific part of the multimedia content according to the rotation direction and rotation degree of the jog shuttle.
  • the multimedia content may be a video composed of continuous frames, a continuous sound source, and the like.
  • FIG. 4(C) shows an example in which a video frame F moves according to the operation of the jog shuttle J. Rotating the jog shuttle J clockwise, it shows the state of moving from frame 1, which is the starting point, to frame 5.
  • FIG. 5 is another example of a touch interface and command execution in a multimedia authoring tool.
  • FIG. 5(A) is an example of a user's touch input.
  • the touch input of FIG. 5A is an input drawing a cross (+) shape. Accordingly, the touch input is composed of straight line drags that cross each other.
  • the order in which the two straight lines are input may follow the order set in the assessment. For example, different authoring interfaces may be called according to the order in which two straight lines are input. For example, as shown in FIG. 5(A), an audio equalizer may be called when a vertical straight line (1) and a horizontal straight line (2) are in order. Alternatively, a fader used for image editing may be called when a horizontal straight line (1) and a vertical straight line (2) are in order.
  • FIG. 5(B) is an example of an authoring interface called according to the touch input of FIG. 5(A).
  • the authoring interface called according to the touch input of FIG. 5A may be an interface having a slide bar.
  • the authoring interface of FIG. 5(B) shows fader S1 and equalizer S2. Of course, other authoring interfaces may also be invoked.
  • Fader S1 or equalizer S2 is called in the authoring interface area 250 according to the touch input of FIG. 5(A). Then, the multimedia content is adjusted according to the level control of the fader S1 or the equalizer S2.
  • 5(C) shows an example of giving a fading effect to a specific frame F by adjusting the level of the fader S1. If the equalizer S2 is adjusted, the output level is constantly adjusted for each frequency band with respect to the sound signal of the selected part.
  • 6 is an example of frame selection using a jog shuttle based on a touch input. 6 further describes touch input related to the jog shuttle. 6 assumes a state in which the jog shuttle is called. That is, it is assumed that the multimedia authoring tool calls the jog shuttle to the authoring interface area 250 after receiving a certain touch input (eg, an input as shown in FIG. 4(A)) from the user.
  • a certain touch input eg, an input as shown in FIG. 4(A)
  • 6(A) is an example of adjusting the called jog shuttle counterclockwise. Explain based on video. If the jog shuttle is adjusted counterclockwise, the frame is moved from the current frame (t) to the previous frame (t-i) in time. i is a natural number representing the frame order.
  • the frames can continue to change at a constant rate (eg -5 frames per second) for a fixed amount of time.
  • the user may have to select a specific frame.
  • 6(B) is an example of a touch input for selecting a specific frame while frames are being changed using a jog shuttle.
  • the touch input may be an input of touching the left area of the jog shuttle once.
  • the area where tapping can be input may be the jog shuttle itself or a specific area around the jog shuttle.
  • a frame at that time may be selected.
  • 6(C) is an example showing a video clip at a selected time point.
  • FIG. 6 may be a process of selecting (marking) a start frame in order to remove (trimming) a specific region from a video.
  • the user can move to a specific time or the most preceding video clip by inputting FIG. 6(A), and select a start frame from which image removal starts by inputting FIG. 6(B).
  • 7 is another example of frame selection using a touch input-based jog shuttle. 7 is an example of an operation opposite to that of FIG. 6 . 7 also assumes a state in which the jog shuttle is called.
  • 7(A) is an example of adjusting the called jog shuttle clockwise. Explain based on video. If the jog shuttle is adjusted clockwise, the frame moves from the current frame (t) to the next frame (t+i) in time.
  • frames can be continuously changed at a constant rate (e.g., +5 frames per second) for a fixed amount of time.
  • the user may have to select a specific frame.
  • 6(B) is an example of a touch input for selecting a specific frame while frames are being changed using a jog shuttle.
  • the touch input may be an input of touching the right area of the jog shuttle twice.
  • the area where tapping can be input may be the jog shuttle itself or a specific area around the jog shuttle.
  • a frame at that time may be selected.
  • 6(C) is an example showing a video clip at a selected time point.
  • FIG. 7 may be a process of selecting the last frame to remove a specific region from a video.
  • the user can move to a video clip of a specific time by inputting FIG. 7(A) and select the last frame to be removed by inputting FIG. 7(B).
  • FIG. 8 is an example of a process of selecting a frame to be removed by a touch input.
  • 8(A) is an example of selecting a plurality of deletion sections by moving a frame with a jog shuttle. It is assumed that the user calls the jog shuttle with a touch input and moves the current frame to the right (later time) on the time axis by adjusting the jog shuttle that has been released later in a clockwise direction. The user can remove multiple sections at once by specifying multiple ⁇ start frame, last frame ⁇ defining sections to be removed. 8(A) is an example in which two sections T1 and T2 are selected. At this time, inputs for selecting the start frame and the last frame may be used differently. For example, the start frame may be selected by tapping once, and the last frame may be selected by tapping twice. Alternatively, inputs for selecting the start frame and the last frame may be the same. For example, a start frame and an end frame may be selected with one tap.
  • FIG. 8(B) is another example of selecting a deletion section by moving a frame with a jog shuttle.
  • FIG. 8(B) assumes that inputs for selecting a start frame and an end frame are used differently. For example, the user selects start frame 1 with one tap. After that, the user can select start frame 2 by tapping once again while watching the moving frame. After that, the user can select the last frame by tapping twice. In this case, start frame 1 initially selected by the user is disregarded, and an interval T3 consisting of start frame 2 closest to the last frame to the last frame may be finally selected. That is, a section to be removed more dynamically can be selected by differentiating inputs for selecting the start frame and the last frame.
  • FIG. 9 is an example of a content authoring process 300 using a multimedia authoring tool based on a touch input.
  • One scenario of authoring multimedia contents will be described as an example.
  • the multimedia authoring tool is executed (310).
  • the multimedia authoring tool may receive a command for selecting content to be used for multimedia authoring from the user (320).
  • the multimedia authoring tool may receive a command to select a specific video.
  • a description will be made based on a video.
  • the multimedia authoring tool receives a user input for calling a specific authoring interface (330).
  • the multimedia authoring tool may receive a touch input that calls a rotation tool that rotates an image frame.
  • the multimedia authoring tool outputs a rotation tool.
  • the multimedia authoring tool may rotate a specific image through a rotation tool from the user (340).
  • the multimedia authoring tool may change a horizontal image into a vertical direction.
  • the multimedia authoring tool may receive a touch input that calls a tool for inverting (left and right inverting) an image frame.
  • the multimedia authoring tool outputs an inversion tool.
  • the multimedia authoring tool may invert the left and right of a specific image through an inversion tool from the user (340).
  • the multimedia authoring tool temporarily stores the video currently being edited.
  • the multimedia authoring tool may store a series of commands for editing without storing actual editing results.
  • the multimedia authoring tool monitors whether a new authoring interface is called (350). If the multimedia authoring tool detects a touch input calling a new authoring interface from the user (YES in 350), the multimedia authoring tool outputs a new authoring interface corresponding to the touch input.
  • the multimedia authoring tool may output the jog shuttle on the screen in response to a touch input calling the jog shuttle. Thereafter, the multimedia authoring tool may receive a command to select a specific section according to user input (340). In addition, the multimedia authoring tool may receive a delete command for the selected section and delete the section.
  • the delete input may be a touch input of pressing a jog shuttle.
  • the multimedia authoring tool may receive a command to select a specific frame or a specific phrase again according to the user's input for controlling the jog shuttle (340).
  • the multimedia authoring tool may monitor whether a new authoring interface is called (350). If the multimedia authoring tool detects a touch input calling a new authoring interface from the user (YES in 350), the multimedia authoring tool outputs a new authoring interface corresponding to the touch input.
  • the multimedia authoring tool may check a touch input that calls a fading tool (eg, the touch input of FIG. 5A ) and output the fading tool to the screen.
  • the multimedia authoring tool may receive and process a fading command (fading in or fading out) for the currently selected frame or section (340).
  • a fading command fading in or fading out
  • the multimedia authoring tool may monitor whether a new authoring interface is called (350), and if there is no new interface call, it is checked whether the authoring is terminated (360), and the multimedia authoring is terminated or the authoring operation continues.
  • the video authoring described in FIG. 9 is described in order: 1 selection of a specific video to be authored, 2 frame rotation or frame inversion, 3 deletion of a specific section of the video, 4 application of a fading effect to a specific frame or section of the video.
  • the scenario described in FIG. 9 is an example, and the user can author various multimedia contents by calling various authoring interfaces with a touch input.
  • the multimedia authoring device 400 corresponds to a personal terminal device in which a multimedia authoring tool is executed.
  • the multimedia authoring device 400 may include various types of devices in which a multimedia authoring tool based on touch input is executed.
  • the multimedia authoring device 400 may be physically implemented in various forms.
  • the multimedia authoring device 400 may have a form of a PC, a laptop computer, a smart device, or a network server.
  • the multimedia authoring device 400 may include a storage device 410, a memory 420, an arithmetic device 430, an interface device 440, a communication device 450, and an output device 460.
  • the storage device 410 stores a program corresponding to a multimedia authoring tool.
  • the storage device 410 may store multimedia content to be authored, various contents used in authoring, and the like.
  • the memory 420 may store data and information generated in the course of the multimedia writing device 400 authoring a multimedia work.
  • the interface device 440 is a device that receives a predetermined command from a user.
  • the interface device 440 may be a touch panel.
  • the communication device 450 refers to a component that receives and transmits certain information through a wired or wireless network.
  • the multimedia authoring device 400 is a server
  • the communication device 450 may receive a user's touch input or a control command corresponding to the touch input.
  • the communication device 450 may transmit the authored multimedia work to an external object.
  • the output device 460 is a device that outputs certain information.
  • the output device 460 may output the screen of the multimedia authoring tool described in FIG. 2 .
  • the arithmetic device 430 controls an operation of authoring a copyrighted work using a program stored in the storage device 410 .
  • the arithmetic device 430 matches a user touch input input to the interface device 440 with a control command corresponding to the corresponding touch input. Accordingly, the multimedia authoring device 400 must retain information in which a user touch input and a control command corresponding to the touch input are mapped in advance. for example.
  • the storage device 410 may store information in which touch inputs and control commands are mapped in a table form. In this case, the arithmetic device 430 may check the control command corresponding to the currently input user touch input with reference to the table information.
  • the arithmetic device 430 outputs an authoring interface corresponding to the confirmed control command to the output device 460 .
  • the arithmetic device 430 monitors a user input for the authoring interface output on the screen.
  • User input for the authoring interface is as described in FIGS. 3 to 9 .
  • the computing device 430 performs an authoring operation on the selected multimedia content according to a user input on the authoring interface.
  • the authoring operation may include frame rotation, frame inversion, selection of a video section, deletion of a selected section, fading of a frame or video section, sound adjustment of a frame or video section, and the like.
  • the arithmetic device 430 may sequentially or simultaneously output various authoring interfaces called by various user touch inputs on the screen.
  • the computing device 430 may constantly author multimedia content according to user input to the authoring interface(s) output on the screen. When a series of authoring commands are completed, the computing device 430 may collectively generate, edit, or convert multimedia content.
  • the arithmetic device 430 may be a device such as a processor, an AP, or a chip in which a program is embedded that processes data and performs certain arithmetic operations.
  • multimedia authoring method and the touch interface method for the multimedia authoring tool as described above may be implemented as a program (or application) including an executable algorithm that can be executed on a computer.
  • the program may be stored and provided in a temporary or non-transitory computer readable medium.
  • a non-transitory readable medium is not a medium that stores data for a short moment, such as a register, cache, or memory, but a medium that stores data semi-permanently and can be read by a device.
  • the various applications or programs described above are CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM (read-only memory), PROM (programmable read only memory), EPROM (Erasable PROM, EPROM)
  • ROM read-only memory
  • PROM programmable read only memory
  • EPROM Erasable PROM, EPROM
  • it may be stored and provided in a non-transitory readable medium such as EEPROM (Electrically EPROM) or flash memory.
  • Temporary readable media include static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), and enhanced SDRAM (Enhanced SDRAM). SDRAM, ESDRAM), Synchronous DRAM (Synclink DRAM, SLDRAM) and Direct Rambus RAM (DRRAM).
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM double data rate SDRAM
  • Enhanced SDRAM Enhanced SDRAM
  • SDRAM ESDRAM
  • Synchronous DRAM Synchronous DRAM
  • SLDRAM Direct Rambus RAM
  • DRRAM Direct Rambus RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Procédé de création de travail multimédia basé sur une entrée tactile comprenant les étapes consistant : à recevoir, par un dispositif de création multimédia, une première entrée tactile pour appeler une première interface de création, et à recevoir une première instruction de commande relative à la première interface de création qui est délivrée sur un écran ; à créer, par le dispositif de création multimédia, un contenu spécifique conformément à la première instruction de commande ; à recevoir, par le dispositif de création multimédia, une seconde entrée tactile pour appeler une seconde interface de création, et à recevoir une seconde instruction de commande relative à la seconde interface de création qui est délivrée sur l'écran ; et à créer, par le dispositif de création multimédia, le contenu spécifique conformément à la seconde instruction de commande.
PCT/KR2022/009144 2021-07-09 2022-06-27 Procédé de création de travail multimédia basé sur une entrée tactile WO2023282519A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210090157A KR102447307B1 (ko) 2021-07-09 2021-07-09 터치 입력 기반 멀티미디어 저작물 저작 방법
KR10-2021-0090157 2021-07-09

Publications (1)

Publication Number Publication Date
WO2023282519A1 true WO2023282519A1 (fr) 2023-01-12

Family

ID=83452685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/009144 WO2023282519A1 (fr) 2021-07-09 2022-06-27 Procédé de création de travail multimédia basé sur une entrée tactile

Country Status (2)

Country Link
KR (1) KR102447307B1 (fr)
WO (1) WO2023282519A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090116435A (ko) * 2008-05-07 2009-11-11 성균관대학교산학협력단 선 입력 기반 영상 처리 방법 및 그 장치
KR20110006243A (ko) * 2009-07-14 2011-01-20 삼성전자주식회사 휴대용 단말기에서 매뉴얼 포커싱 방법 및 장치
KR20110077236A (ko) * 2009-12-30 2011-07-07 주식회사 아이리버 조작툴을 이용하여 원하는 재생 시점으로 이동할 수 있는 멀티미디어 디바이스 및 그 방법
KR20120124173A (ko) * 2011-05-03 2012-11-13 엘지전자 주식회사 전자 기기 및 전자 기기의 오디오 재생 방법
KR101328199B1 (ko) * 2012-11-05 2013-11-13 넥스트리밍(주) 동영상 편집 방법 및 그 단말기 그리고 기록매체
KR101477486B1 (ko) * 2013-07-24 2014-12-30 (주) 프람트 동영상 재생 및 편집을 위한 사용자 인터페이스 장치 및 그 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102230389B1 (ko) 2021-02-23 2021-03-22 스튜디오씨드코리아 주식회사 디지털 컨텐츠 저작 툴의 작업 화면 자동 모드 전환 방법

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090116435A (ko) * 2008-05-07 2009-11-11 성균관대학교산학협력단 선 입력 기반 영상 처리 방법 및 그 장치
KR20110006243A (ko) * 2009-07-14 2011-01-20 삼성전자주식회사 휴대용 단말기에서 매뉴얼 포커싱 방법 및 장치
KR20110077236A (ko) * 2009-12-30 2011-07-07 주식회사 아이리버 조작툴을 이용하여 원하는 재생 시점으로 이동할 수 있는 멀티미디어 디바이스 및 그 방법
KR20120124173A (ko) * 2011-05-03 2012-11-13 엘지전자 주식회사 전자 기기 및 전자 기기의 오디오 재생 방법
KR101328199B1 (ko) * 2012-11-05 2013-11-13 넥스트리밍(주) 동영상 편집 방법 및 그 단말기 그리고 기록매체
KR101477486B1 (ko) * 2013-07-24 2014-12-30 (주) 프람트 동영상 재생 및 편집을 위한 사용자 인터페이스 장치 및 그 방법

Also Published As

Publication number Publication date
KR102447307B1 (ko) 2022-09-26

Similar Documents

Publication Publication Date Title
WO2011099806A2 (fr) Procédé et appareil pour la fourniture d'informations d'une pluralité d'applications
WO2013077537A1 (fr) Appareil d'affichage flexible et procédé de fourniture d'interface utilisateur à l'aide dudit appareil
WO2013085327A1 (fr) Appareil d'affichage conçu pour afficher un écran divisé en une pluralité de zones, et procédé associé
WO2012091289A1 (fr) Procédé de déplacement d'un objet entre des pages et appareil d'interface
WO2014173036A1 (fr) Dispositif de communications sans fil et son procédé d'ajout de gadget logiciel
WO2013172607A1 (fr) Procédé d'utilisation d'une unité d'affichage et terminal prenant en charge ledit procédé
WO2016048024A1 (fr) Appareil d'affichage et procédé d'affichage correspondant
WO2011083975A2 (fr) Dispositif mobile et procédé pour utiliser un contenu affiché sur un panneau d'affichage transparent
WO2013162181A1 (fr) Procédé et appareil de partage de données de présentation et d'annotation
WO2014119886A1 (fr) Procédé et appareil pour un fonctionnement multitâche
WO2012053801A2 (fr) Procédé et appareil pour contrôler un écran tactile dans un terminal mobile en réponse à des entrées tactiles multipoint
WO2012108715A2 (fr) Procédé et appareil permettant d'entrer des commandes d'utilisateur à l'aide de mouvements relatifs de panneaux de dispositif
WO2012039587A1 (fr) Procédé et appareil pour modifier l'écran d'accueil dans un dispositif tactile
CN105930031A (zh) 信息处理设备、显示控制方法以及显示控制程序
KR20120107356A (ko) 휴대단말에서 클립보드 기능 제공 방법 및 장치
US20090049386A1 (en) Inter-Device Operation Interface, Device Control Terminal, and Program
WO2015131451A1 (fr) Terminal mobile, et procédé de lecture de sonnerie correspondant
WO2015020362A1 (fr) Procédé de formation de page et dispositif électronique prenant en charge ledit procédé
WO2014003448A1 (fr) Dispositif terminal et son procédé de commande
WO2023282519A1 (fr) Procédé de création de travail multimédia basé sur une entrée tactile
WO2011059227A2 (fr) Procédé de délivrance de contenus à un appareil extérieur
WO2014007504A1 (fr) Appareil et procédé de commande de livre électronique dans un terminal
WO2013105759A1 (fr) Procédé et appareil pour gérer un contenu, et support d'enregistrement lisible par ordinateur sur lequel est enregistré un programme pour exécuter le procédé de gestion de contenu
WO2013062324A1 (fr) Procédé d'application d'informations d'attribut supplémentaires à un contenu de livre électronique et dispositif mobile adapté à celui-ci
WO2019045362A1 (fr) Appareil d'affichage permettant de fournir une ui de prévisualisation et procédé de commande d'appareil d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22837871

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE