KR20110111565A - Method for communication programming of robot and multimedia object, medium recording program for operating the method - Google Patents

Method for communication programming of robot and multimedia object, medium recording program for operating the method Download PDF

Info

Publication number
KR20110111565A
KR20110111565A KR1020100030695A KR20100030695A KR20110111565A KR 20110111565 A KR20110111565 A KR 20110111565A KR 1020100030695 A KR1020100030695 A KR 1020100030695A KR 20100030695 A KR20100030695 A KR 20100030695A KR 20110111565 A KR20110111565 A KR 20110111565A
Authority
KR
South Korea
Prior art keywords
multimedia
robot
multimedia content
editing
tool
Prior art date
Application number
KR1020100030695A
Other languages
Korean (ko)
Inventor
조혜경
Original Assignee
한성대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한성대학교 산학협력단 filed Critical 한성대학교 산학협력단
Priority to KR1020100030695A priority Critical patent/KR20110111565A/en
Publication of KR20110111565A publication Critical patent/KR20110111565A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention is an interactive programming method that allows a user to easily author his or her own animations or games by interworking a multimedia object in a virtual space with a real robot, and records a program written through the interactive programming method. The present invention relates to a medium, and an interactive programming method for a robot and a multimedia object according to an embodiment of the present invention includes an individual motion of a real robot, an individual motion or virtual environment of a multimedia object, and a robot and a multimedia object or a virtual environment through a multimedia content editing tool. Generating a multimedia content including a linkage operation by interaction between the robot and the individual motion or virtual environment of the robot and the multimedia object included in the content in the process of executing the multimedia content, and the robot and the multimedia Characterized by including the step of providing multimedia content to provide a multimedia information, including the linked operation between the object or the virtual environment.

Description

Method for communication programming of robot and multimedia object, medium recording program for operating the method}

The present invention relates to an interactive programming method for a robot and a multimedia object, and a recording medium for recording the program. In particular, a user can easily create his or her own animation or game by interworking a real robot and a multimedia object in a virtual space. The present invention relates to an interactive programming method, and a recording medium for recording a program written through the interactive programming method.

Various authoring tools have been released for producing a video. For example, an authoring tool such as a flash editor, an Adobe premiere, or the like may be used to edit or create a video, an animated video, or the like. Such image editing or authoring tools deal with images based on texts that represent images in letters, and can be produced only when the image maker understands the meaning of the texts.

On the other hand, in recent years, Internet users can create, modify, and process content through various editing tools, thereby creating a more professional level content having value as information. UCC (User Created Contents) is an example, and UCC is information voluntarily generated by Internet users and is implemented in various forms such as text, pictures, photos, and videos.

However, as described above, the authoring of various multimedia contents such as animation videos and game videos is limited to some advanced users who can handle computer programs skillfully. In particular, most of the users who are good at computer programs are young generations in their 20s and 30s, and the authoring of multimedia contents is concentrated in this particular generation.

Accordingly, there is a need for a programming method capable of generating a multimedia content of its own relatively easily even if a user who does not handle computer programs skillfully.

In addition, the conventional multimedia content is authored around the behavior of virtual objects in cyberspace, that is, the conventional multimedia content was not produced in the form of being associated with the behavior of offline objects such as a real robot.

Accordingly, there is a need to provide an authoring method in which the behavior of offline objects such as a robot is included in the content of the multimedia content, or the behavior of offline objects may affect the process of authoring the multimedia content. Taking a game as an example, if the action of the virtual robot online and the actual robot offline affects each other and a game or a sports game is played, the offline interaction between the existing real robots or the online interaction between the virtual robots is performed. It is possible to author a different type of game than a game that was played only through action.

Accordingly, the present invention can be produced under the condition that the authoring of the multimedia content is an interaction between the offline real object such as a robot and the virtual object on the multimedia, and at the same time, the multimedia content can be utilized by using the interaction between the offline real object and the multimedia virtual object. An interactive programming method for robots and multimedia objects is provided.

In addition, the present invention allows authoring multimedia content while manipulating various operations or states of offline real objects such as robots and virtual objects on multimedia through a relatively easy process of combining a plurality of icons or block menus. An interactive programming method for robots and multimedia objects is provided.

In addition, the present invention authors the multimedia content while operating so that the operation of the physical object on the offline, such as a robot and the virtual object on the multimedia is interconnected, this operation is for intuitive operation through a multimedia device such as a mobile phone or TV It is to provide an interactive programming method for a robot and a multimedia object that can be made by.

The present invention also provides a recording medium for recording a program authored by the interactive programming method for the robot and the multimedia object.

In order to achieve the above object, the interactive programming method for the robot and the multimedia object according to the present invention, through the multimedia content editing tool, the individual operation of the actual robot and the individual operation or virtual environment of the multimedia object and the robot and multimedia object or Generating a multimedia content including a linkage operation by interaction between a virtual environment, the individual operation or virtual environment of the robot and the multimedia object included in the content in the process of executing the multimedia content, and the robot and the multimedia object Or providing multimedia content providing multimedia information including the linking operation between virtual environments.

The generating of the multimedia content may include executing the multimedia content editing tool, displaying a tool area and an editing area through the multimedia content editing tool, and manipulating the tool area and the editing area. Editing, including inputting the motion signal into the multimedia content being edited when inputting a motion signal from the motion sensor included in the robot, and finalizing the multimedia content by editing the multimedia content. It is characterized by including.

The generating of the multimedia content may include displaying a corresponding tool list on the tool area when the user selects a specific tool from the tool area, and selects a specific tool list from the displayed tool list. And displaying multimedia editing information about the image, sound, etc., in the editing area, and editing the multimedia contents in the process of generating by manipulating the displayed multimedia editing information.

The multimedia content editing tool may be configured to control the multimedia content generated by including the manipulation of the tool area and the editing area and a motion signal input from a motion sensor of the robot to be stored in at least one of the robot or the multimedia device. It is characterized by.

The multimedia content editing tool may edit the multimedia content to be displayed as graphic information based on a physics engine.

The multimedia content editing tool may edit the multimedia content through intuitive programming of the robot and the multimedia object or the virtual environment.

The multimedia content editing tool may edit the multimedia content in a form in which the robot responds to an external action occurring to the robot, and simultaneously reacts to an external reaction to the multimedia object or the virtual environment. The multimedia content may be edited in such a way that an object or a virtual environment responds to an external action occurring to the robot, and simultaneously reacts to an external reaction to the robot.

The multimedia content editing tool may be configured to edit the multimedia content in a form in which the multimedia object or the virtual environment operates or reacts according to a physical law.

The multimedia content editing tool may include an intuitive interaction editor for intuitively editing the interaction between the multimedia object or the virtual environment and the robot, and an interaction executor for executing the editing contents by the intuitive interaction editor; And a component gallery for providing various components previously generated and stored in the edited content of the multimedia content running in the interaction executor, and a communication module.

In addition, the interaction editor, the subject receives the action for a predetermined first action, such as "if, if" the interaction between the robot and the multimedia object or the virtual environment is in response to the first action. It is characterized by programming in the form of secondary action.

The interaction executor may include a virtual machine for the PC-based multimedia object or virtual environment, a physical interaction processor, and a graphic interaction processor.

In addition, the physical interaction processor is characterized by using the physics engine to realistically express the movement of the XML UCR, and to represent the physical characteristics of the UCRC on the physics engine.

In addition, the graphic interaction processor, the motion movement according to the physical law of the robot and the motion of the multimedia object or the change of the multimedia virtual environment is rendered and rendered, and the expression of the motion by the physical law of the robot is the position information Receives and expresses, the operation of the multimedia object or the change of the multimedia virtual environment interprets the ILB to express the location information, to handle the image change of the robot or multimedia object for animation, and to load and operate multiple XML UCR It is characterized by.

In addition, the multimedia content editing tool, the multimedia object or virtual environment reflects at least one condition of the image change, the force applied, the physical property change, the sound output, the text output, the movement of the predetermined pattern, the rotation of the predetermined pattern And the multimedia content is displayed in a state of being displayed.

The multimedia content editing tool may be mounted on at least one of the robot and the multimedia device for outputting the multimedia information.

The present invention is characterized in that the computer-readable recording medium recording a program for executing the interactive programming method for the robot and the multimedia object.

According to the present invention, the authoring of multimedia content can be performed under the condition that the interaction between the real object on the offline such as a robot and the virtual object on the multimedia is made, and the multimedia content can be utilized by the interaction between the real object on the offline and the multimedia virtual object. Can be authored.

In addition, it is possible to author multimedia contents while manipulating various operations or states of offline real objects such as robots and virtual objects on multimedia through a relatively easy process of combining a menu of a plurality of icons or blocks.

In addition, the multimedia content is authored while the operation of the offline physical object such as a robot and the operation of the virtual object on the multimedia are linked to each other, and such operation can be performed by an intuitive operation through a multimedia device such as a mobile phone or a TV. have.

In addition, it is possible to provide a recording medium for recording a program authored by the interactive programming method for the robot and the multimedia object.

1 is a block diagram showing an interactive programming method for a robot and a multimedia object according to an embodiment of the present invention;
2 is a block diagram for implementing an interactive programming method for a robot and a multimedia object according to an embodiment of the present invention.
3 through 6 illustrate screens of an interactive programming authoring tool for a robot and a multimedia object according to an embodiment of the present invention.
7 is a flowchart illustrating a procedure of an interactive programming method for a robot and a multimedia object according to an embodiment of the present invention.
8 and 9 illustrate a multimedia screen executed by a recording medium of an interactive program for a robot and a multimedia object according to an embodiment of the present invention.

Hereinafter, the present invention will be described in more detail with reference to the accompanying drawings.

1 is a block diagram showing an interactive programming method for a robot and a multimedia object according to an embodiment of the present invention.

As shown, the operation of the actual robot 100 (hereinafter referred to as robot) by the interactive programming method (hereinafter, abbreviated to the interactive programming method) for the robot and the multimedia object according to an embodiment of the present invention is performed by the multimedia terminal. The virtual object included in the multimedia information while being transmitted to the 200 shows the same operation as the robot 100. Here, the motion sensor unit 140 is installed in the robot 100 by one or more motion sensors, the motion of the robot 100 is detected from the motion sensor of the robot 100, and transmitted to the multimedia terminal 200. The multimedia content editing tool 230 included in the terminal 200 includes the motion of the robot 100 transmitted in the multimedia content being edited.

In addition, although FIG. 1 illustrates that the actual robot 100 and the multimedia object exhibit the same operation, the present invention is not limited thereto and may be displayed in a form in which a virtual object reacts to the operation of the robot 100. . This will be described later.

FIG. 2 is a block diagram for implementing an interactive programming method for a robot and a multimedia object according to an embodiment of the present invention.

As illustrated, the robot 100 includes a transceiver 110, a controller 120, a storage 130, a motion sensor 140, a power supply 150, and an input / output unit 160.

The transceiver 110 functions to transmit and receive data and electrical signals with the multimedia terminal 200. The robot 100 exchanges various data and electrical signals with the multimedia terminal 200 to be described later through the transceiver 110, and in particular, transmits a signal from the motion sensor unit 140 to the multimedia terminal 200.

The storage unit 120 provides a storage space for various information, and the power supply 150 supplies power to the robot 100.

The controller 120 accesses data stored in the storage 130 and controls the overall operation of the robot 100. In particular, the controller 120 processes a signal input from the motion sensor unit 140 and transmits the signal to the multimedia terminal 200 through the transceiver 110.

The motion sensor unit 140 detects the motion of the robot 100, including the motion sensor, and inputs the same to the controller 120.

The multimedia terminal 200 includes a transceiver 210, a controller 220, a multimedia content editing tool 230, a display 240, and an input / output unit 250.

The multimedia content editing tool 230 corresponds to the main features of the present invention, and the individual motions or changes of the virtual objects or the virtual environment included in the individual motions and the multimedia information of the robot 100, and the robot 100 and the multimedia information. The multimedia content can be edited and authored in the form of linking the interaction between the virtual objects or the virtual environment included in the.

3 to 5 illustrate screens of an interactive programming authoring tool for robots and multimedia objects according to an embodiment of the present invention.

As illustrated in FIG. 3, the multimedia content editing tool 230 is displayed through the display unit 240 in a form including a tool area 241 and an editing area 242, and the user may display the tool area 241. A predetermined editing condition is selected, and multimedia information such as a video or audio is displayed in the editing area 242 according to the selected editing condition.

The robot 100 or the multimedia terminal 200 include the multimedia content generated by the manipulation of the tool region 241 and the editing region 242 and the motion signal input from the motion sensor unit 140 of the robot 100. ) To be stored in at least one.

The multimedia content editing tool 230 edits the multimedia content to be displayed as graphic information based on a physics engine. That is, when a physical force is applied to the robot, the robot operates, and the motion of the robot is transmitted to the multimedia object, thereby forming multimedia content in a form in which the multimedia object intuitively reacts to the multimedia object.

In other words, the multimedia content editing tool 230 may display the multimedia content in a form in which the robot 100 reacts to an external action occurring to the robot 100 and simultaneously reacts to an external reaction to the multimedia object or the virtual environment. The multimedia content is edited in such a way that the multimedia object or the virtual environment responds to an external action occurring to the robot and simultaneously reacts in response to an external reaction to the robot 100.

To this end, the multimedia content editing tool 230 is generated including an interaction editor, an interaction launcher, a component gallery, and a communication module.

The interaction editor functions to intuitively edit the interaction between the multimedia object or the virtual environment and the robot 100. Here, the interaction between the robot 100 and the multimedia object or the virtual environment refers to a secondary action in which the subject receiving the action for a predetermined primary action, such as "if, if," responds to the primary action. Programming in the form of

The interaction launcher functions to execute the edits by the intuitive interaction editor. Here, the PC-based virtual object or a virtual environment for the virtual environment, a physical interaction processor and a graphic interaction processor is formed. The physical interaction processor uses the physics engine to realistically represent the movement of the XML UCR and to express the physical characteristics of the UCRC on the physics engine. The graphic interaction processor classifies and renders the movement according to the physical law of the robot and the motion of the multimedia object or the change of the multimedia virtual environment, and the expression of the motion according to the physical law of the robot 100 receives position information. The multimedia object's operation or the change of the multimedia virtual environment is interpreted and expressed by receiving the location information, and the image change of the robot or the multimedia object for animation is processed, and several XML UCRs are loaded and operated.

The component gallery functions to provide various components that are previously created and stored in the edit contents of the multimedia content running in the interactive launcher.

In addition, the multimedia content editing tool is a state in which a multimedia object or a virtual environment reflects at least one condition of an image change, a force, a change of a physical property, a sound output, a text output, a movement of a predetermined pattern, and a rotation of a predetermined pattern. Is outputted to allow the multimedia content to be displayed.

The multimedia content editing tool 230 is mounted on at least one of the robot 100 and the multimedia terminal 200 for outputting multimedia information.

7 is a flowchart illustrating a procedure of an interactive programming method for a robot and a multimedia object according to an embodiment of the present invention.

The interactive programming method according to an embodiment of the present invention includes a multimedia content generating step and a multimedia content providing step.

The multimedia content generating step may be performed through the multimedia content editing tool 230 to perform a linked operation by the individual motion of the robot 100 and the individual motion or virtual environment of the multimedia object and the interaction between the robot 100 and the multimedia object or virtual environment. It is made to include.

In the providing of the multimedia content, in the process of executing the multimedia content, the multimedia information including the individual operation or virtual environment of the robot 100 and the multimedia object included in the content and the linking operation between the robot and the multimedia object or the virtual environment is provided. do.

Here, the generating of the multimedia content may include executing the multimedia content editing tool 230, displaying the tool area 241 and the editing area 242 through the multimedia content editing tool 230, and the tool area. Editing multimedia content by manipulating the 241 and the editing area 242, including the motion signal in the multimedia content being edited when the motion signal is input from the motion sensor included in the robot 100, and the multimedia content. Comprising the editing of the multimedia content comprises the step of final generation.

The generating of the multimedia content may include displaying a list of tools in the tool area when the user selects a specific tool in the tool area 241 of the multimedia content editing tool 230, and in the displayed tool list. Selecting a specific tool list and displaying multimedia editing information on a corresponding video, sound, etc. in the editing area; and editing a multimedia content in a process of generating by manipulating the displayed multimedia editing information. Is done.

Referring to FIG. 7, the multimedia content editing tool is executed (S120), and accordingly, the tool area and the editing area are displayed on the multimedia terminal (S130). The multimedia content is edited through the tool area and the editing area. When the motion signal by the motion sensor of the robot is input (S150), the multimedia content editing tool selects an operation of the multimedia object associated with the input motion signal from a menu illustrated in Table 1 below. By editing the multimedia content in the state reflecting him (S160).

Classification example

Figure pat00001
Figure pat00002
Figure pat00003
Figure pat00004
Figure pat00005
Figure pat00006
Single axis rotation Keep turning CCW CW Half wheel 1/4 turn
Figure pat00007
Figure pat00008
Figure pat00009
Figure pat00010
Figure pat00011
Figure pat00012
Car moving Follow the line control deceleration Acceleration Maze
Figure pat00013
Figure pat00014
Figure pat00015
Figure pat00016
Figure pat00017
Arm type movement Swipe Push hard cast control
Figure pat00018
Figure pat00019
Figure pat00020
Figure pat00021
Figure pat00022
Figure pat00023
Condition of sensor According to time According to the brightness According to the sound According to resistance As seen

Finally, through the series of processes, the multimedia content is finally edited and the multimedia content is generated (S170).

In addition, the method for providing multimedia content for advertisement using the content editing tool according to the present invention may be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium. The computer readable medium may include program instructions, data files, data structures, etc. alone or in combination. Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts. Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks. Magneto-optical media, and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.

100: robot 110: transceiver
120: control unit 130: storage unit
140: motion sensor unit 150: power
160: input and output unit 200: multimedia terminal
210: transceiver 220: controller
230: multimedia content editing tool 240: display unit
241: tool area 242: editing area
250: input / output unit

Claims (16)

Generating a multimedia content including an individual operation of a real robot, an individual operation or a virtual environment of a multimedia object, and a linked operation by interaction between the robot and the multimedia object or the virtual environment through a multimedia content editing tool;
Providing multimedia content for providing multimedia information including the individual operation or virtual environment of the robot and the multimedia object included in the content and the linking operation between the robot and the multimedia object or virtual environment in the process of executing the multimedia content. Interactive programming method for a robot and a multimedia object, characterized in that it comprises a.
The method of claim 1,
The generating of the multimedia content may include:
Executing the multimedia content editing tool;
Displaying a tool area and an editing area through the multimedia content editing tool;
Editing multimedia content by manipulating the tool area and the editing area;
Including the motion signal in the multimedia content being edited when the motion signal is input from the motion sensor provided in the robot;
Comprising the editing of the multimedia content, and the step of generating the multimedia content, the interactive programming method for the robot and the multimedia object, characterized in that the step.
The method of claim 2,
The generating of the multimedia content may include:
If a user selects a specific tool in the tool area, displaying a corresponding tool list in the tool area;
Selecting a specific tool list from the displayed tool list and displaying multimedia editing information on a corresponding image, sound, etc. in the editing area;
And editing the multimedia content in the process of creation by manipulating the displayed multimedia editing information.
The method of claim 2,
The multimedia content editing tool controls the multimedia content generated by including the manipulation of the tool area and the editing area and a motion signal input from a motion sensor of the robot to be stored in at least one of the robot or the multimedia device. Interactive programming method for robots and multimedia objects.
The method of claim 1,
And the multimedia content editing tool edits the multimedia content to be displayed as graphic information based on a physics engine.
The method of claim 1,
And the multimedia content editing tool edits the multimedia content through intuitive programming of the robot and the multimedia object or the virtual environment.
The method of claim 1,
The multimedia content editing tool,
Edit the multimedia content in a form in which the robot responds to an external action occurring to the robot and simultaneously reacts to an external reaction to the multimedia object or the virtual environment,
Interactive programming method for the multimedia object, characterized in that the multimedia object or the virtual environment is in response to the external action occurred to the user and at the same time in response to the external reaction to the robot in response to the form of the multimedia content in a form that can react. .
The method of claim 1,
And the multimedia content editing tool edits the multimedia content in a manner in which the multimedia object or the virtual environment operates or reacts according to a physical law.
The method of claim 1,
The multimedia content editing tool,
An intuitive interaction editor for intuitively editing the interaction of the multimedia object or virtual environment with the robot;
An interaction executor for executing edits made by the intuitive interaction editor;
A component gallery for providing various components previously generated and stored in edit contents of multimedia content running in the interaction executor;
Interactive programming method for a multimedia object, characterized in that it comprises a communication module.
The method of claim 9,
The interaction editor may be configured such that the interaction between the robot and the multimedia object or the virtual environment is "if, if," for a predetermined primary action, such that the object receiving the action responds to the primary action. Interactive programming method for a multimedia object, characterized in that the programming in the form of a difference.
The method of claim 9,
The interaction launcher,
Interactive programming method for a multimedia object, characterized in that it comprises a PC-based virtual object or a virtual machine for a virtual environment, a physical interaction processor, and a graphical interaction processor.
12. The method of claim 11,
And the physical interaction processor realistically expresses the movement of the XML UCR using a physics engine and expresses the physical characteristics of the UCRC on the physics engine.
The method of claim 11,
The graphic interaction processor renders the motion by the physical law of the robot separately from the motion of the multimedia object or the change of the multimedia virtual environment, and the expression of the motion by the physical law of the robot is expressed by receiving position information. The operation of the multimedia object or the change of the multimedia virtual environment is characterized by interpreting the ILB, receiving the location information, processing the image change of the robot or the multimedia object for animation, and loading and operating several XML UCRs. Interactive programming method for multimedia objects.
The method of claim 1,
The multimedia content editing tool is a state in which the multimedia object or virtual environment reflects at least one condition among an image change, a force application, a change of a physical property, a sound output, a text output, a movement of a predetermined pattern, and a rotation of a predetermined pattern. And outputting the multimedia content to display the multimedia content.
The method of claim 1,
And the multimedia content editing tool is mounted on at least one of the robot and the multimedia device for outputting the multimedia information.
A computer-readable recording medium having recorded thereon a program for executing the method of any one of claims 1 to 15.
KR1020100030695A 2010-04-05 2010-04-05 Method for communication programming of robot and multimedia object, medium recording program for operating the method KR20110111565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100030695A KR20110111565A (en) 2010-04-05 2010-04-05 Method for communication programming of robot and multimedia object, medium recording program for operating the method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100030695A KR20110111565A (en) 2010-04-05 2010-04-05 Method for communication programming of robot and multimedia object, medium recording program for operating the method

Publications (1)

Publication Number Publication Date
KR20110111565A true KR20110111565A (en) 2011-10-12

Family

ID=45027554

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100030695A KR20110111565A (en) 2010-04-05 2010-04-05 Method for communication programming of robot and multimedia object, medium recording program for operating the method

Country Status (1)

Country Link
KR (1) KR20110111565A (en)

Similar Documents

Publication Publication Date Title
US10143924B2 (en) Enhancing user experience by presenting past application usage
JP4638913B2 (en) Multi-plane 3D user interface
CA2825223C (en) Remotely emulating computing devices
US20110126106A1 (en) System for generating an interactive or non-interactive branching movie segment by segment and methods useful in conjunction therewith
US20160045834A1 (en) Overlay of avatar onto live environment for recording a video
CN110832450A (en) Method and system for providing objects in a virtual or semi-virtual space based on user characteristics
JP2008250813A (en) Image creating device, image processing method, and program
US9233303B2 (en) Mobile device game interface
US20130198714A1 (en) System and method for stage rendering in a software authoring tool
US8954862B1 (en) System and method for collaborative viewing of a four dimensional model requiring decision by the collaborators
CN115735189A (en) Method for caching and presenting interactive menus for disparate applications
US20150352442A1 (en) Game having a Plurality of Engines
KR20110111565A (en) Method for communication programming of robot and multimedia object, medium recording program for operating the method
US20240112418A1 (en) XR World Build Capture and Playback Engine
Hussain et al. Cocos2d-x game development essentials
JP2003196679A (en) Method for creating photo-realistic animation that expresses a plurality of emotions
US20220398002A1 (en) Editing techniques for interactive videos
WO2024072608A1 (en) Xr world build capture and playback engine
Roberts The AR/VR Technology Stack: A Central Repository of Software Development Libraries, Platforms, and Tools
Kean et al. 3D Games and User Interfaces with Unity
Quintas et al. A context-aware immersive interface for teleoperation of mobile robots
Mavrogeorgi et al. Reasoner system for video creation having as input a conceptual video description
Cameron XNA Framework for Games and Applications
Carr et al. Shadow Puppetry Using the Kinect
Chavez et al. Viewing Active Cinema

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination