WO2022255572A1 - 반응형 그림 제작 서비스 제공 시스템 및 이의 제어방법 - Google Patents
반응형 그림 제작 서비스 제공 시스템 및 이의 제어방법 Download PDFInfo
- Publication number
- WO2022255572A1 WO2022255572A1 PCT/KR2021/016773 KR2021016773W WO2022255572A1 WO 2022255572 A1 WO2022255572 A1 WO 2022255572A1 KR 2021016773 W KR2021016773 W KR 2021016773W WO 2022255572 A1 WO2022255572 A1 WO 2022255572A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- dynamic image
- picture production
- providing system
- service providing
- Prior art date
Links
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 45
- 238000000034 method Methods 0.000 title claims description 23
- 230000003993 interaction Effects 0.000 claims abstract description 31
- 230000002452 interceptive effect Effects 0.000 claims abstract 2
- 239000011324 bead Substances 0.000 claims description 14
- 230000014509 gene expression Effects 0.000 claims description 11
- 230000010399 physical interaction Effects 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 8
- 238000005096 rolling process Methods 0.000 claims description 3
- FGUUSXIOTUKUDN-IBGZPJMESA-N C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 Chemical compound C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 FGUUSXIOTUKUDN-IBGZPJMESA-N 0.000 claims description 2
- 230000003542 behavioural effect Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 7
- 230000006399 behavior Effects 0.000 abstract description 3
- 230000000007 visual effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 14
- 241000196324 Embryophyta Species 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 235000013399 edible fruits Nutrition 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 241000191291 Abies alba Species 0.000 description 1
- 241000272525 Anas platyrhynchos Species 0.000 description 1
- 241000272517 Anseriformes Species 0.000 description 1
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- 206010011469 Crying Diseases 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 239000004579 marble Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000005293 physical law Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/101—Collaborative creation, e.g. joint development of products or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/40—Business processes related to the transportation industry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B11/00—Teaching hand-writing, shorthand, drawing, or painting
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B11/00—Teaching hand-writing, shorthand, drawing, or painting
- G09B11/10—Teaching painting
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/02—Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/401—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
- H04L65/4015—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- the present invention provides a responsive picture production service that can more efficiently assist a user's work by implementing a dynamic image that can be interacted with using a picture drawn by a user and providing a guide image for each category of the dynamic image to be created. It relates to a system and its control method.
- a dynamic image refers to an image produced to move according to an interaction, unlike an image reproduced according to a predetermined flow, such as a video.
- the dynamic image may show an object moving by a click or touch, a color or shape of the object changing over time, or a physical interaction with another object.
- these dynamic images can easily attract users' attention and can be used in various fields such as advertising, education, and content production.
- the present invention implements a dynamic image that can be interacted with by using a picture drawn by the user, and provides a guide image for each category of the dynamic image to be created, thereby producing a responsive picture that can assist the user's work more efficiently.
- the purpose is to provide a service providing system and its control method.
- the user's picture production a terminal capable of inputting event information related to a behavioral response; First data for a dynamic image provided through at least one of the web and an external server according to event information input from the terminal while communicating by establishing a network with the terminal, the web, and an external server, and a pre-stored picture production program a control unit that matches second data for a signal using a drawing tool to generate arbitrary custom information, and outputs a preset notification signal according to the generated custom information; and a notification unit provided in the terminal and visually displaying the notification signal on an arbitrary screen according to the notification signal output from the control unit.
- the picture production program is for performing picture production using a drawing tool capable of pictorial expression of at least one of point, line, plane, shape, color, light, texture, and volume on the canvas. It may be graphical software configured to support the environment.
- control unit determines a category of a dynamic image provided through at least one of the web and an external server according to the event information, and at least one of a plurality of guide images corresponding to the determined category It is displayed on the canvas area on the picture production program, and a drawing image drawn by the user while using the drawing tool based on the guide image is saved as data, and a new dynamic image can be created based on the saved drawing image.
- the dynamic image may include a bead image that is flexibly expressed within a canvas area on the picture production program
- the picture production service providing system includes: a gyro sensor provided in the terminal and sensing a direction; and a position sensor provided in the terminal to sense a position
- the event information may include a user's first manipulation signal for changing an inclination of the terminal
- the control unit may include the gyro sensor and third data for terminal tilt information provided through at least one of a location sensor may be additionally matched and reflected in the custom information.
- the bead image is displayed at the inclination value of the terminal.
- the trace of the marble image flowing according to the second data in which the attribute value for the pictorial expression of the drawing tool is set is real-time
- the displayed notification signal can be output.
- the picture production program may be configured to set a thickness according to the event information for a first icon representing a formative element of a point, line, or plane having an arbitrary thickness among the drawing tools.
- the thickness setting may be performed by a stroke touched (clicked) by the first icon, and the control unit provides a screen in which the size of the first icon is activated corresponding to the size of the set thickness in the notification unit. It is possible to output a notification signal to make it happen.
- the picture production program may be configured to set a color for a second icon representing a formative element for color among the drawing tools according to the event information, and the color setting may be performed by setting the second icon It can be made by a stroke that firstly touches (clicks) and then secondarily touches (clicks) a desired color.
- a notification signal for providing an activated screen in the same color as the set color may be output.
- the interaction of the dynamic image may be related to a category of the dynamic image.
- the interaction of the dynamic image may include moving an object in the dynamic image in response to a user's input, indicating a physical interaction with other objects on the screen of the notification unit, or voice data related to the dynamic image. It may include an operation of outputting.
- the movement of the object in the dynamic image or the method of physical interaction with other objects in the screen of the notification unit may be determined based on a pre-stored physics engine program.
- the guide image may include an object image having a dotted outline, and an object of the guide image may be related to a category of the dynamic image.
- step f) may include f-1) loading image data received through a path where the dynamic image is stored.
- the present invention implements a dynamic image that can be interacted with using a picture drawn by a user, and provides a guide image for each category of the dynamic image to be created, so that the user's work can be assisted more efficiently.
- a picture can be drawn without touching the display according to the tilt of the terminal by providing an effect of drawing a picture by moving the beads on the tray using a position sensor and a gyro sensor pre-installed in the terminal.
- a more interesting picture-making environment can be achieved.
- FIG. 1 is a diagram schematically showing the electronic configuration of a responsive picture production service providing system according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing the configuration of a control unit in the picture production service providing system according to FIG. 1;
- 3 and 4 are views showing an embodiment of pictorial expression using a bead image according to the action of a gyro sensor and a position sensor.
- FIG. 5 is a view showing an embodiment in which the size of a corresponding icon is correspondingly changed and displayed as the property setting of the icon constituting the drawing tool is changed.
- FIG. 6 is a diagram showing an embodiment in which the color of an icon constituting a drawing tool is changed correspondingly and displayed as property settings thereof are converted;
- FIG. 7 is a flowchart illustrating a control method of a responsive picture production service providing system according to an embodiment of the present invention.
- FIG. 8 is a diagram illustrating a screen of a notification unit and a guide image displayed thereon;
- FIG 9 illustrates an example guide image that may be provided to a user.
- FIG. 10 is a diagram illustrating a state in which a user draws a picture along a guide image on a screen of a notification unit
- FIG. 11 is a diagram showing how a dynamic image generated by a user interacts with another object on the screen of a notification unit
- first and second are used to distinguish one component from another, and the scope of rights should not be limited by these terms.
- a first element may be termed a second element, and similarly, a second element may be termed a first element.
- first element may be termed a second element, and similarly, a second element may be termed a first element.
- connection when an element is referred to as “connected” to another element, it may be directly connected to the other element, but other elements may exist in the middle.
- an element when an element is referred to as being “directly connected” to another element, it should be understood that no intervening elements exist.
- other expressions describing the relationship between components such as “between” and “immediately between” or “adjacent to” and “directly adjacent to” should be interpreted similarly.
- FIG. 1 is a diagram schematically showing the electronic configuration of a responsive picture production service providing system according to an embodiment of the present invention
- FIG. 2 is a block diagram showing the configuration of a control unit in the picture production service providing system according to FIG. 1.
- 3 and 4 are diagrams showing an embodiment of a painting expression using a bead image according to the action of a gyro sensor and a position sensor
- FIG. Figure 6 is a diagram showing an embodiment in which the size of is correspondingly changed and displayed
- FIG. 6 is a diagram showing an embodiment in which the color of the icon is correspondingly changed and displayed as the property setting of the icon constituting the drawing tool is changed.
- FIG. 8 is a diagram showing a screen of a notification unit and a guide image displayed thereon
- FIG. 9 is an exemplary view that can be provided to a user.
- FIG. 10 is a diagram showing how a user draws a picture along a guide image on a screen of a notification unit
- FIG. 11 is a state in which a dynamic image created by a user on the screen of a notification unit interacts with another object.
- the responsive picture production service providing system 100 provides a picture production service capable of real-time interaction between a user and an object, in the terminal ( 110), a web 120, an external server 130, a control unit 140, and a notification unit 150.
- FIG. 1 the components shown in FIG. 1 are only the minimum necessary components for the present invention, and a picture production service providing system having more additional components may be implemented.
- the terminal 110 is configured to include various known input devices (not shown) such as a keyboard, a pad, and a mouse. According to a preferred embodiment of the present invention, the terminal 110 is located on the user and has a behavioral response for the user to create a picture. Event information related to may be input to the web 120, the external server 130, and the control unit 140.
- the terminal 110 may be configured in plurality, and preferably, a tablet may be applied, but is not limited thereto, and a PC, a smartphone, and the like may be applied.
- the configuration of the terminal 110 corresponds to a generally published terminal device, the detailed configuration is not shown in the drawing, and a detailed description thereof will be omitted.
- the web 120 is an Internet service configured to support various multimedia such as text, graphics, images, sound, and video. It can be configured to provide dynamic images as well as various known media related to production.
- the external server 130 is configured to correspond to the above-described web 120 and forms a network with the terminal 110, the web 120, and the control unit 140, and interlocks with the web 120. It performs a function of servicing a program preset in the control unit 140.
- the external server 130 may be configured to provide dynamic images in the same way as the web 120 described above, and the program may be an editor program composed of various data related to picture production, which is a well-known technology. As such, it can be freely modified and designed by those skilled in the art with conventional knowledge in the related field.
- the control unit 140 is configured to provide a dynamic image capable of interaction between a user and an object by actively responding to the event information in which the picture production program corresponding to the web 120 and the external server 130 is stored in advance.
- a control server for the control server it is preferable to be configured to establish a network with the terminal 110, the web 120 and the external server 130 to communicate with each other.
- the controller 140 controls first data for a dynamic image provided through at least one of the web 120 and the external server 130 according to event information input from the terminal, and previously stored It performs a function of generating arbitrary custom information by matching second data for a signal using a drawing tool of a picture production program and outputting a preset notification signal according to the generated custom information.
- control unit 140 is more specifically, with reference to FIG. 2, a data generation unit 141 that converts event information input from the terminal into data, and data generated by the data generation unit 141.
- Custom information by matching the first and second data according to the data of the event information with the database unit 142 configured to pre-store and manage the picture production program for reacting in real time according to the data of the event information.
- It may be configured to include a data processing unit 143 that generates and a notification signal output unit 144 that outputs a preset notification signal according to the custom information generated through the data processing unit 143.
- the custom information means information corresponding to data about a new dynamic image, which will be described later.
- control unit 140 determines a category of a dynamic image provided through at least one of the web 120 and the external server 130 according to event information, and determines a plurality of categories corresponding to the determined category.
- At least one of the guide images is displayed on the canvas area on the drawing production program, a drawing image drawn directly by the user while using the drawing tool based on the guide image is saved as data, and a new drawing based on the saved drawing image is displayed. Dynamic images can be created.
- the category of the dynamic image of the present invention is a concept that includes, for example, all major categories such as animals, plants, objects, and humans, as well as detailed sub-categories such as dogs, cats, ducks, trees, flowers, fruits, men, and women.
- Guide image data corresponding to each category and data describing an interaction method of each category may be stored in the database unit 142 of the control unit 140 in advance.
- the picture production program produces pictures using a drawing tool capable of pictorial expression of at least one of point, line, plane, shape, color, light, texture, and volume on a canvas. It may be graphical software configured to support the environment for performing.
- the notification unit 150 visually displays preset text and images on an arbitrary screen according to a notification signal output from the notification signal output unit 144 of the control unit 140, preferably, the terminal. It is preferable to be configured to be provided in (110).
- the notification unit 150 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display. It may be configured to include at least one of (flexible display) and 3D display.
- LCD liquid crystal display
- TFT LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- flexible display It may be configured to include at least one of (flexible display) and 3D display.
- the dynamic image may include a bead image that is flexibly expressed within the canvas area on the picture production program, and the picture production service providing system 100 of the present invention is provided in the terminal 110 to It may be configured to further include a gyro sensor 160 that detects a gyro sensor 160 and a position sensor 170 provided in the terminal 110 to detect a position.
- the picture production service providing system 100 of the present invention is provided in the terminal 110 to It may be configured to further include a gyro sensor 160 that detects a gyro sensor 160 and a position sensor 170 provided in the terminal 110 to detect a position.
- the event information may include a user's first manipulation signal for changing the inclination of the terminal 110, and the controller 140 may use at least one of the gyro sensor 160 and the position sensor 170.
- Third data for the provided terminal tilt information may be additionally matched and reflected in the custom information.
- the control unit 140 refers to FIGS. 3 and 4, the first data represents a value in which a bead image is selected as a dynamic image, and the third data represents the terminal 110 by the above-described first manipulation signal. ), the bead image flows like rolling in the canvas area on the picture production program in response to the tilt change value, and the attribute value for the pictorial expression of the drawing tool is set. According to the second data, a second notification signal in which the trace of the bead image flows in real time is expressed can be output.
- the drawing production program enables setting the thickness of a first icon representing a formative element of a point, line, or plane having an arbitrary thickness among drawing tools according to the event information.
- the thickness setting may be performed by a stroke touched (clicked) by a first icon, and the control unit 140 displays the first icon in the notification unit 150 corresponding to the size of the set thickness.
- a third notification signal for providing an activated screen with a size of can be output.
- the picture production program may be configured to enable color setting according to the event information for a second icon representing a formative element for color among drawing tools, and the color The setting may be performed by a stroke of first touching (clicking) a second icon and then secondarily touching (clicking) a desired color.
- a fourth notification signal for providing a screen in which at least some icons of the drawing tool including icons are activated in the same color as the set color may be output.
- event information is input by the terminal 110 in real time.
- the control unit 140 determines the category of the dynamic image provided through at least one of the web 120 and the external server 130 according to the event information input in the event information input step (S100). for a decision to be made.
- the controller displays at least one of the plurality of guide images corresponding to the category determined in the category determining step (S200) on the canvas area of the picture production program.
- the controller 140 loads one or more pre-stored guide images of the corresponding category and displays them through the notification unit 150. .
- FIG. 8 shows a guide image displayed on the screen of the notification unit 150.
- a guide image corresponding to a category selected by the user is displayed on the screen, and drawing tools necessary for the user to draw are displayed on the left side.
- icons for selecting the thickness, type, and color of drawing lines may be displayed on the drawing tool.
- the guide image is an image for assisting the user to easily draw along the picture, and since each image is tagged with a category, it is a basis for determining an interaction method of a dynamic image to be created later.
- an umbrella-shaped guide image is displayed as shown in FIG. 8
- a dynamic image created therefrom may represent an interaction of bouncing raindrops on the screen as shown in FIG. 11 .
- the first image may correspond to categories such as 'plant', 'tree', and 'Christmas tree'
- the second image may correspond to 'fruit' and 'apple' categories. etc.
- the third image may correspond to categories such as 'animal', 'bird', and 'duck'.
- an outline of an object in the guide image may be displayed as a dotted line, and a user may be encouraged to perform an activity such as drawing or coloring along the dotted line.
- control unit 140 stores the drawing image directly drawn by the user while using the drawing tool based on the guide image as data.
- FIG. 10 shows a state in which a user draws a picture along a guide image on a canvas area. Specifically, the user can select the thickness, color, type (eg, brush, crayon, colored pencil, highlighter, etc.) of a line necessary for drawing a drawing image from the left drawing tool.
- the thickness, color, type eg, brush, crayon, colored pencil, highlighter, etc.
- the user can freely draw a picture along the guide image on the screen using an input device (mouse, touch pen, etc.).
- an input device mouse, touch pen, etc.
- the controller 140 creates a new dynamic image based on the drawing image stored in the drawing image storage step (S400).
- the dynamic image may indicate an interaction in response to a user's input within the screen of the notification unit, and may be loaded into another application through reference to a stored path, and a specific interaction method or rule will be described later. do.
- the notification unit 150 displays the dynamic image generated in the dynamic image generation step (S500) on an arbitrary screen.
- the step of displaying the notification unit (S600) may include a data loading step (not shown) of loading image data received through a path where a dynamic image is stored.
- dynamic images can be loaded into various applications (eg, SNS messenger, e-mail, image processing apps, Internet browsers, etc.) through reference to the stored path, and if each application supports the format of the dynamic image, it can be mutually loaded within the application. action is possible
- the interaction of the dynamic image may be related to the category of the dynamic image, move an object in the dynamic image in response to a user's input, or interact with other objects in the screen of the notification unit 150. It may include an operation of indicating a physical interaction of or outputting voice data related to the dynamic image.
- the motion of the object in the dynamic image or the method of physical interaction with other objects in the screen of the notification unit 150 may be determined based on a pre-stored physics engine program, and as described above, the guide image may include an object image whose outline is indicated by a dotted line, and the object of the guide image may be related to a category of a dynamic image.
- FIG. 11 shows an example of how a dynamic image generated by a user interacts with another object on the screen of the notification unit 150.
- the interaction of the dynamic image may be implemented in various forms, and at least some of the interaction rules are related to the category of the dynamic image (ie, the category of the guide image that is the basis of the dynamic image, determined in step S200).
- moving an object of the dynamic image in response to a user's input touch, mouse click, keyboard input, voice input, etc.
- a user's input touch, mouse click, keyboard input, voice input, etc.
- physical interaction with other objects in the interaction screen eg, two objects colliding
- It may be designed to display a sound (eg, a tree shaking, an animal crying, etc.) related to a dynamic image.
- the motion of the object or the method of physical interaction with other objects may be determined based on a pre-stored physics engine program.
- Physical laws of reality such as bouncing or destroying other objects, can be implemented in virtual space.
- the user can feel as if the image drawn by himself is alive and moving.
- objects such as raindrops and umbrellas may be set as circles of a certain size, and if the distance between the centers of each object is smaller than the sum of the radii of the circles, it may be determined as a collision.
- a more complex physical interaction for example, the direction and appearance of raindrops bouncing after collision may be implemented by considering the shape of the umbrella (incline, etc.), the speed at which raindrops move, and the like.
- an image drawn by a user is displayed in real time according to the user's input. For example, when a user draws a line on the screen by dragging or touching a mouse, a dynamic image (flower, Various forms of interaction can be implemented, such as plant pictures) appearing along the drawn line.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Tourism & Hospitality (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Human Computer Interaction (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Primary Health Care (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Entrepreneurship & Innovation (AREA)
- Operations Research (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Electrically Operated Instructional Devices (AREA)
- Information Transfer Between Computers (AREA)
- Toys (AREA)
- Digital Computer Display Output (AREA)
Abstract
Description
Claims (13)
- 사용자와 오브젝트 간에 실시간 상호작용이 가능한 그림 제작 서비스를 제공하는 반응형 그림 제작 서비스 제공 시스템에 있어서,사용자의 그림 제작을 위한 행동반응과 관련한 이벤트 정보를 입력가능하게 구성되는 단말;상기 단말, 웹 및 외부 서버와 네트워크를 구축하여 통신하면서, 상기 단말로부터 입력되는 이벤트 정보에 따라 상기 웹 및 외부 서버 중 적어도 하나를 통해 제공되는 동적 이미지에 대한 제1 데이터와, 미리 저장된 그림 제작 프로그램의 드로잉 툴을 이용하는 신호에 대한 제2 데이터를 매칭시켜 임의의 커스텀 정보를 생성하고, 상기 생성된 커스텀 정보에 따라 미리 설정된 알림신호를 출력하는 제어부; 및상기 단말에 구비되어, 상기 제어부로부터 출력된 알림신호에 따라 임의의 화면 상에 시각적으로 디스플레이하는 알림부를 포함하는 반응형 그림 제작 서비스 제공 시스템.
- 제 1 항에 있어서,상기 그림 제작 프로그램은,캔버스 상에 점, 선, 면, 형, 색, 빛, 질감, 양감 중 적어도 하나의 회화적 표현이 가능한 드로잉 툴(Drawing Tool)을 이용하여 그림 제작을 수행하기 위한 환경을 지원하도록 구성된 그래픽 소프트웨어인 것을 특징으로 하는 쌍방향 참여형 그림 제작 서비스 제공 시스템.
- 제 2 항에 있어서,상기 제어부는,상기 이벤트 정보에 따라 상기 웹 및 외부 서버 중 적어도 하나를 통해 제공되는 동적 이미지의 카테고리에 대한 결정이 이루어지도록 하고, 상기 결정된 카테고리에 해당하는 복수의 가이드 이미지 중 적어도 하나가 상기 그림 제작 프로그램 상에서의 캔버스 영역에 표시되도록 하며, 사용자가 상기 가이드 이미지에 기반하여 상기 드로잉 툴을 이용하면서 직접 그린 드로잉 이미지를 데이터로 저장하고, 상기 저장된 드로잉 이미지에 기초한 새로운 동적 이미지를 생성하는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템.
- 제 2 항에 있어서,상기 동적 이미지는,상기 그림 제작 프로그램 상에서의 캔버스 영역 내에서 유동가능하게 표현되는 구슬 이미지를 포함하는 것을 특징으로 하고,상기 그림 제작 서비스 제공 시스템은,상기 단말에 구비되어 방위를 감지하는 자이로 센서; 및상기 단말에 구비되어 위치를 감지하는 위치 센서를 더 포함하는 것을 특징으로 하며,상기 이벤트 정보는,상기 단말의 기울기를 변화시키는 사용자의 제1 조작신호를 포함하는 것을 특징으로 하고,상기 제어부는,상기 자이로 센서 및 위치 센서 중 적어도 하나를 통해 제공되는 단말 기울기 정보에 대한 제3 데이터를 부가적으로 매칭시켜 상기 커스텀 정보에 반영하는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템.
- 제 4 항에 있어서,상기 제어부는,상기 제1 데이터가 동적 이미지로 구슬 이미지가 선택된 값을 나타내고,상기 제3 데이터가 상기 제1 조작신호에 의한 상기 단말의 기울기 변화값을 나타낼 경우,상기 구슬 이미지가 상기 기울기 변화값에 대응하여 상기 그림 제작 프로그램 상에서의 캔버스 영역 내에서 굴러다니는듯한 유동을 하면서, 상기 드로잉 툴의 회화적 표현에 대한 속성값이 설정된 제2 데이터에 따라 상기 구슬 이미지가 유동 진행하는 흔적이 실시간 표현되는 알림신호를 출력하는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템.
- 제 2 항에 있어서,상기 그림 제작 프로그램은,상기 드로잉 툴 중 임의의 두께를 갖는 점, 선, 면에 대한 조형요소를 표현하는 제1 아이콘에 대해 상기 이벤트 정보에 따라 두께 설정이 가능하도록 구성되는 것을 특징으로 하고,상기 두께 설정은,상기 제1 아이콘이 터치(클릭)하는 스트로크에 의해 이루어지는 것을 특징으로 하며,상기 제어부는,상기 알림부에, 상기 설정된 두께의 크기에 대응하여 상기 제1 아이콘의 크기가 활성화된 화면이 제공되도록 하는 알림신호를 출력하는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템.
- 제 2 항에 있어서,상기 그림 제작 프로그램은,상기 드로잉 툴 중 색에 대한 조형요소를 표현하는 제2 아이콘에 대해 상기 이벤트 정보에 따라 색상 설정이 가능하도록 구성되는 것을 특징으로 하고,상기 색상 설정은,상기 제2 아이콘을 1차적으로 터치(클릭)한 후, 원하는 색상을 2차적으로 터치(클릭)하는 스트로크에 의해 이루어지는 것을 특징으로 하며,상기 제어부는,상기 알림부에, 상기 제2 아이콘을 포함한 드로잉 툴의 적어도 일부 아이콘이 상기 설정된 색상과 동일한 색상으로 활성화된 화면이 제공되도록 하는 알림신호를 출력하는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템.
- 사용자와 오브젝트 간에 실시간 상호작용이 가능한 그림 제작 서비스를 제공하는 반응형 그림 제작 서비스 제공 시스템의 제어방법으로서,a) 단말에 의해 이벤트 정보가 입력되는 단계;b) 제어부가 상기 a)단계에서 입력되는 이벤트 정보에 따라 상기 웹 및 외부 서버 중 적어도 하나를 통해 제공되는 동적 이미지의 카테고리에 대한 결정이 이루어지도록 하는 단계;c) 제어부가 상기 b)단계에서 결정된 카테고리에 해당하는 복수의 가이드 이미지 중 적어도 하나가 그림 제작 프로그램 상에서의 캔버스 영역에 표시되도록 하는 단계;d) 제어부가 사용자가 상기 가이드 이미지에 기반하여 상기 드로잉 툴을 이용하면서 직접 그린 드로잉 이미지를 데이터로 저장하는 단계;e) 제어부가 상기 d)단계에서 저장된 드로잉 이미지에 기초한 동적 이미지를 생성하는 단계; 및f) 알림부가 상기 e)단계에서 생성된 동적 이미지를 임의의 화면 상에 디스플레이하는 단계를 포함하되,상기 동적 이미지는,상기 알림부의 화면 내에서 사용자의 입력에 대응하여 상호작용을 나타내는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템의 제어방법.
- 제 8 항에 있어서,상기 동적 이미지의 상호작용은,상기 동적 이미지의 카테고리와 관련되는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템의 제어방법.
- 제 9 항에 있어서,상기 동적 이미지의 상호작용은,사용자의 입력에 대응하여 상기 동적 이미지 내 물체를 움직이거나, 상기 알림부의 화면 내 다른 물체들과의 물리적인 상호작용을 나타내거나, 상기 동적 이미지와 관련된 음성 데이터를 출력하는 동작을 포함하는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템의 제어방법.
- 제 10 항에 있어서,상기 동적 이미지 내 물체의 움직임 또는 상기 알림부의 화면 내 다른 물체들과의 물리적인 상호작용 방식은,미리 저장된 물리엔진 프로그램에 기초하여 결정되는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템의 제어방법.
- 제 8 항에 있어서,상기 가이드 이미지는,외곽선이 점선으로 표시된 물체 이미지를 포함하는 것을 특징으로 하고,상기 가이드 이미지의 물체는,상기 동적 이미지의 카테고리와 관련되는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템의 제어방법.
- 제 8 항에 있어서,상기 f)단계는,f-1) 상기 동적 이미지가 저장된 경로를 통해 전달 받은 이미지 데이터를 로딩하는 단계를 포함하는 것을 특징으로 하는 반응형 그림 제작 서비스 제공 시스템의 제어방법.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/916,266 US20230334734A1 (en) | 2021-05-31 | 2021-11-16 | Systems for providing responsive-type drawing making service and method for controlling the same |
JP2022560310A JP7459415B2 (ja) | 2021-05-31 | 2021-11-16 | 反応型絵製作サービス提供システム及びこの制御方法 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2021-0070154 | 2021-05-31 | ||
KR20210070154 | 2021-05-31 | ||
KR1020210155877A KR102450263B1 (ko) | 2021-05-31 | 2021-11-12 | 반응형 그림 제작 서비스 제공 시스템 및 이의 제어방법 |
KR10-2021-0155877 | 2021-11-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022255572A1 true WO2022255572A1 (ko) | 2022-12-08 |
Family
ID=82407069
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/016773 WO2022255572A1 (ko) | 2021-05-31 | 2021-11-16 | 반응형 그림 제작 서비스 제공 시스템 및 이의 제어방법 |
PCT/KR2021/016771 WO2022255571A1 (ko) | 2021-05-31 | 2021-11-16 | 쌍방향 참여형 그림 제작 서비스 제공 시스템 및 이의 제어방법 |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/016771 WO2022255571A1 (ko) | 2021-05-31 | 2021-11-16 | 쌍방향 참여형 그림 제작 서비스 제공 시스템 및 이의 제어방법 |
Country Status (4)
Country | Link |
---|---|
US (2) | US12141433B2 (ko) |
JP (2) | JP7446555B2 (ko) |
KR (8) | KR102420577B1 (ko) |
WO (2) | WO2022255572A1 (ko) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102526254B1 (ko) | 2023-02-03 | 2023-04-26 | 이가람 | 반응형 포스터 콘텐츠의 생성 및 이의 상호작용 제공 방법, 장치 및 시스템 |
KR20240125240A (ko) | 2023-02-10 | 2024-08-19 | 주식회사 콘파파 | 협업 과정에서 발생하는 충돌을 해결하기 위한 방법, 프로그램 및 장치 |
KR102680717B1 (ko) * | 2023-06-29 | 2024-07-02 | 주식회사 테스트뱅크 | 멀티 레이어 구조에서 사용자와의 상호작용을 하기 위한 방법 및 장치 |
CN118334155B (zh) * | 2024-04-16 | 2024-12-06 | 宿州学院 | 一种交互式计算机椭球面电子地图制图系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100927866B1 (ko) * | 2007-01-25 | 2009-11-23 | 한성대학교 산학협력단 | 상호작용형 이미지 콘텐츠 생성방법 및 이를 위한 장치와시스템 |
KR101334865B1 (ko) * | 2012-05-23 | 2013-12-03 | 이예찬 | 창의력 개발을 위한 유, 아동용 인터랙티브 그림 제작 방법 |
KR101761233B1 (ko) * | 2015-09-02 | 2017-07-25 | (주)옐리펀트 | 어린이의 자작그림을 이용한 인터랙티브 애니메이션 제작서비스 제공시스템 |
KR102068428B1 (ko) * | 2010-04-23 | 2020-02-11 | 임머숀 코퍼레이션 | 햅틱 효과를 제공하는 시스템 및 방법 |
KR102184162B1 (ko) * | 2019-11-13 | 2020-12-11 | 유시영 | 반응형 웹툰 제작 시스템 및 방법 |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10124689A (ja) * | 1996-10-15 | 1998-05-15 | Nikon Corp | 画像記録再生装置 |
JP2003067585A (ja) | 2001-08-27 | 2003-03-07 | Fumio Onuma | 似顔絵商品流通サービスシステム |
JP2004110714A (ja) | 2002-09-20 | 2004-04-08 | Toyota Motor Corp | 電子資料共同編集システム |
JP2004139289A (ja) | 2002-10-17 | 2004-05-13 | Hitachi Ltd | 図面編集システムおよび方法 |
US20040174365A1 (en) * | 2002-12-24 | 2004-09-09 | Gil Bub | Method and system for computer animation |
KR20090001498A (ko) | 2007-04-19 | 2009-01-09 | 스토리블렌더 인코퍼레이티드 | 컨텐츠 공동제작 서비스 제공방법 및 그 시스템 |
US20120066577A1 (en) * | 2010-09-09 | 2012-03-15 | Microsoft Corporation | Concurrent Editing of Online Drawings |
US20130031463A1 (en) * | 2011-07-29 | 2013-01-31 | Denny Jaeger | Personal workspaces in a computer operating environment |
US20140026036A1 (en) * | 2011-07-29 | 2014-01-23 | Nbor Corporation | Personal workspaces in a computer operating environment |
KR101423524B1 (ko) * | 2011-12-21 | 2014-08-01 | 주식회사 더클락웍스 | 그림 그리기 사용자 인터페이스 및 그림 공유 시스템과 그 방법 |
US8918453B2 (en) * | 2012-01-03 | 2014-12-23 | Qualcomm Incorporated | Managing data representation for user equipments in a communication session |
US8970476B2 (en) * | 2012-02-09 | 2015-03-03 | Vtech Electronics Ltd. | Motion controlled image creation and/or editing |
KR101434728B1 (ko) * | 2012-05-15 | 2014-08-29 | 조현근 | 영상 통화 시스템에서의 영상 통화 중 드로잉 협업 방법 |
US20140189593A1 (en) * | 2012-12-27 | 2014-07-03 | Kabushiki Kaisha Toshiba | Electronic device and input method |
KR20140085048A (ko) * | 2012-12-27 | 2014-07-07 | 삼성전자주식회사 | 멀티 디스플레이 장치 및 제어 방법 |
KR102183448B1 (ko) * | 2013-04-26 | 2020-11-26 | 삼성전자주식회사 | 사용자 단말 장치 및 그 디스플레이 방법 |
KR20150066687A (ko) | 2013-12-09 | 2015-06-17 | 홍충식 | 실시간 화면 공유 메신저 |
US20150341399A1 (en) * | 2014-05-23 | 2015-11-26 | Samsung Electronics Co., Ltd. | Server and method of providing collaboration services and user terminal for receiving collaboration services |
US20180095653A1 (en) * | 2015-08-14 | 2018-04-05 | Martin Hasek | Device, method and graphical user interface for handwritten interaction |
WO2017053440A1 (en) * | 2015-09-23 | 2017-03-30 | Edoardo Rizzi | Communication device and method |
KR20170089252A (ko) * | 2016-01-26 | 2017-08-03 | 박재동 | 그래픽 디자인 사용자 인터페이스와 그래픽 소스 공유 시스템 및 방법 |
US9832308B1 (en) * | 2016-05-12 | 2017-11-28 | Google Inc. | Caller preview data and call messages based on caller preview data |
US11126325B2 (en) * | 2017-10-23 | 2021-09-21 | Haworth, Inc. | Virtual workspace including shared viewport markers in a collaboration system |
US10616666B1 (en) * | 2018-02-27 | 2020-04-07 | Halogen Networks, LLC | Interactive sentiment-detecting video streaming system and method |
KR102576909B1 (ko) * | 2018-08-08 | 2023-09-11 | 삼성전자 주식회사 | 드로잉 환경을 제공하는 전자 장치 및 방법 |
KR102159326B1 (ko) | 2018-12-06 | 2020-09-23 | 주식회사 마이더스터치 | 작가체학습단말기 및 시스템 |
KR102280805B1 (ko) * | 2018-12-06 | 2021-07-23 | 주식회사 파블로아트컴퍼니 | 단말기 |
KR102558566B1 (ko) * | 2021-03-09 | 2023-07-24 | (주)비케이 | 벡터 이미지의 재생방법 |
-
2021
- 2021-11-12 KR KR1020210155876A patent/KR102420577B1/ko active IP Right Grant
- 2021-11-12 KR KR1020210155877A patent/KR102450263B1/ko active IP Right Grant
- 2021-11-16 JP JP2022560262A patent/JP7446555B2/ja active Active
- 2021-11-16 WO PCT/KR2021/016773 patent/WO2022255572A1/ko active Application Filing
- 2021-11-16 US US17/916,259 patent/US12141433B2/en active Active
- 2021-11-16 JP JP2022560310A patent/JP7459415B2/ja active Active
- 2021-11-16 WO PCT/KR2021/016771 patent/WO2022255571A1/ko active Application Filing
- 2021-11-16 US US17/916,266 patent/US20230334734A1/en active Pending
-
2022
- 2022-07-01 KR KR1020220081342A patent/KR102476443B1/ko active
- 2022-07-01 KR KR1020220081341A patent/KR102476442B1/ko active
- 2022-07-01 KR KR1020220081343A patent/KR102476444B1/ko active
- 2022-09-22 KR KR1020220120156A patent/KR102484310B1/ko active IP Right Grant
- 2022-09-22 KR KR1020220120155A patent/KR102484309B1/ko active IP Right Grant
- 2022-09-22 KR KR1020220120157A patent/KR102484311B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100927866B1 (ko) * | 2007-01-25 | 2009-11-23 | 한성대학교 산학협력단 | 상호작용형 이미지 콘텐츠 생성방법 및 이를 위한 장치와시스템 |
KR102068428B1 (ko) * | 2010-04-23 | 2020-02-11 | 임머숀 코퍼레이션 | 햅틱 효과를 제공하는 시스템 및 방법 |
KR101334865B1 (ko) * | 2012-05-23 | 2013-12-03 | 이예찬 | 창의력 개발을 위한 유, 아동용 인터랙티브 그림 제작 방법 |
KR101761233B1 (ko) * | 2015-09-02 | 2017-07-25 | (주)옐리펀트 | 어린이의 자작그림을 이용한 인터랙티브 애니메이션 제작서비스 제공시스템 |
KR102184162B1 (ko) * | 2019-11-13 | 2020-12-11 | 유시영 | 반응형 웹툰 제작 시스템 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
KR102476442B1 (ko) | 2022-12-15 |
KR102484311B1 (ko) | 2023-01-04 |
KR20220162103A (ko) | 2022-12-07 |
KR20220162101A (ko) | 2022-12-07 |
WO2022255571A1 (ko) | 2022-12-08 |
KR20220162098A (ko) | 2022-12-07 |
KR102484310B1 (ko) | 2023-01-04 |
KR20220162099A (ko) | 2022-12-07 |
JP2023534089A (ja) | 2023-08-08 |
KR102450263B1 (ko) | 2022-10-04 |
KR102484309B1 (ko) | 2023-01-04 |
US20230325070A1 (en) | 2023-10-12 |
US12141433B2 (en) | 2024-11-12 |
US20230334734A1 (en) | 2023-10-19 |
JP7446555B2 (ja) | 2024-03-11 |
KR102476444B1 (ko) | 2022-12-15 |
JP2023532620A (ja) | 2023-07-31 |
KR20220162100A (ko) | 2022-12-07 |
JP7459415B2 (ja) | 2024-04-02 |
KR20220162102A (ko) | 2022-12-07 |
KR102476443B1 (ko) | 2022-12-15 |
KR102420577B1 (ko) | 2022-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022255572A1 (ko) | 반응형 그림 제작 서비스 제공 시스템 및 이의 제어방법 | |
US12093502B2 (en) | Interactive presentation controls | |
CN114375435A (zh) | 增强物理活动表面上的有形内容 | |
JP2022540315A (ja) | 人工現実環境において周辺デバイスを使用する仮想ユーザインターフェース | |
US20130198653A1 (en) | Method of displaying input during a collaboration session and interactive board employing same | |
Lal | Digital design essentials: 100 ways to design better desktop, web, and mobile interfaces | |
WO2014116056A1 (ko) | 애니메이션의 모션 시퀀스를 생성하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 | |
CN108958731B (zh) | 一种应用程序界面生成方法、装置、设备和存储介质 | |
Ledo et al. | Astral: Prototyping mobile and smart object interactive behaviours using familiar applications | |
Schreiber et al. | New interaction concepts by using the wii remote | |
Kühn et al. | WiKnectVR: A gesture-based approach for interacting in virtual reality based on WiKnect and gestural writing | |
Vroegop | Microsoft HoloLens Developer's Guide | |
WO2018056653A1 (ko) | 번역문과 함께 이미지를 제공하는 방법, 장치 및 컴퓨터 프로그램 | |
CN114629800A (zh) | 工控网络靶场的可视化生成方法、装置、终端及存储介质 | |
Mu et al. | Interface And Interaction: The Symbolic Design for Bridge Conning System | |
CN111651102B (zh) | 在线教学交互方法、装置、存储介质以及电子设备 | |
Sinclair | Integrating hypermedia techniques with augmented reality environments | |
Ledo et al. | Astral: Prototyping Mobile and IoT Interactive Behaviours via Streaming and Input Remapping | |
Zhang | Innovative Applications of Image Media and Intelligent Service Systems | |
WO2024076088A1 (en) | Methods and electronic device for generating xr environment | |
WO2024085701A1 (ko) | 스케치되는 탈것의 구동 방법 및 장치 | |
Van den Bergh et al. | Wearable Touchscreens to Integrate Augmented Reality and Tablets for Work Instructions? | |
Yeh et al. | Interactive Gigapixel Prints: Large Paper Interfaces for Visual Context, Mobility, and Collaboration | |
CN118210376A (zh) | 在增强现实环境下基于徒手交互的低代码可视化创建系统 | |
CN117389410A (zh) | Vr交互方法、装置、终端设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2022560310 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21944325 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23/04/2024) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21944325 Country of ref document: EP Kind code of ref document: A1 |