KR101423524B1 - User interface for drawing, and system and method for sharing drawings - Google Patents

User interface for drawing, and system and method for sharing drawings Download PDF

Info

Publication number
KR101423524B1
KR101423524B1 KR20120144762A KR20120144762A KR101423524B1 KR 101423524 B1 KR101423524 B1 KR 101423524B1 KR 20120144762 A KR20120144762 A KR 20120144762A KR 20120144762 A KR20120144762 A KR 20120144762A KR 101423524 B1 KR101423524 B1 KR 101423524B1
Authority
KR
South Korea
Prior art keywords
picture
effect
drawing
user
canvas
Prior art date
Application number
KR20120144762A
Other languages
Korean (ko)
Other versions
KR20130072134A (en
Inventor
이정현
Original Assignee
주식회사 더클락웍스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to KR1020110139213 priority Critical
Priority to KR20110139213 priority
Application filed by 주식회사 더클락웍스 filed Critical 주식회사 더클락웍스
Publication of KR20130072134A publication Critical patent/KR20130072134A/en
Application granted granted Critical
Publication of KR101423524B1 publication Critical patent/KR101423524B1/en

Links

Images

Abstract

Drawing user interface and picture sharing system and method thereof are disclosed. The terminal having the drawing interface detects at least one kind of physical change applied by the user through the sensor, and then grasps and displays the picture effect corresponding to the type and size of the physical change.

Description

Drawing User Interface and Picture Sharing System and Method Thereof,

The present invention relates to a user interface capable of various and creative drawing, and a picture sharing system and method for sharing and evaluating pictures with others.

Various drawing tools provided by a computer or a smart phone provide a number of predefined drawing tools, such as a pen, a paint, a brush, an eraser, etc., and a user selects a drawing tool, , Mouse or touch screen). However, these conventional drawing tools are limited in providing a variety of creative drawing due to the limited input method.

In addition, conventionally, pictures drawn from various terminals can be transmitted to a designated user through an e-mail or the like, or pictures can be displayed on their SNS account so that other people can view the pictures. There is no shared system that can be viewed and evaluated.

Patent Laid-Open Publication No. 2001-0000496 Patent Publication No. 2010-0069882

An object of the present invention is to provide an interface capable of drawing a picture using a variety of creative methods out of a simple input method such as a touch of a mouse or a touch screen.

It is another object of the present invention to provide a picture sharing system and method in which each user can share pictures drawn by his / her terminal with various people and evaluate each other.

According to another aspect of the present invention, there is provided a terminal for implementing a drawing interface according to the present invention, comprising: an input unit for measuring at least one kind of physical change applied by a user through a sensor; A control unit for grasping a picture effect corresponding to the kind and size of the physical change; And a display unit for displaying the picture effect.

According to another aspect of the present invention, there is provided a picture server including: a member database for storing information of members; Picture database to store pictures of members; An evaluation information database for storing evaluation information on the picture image of the members; A first screen for displaying a simplified image of the picture image in a mosaic form, and a second screen for displaying a simplified picture, a picture image of the selected simplified picture, evaluation information of the picture image, And a picture arrangement unit for providing a second screen for displaying information.

According to the present invention, it is possible to draw pictures using a variety of input methods such as a user's voice, gestures, wind blowing, etc., thereby enabling creative drawing. In addition to sharing the picture with each other by evaluating their own paintings, they can improve their immersion in drawing, and they can also share pictures of other countries with their own peers, thus securing diverse perspectives on the picture.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a schematic diagram of a picture sharing system according to the present invention; Fig.
FIG. 2 illustrates an example of a drawing interface according to the present invention.
3 is a diagram illustrating an example of a terminal in which a drawing interface according to the present invention is implemented;
FIG. 4 is a view illustrating an example of a drawing method using a sound of a user or intensity of a wind in a drawing interface according to the present invention.
5 is a diagram illustrating an example of a drawing method using motion of a terminal in a drawing interface according to the present invention.
6 is a diagram illustrating an example of a drawing method using a motion of an object in a drawing interface according to the present invention.
7 is a view illustrating an example of a method of generating a sticker image in a drawing interface according to the present invention;
8 is a view illustrating an example of a method of setting a canvas in a drawing interface according to the present invention.
FIG. 9 and FIG. 10 are views showing an example of a screen for searching and evaluating pictures of a picture server according to the present invention.
11 is a diagram illustrating an example of a configuration of a picture server for sharing pictures according to the present invention.
12 is a diagram illustrating an example of a method of displaying a friend list according to the present invention,
13 is a diagram illustrating an example of a cooperative drawing method among the picture sharing services according to the present invention.

Hereinafter, a drawing user interface and a drawing sharing system and method according to the present invention will be described in detail with reference to the accompanying drawings.

FIG. 1 is a diagram showing a schematic configuration of a picture sharing system according to the present invention.

Referring to FIG. 1, a plurality of user terminals 100, 102, and 110 are connected to a picture server 130 through a network 120. The terminals 100, 102, and 110 are various terminals capable of transmitting and receiving data through a wired / wireless communication network, and examples thereof include a computer, a notebook, a smart phone, and a tablet PC.

The picture server 130 receives pictures from the respective terminals 100, 102 and 110, stores and manages them, and allows them to share and evaluate them with other people. The picture server 130 receives member registration in advance to manage pictures for each user, and each user connects to a picture server through a login process, and then transmits pictures. The picture server 130 may be composed of a plurality of servers in order to distribute the load according to the connection of each terminal.

2 is a diagram illustrating an example of a drawing interface according to the present invention.

Referring to FIG. 2, the drawing interface 200 includes a function tool bar 210 (for example, a storage, a drawing cancellation, a screen capture, etc.) and various drawing tools (pen, A tool bar 220, a canvas 230 for drawing a picture, and the like. The user can draw not only the touch method of the conventional touch screen, but also various types of input such as sound, gesture, terminal movement, etc., which will be described in detail with reference to FIG.

3 is a diagram illustrating an example of a terminal on which a drawing interface according to the present invention is implemented.

3, the terminal having the drawing interface implemented therein includes an input unit 300, a canvas setting unit 310, a drawing tool setting unit 320, a control unit 330, a display unit 340, a speaker 350, (360). Various control operations of the canvas setting unit 310, the drawing tool setting unit 320, and the control unit for controlling the drawing can be implemented by an application.

The input unit 300 includes a touch sensor 302 for sensing a touch of a touch screen, a motion sensor 304 for sensing a direction, a speed, and a tilt of the terminal using an acceleration sensor or the like, A sound detection sensor 306 for detecting the intensity of wind or the like, a camera 308 for photographing the motion of the object, and the like. Of course, the input unit 300 may include various conventional input methods such as a mouse and the like. The input unit 300 can grasp a time point at which the touch ends when the user starts touching the canvas through the touch sensor 302. [

The canvas setting unit 310 sets a certain effect on the canvas (FIGS. 2 and 230), which is the basis of drawing. The canvas setting unit 310 can set the canvas not only to set the canvas with a specific pattern or color, but also to display various types of canvas such as a movie, a crumpled shape, a folded paper, and a motion. The canvas setting unit 310 preferably provides canvas shapes in a menu form that can be set so that the user can easily set the canvas.

The drawing tool setting unit 320 allows the user to set various drawing tools in addition to the drawing tools preset in the drawing drawing interface (FIGS. 2 and 200). For example, a user can set a sticker that can apply various textures, shapes, and images to pictures. The sticker can be used in various forms (square shape, heart shape, or shape (puppy, house, apple, etc.) included in the image) by using a photo image taken by the user or a part of the photo image or various pictures, images, . ≪ / RTI > In addition, the sticker can be made into various images and various shapes such as tire image, vinyl image, and rough texture image in order to provide a picture drawing effect using various materials in picture drawing.

The sticker can be mapped to sound. The picture tool setting unit 320 receives a sticker image (for example, an image to be used for a sticker effect, such as a heart shape image, a puppy shape image) and sound from a user, maps and stores the sticker image and sound, And displayed on the drawing tool tool bar (Fig. 2, 220). The user can select a sticker displayed on the drawing interface 200 and attach it as a sticker on the canvas. When a user selects a sticker image in a picture including a sticker image, the sound mapped to the sticker image is reproduced.

The display unit 340 displays various picture effects on the screen. The speaker 350 outputs a predetermined sound, such as outputting a sound mapped to the sticker image, and the networking unit 360 provides wired / wireless communication with the outside.

The control unit 330 controls various components such as the input unit 300, the canvas setting unit 310, and the picture tool setting unit 320, and performs various control operations for various picture drawing operations such as generating a picture effect.

FIG. 4 is a diagram illustrating an example of a drawing method using the sound of the user or the intensity of the wind in the drawing interface according to the present invention.

Referring to FIGS. 3 and 4, the input unit 300 senses a user's sound or wind intensity through a sound detection sensor 306 (S400). The control unit 330 measures the intensity of the sound or wind sensed through the input unit 300 and generates a picture effect (for example, a paint spread effect) according to the size of the sound (S410, S420) Displays the picture effect on the canvas (Fig. 2, 230) (S430). The control unit 330 may apply a picture effect according to sound or wind intensity to the entire canvas, or may include a portion selected by the user or drawn with a specific picture tool (e.g., a portion drawn with a pen, a portion drawn with a paint, A portion drawn with a sticker, or a portion drawn with a specific material (a drawing tool indicating powder, sand, etc.).

For example, in order to realize the effect of drawing a picture on the canvas, the user drops or paints the paints on the canvas, and then blows wind through the terminal input unit 300 with a feeling of blowing an actual paint, The spreading effect of the paint located on the canvas can be realized. As another example, it is possible to detect the intensity of the sound or the height of the sound emitted by the user through the sound detection sensor 304 of the input unit 300, and various effects such as the spread of the paint or the thickness of the drawn line Can be implemented.

As another example, after detecting the intensity of wind or sound through the input unit 300, a specific portion of a drawn picture or a specific picture tool (pen, paint, Sticky, vinyl shape drawing tool, styrofoam shape drawing tool, etc.), it is possible to realize the effect of disturbing the drawn portion.

As another example, a drawing tool that implements a dry ice type can be selected on the canvas, and a drawing effect that spreads dry ice according to the detected wind or sound intensity can be implemented.

The control unit 330 sets in advance which picture effect is to be displayed according to what kind of physical change (sound, wind intensity, motion, temperature, etc.) measured by the input unit 300 is. For example, the control unit 330 maps the picture effect according to the motion of the terminal with the effect that the canvas is tilted or lifted to allow the paints to flow, And then, when the physical change is measured, the mapped picture effect is variously generated according to the size of the physical change. Of course, the user can set a picture effect corresponding to the kind of physical change.

5 is a diagram illustrating an example of a drawing method using motion of a terminal in a drawing interface according to the present invention.

Referring to FIGS. 3 and 5, the input unit 300 senses the direction, speed, and tilt of the terminal using the motion sensor 304 or the like (S500). The control unit 330 selects a picture effect corresponding to the motion sensed through the input unit 300 and generates an effect degree of the shedding of the painted object according to the direction, S510), the display unit 340 displays the picture effect on the screen (S520). The control unit can apply the picture effect according to the motion only to a specific part or a part drawn by a specific picture tool rather than the entire canvas as described in FIG.

For example, if the user tilts the terminal by about 10 degrees, the terminal implements the effect of flowing the paint downward at a speed corresponding to the angle in the drawing interface. If the user tilts at 20 degrees, It achieves the effect of flowing the paint downward at a faster rate. When the user rotates the terminal by 90 degrees, the terminal also changes the direction in which the paint flows down by 90 degrees. Such a picture effect is generated in real time by the control unit 330 and displayed on the screen through the display unit 340.

As another example, the terminal selects a specific area of the canvas to which the picture effect according to the motion is to be applied, from the user. Then, when the user selects a painting tool represented by a powder form and sprays it on the canvas and shakes the terminal, the terminal senses the shaking and then disperses the powder other than the specific area selected by the user, Effect can be expressed.

FIG. 6 is a diagram illustrating an example of a drawing method using motion of an object in the drawing interface according to the present invention.

3 and 6, the input unit 300 photographs an object through the camera 308 (S600). The control unit 330 tracks movement of the object photographed through the input unit 300 in step S610 and the display unit 340 displays the movement trajectory captured by the control unit on the canvas S620). The control unit 330 can track the motion of the object in the image photographed through the input unit 300 using various existing image recognition technologies. In addition, the control unit 330 may recognize a specific region of the object, for example, a human's pupil or a human's hand or finger, through the image recognition technology, and track only that region. Therefore, the user can make a creative drawing through movement of the body such as the motion of the pupil or the rhythm.

As another example, the input unit 300 can recognize when the user touches the canvas from the point of time when the user touches the canvas through the touch sensor, and the control unit 330 stores the drawn picture from the touch point to the end point as one layer . That is, the control unit 330 stores the picture drawn by touching the canvas again after the touch ends, as a different layer from the previous picture. When the input unit 300 senses movement (e.g., wiggle, tilt, speed, and the like) through the motion sensor with respect to the drawn image, a picture effect such as disturbing each layer can be implemented.

7 is a diagram illustrating an example of a method of generating a sticker image in the drawing interface according to the present invention.

3 and 7, the image tool setting unit 320 selects an image to be used as a sticker image from the user (S700). For example, the user can select an image such as a photograph or various pictures, or a specific part in the image (for example, a puppy in a photograph) as a sticker image. When the user selects a specific portion in the image, the drawing tool setting unit 320 recognizes the outline of the specific portion and extracts only the image of the portion. Extraction of the outline of the image and the like can be performed through a variety of conventional methods, so a detailed description thereof will be omitted.

The drawing tool setting unit 320 also selects a sound to be mapped with the sticker image (e.g., a bell sound, a dog sound, or a song music) (S710). The drawing tool setting unit 320 maps and stores the sticker image and the sound in step S720, and displays the drawing on the drawing tool bar 220 in FIG. 2 for use by the user in step S730. In the picture using the sticker image, when the user clicks the sticker image of the picture, the sound mapped to the sticker image is outputted (S740).

FIG. 8 is a diagram illustrating an example of a method of setting a canvas in a drawing interface according to the present invention.

Referring to FIGS. 3 and 8, the canvas setting unit 310 displays various kinds of settable canvas types so that the user can select the canvas types (S800). When the user selects a specific canvas type, the canvas setting unit 310 applies it on the screen (S810), and the user draws a picture on the screen (S820). The canvas (230 in FIG. 2) may be set to a simple color or pattern, or may be displayed as a moving picture canvas, a old device, a fold, a tearing effect, or the like.

FIG. 9 and FIG. 10 are views showing an example of a screen for searching and evaluating a picture according to the present invention.

Referring to FIG. 9, when a user requests a sharing picture search after connecting through a terminal, the picture server displays thumbnail images 910 in a mosaic form. In the simplified image 920, brief information (for example, a member of the picture, a country, etc.) of the picture is displayed. The shared pictures may be sorted by a predetermined sorting method 930, and may be sorted in order of recent registration order or evaluation order.

Referring to FIG. 10, a screen 1000 appears when a user selects a simplified image 910. FIG. When the user selects the simplified image 910, a detailed picture image 1016 is displayed and the title 1012, member information 1018, evaluation result 1014, evaluation tool 1040, (1050) are displayed on the screen. Also, the user can easily search the previous picture 1020 or the subsequent picture 1030 by moving the screen up and down.

The member information 1018 displays a user name, a user location, an image registered by the user, and the like. Also, an icon (not shown) for facilitating friend / friend disconnecting is displayed. As an example of the icon, (+) and (-) shaped icons may be used. For example, if the corresponding member of the picture image is a friend of the user, the icon is indicated by (-) and if the member is not a friend, the icon is displayed by (+), and the user simply clicks the corresponding icon .

The assessment tool 1040 provides a tool for the user to evaluate the picture. The user can select and evaluate various face emoticons (e.g., heart, smiley face, frown face, and crying face shape) displayed on the evaluation tool 1040. You can also write a comment.

The evaluation result (104) shows the result obtained by combining the results of the evaluation for the figure so far. When the evaluation is made of numbers, the sum or average of the numbers will be displayed. If the evaluation is made with various emoticons, the number of emoticons is displayed. The evaluator screen 1050 displays information (ID, connected country name) of the members who have evaluated or commented on the picture so far. The user can select a specific member displayed on the evaluator screen 1050 to view the pictures of the member. When the user selects a specific member, the picture of the member is displayed in a mosaic form as shown in FIG. 9, and when a specific picture is selected, a detailed picture is displayed as shown in FIG. Also, when a user views his or her own picture, the user can search through the display method as shown in FIGS. 9 and 10. FIG.

FIG. 10 may also include an icon (not shown) to view the location of the evaluators 1050 that have evaluated the picture 1010 through the map. When the user selects this icon, the location of the evaluators 1050 is displayed on the map 10210 in the form of FIG. 12, and a detailed list of the reviewers is displayed in the friend list 1220. The icon for viewing the evaluator position is displayed on one side of the screen or combined with the icon for making a friend to look at before, so that the user can select a specific icon, and a menu such as making / breaking the friend or viewing the position of the evaluator is displayed. View and so on.

11 is a diagram illustrating an example of the configuration of a picture server for sharing pictures according to the present invention.

11, the image server includes a member database 1100, a picture database 1110, an evaluation information database 1120, a picture arrangement unit 1130, and a friend list display unit 1140.

The member database 1100 includes information of members who have joined the picture sharing service. The member database 1100 can receive and store location information (e.g., country information) received from a user terminal when registering a membership. Alternatively, the picture server can receive location information from the access terminal every time the member logs in, and can recognize and update the access location (for example, the access country name) of the member.

The picture database 1110 stores and manages pictures uploaded by members after they log in. The evaluation information database 1120 stores the evaluation results of the users for each picture. Although the picture database 1110 and the evaluation information database 1120 are shown as separate structures for convenience of explanation, it is needless to say that they can be stored and managed through one database.

The picture arrangement unit 1130 arranges the pictures through the interfaces of FIGS. 9 and 10 so that the pictures can be viewed by the user when the user requests a shared picture view or a picture picture view thereof.

When the user requests his / her friend list, the friend list display unit 1140 grasps the members who have made friend through the icon as described in FIG. 10, and maps and displays them on the map. Referring to an example of the method of displaying the friend list shown in FIG. 12, the friend list connecting unit 1140 displays on the map 1210 how many friends are in the country, and there is a detailed friend list 1220 ).

13 is a diagram illustrating an example of a cooperative drawing method among the picture sharing services according to the present invention.

Referring to FIG. 13, the picture server receives the users to participate in the cooperative picture drawing at the request of the cooperative picture drawing service (S1300). In addition, the picture server selects the cooperative picture drawing mode (S1310). The cooperative drawing mode includes a mode in which one canvas is divided into sections, a mode in which a plurality of persons draw in the same area, and a mode in which a canvas is drawn in a sequential order. In step S1320, the image server provides a canvas to the service participant in step S1320, and collects the result to create and share the completed image in step S1330.

The present invention can also be embodied as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored. Examples of the computer-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like. The computer-readable recording medium may also be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner.

The present invention has been described with reference to the preferred embodiments. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. The scope of the present invention is defined by the appended claims rather than by the foregoing description, and all differences within the scope of equivalents thereof should be construed as being included in the present invention.

Claims (11)

  1. delete
  2. An input unit for measuring at least one kind of physical change applied by a user through a sensor;
    A control unit for grasping a picture effect corresponding to the kind and size of the physical change; And
    And a display unit for displaying the picture effect,
    Wherein the input unit includes a sound detection sensor for detecting a sound or wind intensity,
    Wherein the control unit selects a paint spreading effect, a coloring disturbance effect, or a dry ice effect, determines the degree of the effect based on the sound or wind intensity measured by the sound sensing sensor,
    Wherein the display unit displays the effect on the canvas.
  3. An input unit for measuring at least one kind of physical change applied by a user through a sensor;
    A control unit for grasping a picture effect corresponding to the kind and size of the physical change; And
    And a display unit for displaying the picture effect,
    Wherein the input unit includes a motion sensor for measuring a velocity and an inclination of motion,
    Wherein the control unit selects a paint bleeding effect or a dusting effect, determines a degree of an effect according to a velocity and an inclination of the movement measured by the motion sensor,
    Wherein the display unit displays the effect on the canvas.
  4. An input unit for measuring at least one kind of physical change applied by a user through a sensor;
    A control unit for grasping a picture effect corresponding to the kind and size of the physical change; And
    And a display unit for displaying the picture effect,
    The input unit senses a touch end point at which the touch ends from the touch start point of the canvas through the touch sensor, measures the movement speed or tilt of the terminal through the movement sensor,
    The control unit stores the pictures drawn from the touch start point to the touch end point of the canvas as one layer through the process of storing the pictures as the different layers, And the like,
    Wherein the display unit displays the effect on the canvas.
  5. delete
  6. An input unit for measuring at least one kind of physical change applied by a user through a sensor;
    A control unit for grasping a picture effect corresponding to the kind and size of the physical change; And
    And a display unit for displaying the picture effect,
    And a drawing tool setting unit for mapping and storing a sticker image and a sound for implementing a sticker effect on the picture drawing screen and displaying the sticker image by a picture tool of the picture drawing picture, Implemented terminal.
  7. The method according to claim 6,
    Wherein the control unit outputs the sound mapped to the sticker image through a speaker when the sticker image displayed on the picture drawing screen is selected.
  8. delete
  9. delete
  10. delete
  11. delete
KR20120144762A 2011-12-21 2012-12-12 User interface for drawing, and system and method for sharing drawings KR101423524B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020110139213 2011-12-21
KR20110139213 2011-12-21

Publications (2)

Publication Number Publication Date
KR20130072134A KR20130072134A (en) 2013-07-01
KR101423524B1 true KR101423524B1 (en) 2014-08-01

Family

ID=48986942

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20120144762A KR101423524B1 (en) 2011-12-21 2012-12-12 User interface for drawing, and system and method for sharing drawings

Country Status (1)

Country Link
KR (1) KR101423524B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170027499A (en) 2015-09-02 2017-03-10 (주)옐리펀트 System for interactive animation production service using self made painting of child

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016003007A1 (en) * 2014-07-03 2016-01-07 예스튜디오 주식회사 Picture-based sns service method and platform

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060084945A (en) * 2005-01-21 2006-07-26 엘지전자 주식회사 Method of generating brush touch effect using touch pad
KR20090029098A (en) * 2007-09-17 2009-03-20 정관호 The system and method to share the explanation and opinion information : about web service and contents through the internet
KR20090085068A (en) * 2006-11-13 2009-08-06 마이크로소프트 코포레이션 Shared space for communicating information
KR20090106077A (en) * 2008-04-04 2009-10-08 (주)미래디피 Touch pen system using stereo vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060084945A (en) * 2005-01-21 2006-07-26 엘지전자 주식회사 Method of generating brush touch effect using touch pad
KR20090085068A (en) * 2006-11-13 2009-08-06 마이크로소프트 코포레이션 Shared space for communicating information
KR20090029098A (en) * 2007-09-17 2009-03-20 정관호 The system and method to share the explanation and opinion information : about web service and contents through the internet
KR20090106077A (en) * 2008-04-04 2009-10-08 (주)미래디피 Touch pen system using stereo vision

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170027499A (en) 2015-09-02 2017-03-10 (주)옐리펀트 System for interactive animation production service using self made painting of child

Also Published As

Publication number Publication date
KR20130072134A (en) 2013-07-01

Similar Documents

Publication Publication Date Title
Greenberg et al. Sketching user experiences: The workbook
US10401967B2 (en) Touch free interface for augmented reality systems
CN102763422B (en) Projectors and depth cameras for deviceless augmented reality and interaction
JP6275706B2 (en) Text recognition driven functionality
CN102067164B (en) Mobile imaging device as navigator
US8386918B2 (en) Rendering of real world objects and interactions into a virtual universe
US8493353B2 (en) Methods and systems for generating and joining shared experience
US10356274B2 (en) Methods and arrangements relating to signal rich art
CN105378599B (en) interactive digital display
Suzuki et al. u-photo: Interacting with pervasive services using digital still images
CN1695156B (en) Information reproduction/i/o method using dot pattern, information reproduction device, mobile information i/o device, and electronic toy
US9288079B2 (en) Virtual notes in a reality overlay
CN1320506C (en) General-purpose calculating apparatus
CN104641399B (en) System and method for creating environment and for location-based experience in shared environment
CN102884779B (en) Method and system for computing intuition
US20160210602A1 (en) System and method for collaborative shopping, business and entertainment
CN102216941B (en) A method and system for handling content
US9536350B2 (en) Touch and social cues as inputs into a computer
KR20120075487A (en) Sensor-based mobile search, related methods and systems
US8390648B2 (en) Display system for personalized consumer goods
CN106489112B (en) For equipment, the method using vision and/or touch feedback manipulation user interface object
US20130174213A1 (en) Implicit sharing and privacy control through physical behaviors using sensor-rich devices
AU2014204252B2 (en) Extramissive spatial imaging digital eye glass for virtual or augmediated vision
CN103207668B (en) Information processing unit, information processing method and non-transient recording medium
US20150123966A1 (en) Interactive augmented virtual reality and perceptual computing platform

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20180504

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20190508

Year of fee payment: 6