US20180018021A1 - Method, device, system and non-transitory computer-readable recording medium for providing user interface - Google Patents

Method, device, system and non-transitory computer-readable recording medium for providing user interface Download PDF

Info

Publication number
US20180018021A1
US20180018021A1 US15/667,887 US201715667887A US2018018021A1 US 20180018021 A1 US20180018021 A1 US 20180018021A1 US 201715667887 A US201715667887 A US 201715667887A US 2018018021 A1 US2018018021 A1 US 2018018021A1
Authority
US
United States
Prior art keywords
device
user
invention
haptic feedback
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/667,887
Inventor
Sung Jae Hwang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FUTUREPLAY Inc
Original Assignee
FUTUREPLAY Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to KR10-2014-0132943 priority Critical
Priority to KR20140132943 priority
Priority to KR10-2015-0078494 priority
Priority to KR20150078494 priority
Priority to KR1020150105128A priority patent/KR101680698B1/en
Priority to KR10-2015-0105128 priority
Priority to US14/826,922 priority patent/US9753539B2/en
Application filed by FUTUREPLAY Inc filed Critical FUTUREPLAY Inc
Priority to US15/667,887 priority patent/US20180018021A1/en
Assigned to FUTUREPLAY INC. reassignment FUTUREPLAY INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HWANG, SUNG JAE
Publication of US20180018021A1 publication Critical patent/US20180018021A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1698Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders, dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text

Abstract

According to one aspect of the present invention, there is provided a method for providing a user interface. The method comprises acquiring information on a position where at least one of a first device and a second device contacts a body of a user, and upon the occurrence of a triggering event that causes haptic feedback in at least one of the first device and the second device, controlling properties of haptic feedback provided in at least one of the first device and the second device, with reference to the acquired information on the body contact position, information on a display state associated with the triggering event, and information on contents or functions associated with the triggering event.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 14/826,922, filed on Aug. 14, 2015, which claims the benefit of Korean Patent Application No. 10-2014-0132943, filed on Oct. 2, 2014, Korean Patent Application No. 10-2015-0078494, filed on Jun. 3, 2015, and Korean Patent Application No. 10-2015-0105128, filed on Jul. 24, 2015, the entire contents all of which are incorporated herein by reference for all purposes as if fully set forth herein.
  • FIELD OF THE INVENTION
  • The present invention relates to a method, device, system and non-transitory computer-readable recording medium for providing a user interface.
  • BACKGROUND
  • The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
  • Recently, mobile smart devices having various communication and sensing capabilities and powerful computing capabilities, such as smart phones and smart pads, are being widely used. Among such mobile smart devices, there are relatively small-sized ones that may be worn and carried on a body of a user (e.g., a smart glass, a smart watch, a smart band, a smart device in the form of a ring or a brooch, a smart device directly worn on or embedded in a body or a garment, etc.)
  • In this situation, a user may desire to perform a task using two or more (different kinds of) smart devices of the user, or may desire a task to be performed in which smart devices of the user and another user are required to be involved together. Further, the user may desire to intuitively receive information on the performance state of the task. However, this (latent) intention and needs of the user could not have been properly supported in prior art.
  • SUMMARY OF THE INVENTION
  • One object of the present invention is to fully solve the above problem.
  • Another object of the invention is to intuitively provide a user, via haptic sensation, with information on a task performed in a state in which two or more devices are associated, by acquiring information on a position where at least one of a first device and a second device contacts a body of the user, and upon the occurrence of a triggering event that causes haptic feedback in at least one of the first device and the second device, controlling properties of haptic feedback provided in at least one of the first device and the second device, with reference to the acquired information on the body contact position, information on a display state associated with the triggering event, and information on contents or functions associated with the triggering event.
  • According to one aspect of the invention to achieve the objects as described above, there is provided a method for providing a user interface, comprising the steps of: acquiring information on a position where at least one of a first device and a second device contacts a body of a user; and upon the occurrence of a triggering event that causes haptic feedback in at least one of the first device and the second device, controlling properties of haptic feedback provided in at least one of the first device and the second device, with reference to the acquired information on the body contact position, information on a display state associated with the triggering event, and information on contents or functions associated with the triggering event.
  • According to another aspect of the invention, there is provided a device for providing a user interface, comprising: a sensing module for acquiring information on a position where the device contacts a body of a user, and sensing the occurrence of a triggering event that causes haptic feedback in at least one of the device and another device associated with the device; and a program module for, upon the occurrence of the triggering event, controlling properties of haptic feedback provided in at least one of the device and the another device, with reference to the acquired information on the body contact position, information on a display state associated with the triggering event, and information on contents or functions associated with the triggering event.
  • According to yet another aspect of the invention, there is provided a system for providing a user interface, comprising: a control unit for acquiring information on a position where at least one of a first device and a second device contacts a body of a user, and upon the occurrence of a triggering event that causes haptic feedback in at least one of the first device and the second device, controlling properties of haptic feedback provided in at least one of the first device and the second device, with reference to the acquired information on the body contact position, information on a display state associated with the triggering event, and information on contents or functions associated with the triggering event; and a storage for storing information provided from at least one of the first device and the second device.
  • In addition, there are further provided other methods, devices and systems to implement the invention, as well as non-transitory computer-readable recording media having stored thereon computer programs for executing the methods.
  • According to the invention, information on a task performed in a state in which two or more devices are associated may be intuitively provided to a user via haptic sensation.
  • According to the invention, various patterns of haptic feedback may be provided to enable direct sensing of information on a task performed between a device worn on a body of a user and a device not worn thereon, or a task performed between two or more devices worn on the body of the user.
  • Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order that the disclosure may be well understood, there will now be described various forms thereof, given by way of example, reference being made to the accompanying drawings, in which:
  • FIG. 1 schematically shows the configuration of an entire system for providing a user interface according to one embodiment of the invention.
  • FIGS. 2 and 3 illustratively show how haptic feedback is provided in a first device not worn on a body of a user and a second device worn thereon according to one embodiment of the invention.
  • FIGS. 4 and 5 illustratively show how haptic feedback is provided in two or more devices worn on a body of a user according to one embodiment of the invention.
  • FIGS. 6 and 7 illustratively show patterns of haptic feedback provided in two or more devices according to one embodiment of the invention.
  • FIGS. 8 to 14 illustratively show how patterns of haptic feedback provided in two or more devices are determined on the basis of a display state of a region associated with a triggering event according to one embodiment of the invention.
  • FIGS. 15 and 16 illustratively show how times at which haptic feedback is provided in two or more devices are determined according to one embodiment of the invention.
  • FIGS. 17 to 19 illustratively show how haptic illusion is provided in two or more devices according to one embodiment of the invention.
  • FIGS. 20 to 22 illustratively show how haptic feedback is provided on the basis of various tasks performed in a state in which two or more devices are associated according to one embodiment of the invention.
  • FIG. 23 illustratively shows how acoustic feedback is provided in two or more devices according to another embodiment of the invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In the following detailed description of the present invention, references are made to the accompanying drawings that show, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It is to be understood that the various embodiments of the invention, although different from each other, are not necessarily mutually exclusive. For example, specific shapes, structures and characteristics described herein may be implemented as modified from one embodiment to another without departing from the spirit and scope of the invention. Furthermore, it shall be understood that the locations or arrangements of individual elements within each of the disclosed embodiments may also be modified without departing from the spirit and scope of the invention. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of the invention, if properly described, is limited only by the appended claims together with all equivalents thereof. In the drawings, like reference numerals refer to the same or similar functions throughout the several views.
  • Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings to enable those skilled in the art to easily implement the invention.
  • Configuration of an Entire System
  • FIG. 1 schematically shows the configuration of an entire system for providing a user interface according to one embodiment of the invention.
  • As shown in FIG. 1, the entire system according to one embodiment of the invention may comprise a communication network 100, a user interface provision system 200, and multiple devices 310, 320.
  • First, the communication network 100 according to one embodiment of the invention may be implemented regardless of communication modality such as wired and wireless communications, and may be constructed from a variety of communication networks such as local area networks (LANs), metropolitan area networks (MANs), and wide area networks (WANs). Preferably, the communication network 100 described herein may be the Internet or the World Wide Web (WWW). However, the communication network 100 is not necessarily limited thereto, and may at least partially include known wired/wireless data communication networks, known telephone networks, or known wired/wireless television communication networks.
  • Next, the user interface provision system 200 according to one embodiment of the invention may be digital equipment having a memory means and a microprocessor for computing capabilities. The user interface provision system 200 may be a server system. The user interface provision system 200 may function to mediate so that via the communication network 100, one of the devices 310, 320 may transmit information or a control command to the other, or the one may receive information or a control command from the other.
  • To this end, as will be described in detail below, the user interface provision system 200 may function to intuitively provide a user, via haptic sensation, with information on a task performed in a state in which two or more devices are associated, by acquiring information on a position where at least one of a first device and a second device contacts a body of the user, and upon the occurrence of a triggering event that causes haptic feedback in at least one of the first device and the second device, controlling properties of haptic feedback provided in at least one of the first device and the second device, with reference to the acquired information on the body contact position, information on a display state associated with the triggering event, and information on contents or functions associated with the triggering event.
  • The provision of the user interface may be performed by a control unit (not shown) included in the user interface provision system 200. The control unit may reside in the user interface provision system 200 in the form of a program module. The program module may be in the form of an operating system, an application program module, or other program modules. Further, the program module may also be stored in a remote storage device that may communicate with the user interface provision system 200. Meanwhile, such a program module may include, but not limited to, a routine, a subroutine, a program, an object, a component, a data structure and the like for performing a specific task or executing a specific abstract data type as will be described below in accordance with the invention.
  • Further, the user interface provision system 200 may further function to store information on the occurrence or position of body contact provided from at least one of the multiple devices 310, 320 and allow the information to be used by at least one of the multiple devices 310, 320. Furthermore, the user interface provision system 200 may further function to store information constituting contents or functions provided in at least one of the multiple devices 310, 320 and allow the information to be used by at least one of the multiple devices 310, 320. The storing may be performed by a storage (not shown) included in the user interface provision system 200. The storage encompasses a computer-readable recording medium, and may refer not only to a database in a narrow sense but also to a database in a broad sense including file-system based data records and the like.
  • The function of the user interface provision system 200 will be discussed in more detail below. Meanwhile, although the user interface provision system 200 has been described as above, the above description is illustrative and it is apparent to those skilled in the art that at least some of the functions or components required for the user interface provision system 200 may be implemented or included in at least one of the multiple devices 310, 320 to be operated, as necessary.
  • Lastly, the multiple devices 310, 320 according to one embodiment of the invention are digital equipment that may function to connect to and then communicate with the user interface provision system 200 or a counterpart of the multiple devices 310, 320 (which may preferably be separated or externalized from each other), and any type of digital equipment having a memory means and a microprocessor for computing capabilities may be adopted as the devices 310, 320 according to the invention. The devices 310, 320 may be so-called smart devices such as a smart phone, a smart pad, a smart glass, a smart watch, a smart band, a smart ring, and a smart necklace, or may be somewhat traditional devices such as a desktop computer, a notebook computer, a workstation, a personal digital assistant (PDA), a web pad, a mobile phone, buttons, a mouse, a keyboard, and an electronic pen. Further, the devices 310, 320 may be Internet of Things (IoT) devices such as a remote control and a home appliance.
  • Particularly, according to one embodiment of the invention, the devices 310, 320 may include at least one technical means for generating haptic feedback (e.g., vibration, etc.) provided to a user. Examples of the technical means may include commonly known components such as an actuator and a vibration motor.
  • Further, according to one embodiment of the invention, the devices 310, 320 may include at least one technical means for receiving an operation from a user. Examples of the technical means may include sensing modules which are commonly known components such as a touch panel, a pointing tool (e.g., a mouse, a stylus, an electronic pen, etc.), a graphical object operable by the user, a keyboard, a toggle switch, a biometrics (like fingerprints) sensor, a distance sensor, and the like.
  • Furthermore, according to one embodiment of the invention, the devices 310, 320 may include at least one technical means for acquiring physical information on postures or motions of the devices 310, 320. Examples of the technical means may include sensing modules which are commonly known components such as a motion sensor, an acceleration sensor, a gyroscope, a magnetic sensor, a positioning module (a GPS module, a beacon-based positioning (position identification) module, etc.), a barometer, a distance sensor, a camera, and the like.
  • Moreover, according to one embodiment of the invention, the devices 310, 320 may include a technical means for acquiring physical information on postures or motions of the devices 310, 320 on the basis of biometrics acquired from a body of a user carrying the devices 310, 320. Examples of the technical means may include sensing modules such as an electromyogram (EMG) signal measurement apparatus and the like.
  • In addition, the devices 310, 320 may further include an application program for processing the above physical information to transmit information or a control command to another device (310, 320, or the like), to receive information or a control command from another device (310, 320, or the like), or to generate the information or control command. The application may reside in the corresponding devices 310, 320 in the form of a program module. The nature of the program module may be generally similar to that of the aforementioned control unit of the user interface provision system 200. Here, at least a part of the application may be replaced with a hardware or firmware device that may perform a substantially equal or equivalent function, as necessary.
  • Meanwhile, according to one embodiment of the invention, when it is recognized that the first device 310 and the second device 320 have an association (e.g., indicating that they belong to the same user, they function for the sake of the same user, they are located substantially close to each other, or one of them is competent to authenticate or permit the other), a connection may be formed between the first device 310 and the second 320. The recognition or connection may be performed by the user interface provision system 200 or by the first device 310 and the second device 320.
  • EMBODIMENTS
  • Hereinafter, specific examples will be discussed in detail wherein the user interface provision system 200 according to the invention provides a user interface in which the multiple devices 310, 320 are involved according to various embodiments of the invention.
  • According to one embodiment of the invention, the user interface provision system 200 may acquire information on a position where at least one of the first device and the second device contacts a body of a user, and upon the occurrence of a triggering event that causes haptic feedback in at least one of the first device 310 and the second device 320, control properties of haptic feedback provided in at least one of the first device 310 and the second device 320, with reference to the acquired information on the body contact position, information on a display state associated with the triggering event, and information on contents or functions associated with the triggering event. Here, only one of the first device 310 and the second device 320 may contact the body of the user, or both of the first device 310 and the second device 320 may contact the body of the user.
  • According to one embodiment of the invention, a triggering event may encompass all types of events that may be sensed by the first device 310, the second device 320, or the user interface provision system 200, in which the first device 310 and the second device 320 may be involved. For example, the triggering event may include an event in which a user makes a touch operation on the first device 310 or the second device 320. For another example, the triggering event may include an event in which a task such as data transmission, payment, and security authentication is performed between the first device 310 and the second device 320 as the first device 310 and the second device 320 interact with each other. For yet another example, the triggering event may include an event in which a relative relationship between postures or motions of the first device 310 and the second device 320 corresponds to a predetermined relationship (e.g., a relationship where a display screen of the first device 310 and that of the second device 320 face opposite directions.)
  • Specifically, in response to the occurrence of a triggering event, the user interface provision system 200 according to one embodiment of the invention may cause at least a part of haptic feedback provided in the first device 310 to be provided in the second device 320, or cause at least a part of haptic feedback provided in the second device 320 to be provided in the first device 310.
  • Further, according to one embodiment of the invention, in response to the occurrence of a triggering event, the user interface provision system 200 according to one embodiment of the invention may function to determine properties (i.e., patterns) of haptic feedback respectively provided in the first device 310 and the second device 320, with reference to a display state of a region associated with the triggering event. For example, it may be determined according to color or brightness of a graphical object displayed in a region being touched by a user, in which of the first device 310 and the second device 320 haptic feedback will be generated.
  • Furthermore, according to one embodiment of the invention, in response to the occurrence of a triggering event, the user interface provision system 200 according to one embodiment of the invention may function to determine patterns of haptic feedback respectively provided in the first device 310 and the second device 320, with reference to contents or functions associated with the triggering event. For example, when some data are transmitted from the first device 310 to the second device 320, the intensity of vibration provided in the first device 310 may be reduced while that of vibration provided in the second device 320 may be increased, so that a user may be provided with a user experience (UX) in which the user feels as if the vibration having been provided in the first device 310 has been brought to the second device 320.
  • Moreover, in response to the occurrence of a triggering event, the user interface provision system 200 according to one embodiment of the invention may function to determine patterns of haptic feedback respectively provided in the first device 310 and the second device 320, with reference to information on a position where the first device 310 or the second device 320 contacts a body of a user. For example, when the first device 310 and the second device 320 respectively contact a first point and a second point of the body of the user, patterns of haptic feedback respectively provided in the first device 310 and the second device 320 may be appropriately adjusted, so that the user may be provided with a user experience in which the user feels as if the haptic feedback is generated at a point located between the first and second points.
  • Here, the patterns of haptic feedback respectively provided in the first device 310 and the second device 320 may be specified by at least one of a time, order, interval, and intensity in which the haptic feedback is provided. According to one embodiment of the invention, the pattern of haptic feedback provided in the first device 310 may have a complementary relationship with that of haptic feedback provided in the second device 320.
  • FIGS. 2 and 3 illustratively show how haptic feedback is provided in a first device not worn on a body of a user and a second device worn thereon according to one embodiment of the invention.
  • Referring to FIGS. 2 and 3, upon the occurrence of a triggering event in which a user wearing the second device 320 worn on a wrist touches a touch panel of the first device 310, the user interface provision system 200 according to one embodiment of the invention may cause vibration to be generated in the first device 310 and then in the second device 320, so that the user may be provided with a user experience in which the user feels as if the vibration having been provided in the first device 310 has been brought to the second device 320 worn on the user's wrist.
  • FIGS. 4 and 5 illustratively show how haptic feedback is provided in two or more devices worn on a body of a user according to one embodiment of the invention.
  • Referring to FIG. 4, it may be assumed that a user holds a smart phone 310 in a hand with a smart watch 320, a smart band 330, and a smart necklace 340 worn on a wrist, a forearm, and a neck, respectively. In this case, the user interface provision system 200 according to one embodiment of the invention may cause haptic feedback (i.e., vibration) to be generated in the smart phone 310, the smart watch 320, the smart band 330, and the smart necklace 340 in the above-listed order or in the reverse order, thereby providing the user with spatiotemporal haptic feedback.
  • Referring to FIG. 5, the user interface provision system 200 according to one embodiment of the invention may determine times at which haptic feedback is respectively generated in the smart phone 310, the smart watch 320, the smart band 330, and the smart necklace 340, with reference to positions where the smart phone 310, the smart watch 320, the smart band 330, and the smart necklace 340 are respectively worn on the body of the user. For example, as distances between the positions where the smart phone 310, the smart watch 320, the smart band 330, and the smart necklace 340 are respectively worn on the body are longer, intervals between the times at which haptic feedback is generated in the respective devices may be determined to be longer.
  • FIGS. 6 and 7 illustratively show patterns of haptic feedback provided in two or more devices according to one embodiment of the invention.
  • Referring to (a) to (i) of FIG. 6, the user interface provision system 200 according to one embodiment of the invention may cause a variety of haptic feedback to be provided in the first device 310 being a smart phone and the second device 320 being a smart watch, and specifically, a time, cycle, interval, intensity, and the like in which the haptic feedback is provided may be variously determined. Referring to (a) and (b) of FIG. 7, even when there are three or more devices that provide haptic feedback according to the invention, a time, cycle, interval, intensity, and the like in which the haptic feedback is provided in each of the three or more devices may be variously determined.
  • FIGS. 8 to 14 illustratively show how patterns of haptic feedback provided in two or more devices are determined on the basis of a display state of a region associated with a triggering event according to one embodiment of the invention.
  • First, referring to FIGS. 8 to 10, it may be assumed that various graphical objects with various color or brightness are displayed on a display screen of the first device 310 being a smart phone, and a user wearing the second device 320 being a smart watch makes a touch operation on the display screen of the first device 310.
  • In this case, according to one embodiment of the invention, vibration may be generated in the first device 310 when the user touches a graphical object with a relatively bright color, and in the second device 320 when the user touches a graphical object with a relatively dark color (see FIG. 8). Further, according to one embodiment of the invention, when the user makes a drag operation over an area where graphical objects with relatively bright and relatively dark colors are alternately displayed, vibration may be alternately generated in the first device 310 and the second device 320 (see FIG. 9). Furthermore, according to one embodiment of the invention, the intensities of vibration respectively generated in two or more devices may be adjusted, with continuity thereof being maintained, according to the brightness of a graphical object displayed in a region being touched by the user. For example, as the brightness of a graphical object displayed in a region being touched by the user is higher, the intensity of vibration generated in the first device 310 may be increased, whereas that of vibration generated in the second device 320 may be reduced (see FIGS. 10 and 11).
  • Next, referring to FIGS. 12 to 14, it may be assumed that graphical objects with various patterns are displayed on a display screen of the first device 310 being a smart phone, and a user wearing the second device 320 being a smart watch makes a touch operation on the display screen of the first device 310.
  • In this case, according to one embodiment of the invention, haptic feedback (i.e., vibration) corresponding to a pattern of a graphical object displayed on the display screen of the first device 310 may be generated in the first device 310 or the second device 320. Specifically, according to one embodiment of the invention, in response to the user touching the display screen of the first device 310 which is displaying a graphical object having a rough texture, the first device 310 and the second device 320 may be alternately vibrated with a short cycle, thereby allowing the user to intuitively feel the rough texture of the graphical object being displayed on the display screen of the first device 310 (see FIG. 12).
  • Further, according to one embodiment of the invention, the patterns of vibration generated in the first device 310 and the second device 320 may be adjusted according to the properties of a graphical object being displayed in a region touched by the user. For example, vibration may be generated in a pattern in which the intensity thereof is sharply changed in response to the user touching a region where hard and rough stones are being displayed, and in a pattern in which the intensity thereof is smoothly changed in response to the user touching a region where gentle waves are being displayed (see FIGS. 13 and 14).
  • FIGS. 15 and 16 illustratively show how times at which haptic feedback is provided in two or more devices are determined according to one embodiment of the invention.
  • Referring to FIGS. 15 and 16, it may be assumed that a triggering event occurs in which a user wearing the second device 320 on a wrist performs a drag while touching a touch panel of the first device 310 and then performs a release without touching the touch panel. In this case, vibration may be generated in the first device 310 and then in the second device 320, and specifically, vibration may be continuously generated in the second device 320 even in a release state in which the user is no longer touching the touch panel of the first device 310 (see FIG. 15). Further, the intensity of the vibration generated in the release state may be reduced over time (see FIG. 16).
  • FIGS. 17 to 19 illustratively show how haptic illusion is provided in two or more devices according to one embodiment of the invention.
  • According to one embodiment of the invention, when vibration is sequentially generated at a predetermined interval in the first device 310 and the second device 320, which respectively contact a wrist and an elbow of a user, the user may feel that the vibration is transmitted in the direction from the wrist to the elbow (or in the opposite direction). When vibration is generated in both of the first device 310 and the second device 320, the user may feel as if the vibration is generated at a point between the points of contact of the first device 310 and the second device 320, and this phenomenon is referred to as haptic illusion or hopping rabbit illusion.
  • Referring to FIGS. 17 and 18, according to one embodiment of the invention, the user interface provision system 200 may sequentially generate vibration in the first device 310 and the second device 320, or may vary the intensities of vibration in the opposite directions, thereby providing the user with a user experience corresponding to haptic illusion or hopping rabbit illusion. Further, according to one embodiment of the invention, the user interface provision system 200 may adjust the patterns generated in two or more devices that contact a body of the user, thereby allowing the user to intuitively figure out the complicated patterns as shown in (a) to (d) of FIG. 19 via haptic sensation.
  • FIGS. 20 to 22 illustratively show how haptic feedback is provided on the basis of various tasks performed in a state in which two or more devices are associated according to one embodiment of the invention.
  • Referring to FIG. 20, it may be assumed that a user wearing the second device 320 on a wrist remains touching a region corresponding to certain contents on a touch panel of the first device 310 so that the contents may be transmitted from the first device 310 to the second device 320. In this case, according to one embodiment of the invention, the user interface provision system 200 may use haptic feedback to provide information on a transmission status of the contents transmitted from the first device 310 to the second device 320. For example, as the transmission of the contents progresses, the intensity of vibration generated in the first device 310 may be reduced, whereas that of vibration generated in the second device 320 may be increased.
  • Referring to FIG. 21, it may be assumed that a user wearing the second device 320 on a wrist remains touching a touch panel of the first device 310 so that contents being provided in the second device 320 may be transmitted from the second device 320 to the first device 310. In this case, according to one embodiment of the invention, the user interface provision system 200 may use haptic feedback to provide information on a transmission status of the contents transmitted from the second device 320 to the first device 310. For example, as the transmission of the contents progresses, the intensity of vibration generated in the second device 320 may be reduced, whereas that of vibration generated in the first device 310 may be increased.
  • Referring to FIG. 22, it may be assumed that haptic feedback is provided in the first device 310 and the second device 320 as a user wearing the second device 320 on a wrist makes an adjustment to a function of the first device 310. In this case, according to one embodiment of the invention, the user interface provision system 200 may provide haptic feedback to allow the user to directly sense a state in which the function of the first device 310 is adjusted. For example, as the volume of the first device 310 is increased, the intensity of vibration generated in the first device 310 may be reduced, whereas that of vibration generated in the second device 320 may be increased.
  • It is noted that although the embodiments in which haptic feedback is provided via two or more devices associated with a triggering event have been mainly described above, the present invention is not necessarily limited thereto, and a variety of feedback including acoustic feedback may also be provided as long as the objects of the invention may be achieved.
  • FIG. 23 illustratively shows how acoustic feedback is provided in two or more devices according to another embodiment of the invention.
  • Referring to FIG. 23, the user interface provision system 200 according to another embodiment of the invention may provide acoustic feedback in at least one of the first device 310 and the second device 320, with reference to information on positions of the devices, information on a display state associated with a triggering event, and information on contents or functions associated with the triggering event. For example, when a triggering event occurs in which a user wearing the second device 320 on a wrist touches a touch panel of the first device 310, the user interface provision system 200 according to another embodiment of the invention may cause sound to be generated in the first device 310 and then in the second device 320, so that the user may be provided with a user experience in which the user feels as if the sound having been provided in the first device 310 has been brought to the second device 320 worn on the user's wrist.
  • Further, it is noted that although the embodiments in which the first device 310 is a smart phone held in the user's hand and the second device 320 is a smart watch worn on the user's wrist have been mainly described above, the present invention is not necessarily limited thereto, and the first and second devices may also be implemented in any other forms such as a smart pad, a smart glass, a smart band, and a smart ring, as long as the objects of the invention may be achieved.
  • The embodiments according to the invention as described above may be implemented in the form of program instructions that can be executed by various computer components, and may be stored on a non-transitory computer-readable recording medium. The non-transitory computer-readable recording medium may include program instructions, data files, data structures and the like, separately or in combination. The program instructions stored on the non-transitory computer-readable recording medium may be specially designed and configured for the present invention, or may also be known and available to those skilled in the computer software field. Examples of the non-transitory computer-readable recording medium include the following: magnetic media such as hard disks, floppy disks and magnetic tapes; optical media such as compact disk-read only memory (CD-ROM) and digital versatile disks (DVDs); magneto-optical media such as floptical disks; and hardware devices such as read-only memory (ROM), random access memory (RAM) and flash memory, which are specially configured to store and execute program instructions. Examples of the program instructions include not only machine language codes created by a compiler or the like, but also high-level language codes that can be executed by a computer using an interpreter or the like. The above hardware devices may be configured to operate as one or more software modules to perform the processes of the present invention, and vice versa.
  • Although the present invention has been described in terms of specific items such as detailed elements as well as the limited embodiments and the drawings, they are only provided to help more general understanding of the invention, and the present invention is not limited to the above embodiments. It will be appreciated by those skilled in the art to which the present invention pertains that various modifications and changes may be made from the above description.
  • Therefore, the spirit of the present invention shall not be limited to the above-described embodiments, and the entire scope of the appended claims and their equivalents will fall within the scope and spirit of the invention.

Claims (1)

What is claimed is:
1. A method for providing a user interface, comprising the steps of:
acquiring information on a position where each of a first device and a second device contacts a body of a user; and
upon occurrence of a triggering event that causes haptic feedback in at least one of the first device and the second device, controlling properties of haptic feedback provided in at least one of the first device and the second device, with reference to the acquired information on the body contact position, wherein the triggering event comprises an event in which a touch operation occurs on a screen of at least one of the first device and the second device,
wherein in the controlling step, the properties of haptic feedback provided in at least one of the first device and the second device are controlled, with further reference to information on a display state of a region of the screen of at least one of the first device and the second device associated with the triggering event.
US15/667,887 2014-10-02 2017-08-03 Method, device, system and non-transitory computer-readable recording medium for providing user interface Abandoned US20180018021A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
KR10-2014-0132943 2014-10-02
KR20140132943 2014-10-02
KR10-2015-0078494 2015-06-03
KR20150078494 2015-06-03
KR10-2015-0105128 2015-07-24
KR1020150105128A KR101680698B1 (en) 2014-10-02 2015-07-24 Method, device, system and non-transitory computer-readable recording medium for providing user interface
US14/826,922 US9753539B2 (en) 2014-10-02 2015-08-14 Method, device, system and non-transitory computer-readable recording medium for providing user interface
US15/667,887 US20180018021A1 (en) 2014-10-02 2017-08-03 Method, device, system and non-transitory computer-readable recording medium for providing user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/667,887 US20180018021A1 (en) 2014-10-02 2017-08-03 Method, device, system and non-transitory computer-readable recording medium for providing user interface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/826,922 Continuation US9753539B2 (en) 2014-10-02 2015-08-14 Method, device, system and non-transitory computer-readable recording medium for providing user interface

Publications (1)

Publication Number Publication Date
US20180018021A1 true US20180018021A1 (en) 2018-01-18

Family

ID=55632797

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/826,922 Active US9753539B2 (en) 2014-10-02 2015-08-14 Method, device, system and non-transitory computer-readable recording medium for providing user interface
US15/667,887 Abandoned US20180018021A1 (en) 2014-10-02 2017-08-03 Method, device, system and non-transitory computer-readable recording medium for providing user interface

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/826,922 Active US9753539B2 (en) 2014-10-02 2015-08-14 Method, device, system and non-transitory computer-readable recording medium for providing user interface

Country Status (1)

Country Link
US (2) US9753539B2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8493354B1 (en) * 2012-08-23 2013-07-23 Immersion Corporation Interactivity model for shared feedback on mobile devices
US20160259432A1 (en) * 2013-08-13 2016-09-08 Samsung Electronics Company, Ltd. Electromagnetic Interference Signal Detection

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4402145B2 (en) 2007-10-03 2010-01-20 キヤノン株式会社 Calculation method, generation method, a program, an exposure method and mask fabrication methods
US7839269B2 (en) 2007-12-12 2010-11-23 Immersion Corporation Method and apparatus for distributing haptic synchronous signals
US20130198625A1 (en) * 2012-01-26 2013-08-01 Thomas G Anderson System For Generating Haptic Feedback and Receiving User Inputs
KR101521253B1 (en) 2012-11-15 2015-05-18 주식회사 포스코 Grain-orinented electrical steel sheet and method for manufacturing the same
US9466187B2 (en) 2013-02-04 2016-10-11 Immersion Corporation Management of multiple wearable haptic devices
US9671826B2 (en) * 2013-11-27 2017-06-06 Immersion Corporation Method and apparatus of body-mediated digital content transfer and haptic feedback
WO2015083183A1 (en) 2013-12-03 2015-06-11 Verma Abhinav S Hand wearable haptic feedback based navigation device
EP3095023A1 (en) 2014-01-15 2016-11-23 Sony Corporation Haptic notification on wearables
WO2015127059A2 (en) 2014-02-24 2015-08-27 Sony Corporation Smart wearable devices and methods with attention level and workload sensing
US10032345B2 (en) * 2014-04-02 2018-07-24 Immersion Corporation Wearable device with flexibly mounted haptic output device
US9690370B2 (en) 2014-05-05 2017-06-27 Immersion Corporation Systems and methods for viewport-based augmented reality haptic effects
US9986086B2 (en) * 2014-07-31 2018-05-29 Samsung Electronics Co., Ltd. Mobile terminal and method of operating the same
US9588588B2 (en) 2014-09-22 2017-03-07 Disney Enterprises, Inc. Customized haptic effects
US20160187976A1 (en) 2014-12-29 2016-06-30 Immersion Corporation Systems and methods for generating haptic effects based on eye tracking

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8493354B1 (en) * 2012-08-23 2013-07-23 Immersion Corporation Interactivity model for shared feedback on mobile devices
US20160259432A1 (en) * 2013-08-13 2016-09-08 Samsung Electronics Company, Ltd. Electromagnetic Interference Signal Detection

Also Published As

Publication number Publication date
US9753539B2 (en) 2017-09-05
US20160098084A1 (en) 2016-04-07

Similar Documents

Publication Publication Date Title
Chen et al. Duet: exploring joint interactions on a smart phone and a smart watch
TWI546724B (en) Device and method for transmitting information between projects of communications equipment
JP6323862B2 (en) User gesture input to wearable electronic devices, including device movement
KR101933289B1 (en) Devices and methods for a ring computing device
JP6421911B2 (en) Transition and interaction model for wearable electronic devices
US9400489B2 (en) Smart watch and control method thereof
US20140198035A1 (en) Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display
KR20140128275A (en) System and methods for haptically-enabled conformed and multifaceted displays
Esteves et al. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements
US9582076B2 (en) Smart ring
JP2014102843A (en) Wearable electronic device
JP2014112222A (en) Placement of optical sensor on wearable electronic device
US10061387B2 (en) Method and apparatus for providing user interfaces
US20130271390A1 (en) Multi-segment wearable accessory
JP2014102840A (en) User gesture input to wearable electronic device involving movement of device
US20110191707A1 (en) User interface using hologram and method thereof
KR20160077070A (en) Wristband device input using wrist movement
US8570273B1 (en) Input device configured to control a computing device
US20110234488A1 (en) Portable engine for entertainment, education, or communication
US20140181750A1 (en) Input device, input operation method, control program, and electronic device
US20120075196A1 (en) Apparatus and method for user input
US20100090949A1 (en) Method and Apparatus for Input Device
US9978261B2 (en) Remote controller and information processing method and system
CA3051912A1 (en) Gesture recognition devices and methods
WO2007053116A1 (en) Virtual interface system

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUTUREPLAY INC., KOREA, DEMOCRATIC PEOPLE'S REPUBL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HWANG, SUNG JAE;REEL/FRAME:043186/0805

Effective date: 20150803

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION