WO2005003948A1 - Systeme de commande et procede de commande - Google Patents

Systeme de commande et procede de commande Download PDF

Info

Publication number
WO2005003948A1
WO2005003948A1 PCT/JP2004/006643 JP2004006643W WO2005003948A1 WO 2005003948 A1 WO2005003948 A1 WO 2005003948A1 JP 2004006643 W JP2004006643 W JP 2004006643W WO 2005003948 A1 WO2005003948 A1 WO 2005003948A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input
input part
shape
instruction
Prior art date
Application number
PCT/JP2004/006643
Other languages
English (en)
Japanese (ja)
Inventor
Shigeru Enomoto
Yoshinori Washizu
Ryuji Yamamoto
Munetaka Tsuda
Tomonori Shimomura
Junichi Rekimoto
Original Assignee
Sony Computer Entertainment Inc.
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc., Sony Corporation filed Critical Sony Computer Entertainment Inc.
Publication of WO2005003948A1 publication Critical patent/WO2005003948A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to control technology for electronic devices and the like, and particularly relates to the shape or motion of an input part including a part of a user's body, an object operated by the user, or a distance to the input part.
  • the present invention relates to a control system and a control method having a user interface capable of inputting a user's instruction.
  • the present invention has been made in view of such a situation, and its object is to provide an excellent operability. In providing a user interface.
  • the control system includes a detection unit that detects a shape or an operation of one or more input parts including at least a part of a user's body or at least a part of an object operated by the user, or a distance to the input part.
  • the analysis unit that determines the user's instruction by praying the shape or motion of the input part detected by the input unit or the distance to the input part, and the function corresponding to the user's instruction determined by the analysis unit And a control unit to execute.
  • the detection unit may be an imaging device capable of acquiring distance information to an input part.
  • the detection unit may be an input device that includes a plurality of electrodes and detects a change in capacitance between the electrodes due to approach of an input site.
  • FIG. 1 shows a configuration of a control system 10 according to the first embodiment.
  • the control system 10 displays an input device 20 for inputting an instruction from a user, a control device 40 for controlling an operation of an application or the like according to the instruction input from the input device 20, and an image output from the control device 40. And a display device 30 to be used.
  • the control system 10 of the present embodiment includes a user interface (hereinafter, referred to as a “user interface”) that allows a user to input an instruction by an action (gesture) using a part of the body such as a finger, hand, foot, or head.
  • the input device 20 is an input device including at least a part of a user's body. Has the function of a detection unit that detects the shape or movement of the part, or the distance to the input part
  • FIG. 2 shows a configuration example of the input device 20 and the display device 30 according to the embodiment.
  • an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, or a cathode ray tube (Cathord Ray Tube: CRT) display device is used.
  • the control device 40 analyzes the operation of the user photographed by the imaging device 22 by image processing, determines a gesture performed by the user, and acquires a user instruction.
  • a wide range of the user's body can be imaged and its operation can be determined, so that the user can give an instruction by a gesture using a hand or foot that can be moved with just a finger. it can.
  • This method is suitable when the user makes a gesture at a position some distance from the imaging device 22.
  • an imaging device having no distance measuring function may be used as the input device 20, as described later.
  • FIG. 3 shows another configuration example of the input device 20 and the display device 30 according to the embodiment.
  • a projector 36 that projects an image on a screen 38 is used as the display device 30, and an imaging device 22 having a distance measuring function is used as the input device 20.
  • an image is projected on a transparent or translucent screen 38 made of glass or the like provided in front of the user by a projector 36 provided on the upper rear side of the user, and the image displayed on the screen 38 is displayed.
  • a user making a gesture toward the camera is photographed by the imaging device 22 provided on the opposite side of the screen 38.
  • the imaging device 22 can be arranged at a position distant from the screen 38, even if the user makes a gesture near the screen 38, the user's body part The distance to the position can be detected with high accuracy.
  • FIG. 4 shows another configuration example of the input device 20 and the display device 30 according to the embodiment.
  • an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, or a CRT display device is used as the display device 30, and a display screen 34 of the display device 32 is used as the input device 20.
  • the touch panel 24 provided inside is used. Alternatively, an image may be projected on the surface of the touch panel 24 by a projector.
  • the touch panel 24 can be any type of touch panel, such as a resistance pressure-sensitive method or an infrared detection method. According to this configuration example, the user can input an instruction while directly touching an object or the like displayed on the display screen 34.
  • FIG. 5 shows still another configuration example of the input device 20 and the display device 30 according to the embodiment.
  • an arbitrary display device 32 such as a liquid crystal display device, a plasma display device, and a CRT display device is used as the display device 30, and the display screen 34 of the display device 32 is used as the input device 20.
  • a non-contact type input device 26 provided inside is used.
  • a configuration in which an image is projected on the surface of the non-contact input device 26 by a projector may be employed.
  • the non-contact input device 26 is an input device capable of detecting the shape of an object, such as a fingertip of a user, and the distance to the object when the object approaches, for example, Japanese Patent Application Laid-Open No. 2002-342033.
  • the non-contact type input device disclosed in Japanese Patent Application Laid-Open No. 2002-342033 is provided with a plurality of linear electrodes arranged vertically and horizontally, and when a conductive object such as a fingertip of a user approaches the electrodes. At the same time, it detects the change in capacitance according to the degree of approach, and obtains three-dimensional position information of an object near the input device.
  • the non-contact input device 26 can be provided close to the display screen 34, and it is possible to accurately detect the user's movement and the shape of the body part near the display screen 34. Therefore, the user can input an instruction with an input part such as a finger near the image while viewing the image displayed on the display screen 34.
  • the non-contact input device 26 only the part of the user's body that is in the vicinity of the input device is detected, eliminating the need to extract a specific part to analyze the motion and simplifying the processing. can do.
  • the shape and distance can be detected simply by approaching, it is possible to input instructions without touching the display screen 34. Therefore, even a user who feels resistance to directly touching the display screen 34 can be used comfortably.
  • the user who does not need to press the display screen 34 strongly can detect the approach before touching the display screen 34, it is possible to provide a user interface with good responsiveness and excellent operational feeling.
  • Which of the plurality of configuration examples described above is adopted may be determined according to the environment of the place where the control system 10 is installed, the type of application or content to be controlled, and the like. Les ,. For example, in an application in which a plurality of users share and use one display device 30, or in an application in which a content such as a movie is played, the user needs to keep a relatively large distance from the display device 30. Since it is assumed that an instruction is input, the configuration example shown in FIG. 2 or FIG. 3 may be adopted. Further, in the case of an application that one user uses personally, for example, when editing data such as images and documents, it is assumed that the distance between the user and the display device 30 is relatively short. Therefore, the configuration example shown in FIG. 4 or FIG. 5 may be adopted. When constructing the control system 10 using a plurality of configuration examples in combination, it is necessary to select which type of the gesture interface is to be adopted for each application or content, and to switch and use the input device 20 as appropriate. You may.
  • FIG. 6 shows an internal configuration of the control device 40.
  • this configuration can be realized by any computer CPU, memory, or other LSI, and in software, the power realized by programs loaded into the memory, etc.
  • the functional blocks are drawn. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • the control device 40 includes an acquisition unit 42 that acquires an input signal detected by the input device 20, an analysis unit 44 that analyzes a user operation from the input signal acquired by the acquisition unit 42, and determines a user instruction, And a control unit 46 for executing a function corresponding to the user's instruction determined by the analysis unit 44.
  • the analysis unit 44 acquires an image having distance information captured by the imaging device, and determines a user operation by image processing.
  • the analysis unit 44 uses a shape recognition technique to detect a part of the user's body, such as the head, eyes, hands, fingers, The motion of the user may be determined by extracting a foot or the like and analyzing the motion of the extracted body part.
  • the analysis unit 44 may determine the user's operation by analyzing the shape or time change of the input signal.
  • the analysis unit 44 may determine the user's motion by analyzing the shape and distance of the input signal and their time change.
  • FIG. 7 shows hardware components of the control device 40.
  • the control device 40 includes a CPU 120, an input interface 122, a display interface 124, a memory 130, a node disk 132, and a drive device 128. These components are electrically connected by a signal transmission path such as a bus 126.
  • the input interface 122 acquires an input signal detected by the input device 20.
  • the display interface 124 outputs an image signal to be displayed on the display device 30.
  • the hard disk 132 is a large-capacity magnetic storage device, and stores various files.
  • the recording medium 140 records a program for causing the CPU 120 to realize the functions of the control device 40 described above. When the recording medium 140 is inserted into the drive device 128, the program is read into the memory 130 or the hard disk 132, and the CPU 120 performs the control processing of the present embodiment by the read program.
  • the recording medium 140 is a computer-readable medium such as a CD-ROM, a DVD, and an FD.
  • the program may be transmitted from an external server regardless of whether it is wireless or wired.
  • the program is stored in the hard disk 132 in advance, not only when the program is realized from the outside but also when the computer realizes the control function of the present embodiment. It is well understood by those skilled in the art that this is the case.
  • the function of the control unit 46 may be realized by an operating system ( ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ S) executed by a CPU or the like of a computer, an input / output control application, or the like.
  • an operating system ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ S
  • the user's instruction is notified to the application of the topmost window near the location where the input part such as the user's finger has approached, and the Application responds to user instructions
  • the associated function may be executed. If there is no window at that position, the OS or I / O control application may execute the function associated with the instruction.
  • a force is used to input an instruction with a single pointer.
  • the user can use fingers, hands, and feet.
  • An instruction can be given by using a plurality of input units using, for example.
  • FIG. 8 and FIG. 9 are diagrams for explaining an example of inputting an instruction by the movement of a plurality of fingers of the user.
  • the user With the thumb and index finger closed as shown in FIG. 8, the user brings his / her finger close to the icon 200 displayed on the screen of the display device 30, and opens the thumb and index finger as shown in FIG. Is performed.
  • the acquisition unit 42 sends the input signal detected by the input device 20 to the analysis unit 44, and the analysis unit 44 analyzes the movement of the user's finger and determines that the user has issued an instruction to open the finger. I do.
  • the analysis unit 44 extracts the user's hand by using a technique such as shape recognition and tracks the movement of the finger, and performs an operation of opening the finger by the user. Determine what you have done.
  • the analysis unit 44 divides a set of input parts approaching or touching coordinates near the icon 200 on the display screen into two, and It is determined that the user has performed the operation of opening the finger when the user has moved away from the camera. Only when a group of input parts is divided into two parts When two input parts move in a direction away from each other, it may be determined that the user has performed the operation of opening the finger.
  • the control unit 46 executes a function associated with the operation of opening the finger. For example, the control unit 46 may execute a function of activating an application associated with the icon 200 displayed near the user's finger. If the icon is associated with a file, an application that can handle the file may be started and the function to open the file may be executed. In this way, the "open finger" operation is associated with functions such as "open application” and "open file”. By doing so, the user's operation and the application's operation are intuitively matched, so that a more intuitive user interface with a higher level of affinity and immediate affinity can be realized.
  • the operation of opening a finger may be associated with functions such as "start”, “determine”, and “determine”.
  • each application may implement a function corresponding to the operation of opening the finger. For example, in an image processing application, when a user performs an operation of opening a finger on an image, a function of enlarging the portion or extending the portion in the direction in which the finger is opened may be executed.
  • the control unit 46 performs the function associated with the finger closing operation. You can do it.
  • the control unit 46 may execute a function of terminating the application associated with the icon 200 or the window displayed near the user's finger, or may execute the function of terminating the icon 200 or the file associated with the window. You may perform the function to close the.
  • an image processing application when a user performs an operation of closing a finger on an image, a function of reducing that part or reducing the part in a direction in which the finger is closed may be executed.
  • control unit 46 receives an instruction using a plurality of input sites including a user's finger, hand, and the like, and performs the operation.
  • the associated function may be executed.
  • a book browsing application when a user performs an operation of pinching a corner of a displayed book page, the user may memorize the page and execute a “pinch bookmark” function.
  • a function of “picking up” the item may be executed.
  • FIG. 10 is a diagram for explaining an example of inputting an instruction based on the shape of the user's hand. As shown in FIG. 9, it is assumed that the user places his palm on the display screen with his / her hand open. If the input device 20 is a camera with a distance measurement function, the analysis unit 44 The shape of the hand is determined by extracting the hand of the user by the technique. When the input device 20 is a touch panel or a non-contact type input device, the analysis unit 44 extracts a feature point from a shape of an object approaching or touching the display screen by using a known technique. Alternatively, the shape of the object is determined by performing evaluation using a predetermined evaluation formula.
  • a general hand shape may be stored in a database, and the shape of the hand may be determined by extracting a matching shape from the database.
  • the shape may be determined based on the area of the detected object. For example, when the user places his / her hand on the display screen, the area is the largest when the hand is open and the smallest when the user holds the hand. Utilizing this fact, the shape of the hand may be determined.
  • the control unit 46 performs a function corresponding to the shape, for example, an application corresponding to the window 210 which is displayed at the highest position in the position where the palm is placed. Perform the function to "quit".
  • FIG. 11 is a diagram illustrating an example of mapping an object to a hand shape.
  • the user moves a finger from the position of the object 220 on the display screen to the left hand with the other hand, here the right hand, and creates a specific shape with the other hand, here the left hand.
  • the method of determining the shape of the hand is the same as in the example described above.
  • the control unit 46 executes a function of storing a file corresponding to the object 220 moved by the finger of the right hand in a storage location of the hard disk 132 corresponding to the shape of the left hand.
  • the storage location corresponding to the shape of the left hand may be a directory or a folder in the file system, or may be a virtual folder. You can also associate the file with the shape of your hand. It is not only necessary to associate one storage location with one shape. For example, the storage location may be associated with each finger of an open hand.
  • FIG. 12 shows an example of a table for mapping an object to a hand shape.
  • the table 230 is held in the memory 130 or the hard disk 132 of the control device 40.
  • the table 230 has a shape column 232 and a storage location column 234.
  • the shape column 232 holds an image representing a hand shape, parameters, and the like. Recognize hand shapes as images When recognizing the image, the file name of the image file may be stored.When recognizing the shape of the hand as a parameter using feature points or evaluation formulas, the parameter or the file in which the parameter is stored You can keep the file name and so on.
  • the storage location column 234 holds the storage location of the object.
  • the hand shape and the storage location corresponding to the shape may be registered in advance, or if the hand shape is an unregistered shape when the user performs the operation shown in FIG. 11,
  • the controller 46 may register the shape and storage location of the hand in the table 230. At this time, the user may specify the storage location, or an appropriate storage location may be automatically assigned.
  • the storage location field 234 holds the file name of the file.
  • FIG. 13 is a view for explaining an example of extracting an object stored by the operation shown in FIG.
  • the user creates the shape of the hand corresponding to the storage location of the object to be retrieved with one hand, here the left hand, and moves away from the position near the left hand with the other hand, here the right hand. Perform the operation of moving your finger.
  • the control unit 46 specifies the object stored in the storage location corresponding to the shape of the left hand, and displays it on the display screen.
  • the control unit 46 may execute a function of opening a file corresponding to the object.
  • FIG. 14 is a diagram for explaining an example of an instruction input according to a distance.
  • the control unit 46 displays a moving image in which a fish is swimming on the display device.
  • the control unit 46 sets the moving image in which the fish escapes from the vicinity of the user's hand. Switch the display to the image.
  • the speed at which the user's hand approaches is calculated, and when the speed is fast, the threshold for switching the display is increased, and the hand is at a far position.
  • the fish may start to escape from time to time, and when the speed is slow, the threshold value may be reduced so that the fish does not escape until the hand comes close.
  • the control unit 46 displays a moving image in which the fish is freely swimming, and when the distance is smaller than the predetermined value, the fish is displayed. May display a moving image of swimming while avoiding the position of the user's hand. Since the non-contact input device can accurately detect distance information near the display screen, in this example, it is more preferable to use the non-contact input device as the input device 20.
  • the input device 20 When an imaging device is used as the input device 20, if the user is too far from the imaging device, there is a possibility that the user's instruction cannot be determined accurately due to the relationship between the distance measurement function and the accuracy of the image processing. In addition, when a non-contact input device is used as the input device 20, unless a part such as a user's hand approaches the input device 20, the capacitance does not change and cannot be detected. As described above, since there is an upper limit value of the distance at which the user can input an instruction according to the characteristics of the input device 20, the above-described example of a moving image in which a fish swims indicates whether or not the instruction can be input. It may be used as a means.
  • a moving image of the fish swimming freely is displayed, and input is not possible. Is presented to the user, and when the user approaches the input device and reaches a distance at which an instruction can be input, a moving image in which a fish swims around the user's hand to avoid the fish is displayed, and input can be performed. May be presented to the user.
  • the control unit 46 previously holds a distance threshold value for determining whether or not an instruction input to the input device 20 is possible.
  • FIG. 15 and FIG. 16 are diagrams for explaining the function of moving the object displayed on the screen.
  • the user tries to move the object displayed on the display device 30 by shifting it with a finger.
  • the control unit 46 detects the movement amount, speed, and acceleration of a part such as a user's finger, and determines the movement amount of the object accordingly.
  • the weight of each object that is, the virtual energy value required to move the object is set in advance, and the energy value and the amount of movement when the user moves his or her finger to move the object are set.
  • Speed, or acceleration The moving state of the object is controlled based on the degree. As shown in FIG.
  • the control unit 46 moves the user's finger quickly for a light object, as shown in FIG. Does not move the object if the finger is quickly moved.
  • the heavy object starts moving by slowly moving the finger of the user, and moves gradually following the movement of the finger. This makes it possible to simulate a moving state simulating actual static friction and dynamic frictional resistance, and to give the user an operational feeling reflecting the weight of the object.
  • the technology described in this example can be used in a user interface using a conventional pointing device.
  • the control unit 46 may determine the magnitude of the force applied to the object by the user and control the moving state of the object according to the magnitude of the force.
  • the magnitude of the force may be determined based on the distance between the input device 20 and the user's hand. For example, the closer the user's hand is to the display screen, the more force may be applied to the object.
  • the magnitude of the force may be determined based on the magnitude of the pressing force.
  • the magnitude of the force may be determined based on the degree of pressing of a pressure-sensitive mouse or the like.
  • FIG. 17 is a diagram for explaining a paper handling function.
  • the control unit 46 displays an image of a book opened on the display device 30.
  • the control unit 46 turns the page gradually following the movement of the user's finger, and turns the page when it exceeds a predetermined position. Displays a moving image on the next page.
  • the control unit 46 displays an image in which the paper cannot follow the movement of the finger, but turns halfway and returns. As a result, it is possible to give the user a sense of presence as if he / she is actually reading a book.
  • the control unit 46 may hold in advance a threshold value of the speed used to determine whether the paper can follow the movement of the user.
  • the control unit 46 indicates that the paper makes a wrinkle or performs an operation such as pinching or tearing the paper. At this time, the paper may be made uneven, or the paper may be displayed as torn.
  • the instruction input according to the moving amount, the speed, and the acceleration when the user moves an object with a finger or the like and passes over another object, the other object is changed according to the moving speed.
  • a predetermined process may be performed on the object. For example, the function of opening the file corresponding to the moving object by the application corresponding to the passed object may be executed.
  • FIG. 18 is a diagram for describing an example of handling a three-dimensional object.
  • the control unit 46 performs the same operation as the direction in which the user turns his hand. Rotate the 3D object 240 in the direction.
  • the gesture interface of the present embodiment it is possible to give an operational feeling as if a 3D object is actually handled by hand, and to provide a user interface for handling a 3D space. Operability can be dramatically improved.
  • the control system 10 of the present embodiment and the internal configurations of the input device 20, the display device 30, and the control device 40 are the same as those of the first embodiment.
  • a configuration example using the non-contact input device 26 shown in FIG. 5 will be mainly described, but the same applies to a case where another configuration example is used.
  • the non-contact input device 26 that detects a change in capacitance
  • a conductive object is placed on the display screen, and when the user touches the object, the capacitance changes and the shape of the object is detected. You. Utilizing this, conductive objects
  • a user interface can be constructed by associating predetermined functions with shapes and movements of the user.
  • FIG. 19 is a diagram for explaining an example in which an instruction is input based on the shape and movement of an object.
  • a window 250 of a music reproduction application controlled by the control unit 46 is displayed on the display device 30.
  • the analysis unit 44 analyzes the shape of the input signal, detects that the input signal is the volume control unit 260, analyzes the movement of the knob 262, and transmits it to the control unit 46.
  • the control unit 46 controls the volume in the music playback application according to the amount of movement of the knob 262.
  • the non-contact input device that detects a change in capacitance detects only a conductive object
  • a specific shape is drawn by a conductive wire on the bottom surface of the insulating object, and the shape and a predetermined function are determined. It may be associated.
  • control system 10 corresponds to an electronic device such as a personal computer.
  • a non-contact input device 26 may be used as the input device 20, and the display device 30 and the input device 20 may be provided on the tabletop of the table so that a game or the like can be enjoyed thereon.
  • the display device 30 and the input device 20 may be provided on a floor of a passage or the like to display a footprint of a user walking or to navigate a destination of the user by using an image or light.
  • the present invention can be applied to a user interface for controlling an electronic device or the like.
  • FIG. 1 is a diagram showing a configuration of a control system according to an embodiment.
  • FIG. 2 is a diagram showing a configuration example of an input device and a display device according to an embodiment.
  • FIG. 3 is a diagram showing another configuration example of the input device and the display device according to the embodiment.
  • FIG. 4 is a diagram showing still another configuration example of the input device and the display device according to the embodiment.
  • FIG. 5 is a diagram showing still another configuration example of the input device and the display device according to the embodiment.
  • FIG. 6 is a diagram showing an internal configuration of a control device.
  • FIG. 7 is a diagram showing hardware components of the control device.
  • FIG. 8 is a diagram for explaining an example of inputting an instruction by movement of a plurality of fingers of a user.
  • FIG. 9 is a diagram for explaining an example in which an instruction is input by movement of a plurality of fingers of a user.
  • FIG. 10 is a diagram for explaining an example of inputting an instruction according to the shape of a user's hand.
  • FIG. 11 is a diagram illustrating an example of mapping an object to a hand shape.
  • FIG. 12 is a diagram showing an example of a table for mapping objects to hand shapes.
  • FIG. 13 is a diagram for explaining an example of retrieving an object stored by the operation shown in FIG.
  • FIG. 14 is a diagram for explaining an example of an instruction input according to a distance.
  • FIG. 15 is a diagram for explaining a function of moving an object displayed on a screen.
  • FIG. 16 is a diagram for explaining a function of moving an object displayed on a screen.
  • FIG. 17 is a diagram for explaining a function for handling paper.
  • FIG. 18 is a diagram for describing an example of handling a three-dimensional object.
  • FIG. 19 is a diagram for explaining an example of inputting an instruction based on the shape and movement of an object.
  • Control system 10 "Control system, 20" Input device, 22 Image sensor, 24 Touch panel, 26 Non-contact type input device, 30 Display device, 40 Control device, 42 ⁇ Acquisition unit, 44 ⁇ Analysis unit, 46 ⁇ Control unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne une interface utilisateur présentant une très grande facilité d'utilisation. Cette invention concerne en outre un système de commande comprenant, côté interne d'un écran d'affichage (34) d'un dispositif d'affichage (32), un dispositif d'entrée sans contact (26) qui détermine une forme ou une action d'une partie d'entrée qui comprend au moins une partie du corps d'un utilisateur ou au moins une partie d'un objet actionné par l'utilisateur, ou qui détermine une distance jusqu'à la partie d'entrée. Une partie d'analyse analyse la forme ou l'action de la partie d'entrée ou la distance jusqu'à celle-ci, qui est déterminée par le dispositif d'entrée sans contact (26), pour déterminer une instruction de l'utilisateur. Une partie de commande exécute une fonction qui correspond à l'instruction de l'utilisateur déterminée par la partie d'analyse.
PCT/JP2004/006643 2003-07-08 2004-05-18 Systeme de commande et procede de commande WO2005003948A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003193738A JP4723799B2 (ja) 2003-07-08 2003-07-08 制御システムおよび制御方法
JP2003-193738 2003-07-08

Publications (1)

Publication Number Publication Date
WO2005003948A1 true WO2005003948A1 (fr) 2005-01-13

Family

ID=33562479

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/006643 WO2005003948A1 (fr) 2003-07-08 2004-05-18 Systeme de commande et procede de commande

Country Status (2)

Country Link
JP (1) JP4723799B2 (fr)
WO (1) WO2005003948A1 (fr)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2124138A2 (fr) * 2008-05-20 2009-11-25 LG Electronics Inc. Terminal mobile utilisant un détecteur de proximité et procédé de contrôle de fonds d'écran
JP2010539590A (ja) * 2007-09-14 2010-12-16 インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー ジェスチャベースのユーザインタラクションの処理
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8249334B2 (en) 2006-05-11 2012-08-21 Primesense Ltd. Modeling of humanoid forms from depth maps
US8565479B2 (en) 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US8582867B2 (en) 2010-09-16 2013-11-12 Primesense Ltd Learning-based pose estimation from depth maps
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
EP2703949A1 (fr) * 2011-04-28 2014-03-05 NEC System Technologies, Ltd. Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
US8744137B2 (en) 2010-09-07 2014-06-03 Sony Corporation Information processing device and information processing method
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US9002099B2 (en) 2011-09-11 2015-04-07 Apple Inc. Learning-based estimation of hand and finger pose
US9019267B2 (en) 2012-10-30 2015-04-28 Apple Inc. Depth mapping with enhanced resolution
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US9047507B2 (en) 2012-05-02 2015-06-02 Apple Inc. Upper-body skeleton extraction from depth maps
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9229107B2 (en) 2007-11-12 2016-01-05 Intellectual Ventures Holding 81 Llc Lens system
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9247236B2 (en) 2008-03-07 2016-01-26 Intellectual Ventures Holdings 81 Llc Display with built in 3D sensing capability and gesture control of TV
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
WO2016210292A1 (fr) 2015-06-25 2016-12-29 Children's Medical Center Corporation Procédés et compositions se rapportant à l'expansion, l'enrichissement et la conservation de cellules souches hématopoïétiques
WO2017161001A1 (fr) 2016-03-15 2017-09-21 Children's Medical Center Corporation Procédés et compositions concernant l'expansion de cellules souches hématopoïétiques
US10043279B1 (en) 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
CN110543234A (zh) * 2018-05-29 2019-12-06 富士施乐株式会社 信息处理装置和非暂时性计算机可读介质
CN111490840A (zh) * 2019-12-18 2020-08-04 蔡晓青 分散式播放设备管理平台
US10782788B2 (en) 2010-09-21 2020-09-22 Saturn Licensing Llc Gesture controlled communication

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4627052B2 (ja) 2006-07-06 2011-02-09 株式会社ソニー・コンピュータエンタテインメント 画像に連携した音声出力方法および装置
WO2008007372A2 (fr) 2006-07-12 2008-01-17 N-Trig Ltd. Détection par effleurement et détection tactile pour numériseur graphique
US8686964B2 (en) 2006-07-13 2014-04-01 N-Trig Ltd. User specific recognition of intended user interaction with a digitizer
JP4730784B2 (ja) * 2006-08-09 2011-07-20 アルパイン株式会社 車載表示システム
WO2008020446A1 (fr) * 2006-08-15 2008-02-21 N-Trig Ltd. Détection de geste pour un numériseur graphique
JP2010515170A (ja) * 2006-12-29 2010-05-06 ジェスチャー テック,インコーポレイテッド 機能強化したインタラクティブシステムを用いた仮想オブジェクトの操作
US7855718B2 (en) * 2007-01-03 2010-12-21 Apple Inc. Multi-touch input discrimination
US8933892B2 (en) 2007-11-19 2015-01-13 Cirque Corporation Touchpad combined with a display and having proximity and touch sensing capabilities to enable different functions or interfaces to be displayed
JPWO2009069392A1 (ja) * 2007-11-28 2011-04-07 日本電気株式会社 入力装置、サーバ、表示管理方法および記録媒体
JP4318056B1 (ja) * 2008-06-03 2009-08-19 島根県 画像認識装置および操作判定方法
US8154524B2 (en) * 2008-06-24 2012-04-10 Microsoft Corporation Physics simulation-based interaction for surface computing
JP2010140300A (ja) * 2008-12-12 2010-06-24 Sharp Corp 表示装置、制御方法、制御プログラムおよび記録媒体
KR101609388B1 (ko) * 2009-03-04 2016-04-05 엘지전자 주식회사 3차원 메뉴를 표시하는 이동 단말기 및 이동 단말기의 제어방법
JP5287403B2 (ja) 2009-03-19 2013-09-11 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2010244132A (ja) * 2009-04-01 2010-10-28 Mitsubishi Electric Corp タッチパネル付きユーザインタフェース装置、ユーザインタフェース制御方法およびユーザインタフェース制御プログラム
JP5187280B2 (ja) 2009-06-22 2013-04-24 ソニー株式会社 操作制御装置および操作制御方法
US20110148801A1 (en) * 2009-12-18 2011-06-23 Bateman Steven S Touch panel region of interest reporting scheme
EP2544079A4 (fr) * 2010-03-05 2016-07-13 Nec Corp Dispositif de terminal portable
JP5118719B2 (ja) * 2010-03-31 2013-01-16 株式会社エヌ・ティ・ティ・ドコモ 情報端末及び文書編集方法
WO2011136783A1 (fr) * 2010-04-29 2011-11-03 Hewlett-Packard Development Company L. P. Système et procédé permettant de fournir des informations d'objets
JP5675196B2 (ja) * 2010-07-24 2015-02-25 キヤノン株式会社 情報処理装置及びその制御方法
JP5625643B2 (ja) 2010-09-07 2014-11-19 ソニー株式会社 情報処理装置、および情報処理方法
CN103270479B (zh) * 2010-11-22 2017-05-24 株式会社Ip舍路信 信息输入系统、程序、介质
JP5479414B2 (ja) * 2010-11-24 2014-04-23 キヤノン株式会社 情報処理装置およびその制御方法
JP5724422B2 (ja) * 2011-02-07 2015-05-27 富士通株式会社 操作制御装置,操作制御プログラムおよび操作制御方法
JP5852346B2 (ja) 2011-07-11 2016-02-03 京セラ株式会社 表示機器、制御システムおよび制御プログラム
EP2575007A1 (fr) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Mise à l'échelle d'entrées basées sur les gestes
EP2575006B1 (fr) 2011-09-27 2018-06-13 Elo Touch Solutions, Inc. Interaction utilisateur avec contact et sans contact avec un dispositif
JP2013222317A (ja) * 2012-04-17 2013-10-28 Toshiba Mach Co Ltd 数値制御装置
JP5510529B2 (ja) * 2012-11-16 2014-06-04 ソニー株式会社 情報処理装置、記憶媒体、情報処理システム、および情報処理方法、並びにプログラム
US9836199B2 (en) 2013-06-26 2017-12-05 Panasonic Intellectual Property Corporation Of America User interface device and display object operating method
KR20150031384A (ko) * 2013-09-13 2015-03-24 현대자동차주식회사 맞춤형 인터페이스 시스템 및 그 동작 방법
WO2015045090A1 (fr) * 2013-09-27 2015-04-02 株式会社 東芝 Procédé et dispositif électronique
US10080963B2 (en) 2014-03-28 2018-09-25 Sony Interactive Entertainment Inc. Object manipulation method, object manipulation program, and information processing apparatus
JP2015191480A (ja) * 2014-03-28 2015-11-02 株式会社ソニー・コンピュータエンタテインメント 情報処理装置、オブジェクトの操作方法、及びオブジェクトの操作プログラム
JPWO2015159548A1 (ja) * 2014-04-18 2017-04-13 Necソリューションイノベータ株式会社 投影制御装置、投影制御方法及び投影制御プログラム
JP2016042383A (ja) * 2015-11-25 2016-03-31 カシオ計算機株式会社 ユーザ操作処理装置、ユーザ操作処理方法及びプログラム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0536715A2 (fr) * 1991-10-07 1993-04-14 Fujitsu Limited Appareil pour la manipulation d'un objet affiché sur une unité d'affichage
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
JPH1138949A (ja) * 1997-07-15 1999-02-12 Sony Corp 描画装置、描画方法及び記録媒体
JP2000163031A (ja) * 1998-11-25 2000-06-16 Seiko Epson Corp 携帯情報機器及び情報記憶媒体
JP2001216069A (ja) * 2000-02-01 2001-08-10 Toshiba Corp 操作入力装置および方向検出方法
JP2002342033A (ja) * 2001-05-21 2002-11-29 Sony Corp 非接触型ユーザ入力装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07182101A (ja) * 1993-10-26 1995-07-21 Itu Res Inc グラフィック入力装置および方法、グラフィックオブジェクト操作方法、グラフィック入力信号供給方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0536715A2 (fr) * 1991-10-07 1993-04-14 Fujitsu Limited Appareil pour la manipulation d'un objet affiché sur une unité d'affichage
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
JPH1138949A (ja) * 1997-07-15 1999-02-12 Sony Corp 描画装置、描画方法及び記録媒体
JP2000163031A (ja) * 1998-11-25 2000-06-16 Seiko Epson Corp 携帯情報機器及び情報記憶媒体
JP2001216069A (ja) * 2000-02-01 2001-08-10 Toshiba Corp 操作入力装置および方向検出方法
JP2002342033A (ja) * 2001-05-21 2002-11-29 Sony Corp 非接触型ユーザ入力装置

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9128519B1 (en) 2005-04-15 2015-09-08 Intellectual Ventures Holding 67 Llc Method and system for state-based control of objects
US8249334B2 (en) 2006-05-11 2012-08-21 Primesense Ltd. Modeling of humanoid forms from depth maps
US10564731B2 (en) 2007-09-14 2020-02-18 Facebook, Inc. Processing of gesture-based user interactions using volumetric zones
JP2010539590A (ja) * 2007-09-14 2010-12-16 インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー ジェスチャベースのユーザインタラクションの処理
US9058058B2 (en) 2007-09-14 2015-06-16 Intellectual Ventures Holding 67 Llc Processing of gesture-based user interactions activation levels
US9811166B2 (en) 2007-09-14 2017-11-07 Intellectual Ventures Holding 81 Llc Processing of gesture-based user interactions using volumetric zones
US10990189B2 (en) 2007-09-14 2021-04-27 Facebook, Inc. Processing of gesture-based user interaction using volumetric zones
US9229107B2 (en) 2007-11-12 2016-01-05 Intellectual Ventures Holding 81 Llc Lens system
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US9247236B2 (en) 2008-03-07 2016-01-26 Intellectual Ventures Holdings 81 Llc Display with built in 3D sensing capability and gesture control of TV
US10831278B2 (en) 2008-03-07 2020-11-10 Facebook, Inc. Display with built in 3D sensing capability and gesture control of tv
EP2124138A2 (fr) * 2008-05-20 2009-11-25 LG Electronics Inc. Terminal mobile utilisant un détecteur de proximité et procédé de contrôle de fonds d'écran
EP2124138A3 (fr) * 2008-05-20 2014-12-24 LG Electronics Inc. Terminal mobile utilisant un détecteur de proximité et procédé de contrôle de fonds d'écran
US8565479B2 (en) 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
US8824737B2 (en) 2010-05-31 2014-09-02 Primesense Ltd. Identifying components of a humanoid form in three-dimensional scenes
US8781217B2 (en) 2010-05-31 2014-07-15 Primesense Ltd. Analysis of three-dimensional scenes with a surface model
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
US9158375B2 (en) 2010-07-20 2015-10-13 Apple Inc. Interactive reality augmentation for natural interaction
US8744137B2 (en) 2010-09-07 2014-06-03 Sony Corporation Information processing device and information processing method
US8582867B2 (en) 2010-09-16 2013-11-12 Primesense Ltd Learning-based pose estimation from depth maps
US10782788B2 (en) 2010-09-21 2020-09-22 Saturn Licensing Llc Gesture controlled communication
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US9454225B2 (en) 2011-02-09 2016-09-27 Apple Inc. Gaze-based display control
US9342146B2 (en) 2011-02-09 2016-05-17 Apple Inc. Pointing-based display interaction
US9285874B2 (en) 2011-02-09 2016-03-15 Apple Inc. Gaze detection in a 3D mapping environment
EP2703949A4 (fr) * 2011-04-28 2014-10-22 Nec Solution Innovators Ltd Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
US9329673B2 (en) 2011-04-28 2016-05-03 Nec Solution Innovators, Ltd. Information processing device, information processing method, and recording medium
EP2703949A1 (fr) * 2011-04-28 2014-03-05 NEC System Technologies, Ltd. Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9002099B2 (en) 2011-09-11 2015-04-07 Apple Inc. Learning-based estimation of hand and finger pose
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US9377863B2 (en) 2012-03-26 2016-06-28 Apple Inc. Gaze-enhanced virtual touchscreen
US11169611B2 (en) 2012-03-26 2021-11-09 Apple Inc. Enhanced virtual touchpad
US9047507B2 (en) 2012-05-02 2015-06-02 Apple Inc. Upper-body skeleton extraction from depth maps
US9019267B2 (en) 2012-10-30 2015-04-28 Apple Inc. Depth mapping with enhanced resolution
WO2016210292A1 (fr) 2015-06-25 2016-12-29 Children's Medical Center Corporation Procédés et compositions se rapportant à l'expansion, l'enrichissement et la conservation de cellules souches hématopoïétiques
US10043279B1 (en) 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
WO2017161001A1 (fr) 2016-03-15 2017-09-21 Children's Medical Center Corporation Procédés et compositions concernant l'expansion de cellules souches hématopoïétiques
EP4049665A1 (fr) 2016-03-15 2022-08-31 Children's Medical Center Corporation Procédés et compositions associées à l'expansion de cellules souches hématopoïétiques
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
CN110543234A (zh) * 2018-05-29 2019-12-06 富士施乐株式会社 信息处理装置和非暂时性计算机可读介质
CN110543234B (zh) * 2018-05-29 2024-03-08 富士胶片商业创新有限公司 信息处理装置和非暂时性计算机可读介质
CN111490840A (zh) * 2019-12-18 2020-08-04 蔡晓青 分散式播放设备管理平台

Also Published As

Publication number Publication date
JP2005031799A (ja) 2005-02-03
JP4723799B2 (ja) 2011-07-13

Similar Documents

Publication Publication Date Title
JP4723799B2 (ja) 制御システムおよび制御方法
JP5184384B2 (ja) 制御システムおよび制御方法
JP6074170B2 (ja) 近距離動作のトラッキングのシステムおよび方法
US11048333B2 (en) System and method for close-range movement tracking
Hinckley et al. Sensor synaesthesia: touch in motion, and motion in touch
EP2717120B1 (fr) Appareil, procédés et produits de programme informatique fournissant des commandes gestuelles à partir de la main ou d'un doigt pour applications de dispositif électronique portable
KR101270847B1 (ko) 터치 감지 입력 장치용 제스처
KR101544364B1 (ko) 듀얼 터치 스크린을 구비한 휴대 단말기 및 그 컨텐츠 제어방법
Cao et al. ShapeTouch: Leveraging contact shape on interactive surfaces
US8941600B2 (en) Apparatus for providing touch feedback for user input to a touch sensitive surface
US9389722B2 (en) User interface device that zooms image in response to operation that presses screen, image zoom method, and program
US9348458B2 (en) Gestures for touch sensitive input devices
JP2013037675A5 (fr)
US9696882B2 (en) Operation processing method, operation processing device, and control method
TW200847001A (en) Gesturing with a multipoint sensing device
GB2509599A (en) Identification and use of gestures in proximity to a sensor
CN104049734A (zh) 基于用户接触显示图形用户界面的方法和设备
KR20140010003A (ko) 컴퓨팅 디바이스와 사용작용할 때 생성된 입력 이벤트의 해석을 향상시키기 위해 컴퓨팅 디바이스의 이동을 사용하는 방법
KR102194778B1 (ko) 공간상의 상호 작용을 이용한 단말의 제어 방법 및 그 단말
JP2017157079A (ja) 情報処理装置、表示制御方法、及び表示制御プログラム
TW201437844A (zh) 輸入裝置及其切換輸入方式的方法
Shittu et al. A review on interaction techniques on mobile phones
Zhai The Computer Mouse and Related Input Devices
Wu Study and design of interaction techniques to facilitate object selection and manipulation in virtual environments on mobile devices
KR20200143346A (ko) 공간상의 상호 작용을 이용한 단말의 제어 방법 및 그 단말

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase