WO2017029749A1 - Dispositif de traitement d'informations, son procédé de commande, programme et support de stockage - Google Patents

Dispositif de traitement d'informations, son procédé de commande, programme et support de stockage Download PDF

Info

Publication number
WO2017029749A1
WO2017029749A1 PCT/JP2015/073332 JP2015073332W WO2017029749A1 WO 2017029749 A1 WO2017029749 A1 WO 2017029749A1 JP 2015073332 W JP2015073332 W JP 2015073332W WO 2017029749 A1 WO2017029749 A1 WO 2017029749A1
Authority
WO
WIPO (PCT)
Prior art keywords
proximity
degree
touch
condition
touch input
Prior art date
Application number
PCT/JP2015/073332
Other languages
English (en)
Japanese (ja)
Inventor
勇樹 釜森
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2017509073A priority Critical patent/JP6711817B2/ja
Priority to PCT/JP2015/073332 priority patent/WO2017029749A1/fr
Priority to US15/237,182 priority patent/US10156938B2/en
Publication of WO2017029749A1 publication Critical patent/WO2017029749A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to a technique for detecting a touch input by an operator.
  • UI user interface
  • a distance image capturing unit capable of obtaining distance information such as a stereo camera or an infrared camera is used, and a target surface and a part of a predetermined operation body (for example, an operator's fingertip)
  • a target surface and a part of a predetermined operation body for example, an operator's fingertip
  • a predetermined threshold is provided for the distance between the target surface and the operator's fingertip, and when both approach each other to exceed the threshold, it is determined that the fingertip touches the target surface.
  • touch touching the fingertip and the target surface from a non-contact state
  • release non-contact from the contact state
  • Patent Document 1 assumes that the condition for recognizing the release is not exceeding the threshold value, but the direction of movement of the operating body touching the target surface is reversed.
  • Patent Document 1 does not consider such a case.
  • An object of the present invention is to reduce misrecognition of the end of touch input, which occurs when an error is included in the detection result of the degree of proximity between the operating body and the target surface.
  • the present invention recognizes the acquisition means for acquiring the information indicating the degree of proximity of the operating body to the predetermined surface, and the start of touch input to the predetermined surface by the operating body.
  • a reference determination unit that determines a first reference value related to the degree of proximity based on the transition of the proximity level after the touch, and the touch input based on the first reference value specified by the reference determination unit
  • Condition determining means for determining a first condition relating to the degree of proximity for recognizing the end of the information, and the first condition in which the degree of proximity represented by the information acquired by the acquiring means is determined by the condition determining means Recognizing means for recognizing the end of the touch input in response to satisfying.
  • the present invention it is possible to reduce erroneous recognition of the end of touch input, which occurs when an error is included in the detection result of the degree of proximity between the operating body and the target surface.
  • FIG. 1st Embodiment A) Hardware configuration of information processing apparatus according to first embodiment, (b) Block diagram showing an example of functional configuration of first embodiment The flowchart showing an example of the flow of the operation recognition process in 1st Embodiment.
  • touch is an operation of starting touch input for an item by bringing a fingertip into contact with or approaching the display area of the item projected on the table surface.
  • Release is an operation of ending touch input for an item by releasing the fingertip that is in contact with or close to the item after the touch operation from the item.
  • touch operations such as taps, moves, and flicks based on the position, movement, and speed of the fingertip from touch to release. For example, a single input that touches the target surface with the fingertip and then releases it almost without moving in the direction along the target surface is called a “tap operation” to select the displayed item. Often recognized as an instruction.
  • an information processing apparatus that recognizes each of “touch” and “release” as an input of a single operation “touch operation” and “release operation” and provides feedback to the user will be described as an example. Go. For example, when the touch operation is recognized, the color of the item being touched is changed. Further, when the release operation is recognized, the touched item is enlarged. In the present embodiment, the state of the operating body from touch to release is expressed as touch input.
  • the touch sensor provided on the display screen directly detects that the fingertip of the operator has touched the display screen and recognizes touch and release.
  • a touch or release is rarely recognized at a timing significantly different from a user's intention to input a touch operation.
  • the distance between the operating tool and the target surface is acquired and the touch and release are detected based on the comparison between the distance and the threshold, the distance cannot be acquired with sufficient accuracy. There is a possibility that the distance of the measurement result exceeds the threshold at a timing that does not meet the above.
  • a threshold value of a distance from a touch target surface for detecting a touch (hereinafter referred to as a touch threshold value) is often set sufficiently large. Further, the threshold of the distance from the touch target surface for detecting the release (hereinafter referred to as the release threshold) may be set to be equal to the touch threshold or set larger than the touch threshold. If the release threshold is set larger, the release is not recognized until the operation body is sufficiently separated from the target surface, so that it is possible to prevent the recognition of the operation from being interrupted due to a detection blur that occurs during touch input. It was. However, if each of the touch threshold value and the release threshold value is set large, as a result, the user cannot recognize the touch and release unless the user performs a touch operation with an unnaturally large movement. As a result, operability is degraded.
  • the release threshold is dynamically determined based on the transition of the degree of proximity between the operation body and the touch target surface during the subsequent touch input. To decide. Specifically, when the operating body is closest to the touch target surface during the touch input, the proximity of the two is used as a reference, and the operating body is moved away from the touch target surface by a predetermined threshold from the reference. Recognize the release. Further, in this embodiment, the touch threshold when the touch input is resumed after the release is recognized once is based on the transition of the degree of proximity between the operating body and the touch target surface after the release is recognized. Determine dynamically.
  • FIG. 1 is an example of an appearance of a tabletop interface system in which an information processing apparatus 100 according to the present embodiment is installed. Also shown are coordinate axes that define position information in the three-dimensional space.
  • the target surface 101 is a table portion of the table top interface, and the user can input a touch operation by touching the target surface 101.
  • the distance image sensor 102 is installed above the target surface 101 so as to look down on the target surface.
  • a distance image is an image in which information corresponding to the distance from the reference position (for example, the center of the lens of the imaging means used for imaging) to the subject surface captured by the pixel is reflected in the value of each pixel. It is.
  • the pixel value of the distance image captured by the distance image sensor 102 reflects the distance from the distance image sensor 102 to the object surface 101 or an object surface existing above the target surface 101.
  • the captured distance image is input to the information processing apparatus 100 as an input image.
  • the information processing apparatus 100 acquires the three-dimensional position of the user's hand 106 by analyzing the input image, and recognizes the input operation. Therefore, the user can input a space gesture operation by moving a predetermined object such as a hand within a range between the target surface 101 and the distance image sensor 102 within a range that can be imaged by the distance image sensor 102. is there.
  • a pattern irradiation method using infrared light or a TOF (Time-of-Flight) sensor is used.
  • the visible light camera 103 is also installed so as to look down at the target surface 101 from above.
  • the information processing apparatus 100 can function as a document camera that controls the visible light camera 103 to capture an image of an object placed on the target surface 101 and obtain a read image thereof.
  • the information processing apparatus 100 detects and further identifies an object existing in the space on the target surface 101 based on a visible light image obtained by the visible light camera 103 and a distance image obtained by the distance image sensor 102.
  • the object includes, for example, a user's hand, a paper medium, a document such as a book, and other three-dimensional objects.
  • the projector 104 projects an image on the upper surface of the target surface 101.
  • the user performs an operation by touch or space gesture on the item 105 included in the projected image.
  • the distance image acquired using the distance image sensor 102 is used for detecting the hand 106 and recognizing the operation.
  • the target surface 101 may be a liquid crystal display instead of the projector 104. In that case, it is also possible to detect the hand without being affected by the projection light by using a method of detecting a human hand from the image by detecting a skin color region from the visible light image.
  • the distance image sensor 102 and the visible light camera 103 themselves are not necessarily installed above as long as an image obtained by viewing the target surface 101 from above is obtained. For example, even if it is configured to take an image of a mirror placed on the information on the target surface, an image with an angle of view when the target surface 101 is viewed from above is obtained.
  • the projector 104 projects onto the target surface 101 so as to look down obliquely from above, but the projection light projected in a different direction is applied to the target surface 101 using a mirror or the like. May be reflected.
  • the x, y, and z axes shown in FIG. 1 are defined in a three-dimensional space on the target surface 101, and position information is handled.
  • the two dimensions parallel to the upper surface of the table are the xy plane, and the direction perpendicular to the table upper surface and extending upward is the positive direction of the z axis.
  • the z-axis direction corresponds to the height direction in the world coordinate system.
  • the present embodiment can also be applied to a system in which a non-horizontal surface such as a whiteboard or a wall surface is the target surface 101.
  • FIG. 2A is a hardware configuration diagram of a tabletop interface including the information processing apparatus 100 according to the present embodiment.
  • the central processing unit (CPU) 200 is connected to the system bus 204 by using the RAM 202 as a work memory, executing an OS and a program stored in the ROM 201 and the storage device 203, performing various processing calculations and logical determinations, and the like. Control each configuration.
  • the processing executed by the CPU 200 includes touch operation and release operation recognition processing, which will be described later.
  • the storage device 203 is a hard disk drive, an external storage device connected by various interfaces, and the like, and stores programs and various data related to the operation recognition processing of the embodiment.
  • the distance image sensor 102 captures a distance image of a space including a table on which an item is displayed and a user's hand operating the item according to the control of the CPU 200, and outputs the captured distance image to the system bus 204.
  • a distance image acquisition method a sensor using a method using infrared light that is less affected by ambient light or display on a table surface will be described as an example. It is also possible to use a method that uses the reflection time.
  • the projector 104 projects an image item to be operated on a table according to the control of the CPU 200.
  • the visible light camera 103, the distance image sensor 102, and the projector 104 are external devices connected to the information processing apparatus 100 via an input / output interface, and cooperate with the information processing apparatus 100. To construct an information processing system. However, these devices may be integrated with the information processing apparatus 100.
  • FIG. 2B is a block diagram illustrating an example of a functional configuration of the information processing apparatus 100 according to the present embodiment.
  • Each functional unit is realized by the CPU 200 developing a program stored in the ROM 201 in the RAM 202 and executing processing according to each flowchart described below. Further, for example, when hardware is configured as an alternative to software processing using the CPU 200, arithmetic units and circuits corresponding to the processing of each functional unit described here may be configured.
  • the image acquisition unit 210 acquires a distance image captured by the distance image sensor 102 as an input image at regular intervals, and stores it in the RAM 202 as needed.
  • the object acquired by the image acquisition unit 210 and exchanged with each functional unit is actually a signal corresponding to the image data, but in this specification, simply “acquire a distance image” or “acquire an input image”. ".
  • the region extraction unit 211 performs threshold determination and noise reduction processing on each pixel of the input image acquired by the image acquisition unit 210 according to the background difference method, and extracts a hand region in the distance image.
  • As the background image an input image in which a state where no object exists on the table 101 is captured is used.
  • the hand region is a region in which the hand used by the user as the operation tool is shown in the input distance image.
  • the position detection unit 212 detects the fingertip position of the user's hand based on the contour information of the hand region extracted by the region extraction unit 211, and specifies the coordinate value.
  • the distance acquisition unit 213 acquires the distance between the fingertip position detected by the position detection unit 212 and the table upper surface that is the touch target surface as information indicating the degree of proximity. Since the input image of this embodiment has a value corresponding to the z coordinate in each pixel value, the pixel value of the pixel corresponding to the fingertip position detected by the position detection unit 212 in each of the background image and the latest input image. Is equivalent to the distance between the fingertip (upper surface) and the upper surface of the table 101. As described above, in the present embodiment, the degree of proximity between the operating tool and the touch target surface corresponds to the value of the z coordinate of the coordinate axes shown in FIG. 1, which means the height of the fingertip from the table surface. To do.
  • the degree of proximity between the operating tool and the touch target surface is not limited to the distance information obtained based on the pixel value of the distance image.
  • it may be information on the amount of change in capacitance caused by the operation body approaching the touch target surface, or information on the amount of change in temperature, pressure, or contact area.
  • the reference determination unit 214 determines a first reference value that serves as a reference for determining the release threshold based on the transition of the degree of proximity to the target surface of the operating tool after the touch is recognized by the recognition unit 216 described later. decide.
  • the first reference value is specified based on information when the operating tool is closest to the target surface.
  • the degree of proximity is the distance between the touch target surface and the operating body (the height of the fingertip from the table). Therefore, the reference determination unit 214 determines the minimum distance (minimum height) as the first reference value in the transition of the distance between the touch target surface and the operating body after the touch input is started.
  • the reference determination unit 214 determines the second reference value based on the transition of the degree of proximity of the operation body to the target surface after the release is recognized by the recognition unit 216 described later.
  • the second reference value is a reference in the process of determining a touch threshold value for recognizing a touch input again after release.
  • the second reference value is specified based on information at the time when the operating tool is most distant from the target surface after release.
  • the reference determination unit 214 determines the maximum distance (maximum distance) in the transition of the distance between the touch target surface and the operation body (the height of the fingertip from the table) after the release is recognized. Height) is set as the second reference value.
  • the condition determining unit 215 uses the reference value determined by the reference determining unit 214 to determine a condition used for detecting the touch operation and the release operation.
  • a first condition for recognizing at least the end of touch input a release threshold for the degree of proximity between the touch target surface and the operating tool is determined.
  • a touch threshold is determined as a second condition for recognizing the restart of touch input.
  • the release threshold value and the touch threshold value are distance (height) threshold values. It is.
  • the release threshold that is the first condition is determined by adding a predetermined value to the distance determined as the first reference value.
  • the touch threshold value serving as the second condition is determined by adding a predetermined value to the value determined as the second reference value. However, when the result of adding the predetermined value to the second reference value is larger than the initial value of the touch threshold, the touch threshold remains the initial value.
  • the recognizing unit 216 compares the distance acquired by the distance acquiring unit 213 with the touch threshold value as a predetermined initial value or the threshold values that are the first condition and the second condition determined by the condition determining unit 215. Based on this, the touch operation and the release operation are recognized. Specifically, the touch is recognized when the operating body approaches a distance exceeding the touch threshold from a distance sufficiently away from the touch target surface. Further, the release is recognized when the operating body is separated from the touch target surface by a distance exceeding the release threshold.
  • the display control unit 217 uses the information stored in the ROM 201 and the storage device 203 to generate and output an image to be projected on the target surface 101 of the table by the projector 104 which is the display unit of the present embodiment.
  • the output image is projected and displayed on the target surface 101 on the table by the projector 104.
  • the projected image includes a GUI screen including at least a plurality of items to be selected. Selection or movement may be possible for each item, and the entire image may be the object of selection or movement.
  • the operation recognition process of the first embodiment will be described with reference to the flowchart of FIG.
  • the processing of the flowchart of FIG. 3 is realized by the CPU 200 that constitutes each functional unit of the information processing apparatus developing the program recorded in the ROM 201 on the RAM 202 and executing it.
  • the process of the flowchart of FIG. 3 is started in response to the distance image captured by the distance image sensor 102 being input to the information processing apparatus 100.
  • the process of the flowchart in FIG. 3 is repeated each time a distance image is input from the distance image sensor 102. Therefore, the cycle in which the process is repeated matches the frame rate of the captured image of the distance image sensor 102.
  • step S301 the image acquisition unit 210 acquires, from the distance image sensor 102, a distance image obtained by capturing a space including the table on which the image is projected and the user's hand as an input image. For each pixel of the acquired distance image, a value corresponding to the distance from the distance image sensor 102 to the subject is held as a pixel value.
  • coordinate conversion based on the lens characteristics of the distance image sensor 102 and the relative positional relationship with the target surface 101 is performed on the position and pixel value of each pixel in the input image. As a result, information on each pixel in the input image is mapped to a real-world three-dimensional coordinate system defined on the table 101 and handled as three-dimensional position information.
  • the region extraction unit 211 scans each pixel of the distance image, and extracts a hand region in which the user's hand is reflected from the distance image.
  • the difference between the input distance image including the user's hand and the distance image obtained only in the background (distance image obtained by capturing only the table) is obtained for each pixel, and the difference is larger than the threshold value.
  • the threshold value is larger than the threshold value.
  • the extracted region is subjected to expansion / contraction processing, and the hand region is corrected by removing minute noise derived from the error of the distance image sensor.
  • a pixel group that satisfies the condition that the distance from the sensor represented by the pixel value is closer than the distance represented by the predetermined threshold can be extracted as a hand region. In this case, it is not necessary to previously acquire a distance image obtained by capturing only a table for use as a background image.
  • step S303 the position detection unit 212 searches for a position corresponding to the fingertip based on the shape of the hand region extracted in step S302, and detects its three-dimensional coordinates.
  • contour points of the hand region are extracted, and for each contour point, an angle between vectors formed with adjacent contour points is calculated.
  • a contour point having a vector-to-vector angle smaller than the threshold is extracted as a fingertip position candidate point, and a position where a plurality of fingertip position candidate points are dense is specified as the fingertip. That is, the tip of the part representing the protruding end with a relatively thin outline is specified as the fingertip position.
  • the three-dimensional coordinates of the fingertip in the real world coordinate system can be calculated based on the specified fingertip position, the pixel value of the distance image, and the coordinate conversion parameter to the real world coordinate system.
  • the fingertip position detection method is not limited to this, and it is also possible to use a method of narrowing down the position using circular template matching or restrictions based on the structure of the hand.
  • step S304 the distance acquisition unit 213 acquires the distance between the fingertip detected in step S303 and the touch target surface as information indicating the degree of proximity.
  • the distance between the fingertip and the touch target surface is calculated as the z coordinate defined on the table.
  • step S305 the recognition unit 216 determines whether it is currently in a non-touch state.
  • a non-touch state In this embodiment, two states, a touch state and a non-touch state, are defined. Then, for each frame, a determination result as to which state the operating tool is in is obtained. In step S305, it is determined whether or not the current frame is being touch-inputted by referring to which state the operating tool has changed in the previous frame.
  • the distance between the operating body and the touch target surface is a touch threshold value (including both a touch threshold value previously defined as an initial value and a touch threshold value determined as a second condition by the condition determination unit 215). It is in a state of being below.
  • the non-touch state is a state in which the distance between the operation body and the touch target surface exceeds the release threshold when the operation body in the touch state moves away from the touch target surface.
  • the initial state in which touch recognition has not been performed since the information processing apparatus 100 is activated is included in the non-touch state.
  • the state transition from the non-touch state to the touch state occurs, that is, the touch is recognized.
  • a release is recognized when a state transition from a touch state to a non-touch state occurs.
  • the information processing apparatus 100 can recognize a so-called touch operation as in the case of a device equipped with a touch sensor even when an arbitrary surface that does not have a touch sensor function is the target surface.
  • the recognition unit 216 holds information indicating the current state in the RAM 202 every time a state transition occurs. Accordingly, in step S305, the recognition unit 216 performs determination with reference to information held in the RAM 202 as a result of processing the previous frame. If it is determined that the current state is the non-touch state (Yes in step S305), the process proceeds to step S306. If it is determined that the current state is not the non-touch state (No in step S305), the process proceeds to step S307. The case where it is not in a non-touch state is a case where it is in a touch state.
  • step S306 touch operation recognition processing is executed.
  • the touch operation is recognized based on the magnitude relationship between the distance between the operation body currently in the non-touch state and the touch target surface and the touch threshold.
  • the flowchart of FIG. 4A represents an example of a flow of processing executed by the recognition unit 216 in step S306.
  • the recognition unit 216 determines whether the distance acquired in step S304 is smaller than the touch threshold.
  • the touch threshold used for comparison is either the touch threshold held by the information processing apparatus 100 as an initial value, or the touch threshold determined as the second condition in step S310 described later.
  • the process proceeds to step S402. If it is not determined that the acquired distance is smaller than the touch threshold (No in step S401), the touch operation is not recognized, and the process returns to the flowchart of FIG.
  • step S402 the recognition unit 216 holds information indicating that the state of the fingertip has transitioned from the non-touch state to the touch state in the RAM 202.
  • step S403 the recognition unit 216 notifies each functional unit that performs subsequent processing, such as the condition determination unit 215 and the display control unit 217, that the touch input has been started.
  • step S307 a release operation recognition process is executed.
  • the release operation recognition process a release operation is detected based on the magnitude relationship between the distance between the operating body in the touched state and the touch target surface and the release threshold.
  • the flowchart of FIG. 4B represents an example of a flow of processing executed by the recognition unit 216 in step S307.
  • the recognition unit 216 determines whether the distance acquired in step S304 is greater than a release threshold.
  • the release threshold value used for comparison is the release threshold value determined as the first condition in step S311 to be described later.
  • the process proceeds to step S412. If it is not determined that the acquired distance is greater than the release threshold (No in step S411), the release operation is not recognized, and the process returns to the flowchart of FIG.
  • step S412 the recognition unit 216 holds information indicating that the state of the fingertip has transitioned from the touch state to the non-touch state in the RAM 202. At this time, the recognition unit 216 holds the current time in the RAM 202 as the time when the release operation was last recognized. In step S413, the recognition unit 216 notifies the functional units that perform subsequent processing, such as the condition determination unit 215 and the display control unit 217, that the touch input has been completed.
  • step S308 the display control unit 217 controls output to the projector 104 that is a display unit based on the recognition result of the recognition unit 216. For example, when the touch operation is recognized, the color of a part of the displayed image is changed according to the fingertip position detected in step S303. When the release operation is recognized, a part of the displayed image is enlarged according to the fingertip position.
  • the touch operation is recognized, the color of a part of the displayed image is changed according to the fingertip position detected in step S303.
  • the release operation is recognized, a part of the displayed image is enlarged according to the fingertip position.
  • step S309 the condition determination unit 215 refers to information representing the state of the fingertip held in the RAM 202, and determines whether or not it is currently in a non-touch state. Note that the determination in step S309 is based on the result of the recognition process for the touch operation and the release operation for the current frame, and thus the result is not necessarily the same as the determination in step S305. If it is determined that the current state is the non-touch state (Yes in step S309), the process proceeds to step S310. If it is determined that the current state is not the non-touch state (No in step S309), the process proceeds to step S311. In step S310, processing for determining a touch threshold value used in processing of the next frame is executed. In step S311, processing for determining a release threshold used in processing of the next frame is executed. The touch threshold value determination process in step S310 and the release threshold value determination process in step S311 will be described later.
  • FIG. 5 shows an example of the transition of the degree of proximity of the fingertip that continuously inputs the touch operation and the release operation to the touch target surface.
  • the vertical axis represents the z coordinate.
  • the horizontal axis is time.
  • 5A indicates that the user performs a touch operation by bringing the fingertip 500 of the index finger of the hand 106 into contact with the target surface 101, and then performs a release operation (this input corresponds to the tap operation described above). ) Is an ideal trajectory of the fingertip 500. In the case of an ideal trajectory 501, as the time progresses, the z-coordinate decreases uniformly and increases after passing the peak.
  • the acquired fingertip position locus is not necessarily ideal. Actually, as shown in FIG. 5 (b), it varies depending on the individual difference or trial difference in the movement of the user's hand and the influence of the accuracy on the distance side of the sensor.
  • the locus 502 is a locus when the fingertip 500 is not sufficiently separated after the touch operation.
  • the locus 503 is a locus when the fingertip and the target surface are not sufficiently close.
  • a locus 504 is a locus when z is detected as a negative value.
  • a trajectory 505 is a trajectory when a fine vertical movement has been detected as a result of the fluctuation of the value.
  • a fixed z coordinate indicated by a broken line 506 is used as a release threshold. Then, the touch operation may not be detected in the first place, the release operation may not be recognized even if the touch operation can be recognized, an error may occur, or the touch operation may be detected multiple times.
  • the release threshold value used for recognizing the release operation is not fixed, and the touch input is performed. It is determined dynamically according to the degree of proximity transition between the fingertip and the target surface after the start.
  • step S ⁇ b> 601 the condition determination unit 215 determines whether the distance acquired in step S ⁇ b> 304 is smaller than the first reference value held in the RAM 202.
  • a minimum value is selected as the first reference value in the transition of the distance between the operating tool and the touch target surface after the touch input is started.
  • step S601 is always determined as Yes. If it is determined that the acquired distance is smaller than the held first reference value (Yes in step S601), the process proceeds to step S602.
  • step S602 the condition determination unit 215 holds the distance acquired in step S304 as the first reference value.
  • step S603 the condition determination unit 215 determines a value obtained by adding a predetermined value ⁇ ( ⁇ > 0) to the first reference value held in the RAM 202 as a release threshold.
  • the determined release threshold is a first condition relating to the degree of proximity between the operating body and the target surface for recognizing the end of the current touch input.
  • FIG. 7 shows an example of the release threshold determined by this embodiment.
  • the vertical axis represents the z coordinate and the horizontal axis represents time.
  • a broken line 700 is a touch threshold value given as an initial value (for example, the magnitude of the touch threshold value is represented by ⁇ > 0).
  • the state transition to the touch state of the fingertip is detected. Thereby, it is recognized that touch input has started.
  • This time point is shown in the figure as a touch detection time point 701.
  • a line 702 represents the locus of the first reference value held for each frame.
  • a value obtained by adding a predetermined value ⁇ (width 703 in the z direction) to the first reference value (line 702) is determined as the release threshold value.
  • the determined release threshold is used for the determination process in step S411 when the release operation recognition process in step S307 is executed for the next frame.
  • a broken line 704 represents a release threshold locus determined for each frame.
  • the predetermined value ⁇ (the width 703 in the z direction) is determined so as to allow the magnitude of the distance error of the sensor, so that erroneous detection of the release operation can be suppressed. Since the distance error of the sensor may vary depending on the shape of the fingertip and the surface characteristics, it is desirable to refer to these in determining the predetermined value ⁇ .
  • the release operation can be recognized at an appropriate timing by dynamically setting the release threshold in this way. For example, in order to recognize a touch operation even when the operating body is not sufficiently close to the target surface as in the locus 503, it is necessary to set the touch threshold ⁇ to be large to some extent. Even in such a case, if the method for determining the release threshold according to the present embodiment is used, the release threshold does not become unnecessarily large.
  • FIG. 6B is a flowchart illustrating an example of a flow of touch threshold determination processing executed in the present embodiment.
  • the condition determination unit 215 determines whether this time is within a predetermined time after the release operation.
  • the process in step S611 is a process for surely returning the threshold value to the initial state when the user is changed, the application to be executed is changed, or when the apparatus is left in the activated state. Therefore, the predetermined time determined in step S611 is set as a sufficiently long time with respect to the average length of time for which one user performs continuous operation on the application. Depending on the environment, step S611 may be omitted.
  • the condition determination unit 215 determines using the information of the time when the release operation was last recognized in the RAM 202 in step S412. If it is determined that the current time is within a predetermined time after the release operation (Yes in step S611), the process proceeds to step S613. If it is not determined that the current time is within a predetermined time after the release operation (No in step S611), the process proceeds to step S612. In this embodiment, when determining No in step S611, the condition determination unit 215 deletes information on the time when the release operation was last recognized in the RAM 202 in step S412.
  • step S611 When the process of step S611 is executed in a state where the touch operation and the release operation have not been recognized until this point in time after the information processing device 100 is activated (the initial state), the determination result is No. It becomes.
  • step S612 the condition determination unit 215 determines the touch threshold value to be a value ⁇ ( ⁇ > 0) given as the initial value of the touch threshold value. Then, returning to the flowchart of FIG. 3, the series of processing ends.
  • step S613 the condition determination unit 215 determines whether the distance acquired in step S304 is larger than the second reference value held in the RAM 202.
  • the maximum value is selected as the second reference value in the transition of the distance between the operating body and the touch target surface in the non-touch state.
  • step S613 since the process of step S613 is performed for the first time, when the information of the 2nd reference value is not hold
  • step S614 the condition determination unit 215 holds the distance acquired in step S304 as the second reference value.
  • step S615 the condition determination unit 215 determines a value obtained by subtracting the predetermined value ⁇ ( ⁇ > 0) from the second reference value held in the RAM 202 as the touch threshold.
  • step S616 the condition determination unit 215 determines whether the touch threshold determined in step S613 is greater than a value ⁇ ( ⁇ > 0) given as an initial value of the touch threshold.
  • a value ⁇ ⁇ > 0
  • the process proceeds to step S612.
  • the process returns to the flowchart of FIG.
  • the touch threshold value determined as described above in step S310 is a second condition regarding the degree of proximity between the operating body and the target surface for recognizing the start or restart of the next touch input.
  • FIG. 8 shows an example of the touch threshold determined by the present embodiment.
  • FIG. 8 shows the continuation of the movement of the operating body shown in FIG. Therefore, the same elements as those in FIG. 7 are denoted by the same reference numerals, and detailed description thereof is omitted.
  • FIG. 8 illustrates an example in which touch input is started again after the touch operation and the release operation are recognized immediately before. Such a situation occurs, for example, when an operation called “double tap operation” that inputs a tap operation twice in succession is input.
  • the distance between the fingertip 500 in the touch state and the target surface 101 exceeds the release threshold value (broken line 704), and the state transitions to the non-touch state.
  • a line 800 represents a locus of the second reference value held for each frame.
  • a value obtained by subtracting the predetermined value ⁇ (the width 801 in the z direction) from the second reference value (line 800) is determined as the touch threshold value.
  • a broken line 802 represents a locus of a touch threshold determined for each frame. For example, when the movement of the fingertip 500 along the locus 502 in FIG.
  • the touch is performed at a time point 803 when the distance (z coordinate) between the fingertip 500 and the touch target surface 101 falls below the touch threshold (broken line 802).
  • the operation is recognized.
  • the trajectory 502 of the fingertip 500 approaches the touch target surface 101 again before exceeding the threshold 700 that is a touch threshold given as an initial value. If the user tries to input touch and release repeatedly, as in the case of inputting a double tap operation, the fingertip moves slowly and the height of the finger does not rise sufficiently after release as shown in FIG. There is.
  • the touch threshold is determined again as described above, the fingertip is again brought closer to the touch target surface after the release operation and before the touch threshold ⁇ given as the initial value is exceeded.
  • the predetermined value ⁇ is determined so as to allow the distance error of the sensor, so that it is possible to suppress erroneous detection of the touch operation. Since the distance error of the sensor may vary depending on the shape of the fingertip and the surface characteristics, it is desirable to refer to these in determining the predetermined value ⁇ .
  • the release operation can be recognized at a timing according to the user's intention. Furthermore, according to the present embodiment, the touch operation input again immediately after the release operation can be easily recognized even when the user's finger moves slowly.
  • the direction and shape of the touch target surface are not limited to the upward plane.
  • this embodiment can be applied to a curved surface or a point as a touch target surface instead of a plane on the table.
  • a virtual surface viewed through HMD (Head Mount Display) or the like in the MR system may be used as the touch target surface.
  • HMD Head Mount Display
  • the case where the fingertip does not sufficiently approach the target surface, or the case where the fingertip penetrates the defined touch target surface is likely to occur.
  • the release operation according to the user's intention can be easily recognized by dynamically determining the release threshold according to the present embodiment.
  • the distance when the fingertip is closest to the inner side of the virtual surface is held as the first reference value, and a release threshold value obtained by adding a predetermined value to the reference value is determined.
  • the touch target surface is a virtual surface, it is possible to reduce erroneous recognition of the touch operation that is input again after the release operation by determining the touch threshold according to the present embodiment.
  • the present invention is applicable even when the capacitance or pressure sensitive amount of the touch sensor, the contact area between the fingertip and the target surface, or the like is used as the degree of proximity for touch operation recognition instead of the distance.
  • the degree of proximity between the fingertip and the target surface with the amount of change in capacitance caused by the fingertip approaching the touch target surface, the larger the change amount, the closer the fingertip and the target surface are. .
  • the first reference value is specified based on the amount of change at the time when the capacitance becomes maximum after the start of touch input. Further, when the degree of proximity between the fingertip and the target surface is acquired as the pressure sensitive amount of the touch target surface by the fingertip, it means that the fingertip and the target surface are closer to each other as the pressure amount is larger. Therefore, when determining the release threshold, the first reference value is specified based on information at the time when the pressure becomes maximum after the start of touch input. Further, when the degree of proximity between the fingertip and the target surface is acquired as the contact area between the fingertip and the target surface, the larger the contact area, the stronger the fingertip and the target surface are closer to each other. Therefore, when determining the release threshold, the first reference value is specified based on information at the time when the contact area becomes maximum after the start of touch input. The method for specifying the second reference value in determining the touch threshold is similar to this.
  • an operation of moving the operating body in a direction parallel to the touch target surface between the input of the touch operation and the input of the release operation is referred to as a move operation.
  • the operation body is not limited to the user's hand or finger.
  • the appearance of the interface system according to the modification and the configuration of the apparatus are the same as those of the first embodiment shown in FIGS. Therefore, detailed description of each element is omitted.
  • FIG. 2C is a block diagram illustrating an example of a functional configuration of the information processing apparatus 100 according to the modification.
  • Each functional unit is realized by the CPU 200 developing a program stored in the ROM 201 in the RAM 202 and executing processing according to each flowchart according to the first embodiment.
  • hardware can be used as an alternative.
  • the same number is attached
  • differences from the first embodiment will be described.
  • the movement amount acquisition unit 220 acquires the movement distance of the fingertip position detected by the position detection unit 212 from the time when the fingertip state transitions from the non-touch state to the touch state.
  • the time acquisition unit 221 acquires the elapsed time based on the time when the state of the fingertip of the recognition unit 216 transitions from the non-touch state to the touch state as the touch duration time.
  • the condition determination unit 215 of the modification uses the touch duration time acquired by the time acquisition unit 221 and the movement distance of the operating body (fingertip) from the time when the touch operation acquired by the movement amount acquisition unit 220 is recognized, A first condition for recognizing the end of touch input is determined. Also in the modified example, the first condition is defined by setting a release threshold regarding the degree of proximity of the operating tool to the touch target surface.
  • FIG. 9A is a flowchart of the touch operation recognition process executed in step S306 of the modified example.
  • step S402 when the recognition unit 216 holds information indicating that the state of the fingertip has transitioned to the touch state in the RAM 202, the process proceeds to step S901.
  • step S901 the time acquisition unit 221 starts acquiring the touch duration time.
  • the touch continuation time is an elapsed time from the time when the state of the fingertip transitions from the non-touch state to the touch state.
  • the movement amount acquisition unit 220 starts acquiring the movement distance of the fingertip position detected in step S303.
  • the coordinates of the fingertip position in the current frame are held as the touch start position, which is used as a reference for calculating the amount of transition that occurs between the fingertip position detected in the subsequent frames.
  • FIG. 9B is a flowchart showing the release threshold value determination process executed in step S311 of the modification.
  • the time acquisition unit 221 determines whether or not the touch duration time that has started measurement in the process of step S901 exceeds a predetermined time. If it is determined that the touch duration has exceeded the predetermined time (Yes in step S911), the process proceeds to step S913. On the other hand, if it is not determined that the touch duration has exceeded the predetermined time (No in step S911), the process proceeds to step S912.
  • step S912 the movement amount acquisition unit 220 acquires the distance between the touch start position held in step S902 and the fingertip position detected in step S303 as the movement distance in the touch state, and exceeds the predetermined distance. It is determined. If it is determined that the movement distance has exceeded the predetermined distance (Yes in step S912), the process proceeds to step S913. On the other hand, when it is not determined that the moving distance exceeds the predetermined distance (No in step S912), the process proceeds to step S601. In step S913, the condition determination unit 215 determines the release threshold value to be the predetermined value ⁇ , and ends the release threshold value determination process. In the present embodiment, the predetermined value ⁇ is assumed to be the same as the value ⁇ given as the initial value of the touch threshold.
  • FIG. 10 shows an example of the release threshold determined by the modification.
  • the vertical axis represents the z coordinate and the horizontal axis represents time.
  • the same elements as those shown in FIG. 7 are given the same numbers.
  • a broken line 700 is a touch threshold ⁇ ( ⁇ > 0) given as an initial value.
  • the fixed release threshold value ⁇ ( ⁇ > 0) set during the movement of the operating body in the touch state matches the threshold value ⁇ . Therefore, in the modified example, the broken line 700 also represents the release threshold set during the movement of the operating tool in the touch state.
  • the time point when the distance between the fingertip 500 in the non-touch state and the target surface 101 falls below the touch threshold ⁇ (broken line 700) and the fingertip 500 transitions to the touch state is shown as a touch detection time point 701 in the figure.
  • the touch detection time point 701 acquisition of the touch duration time and the movement distance of the fingertip is started.
  • the touch detection time 701 every time a distance image is input according to the frame rate, the distance (z coordinate) between the fingertip 500 and the target surface 101 is acquired, and the minimum distance 702 is specified as the first reference value.
  • a value obtained by adding the predetermined value ⁇ to the first reference value 702 is determined as a release threshold value (indicated by a broken line 704 in the drawing) of each frame.
  • a release threshold value indicated by a broken line 704 in the drawing
  • a dynamically determined release threshold (broken line) 704) may be frequently approached or exceeded.
  • the release threshold value is set to a predetermined value ⁇ at a time 900 when the touch duration exceeds a predetermined time or the movement distance of the fingertip exceeds a predetermined distance. (Dashed line 700).
  • the release is performed at the time point 901 when the distance between the fingertip 500 indicated by the locus 502 and the target surface 101 exceeds the release threshold indicated by the broken line 700. The operation is recognized.
  • the release operation can be recognized at a timing according to the user's intention. In particular, it is possible to reduce misrecognition of a release while the operating body moves in a touch state.
  • the duration of the touch state and the moving distance of the fingertip are used as a reference for determining to use a fixed value without dynamically determining the release threshold.
  • the movement speed of the fingertip or the change in the posture of the hand may be used as a reference.
  • the release threshold is set to a fixed value when the moving speed of the operating body in the touch state reaches a predetermined speed.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • a circuit for example, ASIC

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention a pour but de réduire la reconnaissance erronée d'une terminaison d'une entrée tactile qui peut être provoquée dans le cas dans lequel un résultat de détection d'un degré de proximité entre un corps d'actionnement et une surface cible comprend une erreur. En tant que moyen pour atteindre l'objectif mentionné ci-dessus, la présente invention comporte : une unité d'acquisition de distance (214) pour acquérir des informations indiquant un degré de proximité d'un corps d'actionnement par rapport à une surface prédéterminée ; une unité de détermination de référence (215) pour déterminer une première valeur de référence associée au degré de proximité sur la base de la transition du degré de proximité après qu'un début d'une entrée tactile sur la surface prédéterminée avec le corps d'actionnement est reconnu ; une unité de détermination de condition (215) pour déterminer une première condition associée au degré de proximité pour reconnaître une terminaison de l'entrée tactile sur la base de la première valeur de référence déterminée ; et une unité de reconnaissance (217) pour reconnaître la terminaison de l'entrée tactile conformément au fait que le degré de proximité indiqué par des informations obtenues par l'unité d'acquisition de distance (214) satisfait la première condition déterminée par l'unité de détermination de condition (215).
PCT/JP2015/073332 2015-08-20 2015-08-20 Dispositif de traitement d'informations, son procédé de commande, programme et support de stockage WO2017029749A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2017509073A JP6711817B2 (ja) 2015-08-20 2015-08-20 情報処理装置、その制御方法、プログラム、及び記憶媒体
PCT/JP2015/073332 WO2017029749A1 (fr) 2015-08-20 2015-08-20 Dispositif de traitement d'informations, son procédé de commande, programme et support de stockage
US15/237,182 US10156938B2 (en) 2015-08-20 2016-08-15 Information processing apparatus, method for controlling the same, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/073332 WO2017029749A1 (fr) 2015-08-20 2015-08-20 Dispositif de traitement d'informations, son procédé de commande, programme et support de stockage

Publications (1)

Publication Number Publication Date
WO2017029749A1 true WO2017029749A1 (fr) 2017-02-23

Family

ID=58050742

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/073332 WO2017029749A1 (fr) 2015-08-20 2015-08-20 Dispositif de traitement d'informations, son procédé de commande, programme et support de stockage

Country Status (3)

Country Link
US (1) US10156938B2 (fr)
JP (1) JP6711817B2 (fr)
WO (1) WO2017029749A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018156533A (ja) * 2017-03-21 2018-10-04 株式会社東海理化電機製作所 触覚呈示装置

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9880267B2 (en) * 2015-09-04 2018-01-30 Microvision, Inc. Hybrid data acquisition in scanned beam display
US9933851B2 (en) * 2016-02-22 2018-04-03 Disney Enterprises, Inc. Systems and methods for interacting with virtual objects using sensory feedback
US10891044B1 (en) * 2016-10-25 2021-01-12 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
JP2018106535A (ja) * 2016-12-27 2018-07-05 ソニー株式会社 情報処理装置、情報処理方法及びコンピュータプログラム
US10671219B2 (en) 2018-02-12 2020-06-02 Microvision, Inc. Scanning time of flight 3D sensing with smart pulsing
US10474248B2 (en) * 2018-02-12 2019-11-12 Microvision, Inc. Smart pulsing in regions of interest in scanned beam 3D sensing systems
KR102469722B1 (ko) * 2018-09-21 2022-11-22 삼성전자주식회사 디스플레이 장치 및 그 제어 방법
US11288733B2 (en) * 2018-11-14 2022-03-29 Mastercard International Incorporated Interactive 3D image projection systems and methods

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013506905A (ja) * 2009-09-30 2013-02-28 フリースケール セミコンダクター インコーポレイテッド 静電容量式タッチセンサデバイス設定システムおよび方法
JP2013254331A (ja) * 2012-06-06 2013-12-19 Panasonic Corp 入力装置、入力支援方法及びプログラム

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4052498B2 (ja) 1999-10-29 2008-02-27 株式会社リコー 座標入力装置および方法
US8164573B2 (en) * 2003-11-26 2012-04-24 Immersion Corporation Systems and methods for adaptive interpretation of input from a touch-sensitive input device
US8381135B2 (en) * 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7619616B2 (en) * 2004-12-21 2009-11-17 Microsoft Corporation Pressure sensitive controls
EP2104024B1 (fr) * 2008-03-20 2018-05-02 LG Electronics Inc. Terminal portable capable de détecter un toucher de proximité et procédé pour écran de contrôle l'utilisant
US8363019B2 (en) * 2008-05-26 2013-01-29 Lg Electronics Inc. Mobile terminal using proximity sensor and method of controlling the mobile terminal
KR20100136649A (ko) * 2009-06-19 2010-12-29 삼성전자주식회사 휴대단말기의 근접 센서를 이용한 사용자 인터페이스 구현 방법 및 장치
JP2011053971A (ja) * 2009-09-02 2011-03-17 Sony Corp 情報処理装置、情報処理方法およびプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013506905A (ja) * 2009-09-30 2013-02-28 フリースケール セミコンダクター インコーポレイテッド 静電容量式タッチセンサデバイス設定システムおよび方法
JP2013254331A (ja) * 2012-06-06 2013-12-19 Panasonic Corp 入力装置、入力支援方法及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018156533A (ja) * 2017-03-21 2018-10-04 株式会社東海理化電機製作所 触覚呈示装置

Also Published As

Publication number Publication date
US20170052632A1 (en) 2017-02-23
JP6711817B2 (ja) 2020-06-17
JPWO2017029749A1 (ja) 2018-06-07
US10156938B2 (en) 2018-12-18

Similar Documents

Publication Publication Date Title
WO2017029749A1 (fr) Dispositif de traitement d'informations, son procédé de commande, programme et support de stockage
US9916043B2 (en) Information processing apparatus for recognizing user operation based on an image
US9529527B2 (en) Information processing apparatus and control method, and recording medium
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
JP6482196B2 (ja) 画像処理装置、その制御方法、プログラム、及び記憶媒体
JP2016520946A (ja) 人間対コンピュータの自然な3次元ハンドジェスチャベースのナビゲーション方法
WO2014106219A1 (fr) Interface centrée utilisateur pour une interaction avec un écran de visualisation qui reconnaît les intentions d'un utilisateur
US9880684B2 (en) Information processing apparatus, method for controlling information processing apparatus, and storage medium
US10346992B2 (en) Information processing apparatus, information processing method, and program
JP6452369B2 (ja) 情報処理装置とその制御方法、プログラム、記憶媒体
JP2015022624A (ja) 情報処理装置およびその制御方法、コンピュータプログラム、記憶媒体
GB2530150A (en) Information processing apparatus for detecting object from image, method for controlling the apparatus, and storage medium
US10379678B2 (en) Information processing device, operation detection method, and storage medium that determine the position of an operation object in a three-dimensional space based on a histogram
JP2017084307A (ja) 情報処理装置、その制御方法、プログラム、及び記憶媒体
JP5558899B2 (ja) 情報処理装置、その処理方法及びプログラム
JP6452658B2 (ja) 情報処理装置、およびその制御方法ならびにプログラム
JP6555958B2 (ja) 情報処理装置、その制御方法、プログラム、および記憶媒体
JP6579866B2 (ja) 情報処理装置とその制御方法、プログラム、記憶媒体
JP6618301B2 (ja) 情報処理装置、その制御方法、プログラム、及び記憶媒体
KR102107182B1 (ko) 손 제스처 인식 시스템 및 방법
JP6570376B2 (ja) 情報処理装置、その制御方法、プログラム、記憶媒体
US10175825B2 (en) Information processing apparatus, information processing method, and program for determining contact on the basis of a change in color of an image
JP2017228216A (ja) 情報処理装置、その制御方法、プログラム、及び記憶媒体
JP2017022590A (ja) 画像処理装置、画像処理装置の制御方法、及びプログラム
JP2016110517A (ja) 情報処理装置とその制御方法、プログラム、記憶媒体

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017509073

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15901731

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15901731

Country of ref document: EP

Kind code of ref document: A1