WO2019171635A1 - Dispositif d'entrée d'opération, procédé d'entrée d'opération, et support d'enregistrement lisible par ordinateur - Google Patents

Dispositif d'entrée d'opération, procédé d'entrée d'opération, et support d'enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2019171635A1
WO2019171635A1 PCT/JP2018/034490 JP2018034490W WO2019171635A1 WO 2019171635 A1 WO2019171635 A1 WO 2019171635A1 JP 2018034490 W JP2018034490 W JP 2018034490W WO 2019171635 A1 WO2019171635 A1 WO 2019171635A1
Authority
WO
WIPO (PCT)
Prior art keywords
operation input
sensor
aerial projection
projection plane
depth
Prior art date
Application number
PCT/JP2018/034490
Other languages
English (en)
Japanese (ja)
Inventor
夏美 鈴木
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Priority to JP2020504657A priority Critical patent/JP6898021B2/ja
Priority to CN201880090820.0A priority patent/CN111886567B/zh
Publication of WO2019171635A1 publication Critical patent/WO2019171635A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present invention relates to an operation input device and an operation input method that enable an input operation by touching a screen displayed in the air. Furthermore, the present invention relates to a computer-readable recording medium on which a program for realizing these is recorded. About.
  • Patent Literature 1 an operation device has been proposed in which an operation screen is projected in the air, and a user can input an operation by touching the screen projected in the air (hereinafter referred to as “aerial projection plane”).
  • the operation device proposed by Patent Literature 1 includes a display device, an image imaging plate that forms an image of the screen of the display device in the air, a camera, a distance sensor, and a control unit.
  • the image imaging plate has a function of collecting the light emitted from the image at a specific position at the same distance on the opposite side as viewed from the image imaging plate to form the same image (for example, , See Patent Document 2). For this reason, when the light emitted from the screen displayed on the display device passes through the image imaging plate, the screen is projected into the air.
  • the camera captures the aerial projection plane and the user's finger, and inputs the captured image to the control unit.
  • the distance sensor measures the distance from the user's fingertip and inputs the measured distance to the control unit.
  • the control unit first calculates the coordinates of the user's finger on the aerial projection plane by substituting the position on the image of the user's fingertip shown in the captured image and the distance measured by the distance sensor into the conversion formula. To do.
  • the conversion formula used at this time is determined in advance from the position coordinates of the aerial projection plane and camera information (position, angle of view, focal length, etc.).
  • control unit determines the overlap between the user's finger and the operation icon displayed on the aerial projection plane based on the calculated coordinates, and specifies the input operation by the user based on the determination result.
  • the user can perform an input operation by touching a screen displayed in the air.
  • Patent Document 1 can accurately detect the coordinates of the contact position of the user's finger on the aerial projection plane when the user's fingertip is positioned behind the operation screen in the air. There is a problem of disappearing.
  • the distance measured by the distance sensor is shorter than the distance at the contact position intended by the user. .
  • the distance sensor is installed so that the distance at the center position of the aerial projection plane is the shortest, the coordinates of the user's finger on the aerial projection plane are Will move to the center of the screen.
  • An example of an object of the present invention is to solve the above-described problem and suppress a decrease in detection accuracy of a touch position due to an erroneous operation of a user when a user performs an input operation by touching a screen displayed in the air.
  • An operation input device, an operation input method, and a computer-readable recording medium are provided.
  • an operation input device includes: A display device that displays an operation screen, an optical plate that projects the operation screen into the air to generate an aerial projection surface, and a sensor device for detecting the position of an object that contacts the aerial projection surface in a three-dimensional space And a control device that identifies an operation input performed on the operation screen, and the sensor device includes information for identifying a two-dimensional coordinate of the object in a sensing area, and the sensor device Output sensor data including depth to the object,
  • the controller is From the sensor data, detect the position of the object in the aerial projection plane, Further, when detecting from the sensor data that a part of the object is located on the sensor device side of the aerial projection surface, a figure surrounding the part of the object is set in the sensor data; and Using the depth at the set outer edge of the figure, the position of the object on the aerial projection plane is corrected. It is characterized by that.
  • an operation input method includes: A display device that displays an operation screen, an optical plate that projects the operation screen into the air to generate an aerial projection surface, and a sensor device for detecting the position of an object that contacts the aerial projection surface in a three-dimensional space And a computer for specifying an operation input performed on the operation screen, and information for the sensor device to specify the two-dimensional coordinates of the object in a sensing area, and the sensor device from the sensor device
  • a display device that displays an operation screen, an optical plate that projects the operation screen into the air to generate an aerial projection surface, and a sensor device for detecting the position of an object that contacts the aerial projection surface in a three-dimensional space
  • a computer for specifying an operation input performed on the operation screen, and information for specifying the two-dimensional coordinates of the object in a sensing area, and from the sensor device, the sensor device In the operation input device that outputs sensor data including the depth to the object,
  • the computer In the computer, (A) detecting a position of the object on the aerial projection plane from the sensor data; (B) when detecting from the sensor data that a part of the object is located on the sensor device side of the aerial projection plane, setting a figure surrounding the part of the object in the sensor data;
  • FIG. 1 is a configuration diagram showing the configuration of the operation input device according to the embodiment of the present invention.
  • FIG. 2 is a diagram illustrating the function of the sensor device used in the operation input device according to the embodiment of the present invention.
  • FIG. 3 is a diagram illustrating an example of a sensing result of the sensor device provided in the operation input device according to the embodiment of the present invention.
  • FIG. 4 is a flowchart showing the operation of the operation input device according to the embodiment of the present invention.
  • FIG. 5 is a block diagram illustrating an example of a computer that implements the control device for the operation input device according to the embodiment of the present invention.
  • FIG. 1 is a configuration diagram showing the configuration of the operation input device according to the embodiment of the present invention.
  • the operation input device 100 in the present embodiment shown in FIG. 1 is a device that enables an input operation by touching a screen displayed in the air.
  • the operation input device 100 includes a display device 10, an optical plate 20, a sensor device 30, and a control device 40.
  • Display device 10 displays an operation screen for input.
  • the optical plate 20 projects the operation screen into the air and generates an aerial projection surface 21.
  • the sensor device 30 is a device for detecting the position of the object 50 in contact with the aerial projection plane 21 in a three-dimensional space.
  • the object 50 is a user's finger that performs an operation input.
  • the sensor device 30 is disposed on the back side of the aerial projection surface 21.
  • the sensor device 30 outputs sensor data including information for specifying the two-dimensional coordinates of the object 50 in the sensing area and the depth from the sensor device 30 to the object 50.
  • the control device 40 includes an operation input specifying unit 41, a figure setting unit 42, and a depth correction unit 43.
  • the operation input specifying unit 41 detects the position of the object 50 on the aerial projection plane 21 from the sensor data output by the sensor device 30. Then, the operation input specifying unit 41 specifies an operation input performed on the operation screen according to the detected position of the object 50.
  • the figure setting unit 42 detects that a part of the object 50 is located on the sensor device 30 side of the aerial projection surface 21 from the sensor data, the figure setting unit 42 sets a figure surrounding the part of the object in the sensor data. To do.
  • the depth correction unit 43 corrects the position of the object 50 on the aerial projection plane 21 using the depth of the outer edge of the graphic set by the graphic setting unit 42.
  • the position of the finger is corrected. Therefore, according to the present embodiment, when the user performs an input operation by touching the operation screen displayed in the air, a situation in which the detection accuracy of the touch position is lowered due to an erroneous operation by the user is avoided. Is done.
  • FIG. 2 is a diagram illustrating the function of the sensor device used in the operation input device according to the embodiment of the present invention.
  • FIG. 3 is a diagram illustrating an example of a sensing result of the sensor device provided in the operation input device according to the embodiment of the present invention.
  • the display device 10 is a liquid crystal display device or the like.
  • the optical plate 20 the image imaging plate disclosed in Patent Document 2 described above is used.
  • the optical plate 20 has a function of collecting the light emitted from the image displayed on the screen of the display device 10 at the same distance on the opposite side as viewed from the image imaging plate to form the same image.
  • the sensor device 30 is a depth sensor in the present embodiment.
  • the depth sensor When sensing, the depth sensor outputs image data of an image obtained by sensing and a depth (depth) added to each pixel of the image as sensor data.
  • a device including a camera and a distance sensor may be used as the sensor device 30 as the sensor device 30, a device including a camera and a distance sensor may be used as the sensor device 30, a device including a camera and a distance sensor may be used.
  • the image data included in the sensor data it is possible to specify the two-dimensional coordinates of the object 50 in the sensing area.
  • the position can be specified.
  • the distance between the object 50 and the sensor device 30 in the Z-axis direction can be specified.
  • the Z axis is an axis along the normal line of the sensing surface of the sensor device 30.
  • the operation input specifying unit 41 when receiving sensor data, from the image data and each depth added to each pixel of the image, the most distal side of the object 50 (sensor device). The position in the three-dimensional space of the (30 side) part is specified. Then, the operation input specifying unit 41 converts the specified position into a position on the aerial projection plane 21.
  • the operation input specifying unit 41 first extracts the most distal portion of 50 of the object. And the operation input specific
  • the operation input specifying unit 41 substitutes the position of the most distal portion of the object 50 in the three-dimensional space into the conversion formula obtained from this positional relationship, thereby making the object 50 perpendicular to the aerial projection plane 21.
  • the position in the direction and the position in the horizontal direction of the aerial projection plane 21 are detected.
  • the figure setting unit 42 detects that from the sensor data. Specifically, the graphic setting unit 42 determines whether or not the Z coordinate (depth) of the most distal portion of the object 50 is equal to or greater than a threshold value from the sensor data. As a result of the determination, if it is not equal to or greater than the threshold (less than the threshold), the graphic setting unit 42 determines that a part of the object 50 is located on the sensor device 30 side of the aerial projection plane 21.
  • the threshold value is set according to the position of the object 50. For example, in FIG. 2, the threshold value when the object 50 is above the aerial projection plane 21 is smaller than the threshold value when the object 50 is below the aerial projection plane 21.
  • the graphic setting unit 42 sets a rectangle that surrounds the most distal portion of the object 50 in the sensor data.
  • the rectangle is set in this way, although the X-axis direction of the aerial projection plane 21 matches the H-axis direction of the sensor data, but the Y-axis direction of the aerial projection plane 21 matches the V-axis direction of the sensor data. If a part of the object 50 is positioned on the sensor device 30 side of the aerial projection plane 21, an error occurs in the position in the Y-axis direction.
  • the depth correction unit 43 uses the depth of one of the two sides on the V-axis direction side of the set rectangle, and the depth of the object 50 detected by the operation input specifying unit 41 is determined. The position of the aerial projection plane 21 in the vertical direction is corrected.
  • which side of the two sides on the V-axis direction side is to be used is determined according to the position of the object 50. For example, when the object 50 is on the lower side in the vertical direction with respect to the intersection of the normal line passing through the center of the sensing surface of the sensor device 30 and the aerial projection surface 21, the correction is performed by the depth of the upper side in the V-axis direction. Done. On the other hand, when the object 50 is above the intersection in the vertical direction, correction is performed based on the depth of the lower side in the V-axis direction.
  • a side on the H-axis direction side may be used. Further, depending on the positional relationship, a figure other than a rectangle may be set. Further, only the horizontal position of the aerial projection plane 21 of the object 50 may be corrected according to these positional relationships, or the position in both the vertical direction and the horizontal direction may be corrected. There may be.
  • FIG. 4 is a flowchart showing the operation of the operation input device according to the embodiment of the present invention.
  • FIGS. 1 to 3 are referred to as appropriate.
  • the operation input method is implemented by operating the operation input device 100. Therefore, the description of the operation input method in the present embodiment is replaced with the following description of the operation of the operation input device 100.
  • the control device 40 receives the sensor data and executes the following processing.
  • the operation input specifying unit 41 determines the most distal side (sensor device 30 side) portion of the object 50 from the image data included in the sensor data and each depth. Extract (step A1).
  • the operation input specifying unit 41 specifies the position of the part specified in step A1 in the three-dimensional space, and detects the position on the aerial projection plane 21 of the most distal part of the object 50 from the specified position. (Step A2).
  • the operation input specifying unit 41 specifies the position of the extracted portion in the three-dimensional space, that is, the coordinates on the X axis, the coordinates on the Y axis, and the depth from the sensor data. Then, the operation input specifying unit 41 substitutes the specified position in the three-dimensional space for the conversion formula obtained from the positional relationship, and the position of the most distal portion of the object 50 in the vertical direction of the aerial projection plane 21 The position of the aerial projection plane 21 in the horizontal direction is detected.
  • the graphic setting unit 42 determines whether or not the Z coordinate (depth) of the most distal portion of the object 50 is greater than or equal to a threshold value from the sensor data (step A3).
  • step A3 If the result of determination in step A3 is that the Z coordinate is greater than or equal to the threshold value, the graphic setting unit 42 notifies the operation input specifying unit 41 of this fact. Thereby, the operation input specification part 41 specifies a user's operation input based on the position detected by step A2 (step A4).
  • step A3 determines whether the Z coordinate is greater than or equal to the threshold value (smaller than the threshold value). If the result of determination in step A3 is that the Z coordinate is not greater than or equal to the threshold value (smaller than the threshold value), the most distal portion of the object is located on the sensor device 30 side of the aerial projection plane 21. Therefore, as shown in FIG. 3, the graphic setting unit 42 sets a rectangle that surrounds the most distal portion of the object 50 in the sensor data (step A5).
  • the depth correction unit 43 selects one side of the set rectangle according to the position of the object 50, and uses the depth in the selected one side to detect the most distal side of the object 50 detected in step A2.
  • the position of the part on the aerial projection plane 21 is corrected (step A6).
  • step A6 the depth correction unit 43 notifies the operation input specifying unit 41 that the position has been corrected.
  • the operation input specification part 41 specifies a user's operation input based on the position after correction
  • the position of the object 50 on the aerial projection plane 21 is corrected. Therefore, according to the present embodiment, when the user performs an input operation by touching the operation screen on the aerial projection plane 21, even if the user performs an erroneous operation, a decrease in the detection accuracy of the touch position is suppressed. .
  • the program in the present embodiment may be a program that causes a computer to execute steps A1 to A6 shown in FIG.
  • the control device 40 of the operation input device 100 in the present embodiment can be realized.
  • the processor of the computer functions as the operation input specifying unit 41, the figure setting unit 42, and the depth correction unit 43 to perform processing.
  • each computer may function as any one of the operation input specifying unit 41, the figure setting unit 42, and the depth correction unit 43, respectively.
  • FIG. 5 is a block diagram illustrating an example of a computer that implements the control device for the operation input device according to the embodiment of the present invention.
  • the computer 110 includes a CPU (Central Processing Unit) 111, a main memory 112, a storage device 113, an input interface 114, a display controller 115, a data reader / writer 116, and a communication interface 117. With. These units are connected to each other via a bus 121 so that data communication is possible.
  • the computer 110 may include a GPU (GraphicsGraphProcessing Unit) or an FPGA (Field-Programmable Gate Array) in addition to or instead of the CPU 111.
  • GPU GraphicsGraphProcessing Unit
  • FPGA Field-Programmable Gate Array
  • the CPU 111 performs various operations by developing the program (code) in the present embodiment stored in the storage device 113 in the main memory 112 and executing them in a predetermined order.
  • the main memory 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory).
  • the program in the present embodiment is provided in a state of being stored in a computer-readable recording medium 120. Note that the program in the present embodiment may be distributed on the Internet connected via the communication interface 117.
  • the storage device 113 includes a hard disk drive and a semiconductor storage device such as a flash memory.
  • the input interface 114 mediates data transmission between the CPU 111 and an input device 118 such as a keyboard and a mouse.
  • the display controller 115 is connected to the display device 119 and controls display on the display device 119.
  • the data reader / writer 116 mediates data transmission between the CPU 111 and the recording medium 120, and reads a program from the recording medium 120 and writes a processing result in the computer 110 to the recording medium 120.
  • the communication interface 117 mediates data transmission between the CPU 111 and another computer.
  • the recording medium 120 include general-purpose semiconductor storage devices such as CF (Compact Flash (registered trademark)) and SD (Secure Digital), magnetic recording media such as a flexible disk, or CD- Optical recording media such as ROM (Compact Disk Read Only Memory) are listed.
  • CF Compact Flash
  • SD Secure Digital
  • magnetic recording media such as a flexible disk
  • CD- Optical recording media such as ROM (Compact Disk Read Only Memory) are listed.
  • control device 40 in the present embodiment can be realized not by using a computer in which a program is installed but also by using hardware corresponding to each unit. Furthermore, a part of the control device 40 may be realized by a program, and the remaining part may be realized by hardware.
  • a display device that displays an operation screen, an optical plate that projects the operation screen into the air to generate an aerial projection surface, and a sensor device for detecting the position of an object that contacts the aerial projection surface in a three-dimensional space
  • a control device that identifies an operation input performed on the operation screen, and the sensor device includes information for identifying a two-dimensional coordinate of the object in a sensing area, and the sensor device Output sensor data including depth to the object
  • the controller is From the sensor data, detect the position of the object in the aerial projection plane, Further, when detecting from the sensor data that a part of the object is located on the sensor device side of the aerial projection surface, a figure surrounding the part of the object is set in the sensor data; and Using the depth at the set outer edge of the figure, the position of the object on the aerial projection plane is corrected.
  • An operation input device characterized by that.
  • Appendix 4 The operation input device according to any one of appendices 1 to 3, In the sensor data, when the depth of the portion of the object closest to the sensor device in the sensor data is smaller than a threshold value, a part of the object is located on the sensor device side of the aerial projection plane. To determine, An operation input device characterized by that.
  • the operation input device according to any one of appendices 1 to 4,
  • the sensor device is a depth sensor;
  • a display device that displays an operation screen, an optical plate that projects the operation screen into the air to generate an aerial projection surface, and a sensor device for detecting the position of an object that contacts the aerial projection surface in a three-dimensional space
  • a computer for specifying an operation input performed on the operation screen, and information for the sensor device to specify the two-dimensional coordinates of the object in a sensing area, and the sensor device from the sensor device
  • An operation input method for outputting sensor data including a depth to an object (A) detecting a position of the object on the aerial projection plane from the sensor data by the computer; (B) When the computer detects from the sensor data that a part of the object is located on the sensor device side of the aerial projection surface, a figure surrounding the part of the object is displayed in the sensor data.
  • a display device that displays an operation screen, an optical plate that projects the operation screen into the air to generate an aerial projection surface, and a sensor device for detecting the position of an object that contacts the aerial projection surface in a three-dimensional space
  • a computer for specifying an operation input performed on the operation screen, and information for specifying the two-dimensional coordinates of the object in a sensing area, and from the sensor device, the sensor device In the operation input device that outputs sensor data including the depth to the object,
  • the computer-readable recording medium characterized by recording the program containing the instruction
  • Appendix 15 A computer-readable recording medium according to any one of appendices 11 to 14,
  • the sensor device is a depth sensor;
  • a computer-readable recording medium A computer-readable recording medium.
  • the present invention when a user performs an input operation by touching a screen displayed in the air, it is possible to suppress a decrease in detection accuracy of a touch position due to a user's erroneous operation.
  • the present invention is useful in various devices that perform input on an aerial projection plane.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention concerne un dispositif d'entrée d'opération 100 comprenant un dispositif d'affichage 10 pour afficher un écran d'opération, une plaque optique 20 pour projeter l'écran d'opération dans l'air afin de générer une surface projetée à mi-hauteur 21, un dispositif de capteur 30, et un dispositif de commande pour identifier une entrée d'opération. Le dispositif de capteur 30 délivre des données de capteur qui comprennent des informations pour identifier les coordonnées bidimensionnelles d'un objet dans une zone de détection, et la profondeur par rapport à l'objet 50. À partir des données de capteur, le dispositif de commande 40 détecte la position de l'objet 50 sur la surface projetée à mi-hauteur 21, et lorsqu'une partie de l'objet est positionnée sur le côté dispositif de capteur de la surface projetée à mi-hauteur 21, le dispositif de commande définit dans les données de capteur une figure entourant la partie de l'objet 50, et utilise la profondeur du bord externe de la figure pour corriger la position de l'objet 50 sur la surface projetée à mi-hauteur 21.
PCT/JP2018/034490 2018-03-07 2018-09-18 Dispositif d'entrée d'opération, procédé d'entrée d'opération, et support d'enregistrement lisible par ordinateur WO2019171635A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020504657A JP6898021B2 (ja) 2018-03-07 2018-09-18 操作入力装置、操作入力方法、及びプログラム
CN201880090820.0A CN111886567B (zh) 2018-03-07 2018-09-18 操作输入装置、操作输入方法及计算机可读的记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-041162 2018-03-07
JP2018041162 2018-03-07

Publications (1)

Publication Number Publication Date
WO2019171635A1 true WO2019171635A1 (fr) 2019-09-12

Family

ID=67846002

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/034490 WO2019171635A1 (fr) 2018-03-07 2018-09-18 Dispositif d'entrée d'opération, procédé d'entrée d'opération, et support d'enregistrement lisible par ordinateur

Country Status (3)

Country Link
JP (1) JP6898021B2 (fr)
CN (1) CN111886567B (fr)
WO (1) WO2019171635A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023016352A1 (fr) * 2021-08-13 2023-02-16 安徽省东超科技有限公司 Procédé de détection de positionnement, appareil de détection de positionnement et dispositif terminal d'entrée

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003131785A (ja) * 2001-10-22 2003-05-09 Toshiba Corp インタフェース装置および操作制御方法およびプログラム製品
JP2014179072A (ja) * 2013-03-14 2014-09-25 Honda Motor Co Ltd 三次元指先トラッキング
JP2015060296A (ja) * 2013-09-17 2015-03-30 船井電機株式会社 空間座標特定装置

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009131128A1 (fr) * 2008-04-22 2009-10-29 Fujishima Tomohiko Dispositif d’imagerie optique et procede d’imagerie optique mettant en œuvre ce dispositif
JP5353596B2 (ja) * 2009-09-18 2013-11-27 セイコーエプソン株式会社 投写型表示装置、キーストン補正方法
JP5560771B2 (ja) * 2010-02-26 2014-07-30 セイコーエプソン株式会社 画像補正装置、画像表示システム、画像補正方法
JP2012073659A (ja) * 2010-09-01 2012-04-12 Shinsedai Kk 操作判定装置、指先検出装置、操作判定方法、指先検出方法、操作判定プログラム、及び、指先検出プログラム
JP5197777B2 (ja) * 2011-02-01 2013-05-15 株式会社東芝 インターフェイス装置、方法、およびプログラム
JPWO2012173001A1 (ja) * 2011-06-13 2015-02-23 シチズンホールディングス株式会社 情報入力装置
WO2013124901A1 (fr) * 2012-02-24 2013-08-29 日立コンシューマエレクトロニクス株式会社 Appareil d'affichage de type à projection optique, terminal portatif et programme
KR102517425B1 (ko) * 2013-06-27 2023-03-31 아이사이트 모빌 테크놀로지 엘티디 디지털 디바이스와 상호작용을 위한 다이렉트 포인팅 검출 시스템 및 방법
WO2015092905A1 (fr) * 2013-12-19 2015-06-25 日立マクセル株式会社 Dispositif d'affichage d'image de projection et procédé d'affichage d'image de projection
CN103677274B (zh) * 2013-12-24 2016-08-24 广东威创视讯科技股份有限公司 一种基于主动视觉的互动投影方法及系统
JP6510213B2 (ja) * 2014-02-18 2019-05-08 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 投影システム、半導体集積回路、および画像補正方法
CN104375638A (zh) * 2014-07-17 2015-02-25 深圳市钛客科技有限公司 感知设备、移动终端及空中感知系统
CN104375639A (zh) * 2014-07-17 2015-02-25 深圳市钛客科技有限公司 一种空中感应设备
US10359883B2 (en) * 2014-12-26 2019-07-23 Nikon Corporation Detection device, electronic apparatus, detection method and program
WO2016121708A1 (fr) * 2015-01-26 2016-08-04 Necソリューションイノベータ株式会社 Système d'entrée, dispositif d'entrée, procédé d'entrée et support d'enregistrement
JP2017062709A (ja) * 2015-09-25 2017-03-30 新光商事株式会社 ジェスチャー操作装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003131785A (ja) * 2001-10-22 2003-05-09 Toshiba Corp インタフェース装置および操作制御方法およびプログラム製品
JP2014179072A (ja) * 2013-03-14 2014-09-25 Honda Motor Co Ltd 三次元指先トラッキング
JP2015060296A (ja) * 2013-09-17 2015-03-30 船井電機株式会社 空間座標特定装置

Also Published As

Publication number Publication date
JP6898021B2 (ja) 2021-07-07
CN111886567A (zh) 2020-11-03
CN111886567B (zh) 2023-10-20
JPWO2019171635A1 (ja) 2021-02-12

Similar Documents

Publication Publication Date Title
JP4820285B2 (ja) 自動位置合わせタッチシステムおよび方法
US10254893B2 (en) Operating apparatus, control method therefor, and storage medium storing program
US10042426B2 (en) Information processing apparatus for detecting object from image, method for controlling the apparatus, and storage medium
JP2019215811A (ja) 投影システム、画像処理装置および投影方法
WO2020027818A1 (fr) Détermination de l'emplacement d'un contact sur des surfaces tactiles
US20190325593A1 (en) Image processing apparatus, system, method of manufacturing article, image processing method, and non-transitory computer-readable storage medium
TW201621454A (zh) 投影對準技術
US10146331B2 (en) Information processing system for transforming coordinates of a position designated by a pointer in a virtual image to world coordinates, information processing apparatus, and method of transforming coordinates
WO2019171635A1 (fr) Dispositif d'entrée d'opération, procédé d'entrée d'opération, et support d'enregistrement lisible par ordinateur
EP3032380B1 (fr) Appareil de projection d'image et système utilisant une capacité d'entrée-sortie interactive
JP6065433B2 (ja) 投影装置、投影システム、プログラム
JP6555958B2 (ja) 情報処理装置、その制御方法、プログラム、および記憶媒体
JP6417939B2 (ja) 手書きシステム及びプログラム
US9483125B2 (en) Position information obtaining device and method, and image display system
US20140201687A1 (en) Information processing apparatus and method of controlling information processing apparatus
JP2018063555A (ja) 情報処理装置、情報処理方法及びプログラム
JP2016153996A (ja) 座標取得システム、ディスプレイ装置、座標取得方法およびプログラム
JP7452917B2 (ja) 操作入力装置、操作入力方法及びプログラム
US10373324B2 (en) Measurement apparatus that scans original, method of controlling the same, and storage medium
US10044904B2 (en) Apparatus, image reading method, and storage medium for reading an image of a target object
TWI522871B (zh) 光學觸控系統之物件影像之處理方法
EP3059664A1 (fr) Procédé pour commander un dispositif par des gestes et système permettant de commander un dispositif par des gestes
US20240129443A1 (en) Display method, projector, and non-transitory computer-readable storage medium storing program
WO2021258506A1 (fr) Procédé et appareil tactiles de sous-zone, dispositif électronique et support d'enregistrement
CN117952928A (zh) 图像处理方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18908818

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020504657

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18908818

Country of ref document: EP

Kind code of ref document: A1