WO2016121708A1 - Input system, input device, input method, recording medium - Google Patents

Input system, input device, input method, recording medium Download PDF

Info

Publication number
WO2016121708A1
WO2016121708A1 PCT/JP2016/052054 JP2016052054W WO2016121708A1 WO 2016121708 A1 WO2016121708 A1 WO 2016121708A1 JP 2016052054 W JP2016052054 W JP 2016052054W WO 2016121708 A1 WO2016121708 A1 WO 2016121708A1
Authority
WO
WIPO (PCT)
Prior art keywords
contact
angle
target surface
operation screen
projection target
Prior art date
Application number
PCT/JP2016/052054
Other languages
French (fr)
Japanese (ja)
Inventor
竜太郎 谷村
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Priority to JP2016572027A priority Critical patent/JPWO2016121708A1/en
Publication of WO2016121708A1 publication Critical patent/WO2016121708A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present invention relates to an input system, an input device, an input method, and a recording medium.
  • Patent Document 1 data (numerical values and instructions) are displayed according to a user's operation on a monitor device arranged in front of the operator and a virtual plane (operation surface) arranged between the operator and the monitor device.
  • An input system includes an input device for inputting (including) an input device.
  • Patent Document 2 an image of an operation screen is displayed on the upper surface of a desk, a distance image including pixel values indicating a distance to a detection target is generated, and an object contacts the upper surface of the desk based on the distance image.
  • An information processing apparatus that determines this and inputs data corresponding to the contact position is disclosed.
  • Patent Document 3 discloses an input device that displays a virtual operation surface in a three-dimensional manner, specifies a positional relationship between the virtual operation surface and a user's hand or finger, and inputs data based on the specified positional relationship. ing.
  • the placement of the virtual operation surface is determined based on the position of a specific part of the operator set in advance, such as the position of the joint of the operator.
  • Patent Document 3 specifies the operator's posture only by the position of a specific part of the operator, and thus the actual posture of the operator may not be sufficiently specified. For this reason, the arrangement of the operation surface is not suitable for the posture of the operator, and the operator may feel it difficult to operate.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide an input system with high operability.
  • An input system includes: Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object, and angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface And instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object based on the contact elevation angle and the contact azimuth angle detected by the angle detection means.
  • An input device having display control means for transmitting to the device; The projection device that projects the operation screen onto the object according to the instruction information received from the input device; It is characterized by providing.
  • An input device is: An input device connected to the projection device, Contact detection means for detecting contact of an object with a projection target surface of an operation screen on the object; An angle detecting means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface; Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device.
  • Display control means for transmitting to, It is characterized by providing.
  • An input method includes: A contact detection step of detecting contact of an object with a projection target surface of an operation screen on an object executed by an input device; An angle detection step of detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface; Based on the contact elevation angle and the contact azimuth angle detected in the angle detection step, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is displayed on the projection device.
  • a display control step to transmit; Projecting the operation screen onto the object according to the instruction information received from the input device executed by the projection device; It is characterized by providing.
  • the recording medium is: A computer connected to the projection device, Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object; An angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface; Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device.
  • Display control means for transmitting to Stores a program characterized by functioning as
  • the operation screen is displayed at a predetermined position based on the contact position with the object on the projection target surface (for example, the operator's hand). Accordingly, it is possible to display the operation screen at a predetermined position that is easy for the operator to operate. Thereby, the operativity of an input device can be improved.
  • FIG. 1 It is a figure which shows an example of the method of detecting the contact area
  • the input system 100 includes an input device 1, a depth sensor 2, and a projection device 3, as shown in FIG.
  • the input device 1 is connected to the depth sensor 2 and the projection device 3.
  • the depth sensor 2 detects the depth of the object including the upper surface 41 of the table on which the operation screen is projected (distance from the depth sensor 2), generates distribution information of the detected depth value, and transmits it to the input device 1.
  • the input device 1 analyzes the distribution information of the depth value received from the depth sensor 2, and determines whether or not the user's hand has touched the upper surface 41 of the table. When the input device 1 detects that the hand has touched the upper surface 41, the input device 1 detects a contact point, a contact line, or a contact area. When the input device 1 detects that the tip of one finger contacts the upper surface 41, the input device 1 detects the tip of the finger as a contact point. Further, when the input device 1 detects that the side surface of the hand is in contact with the upper surface 41, the input device 1 detects the side surface of the hand as a contact line. Further, when the input device 1 detects that the palm has contacted the upper surface 41, the input device 1 detects the palm as a contact area.
  • the input device 1 detects an elevation angle and an azimuth angle (hereinafter referred to as contact elevation and contact azimuth angle) formed with the upper surface 41 of the finger, palm, or arm.
  • the input device 1 determines a direction in which the user can easily move a finger, palm, or arm (hereinafter referred to as an operation direction) from the detected contact elevation angle and contact azimuth angle.
  • the input device 1 transmits instruction information for instructing to display an operation screen suitable for the determined operation direction at a predetermined position based on the position of the contact point, the contact line, or the contact area.
  • the operation screen is, for example, a list (menu) screen of selection items (menu items) associated with functions provided in the input device 1.
  • the projection device 3 projects an operation screen (menu) on the upper surface 41 of the table according to the instruction information received from the input device 1.
  • the user can select (instruct) the selection item by moving the contact point, the contact line, or the contact area on the upper surface 41 of the table.
  • the input device 1 analyzes the distribution information of the depth value received from the depth sensor 2, and specifies the instructed selection item.
  • the input device 1 transmits to the projection device 3 instruction information that instructs to display a screen corresponding to the selected item at a predetermined position based on the position of the contact point, the contact line, or the contact area.
  • the projection device 3 projects a screen corresponding to the selected item on the upper surface 41 of the table according to the instruction information received from the input device 1.
  • the input system 100 includes one depth sensor 2 and one projection device 3.
  • the present invention is not limited to this, and a plurality of depth sensors 2 and a plurality of projection devices 3 may be provided. Further, it may be detected that a plurality of user's hands are in contact with the upper surface of the table.
  • the input device 1 includes a depth acquisition unit 11, a contact detection unit 12, an angle detection unit 13, a storage unit 14, and a display control unit 15.
  • the depth acquisition unit 11 receives depth value distribution information from the depth sensor 2. The depth acquisition unit 11 sends the received depth value distribution information to the contact detection unit 12.
  • the contact detection unit 12 analyzes the distribution information of the depth value received from the depth acquisition unit 11, and extracts the user's skeleton from the user's area. The contact detection unit 12 determines whether or not a hand has touched the upper surface 41 of the table from the user's region and skeleton. The contact detection unit 12 determines that the hand has touched the upper surface 41 when the difference between the depth value of the upper surface 41 of the table and the depth value of the contour of the hand is equal to or smaller than the threshold value. This threshold value is set to a value corresponding to the resolution + ⁇ of the depth sensor 2, for example.
  • the contact detection unit 12 detects a contact point, a contact line, or a contact area from the user's area and skeleton (a detection method will be described later).
  • the contact detection unit 12 sends contact position information indicating the position of the detected contact point, contact line, or contact area to the display control unit 15.
  • the contact detection unit 12 sends analysis result information indicating the analysis result of the distribution information of the depth value to the angle detection unit 13.
  • the angle detection unit 13 detects the contact elevation angle and the contact azimuth angle from the analysis result information received from the contact detection unit 12 as follows.
  • the angle detection unit 13 causes the upper surface 41 (in the direction V1 along the finger with the contact point P1 as a base point)
  • a contact elevation angle ⁇ 1 and a contact azimuth angle ⁇ 1 with respect to the projection target surface or tangent plane are obtained.
  • the contact elevation angle ⁇ 1 and the contact azimuth angle ⁇ 1 are angles with respect to the reference direction V0 along the upper surface 41 of the table.
  • a contact elevation angle ⁇ 2 and a contact azimuth angle ⁇ 2 with respect to the tangent plane (the upper surface 41 of the table) are obtained.
  • the angle detection unit 13 makes a tangential plane (the upper surface 41 of the table) in the direction V3 along the user's forearm with the contact area as a base point.
  • a contact elevation angle ⁇ 3 and a contact azimuth angle ⁇ 3 are obtained.
  • the angle detection unit 13 sends angle information indicating the detected contact elevation angle and contact azimuth angle to the display control unit 15. Based on the angle information received from the angle detection unit 13, the display control unit 15 determines the operation direction by a method described later. The display control unit 15 generates an operation screen suitable for the determined operation direction. The display control unit 15 generates an operation screen that is inclined from the reference direction by an angle between the determined operation direction and a preset reference direction.
  • the display control unit 15 receives the contact position information from the contact detection unit 12, and at a predetermined position (a position where the user can easily move the hand) based on the position of the contact point, the contact line, or the contact area indicated by the contact position information. Instruction information for displaying the operation screen is generated.
  • the display control unit 15 sets the position of the operation screen to a position that is higher than the height of the contact point, the contact line, or the contact area by a preset distance. Then, the display control unit 15 transmits the generated instruction information to the projection device 3.
  • the projection device 3 projects an operation screen (menu) on the upper surface 41 of the table according to the instruction information received from the input device 1.
  • the contact detection unit 12 When it is determined that the user's hand has touched the upper surface of the table in a state where the operation screen is projected on the upper surface 41 of the table, the contact detection unit 12 indicates the position of the detected contact point, contact line, or contact area. The selection item selected by the user is specified from the contact position information. The contact detection unit 12 sends selection information indicating the selected selection item to the display control unit 15.
  • the display control unit 15 generates instruction information for instructing to display an operation screen corresponding to the selection item indicated by the selection information received from the contact detection unit 12, and transmits the instruction information to the projection device 3.
  • the projection device 3 projects a screen corresponding to the selected selection item on the upper surface 41 of the table according to the instruction information received from the input device 1.
  • FIG. 4 is a diagram illustrating an example of a method for detecting a contact point, a contact elevation angle, and a contact azimuth angle of an object that has contacted an object according to the embodiment.
  • the user performs a touch operation that makes contact with the upper surface 41 of the table with one finger.
  • the depth value distribution information includes a table region T and a floor region F having substantially constant depth values.
  • the contact detection unit 12 extracts points where a difference of a predetermined value or more appears from a substantially constant depth value of the continuous table region T or floor region F from the distribution information of depth values.
  • the contact detection unit 12 connects the extracted points to extract the user's contour.
  • the contact detection unit 12 separates the user area U surrounded by the user's outline. In the example of FIG. 4, the user's head is indicated by a broken line for easy understanding.
  • the contact detection unit 12 replaces the pixel value with a larger pixel value as the distance to the contour portion of the user area U is shorter.
  • the user's skeleton S is extracted.
  • the extraction method of the skeleton S is not limited to this, and thinning or central axis conversion generally used in image processing may be used.
  • the contact detection unit 12 detects the end points P1 to P4 of the skeleton S. When there is one end point whose difference from the depth value of the table region T is not more than a threshold value among the end points P1 to P4, the contact detection unit 12 detects the position of the end point as a contact point.
  • the end point P1 is a user's fingertip that has contacted the upper surface of the table, and the position of the end point P1 is a contact point.
  • the contact detection unit 12 sends analysis result information indicating these analysis results to the angle detection unit 13.
  • the angle detection unit 13 determines the contact elevation angle of the user's finger from the direction from the end point P1 (fingertip) of the skeleton S indicated by the analysis result information received from the contact detection unit 12 to the base end portion of the finger. Detect the contact azimuth.
  • the angle detection unit 13 sends angle information indicating the contact elevation angle and contact azimuth angle of the user's finger to the display control unit 15.
  • the display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13.
  • FIG. 5 is a diagram illustrating an example of a method for detecting a contact area, a contact elevation angle, and a contact azimuth angle of an object in contact with an object according to the embodiment.
  • the contact detection unit 12 determines the user area U using the depth value distribution information, and extracts the skeleton S.
  • the contact detection unit 12 detects the end points P 5 to 9 from the skeleton S.
  • the contact detection unit 12 determines that the operation is a lapping operation.
  • the contact detection unit 12 determines that the lap operation is performed and the difference from the depth value of the table region T.
  • An area of the palm of the user for which is less than or equal to the threshold is detected as a contact area.
  • the contact detection unit 12 sends analysis result information indicating these analysis results to the angle detection unit 13.
  • the angle detection unit 13 detects the contact elevation angle and contact azimuth angle of the user's forearm from the direction from the wrist to the elbow of the skeleton S indicated by the analysis result information received from the contact detection unit 12.
  • the angle detection unit 13 sends angle information indicating the contact elevation angle and contact azimuth angle of the user's forearm to the display control unit 15.
  • the display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13. Subsequently, display of the operation screen will be described.
  • FIG. 6A is a diagram illustrating a display example of the operation screen according to the embodiment.
  • the display control unit 15 determines the tangent plane of the user's forearm as shown in FIG. 6A based on the angle information received from the angle detection unit 13. The direction toward the top is up, and the top, bottom, left and right are determined.
  • the display control unit 15 Upon receiving the contact position information from the contact detection unit 12, the display control unit 15 instructs to display the operation screen at a predetermined position based on the position of the contact region (palm region) indicated by the contact position information. Information is generated and transmitted to the projection device 3. For example, the operation screen is displayed at a position 10 cm above the contact area.
  • the projection device 3 projects an operation screen as shown in FIG. 6A onto a predetermined position on the upper surface 41 of the table according to the instruction information received from the input device 1.
  • the user can select selection items A, B, C, and D associated with each function of the input device 1 on the operation screen.
  • the contact detection unit 12 of the input device 1 specifies that the selection item B has been selected from the contact position information indicating the position of the detected contact point.
  • the contact detection unit 12 sends selection information indicating the selected selection item B to the display control unit 15.
  • the display control unit 15 generates instruction information for instructing to display a screen corresponding to the selection item B indicated by the selection information received from the contact detection unit 12, and transmits the instruction information to the projection device 3.
  • the projection device 3 projects a screen corresponding to the selected selection item on the upper surface of the table according to the instruction information received from the input device 1.
  • FIG. 7 is a diagram illustrating a display example of a screen corresponding to the selected selection item according to the embodiment.
  • the screen shown in FIG. 7 is a screen for changing the temperature of the air conditioner.
  • the user raises or lowers the temperature of the air conditioner by operating an operation button (open arrow in FIG. 7) displayed on the screen for changing the temperature of the air conditioner.
  • the contact detection unit 12 of the input device 1 determines the user region U using the depth value distribution information and extracts the skeleton S, as in the case of FIGS.
  • the contact detection unit 12 detects a contact line on the side surface of the user's hand.
  • the contact detection unit 12 detects, as a contact line, a line connecting points where the difference from the depth value of the table region T is equal to or less than a threshold value in the contour of the user.
  • the contact detection unit 12 sends analysis result information indicating these analysis results to the angle detection unit 13.
  • the angle detection unit 13 detects the contact elevation angle and the contact azimuth of the palm of the user from the distribution information of the depth values of the palm region of the skeleton S or the user region U indicated by the analysis result information received from the contact detection unit 12.
  • the angle detection unit 13 sends angle information indicating the contact elevation angle and the contact azimuth angle of the palm of the user to the display control unit 15.
  • the display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13.
  • the right palm is tilted to the left and the putting operation is performed, and the angle formed with the tangent to the normal line of the plane along the right palm of the user is the left when the angle is obtuse and the right when the angle is acute. It is assumed that which hand is used to tilt the left or right to perform the pad operation is determined in advance.
  • the direction toward the tangential plane of the user's forearm may be set upward, and the top, bottom, left, and right may be determined.
  • the wrist direction may be down (front), the right may be high (large), and the left may be low (small).
  • the display control unit 15 generates an operation screen in accordance with the determined operation direction, and transmits to the projection device 3 instruction information for instructing display at a predetermined position based on the position of the contact line.
  • the projection device 3 projects an operation screen according to the instruction information.
  • the operation screen is, for example, a position that is 20 cm away from the position of the contact line and the tangent plane of the normal line of the plane along the right palm of the user, and a plane along the right palm of the user. Is displayed at a position 10 cm away on the side where the angle formed with the tangent plane of the normal line is an acute angle.
  • the left pad operation is an operation to lower the temperature
  • the right pad operation is an operation to increase the temperature
  • a left arrow with-displayed on the left side of the contact line and a + on the right side of the contact line.
  • the side where the normal formed by the plane along the right palm of the user and the tangent plane makes an obtuse angle is easy to make a hand shadow.
  • the display of an operation direction may be one side.
  • the function of the input device 1 is not limited to the display of the screen corresponding to the selection item selected by the user, and may be any function such as a function of controlling an electric device.
  • the input device 1 may perform a function corresponding to the selected selection item.
  • FIG. 8 is a flowchart showing an example of the operation of the input process according to the embodiment.
  • the input process of FIG. 8 starts when the power of the depth sensor 2, the input device 1, and the projection device 3 is turned on.
  • the depth sensor 2 detects the depth of the object (step S11), generates distribution information of the detected depth value, and transmits it to the input device 1 (step S12). If the power is not turned off (step S13; NO), the depth sensor 2 returns to step S11 and repeats steps S11 to S13. The depth sensor 2 ends the process when the power is turned off (step S13; YES).
  • the depth acquisition unit 11 of the input device 1 receives the distribution information of the depth value from the depth sensor 2 (step S21) and sends it to the contact detection unit 12.
  • the contact detection unit 12 analyzes the depth value distribution information received from the depth acquisition unit 11, and determines whether or not the user's hand has touched the upper surface of the table (step S22).
  • step S22 If it is not determined that the user's hand has touched the upper surface of the table (step S22; NO), the process returns to step S21, and steps S21 and S22 are repeated.
  • the contact detection unit 12 detects a contact point, a contact line, or a contact area (step S23).
  • the contact detection unit 12 sends contact position information indicating the position of the detected contact point, contact line, or contact area to the display control unit 15, and sends depth value distribution information to the angle detection unit 13.
  • step S24 determines whether or not the operation screen is being displayed (step S24).
  • the angle detection unit 13 analyzes the distribution information of the depth value received from the contact detection unit 12, and the angle with respect to the tangent plane (contact elevation angle and contact azimuth angle). Is detected (step S25). The angle detection unit 13 sends angle information indicating the contact elevation angle and the contact azimuth angle to the display control unit 15. The display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13 (step S26).
  • the display control unit 15 generates an operation screen in accordance with the determined operation direction, and operates to a predetermined position based on the position of the contact point, the contact line, or the contact area indicated by the contact position information received from the contact detection unit 12. Instruction information for displaying the screen is generated and transmitted to the projection apparatus 3 (step S27). The process proceeds to step S30.
  • the contact detection unit 12 specifies a selection item selected from the contact position information indicating the position of the detected contact point, contact line, or contact area (Ste S28).
  • the input device 1 executes a function corresponding to the selected selection item (step S29).
  • the contact detection unit 12 sends selection information indicating the selected selection item to the display control unit 15.
  • the display control unit 15 generates a screen corresponding to the selection item indicated by the selection information received from the contact detection unit 12, and generates instruction information for instructing to display the screen corresponding to the selection item selected at a predetermined position. And transmitted to the projection device 3.
  • step S30 If the power supply is not turned off (step S30; NO), the input device 1 returns to step S21 and repeats steps S21 to S30.
  • step S30; YES the input device 1 ends the process.
  • Projection device 3 receives instruction information for displaying an operation screen from input device 1 (step S31).
  • the projection device 3 projects the operation screen onto the upper surface 41 of the table according to the instruction information received from the input device 1 (step S32).
  • the projection device 3 receives instruction information for instructing to display a screen corresponding to the selected selection item from the input device 1.
  • the projection device 3 projects a screen corresponding to the selected selection item on the upper surface of the table according to the instruction information received from the input device 1.
  • step S33 If the projector 3 is not turned off (step S33; NO), the projection apparatus 3 returns to step S31 and repeats steps S31 to S33.
  • step S33; YES the projection device 3 ends the process.
  • the input system 100 of the present embodiment it becomes possible to display an operation screen at an arbitrary place of an object in accordance with the operation direction of the user (operator), and the operability is improved.
  • the object is the upper surface of the table, and the object that contacts the object is the user's hand, but is not limited thereto.
  • the shape of the object is not limited to a horizontal plane such as the upper surface of the table, but may be any shape that can project an operation screen, such as a vertical plane such as a wall surface, a cylinder, a sphere, or a staircase.
  • the display control unit 15 of the input device 1 may generate an operation screen that matches the shape of the object.
  • the object that contacts the object may be any object that has a shape that can define the direction extending from the tangent plane, such as a stick or a pen.
  • the depth sensor 2 and the projection device 3 are provided on the ceiling, but the present invention is not limited thereto.
  • Depth sensor 2 should just be provided in the position which can generate distribution information of the depth value which can detect the contact elevation angle and contact azimuth angle of the object which contacted the subject.
  • the projection device 3 may be provided at a position where the operation screen can be projected onto the object.
  • the display position of the operation screen is determined based on the contact elevation angle and the contact azimuth angle is described.
  • the present invention is not limited to this, and other parameters other than the contact elevation angle and the contact azimuth angle are used.
  • the display position of the operation screen may be determined based on this. For example, the display position of the operation screen may be determined based on the corner of the elbow or the number of finger contacts.
  • the operation screen is displayed when it is determined that an object (such as a user's hand) has touched the display surface (such as the upper surface of the table) of the object.
  • an operation screen may be displayed, and the direction and position of the operation screen may be changed by detecting that an object (such as a user's hand) approaches the display surface of the object. For example, when the hand is brought close to the display surface, the direction of the operation screen is changed in accordance with the extending direction of the hand and the arm, and the operation screen is brought closer to the tip of the hand. In this case, since the operation screen is displayed, it is easy for the user to understand the operation, and it is easy to start the operation.
  • FIG. 9 is a block diagram illustrating an example of a hardware configuration of the input device according to the embodiment.
  • the control unit 31 includes a CPU (Central Processing Unit) and the like, and executes each process of the contact detection unit 12, the angle detection unit 13, and the display control unit 15 in accordance with a control program 39 stored in the external storage unit 33. .
  • CPU Central Processing Unit
  • the main storage unit 32 is constituted by a RAM (Random-Access Memory) or the like, loads a control program 39 stored in the external storage unit 33, and is used as a work area of the control unit 31.
  • RAM Random-Access Memory
  • the external storage unit 33 includes a nonvolatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc Random-Access Memory), a DVD-RW (Digital Versatile Disc Disc ReWritable), and controls the processing of the input device 1.
  • a program to be executed by the control unit 31 is stored in advance, and data stored by the program is supplied to the control unit 31 in accordance with an instruction from the control unit 31, and the data supplied from the control unit 31 is stored.
  • the storage unit 14 is configured in the external storage unit 33.
  • the operation unit 34 includes a pointing device such as a keyboard and a mouse, and an interface device that connects the keyboard and the pointing device to the internal bus 30.
  • a pointing device such as a keyboard and a mouse
  • an interface device that connects the keyboard and the pointing device to the internal bus 30.
  • the display unit 35 is composed of a CRT or LCD. When the system administrator inputs information to the input device 1, the display unit 35 displays an operation screen.
  • the transmission / reception unit 36 includes a network termination device or a wireless communication device connected to the network, and a serial interface or a LAN (Local Area Network) interface connected to them.
  • the transmission / reception unit 36 functions as the depth acquisition unit 11 and the display control unit 15.
  • the processing of the depth acquisition unit 11, the contact detection unit 12, the angle detection unit 13, the storage unit 14, and the display control unit 15 illustrated in FIG. 3 includes a control program 39, a control unit 31, a main storage unit 32, an external storage unit 33, The processing is executed by using the operation unit 34, the display unit 35, and the transmission / reception unit 36 as resources.
  • the central part that performs control processing including the control unit 31, the main storage unit 32, the external storage unit 33, the internal bus 30 and the like can be realized by using a normal computer system, not a dedicated system.
  • a computer program for executing the above operation is stored and distributed in a computer-readable recording medium (flexible disk, CD-ROM, DVD-ROM, etc.), and the computer program is installed in the computer.
  • the input device 1 that executes the above-described processing may be configured.
  • the computer program may be stored in a storage device included in a server device on a communication network such as the Internet, and the input device 1 may be configured by being downloaded by a normal computer system.
  • the function of the input device 1 is realized by sharing of the OS and the application program or by cooperation between the OS and the application program, only the application program portion may be stored in the recording medium or the storage device.
  • the computer program may be posted on a bulletin board (BBS: Bulletin Board System) on a communication network, and the computer program may be distributed via the network.
  • BSS Bulletin Board System
  • the computer program may be started and executed in the same manner as other application programs under the control of the OS, so that the above-described processing may be executed.
  • the angle detection means is the contact elevation angle that is an elevation angle and an azimuth angle in a one-dimensional direction extending with reference to the contact point of an object that has contacted the object.
  • the angle detection means when the contact detection means detects a contact line, an elevation angle and an azimuth with respect to a tangential plane of a normal in a two-dimensional direction extending with reference to the contact line of an object that has contacted the object.
  • the input system according to appendix 1 or 2, wherein the contact elevation angle and the contact azimuth angle which are angles are detected.
  • the angle detection means is an elevation angle and an azimuth angle of a one-dimensional direction extending with reference to the contact area of the object that has contacted the object.
  • the input system according to any one of appendices 1 to 3, wherein the contact azimuth angle is detected.
  • the contact detection means detects contact of the object with the projection target surface of the operation screen on the target object by analyzing distribution information of depth values indicating the depth of the object in a region including the target object.
  • the input system according to any one of appendices 1 to 4.
  • the angle detection unit analyzes the distribution information of depth values indicating the depth of the object in the region including the target object, so that the contact elevation angle of the object in contact with the projection target surface with respect to the projection target surface and
  • the input system according to any one of appendices 1 to 5, which detects the contact azimuth angle.
  • Display control means for transmitting to, An input device comprising:
  • the angle detection means is the contact elevation angle that is an elevation angle and an azimuth angle in a one-dimensional direction extending with reference to the contact point of an object that has contacted the object.
  • the angle detection means when the contact detection means detects a contact line, an elevation angle and an azimuth with respect to a tangential plane of a normal in a two-dimensional direction extending with reference to the contact line of an object that has contacted the object.
  • the input device according to appendix 7 or 8, wherein the contact elevation angle and the contact azimuth angle which are angles are detected.
  • the angle detection means is an elevation angle and an azimuth angle of a one-dimensional direction extending with reference to the contact area of the object that has contacted the object.
  • the input device according to any one of appendices 7 to 9, wherein the contact azimuth angle is detected.
  • the contact detection means detects contact of the object with the projection target surface of the operation screen on the target object by analyzing distribution information of depth values indicating the depth of the object in a region including the target object.
  • the input device according to any one of appendices 7 to 10.
  • the angle detection unit analyzes the distribution information of depth values indicating the depth of the object in the region including the target object, so that the contact elevation angle of the object in contact with the projection target surface with respect to the projection target surface and The input device according to any one of appendices 7 to 11, which detects the contact azimuth angle.
  • instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is displayed on the projection device.
  • An input method comprising:
  • a computer connected to the projection device, Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object; An angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface; Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device.
  • Display control means for transmitting to A recording medium for storing a program characterized by functioning as a storage medium.

Abstract

An input device (1) has: a contact detection unit (12) that detects the contact of a user's hand on the upper surface of a table; an angle detection unit (13) that detects the contact elevation angle and the contact orientation angle of the user's hand contacting the upper surface of the table with respect to the upper surface of the table; and a display control unit (15) that, on the basis of the contact elevation angle and the contact orientation angle detected by the angle detection unit (13), transmits to a projection device (3) instruction information for displaying an operation screen at a prescribed position that uses, as a reference, the contact position with respect to the user's hand on the upper surface of the table. The projection device (3) projects an operation screen on the upper surface of the table in accordance with the instruction information received from the input device (1).

Description

入力システム、入力装置、入力方法、および、記録媒体INPUT SYSTEM, INPUT DEVICE, INPUT METHOD, AND RECORDING MEDIUM
 本発明は、入力システム、入力装置、入力方法、および、記録媒体に関する。 The present invention relates to an input system, an input device, an input method, and a recording medium.
 特許文献1には、操作者の前方に配置されたモニタ装置と操作者とモニタ装置の間に配置された仮想的な平面(操作面)へのユーザの操作に応じてデータ(数値、指示を含む)を入力する入力装置とを備える入力システムが開示されている。 In Patent Document 1, data (numerical values and instructions) are displayed according to a user's operation on a monitor device arranged in front of the operator and a virtual plane (operation surface) arranged between the operator and the monitor device. An input system is disclosed that includes an input device for inputting (including) an input device.
 特許文献2には、机の上面に操作画面の画像を表示し、検出対象物までの距離を示す画素値からなる距離画像を生成し、距離画像に基づいて、机の上面に物体が接触したことを判定し、接触位置に応じたデータを入力する情報処理装置が開示されている。 In Patent Document 2, an image of an operation screen is displayed on the upper surface of a desk, a distance image including pixel values indicating a distance to a detection target is generated, and an object contacts the upper surface of the desk based on the distance image. An information processing apparatus that determines this and inputs data corresponding to the contact position is disclosed.
 特許文献3には、仮想の操作面を立体的に表示し、仮想操作面とユーザの手や指との位置関係を特定し、特定した位置関係に基づいてデータを入力する入力装置が開示されている。この入力装置は、仮想操作面の配置が操作者の関節の位置等の予め設定された操作者の特定の部位の位置に基づいて決定する。 Patent Document 3 discloses an input device that displays a virtual operation surface in a three-dimensional manner, specifies a positional relationship between the virtual operation surface and a user's hand or finger, and inputs data based on the specified positional relationship. ing. In this input device, the placement of the virtual operation surface is determined based on the position of a specific part of the operator set in advance, such as the position of the joint of the operator.
特開2011-039844号公報Japanese Unexamined Patent Publication No. 2011-039844 特開2011-253255号公報JP2011-253255A 特開2014-056462号公報JP 2014-056462 A
 特許文献1に記載された技術では、仮想的な操作面の配置がモニタの配置によって固定されている。また、特許文献2に記載された技術でも、操作画面の位置が机の上面に固定されている。従って、操作者が操作画面を操作しにくく感じる虞がある。 In the technique described in Patent Document 1, the layout of the virtual operation surface is fixed by the layout of the monitor. Also in the technique described in Patent Document 2, the position of the operation screen is fixed to the upper surface of the desk. Therefore, the operator may feel it difficult to operate the operation screen.
 また、特許文献3に記載された技術は、操作者の特定の部位の位置だけで操作者の姿勢を特定するので、操作者の実際の姿勢を十分に特定できない場合がある。このため、操作面の配置が、操作者の姿勢に適さないものとなり操作者が操作しにくく感じる虞がある。 In addition, the technique described in Patent Document 3 specifies the operator's posture only by the position of a specific part of the operator, and thus the actual posture of the operator may not be sufficiently specified. For this reason, the arrangement of the operation surface is not suitable for the posture of the operator, and the operator may feel it difficult to operate.
 本発明は、上記事情に鑑みてなされたものであり、操作性の高い入力システムを提供することを目的とする。 The present invention has been made in view of the above circumstances, and an object thereof is to provide an input system with high operability.
 本発明の第1の観点に係るにかかる入力システムは、
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段と、前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段と、前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を投影装置に送信する表示制御手段と、を有する入力装置と、
 前記入力装置から受信した前記指示情報にしたがって、前記操作画面を前記対象物に投影する前記投影装置と、
 を備えることを特徴とする。
An input system according to a first aspect of the present invention includes:
Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object, and angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface And instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object based on the contact elevation angle and the contact azimuth angle detected by the angle detection means. An input device having display control means for transmitting to the device;
The projection device that projects the operation screen onto the object according to the instruction information received from the input device;
It is characterized by providing.
 本発明の第2の観点に係るにかかる入力装置は、
 投影装置と接続される入力装置であって、
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段と、
 前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段と、
 前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を前記投影装置に送信する表示制御手段と、
 を備えることを特徴とする。
An input device according to a second aspect of the present invention is:
An input device connected to the projection device,
Contact detection means for detecting contact of an object with a projection target surface of an operation screen on the object;
An angle detecting means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device. Display control means for transmitting to,
It is characterized by providing.
 本発明の第3の観点に係るにかかる入力方法は、
 入力装置が実行する
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出ステップと、
 前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出ステップと、
 前記角度検出ステップで検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を投影装置に送信する表示制御ステップと、
 前記投影装置が実行する
 前記入力装置から受信した前記指示情報にしたがって、前記操作画面を前記対象物に投影するステップと、
 を備えることを特徴とする。
An input method according to the third aspect of the present invention includes:
A contact detection step of detecting contact of an object with a projection target surface of an operation screen on an object executed by an input device;
An angle detection step of detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
Based on the contact elevation angle and the contact azimuth angle detected in the angle detection step, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is displayed on the projection device. A display control step to transmit;
Projecting the operation screen onto the object according to the instruction information received from the input device executed by the projection device;
It is characterized by providing.
 本発明の第4の観点に係るにかかる記録媒体は、
 投影装置と接続されるコンピュータを、
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段、
 前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段、
 前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を前記投影装置に送信する表示制御手段、
 として機能させることを特徴とするプログラムを格納する。
The recording medium according to the fourth aspect of the present invention is:
A computer connected to the projection device,
Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object;
An angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device. Display control means for transmitting to
Stores a program characterized by functioning as
 本発明によれば、接触仰角および接触方位角に基づいて、投影対象面の物体(例えば操作者の手)との接触位置を基準とする所定の位置に操作画面を表示する。従って、操作者が操作しやすい所定の位置に操作画面を表示することが可能となる。これにより、入力装置の操作性を高めることができる。 According to the present invention, based on the contact elevation angle and the contact azimuth angle, the operation screen is displayed at a predetermined position based on the contact position with the object on the projection target surface (for example, the operator's hand). Accordingly, it is possible to display the operation screen at a predetermined position that is easy for the operator to operate. Thereby, the operativity of an input device can be improved.
本発明の実施の形態に係る入力システムの構成を示すブロック図である。It is a block diagram which shows the structure of the input system which concerns on embodiment of this invention. 実施の形態に係る入力装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the input device which concerns on embodiment. 実施の形態に係る接触仰角および接触方位角を説明するための図である。It is a figure for demonstrating the contact elevation angle and contact azimuth | direction angle which concern on embodiment. 実施の形態に係る接触仰角および接触方位角を説明するための図である。It is a figure for demonstrating the contact elevation angle and contact azimuth | direction angle which concern on embodiment. 実施の形態に係る接触仰角および接触方位角を説明するための図である。It is a figure for demonstrating the contact elevation angle and contact azimuth | direction angle which concern on embodiment. 実施の形態に係る対象物に接触した物体の接触点、接触仰角および接触方位角を検出する方法の一例を示す図である。It is a figure which shows an example of the method of detecting the contact point of the object which contacted the target object concerning embodiment, a contact elevation angle, and a contact azimuth. 実施の形態に係る対象物に接触した物体の接触領域、接触仰角および接触方位角を検出する方法の一例を示す図である。It is a figure which shows an example of the method of detecting the contact area | region of the object which contacted the target object concerning embodiment, a contact elevation angle, and a contact azimuth. 図3における操作者の手の近傍部分を示す図である。It is a figure which shows the vicinity part of the operator's hand in FIG. 図3における操作者の手の近傍部分を示す図である。It is a figure which shows the vicinity part of the operator's hand in FIG. 実施の形態に係る選択された選択項目に対応する画面の表示例を示す図である。It is a figure which shows the example of a display of the screen corresponding to the selected selection item which concerns on embodiment. 実施の形態に係る入力処理の動作の一例を示すフローチャートである。It is a flowchart which shows an example of the operation | movement of the input process which concerns on embodiment. 実施の形態に係る入力装置のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the input device which concerns on embodiment.
 以下、この発明の実施の形態について説明する。本実施の形態では、テーブルの上面に操作画面が投影され、ユーザがテーブルの上面に手を接触させることにより入力操作を行う場合について説明する。 Hereinafter, embodiments of the present invention will be described. In the present embodiment, a case will be described in which an operation screen is projected on the upper surface of the table, and the user performs an input operation by bringing his / her hand into contact with the upper surface of the table.
 入力システム100は、図1に示すように、入力装置1と深度センサ2と投影装置3とで構成される。入力装置1は、深度センサ2および投影装置3と接続される。深度センサ2は、操作画面が投影されるテーブルの上面41を含む物体の深度(深度センサ2からの距離)を検出し、検出した深度値の分布情報を生成し、入力装置1に送信する。 The input system 100 includes an input device 1, a depth sensor 2, and a projection device 3, as shown in FIG. The input device 1 is connected to the depth sensor 2 and the projection device 3. The depth sensor 2 detects the depth of the object including the upper surface 41 of the table on which the operation screen is projected (distance from the depth sensor 2), generates distribution information of the detected depth value, and transmits it to the input device 1.
 入力装置1は、深度センサ2から受信した深度値の分布情報を解析し、ユーザの手がテーブルの上面41に接触したか否かを判別する。入力装置1は、手が上面41に接触したことを検知すると、接触点、接触線または接触領域を検出する。入力装置1は、1本の指の先端が上面41に接触したことを検知すると、指の先端を接触点として検出する。また、入力装置1は、手の側面が上面41に接触したことを検知すると、手の側面を接触線として検出する。更に、入力装置1は、手のひらが上面41に接触したことを検知すると、手のひらを接触領域として検出する。また、入力装置1は、手が上面41に接触したことを検知すると、指、手のひらまたは腕の上面41となす仰角、方位角(以下、接触仰度、接触方位角という)を検出する。入力装置1は、検出した接触仰角および接触方位角から、ユーザが指、手のひら、または腕を動かすのが容易な方向(以下、操作方向)を決定する。入力装置1は、決定した操作方向に適した操作画面を、接触点、接触線または接触領域の位置を基準とする所定の位置に表示するよう指示する指示情報を投影装置3に送信する。 The input device 1 analyzes the distribution information of the depth value received from the depth sensor 2, and determines whether or not the user's hand has touched the upper surface 41 of the table. When the input device 1 detects that the hand has touched the upper surface 41, the input device 1 detects a contact point, a contact line, or a contact area. When the input device 1 detects that the tip of one finger contacts the upper surface 41, the input device 1 detects the tip of the finger as a contact point. Further, when the input device 1 detects that the side surface of the hand is in contact with the upper surface 41, the input device 1 detects the side surface of the hand as a contact line. Further, when the input device 1 detects that the palm has contacted the upper surface 41, the input device 1 detects the palm as a contact area. When the input device 1 detects that the hand is in contact with the upper surface 41, the input device 1 detects an elevation angle and an azimuth angle (hereinafter referred to as contact elevation and contact azimuth angle) formed with the upper surface 41 of the finger, palm, or arm. The input device 1 determines a direction in which the user can easily move a finger, palm, or arm (hereinafter referred to as an operation direction) from the detected contact elevation angle and contact azimuth angle. The input device 1 transmits instruction information for instructing to display an operation screen suitable for the determined operation direction at a predetermined position based on the position of the contact point, the contact line, or the contact area.
 操作画面は、例えば、入力装置1が備える機能に対応付けられた選択項目(メニュー項目)のリスト(メニュー)の画面である。投影装置3は、入力装置1から受信した指示情報にしたがって、テーブルの上面41に操作画面(メニュー)を投影する。操作画面が投影されると、ユーザはテーブルの上面41上で接触点、接触線または接触領域を移動させて選択項目を選択(指示)することができる。入力装置1は、深度センサ2から受け取った深度値の分布情報を解析し、指示された選択項目を特定する。 The operation screen is, for example, a list (menu) screen of selection items (menu items) associated with functions provided in the input device 1. The projection device 3 projects an operation screen (menu) on the upper surface 41 of the table according to the instruction information received from the input device 1. When the operation screen is projected, the user can select (instruct) the selection item by moving the contact point, the contact line, or the contact area on the upper surface 41 of the table. The input device 1 analyzes the distribution information of the depth value received from the depth sensor 2, and specifies the instructed selection item.
 入力装置1は、選択された項目に対応する画面を、接触点、接触線または接触領域の位置を基準とする所定の位置に表示するよう指示する指示情報を投影装置3に送信する。投影装置3は、入力装置1から受信した指示情報にしたがって、テーブルの上面41に、選択された項目に対応する画面を投影する。なお、図1では、入力システム100は、深度センサ2および投影装置3を1台ずつ備える。この発明は、これに限らず、複数の深度センサ2と複数台の投影装置3を備えてもよい。また、複数のユーザの手がテーブルの上面に接触したことを検知してもよい。 The input device 1 transmits to the projection device 3 instruction information that instructs to display a screen corresponding to the selected item at a predetermined position based on the position of the contact point, the contact line, or the contact area. The projection device 3 projects a screen corresponding to the selected item on the upper surface 41 of the table according to the instruction information received from the input device 1. In FIG. 1, the input system 100 includes one depth sensor 2 and one projection device 3. The present invention is not limited to this, and a plurality of depth sensors 2 and a plurality of projection devices 3 may be provided. Further, it may be detected that a plurality of user's hands are in contact with the upper surface of the table.
 入力装置1は、図2に示すように、深度取得部11と、接触検出部12と、角度検出部13と、記憶部14と、表示制御部15とを備える。 2, the input device 1 includes a depth acquisition unit 11, a contact detection unit 12, an angle detection unit 13, a storage unit 14, and a display control unit 15.
 深度取得部11は、深度センサ2から深度値の分布情報を受信する。深度取得部11は、受信した深度値の分布情報を接触検出部12に送る。 The depth acquisition unit 11 receives depth value distribution information from the depth sensor 2. The depth acquisition unit 11 sends the received depth value distribution information to the contact detection unit 12.
 接触検出部12は、深度取得部11から受け取った深度値の分布情報を解析し、ユーザの領域からユーザの骨格を抽出する。接触検出部12は、ユーザの領域および骨格からテーブルの上面41に手が接触したか否かを判定する。接触検出部12は、テーブルの上面41の深度値と手の輪郭の深度値との差が閾値以下である場合、手が上面41に接触したと判定する。この閾値は、例えば、深度センサ2の分解能+αに相当する値に設定される。接触検出部12は、上面41に手が接触したと判定した場合、ユーザの領域および骨格から、接触点、接触線または接触領域を検出する(なお、検出方法は後述する)。接触検出部12は、検出した接触点、接触線または接触領域の位置を示す接触位置情報を表示制御部15に送る。また、接触検出部12は、深度値の分布情報の解析結果を示す解析結果情報を角度検出部13に送る。 The contact detection unit 12 analyzes the distribution information of the depth value received from the depth acquisition unit 11, and extracts the user's skeleton from the user's area. The contact detection unit 12 determines whether or not a hand has touched the upper surface 41 of the table from the user's region and skeleton. The contact detection unit 12 determines that the hand has touched the upper surface 41 when the difference between the depth value of the upper surface 41 of the table and the depth value of the contour of the hand is equal to or smaller than the threshold value. This threshold value is set to a value corresponding to the resolution + α of the depth sensor 2, for example. When it is determined that the hand touches the upper surface 41, the contact detection unit 12 detects a contact point, a contact line, or a contact area from the user's area and skeleton (a detection method will be described later). The contact detection unit 12 sends contact position information indicating the position of the detected contact point, contact line, or contact area to the display control unit 15. In addition, the contact detection unit 12 sends analysis result information indicating the analysis result of the distribution information of the depth value to the angle detection unit 13.
 角度検出部13は、接触検出部12から受け取った解析結果情報から、以下のようにして、接触仰角および接触方位角を検出する。1本の指の先端(点)がテーブルに接触した場合、角度検出部13は、図3Aに示すように、その接触点P1を基点とした指に沿った方向V1の、テーブルの上面41(投影対象面または接平面)に対する接触仰角φ1および接触方位角θ1を求める。ここで、接触仰角φ1および接触方位角θ1は、テーブルの上面41に沿った基準方向V0に対する角度である。手の側面が線でテーブルに接触した場合、角度検出部13は、図3Bに示すように、手の側面の接触線L1を基準としたユーザの手のひらに沿った平面PL2の法線V2の、接平面(テーブルの上面41)に対する接触仰角φ2および接触方位角θ2を求める。手のひらが面でテーブルの上面41に接触した場合、図3Cに示すように、角度検出部13は、接触領域を基点としたユーザの前腕に沿った方向V3の、接平面(テーブルの上面41)に対する接触仰角φ3および接触方位角θ3を求める。 The angle detection unit 13 detects the contact elevation angle and the contact azimuth angle from the analysis result information received from the contact detection unit 12 as follows. When the tip (point) of one finger contacts the table, as shown in FIG. 3A, the angle detection unit 13 causes the upper surface 41 (in the direction V1 along the finger with the contact point P1 as a base point) A contact elevation angle φ1 and a contact azimuth angle θ1 with respect to the projection target surface or tangent plane are obtained. Here, the contact elevation angle φ1 and the contact azimuth angle θ1 are angles with respect to the reference direction V0 along the upper surface 41 of the table. When the side surface of the hand contacts the table with a line, the angle detection unit 13, as shown in FIG. 3B, of the normal line V2 of the plane PL2 along the palm of the user with reference to the contact line L1 on the side surface of the hand, A contact elevation angle φ2 and a contact azimuth angle θ2 with respect to the tangent plane (the upper surface 41 of the table) are obtained. When the palm touches the upper surface 41 of the table with the surface, as shown in FIG. 3C, the angle detection unit 13 makes a tangential plane (the upper surface 41 of the table) in the direction V3 along the user's forearm with the contact area as a base point. A contact elevation angle φ3 and a contact azimuth angle θ3 are obtained.
 角度検出部13は、検出した接触仰角および接触方位角を示す角度情報を表示制御部15に送る。表示制御部15は、角度検出部13から受け取った角度情報に基づいて、後述する手法により、操作方向を決定する。表示制御部15は、決定した操作方向に適した操作画面を生成する。表示制御部15は、決定した操作方向と予め設定された基準方向とのなす角度だけ基準方向から傾いた操作画面を生成する。 The angle detection unit 13 sends angle information indicating the detected contact elevation angle and contact azimuth angle to the display control unit 15. Based on the angle information received from the angle detection unit 13, the display control unit 15 determines the operation direction by a method described later. The display control unit 15 generates an operation screen suitable for the determined operation direction. The display control unit 15 generates an operation screen that is inclined from the reference direction by an angle between the determined operation direction and a preset reference direction.
 表示制御部15は、接触検出部12から接触位置情報を受け取り、その接触位置情報が示す接触点、接触線または接触領域の位置を基準とする所定の位置(ユーザが手を動かしやすい位置)に操作画面を表示させる指示情報を生成する。表示制御部15は、操作画面の位置を、接触点、接触線または接触領域の高さから予め設定された距離だけ高い位置に設定する。そして、表示制御部15は、生成した指示情報を投影装置3に送信する。投影装置3は、入力装置1から受信した指示情報にしたがって、操作画面(メニュー)をテーブルの上面41に投影する。 The display control unit 15 receives the contact position information from the contact detection unit 12, and at a predetermined position (a position where the user can easily move the hand) based on the position of the contact point, the contact line, or the contact area indicated by the contact position information. Instruction information for displaying the operation screen is generated. The display control unit 15 sets the position of the operation screen to a position that is higher than the height of the contact point, the contact line, or the contact area by a preset distance. Then, the display control unit 15 transmits the generated instruction information to the projection device 3. The projection device 3 projects an operation screen (menu) on the upper surface 41 of the table according to the instruction information received from the input device 1.
 接触検出部12は、操作画面がテーブルの上面41に投影されている状態で、テーブルの上面にユーザの手が接触したと判定した場合、検出した接触点、接触線または接触領域の位置を示す接触位置情報から、ユーザにより選択された選択項目を特定する。接触検出部12は、選択された選択項目を示す選択情報を表示制御部15に送る。 When it is determined that the user's hand has touched the upper surface of the table in a state where the operation screen is projected on the upper surface 41 of the table, the contact detection unit 12 indicates the position of the detected contact point, contact line, or contact area. The selection item selected by the user is specified from the contact position information. The contact detection unit 12 sends selection information indicating the selected selection item to the display control unit 15.
 表示制御部15は、接触検出部12から受け取った選択情報が示す選択項目に対応する操作画面を表示するよう指示する指示情報を生成し、投影装置3に送信する。投影装置3は、入力装置1から受信した指示情報にしたがって、選択された選択項目に対応する画面をテーブルの上面41に投影する。ここで、深度値の分布情報からテーブルの上面に接触したユーザの手を検出する方法について説明する。 The display control unit 15 generates instruction information for instructing to display an operation screen corresponding to the selection item indicated by the selection information received from the contact detection unit 12, and transmits the instruction information to the projection device 3. The projection device 3 projects a screen corresponding to the selected selection item on the upper surface 41 of the table according to the instruction information received from the input device 1. Here, a method for detecting the user's hand that has touched the upper surface of the table from the distribution information of the depth value will be described.
 図4は、実施の形態に係る対象物に接触した物体の接触点、接触仰角および接触方位角を検出する方法の一例を示す図である。図4の例では、ユーザがテーブルの上面41に1本の指で接触するタッチ操作を行う。深度値の分布情報には、深度値がほぼ一定であるテーブル領域Tと床領域Fとが含まれる。 FIG. 4 is a diagram illustrating an example of a method for detecting a contact point, a contact elevation angle, and a contact azimuth angle of an object that has contacted an object according to the embodiment. In the example of FIG. 4, the user performs a touch operation that makes contact with the upper surface 41 of the table with one finger. The depth value distribution information includes a table region T and a floor region F having substantially constant depth values.
 まず、接触検出部12は、深度値の分布情報から、連続するテーブル領域Tまたは床領域Fのほぼ一定の深度値から所定値以上の差が現れる点を抽出する。接触検出部12は、抽出した点をつないでユーザの輪郭を抽出する。接触検出部12は、ユーザの輪郭で囲まれるユーザ領域Uを分離する。図4の例では、理解を容易にするためユーザの頭を破線で表している。 First, the contact detection unit 12 extracts points where a difference of a predetermined value or more appears from a substantially constant depth value of the continuous table region T or floor region F from the distribution information of depth values. The contact detection unit 12 connects the extracted points to extract the user's contour. The contact detection unit 12 separates the user area U surrounded by the user's outline. In the example of FIG. 4, the user's head is indicated by a broken line for easy understanding.
 接触検出部12は、ユーザ領域U内において、ユーザ領域Uの輪郭部分までの距離が近いほど小さく遠いほど大きい画素値に置き換える。画素値が極大となる点を抽出し、つなげるとユーザの骨格Sが抽出される。骨格Sの抽出方法は、これに限らず、画像処理において一般的に用いられる細線化や中心軸変換などを用いてもよい。 In the user area U, the contact detection unit 12 replaces the pixel value with a larger pixel value as the distance to the contour portion of the user area U is shorter. When the point where the pixel value becomes maximum is extracted and connected, the user's skeleton S is extracted. The extraction method of the skeleton S is not limited to this, and thinning or central axis conversion generally used in image processing may be used.
 接触検出部12は、骨格Sの端点P1~P4を検出する。接触検出部12は、端点P1~P4の中に、テーブル領域Tの深度値との差が閾値以下である端点が1つある場合、その端点の位置を接触点として検出する。図4の例では、端点P1がテーブルの上面に接触したユーザの指先であり、端点P1の位置が接触点である。接触検出部12は、これらの解析結果を示す解析結果情報を角度検出部13に送る。 The contact detection unit 12 detects the end points P1 to P4 of the skeleton S. When there is one end point whose difference from the depth value of the table region T is not more than a threshold value among the end points P1 to P4, the contact detection unit 12 detects the position of the end point as a contact point. In the example of FIG. 4, the end point P1 is a user's fingertip that has contacted the upper surface of the table, and the position of the end point P1 is a contact point. The contact detection unit 12 sends analysis result information indicating these analysis results to the angle detection unit 13.
 図4の例では、角度検出部13は、接触検出部12から受け取った解析結果情報が示す骨格Sの端点P1(指先)から指の基端部分へ向かう方向から、ユーザの指の接触仰角および接触方位角を検出する。角度検出部13は、ユーザの指の接触仰角および接触方位角を示す角度情報を表示制御部15に送る。表示制御部15は、角度検出部13から受け取った角度情報に基づいて、操作方向を決定する。 In the example of FIG. 4, the angle detection unit 13 determines the contact elevation angle of the user's finger from the direction from the end point P1 (fingertip) of the skeleton S indicated by the analysis result information received from the contact detection unit 12 to the base end portion of the finger. Detect the contact azimuth. The angle detection unit 13 sends angle information indicating the contact elevation angle and contact azimuth angle of the user's finger to the display control unit 15. The display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13.
 図5は、実施の形態に係る対象物に接触した物体の接触領域、接触仰角および接触方位角を検出する方法の一例を示す図である。図5の例では、ユーザの手と前腕以外の部分は省略する。接触検出部12は、図4の場合と同様に、深度値の分布情報を用いてユーザ領域Uを決定し、骨格Sを抽出する。図5のように、ユーザがテーブルの上面に手のひらで接触するラップ操作を行った場合、接触検出部12は、骨格Sから端点P5~端点9を検出する。 FIG. 5 is a diagram illustrating an example of a method for detecting a contact area, a contact elevation angle, and a contact azimuth angle of an object in contact with an object according to the embodiment. In the example of FIG. 5, parts other than the user's hand and forearm are omitted. As in the case of FIG. 4, the contact detection unit 12 determines the user area U using the depth value distribution information, and extracts the skeleton S. As shown in FIG. 5, when the user performs a lap operation in which the user touches the upper surface of the table with the palm of the hand, the contact detection unit 12 detects the end points P 5 to 9 from the skeleton S.
 例えば、テーブル領域Tの深度値との差が閾値以下である端点が5個であった場合、接触検出部12は、ラップ操作であると判定する。図5の例では、端点P5~端点9はテーブル領域Tの深度値との差が閾値以下であるので、接触検出部12はラップ操作であると判定し、テーブル領域Tの深度値との差が閾値以下であるユーザの手のひらの領域を接触領域として検出する。接触検出部12は、これらの解析結果を示す解析結果情報を角度検出部13に送る。 For example, when there are five end points whose difference from the depth value of the table region T is equal to or less than the threshold value, the contact detection unit 12 determines that the operation is a lapping operation. In the example of FIG. 5, since the difference between the end point P5 to the end point 9 is not more than the threshold value of the table region T, the contact detection unit 12 determines that the lap operation is performed and the difference from the depth value of the table region T. An area of the palm of the user for which is less than or equal to the threshold is detected as a contact area. The contact detection unit 12 sends analysis result information indicating these analysis results to the angle detection unit 13.
 角度検出部13は、接触検出部12から受け取った解析結果情報が示す骨格Sの手首から肘への方向から、ユーザの前腕の接触仰角および接触方位角を検出する。角度検出部13は、ユーザの前腕の接触仰角および接触方位角を示す角度情報を表示制御部15に送る。表示制御部15は、角度検出部13から受け取った角度情報に基づいて、操作方向を決定する。続いて、操作画面の表示について説明する。 The angle detection unit 13 detects the contact elevation angle and contact azimuth angle of the user's forearm from the direction from the wrist to the elbow of the skeleton S indicated by the analysis result information received from the contact detection unit 12. The angle detection unit 13 sends angle information indicating the contact elevation angle and contact azimuth angle of the user's forearm to the display control unit 15. The display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13. Subsequently, display of the operation screen will be described.
 図6Aは、実施の形態に係る操作画面の表示例を示す図である。例えば、図5のように、ユーザがテーブルの上面でラップ操作をすると、表示制御部15は、角度検出部13から受け取った角度情報に基づいて、図6Aに示すようにユーザの前腕の接平面に向かう方向を上とし、上下左右を決定する。表示制御部15は、接触検出部12から接触位置情報を受け取ると、その接触位置情報が示す接触領域(手のひらの領域)の位置を基準とする所定の位置に操作画面を表示するよう指示する指示情報を生成し、投影装置3に送信する。操作画面は、例えば、接触領域から上に10cmだけ離間した位置に表示される。 FIG. 6A is a diagram illustrating a display example of the operation screen according to the embodiment. For example, as shown in FIG. 5, when the user performs a lap operation on the upper surface of the table, the display control unit 15 determines the tangent plane of the user's forearm as shown in FIG. 6A based on the angle information received from the angle detection unit 13. The direction toward the top is up, and the top, bottom, left and right are determined. Upon receiving the contact position information from the contact detection unit 12, the display control unit 15 instructs to display the operation screen at a predetermined position based on the position of the contact region (palm region) indicated by the contact position information. Information is generated and transmitted to the projection device 3. For example, the operation screen is displayed at a position 10 cm above the contact area.
 投影装置3は、入力装置1から受信した指示情報にしたがって、図6Aのような操作画面をテーブルの上面41の所定の位置に投影する。図6Aの例では、ユーザは操作画面で、入力装置1が備える各機能に対応付けられた選択項目A、B、CおよびDを選択可能である。 The projection device 3 projects an operation screen as shown in FIG. 6A onto a predetermined position on the upper surface 41 of the table according to the instruction information received from the input device 1. In the example of FIG. 6A, the user can select selection items A, B, C, and D associated with each function of the input device 1 on the operation screen.
 図6Bに示すようにユーザが選択項目Bを選択した場合、入力装置1の接触検出部12は、検出した接触点の位置を示す接触位置情報から選択項目Bが選択されたことを特定する。接触検出部12は、選択された選択項目Bを示す選択情報を表示制御部15に送る。表示制御部15は、接触検出部12から受け取った選択情報が示す選択項目Bに対応する画面を表示するよう指示する指示情報を生成し、投影装置3に送信する。投影装置3は、入力装置1から受信した指示情報にしたがって、選択された選択項目に対応する画面をテーブルの上面に投影する。 6B, when the user selects the selection item B, the contact detection unit 12 of the input device 1 specifies that the selection item B has been selected from the contact position information indicating the position of the detected contact point. The contact detection unit 12 sends selection information indicating the selected selection item B to the display control unit 15. The display control unit 15 generates instruction information for instructing to display a screen corresponding to the selection item B indicated by the selection information received from the contact detection unit 12, and transmits the instruction information to the projection device 3. The projection device 3 projects a screen corresponding to the selected selection item on the upper surface of the table according to the instruction information received from the input device 1.
 図7は、実施の形態に係る選択された選択項目に対応する画面の表示例を示す図である。例えば、図6Bでユーザが選択した選択項目Bに対応する機能が、空調機の温度を変更する機能であった場合、図7に示す画面は空調機の温度を変更する画面である。この場合、ユーザは、空調機の温度を変更する画面に表示された操作釦(図7の白抜き矢印)を操作することにより空調機の温度の上げ下げを行う。 FIG. 7 is a diagram illustrating a display example of a screen corresponding to the selected selection item according to the embodiment. For example, when the function corresponding to the selection item B selected by the user in FIG. 6B is a function for changing the temperature of the air conditioner, the screen shown in FIG. 7 is a screen for changing the temperature of the air conditioner. In this case, the user raises or lowers the temperature of the air conditioner by operating an operation button (open arrow in FIG. 7) displayed on the screen for changing the temperature of the air conditioner.
 入力装置1の接触検出部12は、図4および図5の場合と同様に、深度値の分布情報を用いてユーザ領域Uを決定し、骨格Sを抽出する。ユーザの手の側面がテーブルの上面に接触すると、接触検出部12は、ユーザの手の側面の接触線を検出する。例えば、接触検出部12は、ユーザの輪郭のうち、テーブル領域Tの深度値との差が閾値以下である点をつないだ線を接触線として検出する。接触検出部12は、これらの解析結果を示す解析結果情報を角度検出部13に送る。 The contact detection unit 12 of the input device 1 determines the user region U using the depth value distribution information and extracts the skeleton S, as in the case of FIGS. When the side surface of the user's hand contacts the top surface of the table, the contact detection unit 12 detects a contact line on the side surface of the user's hand. For example, the contact detection unit 12 detects, as a contact line, a line connecting points where the difference from the depth value of the table region T is equal to or less than a threshold value in the contour of the user. The contact detection unit 12 sends analysis result information indicating these analysis results to the angle detection unit 13.
 角度検出部13は、接触検出部12から受け取った解析結果情報が示す骨格Sまたはユーザ領域Uの手のひらの領域の深度値の分布情報から、ユーザの手のひらの接触仰角および接触方位角を検出する。角度検出部13は、ユーザの手のひらの接触仰角および接触方位角を示す角度情報を表示制御部15に送る。表示制御部15は、角度検出部13から受け取った角度情報に基づいて、操作方向を決定する。 The angle detection unit 13 detects the contact elevation angle and the contact azimuth of the palm of the user from the distribution information of the depth values of the palm region of the skeleton S or the user region U indicated by the analysis result information received from the contact detection unit 12. The angle detection unit 13 sends angle information indicating the contact elevation angle and the contact azimuth angle of the palm of the user to the display control unit 15. The display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13.
 ここでは、右手のひらを左に傾けてパット操作を行い、ユーザの右手のひらに沿った平面の法線の接平面とのなす角度が鈍角であるほうが左、鋭角であるほうが右とする。どちらの手を使って左右どちらに傾けてパット操作を行うかは予め決められているものとする。また、前述のラップ操作での操作方向の決定方法と同様に、パット操作においても、ユーザの前腕の接平面に向かう方向を上とし、上下左右を決定してもよい。あるいは、ユーザの手のひらを基準として、手首の方向を下(手前)とし、右を高(大)、左を低(小)の方向にしてもよい。 Here, the right palm is tilted to the left and the putting operation is performed, and the angle formed with the tangent to the normal line of the plane along the right palm of the user is the left when the angle is obtuse and the right when the angle is acute. It is assumed that which hand is used to tilt the left or right to perform the pad operation is determined in advance. Similarly to the above-described method for determining the operation direction in the lap operation, in the pad operation, the direction toward the tangential plane of the user's forearm may be set upward, and the top, bottom, left, and right may be determined. Alternatively, with the user's palm as a reference, the wrist direction may be down (front), the right may be high (large), and the left may be low (small).
 表示制御部15は、決定した操作方向に合わせた操作画面を生成し、接触線の位置を基準とする所定の位置に表示するよう指示する指示情報を投影装置3に送信する。投影装置3は、指示情報にしたがって操作画面を投影する。操作画面は、例えば、接触線の位置から、ユーザの右手のひらに沿った平面の法線の接平面とのなす角度が鈍角である側に20cm離れた位置と、ユーザの右手のひらに沿った平面の法線の接平面とのなす角度が鋭角である側に10cm離れた位置と、に表示される。 The display control unit 15 generates an operation screen in accordance with the determined operation direction, and transmits to the projection device 3 instruction information for instructing display at a predetermined position based on the position of the contact line. The projection device 3 projects an operation screen according to the instruction information. The operation screen is, for example, a position that is 20 cm away from the position of the contact line and the tangent plane of the normal line of the plane along the right palm of the user, and a plane along the right palm of the user. Is displayed at a position 10 cm away on the side where the angle formed with the tangent plane of the normal line is an acute angle.
 図7の例では、左向きのパット操作が温度を下げる操作、右向きのパット操作が温度を上げる操作とし、接触線の左側に-が表示された左向きの矢印、接触線の右側に+が表示された右向きの矢印を表示する操作画面を投影する。パット操作において、ユーザの右手のひらに沿った平面の法線の接平面とのなす角度が鈍角である側は手の影ができやすいので、手の影を考慮して接触線からの位置が遠くなるように操作画面を投影することで、操作画面が手の影となって見えなくなることを防ぐことができる。なお、パット操作による正負の方向は一意に決まるので、操作方向の表示は片方でもよい。 In the example of FIG. 7, the left pad operation is an operation to lower the temperature, the right pad operation is an operation to increase the temperature, a left arrow with-displayed on the left side of the contact line, and a + on the right side of the contact line. Project an operation screen that displays a right-pointing arrow. In the pad operation, the side where the normal formed by the plane along the right palm of the user and the tangent plane makes an obtuse angle is easy to make a hand shadow. By projecting the operation screen in such a way, it is possible to prevent the operation screen from becoming invisible as a hand shadow. In addition, since the positive / negative direction by a pad operation is uniquely determined, the display of an operation direction may be one side.
 入力装置1が備える機能は、ユーザにより選択された選択項目に対応する画面の表示に限らず、例えば、電気機器を制御する機能など、どのような機能であってもよい。入力装置1は、選択された選択項目に対応する機能を実行すればよい。 The function of the input device 1 is not limited to the display of the screen corresponding to the selection item selected by the user, and may be any function such as a function of controlling an electric device. The input device 1 may perform a function corresponding to the selected selection item.
 図8は、実施の形態に係る入力処理の動作の一例を示すフローチャートである。図8の入力処理は、深度センサ2、入力装置1および投影装置3の電源がONになると開始する。 FIG. 8 is a flowchart showing an example of the operation of the input process according to the embodiment. The input process of FIG. 8 starts when the power of the depth sensor 2, the input device 1, and the projection device 3 is turned on.
 深度センサ2は、物体の深度を検出し(ステップS11)、検出した深度値の分布情報を生成し、入力装置1に送信する(ステップS12)。深度センサ2は、電源がOFFになっていなければ(ステップS13;NO)、ステップS11に戻り、ステップS11~ステップS13を繰り返す。深度センサ2は、電源がOFFになると(ステップS13;YES)、処理を終了する。 The depth sensor 2 detects the depth of the object (step S11), generates distribution information of the detected depth value, and transmits it to the input device 1 (step S12). If the power is not turned off (step S13; NO), the depth sensor 2 returns to step S11 and repeats steps S11 to S13. The depth sensor 2 ends the process when the power is turned off (step S13; YES).
 入力装置1の深度取得部11は、深度センサ2から深度値の分布情報を受信し(ステップS21)、接触検出部12に送る。接触検出部12は、深度取得部11から受け取った深度値の分布情報を解析し、テーブルの上面にユーザの手が接触したか否かを判定する(ステップS22)。 The depth acquisition unit 11 of the input device 1 receives the distribution information of the depth value from the depth sensor 2 (step S21) and sends it to the contact detection unit 12. The contact detection unit 12 analyzes the depth value distribution information received from the depth acquisition unit 11, and determines whether or not the user's hand has touched the upper surface of the table (step S22).
 テーブルの上面にユーザの手が接触したと判定していない場合(ステップS22;NO)、ステップS21に戻り、ステップS21およびステップS22を繰り返す。テーブルの上面にユーザの手が接触したと判定した場合(ステップS22;YES)、接触検出部12は、接触点、接触線または接触領域を検出する(ステップS23)。接触検出部12は、検出した接触点、接触線または接触領域の位置を示す接触位置情報を表示制御部15に送り、深度値の分布情報を角度検出部13に送る。次に、角度検出部13は、操作画面が表示中であるか否かを判定する(ステップS24)。 If it is not determined that the user's hand has touched the upper surface of the table (step S22; NO), the process returns to step S21, and steps S21 and S22 are repeated. When it is determined that the user's hand has touched the upper surface of the table (step S22; YES), the contact detection unit 12 detects a contact point, a contact line, or a contact area (step S23). The contact detection unit 12 sends contact position information indicating the position of the detected contact point, contact line, or contact area to the display control unit 15, and sends depth value distribution information to the angle detection unit 13. Next, the angle detection unit 13 determines whether or not the operation screen is being displayed (step S24).
 操作画面がまだ表示されていない場合(ステップS24;NO)、角度検出部13は、接触検出部12から受け取った深度値の分布情報を解析し、接平面に対する角度(接触仰角および接触方位角)を検出する(ステップS25)。角度検出部13は、接触仰角および接触方位角を示す角度情報を表示制御部15に送る。表示制御部15は、角度検出部13から受け取った角度情報に基づいて、操作方向を決定する(ステップS26)。 When the operation screen is not displayed yet (step S24; NO), the angle detection unit 13 analyzes the distribution information of the depth value received from the contact detection unit 12, and the angle with respect to the tangent plane (contact elevation angle and contact azimuth angle). Is detected (step S25). The angle detection unit 13 sends angle information indicating the contact elevation angle and the contact azimuth angle to the display control unit 15. The display control unit 15 determines the operation direction based on the angle information received from the angle detection unit 13 (step S26).
 表示制御部15は、決定した操作方向に合わせた操作画面を生成し、接触検出部12から受け取った接触位置情報が示す接触点、接触線または接触領域の位置を基準とする所定の位置に操作画面を表示させる指示情報を生成し、投影装置3に送信する(ステップS27)。処理はステップS30に移行する。 The display control unit 15 generates an operation screen in accordance with the determined operation direction, and operates to a predetermined position based on the position of the contact point, the contact line, or the contact area indicated by the contact position information received from the contact detection unit 12. Instruction information for displaying the screen is generated and transmitted to the projection apparatus 3 (step S27). The process proceeds to step S30.
 一方、操作画面が表示中である場合(ステップS24;YES)、接触検出部12は、検出した接触点、接触線または接触領域の位置を示す接触位置情報から選択された選択項目を特定する(ステップS28)。入力装置1は、選択された選択項目に対応する機能を実行する(ステップS29)。このとき、選択された選択項目に対応する画面の表示である場合には、接触検出部12は、選択された選択項目を示す選択情報を表示制御部15に送る。表示制御部15は、接触検出部12から受け取った選択情報が示す選択項目に対応する画面を生成し、所定の位置に選択された選択項目に対応する画面を表示するよう指示する指示情報を生成して投影装置3に送信する。 On the other hand, when the operation screen is being displayed (step S24; YES), the contact detection unit 12 specifies a selection item selected from the contact position information indicating the position of the detected contact point, contact line, or contact area ( Step S28). The input device 1 executes a function corresponding to the selected selection item (step S29). At this time, in the case of displaying a screen corresponding to the selected selection item, the contact detection unit 12 sends selection information indicating the selected selection item to the display control unit 15. The display control unit 15 generates a screen corresponding to the selection item indicated by the selection information received from the contact detection unit 12, and generates instruction information for instructing to display the screen corresponding to the selection item selected at a predetermined position. And transmitted to the projection device 3.
 入力装置1は、電源がOFFになっていなければ(ステップS30;NO)、ステップS21に戻り、ステップS21~ステップS30を繰り返す。入力装置1は、電源がOFFになると(ステップS30;YES)、処理を終了する。 If the power supply is not turned off (step S30; NO), the input device 1 returns to step S21 and repeats steps S21 to S30. When the power is turned off (step S30; YES), the input device 1 ends the process.
 投影装置3は、入力装置1から操作画面を表示させる指示情報を受信する(ステップS31)。投影装置3は、入力装置1から受信した指示情報にしたがって、操作画面をテーブルの上面41に投影する(ステップS32)。選択された選択項目に対応する画面の表示である場合には、投影装置3は、入力装置1から選択された選択項目に対応する画面を表示するよう指示する指示情報を受信する。投影装置3は、入力装置1から受信した指示情報にしたがって、選択された選択項目に対応する画面をテーブルの上面に投影する。 Projection device 3 receives instruction information for displaying an operation screen from input device 1 (step S31). The projection device 3 projects the operation screen onto the upper surface 41 of the table according to the instruction information received from the input device 1 (step S32). In the case of displaying a screen corresponding to the selected selection item, the projection device 3 receives instruction information for instructing to display a screen corresponding to the selected selection item from the input device 1. The projection device 3 projects a screen corresponding to the selected selection item on the upper surface of the table according to the instruction information received from the input device 1.
 投影装置3は、電源がOFFになっていなければ(ステップS33;NO)、ステップS31に戻り、ステップS31~ステップS33を繰り返す。投影装置3は、電源がOFFになると(ステップS33;YES)、処理を終了する。 If the projector 3 is not turned off (step S33; NO), the projection apparatus 3 returns to step S31 and repeats steps S31 to S33. When the power is turned off (step S33; YES), the projection device 3 ends the process.
 本実施の形態の入力システム100によれば、対象物の任意の場所に、ユーザ(操作者)の操作方向に合わせて操作画面を表示することを可能になり、操作性が向上する。 According to the input system 100 of the present embodiment, it becomes possible to display an operation screen at an arbitrary place of an object in accordance with the operation direction of the user (operator), and the operability is improved.
 上記の実施の形態では、対象物はテーブルの上面であって、対象物に接触する物体はユーザの手であるが、これに限らない。対象物の形状はテーブルの上面のような水平な平面だけでなく、壁面のような垂直な平面、円柱、球、階段状など、操作画面を投影可能な形状であればなんでもよい。この場合、入力装置1の表示制御部15は、対象物の形状に合わせた操作画面を生成するとよい。対象物に接触する物体は、例えば差し棒やペンなど、接平面から延びている方向が規定できる形状の物体であればなんでもよい。 In the above embodiment, the object is the upper surface of the table, and the object that contacts the object is the user's hand, but is not limited thereto. The shape of the object is not limited to a horizontal plane such as the upper surface of the table, but may be any shape that can project an operation screen, such as a vertical plane such as a wall surface, a cylinder, a sphere, or a staircase. In this case, the display control unit 15 of the input device 1 may generate an operation screen that matches the shape of the object. The object that contacts the object may be any object that has a shape that can define the direction extending from the tangent plane, such as a stick or a pen.
 上記の実施の形態では、深度センサ2および投影装置3を天井に備えるが、これに限らない。深度センサ2は、対象物に接触した物体の接触仰角および接触方位角が検出できる深度値の分布情報を生成可能な位置に備えればよい。投影装置3は、操作画面を対象物に投影できる位置に備えればよい。また、上記の実施の形態では、接触仰角と接触方位角に基づいて操作画面の表示位置を決定する例について説明したが、これに限らず、接触仰各と接触方位角以外の他のパラメータに基づいて操作画面の表示位置を決定するものであってもよい。例えば、操作画面の表示位置が肘の曲がり角や指の接触本数に基づいて決定されてもよい。 In the above embodiment, the depth sensor 2 and the projection device 3 are provided on the ceiling, but the present invention is not limited thereto. Depth sensor 2 should just be provided in the position which can generate distribution information of the depth value which can detect the contact elevation angle and contact azimuth angle of the object which contacted the subject. The projection device 3 may be provided at a position where the operation screen can be projected onto the object. In the above embodiment, the example in which the display position of the operation screen is determined based on the contact elevation angle and the contact azimuth angle is described. However, the present invention is not limited to this, and other parameters other than the contact elevation angle and the contact azimuth angle are used. The display position of the operation screen may be determined based on this. For example, the display position of the operation screen may be determined based on the corner of the elbow or the number of finger contacts.
 上記の実施の形態では、対象物の表示面(テーブルの上面など)に物体(ユーザの手など)が接触したと判定したときに操作画面を表示する。これだけでなく、操作画面を表示しておいて、物体(ユーザの手など)が対象物の表示面に近づくのを検知して、操作画面の向きと位置を変化させてもよい。例えば、手を表示面に近づけると、手と腕の伸びている方向に合わせて、操作画面の向きを変化させ、手の先に操作画面を近づける。この場合、操作画面が表示されているので、ユーザにとって操作がわかりやすく、操作を開始するのが容易である。 In the above embodiment, the operation screen is displayed when it is determined that an object (such as a user's hand) has touched the display surface (such as the upper surface of the table) of the object. In addition to this, an operation screen may be displayed, and the direction and position of the operation screen may be changed by detecting that an object (such as a user's hand) approaches the display surface of the object. For example, when the hand is brought close to the display surface, the direction of the operation screen is changed in accordance with the extending direction of the hand and the arm, and the operation screen is brought closer to the tip of the hand. In this case, since the operation screen is displayed, it is easy for the user to understand the operation, and it is easy to start the operation.
 図9は、実施の形態に係る入力装置のハードウェア構成の一例を示すブロック図である。制御部31はCPU(Central Processing Unit)等から構成され、外部記憶部33に記憶されている制御プログラム39にしたがって、接触検出部12、角度検出部13および表示制御部15の各処理を実行する。 FIG. 9 is a block diagram illustrating an example of a hardware configuration of the input device according to the embodiment. The control unit 31 includes a CPU (Central Processing Unit) and the like, and executes each process of the contact detection unit 12, the angle detection unit 13, and the display control unit 15 in accordance with a control program 39 stored in the external storage unit 33. .
 主記憶部32はRAM(Random-Access Memory)等から構成され、外部記憶部33に記憶されている制御プログラム39をロードし、制御部31の作業領域として用いられる。 The main storage unit 32 is constituted by a RAM (Random-Access Memory) or the like, loads a control program 39 stored in the external storage unit 33, and is used as a work area of the control unit 31.
 外部記憶部33は、フラッシュメモリ、ハードディスク、DVD-RAM(Digital Versatile Disc Random-Access Memory)、DVD-RW(Digital Versatile Disc ReWritable)等の不揮発性メモリから構成され、入力装置1の処理を制御部31に行わせるためのプログラムを予め記憶し、また、制御部31の指示にしたがって、このプログラムが記憶するデータを制御部31に供給し、制御部31から供給されたデータを記憶する。記憶部14は、外部記憶部33に構成される。 The external storage unit 33 includes a nonvolatile memory such as a flash memory, a hard disk, a DVD-RAM (Digital Versatile Disc Random-Access Memory), a DVD-RW (Digital Versatile Disc Disc ReWritable), and controls the processing of the input device 1. A program to be executed by the control unit 31 is stored in advance, and data stored by the program is supplied to the control unit 31 in accordance with an instruction from the control unit 31, and the data supplied from the control unit 31 is stored. The storage unit 14 is configured in the external storage unit 33.
 操作部34はキーボードおよびマウスなどのポインティングデバイスなどと、キーボードおよびポインティングデバイスなどを内部バス30に接続するインタフェース装置から構成されている。システム管理者が入力装置1に情報を入力する場合は、操作部34を介して、入力された情報が制御部31に供給される。 The operation unit 34 includes a pointing device such as a keyboard and a mouse, and an interface device that connects the keyboard and the pointing device to the internal bus 30. When the system administrator inputs information to the input device 1, the input information is supplied to the control unit 31 via the operation unit 34.
 表示部35は、CRTまたはLCDなどから構成されている。システム管理者が入力装置1に情報を入力する場合は、表示部35は、操作画面を表示する。 The display unit 35 is composed of a CRT or LCD. When the system administrator inputs information to the input device 1, the display unit 35 displays an operation screen.
  送受信部36は、ネットワークに接続する網終端装置または無線通信装置、およびそれらと接続するシリアルインタフェースまたはLAN(Local Area Network)インタフェースから構成されている。送受信部36は、深度取得部11および表示制御部15として機能する。 The transmission / reception unit 36 includes a network termination device or a wireless communication device connected to the network, and a serial interface or a LAN (Local Area Network) interface connected to them. The transmission / reception unit 36 functions as the depth acquisition unit 11 and the display control unit 15.
 図3に示す深度取得部11、接触検出部12、角度検出部13、記憶部14および表示制御部15の処理は、制御プログラム39が、制御部31、主記憶部32、外部記憶部33、操作部34、表示部35および送受信部36を資源として用いて処理することによって実行する。 The processing of the depth acquisition unit 11, the contact detection unit 12, the angle detection unit 13, the storage unit 14, and the display control unit 15 illustrated in FIG. 3 includes a control program 39, a control unit 31, a main storage unit 32, an external storage unit 33, The processing is executed by using the operation unit 34, the display unit 35, and the transmission / reception unit 36 as resources.
 その他、前記のハードウェア構成やフローチャートは一例であり、任意に変更および修正が可能である。 In addition, the hardware configuration and flowchart described above are merely examples, and can be arbitrarily changed and modified.
 制御部31、主記憶部32、外部記憶部33、内部バス30などから構成される制御処理を行う中心となる部分は、専用のシステムによらず、通常のコンピュータシステムを用いて実現可能である。例えば、前記の動作を実行するためのコンピュータプログラムを、コンピュータが読み取り可能な記録媒体(フレキシブルディスク、CD-ROM、DVD-ROM等)に格納して配布し、該コンピュータプログラムをコンピュータにインストールすることにより、前記の処理を実行する入力装置1を構成してもよい。また、インターネット等の通信ネットワーク上のサーバ装置が有する記憶装置に該コンピュータプログラムを格納しておき、通常のコンピュータシステムがダウンロード等することで入力装置1を構成してもよい。 The central part that performs control processing including the control unit 31, the main storage unit 32, the external storage unit 33, the internal bus 30 and the like can be realized by using a normal computer system, not a dedicated system. . For example, a computer program for executing the above operation is stored and distributed in a computer-readable recording medium (flexible disk, CD-ROM, DVD-ROM, etc.), and the computer program is installed in the computer. Thus, the input device 1 that executes the above-described processing may be configured. Also, the computer program may be stored in a storage device included in a server device on a communication network such as the Internet, and the input device 1 may be configured by being downloaded by a normal computer system.
 また、入力装置1の機能を、OSとアプリケーションプログラムの分担、またはOSとアプリケーションプログラムとの協働により実現する場合などには、アプリケーションプログラム部分のみを記録媒体や記憶装置に格納してもよい。 Further, when the function of the input device 1 is realized by sharing of the OS and the application program or by cooperation between the OS and the application program, only the application program portion may be stored in the recording medium or the storage device.
 また、搬送波にコンピュータプログラムを重畳し、通信ネットワークを介して配信することも可能である。例えば、通信ネットワーク上の掲示板(BBS:Bulletin Board System)に前記コンピュータプログラムを掲示し、ネットワークを介して前記コンピュータプログラムを配信してもよい。そして、このコンピュータプログラムを起動し、OSの制御下で、他のアプリケーションプログラムと同様に実行することにより、前記の処理を実行できるように構成してもよい。 Also, it is possible to superimpose a computer program on a carrier wave and distribute it via a communication network. For example, the computer program may be posted on a bulletin board (BBS: Bulletin Board System) on a communication network, and the computer program may be distributed via the network. The computer program may be started and executed in the same manner as other application programs under the control of the OS, so that the above-described processing may be executed.
 本発明は、本発明の広義の精神と範囲を逸脱することなく、様々な実施の形態および変形が可能とされるものである。また、上述した実施の形態は、この発明を説明するためのものであり、本発明の範囲を限定するものではない。即ち、本発明の範囲は、実施の形態ではなく、請求の範囲によって示される。そして、請求の範囲内およびそれと同等の発明の意義の範囲内で施される様々な変形が、この発明の範囲内とみなされる。 The present invention is capable of various embodiments and modifications without departing from the broad spirit and scope of the present invention. The above-described embodiments are for explaining the present invention and do not limit the scope of the present invention. That is, the scope of the present invention is shown not by the embodiments but by the claims. Various modifications made within the scope of the claims and the scope of the equivalent invention are considered to be within the scope of the present invention.
 上記の実施形態の一部または全部は、以下の付記のようにも記載されうるが、以下には限られない。 Some or all of the above embodiments may be described as in the following supplementary notes, but are not limited to the following.
(付記1)
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段と、前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段と、前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を投影装置に送信する表示制御手段と、を有する入力装置と、
 前記入力装置から受信した前記指示情報にしたがって、前記操作画面を前記対象物に投影する前記投影装置と、
 を備えることを特徴とする入力システム。
(Appendix 1)
Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object, and angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface And instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object based on the contact elevation angle and the contact azimuth angle detected by the angle detection means. An input device having display control means for transmitting to the device;
The projection device that projects the operation screen onto the object according to the instruction information received from the input device;
An input system comprising:
(付記2)
 前記角度検出手段は、前記接触検出手段が接触点を検出した場合、前記対象物に接触した物体の、前記接触点を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記1に記載の入力システム。
(Appendix 2)
When the contact detection means detects a contact point, the angle detection means is the contact elevation angle that is an elevation angle and an azimuth angle in a one-dimensional direction extending with reference to the contact point of an object that has contacted the object. The input system according to claim 1, wherein the contact azimuth angle is detected.
(付記3)
 前記角度検出手段は、前記接触検出手段が接触線を検出した場合、前記対象物に接触した物体の、前記接触線を基準として延びている二次元の方向の法線の接平面に対する仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記1または2に記載の入力システム。
(Appendix 3)
The angle detection means, when the contact detection means detects a contact line, an elevation angle and an azimuth with respect to a tangential plane of a normal in a two-dimensional direction extending with reference to the contact line of an object that has contacted the object The input system according to appendix 1 or 2, wherein the contact elevation angle and the contact azimuth angle which are angles are detected.
(付記4)
 前記角度検出手段は、前記接触検出手段が接触領域を検出した場合、前記対象物に接触した物体の、前記接触領域を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記1ないし3のいずれかに記載の入力システム。
(Appendix 4)
When the contact detection means detects the contact area, the angle detection means is an elevation angle and an azimuth angle of a one-dimensional direction extending with reference to the contact area of the object that has contacted the object. The input system according to any one of appendices 1 to 3, wherein the contact azimuth angle is detected.
(付記5)
 前記接触検出手段は、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記対象物における前記操作画面の投影対象面への前記物体の接触を検出する、付記1から4のいずれかに記載の入力システム。
(Appendix 5)
The contact detection means detects contact of the object with the projection target surface of the operation screen on the target object by analyzing distribution information of depth values indicating the depth of the object in a region including the target object. The input system according to any one of appendices 1 to 4.
(付記6)
 前記角度検出手段は、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記投影対象面に接触した前記物体の、前記投影対象面に対する前記接触仰角および前記接触方位角を検出する、付記1から5のいずれかに記載の入力システム。
(Appendix 6)
The angle detection unit analyzes the distribution information of depth values indicating the depth of the object in the region including the target object, so that the contact elevation angle of the object in contact with the projection target surface with respect to the projection target surface and The input system according to any one of appendices 1 to 5, which detects the contact azimuth angle.
(付記7)
 投影装置と接続される入力装置であって、
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段と、
 前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段と、
 前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を前記投影装置に送信する表示制御手段と、
 を備えることを特徴とする入力装置。
(Appendix 7)
An input device connected to the projection device,
Contact detection means for detecting contact of an object with a projection target surface of an operation screen on the object;
An angle detecting means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device. Display control means for transmitting to,
An input device comprising:
(付記8)
 前記角度検出手段は、前記接触検出手段が接触点を検出した場合、前記対象物に接触した物体の、前記接触点を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記7に記載の入力装置。
(Appendix 8)
When the contact detection means detects a contact point, the angle detection means is the contact elevation angle that is an elevation angle and an azimuth angle in a one-dimensional direction extending with reference to the contact point of an object that has contacted the object. The input device according to appendix 7, wherein the contact azimuth angle is detected.
(付記9)
 前記角度検出手段は、前記接触検出手段が接触線を検出した場合、前記対象物に接触した物体の、前記接触線を基準として延びている二次元の方向の法線の接平面に対する仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記7または8に記載の入力装置。
(Appendix 9)
The angle detection means, when the contact detection means detects a contact line, an elevation angle and an azimuth with respect to a tangential plane of a normal in a two-dimensional direction extending with reference to the contact line of an object that has contacted the object The input device according to appendix 7 or 8, wherein the contact elevation angle and the contact azimuth angle which are angles are detected.
(付記10)
 前記角度検出手段は、前記接触検出手段が接触領域を検出した場合、前記対象物に接触した物体の、前記接触領域を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記7から9のいずれかに記載の入力装置。
(Appendix 10)
When the contact detection means detects the contact area, the angle detection means is an elevation angle and an azimuth angle of a one-dimensional direction extending with reference to the contact area of the object that has contacted the object. The input device according to any one of appendices 7 to 9, wherein the contact azimuth angle is detected.
(付記11)
 前記接触検出手段は、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記対象物における前記操作画面の投影対象面への前記物体の接触を検出する、付記7から10のいずれかに記載の入力装置。
(Appendix 11)
The contact detection means detects contact of the object with the projection target surface of the operation screen on the target object by analyzing distribution information of depth values indicating the depth of the object in a region including the target object. The input device according to any one of appendices 7 to 10.
(付記12)
 前記角度検出手段は、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記投影対象面に接触した前記物体の、前記投影対象面に対する前記接触仰角および前記接触方位角を検出する、付記7から11のいずれかに記載の入力装置。
(Appendix 12)
The angle detection unit analyzes the distribution information of depth values indicating the depth of the object in the region including the target object, so that the contact elevation angle of the object in contact with the projection target surface with respect to the projection target surface and The input device according to any one of appendices 7 to 11, which detects the contact azimuth angle.
(付記13)
 入力装置が実行する
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出ステップと、
 前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出ステップと、
 前記角度検出ステップで検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を投影装置に送信する表示制御ステップと、
 前記投影装置が実行する
 前記入力装置から受信した前記指示情報にしたがって、前記操作画面を前記対象物に投影するステップと、
 を備えることを特徴とする入力方法。
(Appendix 13)
A contact detection step of detecting contact of an object with a projection target surface of an operation screen on an object executed by an input device;
An angle detection step of detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
Based on the contact elevation angle and the contact azimuth angle detected in the angle detection step, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is displayed on the projection device. A display control step to transmit;
Projecting the operation screen onto the object according to the instruction information received from the input device executed by the projection device;
An input method comprising:
(付記14)
 前記角度検出ステップでは、前記接触検出ステップで接触点を検出した場合、前記対象物に接触した物体の、前記接触点を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記13に記載の入力方法。
(Appendix 14)
In the angle detection step, when the contact point is detected in the contact detection step, the contact elevation angle that is an elevation angle and an azimuth angle of a one-dimensional direction extending with reference to the contact point of the object that has contacted the object. The input method according to appendix 13, wherein the contact azimuth angle is detected.
(付記15)
 前記角度検出ステップでは、前記接触検出ステップで接触線を検出した場合、前記対象物に接触した物体の、前記接触線を基準として延びている二次元の方向の法線の接平面に対する仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記13または14に記載の入力方法。
(Appendix 15)
In the angle detection step, when a contact line is detected in the contact detection step, an elevation angle and an azimuth with respect to a tangent plane of a normal line extending in two dimensions with respect to the contact line of the object contacting the object. 15. The input method according to appendix 13 or 14, wherein the contact elevation angle and the contact azimuth angle which are angles are detected.
(付記16)
 前記角度検出ステップでは、前記接触検出ステップで接触領域を検出した場合、前記対象物に接触した物体の、前記接触領域を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする付記13から15のいずれかに記載の入力方法。
(Appendix 16)
In the angle detection step, when the contact area is detected in the contact detection step, the contact elevation angle that is an elevation angle and an azimuth angle of a one-dimensional direction extending with reference to the contact area of the object that has contacted the object. The input method according to any one of appendices 13 to 15, wherein the contact azimuth angle is detected.
(付記17)
 前記接触検出ステップでは、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記対象物における前記操作画面の投影対象面への前記物体の接触を検出する、付記13から16のいずれかに記載の入力方法。
(Appendix 17)
In the contact detection step, the contact of the object to the projection target surface of the operation screen on the target object is detected by analyzing distribution information of depth values indicating the depth of the object in the region including the target object. The input method according to any one of appendices 13 to 16.
(付記18)
 前記角度検出ステップでは、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記投影対象面に接触した前記物体の、前記投影対象面に対する前記接触仰角および前記接触方位角を検出する、付記13から17のいずれかに記載の入力方法。
(Appendix 18)
In the angle detection step, by analyzing distribution information of a depth value indicating the depth of the object in a region including the target object, the contact elevation angle of the object in contact with the projection target surface with respect to the projection target surface and The input method according to any one of appendices 13 to 17, wherein the contact azimuth angle is detected.
(付記19)
 投影装置と接続されるコンピュータを、
 対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段、
 前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段、
 前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を前記投影装置に送信する表示制御手段、
 として機能させることを特徴とするプログラムを格納する記録媒体。
(Appendix 19)
A computer connected to the projection device,
Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object;
An angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device. Display control means for transmitting to
A recording medium for storing a program characterized by functioning as a storage medium.
 本発明は、2015年1月26日に出願された日本国特許出願2015-012683号に基づく。本明細書中に日本国特許出願2015-012683号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 The present invention is based on Japanese Patent Application No. 2015-012683 filed on Jan. 26, 2015. The specification, claims, and entire drawings of Japanese Patent Application No. 2015-012683 are incorporated herein by reference.
    1 入力装置
    2 深度センサ
    3 投影装置
   11 深度取得部
   12 接触検出部
   13 角度検出部
   14 記憶部
   15 表示制御部
   30 内部バス
   31 制御部
   32 主記憶部
   33 外部記憶部
   34 操作部
   35 表示部
   36 送受信部
   39 制御プログラム
41 上面
  100 入力システム
    F 床領域
P1~P9 端点
    S 骨格
    T テーブル領域
    U ユーザ領域
DESCRIPTION OF SYMBOLS 1 Input device 2 Depth sensor 3 Projection apparatus 11 Depth acquisition part 12 Contact detection part 13 Angle detection part 14 Storage part 15 Display control part 30 Internal bus 31 Control part 32 Main storage part 33 External storage part 34 Operation part 35 Display part 36 Transmission / reception Unit 39 Control program 41 Upper surface 100 Input system F Floor area P1 to P9 End point S Skeleton T Table area U User area

Claims (9)

  1.  対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段と、前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段と、前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を投影装置に送信する表示制御手段と、を有する入力装置と、
     前記入力装置から受信した前記指示情報にしたがって、前記操作画面を前記対象物に投影する前記投影装置と、
     を備えることを特徴とする入力システム。
    Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object, and angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface And instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object based on the contact elevation angle and the contact azimuth angle detected by the angle detection means. An input device having display control means for transmitting to the device;
    The projection device that projects the operation screen onto the object according to the instruction information received from the input device;
    An input system comprising:
  2.  前記角度検出手段は、前記接触検出手段が接触点を検出した場合、前記対象物に接触した物体の、前記接触点を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする請求項1に記載の入力システム。 When the contact detection means detects a contact point, the angle detection means is the contact elevation angle that is an elevation angle and an azimuth angle in a one-dimensional direction extending with reference to the contact point of an object that has contacted the object. 2. The input system according to claim 1, wherein the contact azimuth angle is detected.
  3.  前記角度検出手段は、前記接触検出手段が接触線を検出した場合、前記対象物に接触した物体の、前記接触線を基準として延びている二次元の方向の法線の接平面に対する仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする請求項1または2に記載の入力システム。 The angle detection means, when the contact detection means detects a contact line, an elevation angle and an azimuth with respect to a tangential plane of a normal in a two-dimensional direction extending with reference to the contact line of an object that has contacted the object The input system according to claim 1, wherein the contact elevation angle and the contact azimuth angle which are angles are detected.
  4.  前記角度検出手段は、前記接触検出手段が接触領域を検出した場合、前記対象物に接触した物体の、前記接触領域を基準として延びている一次元の方向の仰角および方位角である前記接触仰角および前記接触方位角を検出することを特徴とする請求項1ないし3のいずれか1項に記載の入力システム。 When the contact detection means detects the contact area, the angle detection means is an elevation angle and an azimuth angle of a one-dimensional direction extending with reference to the contact area of the object that has contacted the object. The input system according to claim 1, wherein the contact azimuth angle is detected.
  5.  前記接触検出手段は、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記対象物における前記操作画面の投影対象面への前記物体の接触を検出する、
     請求項1から4のいずれか1項に記載の入力システム。
    The contact detection means detects contact of the object with the projection target surface of the operation screen on the target object by analyzing distribution information of depth values indicating the depth of the object in a region including the target object. ,
    The input system according to any one of claims 1 to 4.
  6.  前記角度検出手段は、前記対象物を含む領域の前記物体の深度を示す深度値の分布情報を解析することにより、前記投影対象面に接触した前記物体の、前記投影対象面に対する前記接触仰角および前記接触方位角を検出する、
     請求項1から5のいずれか1項に記載の入力システム。
    The angle detection unit analyzes the distribution information of depth values indicating the depth of the object in the region including the target object, so that the contact elevation angle of the object in contact with the projection target surface with respect to the projection target surface and Detecting the contact azimuth;
    The input system according to any one of claims 1 to 5.
  7.  投影装置と接続される入力装置であって、
     対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段と、
     前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段と、
     前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を前記投影装置に送信する表示制御手段と、
     を備えることを特徴とする入力装置。
    An input device connected to the projection device,
    Contact detection means for detecting contact of an object with a projection target surface of an operation screen on the object;
    An angle detecting means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
    Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device. Display control means for transmitting to,
    An input device comprising:
  8.  入力装置が実行する
     対象物における操作画面の投影対象面への物体の接触を検出する接触検出ステップと、
     前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出ステップと、
     前記角度検出ステップで検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を投影装置に送信する表示制御ステップと、
     前記投影装置が実行する
     前記入力装置から受信した前記指示情報にしたがって、前記操作画面を前記対象物に投影するステップと、
     を備えることを特徴とする入力方法。
    A contact detection step of detecting contact of an object with a projection target surface of an operation screen on an object executed by an input device;
    An angle detection step of detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
    Based on the contact elevation angle and the contact azimuth angle detected in the angle detection step, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is displayed on the projection device. A display control step to transmit;
    Projecting the operation screen onto the object according to the instruction information received from the input device executed by the projection device;
    An input method comprising:
  9.  投影装置と接続されるコンピュータを、
     対象物における操作画面の投影対象面への物体の接触を検出する接触検出手段、
     前記投影対象面に接触した前記物体の、前記投影対象面に対する接触仰角および接触方位角を検出する角度検出手段、
     前記角度検出手段が検出した前記接触仰角および前記接触方位角に基づいて、前記投影対象面の前記物体との接触位置を基準とする所定の位置に前記操作画面を表示させる指示情報を前記投影装置に送信する表示制御手段、
     として機能させることを特徴とするプログラムを格納する記録媒体。
    A computer connected to the projection device,
    Contact detection means for detecting contact of an object with a projection target surface of an operation screen on an object;
    An angle detection means for detecting a contact elevation angle and a contact azimuth angle of the object in contact with the projection target surface with respect to the projection target surface;
    Based on the contact elevation angle and the contact azimuth angle detected by the angle detection means, instruction information for displaying the operation screen at a predetermined position based on the contact position of the projection target surface with the object is used as the projection device. Display control means for transmitting to
    A recording medium for storing a program characterized by functioning as a storage medium.
PCT/JP2016/052054 2015-01-26 2016-01-25 Input system, input device, input method, recording medium WO2016121708A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016572027A JPWO2016121708A1 (en) 2015-01-26 2016-01-25 INPUT SYSTEM, INPUT DEVICE, INPUT METHOD, AND PROGRAM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015012683 2015-01-26
JP2015-012683 2015-01-26

Publications (1)

Publication Number Publication Date
WO2016121708A1 true WO2016121708A1 (en) 2016-08-04

Family

ID=56543321

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/052054 WO2016121708A1 (en) 2015-01-26 2016-01-25 Input system, input device, input method, recording medium

Country Status (2)

Country Link
JP (1) JPWO2016121708A1 (en)
WO (1) WO2016121708A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111886567A (en) * 2018-03-07 2020-11-03 日本电气方案创新株式会社 Operation input device, operation input method, and computer-readable recording medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04257014A (en) * 1991-02-12 1992-09-11 Matsushita Electric Ind Co Ltd Input device
JP2010140321A (en) * 2008-12-12 2010-06-24 Sony Corp Information processing apparatus, information processing method, and program
JP2013125552A (en) * 2011-12-15 2013-06-24 Toshiba Corp Information processor and image display program
US20140055396A1 (en) * 2012-08-27 2014-02-27 Microchip Technology Incorporated Input Device with Hand Posture Control
JP2014216006A (en) * 2013-04-22 2014-11-17 富士ゼロックス株式会社 System, program and method for receiving input of user

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04257014A (en) * 1991-02-12 1992-09-11 Matsushita Electric Ind Co Ltd Input device
JP2010140321A (en) * 2008-12-12 2010-06-24 Sony Corp Information processing apparatus, information processing method, and program
JP2013125552A (en) * 2011-12-15 2013-06-24 Toshiba Corp Information processor and image display program
US20140055396A1 (en) * 2012-08-27 2014-02-27 Microchip Technology Incorporated Input Device with Hand Posture Control
JP2014216006A (en) * 2013-04-22 2014-11-17 富士ゼロックス株式会社 System, program and method for receiving input of user

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YOSHIKI TAKEOKA ET AL.: "Z-touch: 3D Multi- touch System which Detects Spatial Posture of Finger-tops", TRANSACTIONS OF INFORMATION PROCESSING SOCIETY OF JAPAN, vol. 53, no. 4, 15 April 2012 (2012-04-15), pages 1338 - 1348 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111886567A (en) * 2018-03-07 2020-11-03 日本电气方案创新株式会社 Operation input device, operation input method, and computer-readable recording medium
CN111886567B (en) * 2018-03-07 2023-10-20 日本电气方案创新株式会社 Operation input device, operation input method, and computer-readable recording medium

Also Published As

Publication number Publication date
JPWO2016121708A1 (en) 2017-11-24

Similar Documents

Publication Publication Date Title
EP3015955B1 (en) Controlling multiple devices with a wearable input device
JP5552772B2 (en) Information processing apparatus, information processing method, and computer program
JP5507773B1 (en) Element selection device, element selection method, and program
US8860679B2 (en) Pointing to a desired object displayed on a touchscreen
US8456421B2 (en) Selection device and method
JP6325659B2 (en) Operation screen display device, operation screen display method and program
US20160191875A1 (en) Image projection apparatus, and system employing interactive input-output capability
CN108153475B (en) Object position switching method and mobile terminal
JP6569794B1 (en) Information processing apparatus and program
KR20190059726A (en) Method for processing interaction between object and user of virtual reality environment
WO2016121708A1 (en) Input system, input device, input method, recording medium
WO2017163566A1 (en) Program, computer device, program execution method, and system
JP2015141686A (en) Pointing device, information processing device, information processing system, and method for controlling pointing device
WO2019150430A1 (en) Information processing device
KR101513343B1 (en) Method for controlling motions of an object in a 3-dimensional virtual environment
JP6411067B2 (en) Information processing apparatus and input method
JP2017188140A (en) Program, computer device, program execution method, and system
JP2016119019A (en) Information processing apparatus, information processing method, and program
JP2015153197A (en) Pointing position deciding system
JP2013109538A (en) Input method and device
JP2013134549A (en) Data input device and data input method
JP2012089083A (en) Information input device and method for electronic apparatus
KR20080017194A (en) Wireless mouse and driving method thereof
JP2020062376A (en) Information processor and program
JP6248723B2 (en) Coordinate detection system, coordinate detection method, information processing apparatus, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16743311

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016572027

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16743311

Country of ref document: EP

Kind code of ref document: A1