WO2018154714A1 - Operation input system, operation input method, and operation input program - Google Patents

Operation input system, operation input method, and operation input program Download PDF

Info

Publication number
WO2018154714A1
WO2018154714A1 PCT/JP2017/007123 JP2017007123W WO2018154714A1 WO 2018154714 A1 WO2018154714 A1 WO 2018154714A1 JP 2017007123 W JP2017007123 W JP 2017007123W WO 2018154714 A1 WO2018154714 A1 WO 2018154714A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
wave
display screen
operation input
reflected wave
Prior art date
Application number
PCT/JP2017/007123
Other languages
French (fr)
Japanese (ja)
Inventor
洋海 澤田
Original Assignee
洋海 澤田
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 洋海 澤田 filed Critical 洋海 澤田
Priority to PCT/JP2017/007123 priority Critical patent/WO2018154714A1/en
Publication of WO2018154714A1 publication Critical patent/WO2018154714A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention relates to an operation input system, an operation input method, and an operation input program for inputting a coordinate position on a display screen of an information processing apparatus and an operation command related to the coordinate position.
  • a coordinate position on a display screen of a display such as a mouse, a touch pad, and a touch panel is designated.
  • a method of inputting an operation command at the coordinate position such as button click, tap, and swipe, has become widespread.
  • the operator's hand is held by a camera provided in the information processing terminal. Take an image, set the operation area in the air near the position of the finger in correspondence with the screen of the information processing terminal, detect the movement of the finger in the operation area by image analysis, and from the movement of the finger, the screen
  • an operation input device that moves the upper cursor or highlights and designates an icon (see Patent Document 1). According to the operation input device disclosed in Patent Document 1, it is possible to determine an operation input with less burden on the operator without requiring an operation of directly touching a finger such as a mouse, a touch pad, or a touch panel.
  • the minimum point in the distance image is detected using a distance image that is a distance distribution from the camera to the object surface as disclosed in Patent Document 2, and the operator's
  • an information input device that obtains the position of the fingertip and enables pointing on the screen by the movement.
  • An object of the present invention is to provide an operation input system, an operation input method, and an operation input program that can accurately acquire an operation input.
  • the present invention is an operation input system for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position,
  • a transmitter that transmits electromagnetic waves or sound waves as a transmitted wave to a monitoring space area including a part or all of the visual field range in which the display screen is visible;
  • a receiving unit that receives the transmitted wave reflected by the object as a reflected wave;
  • a contact / separation detection unit that detects the movement of the object in the direction approaching or leaving the transmission unit or the reception unit and its speed, Based on the transmitted wave transmitted and the received reflected wave, the position information on the position and the displacement of the object that generated the reflected wave in the three-dimensional space is detected, and the position information and the contact / separation detection unit
  • a signal processing unit that generates an input signal related to a coordinate position on the display screen based on a detection result and provides the input signal to the information processing apparatus.
  • the present invention is also an operation input method for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position, While transmitting the electromagnetic wave or sound wave as a transmitted wave from the transmitter to a monitoring space area including part or all of the visual field range where the display screen is visible, the transmitted wave reflected by the object is used as a reflected wave.
  • the position of the object in the three-dimensional space and its displacement are detected based on the transmitted wave and the reflected wave, and input signals relating to cursor movement and basic command operations are provided to the information processing apparatus.
  • a pointing operation by an operator can be input without requiring a device such as a mouse, a touch pad, or a touch panel that is directly operated.
  • the approach of the object to the transmitting unit or the receiving unit or without performing heavy processing such as image analysis can be detected, and an operation input with less burden on the operator such as a fine and quick movement of a fingertip or the like can be accurately acquired.
  • filter In the above invention, a filtering unit that extracts in advance only the minimum data necessary for the subsequent processing process of the received reflected wave, that is, only data from the initial wave of the reflected wave to a predetermined time, and sends it to the subsequent processing flow.
  • the signal processing unit generates the input signal based on the reflected wave extracted by the filtering unit.
  • the signal processing unit is parallel to the display screen or has a predetermined angle in the monitoring space region, has the same shape or similar shape to the display screen, and is a distance from the display screen to the monitoring target point
  • an operation surface setting unit for setting a virtual continuous countless operation surface that is, a 3D touch panel, whose size is changed in accordance with the coordinates of the monitoring target point on the virtual operation surface of the 3D touch panel It is preferable to generate the pointing input signal according to the position.
  • Subcommand key In the above invention, since the Doppler commands are limited to two types of button down and button up and combinations thereof, only basic commands that can be executed by the current mouse, touch pad, and touch panel can be realized. Therefore, it is preferable to further include one subcommand key in the pointing gesture and diversify input commands in addition to the Doppler command. This subcommand key can be replaced by other methods such as voice input.
  • the shape of the finger at the time of the pointing gesture is schematically shown as a columnar shape by the distance image. It is also possible to adopt an operation method in which a pointer is displayed at a coordinate position where a vector directed to the axis of the columnar shape intersects with the display screen as a monitoring target point. Also in this case, it is preferable that the contact / separation detection unit monitors the Doppler effect using a reflected wave used for the distance image.
  • the above-described apparatus and method according to the present invention can be realized by executing the operation input program of the present invention described in a predetermined language on a computer. That is, the program of the present invention is installed in a mobile terminal device, a smart phone, a wearable terminal, a mobile personal computer, other information processing terminals, an IC chip or a memory device of a general-purpose computer such as a personal computer or a server computer, and executed on the CPU. By doing so, the system of each function mentioned above can be constructed
  • the program of the present invention is an operation input program for inputting a coordinate position on the display screen of the information processing apparatus and an operation command related to the coordinate position,
  • a transmitter that transmits electromagnetic waves or sound waves as a transmitted wave to a monitoring space area including a part or all of the visual field range in which the display screen is visible,
  • a receiving unit that receives the transmitted wave reflected by the object as a reflected wave; Based on the transmitted wave and the received reflected wave, the position information on the three-dimensional space of the object that generated the reflected wave and position information regarding the displacement are detected, and the display is performed based on the position information.
  • An input signal related to a coordinate position on the screen is generated and provided to the information processing device, and the wavelength of the transmitted wave and the wavelength of the reflected wave are compared, and the signal is transmitted to the transmitter or the receiver.
  • it functions as a contact / separation detection unit that detects the movement and speed of the object in the direction of approaching or leaving.
  • an operation input program of the present invention for example, as a package application that can be distributed through a communication line and that runs on a stand-alone computer by being recorded on a computer-readable recording medium.
  • the recording medium can be recorded on various recording media such as a magnetic recording medium such as a flexible disk and a cassette tape, an optical disk such as a CD-ROM and a DVD-ROM, and a RAM card.
  • the computer-readable recording medium on which the program is recorded the above-described system and method can be easily implemented using a general-purpose computer or a dedicated computer, and the program can be stored, transported, and Easy installation.
  • a mouse, a touch pad, a touch panel, or the like is operated by direct contact with a pointing device or a command input on a desktop or a display screen of an information processing terminal device such as a notebook personal computer, a tablet personal computer, or a smartphone.
  • an information processing terminal device such as a notebook personal computer, a tablet personal computer, or a smartphone.
  • assembled on CPU of the operation input system which concerns on 1st Embodiment. It is explanatory drawing which shows the detection process of the fingertip monitoring target point of the operation input system which concerns on 1st Embodiment. It is explanatory drawing which shows the outline
  • FIG. 1 is a conceptual diagram showing an external configuration of an information processing terminal device for realizing the operation input system according to the present embodiment
  • FIG. 2 is an information process for realizing the operation input system according to the present embodiment.
  • It is a block diagram which shows the hardware constitutions of a terminal device.
  • the “module” used in the description refers to a functional unit that is configured by hardware such as an apparatus or a device, software having the function, or a combination thereof, and achieves a predetermined operation. .
  • the information processing terminal device can be realized by a general-purpose computer or a dedicated device.
  • the notebook personal computer 1 is an information processing terminal device that includes a display screen 5b such as a monitor and is integrated so that a main body having a keyboard, a pointing device, and the like is folded in half, and includes a transmitter 5c and a receiver 6a0. 2 are arranged on the lower side of the display screen 5b toward the operator side.
  • the present invention is not limited to this, and information processing that can be equipped or connected to the transmission unit 5c and the reception units 6a0 to 6a.
  • it is a terminal, it can be realized by a general-purpose computer such as a desktop personal computer or a dedicated device specialized in function, and a tablet personal computer, smartphone, mobile phone, wearable terminal device, etc. can also be adopted. .
  • the hardware configuration of the notebook personal computer 1 includes a CPU 11, a memory 12, an input interface 16, a storage device 14, an output interface 15, and a communication interface 13. .
  • these devices are connected via the CPU bus 10 and can exchange data with each other.
  • the input interface 16 is a module that receives an operation signal from an operation device such as a keyboard 6b, a pointing device, a touch panel, or a button. The received operation signal is transmitted to the CPU 11 and can perform operations on the OS and each application. . Further, input devices such as a CCD camera 6d and a microphone 6c can be connected to the input interface 16, and the receiving units 6a0 to 6a2 of the present invention are also connected to the input interface 16.
  • the receiving units 6a0 to 6a are receiving devices having directivity for blocking re-reflection from the vicinity and selectively receiving only the direct reflected wave from the monitoring target in the air. is there. More specifically, the receiving units 6a0 to 6a2 irradiate the space W that has nothing other than the monitoring target and the operator in the vicinity with a predetermined directivity, and the direction of the keyboard among the transmitted waves reflected by the target object. A direct reflected wave that reaches the receiving unit having directivity so as to prevent re-reflection from being received is received as a reflected wave W2, and the received reflected wave W2 is subjected to the synchronization processing shown in FIG. The signal is input to the signal processing unit 112 and the contact / separation detection unit 113 through the unit 111.
  • the receiving units 6a0 to 6a2 are a plurality of infrared sensors that are physically separated and arranged adjacent to each other, and are arranged in a triangle on the lower right side of the display screen 5b. If there is sufficient resolution, various sensors corresponding to the types of electromagnetic waves and sound waves transmitted by the transmitter 5c, such as electromagnetic sensors other than infrared rays and ultrasonic sensors, can be employed.
  • the output interface 15 is a module that transmits video signals and audio signals in order to output video and audio from output devices such as the display screen 5b and the speaker 5a.
  • a transmitter 5c that transmits electromagnetic waves or sound waves as a transmitted wave is also connected to the output interface 15.
  • the transmitter 5 c has a predetermined directivity with respect to the monitoring space region R ⁇ b> 3 including part or all of the visual field range R ⁇ b> 2 that is visible on the display screen 5 b using electromagnetic waves or sound waves as a transmitted wave.
  • This is a transmitting device, and transmits a transmitted wave in accordance with control by the synchronization processing unit 111 in FIG.
  • the transmitter 5c employs an infrared light that irradiates an infrared laser pulse, and is arranged at the center of the receivers 6a0 to 6a2 arranged in a triangle at the lower right side of the display screen 5b.
  • the transmitter 5c can detect electromagnetic waves and sound waves that can be detected by the receivers 6a0 to 2 such as an ultrasonic transmitter and a radio transmitter other than infrared rays.
  • the receivers 6a0 to 2 such as an ultrasonic transmitter and a radio transmitter other than infrared rays.
  • Various transmitters corresponding to such types can be employed.
  • the communication interface 13 is a module that transmits and receives data to and from other communication devices.
  • a communication method for example, a public line such as a telephone line, an ISDN line, an ADSL line, and an optical line, a dedicated line, WCDMA (registered trademark)
  • 3G 3rd generation
  • 4G 4th generation
  • 5G 5th generation
  • WiFi registered trademark
  • Bluetooth Wireless communication networks such as registered trademark
  • the storage device 14 is a device that accumulates data in a recording medium and reads out the accumulated data in response to a request from each device.
  • a hard disk drive (HDD), a solid state drive (SSD), a memory card, and the like can be configured.
  • the CPU 11 is a device that performs various arithmetic processes necessary for controlling each unit, and virtually constructs various modules on the CPU 11 by executing various programs.
  • an OS Operating System
  • various applications can be executed on the OS, and the CPU 11 executes the OS program, thereby managing and controlling the basic functions of the notebook personal computer 1, and the CPU 11 executing the application program.
  • various functional modules are virtually constructed on the CPU.
  • FIG. 3 is a block diagram showing functional modules constructed on the CPU of the operation input system according to the first embodiment.
  • three modules of the synchronization processing unit 111, the signal processing unit 112, and the contact / separation detection unit 113 are constructed on the CPU 11.
  • the synchronization processing unit 111 synchronizes the transmission process of the transmitted wave W1 in the transmission unit 5c and the reception process of the reflected wave W2 in the reception unit 6a, and transmits the transmitted wave W1 and the reflected wave W2. It is a module that associates with.
  • the transmission unit 5c transmits intermittent pulsar waves as transmission waves W1, receives reflected waves corresponding to the transmitted pulsar waves, transmits and receives the order, time, transmission to reception. Based on the time length, each transmitted wave signal and each corresponding reflected wave signal are selected, the transmitted wave and the reflected wave signal are associated as a set, and sent to the signal processing unit 112.
  • the signal processing unit 112 detects the spatial position information of the object that generated the reflected wave W2 and its displacement based on the transmitted wave W1 transmitted by the transmitting unit 5c and the reflected wave W2 received by the receiving unit 6a.
  • This is a module that generates an input signal related to a coordinate position on the display screen 5b and provides it to an OS or an application running on the information processing apparatus.
  • the pointing gesture recognition unit 112b determines the presence or absence of the pointing gesture as shown in FIGS. 1, 4 to 9, 13, 13, 18, and 19.
  • the reflected wave has a strong amplitude at the fingertip B1 perpendicular to the traveling direction of the transmitted wave, and at the belly of the finger that is parallel to the traveling direction of the transmitted wave or is in contact with a predetermined angle (for example, an obtuse angle). Since there is no to weak amplitude, the reflected wave forms an amplitude-time profile as seen in the A1 region of FIG. Paradoxically speaking, if such a characteristic of the amplitude-time profile is observed in the reflected wave, it is presumed that a pointing gesture was present.
  • the three distance values calculated in this way are input to the position information calculation unit 112d.
  • spatial position information is obtained by three-point ranging from three reception units arranged in a triangle.
  • the arrangement is improved by devising the arrangement.
  • the accuracy of the three-dimensional position information of the target point is improved.
  • the number of transmitting units is one, but the number of transmitting units can be increased to the same number as the receiving units by synchronizing with each receiving unit.
  • the combination of the number and arrangement of the transmitters and receivers can be optimized in accordance with the required accuracy of the monitoring target point position information.
  • the most basic 1 transmission unit + 3 reception unit is adopted, but as shown in FIG. 18, as a pair of three transmission units and three reception units corresponding to each of the three transmission units. Also good.
  • the operation surface setting unit 112e has the same or similar shape to the display screen 5b in parallel or obtuse with the display screen 5b in the monitoring space region R3.
  • This is a module for setting a virtual continuous countless virtual operation screen VP set (3D touch panel) that changes its size in accordance with the distance to the target point.
  • the distance D1 or D2
  • the virtual operation screen VP or VP2 as the bottom surface of the quadrangular pyramid may be set to be perpendicular to the normal line orthogonal to the display screen 5b, for example, and perpendicular to the extension line passing through the transmission unit or the reception unit Or you may set so that it may make a predetermined angle and you may set so that it may become parallel with respect to the display screen 5b.
  • the framework 3D grid of the 3D touch panel set by the operation surface setting unit 112e is input to the pointing signal generation unit 112f.
  • the pointing signal generation unit 112f is a module that generates a GUI (Graphical User Interface) such as a cursor, a pointer, or a scroll bar displayed on the screen as a pointing signal according to coordinates on the screen.
  • GUI Graphic User Interface
  • the signal is input to the signal generator 112g. This is the description of the signal processing unit.
  • the contact / separation detector 113 is a module that detects the Doppler effect by comparing the wavelength of the transmitted wave and the wavelength of the reflected wave, and detects the movement of the object in the direction of approaching or leaving the object and the speed thereof. is there.
  • the contact / separation detection unit 113 includes a Doppler effect detection unit 113a and a Doppler command signal generation unit 113b.
  • the average of the maximum speeds is input to the Doppler command signal generator 113b.
  • Doppler commands In the present embodiment, it is interpreted as a button down operation when it is a blue shift where the wavelength of the reflected wave is short, and as a button up operation when it is a red shift where the wavelength is long. For example, if there is a rapid button-down operation, it is determined that a click with the mouse is performed, and if the button-down operation continues twice rapidly, a double-click operation is determined. If there is a rapid button up, it is determined that the drag operation is started, and if the button is rapidly down after selecting the drag range, it is determined that the drag operation is completed.
  • the Doppler command generated here is input to the input signal generation unit.
  • the input signal generation unit 112g integrates the pointer coordinates (x, y) on the screen obtained by the pointing signal generation unit 112f and the command signal obtained by the Doppler command signal generation unit 113b, and performs comprehensive operations on the GUI. An input signal translated as a command is generated.
  • the Doppler command signal is input to the input signal generation unit 112g, the input of the pointing signal to the input signal generation unit 112g is interrupted, and the pointer coordinates (x, y) remain at a position where the command is executed for a certain time.
  • the input signal generated by the input signal generation unit 112g is provided to the notebook personal computer 1 through the OS executed on the CPU 11 and other applications.
  • the input signal generation unit 112g may generate various commands by incorporating operation signals from other input devices in addition to the Doppler command. For example, with the pointer displayed on the display screen with a pointing gesture, the sub-command key 1a in the left front corner of the information processing terminal shown in FIG. 1 is briefly pressed once to display the menu screen, or the same key is pressed. You can zoom in and out by moving the pointer back and forth while holding down, and scroll the screen by moving it in parallel with the screen. In this way, by adding only one subcommand key, almost all current mouse and touch panel commands can be realized together with the Doppler command. Furthermore, it is more convenient if the GUI mode can be optimized to the operator's preference according to needs, such as adding assistance by voice input to the microphone 6c such as “click”, “drag”, “zoom in”, and the like.
  • a character string such as “click” or “double click” may be displayed as a message for about 1 second, or “drag”, “zoom”, and “scroll” may be continuously displayed during operation.
  • FIG. 11 is a flowchart showing the procedure of the operation input method according to the present embodiment.
  • step S101 infrared rays or other electromagnetic waves or sound waves are transmitted as directional transmission waves from the transmission unit 5c to the monitoring space region R3 shown in FIG. 4 and reflected waves reflected by the object. Are received by the receiving units 6a0 to 6a so that re-reflection is not mixed.
  • step S102 the transmission process of the transmission wave W1 in the transmission unit shown in FIG. 6 and the reception process of the reflected wave W2 in the reception unit are synchronized, and the transmission wave W1 and the reflection wave W2 are associated with each other.
  • step S103 in order to remove unnecessary information in advance and reduce the processing load, the filtering unit 112a performs filtering to select only a reflected wave signal in a range corresponding to a depth of 3 centimeters from the initial motion wave. Send to step.
  • step S104 the presence / absence of a pointing gesture is determined for the filtered signal. When a characteristic reflected wave pattern as shown in the area A1 in FIG. 10 is detected simultaneously with a predetermined number or more of pulse waves from the three receiving units, it is determined that there is a pointing gesture.
  • step S105 If it is determined in step S105 that there is a pointing gesture (“Y” in S105), the process proceeds to the next step S106, and the reflected wave data of the three receiving units shown in the A1 area of FIG. 10 is sent to the next step. . If the pointing gesture is not recognized (“N” in S105), the previous step is repeated until it is recognized. In step S106, the initial motion wave of the reflected wave data in the A1 region in FIG. 10 is interpreted as the fingertip B1, and the distance from the time from when the transmitted wave is irradiated until it returns to each of the three receiving units to the fingertip is calculated. .
  • the fingertip the monitoring target point detection unit 112c is compared with the wavelength of the initial reflected wave data of the three receiving units and the wavelength of the pulse transmission wave, and the presence or absence of the Doppler effect is checked. Detects the movement of a point back and forth.
  • the Doppler command signal in step S111 Is generated.
  • the input of the pointing signal to the input signal generation unit 112g is interrupted, and the pointer coordinates (x, y) remain at a position where the command is executed for a certain time.
  • the Doppler effect is not detected (“N” in step S108)
  • the normal pointing operation is continued (S109).
  • step S110 the pointing position information input from the pointing signal generation unit in step S109 and the Doppler command signal in step S111 are combined, and an input signal of a Doppler command at a specific pointer position (x, y) is output. Is provided to the notebook personal computer 1 through the OS and other applications running on the computer.
  • an operator's pointing operation can be input without using a device such as a mouse, a touch pad, or a touch panel that is operated by direct contact.
  • the so-called Doppler effect is monitored by comparing changes in the wavelength of the transmitted wave and its reflected wave, so that the approach of the target object can be performed easily and accurately without performing heavy processing such as image analysis. It is possible to detect the separation and the moving speed thereof, and it is possible to accurately detect a minute and quick movement of a fingertip or the like and convert it into a command.
  • the most basic setting of one transmitting unit and three receiving units is adopted.
  • the transmitting unit irradiates a transmitted wave in a space where there are no objects to be monitored and an object other than the operator in the vicinity, and the receiving unit is configured to receive only reflected waves from the space from the beginning.
  • Unnecessary signal noise is suppressed.
  • the combination of the 3D touch panel and the Doppler command allows the operator to freely perform a natural and free operation by freely using the space existing in front of the display screen of the information processing terminal.
  • the operability can be improved by combining with a Doppler command that produces an effect of actual touch by a touching gesture.
  • FIG. 12 is a block diagram illustrating functional modules constructed on the CPU of the operation input system according to the second embodiment.
  • the same components as those in the first embodiment described above are denoted by the same reference numerals, and the functions and the like are the same unless otherwise specified, and the description thereof is omitted.
  • the signal processing unit 112 includes a columnar detection unit 112i and a vector determination unit 112h as characteristic components.
  • one receiving unit 6a provided at the lower right of the display screen 5b is a so-called range image camera.
  • a distance image camera is a semiconductor chip such as a CCD that captures an image, and measures and images the arrival distance of reflected waves received by a number of sensor elements to generate a so-called distance image.
  • the role of the pointing gesture recognition unit 112b is the same as that of the first embodiment, but in this embodiment, the pointing gesture is extracted based on the shape obtained from the image information. As shown in FIG. 14, since the finger directed to the display screen can be imitated in a long and narrow cylindrical shape, when a cylindrical shape having a predetermined diameter facing the display screen is recognized in the image data after filtering. Recognize that there is a fingertip gesture.
  • the fingertip gesture When looking at the pointing gesture in the distance image, a shadow is created on the side opposite to the direction of the transmitted wave irradiation, so the fingertip gesture is actually recognized on the basis of an elongated rectangle projected on the xz plane, that is, the keyboard and a horizontal plane.
  • the distance image data that has been recognized with the pointing gesture in this manner is input to the columnar detection unit 112i.
  • the vector determination unit 112h creates an approximate expression of the axis vector V1 (FIGS. 13 to 15) of the columnar shape Ob (FIGS. 14 and 15B) detected by the columnar detection unit 112i.
  • the operation surface setting unit 112e sets a 3D grid with the display screen 5b as the XY axis and the direction orthogonal thereto as the Z axis in the monitoring space area.
  • the 3D grid set by the operation surface setting unit 112e is input as a framework to the position information calculation unit 112d.
  • the position information calculation unit 112d calculates the coordinate position (x, y) where the approximate straight line of the vector V1 expressed in the 3D grid intersects the display screen 5b, and uses this as the pointer position to the pointing signal generation unit 112f. Deliver. Based on this, the pointing signal generator 112f generates an input signal that specifies the position and movement of the pointer.
  • the second pointing gesture is recognized in the above process, it is captured as a subcommand signal and the content of the input signal is changed. For example, as shown in FIG. 16, if a gesture such as raising a little finger is performed in combination with an operation with a pointing finger with an index finger, this is determined as a subcommand signal, and a subcommand dialog is displayed on the screen.
  • the range of selection operations by the operator can be expanded. This corresponds to the sub-command key (1a in FIG. 1) in the first embodiment.
  • this information is replaced with a gesture by making use of the abundant amount of information in the distance image.
  • FIG. 17 is a flowchart showing the procedure of the operation input method according to this embodiment.
  • the same contents as in the first embodiment are omitted, and only the steps unique to this embodiment are described.
  • step S202 the object is monitored with a distance image camera, and in step S204, the presence or absence of a pointing gesture is determined based on the distance image data.
  • step S205 if it is determined in the previous step that there is a pointing gesture, the columnar detection unit 112i geometrically analyzes the columnar shape Ob shown in FIGS. 14 and 15 based on the distance image data of the pointing gesture. Approximate.
  • step S206 the vector determination unit 112h calculates the direction of the axis of the columnar shape Ob detected by the columnar detection unit 112i, that is, the approximate expression of the vector V1, as shown in FIGS.
  • step S207 the position information calculation unit 112d calculates the coordinate position where the extended line of the vector V1 intersects the display screen 5b, and transfers this to the pointing signal generation unit 112f. Based on this information, the pointing signal generator 112f generates a pointing signal for designating the pointer position and movement (S209), and inputs the pointing signal to the input signal generator 112g.
  • Step S208 and Step S211 are basically the same as those in the first embodiment.
  • the subcommand key is used.
  • the subcommand signal is generated by the second pointing gesture using the abundant amount of information in the distance image.
  • a process for displaying the subcommand menu is executed, and then a pointing signal for moving the pointer on the subcommand menu is generated.
  • the signal processing unit 112 generates and outputs an input signal related to the coordinate position on the display screen 5b.
  • the notebook personal computer is used as the information processing terminal device, but the operation input program of the present invention is installed in the smartphone 1 '.
  • the transmitting / receiving unit is arranged in a triangle toward the operator side on the upper left of the display screen 5b.
  • This change example assumes that you have a smartphone with your left hand and perform a pointing gesture with your right index finger, or a smartphone with your right hand and a pointing gesture with your right thumb.
  • smartphones have a limited number of keys that can be used freely, so a subcommand button 1b is newly provided at the lower left side.
  • the function of this button 1b is the same as the subcommand key of the first embodiment shown in 1a of FIG. 1, and various commands can be executed in addition to the Doppler command by combining this button operation and the pointing gesture operation. .
  • Operation surface setting unit 112f Pointing signal generation unit 112g ... Input signal generation unit 112h ... Vector determination unit 112i ... Columnar detection unit 113 ... Contact / separation detection unit 113a ... Doppler effect detection unit 113b ... Doppler command generator

Abstract

[Problem] To reliably obtain an operation input made via fine quick movement of a finger tip or the like without the need for a direct manipulation device such as a mouse or a touch panel, thus imposing a reduced burden on the operator. [Solution] This operation input system is provided with: a transmission unit 5c which transmits electromagnetic waves or sound waves toward a monitored space region R3 including part or all of a viewing range R2 in which a display screen 5b is visible, said electronic waves and sound waves serving as transmission waves; reception units 6a0—2 which receive waves reflected from a target object irradiated with the transmission waves; a signal processing unit 112 which, on the basis of the transmission waves and the reflected waves, detects position information relating to a three-dimensional space position of the wave-reflecting target object and relating to changes in that position, generates an input signal associated with a coordinate position on the display screen 5b, and provides the generated input signal to an information processing terminal device; and an approach/departure detection unit 113 which compares the wavelengths of the reflected waves with those of the transmission waves and thereby detects the movement and the speed of the target object in the direction in which the target object approaches or departs from the display screen 5b.

Description

操作入力システム、操作入力方法及び操作入力プログラムOperation input system, operation input method, and operation input program
 本発明は、情報処理装置の表示画面上における座標位置、及びその座標位置に関する操作コマンドを入力する操作入力システム及び操作入力方法、操作入力プログラムに関する。 The present invention relates to an operation input system, an operation input method, and an operation input program for inputting a coordinate position on a display screen of an information processing apparatus and an operation command related to the coordinate position.
 従来、パーソナルコンピューター等の情報処理装置に対する操作コマンドを入力するデバイスとしては、文字を入力するキーボードの他、マウスやタッチパッド、タッチパネルのように、ディスプレイの表示画面上における座標位置を指定するとともに、ボタンのクリックやタップ、スワイプというように、その座標位置における操作コマンドを入力するものが普及している。 Conventionally, as a device for inputting an operation command for an information processing apparatus such as a personal computer, in addition to a keyboard for inputting characters, a coordinate position on a display screen of a display such as a mouse, a touch pad, and a touch panel is designated. A method of inputting an operation command at the coordinate position, such as button click, tap, and swipe, has become widespread.
 また、操作者が、上述したマウスやタッチパッド、タッチパネルのような特別な機器を用いることなく、操作の利便性をより向上させるために、情報処理端末に備えられたカメラによって操作者の手を撮影し、情報処理端末の画面に対応させて、指の位置の付近の空中に操作領域を設定し、その操作領域内における指の移動を画像解析により検出して、その指の動きから、画面上のカーソルを移動させたり、或いはアイコンを強調表示して指定させたりする操作入力装置が提案されている(特許文献1参照)。この特許文献1に開示された操作入力装置によれば、マウスやタッチパッド、タッチパネル等の直接指を接触させる操作を必要とせず、操作者に負担が少ない動作入力を判定することができる。 In addition, in order for the operator to improve the convenience of operation without using a special device such as a mouse, a touch pad, or a touch panel, the operator's hand is held by a camera provided in the information processing terminal. Take an image, set the operation area in the air near the position of the finger in correspondence with the screen of the information processing terminal, detect the movement of the finger in the operation area by image analysis, and from the movement of the finger, the screen There has been proposed an operation input device that moves the upper cursor or highlights and designates an icon (see Patent Document 1). According to the operation input device disclosed in Patent Document 1, it is possible to determine an operation input with less burden on the operator without requiring an operation of directly touching a finger such as a mouse, a touch pad, or a touch panel.
 ところで、上述した特許文献1に開示された操作入力装置では、カメラで撮影した画像を解析して指の位置を検出するため、指の前後の動きは、指先の画像上の大きさの変化で判断することとなり、細かな動きを検知できないという問題がある。 By the way, in the operation input device disclosed in Patent Document 1 described above, since the position of the finger is detected by analyzing the image taken by the camera, the movement of the finger forward and backward is due to a change in the size of the image of the fingertip. There is a problem that detailed movement cannot be detected.
 これに対して、例えば、特許文献2に開示されたような、カメラから物体表面までの距離の分布である距離画像を用いて、距離画像中の極小点を検出するなどして、操作者の指先の位置を求めて、その動きによって画面上のポインティングができるようにする情報入力装置も提案されている。 On the other hand, for example, the minimum point in the distance image is detected using a distance image that is a distance distribution from the camera to the object surface as disclosed in Patent Document 2, and the operator's There has also been proposed an information input device that obtains the position of the fingertip and enables pointing on the screen by the movement.
特開2013-171529号公報JP 2013-171529 A 特開平7-334299号公報JP 7-334299 A
 しかしながら、上述した特許文献に開示された装置は、いずれも画像解析により指先を検出し、その動きによってポインティング操作を判定しているため、指先の細かで素早い動きを検知するのは困難であり、誤動作を起こす可能性がある。 However, since all of the devices disclosed in the above-mentioned patent documents detect the fingertip by image analysis and determine the pointing operation based on the movement, it is difficult to detect the fine and quick movement of the fingertip. It may cause malfunction.
 そこで、本発明は、上記のような問題を解決するものであり、マウスやタッチパッド、タッチパネルなどの直接操作するデバイスを要することなく、指先の細かで且つ素早い動きなど、操作者に負担が少ない動作入力を的確に取得することが可能な操作入力システム及び操作入力方法、操作入力プログラムを提供することを目的とする。 Therefore, the present invention solves the above-described problems, and does not require a device that directly operates such as a mouse, a touch pad, or a touch panel, and has a small burden on the operator such as a thin and quick movement of a fingertip. An object of the present invention is to provide an operation input system, an operation input method, and an operation input program that can accurately acquire an operation input.
 上記課題を解決するために、本発明は、情報処理装置の表示画面上における座標位置、及びその座標位置に関する操作コマンドを入力する操作入力システムであって、
 電磁波又は音波を発信波として、前記表示画面が視認可能な視野範囲の一部又は全部を含む監視空間領域に対して発信する発信部と、
 対象物によって反射された前記発信波を反射波として受信する受信部と、
 前記発信波の波長と前記反射波の波長とを比較して、前記発信部又は前記受信部に対して接近又は離脱する方向における前記対象物の移動及びその速度を検出する接離検出部と、
 発信した前記発信波と受信した前記反射波に基づき、前記反射波を生成した前記対象物の3次元空間上における位置及びその変位に関する位置情報を検出し、前記位置情報及び前記接離検出部による検出結果に基づき、前記表示画面上における座標位置に関する入力信号を生成して、前記情報処理装置に提供する信号処理部と
を備えることを特徴とする。
In order to solve the above problems, the present invention is an operation input system for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position,
A transmitter that transmits electromagnetic waves or sound waves as a transmitted wave to a monitoring space area including a part or all of the visual field range in which the display screen is visible;
A receiving unit that receives the transmitted wave reflected by the object as a reflected wave;
Compared with the wavelength of the transmitted wave and the wavelength of the reflected wave, a contact / separation detection unit that detects the movement of the object in the direction approaching or leaving the transmission unit or the reception unit and its speed,
Based on the transmitted wave transmitted and the received reflected wave, the position information on the position and the displacement of the object that generated the reflected wave in the three-dimensional space is detected, and the position information and the contact / separation detection unit And a signal processing unit that generates an input signal related to a coordinate position on the display screen based on a detection result and provides the input signal to the information processing apparatus.
 また、本発明は、情報処理装置の表示画面上における座標位置、及びその座標位置に関する操作コマンドを入力する操作入力方法であって、
 電磁波又は音波を発信波として、前記表示画面が視認可能な視野範囲の一部又は全部を含む監視空間領域に対して発信部から発信するとともに、対象物によって反射された前記発信波を反射波として受信部が受信するステップと、
 発信した前記発信波と受信した前記反射波とに基づき、前記反射波を生成した前記対象物の3次元空間上における位置情報及びその変位に関する位置情報を検出し、前記位置情報に基づき、信号処理部が、前記表示画面上における座標位置に関する入力信号を生成して前記情報処理装置に提供するステップと、
 前記発信波の波長と前記反射波の波長とを比較して、前記発信部又は前記受信部に対して接近又は離脱する方向における前記対象物の移動及びその速度を、接離検出部が検出するステップと
を含むことを特徴とする。
The present invention is also an operation input method for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position,
While transmitting the electromagnetic wave or sound wave as a transmitted wave from the transmitter to a monitoring space area including part or all of the visual field range where the display screen is visible, the transmitted wave reflected by the object is used as a reflected wave. A step of receiving by the receiver;
Based on the transmitted wave and the received reflected wave, position information on the three-dimensional space of the object that has generated the reflected wave and position information regarding the displacement are detected, and signal processing is performed based on the position information. A unit that generates an input signal related to a coordinate position on the display screen and provides the input signal to the information processing apparatus;
By comparing the wavelength of the transmitted wave and the wavelength of the reflected wave, the contact / separation detection unit detects the movement and speed of the object in a direction approaching or leaving the transmission unit or the reception unit. And a step.
 これらの発明によれば、発信波と反射波とに基づき対象物の3次元空間内の位置及びその変位を検出して、カーソルの移動や基本的なコマンド操作に関する入力信号を情報処理装置に提供し、マウスやタッチパッド、タッチパネル等の直接操作するデバイスを要することなく、操作者によるポインティング操作を入力することができる。この際、発信波とその反射波の波長の変化を比較して、いわゆるドップラー効果を監視するため、画像解析などの負担の大きい処理を行うことなく、発信部又は受信部に対する対象物の接近又は離脱、及びその移動速度を検出することができ、指先などの細かで且つ素早い動きなど、操作者に負担が少ない動作入力を的確に取得できる。 According to these inventions, the position of the object in the three-dimensional space and its displacement are detected based on the transmitted wave and the reflected wave, and input signals relating to cursor movement and basic command operations are provided to the information processing apparatus. In addition, a pointing operation by an operator can be input without requiring a device such as a mouse, a touch pad, or a touch panel that is directly operated. At this time, in order to monitor the so-called Doppler effect by comparing the change in wavelength of the transmitted wave and its reflected wave, the approach of the object to the transmitting unit or the receiving unit or without performing heavy processing such as image analysis The separation and the moving speed thereof can be detected, and an operation input with less burden on the operator such as a fine and quick movement of a fingertip or the like can be accurately acquired.
(発受信部の指向性)
 上記発明において、前記表示画面方向からパルス波を障害物の無い空間に向けて指向性をもって照射する発信部と、また、近傍からの再反射の混入をふせぎ、空中の指先=監視対象物からの直接反射波のみを指向性を持って受信する受信部をさらに備え、監視対象点から直接戻ってくる反射波に基づいて前記入力信号を生成することが好ましい。
(Direction of transmitter / receiver)
In the above-described invention, a transmitter that emits a pulse wave from the display screen direction toward a space free of obstacles with directivity, and a mixture of re-reflection from the vicinity is avoided, and the fingertip in the air = from the monitored object It is preferable to further include a receiving unit that receives only the direct reflected wave with directivity, and to generate the input signal based on the reflected wave that returns directly from the monitoring target point.
(フィルター)
 上記発明において、受信した前記反射波のうちその後の処理プロセスに必要な最低限のデータ、すなわち反射波の初動波から一定時間までのデータのみを予め抽出し、その後の処理フローに送るフィルタリング部をさらに備え、前記信号処理部は、前記フィルタリング部により抽出された反射波に基づいて前記入力信号を生成することが好ましい。
(filter)
In the above invention, a filtering unit that extracts in advance only the minimum data necessary for the subsequent processing process of the received reflected wave, that is, only data from the initial wave of the reflected wave to a predetermined time, and sends it to the subsequent processing flow. In addition, it is preferable that the signal processing unit generates the input signal based on the reflected wave extracted by the filtering unit.
 このフィルタリングにより処理対象とするデータを早い段階で取捨選択することで、演算処理量を低減して負荷を軽減するとともに処理の高速化を図り、入力操作の応答性を高めることができる。 デ ー タ By selecting data to be processed at an early stage by this filtering, it is possible to reduce the amount of calculation processing, reduce the load, speed up the processing, and improve the responsiveness of the input operation.
(指差しゼスチャー認識と指先検出)
 上記発明において、前記信号処理部は、前記フィルター後の反射波の振幅-時間プロファイルの形状が、初動波が最高振幅強度となり引き続く波がしきい値以下の無~弱振幅となるような、特有なパターンを示すときに指差しゼスチャーとして認識するモジュールと、この指差しゼスチャーが認識される場合にのみ、前記振幅-時間プロファイルの初動波、すなわち最高振幅強度を示しかつ前記受信部までの距離が最短となる点を指先=監視対象点として検出するモジュールをさらに備え、必要かつ最小限の装置を用いて操作者の指差しゼスチャーでの指先=監視対象点を検出することが好ましい。
(Pointing gesture recognition and fingertip detection)
In the above invention, the signal processing unit is characterized in that the shape of the amplitude-time profile of the reflected wave after the filter is such that the initial wave has the highest amplitude intensity and the subsequent wave has no to weak amplitude below a threshold value. Only when the pointing gesture is recognized, and when the pointing gesture is recognized, the initial wave of the amplitude-time profile, that is, the maximum amplitude intensity and the distance to the receiving unit is It is preferable to further include a module that detects the shortest point as a fingertip = monitoring target point, and detects the fingertip = monitoring target point in the operator's pointing gesture using a necessary and minimum device.
(指先検出+3点測距)
 上記発明において、前記信号処理部にて前記指差しゼスチャーが認定され前記初動波を指先=監視対象点として検出した場合、指先=監視対象点と、少なくとも3つある受信部のそれぞれとの距離を求め、これら少なくとも3つの距離から、3点測距法により指先=監視対象点の空間的座標位置を取得するモジュールをさらに備え、複雑な画像処理を省略しつつ操作者のポインティング座標を取得することが好ましい。更に、前記接離検出部は、前記指先=監視対象点の前後の移動及びその速度を検出し、このポインティング座標でのコマンド信号を生成することが好ましい。
(Fingertip detection + 3-point ranging)
In the above invention, when the pointing gesture is recognized by the signal processing unit and the initial wave is detected as a fingertip = monitoring target point, the distance between the fingertip = monitoring target point and each of at least three receiving units is calculated. Obtaining the operator's pointing coordinates while omitting complicated image processing, further comprising a module that obtains the spatial coordinate position of the fingertip = monitoring target point from the at least three distances by the three-point ranging method Is preferred. Furthermore, it is preferable that the contact / separation detection unit detects a movement of the fingertip = a point to be monitored before and after and a speed thereof, and generates a command signal at the pointing coordinates.
(3Dタッチパネル)
 上記発明において、前記信号処理部は、前記監視空間領域内に前記表示画面と平行又は所定角度をなし、前記表示画面と同一形又は相似形をなし、前記表示画面から前記監視対象点までの距離に応じてその大きさを変化させる仮想的な連続する無数の操作面、すなわち3Dタッチパネルを設定する操作面設定部をさらに備え、前記3Dタッチパネルの仮想的な操作面上における前記監視対象点の座標位置に応じて、前記ポインティング入力信号を生成することが好ましい。
(3D touch panel)
In the above invention, the signal processing unit is parallel to the display screen or has a predetermined angle in the monitoring space region, has the same shape or similar shape to the display screen, and is a distance from the display screen to the monitoring target point And an operation surface setting unit for setting a virtual continuous countless operation surface, that is, a 3D touch panel, whose size is changed in accordance with the coordinates of the monitoring target point on the virtual operation surface of the 3D touch panel It is preferable to generate the pointing input signal according to the position.
 前記3Dタッチパネルは、前記操作者と前記表示画面の間の空中に連続する無数の仮想的な操作面を定義することで、カーソルの移動といったポインティング操作をイメージしやすくできるとともに、操作者の指先=監視対象点と表示画面との距離に応じて仮想的操作面の大きさが異なるため(遠いほど大きい)、その距離を調整することで、指先=監視対象点を表示画面に平行に動かす時の指先=監視対象点の速さと、表示画面上のポインターの速さとの比が変わり(遠いほど遅い)、より機動的なポインティング操作が可能となる。 The 3D touch panel makes it easy to imagine a pointing operation such as moving a cursor by defining an infinite number of virtual operation surfaces that are continuous in the air between the operator and the display screen. Since the size of the virtual operation surface varies depending on the distance between the monitoring target point and the display screen (the farther it is, the larger the distance), the fingertip = when moving the monitoring target point parallel to the display screen by adjusting the distance The ratio between the speed of the fingertip = the point to be monitored and the speed of the pointer on the display screen changes (the farther it is, the slower the distance), and a more flexible pointing operation becomes possible.
前記3Dタッチパネルは、画面から操作者までの空間を利用してポインティング操作を実行するといういわゆる非接触型のデバイスであるため、指先=監視対象点によるタッチングによるコマンド操作が困難である。この弱点を補うために、前記ドップラー効果を利用したボタン操作(ドップラーコマンド)を採用することで、情報処理端末の操作性を向上させることが好ましい。 Since the 3D touch panel is a so-called non-contact type device that performs a pointing operation using a space from a screen to an operator, it is difficult to perform a command operation by touching with a fingertip = a monitoring target point. In order to compensate for this weak point, it is preferable to improve the operability of the information processing terminal by adopting a button operation (Doppler command) using the Doppler effect.
(サブコマンドキー)
 上記発明において、ドップラーコマンドはボタンダウンとボタンアップの2種類とその組み合わせに限定されるため、現行のマウスやタッチパッド、タッチパネルで実行可能な基本的なコマンドしか実現できない。よって、指差しゼスチャーに1つのサブコマンドキーをさらに備え、前記ドップラーコマンドに加えて入力コマンドの多様化を図ることが好ましい。このサブコマンドキーは音声入力等の他の方法でも代替することもできる。
(Subcommand key)
In the above invention, since the Doppler commands are limited to two types of button down and button up and combinations thereof, only basic commands that can be executed by the current mouse, touch pad, and touch panel can be realized. Therefore, it is preferable to further include one subcommand key in the pointing gesture and diversify input commands in addition to the Doppler command. This subcommand key can be replaced by other methods such as voice input.
(距離画像+3Dタッチパネル)
 上記発明において、前記3点測距法に代えて距離画像によって形態から前記指差しゼスチャーを認識し前記指先=監視対象点の空間的座標を直接的に求めるようにモジュールを変更し、より視覚的にわかりやすく操作者のポインティング操作を取得することもできる。この変更においても、前記3Dタッチパネルとの組み合わせでポインティング操作をおこない、また、前記接離検出部では、前記ドップラー効果にて前記指先=監視対象点の前後の移動及びその速度を検出することが好ましい。
(Distance image + 3D touch panel)
In the above invention, the module is changed to recognize the pointing gesture from the form by a distance image instead of the three-point ranging method, and directly obtain the spatial coordinates of the fingertip = monitoring target point, and more visually. The operator's pointing operation can also be acquired in an easy-to-understand manner. Even in this change, it is preferable that a pointing operation is performed in combination with the 3D touch panel, and that the contact / separation detection unit detects the movement of the fingertip = the front and back of the monitoring target point and the speed thereof by the Doppler effect. .
(距離画像+ベクトル投影)
 上記発明で、前記距離画像を利用する場合には、指先=監視対象点の位置座標と3Dタッチパネルの組み合わせによるポインティング操作に代わり、前記距離画像により指差しゼスチャー時の指の形を柱状形に模式化し監視対象点とし、この柱状形の軸の向くベクトルが表示画面と交わる座標位置にポインターを表示する操作方法を採用することもできる。この場合も、前記接離検出部では、距離画像に利用する反射波を用いてドップラー効果を監視することが好ましい。
(Distance image + vector projection)
In the above invention, when the distance image is used, instead of a pointing operation by a combination of the fingertip = the position coordinates of the monitoring target point and the 3D touch panel, the shape of the finger at the time of the pointing gesture is schematically shown as a columnar shape by the distance image. It is also possible to adopt an operation method in which a pointer is displayed at a coordinate position where a vector directed to the axis of the columnar shape intersects with the display screen as a monitoring target point. Also in this case, it is preferable that the contact / separation detection unit monitors the Doppler effect using a reflected wave used for the distance image.
(サブコマンド(第2柱状)検出)
 上記発明において、距離画像を利用する場合には、前記監視空間領域内で前記受信部までの距離が2番目に近い指先又は柱状形を第2監視対象として検出し、前記信号処理部は、前記第2監視対象点の有無に応じて、前記入力信号の内容を変更することが好ましい。この場合、例えば人差し指によるポインター操作と併せて、小指を立てるなどの付加的なゼスチャーを検出し、入力コマンドの多様化を図ることができる。
(Subcommand (second columnar) detection)
In the above invention, when using a distance image, a fingertip or a columnar shape having the second closest distance to the receiving unit in the monitoring space area is detected as a second monitoring target, and the signal processing unit It is preferable to change the content of the input signal according to the presence or absence of the second monitoring target point. In this case, it is possible to diversify the input commands by detecting an additional gesture such as raising a little finger in combination with a pointer operation with the index finger, for example.
(操作入力プログラム)
 なお、上述した本発明に係る装置及び方法は、所定の言語で記述された本発明の操作入力プログラムをコンピューター上で実行することにより実現することができる。すなわち、本発明のプログラムを、携帯端末装置やスマートフォン、ウェアラブル端末、モバイルパーソナルコンピューター、その他の情報処理端末、パーソナルコンピューターやサーバーコンピューター等の汎用コンピューターのICチップ、メモリ装置にインストールし、CPU上で実行することにより、上述した各機能を有するシステムを構築して、本発明の方法を実施することができる。
(Operation input program)
The above-described apparatus and method according to the present invention can be realized by executing the operation input program of the present invention described in a predetermined language on a computer. That is, the program of the present invention is installed in a mobile terminal device, a smart phone, a wearable terminal, a mobile personal computer, other information processing terminals, an IC chip or a memory device of a general-purpose computer such as a personal computer or a server computer, and executed on the CPU. By doing so, the system of each function mentioned above can be constructed | assembled, and the method of this invention can be implemented.
 すなわち、本発明のプログラムは、情報処理装置の表示画面上における座標位置、及びその座標位置に関する操作コマンドを入力する操作入力プログラムであって、コンピューターを、
 電磁波又は音波を発信波として、前記表示画面が視認可能な視野範囲の一部又は全部を含む監視空間領域に対して発信する発信部、
 対象物によって反射された前記発信波を反射波として受信する受信部と、
 発信した前記発信波と、受信した前記反射波とに基づき、前記反射波を生成した前記対象物の3次元空間上における位置及びその変位に関する位置情報を検出し、前記位置情報に基づき、前記表示画面上における座標位置に関する入力信号を生成して、前記情報処理装置に提供する信号処理部、及び
 前記発信波の波長と前記反射波の波長とを比較して、前記発信部又は前記受信部に対して接近又は離脱する方向における前記対象物の移動及びその速度を検出する接離検出部
として機能させる。
That is, the program of the present invention is an operation input program for inputting a coordinate position on the display screen of the information processing apparatus and an operation command related to the coordinate position,
A transmitter that transmits electromagnetic waves or sound waves as a transmitted wave to a monitoring space area including a part or all of the visual field range in which the display screen is visible,
A receiving unit that receives the transmitted wave reflected by the object as a reflected wave;
Based on the transmitted wave and the received reflected wave, the position information on the three-dimensional space of the object that generated the reflected wave and position information regarding the displacement are detected, and the display is performed based on the position information. An input signal related to a coordinate position on the screen is generated and provided to the information processing device, and the wavelength of the transmitted wave and the wavelength of the reflected wave are compared, and the signal is transmitted to the transmitter or the receiver. On the other hand, it functions as a contact / separation detection unit that detects the movement and speed of the object in the direction of approaching or leaving.
 このような本発明の操作入力プログラムでは、例えば、通信回線を通じて配布することが可能であり、また、コンピューターで読み取り可能な記録媒体に記録することにより、スタンドアローンの計算機上で動作するパッケージアプリケーションとして譲渡することができる。この記録媒体として、具体的には、フレキシブルディスクやカセットテープ等の磁気記録媒体、若しくはCD-ROMやDVD-ROM等の光ディスクの他、RAMカードなど、種々の記録媒体に記録することができる。そして、このプログラムを記録したコンピューター読み取り可能な記録媒体によれば、汎用のコンピューターや専用コンピューターを用いて、上述したシステム及び方法を簡便に実施することが可能となるとともに、プログラムの保存、運搬及びインストールを容易に行うことができる。 In such an operation input program of the present invention, for example, as a package application that can be distributed through a communication line and that runs on a stand-alone computer by being recorded on a computer-readable recording medium. Can be transferred. Specifically, the recording medium can be recorded on various recording media such as a magnetic recording medium such as a flexible disk and a cassette tape, an optical disk such as a CD-ROM and a DVD-ROM, and a RAM card. According to the computer-readable recording medium on which the program is recorded, the above-described system and method can be easily implemented using a general-purpose computer or a dedicated computer, and the program can be stored, transported, and Easy installation.
 これらの発明によれば、卓上あるいはノートパーソナルコンピューターやタブレットパーソナルコンピューター、スマートフォンなどの情報処理端末装置の表示画面上における、ポインティングやコマンドの入力に際し、マウスやタッチパッド、タッチパネルなどの直接触って操作するデバイスを要することなく、指先の細かで且つ素早い動きなど、操作者に負担が少ない動作入力を的確に取得することが可能となる。 According to these inventions, a mouse, a touch pad, a touch panel, or the like is operated by direct contact with a pointing device or a command input on a desktop or a display screen of an information processing terminal device such as a notebook personal computer, a tablet personal computer, or a smartphone. Without the need for a device, it is possible to accurately acquire an operation input that places little burden on the operator, such as a thin and quick fingertip movement.
第1実施形態に係る操作入力システムを実現するための情報処理端末装置の外観構成を示す斜視図である。It is a perspective view showing appearance composition of an information processing terminal unit for realizing an operation input system concerning a 1st embodiment. 第1実施形態に係る情報処理端末装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the information processing terminal device which concerns on 1st Embodiment. 第1実施形態に係る操作入力システムのCPU上に構築される機能モジュールを示すブロック図である。It is a block diagram which shows the functional module constructed | assembled on CPU of the operation input system which concerns on 1st Embodiment. 第1実施形態に係る操作入力システムの指先=監視対象点の検出処理を示す説明図である。It is explanatory drawing which shows the detection process of the fingertip = monitoring target point of the operation input system which concerns on 1st Embodiment. 第1実施形態に係る操作入力システムの3点測距法の概要を示す説明図である。It is explanatory drawing which shows the outline | summary of the three-point ranging method of the operation input system which concerns on 1st Embodiment. 第1実施形態に係る操作入力システムのドップラー効果検出処理を示す説明図である。It is explanatory drawing which shows the Doppler effect detection process of the operation input system which concerns on 1st Embodiment. 第1実施形態に係る操作入力システムの3Dタッチパネルを示す説明図である。It is explanatory drawing which shows the 3D touch panel of the operation input system which concerns on 1st Embodiment. 第1実施形態に係る操作入力システムの3Dタッチパネルを示す説明図である。It is explanatory drawing which shows the 3D touch panel of the operation input system which concerns on 1st Embodiment. 第1実施形態に係る操作入力システムのフィルタリング処理と指差しゼスチャー認識を示す説明図である。It is explanatory drawing which shows the filtering process and pointing gesture recognition of the operation input system which concern on 1st Embodiment. 第1実施形態に係る操作入力システムの指差しゼスチャー認識と指先=監視対象点検出処理に利用される振幅-時間プロファイルの例を示す説明図である。It is explanatory drawing which shows the example of the amplitude-time profile utilized for the pointing gesture recognition of the operation input system which concerns on 1st Embodiment, and a fingertip = monitoring target point detection process. 第1実施形態に係る操作入力方法の手順を示すフローチャート図である。It is a flowchart figure which shows the procedure of the operation input method which concerns on 1st Embodiment. 第2実施形態に係る操作入力システムのCPU上に構築される機能モジュールを示すブロック図である。It is a block diagram which shows the functional module constructed | assembled on CPU of the operation input system which concerns on 2nd Embodiment. 第2実施形態に係る操作入力システムの操作方法を示す説明図である。It is explanatory drawing which shows the operation method of the operation input system which concerns on 2nd Embodiment. 第2実施形態に係る操作入力システムのベクトル投影処理を示す説明図である。It is explanatory drawing which shows the vector projection process of the operation input system which concerns on 2nd Embodiment. 第2実施形態に係る操作入力システムの柱状形検出処理を示す説明図である。It is explanatory drawing which shows the columnar shape detection process of the operation input system which concerns on 2nd Embodiment. 第2実施形態に係る操作入力システムのサブコマンド認識処理を示す説明図である。It is explanatory drawing which shows the subcommand recognition process of the operation input system which concerns on 2nd Embodiment. 第2実施形態に係る操作入力方法の手順を示すフローチャート図である。It is a flowchart figure which shows the procedure of the operation input method which concerns on 2nd Embodiment. 変更例に係る3Dタッチパネルを示す説明図である。It is explanatory drawing which shows the 3D touchscreen which concerns on the example of a change. 変更例に係る操作入力システムの指先=監視対象点検出処理における3点測距の概要を示す説明図である。It is explanatory drawing which shows the outline | summary of the three-point distance measurement in the fingertip = monitoring target point detection process of the operation input system which concerns on the example of a change.
[第1実施形態]
(操作入力システムの全体構成)
 以下に添付図面を参照して、本発明に係る操作入力システムの第1実施形態について詳細に説明する。図1は、本実施形態に係る操作入力システムを実現するための情報処理端末装置の外観構成を示す概念図であり、図2は、本実施形態に係る操作入力システムを実現するための情報処理端末装置のハードウェア構成を示すブロック図である。なお、説明中で用いられる「モジュール」とは、装置や機器等のハードウェア、或いはその機能を持ったソフトウェア、又はこれらの組み合わせなどによって構成され、所定の動作を達成するための機能単位を示す。
[First Embodiment]
(Overall configuration of operation input system)
Hereinafter, a first embodiment of an operation input system according to the present invention will be described in detail with reference to the accompanying drawings. FIG. 1 is a conceptual diagram showing an external configuration of an information processing terminal device for realizing the operation input system according to the present embodiment, and FIG. 2 is an information process for realizing the operation input system according to the present embodiment. It is a block diagram which shows the hardware constitutions of a terminal device. The “module” used in the description refers to a functional unit that is configured by hardware such as an apparatus or a device, software having the function, or a combination thereof, and achieves a predetermined operation. .
 本実施形態に係る情報処理端末装置は、汎用的なコンピューターや専用の装置で実現することができ、本実施形態では、図1に示すように、汎用的なノートパーソナルコンピューター1に、本発明の操作入力プログラムをインストールして実施する。ノートパーソナルコンピューター1は、モニターなどの表示画面5bを備え、キーボードやポインティングデバイスなどを有する本体部分が二つ折りとなるように一体化された情報処理端末装置であり、発信部5cと、受信部6a0~2とが表示画面5bの下側に、操作者側に向けて配置されている。なお、本実施形態では、ノートパーソナルコンピューターに本発明を適用した場合を例に説明するが、本発明はこれに限定されず、発信部5c及び受信部6a0~2が装備若しくは接続可能な情報処理端末であれば、デスクトップ形式のパーソナルコンピューター等の汎用コンピューターや機能を特化させた専用装置により実現することができる他、タブレットパーソナルコンピューターやスマートフォン、携帯電話機、ウェアラブル端末装置等を採用することもできる。 The information processing terminal device according to the present embodiment can be realized by a general-purpose computer or a dedicated device. In the present embodiment, as shown in FIG. Install the operation input program. The notebook personal computer 1 is an information processing terminal device that includes a display screen 5b such as a monitor and is integrated so that a main body having a keyboard, a pointing device, and the like is folded in half, and includes a transmitter 5c and a receiver 6a0. 2 are arranged on the lower side of the display screen 5b toward the operator side. In the present embodiment, the case where the present invention is applied to a notebook personal computer will be described as an example. However, the present invention is not limited to this, and information processing that can be equipped or connected to the transmission unit 5c and the reception units 6a0 to 6a. If it is a terminal, it can be realized by a general-purpose computer such as a desktop personal computer or a dedicated device specialized in function, and a tablet personal computer, smartphone, mobile phone, wearable terminal device, etc. can also be adopted. .
 ノートパーソナルコンピューター1内部のハードウェア構成としては、図2に示すように、CPU11と、メモリ12と、入力インターフェース16と、記憶装置14と、出力インターフェース15と、通信インターフェース13とが備えられている。なお、本実施形態では、これらの各デバイスは、CPUバス10を介して接続されており、相互にデータの受け渡しが可能となっている。 As shown in FIG. 2, the hardware configuration of the notebook personal computer 1 includes a CPU 11, a memory 12, an input interface 16, a storage device 14, an output interface 15, and a communication interface 13. . In the present embodiment, these devices are connected via the CPU bus 10 and can exchange data with each other.
 入力インターフェース16は、キーボード6bやポインティングデバイス、タッチパネルやボタン等の操作デバイスから操作信号を受信するモジュールであり、受信された操作信号はCPU11に伝えられ、OSや各アプリケーションに対する操作を行うことができる。また、入力インターフェース16には、CCDカメラ6dやマイク6cなどの入力デバイスも接続可能であり、本発明の受信部6a0~2もこの入力インターフェース16に接続される。 The input interface 16 is a module that receives an operation signal from an operation device such as a keyboard 6b, a pointing device, a touch panel, or a button. The received operation signal is transmitted to the CPU 11 and can perform operations on the OS and each application. . Further, input devices such as a CCD camera 6d and a microphone 6c can be connected to the input interface 16, and the receiving units 6a0 to 6a2 of the present invention are also connected to the input interface 16.
 受信部6a0~2は、図4~図6に示すように、近傍からの再反射をブロックし、空中の監視対象物からの直接反射波のみを選択的に受信する指向性を持つ受信デバイスである。詳述すると受信部6a0~2は、近傍に監視対象と操作者以外には何も存在しない空間に所定の指向性をもって発信波W1を照射し、対象物によって反射された発信波のうちキーボード方向などからの再反射の混入を防ぐように指向性を持たせた受信部に到達する直接的な反射波を反射波W2として受信し、受信された反射波W2は、図3に示される同期処理部111を通じて信号処理部112や接離検出部113に入力される。 As shown in FIGS. 4 to 6, the receiving units 6a0 to 6a are receiving devices having directivity for blocking re-reflection from the vicinity and selectively receiving only the direct reflected wave from the monitoring target in the air. is there. More specifically, the receiving units 6a0 to 6a2 irradiate the space W that has nothing other than the monitoring target and the operator in the vicinity with a predetermined directivity, and the direction of the keyboard among the transmitted waves reflected by the target object. A direct reflected wave that reaches the receiving unit having directivity so as to prevent re-reflection from being received is received as a reflected wave W2, and the received reflected wave W2 is subjected to the synchronization processing shown in FIG. The signal is input to the signal processing unit 112 and the contact / separation detection unit 113 through the unit 111.
 本実施形態においては、この受信部6a0~2は、物理的に離間され、隣接配置された複数の赤外線センサーであり、表示画面5bの下部右側に三角形に配列される。なお、十分な解像度があれば、赤外線以外の電磁波センサー、超音波センサーなど、発信部5cによって発信される電磁波や音波などの種類に対応した各種センサーを採用することができる。 In this embodiment, the receiving units 6a0 to 6a2 are a plurality of infrared sensors that are physically separated and arranged adjacent to each other, and are arranged in a triangle on the lower right side of the display screen 5b. If there is sufficient resolution, various sensors corresponding to the types of electromagnetic waves and sound waves transmitted by the transmitter 5c, such as electromagnetic sensors other than infrared rays and ultrasonic sensors, can be employed.
 出力インターフェース15は、表示画面5bやスピーカー5a等の出力デバイスから映像や音声を出力するために映像信号や音声信号を送出するモジュールである。なお、電磁波又は音波を発信波として発信する発信部5cもこの出力インターフェース15に接続されている。 The output interface 15 is a module that transmits video signals and audio signals in order to output video and audio from output devices such as the display screen 5b and the speaker 5a. A transmitter 5c that transmits electromagnetic waves or sound waves as a transmitted wave is also connected to the output interface 15.
 発信部5cは、図4に示すように、電磁波又は音波を発信波として、表示画面5bが視認可能な視野範囲R2の一部又は全部を含む監視空間領域R3に対して、所定の指向性をもって発信するデバイスであり、図3の同期処理部111による制御に応じて発信波を発信する。特に本実施形態においてこの発信部5cは、赤外線レーザーをパルス照射する赤外線ライトが採用されており、表示画面5bの右側下部に三角形に配列された受信部6a0~2の中心に配置されている。なお、この発信部5cは、指先=監視対象点を検出するだけの十分な解像度があれば、超音波発信器、赤外線以外の電波発信器など、受信部6a0~2で感知可能な電磁波や音波などの種類に対応した各種発信器を採用することができる。 As shown in FIG. 4, the transmitter 5 c has a predetermined directivity with respect to the monitoring space region R <b> 3 including part or all of the visual field range R <b> 2 that is visible on the display screen 5 b using electromagnetic waves or sound waves as a transmitted wave. This is a transmitting device, and transmits a transmitted wave in accordance with control by the synchronization processing unit 111 in FIG. In particular, in the present embodiment, the transmitter 5c employs an infrared light that irradiates an infrared laser pulse, and is arranged at the center of the receivers 6a0 to 6a2 arranged in a triangle at the lower right side of the display screen 5b. If the transmitter 5c has sufficient resolution to detect the fingertip = the monitoring target point, the transmitter 5c can detect electromagnetic waves and sound waves that can be detected by the receivers 6a0 to 2 such as an ultrasonic transmitter and a radio transmitter other than infrared rays. Various transmitters corresponding to such types can be employed.
 通信インターフェース13は、他の通信機器とデータの送受信を行うモジュールであり、通信方式としては、例えば、電話回線やISDN回線、ADSL回線、光回線などの公衆回線、専用回線、WCDMA(登録商標)及びCDMA2000などの第3世代(3G)の通信方式、LTEなどの第4世代(4G)の通信方式、及び第5世代(5G)以降の通信方式等の他、Wifi(登録商標)、Bluetooth(登録商標)などの無線通信ネットワークが含まれる。 The communication interface 13 is a module that transmits and receives data to and from other communication devices. As a communication method, for example, a public line such as a telephone line, an ISDN line, an ADSL line, and an optical line, a dedicated line, WCDMA (registered trademark) In addition to 3rd generation (3G) communication systems such as CDMA2000, 4th generation (4G) communication systems such as LTE, and 5th generation (5G) and later communication systems, WiFi (registered trademark), Bluetooth ( Wireless communication networks such as registered trademark).
 記憶装置14は、データを記録媒体に蓄積するとともに、これら蓄積されたデータを各デバイスの要求に応じて読み出す装置であり、例えば、ハードディスクドライブ(HDD)やソリッドステートドライブ(SSD)、メモリカード等により構成することができる。 The storage device 14 is a device that accumulates data in a recording medium and reads out the accumulated data in response to a request from each device. For example, a hard disk drive (HDD), a solid state drive (SSD), a memory card, and the like Can be configured.
 CPU11は、各部を制御する際に必要な種々の演算処理を行う装置であり、各種プログラムを実行することにより、CPU11上に仮想的に各種モジュールを構築する。また、このCPU11上では、OS(Operating System)が起動・実行されており、このOSによってノートパーソナルコンピューター1の基本的な機能が管理・制御されている。さらに、このOS上では種々のアプリケーションが実行可能になっており、CPU11でOSプログラムが実行されることによって、ノートパーソナルコンピューター1の基本的な機能が管理・制御されるとともに、CPU11でアプリケーションプログラムが実行されることによって、種々の機能モジュールがCPU上に仮想的に構築される。 The CPU 11 is a device that performs various arithmetic processes necessary for controlling each unit, and virtually constructs various modules on the CPU 11 by executing various programs. On the CPU 11, an OS (Operating System) is activated and executed, and the basic functions of the notebook personal computer 1 are managed and controlled by the OS. Furthermore, various applications can be executed on the OS, and the CPU 11 executes the OS program, thereby managing and controlling the basic functions of the notebook personal computer 1, and the CPU 11 executing the application program. By being executed, various functional modules are virtually constructed on the CPU.
(各種機能モジュールの構成)
 図3は第1実施形態に係る操作入力システムのCPU上に構築される機能モジュールを示すブロック図である。本実施形態では、本発明の操作入力プログラムを実行することにより、同期処理部111、信号処理部112、接離検出部113の3モジュールがCPU11上に構築される。
(Configuration of various function modules)
FIG. 3 is a block diagram showing functional modules constructed on the CPU of the operation input system according to the first embodiment. In the present embodiment, by executing the operation input program of the present invention, three modules of the synchronization processing unit 111, the signal processing unit 112, and the contact / separation detection unit 113 are constructed on the CPU 11.
 同期処理部111は、図4~図6に示すように、発信部5cにおける発信波W1の発信処理と、受信部6aにおける反射波W2の受信処理とを同期させ、発信波W1と反射波W2との関連付けを行うモジュールである。本実施形態では、発信部5cは間欠的なパルサー波を発信波W1として発信し、その発信した各パルサー波に対応する反射波を受信し、発信及び受信した順番、時刻、発信から受信までの時間長に基づいて、各発信波の信号とそれに対応する各反射波の信号とを選別し、発信波と反射波の信号をセットとして関連づけ、信号処理部112に送出する。 As shown in FIGS. 4 to 6, the synchronization processing unit 111 synchronizes the transmission process of the transmitted wave W1 in the transmission unit 5c and the reception process of the reflected wave W2 in the reception unit 6a, and transmits the transmitted wave W1 and the reflected wave W2. It is a module that associates with. In the present embodiment, the transmission unit 5c transmits intermittent pulsar waves as transmission waves W1, receives reflected waves corresponding to the transmitted pulsar waves, transmits and receives the order, time, transmission to reception. Based on the time length, each transmitted wave signal and each corresponding reflected wave signal are selected, the transmitted wave and the reflected wave signal are associated as a set, and sent to the signal processing unit 112.
 信号処理部112は、発信部5cが発信した発信波W1と、受信部6aが受信した反射波W2とに基づき、反射波W2を生成した対象物の空間的位置情報及びその変位を検出し、表示画面5b上における座標位置に関する入力信号を生成して、情報処理装置上で実行されているOSやアプリケーションに提供するモジュールである。
 具体的に本実施形態に係る信号処理部112は、フィルタリング部112aと、指差しゼスチャー認識部112b、指先=監視対象点検出部112c、位置情報算出部112d、操作面設定部112e、ポインティング信号生成部112f、入力信号生成部112gから構成される。
The signal processing unit 112 detects the spatial position information of the object that generated the reflected wave W2 and its displacement based on the transmitted wave W1 transmitted by the transmitting unit 5c and the reflected wave W2 received by the receiving unit 6a. This is a module that generates an input signal related to a coordinate position on the display screen 5b and provides it to an OS or an application running on the information processing apparatus.
Specifically, the signal processing unit 112 according to the present embodiment includes a filtering unit 112a, a pointing gesture recognition unit 112b, a fingertip = monitor target point detection unit 112c, a position information calculation unit 112d, an operation surface setting unit 112e, and a pointing signal generation. 112f and an input signal generator 112g.
 フィルタリング部112aは、図4に示す監視空間領域R3を監視する際に、後の処理作業で必要とされる反射波以外の信号を予め除外するモジュールである。今後実施が予定される処理作業、すなわち指差しゼスチャー認識及び指先=監視対象検出には、図10に示す初動波から奥行き3cm程度に相当する範囲(A1)の反射波のみが必要とされるため、それ以外の反射波データを前段階で排除し、排除後のデータのみを指差しゼスチャー認識部112bへ入力する。 The filtering unit 112a is a module that excludes in advance signals other than the reflected waves that are required in subsequent processing operations when the monitoring space region R3 shown in FIG. 4 is monitored. Since processing operations scheduled to be performed in the future, that is, pointing gesture recognition and fingertip = monitoring target detection, only a reflected wave in a range (A1) corresponding to a depth of about 3 cm from the initial motion wave shown in FIG. 10 is required. The other reflected wave data is excluded at the previous stage, and only the excluded data is input to the pointing gesture recognition unit 112b.
 指差しゼスチャー認識部112bでは、図1、図4~図9、図13、図14、図18及び図19に示すような指差しゼスチャーの有無を判断する。図9に示す指差しが行われる時には、発信波の進行方向に垂直な指先B1では反射波は強振幅となり、発信波の進行方向に平行或いは又は所定角度(例えば鈍角)に接する指の腹では無~弱振幅となるため、反射波は図10のA1領域にみられるような振幅―時間プロファイルを形成する。逆説的に言えば、反射波にこのような振幅-時間プロファイルの特徴が観察されれば指差しゼスチャーがあったと推定される。本実施形態では、3つの受信部6a0~2の全ての所定数以上のパルス波でこのようなプロファイルが同時に観察された場合に指差しゼスチャーがあったと判断し、これら反射波データは次の指先=監視対象点検出部112cへ入力される。 The pointing gesture recognition unit 112b determines the presence or absence of the pointing gesture as shown in FIGS. 1, 4 to 9, 13, 13, 18, and 19. When the pointing shown in FIG. 9 is performed, the reflected wave has a strong amplitude at the fingertip B1 perpendicular to the traveling direction of the transmitted wave, and at the belly of the finger that is parallel to the traveling direction of the transmitted wave or is in contact with a predetermined angle (for example, an obtuse angle). Since there is no to weak amplitude, the reflected wave forms an amplitude-time profile as seen in the A1 region of FIG. Paradoxically speaking, if such a characteristic of the amplitude-time profile is observed in the reflected wave, it is presumed that a pointing gesture was present. In the present embodiment, it is determined that there is a pointing gesture when such a profile is observed simultaneously with all the predetermined number or more of the pulse waves of the three receiving units 6a0 to 6a2, and the reflected wave data is stored at the next fingertip. = Input to monitoring target point detection unit 112c.
 指先=監視対象点検出部112cでは、図10のA1の範囲内に示される指差しゼスチャーに伴う反射波データのうち初動波を図9及び図10のB1に示す指先=監視対象点として同定し、発信波パルスの往復時間を光速で乗じ2で割ることで、指先=監視対象点からそれぞれ3つの受信部までの距離を計算する。こうして計算された3つの距離値は位置情報算出部112dに入力される。
 次に、位置情報算出部112dでは、指先=監視対象点検出部112cから入力された前記3つの距離から、3点測距法により、指先=監視対象点の相対的位置情報(x、y、z)が求められ、ポインティング信号生成部112fに入力される。
The fingertip = monitoring target point detection unit 112c identifies the initial motion wave as the fingertip = monitoring target point shown in B1 of FIGS. 9 and 10 in the reflected wave data accompanying the pointing gesture shown in the range of A1 in FIG. The distance from the fingertip = the monitoring target point to each of the three receiving units is calculated by multiplying the round trip time of the transmitted wave pulse by the speed of light and dividing by 2. The three distance values calculated in this way are input to the position information calculation unit 112d.
Next, the positional information calculation unit 112d uses the three-point distance measurement method based on the three distances input from the fingertip = monitoring target point detection unit 112c to calculate the relative position information (x, y, z) is obtained and input to the pointing signal generator 112f.
 本実施形態では、三角形に配置された受信部3点から3点測距法により空間的位置情報を求めているが、受信部の数が増えるほど、また、その配置を工夫することで、監視対象点の三次元位置情報の精度は向上する。また、本実施形態では発信部は1点であるが、それぞれの受信部に同期させることで発信部の数を受信部と同数まで増やすことができる。受信部と発信部が接近することで、距離測定の精度が上がり、指先=監視対象点の空間的位置情報の信頼性が高まる。送信部と受信部の数と配置の組み合わせは必要とされる監視対象点位置情報の精度にあわせて最適化できる。第一実施形態では図1に示すように最も基本的な1発信部+3受信部を採用しているが、図18で示すように3つの発信部とそれぞれに対応する3つの受信部のペアとしてもよい。 In this embodiment, spatial position information is obtained by three-point ranging from three reception units arranged in a triangle. However, as the number of reception units increases, the arrangement is improved by devising the arrangement. The accuracy of the three-dimensional position information of the target point is improved. In the present embodiment, the number of transmitting units is one, but the number of transmitting units can be increased to the same number as the receiving units by synchronizing with each receiving unit. The proximity of the receiving unit and the transmitting unit increases the accuracy of distance measurement and increases the reliability of the spatial position information of the fingertip = monitoring target point. The combination of the number and arrangement of the transmitters and receivers can be optimized in accordance with the required accuracy of the monitoring target point position information. In the first embodiment, as shown in FIG. 1, the most basic 1 transmission unit + 3 reception unit is adopted, but as shown in FIG. 18, as a pair of three transmission units and three reception units corresponding to each of the three transmission units. Also good.
 操作面設定部112eは、図7及び図8に示すように、監視空間領域R3内に、表示画面5bと平行又は鈍角を成す、この画面と同一形又は相似形をなし、表示画面から前記監視対象点までの距離に応じてその大きさを変化させる仮想的な連続する無数の仮想操作画面VPの集合(3Dタッチパネル)を設定するモジュールである。 As shown in FIGS. 7 and 8, the operation surface setting unit 112e has the same or similar shape to the display screen 5b in parallel or obtuse with the display screen 5b in the monitoring space region R3. This is a module for setting a virtual continuous countless virtual operation screen VP set (3D touch panel) that changes its size in accordance with the distance to the target point.
 3Dタッチパネルでは、表示画面5bから監視対象点3aまでの距離に応じてVPの大きさが変化する。すなわち、操作者の指先(3a又は3a’)=監視対象点と表示画面5bとの距離D1又はD2に応じて、仮想操作画面VP又はVP2の大きさが変化するため(遠いほど大きい)、その距離D1(又はD2)を調整することで、指先=監視対象点(3a又は3a’)を表示画面5bに平行に動かす速さと、表示画面上のポインターの速さとの比が変わる。ポインターを早く大きく動かしたいときには画面の近くで、ゆっくりと緻密に動かしたいときには指先を画面から離して操作すればいい。仮想操作画面VP又はVP2は四角錐の底面であり、四角錐の高さ=発信波の中心軸方向(図中z)に沿って無数に形成される。四角錐の底面としての仮想操作画面VP又はVP2は、例えば表示画面5bと直行する法線に対して垂直となるように設定してもよく、発信部又は受信部を通る延長線に対して垂直又は所定角度をなすように設定してもよく、表示画面5bに対して平行となるように設定してもよい。 In the 3D touch panel, the size of the VP changes according to the distance from the display screen 5b to the monitoring target point 3a. That is, the size of the virtual operation screen VP or VP2 changes according to the distance D1 or D2 between the operator's fingertip (3a or 3a ′) = the monitoring target point and the display screen 5b (larger as the distance increases), By adjusting the distance D1 (or D2), the ratio between the speed at which the fingertip = the monitoring target point (3a or 3a ′) is moved in parallel to the display screen 5b and the speed of the pointer on the display screen is changed. If you want to move the pointer quickly and greatly, move it near the screen. If you want to move it slowly and precisely, move your fingertip away from the screen. The virtual operation screen VP or VP2 is a bottom surface of a quadrangular pyramid, and is formed innumerably along the height of the quadrangular pyramid = the central axis direction of the transmitted wave (z in the figure). The virtual operation screen VP or VP2 as the bottom surface of the quadrangular pyramid may be set to be perpendicular to the normal line orthogonal to the display screen 5b, for example, and perpendicular to the extension line passing through the transmission unit or the reception unit Or you may set so that it may make a predetermined angle and you may set so that it may become parallel with respect to the display screen 5b.
 すなわち、表示画面5bの横縦をそれぞれx軸y軸、画面から垂直な方向をz軸とする3次元空間において、任意のz値に対して同形で縮尺の異なるVPが1つ割り当てられる。指先=監視対象点3aを前方に動かしてz値が小さくなれば仮想操作画面VPは小さくなり、後方に動かしてz値が大きく仮想操作画面VPは大きくなる。この操作面設定部112eで設定された3Dタッチパネルのフレームワーク=3Dグリッドはポインティング信号生成部112fに入力される。 That is, in the three-dimensional space in which the horizontal and vertical directions of the display screen 5b are the x axis and the y axis, respectively, and the direction perpendicular to the screen is the z axis, one VP having the same shape and different scale is assigned to an arbitrary z value. If the fingertip = monitoring target point 3a is moved forward and the z value becomes smaller, the virtual operation screen VP becomes smaller, and the virtual operation screen VP becomes larger by moving backward and the z value becomes larger. The framework = 3D grid of the 3D touch panel set by the operation surface setting unit 112e is input to the pointing signal generation unit 112f.
 ポインティング信号生成部112fは、画面上の座標により、画面上に表示されるカーソルやポインター、スクロールバーなどのGUI(Graphical User Interface)をポインティング信号として生成するモジュールである。ここでは、位置情報算出部112dから入力された指先=監視対象点の空間的位置情報(x、y、z)が、前記操作面設定部112eで設定された3Dタッチパネルのフレームワーク=3Dグリッドに取り込まれる。指先=監視対象点のz値に対応する1つのVPが選択され、そのVP上での指先=監視対象点の位置(x、y)に基づき、表示画面5bのポインターの位置が決められ、入力信号生成部112gに入力される。ここまでが信号処理部の説明となる。 The pointing signal generation unit 112f is a module that generates a GUI (Graphical User Interface) such as a cursor, a pointer, or a scroll bar displayed on the screen as a pointing signal according to coordinates on the screen. Here, the spatial position information (x, y, z) of the fingertip = monitoring target point input from the position information calculation unit 112d is applied to the 3D touch panel framework = 3D grid set by the operation surface setting unit 112e. It is captured. One VP corresponding to the z value of the fingertip = monitoring target point is selected, and the position of the pointer on the display screen 5b is determined on the basis of the fingertip = position (x, y) of the monitoring target point on the VP. The signal is input to the signal generator 112g. This is the description of the signal processing unit.
 次に接離検出部について説明する。接離検出部113は発信波の波長と反射波の波長とを比較してドップラー効果を検出し、対象物の発受信部に対して接近又は離脱する方向の移動及びその速度を検出するモジュールである。この接離検出部113は、ドップラー効果検出部113aとドップラーコマンド信号生成部113bを備えている。 Next, the contact / separation detector will be described. The contact / separation detector 113 is a module that detects the Doppler effect by comparing the wavelength of the transmitted wave and the wavelength of the reflected wave, and detects the movement of the object in the direction of approaching or leaving the object and the speed thereof. is there. The contact / separation detection unit 113 includes a Doppler effect detection unit 113a and a Doppler command signal generation unit 113b.
 ドップラー効果検出部113aでは、図6に示すように、発信されたパルス波である発信波W1の波長T1と、反射波W2の波長T2とを比べてドップラー効果を検出する。反射波W2には、図10に示す、指先=監視対象点検出部112cから入力される指先B1での初動波を利用する。反射波の波長T2が短くなる、すなわち青方偏移を検出したときには指先=監視対象点が接近したと判定し、波長T2が長くなる、すなわち赤方偏移を検出したときには指先=監視対象点が離脱したと判定する。3つの受信部の偏移方向が一致し全ての受信部で指先=監視対象点の移動速度がしきい値以上のパルス波が特定回数以上認識される場合に、その各受信部のパルス波のうち最高速度の平均がドップラーコマンド信号生成部113bに入力される。 The Doppler effect detection unit 113a detects the Doppler effect by comparing the wavelength T1 of the transmitted wave W1, which is a transmitted pulse wave, with the wavelength T2 of the reflected wave W2, as shown in FIG. As the reflected wave W2, the initial wave at the fingertip B1 input from the fingertip = monitoring target point detection unit 112c shown in FIG. 10 is used. When the wavelength T2 of the reflected wave is shortened, that is, when a blue shift is detected, it is determined that the fingertip = monitor target point is approached. When the wavelength T2 is increased, that is, when a red shift is detected, the fingertip = monitor target point. Is determined to have left. When the pulse directions of the three receiving units are the same and all the receiving units recognize a pulse wave whose fingertip = monitoring point moving speed is equal to or greater than a threshold value more than a specific number of times, Of these, the average of the maximum speeds is input to the Doppler command signal generator 113b.
 ドップラーコマンド信号生成部113bは、前記ドップラー効果検出部113aで求められた指先=監視対象点の前後の移動方向及び移動速度をコマンド(ドップラーコマンド)に変換するモジュールである。本実施形態では、反射波の波長が短くなる青方偏移であるときにはボタンダウン操作と、波長が長くなる赤方偏移であるときにはボタンアップ操作と解釈する。例えば、急速にボタンダウン操作があればマウスでのクリック、ボタンダウン操作が2回急速に連続すればダブルクリック操作と判断する。急速にボタンアップがあればドラッグ操作開始、ドラッグ範囲を選択後急速にボタンダウンすればドラッグ操作の終了と判断する。ここで生成されたドップラーコマンドは入力信号生成部に入力される。 The Doppler command signal generation unit 113b is a module for converting the fingertip = the moving direction and the moving speed before and after the monitoring target point obtained by the Doppler effect detecting unit 113a into commands (Doppler commands). In the present embodiment, it is interpreted as a button down operation when it is a blue shift where the wavelength of the reflected wave is short, and as a button up operation when it is a red shift where the wavelength is long. For example, if there is a rapid button-down operation, it is determined that a click with the mouse is performed, and if the button-down operation continues twice rapidly, a double-click operation is determined. If there is a rapid button up, it is determined that the drag operation is started, and if the button is rapidly down after selecting the drag range, it is determined that the drag operation is completed. The Doppler command generated here is input to the input signal generation unit.
 入力信号生成部112gでは、ポインティング信号生成部112fで得られた画面上のポインター座標(x、y)と、ドップラーコマンド信号生成部113bで得たコマンド信号を統合して、GUIに対する総合的な操作コマンドとして翻訳された入力信号を生成する。入力信号生成部112gにドップラーコマンド信号が入力されるとポインティング信号の入力信号生成部112gへの入力は中断されポインター座標(x、y)はコマンドが実施された位置に一定時間静止する。更には、ドップラーコマンドを認識するまでのわずか間にポインター座標にブレが生じないように、ポインティング信号とドップラーコマンド信号の入力時間にタイムラグを設け、指先=監視対象点が動き始めてからドップラー効果を検出するまでの指の動きをポインティング操作から除外することが好ましい。 The input signal generation unit 112g integrates the pointer coordinates (x, y) on the screen obtained by the pointing signal generation unit 112f and the command signal obtained by the Doppler command signal generation unit 113b, and performs comprehensive operations on the GUI. An input signal translated as a command is generated. When the Doppler command signal is input to the input signal generation unit 112g, the input of the pointing signal to the input signal generation unit 112g is interrupted, and the pointer coordinates (x, y) remain at a position where the command is executed for a certain time. In addition, a time lag is provided in the input time of the pointing signal and the Doppler command signal so that the pointer coordinates do not fluctuate in the short time before the Doppler command is recognized, and the Doppler effect is detected after the fingertip = monitored point starts moving. It is preferable to exclude the movement of the finger up to the point from the pointing operation.
 入力信号生成部112gで生成された入力信号は、CPU11上で実行されているOSや他のアプリケーションを通じてノートパーソナルコンピューター1に提供される。なお、入力信号生成部112gでは、前記ドップラーコマンドに加えて、他の入力デバイスによる操作信号を取り入れて、多様なコマンドを生成してもよい。例えば、指差しゼスチャーにより表示画面にポインターを表示した状態で、図1に示される情報処理端末の左手前隅のサブコマンドキー1aを短く1回押してメニュー画面を表示させたり、また、同じキーを長押ししながらポインターを前後に動かしてズームインアウト、画面と平行に動かして画面のスクロールを実行したりすることもできる。このように、サブコマンドキーを1つ加えるだけで、ドップラーコマンドとあわせて現在のマウスやタッチパネルのコマンドのほぼ全てが実現可能となる。また、「クリック」「ドラッグ」「ズームイン」などとマイク6cに呼びかける音声入力による補助を加えるなど、ニーズに応じてGUIの様態を操作者の好みに最適化させることができれば一層便利である。 The input signal generated by the input signal generation unit 112g is provided to the notebook personal computer 1 through the OS executed on the CPU 11 and other applications. The input signal generation unit 112g may generate various commands by incorporating operation signals from other input devices in addition to the Doppler command. For example, with the pointer displayed on the display screen with a pointing gesture, the sub-command key 1a in the left front corner of the information processing terminal shown in FIG. 1 is briefly pressed once to display the menu screen, or the same key is pressed. You can zoom in and out by moving the pointer back and forth while holding down, and scroll the screen by moving it in parallel with the screen. In this way, by adding only one subcommand key, almost all current mouse and touch panel commands can be realized together with the Doppler command. Furthermore, it is more convenient if the GUI mode can be optimized to the operator's preference according to needs, such as adding assistance by voice input to the microphone 6c such as “click”, “drag”, “zoom in”, and the like.
 また、入力信号の生成にあたり、入力信号に含まれるコマンドの内容を画面上のポインターの近傍に表示させる画像信号や文字情報を含めるようにすれば一層便利である。例えば、「クリック」や「ダブルクリック」等の文字列をメッセージとして1秒程度表示させたり、操作中に「ドラッグ」「ズーム」「スクロール」を継続表示させたりしてもよい。 Also, when generating the input signal, it is more convenient to include an image signal and character information for displaying the content of the command included in the input signal in the vicinity of the pointer on the screen. For example, a character string such as “click” or “double click” may be displayed as a message for about 1 second, or “drag”, “zoom”, and “scroll” may be continuously displayed during operation.
(操作入力方法の手順)
 以上説明した操作入力システムを動作させることによって、本発明の操作入力方法を実施できる。図11は、本実施形態に係る操作入力方法の手順を示すフローチャート図である。
(Operation input procedure)
The operation input method of the present invention can be implemented by operating the operation input system described above. FIG. 11 is a flowchart showing the procedure of the operation input method according to the present embodiment.
 先ず、ステップS101では、赤外線或いはその他の電磁波又は音波を、指向性のある発信波として、発信部5cから図4に示す監視空間領域R3に対して発信するとともに、対象物によって反射された反射波を受信部6a0~2で再反射が混入しないように受信する。
 ステップS102では、図6に示す発信部における発信波W1の発信処理と、受信部における反射波W2の受信処理とを同期させ、発信波W1と反射波W2との関連付けを行う。
 ステップS103では、不要な情報を予め除去して処理負担を軽減するために、フィルタリング部112aによって初動波から奥行き3センチメートルに相当する範囲の反射波の信号のみを選択するフィルタリングを行い、次のステップに送る。
 ステップS104では、フィルタリングされた信号に対して、指差しゼスチャーの有無を判断する。3つの受信部での所定数以上のパルス波で同時に図10のA1領域に示されるような特徴的な反射波のパターンが検出された場合指差しゼスチャーありと判断する。
First, in step S101, infrared rays or other electromagnetic waves or sound waves are transmitted as directional transmission waves from the transmission unit 5c to the monitoring space region R3 shown in FIG. 4 and reflected waves reflected by the object. Are received by the receiving units 6a0 to 6a so that re-reflection is not mixed.
In step S102, the transmission process of the transmission wave W1 in the transmission unit shown in FIG. 6 and the reception process of the reflected wave W2 in the reception unit are synchronized, and the transmission wave W1 and the reflection wave W2 are associated with each other.
In step S103, in order to remove unnecessary information in advance and reduce the processing load, the filtering unit 112a performs filtering to select only a reflected wave signal in a range corresponding to a depth of 3 centimeters from the initial motion wave. Send to step.
In step S104, the presence / absence of a pointing gesture is determined for the filtered signal. When a characteristic reflected wave pattern as shown in the area A1 in FIG. 10 is detected simultaneously with a predetermined number or more of pulse waves from the three receiving units, it is determined that there is a pointing gesture.
 また、ステップS105で指差しゼスチャーありと判断されれば(S105における「Y」)、次のステップS106に進み、図10のA1領域に示す3つの受信部の反射波データを次のステップに送る。指差しゼスチャーが認められない場合は(S105における「N」)、認められるまで前のステップが繰り返される。
 ステップS106では、図10のA1領域の反射波データの初動波を指先B1と解釈し、発信波が照射されてから3つの受信部それぞれに戻ってくるまでの時間から指先までの距離を計算する。この3つの距離に基づいて位置情報算出部112dにて3点測距法に基づいて指先=監視対象点の空間的位置情報(x、y、z)が求められる。
 ステップS107では、指先=監視対象点検出部112cから入力される3つの受信部の初動反射波データの波長と、パルス発信波の波長とを比較し、ドップラー効果の有無を調べて指先=監視対象点の前後の移動を検出する。
If it is determined in step S105 that there is a pointing gesture (“Y” in S105), the process proceeds to the next step S106, and the reflected wave data of the three receiving units shown in the A1 area of FIG. 10 is sent to the next step. . If the pointing gesture is not recognized (“N” in S105), the previous step is repeated until it is recognized.
In step S106, the initial motion wave of the reflected wave data in the A1 region in FIG. 10 is interpreted as the fingertip B1, and the distance from the time from when the transmitted wave is irradiated until it returns to each of the three receiving units to the fingertip is calculated. . Based on these three distances, the position information calculation unit 112d obtains the spatial position information (x, y, z) of the fingertip = monitoring target point based on the three-point distance measuring method.
In step S107, the fingertip = the monitoring target point detection unit 112c is compared with the wavelength of the initial reflected wave data of the three receiving units and the wavelength of the pulse transmission wave, and the presence or absence of the Doppler effect is checked. Detects the movement of a point back and forth.
 指先=監視対象点の各受信部の所定数以上のパルス波で偏移方向が一致し、その速度がしきい値を超える場合には(ステップS108の「Y」)、ステップS111においてドップラーコマンド信号を生成する。 If the deviation direction matches with a predetermined number or more of the pulse waves of each reception unit at the fingertip = monitoring target point and the speed exceeds the threshold value (“Y” in step S108), the Doppler command signal in step S111 Is generated.
 入力信号生成部112gにドップラーコマンド信号が入力されるとポインティング信号の入力信号生成部112gへの入力は中断されポインター座標(x、y)はコマンドが実施された位置に一定時間静止する。更には、ドップラーコマンドを認識するまでのわずか間にポインター座標にブレが生じないように、ポインティング信号とドップラーコマンド信号の入力時間にタイムラグを設け、指先=監視対象点が動き始めてからドップラー効果を検出するまでの指の動きをポインティング操作から除外することが好ましい。一方、ドップラー効果が検出されない場合には(ステップS108における「N」)通常のポインティング操作が継続実施される(S109)。 When the Doppler command signal is input to the input signal generation unit 112g, the input of the pointing signal to the input signal generation unit 112g is interrupted, and the pointer coordinates (x, y) remain at a position where the command is executed for a certain time. In addition, a time lag is provided in the input time of the pointing signal and the Doppler command signal so that the pointer coordinates do not fluctuate in the short time before the Doppler command is recognized, and the Doppler effect is detected after the fingertip = monitored point starts moving. It is preferable to exclude the movement of the finger up to the point from the pointing operation. On the other hand, when the Doppler effect is not detected (“N” in step S108), the normal pointing operation is continued (S109).
 ステップS110では、ステップS109のポインティング信号生成部から入力されたポインティング位置情報とステップS111のドップラーコマンド信号を併せ、特定のポインター位置(x、y)でのドップラーコマンドの入力信号が出力され、CPU11上で実行されているOSや他のアプリケーションを通じてノートパーソナルコンピューター1に提供される。 In step S110, the pointing position information input from the pointing signal generation unit in step S109 and the Doppler command signal in step S111 are combined, and an input signal of a Doppler command at a specific pointer position (x, y) is output. Is provided to the notebook personal computer 1 through the OS and other applications running on the computer.
(作用・効果)
 本実施形態によれば、同期された発信波と反射波とに基づき指先=監視対象点の空間的位置及びその変位を検出して、ポインターの移動やその他の操作などの入力信号を出力できることから、マウスやタッチパッド、タッチパネル等の直接触って操作するデバイスを要することなく操作者のポインティング操作を入力することができる。この際、発信波とその反射波の波長の変化を比較して、いわゆるドップラー効果を監視することで、画像解析などの負担の大きい処理を行うことなく、簡単かつ正確に、対象物の接近又は離脱、及びその移動速度を検出でき、指先などの細かで且つ素早い動きなどを的確に検出しコマンドに変換できる。
(Action / Effect)
According to the present embodiment, the spatial position of the fingertip = monitored point and its displacement can be detected based on the synchronized transmitted wave and reflected wave, and an input signal such as movement of the pointer or other operations can be output. In addition, an operator's pointing operation can be input without using a device such as a mouse, a touch pad, or a touch panel that is operated by direct contact. In this case, the so-called Doppler effect is monitored by comparing changes in the wavelength of the transmitted wave and its reflected wave, so that the approach of the target object can be performed easily and accurately without performing heavy processing such as image analysis. It is possible to detect the separation and the moving speed thereof, and it is possible to accurately detect a minute and quick movement of a fingertip or the like and convert it into a command.
 また、ドップラーコマンドだけではカバーしされない他のコマンドについても、1つのボタン操作を追加して指差しゼスチャーと組み合わせることで、現在のマウスやタッチパッド、タッチパネルでのコマンドをほぼ全てカバーできるようになる。更に音声によるコマンドを付け加えたりすることで、操作者のニーズに合わせた様々な仕様に対応させることもできる。 Also, for other commands that are not covered only by Doppler commands, adding a single button operation and combining it with a pointing gesture will allow you to cover almost all commands on the current mouse, touchpad, and touch panel. . Furthermore, by adding voice commands, it is possible to meet various specifications that meet the needs of the operator.
 本実施形態では、3点測距法により指先=監視対象点の三次元位置を得ることで、最低3つの受信センサーにより距離画像と遜色の無い指先=監視対象点の空間的位置情報を取得できる。更に受信部を増やすことで、またそれぞれの受信部に連動させる形で発信部を増やすことで指先=監視対象点検出の精度を上げることができる。本実施形態では1つの発信部と3つの受信部という最もベーシックな設定を採用したが、図18で示すスマートフォンでの変更例では同じ位置にそれぞれ連動する形で受発信部のペアが3つ存在するより高度な設定を採用している。 In this embodiment, by obtaining the three-dimensional position of the fingertip = monitoring target point by the three-point distance measurement method, the spatial position information of the fingertip = monitoring target point that is not inferior to the distance image can be acquired by at least three receiving sensors. . Further, by increasing the number of receiving units, and by increasing the number of transmitting units linked to each receiving unit, the accuracy of fingertip = monitoring target point detection can be increased. In this embodiment, the most basic setting of one transmitting unit and three receiving units is adopted. However, in the modification example of the smartphone shown in FIG. 18, there are three pairs of receiving / transmitting units linked to the same position. Adopt more advanced settings.
 なお、3点測距離法に代わり距離画像により指差しゼスチャーを認識し指先=監視対象点の空間的位置情報を取得して3Dタッチパネル、ドップラーコマンドと組み合わせてポインティング、コマンド操作を行うこともできる。距離画像を他の目的で利用する場合にはこれを転用すればよい。 In addition, instead of the three-point distance measurement method, it is also possible to recognize a pointing gesture using a distance image, acquire spatial position information of the fingertip = monitoring target point, and perform pointing and command operations in combination with a 3D touch panel and Doppler commands. If the distance image is used for other purposes, it may be diverted.
 本実施形態では、指差しゼスチャー認識及びドップラー効果検出において、異なる3つの受信部の複数のパルス反射波から提供されるそれぞれ独立した情報が整合した場合のみ採用する、統計的なクロスチェックを行うため誤認の可能性が低く抑えられている。上述のように3つの受信点は最もベーシックな仕様であり、受信部、及びパルス波密度が増えればクロスチェックの数が増え、判定の確からしさが更に向上する。 In this embodiment, in order to perform a statistical cross check that is adopted only when independent information provided from a plurality of pulse reflected waves of three different receiving units is matched in pointing gesture recognition and Doppler effect detection. The possibility of misidentification is kept low. As described above, the three reception points have the most basic specifications. If the reception unit and the pulse wave density increase, the number of cross checks increases, and the accuracy of determination further improves.
 本実施形態では、発信部は、近傍に監視対象物と操作者以外の物体が存在しない空間に発信波を照射し、また受信部はその空間からの反射波のみを受ける仕様とするため最初から不要な信号=ノイズが抑えられている。また、フィルタリング及び指先ゼスチャー認識をおこない、必要かつ最低限の情報のみを取捨選択しながら指先=監視対象点検出とドップラー効果検出の目的を達成する。このため、従来のゼスチャーGUIのように、ハイスペックな機器による膨大なデータ処理と複雑な画像処理を要さないので、CPUの負荷を大幅に軽減することができる。 In the present embodiment, the transmitting unit irradiates a transmitted wave in a space where there are no objects to be monitored and an object other than the operator in the vicinity, and the receiving unit is configured to receive only reflected waves from the space from the beginning. Unnecessary signal = noise is suppressed. In addition, filtering and fingertip gesture recognition are performed, and the objectives of fingertip = monitoring target point detection and Doppler effect detection are achieved while selecting only necessary and minimum information. For this reason, unlike the conventional gesture GUI, a huge amount of data processing and complicated image processing by a high-spec device are not required, so that the load on the CPU can be greatly reduced.
 本実施形態では、3Dタッチパネルとドップラーコマンドの組み合わせにより、操作者は情報処理端末の表示画面の前面に存在する空間を自在に利用して自然で自由な操作が実行できる。3Dタッチパネルでは、指先=監視対象点の表示画面からの距離に応じて仮想操作画面VPの大きさが変わるため、ポインターを大きく動かしたいときには指を画面に寄せ、より精密な操作をしたいときは指を後方に退けて操作すればよい。タッチするゼスチャーにより実際にタッチした効果を生むドップラーコマンドとの組み合わせにより操作性を高めることができる。 In this embodiment, the combination of the 3D touch panel and the Doppler command allows the operator to freely perform a natural and free operation by freely using the space existing in front of the display screen of the information processing terminal. In the 3D touch panel, the size of the virtual operation screen VP changes depending on the distance from the display screen of the fingertip = monitoring target point, so if you want to move the pointer greatly, bring your finger to the screen, and if you want to perform more precise operations, You only need to move back backwards. The operability can be improved by combining with a Doppler command that produces an effect of actual touch by a touching gesture.
[第2実施形態]
 次いで、本発明の第2実施形態について説明する。本実施形態では、対象物表面までの距離の分布である距離画像における指の3次元形状に基づき、指差しの方向の延長線上の表示画面にポインターを表現する。図12は、第2実施形態に係る操作入力システムのCPU上に構築される機能モジュールを示すブロック図である。なお、本実施形態において、上述した第1実施形態と同一の構成要素には同一の符号を付し、その機能等は特に言及しない限り同一であり、その説明は省略する。
[Second Embodiment]
Next, a second embodiment of the present invention will be described. In the present embodiment, a pointer is represented on a display screen on an extension line in the pointing direction based on the three-dimensional shape of the finger in a distance image that is a distance distribution to the object surface. FIG. 12 is a block diagram illustrating functional modules constructed on the CPU of the operation input system according to the second embodiment. In the present embodiment, the same components as those in the first embodiment described above are denoted by the same reference numerals, and the functions and the like are the same unless otherwise specified, and the description thereof is omitted.
(操作入力システムの構成)
 図12に示すように、本実施形態に係る信号処理部112は、柱状検出部112iと、ベクトル判定部112hを特徴的構成要素とする。
 本実施形態において、表示画面5bの右下に1つ設けられている受信部6aはいわゆる距離画像カメラである。距離画像カメラは、画像を撮影するCCD等の半導体チップであり、多数のセンサー素子により受光された反射波の到達距離を測定、イメージングしいわゆる距離画像を生成する。
(Operation input system configuration)
As shown in FIG. 12, the signal processing unit 112 according to the present embodiment includes a columnar detection unit 112i and a vector determination unit 112h as characteristic components.
In the present embodiment, one receiving unit 6a provided at the lower right of the display screen 5b is a so-called range image camera. A distance image camera is a semiconductor chip such as a CCD that captures an image, and measures and images the arrival distance of reflected waves received by a number of sensor elements to generate a so-called distance image.
 指差しゼスチャー認識部112bの役割は第一実施形態と同じであるが、本実施形態では画像情報から得られる形状を基に指差しゼスチャーを抽出する。図14に示すように表示画面に向けられた指は細長い円柱状の形状に模すことができるため、フィルター後の画像データに、表示画面を向く所定の直径の円柱形状が認識される場合に指先ゼスチャーありと認識する。 The role of the pointing gesture recognition unit 112b is the same as that of the first embodiment, but in this embodiment, the pointing gesture is extracted based on the shape obtained from the image information. As shown in FIG. 14, since the finger directed to the display screen can be imitated in a long and narrow cylindrical shape, when a cylindrical shape having a predetermined diameter facing the display screen is recognized in the image data after filtering. Recognize that there is a fingertip gesture.
 距離画像で指差しゼスチャーを見た場合発信波照射方向と反対側に影が生まれるため、実際にはxz面、すなわちキーボードと水平な面に投影された細長い長方形を基準に指先ゼスチャーを認識する。このようにして指差しゼスチャー認識がなされた距離画像データは柱状検出部112iに入力される。 When looking at the pointing gesture in the distance image, a shadow is created on the side opposite to the direction of the transmitted wave irradiation, so the fingertip gesture is actually recognized on the basis of an elongated rectangle projected on the xz plane, that is, the keyboard and a horizontal plane. The distance image data that has been recognized with the pointing gesture in this manner is input to the columnar detection unit 112i.
 柱状検出部112iでは、指差しゼスチャーが認識された場合、入力された距離画像データに基づいて、影を含まない部分の画像(図15(a))から、図14及び図15(b)のObのような円柱形を幾何学的に近似し三次元的に描図する。 In the columnar detection unit 112i, when a pointing gesture is recognized, based on the input distance image data, from the image of the portion that does not include a shadow (FIG. 15A), FIG. 14 and FIG. A cylindrical shape such as Ob is geometrically approximated and drawn three-dimensionally.
 ベクトル判定部112hでは、柱状検出部112iによって検出された柱状形Ob(図14及び図15(b))の軸のベクトルV1(図13~15)の近似式を作る。操作面設定部112eでは、監視空間領域内に、表示画面5bをXY軸、それと直行する方向をZ軸とする3Dグリッドを設定する。この操作面設定部112eで設定された3Dグリッドは位置情報算出部112dにフレームワークとして入力される。 The vector determination unit 112h creates an approximate expression of the axis vector V1 (FIGS. 13 to 15) of the columnar shape Ob (FIGS. 14 and 15B) detected by the columnar detection unit 112i. The operation surface setting unit 112e sets a 3D grid with the display screen 5b as the XY axis and the direction orthogonal thereto as the Z axis in the monitoring space area. The 3D grid set by the operation surface setting unit 112e is input as a framework to the position information calculation unit 112d.
 位置情報算出部112dでは、上記3Dグリッド内に表現されたベクトルV1の近似直線が表示画面5b上と交わる座標位置(x、y)を計算し、これをポインターの位置としてポインティング信号生成部112fに受け渡す。これに基づき、ポインティング信号生成部112fは、ポインターの位置や移動を指定する入力信号を生成する。 The position information calculation unit 112d calculates the coordinate position (x, y) where the approximate straight line of the vector V1 expressed in the 3D grid intersects the display screen 5b, and uses this as the pointer position to the pointing signal generation unit 112f. Deliver. Based on this, the pointing signal generator 112f generates an input signal that specifies the position and movement of the pointer.
 なお、本実施形態においては、上記のプロセスの中で、第2の指差しゼスチャーが認識されればこれをサブコマンド信号として捕らえ入力信号の内容を変更させる。例えば、図16に示されるように、人差し指による指差しゼスチャーによる操作と併せて小指を立てるなどのゼスチャーを行えば、これをサブコマンド信号と判断し、画面上にサブコマンド用のダイアログを表示させ、操作者による選択操作の範囲を拡張できる。これは第一次実施形態でのサブコマンドキー(図1の1a)に相当するが、本実施形態では距離画像の豊富な情報量を生かしてこれをゼスチャーに置き換える。 In the present embodiment, if the second pointing gesture is recognized in the above process, it is captured as a subcommand signal and the content of the input signal is changed. For example, as shown in FIG. 16, if a gesture such as raising a little finger is performed in combination with an operation with a pointing finger with an index finger, this is determined as a subcommand signal, and a subcommand dialog is displayed on the screen. The range of selection operations by the operator can be expanded. This corresponds to the sub-command key (1a in FIG. 1) in the first embodiment. In this embodiment, this information is replaced with a gesture by making use of the abundant amount of information in the distance image.
(操作入力方法の手順)
 以上説明した本実施形態に係る操作入力システムを動作させることによって、本発明の操作入力方法を実施できる。図17は、本実施形態に係る操作入力方法の手順を示すフローチャート図である。基本的なプロセスは第一次実施形態と同じだが、指先=監視対象点の位置情報に代わって、距離画像解析の指差しゼスチャーのベクトルに基づいて、ポインティング操作が実施される。以下に、第一次実施形態と同じ内容については省略して本実施形態に特有なステップについてのみ記載する。
(Operation input procedure)
The operation input method of the present invention can be implemented by operating the operation input system according to the present embodiment described above. FIG. 17 is a flowchart showing the procedure of the operation input method according to this embodiment. Although the basic process is the same as in the first embodiment, a pointing operation is performed based on a pointing gesture vector of distance image analysis instead of the position information of the fingertip = monitoring target point. In the following, the same contents as in the first embodiment are omitted, and only the steps unique to this embodiment are described.
 ステップS202にて、対象物を距離画像カメラでモニターし、ステップS204にて、距離画像データに基づいて指差しゼスチャーの有無を判断する。
 ステップS205では、先のステップで指差しゼスチャーがあると判断された場合、柱状検出部112iにて、指差しゼスチャーの距離画像データに基づいて、図14及び図15に示す柱状形Obを幾何学的に近似する。
 ステップS206では、ベクトル判定部112hにて、図13~図15に示すように、柱状検出部112iによって検出された柱状形Obの軸の方向、すなわちベクトルV1の近似式を算定する。
In step S202, the object is monitored with a distance image camera, and in step S204, the presence or absence of a pointing gesture is determined based on the distance image data.
In step S205, if it is determined in the previous step that there is a pointing gesture, the columnar detection unit 112i geometrically analyzes the columnar shape Ob shown in FIGS. 14 and 15 based on the distance image data of the pointing gesture. Approximate.
In step S206, the vector determination unit 112h calculates the direction of the axis of the columnar shape Ob detected by the columnar detection unit 112i, that is, the approximate expression of the vector V1, as shown in FIGS.
 ステップS207では、位置情報算出部112dにて、ベクトルV1の延長線が表示画面5b上と交わる座標位置を計算し、これをポインティング信号生成部112fに受け渡す。この情報に基づき、ポインティング信号生成部112fは、ポインター位置や移動を指定するポインティング信号を生成し(S209)、入力信号生成部112gに入力する。 In step S207, the position information calculation unit 112d calculates the coordinate position where the extended line of the vector V1 intersects the display screen 5b, and transfers this to the pointing signal generation unit 112f. Based on this information, the pointing signal generator 112f generates a pointing signal for designating the pointer position and movement (S209), and inputs the pointing signal to the input signal generator 112g.
 ステップS208及びステップS211のドップラーコマンド生成に関するプロセスについても基本的に第一実施形態と同一である。本実施形態では、ドップラー効果を検出に要する、発信波と波長を比較するための反射波サンプルを指先ゼスチャー認識後の指先=監視対象点に相当する初動波からではなく、距離画像データから直接抽出する点で異なる。 The processes related to Doppler command generation in Step S208 and Step S211 are basically the same as those in the first embodiment. In this embodiment, the reflected wave sample for comparing the transmitted wave and the wavelength required for detecting the Doppler effect is directly extracted from the distance image data, not from the fingertip after fingertip gesture recognition = the initial motion wave corresponding to the monitoring target point. It is different in point to do.
 また、第一次実施形態ではサブコマンドキーを利用していたが、本実施形態においては距離画像の豊富な情報量を利用して、第2の指差しゼスチャーによってサブコマンド信号を生成する。サブコマンドのゼスチャーが認識されていると判断したときにはサブコマンドメニューを表示させる処理を実行させた後、サブコマンドメニュー上のポインターを移動させるポインティング信号を生成する。その後、ステップS210において、位置情報に基づき、信号処理部112が表示画面5b上における座標位置に関する入力信号を生成し、出力する。 In the first embodiment, the subcommand key is used. In this embodiment, the subcommand signal is generated by the second pointing gesture using the abundant amount of information in the distance image. When it is determined that the gesture of the subcommand is recognized, a process for displaying the subcommand menu is executed, and then a pointing signal for moving the pointer on the subcommand menu is generated. Thereafter, in step S210, based on the position information, the signal processing unit 112 generates and outputs an input signal related to the coordinate position on the display screen 5b.
[変更例]
 なお、上述した実施形態の説明は、本発明の一例である。このため、本発明は上述した実施形態に限定されることなく、本発明に係る技術的思想を逸脱しない範囲であれば、設計等に応じて種々の変更が可能である。
[Example of change]
The above description of the embodiment is an example of the present invention. For this reason, the present invention is not limited to the above-described embodiment, and various modifications can be made according to the design and the like as long as they do not depart from the technical idea of the present invention.
 上述した第1及び第2実施形態では、情報処理端末装置としてノートパーソナルコンピューターを用いたが、本発明の操作入力プログラムをスマートフォン1’にインストールする。本変更例では、図18に示すように発受信部が表示画面5bの左上に、操作者側に向けて三角形に配置される。発信部は3つの受信部と同じ場所に同じ数だけあり、発受信のペアが連動してそれぞれの場所で指先=監視対象点までの距離を測定する。発受信部が同一箇所にあることから距離測定の精度、指先=監視対象点の位置情報の精度は高い。このような発受信部の構成にすることでスマートフォンのような小さな画面でもすぐれた操作性を実現できる。 In the first and second embodiments described above, the notebook personal computer is used as the information processing terminal device, but the operation input program of the present invention is installed in the smartphone 1 '. In this modified example, as shown in FIG. 18, the transmitting / receiving unit is arranged in a triangle toward the operator side on the upper left of the display screen 5b. There are the same number of transmitters at the same place as the three receivers, and the pairs of transmitters and receivers work together to measure the distance from the fingertip to the monitoring target point at each location. Since the transmitter / receiver is located at the same location, the accuracy of distance measurement and the accuracy of the position information of the fingertip = monitored point are high. By adopting such a transmitter / receiver configuration, excellent operability can be realized even on a small screen such as a smartphone.
 本変更例では、左手でスマートフォンを持ち右手人差し指で指差しゼスチャー操作を行う、又は右手でスマートフォンを持ち右手親指で指差しゼスチャー操作を行うことを想定する。スマートフォンではノートパーソナルコンピューターと異なり自由に使えるキーが限られるので、左側面下部にサブコマンド用のボタン1bを新設する。このボタン1bの機能は図1の1aに示される第一実施形態のサブコマンドキー同じであり、このボタン操作と指差しゼスチャー操作とを組み合わせることで、ドップラーコマンドに加えて様々なコマンドを実施できる。 This change example assumes that you have a smartphone with your left hand and perform a pointing gesture with your right index finger, or a smartphone with your right hand and a pointing gesture with your right thumb. Unlike a notebook personal computer, smartphones have a limited number of keys that can be used freely, so a subcommand button 1b is newly provided at the lower left side. The function of this button 1b is the same as the subcommand key of the first embodiment shown in 1a of FIG. 1, and various commands can be executed in addition to the Doppler command by combining this button operation and the pointing gesture operation. .
 VP…仮想操作画面
 W1…発信波
 W2…反射波
 1…ノートパーソナルコンピューター
 1’…スマートフォン
 1a…サブコマンドキー
 3a…指先(監視対象点)
 3b…指先(第2監視対象点)
 6a,6a0~2…受信部
 10…CPUバス
 11…CPU
 12…メモリ
 13…通信インターフェース
 14…記憶装置
 15…出力インターフェース
 16…入力インターフェース
 51…ポインター
 111…同期処理部
 112…信号処理部
 112a…フィルタリング部
 112b…指差しゼスチャー認識部
 112c…指先=監視対象点検出部
 112d…位置情報算出部
 112e…操作面設定部
 112f…ポインティング信号生成部
 112g…入力信号生成部
 112h…ベクトル判定部
 112i…柱状検出部
 113…接離検出部
 113a…ドップラー効果検出部
 113b…ドップラーコマンド生成部
VP ... Virtual operation screen W1 ... Transmitted wave W2 ... Reflected wave 1 ... Notebook personal computer 1 '... Smartphone 1a ... Subcommand key 3a ... Fingertip (monitoring point)
3b ... fingertip (second monitoring target point)
6a, 6a0 to 2 ... receiving unit 10 ... CPU bus 11 ... CPU
DESCRIPTION OF SYMBOLS 12 ... Memory 13 ... Communication interface 14 ... Memory | storage device 15 ... Output interface 16 ... Input interface 51 ... Pointer 111 ... Synchronization processing part 112 ... Signal processing part 112a ... Filtering part 112b ... Pointing gesture recognition part 112c ... Fingertip = inspection object inspection Output unit 112d ... Position information calculation unit 112e ... Operation surface setting unit 112f ... Pointing signal generation unit 112g ... Input signal generation unit 112h ... Vector determination unit 112i ... Columnar detection unit 113 ... Contact / separation detection unit 113a ... Doppler effect detection unit 113b ... Doppler command generator

Claims (12)

  1.  情報処理装置の表示画面上における座標位置、及びその座標位置に関する操作コマンドを入力する操作入力システムであって、 電磁波又は音波を発信波として、前記表示画面が視認可能な視野範囲の一部又は全部を含む監視空間領域に対して発信する発信部と、
     対象物によって反射された前記発信波を反射波として受信する受信部と、
     前記発信波の波長と前記反射波の波長とを比較して、前記発信部又は前記受信部に対して接近又は離脱する方向における前記対象物の移動及びその速度を検出する接離検出部と、
     発信した前記発信波と、受信した前記反射波とに基づき、前記反射波を生成した前記対象物の3次元空間上における位置及びその変位に関する位置情報を検出し、前記位置情報及び前記接離検出部による検出結果とに基づき、前記表示画面上における座標位置に関する入力信号を生成して、前記情報処理装置に提供する信号処理部と、
    を備えることを特徴とする操作入力システム。
    An operation input system for inputting a coordinate position on a display screen of an information processing device and an operation command related to the coordinate position, and a part or all of a visual field range visible on the display screen by using electromagnetic waves or sound waves as a transmission wave A transmitter for transmitting to a surveillance space area including
    A receiving unit that receives the transmitted wave reflected by the object as a reflected wave;
    Compared with the wavelength of the transmitted wave and the wavelength of the reflected wave, a contact / separation detection unit that detects the movement of the object in the direction approaching or leaving the transmission unit or the reception unit and its speed,
    Based on the transmitted wave transmitted and the received reflected wave, the position information on the three-dimensional space of the object that generated the reflected wave and position information regarding the displacement thereof are detected, and the position information and the contact / separation detection are detected. A signal processing unit that generates an input signal related to a coordinate position on the display screen based on a detection result by the unit and provides the input signal to the information processing device;
    An operation input system comprising:
  2.  前記発信部は相互に離間されて少なくとも1つ、前記受信部は少なくとも3つ、或いは連動する発信部と受信部のペアが少なくとも3つ設けられ、
     前記信号処理部は、前記発信波の発信から前記反射波の受信までの時間長に基づいて、3点測距法により指先=監視対象点の空間的位置を把握し、これに基づいて前記入力信号を生成する位置情報算出部をさらに備え、
     前記接離検出部は、前記対象物として前記監視対象点の移動及びその速度を検出する
    ことを特徴とする請求項1に記載の操作入力システム。
    At least one of the transmitters is spaced apart from each other, at least three of the receivers are provided, or at least three pairs of transmitters and receivers to be linked are provided,
    The signal processing unit grasps the spatial position of the fingertip = monitoring target point by a three-point ranging method based on the time length from the transmission of the transmitted wave to the reception of the reflected wave, and based on this, the input A position information calculation unit for generating a signal;
    The operation input system according to claim 1, wherein the contact / separation detection unit detects a movement and a speed of the monitoring target point as the object.
  3.  前記受信部と前記対象物表面までの距離の分布である距離画像に基づいて、前記監視空間領域内において、前記受信部までの距離が最短となる座標点を監視対象点として検出する監視対象点検出部をさらに備え、
     前記接離検出部は、前記対象物として前記監視対象点の移動及びその速度を検出する
    ことを特徴とする請求項1に記載の操作入力システム。
    Based on a distance image that is a distribution of distances between the receiving unit and the object surface, a monitoring target inspection that detects a coordinate point having the shortest distance to the receiving unit as a monitoring target point in the monitoring space region It is further equipped with a branch
    The operation input system according to claim 1, wherein the contact / separation detection unit detects a movement and a speed of the monitoring target point as the object.
  4.  前記信号処理部は、
     前記監視空間領域内に、前記表示画面と平行又は所定角度をなし前記表示画面と同一形又は相似形をなす、互いに平行な連続する無数の仮想的な操作面を設定するとともに、前記受信部から前記監視対象点までの距離に応じて前記仮想的な操作面の大きさを変化させる操作面設定部をさらに備え、
     前記仮想的な操作面上における前記監視対象点の座標位置に応じて、前記入力信号を生成する
    ことを特徴とする請求項2又は3に記載の操作入力システム。
    The signal processing unit
    In the monitoring space area, set innumerable continuous virtual operation surfaces parallel to each other and parallel to the display screen or having a predetermined angle and the same shape or similar shape to the display screen, and from the receiving unit An operation surface setting unit that changes the size of the virtual operation surface according to the distance to the monitoring target point;
    The operation input system according to claim 2, wherein the input signal is generated according to a coordinate position of the monitoring target point on the virtual operation surface.
  5.  前記受信部までの距離が最短となる座標点から所定の距離までの前記反射波に限って抽出し、その後の処理フローに送るフィルタリング部をさらに備え、
     前記信号処理部は、前記フィルタリング部により抽出された反射波に基づいて前記入力信号を生成する
    ことを特徴とする請求項2又は3に記載の操作入力システム。
    Extracting only the reflected wave from the coordinate point where the distance to the receiving unit is the shortest to a predetermined distance, further comprising a filtering unit to send to the subsequent processing flow,
    The operation input system according to claim 2, wherein the signal processing unit generates the input signal based on the reflected wave extracted by the filtering unit.
  6.  前記信号処理部は、3点測距離法においては、前記反射波の振幅-時間プロファイルの形態的特徴に基づき、距離画像においては指の3次元形状に基づき、操作者の指差しゼスチャーを認識し、それぞれ、前記振幅-時間プロファイルの初動波、または前記受信部に最も近い点を指先=監視対象点として検出する
    ことを特徴とする請求項2又は3に記載の操作入力システム。
    In the three-point distance measurement method, the signal processing unit recognizes the operator's pointing gesture based on the morphological characteristics of the amplitude-time profile of the reflected wave and in the distance image based on the three-dimensional shape of the finger. 4. The operation input system according to claim 2, wherein an initial motion wave of the amplitude-time profile or a point closest to the receiving unit is detected as a fingertip = a monitoring target point. 5.
  7.  前記信号処理部は、操作者による任意の操作を検出し、この任意の操作の有無に応じて、前記入力信号の内容を変更する
    ことを特徴とする請求項2又は3に記載の操作入力システム。
    The operation input system according to claim 2 or 3, wherein the signal processing unit detects an arbitrary operation by an operator and changes the content of the input signal according to the presence or absence of the arbitrary operation. .
  8.  前記距離画像にて、前記指差しゼスチャーにおいて表示画面に向けられた指の形状を模す柱状形を検出し、検出された前記柱状形の軸の延長線が前記表示画面上と交わる座標位置を指定する入力信号として生成する柱状検出部とベクトル判定部をさらに備え、
     前記接離検出部は、前記監視対象点の移動及びその速度を検出する
    ことを特徴とする操作入力システム。
    In the distance image, a columnar shape imitating the shape of a finger directed to the display screen in the pointing gesture is detected, and a coordinate position where an extension line of the detected axis of the columnar shape intersects the display screen is determined. It further includes a columnar detection unit and a vector determination unit that are generated as input signals to be specified,
    The operation input system, wherein the contact / separation detection unit detects the movement and speed of the monitoring target point.
  9.  前記指差しゼスチャー認識において、前記距離画像に基づいて、前記監視空間領域内で前記各受信部までの距離が2番目の指差しゼスチャーを検出し、
     前記信号処理部は、前記第2の指差しゼスチャーの有無に応じて、前記入力信号の内容を変更する
    ことを特徴とする請求項3又は8に記載の操作入力システム。
    In the pointing gesture recognition, based on the distance image, a second pointing gesture with a distance to each receiving unit in the monitoring space area is detected,
    The operation input system according to claim 3 or 8, wherein the signal processing unit changes the content of the input signal in accordance with the presence or absence of the second pointing gesture.
  10.  前記発信部は、前記受信部から発信されたパルス波を障害のない空間に向けて指向性をもって照射し、
     前記受信部は、近傍からの再反射をブロックし、空中の監視対象物からの直接反射波のみを選択的に受信する指向性を持つ
    ことを特徴とする請求項1に記載の操作入力システム。
    The transmitting unit irradiates the pulse wave transmitted from the receiving unit with directivity toward an unobstructed space,
    2. The operation input system according to claim 1, wherein the receiving unit has directivity for blocking re-reflection from the vicinity and selectively receiving only a direct reflected wave from a monitoring object in the air.
  11.  情報処理装置の表示画面上における座標位置、及びその座標位置に関する操作コマンドを入力する操作入力方法であって、
     電磁波又は音波を発信波として、前記表示画面が視認可能な視野範囲の一部又は全部を含む監視空間領域に対して発信部から発信するとともに、対象物によって反射された前記発信波を反射波として受信部が受信するステップと、
     発信した前記発信波と、受信した前記反射波とに基づき、前記反射波を生成した前記対象物の3次元空間上における位置及びその変位に関する位置情報を検出し、前記位置情報に基づき、信号処理部が、前記表示画面上における座標位置に関する入力信号を生成して前記情報処理装置に提供するステップと、
     前記発信波の波長と前記反射波の波長とを比較して、前記発信部又は前記受信部に対して接近又は離脱する方向における前記対象物の移動及びその速度を、接離検出部が検出するステップと
    を含むことを特徴とする操作入力方法。
    An operation input method for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position,
    While transmitting the electromagnetic wave or sound wave as a transmitted wave from the transmitter to a monitoring space area including part or all of the visual field range where the display screen is visible, the transmitted wave reflected by the object is used as a reflected wave. A step of receiving by the receiver;
    Based on the transmitted wave and the received reflected wave, the position of the object that generated the reflected wave in the three-dimensional space and position information regarding its displacement are detected, and signal processing is performed based on the position information. A unit that generates an input signal related to a coordinate position on the display screen and provides the input signal to the information processing apparatus;
    By comparing the wavelength of the transmitted wave and the wavelength of the reflected wave, the contact / separation detection unit detects the movement and speed of the object in a direction approaching or leaving the transmission unit or the reception unit. An operation input method comprising: steps.
  12.  情報処理装置の表示画面上における座標位置、及びその座標位置に関する操作コマンドを入力する操作入力プログラムであって、コンピューターを、
     電磁波又は音波を発信波として、前記表示画面が視認可能な視野範囲の一部又は全部を含む監視空間領域に対して発信する発信部、
     対象物によって反射された前記発信波を反射波として受信する受信部と、
     発信した前記発信波と、受信した前記反射波とに基づき、前記反射波を生成した前記対象物の3次元空間上における位置及びその変位に関する位置情報を検出し、前記位置情報に基づき、前記表示画面上における座標位置に関する入力信号を生成して、前記情報処理装置に提供する信号処理部、及び
     前記発信波の波長と前記反射波の波長とを比較して、前記発信部又は前記受信部に対して接近又は離脱する方向における前記対象物の移動及びその速度を検出する接離検出部
    として機能させることを特徴とする操作入力プログラム。
    An operation input program for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position, the computer,
    A transmitter that transmits electromagnetic waves or sound waves as a transmitted wave to a monitoring space area including a part or all of the visual field range in which the display screen is visible,
    A receiving unit that receives the transmitted wave reflected by the object as a reflected wave;
    Based on the transmitted wave and the received reflected wave, the position information on the three-dimensional space of the object that generated the reflected wave and position information regarding the displacement are detected, and the display is performed based on the position information. An input signal related to a coordinate position on the screen is generated and provided to the information processing device, and the wavelength of the transmitted wave and the wavelength of the reflected wave are compared, and the signal is transmitted to the transmitter or the receiver. An operation input program that functions as a contact / separation detection unit that detects movement and speed of the object in a direction of approaching or leaving.
PCT/JP2017/007123 2017-02-24 2017-02-24 Operation input system, operation input method, and operation input program WO2018154714A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/007123 WO2018154714A1 (en) 2017-02-24 2017-02-24 Operation input system, operation input method, and operation input program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/007123 WO2018154714A1 (en) 2017-02-24 2017-02-24 Operation input system, operation input method, and operation input program

Publications (1)

Publication Number Publication Date
WO2018154714A1 true WO2018154714A1 (en) 2018-08-30

Family

ID=63252556

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/007123 WO2018154714A1 (en) 2017-02-24 2017-02-24 Operation input system, operation input method, and operation input program

Country Status (1)

Country Link
WO (1) WO2018154714A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005280396A (en) * 2004-03-29 2005-10-13 Alpine Electronics Inc Operation instruction device
JP2013536493A (en) * 2010-06-29 2013-09-19 クゥアルコム・インコーポレイテッド Touchless sensing and gesture recognition using continuous wave ultrasound signals

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005280396A (en) * 2004-03-29 2005-10-13 Alpine Electronics Inc Operation instruction device
JP2013536493A (en) * 2010-06-29 2013-09-19 クゥアルコム・インコーポレイテッド Touchless sensing and gesture recognition using continuous wave ultrasound signals

Similar Documents

Publication Publication Date Title
KR102230630B1 (en) Rapid gesture re-engagement
US8593398B2 (en) Apparatus and method for proximity based input
US9020194B2 (en) Systems and methods for performing a device action based on a detected gesture
CN105229582B (en) Gesture detection based on proximity sensor and image sensor
EP2908215B1 (en) Method and apparatus for gesture detection and display control
US20140189579A1 (en) System and method for controlling zooming and/or scrolling
WO2019033957A1 (en) Interaction position determination method and system, storage medium and smart terminal
US20140237401A1 (en) Interpretation of a gesture on a touch sensing device
US20160274732A1 (en) Touchless user interfaces for electronic devices
KR101194883B1 (en) system for controling non-contact screen and method for controling non-contact screen in the system
US20140267142A1 (en) Extending interactive inputs via sensor fusion
KR20140114913A (en) Apparatus and Method for operating sensors in user device
JP2014219938A (en) Input assistance device, input assistance method, and program
US10346992B2 (en) Information processing apparatus, information processing method, and program
WO2014146516A1 (en) Interactive device and method for left and right hands
TWI522853B (en) Navigation device and image display system
TWI486815B (en) Display device, system and method for controlling the display device
US10394442B2 (en) Adjustment of user interface elements based on user accuracy and content consumption
US8749488B2 (en) Apparatus and method for providing contactless graphic user interface
TWI510082B (en) Image capturing method for image rcognition and system thereof
KR101019255B1 (en) wireless apparatus and method for space touch sensing and screen apparatus using depth sensor
WO2018154714A1 (en) Operation input system, operation input method, and operation input program
US10338692B1 (en) Dual touchpad system
TW201429217A (en) Cell phone with contact free controllable function
KR101394604B1 (en) method for implementing user interface based on motion detection and apparatus thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17898016

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: 1205A 20.01.2020

122 Ep: pct application non-entry in european phase

Ref document number: 17898016

Country of ref document: EP

Kind code of ref document: A1