WO2015160231A1 - Appareil d'interface humaine multifonction - Google Patents

Appareil d'interface humaine multifonction Download PDF

Info

Publication number
WO2015160231A1
WO2015160231A1 PCT/KR2015/003913 KR2015003913W WO2015160231A1 WO 2015160231 A1 WO2015160231 A1 WO 2015160231A1 KR 2015003913 W KR2015003913 W KR 2015003913W WO 2015160231 A1 WO2015160231 A1 WO 2015160231A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
pointer
location information
information input
mode
Prior art date
Application number
PCT/KR2015/003913
Other languages
English (en)
Korean (ko)
Inventor
조은형
Original Assignee
조은형
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 조은형 filed Critical 조은형
Priority to CN201580001841.7A priority Critical patent/CN105659193B/zh
Publication of WO2015160231A1 publication Critical patent/WO2015160231A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a human interface for receiving text information or pointing location information from a user on a digital device capable of receiving text or pointing location related information such as a computer, a laptop, a tablet PC, a mobile phone, and transmitting the same to the digital device.
  • Text input devices such as keyboards for text input
  • pointing devices such as a mouse device for controlling a pointing position of a pointer for controlling the digital device and performing a function have been disclosed.
  • the text input device and the pointing device are provided as separate devices or have a pointing input area configured at a separate location separate from the text input area of the text input device.
  • the user's hand is moved more than necessary in a work environment in which text input, pointing location information input and pointer execution command input are frequently switched, thereby reducing work efficiency.
  • a pointing position information input area of a pointing device is provided on a text input area of a text input device, and a switching unit is provided to switch between a text input mode and a pointing position information input mode.
  • Work efficiency can be improved by enabling pointing input with minimal hand movement.
  • FIG. 1 is an exemplary view of a pointing device integrated text input device.
  • FIG. 2 is a flowchart illustrating an example of an operation procedure according to mode switching between a pointing device and a text device.
  • FIG. 3 is a diagram illustrating embodiments of a text input device and a pointing device.
  • FIG. 4 illustrates exemplary embodiments of a pointer execution command unit integrated mode switching unit.
  • FIG. 5 illustrates embodiments of a structure of a pointer location information input device.
  • FIG. 6 is a configuration diagram of an infrared laser sensor module.
  • FIG. 7 is a diagram illustrating an example of displaying a pointer location information input area.
  • FIG. 8 is a first embodiment of a pointer location information input device that can be separated from a text input device.
  • FIG. 9 is a second embodiment of a pointer location information input device that can be separated from a text input device.
  • FIG. 10 is a third embodiment of a pointer location information input device that can be separated from a text input device.
  • FIG. 11 is a front view and a perspective view of a human interface device to which the bottom positional pointer location information input device is applied.
  • FIG. 12 is an embodiment of a pointer execution command unit integrated mode switching unit.
  • FIG. 13 illustrates an embodiment using a cover of a human interface device to which a lower position pointer type location information input device is applied.
  • FIG. 14 is an embodiment of a human interface device applied to a portable notebook.
  • 15 is an embodiment in which the lower position type pointer position information input device and the lower position type pointer execution command unit are applied.
  • the present invention relates to a human interface for receiving text information or pointing location information from a user on a digital device capable of receiving text or pointing location related information such as a computer, a laptop, a tablet PC, a mobile phone, and transmitting the same to the digital device.
  • a keyboard composed of a plurality of physical buttons connected to an elastic body and a switch is widely used.
  • a virtual keyboard is displayed on a display, and when a part of the user's body is touched by the virtual keyboard displayed on the display, a user's gesture or electrical signal is detected and the user touches the virtual keyboard.
  • the method of inputting text on the virtual keyboard displayed on the part is also used.
  • the touch interface recognizes movement of a part of the user's body, recognizes that the user has touched a specific contact surface, detects the flow of current through the user's body, or blocks light or sound waves from being blocked or interfered with by a part of the user's body.
  • the user's touch can be recognized by sensing.
  • Examples of the touch interface include a pressure sensitive touch screen, a capacitive touch screen, an optical touch screen, and an ultrasonic touch screen.
  • the resistive touch screen or the pressure sensitive touch screen is a touch screen that operates by sensing pressure.
  • Resistive touch screens are inexpensive and can be written in small spaces with a stylus pen, but because of the pressure method, they are hard to recognize when pressed harder, and the touch is slightly dull than the capacitive touch method. It is known to be a disadvantage.
  • Resistive touchscreens consist of multiple layers.
  • the conductive layer which has two layers facing the air layer.
  • Capacitive sensing or capacitive touch is a method of sensing an operation using a capacitive coupling effect.
  • Capacitive touchscreens are made of highly conductive glass called indium tin oxide, unlike pressure-sensitive touchscreens.
  • Sensors are attached to the four corners of the glass so that a current flows through the glass surface.
  • the capacitive touch method is known to have a smooth feeling of operation and scrolling as compared to a pressure sensitive touch screen because it is recognized only by touching the screen, not by pressing the screen with force.
  • the capacitive touch method is capable of multi-touch to touch several places.
  • the capacitive touch method may be worn with a non-current leather glove or a fingernail or a stylus pen.
  • the sensor is sensitive and may be affected by peripherals.
  • An optical touchscreen works by using an infrared camera mounted at the vertex of the touchscreen and an infrared light that measures coordinates with the shadow of the object you want to touch the screen.
  • the ultrasonic touch screen operates by measuring the coordinates by emitting an ultrasonic wave on the screen to detect the interference effect of the user's touch.
  • various touch input technologies may be used in the present invention, which may be used to detect location information of a part of the user's body by sensing a contact or movement of the user and control the location information of the pointer.
  • FIG. 1 is an exemplary view of a pointing device integrated text input device.
  • the pointing device integrated text input device may have a housing 100 for supporting the text input device and the pointer location information input area.
  • the housing has a strength that can withstand the pressure of the user input, and receives the text input information and the pointer location information input information and transmits the information to a digital device connected to the pointing device integrated text input device by wire or wirelessly.
  • the controller may include a controller, a memory unit, a battery unit, an encoder, a transmitter, and the like.
  • the pointing device integrated text input device may include a plurality of buttons 109 for receiving text input information from a user.
  • the plurality of buttons 109 may be configured as physical buttons or virtual buttons.
  • the physical button may be composed of a button that is connected to the elastic body or the button itself is elastic, it may be characterized by moving to a position when receiving the user's input and returning to its original position when the user's pressure is removed.
  • the physical button is connected to an electrical switch, the user's pressure is applied to move the position of the button to change the phase of the switch may be configured in a structure that generates a text input value of the button.
  • the physical button has an elastic structure without an electrical switch and moves to a position under pressure of the user and returns to its original position when the pressure of the user is removed, and the user's text input information is controlled by the touch input device.
  • the text input information may be generated based on the location information on which the user's pressure or gesture is recognized.
  • the virtual button may be configured by displaying a text input button on a display device.
  • the virtual button may display an arbitrary button by projecting light onto a transparent, translucent, or opaque object.
  • the virtual button may generate the corresponding text input information based on the user's pressure or gesture information with the unique location information for each text without being recognized by the user's eyes.
  • the pointing device integrated text input device may have a pointing position information input area 108a or 108b having at least a portion of the text input area 107 as a common area.
  • the pointing location information input areas 108a and 108b are areas located at the top or bottom or bottom of the button for text input. As shown in FIG. 1, the pointing location information input areas may have at least a part of the text input area in common. have.
  • the pointing location information input area may include the text input area or the text input area may include a point location information input area.
  • the pointing location information input area and the text input area are at least partially in common, and at least some of them are used as the pointing location information input area but are not used as the text input area, and at least some are used as the text input area. May not be used.
  • the integrated text input device is a pointer location information input device 105 for text input or for pointing location information input or for forming a virtual text input area or a virtual pointing location information input area for text input and pointing location information input 105 ) May be configured on the pointing device integrated text input device or external to the pointing device integrated text input device.
  • the pointer location information input device 105 is a user on the surface of the housing 100 or over the housing 100 such as an infrared ray generator and an infrared receiver or an RGB camera or an ultrasonic wave generator and an ultrasonic receiver or an infrared ray generator and an infrared camera. A part of the body may be detected to receive location information and gesture information.
  • the pointer location information input device may be configured with a plurality of devices to expand the pointer location information input area or to improve accuracy and sensitivity.
  • it may have a pointer location information input device 105b for configuring a text input or pointing location information input area 108b of the right hand.
  • the device may have a pointer location information input device 105a for configuring a text input or pointing location information input area 108a of the left hand.
  • the text input area or the pointing location information input area 108b for the right hand may include an area of the J button of the English standard keyboard.
  • the text input area or the pointing location information input area 108a for the left hand may include an area of the F button of the English standard keyboard.
  • the pointer location information input device may include both the J and F buttons of the English standard keyboard.
  • the pointing location information input area When the pointing location information input area is connected to a plurality of digital devices having a display unit, the pointing location information input area may move the pointer position on the plurality of display units. have.
  • the pointer location information input area may be divided to match each display unit, or a separate button indicating a display unit may be operated to transmit pointer position information from the corresponding display unit, or a plurality of display units may be virtually single. Recognizing the display unit may transmit the pointer position information so that the pointer can move on the virtual single display unit.
  • the integrated text input device includes a pointing location information input device that constitutes a pointing input area and receives pointer location information from a user, and includes a button or an image, a space, an icon, or a location where a pointer moved by the pointing location information input device is located. It may have a pointer execution command unit (101, 102) for executing at least one function on the text input window.
  • the pointer execution command unit may be composed of one or two buttons, respectively, and may serve as a first function and a second function, and may be located at the left side of the housing or at the post or center.
  • the first function may, for example, function as a left click of a computer mouse and the second function may, for example, function as a right click of a computer mouse.
  • the pointer execution command unit consisting of one or two buttons may be configured on both the left side and the post of the housing so as to be convenient for both left and right handed users.
  • the pointer execution command unit may be configured to be operated by using the above-described touch technology to recognize the touch or light blocking of the user's body part or the interference of ultrasonic waves or the shadow of the user's body part.
  • the pointer execution command unit may be configured of an elastic physical button.
  • the pointer execution command unit may operate by using at least one button of the text button located in the text input area in addition to the pointing position information input area.
  • the pointer execution command unit may operate by selecting a physical or virtual text button on a pointing location information input area.
  • a virtual input device is used as a pointing location information input device and a physical button is used as a text input device
  • the location information of the pointer is input on the virtual pointing location information input area, and the physical location of the location is located. You can issue a pointer execution command by pressing the text button.
  • the pointer execution command unit may receive pointer position information in response to a user first gesture in a pointing position information input area and generate a pointer execution command by a user second gesture at the same position.
  • the pointer execution command unit may be configured to execute a first function by a first gesture or a first voice of the user's body, a first blink, a first mouth shape, or the like.
  • the pointer execution command unit may be configured to execute the second function by a second gesture or a second voice of the user's body, a second blink, a second mouth shape, or the like.
  • the pointing device integrated text input device may be configured to include a text input mode for receiving text information through the text input device and a pointing location information input mode for receiving pointing location information through the pointing location information input device.
  • the text input mode and the pointing position information input mode may be switched by the mode switching unit 103.
  • the mode switching unit 103 may be configured as a switch located separately on the housing.
  • the mode switching unit 103 may perform mode switching by sensing that at least one text input button or a plurality of text inputs included in the text input device are simultaneously received.
  • the mode switching unit 103 may switch the mode by receiving control information from a digital device connected to the pointing device integrated text input device by wire or wirelessly.
  • the mode switching unit 103 may be integrally formed with the pointer execution command unit 102.
  • a first gesture such as a contact of a part of the user's body on the pointer execution command unit 102 having a first response to the first touch or the first pressure and a second response to the second touch or the second pressure. It can be configured to generate a pointer execution command in response to the second touch or the second pressure by a second gesture such as sensing the first touch or the first pressure to switch modes and press a button.
  • the mode switching unit 103 may be configured of a temporary mode switching mode and a permanent mode switching mode.
  • switching from the text input mode to the pointing position information input mode and returning to the text input mode when the first touch or the first pressure is removed may be set as the temporary switching mode. have.
  • the switching from the pointing position information input mode to the text input mode in response to the first touch or the first pressure and returning to the pointing position information input mode when the first touch or the first pressure is removed may be set as a temporary switching mode.
  • the temporary mode switching may be performed by the first control information received from the digital device connected to the pointing device integrated text input device and wired or wirelessly.
  • the permanent mode switching may be performed by the second control information received from the digital device connected to the pointing device integrated text input device and wired or wirelessly.
  • the mode switching unit 103 may be integrated with the pointer execution command unit.
  • the temporary mode is switched from the text input mode to the pointer position information input mode, and when the second touch or the second pressure is detected, the pointer execution command is executed.
  • the third touch or the third pressure is applied and the third touch or the third pressure is removed, it may be set to the permanent switching mode to operate in the pointer position information input mode even if the third touch or the third pressure is removed.
  • the pointer execution command can be input even during the permanent switching mode.
  • the mode switching unit may be configured on the left or right side of the housing (106a, 106b)
  • the mode switching units 106a and 106b configured on the left side or the right side or the left side and the right side of the housing may be configured as virtual buttons or physical buttons to sense and operate a user's touch input or pressure.
  • the mode switching units 106a and 106b configured on the left side or the right side or the left side and the right side of the housing may be configured to have an input area of 3 centimeters or more and less than 15 centimeters along the side of the housing.
  • the pointing device integrated text input device may have a transmitter 104 for transmitting data to the digital device including an external or pointing device integrated text input device by wire or wirelessly.
  • the digital device may receive a text input or receive pointer location information.
  • FIG. 2 is a flowchart illustrating an example of an operation procedure according to mode switching between a pointing device and a text device.
  • the pointing device-integrated text input device may have a separate power supply unit or may receive power from the outside by wire or wirelessly and may have a separate switch for controlling the power supply unit.
  • the switch controlling the power supply unit When the power supply to the pointing device integrated text input device is supplied by the switch controlling the power supply unit, it may be determined whether the pointing device integrated text input device is currently in the text input mode or the pointing location information input mode (200).
  • the text input device may be activated and text input may be received from the user (201).
  • the text input of the input user may be transmitted to a digital device connected by wire or wirelessly (202).
  • the text input mode may be switched to the pointing position information input mode.
  • it may be configured to input pointing position information simultaneously with text input.
  • the pointing location information may be received by the user's input (204).
  • the input pointer position information may be transmitted to a digital device connected by wire or wirelessly (205).
  • the pointing device integrated text input device may transmit the pointer first execution command to a digital device connected by wire or wirelessly (207).
  • the pointing device integrated text input device may transmit the pointer first execution command to a digital device connected via wire or wirelessly (209).
  • the pointing device integrated text input device may switch to the text input mode.
  • FIG. 3 is a diagram illustrating embodiments of a text input device and a pointing device.
  • the pointing device integrated text input device may include a first housing 301 including a power supply unit, a controller, or a communication unit, and a second housing 302 for configuring a text input area and a pointing location information input area.
  • the pointing device integrated text input device may have a text input device and a text input area 303 for receiving text input from a user.
  • the text input area 303 may be configured as a virtual button or a physical button.
  • the pointing location information input device may be configured in a pressure sensitive or capacitive touch pad type and positioned on a physical button of the text input device (304).
  • the text input device when configured in the form of a physical touch pad such as a pressure sensitive or capacitive touch panel, the touch pad may be used as the text input device and the pointing location information input device, and the mode switching unit may switch the mode.
  • a physical touch pad such as a pressure sensitive or capacitive touch panel may be configured as a large-area touch pad type pointing device-integrated text input device 300 including a touch pad 304 including a plurality of text button areas of the text input unit.
  • a physical touch pad such as a pressure sensitive or capacitive touch sensor, may be configured as a plurality of touch pad type pointing device-integrated text input devices 310 including a plurality of touch pads 311 including one text button area of the text input unit.
  • the pointing location information input may be configured as an upper camera type pointing device-integrated text input device 320 that is received 321 by an infrared camera or an RGB camera 222 positioned at an upper end of the pointing location information input area. have.
  • the upper camera pointing device-integrated text input device 320 may configure the virtual text input button 303 using the upper camera.
  • the camera may be configured as a bottom camera 332-type pointing device-integrated text input device 330 configured to receive the pointing position information input by placing the camera at the bottom of the second housing.
  • the lower camera-type pointing device-integrated text input device 330 may configure a virtual text input button 303 to replace the physical text button using the lower camera.
  • a virtual pointing location information input area composed of a pair of infrared receivers or ultrasound receivers for receiving information from which infrared or ultrasonic waves transmitted by an infrared transmitter or an ultrasonic transmitter are blocked or interfered with by a part of the user's body.
  • 341 may be configured as a text input device 340 integrated with a transmission / reception type pointing device.
  • the pair of virtual pointing position information input areas may be used as a virtual text button input means in place of the physical text button by the mode switching unit.
  • FIG. 4 is a diagram illustrating embodiments of the pointer execution command unit integrated mode switching unit.
  • the pointer execution command In the button type pointer execution command unit having switches 404 and 405 for generating a touch input a touch input located at a top of the button detects a user's touch input to switch a text mode and a pointing position information input mode.
  • Using the pointer execution command unit integrated mode switching unit 400 can be configured.
  • a second switch section 411 which is joined by a user second pressure less than the user first pressure in the button type pointer execution command section with the first switches 404 and 405 for generating an execution command, to generate a mode switch execution command;
  • the pressure-sensitive pointer execution command unit integrated mode switching unit 410 configured as 412 may be configured.
  • the button unit 402 may be moved to fix the button unit 402 so as not to return to its original position by the elastic body 403.
  • the position where the button is fixed may be configured to be operated in the permanent mode switching mode when the button is fixed by setting the position where the second switches 411 and 412 are bonded and the first switches 404 and 405 not bonded.
  • the permanent mode switching switch 414 may be configured to operate the permanent mode switching switch 414 by moving a position such as sliding the button unit.
  • the pointer execution command switches 404 and 405 may be configured to operate when additional pressure is applied in the operated state.
  • the pointer execution command unit-integrated mode switching unit 410 is configured as a touch pad 421 to operate as a mode switching unit when the contact area of a part of the user's body in contact with the touch pad is within the first predetermined range and is within the second predetermined range. Can act as a pointer execution command.
  • FIG. 5 illustrates embodiments of a structure of a pointer location information input device.
  • the pointer location information input device of the human interface device of the present invention may include a sensor module 501 including a light emitter and a camera.
  • the light emitter of the sensor module 501 has a path of light blocked by an obstacle, for example, a user's finger, on a pointing location information input area 108 configured on a light plane composed of light emitted from a light emitter. It serves to configure the light plane to be reflected.
  • the camera of the sensor module includes an optical sensor for detecting the light blocked or reflected by the obstacle.
  • the optical sensor may use a line camera that can recognize a line image.
  • the line camera is suitable that the image sensor is composed of 400 to 1000 elements in the form of a line, preferably 500 to 700 elements is ideal.
  • the light emitter is composed of a light emitting element for constructing a light plane on the upper end of the character input button 303 about 1 millimeter.
  • the pointing location information input area 108 is composed of a part of the light plane.
  • an infrared emitter for example, an infrared emitter, an infrared laser emitter, a laser emitter, an ultraviolet emitter, a visible light emitter, or the like can be used.
  • the light emitting device is ideal to use an infrared laser emitter that does not require a reflector on the edge.
  • Infrared laser emitters are suitable for driving lasers having a laser wavelength of 800 to 850 nanometers at 0.3 to 1 milliwatt.
  • the light emitter lets the light from the infrared emitter pass through the line lens to spread in a plane.
  • a line lens may be further included in the light emitter of the light emitter so that the light emitted from the light emitting device may form a light plane.
  • the line lens allows the line lens to be aligned so that the light emitted from the light emitting device may form a light plane parallel to the character input unit.
  • the pointer location information input device may further configure the optical filter 504 to transmit light of the light emitter to only light of a specific wavelength band.
  • the optical filter 504 is installed so that the light emitted from the light emitter passes through, and also prevents the light emitted from the light emitter from being received by the camera when light reflected or blocked by an obstacle is received by the camera. Can also be installed.
  • optical filter for the light emitter and the camera may be configured separately, it is preferable to design the light emitter and the camera in close proximity so as to filter both the light transmitted and the received light with one optical filter. .
  • the optical filter may be installed to be perpendicular to the traveling direction of the light of the light emitter.
  • the optical filter may be installed to be 30 degrees to 60 degrees or 120 degrees to 150 degrees in the advancing direction of the light emitter in consideration of design.
  • the sensor module 501 may be installed so that the light of the light emitter is directly radiated to the light plane.
  • the sensor module 501 may further include one reflector 503 such that the light of the light emitter is refracted once and radiated to the light plane.
  • the light blocked or reflected by the obstacle may be refracted by the reflector 503 once and received by the camera.
  • the sensor module 501 may include two reflecting plates 503a and 503b such that the light of the light emitter is refracted twice and radiated to the light plane.
  • the light blocked or reflected by the obstacle may be refracted twice by the two reflecting plates 503a and 503b and received by the camera.
  • the sensor module may be installed in the first housing 301.
  • an optical filter may be coated on the reflecting plate 503 to replace a separate optical filter 504.
  • the user moves the text input button 303 by swiping with his / her finger to input the pointing location information.
  • the distance between the text input buttons 303 is ideally 0.1 millimeter to 1 millimeter so that the user's fingers are not disturbed by the distance between the text input buttons 303.
  • the plurality of text input buttons 303 disposed under the pointing location information input region 108 is positioned on a plane where the upper surfaces of the plurality of text input buttons are flat.
  • the light source emitted from the sensor module 501 is reflected by a reflector or an obstacle and the light source received by the camera passes through the light tunnel 502.
  • the upper surface of the light tunnel is replaced with a portion of the first housing 301 can reduce the thickness of the light tunnel.
  • the lower surface and the side surface of the light tunnel may be designed as a separate structure from the first housing.
  • FIG. 6 is a configuration diagram of an infrared laser sensor module.
  • the infrared sensor module basically emits the light covering the pointing location information area by using the light source 601 and analyzes the light reflected or blocked by the obstacle to locate the obstacle to one or two cameras 603. Calculation principle.
  • a line lens 602 may be used to scatter light so that light emitted from the light source 601 may be emitted to a desired area.
  • a line camera as a camera for recognizing a state in which the light source is reflected or blocked by an obstacle.
  • the camera may install a guide tunnel in front of the camera to help the incoming light to receive light in a desired area.
  • the light source may be infrared rays, ultraviolet rays, lasers, or the like.
  • the light source and camera can be replaced by an ultrasonic and ultrasonic receiver.
  • an infrared laser as a light source in order not to provide a reflector on the edge of the keyboard.
  • the sensor module may have an auxiliary control board that controls the light source and the camera and transmits the received signal of the camera to the main control board.
  • the pointer location information input unit may be positioned to include a portion of an extended surface of the text input unit plane.
  • the text input unit includes a plurality of buttons for receiving at least one text, wherein the buttons generate a text selection signal while moving from the first position to the second position by the pressure applied by the user, and the first position by the elastic body. It can be configured in the form of a physical button returning to.
  • the text input unit includes a F4 (F4) and F5 (F5) button, characterized in that the two cameras of the pointer position information input unit is the closest of the camera on the left of the two cameras and the button of the text input unit You can design a button in the distance to be an F4 or F5 button.
  • the index finger of the left hand can obstruct the camera when inputting the pointer position information with the right hand.
  • the pointer execution command unit When the pointer execution command unit is set to a button having the largest area among the text input units, for example, a space bar, it is convenient to execute the pointer execution command unit with the left hand while inputting mouse position information with the right hand, for example.
  • the mode switching unit and the pointer execution command unit preferably have a shortest distance of 8 to 15 centimeters.
  • the mode switching unit may be provided separately from the button of the text input unit and recognizes that a part of the user's body is touched by a part of the human interface body to switch to the text input mode and the pointer position information input mode. It operates in the pointer location information mode and can operate in the text input mode when it is not touched.
  • the pointer execution command unit includes two buttons to perform a first function when a first button is pressed, for example, a left click of the mouse, and a second function when a second button is pressed, for example, a right click of a mouse
  • the first button is a button having the largest area among the text input buttons, for example, a space bar button
  • the second button is the first button located at the left or right side of the first button.
  • the thumb's thumb is convenient to control and provides an easy user experience.
  • a display indicating the text input mode or the pointer location information input mode may be displayed.
  • the pointer position information input mode may be displayed by temporarily displaying a moment when a touch occurs in the mode switching unit or displaying the pointer position information input region from a time when a touch of the mode switch is generated to a release point. lighting) signal.
  • the text input area of the text input part may be designed to be the same as the pointer location information input area of the pointer location information input part or to include a pointer location information input area of the pointer location information input part as shown in FIG. 1.
  • one or two optical emitters and a camera may be configured.
  • the optical signal received by the camera may pass through a light tunnel to block an external noise light source.
  • the light tunnel may be configured before the optical signal is emitted from the optical emitter to the pointer location information input area, and may be installed in front of a camera that receives a user input signal from the pointer location information input area.
  • the reflection plate provides freedom to the installation position and the direction of the pointer position information input unit, thereby obtaining space utilization and design benefits.
  • the pointer location information input unit may configure the pointer location information input area at least 0.1 to 5 millimeters from the text input unit by the optical emitter and the camera.
  • the pointer location information can be received only when there is a clear touch intention of the user.
  • the optical emitter When used as an optical emitter using an infrared laser having a wavelength of 800 to 850 nanometers and designed to operate at 0.3 to 1 milliwatt, the optical emitter exhibits excellent power consumption, cost, stability, safety, and visual characteristics.
  • the optical emitter may further include a line lens for scattering the optical signal such that the optical signal constitutes a pointer location information input area parallel to the text input unit.
  • the pointer location information input unit may be designed to block optical external noise by using an optical filter that transmits only an optical signal in an infrared region.
  • the text input unit When the text input unit is designed to have a distance between 0.1 mm and 1 mm between physical buttons, the text input unit may provide a user with a sufficiently soft and flat feeling during input of the pointer location information.
  • FIG. 7 is a diagram illustrating an example of displaying a pointer location information input area.
  • the human interface device of the present invention may be designed to further include pointer location information input area display units 701 and 702 for visually displaying the pointer location information input area in the pointer location information input mode.
  • the mode switching unit of the present invention is provided separately from a button of the text input unit and recognizes that a part of the user's body is touched by a part of the human interface body to switch to the text input mode and the pointer position information input mode, It can be designed to operate in pointer location information input mode and to operate in text input mode during non-touch.
  • the hand for the pointer position information input 108a may be designed to be the left hand.
  • the pointer position information input area is designed to be closer to the opposite edge of the corner where the mode switching unit is located, and the same principle can be used when the left hand and the right hand are used in reverse.
  • a part of the body used for the mode switching is the right hand and the hand for inputting the pointer position information may be designed to be the right hand.
  • the pointer location information input area is designed to be closer to the corner where the mode switching part is located than the opposite corner of the corner where the mode switching part is located, and the left hand may be designed in the same principle.
  • the mode switching unit may determine the text input mode and the pointer location information input mode based on the number of user fingers recognized by the pointer location information input unit.
  • the number of recognized user fingers in the determination of the pointer position information input mode is smaller than the number of recognized user fingers in the determination of the text input mode.
  • the finger of the left hand may be non-touched from the text input button and the index finger of the right hand may be touched for inputting the pointer location information.
  • the number of is one.
  • the pointer position information input mode display unit temporarily displays the pointer position information input area when the mode switch is switched to the pointer position information input mode or when the pointer position information input mode is switched to the release point from the pointer position information input mode. Can be displayed visually.
  • the pointer position information input mode display unit may be displayed by the visible light generator from the text input button, reflected light by the text input button, or through the interval between the text input buttons.
  • the pointer position information input mode display unit includes invisible light generators 701a and 701b and surfaces 702a and 702b coated with dyes that optically react to the invisible light to emit visible light, and the dye is the text. It can be applied to the interval between the input button or the text input button.
  • the mode switching unit may further include a handside determination unit that determines whether the user uses the pointer position information input with the left hand or the right hand.
  • the display area of the pointer location information input display unit may be flexibly displayed according to the handside determination unit.
  • the handside determination unit it is possible to flexibly switch the button arrangement and operation of the pointer execution command unit.
  • the button for performing right-click and left-click may be switched according to the determination of the handside determiner.
  • the pointer location information input area display unit may display different pointer location information input areas according to the determination result of the handside determination unit 702a or 702b.
  • the mode switching unit may be designed to automatically switch to the text input mode when the pointer location information input is not input from the user for a predetermined time or when an input is received through a text input button in the pointer location information input mode.
  • the pointer location information input area display unit may apply a dye to the text input button or a portion of the text input area so as to be visually displayed regardless of the mode of the mode switching unit so that the pointer input area may be recognized even during the text input mode. .
  • FIG. 8 is a first embodiment of a pointer location information input device that can be separated from a text input device.
  • the pointer location information input device 810 of the present invention is designed to be separated from the text input device 800 and may be combined (801) or separated from the text input device (802).
  • the power connection unit 820 included in the pointer location information input device may receive power from a power source of the text input device 800.
  • the pointer location information input device 801 includes at least one pointer location information input device 105a, 105b, and may further include a controller and an optical filter 504.
  • the detachable pointer position information input device 810 may be combined to form a pointing position information input area on a plane parallel to the text input area (801), and on the left side 803 or the right side 804 of the text input device.
  • the pointing location information input area may be configured on the bottom surface of the text input device.
  • the pointing position information input device and the power connection unit 820 are designed to be flexibly deformed to the positional change of the pointing position information input device.
  • FIG. 9 is a second embodiment of a pointer location information input device that can be separated from a text input device.
  • the pointer location information input device 900 includes a built-in power supply.
  • the pointer location information input device 900 When the pointer location information input device 900 is in the first location information input mode 902, the pointer location information input device 900 is positioned at a constant height and angle so that the pointer location information input area is parallel to the text input area of the text input device.
  • the housing can be designed.
  • the pointer position information input area is input at a constant height and angle so that the pointer position information input area is parallel to the bottom surface on which the pointer position information input device 900 is placed.
  • the housing of the device 900 can be designed.
  • the height of the optical filter 504 in the first positional information input mode is higher than the height of the optical filter 504 in the second positional information input mode.
  • the pointer location information input device 900 When the pointer location information input device 900 is in the third location information input mode 902, the pointer location information is positioned at a predetermined height and angle so that the pointer location information input area is perpendicular to the floor on which the pointer location information input device 900 is placed.
  • the housing of the input device 900 can be designed.
  • the pointer location information input apparatus 900 may distinguish between at least two modes of the first location information input mode, the second location information input mode, and the third location information input mode according to the state placed on the floor. It may include a sensor unit.
  • the location information input mode sensor unit may be configured as a sensor or a switch utilizing gravity.
  • FIG. 10 is a third embodiment of a pointer location information input device that can be separated from a text input device.
  • the pointer location information input apparatus 1000 may have a housing 1001 surrounding at least two sides of the text input apparatus so that the text input apparatus 800 may be mounted.
  • the pointer location information input device 1000 includes a pointer location information input device 105a, 105b, a mode switching unit 106, a controller 1001, a housing 1001, and an optical filter 504 (1010).
  • a pointing location information input area parallel to the top of the text input device 800 input area through the optical filter 504 is provided.
  • the optical filter is positioned higher than the text input device 800 so that the light source can pass therethrough.
  • At least one surface 1002 surrounding the text input device of the housing 1001 may be a surface such that a light source emitted from the pointer location information input device 105 may be reflected, transmitted, or absorbed according to the type of the pointer location information input device.
  • Each material can be treated with a mirror, transparent glass or plastic, or a black light absorber.
  • pointing position information input apparatus detachable from a text input apparatus, and have the following components.
  • the pointer location information input unit receives information related to the pointer location from the user.
  • the pointer location information input unit receives the user's pointing location information input through the pointing location information input area configured by the pointer location information input device.
  • the pointer execution command receiving unit receives a signal of a pointer execution command unit that receives a user's command to perform at least one function where the pointer is located.
  • the pointer execution command unit may be configured as a button or a touch switch included in the pointer location information input unit, or may be configured as a control unit that switches to a pointer execution command when a predetermined input among user inputs received from the text input device is received.
  • a mode switching command receiver for receiving a signal of a mode switching unit for switching to the pointer position information input mode.
  • the mode switching unit may be configured in a pointing location information input device, a text input device, or may be configured as a separate device.
  • a power supply unit configured to transfer power to the pointer location information input unit, the pointer execution command receiver, and the mode change command receiver.
  • the power supply unit may be a power supply unit having its own battery or a contactor supplied with power from the outside.
  • a pointer location information transmitter for transmitting information related to the pointer location input to the pointer location information input unit to a digital device connected to the human interface device by wire or wirelessly.
  • the pointer location information input area of the pointer location information input unit may be positioned at a predetermined height and angle so that the pointer location information input area may be parallel to an upper end of at least a portion of the text input area of the text input device including a plurality of physical buttons. Is set.
  • the mode switch When the mode switch is provided separately from the text input button of the text input device, the mode switch may be fixed to at least one surface of the text input device and recognize that the user input is received in the mode switch to input the pointer position information.
  • the mode change command receiver may be transmitted by wire or wirelessly.
  • the pointer location information input unit includes an optical emitter and a camera, and the optical signal received by the camera may be designed to pass through an optical filter through a light tunnel.
  • It may have a position state recognition unit for recognizing the position state information on the gravity direction of the human interface, the position state information on the floor on which the human interface is placed, or the position state information of the human interface through a user input.
  • the position state is composed of at least two types, and when the first position state is switched to the first pointer position information input mode, and when the second position state is switched to the second pointer position information input mode.
  • the pointer location information input area in the first location state refers to a position parallel to at least a portion of the text input area of the text input device.
  • the pointer position information input area in the second position state means that the pointer position information input area is parallel to the bottom surface on which the pointing position information input device is placed.
  • Pointing in the first pointer positional information input mode The upper position of the pointing positional information input device is located on the floor, and when the second pointer positional information input mode is switched, the ordinate coordinates of the pointing positional information input are the same but the calculation of the abscissa is increased or decreased. Calculate by changing direction.
  • the pointer execution command unit may use a space bar among text input buttons of the text input device, and another button adjacent to the space bar may be used as the pointer execution command unit.
  • the power supply may receive power from the text input device.
  • the predefined height and angle of the pointer position input area in the first position state may be defined by an adjuster to enable adjustment within a predefined range by the user.
  • the adjuster may adjust the angle by adjusting a bonding angle of at least one surface of the housing of the pointing position information input device with an adjacent surface.
  • the adjuster may adjust the height by adjusting the height of at least one surface of the housing of the pointing position information input device.
  • the controller may adjust the predefined height and angle by adjusting the height and angle of the optical module of the pointing location information input unit of the pointing location information input device.
  • FIG. 11 is a front view of a human interface device to which a bottom positional pointer location information input device is applied.
  • the human interface device may include a text input unit including a plurality of physical buttons, a pointer location information input unit for receiving information regarding a pointer position from a user, and a pointer execution command for receiving a user's command to perform at least one function at a location of the pointer.
  • a pointer execution command receiving unit for receiving a negative signal, a mode switching command receiving unit for receiving a signal of a mode switching unit for switching to a pointer position information input mode, the pointer position information input unit, the pointer execution command receiving unit, and a mode switching command receiving unit
  • a power supply unit for transmitting a pointer position information transmitter for transmitting information related to the pointer position input to the pointer position information input unit to a digital device connected to the human interface device by wire or wirelessly, on a pointer of the pointer position information input unit
  • the information input area is positioned parallel to the top of at least a portion of the text input area of the text input part including the plurality of physical buttons, and the pointer location information input part is composed of at least two sensor modules and the at least two sensors.
  • the module is located on the lower left side 1102 and the right side 1103 of the human interface device.
  • the pointer location information input unit includes an optical emitter and a camera, and the optical signal received by the camera passes through a light tunnel.
  • the mode switching unit may be provided separately from a text input button of the text input device.
  • the mode switch is located on the left side of the text input area of the text input device and recognizes that the user input is received in the mode switch, and transmits the pointer position information input mode to the mode switch command receiver.
  • the physical button is characterized by consisting of an elastic body for applying a physical force to return the upper part of the button and the upper part of the button when the user's finger pressure is removed from the user's finger pressure.
  • the human interface device may include a reflector or an absorber 1101 that reflects a light source generated by the pointer location information input unit at left, right, and top sides of the text input area, but not at least partially below. do.
  • the reflector is used when the light source generated by the pointer position information input unit is an infrared ray, and an absorber when an infrared laser is used.
  • At least two sensor modules 1102 and 1103 of the pointer location information input unit may be positioned to include at least a portion of the left outer area and the right outer area of the text input area.
  • the left sensor includes at least a part of the left side of the control button of the standard window keyboard, and is lowered downward toward the housing corner without the key button.
  • the sensor module enables the configuration of the minimum volume of the complex human interface device to maximize the aesthetic effect of the complex human interface.
  • positioning the sensor near a vertex of the text input area 1110 to direct the sensor in the diagonal direction of the text input area is more than directing the sensor to the horizontal and vertical direction of the text input area.
  • a pointer area information input area having a larger area can be configured.
  • the position of the sensor may be positioned 1120 narrower than the width of the keyboard below the space bar of the standard keyboard.
  • the width between the sensor and the sensor determines the width of the pointer location information input area, thereby reducing the area of the pointer location information input area but obtaining a relatively high pointer location information detex resolution than the location 1110 at the corner.
  • the sensor eliminates the reflector or absorber of the light source and lowers the height of the housing to prevent the thumb from being caught by the housing when the thumb presses the space bar of a standard keyboard with the thumb.
  • the pointer execution command units 1104 and 1105 may be located at positions separate from the text input area and the mode switching unit of the text input device.
  • FIG. 12 is an embodiment of a pointer execution command unit integrated mode switching unit.
  • the pointer execution command unit receives the user's input when the pointer position information is input with the right hand while the user puts his hand on the mode switching unit and the pressure is applied to the mode switching unit.
  • the user can share the position of the mode switch unit for the mode switch and the pointer execution command unit for the pointer execution command.
  • the mode switching unit is configured as a touch switch that can accept a touch input of the user's hand and the pointer execution command unit is configured as a switch that responds to the pressure, such as a text switch is located below the mode switching unit.
  • the first pointer execution command unit 1104 and the second pointer execution command unit 1105 are connected to each other by a conductor material capable of recognizing a touch from a user's hand, or commonly connected to the mode switching unit 1201. Therefore, whether the user touches the first pointer execution command unit or the second pointer execution command unit is configured to make the mode change in the same manner.
  • the mode switch operates in the pointer location information input mode while the touch is being made, and in the text input mode when the touch is released.
  • Mode switching by touch may be configured as a sensor that can sense the position of the user's finger or another switch that operates at a pressure less than the pressure for the pointer execution command in addition to the touch.
  • the mode switching and pointer execution command unit is located in an area distinct from the text input area.
  • it is located in the outer left area of the text input area. If the mode switching unit and the pointer execution command unit are located outside the text input region, the possibility of confusion with the pointer position information input unit is eliminated and the pointer position information input region can be expanded.
  • the complex human interface device of the present invention may include a second mode switch 1202.
  • the second mode switching unit is a switch which operates like a toggle switch, and switches to a text input mode and a pointer position information input mode whenever an input from a user is received.
  • the pointer position information can be input using only the right hand.
  • the composite human interface device of the present invention operates in the pointer position information input mode.
  • the composite human interface device by the second mode switching unit is the pointer position information input mode
  • the mode switch command by touch is received, the pointer position information input mode is maintained, but the mode switch command by the touch is released.
  • the text input mode is switched.
  • the text input mode when the text input mode is switched by the text input, at least the first text input is ignored, and when at least two text inputs are received, the text input mode is switched.
  • the composite human interface device transmits the text input including the ignored at least first text to the digital device and then newly inputs the text input information to the digital device. To send.
  • FIG 13 illustrates an embodiment utilizing a cover of the composite human interface device.
  • the complex human interface device requires at least one millimeter to two millimeters of space on top of a text input device plane composed of physical devices for the pointer position input device to form a pointer position input area, and an absorber for absorbing or reflecting an optical signal.
  • a border for positioning the reflector may be required at least three edges surrounding the text input area.
  • a difference of at least 1 to 2 millimeters occurs between the height of the edge of the at least three sides of the edge and the height of the text input area plane, thereby forming a substantially rectangular parallelepiped type space.
  • the complex human interface device of the present invention may further include a multipurpose cover 1322 in the substantially rectangular parallelepiped type space for protecting the text input area from external impact.
  • the multipurpose cover 1322 is separated from the composite human interface device 1320, and may be combined and separated by a magnet or a physical structure.
  • the multipurpose cover 1322 may be a structure folded over several times.
  • the width of at least one area of the divided area of the utility cover in a double folding structure may be smaller than the width of the other area (1341).
  • a portion of the folded surface that comes into contact with the ground may include an inclined surface 1342 so that the area of contact with the ground may be widened.
  • the inclination of the multi-purpose cover 1322 can be adjusted by the user to the desired tilt of the composite human interface device.
  • the multi-purpose cover may include a rechargeable battery 1323 in the cover.
  • the power of the rechargeable battery 1323 may be located at a portion of the protruding region than the portion covering the text input area covering the edge of the portion covering the text input area of the lower end of the multipurpose cover (1331) When the cover covers the text input device (1310) it is connected to the electrode provided on the top of the composite human interface device.
  • the cover when the multi-purpose cover is folded and mounted on the lower end to adjust the tilt of the composite human interface device, the cover may be connected to an electrode provided on the bottom of the composite human interface device.
  • the composite human interface device may be connected to an external power source to supply power to the rechargeable battery 1323 inside the multipurpose cover or to separately charge the multipurpose cover.
  • the bottom surface of the composite human interface device is a magnet or a physical coupling device or the like so as to be able to check at an accurate position with the area 1332 covering the protruding edge area where the electrode 1331 is located and the text input area having a height difference therebetween. It may have a groove into which the cover is inserted.
  • the edge of the multi-purpose cover is coated with a material capable of removing fine dust, oil, moisture, etc., so that foreign matters on the absorber plate, reflector plate, optical emitter, or the front of the camera can be removed whenever it is attached to the complex human interface device. have.
  • the multi-purpose cover When the multi-purpose cover covers the text input area, the multi-purpose cover can detect the power supply of the composite human interface device.
  • the composite human interface device When the multipurpose cover is mounted on the bottom of the composite human interface device, the composite human interface device may be turned on.
  • the power may be turned off or the standby mode may be changed.
  • the composite human interface device 1321 When the multipurpose cover covers the text input area (1310), the composite human interface device 1321 has a substantially rectangular parallelepiped shape and is designed to have no tilt when placed on the ground to maximize aesthetic effect. Can increase portability.
  • the multi-purpose cover when folded and mounted under several composite human interface devices, it can be tilted like a normal keyboard.
  • the multi-purpose cover may be configured to be detachable, but may be designed in such a way that the hinge is returned from the top to the bottom of the composite human interface device by folding.
  • FIG. 14 is an embodiment of a human interface device applied to a portable notebook.
  • the human interface device 1420 of the present invention can be utilized as an input device of a portable notebook.
  • a display unit composed of a display panel 1411 and a frame 1410 supporting the display panel is bound and covered by the human interface device and the hinge 1427 of the present invention. It can be designed with high open structure.
  • the display unit is inserted into the wall formed by the reflector or absorber plate 1424 of the human interface device of the present invention. Then it is possible to minimize the thickness of the portable notebook using the human interface device of the present invention.
  • the display unit of the portable notebook should be designed to have a width smaller than the width of the human interface device 1420 of the present invention by at least twice the thickness of the reflector or absorber plates 1423 and 1424.
  • the display unit has a feature 1413 that allows both edges to be rounded or diagonally cut.
  • the pointer location information input devices 1421 and 1422 may be positioned toward the outer edge of the rounded or diagonally cut portion.
  • the mode switching unit and the pointer execution command unit may be located outside the side of the human interface device.
  • the mode switching unit and the pointer execution command unit are preferably integrally formed as shown in FIG. 12, but may be separately separated.
  • the pointer location information input devices 1421 and 1422 are used by the user.
  • the pointer location information input area should be set to exclude the area where the mode switch or the pointer execution command unit is located so that the pointer location information input area does not malfunction by inputting the location information of the pointer.
  • the optical signal generated from the pointer position information input device may be adjusted to prevent the optical signal from reaching the mode switching unit or the pointer execution command unit.
  • the optical signal reception angle of the camera may be adjusted or received in a corresponding direction so that the camera does not receive an optical signal generated by reflection, interference, or blocking between the user's finger and the optical signal located in the mode switching unit or the pointer execution command unit.
  • the optical signal can be ignored.
  • the portable notebook is designed to have at least three corner portions of an absorbing plate or a reflecting plate for absorbing or reflecting a light source generated by the pointer position information input device.
  • the two surfaces are located on the side of the human interface device (1423, 1424) and the other surface is the pointer position information input device (1421) when the display unit is open within a predetermined angle on the surface where the display unit is in contact with the human interface device.
  • the reflector or absorber is positioned in a constant region 1412 so that the light source generated by the light source 1422 is sufficiently reflected or absorbed.
  • Fig. 15 is an embodiment in which the lower position type pointer position information input device and the lower position type pointer execution command unit are applied.
  • the human interface device may include a text input unit 1501 including a plurality of physical buttons, a pointer location information input unit for receiving information regarding a pointer position from a user, and a pointer for receiving a user's command to perform at least one function at a location of the pointer.
  • a text input unit 1501 including a plurality of physical buttons
  • a pointer location information input unit for receiving information regarding a pointer position from a user
  • a pointer for receiving a user's command to perform at least one function at a location of the pointer.
  • a pointer execution command receiving unit for receiving signals from the execution command units 1505 and 1506, a mode switching command receiving unit for receiving a signal of a mode switching unit for switching to a pointer position information input mode, and a pointer position input to the pointer position information input unit
  • a pointer location information transmitter for transmitting information to a digital device connected to the human interface device in a wired or wireless manner, and the pointer location information input area of the pointer location information input unit is at least a portion of a text input area of the text input unit including the plurality of physical buttons.
  • An optical signal reflector or absorber 1503a, 1503b, or 1503c positioned parallel to an upper end, wherein the pointer location information input unit is formed higher than the height of the text input unit on first to third surfaces 1502a, 1502b, and 1502c surrounding the text input unit;
  • the fourth surface 1502d surrounding the text input unit is formed to be lower than the height of the first to third surfaces, and the fourth surface may include the pointer execution command units 1505 and 1506.
  • the first surface and the third surface may be located at the left side and the right side of the text input unit, respectively, and the second surface may be located above the text input unit, and the fourth surface may be positioned below the text input unit.
  • the pointer location information input unit may include at least two sensor modules, and each of the two sensor modules may be located at a lower left corner and a lower right corner of the text input unit, respectively.
  • the mode switching unit may be operated by a first input of a user input to a first button, and the pointer execution command receiving unit may be operated by a second input of a user input to the first button.
  • the first button may be made of a material capable of sensing an electrical signal generated by a finger touch, and the first input may be generated by recognizing the electrical signal.
  • the pointer position information input mode may be operated while the first input is maintained, and when the first input is released, the pointer position information input mode may be released and the first input may be generated by physical pressure.
  • the mode switching unit may be operated by a second input of a user input to a second button.
  • the mode switching unit is activated when the pointer location information input mode is released and the pointer location information input mode is activated. If it is activated, it can be released.
  • the light source and the image receiving device generated by the pointer location information input devices 1421 and 1422 are preferably directed toward the center of the keyboard.
  • the absorber plate or the reflector plates 1423 and 1424 located on the side of the human interface device may use the reflector plate instead of the absorber plate and the pointer position information.
  • the light sources and the image receiving devices of the input devices 1421 and 1422 may be installed to face the reflecting plates 1423 and 1424.
  • the pointer location information input device 1421 may receive the pointer location information reflected by the reflection plate 1423.
  • the pinter position information input device 1422 may receive the pointer position information reflected by the reflector plate 1424.
  • the angle of the reflecting plate can be provided so as to open at a predetermined angle from 1 degree to 15 degrees. That is, the lower end may be wider than the upper end of the reflector plates 1423 and 1424.
  • the installation position of the pointer location information input devices 1423 and 1424 is obtained to the outside of the left and right sides of the actual composite human interface device, and is not disturbed by the pointer execution command input device located at the bottom of the keyboard. You can get the effect of moving to the top of a certain distance so as not to.
  • the complex human interface device may include a reflection unit that reflects a light source generated from the pointer location information input unit, and at least two pointer location information input units may be installed at left and right sides of the human interface device, respectively, and the reflection unit may be disposed on the human interface device.
  • the pointer position information input unit on the left side and the right side of the interface device are respectively installed toward the reflector on the left side to receive a light source input through the reflector on the left side, and the pointer position information input unit on the right side It is installed toward the reflector to receive a light source input through the reflector on the right side.
  • the reflection parts on the left and right sides are not parallel to each other, and may be provided to be open toward a portion where the pointer location information input parts on the left and right sides are located.
  • the pointer location information input area may be divided into a first area and a second area. That is, for example, the pointer location information input signal input from the right hand may be received from the first area, and the pointer location information input signal input from the left hand may be received from the second area.
  • the multi-touch control determines a control command according to a later movement pattern of the plurality of touch input signals after receiving the plurality of touch input signals.
  • the first pointer location information input signal received from the first pointer location information input area may be used to determine the number of touch inputs.
  • the second pointer location information input signal received from the second pointer location information input area may be used to receive a movement signal of a touch input.
  • this signal is a touch input up and down with two touch signals of the general multi-touch function Can be replaced.
  • this signal is a touch input up and down with two touch signals of the general multi-touch function Can be replaced.
  • an example of performing up and down scroll input of an Internet browser is an example.
  • the touch input of the left hand is two, the same function as that of tapping with two fingers can be performed.
  • the signal is replaced by sliding the left and right touch inputs with three touch signals of the general multi-touch function.
  • sliding the left and right inputs such as flicking the full-screen app or dragging with three fingers.
  • the function may be performed in the same manner as when the user taps with three fingers. Examples of the left hand and the right hand may be replaced by examples of the right hand and the left hand, and the number of input signals for determining the number of touch is not limited to the two and three described above, but may include one, four, and five touches. Can be.
  • the information related to the pointer position received by the pointer position information input unit includes information related to a first pointer position for moving a pointer position and a second pointer for switching the human interface device from a text input mode to a pointer position information input mode. It may consist of information related to the location.
  • the pointer position information input mode may be switched.
  • FIG. 1 to 15 illustrate an example of a pointing device-integrated text input device to be described in the present invention, and the type and technology used of the text input device pointing location information input device are changed or replaced by those skilled in the art without changing the basic purpose. Includes what is possible.
  • the present invention can be applied in whole or in part to a composite human interface device having a text input device and a pointer location information input device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention concerne un appareil d'interface humaine caractérisé en ce qu'il comprend les étapes consistant à: recevoir l'entrée d'un premier geste d'un utilisateur par l'intermédiaire d'une région d'entrée de texte; générer des premières informations liées au texte en fonction du premier geste; transmettre les informations de texte à un dispositif numérique qui est relié sans fil ou avec fil à l'appareil d'interface humaine; et recevoir l'entrée d'un second geste d'un utilisateur.
PCT/KR2015/003913 2011-11-15 2015-04-20 Appareil d'interface humaine multifonction WO2015160231A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201580001841.7A CN105659193B (zh) 2014-04-19 2015-04-20 一种包括人机交互装置的数字装置

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR20110119218 2011-11-15
KR10-2014-0047073 2014-04-19
KR1020140047073A KR20140075651A (ko) 2011-11-15 2014-04-19 디스플레이 유닛이 구비된 복합 휴먼 인터페이스 장치

Publications (1)

Publication Number Publication Date
WO2015160231A1 true WO2015160231A1 (fr) 2015-10-22

Family

ID=54327010

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/003913 WO2015160231A1 (fr) 2011-11-15 2015-04-20 Appareil d'interface humaine multifonction

Country Status (2)

Country Link
KR (1) KR20140075651A (fr)
WO (1) WO2015160231A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10635192B2 (en) 2016-05-01 2020-04-28 Innopresso, Inc. Electronic device having multi-functional human interface
US10635187B2 (en) 2016-06-23 2020-04-28 Innopresso, Inc. Electronic device having multi-functional human interface
US11009990B2 (en) 2016-05-01 2021-05-18 Innopresso, Inc. Electronic device having multi-functional human interface

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101757539B1 (ko) * 2015-08-12 2017-07-12 전자부품연구원 멀티입력장치
KR102015313B1 (ko) * 2017-09-01 2019-08-28 (주)이노프레소 복합 휴먼 인터페이스가 구비된 전자 기기 및 그 제어 방법
KR102015309B1 (ko) * 2017-09-01 2019-08-28 (주)이노프레소 복합 휴먼 인터페이스가 구비된 전자 기기 및 그 제어 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030019813A (ko) * 2001-08-31 2003-03-07 한국에스엠케이 주식회사 키스위치
US20100149099A1 (en) * 2008-12-12 2010-06-17 John Greer Elias Motion sensitive mechanical keyboard
KR20110023654A (ko) * 2009-08-31 2011-03-08 이화여자대학교 산학협력단 핑거 마우스
KR101042285B1 (ko) * 2010-10-28 2011-06-17 주식회사 위젯 마우스와 멀티터치 기능을 가진 터치패드가 키 캡에 통합된 키보드

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030019813A (ko) * 2001-08-31 2003-03-07 한국에스엠케이 주식회사 키스위치
US20100149099A1 (en) * 2008-12-12 2010-06-17 John Greer Elias Motion sensitive mechanical keyboard
KR20110023654A (ko) * 2009-08-31 2011-03-08 이화여자대학교 산학협력단 핑거 마우스
KR101042285B1 (ko) * 2010-10-28 2011-06-17 주식회사 위젯 마우스와 멀티터치 기능을 가진 터치패드가 키 캡에 통합된 키보드

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10635192B2 (en) 2016-05-01 2020-04-28 Innopresso, Inc. Electronic device having multi-functional human interface
US11009990B2 (en) 2016-05-01 2021-05-18 Innopresso, Inc. Electronic device having multi-functional human interface
US11068079B2 (en) 2016-05-01 2021-07-20 Innopresso, Inc. Electronic device having multi-functional human interface
US11586299B2 (en) 2016-05-01 2023-02-21 Mokibo, Inc. Electronic device having multi-functional human interface
US11747916B2 (en) 2016-05-01 2023-09-05 Mokibo, Inc. Electronic device having multi-functional human interface
US10635187B2 (en) 2016-06-23 2020-04-28 Innopresso, Inc. Electronic device having multi-functional human interface
US10921902B2 (en) 2016-06-23 2021-02-16 Innopresso, Inc. Electronic device having multi-functional human interface
US10921901B2 (en) 2016-06-23 2021-02-16 Innopresso, Inc. Electronic device having multi-functional human interface
US10976832B2 (en) 2016-06-23 2021-04-13 Innopresso, Inc. Electronic device having multi-functional human interface
US11526213B2 (en) 2016-06-23 2022-12-13 Mokibo, Inc. Electronic device having multi-functional human interface

Also Published As

Publication number Publication date
KR20140075651A (ko) 2014-06-19

Similar Documents

Publication Publication Date Title
WO2017222346A1 (fr) Dispositif électronique à interface humaine complexe
WO2015160231A1 (fr) Appareil d'interface humaine multifonction
WO2018080006A1 (fr) Appareil électronique équipé d'une interface humaine complexe et son procédé de commande
WO2015076463A1 (fr) Terminal mobile et son procédé de commande
WO2015016628A1 (fr) Procédé et appareil d'affichage d'applications
WO2018034402A1 (fr) Terminal mobile et son procédé de commande
WO2015030452A1 (fr) Terminal mobile comprenant un stylet et un écran tactile
WO2015142023A1 (fr) Procédé et dispositif corporel pour fournir une interface d'entrée virtuelle
WO2016125962A1 (fr) Terminal mobile comprenant un stylet et un écran tactile, et procédé de commande associé
WO2015020283A1 (fr) Terminal mobile et son procédé de commande
WO2017119531A1 (fr) Terminal mobile et son procédé de commande
WO2021101189A1 (fr) Procédé et dispositif pour fournir une interface utilisateur dans un dispositif électronique ayant un écran pliable
WO2015119484A1 (fr) Dispositif de terminal utilisateur et son procédé d'affichage
WO2015020284A1 (fr) Terminal mobile et procédé de commande associé
WO2021096196A1 (fr) Dispositif électronique pliable
WO2014030812A1 (fr) Appareil flexible et procédé de commande associé
WO2016032045A1 (fr) Terminal mobile et son procédé de commande
WO2017047854A1 (fr) Terminal mobile et son procédé de commande
WO2017039094A1 (fr) Terminal mobile et son procédé de commande
WO2013077667A1 (fr) Dispositif électronique
WO2017183803A1 (fr) Procédé, dispositif et support non temporaire lisible par ordinateur pour commander un dispositif d'interface tactile susceptible d'interagir avec un utilisateur
WO2015093667A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique
WO2018034496A1 (fr) Stylet, système de détection tactile, dispositif de commande tactile et procédé de détection tactile
WO2015093666A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique
WO2015167128A1 (fr) Terminal mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15779707

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.05.2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15779707

Country of ref document: EP

Kind code of ref document: A1