WO2013086718A1 - Input device and method - Google Patents

Input device and method

Info

Publication number
WO2013086718A1
WO2013086718A1 PCT/CN2011/084051 CN2011084051W WO2013086718A1 WO 2013086718 A1 WO2013086718 A1 WO 2013086718A1 CN 2011084051 W CN2011084051 W CN 2011084051W WO 2013086718 A1 WO2013086718 A1 WO 2013086718A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
input
device
information
direction
imaging
Prior art date
Application number
PCT/CN2011/084051
Other languages
French (fr)
Chinese (zh)
Inventor
王德元
Original Assignee
Wang Deyuan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Abstract

An input device and method. The input device comprises an imaging module (7), configured to image a scene or an object. The input device is configured to be movable, so that an image obtained by the imaging module (7) is changed, and the change indicates direction information of movement. A direction information determination module (9) may further be provided. The direction information determination module (9) is configured to determine direction information according to the change. The direction information determination module (9) is disposed in the input device or disposed in a computing device used as a main machine of the input device.

Description

Input device and method

FIELD

Embodiments of the present disclosure relates generally to the field of input devices, and more particularly, relates to an input apparatus and a method for the input direction information. Background technique

Now computer systems generally use "Windows System" (Windo WS). When operating in "Window System", in order to control a cursor on a display, to be used in substantially a "mouse" (Mouse) or "mouse." Such a mouse or mouse, usually a slidable means, slidable on a plane. The slide means that slides in a direction on a plane, is detected direction information, and transmits this information to a computer system, for controlling a cursor on the display moves accordingly.

To date, there are two most used method of detecting direction. One is the mechanical detection method. In particular, at the bottom is mounted a ball slide device. The sliding device driven rolling ball, the ball rolling means driven by an internal conducting sliding means. Different sliding direction of the sliding device 4 is provided with conducting means, each conducting means different scroll information obtained from the ball. Converting these four conducting means into the scroll information and direction information transferred to the computer system for controlling the movement of the cursor. Another type is an optical detection method. In particular, at the bottom of a sliding means mounted receiving means and the reflective optical illumination. Sliding means slide, light is partially emitted from the optical illumination plane sliding means is slid plane, the reflected part of the light reflected is received by the receiving means. The reflected light includes movement information. Light reflection treatment to give the direction information of the slide, and transmits this information to a computer system, for controlling movement of the cursor. The optical apparatus is not prone to failure more easily than mechanical means.

However, both methods have a common drawback. That is, there must be one that allows sliding means to slide the special plane. Therefore, these two methods to use is restricted environment. SUMMARY

Considering the above problems, the present disclosure is to provide a program input means and an input method, the input direction can be more convenient information, for example, so that the control target (e.g., a cursor on a display and the like) movement.

Other aspects of the present disclosure will be set forth in part in the description which follows and in part will be apparent from this description according to, or may be learned by practice of the disclosure.

According to one aspect of the present disclosure, there is provided an input apparatus, comprising: an imaging module configured to image the scene or object, wherein the input device is configured to be movable, so that the image formed by the imaging module changes, the information about the change in the pointing direction of movement.

According to one embodiment, the input device may further comprise: a directional information determination module configured to change in accordance with the determined direction information. Alternatively, according to another embodiment, the input device configured to input devices of the computing device, wherein the computing device may comprise: a directional information determination module configured to change in accordance with the determined direction information.

According to one embodiment, the input device may also be integrated with a camera, a camera for imaging by the imaging module.

Embodiment, the imaging module in accordance with one embodiment may include an imaging pixel array to the scene or object with high resolution imaging. Alternatively, according to another embodiment, the imaging module may include a plurality of discrete pixel image for the scene or object rough image.

According to another aspect of the present invention, there is provided an input method, comprising: a scene or object for the imaging; when moved due to the change of the image occurs according to the change information about the movement direction is determined.

Embodiment, the imaging information and direction can be determined in accordance with one embodiment of the input device. Alternatively, according to another embodiment, the imaging may be performed in the input device, for determining the direction information may be a computing device as an input device of the host.

According to an embodiment of the present invention, the direction information may be used to move the control target. This goal is cursor on a display, for example, a controlled object such as a robot or a remote control toy or the like. In the case where the target is a cursor, a scene or object being imaged may include a display.

According to an embodiment of the present invention, imaging can be infrared, ultraviolet rays, electromagnetic waves, sound waves and one or more other rays by visible light. BRIEF DESCRIPTION

Embodiments are described below with reference to the accompanying drawings of the present disclosure, the above-described embodiment of the present disclosure and other features and advantages of embodiments will become more apparent in the drawings:

FIG 1 shows a schematic view of the working principle of the input device according to an embodiment of the disclosure;

Figure 2 shows a block diagram of an input device of the present disclosure according to an embodiment; and

FIG. 3 shows a schematic view of the application according to the input apparatus according to one embodiment of the disclosure. detailed description

Hereinafter, the embodiments of the present disclosure will be described in detail, which are illustrated in the accompanying drawings. It should be understood that these descriptions are exemplary only, and not to limit the scope of the present disclosure.

The imaging principle, subject (or scenes) can be imaged through the lens or aperture. Camera or video camera is based on this principle, it may be subject to (or scene) and the image recorded. In the camera image, if the image capturing direction of the camera movement or deflection, the subject image of an object (or scene) will follow a corresponding change in movement. According to this principle, such as the change in direction can be and / or size changes from the subject (or scene) to obtain a variation on the image movement information, is used as input information to computing equipment such as computers direction.

FIG 1 shows a schematic view of the working principle of the input device according to an embodiment of the disclosure. Figure 1 shows three states 8a, 8b and 8c. 8a is a state subject (or scene) lenses (or apertures) 3a and 5a of the image forming apparatus 7a and the subject (or scene) is set to an imaging surface 6a of the coaxial state, the state of the image forming apparatus 7b 8b is a pointing 4b departing from the subject to the right (or scene) 3b of the state, and the state of the image forming apparatus 7c 8c is directed leftward 4c departing from the subject state of the object (or scene). 3C is. As shown in FIG. 1, in a state 8a, subject (or scene) image 3a is projected in the center of the imaging plane 6a. In the state 8B, the subject (or scene) image is projected on the right side 3b of the imaging face 6b, 8a with respect to the image formed state shifted to the right. In the state 8C, the subject image of the object (or scene) is projected. 3C left side of the image plane 6c, the state shifts to the left with respect to the image 8a. That is, the image forming apparatus 7b, 7c (or its point 4b, 4c) is moved so that the subject (or scene) 3b, 3c, 6c moving image on the imaging surface 6b.

Although only the image forming apparatus 1 in FIG. 7b, 7C (or its point 4b, 4c) moves about the case, but those skilled in the art understand, the imaging apparatus moves back and forth (in FIG. 1, the sheet moves up and down along the ), the subject (or scene) will change the imaging plane of the image (e.g., image size change).

That is, the image forming apparatus (or a point) do move in any direction, subject (or scene) will change corresponding to the image on the imaging surface, such as dimensional change or change in position. Thus, it can be moved back and forth according to the image forming apparatus (or a point) of the imaged changes, to obtain the moving direction of the image forming apparatus information (or point) of. This information can be used to input to the computing device (e.g., a computer), for example, for controlling the movement of the target (e.g., a cursor on the display of the computer) a.

Figure 2 shows a block diagram of an input device of the present disclosure according to an embodiment. As shown, the imaging module 7 and the direction information determination module 92 according to the embodiment of the input device 2 comprises.

7, for example, the imaging module may be a reference to the image forming apparatus of FIG. 1, 7a, 7b, 7c, it may include an imaging optical system (e.g., the above-described lenses or apertures 5a, etc.) and an imaging plane (e.g., the above-described image forming surfaces 6a, 6b , 6c). The imaging surface may include a photovoltaic device for converting the captured by an optical system from a subject (or scene) 3 converts an optical signal into an electrical signal, for example, may include a CCD (Charge Coupled Device) or the like. According to some embodiments of the present disclosure, the imaging surface may comprise an array of imaging pixels to achieve high resolution imaging to obtain a subject (or scene) of 3 clear image. Alternatively, according to other embodiments of the present disclosure, the imaging surface may comprise the subject (or scene) 3 pixels forming several distinct rough imaging, as long as the information to determine the direction from the image formed. For example, the imaging surface may include only a plurality of photodiodes.

The imaging module 7 on the subject (or scene) 3 may be imaged with visible light. However, the present disclosure is not limited thereto. Alternatively, the imaging module 7 may use infrared, ultraviolet, electromagnetic, acoustic or other types of imaging radiation and the like. For example, the subject (or scene) 3 itself may emit visible, infrared, ultraviolet, electromagnetic, acoustic or various other rays or the like, or may reflect ambient visible light, infrared, ultraviolet, electromagnetic, acoustic or various other rays. Further, for example, may include an imaging module 7 emit visible light, infrared device, ultraviolet rays, electromagnetic waves, acoustic or various other rays or the like, and from the use of the subject (or scene) 3 is reflected visible, infrared, ultraviolet, electromagnetic, acoustic or various other ray imaging and the like.

Here, the subject (or scene) 3 may be any object or scene, and not necessarily to a particular object or scene. For example, the subject may be a computing device display. Further, the subject (or scenes) can be controlled to the target are the same object. For example, the subject may be a cursor on the display itself.

When the user wants to input information to the direction of a host computing device, such as a computer or the like, it is possible to move the input device 2, so that the input device 2 included in the imaging module 7 are also moved. As described above with reference to FIG. 1, mobile imaging module 7 will cause the subject (or scene) 3 occurs in the image plane of the image changes, such as change in position or size change.

9 the direction information determination module is configured to vary depending on the image formed by the imaging module occurred, determining information about the moving direction. This direction information determination module 9 may be dedicated hardware, firmware or software. For example, determining the direction information module 9 may be implemented as a microprocessor running the program, or a field programmable gate array

(FPGA). Specifically, the direction information determination module 9 can move like before and after the position and / or size variations, to determine the moving direction of the input device 2 before and after the left and right, and thus the corresponding direction information for input into the computing device.

In the block diagram shown in FIG. 2, the direction information determination module 9 is also included in the input device 2. However, it should be noted that the present disclosure is not limited to this structure. For example, the direction information determination module may also be included in a computing device, so that the input device (specifically, calculating direction information device determination module) images before and after movement to be output to the computing device, and identification information by the direction of the module process the image to determine the direction information. In this case, the direction information determination module, such as a central processing unit (CPU) may be implemented in the processor of the computing device.

The computing device may be information input by the direction input device 2, in accordance with movement of the control target. For example, the target may be a cursor displayed on the display of the computing device. In this case, according to the direction information, controlling cursor movement on the display up and down. Alternatively, the target can be controlled objects, such as a robot or a remote control toys. In this case, according to the direction information, the direction of the control action of the controlled object, such as a control or a remote-controlled toy robot moves in the corresponding direction.

FIG. 3 shows a schematic view of an input device according to the present application discloses an embodiment. 3, the input apparatus 2 is configured to control movement of the cursor on the display of a computing device. The input device 2 can use the display as a subject.

According to one embodiment of the present disclosure, since the input device using the imaging module, the input device can be integrated with the camera in the same device. Specifically, the camera can use imaging module imaging function, to implement the camera function.

In addition to the above-described input apparatus, according to one embodiment of the present disclosure, an input method is also provided. The input method may include: imaging of an object or scene; due to the movement when the image formed changes according to the change information about the movement direction is determined. Wherein the imaging information and direction can be determined in the input device; Alternatively, imaging may be performed in the input device, for determining the direction information may be a computing device as an input device of the host.

Embodiment of the present invention with reference to the foregoing embodiments of the present invention to be described. However, these examples are merely for illustrative purposes and are not intended to limit the scope of the invention. Scope of the invention defined by the appended claims and their equivalents. Without departing from the scope of the invention, those skilled in the art may make various modifications and alternatives, such alterations and modifications are intended to fall within the scope of the invention.

Claims

Rights request
An input apparatus, comprising:
An imaging module configured to image the scene or object,
Wherein the input device is configured to be movable, so that the image formed by the imaging module changes, the change indication information on the movement direction.
The input device according to claim 1, further comprising: a direction information determination module configured to change in accordance with the determining the direction information.
3. The input device according to claim 1, wherein the input device is an input device provided to the computing device, wherein said computing device comprises:
A direction information determination module configured to change in accordance with the determining the direction information.
4. The input device according to claim 1, wherein said input device further has an integrated camera, a camera for imaging by the imaging module.
The input device according to claim 1, wherein the imaging module comprises an array of imaging pixels.
6. The input device according to claim 1, wherein the imaging module comprises a plurality of discrete points forming pixels.
An input method, comprising:
For the imaging scene or object;
When due to the movement when the image formed changes according to the change information on the determined direction of movement.
The method according to claim 7, wherein the imaging and determine the direction information is performed in the input device.
9. The method according to claim 7, wherein, in the image input device, to determine the direction information in a computing device as an input device of the host.
10. The input method according to claim 7 or according to the input apparatus according to claim claim 1, wherein the direction information for controlling cursor movement on the display.
11. The input device or input method of claim 10, wherein the scene or object is not include significant.
12. The input method according to claim 7 or according to the input apparatus according to claim 1, wherein, using visible light, one or more of infrared, ultraviolet, electromagnetic waves, acoustic waves and radiation in imaging.
13. The input method according to claim 7 or according to the input apparatus according to claim 1, wherein the direction of action information for controlling the controlled object.
14. The input device or input method of claim 13, wherein said object includes a robot or remote controlled toy.
PCT/CN2011/084051 2011-12-15 2011-12-15 Input device and method WO2013086718A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/084051 WO2013086718A1 (en) 2011-12-15 2011-12-15 Input device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/084051 WO2013086718A1 (en) 2011-12-15 2011-12-15 Input device and method

Publications (1)

Publication Number Publication Date
WO2013086718A1 true true WO2013086718A1 (en) 2013-06-20

Family

ID=48611823

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/084051 WO2013086718A1 (en) 2011-12-15 2011-12-15 Input device and method

Country Status (1)

Country Link
WO (1) WO2013086718A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1834878A (en) * 2005-03-18 2006-09-20 安捷伦科技有限公司 Optical navigation system
CN1975646A (en) * 2005-12-02 2007-06-06 艾勒博科技股份有限公司 Optical navigation device and method of operating the same
US20080129704A1 (en) * 1995-06-29 2008-06-05 Pryor Timothy R Multipoint, virtual control, and force based touch screen applications
CN201311700Y (en) * 2008-09-02 2009-09-16 Tcl集团股份有限公司 Remote controller with a camera
WO2010082499A1 (en) * 2009-01-15 2010-07-22 シャープ株式会社 Optical pointing device and electronic equipment mounted with same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080129704A1 (en) * 1995-06-29 2008-06-05 Pryor Timothy R Multipoint, virtual control, and force based touch screen applications
CN1834878A (en) * 2005-03-18 2006-09-20 安捷伦科技有限公司 Optical navigation system
CN1975646A (en) * 2005-12-02 2007-06-06 艾勒博科技股份有限公司 Optical navigation device and method of operating the same
CN201311700Y (en) * 2008-09-02 2009-09-16 Tcl集团股份有限公司 Remote controller with a camera
WO2010082499A1 (en) * 2009-01-15 2010-07-22 シャープ株式会社 Optical pointing device and electronic equipment mounted with same

Similar Documents

Publication Publication Date Title
US7557935B2 (en) Optical coordinate input device comprising few elements
US6947032B2 (en) Touch system and method for determining pointer contacts on a touch surface
US20020085097A1 (en) Computer vision-based wireless pointing system
US20080259053A1 (en) Touch Screen System with Hover and Click Input Methods
US20080030458A1 (en) Inertial input apparatus and method with optical motion state detection
US20090009469A1 (en) Multi-Axis Motion-Based Remote Control
US20060158437A1 (en) Display device
US20150002734A1 (en) Electronic Device with Modulated Light Flash Operation for Rolling Shutter Image Sensor
US20120200495A1 (en) Autostereoscopic Rendering and Display Apparatus
US20090244097A1 (en) System and Method for Providing Augmented Reality
US20070273842A1 (en) Method And Apparatus For Inhibiting A Subject's Eyes From Being Exposed To Projected Light
US20100103099A1 (en) Pointing device using camera and outputting mark
US8723789B1 (en) Two-dimensional method and system enabling three-dimensional user interaction with a device
US20140240492A1 (en) Depth sensor using modulated light projector and image sensor with color and ir sensing
US8686943B1 (en) Two-dimensional method and system enabling three-dimensional user interaction with a device
US20110291988A1 (en) Method and system for recognition of user gesture interaction with passive surface video displays
US20090091553A1 (en) Detecting touch on a surface via a scanning laser
US20100053324A1 (en) Egomotion speed estimation on a mobile device
CN102799318A (en) Human-machine interaction method and system based on binocular stereoscopic vision
US8773512B1 (en) Portable remote control device enabling three-dimensional user interaction with at least one appliance
US20130147711A1 (en) Camera-based multi-touch interaction apparatus, system and method
US20110187820A1 (en) Depth camera compatibility
US20110181553A1 (en) Interactive Projection with Gesture Recognition
Berman et al. Sensors for gesture recognition systems
US20090296991A1 (en) Human interface electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11877388

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 11877388

Country of ref document: EP

Kind code of ref document: A1