WO2013104315A1 - 一种为发射光源的运动轨迹映射其应用轨迹的方法与系统 - Google Patents

一种为发射光源的运动轨迹映射其应用轨迹的方法与系统 Download PDF

Info

Publication number
WO2013104315A1
WO2013104315A1 PCT/CN2013/070287 CN2013070287W WO2013104315A1 WO 2013104315 A1 WO2013104315 A1 WO 2013104315A1 CN 2013070287 W CN2013070287 W CN 2013070287W WO 2013104315 A1 WO2013104315 A1 WO 2013104315A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
application
input
trajectory
mouse
Prior art date
Application number
PCT/CN2013/070287
Other languages
English (en)
French (fr)
Inventor
李东舸
王玮
Original Assignee
西安智意能电子科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西安智意能电子科技有限公司 filed Critical 西安智意能电子科技有限公司
Priority to US14/371,421 priority Critical patent/US20150084853A1/en
Publication of WO2013104315A1 publication Critical patent/WO2013104315A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/285Analysis of motion using a sequence of stereo image pairs

Definitions

  • the present invention relates to the field of intelligent control technologies, and in particular, to a technique for mapping a motion trajectory of a transmitting light source to its application trajectory. Background technique
  • a certain signal sent by the input device such as an electromagnetic signal, a sound signal or an optical signal, is usually detected by the detecting device to perform corresponding input mapping and displayed on the screen.
  • An application trajectory corresponding to the trajectory of the input device tend to be simple mappings, such as mapping by MEMS sensors based on acceleration, simple two-dimensional mapping by gravity sensors, etc., and the user experience is poor.
  • a method of mapping a motion trajectory of an emission source to an application trajectory thereof comprising the steps of:
  • the method further includes:
  • the detecting the input mode of the emission light source comprises:
  • the application mapping curve comprises a three-dimensional application mapping curve.
  • the amplification factor of the three-dimensional application mapping curve is adjusted based on the distance of the emission source.
  • the three-dimensional application mapping curve comprises a three-dimensional application mapping curve based on a three-dimensional rotational position of the emission source.
  • step b includes:
  • the application mapping curve is adjusted based on historical state information of the transmitted light source.
  • the method further comprises:
  • the method further comprises:
  • the step d includes:
  • the method further comprises:
  • the predetermined input operation correction stop condition includes at least one of the following: - the movement time of the emission light source reaches a predetermined correction delay time threshold;
  • the step b further includes:
  • Determining based on historical motion characteristic information of the motion trajectory, predicted position information of the transmitted light source for smoothing the motion trajectory.
  • the input mode of the light source comprises a handwriting input mode.
  • the application mapping curve comprises a linear curve.
  • the method further comprises:
  • the method further includes:
  • the input mode of the light source includes a mouse input mode.
  • the method further comprises:
  • control information transmitted by the transmitting light source according to imaging information of the transmitting light source, and obtaining a mouse operation corresponding to the control information by querying a predetermined control information table;
  • a system for mapping a motion trajectory of an emission source to an application trajectory thereof wherein the system includes an emission source, a camera for acquiring imaging information of the emission source, and a processing device And an output device;
  • processing device is used to:
  • the output device is configured to output the application track to an external device.
  • the present invention determines a corresponding application mapping curve according to an input mode of the emission light source, and then obtains an application trajectory of the emission light source according to the motion trajectory of the emission light source, thereby realizing the adaptation.
  • the ground is the different input modes of the emission source, matching the application mapping curve and obtaining the application track, which improves the user experience.
  • FIG. 1 shows a system diagram of a system for mapping an application trajectory of a motion trajectory of an emission source in accordance with an aspect of the present invention
  • FIG. 2 is a schematic diagram showing a two-dimensional mouse application mapping curve according to the present invention
  • FIG. 3 is a schematic view showing three-dimensional rotational position information indicating an emission source according to the present invention
  • FIG. 4 is a flow chart showing a method for mapping a motion trajectory of a light source to its application trajectory according to another aspect of the present invention
  • Figure 5 is a flow chart showing a method for mapping the motion trajectory of an emission source to its applied trajectory in accordance with a preferred embodiment of the present invention
  • FIG. 6 shows a flow chart of a method for mapping a motion trajectory of an emission source to its application trajectory in accordance with another preferred embodiment of the present invention
  • FIG. 7 is a flow chart showing a method for mapping an application trajectory of a motion trajectory of a light source according to still another preferred embodiment of the present invention.
  • FIG. 8 is a flow chart showing a method for mapping a motion trajectory of an emission source to an application trajectory thereof according to still another preferred embodiment of the present invention.
  • Figure 9 illustrates a motion trajectory of a light source in accordance with a preferred embodiment of the present invention.
  • Figure 1 is a schematic illustration of a system in accordance with one aspect of the present invention showing a system for mapping the motion trajectory of an emission source to its applied trajectory.
  • the input detection system 100 includes an input device 110 and an application detection device 120, wherein the input device 110 and the application detection device 120 are respectively placed at both ends.
  • Input device 110 includes at least one emission source 111.
  • the application detecting device 120 includes at least one processing device 122 and at least one output device 123.
  • the application detecting device 120 further includes or externally connects at least one camera 121.
  • the camera 121 captures the transmitting light source 111 to obtain imaging information of the transmitting light source 111.
  • the output device 123 It is also connected to the external device 130.
  • the camera 121 captures the emission light source 111 and acquires the imaging information of the emission light source 111.
  • the processing device 122 detects the input mode of the emission light source 111, determines an application mapping curve corresponding to the input mode, and acquires the emission according to the imaging information of the emission light source 111.
  • a motion trajectory of the light source 111, and an application trajectory corresponding to the motion trajectory is obtained by the application mapping curve; the output device 123 outputs the application trajectory to the external device 130.
  • the motion track includes one or more positional information of the light source 111
  • the application track includes one or more display positions corresponding to the light source 111 on the screen of the external device 130.
  • the transmitting light source 111 is disposed on the input device 110
  • the position and motion trajectory of the input device 110 are characterized by the position and motion trajectory of the transmitting light source 111, and the two are equally used.
  • the camera 121 captures the emission light source 111 and acquires a multi-frame image of the emission light source 111.
  • the processing device 122 determines the input mode of the emission light source 111 as the mouse input mode according to the system default setting, and determines the mouse application mapping corresponding to the mouse input mode.
  • the binocular stereo vision algorithm described above is only one example of obtaining a three-dimensional translational position of the emission source, and the examples are merely for the purpose of illustrating the invention, and are not to be construed as the invention. Any limitation, other existing or future possible ways of calculating the three-dimensional translational position of the emitted light source, as applicable to the present invention, are also intended to be included within the scope of the present invention and are incorporated herein by reference.
  • the manner in which the processing device 122 detects the input mode of the light source 111 can be varied.
  • the input mode of the transmitting light source 111 is determined according to the control signal of the input device 110, such as querying a predetermined control information table according to the control information to determine a corresponding input mode; or determining the transmitting light source 111 according to the current application of the external device 130.
  • the input mode if the current application is an input box, the corresponding input mode is the handwriting input mode, and the current application is the program menu, and the corresponding input mode is the mouse input mode.
  • the processing device 122 may detect its corresponding input mode at the initial moment of the motion of the transmitting light source 111, or may switch the input mode of the transmitting light source 111 when the current application of the external device 130 changes.
  • the application detecting device 120 may include a mapping curve library for storing application mapping curves corresponding to various input modes, such as a mouse application mapping curve, a handwriting application mapping curve, and the like.
  • Figure 2 shows a plurality of two-dimensional mouse application mapping curves.
  • the two-dimensional mouse application mapping curve may be a primary curve (ie, a linear transformation curve), a quadratic curve, or a curve divided into multiple segments.
  • the x or y direction uses the same or different mapping curves to determine the moving position or speed of the mouse.
  • the moving distance of the imaging spot in the x, y direction of the image is mapped to the moving distance on the screen of the external device 130, And the smaller the moving distance of the imaging spot, the mapping curve The slower, that is, the smaller the slope, to prevent jitter, and the larger the moving distance of the imaging spot, the larger the slope of the mapping curve.
  • a two-dimensional mouse application mapping curve can also be used to map the absolute position of the imaging spot to the display position of the screen.
  • the mouse application mapping curve may further include a three-dimensional mouse application mapping curve, and the x, y, and z directions may respectively determine the moving position or speed of the mouse by the same or different mapping curves.
  • ⁇ , ⁇ , ⁇ can also change the position of the mouse, such as the moving distance or speed of the mouse.
  • x, y, z can also change the position of the transmitting light source 111, such as the moving distance or speed of the imaging spot.
  • the application mapping curve of the corresponding input mode may be further set according to a specific application. For example, for a general application such as web browsing, the display position of the mouse may be mapped according to the position of the emission light source 111, and For applications with high sensitivity and sensitivity requirements, the positional change of the mouse can be mapped according to the positional change of the emitted light source 111.
  • X f(x,y,z,(x,P,Y)
  • Y g(x,y,z,(x,P,Y)
  • 1 ⁇ ( ⁇ , ⁇ , ⁇ , ⁇ , ⁇ , ⁇ ;
  • the three-dimensional rotational position of the emission light source 111 is denoted by ( , ⁇ , ⁇ ), where ⁇ is the horizontal direction angle of the emission light source 111 through its centroid axis, and ⁇ is the emission light source 111 through its centroid axis
  • the vertical direction angle, ⁇ is the rotation angle of the emission light source 111 around its centroid axis, that is, the rotation angle of the emission light source 111.
  • the three-dimensional rotational position of the emission light source 111 may also be labeled as ⁇ or ( ⁇ , ⁇ ), where ⁇ is the angle between the axis of the light source 111 and the line connecting the light source 111 to the camera 122.
  • the corresponding angle ⁇ is obtained by various sample interpolation algorithms.
  • the sample interpolation algorithm includes, but is not limited to, nearest neighbor interpolation, linear weighted interpolation, bicubic interpolation, etc., any existing or future possible interpolation that may be suitable for use in the present invention. algorithm.
  • a sufficient number of samples ie, values of r and 1 (or other available spot properties) may be measured at different angles in a certain step to establish the aforementioned spot property-angle sample table, or The one, two or more curves are fitted to the mapping relationship of r, I and ⁇ according to the minimum error criterion to obtain the aforementioned angle fitting curve.
  • the optical characteristics should be selected within the effective working range.
  • the combination of r and I can be used to uniquely determine the angle of the LED light source.
  • f p and g p are mapping functions for the three-dimensional translational position
  • f and g z are mapping functions for the three-dimensional rotational position
  • wl and w2 are the influence weights of the three-dimensional translational position and the three-dimensional rotational position, respectively.
  • ⁇ , ⁇ , ⁇ , ⁇ , ⁇ can also be changes in the corresponding direction, such as translational or rotational speed, rather than actual position values; this is more useful for applications such as 3D TV or 3D games. It is helpful, for example, to rotate the menu according to the rotational speed of the light source 111, or to more accurately map the motion of the person in the 3D game according to the translational and rotational speeds of the emitted light source 111.
  • the application mapping curve can also be adjusted based on historical state information of the transmitting light source 111.
  • Historical state information such as its most recent state of use, is adjusted by the mapping curve magnification factor. It should be noted that those skilled in the art should understand that the most recently used state can be used not only to adjust the aforementioned amplification factor, but also to select different mapping curves f, g, h in some applications, so as to obtain an optimal positioning experience.
  • the magnification factor of the mouse application mapping curve is adjusted by detecting the size of the imaging spot or the distance of the emission source 111 relative to the camera 121.
  • the distance of the light source 111 is close, the amplification factor of the mapping curve is small; when the distance of the light source 111 is far, the amplification factor of the mapping curve is large, so that the user's experience of using the input device at different distances is consistent.
  • the distance of the emission source 111 can also be estimated by face detection to adjust the amplification factor of the mapping curve.
  • the distance of the light source 111 is estimated based on the face feature information in the imaging information that is closest to the motion track of the imaging spot, such as the size of the face, the distance between the eyes, the pixel width, and the like.
  • the amplification factor is calculated as follows:
  • Esi ⁇ F : , i - ! ⁇ iS si ⁇ + 1 ⁇ 2 ⁇ ⁇
  • ⁇ j- curF the magnification factor used in this frame
  • preF the amplification factor used in the previous frame, if it is the first frame, take 1;
  • a parameter set by the user the larger the value, the faster the change of the amplification factor, the smaller the amplification factor is affected by the accumulation of the previous frame;
  • z the distance from the input device 110 to the application detecting device 120, that is, the depth coordinate of the emitted light source 111 with respect to the spatial origin;
  • Db The mean of multiple distances z, as preset to 3.0 meters.
  • the curF is obtained by the above formula, it is multiplied separately with the g in the X direction and the g in the Y direction to obtain a three-dimensional application mapping curve based on the most recently used state.
  • the magnification factor of the mouse application mapping curve is also adjusted based on the speed of movement of the input device 110 for a recent period of time. If the recent moving speed of the input device 110 is small, the amplification factor of the mouse application mapping curve becomes smaller, and if the recent moving speed of the input device 110 is large, the amplification factor of the mouse application mapping curve becomes larger. Thus, when the user continuously performs a small range of fine operations, a small amplification factor contributes to accurate positioning; When the household is moving quickly, the large amplification factor is beneficial for rapid movement.
  • handwritten application mapping curves it can be a linear curve, including a two-dimensional application mapping curve and a three-dimensional application mapping curve. Similar to the mouse application mapping curve, the input of the handwritten application mapping curve can also emit a change in position or position of the light source 111 (e.g., moving distance or speed), correspondingly to a screen position or position change of the handwritten input.
  • the transform coefficient of the handwritten application mapping curve that is, the slope of the aforementioned linear curve, can be set according to different applications.
  • the corresponding transform coefficient is 5, that is, the moving distance of the transmitting light source 111 is mapped to 5 times the moving distance of the input focus in the screen; for handwriting input such as drawing board
  • the corresponding transform coefficient of the application may be 1, that is, the position of the emitted light source 111 and the motion track are directly mapped to the position of the input focus in the screen and the application track.
  • the position information of the light source 111 can be directly calculated to obtain a corresponding display position.
  • a table may be generated in advance, and the corresponding display position is obtained according to the position information of the transmitting light source 111 in a table lookup manner.
  • the light source 111 includes, but is not limited to, any of various point light sources, surface light sources, and the like, such as LED visible light sources, LED infrared light sources, OLED light sources, and the like, which are applicable to the present invention.
  • the present invention mostly exemplifies the light source 111 by taking an LED light source as an example.
  • the examples are merely for explaining the present invention in a simple manner, and should not be construed as being Any limitations of the invention.
  • the camera 121 includes, but is not limited to, any image acquisition device capable of sensing and acquiring images such as LED visible light, infrared rays or gestures, which is applicable to the present invention; for example, the camera 121 is provided with 1) a sufficiently high acquisition frame rate, such as 15 fps or more. , 2) Suitable resolution such as 640x480 or above, 3) Short enough exposure time, such as 1/500 or less.
  • the processing device 122 includes, but is not limited to, any electronic device capable of automatically performing numerical calculation and/or various information processing according to a program stored in advance, and the hardware includes, but not limited to, a microprocessor, an FPGA, and a DSP. , embedded devices, etc. Further, in the present invention, the application detecting device 120 may include one or more processing devices 122. When there are multiple processing devices 122, each processing device 122 may be assigned to execute a specific one. Information processing operations to achieve parallel computing, thereby improving detection efficiency.
  • the external device 130 includes, but is not limited to, a television set, a set top box, or a mobile device.
  • the output device 123 and the external device 130 transmit data and/or information through various wired or wireless communication methods, such as the output device 123 communicating with the external device 30 in a wired manner through a hardware interface such as a VGA interface or a USB interface, or the output device 123 Communicate with external device 30 via wireless means such as Bluetooth or WIFI.
  • a hardware interface such as a VGA interface or a USB interface
  • wireless means such as Bluetooth or WIFI.
  • FIG. 4 is a flow chart of a method in accordance with another aspect of the present invention showing a process for mapping the applied trajectory of a motion trajectory of a light source.
  • the application input system 100 includes an input device 110 and an application detecting device 120, wherein the input device 110 and the application detecting device 120 are respectively placed at both ends.
  • Input device 110 includes at least one emission source 111.
  • the application detecting device 120 has built-in or externally connected at least one camera head 121; the camera 121 captures the emission light source 111 to obtain imaging information of the emission light source 111; and the application detecting device 120 is also connected to the external device 130.
  • step S401 the camera 121 captures imaging information of the emission source 111.
  • the application detecting device 120 detects an input mode of the emission source 111 to determine the input mode. Applying the mapping curve; in step S403, the application detecting device 120 acquires the motion trajectory of the transmitting light source 111 according to the imaging information of the transmitting light source 111; in step S404, the application detecting device 120 passes the determined application according to the motion trajectory.
  • the mapping curve is obtained to obtain an application trajectory corresponding to the motion trajectory; in step S405, the application detecting device 120 outputs the application trajectory to the external device 130.
  • the light source 111 is an LED light source that is mounted on an input control device 110, such as a remote controller; the user manipulates the remote controller to perform various actions in space in the direction of the camera 121.
  • the camera 121 is built in the application detecting device 120.
  • the camera 121 captures an image of the LED light source at a frame rate three times the blinking frequency of the LED light source to obtain imaging information of the LED light source.
  • the detection device is applied.
  • the device 120 queries a predetermined input mode mapping table according to the blinking frequency of the LED light source, determines a current input mode of the LED light source, such as a mouse input mode, and acquires an application mapping curve corresponding to the mouse input mode; in step S403, the application The detecting device 120 acquires a motion trajectory of the LED light source according to the imaging information of the LED light source, such as a plurality of position information of the LED light source. In step S404, the application detecting device 120 maps through the foregoing application according to the motion trajectory of the LED light source.
  • step S405 the application detecting device 120 outputs the mouse motion track to the VGA interface connected to the external device 130 to
  • the external device 130 is configured to display a mouse motion track corresponding to the LED light source on the screen of the external device 130.
  • the application detecting device 120 further detects the current input state of the transmitting light source 111.
  • the waiting time corresponding to the input state expires, the imaging information of the transmitting light source 111 is detected, and the motion track is obtained, thereby obtaining the motion track. Corresponding application track.
  • the application detecting device 120 detects the current input state of the transmitting light source 111, and when the waiting time corresponding to the input state expires, according to the motion trajectory of the transmitting light source 111, the application mapping curve corresponding to the application mode is obtained. Application track.
  • the application detecting device 120 can detect the current input state of the transmitting light source 111, such as an input state or a waiting state, by using the screen input position of the emitting light source 111 or the moving mode of the transmitting light source 111.
  • the application detecting device 120 can detect the input state and the wait state using the movement mode of the emission light source 111: the input state when the moving speed or distance of the transmitting light source 111 or the input cursor is greater than a threshold, otherwise the waiting state.
  • Figure 5 is a flow diagram of a method in accordance with a preferred embodiment of the present invention showing a process for mapping the applied trajectory of a motion trajectory of a transmitted light source.
  • the application input system 100 includes an input device 110 and an application detecting device 120, wherein the input device 110 and the application detecting device 120 are respectively placed at both ends.
  • Input device 110 includes at least one emission source 111.
  • the application detecting device 120 has built-in or externally connected at least one camera head 121; the camera 121 captures the emission light source 111 to obtain imaging information of the emission light source 111; and the application detecting device 120 is also connected to the external device 130.
  • step S501 the camera 121 captures and obtains the emitted light. Imaging information of the source 111; in step S502, the application detecting device 120 detects an input mode of the transmitting light source 111 to determine an application mapping curve corresponding to the input mode; in step S503, the application detecting device 120 according to the transmitting light source 111 Imaging information, obtaining a motion trajectory of the emission light source 111; in step S506, the application detecting device 120 corrects a starting point of the motion trajectory according to a peak of a motion characteristic of the motion trajectory within a predetermined search time range, for correcting The starting point of the application track; in step S507, the application detecting device 120 performs the corresponding input operation correction according to the operation related information of the input device 110 from the start time of the input operation of the input device 110, to obtain the corrected The input operation until the predetermined input operation correction stop condition is satisfied, wherein the predetermined input operation correction stop condition includes the movement time of the emission source 111 reaching a pre
  • the application detecting means 120 corrects the starting point of the motion trajectory based on the peak of the motion characteristic of the motion trajectory of the imaging spot in the predetermined search time range to effect correction such as the mouse position and the handwritten position.
  • the mouse position correction recording the position of the imaged light spot in each frame image detected in the most recent period, such as 500 ms; when receiving control information from the user, such as indicating a mouse click operation, applying the detecting device 120: calculating a motion feature of the imaged spot in each frame of the recorded image from a click time from the click time, such as 100 ms or 200 ms; calculating a mouse click start time according to the motion characteristics
  • the position of the mouse corresponding to the position of the imaging spot is used as the true position of the mouse click, such as using the peak of the used motion feature value in the search time range or its previous frame as the click start time, and the corresponding mouse position is taken as the corresponding mouse position.
  • the motion characteristics include, but
  • the application detecting device 120 self-acquires the start of the input operation of the input device 110 Engraving, according to the operation related information of the input device 110, performing corresponding input operation correction to obtain the corrected input operation, such as interpreting the mouse drag operation as a mouse click operation, or interpreting the mouse drag + click operation as a mouse double click The operation or the like until the predetermined input operation corrects the stop condition, such as the movement time of the emission source 111 reaches the predetermined correction delay time threshold and/or the motion characteristic value of the motion trajectory of the emission source 111 reaches its corresponding motion characteristic value threshold.
  • the operation related information includes, but is not limited to, any subsequent related operations or movements performed by the input device 110 in the current input operation state, such as the input device 110 performing motion in a mouse click state, Thereby, the mouse click operation is converted into a mouse drag operation, or the input device 110 clicks again in the mouse drag state, thereby converting the mouse drag operation into a mouse click operation or the like.
  • a predetermined input operation mapping relationship that maps one or more input operations of the user to other input operations, such as interpreting a mouse drag operation as a mouse click operation, or dragging a mouse and clicking a click operation into a mouse double click operation, etc. To prevent the mouse or input focus from appearing on the screen, which affects the user experience.
  • the application detecting device 120 obtains the starting moment of the input operation of the input device 110, after determining the mouse click position in step S506, the user manipulates the input device 110 to slightly shake in the mouse click state, thereby the mouse
  • the click operation is converted into a mouse drag operation, and the application detecting device 120 maps the mouse drag operation back to the mouse click operation according to the predetermined input operation mapping relationship, and simultaneously detects whether the predetermined input operation correction stop condition is satisfied, when the light source 111 is emitted.
  • the motion time reaches a predetermined correction delay time threshold and/or the motion characteristic value of the motion trajectory of the emission source 111 reaches its corresponding motion feature value threshold, the application detecting device 120 stops the input operation correction, and restores the previous motion trajectory to the emission light source 111. Calculation.
  • the application detecting device 120 calculates the motion feature of the imaging spot in each frame image after the start time of the input operation of the input device 110, when one or more motion features exceed their corresponding reservations At the threshold, the input operation is corrected, and the input operation is corrected when the displacement of the imaging spot is sufficiently large. Or, preset one
  • the maximum anti-shake delay time such as 100 to 200ms, starts from the start of the movement of the emission source 111, and stops the input operation correction when the maximum anti-shake delay time is reached.
  • the motion features include, but are not limited to, speed, acceleration, or vertical velocity of the imaging spot in each frame of image, acceleration, and amount of change in adjacent frames, or the like, or imaging with respect to the click time.
  • the displacement of the initial position of the spot or the horizontal or vertical component of the displacement.
  • the application detecting device 120 determines that the input mode of the transmitting light source 111 is the mouse input mode according to the current application of the external device 130, such as web browsing; the application detecting device 120 acquires imaging information of the transmitting light source 111 from the camera 121, and according to the imaging Information, calculating a motion trajectory of the light source 111; the application detecting device 120 calculates the speed of the image spot at each frame in the first 100 ms from a starting time of the motion trajectory, and a maximum search time range, such as 100 ms.
  • the position of the frame corresponding to the speed peak or the position of the previous frame is used as the starting position of the motion track to correct the motion track for subsequent corresponding correction of the application track; subsequently, the application detecting device 120 is based on the re-determined motion track
  • the mapping curve is applied by the mouse to obtain a corresponding application track, and is output to the external device 130.
  • the current position of the transmitting light source 111 is interpreted by the application detecting device 120 as the current position of the mouse. If the user has slight jitter during the manipulation of the input device 110, the corresponding mouse position may also slightly shake, which may cause the application detecting device 120 to perform a mouse click operation at the wrong position or interpret the mouse click operation as a mouse drag operation. .
  • Step S506 and step S507 can perform click position correction and click jitter correction for the two problems respectively.
  • FIG. 6 is a flow chart of a method in accordance with another preferred embodiment of the present invention showing a process for mapping the applied trajectory of a motion trajectory of a transmitted light source.
  • the application input system 100 includes an input device 110 and an application detecting device 120, wherein the input device 110 and the application detecting device 120 are respectively disposed at both ends.
  • Input device 110 At least one emission source 111 is included.
  • the application detecting device 120 has built-in or externally connected at least one camera 121; the camera 121 captures the transmitting light source 111 to obtain imaging information of the transmitting light source 111; and the application detecting device 120 is also connected to the external device 130.
  • step S601 the camera 121 captures imaging information of the emission source 111.
  • step S602 the application detecting device 120 detects an input mode of the emission source 111 to determine the input mode. Applying the mapping curve; in step S603, the application detecting device 120 acquires the motion trajectory of the transmitting light source 111 according to the imaging information of the transmitting light source 111; in step S604, the application detecting device 120 passes the determined application mapping according to the motion trajectory.
  • step S606 the application detecting device 120 corrects a starting point of the application trajectory according to a motion characteristic peak of the application trajectory within a predetermined search time range; in step S607 The application detecting device 120 performs the corresponding input operation correction according to the operation related information of the input device 110 from the start time of the input operation of the input device 110 to obtain the corrected input operation until the predetermined input operation is satisfied and stops.
  • the predetermined input operation corrects the stop condition including that the motion time of the transmitting light source 111 reaches a predetermined correction delay time threshold and/or the motion characteristic value of the applied trajectory of the transmitting light source 111 reaches its corresponding motion characteristic value threshold; in step S605, the application detection is performed.
  • the device 120 outputs the corrected application track to the external device 130.
  • the application detecting device 120 corrects the starting point of the applied trajectory according to the peak of the motion characteristic of the applied trajectory of the imaging spot in the predetermined search time range to achieve correction such as the mouse position and the handwritten position.
  • the detected mouse position is recorded in the latest period of time, such as 500 ms; when receiving control information from the user, such as indicating a mouse click operation, the application detecting device 120 moves forward from the click time.
  • the mouse motion feature includes, but is not limited to, speed, acceleration, or vertical speed of the mouse movement in each frame image, acceleration Degree, and its amount of change in adjacent frames, etc.
  • the application detecting device 120 performs corresponding input operation correction according to the operation related information of the input device 110 from the start time of the input operation of the input device 110 to obtain a corrected input operation, such as explaining the mouse drag operation.
  • a corrected input operation such as explaining the mouse drag operation.
  • a mouse click operation or a mouse drag + click operation is interpreted as a mouse double click operation or the like until a predetermined input operation is satisfied to correct the stop condition, such as the movement time of the emission light source 111 reaches a predetermined correction delay time threshold and/or the emission light source 111
  • the motion feature value of the applied trajectory reaches its corresponding motion feature value threshold.
  • the application detecting device 120 obtains the starting moment of the input operation of the input device 110, such as the mouse drag operation, in which the user manipulates the input device 110 to perform a mouse click operation, thereby
  • the mouse drag operation is converted into a mouse click operation at the drag stop position, and the application detecting device 120 maps the mouse drag and click operation to a mouse double click operation at the start position of the original mouse drag according to a predetermined input operation mapping relationship.
  • the application Simultaneously detecting whether the predetermined input operation correcting stop condition is satisfied, when the motion time of the transmitting light source 111 reaches a predetermined correction delay time threshold and/or the motion characteristic value of the applied trajectory of the transmitting light source 111 reaches its corresponding motion characteristic value threshold, the application The detecting device 120 stops the input operation correction and restores the previous calculation of the applied trajectory of the emitted light source 111.
  • the application detecting device 120 calculates a mouse motion feature corresponding to each frame image after the start time of the input operation of the input device 110, when one or more mouse motion features exceed their corresponding
  • the input operation correction is stopped, and if the mouse movement displacement is large enough, the input operation correction is stopped.
  • preset a maximum anti-shake delay time such as 100 to 200ms, from the start of the movement of the emission source 111, and stop the input operation correction when the maximum anti-shake delay time is reached.
  • the mouse motion feature includes, but is not limited to, a speed of the mouse movement corresponding to each frame image, an acceleration, or a speed in the vertical direction, an acceleration, and a variation thereof in an adjacent frame, or a mouse click.
  • the displacement or displacement of the position, or the horizontal or vertical component of the displacement is not limited to, a speed of the mouse movement corresponding to each frame image, an acceleration, or a speed in the vertical direction, an acceleration, and a variation thereof in an adjacent frame, or a mouse click.
  • the application detecting device 120 determines, according to the current application of the external device 130, such as web browsing, that the input mode of the transmitting light source 111 is a mouse input mode, and determines a corresponding mouse application mapping curve; and the application detecting device 120 performs imaging according to the transmitting light source 111. Information, obtaining a motion track thereof, applying a mapping curve by the mouse, and calculating a corresponding application track; the application detecting device 120 calculates a recorded each time from a starting time of the application track to a maximum search time range, such as 100 ms.
  • the mouse motion feature in one frame image such as the mouse movement speed in each frame image in the first 100 ms of the calculation, the frame corresponding to the velocity peak or the mouse position of the previous frame is used as the starting position of the application track to correct
  • the application detecting device 120 obtains the mouse click operation of the input device 110
  • the user controls the movement performed by the input device 110 to convert the mouse click operation into a mouse drag operation
  • the application detecting device 120 drags the mouse ⁇ operation maps back to the mouse click operation, and Detecting whether the predetermined input operation correction stop condition is satisfied, when the displacement of a position in the application trajectory of the emission source 111 relative to the motion start position or its horizontal or vertical component exceeds the corresponding threshold, the input operation is stopped, and the previous operation is resumed.
  • Application Figure 7 is a flow chart of a method according to a further preferred embodiment of the present invention, showing a process of mapping the applied trajectory of the motion trajectory of the transmitting light source.
  • the application input system 100 includes an input device 110 and an application detecting device 120, wherein the input device 110 and the application detecting device 120 are respectively placed at both ends.
  • Input device 110 includes at least one emission source 111.
  • the application detecting device 120 has built-in or externally connected at least one camera head 121; the camera 121 captures the emission light source 111 to obtain imaging information of the emission light source 111; and the application detecting device 120 is also connected to the external device 130.
  • step S701 the camera 121 captures imaging information of the emission source 111.
  • step S702 the application detecting device 120 detects an input mode of the emission source 111 to determine the input mode. Applying the mapping curve; in step S7031, the application detecting device 120 acquires the motion trajectory of the transmitting light source 111 according to the imaging information of the transmitting light source 111; in step S7032, the application detecting device 120 The historical motion feature information of the motion trajectory is determined, and the expected position information of the transmitting light source 111 is determined to be used for smoothing the motion trajectory; in step S704, the application detecting device 120 obtains the determined application mapping curve according to the motion trajectory. The application trajectory corresponding to the motion trajectory; in step S705, the application detecting device 120 outputs the application trajectory to the external device 130.
  • the application detecting mouse 120 performs an interpolation smoothing operation on the motion trajectory. Specifically, a maximum output time interval, such as 10 ms, is predetermined, when the maximum output time interval is exceeded, the application detecting device 120 still has no application trajectory output of the transmitting light source 111; and the application detecting device 120 according to the history of the moving trajectory of the transmitting light source 111
  • the motion characteristic information such as the position, velocity, acceleration, etc.
  • the application detecting device 120 obtains the expected application trajectory corresponding to the expected location information by using the corresponding application mapping curve according to the expected location information.
  • the 2D/3D motion trajectory detected by the application detecting device 120 may have a insufficient sampling rate, which may result in The user experience has dropped.
  • the mouse application trajectory generated by the 2D/3D motion trajectory may be frustrated and not smooth enough, and the interpolation is performed by the expected position information according to the above process to increase the smoothness of the motion trajectory, so that the corresponding mouse application trajectory is also Smooth, smooth, without frustration.
  • Figure 8 is a flow diagram of a method in accordance with yet another preferred embodiment of the present invention showing a process for mapping the applied trajectory of a motion trajectory of a transmitted light source.
  • the application input system 100 includes an input device 110 and an application detecting device 120, wherein the input device 110 and the application detecting device 120 are respectively placed at both ends.
  • the input device 110 includes at least one emission source 111, and the input mode of the emission source 111 includes a handwriting input mode.
  • the application detecting device 120 has built-in or externally connected at least one camera 121; the camera 121 captures the emission light source 111 to obtain imaging information of the emission light source 111; and the application detecting device 120 is also connected to the external device 130.
  • step S801 the camera 121 captures and obtains the emitted light. Imaging information of the source 111; in step S802, the application detecting device 120 detects an input mode of the transmitting light source 111 to determine an application mapping curve corresponding to the input mode; in step S803, the application detecting device 120 according to the transmitting light source 111 Obtaining a motion trajectory of the emission light source 111; in step S804, the application detecting device 120 obtains an application trajectory corresponding to the motion trajectory according to the determined application mapping curve according to the motion trajectory; The application detecting device 120 outputs the application trajectory to the external device 130. In step S808, the application detecting device 120 queries a predetermined character library according to the application trajectory to obtain a character corresponding to the application trajectory; In S809, the application detecting device 120 outputs the character to the external device 130.
  • the application detecting device 120 detects that the input mode of the transmitting light source 111 is a handwriting input mode, and determines that the corresponding applied mapping curve is a linear curve with a slope of 5 (ie, a transform coefficient); the application detecting device 120 according to the transmitting light source 111 Imaging information, such as position information of an imaging spot of the LED light source in each frame image, and obtaining a motion trajectory of the emission light source 111 according to a motion trajectory formed by the imaging light spot in a continuous image sequence by a target tracking method And after calculating and outputting the corresponding application track according to the motion track of the light source 111, the application detecting device 120 further queries a predetermined character library according to the application track to obtain a character corresponding to the application track, and The character is output to the external device 130.
  • a predetermined character library according to the application track to obtain a character corresponding to the application track
  • the transform coefficient may be determined according to statistical habits of multiple users, or preset by the user or the application detecting device 120, or determined by the application detecting device 120 according to the current user's usage habits.
  • the application mapping curve may determine a corresponding screen input position according to a position of the imaging spot relative to a fixed point (such as an upper left point of the image); or may determine a corresponding screen trajectory according to a moving distance or speed of the imaging spot.
  • the moving distance or speed such as the moving distance or speed of the emitting light source 111, linearly corresponds to the position and length of the input stroke.
  • the application detecting device 120 further detects the current input state of the transmitting light source 111, such as an input state or a waiting state.
  • the predetermined character library is queried according to the determined application trajectory, A character corresponding to the application track is obtained, and the character is output to the external device 130.
  • the waiting time between strokes is T1
  • the waiting time between strokes is T2, and T2 ⁇ T1.
  • the waiting time expires, it is considered that the user has finished writing a word and starts to perform character recognition automatically.
  • the predetermined character library is queried to obtain the applied track.
  • the waiting time can record the waiting state time from the end of the last stroke input to prevent the system from waiting indefinitely, that is, the waiting time between the longest strokes does not exceed Tl.
  • the application detecting device 120 detects the input state and the waiting state using the screen input position of the transmitting light source 111:
  • the screen input position such as the position of the input cursor
  • the waiting time between the strokes is T1.
  • the stroke waiting time is ⁇ 2, and ⁇ 2 ⁇ 1.
  • the waiting time expires, the user is considered to have written a word and begins to automatically recognize the character.
  • the waiting time is long; and when the user moves the input cursor out of the handwriting input area, the waiting time is short.
  • the handwriting input area may be a fixed area on the screen of the external device 130, such as a central area of the screen, or may be an area dynamically determined according to the starting point of the application track.
  • the handwriting input area corresponding to the handwriting input mode is determined according to the starting point of the application track, that is, the initial pen down of the handwriting input, and the area of the certain displacement is extended upward, downward, left, and right.
  • the size of the area gives the user enough space to write a word.
  • Figure 9 is a flow diagram of a method in accordance with still another preferred embodiment of the present invention showing a process for mapping the applied trajectory of a motion trajectory of a transmitted light source.
  • the application input system 100 includes an input device 110 and an application detecting device 120, wherein the input device 110 and the application detecting device 120 are respectively placed at both ends.
  • the input device 110 includes at least one emission source 111, and the input mode of the emission source 111 includes a mouse input mode.
  • the application detecting device 120 has built-in or externally connected at least one camera 121; the camera 121 captures the emission light source 111 to obtain imaging information of the emission light source 111; and the application detecting device 120 is also connected to the external device 130.
  • step S901 the camera 121 captures imaging information of the emission source 111.
  • step S902 the application detecting device 120 detects the emitted light.
  • An input mode of the source 111 to determine an application mapping curve corresponding to the input mode;
  • the application detecting device 120 acquires the motion trajectory of the transmitting light source 111 according to the imaging information of the transmitting light source 111.
  • the application detecting device 120 obtains the determined mapping curve according to the motion trajectory.
  • step S9011 Controlling the information, and obtaining a mouse operation corresponding to the control information by querying a predetermined control information table; in step S9011, the application detecting device 120 outputs the execution instruction of the mouse operation to the external device 130 to transmit
  • the input focus corresponding to the light source 111 performs the mouse operation, and the execution result corresponding to the mouse operation is presented at the external device 130.
  • the user performs various mouse operations through the buttons provided on the input device 110, and controls the emission light source 111 to emit light according to a certain blinking frequency, so that the application detecting device 120 obtains corresponding by detecting the blinking frequency.
  • Mouse operation The application detecting device 120 acquires and outputs a corresponding application track according to the motion trajectory of the transmitting light source 111, and the application detecting device 120 calculates the imaging spot of the transmitting light source 111 for a period of time according to the imaging information of the transmitting light source 111.
  • the blinking frequency of the emitting light source 111 is obtained, and a predetermined control information table is queried according to the blinking frequency to obtain a corresponding mouse operation, such as a click operation; subsequently, the application detecting device 120 performs the mouse operation.
  • the instruction is output to the external device 130 to perform the click operation at the current mouse position, and the corresponding execution result is presented on the screen of the external device 130.
  • ASIC application specific integrated circuit
  • general purpose computer or any other similar hardware device.
  • the software program of the present invention can be executed by a processor to implement the steps or functions described above.
  • the software program (including related data structures) of the present invention can be stored in a computer readable recording medium such as a RAM memory, a magnetic or optical drive or a floppy disk and the like.
  • some steps or functions of the present invention may be implemented by hardware, for example, A circuit that cooperates with a processor to perform various functions or steps.
  • a portion of the present invention can be applied as a computer program product, such as computer program instructions, which, when executed by a computer, can invoke or provide a method and/or solution in accordance with the present invention.
  • the program instructions for invoking the method of the present invention may be stored in a fixed or removable recording medium and/or transmitted by a data stream in a broadcast or other signal bearing medium, and/or stored in a The working memory of the computer device in which the program instructions are run.
  • an apparatus includes a device including a memory for storing computer program instructions and a processor for executing program instructions, wherein when the computer program instructions are executed by the processor The device is triggered to operate based on the aforementioned methods and/or technical solutions in accordance with various embodiments of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明的目的是提供一种为发射光源的运动轨迹映射其应用轨迹方法与设备。其中,应用检测设备获取发射光源的成像信息;检测所述发射光源的输入模式,以确定与所述输入模式相对应的应用映射曲线;根据所述成像信息,获取所述发射光源的运动轨迹;根据所述运动轨迹,通过所述应用映射曲线,获得与所述运动轨迹相对应的应用轨迹;将所述应用轨迹输出至外接设备。与现有技术相比,本发明实现了自适应地为发射光源的不同输入模式,匹配应用映射曲线以及获取应用轨迹,提升了用户体验。

Description

一种为发射光源的运动轨迹映射其应用轨迹的方法与系统
技术领域
本发明涉及智能控制技术领域, 尤其涉及一种为发射光源的运动 轨迹映射其应用轨迹的技术。 背景技术
在智能电视、 体感交互、 虚拟现实等智能控制领域, 通常通过检 测装置检测由输入装置所发送的一定信号, 如电磁信号、 声音信号或 光信号等, 来进行相应的输入映射, 在屏幕上显示与输入装置的运动 轨迹相对应的应用轨迹。 然而, 这等输入映射往往是简单映射, 诸如 通过 MEMS传感器根据加速度进行的映射、 通过重力传感器进行的 简单二维映射等, 用户体验较差。
因此, 如何针对上述不足, 提供一种为发射光源的运动轨迹映射 其应用轨迹的方法, 成为本领域技术人员亟需解决的技术问题之一。 发明内容
本发明的目的是提供一种为发射光源的运动轨迹映射其应用轨 迹的方法与系统。
根据本发明的一个方面,提供了一种为发射光源的运动轨迹映射其 应用轨迹的方法, 其中, 该方法包括以下步骤:
- 获取发射光源的成像信息;
其中, 该方法还包括:
a检测所述发射光源的输入模式, 以确定与所述输入模式相对应的 应用映射曲线;
b根据所述成像信息, 获取所述发射光源的运动轨迹;
c根据所述运动轨迹, 通过所述应用映射曲线, 获得与所述运动轨 迹相对应的应用轨迹;
d将所述应用轨迹输出至外接设备。 优选地, 所述检测所述发射光源的输入模式的操作包括:
-根据所述外接设备的当前应用, 确定所述发射光源的输入模式。 优选地, 所述应用映射曲线包括三维应用映射曲线。
更优选地, 所述三维应用映射曲线的放大因子基于所述发射光源的 距离来调整。
更优选地, 所述三维应用映射曲线包括基于所述发射光源的三维转 动位置的三维应用映射曲线。
进一步地, 所述步骤 b包括:
-根据所述成像信息, 获取所述发射光源的三维转动运动轨迹。 作为本发明的方法的优选实施例之一, 所述应用映射曲线基于所 述发射光源的历史状态信息来调整。
作为本发明的方法的优选实施例之一, 在所述步骤 b之前, 该方 法还包括:
-检测所述发射光源当前的输入状态, 以在所述输入状态所对应的 等待时间期满时, 启动后续操作。
作为本发明的方法的优选实施例之一, 该方法还包括:
-才艮据预定搜索时间范围内所述运动轨迹或所述应用轨迹的运动特 征峰值, 纠正所述应用轨迹的起始点;
其中, 所述步骤 d包括:
- 将所述纠正后的应用轨迹输出至所述外接设备。
作为本发明的方法的优选实施例之一, 在所述步骤 d之前, 该方法 还包括:
- 自获得输入设备的输入操作的起始时刻, 根据所述输入设备的操 作相关信息, 进行相应的输入操作纠正, 以获得纠正后的输入操作, 直 至满足预定的输入操作纠正停止条件;
其中, 所述预定的输入操作纠正停止条件包括以下至少任一项: - 所述发射光源的运动时间达到预定的纠正延迟时间阈值;
- 所述发射光源的运动轨迹的运动特征值达到其相应的运动特征值 阈值; - 所述发射光源的应用轨迹的运动特征值达到其相应的运动特征值 阈值。
作为本发明的方法的优选实施例之一, 所述步骤 b还包括:
-根据所述运动轨迹的历史运动特征信息, 确定所述发射光源的预 期位置信息, 以用于平滑所述运动轨迹。
作为本发明的方法的优选实施例之一, 所述发射光源的输入模式 包括手写输入模式。
优选地, 所述应用映射曲线包括一次线性曲线。
优选地, 该方法还包括:
-才艮据所述应用轨迹, 查询预定的字符库, 以获得与所述应用轨迹 相对应的字符;
- 将所述字符输出至所述外接设备。
优选地, 在手写输入模式下, 该方法还包括:
-根据所述应用轨迹的起始点, 确定所述手写输入模式所对应的输 入区域。
作为本发明的方法的优选实施例之一, 所述发射光源的输入模式 包括鼠标输入模式。
优选地, 该方法还包括:
-根据所述发射光源的成像信息, 获取所述发射光源发射的控制信 息, 并通过查询预定的控制信息表, 获得与所述控制信息相对应的鼠标 操作;
- 将所述鼠标操作的执行指令输出至所述外接设备, 以在所述发射 光源所对应的输入焦点执行所述鼠标操作, 并在所述外接设备呈现与所 述鼠标操作相对应的执行结果。
根据本发明的另一个方面, 还提供了一种为发射光源的运动轨迹 映射其应用轨迹的系统, 其中, 所述系统包括发射光源、 用于获取所述 发射光源的成像信息的摄像头、 处理装置及输出装置;
其中, 所述处理装置用于:
-检测所述发射光源的输入模式, 以确定与所述输入模式相对应的 应用映射曲线;
-根据所述成像信息, 获取所述发射光源的运动轨迹;
-根据所述运动轨迹, 通过所述应用映射曲线, 获得与所述运动轨 迹相对应的应用轨迹;
其中, 所述输出装置用于将所述应用轨迹输出至外接设备。
与现有技术相比, 本发明根据发射光源的输入模式, 确定相应的 应用映射曲线, 进而根据该发射光源的运动轨迹, 通过该应用映射曲 线, 获得该发射光源的应用轨迹, 实现了自适应地为发射光源的不同 输入模式, 匹配应用映射曲线以及获取应用轨迹, 提升了用户体验。 附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述, 本发明的其它特征、 目的和优点将会变得更明显:
图 1示出根据本发明一个方面的为发射光源的运动轨迹映射其应 用轨迹的系统的系统示意图;
图 2示出根据本发明的二维鼠标应用映射曲线的示意图; 图 3示出根据本发明的指示发射光源的三维转动位置信息的示意 图;
图 4示出根据本发明另一个方面的为发射光源的运动轨迹映射其 应用轨迹的方法流程图;
图 5示出根据本发明一个优选实施例的为发射光源的运动轨迹映 射其应用轨迹的方法流程图;
图 6示出根据本发明另一个优选实施例的为发射光源的运动轨迹 映射其应用轨迹的方法流程图;
图 7示出根据本发明再一个优选实施例的为发射光源的运动轨迹 映射其应用轨迹的方法流程图;
图 8示出根据本发明又一个优选实施例的为发射光源的运动轨迹 映射其应用轨迹的方法流程图;
图 9示出根据本发明还一个优选实施例的为发射光源的运动轨迹 映射其应用轨迹的方法流程图。
附图中相同或相似的附图标记代表相同或相似的部件。 具体实施方式
下面结合附图对本发明作进一步详细描述。
图 1为才艮据本发明一个方面的系统示意图, 示出一种为发射光源的 运动轨迹映射其应用轨迹的系统。
在此, 输入检测系统 100包括输入设备 110和应用检测设备 120, 其中, 输入设备 110和应用检测设备 120分别置于两端。 输入设备 110 包括至少一个发射光源 111。 应用检测设备 120包括至少一个处理装置 122和至少一个输出装置 123 , 该应用检测设备 120还内置或外接至少 一个摄像头 121 ; 摄像头 121 拍摄发射光源 111 , 以获得发射光源 111 的成像信息; 输出装置 123还与外接设备 130相连接。
其中, 摄像头 121拍摄发射光源 111 , 获取发射光源 111的成像信 息; 处理装置 122检测发射光源 111的输入模式, 以确定该输入模式所 对应的应用映射曲线,根据发射光源 111的成像信息,获取发射光源 111 的运动轨迹, 并通过所述应用映射曲线, 获得与所述运动轨迹相对应的 应用轨迹; 输出装置 123将所述应用轨迹输出至外接设备 130。
本发明中,所述运动轨迹包括发射光源 111的一个或多个位置信息, 所述应用轨迹包括发射光源 111在外接设备 130的屏幕上所对应的一个 或多个显示位置。 此外, 由于发射光源 111装置于输入设备 110, 故输 入设备 110的位置及运动轨迹由发射光源 111的位置及运动轨迹来表征, 两者等同使用。
例如, 摄像头 121拍摄发射光源 111 , 获取发射光源 111的多帧图 像; 处理装置 122根据系统默认设置, 确定发射光源 111的输入模式为 鼠标输入模式, 并确定该鼠标输入模式所对应的鼠标应用映射曲线; 处 理装置 122根据发射光源 111的每一帧图像, 通过双目立体视觉算法, 获得每一帧图像所对应的该发射光源 111的三维平动位置 (x,y,z),即该发 射光源 111的三维平动运动轨迹, 其中 X为发射光源 111相对空间原点 的水平坐标, y为发射光源 111相对空间原点的竖直坐标, z为发射光源 111相对空间原点的纵深坐标; 处理装置 122才艮据该三维运动轨迹中的 每一个三维平动位置 (x,y,z), 通过该鼠标应用映射曲线, 如 X= f(x,y,z), Y=g(x,y,z), Z=h(x,y,z), 计算获得相应的鼠标平动位置 (Χ,Υ,Ζ ), 从而获 得该发射光源 111的三维平动应用轨迹; 输出装置 123将该三维平动应 用轨迹, 即每一个鼠标平动位置(Χ,Υ,Ζ )输出至外接设备 130, 以在该 外接设备 130呈现该三维平动应用轨迹。
本领域的技术人员应能理解,上述双目立体视觉算法仅为获得发射 光源的三维平动位置的一种示例, 该种举例仅为简便地阐述本发明之 用, 而不应理解为对本发明的任何限制, 其他现有的或今后可能出现 的计算发射光源的三维平动位置的方式如可适用于本发明, 也应包含在 本发明保护范围以内, 并以引用方式包含于此。
处理装置 122检测发射光源 111的输入模式的方式可以有多种。 例 如, 根据输入设备 110的控制信号确定发射光源 111的输入模式, 如根 据所述控制信息查询预定的控制信息表, 以确定相应的输入模式;或者, 根据外接设备 130的当前应用确定发射光源 111的输入模式, 如当前应 用为输入框,则相应的输入模式为手写输入模式, 当前应用为节目菜单, 则相应的输入模式为鼠标输入模式。 处理装置 122可以在发射光源 111 的运动初始时刻, 检测其相应的输入模式, 也可以在外接设备 130的当 前应用发生改变时, 切换发射光源 111的输入模式。
在此, 应用检测设备 120可以包括一个映射曲线库, 用于存储与各 种输入模式相对应的应用映射曲线, 诸如鼠标应用映射曲线、 手写应用 映射曲线等。
例如, 图 2示出多条二维鼠标应用映射曲线。 本发明中, 二维鼠标 应用映射曲线可为一次曲线 (即线性变换曲线) 、 二次曲线、 或分成多 段的曲线。 通常 x,y方向分别用相同或不同的映射曲线决定鼠标的移动 位置或速度。 在一个示例中, 对于发射光源 111在图像中的成像光点, 将该成像光点在图像的 x,y方向上相邻两帧的移动距离映射为在外接设 备 130的屏幕上的移动距离, 且成像光点的移动距离越小时, 映射曲线 越緩, 即斜率越小, 以防止抖动, 而成像光点的移动距离越大时, 映射 曲线斜率越大。 在另一个示例中, 二维鼠标应用映射曲线也可用于将成 像光点的绝对位置映射于屏幕的显示位置。
本发明中, 鼠标应用映射曲线还可包括三维鼠标应用映射曲线, 其 x,y,z 方向可以分别用相同或不同的映射曲线决定鼠标的移动位置或速 度。 三维鼠标应用映射曲线的一般表达式可表示为: X = f(x,y,z), Y=g(x,y,z), Z=h(x,y,z); 其中 Χ,Υ,Ζ 为三维显示界面或操作界面的三维 鼠标位置, x,y,z为检测到的发射光源的三维平动位置; f,g,h为各方向上 的映射曲线, 可为一次曲线 (即线性变换曲线) 、 二次曲线、 或分成多 段的曲线。 Χ,Υ,Ζ也可为鼠标的位置变化,如鼠标的移动距离或速度, 同 样, x,y,z也可为发射光源 111的位置变化, 如成像光点的移动距离或速 度。 优选地, 可进一步根据具体的应用, 设置相应输入模式的应用映射 曲线, 例如, 对于如网页浏览等普通应用, 可才艮据发射光源 111的位置 映射鼠标的显示位置, 而对于如游戏等精确度和灵敏度要求较高的应 用, 可才艮据发射光源 111的位置变化映射鼠标的位置变化。
进一步地, 对于精确度和灵敏度要求更高的三维应用场景, 本发明 还可提供基于发射光源 111的三维平动位置和三维转动位置的鼠标应用 映射曲线, 其一般表达式可表示为: X = f(x,y,z,(x,P,Y), Y=g(x,y,z,(x,P,Y), Ζ=1ι(χ,γ,ζ,α,β,γ;)。 在此, 参阅图 3 , 发射光源 111的三维转动位置标记为 ( ,β,γ), 其中, α是发射光源 111通过其质心轴的水平方向角, β是发 射光源 111通过其质心轴的竖直方向角, γ为发射光源 111 围绕其质心 轴的转角, 即发射光源 111的自转角度。 进一步地, 发射光源 111的三 维转动位置还可标记为 Θ或(θ, γ ), 其中, Θ为发射光源 111的轴线与 该发射光源 111到摄像头 122连线之间的夹角。 在获得夹角 Θ后, 结合 发射光源 111的三维平动位置, 即可确定发射光源 111的水平方向角 α 和竖直方向角 β。
在此, 处理装置 122根据发射光源 111的成像信息, 获取发射光源 111 在每一帧图像中的三维转动位置, 进而获得三维转动位置的三维转 动运动轨迹。 例如, 按照预定的夹角拟合曲线 0=h(r, I), 根据发射光源 11 1的成像光点的圆半径 r和亮度 I, 计算获得相应的夹角 Θ; 或者, 根 据发射光源 111的成像光点的圆半径 r和亮度 I,通过查询预定的光点属 性-夹角样本表, 获得相应的夹角 Θ, 如前述样本表中尚未包括前述圆半 径 r和亮度 I, 则通过各种样本内插算法, 计算获得相应的夹角 θ。 所述 样本内插算法包括但不限于最近邻域内插法, 线性加权内插法、 双三次 内插法( bicubic interpolation )等任何可适用于本发明的, 现有的或将来 可能实现的内插算法。
其中, 可按一定步长在不同夹角 Θ下测定足够多的样本, 即 r和 1 的值(或其他可用的光点属性), 以建立前述光点属性-夹角样本表, 或 者, 以一次、 二次或多次曲线按照最小误差准则拟合 r、 I与 Θ的映射关 系, 以获得前述夹角拟合曲线。 采样时, 应选在有效工作范围内光学特 性可通过 r和 I的组合唯一确定夹角 Θ的 LED光源。
本领域的技术人员应能理解, 上述夹角拟合曲线及样本内插算法 仅为获得发射光源的三维转动位置的示例,该种举例仅为简便地阐述本 发明之用, 而不应理解为对本发明的任何限制, 其他现有的或今后可 能出现的计算发射光源的三维转动位置的方式如可适用于本发明, 也应 包含在本发明保护范围以内, 并以引用方式包含于此。
在一个示例中, Z=l , 即相对于二维操作界面, 鼠标只在 Χ,Υ方向 移动;
Figure imgf000010_0001
, 其中, fp、 gp为对三维平动位置的映射函数, f 、 gz为对三维转动位置的映射函 数, wl、 w2分别为三维平动位置和三维转动位置的影响权值。 同样地, χ,γ,ζ,α,β,γ也可为在相应方向的变化,如平动或转动速度, 而不是实际的 位置值; 这对于如 3D TV或 3D游戏之类的应用更有帮助, 如根据发射 光源 111的转动速度, 进行菜单的转动, 或者才艮据发射光源 111的平动 和转动速度, 更为精确地映射 3D游戏中人物的运动。
优选地, 应用映射曲线还可以基于发射光源 111 的历史状态信息来 调整。 在此, 以三维应用映射曲线的调整为例进行说明。 基于相关历史 状态信息调整的三维应用映射曲线可进一步表示为: X,=Dx*X=Dx*f, Y,=Dy*Y=Dy*g, Z,=Dz*Z=Dz*h; 其中 Dx, Dy, Dz是由发射光源 111 的历史状态信息, 如其最近使用状态, 所调整的映射曲线放大因子。 需 要说明的是, 本领域技术人员应能理解, 最近使用状态不仅可用于调整 前述放大因子, 在一些应用中也可用于选取不同的映射曲线 f、 g、 h, 从而获得最佳的定位体验。
例如, 通过检测成像光点的大小或发射光源 111相对摄像头 121的 距离来调整鼠标应用映射曲线的放大因子。 当发射光源 111的距离近, 则映射曲线的放大因子小; 当发射光源 111的距离远, 则映射曲线的放 大因子大,从而用户在不同距离使用输入设备的体验是一致的。优选地, 还可通过人脸检测来估计发射光源 111的距离, 以调整映射曲线的放大 因子。 例如, 根据成像信息中与成像光点的运动轨迹距离最近的人脸特 征信息, 如人脸的大小、 双眼间的距离、 像素宽度等, 来估计发射光源 111的距离。
在一个示例中, 放大因子的计算公式如下:
、 £ - ^ t,S D: '
esi^F =: , i - !} iS si ^ + ½^ ί
'- ; ΰϊΰ j- curF: 本帧使用的放大因子;
preF: 上一帧使用的放大因子, 若为第一帧则取 1 ;
1: 用户设定的一个参数, 其越大, 放大因子的变化越快, 其越小, 放大因子受前帧的积累影响越大;
z: 输入设备 110到应用检测设备 120的距离, 即发射光源 111相对 空间原点的纵深坐标;
Db : 多个距离 z的均值, 如可预设为 3.0米。
在通过上述公式计算获得 curF后, 将其与 X方向的 f和 Y方向的 g 分别做乘法运算, 以获得基于最近使用状态的三维应用映射曲线。
在另一示例中, 还根据最近一段时间输入设备 110的移动速度来调 整鼠标应用映射曲线的放大因子。如果输入设备 110最近的移动速度小, 则鼠标应用映射曲线的放大因子随之变小, 而如果输入设备 110最近的 移动速度大, 则鼠标应用映射曲线的放大因子随之变大。 从而, 当用户 连续进行小范围的精细操作时, 小的放大因子有助于精确定位; 而当用 户大范围快速移动时, 大的放大因子又有利于迅速移动。
此外, 对于手写应用映射曲线, 其可为一次线性曲线, 包括二维应 用映射曲线和三维应用映射曲线。 与鼠标应用映射曲线类似地, 手写应 用映射曲线的输入同样可以发射光源 111的位置或位置变化(如移动距 离或速度) , 相应映射为手写输入的屏幕位置或位置变化。 手写应用映 射曲线的变换系数, 即前述一次线性曲线的斜率, 可才艮据不同的应用进 行设置。 例如, 对于通常的手写输入应用, 如在输入框中输入文字, 其 相应的变换系数为 5, 即将发射光源 111的移动距离 5倍映射为屏幕中 输入焦点的移动距离; 对于如画板等手写输入应用, 其相应的变换系数 可为 1 , 即将发射光源 111的位置以及运动轨迹直接映射为屏幕中输入 焦点的位置及应用轨迹。
优选地, 本发明中, 对于简单的应用映射曲线, 可以才艮据发射光源 111 的位置信息, 直接计算获得相应的显示位置。 而对于复杂的应用映 射曲线,可以预先生成表格, 以查表方式根据发射光源 111的位置信息, 获得相应的显示位置。
发射光源 111 包括但不限于任何可适用于本发明的, 各种点光源、 面光源等发光物, 诸如 LED可见光源、 LED红外光源、 OLED光源等。 为简化说明起见,本发明多以 LED光源为例对发射光源 111进行阐述, 然而, 本领域的技术人员应能理解, 该种举例仅为简便地阐述本发明 之用, 而不应理解为对本发明的任何限制。
摄像头 121 包括但不限于任何可适用于本发明的, 能够感应和采集 诸如 LED可见光、 红外线或手势等图像的图像采集设备; 例如, 摄像头 121具备 1 )足够高的采集帧率, 如 15fps或以上, 2 )合适的分辨率如 640x480或以上, 3 )足够短的曝光时间, 如 1/500或更短。
处理装置 122包括但不限于任何可适用于本发明的, 能够按照事先 存储的程序, 自动进行数值计算和 /或各种信息处理的电子设备,其硬 件包括但不限于微处理器、 FPGA、 DSP, 嵌入式设备等。 进一步地, 本发明中, 应用检测设备 120可以包括一个或多个处理装置 122, 当 处理装置 122有多个时,每个处理装置 122可以被分配执行一个特定 的信息处理操作, 以实现并行计算, 从而提高检测效率。
此外, 外接设备 130包括但不限于电视机、 机顶盒或移动设备等。 输出装置 123与外接设备 130通过各种有线或无线的通信方式传输数据 和 /或信息, 诸如输出装置 123通过 VGA接口、 USB接口等硬件接口以 有线方式与外接设备 30进行通信, 或者输出装置 123通过蓝牙、 WIFI 等无线方式与外接设备 30 进行通信。 本领域技术人员应能理解上述外 接设备及其与输出装置之间的通信方式仅为举例, 其他现有的或今后可 能出现的外接设备或其与输出装置之间的通信方式如可适用于本发明, 也应包含在本发明保护范围以内, 并以引用方式包含于此。
图 4为根据本发明另一个方面的方法流程图, 示出一种为发射光源 的运动轨迹映射其应用轨迹的过程。
在此, 应用输入系统 100包括输入设备 110和应用检测设备 120, 其中, 输入设备 110和应用检测设备 120分别置于两端。 输入设备 110 包括至少一个发射光源 111。 应用检测设备 120内置或外接至少一个摄 像头 121 ; 摄像头 121拍摄发射光源 111 , 以获得发射光源 111的成像 信息; 该应用检测设备 120还与外接设备 130相连接。
配合参阅图 1和图 4, 在步骤 S401中, 摄像头 121拍摄获得发射光 源 111的成像信息; 在步骤 S402中, 应用检测设备 120检测该发射光 源 111的输入模式, 以确定该输入模式所对应的应用映射曲线; 在步骤 S403中, 应用检测设备 120根据该发射光源 111的成像信息, 获取该发 射光源 111的运动轨迹; 在步骤 S404中, 应用检测设备 120根据该运 动轨迹, 通过所确定的应用映射曲线, 获得与该运动轨迹相对应的应用 轨迹; 在步骤 S405中, 应用检测设备 120将该应用轨迹输出至外接设 备 130。
例如,发射光源 111为 LED光源,其安装于一输入控制设备 110上, 如遥控器; 用户通过操纵该遥控器, 以面对摄像头 121的方向在空间中 进行各种动作。 摄像头 121内置于应用检测设备 120。 在步骤 S401中, 摄像头 121采用三倍于该 LED光源闪烁频率的帧率, 拍摄该 LED光源 的图像, 以获得该 LED光源的成像信息; 在步骤 S402中, 应用检测设 备 120根据该 LED光源的闪烁频率, 查询预定的输入模式映射表,确定 该 LED光源当前的输入模式,如鼠标输入模式, 并获取鼠标输入模式所 对应的应用映射曲线; 在步骤 S403中, 应用检测设备 120根据该 LED 光源的成像信息, 获取该 LED光源的运动轨迹, 如该 LED光源的多个 位置信息; 在步骤 S404中, 应用检测设备 120根据该 LED光源的运动 轨迹, 通过前述应用映射曲线, 获得该运动轨迹所对应的应用轨迹, 如 在外接设备 130呈现的鼠标运动轨迹; 在步骤 S405中, 应用检测设备 120通过其与外接设备 130连接的 VGA接口,将该鼠标运动轨迹输出至 该外接设备 130, 以在该外接设备 130的屏幕上呈现该 LED光源所对应 的鼠标运动轨迹。
优选地, 应用检测设备 120还检测发射光源 111 当前的输入状态, 当该输入状态所对应的等待时间期满, 再检测该发射光源 111的成像信 息, 获取其运动轨迹, 进而获得与该运动轨迹相对应的应用轨迹。
或者, 应用检测设备 120检测发射光源 111 当前的输入状态, 当该 输入状态所对应的等待时间期满, 再根据该发射光源 111的运动轨迹, 通过其应用模式所对应的应用映射曲线, 获得相应的应用轨迹。
在此, 应用检测设备 120可利用发射光源 111的屏幕输入位置或发 射光源 111的移动模式来检测发射光源 111当前的输入状态, 如输入态 或等待态。 例如, 应用检测设备 120可以使用发射光源 111的移动模式 来检测输入态和等待态: 当发射光源 111或输入光标的移动速度或距离 大于一阈值时为输入态, 否则为等待态。
图 5为根据本发明一个优选实施例的方法流程图, 示出一种为发射 光源的运动轨迹映射其应用轨迹的过程。
在此, 应用输入系统 100包括输入设备 110和应用检测设备 120, 其中, 输入设备 110和应用检测设备 120分别置于两端。 输入设备 110 包括至少一个发射光源 111。 应用检测设备 120内置或外接至少一个摄 像头 121 ; 摄像头 121拍摄发射光源 111 , 以获得发射光源 111的成像 信息; 该应用检测设备 120还与外接设备 130相连接。
配合参阅图 1和图 5, 在步骤 S501中, 摄像头 121拍摄获得发射光 源 111的成像信息; 在步骤 S502中, 应用检测设备 120检测该发射光 源 111的输入模式, 以确定该输入模式所对应的应用映射曲线; 在步骤 S503中, 应用检测设备 120根据该发射光源 111的成像信息, 获取该发 射光源 111的运动轨迹; 在步骤 S506中, 应用检测设备 120根据预定 搜索时间范围内所述运动轨迹的运动特征峰值, 纠正所述运动轨迹的起 始点, 以用于纠正所述应用轨迹的起始点; 在步骤 S507 中, 应用检测 设备 120自获得输入设备 110的输入操作的起始时刻,根据输入设备 110 的操作相关信息,进行相应的输入操作纠正, 以获得纠正后的输入操作, 直至满足预定的输入操作纠正停止条件, 其中, 所述预定的输入操作纠 正停止条件包括发射光源 111的运动时间达到预定的纠正延迟时间阈值 和 /或发射光源 111的运动轨迹的运动特征值达到其相应的运动特征值阈 值; 在步骤 S504中, 应用检测设备 120根据该纠正后的运动轨迹, 通 过所确定的应用映射曲线, 获得与该纠正后的运动轨迹相对应的应用轨 迹; 在步骤 S505中, 应用检测设备 120将该应用轨迹输出至外接设备 130。
本发明中, 应用检测设备 120根据预定搜索时间范围内成像光点的 运动轨迹的运动特征峰值, 纠正所述运动轨迹的起始点, 以实现对诸如 鼠标位置、 手写位置的纠正。 以鼠标位置纠正为例, 记录最近一段时间 内, 如 500ms, 所检测到的每一帧图像中成像光点的位置; 当收到来自 用户的控制信息, 如指示鼠标点击操作时, 应用检测设备 120从该点击 时刻起向前以一个最大搜索时间范围, 如 100ms或 200ms, 计算已记录 的每一帧图像中该成像光点的运动特征; 根据该等运动特征, 计算鼠标 点击开始时刻并取此刻该成像光点的位置所对应的鼠标位置作为鼠标 点击的真正位置, 如采用搜索时间范围内出现所用运动特征值的峰值处 或其前一帧作为点击开始时刻, 并将相应的鼠标位置作为真正的鼠标点 击位置。 其中, 所述运动特征包括但不限于, 每一帧图像中成像光点的 速度、 加速度、 或垂直方向的速度、 加速度, 以及其在相邻帧的变化量 等。
此外, 应用检测设备 120 自获得输入设备 110的输入操作的起始时 刻, 根据输入设备 110的操作相关信息, 进行相应的输入操作纠正, 以 获得纠正后的输入操作, 如将鼠标拖拽操作解释为鼠标点击操作, 或将 鼠标拖拽 +点击操作解释为鼠标双击操作等, 直至满足预定的输入操作 纠正停止条件, 如发射光源 111的运动时间达到预定的纠正延迟时间阈 值和 /或发射光源 111的运动轨迹的运动特征值达到其相应的运动特征值 阈值。
其中, 所述操作相关信息包括但不限于任何可适用于本发明的, 输 入设备 110在当前的输入操作状态下, 进行的后续相关操作或运动, 诸 如输入设备 110在鼠标点击状态下进行运动, 从而将鼠标点击操作转换 为鼠标拖拽操作, 或者, 输入设备 110在鼠标拖拽状态下再次点击, 从 而将鼠标拖拽操作转换为鼠标点击操作等。 预定的输入操作映射关系, 将用户的一项或多项输入操作映射为其他的 输入操作, 诸如将鼠标拖拽操作解释为鼠标点击操作, 或将鼠标拖拽 + 点击操作解释为鼠标双击操作等, 以实现防止鼠标或输入焦点在屏幕上 出现抖动, 从而影响用户的使用体验。
例如, 应用检测设备 120 自获得输入设备 110的输入操作的起始时 刻, 如在步骤 S506 中确定鼠标点击位置后, 在该鼠标点击状态下, 用 户操控输入设备 110发生轻微抖动, 从而将该鼠标点击操作转换为鼠标 拖拽操作, 应用检测设备 120按照预定的输入操作映射关系, 将该鼠标 拖拽操作映射回鼠标点击操作, 同时检测是否满足预定的输入操作纠正 停止条件, 当发射光源 111的运动时间达到预定的纠正延迟时间阈值和 / 或发射光源 111 的运动轨迹的运动特征值达到其相应的运动特征值阈 值, 应用检测设备 120停止输入操作纠正, 并恢复先前对发射光源 111 的运动轨迹的计算。
关于预定的输入操作纠正停止条件, 应用检测设备 120计算输入设 备 110 的输入操作的起始时刻之后的每一帧图像中成像光点的运动特 征, 当一个或多个运动特征超过其对应的预定阈值时, 停止输入操作纠 正, 如成像光点的移动位移足够大时停止输入操作纠正。 或者, 预设一 个最大防抖延迟时间, 如 100至 200ms, 自发射光源 111开始运动起, 当达到最大防抖延迟时间时, 停止输入操作纠正。 其中, 所述运动特征 包括但不限于, 每一帧图像中成像光点的速度、 加速度、 或垂直方向的 速度、 加速度, 以及其在相邻帧的变化量等, 或者相对于点击时刻该成 像光点的初始位置的位移、 或该位移的水平或垂直分量。
需要说明的是, 本领域技术人员应能理解, 上述运动轨迹起始点的 纠正操作与输入操作纠正不是必须在本发明的一个实施例中执行的, 上 述运动轨迹起始点的纠正操作与输入操作纠正可以分别适用于本发明 的不同实施例, 以在各具体实施例中实现对运动轨迹起始点的纠正或输 入操作的纠正。
例如, 应用检测设备 120根据外接设备 130的当前应用, 如网页浏 览, 确定发射光源 111的输入模式为鼠标输入模式; 应用检测设备 120 自摄像头 121获取该发射光源 111的成像信息, 并根据该成像信息, 计 算发射光源 111的运动轨迹; 应用检测设备 120自该运动轨迹的起始时 刻, 向前以一个最大搜索时间范围, 如 100ms, 计算前 100ms中成像光 点在每一帧的速度, 将速度峰值所对应的帧或其前一帧的位置作为该运 动轨迹的起始位置, 以修正该运动轨迹, 以用于后续相应修正应用轨迹; 随后, 应用检测设备 120根据该重新确定的运动轨迹, 通过鼠标应用映 射曲线, 获得相应的应用轨迹, 并输出至外接设备 130。
其中, 在鼠标输入模式下, 发射光源 111 的当前位置会被应用检测 设备 120解释为鼠标当前位置。 如果用户在操控输入设备 110的过程中 有轻微抖动, 相应的鼠标位置也会发生轻微抖动, 这可能会导致应用检 测设备 120在错误位置执行鼠标点击操作或将鼠标点击操作解释为鼠标 拖拽操作。 步骤 S506和步骤 S507可针对这两个问题分别进行点击位置 纠正和点击抖动纠正。
图 6为根据本发明另一个优选实施例的方法流程图, 示出一种为发 射光源的运动轨迹映射其应用轨迹的过程。
在此, 应用输入系统 100包括输入设备 110和应用检测设备 120, 其中, 输入设备 110和应用检测设备 120分别置于两端。 输入设备 110 包括至少一个发射光源 111。 应用检测设备 120内置或外接至少一个摄 像头 121 ; 摄像头 121拍摄发射光源 111 , 以获得发射光源 111的成像 信息; 该应用检测设备 120还与外接设备 130相连接。
配合参阅图 1和图 6, 在步骤 S601中, 摄像头 121拍摄获得发射光 源 111的成像信息; 在步骤 S602中, 应用检测设备 120检测该发射光 源 111的输入模式, 以确定该输入模式所对应的应用映射曲线; 在步骤 S603中, 应用检测设备 120根据发射光源 111的成像信息, 获取该发射 光源 111的运动轨迹;在步骤 S604中,应用检测设备 120根据该运动轨 迹,通过所确定的应用映射曲线,获得与该运动轨迹相对应的应用轨迹; 在步骤 S606中, 应用检测设备 120根据预定搜索时间范围内所述应用 轨迹的运动特征峰值, 纠正所述应用轨迹的起始点; 在步骤 S607 中, 应用检测设备 120自获得输入设备 110的输入操作的起始时刻, 根据输 入设备 110的操作相关信息, 进行相应的输入操作纠正, 以获得纠正后 的输入操作, 直至满足预定的输入操作纠正停止条件, 其中, 所述预定 的输入操作纠正停止条件包括发射光源 111的运动时间达到预定的纠正 延迟时间阈值和 /或发射光源 111的应用轨迹的运动特征值达到其相应的 运动特征值阈值; 在步骤 S605中, 应用检测设备 120将该纠正后的应 用轨迹输出至外接设备 130。
本发明中, 应用检测设备 120根据预定搜索时间范围内成像光点的 应用轨迹的运动特征峰值, 纠正所述应用轨迹的起始点, 以实现对诸如 鼠标位置、 手写位置的纠正。 以鼠标位置纠正为例, 记录最近一段时间 内, 如 500ms, 所检测到的鼠标位置; 当收到来自用户的控制信息, 如 指示鼠标点击操作时, 应用检测设备 120从该点击时刻起向前以一个最 大搜索时间范围, 如 100ms或 200ms, 计算已记录的每一帧图像所对应 的鼠标运动特征; 根据该等鼠标运动特征, 计算鼠标点击开始时刻并取 此刻的鼠标位置作为鼠标点击的真正位置, 如采用搜索范围内出现所用 鼠标运动特征值的峰值处或其前一帧作为点击开始时刻, 并将相应的鼠 标位置作为真正的鼠标点击位置。 其中, 所述鼠标运动特征包括但不限 于, 每一帧图像中鼠标移动的速度、 加速度、 或垂直方向的速度、 加速 度, 以及其在相邻帧的变化量等。
此外, 应用检测设备 120 自获得输入设备 110的输入操作的起始时 刻, 根据输入设备 110的操作相关信息, 进行相应的输入操作纠正, 以 获得纠正后的输入操作, 如将鼠标拖拽操作解释为鼠标点击操作, 或将 鼠标拖拽 +点击操作解释为鼠标双击操作等, 直至满足预定的输入操作 纠正停止条件, 如发射光源 111的运动时间达到预定的纠正延迟时间阈 值和 /或发射光源 111的应用轨迹的运动特征值达到其相应的运动特征值 阈值。
例如, 应用检测设备 120 自获得输入设备 110的输入操作的起始时 刻, 如该输入操作为鼠标拖拽操作, 在该鼠标拖拽状态下, 用户再次操 控输入设备 110进行鼠标点击操作, 从而将该鼠标拖拽操作转换为在拖 拽停止位置的鼠标点击操作, 应用检测设备 120按照预定的输入操作映 射关系, 将该鼠标拖拽 +点击操作映射为在原鼠标拖拽起始位置的鼠标 双击操作, 同时检测是否满足预定的输入操作纠正停止条件, 当发射光 源 111的运动时间达到预定的纠正延迟时间阈值和 /或发射光源 111的应 用轨迹的运动特征值达到其相应的运动特征值阈值, 应用检测设备 120 停止输入操作纠正, 并恢复先前对发射光源 111的应用轨迹的计算。
关于预定的输入操作纠正停止条件, 应用检测设备 120计算输入设 备 110的输入操作的起始时刻之后的每一帧图像中所对应的鼠标运动特 征, 当一个或多个鼠标运动特征超过其对应的预定阈值时, 停止输入操 作纠正, 如鼠标移动位移足够大时停止输入操作纠正。 或者, 预设一个 最大防抖延迟时间, 如 100至 200ms, 自发射光源 111开始运动起, 当 达到最大防抖延迟时间时, 停止输入操作纠正。 其中, 所述鼠标运动特 征包括但不限于, 每一帧图像所对应的鼠标移动的速度、 加速度、 或垂 直方向的速度、 加速度, 以及其在相邻帧的变化量等, 或者相对于鼠标 点击位置的位移、 或位移的水平或垂直分量。
需要说明的是, 本领域技术人员应能理解, 上述应用轨迹起始点的 纠正操作与输入操作纠正不是必须在本发明的一个实施例中执行的, 上 述应用轨迹起始点的纠正操作与输入操作纠正可以分别适用于本发明 的不同实施例, 以在具体实施例中实现的对应用轨迹起始点的纠正或输 入操作的纠正。
例如, 应用检测设备 120根据外接设备 130的当前应用, 如网页浏 览, 确定发射光源 111的输入模式为鼠标输入模式, 并确定相应的鼠标 应用映射曲线; 应用检测设备 120根据该发射光源 111的成像信息, 获 取其运动轨迹, 通过该鼠标应用映射曲线, 计算对应的应用轨迹; 应用 检测设备 120自该应用轨迹的起始时刻,向前以一个最大搜索时间范围, 如 100ms,计算已记录的每一帧图像中的鼠标运动特征,如计算前 100ms 中每一帧图像中的鼠标移动速度, 将速度峰值所对应的帧或其前一帧的 鼠标位置作为该应用轨迹的起始位置, 以修正该应用轨迹; 随后, 应用 检测设备 120在获得输入设备 110的鼠标点击操作后, 用户操控该输入 设备 110进行的运动使得该鼠标点击操作转换为鼠标拖拽操作, 应用检 测设备 120将该鼠标拖拽操作映射回鼠标点击操作, 并检测是否满足预 定的输入操作纠正停止条件, 当发射光源 111的应用轨迹中某一位置相 对于运动起始位置的位移或其水平或垂直分量, 超过对应阈值时, 停止 输入操作纠正, 并恢复先前对发射光源 111的应用轨迹的计算; 应用检 图 7为根据本发明再一个优选实施例的方法流程图, 示出一种为发 射光源的运动轨迹映射其应用轨迹的过程。
在此, 应用输入系统 100包括输入设备 110和应用检测设备 120, 其中, 输入设备 110和应用检测设备 120分别置于两端。 输入设备 110 包括至少一个发射光源 111。 应用检测设备 120内置或外接至少一个摄 像头 121 ; 摄像头 121拍摄发射光源 111 , 以获得发射光源 111的成像 信息; 该应用检测设备 120还与外接设备 130相连接。
配合参阅图 1和图 7, 在步骤 S701中, 摄像头 121拍摄获得发射光 源 111的成像信息; 在步骤 S702中, 应用检测设备 120检测该发射光 源 111的输入模式, 以确定该输入模式所对应的应用映射曲线; 在步骤 S7031中, 应用检测设备 120根据该发射光源 111的成像信息, 获取该 发射光源 111的运动轨迹; 在步骤 S7032中, 应用检测设备 120根据所 述运动轨迹的历史运动特征信息, 确定发射光源 111的预期位置信息, 以用于平滑所述运动轨迹; 在步骤 S704中, 应用检测设备 120根据该 运动轨迹, 通过所确定的应用映射曲线, 获得与该运动轨迹相对应的应 用轨迹; 在步骤 S705中, 应用检测设备 120将该应用轨迹输出至外接 设备 130。
例如, 在步骤 S7032中, 应用检测鼠标 120执行关于所述运动轨迹 的内插平滑操作。 具体地, 预定一个最大输出时间间隔, 如 10ms, 当超 过该最大输出时间间隔时, 应用检测设备 120仍无发射光源 111的应用 轨迹输出; 应用检测设备 120根据该发射光源 111的运动轨迹的历史运 动特征信息, 诸如最近一次检测到的发射光源 111的位置、 速度、 加速 度等,确定该发射光源 111的预期位置信息,如 x'=x+vx*t, y'=y+Vy*t, 其 中 V为运动速度, x'、 y'为预期位置信息; 随后, 应用检测设备 120根 据该预期位置信息, 通过相应的应用映射曲线, 获得与该预期位置信息 相对应的预期应用轨迹。
由于摄像头可采集图像的帧率是有限的,在帧率较低而发射光源 111 高速运动时,应用检测设备 120检测获得的二维 /三维运动轨迹会有抽样 率不足的情形, 这可能会导致用户体验下降。 例如, 由二维 /三维运动轨 迹所产生的鼠标应用轨迹会有顿挫感而不够流畅, 根据上述过程通过预 期位置信息进行内插, 以增加运动轨迹的平滑度, 从而使得相应的鼠标 应用轨迹也平滑、 顺畅, 而不会有顿挫感。
图 8为根据本发明又一个优选实施例的方法流程图, 示出一种为发 射光源的运动轨迹映射其应用轨迹的过程。
在此, 应用输入系统 100包括输入设备 110和应用检测设备 120, 其中, 输入设备 110和应用检测设备 120分别置于两端。 输入设备 110 包括至少一个发射光源 111 , 发射光源 111 的输入模式包括手写输入模 式。 应用检测设备 120内置或外接至少一个摄像头 121 ; 摄像头 121拍 摄发射光源 111 , 以获得发射光源 111的成像信息;该应用检测设备 120 还与外接设备 130相连接。
配合参阅图 1和图 8, 在步骤 S801中, 摄像头 121拍摄获得发射光 源 111的成像信息; 在步骤 S802中, 应用检测设备 120检测该发射光 源 111的输入模式, 以确定该输入模式所对应的应用映射曲线; 在步骤 S803中, 应用检测设备 120根据该发射光源 111的成像信息, 获取该发 射光源 111的运动轨迹; 在步骤 S804中, 应用检测设备 120根据该运 动轨迹, 通过所确定的应用映射曲线, 获得与该运动轨迹相对应的应用 轨迹; 在步骤 S805中, 应用检测设备 120将该应用轨迹输出至外接设 备 130; 在步骤 S808中, 应用检测设备 120根据所述应用轨迹, 查询预 定的字符库, 以获得与所述应用轨迹相对应的字符; 在步骤 S809 中, 应用检测设备 120将所述字符输出至该外接设备 130。
例如, 应用检测设备 120检测该发射光源 111的输入模式为手写输 入模式, 并确定相应的应用映射曲线为一条斜率为 5 (即变换系数) 的 一次线性曲线; 应用检测设备 120根据该发射光源 111的成像信息, 如 LED光源的成像光点在每一帧图像中的位置信息, 并通过目标跟踪方法 根据成像光点在连续的图像序列中形成的运动轨迹, 以获得该发射光源 111的运动轨迹, 并根据该发射光源 111的运动轨迹, 计算及输出相应 的应用轨迹后, 应用检测设备 120还根据该应用轨迹, 查询预定的字符 库, 以获得与该应用轨迹相对应的字符, 并将所述字符输出至该外接设 备 130。
在此, 所述变换系数可以才艮据多个用户的统计习惯确定, 或由用户 或应用检测设备 120预设, 或由应用检测设备 120根据当前用户的使用 习惯对默认值调整确定。 所述应用映射曲线可以根据成像光点相对于一 固定点 (如图像左上点) 的位置, 确定相应的屏幕输入位置; 也可以根 据成像光点的移动距离或速度, 来确定相应的屏幕轨迹的移动距离或速 度, 如将发射光源 111的移动距离或速度线性对应于输入笔画的位置和 长度。
优选地, 应用检测设备 120还检测发射光源 111 当前的输入状态, 如输入态或等待态, 当该输入状态所对应的等待时间期满, 根据所确定 的应用轨迹, 查询预定的字符库, 以获得与该应用轨迹相对应的字符, 并将字符输出至外接设备 130。 例如, 在输入态, 笔画间等待时间为 T1 , 在等待态, 笔画间等待时 间为 T2, 且 T2<T1。 当等待时间期满, 则认为用户写完了一个字, 开始 自动进行字符识别, 如才艮据由发射光源 111的运动轨迹所确定的应用轨 迹, 查询预定的字符库, 以获得与该应用轨迹相对应的字符。 当从等待 态转为输入态, 而用户并没有输入笔画, 等待时间可从上次笔画输入结 束开始记录等待态时间, 以防止系统无限等待, 即最长笔画间等待时间 不超过 Tl。
又如, 当应用检测设备 120使用发射光源 111的屏幕输入位置来检 测输入态和等待态时: 当该屏幕输入位置, 如输入光标的位置, 在手写 输入区域以内, 笔画间等待时间为 T1 , 当屏幕输入位置在手写输入区域 以外, 笔画等待时间为 Τ2, 且 Τ2<Τ1。 当等待时间期满, 则认为用户写 完了一个字, 开始自动进行字符识别。 从而, 当用户还在输入且屏幕输 入位置处于手写输入区域内, 等待时间长; 而当用户把输入光标移出手 写输入区域, 等待时间短。
在此, 手写输入区域可为外接设备 130屏幕上的一个固定区域, 如 屏幕的中心区域, 也可为根据应用轨迹的起始点来动态确定的区域。 例 如, 根据应用轨迹的起始点, 即手写输入的初始落笔处, 往上、 下、 左、 右扩展一定位移的区域, 来确定手写输入模式所对应的手写输入区域。 区域大小可为用户写一个字所需的足够空间。
图 9为根据本发明还一个优选实施例的方法流程图, 示出一种为发 射光源的运动轨迹映射其应用轨迹的过程。
在此, 应用输入系统 100包括输入设备 110和应用检测设备 120, 其中, 输入设备 110和应用检测设备 120分别置于两端。 输入设备 110 包括至少一个发射光源 111 , 发射光源 111 的输入模式包括鼠标输入模 式。 应用检测设备 120内置或外接至少一个摄像头 121 ; 摄像头 121拍 摄发射光源 111 , 以获得发射光源 111的成像信息;该应用检测设备 120 还与外接设备 130相连接。
配合参阅图 1和图 9, 在步骤 S901中, 摄像头 121拍摄获得发射光 源 111的成像信息; 在步骤 S902中, 应用检测设备 120检测该发射光 源 111的输入模式, 以确定该输入模式所对应的应用映射曲线; 在步骤
S903中, 应用检测设备 120根据该发射光源 111的成像信息, 获取该发 射光源 111的运动轨迹; 在步骤 S904中, 应用检测设备 120根据该运 动轨迹, 通过所确定的应用映射曲线, 获得与该运动轨迹相对应的应用 轨迹; 在步骤 S905中, 应用检测设备 120将该应用轨迹输出至外接设 备 130; 在步骤 S9010中, 应用检测设备 120根据发射光源 111的成像 信息, 获取发射光源 111发射的控制信息, 并通过查询预定的控制信息 表, 获得与所述控制信息相对应的鼠标操作; 在步骤 S9011中, 应用检 测设备 120将所述鼠标操作的执行指令输出至外接设备 130, 以在发射 光源 111所对应的输入焦点执行所述鼠标操作, 并在该外接设备 130呈 现与所述鼠标操作相对应的执行结果。
例如, 在鼠标输入模式下, 用户通过设置于输入设备 110上的按键 进行各项鼠标操作, 并控制发射光源 111按照一定的闪烁频率发光, 从 而使得应用检测设备 120通过检测该闪烁频率获得相应的鼠标操作。 应 用检测设备 120除了根据发射光源 111的运动轨迹, 获取并输出相应的 应用轨迹之外, 该应用检测设备 120还根据发射光源 111的成像信息, 通过计算该发射光源 111的成像光点在一段时间内出现亮的次数, 获得 该发射光源 111的闪烁频率,并根据该闪烁频率查询预定的控制信息表, 以获得相应的鼠标操作, 如点击操作; 随后, 应用检测设备 120将该鼠 标操作的执行指令输出至外接设备 130, 以在当前鼠标位置执行该点击 操作, 并在该外接设备 130的屏幕上呈现相应的执行结果。 需要注意的是, 本发明可在软件和 /或软件与硬件的组合体中被实 施, 例如, 可采用专用集成电路(ASIC ) 、 通用目的计算机或任何其他 类似硬件设备来实现。
本发明的软件程序可以通过处理器执行以实现上文所述步骤或功 能。 同样地, 本发明的软件程序 (包括相关的数据结构)可以被存储到 计算机可读记录介质中, 例如, RAM存储器, 磁或光驱动器或软磁盘 及类似设备。 另夕卜, 本发明的一些步骤或功能可采用硬件来实现, 例如, 作为与处理器配合从而执行各个功能或步骤的电路。
另外, 本发明的一部分可被应用为计算机程序产品, 例如计算机 程序指令, 当其被计算机执行时, 通过该计算机的操作, 可以调用或 提供根据本发明的方法和 /或技术方案。而调用本发明的方法的程序指 令,可能被存储在固定的或可移动的记录介质中,和 /或通过广播或其 他信号承载媒体中的数据流而被传输,和 /或被存储在根据所述程序指 令运行的计算机设备的工作存储器中。 在此, 根据本发明的一个实施 例, 其包括一个装置, 该装置包括用于存储计算机程序指令的存储器 和用于执行程序指令的处理器, 其中, 当该计算机程序指令被该处理 器执行时, 触发该装置运行基于前述根据本发明的多个实施例的方法 和 /或技术方案。
对于本领域技术人员而言, 显然本发明不限于上述示范性实施例 的细节, 而且在不背离本发明的精神或基本特征的情况下, 能够以其 他的具体形式实现本发明。 因此, 无论从哪一点来看, 均应将实施例 看作是示范性的, 而且是非限制性的, 本发明的范围由所附权利要求 而不是上述说明限定, 因此旨在将落在权利要求的等同要件的含义和 范围内的所有变化涵括在本发明内。 不应将权利要求中的任何附图标 记视为限制所涉及的权利要求。 此外, 显然"包括"一词不排除其他单 元或步骤, 单数不排除复数。 系统权利要求中陈述的多个单元或装置 也可以由一个单元或装置通过软件或者硬件来实现。 第一, 第二等词 语用来表示名称, 而并不表示任何特定的顺序。

Claims

权 利 要 求 书
方法包括以下步骤:
- 获取发射光源的成像信息;
其中, 该方法还包括:
a检测所述发射光源的输入模式, 以确定与所述输入模式相对应的 应用映射曲线;
b根据所述成像信息, 获取所述发射光源的运动轨迹;
c根据所述运动轨迹, 通过所述应用映射曲线, 获得与所述运动轨 迹相对应的应用轨迹;
d将所述应用轨迹输出至外接设备。
2. 根据权利要求 1所述的方法, 其中, 所述检测所述发射光源的输 入模式的操作包括:
-根据所述外接设备的当前应用, 确定所述发射光源的输入模式。
3. 根据权利要求 1或 2所述的方法, 其中, 该方法还包括:
-才艮据预定搜索时间范围内所述运动轨迹或所述应用轨迹的运动特 征峰值, 纠正所述应用轨迹的起始点;
其中, 所述步骤 d包括:
- 将所述纠正后的应用轨迹输出至所述外接设备。
4. 根据权利要求 1至 3中任一项所述的方法, 其中, 在所述步骤 d 之前, 该方法还包括:
- 自获得输入设备的输入操作的起始时刻, 根据所述输入设备的操 作相关信息, 进行相应的输入操作纠正, 以获得纠正后的输入操作, 直 至满足预定的输入操作纠正停止条件;
其中, 所述预定的输入操作纠正停止条件包括以下至少任一项: - 所述发射光源的运动时间达到预定的纠正延迟时间阈值;
- 所述发射光源的运动轨迹的运动特征值达到其相应的运动特征值 阈值; - 所述发射光源的应用轨迹的运动特征值达到其相应的运动特征值 阈值。
5. 根据权利要求 1至 4中任一项所述的方法, 其中, 所述步骤 b还 包括:
-根据所述运动轨迹的历史运动特征信息, 确定所述发射光源的预 期位置信息, 以用于平滑所述运动轨迹。
6. 根据权利要求 1至 5中任一项所述的方法, 其中, 所述应用映射 曲线包括三维应用映射曲线。
7. 根据权利要求 6所述的方法, 其中, 所述三维应用映射曲线的放 大因子基于所述发射光源的距离来调整。
8. 根据权利要求 6或 7所述的方法, 其中, 所述三维应用映射曲线 包括基于所述发射光源的三维转动位置的三维应用映射曲线。
9. 根据权利要求 8所述的方法, 其中, 所述步骤 b包括:
-根据所述成像信息, 获取所述发射光源的三维转动运动轨迹。
10. 根据权利要求 1至 9中任一项所述的方法, 其中, 所述应用映 射曲线基于所述发射光源的历史状态信息来调整。
11. 根据权利要求 1至 10中任一项所述的方法, 其中, 在所述步骤 b之前, 该方法还包括:
-检测所述发射光源当前的输入状态, 以在所述输入状态所对应的 等待时间期满时, 启动后续操作。
12根据权利要求 1至 11 中任一项所述的方法, 其中, 所述发射光 源的输入模式包括手写输入模式。
13. 根据权利要求 12所述的方法, 其中, 所述应用映射曲线包括一 次线性曲线。
14. 根据权利要求 12所述的方法, 其中, 该方法还包括:
i才艮据所述应用轨迹, 查询预定的字符库, 以获得与所述应用轨迹 相对应的字符;
- 将所述字符输出至所述外接设备。
15. 根据权利要求 12至 14中任一项所述的方法, 其中, 该方法还 包括:
-根据所述应用轨迹的起始点, 确定所述手写输入模式所对应的输 入区域。
16. 根据权利要求 1至 11中任一项所述的方法, 所述发射光源的输 入模式包括鼠标输入模式。
17. 根据权利要求 16所述的方法, 其中, 该方法还包括:
-根据所述发射光源的成像信息, 获取所述发射光源发射的控制信 息, 并通过查询预定的控制信息表, 获得与所述控制信息相对应的鼠标 操作;
- 将所述鼠标操作的执行指令输出至所述外接设备, 以在所述发射 光源所对应的输入焦点执行所述鼠标操作, 并在所述外接设备呈现与所 述鼠标操作相对应的执行结果。
18. 一种为发射光源的运动轨迹映射其应用轨迹的系统, 其中, 该 系统包括发射光源、 用于获取所述发射光源的成像信息的摄像头、 处理 装置及输出装置;
其中, 所述处理装置用于:
-检测所述发射光源的输入模式, 以确定与所述输入模式相对应的 应用映射曲线;
-根据所述成像信息, 获取所述发射光源的运动轨迹;
-根据所述运动轨迹, 通过所述应用映射曲线, 获得与所述运动轨 迹相对应的应用轨迹;
其中, 所述输出装置用于将所述应用轨迹输出至外接设备。
PCT/CN2013/070287 2012-01-09 2013-01-09 一种为发射光源的运动轨迹映射其应用轨迹的方法与系统 WO2013104315A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/371,421 US20150084853A1 (en) 2012-01-09 2013-01-09 Method and System for Mapping for Movement Trajectory of Emission Light Source Application Trajectory Thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2012100048105A CN103197774A (zh) 2012-01-09 2012-01-09 一种为发射光源的运动轨迹映射其应用轨迹的方法与系统
CN201210004810.5 2012-01-09

Publications (1)

Publication Number Publication Date
WO2013104315A1 true WO2013104315A1 (zh) 2013-07-18

Family

ID=48720430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/070287 WO2013104315A1 (zh) 2012-01-09 2013-01-09 一种为发射光源的运动轨迹映射其应用轨迹的方法与系统

Country Status (3)

Country Link
US (1) US20150084853A1 (zh)
CN (1) CN103197774A (zh)
WO (1) WO2013104315A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI556142B (zh) * 2015-10-07 2016-11-01 原相科技股份有限公司 導航軌跡校正方法及其光學導航裝置
CN109479361B (zh) * 2016-07-14 2022-08-19 昕诺飞控股有限公司 光照控制系统及方法
WO2019000430A1 (en) * 2017-06-30 2019-01-03 Guangdong Virtual Reality Technology Co., Ltd. ELECTRONIC SYSTEMS AND TEXT INPUT METHODS IN A VIRTUAL ENVIRONMENT
CN110044334B (zh) * 2018-01-16 2020-04-21 京东方科技集团股份有限公司 基于维诺图的室内空间定位
CN108844529A (zh) * 2018-06-07 2018-11-20 青岛海信电器股份有限公司 确定姿态的方法、装置及智能设备
US10944912B2 (en) * 2019-06-04 2021-03-09 Ford Global Technologies, Llc Systems and methods for reducing flicker artifacts in imaged light sources
US11120313B2 (en) * 2019-07-15 2021-09-14 International Business Machines Corporation Generating search determinations for assortment planning using visual sketches
CN113965692A (zh) * 2020-11-30 2022-01-21 深圳卡多希科技有限公司 一种光源点控制摄像头装置转动的方法和装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1816792A (zh) * 2003-07-02 2006-08-09 新世代株式会社 信息处理装置、信息处理系统、操作物、信息处理方法、信息处理程序以及游戏系统
CN101320291A (zh) * 2008-07-11 2008-12-10 华南理工大学 一种基于可见光检测的虚拟文字识别方法
CN101794174A (zh) * 2010-03-31 2010-08-04 程宇航 利用光源的输入装置、图形用户设备和方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7003136B1 (en) * 2002-04-26 2006-02-21 Hewlett-Packard Development Company, L.P. Plan-view projections of depth image data for object tracking
US7961909B2 (en) * 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
US20050212753A1 (en) * 2004-03-23 2005-09-29 Marvit David L Motion controlled remote controller
JP2009505201A (ja) * 2005-08-11 2009-02-05 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ ポインティング装置の動きを判定する方法
US8279168B2 (en) * 2005-12-09 2012-10-02 Edge 3 Technologies Llc Three-dimensional virtual-touch human-machine interface system and method therefor
JP2008219788A (ja) * 2007-03-07 2008-09-18 Toshiba Corp 立体画像表示装置、方法およびプログラム
US8542907B2 (en) * 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US8310547B2 (en) * 2008-12-05 2012-11-13 Electronics And Telecommunications Research Institue Device for recognizing motion and method of recognizing motion using the same
US8432305B2 (en) * 2009-09-03 2013-04-30 Samsung Electronics Co., Ltd. Electronic apparatus, control method thereof, remote control apparatus, and control method thereof
CN201570011U (zh) * 2010-01-13 2010-09-01 北京视博数字电视科技有限公司 一种终端控制装置及终端
CN102221888A (zh) * 2011-06-24 2011-10-19 北京数码视讯科技股份有限公司 基于遥控器的控制方法及系统

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1816792A (zh) * 2003-07-02 2006-08-09 新世代株式会社 信息处理装置、信息处理系统、操作物、信息处理方法、信息处理程序以及游戏系统
CN101320291A (zh) * 2008-07-11 2008-12-10 华南理工大学 一种基于可见光检测的虚拟文字识别方法
CN101794174A (zh) * 2010-03-31 2010-08-04 程宇航 利用光源的输入装置、图形用户设备和方法

Also Published As

Publication number Publication date
CN103197774A (zh) 2013-07-10
US20150084853A1 (en) 2015-03-26

Similar Documents

Publication Publication Date Title
WO2013104315A1 (zh) 一种为发射光源的运动轨迹映射其应用轨迹的方法与系统
US9696859B1 (en) Detecting tap-based user input on a mobile device based on motion sensor data
RU2559720C2 (ru) Устройство и способ пользовательского ввода для управления отображаемой информацией
US9910505B2 (en) Motion control for managing content
CN105229582B (zh) 基于近距离传感器和图像传感器的手势检测
JP6445515B2 (ja) 少なくとも2つの制御オブジェクトを用いて行われるジェスチャの検出
US10642372B2 (en) Apparatus and method for remote control using camera-based virtual touch
JP6057396B2 (ja) 3次元ユーザインタフェース装置及び3次元操作処理方法
US9619105B1 (en) Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9342925B2 (en) Information processing apparatus, information processing method, and program
US20140184499A1 (en) Remote manipulation device and method using a virtual touch of a three-dimensionally modeled electronic device
US20100053322A1 (en) Detecting ego-motion on a mobile device displaying three-dimensional content
KR20120126508A (ko) 포인터를 사용하지 않는 가상 터치 장치에서의 터치 인식 방법
KR20110037053A (ko) 영상센서를 이용한 3차원 공간 터치 입력장치 및 그 방법
WO2015100205A1 (en) Remote sensitivity adjustment in an interactive display system
WO2016008265A1 (zh) 一种定位位置的方法及装置
JP2013156889A (ja) 移動制御装置、移動制御装置の制御方法、及びプログラム
JP2015118442A (ja) 情報処理装置、情報処理方法およびプログラム
JP2016143414A (ja) インタラクティブシステム、リモコン及びその操作方法
TWI486815B (zh) 顯示設備及其控制系統和方法
JP6447521B2 (ja) 情報処理装置、情報処理方法、およびプログラム
TW201351977A (zh) 用於影像辨識之影像擷取方法及其系統
JP6746419B2 (ja) 情報処理装置、及びその制御方法ならびにコンピュータプログラム
KR20160072306A (ko) 스마트 펜 기반의 콘텐츠 증강 방법 및 시스템
JP2018109899A (ja) 情報処理装置、操作検出方法、及びコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13735579

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 14371421

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 13735579

Country of ref document: EP

Kind code of ref document: A1