WO2013067849A1 - 人机交互操作指令的触发控制方法和系统及激光发射装置 - Google Patents
人机交互操作指令的触发控制方法和系统及激光发射装置 Download PDFInfo
- Publication number
- WO2013067849A1 WO2013067849A1 PCT/CN2012/081405 CN2012081405W WO2013067849A1 WO 2013067849 A1 WO2013067849 A1 WO 2013067849A1 CN 2012081405 W CN2012081405 W CN 2012081405W WO 2013067849 A1 WO2013067849 A1 WO 2013067849A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- laser
- human
- image
- captured
- laser spot
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0386—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
Definitions
- the present invention relates to human-computer interaction system technology, and in particular, to a trigger control method and system for human-computer interaction operation instructions and a laser emitting device for use.
- Human-Computer Interaction Techniques refers to the technology that interacts with data processing devices in an efficient manner through the input and output devices of the data processing device. It includes the machine providing a large amount of relevant information and prompt instructions through the output or display device, and the person inputs relevant information and operation instructions to the machine through the input device.
- an embodiment of the present invention provides a trigger control method and system for human-computer interaction operation instructions, so as to facilitate a user to perform a medium-to-remote human-computer interaction operation.
- the embodiment of the invention further provides a trigger control system with the human-machine interaction operation instruction
- the laser emitting device used can accurately emit the laser coded signal corresponding to the operation instruction, thereby improving the operation precision in the middle and remote human-machine interaction operation.
- a trigger control method for human-computer interaction operation instructions includes:
- Detecting a laser spot in a display area captured by the imaging device determining a coordinate of the detected laser spot, and converting the relationship between the display area captured by the imaging device and the original image output by the image output device, and detecting the The laser spot coordinates are converted into coordinates in the original picture output by the image output device;
- Identifying a coded signal of the laser spot and when recognizing that the laser spot emits a coded signal corresponding to a human-machine interaction operation instruction, triggering on a coordinate position in the original picture corresponding to the laser point coordinate correspondingly The human-machine interaction operation instruction corresponding to the encoded signal.
- a trigger control system for human-computer interaction operation instructions comprising:
- An image output module configured to provide an original image for output by the image output device
- a camera acquisition module configured to acquire a display area output by the image output device captured by the camera device
- mapping relationship module configured to determine a coordinate mapping transformation relationship between the display area captured by the imaging device and the original image output by the image output device
- a laser point detecting module configured to detect a laser spot in a display area captured by the camera device
- a positioning module configured to determine coordinates of the detected laser spot, according to a display area captured by the image capturing device and an original output of the image output device a coordinate mapping transformation relationship of the screen, converting the detected laser spot coordinates into coordinates in an original picture output by the image output device;
- a laser emitting device matched with the above-mentioned trigger control system for human-computer interaction operation instructions comprising:
- the human-computer interaction operation instruction triggering key is configured to trigger a corresponding human-computer interaction operation instruction;
- the signal coding unit is configured to store a laser coding mode corresponding to the human-machine interaction operation instruction;
- the laser emitter is configured to emit the laser beam;
- a laser emission controller configured to read a corresponding laser coding mode from the signal coding unit according to a human-machine interaction operation command triggered by a human-machine interaction operation trigger key, and control the laser emitter to emit a signal representing a corresponding laser coded signal Laser beam.
- the solution provided by the present invention can complete the positioning of the laser signal by detecting and recognizing the laser signal emitted by the user to the display area in the middle by using the laser and the camera device.
- the triggering of the corresponding operation instruction of the position, the laser signal can encode and simulate a plurality of operation instructions, so that the user can perform human-computer interaction operation in a medium-remote scene.
- the laser emitting device of the present invention can also accurately emit a laser coded signal corresponding to an operation command, thereby improving the operation precision in the middle and remote human-machine interaction operations.
- FIG. 1 is a schematic diagram of a device system connection according to an application scenario of the method of the present invention
- FIG. 2 is a schematic diagram of calibration of a projection area captured by a camera according to the present invention
- FIG. 3 is a calibration screen captured by a camera.
- FIG. 4 is a schematic diagram of a process of detecting a laser spot in a picture taken by a camera
- FIG. 5 is a schematic diagram of a flash code of a laser beam
- FIG. 6 is a schematic diagram of a trigger control system for human-computer interaction operation instructions according to the present invention
- FIG. 7a is a schematic diagram of a specific composition of a mapping relationship module in the trigger control system
- FIG. 7b is a laser point in the trigger control system
- FIG. 7c is a schematic diagram of a specific composition of a code recognition module in the trigger control system
- FIG. 8 is a schematic diagram of a laser emitting device according to the present invention.
- the coordinate mapping transformation relationship is represented by two parts of data: one is a coordinate of the reference calibration point in the shooting picture, and the second is the original picture and The ratio of the length ratio to the width of the photographing screen;
- Detecting a laser spot in a display area captured by the imaging device determining a coordinate of the detected laser spot, and converting the relationship between the display area captured by the imaging device and the original image output by the image output device, and detecting the The laser spot coordinates are converted into coordinates in the original picture output by the image output device;
- Identifying a coded signal of the laser spot and when recognizing that the laser spot emits a coded signal corresponding to a human-machine interaction operation instruction, triggering on a coordinate position in the original picture corresponding to the laser point coordinate correspondingly The human-machine interaction operation instruction corresponding to the encoded signal.
- the image output device of the present invention may be a projector, and the output area corresponding to the output is a projection area projected by the projector on a screen or a wall; the image output device may also be a display, and the corresponding output display area is the display area.
- the display screen of the display may be a projector, and the output area corresponding to the output is a projection area projected by the projector on a screen or a wall; the image output device may also be a display, and the corresponding output display area is the display area.
- the display screen of the display may be a projector, and the output area corresponding to the output is a projection area projected by the projector on a screen or a wall; the image output device may also be a display, and the corresponding output display area is the display area.
- the display screen of the display may be a projector, and the output area corresponding to the output is a projection area projected by the projector on a screen or a wall; the image output device may also be a display, and the
- the present invention can simulate the encoding of a plurality of operational commands by the encoded signal of the laser.
- the present invention will be described by taking a laser-simulated mouse operation as an example in the following embodiments.
- the present invention can also be applied to simulate more human-machine modes of operation, such as: simulating single-touch operation, and simulating multi-touch operation using more than one laser emitting device. This enables remote human-machine interaction.
- FIG. 1 is a schematic diagram of a device system connection in an application scenario of the method according to the present invention.
- Reference 1 is an example of a typical device connection configuration for implementing the present invention.
- the present invention is not limited to this connection scenario, and other connection manners may be used.
- the projector may not be a mandatory device, but The projector is replaced by a display, and the laser is used to operate directly on the display of the display.
- the data processing device 105 is connected to the camera 101 through a camera interface 107, which may be connected by various industry-proven connection solutions such as a universal serial bus (USB) connection or a wifi wireless connection.
- the camera 101 may not be a separate device, but rather a built-in camera in the data processing device 105.
- the projector 102 is connected to the data processing device 105 through a projector interface 104, and the connection manner thereof may be a VGA mode, a composite video output mode, a high definition multimedia interface (HDMI) mode, and other various wired or wireless technologies.
- the connection method of video transmission capability may be a VGA mode, a composite video output mode, a high definition multimedia interface (HDMI) mode, and other various wired or wireless technologies.
- the projector 102 will project a projection area 103 (i.e., the display area of the present invention), and the camera 101 completely captures the projection area 103 and performs sharp focus by manual setting or automatic adjustment.
- the camera 101 completely captures the display area of the display (equivalent to the projection area 103) by manual setting or automatic adjustment and performs clear focusing.
- the laser beam emitted by the laser 108 strikes the projection area 103 to form a laser beam spot 109.
- the trigger control system 106 on the data processing device 105 can be activated after the camera 101 has completely captured the projection area 103 and is in focus.
- the laser beam emitted by the laser 108 can be an infrared laser. In this manner, an infrared filter can be added to the camera 101 to allow the camera 101 to capture the infrared laser spot.
- the data processing device 105 may be a computing system having a central processing unit (CPU), a memory, and an operating system provider operating environment. Typical examples are desktop computers, notebook computers, tablets, televisions, and computing power. Handheld devices such as smartphones and robotic devices with computing power.
- the trigger control system 106 running on the data processing device 105 is a software system for acquiring a video picture of the projection area 103 through the camera 101 and performing video image analysis calculation to detect the laser beam spot 109 emitted by the laser 108.
- the present invention will now be described in detail by the trigger control system 106 which simulates a mouse operation by detecting a laser beam spot.
- Step s01 providing an original picture through the projector interface 104 for output by the projector (ie, the image output device of the present invention); and simultaneously acquiring, by the camera interface 107, a display area projected by the projector captured by the camera, that is, the projection area 103.
- the projector ie, the image output device of the present invention
- Step s02 Determine a coordinate mapping relationship between the projection area 103 captured by the camera and the original picture projected by the projector.
- the coordinate mapping transformation relationship is represented by two parts of data: one is the calibration data of the projection area, that is, the coordinates of the reference calibration point in the shooting picture, and the second is the ratio of the length ratio and the width of the original picture and the captured picture.
- a specific calibration method of an embodiment of the present invention may be:
- the trigger control system 106 controls the projector 102 to project a calibration picture.
- the projection area 103 illustrated in FIG. 2 is an original calibration picture projected by the projector.
- the calibration picture may be a default A picture with a single color background, the calibration picture includes at least four reference calibration points, and the more reference calibration points, the more accurate the coordinate transformation is.
- four reference calibration points that is, four corners of the picture are used, respectively, having reference calibration points 11, 12, 13, and
- Figure 3 shows a schematic diagram of the calibration picture captured by the camera.
- w and h are the width and height of the photographing screen 301 taken by the camera.
- the present invention uses the camera photographing screen 301 as a coordinate system, such as the horizontal axis Y and the vertical axis X shown in FIG.
- the computer is accustomed to having the longitudinal axis X facing downward.
- the coordinate origin (0, 0) is the intersection of X and Y, that is, the upper left corner of the photographing picture 301.
- the area 302 within the captured picture 301 is the projected area (or in another embodiment the display area of the display) output by the projector 102.
- the projection area output by the projector 102 should be rectangular in the standard environment, but since the real-life camera and the projector do not necessarily completely coordinate coaxially and 1:1, the projection area 302 captured by the camera (or In another embodiment, the display area of the display is often shown to be nearly trapezoidal.
- the four corners of the coordinates (slx, sly), (s2x, s2y), (s3x, s3y), (s4x, s4y) shown in FIG. 3 are the four corners of the projection area 302 in the camera video screen. coordinate.
- the coordinate values (slx, sly), (s2x, s2y), (s3x, s3y), (s 4 x, s 4 y) are respectively the calibration screens 30 taken by the camera. 2 with reference to four calibration points 11, 12, 13, and 14 to capture the screen 301 as a coordinate value of a reference coordinate system.
- the method for determining the coordinate value of the reference calibration point is: the trigger control system 106 analyzes the captured calibration picture, and the color of the reference calibration point of the calibration picture is clearly distinguished from the background color of the calibration picture, for example, the background of the calibration picture is white.
- the color of the reference calibration point is red, and the trigger control system can further perform image background weakening processing on the captured image, and remove the image information unrelated to the reference calibration point to highlight the reference calibration point.
- the reference calibration point can then be captured very conveniently according to the existing image coordinate analysis technique, and the coordinate values of the reference calibration points 11, 12, 13, and 14 in the coordinate system of the video frame 301 are calculated (slx , sly), (s2x, s2y), (s3x, s3y), (s4x, s4y).
- the calibration data of the projection area that is, the coordinates (slx, sly), (s2x, s2y), (s3x, s3y), (s4x, s4y) of the reference calibration point in the captured image, and store the original The ratio of the length ratio to the width of the picture and the picture.
- the present invention can also use other mature transformation algorithms to determine the coordinate mapping transformation relationship between the display area captured by the imaging device and the original picture output by the image output device.
- the calibration may be performed in a calibration-free manner.
- the position of the laser beam spot is determined to simulate the mouse motion.
- the calibration-free method can be used with an infrared laser to avoid the user's troubles caused by the inconsistent laser spot and mouse position.
- the reference calibration points in the calibration pictures shown in Figures 2 and 3 are also only a typical calibration implementation, or other calibration methods for reference calibration points, such as setting reference calibration points at three corners and center points. Waiting for the way.
- Step s03 Detect the position of the laser spot in the display area captured by the camera.
- laser is an ultra-high brightness light source with very good horizontal concentrating ability, which is very suitable as a pointing device.
- the key technical feature of the present invention is to use a light spot formed by a high-intensity laser beam as a remote operation control point.
- the position of the laser spot represents the position of the mouse cursor.
- Fig. 4 is a schematic view showing a process of detecting a laser spot in a picture taken by a camera. See picture
- the sub-picture 401 represents a picture seen by the human eye, which includes a picture projected by the projector (or a picture displayed by the display), and a laser spot emitted by the user using a laser to emit a light beam.
- the upper dot in the figure indicates the laser point.
- the trigger control system needs to perform image background weakening processing on the captured image to remove the image information irrelevant to the laser spot and highlight the laser spot.
- the trigger control system highlights the laser spot information by controlling the exposure of the camera to remove image information unrelated to the laser spot. For example, a typical way is to minimize the exposure of the camera, so that the brightness of the projected image is much lower than that of the laser.
- the screen shot by the camera is dimmed, and the laser spot remains clear due to its highlight. As shown in sub-screen 402.
- the trigger control system may further perform image processing on the image of the sub-picture 402.
- the typical manner is to further weaken the image information by adjusting the image gradation, that is, removing the residual dim image signal, and further highlighting the highlight.
- the laser spot as shown in sub-picture 403.
- the image processing knowledge here is a well-known common technique.
- the present invention can also realize the removal of laser point information irrespective of the laser spot by other image processing methods to highlight the laser spot information.
- control program processes the picture taken by the camera to obtain a result picture similar to that shown in the sub-picture 4.
- the result screen is a screen having only laser spot information 400, and based on the result screen, the laser spot can be captured very easily in accordance with the conventional image coordinate analysis technique.
- Step s04 since the laser spot is captured, the coordinates of the detected laser spot in the photographing screen 301 can be calculated, and if more precise, the coordinates of the average center of the laser spot in the photographed picture 301 are calculated. value.
- the display area captured by the camera and the projector The coordinate mapping transformation relationship of the original image is obtained, and the detected laser spot coordinates are converted into coordinates in the original image output by the projector.
- (px, py) is the coordinates of the laser spot obtained in the camera photographing screen 301 by the processing procedure shown in FIG. 4, and the photographing point is photographed according to the stored reference point of the projection area.
- the coordinates (slx, sly), (s2x, s2y), (s3x, s3y), (s4x, s4y) in the picture, and the length ratio and width ratio of the original picture and the captured picture are stored, and the laser can be converted and calculated. Point the coordinates ( ⁇ , ⁇ ) in the original picture output from the projector.
- the specific calculation method is a conventional technique in the art, for example, one of the methods is:
- the coordinate position of the laser spot in the original picture is the position of the mouse cursor in the original picture
- the trigger control system can control the display of the mouse cursor at the position
- the trigger control system will process each frame of the video obtained by the camera to obtain the laser beam. Click on the position on the screen.
- the control program processes the camera picture in real time, and moves the mouse cursor to the position of the laser spot in real time, thereby Simulate the effect of a laser mouse cursor.
- Step s05 identifying the coded signal of the laser spot, and when identifying the coded signal corresponding to a human-machine interaction operation instruction by the laser point, the coordinates in the original picture transformed by the laser point coordinate correspondingly The location triggers a human-machine interaction operation instruction corresponding to the encoded signal.
- the laser beam highlights are flashed according to a specific coding mode, corresponding to the mouse point. Click the action command such as click, right click, double click and drag and drop.
- the present invention is not limited to the scintillation coding of laser spots, and can be programmed and interpreted to represent more complex coding methods in accordance with the principles of the present invention.
- Fig. 5 is a schematic diagram showing the scintillation coding of a laser beam. Referring to Fig. 5, the ordinate is the laser beam on state, the upper edge of the square wave indicates that the laser is turned on, and the lower edge of the square wave indicates that the laser is turned off, and different laser beam flicker coding modes correspond to different mouse operations.
- the specific method for identifying the coded signal of the laser spot is:
- the control program obtains an image sequence of the laser spot according to the method described in steps s03 and s04, continuously detects the laser spot in each frame of the captured image, and determines that the continuous frame is in a predetermined detection time interval.
- the flashing code of the laser spot is matched with the preset (such as the blinking mode shown in FIG. 5) by the human-computer interaction operation instruction represented by the flashing code of the laser spot. If a human-computer interaction operation instruction is matched, the identification is recognized.
- the encoded signal corresponding to the human-machine interaction operation instruction is used as a basis for the click, double-click, long-press or release long-press of the trigger control system to simulate the mouse operation, and the coordinate position in the original picture of the laser point is triggered. Corresponding mouse operation instructions.
- FIG. 6 is a schematic diagram of a trigger control system 106 for human-computer interaction operation instructions according to the present invention.
- the trigger control system 106 is mainly used to implement the foregoing processing method of the present invention, and specifically includes:
- An image output module 601 is coupled to the projector interface 104 for providing an original picture for output by the image output device.
- the camera acquisition module 602 is connected to the camera interface 107 and is configured to acquire a display area output by the image output device captured by the camera.
- the mapping relationship module 603 is configured to determine a coordinate mapping transformation relationship between the display area captured by the imaging device and the original picture output by the image output device.
- the laser spot detection module 604 is configured to detect a laser spot in a display area captured by the imaging device.
- the positioning module 605 is configured to determine coordinates of the detected laser spot, and convert the detected laser spot coordinates into an image according to a coordinate mapping relationship between the display area captured by the imaging device and the original image output by the image output device. The coordinates in the original picture output by the output device;
- a code identification module 606 configured to identify an encoded signal of the laser spot, when the excitation is identified When the light spot emits a coded signal corresponding to a human-machine interaction operation command, a human-machine interaction operation instruction corresponding to the coded signal is triggered at a coordinate position in the original picture converted by the laser spot coordinates.
- the mapping relationship module 603 includes: a standard stator module 631, configured to control the image output module to provide an original calibration screen, where the calibration screen includes at least three reference calibration points. And determining coordinates of the reference calibration point captured by the imaging device in the captured image.
- the ratio determining sub-module 632 is configured to determine a length ratio and a width ratio of a picture taken by the camera and an original picture output by the image output device.
- the storage submodule 633 is configured to store coordinates of the reference calibration point in the shooting picture, and length ratio and width ratio of the original picture and the shooting picture.
- the laser spot detection module 604 specifically includes: an image processing sub-module 641, configured to perform image background weakening processing on the captured image, and remove image information unrelated to the laser spot to highlight the laser point.
- the capture sub-module 642 is configured to capture the highlighted laser spot from the captured image processed by the image processing sub-module 641.
- the code identification module 606 specifically includes:
- the code library 661 is configured to store a laser coding mode corresponding to the human-machine interaction operation instruction; the code recognition sub-module 662 obtains a laser spot in each frame of the continuous detection by the laser point detection module 604, and determines a predetermined detection time. The flashing code of the laser spot in the continuous frame picture in the interval is compared with the laser coding mode of the coded library storage. If the laser coding mode corresponding to a human-computer interaction operation instruction is matched, it is determined that the human-machine interaction is recognized. An encoded signal corresponding to the operation instruction;
- the instruction triggering module 663 is configured to trigger a human-machine interaction operation instruction corresponding to the coded signal recognized by the code recognition sub-module 662 at a coordinate position of the laser spot determined by the positioning module 605 in the original picture.
- each of the foregoing functional modules may be an integrated device built in an intelligent terminal, such as a mobile phone, a tablet computer, a television, a projector, and the like. He holds the terminal.
- the spot detected by the laser spot detecting module 604 is emitted by the laser emitting device, and the laser emitting device may be a separate device or integrated in the smart terminal described above. That is, the trigger control system 106 of the above-described human-computer interaction operation command may further include a laser emitting device, and for further details regarding the laser emitting device, reference may be made to FIG. 8 and related description below.
- the present invention also discloses a laser emitting device for use with the above-described trigger control system for human-machine interaction instructions.
- Fig. 8 is a schematic view of the laser emitting device.
- the laser emitting device includes: a human-machine interaction operation trigger key 801, configured to trigger a corresponding human-computer interaction operation instruction.
- the signal coding unit 802 is configured to store a laser coding mode corresponding to the human-machine interaction operation instruction.
- the laser emission controller 804 is configured to read a corresponding laser coding mode from the signal coding unit according to a human-machine interaction operation command triggered by a human-machine interaction operation trigger key, and control the laser emitter to emit a representative laser-encoded signal. Laser beam.
- the power supply and switch 805 are also included.
- the human-machine interaction operation trigger key 801 may include at least one of the following trigger keys: a mouse operation key, configured to trigger a mouse operation instruction;
- Multi-touch operation button for triggering multi-touch operation commands.
- the working module of the wireless mouse (Bluetooth or 2.4G, etc.) can be integrated into the laser transmitter 803, so that the left mouse button, the right button and the double button operation of the wireless mouse can be directly borrowed.
- the human-computer interaction operation trigger key is a mouse operation key, for example, which includes: a long press operation key 811 for triggering a long press operation instruction, and a click operation key for triggering a click operation instruction. 812.
- the laser coded signal emitted by the laser emitter is a laser flashing signal.
- the laser coding mode in the signal coding unit 802 may be, for example, the coding mode shown in FIG. 5, which is completely consistent with the coding mode stored in the coding library 661 of the trigger control system 106.
- the laser emission controller 804 controls the laser emitter 803 to emit a laser scintillation signal corresponding to the operation command represented by the button shown in FIG. 5, that is, the laser beam containing the scintillation code.
- the trigger control system 106 can recognize the laser flicker signal, and match the corresponding laser coding mode from the code base 661 to know which one of the corresponding operation commands is, thereby finally triggering the operation command.
- the present invention is not limited to the flash coded signal of the laser spot, and can be programmed and interpreted to represent a more complex coding scheme in accordance with the principles of the present invention.
- the laser emitting device may be integrated into the smart terminal to form an integrated device, such as a mobile phone, a tablet computer, a television, a projector, and other handheld terminals.
- the screen of the data processing device projected by the projector is monitored by the camera, and the trigger control system on the data processing device can analyze the content captured by the camera and perform image analysis to distinguish the projection of the laser finger.
- the position on the screen, the trigger control system will manage the position of the mouse cursor on the data processing device, and obtain the simulated mouse click, double click, right click or long press drag and drop by parsing the laser's launch flash control code. Therefore, it is convenient for the user to remotely control the interface of the computer in the laser emitting device without being in the side of the computer, which is convenient to operate, and the operation instructions can be diversified, that is, if a control operation instruction is to be added, only the coding library 661 is needed.
- the corresponding laser coding mode may be added to the signal coding unit 802.
- the present invention can also simulate a single touch operation of a touch screen operation, and a multi-touch operation that simulates a touch screen using one or more laser emitting devices.
- a multi-touch operation that simulates a touch screen using one or more laser emitting devices.
- more than one laser emitter is required to apply more than one laser spot on the projection screen, and the one or more lasers can be integrated in the same laser emitting device, and the multi-touch is stored in the signal encoding unit 802.
- the coding mode corresponding to the multiple laser points corresponding to the operation instruction for example, the two laser points flashing twice at the same frequency at the same time, indicates the zoom gesture operation instruction in the multi-touch operation, and the two laser points flash three times at the same frequency at the same time. , means the reduction gesture operation instruction in the multi-touch operation, and the like.
- the laser emission controller 804 reads the corresponding multi-point laser code from the signal coding unit.
- the code mode controls the one or more laser emitters to emit a laser beam representing a corresponding laser-encoded signal, for example.
- the zoom gesture operation command requires two laser emitters to simultaneously emit a laser beam that blinks twice at the same frequency.
- the code library 661 in the trigger control system 106 also needs to further store a plurality of touch operation commands represented by a plurality of laser point coding modes. For example, if two laser points are blinked twice at the same frequency, it indicates multiple points.
- the zoom gesture operation command in the touch operation the two laser spots flash three times at the same frequency at the same time, indicating the zoom gesture operation command in the multi-touch operation.
- the tap operation command that triggers the zoom gesture is determined, thereby triggering the performing the zoom-in operation.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/350,622 US20140247216A1 (en) | 2011-11-08 | 2012-11-14 | Trigger and control method and system of human-computer interaction operation command and laser emission device |
IN1012MUN2014 IN2014MN01012A (zh) | 2011-11-08 | 2014-05-26 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110349911.1 | 2011-11-08 | ||
CN201110349911.1A CN103092432B (zh) | 2011-11-08 | 2011-11-08 | 人机交互操作指令的触发控制方法和系统及激光发射装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013067849A1 true WO2013067849A1 (zh) | 2013-05-16 |
Family
ID=48205083
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2012/081405 WO2013067849A1 (zh) | 2011-11-08 | 2012-09-14 | 人机交互操作指令的触发控制方法和系统及激光发射装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140247216A1 (zh) |
CN (1) | CN103092432B (zh) |
IN (1) | IN2014MN01012A (zh) |
WO (1) | WO2013067849A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110502129A (zh) * | 2019-08-29 | 2019-11-26 | 王国梁 | 交互控制系统 |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
CN103729610B (zh) * | 2013-12-24 | 2017-01-11 | 北京握奇智能科技有限公司 | 一种二维码聚焦显示方法及系统 |
CN104978077B (zh) * | 2014-04-08 | 2020-01-31 | 联想(北京)有限公司 | 一种交互方法及系统 |
CN105323517A (zh) * | 2014-07-16 | 2016-02-10 | 腾讯科技(深圳)有限公司 | 投影画面自动校准方法及装置 |
CN105430308B (zh) * | 2014-09-17 | 2020-04-03 | 索尼公司 | 交互式投影装置及其曝光值自动调整方法 |
CN104270664B (zh) * | 2014-10-29 | 2017-09-05 | 上海联彤网络通讯技术有限公司 | 光笔遥控器、实现智能操作平台输入控制的系统及方法 |
CN106445090B (zh) * | 2015-08-12 | 2021-02-23 | 中兴通讯股份有限公司 | 一种控制光标的方法、装置及输入设备 |
CN106993146A (zh) * | 2016-01-21 | 2017-07-28 | 中兴通讯股份有限公司 | 控制方法、控制装置、投影机 |
CN107229377A (zh) * | 2016-03-26 | 2017-10-03 | 谭登峰 | 大视角反射成像触控系统 |
CN106325614A (zh) * | 2016-08-28 | 2017-01-11 | 上海纬而视科技股份有限公司 | 一种用红外触控或书写的显示操控方法及其装置 |
CN108628487A (zh) * | 2017-03-24 | 2018-10-09 | 西安中兴通讯终端科技有限公司 | 一种位置信息确定方法、投影设备和计算机存储介质 |
TWI629617B (zh) * | 2017-04-19 | 2018-07-11 | 中原大學 | 投影幕雷射筆偵測定位系統與方法 |
US10802585B2 (en) | 2018-07-12 | 2020-10-13 | Apple Inc. | Electronic devices with display operation based on eye activity |
CN109144375B (zh) * | 2018-10-09 | 2022-08-19 | 中天智领(北京)科技有限公司 | 一种屏幕控制方法及装置 |
CN111046150B (zh) | 2018-10-15 | 2023-04-25 | 阿里巴巴集团控股有限公司 | 人机交互处理系统及其方法、存储介质、电子设备 |
CN109412689B (zh) * | 2018-10-19 | 2023-06-27 | 苏州融萃特种机器人有限公司 | 一种基于图像处理的机器人激光通信系统及其方法 |
CN109828695B (zh) * | 2018-12-29 | 2022-02-18 | 合肥金诺数码科技股份有限公司 | 一种基于激光雷达定位的大屏幕交互系统 |
CN110221796A (zh) * | 2019-05-28 | 2019-09-10 | 上海寰视网络科技有限公司 | 多屏拼接系统的控制方法及控制系统 |
CN110297556B (zh) * | 2019-07-02 | 2023-03-31 | 沈阳理工大学 | 一种基于图像识别技术的电子投影画板系统及其处理方法 |
CN110427122A (zh) * | 2019-07-10 | 2019-11-08 | 北京云迹科技有限公司 | 基于激光传感器的触摸控制方法 |
CN110347273B (zh) * | 2019-07-12 | 2023-04-28 | 哈尔滨工业大学(威海) | 基于激光的人机交互方法 |
CN111107406A (zh) * | 2019-12-20 | 2020-05-05 | 视联动力信息技术股份有限公司 | 一种显示终端的控制方法、装置和存储介质 |
CN111462247B (zh) * | 2020-03-13 | 2024-04-02 | 中天智领(北京)科技有限公司 | 一种用于屏幕交互的光标位置校准方法及装置 |
CN111427452B (zh) * | 2020-03-27 | 2023-10-20 | 海信视像科技股份有限公司 | 控制器的追踪方法及vr系统 |
CN112328158A (zh) * | 2020-07-23 | 2021-02-05 | 深圳Tcl新技术有限公司 | 交互方法、显示装置、发射装置、交互系统及存储介质 |
CN112099028B (zh) * | 2020-09-03 | 2024-07-30 | 深圳市迈测科技股份有限公司 | 激光点自动追踪方法、装置、存储介质及激光测距装置 |
CN114428571A (zh) * | 2020-10-29 | 2022-05-03 | 深圳Tcl新技术有限公司 | 一种交互方法、计算机设备、计算机可读存储介质 |
CN112346644A (zh) * | 2020-11-19 | 2021-02-09 | 深圳Tcl新技术有限公司 | 基于激光感应的交互方法、终端设备及可读存储介质 |
CN112506384B (zh) * | 2020-12-18 | 2024-07-09 | 深圳Tcl新技术有限公司 | 基于激光信号的交互方法、装置、设备及可读存储介质 |
CN112700463A (zh) * | 2020-12-30 | 2021-04-23 | 上海幻维数码创意科技股份有限公司 | 基于图像检测的多媒体展厅交互方法、装置及存储介质 |
CN112822468B (zh) * | 2020-12-31 | 2023-02-17 | 成都极米科技股份有限公司 | 一种投影控制方法、装置、投影设备及激光控制器 |
CN113849073A (zh) * | 2021-08-25 | 2021-12-28 | 中国船舶重工集团公司第七0九研究所 | 一种面向远程操控的鼠标与回传画面的同步方法及系统 |
CN114527922A (zh) * | 2022-01-13 | 2022-05-24 | 珠海视熙科技有限公司 | 一种基于屏幕识别实现触控的方法及屏幕控制设备 |
CN116185243B (zh) * | 2023-04-28 | 2023-07-21 | 苏州市世为科技有限公司 | 一种人机交互数据处理评估预警系统 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1534544A (zh) * | 2003-04-01 | 2004-10-06 | 中国科学院电子学研究所 | 大屏幕非接触式控制方式 |
CN1912816A (zh) * | 2005-08-08 | 2007-02-14 | 北京理工大学 | 一种基于摄像头的虚拟触摸屏系统 |
CN101419513A (zh) * | 2008-12-09 | 2009-04-29 | 安徽大学 | 一种红外激光笔遥指虚拟触摸系统 |
US20110128258A1 (en) * | 2009-11-30 | 2011-06-02 | Hui-Hu Liang | Mouse Pen |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5825350A (en) * | 1996-03-13 | 1998-10-20 | Gyration, Inc. | Electronic pointing apparatus and method |
US6292171B1 (en) * | 1999-03-31 | 2001-09-18 | Seiko Epson Corporation | Method and apparatus for calibrating a computer-generated projected image |
US6275214B1 (en) * | 1999-07-06 | 2001-08-14 | Karl C. Hansen | Computer presentation system and method with optical tracking of wireless pointer |
JP4040046B2 (ja) * | 2003-03-14 | 2008-01-30 | 富士通株式会社 | ポインタ領域検出装置、方法及びプログラム、画像の対応付け装置、方法及びプログラム、並びにコンテンツ配信サーバ、コンテンツ配信方法 |
US7427758B2 (en) * | 2003-05-28 | 2008-09-23 | Opto-Knowledge Systems, Inc. | Cryogenically cooled adjustable apertures for infra-red cameras |
CN101027679B (zh) * | 2004-09-09 | 2010-04-21 | 奥普提克斯晶硅有限公司 | 表达通用的二维空间变换的系统和方法 |
JP2006121240A (ja) * | 2004-10-20 | 2006-05-11 | Sharp Corp | 画像投射方法、プロジェクタ、及びコンピュータプログラム |
JP3953500B1 (ja) * | 2006-02-07 | 2007-08-08 | シャープ株式会社 | 画像投影方法及びプロジェクタ |
JP3880609B1 (ja) * | 2006-02-10 | 2007-02-14 | シャープ株式会社 | 画像投影方法及びプロジェクタ |
EP1830246A1 (en) * | 2006-03-01 | 2007-09-05 | STMicroelectronics (Research & Development) Limited | Device and system for presenting information |
CN1952851A (zh) * | 2006-10-13 | 2007-04-25 | 广东威创日新电子有限公司 | 一种实现交互显示的电子装置和方法 |
GB0622451D0 (en) * | 2006-11-10 | 2006-12-20 | Intelligent Earth Ltd | Object position and orientation detection device |
US8089455B1 (en) * | 2006-11-28 | 2012-01-03 | Wieder James W | Remote control with a single control button |
TW201044226A (en) * | 2009-06-10 | 2010-12-16 | Weistech Technology Co Ltd | Integrated wired/wireless virtual unit control apparatus and method |
CN101714033B (zh) * | 2009-09-04 | 2014-06-18 | 谭登峰 | 一种多光点触摸控制装置 |
CN102103435B (zh) * | 2009-12-18 | 2013-04-17 | 深圳市巨龙科教高技术股份有限公司 | 一种交互式电子白板装置及其定位方法 |
US20110230238A1 (en) * | 2010-03-17 | 2011-09-22 | Sony Ericsson Mobile Communications Ab | Pointer device to navigate a projected user interface |
KR101726607B1 (ko) * | 2010-10-19 | 2017-04-13 | 삼성전자주식회사 | 휴대 단말기의 화면 제어 방법 및 장치 |
CN102073395B (zh) * | 2011-02-25 | 2012-08-29 | 上海交通大学 | 基于fpga的无线激光笔互动系统 |
CN102221933B (zh) * | 2011-07-03 | 2013-04-17 | 吉林大学 | 电子白板中失真投影面内触摸点屏幕坐标的精确计算方法 |
-
2011
- 2011-11-08 CN CN201110349911.1A patent/CN103092432B/zh active Active
-
2012
- 2012-09-14 WO PCT/CN2012/081405 patent/WO2013067849A1/zh active Application Filing
- 2012-11-14 US US14/350,622 patent/US20140247216A1/en not_active Abandoned
-
2014
- 2014-05-26 IN IN1012MUN2014 patent/IN2014MN01012A/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1534544A (zh) * | 2003-04-01 | 2004-10-06 | 中国科学院电子学研究所 | 大屏幕非接触式控制方式 |
CN1912816A (zh) * | 2005-08-08 | 2007-02-14 | 北京理工大学 | 一种基于摄像头的虚拟触摸屏系统 |
CN101419513A (zh) * | 2008-12-09 | 2009-04-29 | 安徽大学 | 一种红外激光笔遥指虚拟触摸系统 |
US20110128258A1 (en) * | 2009-11-30 | 2011-06-02 | Hui-Hu Liang | Mouse Pen |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110502129A (zh) * | 2019-08-29 | 2019-11-26 | 王国梁 | 交互控制系统 |
Also Published As
Publication number | Publication date |
---|---|
CN103092432B (zh) | 2016-08-03 |
CN103092432A (zh) | 2013-05-08 |
US20140247216A1 (en) | 2014-09-04 |
IN2014MN01012A (zh) | 2015-07-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013067849A1 (zh) | 人机交互操作指令的触发控制方法和系统及激光发射装置 | |
US9684385B2 (en) | Display device, display system, and data supply method for display device | |
TWI423096B (zh) | 具可觸控投影畫面之投影系統 | |
CN102662498B (zh) | 一种投影演示的无线控制方法及系统 | |
US8943231B2 (en) | Display device, projector, display system, and method of switching device | |
US9645678B2 (en) | Display device, and method of controlling display device | |
CN103365549B (zh) | 输入装置、显示系统及输入方法 | |
US9830023B2 (en) | Image display apparatus and method of controlling image display apparatus | |
WO2013078989A1 (zh) | 人机交互操作指令的触发控制方法和系统 | |
CN101114200A (zh) | 功能命令系统、功能命令装置、功能命令分析系统、演示系统及计算机可读介质 | |
TW201349029A (zh) | 具光點辨識之互動投影系統以及控制方法 | |
JP2012238293A (ja) | 入力装置 | |
US20150261385A1 (en) | Picture signal output apparatus, picture signal output method, program, and display system | |
CN104064022A (zh) | 遥控方法和系统 | |
WO2018150569A1 (ja) | ジェスチャー認識装置、ジェスチャー認識方法、ジェスチャー認識装置を備えたプロジェクタおよび映像信号供給装置 | |
WO2014048031A1 (zh) | 一种光线遥控定位的方法、装置及系统 | |
CN104270664B (zh) | 光笔遥控器、实现智能操作平台输入控制的系统及方法 | |
US20230384868A1 (en) | Display apparatus | |
JP6273671B2 (ja) | プロジェクター、表示システム、及びプロジェクターの制御方法 | |
KR100843586B1 (ko) | 무접점 기능을 수행하는 장치 및 방법 | |
TWI518553B (zh) | 具多模式之互動投影系統及其指示裝置與控制方法 | |
KR100849532B1 (ko) | 무접점 마우스 기능을 가지는 장치 및 방법 | |
EP2296081A1 (en) | Image processing apparatus and method of controlling the same | |
US20110285624A1 (en) | Screen positioning system and method based on light source type | |
CN118411814B (zh) | 基于投影仪摄像头的类触控遥控方法及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12848143 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14350622 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 15/07/2014) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12848143 Country of ref document: EP Kind code of ref document: A1 |