CN107368200B - Remote control device - Google Patents
Remote control device Download PDFInfo
- Publication number
- CN107368200B CN107368200B CN201710507743.1A CN201710507743A CN107368200B CN 107368200 B CN107368200 B CN 107368200B CN 201710507743 A CN201710507743 A CN 201710507743A CN 107368200 B CN107368200 B CN 107368200B
- Authority
- CN
- China
- Prior art keywords
- image
- light source
- remote control
- control device
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0308—Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- Selective Calling Equipment (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention proposes a remote control device for controlling a controlled device having a light source, the remote control device comprising: an image sensor to acquire a first image and a second image including the light source at different times; and a processor configured to form a two-dimensional space from an image acquired by the image sensor, calculate a current movement vector of the controlled device in the two-dimensional space from an imaging position of the light source in the first image and the second image in the two-dimensional space, calculate a pointing vector in the two-dimensional space from the imaging position of the light source in the second image in the two-dimensional space and a preset pointing position in the two-dimensional space, and determine a moving direction of the controlled device according to the current movement vector and the pointing vector in the two-dimensional space.
Description
Technical Field
The present invention relates to a remote control system, and more particularly, to a remote control device.
Background
A remote control system is known which includes, for example, a remote control car which moves in accordance with an electric wave signal transmitted from a remote controller, and the remote controller. The remote-controlled car generally includes a switch for activating the receiver to stand by or to move linearly at a predetermined speed. The remote controller includes a plurality of keys thereon, such as a plurality of direction keys for controlling the moving direction of the remote-controlled car. To improve pointing accuracy, some remote controls are provided with a joystick instead of a directional key to provide more controllable direction. Therefore, a user can control the remote control automobile to steer at any angle through the rocker of the remote controller.
However, since the user's line of sight does not coincide with the current direction of travel of the controlled device, the user needs to think about his turn in the perspective of the controlled device during control. For example, when the user sends a left turn instruction to the remote control automobile through the remote controller, if the sight of the user is consistent with the traveling direction of the remote control automobile, the user can see that the remote control automobile turns left; on the contrary, if the sight line of the user is opposite to the traveling direction of the remote control automobile, the user sees that the remote control automobile turns to the right. Therefore, in the known remote control system, the control of the direction is inconvenient for the user.
Another remote control system uses laser-guided beams instead of the mechanical key controls described above. The remote control system includes a controlled device and a remote control device. The remote control device is provided with a laser; the controlled device is provided with a camera for imaging the pointing point of the laser guiding beam and moving towards the direction of the pointing point. Compared with the method, the remote control device greatly simplifies the key configuration and can improve the operation experience of a user. However, the camera of the controlled device needs to perform 360-degree panoramic scanning to image the laser-guided beam, and the identification process is easily interfered by the ambient light source, so that the problems of slow response time and poor accuracy are caused.
Accordingly, the present invention provides a remote control system capable of efficiently and precisely moving a controlled device without complicated buttons.
Disclosure of Invention
The present invention provides a remote control device for controlling a controlled device having a light source, the remote control device comprising: an image sensor to acquire a first image and a second image including the light source at different times; and a processor configured to form a two-dimensional space from an image acquired by the image sensor, calculate a current movement vector of the controlled device in the two-dimensional space from an imaging position of the light source in the first image and the second image in the two-dimensional space, calculate a pointing vector in the two-dimensional space from the imaging position of the light source in the second image in the two-dimensional space and a preset pointing position in the two-dimensional space, and determine a moving direction of the controlled device according to the current movement vector and the pointing vector in the two-dimensional space.
The present invention also provides a remote control device for controlling a controlled device having a first light source and a second light source, the remote control device comprising: an image sensor to acquire an image including the first light source and the second light source; and the processor is used for calculating the current moving direction of the controlled device in the image according to the imaging positions of the first light source and the second light source in the image, calculating a pointing vector in the image according to the imaging position of the first light source or the second light source in the image and a preset pointing position in the image, and determining the moving direction of the controlled device according to the current moving direction, the pointing vector and the imaging position in the image.
The present invention also provides a remote control device for controlling a controlled device having a light source with a preset pattern, the remote control device comprising: an image sensor for acquiring an image including the preset pattern; and the processor is used for judging the current moving direction of the controlled device in the image according to the preset pattern in the image, calculating a pointing vector in the image according to the imaging position of the light source in the image and a preset pointing position in the image, and determining the steering angle of the controlled device according to the current moving direction, the pointing vector and the imaging position in the image.
The remote control system of the directional robot in the embodiment of the invention can determine the steering angle, the moving direction and/or the moving distance of the controlled device through vector operation according to the light source image of the controlled device. Therefore, the user can control the action of the remote control device more intuitively.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 shows a schematic diagram of a remote control system according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram of a two-dimensional space formed by images acquired by an image sensor according to a first embodiment of the present invention;
FIG. 3 shows a schematic diagram of a remote control system according to a second embodiment of the present invention;
FIG. 4 is a schematic diagram of an image captured by an image sensor according to a second embodiment of the present invention;
FIG. 5 shows a schematic diagram of a remote control system according to a third embodiment of the present invention;
FIG. 6 is a schematic diagram of an image captured by an image sensor according to a third embodiment of the present invention;
description of the reference numerals
1. 2, 3 remote control system
10. 10' controlled device
12. 12' light source
121 first light source
122 second light source
14. 14' receiver
20 remote control device
25 image sensor
27 emitter
29 processor
2DS two-dimensional space
C1、C2P image center
F image
F1First image
F2Second image
I12、I12' light source image
PCCenter of two-dimensional space
P12Imaging position
P1、P121First imaging position
P2、P122Second imaging position
SCControl signal
T1、T2Time of day
V view angle
Theta steering angle.
Detailed Description
In order that the manner in which the above recited and other objects, features and advantages of the present invention are obtained will become more apparent, a more particular description of the invention briefly described below will be rendered by reference to the appended drawings. In the description of the present invention, the same components are denoted by the same reference numerals and will be described in advance herein.
In the following description, the remote control system of the present invention will be described by way of example, in which a remote control device uses an image sensor instead of a known laser-guided beam. However, embodiments of the invention are not limited to any particular environment, application, or implementation. Accordingly, the following examples are illustrative only and are not intended to limit the present invention. It is to be understood that components not directly related to the present invention have been omitted and are not shown in the following embodiments and drawings.
Fig. 1 shows a schematic view of a remote control system 1, for example a pointing robot, according to a first embodiment of the present invention. The remote control system 1 includes a controlled apparatus 10 and a remote control apparatus 20. The user can control the controlled device 10 through the remote control device 20, such as controlling the traveling direction, moving speed, moving distance, steering angle, and/or operation intensity of the controlled device 10.
The controlled device 10 has a light source 12 and a receiver 14. The light source 12 emits light or continuously lights at an adjustable lighting frequency to serve as a position detection reference point of the controlled device 10. The receiver 14 is used for one-way or two-way communication with the remote control device 20. It should be noted that the position of the light source 12 of the controlled device 10 shown in fig. 1 is not intended to limit the present invention. In this embodiment, the controlled device 10 may be a directional Robot, such as a Cleaning Robot (Cleaning Robot), for executing a predetermined function, but the invention is not limited thereto, as long as the operation thereof can be controlled by a remote controller.
The remote control device 20 includes an image sensor 25, a transmitter 27, and a processor 29; wherein, theThe processor 29 is electrically connected to the image sensor 25 and the transmitter 27. In one embodiment, the remote control device 20 further comprises at least one switch (not shown) for the user to operate, wherein the switch can be a mechanical button or a capacitive switch for triggering the image sensor 25 to acquire an image and the transmitter 27 to send a control signal SC。
In this embodiment, the image sensor 25 is preferably located in front of the remote control device 20, so that the remote control device 20 is oriented to extend in a direction substantially along the arm of the user when the user holds the device for use. The image sensor 25 acquires an image at a predetermined viewing angle V for acquiring an image of the light source 12 covering the controlled device 10. The processor 29 forms a two-dimensional space from the image for vector operation. For example, fig. 2 shows a schematic diagram of a two-dimensional space 2DS formed from images acquired by the image sensor 25.
Referring to FIGS. 1 and 2, assume that at time T1While the image sensor 25 acquires a first image F containing the light source 121Having an image center C1(e.g., the pointing position of the user); the processor 29 may then identify the first image F1Light source image I of said light source 1212And the first image F is processed1Mapping (mapping) to a two-dimensional coordinate system 2DS, such as a polar coordinate system or a rectangular coordinate system, in which case the first image F1Said image center C of1Corresponding to the two-dimensional space center PCAnd the first image F is processed1The light source image I12Is recorded as a first imaging position P1. Similarly, assume at time T2While the image sensor 25 acquires a second image F comprising the light source 122Having an image center C2(e.g., the pointing position of the user); the processor 29 may then identify the second image F2Light source image I of said light source 1212', and combining said second image F2Mapping to the two-dimensional coordinate system 2DS, in this case the second image F2Of the center of the imageC2Corresponding to the two-dimensional space center PCAnd the second image F is processed2The light source image I12The position of' is recorded as a second imaging position P2. It must be noted that the first imaging position P1And the second imaging position P2The center position, the gravity center position, or other predetermined positions in the object image formed by the light source can be respectively used as long as the same definition is used.
It should be noted that reference signs 12, 12' in fig. 1 are only used to indicate light sources at different times, and do not indicate different light sources; similarly, the reference signs 10, 10 'and 14, 14' are only used to indicate elements at different times, and do not indicate different elements.
In this embodiment, the processing unit 29 converts the first image F into the second image F1And said second image F2And converting the space transformation into the same two-dimensional space to perform vector operation. In one embodiment, the second image F2Convertible to said first image F1The formed two-dimensional space; in another embodiment, the first image F1Convertible to said second image F2The formed two-dimensional space; in another embodiment, the first image F1And said second image F2Can be respectively converted into another two-dimensional space for subsequent vector operation. For example, when the first image F is used1When the vector operation is carried out in the formed two-dimensional space, the center C of the image1May be the two-dimensional space center PC(ii) a When the second image F is used2When the vector operation is carried out in the formed two-dimensional space, the center C of the image2May be the two-dimensional space center PC(ii) a The image center C when using a two-dimensional space other than the first image and the second image1And C2Then mapped to the two-dimensional space center PC. Wherein said first image F can be simply put in1And said second image F2Direct superposition for determining the subsequent first imaging position P1And the second imaging position P2The vector of (a) changes.
Thus, the processor 29 may determine the first imaging position P in the two-dimensional space 2DS according to1And the second imaging position P2Obtaining a current motion vectorAnd according to the second imaging position P in the two-dimensional space 2DS2And said two-dimensional space center PCObtaining a directional vectorThen according to the current motion vectorThe pointing vectorAnd the second imaging position P2The moving direction or steering angle θ of the controlled device 10 can be determined by vector operation, for example, toward the two-dimensional space center P as shown in fig. 2CIn the direction of (a).
In this embodiment, the transmitter 27 may transmit the moving direction to the receiver 14 by infrared light or radio waves (e.g., bluetooth). It will be appreciated that fig. 1 and 2 merely show exemplary relative positions of the light source 12, the receiver 14, the emitter 27 and the image sensor 25 and the shape of the viewing angle V; the shape and size of the viewing angle V may be determined according to the image capturing angle and the image capturing distance of the image sensor 25. Preferably, the processing unit 29 is capable of converting the image acquired by the image sensor 25 into a rectangular or square two-dimensional space for vector operation.
In one embodiment, if the processor 29 has a color gradation or color recognition function, the light source 12 of the controlled device 10 can be designed to emit light with different brightness or color to correspond to different operation modes of the controlled device 10, such as a normal mode in which the controlled device 10 operates at a preset speed and a preset intensity; the controlled device 10 operates at less than the preset speed and the preset intensity in the silent mode; in the fast mode, the controlled device 10 operates at a speed greater than the preset speed and the preset intensity. It should be noted that the functions of the various modes described herein may be preset according to the functions executed by the controlled apparatus 10 before shipment, and are not limited to the functions disclosed in the description of the present invention. Therefore, when the processor 29 determines the moving direction of the controlled device 10, it may also determine whether to change the operation mode of the controlled device 10. In addition, the moving direction and the operation mode can be determined separately, and the number of the operation modes can be determined according to different applications.
In one embodiment, the light source 12 of the controlled device 10 may be designed to blink at different frequencies to correspond to different operating modes of the controlled device 10. The processor 29 may be responsive to the first image F1And said second image F2The current operation mode of the controlled device 10 can be determined according to a plurality of images, so that the image sensor 25 can continuously acquire a plurality of images (for example, pressing a switch) each time the image sensor is triggered, and the image sensor is not limited to two images. Furthermore, the first image F1And said second image F2The number of images is not limited to two consecutive images, and may be two images separated by one or more images among a plurality of images acquired consecutively. In this embodiment, the light source 12 can be designed to have different flashing frequencies to correspond to different operation modes of the controlled device 10 (such as the normal mode, the mute mode or the fast mode), and when the processor 29 determines the moving direction of the controlled device 10, the flashing frequency of the light source 12 can also be determined according to the acquired images to determine whether to change the operation mode of the controlled device 10. In addition, if the controlled apparatus 10 is set to operate in the mute mode, the controlled apparatus 10 or the processor 29 may ignore the mode change instruction.
In one embodiment, the processor 29 may also be configured to determine the current motion vector based on the current motion vectorAnd the pointing vectorThe size of at least one of them determines the speed and/or distance of movement of the controlled device 10. For example, when the vector points toIs greater than or less than a threshold value, the processor 29 sends a control signal SCCan contain moving direction and mode change information at the same time; wherein the threshold value can be a fixed value or the current motion vectorThe multiple of the size. Further, the threshold value may include a plurality of threshold values according to the number of modes that can be changed. The processor 29 may also determine the motion vectorWhether the magnitude of (b) is in accordance with the user's setting to determine whether to perform a mode change, or may directly control the moving speed of the controlled apparatus 10.
Fig. 3 shows a schematic view of a remote control system 2 according to a second embodiment of the invention, which may also be a remote control system of a pointing robot and comprises a controlled device 10 and a remote control device 20. The user can also control the controlled device 10 via the remote control device 20, such as controlling the operation mode, the traveling direction, the steering angle, the moving speed, the moving distance, and/or the operation intensity of the controlled device 10.
The controlled device 10 has a first light source 121, a second light source 122 and a receiver 14 and moves in a predetermined direction(e.g., the front of the controlled device 10). The first light source 121 and the second light source 122 have different features (described in detail later) for the remote control device 20 to distinguish. The receiver 14 is used for receiving the signalThe remote control device 20 communicates one-way or two-way. It should be noted that the positions of the first light source 121 and the second light source 122 of the controlled device 10 shown in fig. 3 are not intended to limit the present invention. As described in the first embodiment, the controlled apparatus 10 may be a pointing robot that performs a predetermined function.
The remote control device 20 includes an image sensor 25, a transmitter 27, and a processor 29; wherein the processor 29 is electrically connected to the image sensor 25 and the transmitter 27. As in the first embodiment, the remote control device 20 may further comprise at least one switch (not shown).
In this embodiment, since the processor 29 determines the moving direction of the controlled device 10 according to the image sensor 25 acquiring one image covering the first light source 121 and the second light source 122 of the controlled device 10, the processor 29 can directly perform vector operation by using the two-dimensional space formed by the one image. The present embodiment will directly describe the vector operation with the image acquired by the image sensor 25, for example, fig. 4 shows a schematic diagram of the image F acquired by the image sensor 25. It can be understood that, when the image capturing angle of the image sensor 25 causes the image F not to be rectangular, a spatial transformation may be used to convert the image F into a rectangular two-dimensional space for vector operation.
Referring to fig. 3 and 4, the image sensor 25 obtains an image F including the first light source 121 and the second light source 122, and the processor 29 can identify the first light source 121 and the second light source 122 and record the first image as a first imaging position P121And a second imaging position P122Said image F, as shown in fig. 4, has an image center P. It must be noted that the first imaging position P121And the second imaging position P122The center position, the center of gravity position or other preset positions of the object image can be respectively used as long as the same position is defined.
Thus, the processor 29 may determine the first imaging position P in the image F121And the second imaging position P122Obtaining a current moving party of the controlled device 10To the direction ofIt is assumed here that the current direction of movement isAnd the preset moving directionPreset to the same direction (i.e.) The processor 29 can use the current moving directionDetermining the preset moving direction of the controlled device 10At the same time, the processor 29 determines the second imaging position P from the image center P in the image F122Obtaining a directional vectorThus, according to the current direction of movementThe pointing vectorAnd the imaging position P122The moving direction or steering angle θ of the controlled device can be determined by vector operation, as shown in fig. 4. The difference between this embodiment and the first embodiment is that the first embodiment can determine the current moving direction of the controlled device 10 according to the position change of the light source of the controlled device 10 in two images, while the second embodiment determines the current moving direction of the controlled device 10 according to the two light sources in the same image, and the other parts are similar to the first embodiment, so that here, the processor 29 determines the current moving direction of the controlled device 10 according to the position change of the light source in the same image, and thus, the other parts are similar to the first embodiment, and thusAnd will not be described in detail.
It has to be noted that the current direction of movement isAnd the preset moving directionPreset to the same direction (i.e. to be in-stance) only by way of example). In other embodiments, the current moving directionAnd the preset moving directionMay be different, as long as it is preset at the time of factory shipment and converted by the processor 29. In this embodiment, as long as the relative positions of the first light source 121 and the second light source 122 can be determined, the current moving direction can be determined
In addition, this embodiment only illustrates the direction vector asAnd the preset moving directionIs the second imaging position P122(ii) a In another embodiment, the pointing vector may also be selected asWhile the preset moving directionStart of (2)The point is selectable as said first imaging position P121。
In this embodiment, the first light source 121 and the second light source 122 of the controlled device 10 have different characteristics, such as different brightness, color, area or shape. For example, the processor 29 has a color level or color recognition function, the first light source 121 is composed of 3 LED lamps and the second light source 122 is composed of 1 LED lamp, and assuming that the LED lamps have the same brightness, the processor 29 can determine the positions of the first light source 121 and the second light source 122 by the color level recognition function. Therefore, the processor 29 can identify the relative positions of the first light source 121 and the second light source 122 according to the light source characteristics to obtain the current moving direction of the controlled device 10
In addition, the combination of different brightness or color of the first light source 121 and the second light source 122 may correspond to different operation modes of the controlled device (e.g., the normal mode, the silent mode or the fast mode), so that when the processor 29 determines the moving direction of the controlled device 10, it may also determine whether to change the operation mode of the controlled device 10.
As described in the first embodiment, the first light source 121 and the second light source 122 can be designed to have different flashing frequencies to correspond to different operation modes of the controlled device 10 (e.g. the normal mode, the mute mode or the fast mode) or to distinguish different light sources, so that the processor 29 can also determine whether to change the operation mode of the controlled device 10 when determining the moving direction of the controlled device 10.
As described in the first embodiment, the processor 29 may also be based on the pointing vectorDetermines the moving speed and/or moving distance of the controlled device 10, e.g. according to the pointing vectorThe comparison result with the at least one threshold value determines whether to change the operation mode, which is similar to the first embodiment, and thus is not described herein again.
Fig. 5 shows a schematic view of a remote control system 3 according to a third embodiment of the invention, which may also be a remote control system of a pointing robot and comprises a controlled device 10 and a remote control device 20. The user can also control the controlled device 10 by means of the remote control device 20, for example, to control the direction of travel, steering angle, speed of movement, distance of movement, and/or intensity of operation of the controlled device 10.
The controlled device 10 has a light source 12 and a receiver 14 and is moved in a predetermined directionA movement, wherein the light source 12 has a predetermined pattern to correspond to the predetermined movement direction(e.g. arrow pointing in the direction of said predetermined movementAs shown in fig. 5). The light source 12 is lighted or continuously lighted at an adjustable lighting frequency to serve as a position detection reference point of the controlled device 10. The receiver 14 is used for one-way or two-way communication with the remote control device 20. It should be noted that the position of the light source 12 of the controlled device 10 shown in fig. 5 is not intended to limit the present invention. As described in the first embodiment, the controlled device 10 may be a pointing robot that performs a predetermined function
The remote control device 20 includes an image sensor 25, a transmitter 27, and a processor 29; wherein the processor 29 is electrically connected to the image sensor 25 and the transmitter 27. As in the first embodiment, the remote control device 20 may further comprise at least one switch (not shown).
In this embodiment, since the processor 29 determines the moving direction or the turning angle of the controlled device 10 according to an image of the light source 12 covering the controlled device 10 acquired by the image sensor 25, the processor 29 can directly perform vector operation by using a two-dimensional space formed by the image. The present embodiment will directly describe the vector operation with the image acquired by the image sensor 25, for example, fig. 6 shows a schematic diagram of the image F acquired by the image sensor 25. It can be understood that, when the image capturing angle of the image sensor 25 causes the image F not to be rectangular, a spatial transformation may be used to convert the image F into a rectangular two-dimensional space for vector operation.
Referring to fig. 5 and 6, the image sensor 25 obtains the image F including the light source 12, and the processor 29 identifies the light source 12 and records the image as the imaging position P12And an image center P as shown in fig. 6. At the same time, the processor 29 also determines the shape of the predetermined pattern of the light source 12. It must be noted that the imaging position P12May be the center position, center of gravity position or other preset position of the object image (i.e. light source image). For example, the imaging position P in the present embodiment12Is defined as the center position of the preset pattern.
Thus, the processor 29 may determine the imaging position P in the image F12And the shape of the predetermined pattern results in the predetermined moving direction of the controlled device 10For example, the predetermined pattern is shaped as an arrow and the arrow is assumed to point in the predetermined moving directionPreset to the same direction, as shown in fig. 5, the processor 29 can determine the preset moving direction of the controlled device 10 by the preset patternAs the current moving direction; meanwhile, the processor 29 obtains the image center P in the image FOrientation vectorThen, the steering angle θ or the moving direction of the controlled device can be determined by vector operation according to the current moving direction, the pointing vector and the imaging position, as shown in fig. 6. The difference between this embodiment and the first embodiment is that the first embodiment can determine the current moving direction of the controlled device 10 according to the position change of the light source of the controlled device 10 in two images, while the third embodiment determines the current moving direction of the controlled device 10 according to the shape of the light source by the processor 29, and other parts are similar to the first embodiment, and therefore are not described herein again.
It has to be noted that the shape of the preset pattern is only exemplarily shown as an arrow; in other embodiments, the predetermined pattern may be a triangle, a pentagon, or other asymmetrical pattern, as long as the processor 29 can recognize and interact with the predetermined directionAnd (4) performing association. That is, in the present embodiment, as long as the processor 29 can recognize the shape of the light source image, the current moving direction (i.e. the preset direction) can be recognized)。
In this embodiment, since the processor 29 has a pattern recognition function, the predetermined pattern of the light source 12 can be designed to be formed by arranging a plurality of light emitting diodes and controlling part of the light emitting diodes to emit light to form different patterns corresponding to different operation modes (such as the normal mode, the mute mode or the fast mode) of the controlled apparatus 10, so that when the processor 29 determines the moving direction of the controlled apparatus 10, it can also determine whether to change the operation mode of the controlled apparatus 10.
In another embodiment, if the processor 29 has a color gradation or color recognition function, the light source 12 of the controlled device 10 can be designed to have different brightness or color to correspond to different operation modes of the controlled device 10 (such as the normal mode, the silent mode or the fast mode), so that when the processor 29 determines the moving direction of the controlled device 10, it can also determine whether to change the operation mode of the controlled device 10. At this time, the light source 12 may continuously emit light with different brightness or color to indicate the current operation mode.
As described in the first embodiment, the light source 12 may have different flashing frequencies to correspond to different operation modes of the controlled device 10 (e.g., the normal mode, the silent mode, or the fast mode), so that when the processor 29 determines the moving direction of the controlled device 10, it may also determine whether to change the operation mode of the controlled device 10.
As described in the first embodiment, the processor 29 may also be based on the pointing vectorDetermines the moving speed and/or moving distance of the controlled device 10; e.g. according to the pointing vectorThe comparison result with the at least one threshold value determines whether to change the operation mode, which is similar to the first embodiment, and thus is not described herein again.
In the above embodiments, the remote control device 20 is preferably a hand-held remote control device; in other embodiments, the remote control device 20 may also be a portable electronic device, such as a smart phone with infrared light or bluetooth function, wherein the camera function of the smart phone may correspond to the function of the image sensor 25 of the present invention.
The field of view of the image sensor 25 of the above embodiments preferably covers the entire controlled device 10 to ensure that all of the light sources are accessible to the image sensor 25. The image sensor 25 may be a Complementary Metal Oxide Semiconductor (CMOS) image sensor or a Charge Coupled Device (CCD).
In one embodiment, the image sensor 25 of the above embodiments may be divided into a plurality of sub-regions according to the coordinate system applied by the processor 29. Taking a rectangular coordinate system as an example, please refer to fig. 2, which shows a schematic diagram of a matrix in which an operation area is divided into a plurality of sub-areas. That is, the position information of at least one light point in the image may be obtained from a look-up table formed by a matrix dividing the field of view of the image sensor into a plurality of sub-regions.
The light source can be any known light source, such as a Light Emitting Diode (LED), a Laser Diode (LD) or other active light source, but the invention is not limited thereto, and the shape of the light source can be determined by transmitting the light emitted by the active light source through the translucent cover. If an infrared light source is used, the influence on the vision of the user can be avoided.
In the description of the present invention, the pointing position may be defined as an image center, and when the pointing position is mapped to a two-dimensional space for vector operation, the pointing position may be mapped to the two-dimensional space center. However, according to different embodiments, the pointing position may also be defined as other preset positions in the acquired image, such as corners of the image; when mapping to a two-dimensional space for vector operations, the pointing position may be mapped to a relative position of the two-dimensional space.
As described above, known remote control systems having a plurality of keys or operated by a laser-guided beam have problems of poor accuracy and slow response time, respectively. Accordingly, the present invention provides a remote control system for a pointing robot, which can determine a turning angle, a moving direction, a moving purpose, and/or a moving distance of a controlled device through vector operation based on a light source image of the controlled device. Thereby, the user can control the operation of the remote control device more intuitively.
Although the present invention has been disclosed in the foregoing embodiments, it should be understood that they have been presented by way of example only, and various changes and modifications can be made by one skilled in the art without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention is subject to the scope defined by the appended claims.
Claims (19)
1. A remote control device for controlling a controlled device having a light source, the remote control device comprising:
an image sensor to acquire a first image and a second image including the light source at different times; and
a processor configured to form a two-dimensional space from the image acquired by the image sensor, calculate a current movement vector of the controlled device in the two-dimensional space from the imaging position of the light source in the first image and the second image in the two-dimensional space, calculate a pointing vector in the two-dimensional space from the imaging position of the light source in the second image in the two-dimensional space and a preset pointing position in the two-dimensional space, and determine a moving direction of the controlled device from the current movement vector and the pointing vector in the two-dimensional space,
wherein the preset pointing position is a relative position in the two-dimensional space to which a preset position in the image acquired by the image sensor is mapped, and the processor calculates a vector of the imaging position of the light source in the second image in the two-dimensional space to the relative position as the pointing vector.
2. The remote control device of claim 1, wherein the remote control device further comprises a transmitter by which the processor transmits the direction of movement as infrared light or radio waves.
3. The remote control device of claim 1, wherein the light source has a blinking frequency.
4. The remote control device of claim 1, wherein the preset position is an image center of the image acquired by the image sensor.
5. The remote control device of claim 1, wherein the processor also determines a speed of movement and/or a distance of movement of the controlled device based on a magnitude of at least one of the present movement vector and the pointing vector.
6. The remote control device of claim 1, wherein the remote control device further comprises at least one switch to trigger the image sensor to acquire the image.
7. The remote control apparatus according to claim 1, wherein the imaging position of the light source is a center position, a center of gravity position, or a preset position of an object image formed by the light source.
8. A remote control device for controlling a controlled device having a first light source and a second light source, the remote control device comprising:
an image sensor to acquire an image including the first light source and the second light source; and
a processor for calculating a current moving direction of the controlled device in the image according to the imaging positions of the first light source and the second light source in the image, calculating a pointing vector in the image according to the imaging position of the first light source or the second light source in the image and a preset pointing position in the image, and determining a moving direction of the controlled device according to the current moving direction, the pointing vector and the imaging position in the image,
wherein the preset pointing position is a preset position of the image acquired by the image sensor, and the processor calculates a vector of the imaging position of the first light source or the second light source in the image to the preset position as the pointing vector.
9. The remote control device of claim 8, wherein the remote control device further includes a transmitter by which the processor transmits the direction of movement as infrared light or radio waves.
10. The remote control device of claim 8, wherein the first light source and the second light source have different characteristics.
11. The remote control device of claim 8, wherein the preset position is an image center of the image acquired by the image sensor.
12. The remote control device of claim 8, wherein the remote control device further comprises at least one switch to trigger the image sensor to acquire the image.
13. The remote control apparatus according to claim 8, wherein the imaging positions of the first light source and the second light source are a center position, a center of gravity position, or a preset position of an object image formed by the first light source and the second light source, respectively.
14. A remote control device for controlling a controlled device having a light source with a preset pattern, the remote control device comprising:
an image sensor for acquiring an image including the preset pattern; and
a processor for determining a current moving direction of the controlled device in the image according to the preset pattern in the image, calculating a pointing vector in the image according to an imaging position of the light source in the image and a preset pointing position in the image, and determining a steering angle of the controlled device according to the current moving direction, the pointing vector and the imaging position in the image,
wherein the preset pointing position is a preset position of the image acquired by the image sensor, and the processor calculates a vector of the imaging position of the light source in the image to the preset position as the pointing vector.
15. The remote control device of claim 14, wherein the remote control device further includes a transmitter by which the processor transmits the steering angle in infrared light or radio waves.
16. The remote control device of claim 14, wherein the preset pattern of light sources is an arrow or an asymmetrical pattern.
17. The remote control device of claim 14, wherein the preset pattern is a variable pattern.
18. The remote control device of claim 14, wherein the preset position is an image center of the image acquired by the image sensor.
19. The remote control device of claim 14, wherein the remote control device further comprises at least one switch to trigger the image sensor to acquire the image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710507743.1A CN107368200B (en) | 2013-06-18 | 2013-06-18 | Remote control device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710507743.1A CN107368200B (en) | 2013-06-18 | 2013-06-18 | Remote control device |
CN201310240883.9A CN104238555B (en) | 2013-06-18 | 2013-06-18 | The remote control system of directional type robot |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310240883.9A Division CN104238555B (en) | 2013-06-18 | 2013-06-18 | The remote control system of directional type robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107368200A CN107368200A (en) | 2017-11-21 |
CN107368200B true CN107368200B (en) | 2020-06-02 |
Family
ID=52226847
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710507743.1A Active CN107368200B (en) | 2013-06-18 | 2013-06-18 | Remote control device |
CN201310240883.9A Active CN104238555B (en) | 2013-06-18 | 2013-06-18 | The remote control system of directional type robot |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310240883.9A Active CN104238555B (en) | 2013-06-18 | 2013-06-18 | The remote control system of directional type robot |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN107368200B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108536165A (en) * | 2018-04-02 | 2018-09-14 | 深圳小趴智能科技有限公司 | A kind of posture induction remote control control robot movement technique |
CN109375628A (en) * | 2018-11-28 | 2019-02-22 | 南京工程学院 | A kind of Intelligent Mobile Robot air navigation aid positioned using laser orientation and radio frequency |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101154110A (en) * | 2006-09-29 | 2008-04-02 | 三星电子株式会社 | Method, apparatus, and medium for controlling mobile device based on image of real space including the mobile device |
CN101909379A (en) * | 2009-06-05 | 2010-12-08 | 原相科技股份有限公司 | Light source control system and control method thereof |
CN102169366A (en) * | 2011-03-18 | 2011-08-31 | 汤牧天 | Multi-target tracking method in three-dimensional space |
CN102903227A (en) * | 2011-07-26 | 2013-01-30 | 原相科技股份有限公司 | Optical remote control system |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020085097A1 (en) * | 2000-12-22 | 2002-07-04 | Colmenarez Antonio J. | Computer vision-based wireless pointing system |
CN1198243C (en) * | 2001-12-29 | 2005-04-20 | 原相科技股份有限公司 | Moving distance detecting method of image sensor |
CN100416476C (en) * | 2004-01-20 | 2008-09-03 | 原相科技股份有限公司 | Hand held direction indicator and method for evaluating displacement quantity |
JP4142073B2 (en) * | 2006-10-13 | 2008-08-27 | 株式会社コナミデジタルエンタテインメント | Display device, display method, and program |
CN101542550A (en) * | 2006-11-27 | 2009-09-23 | 皇家飞利浦电子股份有限公司 | Remote control pointing technology |
CN101388138B (en) * | 2007-09-12 | 2011-06-29 | 原相科技股份有限公司 | Interaction image system, interaction apparatus and operation method thereof |
CN102289304B (en) * | 2011-08-17 | 2014-05-07 | Tcl集团股份有限公司 | Method and system for realizing mouse movement simulation of remote controller as well as remote controller |
-
2013
- 2013-06-18 CN CN201710507743.1A patent/CN107368200B/en active Active
- 2013-06-18 CN CN201310240883.9A patent/CN104238555B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101154110A (en) * | 2006-09-29 | 2008-04-02 | 三星电子株式会社 | Method, apparatus, and medium for controlling mobile device based on image of real space including the mobile device |
CN101909379A (en) * | 2009-06-05 | 2010-12-08 | 原相科技股份有限公司 | Light source control system and control method thereof |
CN102169366A (en) * | 2011-03-18 | 2011-08-31 | 汤牧天 | Multi-target tracking method in three-dimensional space |
CN102903227A (en) * | 2011-07-26 | 2013-01-30 | 原相科技股份有限公司 | Optical remote control system |
Also Published As
Publication number | Publication date |
---|---|
CN107368200A (en) | 2017-11-21 |
CN104238555B (en) | 2017-09-22 |
CN104238555A (en) | 2014-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10112295B2 (en) | Remote control system for pointing robot | |
US9301372B2 (en) | Light control method and lighting device using the same | |
US20170270924A1 (en) | Control device and method with voice and/or gestural recognition for the interior lighting of a vehicle | |
CN110893085B (en) | Cleaning robot and charging path determining method thereof | |
WO2013104312A1 (en) | System for use in remote controlling controlled device | |
JP2019527430A (en) | Tactile guidance system | |
JPH09265346A (en) | Space mouse, mouse position detection device and visualization device | |
KR102627014B1 (en) | electronic device and method for recognizing gestures | |
US11442274B2 (en) | Electronic device and method for displaying object associated with external electronic device on basis of position and movement of external electronic device | |
CN112451962B (en) | Handle control tracker | |
JP7294350B2 (en) | Information processing device, information processing method, and program | |
EP2928269A1 (en) | Lighting system and control method thereof | |
CN107368200B (en) | Remote control device | |
US10902627B2 (en) | Head mounted device for virtual or augmented reality combining reliable gesture recognition with motion tracking algorithm | |
JP2005529427A (en) | Method and apparatus for sensing the spatial position of an object | |
KR20160051360A (en) | Gesture recognition apparatus for vehicle | |
CN110543234B (en) | Information processing apparatus and non-transitory computer readable medium | |
US20170168592A1 (en) | System and method for optical tracking | |
TW201435656A (en) | Information technology device input systems and associated methods | |
CN113453147A (en) | Gesture interaction control system based on UWB | |
US20140132500A1 (en) | Method and apparatus for recognizing location of moving object in real time | |
EP3795898B1 (en) | Electronic device for providing visual effect by using light-emitting device on basis of user location, and method therefor | |
KR20040027561A (en) | A TV system with a camera-based pointing device, and an acting method thereof | |
WO2016050828A1 (en) | User interface system based on pointing device | |
JP2017092761A (en) | Imaging System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |