US9978261B2 - Remote controller and information processing method and system - Google Patents
Remote controller and information processing method and system Download PDFInfo
- Publication number
- US9978261B2 US9978261B2 US14/493,520 US201414493520A US9978261B2 US 9978261 B2 US9978261 B2 US 9978261B2 US 201414493520 A US201414493520 A US 201414493520A US 9978261 B2 US9978261 B2 US 9978261B2
- Authority
- US
- United States
- Prior art keywords
- terminal device
- remote controller
- gesture
- control signal
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/20—Binding and programming of remote control devices
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/32—Remote control based on movements, attitude of remote control device
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/50—Receiving or transmitting feedback, e.g. replies, status updates, acknowledgements, from the controlled devices
- G08C2201/51—Remote controlling of devices based on replies, status thereof
Definitions
- the present invention relates to the field of electronic technologies, and in particular, to a remote controller and an information processing method and system.
- a remote controller is an apparatus for remotely controlling a terminal device.
- Current remote controllers are mainly pushbutton remote controllers on which multiple convex pushbuttons are arranged.
- a user may hold a pushbutton remote controller with a hand, and control, by pressing different pushbuttons, a terminal device to execute corresponding response instructions.
- the pushbutton remote controller requires the user to control the terminal device by an action such as clicking or pressing and holding a pushbutton, and is not easy to carry. Therefore, when the pushbutton remote controller is used, user experience is relatively poor.
- Embodiments of the present invention provide a remote controller and an information processing method and system, which can improve user experience, where the remote controller is easy to carry.
- a remote controller including a remote control main body, where the remote control main body is a wearable ring body; a sensor configured to detect a signal generated by an operation gesture input by a user; a processor, where the processor is connected to the sensor, and is configured to analyze the signal generated by the operation gesture, and generate a corresponding control signal according to an analysis result; and a first wireless communications module, connected to the processor, and configured to send the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal.
- the senor includes a touch pad, where the touch pad is located on an outer surface of the remote control main body, and is configured to receive a touch gesture input by the user, where the touch gesture is one of the operation gestures; and/or, an acceleration sensor, where the acceleration sensor is configured to convert an acceleration, which is generated when the user shakes the remote controller, to a voltage output signal; and/or, a first sensor, where the first sensor is located on an inner surface of the remote control main body, and is configured to detect a movement direction and a movement distance in the movement direction when the remote controller moves.
- the processor is further configured to when the operation gesture received by the touch pad is a gesture indicating pairing, activate a first near field communication (NFC) module, so that the first NFC module establishes a connection to and communicates with a second NFC module of the terminal device within a preset time threshold, where a distance between the first NFC module and the second NFC module is within a specific distance range; and the first wireless communications module is further configured to establish a connection between the first wireless communications module of the remote controller and a second wireless communications module of the terminal device by means of communication between the first NFC module and the second NFC module, where the connection between the first wireless communications module of the remote controller and the second wireless communications module of the terminal device is a Wireless Fidelity (WIFI) or Bluetooth connection.
- WIFI Wireless Fidelity
- the first wireless communications module is further configured to send a first operation instruction table to the second wireless communications module, so that the terminal device updates an operation instruction table in the terminal device according to the first operation instruction table, where the first operation instruction table records relationships between operation gestures and response instructions;
- the processor is configured to generate, according to the signal generated by the operation gesture input by the user, a control signal indicating the operation gesture;
- the first wireless communications module is configured to send the control signal to the terminal device, so that the terminal device queries the first operation instruction table according to the operation gesture indicated by the control signal, and executes, according to a query result, a response instruction corresponding to the operation gesture.
- the first wireless communications module is further configured to receive a second operation instruction table sent by the second wireless communications module; the first wireless communications module is further configured to update an operation instruction table in the remote controller according to the second operation instruction table, where the second operation instruction table records relationships between control signals and response instructions; the processor is configured to query the second operation instruction table according to the operation gesture, and generate a control signal that carries a response instruction corresponding to the operation gesture; and the first wireless communications module is configured to send the control signal to the terminal device, so that the terminal device executes the response instruction carried in the control signal.
- the operation gesture includes a gesture of touching and holding after sliding, a gesture of covering, a gesture of covering and rotating, a gesture of covering and translation, or a gesture of holding after covering and rotating.
- the processor is configured to receive touch point information reported by the touch pad, where touch points are continuous; record start position coordinates and end position coordinates of the operation gesture; record an operation residence time of the user at the end position coordinates; and when a distance between the start position coordinates and the end position coordinates is larger than or equal to a first preset distance, and the operation residence time is longer than or equal to a first time threshold, determine that the operation gesture is the gesture of touching and holding after sliding.
- the processor is configured to obtain a touch area detected by the touch pad and generated by a user operation; obtain a spacing between two points, between which an annular distance is the largest, in the touch area as a first length; and obtain a spacing between two points, between which an axial distance is the largest, in the touch area as a second length.
- the processor is configured to when a movement distance of the remote controller in a first movement direction is smaller than a first preset movement distance and a movement distance in a second movement direction is smaller than a second preset movement distance, and the first length is larger than or equal to a first preset length and the second length is larger than or equal to a second preset length, determine that the operation gesture is the gesture of covering, where the first movement direction is parallel to a circumferential direction of the ring body, and the second movement direction is parallel to an axis direction of the remote control main body.
- the processor is configured to when a movement distance of the remote controller in a first movement direction is larger than or equal to a first preset movement distance, and a movement time is shorter than a first preset movement time, and at the same time, an operation residence time of the user at a movement end position is shorter than or equal to a second time threshold, the first length is larger than or equal to a first preset length, and the second length is larger than or equal to a second preset length, determine that the operation gesture is the gesture of covering and rotating, where the first movement direction is parallel to a circumferential direction of the ring body.
- the processor is configured to when a movement distance of the remote controller in a second movement direction is larger than or equal to a second preset movement distance, and a movement time is shorter than a second preset movement time, and at the same time, an operation residence time of the user at a movement end position is shorter than or equal to a second time threshold, the first length is larger than or equal to a first preset length, and the second length is larger than or equal to a second preset length, determine that the operation gesture is the gesture of covering and translation, where the second movement direction is parallel to an axis direction of the remote control main body.
- the processor is configured to when a movement distance of the remote controller in a first movement direction is larger than or equal to a first preset movement distance, and a movement time is shorter than a first preset movement time, and at the same time, an operation residence time of the user at a movement end position is longer than a second time threshold, the first length is larger than or equal to a first preset length, and the second length is larger than or equal to a second preset length, determine that the operation gesture is the gesture of holding after covering and rotating, where the first movement direction is parallel to a circumferential direction of the ring body.
- the remote controller further includes a microphone, where the microphone is located on the outer surface of the remote control main body, and is configured to receive an external voice input.
- the remote controller further includes an indicator light, where the indicator light is located on the outer surface of the remote control main body, and is configured to provide indication signals of different colors for the user.
- an information processing method applied to the foregoing remote controller, and including receiving a signal generated by an operation gesture input by a user, where the operation gesture includes a gesture of touching and holding after sliding, a gesture of covering, a gesture of covering and rotating, a gesture of covering and translation, or a gesture of holding after covering and rotating; analyzing the signal generated by the operation gesture; generating a control signal according to an analysis result; and sending the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal.
- the method before the receiving a signal generated by an operation gesture input by a user, the method further includes receiving an operation gesture indicating pairing; activating a first NFC module of the remote controller; within a preset time threshold, when a distance between the first NFC module and a second NFC module of the terminal device is shortened to be within a specific distance range, establishing, by the first NFC module, a connection to and communicating with the second NFC module; and establishing a connection between a first wireless communications module of the remote controller and a second wireless communications module of the terminal device by means of communication between the first NFC module and the second NFC module, where the connection between the first wireless communications module of the remote controller and the second wireless communications module of the terminal device is a WIFI or Bluetooth connection.
- the control signal includes the operation gesture, after the establishing a connection between a first wireless communications module of the remote controller and a second wireless communications module of the terminal device by means of communication between the first NFC module and the second NFC module, the method further includes sending a first operation instruction table to the terminal device, so that the terminal device updates an operation instruction table in the terminal device according to the first operation instruction table, where the first operation instruction table records relationships between operation gestures and response instructions; and the sending the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal includes sending the control signal to the terminal device, so that the terminal device queries the first operation instruction table according to the operation gesture indicated by the control signal, and executes a corresponding response instruction according to a query result.
- the method further includes receiving a second operation instruction table sent by the second wireless communications module; and updating an operation instruction table in the remote controller according to the second operation instruction table, where the second operation instruction table records relationships between control signals and response instructions; and the sending the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal includes sending the control signal to the terminal device, so that the terminal device executes the response instruction carried in the control signal.
- the analyzing the operation gesture includes receiving touch point information reported by the touch pad, where touch points are continuous; recording start position coordinates and end position coordinates of the operation gesture; recording an operation residence time of the user at the end position coordinates; and when a distance between the start position coordinates and the end position coordinates is larger than or equal to a first preset distance, and the operation residence time is longer than or equal to a first time threshold, determining that the operation gesture is the gesture of touching and holding after sliding.
- the analyzing the operation gesture includes obtaining a touch area detected by the touch pad and generated by a user operation; obtaining a spacing between two points, between which an annular distance is the largest, in the touch area as a first length; and obtaining a spacing between two points, between which an axial distance is the largest, in the touch area as a second length.
- the analyzing the operation gesture further includes when a movement distance of the remote controller in a first movement direction is smaller than a first preset movement distance and a movement distance in a second movement direction is smaller than a second preset movement distance, and the first length is larger than or equal to a first preset length and the second length is larger than or equal to a second preset length, determining that the operation gesture is the gesture of covering, where the first movement direction is parallel to a circumferential direction of the ring body, and the second movement direction is parallel to an axis direction of the remote control main body.
- the analyzing the operation gesture further includes when a movement distance of the remote controller in a first movement direction is larger than or equal to a first preset movement distance, and a movement time is shorter than a first preset movement time, and at the same time, an operation residence time of the user at a movement end position is shorter than or equal to a second time threshold, the first length is larger than or equal to a first preset length, and the second length is larger than or equal to a second preset length, determining that the operation gesture is the gesture of covering and rotating, where the first movement direction is parallel to a circumferential direction of the ring body.
- the analyzing the operation gesture further includes when a movement distance of the remote controller in a second movement direction is larger than or equal to a second preset movement distance, and a movement time is shorter than a second preset movement time, and at the same time, an operation residence time of the user at a movement end position is shorter than or equal to a second time threshold, the first length is larger than or equal to a first preset length, and the second length is larger than or equal to a second preset length, determining that the operation gesture is the gesture of covering and translation, where the second movement direction is parallel to an axis direction of the remote control main body.
- the analyzing the operation gesture further includes when a movement distance of the remote controller in a first movement direction is larger than or equal to a first preset movement distance, and a movement time is shorter than a first preset movement time, and at the same time, an operation residence time of the user at a movement end position is longer than a second time threshold, the first length is larger than or equal to a first preset length, and the second length is larger than or equal to a second preset length, determining that the operation gesture is the gesture of holding after covering and rotating, where the first movement direction is parallel to a circumferential direction of the ring body.
- an information processing system including any remote controller described above and a terminal device, where the terminal device is configured to perform a corresponding operation according to a control instruction sent by the remote controller.
- the terminal device is a mobile phone, a television set, or a computer.
- the embodiments of the present invention provide a remote controller and an information processing method and system, where the remote controller includes a remote control main body, where the remote control main body is a wearable ring body; a sensor configured to detect a signal generated by an operation gesture input by a user; a processor, where the processor is connected to the sensor, and is configured to analyze the signal generated by the operation gesture, and generate a corresponding control signal according to an analysis result; and a first wireless communications module, connected to the processor, and configured to send the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal.
- the remote controller includes a remote control main body, where the remote control main body is a wearable ring body; a sensor configured to detect a signal generated by an operation gesture input by a user; a processor, where the processor is connected to the sensor, and is configured to analyze the signal generated by the operation gesture, and generate a corresponding control signal according to an analysis result; and a first wireless communications module, connected to the processor, and configured
- the remote control main body is a wearable ring body
- the remote controller can be worn on the body of a user; moreover, the user controls, by inputting an operation gesture to the sensor, the remote controller to send a control signal to a terminal device, and neither a pushbutton operation nor a large-amplitude hand action is required. Therefore, the remote controller can improve user experience and is easy to carry.
- FIG. 1 is a schematic structural diagram of a remote controller according to an embodiment of the present invention
- FIG. 2 is a diagram of an annular axis section of a remote control main body of a remote controller according to an embodiment of the present invention
- FIG. 3 is a schematic structural diagram of a sensor according to an embodiment of the present invention.
- FIG. 4 is a schematic diagram of a gesture of touching and holding after sliding according to an embodiment of the present invention.
- FIG. 5 is a schematic diagram of a gesture of covering according to an embodiment of the present invention.
- FIG. 6 is a schematic diagram of a gesture of covering and rotating or a gesture of holding after covering and rotating according to an embodiment of the present invention
- FIG. 7 is a schematic diagram of a gesture of covering and translation according to an embodiment of the present invention.
- FIG. 8 is a schematic diagram of a touch area according to an embodiment of the present invention.
- FIG. 9 is another schematic structural diagram of a remote controller according to an embodiment of the present invention.
- FIG. 10 is still another schematic structural diagram of a remote controller according to an embodiment of the present invention.
- FIG. 11 is a schematic diagram of connection manners between modules in a remote controller according to an embodiment of the present invention.
- FIG. 12 is a flowchart of an information processing method according to an embodiment of the present invention.
- the term “and/or” only describes an association relationship between associated objects, and indicates that three relationships may exist.
- a and/or B may indicate three cases: only A exists; both A and B exist; and only B exists.
- the symbol “/” generally indicates that the associated objects before and after the symbol are in an “or” relationship.
- an embodiment of the present invention provides a remote controller 10 , including a remote control main body 101 , where the remote control main body 101 is a wearable ring body, the ring body may be enclosed or not enclosed, and the remote control main body 101 may be worn around a body part such as a waist or a wrist, for example, the remote control main body 101 in FIG. 1 is worn around a wrist;
- the ring body mentioned in the embodiment of the present invention may be bangle-shaped, including an inner surface and an outer surface, and may also be a cylindrical hollow barrel-shaped body, including an inner surface, an outer surface, and an upper bottom surface and a lower bottom surface; an annular axis section of the ring body may be that shown in FIG.
- the slash part is solid, and the blank part, that is, the part of a circle A is hollow, and the body part such as the wrist can pass through this blank part to wear the remote control main body 101 ;
- the circle A and a circle B may have a common circle center o, and a radius r of the circle A is shorter than a radius R of the circle B;
- a curved surface H on which a curve of the circle A, whose boundary orientates toward the inside of the circle A, is located is the inner surface of the remote control main body, and a curved surface K on which a curve of the circle B, whose boundary orientates toward the outside of the circle B is located, is the outer surface of the remote control main body;
- the inner surface of the remote control main body 101 may contact the body of a user;
- a sensor 102 configured to detect a signal generated by an operation gesture input by the user;
- a processor 103 where the processor is connected to the sensor 102 , and is configured to analyze
- shapes and positions of the sensor 102 , the processor 103 , and the first wireless communications module 104 are merely examples for description; and in an actual application, the sensor 102 , the processor 103 , and the first wireless communications module 104 may be located on the outer surface of the remote control main body 101 , or located on the inner surface of the remote control main body 101 , or embedded into the remote control main body 101 .
- the remote control main body 101 is a wearable ring body
- the remote controller can be worn on the body of a user; moreover, the user controls, by inputting an operation gesture to the sensor 102 , the remote controller to send a control signal to a terminal device, and no pushbutton operation is required. Therefore, the remote controller can improve user experience and is easy to carry.
- the sensor 102 may include a touch pad 1021 , where the touch pad 1021 is located on the outer surface of the remote control main body 101 ; may cover the outer surface of the remote control main body 101 , or may be embedded into the outer surface of the remote control main body 101 ; and is configured to receive a touch gesture input by the user, where the touch gesture is one of the operation gestures; and/or, an acceleration sensor 1022 , where the acceleration sensor 1022 is configured to convert an acceleration, which is generated when the user shakes the remote controller 10 , to a voltage output signal, and the acceleration sensor 1022 may be connected to the processor 103 , so that the processor 103 analyzes the voltage output signal to obtain the acceleration and a direction of shaking the remote controller 10 ; and/or, a first sensor 1023 , where the first sensor 1023 may be located on the inner surface of the remote control main body 101 , and is configured to detect a movement direction and a movement distance in the movement direction when the remote controller 10 moves, and output a detection
- the first sensor 1023 may be formed by a light emitting diode, an optical lens component, an optical sensor, an image processing chip, and so on, which is similar to an induction principle of an optical mouse.
- the light emitting diode may continuously emit light to an area such as skin on the wrist, which can be contacted by the inner surface of the ring-shaped remote controller, on the body of the user, so that the light is reflected by the contactable area to the optical sensor through the optical lens component, the optical sensor converts a light signal to an electrical signal and sends the electrical signal to the image processing chip, and the image processing chip processes the electrical signal and outputs the movement direction of the remote controller and the movement distance in the movement direction to the processor.
- the first sensor may also be one of other multiple sensors configured to detect the movement direction and the movement distance, which is not limited in the present invention.
- the remote controller further includes a first NFC module 105 , which is configured to perform NFC communication with another device with an NFC function.
- the processor 103 may be further configured to when the operation gesture received by the touch pad 1021 is a gesture indicating pairing, activate a first NFC module 105 , so that the first NFC module 105 establishes a connection to and communicates with a second NFC module 201 of the terminal device 20 within a preset time threshold, where a distance between the first NFC module 105 and the second NFC module 201 is within a specific distance range, and the specific distance range is generally 0 to 10 centimeters (cm).
- the first NFC module 105 can establish a connection with the second NFC module 201 of the terminal device 20 when the distance between the first NFC module 105 and the second NFC module 201 of the terminal device 20 is within the specific distance range.
- the terminal device 20 is a mobile phone.
- the first wireless communications module 104 is further configured to establish a connection between the first wireless communications module 104 of the remote controller 10 and a second wireless communications module (not shown in FIG. 3 ) of the terminal device 20 by means of communication between the first NFC module 105 and the second NFC module 201 , where the connection between the first wireless communications module 104 of the remote controller 10 and the second wireless communications module of the terminal device is a WIFI or Bluetooth connection.
- relevant information of the first wireless communications module 104 and the second wireless communications module may be transmitted through the connection, and pairing may be performed on communications links of the first wireless communications module 104 and the second wireless communications module according to the relevant information, where the first wireless communications module 104 and the second wireless communications module may be Bluetooth communications modules or WIFI modules, and so on.
- a wireless connection is established between the first wireless communications module 104 and the second wireless communications module.
- the two modules can communicate with each other, and the first wireless communications module 104 can control, by sending the control signal to the second wireless communications module, the terminal device 20 to perform the corresponding operation.
- pairing connection is performed between the first NFC module 105 and the second NFC module 201 ; and if an operation gesture indicating pairing of a user is not received, even if a distance between the first NFC module 105 and the second NFC module 201 is shortened to be within a specific distance range, connection cannot be performed because the first NFC module 105 is not activated. Therefore, misconnections caused by unintended touch by the user are effectively reduced.
- the remote controller can control multiple terminal devices, for example, a mobile phone, a portable computer, a palmtop computer, and so on.
- a screen of the terminal device 20 may display that the connection succeeds, to remind the user that the connection has succeeded.
- the first wireless communications module 104 may be further configured to send a first operation instruction table to the second wireless communications module, so that the terminal device 20 updates an operation instruction table in the terminal device 20 according to the first operation instruction table, where the first operation instruction table records correspondences between operation gestures and response instructions, for example, an operation gesture is a tap gesture, and a corresponding response instruction is to open a file.
- the processor is configured to generate, according to the signal generated by the operation gesture input by the user, a control signal indicating the operation gesture, that is, the processor determines a specific operation gesture according to the detected signal generated by the operation gesture, and generates, according to the operation gesture, a control signal indicating the operation gesture.
- the first wireless communications module 104 is configured to send the control signal to the terminal device 20 , so that the terminal device 20 queries the first operation instruction table according to the operation gesture indicated by the control signal, and executes a corresponding response instruction according to a query result. Because the terminal device updates a mapping table to the first operation instruction table, in an actual application, the control signal may include only the operation gesture, which reduces the process of processing the operation gesture by the processor, and simplifies the process of controlling the terminal device by the remote controller.
- the first wireless communications module 104 may be further configured to receive a second operation instruction table sent by the second wireless communications module.
- the first wireless communications module 104 is further configured to update an operation instruction table in the remote controller according to the second operation instruction table, where the second operation instruction table records relationships between control signals and response instructions, for example, a control signal is a first control signal, and a corresponding response instruction is to open a file.
- the processor is configured to query the second operation instruction table according to the operation gesture, and generate a control signal that carries a response instruction corresponding to the operation gesture, where, the processor may analyze the signal detected by the sensor and generated by the operation gesture input by the user, query the second operation instruction table according to an analysis result, to obtain a response instruction corresponding to the operation gesture, and generate the control signal that carries the response instruction.
- the first wireless communications module 104 is configured to send the control signal to the terminal device 20 , so that the terminal device 20 executes the response instruction carried in the control signal. In this way, by updating the operation instruction table to the second operation instruction table, the remote controller can be connected to multiple terminal devices, which expands a scope in which the remote controller can be used.
- the terminal device 20 may send a feedback signal to the remote controller through the established wireless connection, where the feedback signal indicates that the operation succeeds or the operation fails.
- the processor 103 may control, according to the feedback signal, a corresponding module in the remote controller 10 to send a sound signal or a color signal to remind the user.
- the operation gesture may include the tap gesture, a gesture of touching and holding, a double-tap gesture, a triple-tap gesture, a slide gesture, a gesture of touching and holding after sliding, a drag gesture, a gesture with two fingers, a gesture of covering, a gesture of covering and rotating, a gesture of covering and translation, or a gesture of holding after covering and rotating, and so on.
- the tap gesture, the gesture of touching and holding, the double-tap gesture, the triple-tap gesture, the slide gesture, the drag gesture, and the gesture with two fingers are touch gestures, and are mainly detected using a touch pad, which is the same as that in the prior art, and no details are provided in the present invention. Operations for the operation gesture involve small-amplitude actions and are easy to learn; and the user controls the terminal device using a simple operation gesture only without pressing different pushbuttons. Therefore, user experience can be improved.
- the operation gesture is the gesture of touching and holding after sliding, as shown in FIG. 4 , a finger 40 slides starting from a point Q in a direction of an arrow t to a point P, and stays for a certain period of time at the point P.
- the arrow t is located on the outer surface of the remote control main body 101 , and therefore is a curved arrow; the arrow t indicates a sliding track and a sliding direction of the finger 40 on the touch pad (not shown in FIG. 4 ), and the direction of the arrow t in FIG. 4 is merely exemplary for description; and in an actual application, the arrow t may indicate any direction, which is not limited in the present invention.
- the processor 103 is configured to determine, according to touch point information reported by the touch pad and according to the fact that touch points are continuous, that the operation gesture is sliding, and record start position coordinates and end position coordinates of the operation gesture, that is, position coordinates of the point Q and the point P in FIG. 4 ; record an operation residence time of the user at the end position coordinates, that is, the operation residence time at the point P; and when a distance between the start position coordinates and the end position coordinates is larger than or equal to a first preset distance, and the operation residence time is longer than or equal to a first time threshold, determine that the operation gesture is the gesture of touching and holding after sliding.
- a hand 50 of the user covers the outer surface of the remote control main body 101 , where the fingers and palm cover the outer surface along a bending direction of the outer surface of the remote control main body 101 , and touch the touch pad as much as possible.
- the hand 50 of the user covers the outer surface of the remote control main body 101 , where the fingers and palm cover along the bending direction of the outer surface of the remote control main body 101 , and touch the touch pad as much as possible; and then rotate in an annular direction of the remote control main body 101 , for example, rotate in a direction of an arrow in FIG. 6 , or first rotate in a direction of an arrow, and then rotate in a direction of another arrow. After the rotation ends, the hand 50 of the user immediately leaves the outer surface of the remote control main body 101 .
- the hand 50 of the user covers the outer surface of the remote control main body 101 , where the fingers and palm cover along the bending direction of the outer surface of the remote control main body 101 , and touch the touch pad as much as possible; and then the fingers and palm hold the remote control main body 101 and perform translation in an axial direction of the remote control main body 101 , for example, perform translation in a direction of an arrow in FIG. 7 , or first move in a direction of an arrow, and then perform translation in a direction of another arrow.
- the hand 50 of the user immediately leaves the outer surface of the remote control main body 101 .
- the arrow direction in FIG. 7 is parallel to the arm of the user.
- the hand 50 of the user covers the outer surface of the remote control main body 101 , where the fingers and palm cover along the bending direction of the outer surface of the remote control main body 101 , and touch the touch pad as much as possible; and then rotate in the annular direction of the remote control main body 101 , for example, rotate in a direction of an arrow in FIG. 6 , or first rotate in a direction of an arrow, and then rotate in a direction of another arrow.
- the hand 50 of the user stays for a period of time on the outer surface of the remote control main body 101 .
- the processor 103 is configured to first obtain a touch area detected by the touch pad and generated by a user operation, and then analyze the touch area generated on the touch pad by the user operation. It is assumed that, after being unfolded, the touch pad 1021 takes on a rectangle shown in FIG. 8 , and after being unfolded, a touch area 60 is an irregular area shown in FIG. 8 .
- the processor 103 may obtain a spacing between two points, between which an annular distance is the largest, in the touch area 60 as a first length x; and obtain a spacing between two points, between which an axial distance is the largest, in the touch area 60 as a second length y. Then, the processor 103 determines whether the remote controller moves or not, where the movement includes rotation movement and translation movement.
- the operation gesture is the gesture of covering, where the first movement direction is parallel to a circumferential direction of the ring body, and the second movement direction is parallel to an axis direction of the remote control main body.
- shape approximating processing may further be performed on the touch area 60 first, to approximate the touch area 60 to a shape such as a rectangle or an ellipse, so as to simplify the process of obtaining the first length x and the second length y. It should be noted that, whether the remote controller moves or not may be determined according to a signal generated by the first sensor 1023 .
- the operation gesture is the gesture of covering and rotating, where the first movement direction (that is, the annular direction) is parallel to the circumferential direction of the ring body.
- the operation gesture is the gesture of covering and translation, where the second movement direction (that is, the axial direction) is parallel to the axis direction of the remote control main body.
- the first preset movement distance may be equal or unequal to the second preset movement distance, which is not limited in the present invention.
- the second movement direction may be parallel to the axial direction of the remote control main body, and the first preset movement time and the second preset movement time are set to determine that the remote controller is being operated by a user. Movement in a specified direction, within a specified period of time, and within a specified distance range is determined, and it is avoided that movement caused by the user by doing exercises such as dancing and running is determined, thereby reducing misoperations.
- the first preset movement time may be equal or unequal to the second preset movement time, which is not limited in the present invention.
- the operation gesture is the gesture of holding after covering and rotating, where the first movement direction (that is, the annular direction) is parallel to the circumferential direction of the ring body.
- the user can control the terminal device to execute different operations, and the different operations executed by the terminal device may be set according to an actual situation.
- the user may select an icon by sliding on the touch pad using the slide gesture.
- a cursor on the television set moves on the multiple icons along with the slide gesture.
- the slide gesture stops the icon on which the cursor stays is a selected icon. Then, the user may open the icon by staying at any position on the touch pad using the gesture of touching and holding.
- the user taps any point on the touch pad using the tap gesture, it may indicate to start to play a video; when the users taps any point on the touch pad again, it may indicate to pause playback of the video; and when the user holds the remote controller using the gesture of covering, it may indicate to start to operate content of the video.
- the user controls, using the remote controller, a mobile phone to browse a microblog, the user may trigger, using the gesture of covering and rotating, the microblog on a screen of the mobile phone to roll down; and after several posts are rolled, the microblog stops rolling when the gesture of covering and rotating stops.
- the user may also trigger, using the gesture of holding after covering and rotating, the microblog on the screen of the mobile phone to roll down; during holding, the microblog keeps rolling; and only when the user releases the hand to stop the holding does the microblog gradually stop.
- the user may further perform volume adjustment and so on using the sliding gesture, which is not limited in the present invention.
- the first sensor 1023 is further configured to send the movement direction and the movement distance in the movement direction to the processor 103 , so that the processor 103 determines, according to the movement direction and the movement distance in the movement direction, whether the remote controller 10 moves or not.
- the remote controller 10 may move in multiple cases, where movement includes rotation movement and translation movement, and therefore movement needs to be determined by the processor 103 .
- the processor 103 may determine, when the movement distance of the remote controller 10 in the first movement direction is smaller than the first preset movement distance and the movement distance in the second movement direction is smaller than the second preset movement distance, that the remote controller 10 does not move; and determine, when the movement distance is than or equal to the first preset movement distance and the movement direction is the first movement direction, that the rotation movement occurs on the remote controller 10 .
- the first movement direction (that is, the annular direction) may be parallel to the circumferential direction of the ring body, for example, a direction of an arrow z, where the arrow z is a curved arrow.
- the remote controller When the user applies a force to the remote controller in the first movement direction, the remote controller may rotate around an axial line n.
- the first movement direction (that is, the annular direction) may also be a direction that is parallel to a tangential direction of an annular axis section T (which is perpendicular to the axial line n).
- the movement distance in the first movement direction is a projection value, in the first movement direction, of an actual movement distance on an outer surface W of the remote control main body.
- the second movement direction (that is, the axial direction) may be parallel to a direction of the axial line n, for example, a direction of an arrow e.
- the remote controller performs translation in the direction parallel to the axial line n.
- included angles may exist between the movement direction obtained by measurement by the first sensor 1023 and the first movement direction and between the movement direction and the second movement direction.
- the included angle between the movement direction and the first movement direction is within a preset included angle range, it may be considered that the movement direction is the first movement direction.
- the included angle between the movement direction and the second movement direction is within the preset included angle range, it may be considered that the movement direction is the second movement direction.
- the preset included angle is generally an angle smaller than 45 degrees.
- the remote controller 10 may further include a microphone 106 , where the microphone 106 is located on the outer surface of the remote control main body 101 (not shown), and is configured to receive an external voice input; a loudspeaker (not shown in the figure), where the loudspeaker is located on the outer surface of the remote control main body 101 , and is configured to output a prompt tone; if there is no loudspeaker in the remote controller 10 , the prompt tone may be output using a loudspeaker of the terminal device 20 ; an indicator light 107 , where the indicator light 107 is located on the outer surface of the remote control main body 101 , and is configured to provide indication signals of different colors for the user.
- the remote controller 10 further includes a power supply module 108 configured to provide power to modules in the remote controller 10 .
- the sensor 102 may include the touch pad 1021 , the acceleration sensor 1022 , and the first sensor 1023 , where connection manners between modules in the remote controller 10 is shown in FIG. 11 ; and the processor 103 , the first wireless communications module 104 , the touch pad 1021 , the acceleration sensor 1022 , the first sensor 1023 , the microphone 106 , and the indicator light 107 are all located on the remote control main body 101 .
- the power supply module 108 is connected to the modules in the remote controller 10 , and is configured to provide power to the modules.
- the touch pad 1021 , the first sensor 1023 , the acceleration sensor 1022 , the microphone 106 , the first wireless communications module 104 , and the indicator light 107 are separately connected to the processor 103 , where the touch pad 1021 provides, for the processor 103 , a signal generated by a touch gesture in the operation gestures, the first sensor 1023 provides the movement direction of the remote controller 10 and the movement distance in the movement direction for the processor 103 , and the acceleration sensor 1022 provides an acceleration signal for the processor 103 ; the processor 103 may perform determining according to the foregoing signals to obtain a corresponding operation gesture, generate, according to the operation gesture, a control signal indicating the operation gesture, or generate a control signal that carries a response instruction, and transmit the control signal to the first wireless communications module 104 ; the first wireless communications module 104 sends the control signal to a second wireless communications module 202 of the terminal device 20 , and the second wireless communications module 202 transmits the control signal to a corresponding module of the terminal device 20 , so
- the user may also control, using different actions, the terminal device to execute different operations, and the different operations executed by the terminal device may be set according to an actual situation.
- the user may control playback of music by shaking the remote controller; when the user shakes the remote controller once for the first time, it may indicate to start to play the music randomly; when the user shakes the remote controller once again, it may indicate to stop playing the music; when the user shakes the remote controller twice, it may indicate to play a next song; when the user shakes the remote controller thrice, it may indicate to play a previous song; the user may also control playback progress of a video or an image by blowing the microphone; and so on.
- the remote controller provided by the embodiment of the present invention, the user can control, using the operation gestures and different body actions, the terminal device to execute different operations, which provides multiple control manners for the user for selection, thereby improving the user experience.
- the distance in the embodiment of the present invention not only may be a distance on a curved surface, but also may be a linear distance, which is not limited in the present invention.
- the remote controller can be worn on the body of a user; moreover, the user controls, by inputting an operation gesture to the sensor, the remote controller to send a control signal to a terminal device, and no pushbutton operation is required; in addition, by updating an operation instruction table, the remote controller can control multiple terminal devices, and provide multiple control manners on a terminal device for selection. Therefore, the remote controller can improve the user experience and is easy to carry.
- an embodiment of the present invention provides an information processing method, which is applied to the remote controller, and includes the following steps:
- Step 701 Receive a signal generated by an operation gesture input by a user, where the operation gesture includes a gesture of touching and holding after sliding, a gesture of covering, a gesture of covering and rotating, a gesture of covering and translation, or a gesture of holding after covering and rotating.
- Step 702 Analyze the signal generated by the operation gesture.
- Step 703 Generate a control signal according to an analysis result.
- Step 704 Send the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal.
- the user controls, using the operation gesture, the remote controller to send the control signal to the terminal device, and no pushbutton operation is required. Therefore, user experience can be improved.
- a pairing action may further be first performed, which comprises receiving an operation gesture indicating pairing; activating a first NFC module of the remote controller; within a preset time threshold, when a distance between the first NFC module and a second NFC module of the terminal device is shortened to be within a specific distance range, establishing, by the first NFC module, a connection to and communicating with the second NFC module; and establishing a connection between a first wireless communications module of the remote controller and a second wireless communications module of the terminal device by means of communication between the first NFC module and the second NFC module, that is, establishing a wireless connection between the first wireless communications module 104 and the second wireless communications module 202 .
- the connection between the first wireless communications module of the remote controller and the second wireless communications module of the terminal device is a WIFI or Bluetooth connection.
- the method may further include sending a first operation instruction table to the terminal device, so that the terminal device updates an operation instruction table in the terminal device according to the first operation instruction table, where the first operation instruction table records relationships between operation gestures and response instructions; and the sending the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal includes sending the control signal indicating the operation gesture to the terminal device, so that the terminal device queries the first operation instruction table according to the operation gesture indicated by the control signal, and executes a corresponding response instruction according to a query result.
- the method further includes receiving a second operation instruction table sent by the second wireless communications module of the terminal device; updating an operation instruction table in the remote controller according to the second operation instruction table, where the second operation instruction table records relationships between control signals and response instructions; and the sending the control signal to a terminal device, so that the terminal device performs a corresponding operation according to the control signal includes sending the control signal that carries the response instruction to the terminal device, so that the terminal device executes the response instruction carried in the control signal.
- the analyzing the operation gesture may include receiving touch point information reported by the touch pad, where touch points are continuous; recording start position coordinates and end position coordinates of the operation gesture; recording an operation residence time of the user at the end position coordinates; and when a distance between the start position coordinates and the end position coordinates is larger than or equal to a first preset distance, and the operation residence time is longer than or equal to a first time threshold, determining that the operation gesture is the gesture of touching and holding after sliding.
- the analyzing the operation gesture may further include obtaining a touch area reported by the touch pad and generated by a user operation; obtaining a spacing between two points, between which an annular distance is the largest, in the touch area as a first length; obtaining a spacing between two points, between which an axial distance is the largest, in the touch area as a second length; and determining whether the remote controller moves or not, where movement includes rotation movement and translation movement.
- a movement distance of the remote controller in a first movement direction that is, an annular direction
- a movement distance of the remote controller in a second movement direction that is, an axial direction
- the movement distance is larger than or equal to the first preset movement distance and the movement direction is the first movement direction
- that rotation movement occurs on the remote controller
- the movement distance is larger than or equal to the second preset movement distance and the movement direction is the second movement direction, it is determined that that translation movement occurs on the remote controller.
- the operation gesture is the gesture of covering, where the first movement direction is parallel to a circumferential direction of the ring body, and the second movement direction is parallel to an axis direction of the remote control main body.
- the movement direction is the first movement direction
- a movement time is shorter than a first preset movement time
- an operation residence time of the user at a movement end position is shorter than or equal to a second time threshold
- the first length is larger than or equal to the first preset length
- the second length is larger than or equal to the second preset length
- the movement direction is the second movement direction
- the movement time is shorter than a second preset movement time
- the operation residence time of the user at the movement end position is shorter than or equal to the second time threshold
- the first length is larger than or equal to the first preset length
- the second length is larger than or equal to the second preset length
- the movement direction is the first movement direction
- the movement time is shorter than the first preset movement time
- the operation residence time of the user at the movement end position is longer than the second time threshold
- the first length is larger than or equal to the first preset length
- the second length is larger than or equal to the second preset length
- a user controls, by inputting an operation gesture, a remote controller to send a control signal to a terminal device; operations for the operation gesture involve small-amplitude actions, are the same as or similar to existing gesture operations, and are easy to learn; the user controls the terminal device using a simple operation gesture only without pressing different pushbuttons, and neither a pushbutton operation nor a large-amplitude hand action is required; and the remote controller can control multiple terminal devices. Therefore, the user experience can be improved.
- An information processing system in an embodiment of the present invention includes a remote controller provided by any embodiment of the present invention and a terminal device, where the terminal device is configured to perform a corresponding operation according to a control instruction sent by the remote controller.
- the terminal device may be a mobile phone, a television set, or a computer, and so on.
- the disclosed system, apparatus, and method may be implemented in other manners.
- the described apparatus embodiment is merely schematic.
- the unit division is merely logical function division and may be other division in actual implementation.
- a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not executed.
- the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. Indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
- the units described as separate parts may be or may not be physically separate, and parts displayed as units may be or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. A part or all of the units may be selected according to an actual need to achieve objectives of solutions of the embodiments.
- functional units in the embodiments of the present invention may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units are integrated into one unit.
- the integrated unit may be implemented in a form of hardware, or may also be implemented in a form of hardware plus a software functional unit.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
- Selective Calling Equipment (AREA)
- Telephone Function (AREA)
- Details Of Television Systems (AREA)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2013/082180 WO2015024252A1 (zh) | 2013-08-23 | 2013-08-23 | 一种遥控器、信息处理方法及系统 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2013/082180 Continuation WO2015024252A1 (zh) | 2013-08-23 | 2013-08-23 | 一种遥控器、信息处理方法及系统 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150054630A1 US20150054630A1 (en) | 2015-02-26 |
US9978261B2 true US9978261B2 (en) | 2018-05-22 |
Family
ID=50363923
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/493,520 Active 2034-03-19 US9978261B2 (en) | 2013-08-23 | 2014-09-23 | Remote controller and information processing method and system |
Country Status (6)
Country | Link |
---|---|
US (1) | US9978261B2 (de) |
EP (1) | EP3037946B1 (de) |
JP (1) | JP6275839B2 (de) |
KR (1) | KR101793566B1 (de) |
CN (1) | CN103703495B (de) |
WO (1) | WO2015024252A1 (de) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11400366B2 (en) | 2017-01-11 | 2022-08-02 | Sony Interactive Entertainment Inc. | Controller |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6275839B2 (ja) * | 2013-08-23 | 2018-02-07 | 華為技術有限公司Huawei Technologies Co.,Ltd. | リモコン装置、情報処理方法およびシステム |
KR20160117479A (ko) * | 2014-02-10 | 2016-10-10 | 애플 인크. | 광학 센서를 사용해 검출된 모션 제스처 입력 |
CN104008635B (zh) * | 2014-04-18 | 2017-10-10 | 小米科技有限责任公司 | 设备控制方法及装置 |
CN105320326A (zh) * | 2014-07-25 | 2016-02-10 | 南京瀚宇彩欣科技有限责任公司 | 智能网络系统 |
CN105320370A (zh) * | 2014-07-25 | 2016-02-10 | 南京瀚宇彩欣科技有限责任公司 | 穿戴式智能装置 |
CN105320419A (zh) * | 2014-07-25 | 2016-02-10 | 南京瀚宇彩欣科技有限责任公司 | 智能环带的处理电路 |
CN105286224A (zh) * | 2014-07-25 | 2016-02-03 | 南京瀚宇彩欣科技有限责任公司 | 智能环带 |
KR20160075079A (ko) * | 2014-12-19 | 2016-06-29 | 삼성전자주식회사 | 다른 전자 장치를 제어하는 전자 장치 및 제어 방법 |
CN104639966A (zh) * | 2015-01-29 | 2015-05-20 | 小米科技有限责任公司 | 遥控方法及装置 |
CN104615358A (zh) * | 2015-02-06 | 2015-05-13 | 掌赢信息科技(上海)有限公司 | 一种应用程序启动方法和电子设备 |
CN105005378B (zh) * | 2015-04-29 | 2018-04-24 | 杭州喵隐科技有限公司 | 一种可以控制远程电器的手势识别系统及控制方法 |
CN106325079A (zh) * | 2015-06-16 | 2017-01-11 | 中兴通讯股份有限公司 | 家电设备的控制方法和装置 |
US10166123B2 (en) * | 2015-06-29 | 2019-01-01 | International Business Machines Corporation | Controlling prosthetic devices with smart wearable technology |
US9804679B2 (en) * | 2015-07-03 | 2017-10-31 | Google Inc. | Touchless user interface navigation using gestures |
CN105083147A (zh) * | 2015-07-08 | 2015-11-25 | 北汽福田汽车股份有限公司 | 车载控制器及具有其的汽车控制系统 |
KR20170035547A (ko) * | 2015-09-23 | 2017-03-31 | 엘지이노텍 주식회사 | 원격 제어장치, 원격 제어방법 및 원격 제어시스템 |
CN105068664B (zh) * | 2015-09-23 | 2019-01-04 | 谢小强 | 交互系统及交互控制方法 |
CN105404801B (zh) * | 2015-11-03 | 2018-06-19 | 张亚东 | 控制重力传感器进行身份安全访问认证的系统及方法 |
CN105302307B (zh) * | 2015-11-03 | 2018-01-12 | 深圳市尚锐科技有限公司 | 通过加速度传感器获取方向信息进行行为匹配的方法 |
CN105261187A (zh) * | 2015-11-13 | 2016-01-20 | 南京物联传感技术有限公司 | 一种可替换普通电池的智能电池 |
CN105472424A (zh) * | 2015-11-23 | 2016-04-06 | 晨星半导体股份有限公司 | 终端设备及其遥控方法 |
US20170189803A1 (en) | 2016-01-04 | 2017-07-06 | Sphero, Inc. | Task-oriented feedback using a modular sensing device |
TWI729064B (zh) * | 2016-01-28 | 2021-06-01 | 日商日本鼎意股份有限公司 | 包括內置有感應器的球的系統、行動終端的程式及經由行動終端監控球的動向之方法 |
CN105741525B (zh) * | 2016-02-24 | 2019-10-01 | 北京小米移动软件有限公司 | 遥控器绑定的处理方法、装置和设备 |
US9996163B2 (en) | 2016-06-09 | 2018-06-12 | International Business Machines Corporation | Wearable device positioning based control |
CN106774934A (zh) * | 2017-01-06 | 2017-05-31 | 广东小天才科技有限公司 | 多媒体播放进度控制方法及系统 |
CN107092360A (zh) * | 2017-05-08 | 2017-08-25 | 电子科技大学 | 一种利用手腕扭动感应的定位输入装置 |
CN107605464A (zh) * | 2017-09-22 | 2018-01-19 | 中国石油集团西部钻探工程有限公司 | 气井控制柜控制系统及其控制方法 |
JP6629819B2 (ja) * | 2017-11-10 | 2020-01-15 | ファナック株式会社 | 操作端末とのペアリング機能を有する外付けデバイス |
US10866652B2 (en) | 2017-11-13 | 2020-12-15 | Samsung Electronics Co., Ltd. | System and method for distributed device tracking |
GB2574886A (en) | 2018-06-22 | 2019-12-25 | Ecole Polytechnique Fed Lausanne Epfl | Teleoperation with a wearable sensor system |
KR20210014397A (ko) | 2019-07-30 | 2021-02-09 | 삼성전자주식회사 | 스타일러스 펜에 의한 제스처를 확인하는 전자 장치 및 그 동작 방법 |
KR20210014401A (ko) | 2019-07-30 | 2021-02-09 | 삼성전자주식회사 | 스타일러스 펜에 의한 제스처를 확인하는 전자 장치 및 그 동작 방법 |
CN110557741B (zh) * | 2019-08-05 | 2021-08-13 | 华为技术有限公司 | 终端交互的方法及终端 |
JP2024048680A (ja) * | 2022-09-28 | 2024-04-09 | キヤノン株式会社 | 制御装置、制御方法、プログラム |
Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5909183A (en) | 1996-12-26 | 1999-06-01 | Motorola, Inc. | Interactive appliance remote controller, system and method |
JP2002358149A (ja) | 2001-06-01 | 2002-12-13 | Sony Corp | ユーザ入力装置 |
CN1422400A (zh) | 2000-02-08 | 2003-06-04 | 松下电器产业株式会社 | 便携终端 |
JP2003209399A (ja) | 2002-01-11 | 2003-07-25 | Matsushita Electric Ind Co Ltd | 部品実装機 |
CN1525389A (zh) | 2003-02-27 | 2004-09-01 | 宜 王 | 光电输入笔 |
JP2005217645A (ja) | 2004-01-28 | 2005-08-11 | Sanyo Electric Co Ltd | リモコン端末、サーバ装置およびネットワーク対応機器 |
US20050212911A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture identification of controlled devices |
WO2009024971A2 (en) | 2007-08-19 | 2009-02-26 | Saar Shai | Finger-worn devices and related methods of use |
US20090143877A1 (en) * | 2003-12-23 | 2009-06-04 | Koninklijke Philips Electronic, N.V. | Method of controlling a portable user device |
US20090146198A1 (en) | 2007-12-11 | 2009-06-11 | Samsung Electronics Co., Ltd | Photodiodes, image sensing devices and image sensors |
CN101600009A (zh) | 2008-06-04 | 2009-12-09 | 深圳富泰宏精密工业有限公司 | 无线控制装置及具有该控制装置的无线通信装置 |
JP2010054293A (ja) | 2008-08-27 | 2010-03-11 | Fujitsu Microelectronics Ltd | 車載用画像データ転送装置 |
CN101819463A (zh) | 2009-02-27 | 2010-09-01 | 株式会社电装 | 输入系统及用于该系统的可佩戴电气设备 |
JP2010267220A (ja) | 2009-05-18 | 2010-11-25 | Nara Institute Of Science & Technology | ウェアラブルコンピュータに用いるリング型インタフェース、インタフェース装置、およびインタフェース方法 |
CN102023731A (zh) | 2010-12-31 | 2011-04-20 | 北京邮电大学 | 一种适用于移动终端的无线微型指环鼠标 |
US20120025946A1 (en) | 2010-07-29 | 2012-02-02 | Hon Hai Precision Industry Co., Ltd. | Electronic device with remote function |
WO2012099428A2 (ko) | 2011-01-19 | 2012-07-26 | 삼성전자 주식회사 | 방송 시스템에서의 어플리케이션 서비스 장치 및 방법 |
CN102681727A (zh) | 2012-05-09 | 2012-09-19 | 闻泰通讯股份有限公司 | 一种触摸及动作感应结合的电子设备控制系统及方法 |
CN102737489A (zh) | 2012-06-12 | 2012-10-17 | 康佳集团股份有限公司 | 一种遥控器及遥控系统及基于该遥控系统的控制方法 |
CN103076918A (zh) | 2012-12-28 | 2013-05-01 | 深圳Tcl新技术有限公司 | 基于触摸终端的远程控制方法及系统 |
CN103116973A (zh) | 2013-01-22 | 2013-05-22 | 周万荣 | 一种基于应用的遥控设备和系统 |
US20130173064A1 (en) * | 2011-10-21 | 2013-07-04 | Nest Labs, Inc. | User-friendly, network connected learning thermostat and related systems and methods |
CN103197529A (zh) | 2013-02-06 | 2013-07-10 | 方科峰 | 一种体感腕表及应用方法 |
CN203120084U (zh) | 2012-12-12 | 2013-08-07 | 康佳集团股份有限公司 | 一种基于触摸板的遥控系统 |
CN103703495A (zh) | 2013-08-23 | 2014-04-02 | 华为技术有限公司 | 一种遥控器、信息处理方法及系统 |
-
2013
- 2013-08-23 JP JP2016528291A patent/JP6275839B2/ja active Active
- 2013-08-23 EP EP13891790.1A patent/EP3037946B1/de active Active
- 2013-08-23 WO PCT/CN2013/082180 patent/WO2015024252A1/zh active Application Filing
- 2013-08-23 CN CN201380001602.2A patent/CN103703495B/zh active Active
- 2013-08-23 KR KR1020157035139A patent/KR101793566B1/ko not_active Application Discontinuation
-
2014
- 2014-09-23 US US14/493,520 patent/US9978261B2/en active Active
Patent Citations (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5909183A (en) | 1996-12-26 | 1999-06-01 | Motorola, Inc. | Interactive appliance remote controller, system and method |
CN1422400A (zh) | 2000-02-08 | 2003-06-04 | 松下电器产业株式会社 | 便携终端 |
US20030122804A1 (en) | 2000-02-08 | 2003-07-03 | Osamu Yamazaki | Portable terminal |
JP2002358149A (ja) | 2001-06-01 | 2002-12-13 | Sony Corp | ユーザ入力装置 |
JP2003209399A (ja) | 2002-01-11 | 2003-07-25 | Matsushita Electric Ind Co Ltd | 部品実装機 |
CN1525389A (zh) | 2003-02-27 | 2004-09-01 | 宜 王 | 光电输入笔 |
US20090143877A1 (en) * | 2003-12-23 | 2009-06-04 | Koninklijke Philips Electronic, N.V. | Method of controlling a portable user device |
JP2005217645A (ja) | 2004-01-28 | 2005-08-11 | Sanyo Electric Co Ltd | リモコン端末、サーバ装置およびネットワーク対応機器 |
US20050212911A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture identification of controlled devices |
WO2009024971A2 (en) | 2007-08-19 | 2009-02-26 | Saar Shai | Finger-worn devices and related methods of use |
JP2010537302A (ja) | 2007-08-19 | 2010-12-02 | リングボウ エルティディ. | 指に装着される装置とその使用方法 |
KR20100072198A (ko) | 2007-08-19 | 2010-06-30 | 링보우 리미티드 | 반지형 장치와 그 사용방법 |
US20090146198A1 (en) | 2007-12-11 | 2009-06-11 | Samsung Electronics Co., Ltd | Photodiodes, image sensing devices and image sensors |
CN101459185A (zh) | 2007-12-11 | 2009-06-17 | 三星电子株式会社 | 光电二极管、图像感测装置及图像传感器 |
CN101600009A (zh) | 2008-06-04 | 2009-12-09 | 深圳富泰宏精密工业有限公司 | 无线控制装置及具有该控制装置的无线通信装置 |
US20090303074A1 (en) | 2008-06-04 | 2009-12-10 | Chi Mei Communication Systems, Inc. | Wireless control device and wireless communication device using the same |
JP2010054293A (ja) | 2008-08-27 | 2010-03-11 | Fujitsu Microelectronics Ltd | 車載用画像データ転送装置 |
CN101819463A (zh) | 2009-02-27 | 2010-09-01 | 株式会社电装 | 输入系统及用于该系统的可佩戴电气设备 |
JP2010204724A (ja) | 2009-02-27 | 2010-09-16 | Denso Corp | 入力システム及び電気機器 |
JP2010267220A (ja) | 2009-05-18 | 2010-11-25 | Nara Institute Of Science & Technology | ウェアラブルコンピュータに用いるリング型インタフェース、インタフェース装置、およびインタフェース方法 |
US20120025946A1 (en) | 2010-07-29 | 2012-02-02 | Hon Hai Precision Industry Co., Ltd. | Electronic device with remote function |
CN102023731A (zh) | 2010-12-31 | 2011-04-20 | 北京邮电大学 | 一种适用于移动终端的无线微型指环鼠标 |
WO2012099428A2 (ko) | 2011-01-19 | 2012-07-26 | 삼성전자 주식회사 | 방송 시스템에서의 어플리케이션 서비스 장치 및 방법 |
US20130173064A1 (en) * | 2011-10-21 | 2013-07-04 | Nest Labs, Inc. | User-friendly, network connected learning thermostat and related systems and methods |
CN102681727A (zh) | 2012-05-09 | 2012-09-19 | 闻泰通讯股份有限公司 | 一种触摸及动作感应结合的电子设备控制系统及方法 |
CN102737489A (zh) | 2012-06-12 | 2012-10-17 | 康佳集团股份有限公司 | 一种遥控器及遥控系统及基于该遥控系统的控制方法 |
CN203120084U (zh) | 2012-12-12 | 2013-08-07 | 康佳集团股份有限公司 | 一种基于触摸板的遥控系统 |
CN103076918A (zh) | 2012-12-28 | 2013-05-01 | 深圳Tcl新技术有限公司 | 基于触摸终端的远程控制方法及系统 |
CN103116973A (zh) | 2013-01-22 | 2013-05-22 | 周万荣 | 一种基于应用的遥控设备和系统 |
CN103197529A (zh) | 2013-02-06 | 2013-07-10 | 方科峰 | 一种体感腕表及应用方法 |
CN103703495A (zh) | 2013-08-23 | 2014-04-02 | 华为技术有限公司 | 一种遥控器、信息处理方法及系统 |
Non-Patent Citations (20)
Title |
---|
Foreign Communication From a Counterpart Application, Chinese Application No. 201380001602.2, Dec. 30, 2015, 13 pages. |
Foreign Communication From a Counterpart Application, European Application No. 13891790.1, Extended European Search Report dated Aug. 25, 2016, 8 pages. |
Foreign Communication From a Counterpart Application, Japanese Application No. 2016528291, English Translation of Japanese Office Action dated Feb. 21, 2017, 6 pages. |
Foreign Communication From a Counterpart Application, Japanese Application No. 2016528291, Japanese Office Action dated Feb. 21, 2017, 5 pages. |
Foreign Communication From a Counterpart Application, Korean Application No. 10-2015-7035139, English Translation of Korean Office Action dated Jan. 22, 2017, 5 pages. |
Foreign Communication From a Counterpart Application, Korean Application No. 10-2015-7035139, Korean Office Action dated Jan. 22, 2017, 6 pages. |
Foreign Communication From a Counterpart Application, PCT Application No. PCT/CN2013/082180, International Search Report dated May 30, 2014, 7 pages. |
Foreign Communication From a Counterpart Application, PCT Application No. PCT/CN2013/082180, Written Opinion dated May 30, 2014, 6 pages. |
Machine Translation and Abstract of Japanese Publication No. JP2003209399, Jul. 25, 2003, 29 pages. |
Machine Translation and Abstract of Japanese Publication No. JP2005217645, Aug. 11, 2005, 32 pages. |
Machine Translation and Abstract of Japanese Publication No. JP2010204724, Sep. 16, 2010, 34 pages. |
Machine Translation and Abstract of Japanese Publication No. JP2010267220, Nov. 25, 2010, 23 pages. |
Machine Translation and Abstract of Japanese Publication No. JP201054293, Mar. 11, 2010, 23 pages. |
Partial English Translation and Abstract of Chinese Patent Application No. CN101819463A, Sep. 12, 2014, 7 pages. |
Partial English Translation and Abstract of Chinese Patent Application No. CN102023731A, Sep. 12, 2014, 17 pages. |
Partial English Translation and Abstract of Chinese Patent Application No. CN102681727A, Sep. 12, 2014, 39 pages. |
Partial English Translation and Abstract of Chinese Patent Application No. CN103076918, Feb. 19, 2016, 11 pages. |
Partial English Translation and Abstract of Chinese Patent Application No. CN103197529, Feb. 19, 2016, 10 Sages. |
Partial English Translation and Abstract of Chinese Patent Application No. CN103703495, Sep. 12, 2014, 16 pages. |
Partial English Translation and Abstract of Chinese Patent Application No. CN1525389, Oct. 10, 2016, 9 pages. |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11400366B2 (en) | 2017-01-11 | 2022-08-02 | Sony Interactive Entertainment Inc. | Controller |
Also Published As
Publication number | Publication date |
---|---|
CN103703495B (zh) | 2017-11-17 |
WO2015024252A1 (zh) | 2015-02-26 |
EP3037946A4 (de) | 2016-09-28 |
US20150054630A1 (en) | 2015-02-26 |
JP6275839B2 (ja) | 2018-02-07 |
KR101793566B1 (ko) | 2017-11-03 |
KR20160007612A (ko) | 2016-01-20 |
EP3037946B1 (de) | 2018-01-03 |
CN103703495A (zh) | 2014-04-02 |
JP2016533577A (ja) | 2016-10-27 |
EP3037946A1 (de) | 2016-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9978261B2 (en) | Remote controller and information processing method and system | |
US20220083149A1 (en) | Computing interface system | |
US11543887B2 (en) | User interface control of responsive devices | |
KR102437106B1 (ko) | 마찰음을 이용하는 장치 및 방법 | |
US11409371B2 (en) | Systems and methods for gesture-based control | |
KR20170054423A (ko) | 다-표면 컨트롤러 | |
KR102297473B1 (ko) | 신체를 이용하여 터치 입력을 제공하는 장치 및 방법 | |
JP2018504798A (ja) | ジェスチャ制御方法、デバイス、およびシステム | |
WO2021160000A1 (zh) | 可穿戴设备及控制方法 | |
WO2016049842A1 (zh) | 一种便携或可穿戴智能设备的混合交互方法 | |
KR20160039589A (ko) | 손가락 센싱 방식을 이용한 무선 공간 제어 장치 | |
US20170269697A1 (en) | Under-wrist mounted gesturing | |
WO2016121034A1 (ja) | ウェアラブル装置、入力方法及びプログラム | |
CN105117026A (zh) | 带有自检功能的手势识别装置及手势识别装置的自检方法 | |
US20160139628A1 (en) | User Programable Touch and Motion Controller | |
Chen et al. | MobiRing: A Finger-Worn Wireless Motion Tracker | |
CN118012325A (zh) | 一种手势识别装置 | |
KR20190023658A (ko) | 멀티 터치 패드를 활용한 화면 컨트롤 시스템 | |
KR20120135124A (ko) | 휴대 단말에서 포인팅 디바이스를 이용하여 게임 캐릭터의 움직임을 제어하는 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HUAWEI TECHNOLOGIES CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:XU, LIFU;ZHONG, SHAN;XIA, ZHAOJIE;REEL/FRAME:033796/0846 Effective date: 20140923 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |