WO2013056452A1 - 用于遥控的设备和方法 - Google Patents

用于遥控的设备和方法 Download PDF

Info

Publication number
WO2013056452A1
WO2013056452A1 PCT/CN2011/081063 CN2011081063W WO2013056452A1 WO 2013056452 A1 WO2013056452 A1 WO 2013056452A1 CN 2011081063 W CN2011081063 W CN 2011081063W WO 2013056452 A1 WO2013056452 A1 WO 2013056452A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
remote control
control device
processing device
user
Prior art date
Application number
PCT/CN2011/081063
Other languages
English (en)
French (fr)
Inventor
刘运亮
Original Assignee
深圳市协力拓展科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市协力拓展科技有限公司 filed Critical 深圳市协力拓展科技有限公司
Priority to PCT/CN2011/081063 priority Critical patent/WO2013056452A1/zh
Publication of WO2013056452A1 publication Critical patent/WO2013056452A1/zh

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device

Definitions

  • Embodiments of the present invention generally relate to the field of information technology and, more particularly, to an apparatus and method for remote control. Background technique
  • Remote control or remote control is one of the basic functions of modern digital devices. From home use TV sets, set-top boxes, VCD/DVD (video compression discs / digital video discs) players, Blu-ray players, hard disk players, audio equipment, even refrigerators, washing machines and other household appliances, to remote control toys, model aircraft, and then To various remote working equipment, machine tools, and instruments used in the industry, it is often necessary to control these target devices through remote control devices (or simply "remote controls"). In the past, users could only implement simple functions by using a remote control device, such as switching channels of a television, adjusting the volume of the sound, controlling the state of the industrial instrument, and the like.
  • remote control devices With the rapid development of digital technology, the interaction between users and devices through remote control devices has become more diverse and complicated. For example, as traditional TV sets evolve into set-top boxes, users not only need to use remote control devices to implement basic functions such as switching channels and adjusting volume, but also need to implement content retrieval, query, on-demand, commenting, and even online interaction, games, etc. Complex features. This is just an example, and other remote control devices have such problems. However, in the face of this rapid growth in demand, traditional remote control devices and their use methods cannot meet the needs of users.
  • remote control devices are typically equipped with manipulation buttons, knobs, joysticks, and the like that can generate corresponding control signals in response to user manipulation for controlling the behavior of the target device.
  • Such remote control devices have significant drawbacks during use.
  • manipulating such a remote control device requires the user to have a certain level of cognition and operation.
  • the text on the buttons of the remote control device is usually English letters, and different keys may be multiplexed on one button at the same time.
  • the operation of the remote control device requires not only certain cognitive skills, but also the use of the multiplex buttons. This cognitive skill and operational level requirements limit
  • Another type of remote control device controls the target device by the movement of the remote control device itself, with little or no use of buttons.
  • the game console WH developed by Nintendo of Japan provides a handheld remote control device that can manipulate a target device or a virtual object therein through a set of predefined gestures when the user holds the remote control device (eg , tennis, bowling, etc.).
  • the remote control devices also have their own limitations. For example, such remote control devices often rely on their own displacement to control the motion of the target object, which requires the user to free up the remote control device and maintain a certain amount of operating space. For this purpose, the user has to perform a number of additional actions, such as cleaning and/or arranging other items on the table, and the like. This is inconvenient for the user.
  • a processing device for use in conjunction with a remote control device for remotely controlling at least one target device.
  • the processing device includes: a touch processing device configured to generate a control signal for the target device according to a gesture input provided by the user to the remote control device; and a non-touch processing device configured to be directed to the remote control device according to the user Providing a non-gesture input to generate a control signal for the target device, the control signal to be transmitted by the remote control device to the target device give away.
  • a remote control device for remotely controlling at least one target device.
  • the remote control device includes: at least one touch component disposed on a housing of the remote control device for receiving a gesture input from a user; at least one disposed above the housing and independent of the touch component a non-touch component for receiving non-gesture input from the user; a processing device as described above coupled to the touch component and the non-touch component; and a transmitting device coupled to the processing device And configured to transmit a control signal generated by the processing device to the target device.
  • a method of remotely controlling at least one target device using a remote control device includes: providing a gesture input to the remote control device using at least one touch component disposed on a housing of the remote control device; using a housing disposed on the remote control device and independent of the touch component At least one non-touch component that provides non-gesture input to the remote control device.
  • a computer program product comprises machine executable instructions tangibly stored on a computer readable medium for performing the methods described above when the instructions are executed by a processor.
  • a touch-type component e.g., a touchpad
  • a non-touch component e.g., a button, a knob
  • the user can flexibly utilize gesture input, non-gesture input, or a combination thereof to control the remote device according to actual needs.
  • the target device can be remotely controlled more efficiently and accurately.
  • FIG. 1 is a block diagram showing the structure of a remote control device according to an exemplary embodiment of the present invention
  • 2 to 4 illustrate schematic arrangements of a touch member and a non-touch member on a housing according to an exemplary embodiment of the present invention
  • FIG. 5 illustrates a flow chart of a method of controlling at least one target device using a remote control device in accordance with an exemplary embodiment of the present invention.
  • FIG. 1 there is shown a block diagram of a remote control device (or remote controller) 100 in accordance with an exemplary embodiment of the present invention.
  • the remote control device 100 can include a touch component 102 that is configured to receive gesture input from a user of the remote control device 100.
  • the remote control device 100 typically has a housing.
  • Touch device 102 can be disposed at any suitable location on the housing in accordance with an embodiment of the present invention.
  • the touch component 102 can occupy a major portion and/or area of the housing of the remote control device 100. In such an embodiment, the touch component 102 can be operated by the user more easily.
  • touch component refers to a component that achieves information input by utilizing direct contact (or proximity) to generate an electrical signal.
  • Any suitable device currently known or developed in the future can be used as the touch component 102.
  • touch component 102 can be a touchpad, such as a capacitive touchpad, an inductive touchpad, or any other suitable touchpad.
  • a device such as a TrackPoint may also function as touch component 102.
  • an interactive screen that allows direct manipulation thereof can also be used as the touch component 102.
  • a touch screen which allows the user to input information by touching the iH screen and performing corresponding actions thereon (eg, clicking, moving, etc.).
  • Another example of an interactive screen is a proximity screen where the user simply has to move the interactive tool close to the screen without actually making contact.
  • a touch/proximity screen is employed as the touch component 102
  • such a touch component 102 can also be used to provide visual feedback to a user.
  • virtual interactive devices such as soft-keys may also be present on the screen.
  • gesture refers to the input provided by the user through the trajectory produced by the movement of the hand.
  • the gesture input can be provided to the touch component 102 directly by the user's finger.
  • any suitable tool may be utilized in conjunction with touch component 102 to receive gesture input from the user.
  • such a tool may include a dedicated stylus or a conventional pen, and the like.
  • touch component 102 may be disposed on the housing of the remote control device 100 as needed, and the types of the respective touch members 102 may be different.
  • non-touch component 104 that is separate from the touch component 102 is also disposed on the housing of the remote control device 100.
  • the term "non-touch component” as used herein refers to any suitable component capable of converting a user's physical operation of the component to an electrical signal.
  • the non-touch component can include one or more non-touch components such as buttons, joysticks, knobs, and the like.
  • the non-touch component 104 is for receiving non-gesture input from a user.
  • such mechanical operation can be converted to an electrical signal when the user presses the button a single or multiple times, as is known in the art.
  • This electrical signal constitutes the "non-gesture input” as referred to herein.
  • electrical signals generated by mechanical manipulation of components such as joysticks, knobs, etc. can also produce non-gesture inputs.
  • the remote control device 100 also includes a processing device 106 that is typically disposed within the housing of the remote control device 100.
  • Processing device 106 is coupled to touch component 102 and non-touch component 104, configured to generate based on gesture input from touch component 102 and/or non-gesture input from non-touch component 104, in accordance with an embodiment of the present invention. control signal.
  • processing device 106 may be implemented by any suitable device currently known or developed in the future, including but not limited to one or more of the following: Use or general purpose microcontroller (MCU), microprocessor, central processing unit (CPU), and so on. Moreover, processing device 106 can be implemented using any suitable hardware, software, firmware, and/or combinations thereof.
  • MCU Use or general purpose microcontroller
  • CPU central processing unit
  • processing device 106 can be implemented using any suitable hardware, software, firmware, and/or combinations thereof.
  • the processing device 106 in accordance with an embodiment of the present invention includes a touch processing device 1062 configured to generate a control signal for a target device based on a gesture input provided by the user to the remote control device 100 via the touch component 102.
  • the touch processing device 1062 can determine the user's operational trajectory based on the electrical signals generated by the user's touch and manipulation of the touch component 102, and then determine the user's gesture. Thereafter, the touch processing device 1062 can generate a corresponding control signal according to the input gesture for controlling the operation of the target device.
  • a correspondence between a user action (ie, a gesture) and a target device operation may be predetermined and stored.
  • such correspondence may be stored in a memory (not shown) accessible by processing device 106.
  • a remote control audio device such a corresponding relationship may be predetermined and stored: The user makes a "slide up" gesture on the touch component 102 indicating that the user wishes to increase the volume of the target audio device.
  • gestures and target device operations does not have to be - corresponding.
  • a gesture may implement multiple different operations in different scenarios; conversely, multiple gestures may also be used to achieve the same operation.
  • this correspondence can be dynamically customized according to factors such as application scenarios, requirements, device status, etc., which will be described in detail below.
  • single touch means that only one point (for example, a user's finger, or the tip of a stylus) is in contact with or near the touch member 102 at any given time for realizing the remote control device.
  • Information input include, for example, "drag and drop” operations.
  • touch processing device 1062 can include a single touch processing device configured to generate a control signal based on a single touch gesture input provided by a user via touch component 102.
  • a single touch processing device configured to generate a control signal based on a single touch gesture input provided by a user via touch component 102.
  • touch processing device 1062 can include a single touch processing device configured to generate a control signal based on a single touch gesture input provided by a user via touch component 102.
  • remote control device 102 supports recognition of gesture input provided by a user via touch component 102, in accordance with certain embodiments of the present invention.
  • the user can input text through the touch part of the remote control device.
  • the remote control device can recognize the gesture input representing the text and transmit the recognized result to the target device to be controlled for various purposes.
  • the touch processing device 1062 can include a handwriting recognition device. It can be understood that the user's handwritten content is usually input in a single point operation mode.
  • a handwriting recognition device is typically coupled to a single touch processing device.
  • the handwriting recognition device can receive a single touch gesture input from the single touch processing device and perform an identification operation thereon.
  • the touch processing device 1062 can then use the identified result in the process of generating the control signal.
  • the remote control device can instruct the target device to display the recognized characters on its display.
  • the handwriting recognition device in touch processing device 1062 may employ any handwriting recognition technology currently known or developed in the future to identify a single touch input provided by a user.
  • Such techniques include, for example, but are not limited to, matching based recognition, statistical model based recognition, neural network based identification, genetic method based recognition, and the like. These are merely examples and are not intended to limit the scope of the invention.
  • these identification techniques can be implemented in software or hardcoded into any suitable hardware.
  • multi-touch refers to the presence of at least two points (eg, two or more fingers of a user) at or near any given moment in contact with or near the touch component 102 for implementation of the remote control device.
  • Information input include, for example, "magnifying" operations using two fingers with one hand, and the like.
  • multi-touch input can provide a more convenient and rich gesture language for the user, and is therefore desirable.
  • touch processing device 1062 can include a multi-touch processing device configured to generate a control signal based on a multi-touch gesture input provided by a user via touch device 102.
  • a multi-touch processing device configured to generate a control signal based on a multi-touch gesture input provided by a user via touch device 102.
  • touch processing device 1062 can include a multi-touch processing device configured to generate a control signal based on a multi-touch gesture input provided by a user via touch device 102.
  • the processing device 106 further includes a non-touch processing device 1064 configured to generate a target device based on non-gestural input provided by the user to the remote control device 100 via the non-touch component 104. control signal.
  • the non-gesture input referred to here comes from the user's mechanical operation of non-touch components such as buttons, knobs, and joysticks.
  • the electrical signals generated thereby can be received by the non-touch processing device 1064.
  • the non-touch processing device 1064 generates a corresponding control signal, for example, based on the corresponding relationship between the user action and the target device operation.
  • the processing device 106 may further include a customization device 1066 configured to customize the touch processing device 1062 and/or the non-touch processing device 1064 in response to a user's command, thereby dynamically The functionality of touch component 102 and/or non-touch component 104 is altered.
  • a customization device 1066 configured to customize the touch processing device 1062 and/or the non-touch processing device 1064 in response to a user's command, thereby dynamically The functionality of touch component 102 and/or non-touch component 104 is altered.
  • the target device operation corresponding to touch component 102 and/or non-touch component 104 can be dynamically modified by providing commands (hand gestures and/or non-gesture inputs) to remote control device 100.
  • commands hand gestures and/or non-gesture inputs
  • the customization or modification of such functionality is accomplished by reconfiguration of touch processing device 1062 and/or non-touch processing device 1064.
  • the customization device 1066 can modify the correspondence between user actions and target device operations to customize the functionality of the touch component 102 and/or the non-touch component 104.
  • the customization device 1066 can enable/disable the particular touch component 102 and/or the non-touch component 104.
  • processing device 106 can generate a control signal based on any of a gesture input from touch component 102 and a non-gesture input from non-touch component 104.
  • the processing device 106 can also generate control signals based on both types of inputs simultaneously. That is, the touch processing device 1062 and/or the non-touch processing device 1064 can work alone or in cooperation to work together.
  • the remote control device 100 allows the user to use the touch component 102 or the non-touch component 104 alone, and may also allow the user to use the touch component 102 and the non-touch component 104 simultaneously. This provides users with a richer and more natural means of interaction.
  • remote control device 100 includes a transmitting device 108 coupled to processing device 106 for transmitting control signals generated by processing device 106 from remote control device 100 to a target device.
  • the transmitting device 108 can send control signals through a wireless connection including, but not limited to, a Bluetooth connection, an infrared (IR) connection, a wireless local area network (WLAN) connection, a radio frequency (RF) connection, a wireless universal bus (WUSB) connection, etc. Wait.
  • the transmitting device 108 can transmit control signals to the target object via a wired connection, including but not limited to: a universal bus (USB) connection, an RS-232 serial connection, and the like.
  • FIG. 1 can be implemented using software and/or firmware. Alternatively or additionally, these components may be implemented partially or completely based on hardware, including but not limited to: integrated circuit (IC) chips, application specific integrated circuits (ASICs), system on a chip (SOC), etc. . Other means now known or later developed are also possible, and the scope of the invention is not limited in this respect.
  • IC integrated circuit
  • ASICs application specific integrated circuits
  • SOC system on a chip
  • the single device shown in Figure 1 may be comprised of multiples in alternative embodiments. The devices are implemented together. Conversely, the plurality of devices shown in Figure 1 can be implemented by a single device in alternative embodiments. As an example, in the embodiment shown in FIG. 1, a handwriting recognition device is included in the touch processing device 1062. In some other embodiments, however, the two may also be independent of one another. In general, based on the teachings and teachings of the present invention, the functions can be flexibly distributed between the various devices according to actual needs, and the scope of the present invention is not limited in this respect.
  • the touch member and the non-touch member may be disposed on the housing of the remote control device in various positional relationships.
  • Several exemplary possible arrangements are described below with reference to Figures 2 through 4.
  • a touch component 202 in this case, a touchpad
  • a set of non-touch components 204 in this case, buttons
  • the non-touch component 204 can be a function button that implements any suitable function.
  • Figure 3 shows another possible example.
  • the touch component 302 and the non-touch components 304-1 and 304-2 are disposed on the same side of the housing of the remote control device 300, and are disposed on different sides of the housing with the non-touch component 304-3.
  • the non-touch member 304-3 is disposed on one side of the casing in the example shown in Fig. 3, this is merely exemplary.
  • One or more touch components and/or non-touch components may also be disposed on other sides and/or back of the housing. The scope of the invention is not limited in this respect.
  • FIG. 4 illustrates the front and back of an exemplary remote control device 400 housing.
  • a touch component 402-1 and three sets of non-touch components 404-1, 404-2, and 404-3 are disposed on the front surface of the housing of the remote control device 400, and are in the shell of the remote control device 400.
  • a set of non-touch components 404-4 are disposed on the sides of the body.
  • another touch component 402-2 and another set of non-touch components 404-5 are disposed on the back of the housing of the remote control device 400.
  • FIG. 5 there is shown a flow diagram of a method 500 of controlling a target device using a remote control device in accordance with an exemplary embodiment of the present invention.
  • the user can provide a gesture input to the remote control device using a touch component disposed on the housing of the remote control device.
  • the gesture input provided herein may include at least one of: a single touch input, and a multi-touch input.
  • the single touch input can include handwriting input.
  • the touch component disposed on the remote control device may include at least one of the following: a touch pad, a pointing stick, a touch screen, and a proximity screen.
  • step S504 the user can provide non-gesture input to the remote control device using at least one non-touch component disposed on the housing of the remote control device and independent of the touch component.
  • the remote control device can convert such mechanical operation into an electrical signal to provide a corresponding non-gestural input.
  • the non-touch component disposed on the remote control device may include at least one of the following: a button, a joystick, and a knob.
  • the method 500 may also proceed to an optional step S506, where the user provides a command to the remote control device to customize or modify at least one of the touch component and the non-touch component.
  • the user provides a command to the remote control device to customize or modify at least one of the touch component and the non-touch component.
  • the function can be implemented using gesture input and/or non-gesture input.
  • the method 500 ends after step S506.
  • step S504 is illustrated as being performed after step S502, this is for illustrative purposes only. There is no order dependency between steps S504 and S502, but can be performed in any order or even simultaneously. Moreover, steps S502 and S504 can be alternately performed repeatedly. Likewise, step S506 does not have an order dependency with the steps shown to be performed before it. Moreover, method 500 can also include additional steps and/or omit execution of the steps shown.
  • the remote control device is applicable to various devices in various fields of home, industrial, and scientific experiments. Users can flexibly use gesture input, non-gesture input or a combination of the two to achieve various complex interactive functions according to actual needs. For example, to interact
  • a TV set-top box can be used by a user to replace a gamepad with a remote control device.
  • the remote control device described above and corresponding methods can be used to control a set top box type of device that can run various operating systems such as Andriod, Windows, iOS, MeeGo, and the like.
  • the main functions of such set-top box devices are Internet browsing (service), video playback, and games (based on the corresponding operating system).
  • the remote control device according to an embodiment of the present invention the operation of the above application can be effectively simplified, thereby obtaining a better user operating experience.
  • the communication between the remote control device and the target device can be based on Bluetooth, 2, 4G wireless technology, etc., in order to accommodate a high rate delivery connection.
  • a processing device of the remote control device for example, in the form of a chip or a remote control device itself may be provided.
  • a driver and application interface adapted to the operating system of the target device, or even a secondary developed user UI interface may also be provided.
  • a component for inputting a gesture and a component for non-gesture input can be simultaneously disposed on the remote control device.
  • the user can flexibly utilize gestures, non-gestures, or a combination thereof to control the remote device according to actual needs.
  • not only the cognitive burden and the interaction difficulty in the process of using the remote control device by the user can be reduced, but also complicated application operations can be easily realized.
  • the target device can be controlled more efficiently and accurately.
  • embodiments of the present invention may be implemented by hardware, software, or a combination of software and hardware.
  • the hardware portion can be implemented using dedicated logic; the software portion can be stored in memory and executed by a suitable instruction execution system, such as a microprocessor or dedicated design hardware.
  • a suitable instruction execution system such as a microprocessor or dedicated design hardware.
  • processor control code such as a carrier medium such as a magnetic disk, CD or DVD-ROM, such as a read-only memory.
  • Such code is provided on a programmable memory (firmware) or on a data carrier such as an optical or electronic signal carrier.
  • the device of the present invention and its module can be composed of, for example, a very large scale integrated circuit
  • An OR gate array, a semiconductor such as a logic chip, a transistor, or the like, or a hardware circuit implementation of a programmable hardware device such as a field programmable gate array, a programmable logic device, or the like, may also be implemented in software executed by various types of processors. It can also be implemented by a combination of the above hardware circuits and software such as firmware.
  • the communication network mentioned in the specification may include various types of networks including, but not limited to, a local area network ("LAN”), a wide area network (“WAN”), a network according to an IP protocol (for example, the Internet), and an end-to-end network (for example, ad Hoc peer-to-peer network).
  • LAN local area network
  • WAN wide area network
  • IP protocol for example, the Internet
  • end-to-end network for example, ad Hoc peer-to-peer network

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Selective Calling Equipment (AREA)
  • Position Input By Displaying (AREA)

Abstract

本发明的实施方式涉及用于遥控的设备和方法。具体地,公开了一种与遥控设备结合使用的处理装置,该遥控设备用于远程控制至少一个目标设备,该处理装置包括:触摸处理装置,配置用于根据用户向该遥控设备提供的手势输入而产生针对该目标设备的控制信号;以及非触摸处理装置,配置用于根据用户向该遥控设备提供的非手势输入而产生针对该目标设备的控制信号,该控制信号将由该遥控设备向该目标设备传送。还公开了相应的遥控设备和方法。根据本发明的实施方式,能够实现对远程设备更为有效、自然和准确的遥控。

Description

用于遥控的设备和方法 技术领域
本发明的实施方式总体上涉及信息技术领域, 更具体地, 涉及用 于遥控的设备和方法。 背景技术
远程控制或称遥控是现代数字化设备所配备的基本功能之一。 从 家庭使用的电视机、 机顶盒、 VCD/DVD (视频压缩盘 /数字视频盘) 播放器、 蓝光播放器、 硬盘播放器、 音响设备甚至电冰箱、 洗衣机等 家用电器, 到遥控玩具、航模,再到工业界使用的各种远程作业设备、 机床、 仪器, 人们经常需要通过遥控设备(或简称 "遥控器" ) 来对 这些目标设备进行控制。 过去, 用户通过使用遥控设备只能实现简单 的功能, 例如切换电视机的频道, 调节音响的音量、 控制工业仪器的 状态, 等等。
随着数字科技的迅速发展, 用户需要通过遥控设备与设备进行的 交互行为也越发多样化和复杂化。 例如, 随着传统的电视机演进为机 顶盒, 用户不但需要使用遥控设备实现切换频道、 调节音量这样的基 本功能, 还可能需要实现内容检索、 查询、 点播、 评论甚至在线互动、 游戏等越来越繁复的功能。 这仅仅是一个示例, 其他遥控设备同样存 在这样的问题。 然而, 面对需求的这种迅猛增长, 传统的遥控设备及 其使用方法无法满足用户的需求。
传统上, 遥控设备上通常配备有操纵按键、旋钮、操纵杆等部件, 其可以响应于用户的操纵而产生相应的控制信号, 以用于控制目标设 备的行为。 这种遥控设备在使用过程中存在明显的缺陷。 首先, 操纵 这种遥控设备需要用户具有一定的认知水平和操作水平。 例如, 遥控 设备按键上的文字通常为英文字母, 而且可能一个按键上同时复用不 同的字母, 操作这种遥控设备不但需要一定的认知技能, 还需要学习 和掌握复用按键的使用。 这种认知技能和操作水平的要求限制了儿
] 童、 老人等用户的使用。 而且, 此类遥控设备的使用过程不是足够用 户友好的。 利用有线的键盘实现越来越众多和复杂的交互功能, 加重 了用户在使用过程中的交互负担, 分散了用户对目标设备或任务本身 的注意力。 从人机交互的角度考虑, 这必然对终端用户体验带来不利 影响。
另一类遥控设备通过遥控设备本身的运动来控制目标设备, 而较 少甚至完全不使用按键。 例如, 由日本任天堂公司 (Nintendo ) 开发 的游戏主机 WH提供了一种手持式遥控设备, 当用户握住该遥控设备 时, 可以通过一组预定义手势来操纵目标设备或其中的虛拟对象(例 如, 网球、 保龄球等) 的运动。 然而, 这种遥控设备也存在其自身的 局限性。 例如, 这种遥控设备往往依赖于本身的位移来控制目标对象 的运动, 这就需要用户为遥控设备腾出并且保持一定的操作空间。 用 户为此例如要进行很多额外动作,例如清理和 /或布置桌面上的其他物 品, 等等。 这对用户来说是不方便的。 这一问题在可用空间有限时尤 为突出。 例如, 目前的很多交通工具(例如, 飞机、 火车、 出租车等) 为用户提供有专用的交互式电子设备, 以用于节目点播、 信息查询、 地图导航等。 此时, 用户可用的空间相对狭小(例如, 由于座椅布置、 周围其他用户) , 遥控设备本身的移动将更加不便甚至不现实。 发明内容
鉴于上述问题, 目前在本领域中, 需要一种允许用户更为自然、 高效和准确地远程控制目标设备的遥控设备和方法。 本发明提出了一 种这样的设备和方法。
在本发明的一个方面, 提供一种与遥控设备结合使用的处理装 置,所述遥控设备用于远程控制至少一个目标设备。该处理装置包括: 触摸处理装置, 配置用于根据用户向所述遥控设备提供的手势输入而 产生针对所述目标设备的控制信号; 以及非触摸处理装置, 配置用于 根据用户向所述遥控设备提供的非手势输入而产生针对所述目标设 备的控制信号, 所述控制信号将由所述遥控设备向所述目标设备传 送。
在本发明的另一方面, 提供一种用于远程控制至少一个目标设备 的遥控设备。 该遥控设备包括: 布置在所述遥控设备的壳体上的至少 一个触摸式部件, 用于接收来自用户的手势输入; 布置在所述壳体之 上并且独立于所述触摸式部件的至少一个非触摸式部件, 用于接收来 自所述用户的非手势输入; 如上文所述的处理装置, 耦合至所述触摸 式部件和所述非触摸式部件; 以及传送装置, 耦合至所述处理装置, 配置用于将所述处理装置产生的控制信号向所述目标设备传送。
在本发明的又一方面, 提供一种使用遥控设备来远程控制至少一 个目标设备的方法。 该方法包括: 使用布置在所述遥控设备的壳体上 的至少一个触摸式部件, 向所述遥控设备提供手势输入; 使用布置在 所述遥控设备的壳体上并且独立于所述触摸式部件的至少一个非触 摸式部件, 向所述遥控设备提供非手势输入。
在本发明的再一方面, 提供一种计算机程序产品。 该计算机程序 产品包括有形地存储于计算机可读介质上的机器可执行指令, 当所述 指令由处理器执行时, 用于实现上文所述的方法。
通过下文描述将会理解, 根据本发明的实施方式, 在遥控设备上 可以同时布置用于输入手势的触摸式部件(例如, 触摸板)和非触摸 式部件 (例如, 按键、 旋钮) 。 这样, 用户可以根据实际需求, 灵活 地利用手势输入、 非手势输入或其组合来控制远程设备。 以此方式, 不仅能够降低用户使用遥控设备过程中的认知负担和交互难度, 而且 可以容易地实现复杂的目标设备搮作。 由此, 可以更为高效和准确地 远程控制目标设备。 附图说明
通过参考附图阅读下文的详细描述, 本发明实施方式的上述以及 其他目的、 特征和优点将变得易于理解。 在附图中, 以示例性而非限 制性的方式示出了本发明的若干实施方式, 其中:
图 1示出了根据本发明示例性实施方式的遥控设备的结构框图; 图 2-图 4示出了根据本发明示例性实施方式的触摸式部件和非触 摸式部件在壳体上的示意性布置;
图 5示出了使用根据本发明示例性实施方式的遥控设备来控制至 少一个目标设备的方法流程图。
在各个附图中, 相同或对应的标号表示相同或对应的部分。 具体实施方式
下面将参考若千示例性实施方式来描述本发明的原理和精神。 应 当理解, 给出这些实施方式仅仅是为了使本领域技术人员能够更好地 理解进而实现本发明, 而并非以任何方式限制本发明的范围。
首先参考图 1, 其示出了根据本发明示例性实施方式的遥控设备 (或称遥控器) 100的结构框图。
如图所示, 遥控设备 100可以包括触摸式部件 102, 其配置用于 接收来自遥控设备 100的用户的手势输入。 可以理解, 遥控设备 100 通常具有壳体。 根据本发明的实施方式, 触摸式部件 102可以布置在 壳体上任何适当的位置。根据本发明的某些实施方式,触摸式部件 102 可以占据遥控设备 100的壳体的主要部分和 /或面积。在这样的实施方 式中, 用户可以更为容易地操作触摸式部件 102。
在此使用的术语 "触摸式部件"是指通过利用直接接触(或靠近) 产生电信号以实现信息输入的部件。各种目前已知或者将来开发的任 何适当设备均可被用作触摸式部件 102。 例如, 触摸式部件 102可以 是触摸板, 例如电容式触摸板、 电感式触摸板或其他任何适当的触摸 板。 备选地或附加地, 指点杆 (TrackPoint ) 之类的设备也可以充当 触摸式部件 102。
此外, 还可以将允许对其直接进行操作的交互式屏幕用作触摸式 部件 102。 这种屏幕的典型示例之一是触摸式屏幕, 用户可以通过接 触 i亥屏幕并在其上执行相应的动作 (例如, 点击、 移动, 等等) 来实 现信息输入。 交互式屏幕的另一示例是邻近式(proximity )屏幕, 用 户只需将交互工具靠近该屏幕而无需实际发生接触即可进行操。 可以 理解, 在采用触摸式 /邻近式屏幕充当触摸式部件 102的实施方式中, 这种触摸式部件 102还可被用于向用户提供视觉反馈。 而且, 在使用 屏幕作为触摸式部件 102 的实施方式中, 该屏幕上还可以存在软键 ( soft-key ) 等虛拟交互装置。
术语 "手势" 是指用户通过手部的运动所产生的轨迹而提供的输 入。 手势输入可以直接通过用户的手指被提供给触摸式部件 102。 备 选地或附加地,也可以利用任何适当工具与触摸式部件 102结合工作, 以接收用户的手势输入。 例如, 这种工具可以包括专用的触笔或者普 通的笔, 等等。
应当注意, 尽管图中仅示出了一个触摸式部件 102, 但是本发明 的范围不限于此。 根据需要, 可以在遥控设备 100的壳体上布置不止 一个触摸式部件 102, 而且各个触摸式部件 102的类型可以不同。
除了触摸式部件 102之外,如图所示,在遥控设备 100的壳体上, 还布置有独立于触摸式部件 102的至少一个非触摸式部件 104。 在此 所用的术语 "非触摸式部件" 是指能够将用户对该部件的物理操作转 换为电信号的任何适当部件。 例如, 非触摸式部件可以包括按键, 操 纵杆, 旋钮之类的一个或多个非触摸式部件。
在本发明的实施方式中, 非触摸式部件 104用于接收来自用户的 非手势输入。 例如, 在使用按键的情况下, ^用户对按键进行单次或 多次按压时, 这种机械操作可以被转化为电信号, 这在本领域是已知 的。 这种电信号构成了在此所说的 "非手势输入" 。 类似地, 对操纵 杆、 旋钮等部件的机械操作所产生的电信号也可以产生非手势输入。
继续参考图 1, 遥控设备 100还包括处理装置 106, 其通常被布 置在遥控设备 100的壳体内部。根据本发明的实施方式,处理装置 106 耦合至触摸式部件 102和非触摸式部件 104, 配置用于基于来自触摸 式部件 102的手势输入和 /或来自非触摸式部件 104的非手势输入而产 生控制信号。
在本发明的实施方式中, 处理装置 106可以借助于目前已知或者 将来开发的任何适当设备来实现, 包括但不限于以下一个或多个: 专 用或通用的微控制器 (MCU ) 、 微处理器、 中央处理单元 (CPU ) , 等等。 而且, 处理装置 106可以利用任何适当的硬件、 软件、 固件和 /或其组合来实现。
具体地, 如图所示, 根据本发明实施方式的处理装置 106包括触 摸处理装置 1062,配置用于根据用户通过触摸式部件 102向遥控设备 100提供的手势输入而产生针对目标设备的控制信号。
触摸处理装置 1062可以根据用户对触摸式部件 102的触摸和操 作所产生的电信号来确定用户的操作轨迹, 继而确定用户的手势。 此 后, 触摸处理装置 1062便可以根据输入的手势产生相应的控制信号, 以用于控制目标设备的操作。根据本发明的实施方式, 用户动作(即, 手势) 与目标设备操作的对应关系可以是预先确定并存储的。 例如, 这种对应关系可以存储在处理装置 106可访问的存储器中(未示出)。 作为示例, 在遥控音响设备的示例中, 可以预先确定并存储这样的对 应关系: 用户在触摸式部件 102上做出 "向上滑动" 这一手势, 表示 用户希望提高目标音响设备的音量。
注意, 上面描述的仅仅是一个示例。 实际上, 手势与目标设备操 作之间的对应关系并非必须是——对应的。 换言之, 一个手势可能在 不同场景下实现多个不同操作; 反之, 多个手势也可能被用于实现相 同操作。 而且, 这种对应关系还可以根据应用场景、 需求、 设备状态 等因素而动态定制, 这将在下文详述。
特别地, 根据本发明的某些实施方式, 在使用遥控设备 100的触 摸式部件 102的过程中, 允许用户通过单点触摸操作来提供其手势输 入。 这里所说的 "单点触摸" 是指在任何给定的时刻只有一个点 (例 如, 用户的一根手指, 或者一个触笔的尖端)接触或靠近触摸式部件 102, 用以实现向遥控设备的信息输入。 典型的单点触摸操作例如包 括 "拖拽" 操作。
相应地, 在这些实施方式中, 触摸处理装置 1062 可以包括单点 触摸处理装置, 其配置用于基于用户通过触摸式部件 102提供的单点 触摸手势输入而产生控制信号。 注意, 本领域已经存在各种可用于处 理单点触摸操作的已知技术, 这些技术均可与单点触摸处理装置结合 使用 此外, 将来开发的用于处理单点触摸操作以生成控制信号的装 置也可在未来充当单点触摸处理装置。本发明的范围在此方面不受限 制。
特别地, 根据本发明的某些实施方式, 遥控设备 102支持对用户 通过触摸式部件 102提供的手势输入进行识别。 这样, 用户可以通过 遥控设备的触摸式部件输入文字。 相应地, 遥控设备可以对代表文字 的手势输入进行识别, 并且将识别的结果传送给要控制的目标设备以 用于各种目的。
为此, 如图 1所示, 触摸处理装置 1062可以包括手写识别装置。 可以理解, 用户的手写内容通常是以单点操作模式输入的。 因此根据 本发明的实施方式, 手写识别装置通常耦合至单点触摸处理装置。 手 写识别装置可以接收来自单点触摸处理装置的单点触摸手势输入, 并 且对其执行识别操作。 而后, 触摸处理装置 1062 可以产生控制信号 的过程中使用识别的结果。 例如, 遥控设备可以指令目标设备在其显 示器上显示识别所得的字符。
根据本发明的实施方式, 触摸处理装置 1062 中的手写识别装置 可以采用任何目前已知或者将来开发的手写识别技术来识别用户提 供的单点触摸输入。这样的技术例如包括但不限于:基于匹配的识别, 基于统计模型的识别,基于神经元网絡的识别,基于遗传方法的识别, 等等。 这些仅仅是示例, 并非意在限制本发明的范围。 而且, 应当注 意, 这些识别技术可以通过软件形式实现, 也可以被硬编码到任何适 当的硬件中加以实现。
除了单点触摸操作之外或者作为补充, 根据本发明的某些实施方 式, 在使用遥控设备 100的触摸式部件 102时, 还允许用户通过多点 触摸操作来提供其手势输入。 这里所说的 "多点触摸" 是指在任何给 定的时刻同时存在至少两个点 (例如, 用户的两根或更多手指)接触 或靠近触摸式部件 102, 用以实现向遥控设备的信息输入。 典型的多 点触摸操作例如包括使用单手的两个手指进行 "放大" 操作, 等等。 在某些情况下, 多点触摸输入可以为用户提供更为方便和丰富的手势 语言, 因此是期望的。
相应地, 在这些实施方式中, 触摸处理装置 1062可以包括多点 触摸处理装置, 配置'用于基于用户通过触摸式部件 102提供的多点触 摸手势输入而产生控制信号。 注意, 在本领域中已经存在各种可用于 处理多点触摸操作的已知技术, 这些技术均可与多点触摸处理装置结 合使用。 此外, 将来开发的用于处理多点触摸操作以生成控制信号的 装置也可在未来充当多点触摸处理装置。 本发明的范围在此方面不受 限制。
继续参考图 1 , 除了触摸处理装置 1062之外, 处理装置 106还包 括非触摸处理装置 1064, 配置用于根据用户通过非触摸式部件 104 向遥控设备 100提供的非手势输入而产生针对目标设备的控制信号。 这里所说的非手势输入来自用户对按键、 旋钮、 操纵杆等非触摸式部 件的机械操作。 据此产生的电信号可被非触摸处理装置 1064接收。 此后, 非触摸处理装置 1064例如根据用户动作与目标设备操作的对 应关系产生相应的控制信号。
可选地, 根据本发明的某些实施方式, 处理装置 106还可以包括 定制装置 1066, 配置用于响应于用户的命令而对触摸处理装置 1062 和 /或非触摸处理装置 1064进行定制, 从而动态改变触摸式部件 102 和 /或非触摸式部件 104的功能。
换言之,从用户的角度看,可以通过向遥控设备 100提供命令(手 势输入和 /或非手势输入) 来动态地修改触摸式部件 102和 /或非触摸 式部件 104所对应的目标设备操作。 从实现的角度看, 这种功能的定 制或修改是通过对触摸处理装置 1062和 /或非触摸处理装置 1064的重 配置来完成的。
作为示例, 定制装置 1066 可以修改用户动作与目标设备操作之 间的对应关系, 以此来定制触摸式部件 102 和 /或非触摸式部件 104 的功能。 备选地或附加地, 定制装置 1066可以启用 /禁用特定的触摸 式部件 102和 /或非触摸式部件 104。 注意, 根据本发明的实施方式, 处理装置 106可以根据来自触摸 式部件 102的手势输入和来自非触摸式部件 104的非手势输入中的任 何一个而产生控制信号。 备选地或附加地, 处理装置 106也可以同时 基于这两类输入而产生控制信号。 也就是说, 触摸处理装置 1062和 / 或非触摸处理装置 1064可以单独工作, 也可以相互配合以协同工作。 以此方式, 遥控设备 100允许用户单独使用触摸式部件 102或者非触 摸式部件 104, 也可以允许用户同时配合使用触摸式部件 102和非触 摸式部件 104。 这为用户提供了更为丰富和自然的交互手段。
根椐本发明的实施方式, 在处理装置 106根据分别来自触摸式部 件 102和 /或非触摸式部件 104的手势输入和 /或非手势输入产生控制 信号之后, 该控制信号将被传送给与遥控设备相关联的一个或多个目 标设备。 如图 1所示, 遥控设备 100包括传送装置 108 , 耦合至处理 装置 106, 配置用于将处理装置 106产生的控制信号从遥控设备 100 传送给目标设备。 传送装置 108可以通过无线连接来送控制信号, 无 线连接包括但不限于:蓝牙连接、红外( IR )连接、无线局域网( WLAN ) 连接、 射频 (RF )连接、 无线通用总线 (WUSB )连接, 等等。 备选 地或附加地, 传送装置 108可以通过有线连接向目标对象传送控制信 号, 其中有线连接包括但不限于: 通用总线 (USB ) 连接、 RS-232 串行连接, 等等。
应当理解, 图 1中示出以及在上文中描述的仅仅是与本发明有关 的部件及其功能。 遥控设备当然可以包括任何其他适当的部件和支持 电路。 由此, 上面的框图仅仅是示意性的, 并非意在以任何方式限制 本发明的实施方式。
此外, 图 1 中示出的各个部件可以利用软件和 /或固件来实现。备 选地或附加地, 这些部件可以部分地或者完全地基于硬件来实现, 这 样的硬件包括但不限于: 集成电路(IC )芯片、专用集成电路(ASIC )、 片上系统 (SOC ) , 等等。 现在已知或者将来开发的其他方式也是可 行的, 本发明的范围在此方面不受限制。
还应注意, 图 1中所示的单个装置在备选实施方式中可以由多个 装置共同实现。 反之, 图 1中所示的多个装置可以在备选实施方式中 由单个装置实现。 作为示例, 在图 1所示的实施方式中, 手写识别装 置被包括在触摸处理装置 1062 中。 然而在某些其他实施方式中, 二 者也可以相互独立。 一般而言, 基于本发明所给出的启示和教导, 功 能可以根据实际需求在各个装置之间灵活地分配, 本发明的范围在此 方面不受限制。
根据本发明的实施方式, 触摸式部件和非触摸式部件可以按照各 种位置关系而被布置在遥控设备的壳体之上。下面将参考图 2-图 4描 述几种示例性的可行布置。
如图 2所示, 遥控设备 200的壳体的相同面上, 布置有一个触摸 式部件 202 (在此例中是触摸板) 以及一组非触摸式部件 204 (在此 例中是按键) 。 特别地, 非触摸式部件 204可以是实现任何适当功能 的功能按键。
备选地, 图 3示出了另一个可行的示例。 在图 3的示例中, 触摸 式部件 302与非触摸式部件 304-1和 304-2布置在遥控设备 300壳体 的相同面, 并且与非触摸式部件 304-3布置在壳体的不同面。 注意, 尽管在图 3 所示的例子中非触摸式部件 304-3 布置在壳体的一个侧 面,但这仅仅是示例性的。一个或多个触摸式部件和 /或非触摸式部件 也可以布置在壳体的其他侧面和 /或背面。本发明的范围在此方面不受 限制。
作为又一示例, 图 4示出了一个示例性遥控设备 400壳体的正面 和背面。 如图 4的上部所示, 在遥控设备 400壳体的正面布置有一个 触摸式部件 402-1和三组非触摸式部件 404-1、 404-2和 404-3 , 并且 在遥控设备 400壳体的侧面布置有一组非触摸式部件 404-4。 如图 4 下部所示,在在遥控设备 400壳体的背面布置有另一触摸式部件 402-2 以及另一组非触摸式部件 404-5。
应当注意, 上文结合图 2-图 4描述的仅仅是遥控设备 400的几个 可行示例。 这些图中示出的部件的类型、 布置、 形状、 数目以及遥控 设备的横纵方向等均是示例性的, 不构成对本发明范围的任何限制。 下面参考图 5, 其示出了使用根据本发明示例性实施方式的遥控 设备来控制目标设备的方法 500的流程图。
方法 500开始之后, 在步骤 S502, 用户可以使用布置在遥控设备 的壳体上的触摸式部件向所述遥控设备提供手势输入。根据本发明的 某些实施方式, 在此提供的手势输入可以包括以下至少一个: 单点触 摸输入, 以及多点触摸输入。 特别地, 单点触摸输入可以包括手写输 入。 此外, 布置在遥控设备上的触摸式部件可以包括以下至少一个: 触摸板, 指点杆, 触摸式屏幕, 邻近式屏幕。
接下来, 方法 500进行到步骤 S504 , 在此用户可以使用布置在遥 控设备的壳体上并且独立于触摸式部件的至少一个非触摸式部件, 向 遥控设备提供非手势输入。 可以理解, 响应于用户对非触摸式部件的 机械操作, 遥控设备可以将这种机械操作转换为电信号, 从而提供相 应的非手势输入。 此外, 布置在遥控设备上的非触摸式部件可以包括 以下至少一个: 按键, 操纵杆, 旋钮。
在本发明的某些可选实施方式中, 方法 500还可以进行到可选的 步骤 S506 , 在此用户向所述遥控设备提供命令, 以定制或修改触摸 式部件和非触摸式部件中至少一个的功能。 注意, 用户提供的命令可 以采用手势输入和 /或非手势输入来实现。
方法 500在步骤 S506之后结束。
注意, 方法 500 中记载的各个步骤可以按照不同的顺序执行和 / 或并行执行。 例如, 尽管将步骤 S504示为在步骤 S502之后执行, 但 这只是为了说明之目的。 步骤 S504和 S502之间并无顺序依赖性, 而 是可以按照任何顺序甚至同时执行。 而且, 步骤 S502和 S504可以多 次反复地交替执行。 同样, 步骤 S506与被示为在其之前执行的步骤 也不存在顺序依赖性。 而且, 方法 500还可以包括附加的步骤和 /或省 略执行示出的步骤。
根据本发明实施方式的遥控设备适用于各种家用、 工业、 科学实 验领域的各种设备。 用户可以根据实际需要, 灵活地使用手势输入、 非手势输入或二者的结合来实现各种复杂的交互功能。 例如, 以交互
] 1 式电视机顶盒为例, 用户甚至可以使用遥控设备取代游戏手柄。
注意, 根据本发明的实施方式可以应用于各种领域。 例如, 上文 描述的遥控设备以及相应方法可被用于控制机顶盒类型的设备, 此类 设备可以运行各种操作系统, 诸如 Andriod、 Windows, iOS、 MeeGo, 等等。 这类机顶盒设备主要提供的功能是互联网浏览 (服务) 、 影音 播放、 游戏 (基于对应操作系统) 。 通过使用根据本发明实施方式的 遥控设备, 可以有效地简化上述应用的操作, 从而获得更好的用户操 作体验。 在操作这类设备时, 优选地, 遥控设备与目标设备之间的通 信可以基于蓝牙、 2,4G无线技术等,以便满足高速率传递的连接方式。 可以提供遥控设备的处理装置(例如, 以芯片的形式)或者遥控设备 本身。 备选地或附加地, 也可以提供与目标设备的操作系统相适应的 驱动及应用接口, 甚至于二次开发的用户 UI接口 (例如, 以软件的 形式) 。 上所述, 根据本发明的实施方式, 在遥控设备上可以同时布置有用于 输入手势的部件和用于非手势输入的部件。 这样, 用户可以根据实际 需求, 灵活地利用手势、 非手势或其组合来控制远程设备。 以此方式, 不仅能够降低用户使用遥控设备过程中的认知负担和交互难度, 而且 可以容易地实现复杂的应用操作。 由此, 可以更为高效和准确地控制 目标设备。
应当注意, 本发明的实施方式可以通过硬件、 软件或者软件和硬 件的结合来实现。 硬件部分可以利用专用逻辑来实现; 软件部分可以 存储在存储器中, 由适当的指令执行系统, 例如微处理器或者专用设 计硬件来执行。 本领域的普通技术人员可以理解上述的设备和方法可 以使用计算机可执行指令和 /或包含在处理器控制代码中来实现,例如 在诸如磁盘、 CD或 DVD- ROM的载体介质、诸如只读存储器(固件) 的可编程的存储器或者诸如光学或电子信号载体的数据载体上提供 了这样的代码。 本发明的设备及其模块可以由诸如超大规模集成电路 或门阵列、 诸如逻辑芯片、 晶体管等的半导体、 或者诸如现场可编程 门阵列、 可编程逻辑设备等的可编程硬件设备的硬件电路实现, 也可 以用由各种类型的处理器执行的软件实现, 也可以由上述硬件电路和 软件的结合例如固件来实现。
说明书中提及的通信网络可以包括各类网络, 包括但不限于局域 网 ("LAN" ) , 广域网 ("WAN" ) , 根据 IP协议的网络 (例如, 因 特网) 以及端对端网络 (例如, ad hoc对等网絡) 。
应当注意, 尽管在上文详细描述中提及了设备的若干装置或子装 置, 但是这种划分仅仅并非强制性的。 实际上, 根据本发明的实施方 式, 上文描述的两个或更多装置的特征和功能可以在一个装置中具体 化。 反之, 上文描述的一个装置的特征和功能可以进一步划分为由多 个装置来具体化。
此外,尽管在附图中以特定顺序描述了本发明方法的操作,但是, 这并非要求或者暗示必须按照该特定顺序来执行这些操作, 或是必须 执行全部所示的操作才能实现期望的结果。 相反, 流程图中描绘的步 骤可以改变执行顺序。 附加地或备选地, 可以省略某些步骤, 将多个 步骤合并为一个步骤执行, 和 /或将一个步骤分解为多个步驟执行。
虽然已经参考若干具体实施方式描述了本发明, 但是应该理解, 本发明并不限于所公开的具体实施方式。 本发明旨在涵盖所附权利要 求的精神和范围内所包括的各种修改和等同布置。 所附权利要求的范 围符合最宽泛的解释, 从而包含所有这样的修改及等同结构和功能。

Claims

权 利 要 求 书
1. 一种与遥控设备结合使用的处理装置,所述遥控设备用于远程 控制至少一个目标设备, 所述处理装置包括:
触摸处理装置, 配置用于根据用户向所述遥控设备提供的手势输 入而产生针对所述目标设备的控制信号; 以及
非触摸处理装置, 配置用于根据用户向所述遥控设备提供的非手 势输入而产生针对所述目标设备的控制信号,
所述控制信号将由所述遥控设备向所述目标设备传送。
2. 根据权利要求 1所述的处理装置,其中所述触摸处理装置包括 以下至少一个:
单点触摸处理装置, 配置用于根据用户向所述遥控设备提供的单 点触摸手势输入而产生所述控制信号; 以及
多点触摸处理装置, 配置用于根据用户向所述遥控设备提供的多 点触摸手势输入而产生所述控制信号。
3. 根据权利要求 1所述的处理装置, 进一步包括:
手写识别装置, 耦合至所述单点触摸处理装置, 配置用于对所述 单点触摸手势输入执行识别, 所述识别的结果将被用于产生所述控制 信号。
4. 根据权利要求 1所述的处理装置, 进一步包括:
定制装置, 耦合至所述触摸处理装置和非触摸处理装置的至少一 个, 配置用于响应于用户的命令而对触摸处理装置和非触摸处理装置 的至少一个进行定制。
5. 一种用于远程控制至少一个目标设备的遥控设备, 包括: 布置在所述遥控设备的壳体上的至少一个触摸式部件, 用于接收 来自用户的手势输入;
布置在所述壳体之上并且独立于所述触摸式部件的至少一个非 触摸式部件, 用于接收来自所述用户的非手势输入;
根据权利要求 1所述的处理装置, 耦合至所述触摸式部件和所述
!4 非触摸式部件; 以及
传送装置, 耦合至所述处理装置, 配置用于将所述处理装置产生 的控制信号向所述目标设备传送。
6. 根据权利要求 5所述的遥控设备,其中所述触摸式部件包括以 下至少一个: 触摸板, 指点杆, 触摸式屏幕, 邻近式屏幕。
7. 根据权利要求 5所述的遥控设备,其中所述非触摸式部件包括 以下至少一个: 按键, 操纵杆, 旋钮。
8. 根据权利要求 5所述的遥控设备,其中所述触摸式部件和所述 非触摸式部件布置在所述壳体的相同面。
9. 根据权利要求 5所述的遥控设备,其中所述触摸式部件和所述 非触摸式部件布置在所述壳体的不同面。
10. 一种使用遥控设备来远程控制至少一个目标设备的方法, 包 括:
使用布置在所述遥控设备的壳体上的至少一个触摸式部件, 向所 述遥控设备提供手势输入; 以及
使用布置在所述遥控设备的壳体上并且独立于所述触摸式部件 的至少一个非触摸式部件, 向所述遥控设备提供非手势输入。
1 1. 根据权利要求 10所述的方法,其中所述手势输入包括以下至 少一个: 单点触摸输入, 以及多点触摸输入。
12. 根据权利要求 1 1所述的方法,其中所述单点触摸输入包括手 写输入。
13. 根据权利要求 10所述的方法, 进一步包括:
向所述遥控设备提供命令, 以定制所述触摸式部件和所述非触摸 式部件中至少一个的功能。
14. 根据权利要求 10所述的方法,其中所述触摸式部件包括以下 至少一个: 触摸板, 指点杆, 触摸式屏幕, 邻近式屏幕。
15. 根椐权利要求 10所述的方法,其中所述非触摸式部件包括以 下至少一个: 按键, 操纵杆, 旋钮。
16. 一种计算机程序产品, 包括有形地存储于计算机可读介质上 的机器可执行指令, 当所述指令由处理器执行时, 用于实现根据权利 要求 10- 15任一项所述的方法。
PCT/CN2011/081063 2011-10-20 2011-10-20 用于遥控的设备和方法 WO2013056452A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/081063 WO2013056452A1 (zh) 2011-10-20 2011-10-20 用于遥控的设备和方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/081063 WO2013056452A1 (zh) 2011-10-20 2011-10-20 用于遥控的设备和方法

Publications (1)

Publication Number Publication Date
WO2013056452A1 true WO2013056452A1 (zh) 2013-04-25

Family

ID=48140333

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/081063 WO2013056452A1 (zh) 2011-10-20 2011-10-20 用于遥控的设备和方法

Country Status (1)

Country Link
WO (1) WO2013056452A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201035752Y (zh) * 2007-04-24 2008-03-12 中兴通讯股份有限公司 带有输入功能的电视机顶盒—遥控器装置
CN101609599A (zh) * 2008-06-17 2009-12-23 上海晨兴电子科技有限公司 触摸式遥控器及其遥控方法
CN201443822U (zh) * 2009-04-30 2010-04-28 海信(山东)空调有限公司 具有手写功能的空调器
US20100127914A1 (en) * 2007-08-01 2010-05-27 Fm Marketing Gmbh Remote control

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201035752Y (zh) * 2007-04-24 2008-03-12 中兴通讯股份有限公司 带有输入功能的电视机顶盒—遥控器装置
US20100127914A1 (en) * 2007-08-01 2010-05-27 Fm Marketing Gmbh Remote control
CN101609599A (zh) * 2008-06-17 2009-12-23 上海晨兴电子科技有限公司 触摸式遥控器及其遥控方法
CN201443822U (zh) * 2009-04-30 2010-04-28 海信(山东)空调有限公司 具有手写功能的空调器

Similar Documents

Publication Publication Date Title
CN102884491B (zh) 用于基于触摸屏的电子设备的可动作对象控制器和数据录入附件
JP4933027B2 (ja) 継ぎ目なく組み合わされた自由に動くカーソルと切り替わるハイライトナビゲーション
JP6275839B2 (ja) リモコン装置、情報処理方法およびシステム
US9547421B2 (en) Apparatus and method for managing operations of accessories
TWI478010B (zh) 使用合作輸入來源及有效率的動態座標重對映之雙指標管理方法
US10960311B2 (en) Method and apparatus for adapting applications over multiple devices
US20140098038A1 (en) Multi-function configurable haptic device
KR101930225B1 (ko) 터치스크린 동작모드의 제어방법 및 제어장치
JP2008140182A (ja) 入力装置、送受信システム、入力処理方法、および制御プログラム
US20110034248A1 (en) Apparatus for associating physical characteristics with commands
US20140329593A1 (en) Text entry using game controller
US20160253087A1 (en) Apparatus and method for controlling content by using line interaction
US20120005615A1 (en) Method for executing an input by means of a virtual keyboard displayed on a screen
KR101451941B1 (ko) 아이콘 관련 화면 제어 방법 및 이를 구현하는 셋톱박스
CN103064506A (zh) 用于遥控的设备和方法
KR20170009302A (ko) 디스플레이장치 및 그 제어방법
CN202281975U (zh) 用于遥控的设备
CN103150024B (zh) 一种计算机操作方法
WO2013056452A1 (zh) 用于遥控的设备和方法
JP3211484U (ja) 触感型コントローラー
EP4302166A1 (en) A computer a software module arrangement, a circuitry arrangement, a user equipment and a method for an improved user interface controlling multiple applications simultaneously
CN202404523U (zh) 一种动作控制装置
KR100654086B1 (ko) 마우스포인팅 방식의 리모콘
KR100812998B1 (ko) 블루투스 통신을 이용한 마우스 제어 시스템
WO2018113031A1 (zh) 一种手机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11874356

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 10/09/2014)

122 Ep: pct application non-entry in european phase

Ref document number: 11874356

Country of ref document: EP

Kind code of ref document: A1