WO2016134591A1 - 智能设备的操控方法及装置 - Google Patents

智能设备的操控方法及装置 Download PDF

Info

Publication number
WO2016134591A1
WO2016134591A1 PCT/CN2015/088583 CN2015088583W WO2016134591A1 WO 2016134591 A1 WO2016134591 A1 WO 2016134591A1 CN 2015088583 W CN2015088583 W CN 2015088583W WO 2016134591 A1 WO2016134591 A1 WO 2016134591A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture action
mapping relationship
controlled device
control
gesture
Prior art date
Application number
PCT/CN2015/088583
Other languages
English (en)
French (fr)
Inventor
梁欣
樊理
Original Assignee
小米科技有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 小米科技有限责任公司 filed Critical 小米科技有限责任公司
Priority to JP2017501458A priority Critical patent/JP6229096B2/ja
Priority to MX2016000471A priority patent/MX362892B/es
Priority to RU2016101234A priority patent/RU2633367C2/ru
Priority to KR1020157030749A priority patent/KR101736318B1/ko
Publication of WO2016134591A1 publication Critical patent/WO2016134591A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/70Device selection
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/90Additional features
    • G08C2201/92Universal remote control
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/90Additional features
    • G08C2201/93Remote control using other portable devices, e.g. mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2807Exchanging configuration information on appliance services in a home automation network
    • H04L12/2809Exchanging configuration information on appliance services in a home automation network indicating that an appliance service is present in a home automation network

Definitions

  • the present disclosure relates to the field of terminal technologies, and in particular, to a method and an apparatus for controlling a smart device.
  • the manipulation of the smart device cannot be realized, resulting in inflexible control mode.
  • the smart device is operated by the remote controller, since one smart device corresponds to one controller, when it is desired to manipulate other smart devices, it is necessary to replace the remote controller with other devices, so that the manipulation mode is complicated. For example, if the user wants to control the air conditioner after controlling the TV through the TV remote control, the air conditioner remote controller needs to be found first, and the air conditioner remote controller is operated, and the manipulation method is complicated.
  • the present disclosure provides a method and an apparatus for controlling a smart device.
  • a method for controlling a smart device comprising:
  • the control relationship switch is prepared according to the first gesture action and the first mapping relationship, where the first mapping relationship is a mapping relationship between the gesture action and the control function;
  • the method further includes:
  • first positioning straight line and the second positioning straight line are extension lines pointed by the user's finger when the user points to the controlled device in the first position and the second position respectively;
  • the first gesture action and the second gesture action are performed by a user by using two hands, and the first gesture action is performed according to the first gesture And the first mapping relationship, ready to switch the control relationship, including:
  • Determining, according to the second gesture action and the second mapping relationship, the target controlled device, when the second gesture action is detected including:
  • the first gesture action and the second gesture action are performed by a user by using one hand, and the first gesture action is performed according to the first gesture And the first mapping relationship, ready to switch the control relationship, including:
  • Determining, according to the second gesture action and the second mapping relationship, the target controlled device, when the second gesture action is detected including:
  • the action and the second mapping relationship determine the target controlled device.
  • the method further includes:
  • a control device for a smart device comprising:
  • the pre-switching module is configured to perform control relationship switching according to the first gesture action and the first mapping relationship when the first gesture action is detected, where the first mapping relationship is a mapping between the gesture action and the control function Relationship
  • a first determining module configured to determine, according to the second gesture action and the second mapping relationship, the target controlled device, where the second mapping relationship is a gesture action and a controlled device, when the second gesture action is detected
  • the mapping relationship between locations
  • a first establishing module configured to establish a control relationship with the target controlled device
  • a first acquiring module configured to acquire a third gesture action
  • a control module configured to control, according to the third gesture action and the first mapping relationship, an operation corresponding to the third gesture action performed by the target controlled device.
  • the device further includes:
  • a second acquiring module configured to acquire a first positioning straight line and a second positioning straight line, where the first positioning straight line and the second positioning straight line are when the user points to the controlled device in the first position and the second position respectively, the user finger An extension line pointing to;
  • a second determining module configured to determine a location of the controlled device according to an intersection of the first positioning line and the second positioning line
  • the second establishing module is configured to establish a second mapping relationship between the gesture action and the location of the controlled device.
  • the device further includes:
  • the third establishing module is configured to acquire a first mapping relationship between the gesture action and the control function, wherein the same gesture action is used to control the same control function of different controlled devices.
  • the pre-switching module is configured to perform control according to a first gesture action and a first mapping relationship that are sent by one hand of the user Relationship switching
  • the first determining module is configured to: when detecting the second gesture action by the other hand of the user while detecting the first gesture action, according to the second gesture action and The second mapping relationship determines the target controlled device.
  • the pre-switching module is configured to perform control according to a first gesture action and a first mapping relationship that are sent by one hand of the user Relationship switching
  • the first determining module is configured to detect a second gesture sent by the same hand or the other hand of the user after detecting the first gesture sent by the one hand of the user for a specified length of time And determining, according to the second gesture action and the second mapping relationship, the target controlled device.
  • the device further includes:
  • a switching module configured to disconnect the control relationship with the target controlled device, and switch the control relationship back to the controlled device controlled before detecting the first gesture action.
  • a terminal where the terminal includes:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • the control relationship switch is prepared according to the first gesture action and the first mapping relationship, where the first mapping relationship is a mapping relationship between the gesture action and the control function;
  • the second mapping relationship is a mapping relationship between the gesture action and the location of the controlled device
  • FIG. 1 is a schematic diagram of an implementation environment involved in a method for controlling a smart device according to an exemplary embodiment.
  • FIG. 2 is a flowchart of a method for controlling a smart device according to an exemplary embodiment.
  • FIG. 3 is a flowchart of a method for controlling a smart device according to an exemplary embodiment.
  • FIG. 4 is a schematic diagram of a process of acquiring a positioning straight line, according to an exemplary embodiment.
  • FIG. 5 is a block diagram of a manipulation device of a smart device, according to an exemplary embodiment.
  • FIG. 6 is a block diagram of a manipulation device of a smart device, according to an exemplary embodiment.
  • FIG. 7 is a block diagram of a manipulation device of a smart device, according to an exemplary embodiment.
  • FIG. 8 is a block diagram of a manipulation device of a smart device, according to an exemplary embodiment.
  • FIG. 9 is a block diagram of a terminal, according to an exemplary embodiment.
  • FIG. 1 is a schematic diagram of an implementation environment involved in a method for controlling a smart device according to an exemplary embodiment.
  • the implementation environment includes a control device 101 and respective controlled devices 102.
  • the control device 101 Each of the controlled devices 102 is in the same local area network, and the control devices 101 are respectively connected to the respective controlled devices 102 through a communication network, which may be the Internet or a Bluetooth network.
  • a communication network which may be the Internet or a Bluetooth network.
  • the network may be a wired network or a wireless network.
  • the control device 101 can be configured to recognize a gesture action issued by the user, and after determining the control function corresponding to the gesture action, control the controlled device 102 to perform an operation corresponding to the gesture action.
  • Each of the controlled devices 102 communicates with the control device 101 such that the control device 101 can control the respective controlled devices 102 to perform operations corresponding to the gesture actions of the user.
  • the control device 101 may be a stand-alone device, or may be a function module installed in a controlled device 102, or may be a function module on a router in a local area network, which is not specifically limited in this embodiment of the present disclosure. . Additionally, the control device 101 can include a camera to recognize gesture gestures issued by the user. The control device 101 may further include data processing means for converting the acquired gesture action into a corresponding control function or the like. Regarding the composition of the control device 101, the embodiment of the present disclosure is not specifically limited.
  • Each of the controlled devices 102 may be various types of smart devices.
  • the smart devices may be smart home devices, such as smart TVs, smart refrigerators, smart air conditioners, smart phones, smart sound systems, smart lights, etc.
  • the smart terminal such as a mobile phone, a tablet computer, a PC (Personal Computer), a desktop computer, a portable computer, etc., does not limit the type of the controlled device 102 in the embodiment of the present disclosure.
  • the embodiment of the present disclosure provides a method for controlling a smart device.
  • the specific method for controlling the smart device is as follows:
  • FIG. 2 is a flowchart of a method for controlling a smart device according to an exemplary embodiment. As shown in FIG. 2, a method for controlling a smart device is used in a terminal. , including the following steps.
  • step S201 when the first gesture action is detected, the control relationship switching is prepared according to the first gesture action and the first mapping relationship, where the first mapping relationship is a mapping relationship between the gesture action and the control function.
  • step S202 when the second gesture action is detected, the target controlled device is determined according to the second gesture action and the second mapping relationship, wherein the second mapping relationship is between the gesture action and the controlled device location. Mapping relations.
  • step S203 a control relationship with the target controlled device is established.
  • step S204 a third gesture action is acquired.
  • step S205 according to the third gesture action and the first mapping relationship, the target controlled device is controlled to perform an operation corresponding to the third gesture action.
  • the method provided by the embodiment of the present disclosure prepares a control relationship switch by using a first gesture action and a first mapping relationship, and determines a target controlled device according to the second gesture action and the second mapping relationship, and establishes a target controlled device. It And controlling the relationship, and further controlling the target controlled device to perform the operation corresponding to the third gesture action by acquiring the third gesture action, thereby implementing the target controlled device. Since the control relationship can be switched to the target controlled device through the gesture action, and the target controlled device is controlled by the gesture action, the manipulation mode is simpler and the flexibility of the manipulation mode is also improved.
  • the method when the second gesture action is detected, before determining the target controlled device according to the second gesture action and the second mapping relationship, the method further includes:
  • first positioning straight line and the second positioning straight line are extension lines pointed by the user's finger when the user points to the controlled device in the first position and the second position respectively;
  • the method when the first gesture action is detected, before the control relationship switch is prepared according to the first gesture action and the first mapping relationship, the method further includes:
  • the first gesture action and the second gesture action are performed by the user by using two hands, and the control relationship switching is prepared according to the first gesture action and the first mapping relationship, including:
  • determining the target controlled device according to the second gesture action and the second mapping relationship including:
  • the target controlled device is determined according to the second gesture action and the second mapping relationship.
  • the first gesture action and the second gesture action are performed by the user by one hand, and the control relationship switching is prepared according to the first gesture action and the first mapping relationship, including:
  • determining the target controlled device according to the second gesture action and the second mapping relationship including:
  • the second gesture action by the same hand or the other hand is detected after detecting the first gesture action by the user's one hand, the second gesture action and the second mapping relationship are detected according to the second gesture action , determine the target controlled device.
  • the method further includes:
  • the control relationship with the target controlled device is disconnected, and the control relationship is switched back to the controlled device controlled before the first gesture action is detected.
  • FIG. 2 is a schematic diagram of a smart device according to an exemplary embodiment.
  • a flowchart of the control method, as shown in FIG. 3, is used in the terminal for the control method of the smart device, and includes the following steps.
  • step S301 a first mapping relationship between the gesture action and the control function is established, wherein the same gesture action is used to control the same control function of different controlled devices.
  • the control device After the control device detects the gesture action sent by the user, in order to know that the user wants to control the controlled device to perform the operation through the gesture action, the control device needs to establish the gesture action first when the control device is controlled by the gesture action.
  • the first mapping relationship with the control function.
  • the control device can determine what operation the controlled device performs by querying the first mapping relationship. Therefore, when the control method of the smart device is executed, the first mapping relationship between the gesture action and the control function needs to be established, and the first mapping relationship is stored.
  • the gesture action may be a dynamic action or a static action.
  • the control device can implement the gesture motion according to the gesture motion track; when the gesture motion is a static motion, for example, the gesture motion is performed for the user.
  • the gesture action can also be a combination of a dynamic action and a static action, such as a gesture action for the user to first put out an "ok" shape gesture action, and when a circle motion is issued, at this time, the control device recognizes the gesture action. It can be realized according to the shape of the gesture and the movement track of the gesture at the same time.
  • the first mapping relationship between the established gesture action and the control function is also different.
  • the first mapping relationship may be a mapping relationship between the feature track constituting the gesture action and the control function;
  • the first mapping relationship It can be a mapping relationship between feature point sets and control functions that make up a gesture shape.
  • the gesture action is a circle motion
  • the control function corresponding to the gesture action is a start operation
  • the feature track of the circle motion is a circular track
  • the established first mapping relationship is a circular track and a start operation.
  • the gesture action is an "ok" gesture shape
  • the control function corresponding to the gesture action is a close operation
  • the contour point set constituting the shape of the "ok” gesture can be acquired, and these points are used as the features of the "ok" gesture shape.
  • the point set, at this time, the first mapping relationship established is a mapping relationship between the contour point set of the "ok" gesture shape and the closing operation.
  • mapping When establishing the first mapping relationship between the gesture action and the control function, the following mapping can be implemented:
  • the first way when the control device is shipped from the factory, the first mapping relationship between the gesture action and the control function is preset.
  • the specific definition of the method is as follows: when the control device is shipped from the factory, the first mapping relationship is preset, and when the controlled device is operated by the control device, the first mapping relationship preset according to the preset time of the control device is implemented. For example, if the opening operation has been set to correspond to the circular motion when the control device is shipped from the factory, when the controlled device is operated by the control device to implement the opening operation, the user is realized by drawing a circular motion.
  • each controlled device since the same local area network usually includes multiple controlled devices, and each controlled device often includes the same control functions, such as the opening function, the determining function, and the closing function, in order to facilitate the user to control the same for different controlled devices.
  • the control function in the embodiment of the present disclosure, when setting the first mapping relationship, can set the same gesture action for manipulating the same control function of different controlled devices.
  • the volume increase operation for the smart TV may correspond to the same gesture action; for the “determination” operation of different controlled devices, it may also correspond to the same gesture action; for “upper", “down”, “left”, “for different controlled devices”
  • the right “equal direction indicates the operation, and the same gesture action or the like can be defined for different directions.
  • the same gesture action By setting the same gesture action by setting the same control function of different controlled devices, when the user controls the same control function of different controlled devices, the same gesture action can be implemented, and the different controlled devices are controlled by different gesture actions in the related art.
  • the same control function, the control method is not only simple, but also more time-saving.
  • the second way after the control device is installed, according to the operation of the user, the first mapping relationship between the gesture action and the control function is entered.
  • the first mapping relationship is customized according to the operation of the user in the local area network. For example, after the control device newly joins the home A, the control device may first scan all the control control functions included in all the controlled devices in the home A, and display all the control functions scanned by the screen on the screen. Further, the control device may prompt the user to enter a gesture action corresponding to each control function by using a screen display manner. After detecting that the user selects a certain control function, the gesture action set by the user for the control function is entered, and the mapping relationship between the gesture action and the control function is stored. The above gesture action input operation is performed for each control function, that is, a first mapping relationship between all control functions and their corresponding gesture actions is established.
  • each voice when the user is prompted to enter a gesture action corresponding to each control function, it can also be implemented by voice. For example, after scanning all the control functions, each voice broadcasts a control function, and after determining that the user has entered the gesture action corresponding to the control function, continues to broadcast the next control function, and continues to enter the gesture action corresponding to the next control function. .
  • the gesture action of a certain input can be applied to a controlled device, and can also be applied to a partially controlled device including the same control function.
  • the control functions may be classified first, and Get control functions common to different controlled devices.
  • the control device can set the same gesture action for manipulating the same control function of all controlled devices in the local area network.
  • the user can customize according to his own needs, preferences, and operating habits, so that the manner of establishing the first mapping relationship is more flexible and changeable. For example, if the user is left-handed, the user can implement it by left hand when customizing the first mapping relationship.
  • the user may also modify the first mapping relationship between the gesture action and the control function as needed. For example, after the user sets the first mapping relationship according to the preference, if the gesture action used when setting the first mapping relationship is relatively laborious or the recognition success rate of the control device is low, the gesture action can be modified.
  • the gesture action corresponding to the volume increase operation may be modified. For example, modify to slide a line that is not less than the preset length from left to right.
  • the step is to implement the control of the controlled device.
  • the controlled device can be manipulated by the gesture action. That is to say, the step is the step before the control device is controlled, and the step is not performed every time the control method of the smart device is executed, and the first mapping relationship has been established when the control method of the smart device is executed.
  • step S302 when the first gesture action is detected, the control relationship switching is prepared according to the first gesture action and the first mapping relationship.
  • the control relationship is the basis for transmitting control data and control commands between the control device and the controlled device.
  • the control device controls a controlled device, it first needs to establish a control relationship with the controlled device, and implements on the basis of the control device. Control the transmission of data and control commands.
  • the control device controls a controlled device
  • the control device if the control device is currently manipulating a controlled device, at this time, if it is detected that the user wants to control another controlled device, the control device needs to prepare for control relationship switching.
  • the preparation for the control relationship switching may refer to a process of disconnecting the control relationship with the currently controlled device. If the control device does not currently control any of the controlled devices, the preparation for the control relationship switching refers to the process of setting the current state to the state to be switched.
  • the control function in the first mapping relationship includes preparing to perform a control relationship switching function, and in the embodiment of the present disclosure, different The control function of the control device to perform control relationship switching corresponds to the first gesture action. That is, after detecting the first gesture action, the control device may determine that the first gesture action corresponds to a control function that is ready to perform control relationship switching by querying the first mapping relationship. At this time, according to the first gesture action and the first Mapping relationships, ready to switch control relationships.
  • the first gesture action may be a user's circular motion, or may be a motion of the user to perform a certain shape, such as a user performing a circular motion, posing a heart-shaped motion, and the like.
  • the trajectory of the first gesture action may be implemented, or may be implemented according to the shape of the first gesture action. This is not specifically limited.
  • the same gesture action is used to control the same control function of different controlled devices, so when the control relationship between different controlled devices is switched, the first gesture may be adopted.
  • Action implementation For example, if the first gesture action is a circle motion, if the smart TV is currently being manipulated, when the user poses a circle motion, the control relationship of the smart TV is ready to be switched; if the smart air conditioner is currently being operated, the user is also placed After the drawing circle is moved, the control relationship of the smart air conditioner is switched.
  • step S303 when the second gesture action is detected, according to the second gesture action and the second mapping relationship, The target controlled device, wherein the second mapping relationship is a mapping relationship between the gesture action and the location of the controlled device.
  • the target controlled device is a controlled device that the user currently wants to control.
  • the control device triggers determining the target controlled device according to the second gesture action issued by the user. Therefore, after the control device is ready to perform the control relationship switching, when the second gesture action is detected, the target controlled device is determined according to the second mapping action between the second gesture action and the gesture action and the controlled device location.
  • the control device detects the second gesture action, the target is controlled according to the second gesture action and the second mapping relationship.
  • a second mapping relationship between the gesture action and the position of the controlled device should also be established.
  • the control device includes, but is not limited to, the following steps S3031 to S3033:
  • step S3031 a first positioning straight line and a second positioning straight line are acquired, wherein the first positioning straight line and the second positioning straight line are extension lines pointed by the user's finger when the user points to the controlled device in the first position and the second position respectively.
  • the control device can determine the current location of the user relative to the controlled device according to the relative distance between the user and the controlled device.
  • the control device may first prompt the user to point to a controlled device in two different locations by means of a screen display or a voice mode.
  • the control device detects that the user points to the controlled device according to the prompt, acquires the first location where the user is located and the finger pointing of the user, so that the user points when the user points to the controlled device in the first position.
  • the extension line, the extension line pointed by the finger is used as the first positioning line; when the control device detects that the user points to the controlled device in the second position according to the prompt, the second position of the user and the finger of the user are acquired. Pointing to obtain an extension line pointed by the finger when the user points to the controlled device in the second position, and the extension line pointed by the finger is used as the second positioning straight line.
  • FIG. 4 shows a schematic diagram when a positioning straight line is acquired.
  • points A and B are the first position and the second position where the user is located.
  • the first positioning straight line L and the second positioning straight line L' may be determined. Further, the intersection C of L and L' is the position of the controlled device.
  • step S3032 the position of the controlled device is determined according to the intersection of the first positioning line and the second positioning line.
  • the spatial coordinates of the intersection of the first positioning line and the second positioning line may be determined first, and the space coordinates are taken as the position of the controlled device.
  • the position of each controlled device in the local area network can be determined by the above steps S3031 and S3032.
  • the location of the newly added controlled device may also be determined by the manner provided in the foregoing steps S3031 and S3032.
  • the newly-added device it can be implemented by the automatic identification protocol of the controlled device, which is not limited by the embodiment of the present disclosure.
  • step S3033 a second mapping relationship between the gesture action and the position of the controlled device is established.
  • each controlled device included in the local area network its position can be determined by the above steps S3031 and S3032. Further, in the process of controlling the controlled device by gesture recognition, after the user issues a certain gesture action, in order to quickly determine the target controlled device controlled by the gesture action, a gesture action and a controlled device position need to be established. The second mapping relationship between. On this basis, after obtaining the second gesture action, the target controlled device can be quickly determined.
  • the gesture actions used in establishing the second mapping relationship include, but are not limited to, gesture actions directed to a certain direction.
  • the gesture action used when establishing the second mapping relationship may also be other gesture actions, and the mapping relationship between the second gesture action and the position of the controlled device may be ensured.
  • the step S302 and the step S303 are implemented to prepare the control relationship switching according to the first gesture action and the first mapping relationship, and determine the target controlled device according to the second gesture action and the second mapping relationship.
  • the control device when the control device is ready to perform control relationship switching and determine the target controlled device, the user needs to complete the first gesture action and the second gesture action. The user can complete the first gesture action and the second gesture action simultaneously by two hands, or can be completed by one hand. At this time, the control device is ready to perform control relationship switching and determine target control.
  • the device includes, but is not limited to, the following two methods:
  • the first mode the first gesture action and the second gesture action are performed by the user by two hands.
  • the control device prepares the control relationship according to the first gesture action and the first mapping relationship issued by the user's one hand. Switching; when the control device detects the second gesture action by the other hand of the user while detecting the first gesture action, determining the target controlled device according to the second gesture action and the second mapping relationship. That is, the operation of the control device to perform the control relationship switching is performed simultaneously with the determination of the operation of the target controlled device.
  • the first gesture action is a circle motion and the second gesture motion is a finger pointing motion
  • the control device detects that both the user's hands are simultaneously performing the circle motion and the finger pointing motion
  • the preparation is performed simultaneously. Control the operation of the relationship switch and determine the operation of the target controlled device.
  • the second way the first gesture action and the second gesture action are completed by the user through one hand, at this time,
  • the control device prepares to perform control relationship switching according to the first gesture action and the first mapping relationship sent by the user's one hand; when the control device detects the first gesture action sent by the user's one hand for a specified length of time, the user is detected.
  • the second gesture gesture is performed by the same hand or the other hand, the target controlled device is determined according to the second gesture action and the second mapping relationship. That is to say, the operation of the control device to perform the control relationship switching and the operation of determining the target controlled device are sequentially performed.
  • the embodiment of the present disclosure is not specifically limited.
  • the specified duration may be 1 s (seconds), 3 s, and the like.
  • the operation for preparing the control relationship switching is performed. After the control device detects that the user's hand has been drawn for a specified length of time, and detects that the user has issued a finger pointing action through the hand, the operation of determining the target controlled device is performed.
  • the control device needs The first gesture action and the second gesture action are detected in sequence, and therefore, the operation process is relatively time-saving relative to the first mode.
  • the user can complete the operation with one hand, and therefore, the operation is relatively simple.
  • step S304 a control relationship with the target controlled device is established.
  • a control relationship with the target controlled device needs to be established. After the control relationship with the target controlled device is established, after the gesture action sent by the user is detected, the control command corresponding to the gesture action is triggered to be sent to the target controlled device, thereby controlling the target controlled device to perform the corresponding operation.
  • control device when establishing a control relationship with the target controlled device, if the control device has just started, that is, no controlled device has been manipulated, the control device can directly establish a control relationship with the target controlled device; The device is currently manipulating a controlled device, and the control device can switch the control relationship from the currently controlled controlled device to the target controlled device.
  • step S305 a third gesture action is acquired, and according to the third gesture action and the first mapping relationship, the target controlled device is controlled to perform an operation corresponding to the third gesture action.
  • the control device may first query the first mapping relationship according to the third gesture action, and obtain control corresponding to the third gesture action. And further determining, according to the control function, what operation is performed by the control target controlled device, thereby controlling the target controlled device to perform an operation corresponding to the third gesture action.
  • the control device may implement the operation corresponding to the third gesture action according to the third gesture action and the first mapping relationship, and may be implemented according to the trajectory of the third gesture action, or may be according to the shape of the third gesture action.
  • the implementation of the present disclosure does not specifically limit this.
  • control instruction may include the operation content by sending the control instruction to the target controlled device. After receiving the control instruction, the target controlled device performs an operation corresponding to the operation content.
  • the control device may send an operation notification message to the target controlled device, where the operation notification message includes The operation content is increased in volume.
  • the smart TV performs a volume increase operation.
  • step S306 the control relationship with the target controlled device is disconnected, and the control relationship is switched back to the controlled device controlled before the first gesture action is detected.
  • step S304 the control device establishes a control relationship with the target controlled device, the control relationship is switched from the currently controlled device to the target controlled device, and the control device performs the first control of the target controlled device. After the operation corresponding to the three gesture actions, the control relationship with the target controlled device may be disconnected, and the control relationship is switched back to the controlled device controlled before the first gesture action is detected.
  • control device controls the controlled device
  • only the control relationship is temporarily switched to the target controlled device, and after the target controlled device is manipulated, the control device is switched back. Controlled device controlled.
  • the control device can switch the control relationship by the smart TV through steps S302 to S304. After the smart air conditioner is operated and the smart air conditioner is operated by the method of step S305, the control relationship is switched back to the smart TV. After acquiring the gesture action, the control device continues to control the smart TV according to the gesture action and the first mapping relationship.
  • the method provided by the embodiment of the present disclosure prepares a control relationship switch by using a first gesture action and a first mapping relationship, and determines a target controlled device according to the second gesture action and the second mapping relationship, and establishes a target controlled device. Controlling the relationship between the two, and further controlling the target controlled device to perform the operation corresponding to the third gesture action by acquiring the third gesture action, thereby implementing the manipulation target controlled device. Since the control relationship can be switched to the target controlled device through the gesture action, and the target controlled device is controlled by the gesture action, the manipulation mode is simpler and the flexibility of the manipulation mode is also improved.
  • FIG. 5 is a block diagram of a manipulation device of a smart device for performing a manipulation method of a smart device provided by the embodiment corresponding to FIG. 2 or FIG. 3 according to an exemplary embodiment.
  • the control device of the smart device includes a pre-switching module 501, a first determining module 502, a first establishing module 503, a first obtaining module 504, and a control module 505. among them:
  • the pre-switching module 501 is configured to prepare a control relationship switch according to the first gesture action and the first mapping relationship when the first gesture action is detected, where the first mapping relationship is a mapping between the gesture action and the control function relationship;
  • the first determining module 502 is configured to determine, according to the second gesture action and the second mapping relationship, the target controlled device, when the second gesture action is detected, where the second mapping relationship is a gesture action and the controlled device The mapping relationship between locations;
  • the first establishing module 503 is configured to establish a control relationship with the target controlled device
  • the first obtaining module 504 is configured to acquire a third gesture action
  • the control module 505 is configured to control the target controlled device to perform an operation corresponding to the third gesture action according to the third gesture action and the first mapping relationship.
  • the device provided by the embodiment of the present disclosure prepares a control relationship switch by using a first gesture action and a first mapping relationship, and determines a target controlled device according to the second gesture action and the second mapping relationship, and establishes a target controlled device. Controlling the relationship between the two, and further controlling the target controlled device to perform the operation corresponding to the third gesture action by acquiring the third gesture action, thereby implementing the manipulation target controlled device. Since the control relationship can be switched to the target controlled device through the gesture action, and the target controlled device is controlled by the gesture action, the manipulation mode is simpler and the flexibility of the manipulation mode is also improved.
  • the operating device of the smart device further includes a second obtaining module 506, a second determining module 507, and a second establishing module 508. among them:
  • the second obtaining module 506 is configured to acquire a first positioning straight line and a second positioning straight line, wherein the first positioning straight line
  • the line and the second positioning straight line are extension lines pointed by the user's finger when the user points to the controlled device in the first position and the second position, respectively;
  • the second determining module 507 is configured to determine a location of the controlled device according to an intersection of the first positioning line and the second positioning line;
  • the second establishing module 508 is configured to establish a second mapping relationship between the gesture action and the location of the controlled device.
  • the operating device of the smart device further includes a third obtaining module 509. among them:
  • the third establishing module 509 is configured to acquire a first mapping relationship between the gesture action and the control function, wherein the same gesture action is used to manipulate the same control function of different controlled devices.
  • the pre-switching module 501 is configured to prepare a control relationship switch according to the first gesture action and the first mapping relationship that are sent by one hand of the user;
  • the first determining module 502 is configured to determine, according to the second gesture action and the second mapping relationship, when the second gesture action issued by the other hand of the user is detected while detecting the first gesture action Target controlled device.
  • the pre-switching module 501 is configured to prepare a control relationship switch according to the first gesture action and the first mapping relationship that are sent by one hand of the user;
  • the first determining module 502 is configured to detect, when the second gesture of the same hand or the other hand is detected by the user after detecting the first gesture sent by the user's one hand, The second gesture action and the second mapping relationship determine the target controlled device.
  • the operating device of the smart device further includes a switching module 510. among them:
  • the switching module 510 is configured to disconnect the control relationship with the target controlled device, switching the control relationship back to the controlled device controlled before detecting the first gesture action.
  • FIG. 9 is a block diagram of a terminal 900, which may be used to perform the method for controlling a smart device provided by the embodiment corresponding to FIG. 2 or FIG. 3, according to an exemplary embodiment.
  • the terminal 900 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • terminal 900 may include one or more of the following components: processing component 902, memory 904, power component 906, multimedia component 908, audio component 910, I/O (Input/Output) interface 912, sensor Component 914, and communication component 916.
  • processing component 902 memory 904
  • power component 906 multimedia component 908, audio component 910
  • I/O (Input/Output) interface 912 sensor Component 914
  • communication component 916 communication component 916.
  • Processing component 902 typically controls the overall operation of terminal 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 902 can include one or more processors 920 to execute instructions to perform all or part of the steps described above.
  • processing component 902 can include one or more modules. It is convenient to handle the interaction between component 902 and other components.
  • processing component 902 can include a multimedia module to facilitate interaction between multimedia component 908 and processing component 902.
  • the memory 904 is configured to store various types of data to support operation at the terminal 900. Examples of such data include instructions for any application or method operating on terminal 900, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 904 can be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as SRAM (Static Random Access Memory), EEPROM (Electrically-Erasable Programmable Read-Only Memory). Erasable Programmable Read Only Memory (EPROM), PROM (Programmable Read-Only Memory), ROM (Read-Only Memory, Read only memory), magnetic memory, flash memory, disk or optical disk.
  • SRAM Static Random Access Memory
  • EEPROM Electrically-Erasable Programmable Read-Only Memory
  • EPROM Erasable Programmable Read Only Memory
  • PROM PROM
  • ROM Read-Only Memory, Read only memory
  • magnetic memory flash memory, disk or optical disk.
  • Power component 906 provides power to various components of terminal 900.
  • Power component 906 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for terminal 900.
  • the multimedia component 908 includes a screen between the terminal 900 and the user that provides an output interface.
  • the screen may include an LCD (Liquid Crystal Display) and a TP (Touch Panel). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor can sense not only the boundaries of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 908 includes a front camera and/or a rear camera. When the terminal 900 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 910 is configured to output and/or input an audio signal.
  • the audio component 910 includes a MIC (Microphone) that is configured to receive an external audio signal when the terminal 900 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 904 or transmitted via communication component 916.
  • the audio component 910 also includes a speaker for outputting an audio signal.
  • the I/O interface 912 provides an interface between the processing component 902 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor component 914 includes one or more sensors for providing terminal 900 with various aspects of status assessment.
  • sensor component 914 can detect an open/closed state of terminal 900, a relative positioning of components, such as a display and a keypad of terminal 900, and sensor component 914 can also detect a change in position of a component of terminal 900 or terminal 900, the user The presence or absence of contact with the terminal 900, the orientation or acceleration/deceleration of the terminal 900 and the temperature change of the terminal 900.
  • Sensor assembly 914 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor component 914 can also include a light sensor, such as CMOS (Complementary Metal Oxide Semiconductor, a complementary metal oxide) or CCD (Charge-coupled Device) image sensor for use in imaging applications.
  • CMOS Complementary Metal Oxide Semiconductor, a complementary metal oxide
  • CCD Charge-coupled Device
  • the sensor component 914 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 916 is configured to facilitate wired or wireless communication between terminal 900 and other devices.
  • the terminal 900 can access a wireless network based on a communication standard such as WiFi, 2G or 3G, or a combination thereof.
  • communication component 916 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 916 further includes an NFC (Near Field Communication) module to facilitate short-range communication.
  • the NFC module can be based on RFID (Radio Frequency Identification) technology, IrDA (Infra-red Data Association) technology, UWB (Ultra Wideband) technology, BT (Bluetooth) technology and Other technologies to achieve.
  • the terminal 900 may be configured by one or more ASICs (Application Specific Integrated Circuits), DSP (Digital Signal Processor), DSPD (Digital Signal Processor Device) Device), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), controller, microcontroller, microprocessor or other electronic components are implemented to perform the above diagram 2 or the control method of the smart device provided by the embodiment shown in FIG.
  • ASICs Application Specific Integrated Circuits
  • DSP Digital Signal Processor
  • DSPD Digital Signal Processor Device
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • controller microcontroller, microprocessor or other electronic components are implemented to perform the above diagram 2 or the control method of the smart device provided by the embodiment shown in FIG.
  • non-transitory computer readable storage medium comprising instructions, such as a memory 904 comprising instructions executable by processor 920 of terminal 900 to perform the above described FIG. 2 or FIG.
  • the method for controlling the smart device provided by the embodiment is shown.
  • the non-transitory computer readable storage medium may be a ROM, a RAM (Random Access Memory), a CD-ROM (Compact Disc Read-Only Memory), a magnetic tape, a floppy disk, and optical data. Storage devices, etc.
  • a non-transitory computer readable storage medium when instructions in the storage medium are executed by a processor of a terminal, enabling the mobile terminal to perform a method of controlling the smart device, the method comprising:
  • the control relationship is prepared according to the first gesture action and the first mapping relationship, where the first mapping relationship is a mapping relationship between the gesture action and the control function;
  • the target controlled device When the second gesture action is detected, determining the target controlled device according to the second gesture action and the second mapping relationship, where the second mapping relationship is a mapping relationship between the gesture action and the controlled device location;
  • the memory of the terminal further includes an instruction for performing the following operations:
  • the method further includes:
  • first positioning straight line and a second positioning straight line wherein the first positioning straight line and the second positioning straight line are respectively for the user An extension line pointed by the user's finger when the first position and the second position point to the controlled device;
  • the memory of the terminal further includes an instruction for: when the first gesture action is detected, according to the first gesture Before the action and the first mapping relationship are prepared, the control relationship is further included:
  • the memory of the terminal further includes an instruction for: the first gesture action and the second gesture action are adopted by the user The two hands are completed, and according to the first gesture action and the first mapping relationship, the control relationship switching is prepared, including:
  • determining the target controlled device according to the second gesture action and the second mapping relationship including:
  • the target controlled device is determined according to the second gesture action and the second mapping relationship.
  • the memory of the terminal further includes an instruction for: the first gesture action and the second gesture action are adopted by the user After one hand is completed, according to the first gesture action and the first mapping relationship, the control relationship switching is prepared, including:
  • determining the target controlled device according to the second gesture action and the second mapping relationship including:
  • the second gesture action by the same hand or the other hand is detected after detecting the first gesture action by the user's one hand, the second gesture action and the second mapping relationship are detected according to the second gesture action , determine the target controlled device.
  • the memory of the terminal further includes an instruction for: controlling, by the target controlled device, the operation corresponding to the third gesture action After that, it also includes:
  • the control relationship with the target controlled device is disconnected, and the control relationship is switched back to the controlled device controlled before the first gesture action is detected.
  • the non-transitory computer readable storage medium prepareds a control relationship switch by using a first gesture action and a first mapping relationship, and determines a target controlled device according to the second gesture action and the second mapping relationship. And establishing a control relationship with the target controlled device, and further controlling the target controlled device to perform the operation corresponding to the third gesture action by acquiring the third gesture action, thereby implementing the target controlled device. Since the control relationship can be switched to the target controlled device through the gesture action, and the target controlled device is controlled by the gesture action, the manipulation mode is simpler and the flexibility of the manipulation mode is also improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)

Abstract

一种智能设备的操控方法及装置,属于终端技术领域。包括:当检测到第一手势动作时,准备进行控制关系切换(S201);当检测到第二手势动作时,确定目标受控设备(S202);建立与目标受控设备之间的控制关系(S203);获取第三手势动作(S204);控制目标受控设备执行第三手势动作对应的操作(S205)。通过第一手势动作准备进行控制关系切换,根据第二手势动作确定目标受控设备后,建立与目标受控设备之间的控制关系,进一步通过获取第三手势动作,控制目标受控设备执行第三手势动作对应的操作,从而实现操控目标受控设备。由于通过手势动作即可将控制关系切换至目标受控设备,并通过手势动作完成目标受控设备的操控,因此,操控方式更简单,同时也提高了操控方式的灵活性。

Description

智能设备的操控方法及装置
本申请基于申请号为201510087958.3、申请日为2015/2/26的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及终端技术领域,特别涉及一种智能设备的操控方法及装置。
背景技术
随着终端技术的迅速发展,出现了越来越多的智能设备,如智能手机、可穿戴设备等智能移动终端,智能电视、智能冰箱等智能家居设备。通常,同一用户往往会同时拥有多个智能设备,为了实现简单快捷地操控各种智能设备,智能设备的操控方法显得至关重要。
相关技术在操控智能设备时,通常通过接触式操控实现。例如,在操控智能设备时,用户通过接触智能设备上的功能按键或智能设备的遥控器来操控智能设备。例如,用户在操控智能电视时,通过使用手操作智能电视上的功能按键或电视遥控器来实现。
在实现本公开的过程中,发明人发现相关技术至少存在以下问题:
由于相关技术在操控智能设备时,需要借助于用户的手动接触,如果智能设备或者其遥控器并不在用户手边,则不能实现智能设备的操控,导致操控方式不灵活。另外,在通过遥控器操控智能设备时,由于一个智能设备对应一个控制器,因此,当想要操控其它智能设备时,需要更换为其它设备的遥控器,使得操控方式复杂。例如,如果用户通过电视遥控器操控完电视之后,又想操控空调,则需要先找到空调遥控器,并通过操控空调遥控器实现,操控方式很复杂。
发明内容
为克服相关技术中存在的问题,本公开提供一种智能设备的操控方法及装置。
根据本公开实施例的第一方面,提供一种智能设备的操控方法,所述方法包括:
当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换,所述第一映射关系为手势动作与控制功能之间的映射关系;
当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,所述第二映射关系为手势动作与受控设备位置之间的映射关系;
建立与所述目标受控设备之间的控制关系;
获取第三手势动作;
根据所述第三手势动作及所述第一映射关系,控制所述目标受控设备执行所述第三手势动作对应的操作。
结合第一方面,在第一方面的第一种可能的实现方式中,所述当检测到第二手势动作 时,根据所述第二手势动作和第二映射关系,确定目标受控设备之前,还包括:
获取第一定位直线和第二定位直线,所述第一定位直线和所述第二定位直线为用户分别在第一位置和第二位置指向受控设备时,用户手指指向的延长线;
根据所述第一定位直线和所述第二定位直线的交点,确定所述受控设备的位置;
建立手势动作与所述受控设备位置之间的第二映射关系。
结合第一方面,在第一方面的第二种可能的实现方式中,所述当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换之前,还包括:
建立手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
结合第一方面,在第一方面的第三种可能的实现方式中,所述第一手势动作和所述第二手势动作由用户通过两只手完成,所述根据所述第一手势动作和第一映射关系,准备进行控制关系切换,包括:
根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
所述当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,包括:
当在检测到所述第一手势动作的同时,检测到所述用户的另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
结合第一方面,在第一方面的第四种可能的实现方式中,所述第一手势动作和所述第二手势动作由用户通过一只手完成,所述根据所述第一手势动作和第一映射关系,准备进行控制关系切换,包括:
根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
所述当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,包括:
当在检测到所述用户的一只手发出的第一手势动作指定时长后,检测到所述用户同一只手或另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
结合第一方面,在第一方面的第五种可能的实现方式中,所述控制所述目标受控设备执行所述第三手势动作对应的操作之后,还包括:
断开与所述目标受控设备之间的控制关系,将控制关系切换回检测到所述第一手势动作之前所控制的受控设备。
根据本公开实施例的第二方面,提供一种智能设备的操控装置,所述装置包括:
预切换模块,用于当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换,所述第一映射关系为手势动作与控制功能之间的映射关系;
第一确定模块,用于当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,所述第二映射关系为手势动作与受控设备位置之间的映射关系;
第一建立模块,用于建立与所述目标受控设备之间的控制关系;
第一获取模块,用于获取第三手势动作;
控制模块,用于根据所述第三手势动作及所述第一映射关系,控制所述目标受控设备执行所述第三手势动作对应的操作。
结合第二方面,在第一方面的第一种可能的实现方式中,所述装置还包括:
第二获取模块,用于获取第一定位直线和第二定位直线,所述第一定位直线和所述第二定位直线为用户分别在第一位置和第二位置指向受控设备时,用户手指指向的延长线;
第二确定模块,用于根据所述第一定位直线和所述第二定位直线的交点,确定所述受控设备的位置;
第二建立模块,用于建立手势动作与所述受控设备位置之间的第二映射关系。
结合第二方面,在第二方面的第二种可能的实现方式中,所述装置还包括:
第三建立模块,用于获取手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
结合第二方面,在第二方面的第三种可能的实现方式中,所述预切换模块,用于根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
所述第一确定模块,用于当在检测到所述第一手势动作的同时,检测到所述用户的另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
结合第二方面,在第二方面的第四种可能的实现方式中,所述预切换模块,用于根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
所述第一确定模块,用于当在检测到所述用户的一只手发出的第一手势动作指定时长后,检测到所述用户同一只手或另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
结合第二方面,在第二方面的第五种可能的实现方式中,所述装置还包括:
切换模块,用于断开与所述目标受控设备之间的控制关系,将控制关系切换回检测到所述第一手势动作之前所控制的受控设备。
根据本公开实施例的第三方面,提供一种终端,所述终端包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,所述处理器被配置为:
当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换,所述第一映射关系为手势动作与控制功能之间的映射关系;
当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设 备,所述第二映射关系为手势动作与受控设备位置之间的映射关系;
建立与所述目标受控设备之间的控制关系;
获取第三手势动作;
根据所述第三手势动作及所述第一映射关系,控制所述目标受控设备执行所述第三手势动作对应的操作。
本公开的实施例提供的技术方案可以包括以下有益效果:
通过第一手势动作及第一映射关系准备进行控制关系切换,并根据第二手势动作及第二映射关系,确定目标受控设备后,建立与目标受控设备之间的控制关系,并进一步通过获取第三手势动作,控制目标受控设备执行第三手势动作对应的操作,从而实现操控目标受控设备。由于通过手势动作即可将控制关系切换至目标受控设备,并通过手势动作完成目标受控设备的操控,因此,操控方式更加简单,同时也提高了操控方式的灵活性。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本发明的实施例,并与说明书一起用于解释本发明的原理。
图1是根据一示例性实施例示出的一种智能设备的操控方法所涉及的实施环境示意图。
图2是根据一示例性实施例示出的一种智能设备的操控方法的流程图。
图3是根据一示例性实施例示出的一种智能设备的操控方法的流程图。
图4是根据一示例性实施例示出的一种获取定位直线的过程示意图。
图5是根据一示例性实施例示出的一种智能设备的操控装置的框图。
图6是根据一示例性实施例示出的一种智能设备的操控装置的框图。
图7是根据一示例性实施例示出的一种智能设备的操控装置的框图。
图8是根据一示例性实施例示出的一种智能设备的操控装置的框图。
图9是根据一示例性实施例示出的一种终端的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本发明相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本发明的一些方面相一致的装置和方法的例子。
图1是根据一示例性实施例提供的一种智能设备的操控方法所涉及的实施环境示意图。如图1所示,该实施环境包括控制设备101和各个受控设备102。其中,控制设备101 和各个受控设备102处于同一局域网中,且控制设备101分别与各个受控设备102通过通讯网络连接,该通讯网络可以为互联网,也可以为蓝牙网络等。其中,当通讯网络为互联网时,该网络可以为有线网络,也可以为无线网络。
控制设备101可以用于识别用户发出的手势动作,并在确定该手势动作对应的控制功能后,控制受控设备102执行该手势动作对应的操作。各个受控设备102通过与控制设备101进行通信,以使得控制设备101可以控制各个受控设备102执行用户的手势动作对应的操作。
其中,控制设备101可以为一个独立的设备,也可以为安装于某一受控设备102内部的一个功能模块,还可以为局域网中路由器上的一个功能模块,本公开实施例对此不作具体限定。另外,控制设备101可以包括摄像头,以识别用户发出的手势动作。控制设备101还可以包括数据处理装置,用于将获取到的手势动作转化为对应的控制功能等。关于控制设备101的组成结构,本公开实施例不作具体限定。
各个受控设备102可以为各种类型的智能设备,关于智能设备的类型,可以为智能家居设备,如智能电视、智能冰箱、智能空调、智能收音机、智能音响系统及智能灯等,也可以为智能终端,如手机、平板电脑、PC(Personal Computer,个人计算机)、台式计算机及便携式计算机等,本公开实施例不对受控设备102的类型进行限定。
通常,由于同一局域网中往往会包括很多受控设备102,因此,在通过手势识别操控各个受控设备102时,如何将控制设备101的控制关系切换至某一受控设备102,并进一步实现对该受控设备102的操控,受到业内人士的普遍关注。为了解决该技术问题,本公开实施例提供了一种智能设备的操控方法,具体的智能设备的操控方法详见下述各个实施例:
结合图1所示的实施环境示意图及上述内容,图2是根据一示例性实施例示出的一种智能设备的操控方法的流程图,如图2所示,智能设备的操控方法用于终端中,包括以下步骤。
在步骤S201中,当检测到第一手势动作时,根据第一手势动作和第一映射关系,准备进行控制关系切换,其中,第一映射关系为手势动作与控制功能之间的映射关系。
在步骤S202中,当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备,其中,第二映射关系为手势动作与受控设备位置之间的映射关系。
在步骤S203中,建立与目标受控设备之间的控制关系。
在步骤S204中,获取第三手势动作。
在步骤S205中,根据第三手势动作及第一映射关系,控制目标受控设备执行第三手势动作对应的操作。
本公开实施例提供的方法,通过第一手势动作及第一映射关系准备进行控制关系切换,并根据第二手势动作及第二映射关系,确定目标受控设备后,建立与目标受控设备之 间的控制关系,并进一步通过获取第三手势动作,控制目标受控设备执行第三手势动作对应的操作,从而实现操控目标受控设备。由于通过手势动作即可将控制关系切换至目标受控设备,并通过手势动作完成目标受控设备的操控,因此,操控方式更加简单,同时也提高了操控方式的灵活性。
在另一个实施例中,当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备之前,还包括:
获取第一定位直线和第二定位直线,其中,第一定位直线和第二定位直线为用户分别在第一位置和第二位置指向受控设备时,用户手指指向的延长线;
根据第一定位直线和第二定位直线的交点,确定受控设备的位置;
建立手势动作与受控设备位置之间的第二映射关系。
在另一个实施例中,当检测到第一手势动作时,根据第一手势动作和第一映射关系,准备进行控制关系切换之前,还包括:
建立手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
在另一个实施例中,第一手势动作和第二手势动作由用户通过两只手完成,根据第一手势动作和第一映射关系,准备进行控制关系切换,包括:
根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备,包括:
当在检测到第一手势动作的同时,检测到用户的另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。
在另一个实施例中,第一手势动作和第二手势动作由用户通过一只手完成,根据第一手势动作和第一映射关系,准备进行控制关系切换,包括:
根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备,包括:
当在检测到用户的一只手发出的第一手势动作指定时长后,检测到用户同一只手或另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。
在另一个实施例中,控制目标受控设备执行第三手势动作对应的操作之后,还包括:
断开与目标受控设备之间的控制关系,将控制关系切换回检测到第一手势动作之前所控制的受控设备。
上述所有可选技术方案,可以采用任意结合形成本发明的可选实施例,在此不再一一赘述。
结合图2所对应实施例的内容,图3是根据一示例性实施例示出的一种智能设备的操 控方法的流程图,如图3所示,智能设备的操控方法用于终端中,包括以下步骤。
在步骤S301中,建立手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
当控制设备检测到用户发出的手势动作后,为了获知用户通过该手势动作,想要控制受控设备执行什么样的操作,控制设备在实现通过手势动作操控受控设备时,需要先建立手势动作与控制功能之间的第一映射关系。在此基础上,后续在获取到用户发出的手势动作后,控制设备通过查询该第一映射关系,即可确定控制受控设备执行什么样的操作。因此,在执行智能设备的操控方法时,需要先建立手势动作与控制功能之间的第一映射关系,并存储该第一映射关系。
其中,手势动作可以为动态动作,也可以为静态动作。当手势动作为动态动作时,如手势动作为用户的画圆动作时,控制设备在识别手势动作时,可以根据手势运动轨迹实现;当手势动作为静态动作时,如手势动作为用户摆出“ok”形状的手势动作时,控制设备在识别手势动作时,可以根据手势形状实现。当然,手势动作还可以为动态动作和静态动作的结合,如手势动作为用户先摆出一个“ok”形状的手势动作,又发出一个画圆动作时,此时,控制设备在识别手势动作时,可以同时根据手势形状及手势运动轨迹实现。
结合手势动作的不同状态,建立的手势动作与控制功能的第一映射关系也不同。例如,当控制设备根据手势运动轨迹识别手势动作时,第一映射关系可以为组成手势动作的特征轨迹与控制功能之间的映射关系;当控制设备根据手势形状识别手势动作时,第一映射关系可以为组成手势形状的特征点集与控制功能之间的映射关系。
例如,如果手势动作为画圆动作,且该手势动作对应的控制功能为启动操作,由于画圆动作的特征轨迹为圆形轨迹,因此,建立的第一映射关系为圆形轨迹与启动操作。
又例如,如果手势动作为“ok”手势形状,且该手势动作对应的控制功能为关闭操作,可以获取构成“ok”手势形状的轮廓点集,并将这些点作为“ok”手势形状的特征点集,此时,建立的第一映射关系为该“ok”手势形状的轮廓点集与关闭操作之间的映射关系。
在建立手势动作与控制功能之间的第一映射关系时,可以通过如下方式来实现:
第一种方式:在控制设备出厂时,预先设定手势动作与控制功能之间的第一映射关系。
该种方式具体释义为:在控制设备出厂时,预先设定好第一映射关系,后续在通过该控制设备操控受控设备时,根据控制设备出厂时预先设定的第一映射关系实现。例如,如果在控制设备出厂时,已经设定开启操作与画圆动作对应,则当通过该控制设备操控受控设备实现开启操作时,用户均通过画圆动作实现。
另外,由于同一局域网中通常会包括多个受控设备,且各个受控设备之间往往包括相同的控制功能,如开启功能、确定功能、关闭功能等,为了便于用户操控不同受控设备的相同控制功能,在本公开实施例中,在设定第一映射关系时,可以设定同一手势动作用于操控不同受控设备的相同控制功能。
例如,针对于智能电视的音量增加操作、智能收音机的音量增加操作及智能空调的温 度增加操作等,可以对应同一手势动作;针对于不同受控设备的“确定”操作,也可以对应同一手势动作;针对于不同受控设备的“上”、“下”、“左”、“右”等方向指示操作,也可以针对不同的方向定义同一手势动作等。
通过设置不同受控设备的相同控制功能使用同一手势动作,用户操控不同受控设备的相同控制功能时,即可通过同一手势动作实现,相对于相关技术中通过不同手势动作操控不同受控设备的同一控制功能的方式,操控方式不仅比较简单,而且比较省时。
第二种方式:在控制设备被安装后,根据用户的操作,录入手势动作与控制功能之间的第一映射关系。
该种方式具体释义为:当控制设备被安装于某一局域网后,根据该局域网中用户的操作定制第一映射关系。例如,在控制设备新加入家庭A之后,控制设备可以先扫描该家庭A中的所有受控设备包括的所有控制控制功能,并在屏幕显示其扫描到的所有控制功能。进一步地,控制设备可以通过屏幕显示方式提示用户录入每个控制功能对应的手势动作。在检测到用户选择某一控制功能后,录入用户针对该控制功能设定的手势动作,并存储该手势动作与控制功能之间的映射关系。针对每个控制功能均执行上述手势动作录入操作,即建立了包括所有控制功能及其对应的手势动作之间的第一映射关系。
另外,在提示用户录入每个控制功能对应的手势动作时,还可以通过语音方式实现。例如,在扫描到所有控制功能后,每次语音播报一个控制功能,在确定用户已录入该控制功能对应的手势动作后,继续播报下一个控制功能,并继续录入下一个控制功能对应的手势动作。
通过该种方式建立第一映射关系时,某一次录入的手势动作可以适用于某一个受控设备,也可适用于包括相同控制功能的部分受控设备。然而,针对于不同受控设备包括相同控制功能的情况,为了便于用户后续操控局域网中的所有受控设备,在搜集到所有受控设备的所有控制功能后,可以先对控制功能进行分类,并获取不同受控设备上共有的控制功能。进一步地,控制设备可以设定同一手势动作用于操控局域网中的所有受控设备的同一个控制功能。关于该同一手势动作用于操控不同受控设备的相同控制功能的内容,可以参见上述第一种方式中的描述,在此不再赘述。
通过该种方式建立第一映射关系时,用户可以根据自己的需要、喜好及操作习惯定制,使得建立第一映射关系的方式更加灵活多变。例如,如果用户为左撇子,则在定制第一映射关系时,用户可以通过左手实现。
进一步地,通过该种方式建立第一映射关系后,用户还可以根据需要修改手势动作与控制功能之间的第一映射关系。例如,用户在根据喜好设定第一映射关系后,如果觉得摆出设定该第一映射关系时使用的手势动作比较费力或是控制设备的识别成功率较低,则可以修改该手势动作。
例如,如果用户刚开始设定音量增加操作对应的手势动作为从耳朵根部向上指,后续认为摆出从耳朵根部向上指的姿势比较费力,则可以修改音量增加操作对应的手势动作, 如,修改为从左至右滑出不小于预设长度的直线。
需要说明的是,该步骤为实现受控设备的操控的基础,在预先建立该第一映射关系后,后续才可通过手势动作操控受控设备。也就是说,该步骤为操控受控设备之前的步骤,并非每次执行智能设备的操控方法时均需执行该步骤,保证执行智能设备的操控方法时,已经建立该第一映射关系即可。
在步骤S302中,当检测到第一手势动作时,根据第一手势动作和第一映射关系,准备进行控制关系切换。
其中,控制关系是控制设备与受控设备之间传输控制数据及控制指令的基础,控制设备在操控某一受控设备时,需要先与该受控设备建立控制关系,并在此基础上实现控制数据及控制指令的传输。
在控制设备操控某一受控设备时,如果控制设备当前正在操控一个受控设备,此时,如果检测到用户又想操控另一受控设备,则控制设备需要准备进行控制关系切换。此时,该准备进行控制关系切换可以指断开与当前受控设备之间的控制关系的过程。如果控制设备当前未操控任何受控设备,则该准备进行控制关系切换是指将当前状态设置为待切换状态的过程。
结合上述内容及步骤S301中预先建立的手势动作与控制功能之间的第一映射关系,该第一映射关系中的控制功能包括准备进行控制关系切换功能,且在本公开实施例中,不同受控设备的准备进行控制关系切换的控制功能均对应第一手势动作。即,控制设备在检测到第一手势动作后,通过查询该第一映射关系,可以确定第一手势动作与准备进行控制关系切换的控制功能相对应,此时,根据第一手势动作和第一映射关系,准备进行控制关系切换。
关于第一手势动作的类型,可以有很多种。例如,第一手势动作可以为用户的画圆动作,也可以为用户摆出某一形状的动作,如,用户摆出一个圆形的动作、摆出一个心形的动作等。在此基础上,在根据第一手势动作和第一映射关系,准备进行控制关系切换时,可以根据第一手势动作的轨迹实现,也可以根据第一手势动作的形状实现,本公开实施例对此不作具体限定。
需要说明的是,由于在本公开实施例中,同一手势动作用于操控不同受控设备的相同控制功能,因此,在切换不同受控设备之间的控制关系时,可以均通过该第一手势动作实现。例如,如果第一手势动作为画圆动作,如果当前正在操控智能电视,则当用户摆出画圆动作后,准备进行智能电视的控制关系切换;如果当前正在操控智能空调,则当用户同样摆出画圆动作后,准备进行智能空调的控制关系切换。
通过将不同受控设备的该准备切换功能设定为第一手势动作,确保当获取到第一手势动作时,即执行准备进行控制关系切换的操作,避免了相关技术中当不同受控设备通过不同手势动作时所导致的操作繁琐的缺陷。
在步骤S303中,当检测到第二手势动作时,根据第二手势动作和第二映射关系,确 定目标受控设备,其中,第二映射关系为手势动作与受控设备位置之间的映射关系。
其中,目标受控设备为用户当前想操控的某一个受控设备。在通过手势动作实现智能设备的操控时,为了确定目标受控设备,也需要通过用户一定的手势动作来触发。在本公开实施例中,控制设备根据用户发出的第二手势动作触发确定目标受控设备。因此,控制设备在准备进行控制关系切换之后,当检测到第二手势动作时,根据第二手势动作和手势动作与受控设备位置之间的第二映射关系,确定目标受控设备。
在确定目标受控设备时,往往根据第二手势动作进行定位来实现,因此,当控制设备检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备之前,还应该建立手势动作与受控设备位置之间的第二映射关系。控制设备在建立第二映射关系时,包括但不限于通过如下步骤S3031至步骤S3033实现:
在步骤S3031中,获取第一定位直线和第二定位直线,其中,第一定位直线和第二定位直线为用户分别在第一位置和第二位置指向受控设备时,用户手指指向的延长线。
需要说明的是,控制设备能够根据用户与受控设备之间的相对距离,确定用户当前相对于受控设备的位置。控制设备在确定某一受控设备的位置时,可以先以屏幕显示的方式或语音方式,提示用户分别在两个不同位置指向某一受控设备。当控制设备检测到用户根据该提示,在第一位置指向该受控设备后,获取用户所处的第一位置以及用户的手指指向,从而获取用户在第一位置指向该受控设备时手指指向的延长线,将该手指指向的延长线作为第一定位直线;当控制设备检测到用户根据该提示,在第二位置指向该受控设备后,获取用户所处的第二位置以及用户的手指指向,从而获取用户在第二位置指向该受控设备时手指指向的延长线,将该手指指向的延长线作为第二定位直线。
如图4所示,其示出了一种获取定位直线时的示意图。在图4中,A点和B点即为用户所在的第一位置和第二位置,当用户分别在A点和B点指向某一受控设备后,根据用户的手指指向的延长线,即可确定第一定位直线L和第二定位直线L’。进一步地,L和L’的交点C即为该受控设备的位置。
在步骤S3032中,根据第一定位直线和第二定位直线的交点,确定受控设备的位置。
在确定该受控设备的位置时,可以先确定第一定位直线和第二定位直线的交点的空间坐标,并将该空间坐标作为该受控设备的位置。
通过上述步骤S3031和步骤S3032即可确定局域网中的每个受控设备的位置。另外,当局域网中新增受控设备时,也可以通过上述步骤S3031和步骤S3032提供的方式确定该新增受控设备的位置。其中,在发现新增受控设备时,可以通过受控设备的自动识别协议实现,本公开实施例对此不作限定。
在步骤S3033中,建立手势动作与受控设备位置之间的第二映射关系。
针对于局域网中包括的每个受控设备,均可以通过上述步骤S3031和步骤S3032确定其位置。进一步地,在通过手势识别进行受控设备的操控过程中,当用户发出某一手势动作后,为了快速确定该手势动作所操控的目标受控设备,需建立手势动作与受控设备位置 之间的第二映射关系。在此基础上,在获取到第二手势动作后,即可快速确定目标受控设备。
另外,由于通过手势指向往往能够确定某一受控设备的位置,因此,在建立第二映射关系时所使用的手势动作包括但不限于为指向某一方向的手势动作。然而,建立第二映射关系时所使用的手势动作也可以为其它手势动作,保证该第二手势动作与受控设备位置之间建立映射关系即可。
通过上述步骤S302和步骤S303实现了根据第一手势动作及第一映射关系,准备进行控制关系切换,并根据第二手势动作及第二映射关系,确定了目标受控设备。结合上述内容,控制设备在准备进行控制关系切换及确定目标受控设备时,需要用户通过第一手势动作和第二手势动作完成。其中,用户在完成第一手势动作和第二手势动作时,可以通过两只手同时完成,也可以通过一只手完成,此时,控制设备在实现准备进行控制关系切换和确定目标受控设备时,包括但不限于有如下两种方式:
第一种方式:第一手势动作和第二手势动作由用户通过两只手完成,此时,控制设备根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;当控制设备在检测到第一手势动作的同时,检测到用户的另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。也就是说,控制设备准备进行控制关系切换的操作与确定目标受控设备的操作同时执行。
例如,如果第一手势动作为画圆动作,第二手势动作为手指指向动作,则当控制设备检测到用户的两只手分别同时摆出画圆动作和手指指向动作后,同时执行准备进行控制关系切换的操作及确定目标受控设备的操作。
通过该种方式实现准备进行控制关系切换及确定目标受控设备时,由于控制设备能够同时检测第一手势动作和第二手势动作,因此,操作过程比较省时。
第二种方式:第一手势动作和第二手势动作由用户通过一只手完成,此时,
控制设备根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;当控制设备在检测到用户的一只手发出的第一手势动作指定时长后,检测到用户同一只手或另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。也就是说,控制设备准备进行控制关系切换的操作与确定目标受控设备的操作分先后顺序执行。
关于指定时长的具体数值,本公开实施例不作具体限定。例如,该指定时长可以为1s(秒)、3s等。
例如,如果第一手势动作为画圆动作,第二手势动作为手指指向动作,则当控制设备检测到用户的一只手摆出画圆动作后,执行准备进行控制关系切换的操作。当控制设备检测到用户的一只手摆出画圆动作指定时长后,检测到用户又通过该只手发出了手指指向动作,则执行确定目标受控设备的操作。
通过该种方式实现准备进行控制关系切换及确定目标受控设备时,由于控制设备需要 分先后顺序检测第一手势动作和第二手势动作,因此,操作过程相对于第一种方式比较省时。然而,通过该种方式用户通过一只手即可完成操作,因此,操作比较简单。
在步骤S304中,建立与目标受控设备之间的控制关系。
为了实现操控目标受控设备,需要将控制关系切换至目标受控设备,因此,需要建立与目标受控设备之间的控制关系。建立与目标受控设备之间的控制关系后,当后续检测到用户发出的手势动作后,触发将该手势动作对应的控制指令发送至目标受控设备,从而控制目标受控设备执行相应操作。
其中,在建立与目标受控设备之间的控制关系时,如果控制设备刚刚启动,即还未操控任何受控设备,则控制设备可以直接建立与目标受控设备之间的控制关系;如果控制设备当前正在操控一个受控设备,则控制设备可以将控制关系由当前操控的受控设备切换至目标受控设备。
在步骤S305中,获取第三手势动作,并根据第三手势动作及第一映射关系,控制目标受控设备执行第三手势动作对应的操作。
例如,在建立与目标受控设备之间的控制关系后,如果后续获取到第三手势动作,即控制设备可先根据第三手势动作查询第一映射关系,得到该第三手势动作对应的控制功能,进一步根据该控制功能确定控制目标受控设备执行什么样的操作,从而控制目标受控设备执行第三手势动作对应的操作。
其中,控制设备在根据第三手势动作及第一映射关系,控制目标受控设备执行第三手势动作对应的操作时,可以根据第三手势动作的轨迹实现,也可以根据第三手势动作的形状实现,本公开实施例对此不作具体限定。
进一步地,控制设备在控制目标受控设备执行第三手势动作对应的操作时,可以通过向目标受控设备发送控制指令,该控制指令包括操作内容。目标受控设备在接收该控制指令后,执行该操作内容对应的操作。
例如,如果目标受控设备为智能电视,第三手势动作对应的操作为音量增加操作,控制设备在获取到第三手势动作后,可以向目标受控设备发送操作通知消息,该操作通知消息包括的操作内容为音量增加。智能电视接收该操作通知消息后,执行音量增加操作。
在步骤S306中,断开与目标受控设备之间的控制关系,将控制关系切换回检测到第一手势动作之前所控制的受控设备。
该步骤为可选步骤。如果在步骤S304中,控制设备建立与目标受控设备之间的控制关系时,是将控制关系由当前受控设备切换至目标受控设备,则控制设备在控制目标受控设备执行完该第三手势动作对应的操作后,可以断开与目标受控设备之间的控制关系,并将控制关系切换回检测到第一手势动作之前所控制的受控设备。
通过该可选步骤,使得本公开实施例提供的方法中,控制设备在操控受控设备时,仅将控制关系临时切换至目标受控设备,在操控完目标受控设备后,又切换回之前所控制的受控设备。
例如,如果用户当前正在观看电视节目,此时,他觉得房间有点热,因此,想将智能空调的温度调低一点,此时,控制设备可以通过步骤S302至步骤S304将控制关系由智能电视切换至智能空调,并通过步骤S305的方式操控完智能空调后,再将控制关系切换回智能电视。后续再获取到手势动作后,控制设备继续根据该手势动作及第一映射关系,操控智能电视。
本公开实施例提供的方法,通过第一手势动作及第一映射关系准备进行控制关系切换,并根据第二手势动作及第二映射关系,确定目标受控设备后,建立与目标受控设备之间的控制关系,并进一步通过获取第三手势动作,控制目标受控设备执行第三手势动作对应的操作,从而实现操控目标受控设备。由于通过手势动作即可将控制关系切换至目标受控设备,并通过手势动作完成目标受控设备的操控,因此,操控方式更加简单,同时也提高了操控方式的灵活性。
图5是根据一示例性实施例示出的一种智能设备的操控装置的框图,该智能设备的操控装置用于执行图2或图3所对应实施例提供的智能设备的操控方法。参照图5,该智能设备的操控装置包括预切换模块501、第一确定模块502、第一建立模块503、第一获取模块504和控制模块505。其中:
该预切换模块501被配置为当检测到第一手势动作时,根据第一手势动作和第一映射关系,准备进行控制关系切换,其中,第一映射关系为手势动作与控制功能之间的映射关系;
该第一确定模块502被配置为当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备,其中,第二映射关系为手势动作与受控设备位置之间的映射关系;
该第一建立模块503被配置为建立与目标受控设备之间的控制关系;
该第一获取模块504被配置为获取第三手势动作;
该控制模块505被配置为根据第三手势动作及第一映射关系,控制目标受控设备执行第三手势动作对应的操作。
本公开实施例提供的装置,通过第一手势动作及第一映射关系准备进行控制关系切换,并根据第二手势动作及第二映射关系,确定目标受控设备后,建立与目标受控设备之间的控制关系,并进一步通过获取第三手势动作,控制目标受控设备执行第三手势动作对应的操作,从而实现操控目标受控设备。由于通过手势动作即可将控制关系切换至目标受控设备,并通过手势动作完成目标受控设备的操控,因此,操控方式更加简单,同时也提高了操控方式的灵活性。
在另一个实施例中,如图6所示,智能设备的操控装置还包括第二获取模块506、第二确定模块507和第二建立模块508。其中:
该第二获取模块506被配置为获取第一定位直线和第二定位直线,其中,第一定位直 线和第二定位直线为用户分别在第一位置和第二位置指向受控设备时,用户手指指向的延长线;
该第二确定模块507被配置为根据第一定位直线和第二定位直线的交点,确定受控设备的位置;
该第二建立模块508被配置为建立手势动作与受控设备位置之间的第二映射关系。
在另一个实施例中,参见图7,智能设备的操控装置还包括第三获取模块509。其中:
该第三建立模块509被配置为获取手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
在另一个实施例中,该预切换模块501被配置为根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
该第一确定模块502被配置为当在检测到第一手势动作的同时,检测到用户的另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。
在另一个实施例中,该预切换模块501被配置为根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
该第一确定模块502被配置为当在检测到用户的一只手发出的第一手势动作指定时长后,检测到用户同一只手或另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。
在另一个实施例中,参见图8,智能设备的操控装置还包括切换模块510。其中:
该切换模块510被配置为断开与目标受控设备之间的控制关系,将控制关系切换回检测到第一手势动作之前所控制的受控设备。
上述所有可选技术方案,可以采用任意结合形成本发明的可选实施例,在此不再一一赘述。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
图9是根据一示例性实施例示出的一种终端900的框图,该终端可以用于执行上述图2或图3所对应实施例提供的智能设备的操控方法。例如,终端900可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。
参照图9,终端900可以包括以下一个或多个组件:处理组件902,存储器904,电源组件906,多媒体组件908,音频组件910,I/O(Input/Output,输入/输出)接口912,传感器组件914,以及通信组件916。
处理组件902通常控制终端900的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件902可以包括一个或多个处理器920来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件902可以包括一个或多个模块, 便于处理组件902和其它组件之间的交互。例如,处理组件902可以包括多媒体模块,以方便多媒体组件908和处理组件902之间的交互。
存储器904被配置为存储各种类型的数据以支持在终端900的操作。这些数据的示例包括用于在终端900上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器904可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如SRAM(Static Random Access Memory,静态随机存取存储器),EEPROM(Electrically-Erasable Programmable Read-Only Memory,电可擦除可编程只读存储器),EPROM(Erasable Programmable Read Only Memory,可擦除可编程只读存储器),PROM(Programmable Read-Only Memory,可编程只读存储器),ROM(Read-Only Memory,只读存储器),磁存储器,快闪存储器,磁盘或光盘。
电源组件906为终端900的各种组件提供电力。电源组件906可以包括电源管理系统,一个或多个电源,及其他与为终端900生成、管理和分配电力相关联的组件。
多媒体组件908包括在所述终端900和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括LCD(Liquid Crystal Display,液晶显示器)和TP(Touch Panel,触摸面板)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件908包括一个前置摄像头和/或后置摄像头。当终端900处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件910被配置为输出和/或输入音频信号。例如,音频组件910包括一个MIC(Microphone,麦克风),当终端900处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器904或经由通信组件916发送。在一些实施例中,音频组件910还包括一个扬声器,用于输出音频信号。
I/O接口912为处理组件902和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件914包括一个或多个传感器,用于为终端900提供各个方面的状态评估。例如,传感器组件914可以检测到终端900的打开/关闭状态,组件的相对定位,例如组件为终端900的显示器和小键盘,传感器组件914还可以检测终端900或终端900一个组件的位置改变,用户与终端900接触的存在或不存在,终端900方位或加速/减速和终端900的温度变化。传感器组件914可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件914还可以包括光传感器,如CMOS(Complementary  Metal Oxide Semiconductor,互补金属氧化物)或CCD(Charge-coupled Device,电荷耦合元件)图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件914还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件916被配置为便于终端900和其他设备之间有线或无线方式的通信。终端900可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信组件916经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件916还包括NFC(Near Field Communication,近场通信)模块,以促进短程通信。例如,在NFC模块可基于RFID(Radio Frequency Identification,射频识别)技术,IrDA(Infra-red Data Association,红外数据协会)技术,UWB(Ultra Wideband,超宽带)技术,BT(Bluetooth,蓝牙)技术和其他技术来实现。
在示例性实施例中,终端900可以被一个或多个ASIC(Application Specific Integrated Circuit,应用专用集成电路)、DSP(Digital signal Processor,数字信号处理器)、DSPD(Digital signal Processor Device,数字信号处理设备)、PLD(Programmable Logic Device,可编程逻辑器件)、FPGA(Field Programmable Gate Array,现场可编程门阵列)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述图2或图3所示实施例提供的智能设备的操控方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器904,上述指令可由终端900的处理器920执行以完成上述图2或图3所示实施例提供的智能设备的操控方法。例如,所述非临时性计算机可读存储介质可以是ROM、RAM(Random Access Memory,随机存取存储器)、CD-ROM(Compact Disc Read-Only Memory,光盘只读存储器)、磁带、软盘和光数据存储设备等。
一种非临时性计算机可读存储介质,当该存储介质中的指令由终端的处理器执行时,使得移动终端能够执行一种智能设备的操控方法,该方法包括:
当检测到第一手势动作时,根据第一手势动作和第一映射关系,准备进行控制关系切换,其中,第一映射关系为手势动作与控制功能之间的映射关系;
当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备,其中,第二映射关系为手势动作与受控设备位置之间的映射关系;
建立与目标受控设备之间的控制关系;
获取第三手势动作;
根据第三手势动作及第一映射关系,控制目标受控设备执行第三手势动作对应的操作。
假设上述为第一种可能的实施方式,则在第一种可能的实施方式作为基础而提供的第二种可能的实施方式中,终端的存储器中,还包含用于执行以下操作的指令:当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备之前,还包括:
获取第一定位直线和第二定位直线,其中,第一定位直线和第二定位直线为用户分别 在第一位置和第二位置指向受控设备时,用户手指指向的延长线;
根据第一定位直线和第二定位直线的交点,确定受控设备的位置;
建立手势动作与受控设备位置之间的第二映射关系。
在第一种可能的实施方式作为基础而提供的第三种可能的实施方式中,终端的存储器中,还包含用于执行以下操作的指令:当检测到第一手势动作时,根据第一手势动作和第一映射关系,准备进行控制关系切换之前,还包括:
建立手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
在第一种可能的实施方式作为基础而提供的第四种可能的实施方式中,终端的存储器中,还包含用于执行以下操作的指令:第一手势动作和第二手势动作由用户通过两只手完成,根据第一手势动作和第一映射关系,准备进行控制关系切换,包括:
根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备,包括:
当在检测到第一手势动作的同时,检测到用户的另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。
在第一种可能的实施方式作为基础而提供的第五种可能的实施方式中,终端的存储器中,还包含用于执行以下操作的指令:第一手势动作和第二手势动作由用户通过一只手完成,根据第一手势动作和第一映射关系,准备进行控制关系切换,包括:
根据用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
当检测到第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备,包括:
当在检测到用户的一只手发出的第一手势动作指定时长后,检测到用户同一只手或另一只手发出的第二手势动作时,根据第二手势动作和第二映射关系,确定目标受控设备。
在第一种可能的实施方式作为基础而提供的第六种可能的实施方式中,终端的存储器中,还包含用于执行以下操作的指令:控制目标受控设备执行第三手势动作对应的操作之后,还包括:
断开与目标受控设备之间的控制关系,将控制关系切换回检测到第一手势动作之前所控制的受控设备。
本公开实施例提供的非临时性计算机可读存储介质,通过第一手势动作及第一映射关系准备进行控制关系切换,并根据第二手势动作及第二映射关系,确定目标受控设备后,建立与目标受控设备之间的控制关系,并进一步通过获取第三手势动作,控制目标受控设备执行第三手势动作对应的操作,从而实现操控目标受控设备。由于通过手势动作即可将控制关系切换至目标受控设备,并通过手势动作完成目标受控设备的操控,因此,操控方式更加简单,同时也提高了操控方式的灵活性。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本发明的其它实施方案。本申请旨在涵盖本发明的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本发明的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本发明的真正范围和精神由下面的权利要求指出。
应当理解的是,本发明并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本发明的范围仅由所附的权利要求来限制。

Claims (13)

  1. 一种智能设备的操控方法,其特征在于,所述方法包括:
    当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换,所述第一映射关系为手势动作与控制功能之间的映射关系;
    当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,所述第二映射关系为手势动作与受控设备位置之间的映射关系;
    建立与所述目标受控设备之间的控制关系;
    获取第三手势动作;
    根据所述第三手势动作及所述第一映射关系,控制所述目标受控设备执行所述第三手势动作对应的操作。
  2. 根据权利要求1所述的方法,其特征在于,所述当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备之前,还包括:
    获取第一定位直线和第二定位直线,所述第一定位直线和所述第二定位直线为用户分别在第一位置和第二位置指向受控设备时,用户手指指向的延长线;
    根据所述第一定位直线和所述第二定位直线的交点,确定所述受控设备的位置;
    建立手势动作与所述受控设备位置之间的第二映射关系。
  3. 根据权利要求1所述的方法,其特征在于,所述当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换之前,还包括:
    建立手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
  4. 根据权利要求1所述的方法,其特征在于,所述第一手势动作和所述第二手势动作由用户通过两只手完成,所述根据所述第一手势动作和第一映射关系,准备进行控制关系切换,包括:
    根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
    所述当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,包括:
    当在检测到所述第一手势动作的同时,检测到所述用户的另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
  5. 根据权利要求1所述的方法,其特征在于,所述第一手势动作和所述第二手势动作由用户通过一只手完成,所述根据所述第一手势动作和第一映射关系,准备进行控制关 系切换,包括:
    根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
    所述当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,包括:
    当在检测到所述用户的一只手发出的第一手势动作指定时长后,检测到所述用户同一只手或另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
  6. 根据权利要求1所述的方法,其特征在于,所述控制所述目标受控设备执行所述第三手势动作对应的操作之后,还包括:
    断开与所述目标受控设备之间的控制关系,将控制关系切换回检测到所述第一手势动作之前所控制的受控设备。
  7. 一种智能设备的操控装置,其特征在于,所述装置包括:
    预切换模块,用于当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换,所述第一映射关系为手势动作与控制功能之间的映射关系;
    第一确定模块,用于当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,所述第二映射关系为手势动作与受控设备位置之间的映射关系;
    第一建立模块,用于建立与所述目标受控设备之间的控制关系;
    第一获取模块,用于获取第三手势动作;
    控制模块,用于根据所述第三手势动作及所述第一映射关系,控制所述目标受控设备执行所述第三手势动作对应的操作。
  8. 根据权利要求7所述的装置,其特征在于,所述装置还包括:
    第二获取模块,用于获取第一定位直线和第二定位直线,所述第一定位直线和所述第二定位直线为用户分别在第一位置和第二位置指向受控设备时,用户手指指向的延长线;
    第二确定模块,用于根据所述第一定位直线和所述第二定位直线的交点,确定所述受控设备的位置;
    第二建立模块,用于建立手势动作与所述受控设备位置之间的第二映射关系。
  9. 根据权利要求7所述的装置,其特征在于,所述装置还包括:
    第三建立模块,用于获取手势动作与控制功能之间的第一映射关系,其中,同一手势动作用于操控不同受控设备的相同控制功能。
  10. 根据权利要求7所述的装置,其特征在于,所述预切换模块,用于根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
    所述第一确定模块,用于当在检测到所述第一手势动作的同时,检测到所述用户的另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
  11. 根据权利要求7所述的装置,其特征在于,所述预切换模块,用于根据所述用户的一只手发出的第一手势动作和第一映射关系,准备进行控制关系切换;
    所述第一确定模块,用于当在检测到所述用户的一只手发出的第一手势动作指定时长后,检测到所述用户同一只手或另一只手发出的第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备。
  12. 根据权利要求7所述的装置,其特征在于,所述装置还包括:
    切换模块,用于断开与所述目标受控设备之间的控制关系,将控制关系切换回检测到所述第一手势动作之前所控制的受控设备。
  13. 一种终端,其特征在于,所述终端包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为:
    当检测到第一手势动作时,根据所述第一手势动作和第一映射关系,准备进行控制关系切换,所述第一映射关系为手势动作与控制功能之间的映射关系;
    当检测到第二手势动作时,根据所述第二手势动作和第二映射关系,确定目标受控设备,所述第二映射关系为手势动作与受控设备位置之间的映射关系;
    建立与所述目标受控设备之间的控制关系;
    获取第三手势动作;
    根据所述第三手势动作及所述第一映射关系,控制所述目标受控设备执行所述第三手势动作对应的操作。
PCT/CN2015/088583 2015-02-26 2015-08-31 智能设备的操控方法及装置 WO2016134591A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2017501458A JP6229096B2 (ja) 2015-02-26 2015-08-31 スマート機器の操作制御方法、装置、プログラム及び記憶媒体
MX2016000471A MX362892B (es) 2015-02-26 2015-08-31 Metodo y aparato para operar y controlar dispositivos inteligentes.
RU2016101234A RU2633367C2 (ru) 2015-02-26 2015-08-31 Способ и устройство для оперирования и управления интеллектуальным устройством
KR1020157030749A KR101736318B1 (ko) 2015-02-26 2015-08-31 스마트 기기의 조작 제어 방법, 장치, 프로그램 및 기록매체

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510087958.3 2015-02-26
CN201510087958.3A CN104699244B (zh) 2015-02-26 2015-02-26 智能设备的操控方法及装置

Publications (1)

Publication Number Publication Date
WO2016134591A1 true WO2016134591A1 (zh) 2016-09-01

Family

ID=53346454

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/088583 WO2016134591A1 (zh) 2015-02-26 2015-08-31 智能设备的操控方法及装置

Country Status (8)

Country Link
US (1) US10007354B2 (zh)
EP (1) EP3062196B1 (zh)
JP (1) JP6229096B2 (zh)
KR (1) KR101736318B1 (zh)
CN (1) CN104699244B (zh)
MX (1) MX362892B (zh)
RU (1) RU2633367C2 (zh)
WO (1) WO2016134591A1 (zh)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104699244B (zh) 2015-02-26 2018-07-06 小米科技有限责任公司 智能设备的操控方法及装置
CN105204742B (zh) * 2015-09-28 2019-07-09 小米科技有限责任公司 电子设备的控制方法、装置及终端
CN105425597B (zh) * 2015-11-02 2020-03-03 小米科技有限责任公司 控制智能灯的方法、装置及系统
CN105491234B (zh) * 2015-11-30 2019-02-01 深圳还是威健康科技有限公司 一种智能终端交互的方法及智能终端
WO2017201697A1 (en) 2016-05-25 2017-11-30 SZ DJI Technology Co., Ltd. Techniques for image recognition-based aerial vehicle navigation
CN107526500A (zh) * 2016-06-22 2017-12-29 斑马网络技术有限公司 功能调节方法、装置、设备、界面系统及控制设备
CN107801413B (zh) * 2016-06-28 2020-01-31 华为技术有限公司 对电子设备进行控制的终端及其处理方法
CN106406725A (zh) * 2016-08-24 2017-02-15 深圳前海弘稼科技有限公司 一种基于种植箱的控制方法及装置
CN106355852B (zh) * 2016-08-25 2020-01-07 北京小米移动软件有限公司 设备控制方法及装置
CN106354263A (zh) * 2016-09-09 2017-01-25 电子科技大学 基于面部特征追踪的实时人机交互系统及其工作方法
CN106444415B (zh) * 2016-12-08 2019-10-01 湖北大学 智能家居控制方法及系统
CN107272427B (zh) * 2017-06-16 2021-01-05 北京小米移动软件有限公司 智能设备的控制方法及装置
CN107315355B (zh) * 2017-06-30 2021-05-18 京东方科技集团股份有限公司 一种电器控制设备及方法
US10620721B2 (en) * 2018-01-29 2020-04-14 Google Llc Position-based location indication and device control
CN108717271A (zh) * 2018-05-30 2018-10-30 辽东学院 人机交互控制方法、装置、系统及可读存储介质
US20200192485A1 (en) * 2018-12-12 2020-06-18 Lenovo (Singapore) Pte. Ltd. Gaze-based gesture recognition
US10701661B1 (en) 2019-04-02 2020-06-30 Google Llc Location determination for device control and configuration
US11082249B2 (en) 2019-04-02 2021-08-03 Google Llc Location determination for device control and configuration
CN110471296B (zh) * 2019-07-19 2022-05-13 深圳绿米联创科技有限公司 设备控制方法、装置、系统、电子设备及存储介质
CN110557741B (zh) 2019-08-05 2021-08-13 华为技术有限公司 终端交互的方法及终端
CN111479118A (zh) * 2019-10-09 2020-07-31 王东 一种电子设备的控制方法、装置及电子设备
CN110764616A (zh) * 2019-10-22 2020-02-07 深圳市商汤科技有限公司 手势控制方法和装置
CN113495617A (zh) * 2020-04-01 2021-10-12 深圳绿米联创科技有限公司 设备控制的方法、装置、终端设备以及存储介质
CN111857525A (zh) * 2020-06-23 2020-10-30 佳格科技(浙江)股份有限公司 一种手势切换屏幕的方法及系统
JP7322824B2 (ja) * 2020-07-01 2023-08-08 トヨタ自動車株式会社 情報処理装置、情報処理方法、および制御システム
US11789542B2 (en) 2020-10-21 2023-10-17 International Business Machines Corporation Sensor agnostic gesture detection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110296353A1 (en) * 2009-05-29 2011-12-01 Canesta, Inc. Method and system implementing user-centric gesture control
CN103793055A (zh) * 2014-01-20 2014-05-14 华为终端有限公司 一种手势响应方法及终端
CN104102335A (zh) * 2013-04-15 2014-10-15 中兴通讯股份有限公司 一种手势控制方法、装置和系统
CN104360736A (zh) * 2014-10-30 2015-02-18 广东美的制冷设备有限公司 基于手势的终端控制方法和系统
CN104699244A (zh) * 2015-02-26 2015-06-10 小米科技有限责任公司 智能设备的操控方法及装置
CN104866084A (zh) * 2014-02-25 2015-08-26 中兴通讯股份有限公司 手势识别方法、装置和系统

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1394727B1 (en) * 2002-08-30 2011-10-12 MVTec Software GmbH Hierarchical component based object recognition
JP2005047331A (ja) * 2003-07-31 2005-02-24 Nissan Motor Co Ltd 制御装置
JP3900136B2 (ja) * 2003-10-27 2007-04-04 オンキヨー株式会社 遠隔操作システム
JP2006155244A (ja) * 2004-11-29 2006-06-15 Olympus Corp 情報表示装置
US8199113B2 (en) * 2006-09-13 2012-06-12 Savant Systems, Llc Programmable on screen display and remote control
JP5207513B2 (ja) * 2007-08-02 2013-06-12 公立大学法人首都大学東京 制御機器操作ジェスチャ認識装置、制御機器操作ジェスチャ認識システムおよび制御機器操作ジェスチャ認識プログラム
KR100977443B1 (ko) * 2008-10-01 2010-08-24 숭실대학교산학협력단 제스쳐 기반의 가전기기 제어장치 및 방법
KR101563487B1 (ko) * 2009-05-11 2015-10-27 엘지전자 주식회사 가전기기를 제어하는 휴대 단말기
KR20110010906A (ko) * 2009-07-27 2011-02-08 삼성전자주식회사 사용자 인터랙션을 이용한 전자기기 제어 방법 및 장치
CN102117117A (zh) * 2010-01-06 2011-07-06 致伸科技股份有限公司 利用图像提取装置辨识使用者姿势进行控制的系统及方法
CN101777250B (zh) * 2010-01-25 2012-01-25 中国科学技术大学 家用电器的通用遥控装置及方法
US8150384B2 (en) * 2010-06-16 2012-04-03 Qualcomm Incorporated Methods and apparatuses for gesture based remote control
US9304592B2 (en) * 2010-11-12 2016-04-05 At&T Intellectual Property I, L.P. Electronic device control based on gestures
US9778747B2 (en) * 2011-01-19 2017-10-03 Hewlett-Packard Development Company, L.P. Method and system for multimodal and gestural control
EP2650754A3 (en) * 2012-03-15 2014-09-24 Omron Corporation Gesture recognition apparatus, electronic device, gesture recognition method, control program, and recording medium
JP2013205983A (ja) * 2012-03-27 2013-10-07 Sony Corp 情報入力装置及び情報入力方法、並びにコンピューター・プログラム
US9170674B2 (en) * 2012-04-09 2015-10-27 Qualcomm Incorporated Gesture-based device control using pressure-sensitive sensors
US9529439B2 (en) * 2012-11-27 2016-12-27 Qualcomm Incorporated Multi device pairing and sharing via gestures
JP6202810B2 (ja) * 2012-12-04 2017-09-27 アルパイン株式会社 ジェスチャ認識装置および方法ならびにプログラム
JP6030430B2 (ja) * 2012-12-14 2016-11-24 クラリオン株式会社 制御装置、車両及び携帯端末
US8886399B2 (en) * 2013-03-15 2014-11-11 Honda Motor Co., Ltd. System and method for controlling a vehicle user interface based on gesture angle
CN103246347A (zh) * 2013-04-02 2013-08-14 百度在线网络技术(北京)有限公司 控制方法、装置和终端
CN103870196B (zh) * 2014-03-06 2018-02-09 美卓软件设计(北京)有限公司 一种切换对象的方法及装置
US10782657B2 (en) * 2014-05-27 2020-09-22 Ultrahaptics IP Two Limited Systems and methods of gestural interaction in a pervasive computing environment
KR101556521B1 (ko) * 2014-10-06 2015-10-13 현대자동차주식회사 휴먼 머신 인터페이스 장치, 그를 가지는 차량 및 그 제어 방법

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110296353A1 (en) * 2009-05-29 2011-12-01 Canesta, Inc. Method and system implementing user-centric gesture control
CN104102335A (zh) * 2013-04-15 2014-10-15 中兴通讯股份有限公司 一种手势控制方法、装置和系统
CN103793055A (zh) * 2014-01-20 2014-05-14 华为终端有限公司 一种手势响应方法及终端
CN104866084A (zh) * 2014-02-25 2015-08-26 中兴通讯股份有限公司 手势识别方法、装置和系统
CN104360736A (zh) * 2014-10-30 2015-02-18 广东美的制冷设备有限公司 基于手势的终端控制方法和系统
CN104699244A (zh) * 2015-02-26 2015-06-10 小米科技有限责任公司 智能设备的操控方法及装置

Also Published As

Publication number Publication date
US10007354B2 (en) 2018-06-26
US20160252967A1 (en) 2016-09-01
CN104699244B (zh) 2018-07-06
JP2017516242A (ja) 2017-06-15
KR101736318B1 (ko) 2017-05-16
EP3062196B1 (en) 2018-02-07
KR20160113544A (ko) 2016-09-30
RU2633367C2 (ru) 2017-10-12
CN104699244A (zh) 2015-06-10
MX362892B (es) 2019-02-22
JP6229096B2 (ja) 2017-11-08
RU2016101234A (ru) 2017-07-20
MX2016000471A (es) 2016-10-26
EP3062196A1 (en) 2016-08-31

Similar Documents

Publication Publication Date Title
WO2016134591A1 (zh) 智能设备的操控方法及装置
JP6488375B2 (ja) デバイス制御方法及び装置
US10613498B2 (en) Method for controlling device by remote control device
KR101814161B1 (ko) 전자기기 제어 방법 및 장치
US10453331B2 (en) Device control method and apparatus
WO2017024711A1 (zh) 智能家居设备的控制方法、装置、系统及设备
WO2017008398A1 (zh) 智能设备控制方法和装置
WO2017201860A1 (zh) 视频直播方法及装置
WO2017084269A1 (zh) 控制智能设备的方法和装置
EP2985989B1 (en) Method and device for acquiring multimedia data stream
WO2016119458A1 (zh) 遥控方法及装置
US20160316007A1 (en) Method and apparatus for grouping smart device in smart home system
WO2017101497A1 (zh) 设备控制方法及装置
WO2017071077A1 (zh) 界面显示方法及装置
WO2017008394A1 (zh) 一种下载控制程序的方法及装置
US9865161B2 (en) Method, remote controller and electrical applicance for releasing a binding of a remote controller
WO2018040325A1 (zh) 设备标识方法及装置
WO2023115777A1 (zh) 光标控制方法、装置、电子设备和存储介质
US11782523B2 (en) Method for controlling Internet of Things device, and terminal device
CN106527954B (zh) 设备控制方法及装置和移动终端
US20240056921A1 (en) Connection method and apparatus for wireless smart wearable device and storage medium
CN105592337A (zh) 显示方法、装置、发送设备及控制设备

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 20157030749

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017501458

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: MX/A/2016/000471

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2016101234

Country of ref document: RU

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15883028

Country of ref document: EP

Kind code of ref document: A1

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112016001498

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112016001498

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20160122

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15883028

Country of ref document: EP

Kind code of ref document: A1