CN103961869A - Device control method - Google Patents

Device control method Download PDF

Info

Publication number
CN103961869A
CN103961869A CN201410149230.4A CN201410149230A CN103961869A CN 103961869 A CN103961869 A CN 103961869A CN 201410149230 A CN201410149230 A CN 201410149230A CN 103961869 A CN103961869 A CN 103961869A
Authority
CN
China
Prior art keywords
information
user
control instruction
equipment
control information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410149230.4A
Other languages
Chinese (zh)
Inventor
林云帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201410149230.4A priority Critical patent/CN103961869A/en
Priority to CN201710420565.9A priority patent/CN107346593B/en
Priority to CN201710420621.9A priority patent/CN107320948B/en
Publication of CN103961869A publication Critical patent/CN103961869A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3202Hardware aspects of a gaming system, e.g. components, construction, architecture thereof
    • G07F17/3204Player-machine interfaces
    • G07F17/3209Input means, e.g. buttons, touch screen
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/326Game play aspects of gaming systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention belongs to the field of control, and provides a device control method. The device control method includes the steps of receiving characteristic control information input by a user, wherein the characteristic control information is one or more of body part information, voice information and facial information; finding a control instruction corresponding to the characteristic control information input by the user according to the preset correspondence between the characteristic control information and the control instruction; controlling a device according to the corresponding control instruction. According to the device control method, control over the device can be achieved through one or more of the body part information, the voice information and the facial information, and therefore no physical abrasion can be caused to the input device, cost is saved, the input method which is used is more flexible, and health of the user is facilitated.

Description

A kind of apparatus control method
Technical field
The invention belongs to control field, relate in particular to a kind of apparatus control method.
Background technology
Along with the variation of people's life, electronic game has become a kind of reasonable amusement and recreation mode.By the electronic game of the body part in conjunction with people or mental activity, make people not only obtain loosening of body and mind, can also be taken exercise accordingly, obtain liking of people.
The control mode of electronic game at present, generally controls by manual mode, by using the touch input devices such as keyboard, mouse, game paddle.By touch input device, can input preferably game control instruction, but because electronic game need to operate repeatedly, easily cause the wearing and tearing of wired input equipment, material requirements to input equipment is higher, cause the cost of input equipment higher, on the other hand, because the input mode of input equipment is generally the motion among a small circle of finger, long-time operation is unfavorable for that user's body is healthy.
Summary of the invention
The object of the embodiment of the present invention is to provide a kind of apparatus control method, and to solve, the input equipment of electronic game in prior art easily weares and teares, cost is higher, and is unfavorable for the problem of user's body health.
The embodiment of the present invention is achieved in that a kind of apparatus control method, and described method comprises:
Receive the character control information of user's input, described character control information be in body part information, voice messaging or facial information one or more;
According to the corresponding relation of predefined character control information and control instruction, search the corresponding control instruction of character control information of described user's input;
According to the control instruction of described correspondence, control described equipment.
In embodiments of the present invention, by one or more character control information in body part information, voice messaging or the facial information of reception user input, and according to the control instruction corresponding to character control information searching of input, by described control instruction control appliance, carry out corresponding action.Because the present invention can realize the control to equipment by one or more in body part information, voice messaging or facial information, can not cause to input equipment the wearing and tearing of physics, save cost, and the input mode of using is more flexible, is conducive to user's body health.
Accompanying drawing explanation
Fig. 1 is the realization flow figure of the apparatus control method that provides of first embodiment of the invention;
Fig. 2 is the realization flow figure of the corresponding control instruction of character control information of the described user's input of searching of providing of second embodiment of the invention;
The schematic diagram of setting up coordinate in image that Fig. 3 provides for second embodiment of the invention;
Fig. 4 is the realization flow figure of the corresponding control instruction of character control information of the described user's input of searching of providing of third embodiment of the invention;
The change direction according to user's body position that Fig. 4 a provides for third embodiment of the invention generates the schematic diagram of corresponding control instruction;
Fig. 5 is the realization flow figure of the corresponding control instruction of character control information of the described user's input of searching of providing of fourth embodiment of the invention;
Fig. 6 is the realization flow figure of the corresponding control instruction of character control information of the described user's input of searching of providing of fifth embodiment of the invention;
The schematic flow sheet of the device start that Fig. 7 seventh embodiment of the invention provides.
The specific embodiment
In order to make object of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
Method described in the embodiment of the present invention, can be used for various control kind equipments, the equipment that is particularly useful for the game of Electronic Control class, as current use doll machine, ball interactive game etc. widely, in the mode of traditional use handle input, due to game intensity, while causing user's input control order, overexert or frequency of utilization too high, easily cause the damage of game paddle.And the electronic game of controlling by game paddle, it mainly realizes input control order by the activity of finger, and mode is comparatively single, and is unfavorable for long-time operation, does not reach electronic game for loosening the object of body and mind, body-building.The control method of equipment of the present invention, when being applied to electronic game station, by the use of multiple control modes, in the cost-effective while, is also conducive to loosening of user's body and mind more.
Embodiment mono-:
Fig. 1 shows the realization flow of a kind of apparatus control method that first embodiment of the invention provides, and details are as follows:
In step S101, receive the character control information of user's input, described character control information be in body part information, voice messaging or facial information one or more.
Concrete, described body part information, can be positional information or the action message at user's hand, pin or first-class position.
Wherein, described positional information, for hand, its positional information can comprise as positional informations such as upper and lower, left and right, upper left, lower-left, upper right, bottom rights.
When detecting described positional information, the recognition time threshold value of system need to be set, when fixedly the duration time of the stop of a certain position () is greater than described threshold value the position of user's hand, system can be identified the positional information of described hand.When fixedly duration is less than threshold value in the position of user's hand, it is control instruction for system nonrecognition.
Described action message, can comprise upwards, downwards, left, to the right, to upper left, to left down, to upper right, to the action of bottom right motion etc.
Described action message, need to arrange a reference point, by as a reference point with the center of user's both hands, when user's hand stretches out, can identify it for control instruction, when the action that user's hand is regained, from the disappearance nonrecognition of camera viewing area, is control instruction.
Described voice messaging, can be by the voice signal of storing in equipment, realize the voice that user is sent and identify, common phonetic control command is as comprised: upwards, downwards, left, to the right, to upper left, to left down, to upper right, inferior to the right.In addition, consider the difference of language of the use of different user, can be in equipment pre-stored multiple different voice signal so that same equipment can adapt to multilingual user's requirement.In addition, also can distribute according to the crowd of the equipment of use, adaptive storage of speech signals.
Described facial information, can identify facial expression accordingly according to face recognition algorithm, as the difference of the expressions such as pleasure, anger, sorrow, happiness, acts on behalf of different instructions, or according to the direction of facial eye rotation, carrys out the input of realization character control information.
The identification of described facial information, can identify by predefined position change information, as raised up amplitude for the corners of the mouth, defines smile control instruction etc.
Concrete, receive the mode of above-mentioned character control information, can obtain by the camera of equipment the image of user position, or by camera, obtain user's facial information, as the present invention other can receiving control information mode, can also comprise as the mode of the position inductive pick-ups such as infrared sensor, obtain user's body part information.Can receive by microphone user's voice messaging.
In step S102, according to the corresponding relation of predefined character control information and control instruction, search the corresponding control instruction of character control information of described user's input.
The pre-stored correspondence database that has the instruction of character control information and control in equipment.In described correspondence database, comprise the state of the body part that each is concrete or move corresponding control instruction that the control instruction corresponding to the voice signal that can be used as control instruction of setting also can comprise that other is as corresponding control instructions such as facial expression actions.
In step S103, according to the control instruction of described correspondence, control described equipment.
After finding control instruction corresponding to described character control information, by control instruction, drive or stop the action of drive motors, thereby realize the movement of equipment, the control that stops, can also send the virtual controlling instruction comprising as to game, as passed through to detect user's limbs morphological change characteristics, corresponding to the variation of role's limbs form in game, it is more vivid to make to control, and obtains control effect more true to nature.
In the embodiment of the present invention, by one or more character control information in body part information, voice messaging or the facial information of reception user input, and according to the control instruction corresponding to character control information searching of input, by described control instruction control appliance, carry out corresponding action.Because the present invention can realize the control to equipment by one or more in body part information, voice messaging or facial information, can not cause to input equipment the wearing and tearing of physics, save cost, and the input mode of using is more flexible, is conducive to user's body health.
Embodiment bis-:
Fig. 2 shows the realization flow of the corresponding control instruction of character control information of searching described user's input that second embodiment of the invention provides, and described control instruction comprises moving direction control instruction, and details are as follows.
The character control information that wherein receives user's input is identical with embodiment mono-with the step of controlling described equipment according to control instruction, at this, does not repeat.
In step S201, by camera, obtain the image information of predeterminated position.
Concrete, described predeterminated position, is generally positioned at the front end of described equipment, and aims at place, user position, when user is not during in default position, may not get whole body part information of user or can only get the body part information of part.
Described camera can get the image of the body part that user is different.As for user's hand, camera can be set and aim at user's upper part of the body position, gather user's image above the waist, for pin or head, accordingly the high and low position of camera can be set, in addition, the scope that may move according to the distance between user and equipment and body part, adjusts the focal range of camera.Or do not need to know that body part also can Direct Recognition, for example, only see in one's handsly, can identify.Because in situation closely, can not show other positions of health.
In step S202, according in advance the coordinate of image being divided, search described user's body part place coordinate position.
In step S203, according to the position relation of described body part place coordinate position and described picture centre, generate the control instruction of described azimuth direction.
On the image of described collection, set up coordinate, origin selection mode can be various, as can be for image center, also can be the corner location of image, Fig. 3 be take the center of image and is set up coordinate and describe as example.
As shown in Figure 3, the length of image is 6a, and width is 6b, and the true ratio of the image that image obtains in camera shows, image is divided into 9 regions as shown in Figure 3, each region coordinate range be respectively:
First area (3a<x<-a, b<y<3b);
Second area (a<x<a, b<y<3b);
The 3rd region (a<x<3a, b<y<3b);
The 4th region (3a<x<-a ,-b<y<b);
The 5th region (a<x<a ,-b<y<b);
The 6th region (a<x<3a ,-b<y<b);
SECTOR-SEVEN territory (3a<x<-a ,-3b<y<-b);
Section Eight territory (a<x<a ,-3b<y<-b);
The 9th region (a<x<3a ,-3b<y<-b);
When the body part that need to detect as user is positioned at first area, because first area is with respect to the origin of coordinates, the center that is image is upper left side, generate and be oriented upper left control instruction, for as video game apparatus having memory cards such as doll machines, send the corresponding instruction that drives, control drive motors and move towards described direction.For second area, the 3rd region, the 4th region, the 6th region, SECTOR-SEVEN territory, Section Eight territory and the 9th region, produce the driving instruction of correspondence direction equally.
In addition, can also be according to distance between the position of the body part of described detection and the center of health, the distance of the required movement of corresponding described control instruction.
When the body part that need to detect as user is positioned at the 5th region, sends and stop mobile control instruction, control appliance stops mobile.
The embodiment of further optimizing as the embodiment of the present invention, when described equipment is while capturing control appliance, can also comprise the opening and closing state of further detection finger, thereby generate corresponding driving instruction, control appliance pick-and-place according to a state that closes of finger.Certainly, this is a concrete application implementation mode, and described finger state can also be replaced by body part state or the action of other detection, as smile or other expression etc., thereby generate, triggers the instruction of control appliance pick-and-place.
Equally, for the operation of pick-and-place equipment, also can control class action for other, as shake etc., can carry out association correspondence according to concrete equipment.
The doll machine of take below describes as example.User standing place at doll machine, correspondence is provided with the camera that can obtain user's body location information, described user's body location information is shown to (can certainly obtain whole body part information) by display frame, by set up coordinate in described picture, by picture segmentation, be nine regions shown in Fig. 3, adopt image recognition algorithm, be identified in the gesture in regional, when two hands are positioned at the 5th region simultaneously, expression present instruction is halt instruction, when only having, wherein there is a hand to be positioned at n-quadrant in addition, the 5th region, judge that it is that n-quadrant is with respect to the control instruction in the orientation in the 5th region, if n-quadrant is above the 5th region, corresponding to the control instruction making progress.In addition, when two hands all beyond in the 5th region and during not at the same area, judge that this control instruction is invalid.When finger, clench fist while then opening, trigger doll machine handgrip grasping movement.
When described equipment is launch control equipment, can also be further by detecting other, control class and move, when meeting testing conditions, carry out corresponding send action, as various shooting game apparatus etc.
The embodiment of the present invention, by the position of body part is generated to control instruction, can be controlled easily without contact arrangement, when reducing cost, has early than user and uses the diversified generation control instruction of health.
Embodiment tri-:
Fig. 4 is the realization flow of the corresponding control instruction of character control information of the described user's input of searching of providing of third embodiment of the invention, and described control instruction comprises moving direction control instruction or state control instruction, and details are as follows.
The character control information that wherein receives user's input is identical with embodiment mono-with the step of controlling described equipment according to control instruction, at this, does not repeat.
In step S401, by camera, obtain the image information of predeterminated position.
In step S402, in described image information, detect the change direction of described user's body part, generate the control instruction of described change direction, in described image information, detect described user's form, generate corresponding state control instruction.
The change direction according to user's body position that Fig. 4 a provides for third embodiment of the invention generates the schematic diagram of corresponding control instruction, as shown in Fig. 4 a, user's body part, as being moved for positions such as hand or head or pin, in picture, show the speed of corresponding movement, mobile angle, mobile amplitude information, according to corresponding proportionate relationship, transferring to is to the corresponding mobile control instruction of the mobile controller in equipment.According to the mobile mode of controlling of concrete equipment, the angle of described movement can also specifically be divided into the mobile control instruction of four direction or multiple directions, the amplitude of described movement also can adopt fixing mode, and every reception is mobile control instruction once, the mobile fixing distance setting.
Body part described in the embodiment of the present invention, comprises hand, pin or head.
The passable direction of motion of described hand comprises centered by human body, outside direction of extension is (because health can not infinitely outwards stretch, therefore definition body part inwardly reclaims not corresponding control instruction, thereby more convenient controls operate), when equipment is sent to mobile control instruction, can, according to the movement velocity of hand, generate the control instruction of corresponding translational speed, according to chirokinesthetic amplitude, generate the control instruction of corresponding mobile range.
Described pin can comprise the movement of the directions such as front, rear, left and right, and described head can produce move mode, to the right left, thereby produces the control instruction of correspondence direction.
The difference of the embodiment of the present invention and embodiment bis-is, embodiment produces corresponding control instruction by the position of body part, the moving direction of the present embodiment by body part and translational speed, amplitude become corresponding control instruction, flexible operation mode next life.
Certainly, the status information that can also comprise user's body position, if user's gesture is fist shape, generation corresponds to the control instruction stopping, user's gesture is " V " shape or during for the state of " OK " gesture, produces the control instruction capturing, the same with the definition mode in embodiment bis-, can also define action or the state of other similar body part, define accordingly the control instruction that needs execution.
Embodiment tetra-:
Fig. 5 is the realization flow of the corresponding control instruction of character control information of the described user's input of searching of providing of fourth embodiment of the invention, and described control instruction comprises moving direction control instruction or state control instruction, and details are as follows.
The character control information that wherein receives user's input is identical with embodiment mono-with the step of controlling described equipment according to control instruction, at this, does not repeat.
In step S501, by microphone, obtain the voice signal that user sends.
Described equipment is generally set as voice signal corresponding to described control instruction, voice signal as corresponding in the control instruction moving forward is " forward ", or the voice of " moving forward ", certainly, according to concrete applicable cases, corresponding relation that can also more diversified definition voice signal.
Described microphone, for changing the voice signal of detection into the signal of telecommunication, generates voice data, to carry out follow-up comparison.
In step S502, according to default voice signal and the corresponding relation of control instruction, obtain the corresponding control instruction of the voice signal of described transmission.
Pre-stored speech-sound data sequence in equipment, whether by the situation of change of speech data relatively, searching the voice signal that received by described microphone can be in requisition for carrying out control instruction.
When the speech data that microphone detects is identical with the speech data of storing in equipment or substantially similar, generate corresponding control instruction, control appliance is carried out corresponding operation.
The present embodiment is realized the control to equipment by receiving user's voice signal, be appreciated that the embodiment of the present invention equally can be in conjunction with the embodiments two, the technical characterictic in embodiment tri-, common realization controlled the variation of equipment.
Embodiment five:
Fig. 6 is the realization flow of the corresponding control instruction of character control information of the described user's input of searching of providing of fifth embodiment of the invention, and described control instruction comprises moving direction control instruction or state control instruction, and details are as follows.
The character control information that wherein receives user's input is identical with embodiment mono-with the step of controlling described equipment according to control instruction, at this, does not repeat.
In step S601, by camera, obtain the image information of predeterminated position.
For the accuracy improve detecting, camera can be arranged on the one hand to the position close to from user's head, on the other hand, can also improve the resolution ratio of camera, can obtain facial information more clearly.
In step S602, detect the facial expression of user in described image information, according to the corresponding relation of described facial expression and control instruction, generate described user's the corresponding control instruction of facial expression.
In embodiments of the present invention, can pre-define the facial expression corresponding to shape of face, as moods such as can smiling according to the coordinate contextual definition of face recognition point, find pleasure in greatly, wail, be normal, and by the control instruction corresponding to mood difference of described definition, by camera, obtain user's facial information, mate corresponding expression, thereby generate corresponding control instruction.
The present embodiment is realized the control to equipment by receiving user's voice signal, be appreciated that the embodiment of the present invention equally can be in conjunction with the embodiments two, the technical characterictic in embodiment tri-, embodiment tetra-, common realization controlled the variation of equipment.
Embodiment six:
Before the embodiment of the present invention one to embodiment five arbitrary embodiment, can also comprise the step of the starting device described in the embodiment of the present invention:
Equipment described in the embodiment of the present invention, it comprises the operating system being connected with internet network.On described equipment, be provided with the pattern identification that can be used for identification equipment, so that user can be scanned described pattern identification, send the interactive information suitable with described equipment.
Described pattern identification, can comprise for Quick Response Code or other sign of current device sign and interaction platform corresponding to current device.At Quick Response Code described in scanning input or by alternate manner, obtain the interaction platform information that current device is corresponding.
When generation interactive information new, that comprise described device identification being detected, generate the enabled instruction of described equipment.
The interactive information that comprises described device identification, can comprise the forwarding of the corresponding concern of micro-letter account of described equipment or the relevant microblogging article of equipment or review information etc.Server, after receiving satisfactory interactive information, sends enabled instruction to described equipment by network, completes the startup to equipment.
The embodiment of the present invention is by the interaction platform that comprises facility information is provided, and generates enabled instruction according to the interactive information receiving, and completes the startup of equipment is controlled, and it is more conducive to the variation of device start mode.
Embodiment seven:
Before the embodiment of the present invention one to embodiment five arbitrary embodiment, can also comprise the step of the starting device described in the embodiment of the present invention, as shown in Figure 7:
In step S701, receive the web-privilege password Web of user's input.
Optionally, the web-privilege password Web step of described reception user input comprises:
By camera, scan pattern identification or the data message obtaining on user terminal, described pattern identification or data message are generated and sent to user terminal displays by server, or described pattern identification or data message send information command by server and are generated by user terminal to user terminal;
Resolve the encrypted message that described pattern identification or data message are corresponding.
Wherein, described pattern identification can be Quick Response Code.
Optionally, the web-privilege password Web step of described reception user input comprises:
By microphone, receive the acoustic information that user terminal sends, described acoustic information is generated and sent to user terminal by server, or described acoustic information sends certain information command by server and by user terminal, generated to user terminal;
Resolve the encrypted message that described acoustic information is corresponding.
Wherein, described acoustic information, the frequency change by sound embodies encrypted message, or change frequency that can also intensity of sound or amplitude of variation etc. embody encrypted message.
In step S702, by network connection service device, inquire about described web-privilege password Web whether have legal.
Encrypted message resolving input, after acoustic information, pattern identification or data message, judges whether to meet default requirement.
In step S703, when described web-privilege password Web is legal, generate the enabled instruction of described equipment.
In addition, similar with the embodiment of the present invention, at described equipment, be connected with internet network, in the character control information of described reception user input, described character control information is in body part, voice or facial information before one or more steps, and described method also comprises:
By network connection service device, the state outcome of the interactive information between inquiring user and server, described interactive information can comprise as social media pay, pay close attention to, forward share, state outcome after one or more operations in comment, game credits interaction;
When described state outcome is legal, generate the enabled instruction of described equipment.
As the ideal money of user by network electronic or actual monetary payments, interactive information obtain startup right, judge whether interactive number of times and interactive content meet default requirement, pay and whether meet the requirements etc., realize the startup to equipment.Or can also pass through the short haul connection mode direct payment of Wearable electronic equipment.
In addition, for further improving the convenience that user uses, described equipment is provided with audiovisual system, and described audiovisual system can be play various videos, music or picture, so that user watches or listens in controlling described device procedures, and/or the interactive data between demonstration and user.
Described interactive data, can comprise as the integration in game interactive data, outpost of the tax office data message, or the concern in interaction platform, forwarding, comment etc. are about the interactive information as in microblogging, micro-letter, personal homepage.In display screen, show the data in social platform, user completes after interactive action, starts game.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, all any modifications of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.

Claims (13)

1. an apparatus control method, is characterized in that, described method comprises:
Receive the character control information of user's input, described character control information be in body part information, voice messaging or facial information one or more;
According to the corresponding relation of predefined character control information and control instruction, search the corresponding control instruction of character control information of described user's input;
According to the control instruction of described correspondence, control described equipment.
2. method according to claim 1, it is characterized in that, described character control information is body part information, the control instruction of described equipment comprises moving direction control instruction, described according to the corresponding relation of predefined character control information and control instruction, the corresponding control instruction step of character control information of searching described user's input comprises:
By camera, obtain the image information of predeterminated position;
According in advance the coordinate of image being divided, search described user's body part place coordinate position;
According to the position relation of described body part place coordinate position and described picture centre, generate the control instruction of described azimuth direction.
3. method according to claim 1, it is characterized in that, described character control information is body part information, the control instruction of described equipment comprises moving direction or state control instruction, described according to the corresponding relation of predefined character control information and control instruction, the corresponding control instruction of character control information of searching described user's input comprises:
By camera, obtain the image information of predeterminated position;
In described image information, detect the change direction of described user's body part, generate the control instruction of described change direction, or in described image information, detect described user's form, generate corresponding state control instruction.
4. according to method described in claim 2 or 3, it is characterized in that, described body part is hand or head or foot.
5. method according to claim 1, it is characterized in that, described character control information is voice, described according to the corresponding relation of predefined character control information and control instruction, and the corresponding control instruction of character control information of searching described user's input comprises:
By microphone, obtain the voice signal that user sends;
According to default voice signal and the corresponding relation of control instruction, obtain the corresponding control instruction of the voice signal of described transmission.
6. method according to claim 1, it is characterized in that, described character control information is facial information, described according to the corresponding relation of predefined character control information and control instruction, and the corresponding control instruction of character control information of searching described user's input also comprises:
By camera, obtain the image information of predeterminated position;
Detect the facial expression of user in described image information, according to the corresponding relation of described facial expression and control instruction, generate described user's the corresponding control instruction of facial expression.
7. according to method described in claim 1,2,3,5 or 6 any one, it is characterized in that, described equipment is for capturing or launch control equipment, the control instruction of described equipment also comprises crawl or firing order, described according to the corresponding relation of predefined character control information and control instruction, the corresponding control instruction of character control information of searching described user's input also comprises:
Detect the opening and closing state of user's finger, according to pick-and-place or the transmitting that a state generates corresponding driving instruction control appliance of closing of described finger.
8. method according to claim 1, it is characterized in that, described equipment is connected with internet network, character control information in described reception user input, described character control information is in body part information, voice messaging or facial information before one or more steps, and described method also comprises:
Described equipment is provided with the pattern identification that can be used for identification equipment, so that user can be scanned described pattern identification, sends the interactive information suitable with described equipment;
When generation interactive information new, that comprise described device identification being detected, generate the enabled instruction of described equipment.
9. method according to claim 1, it is characterized in that, described equipment is connected with internet network, in the character control information of described reception user input, described character control information is in body part, voice or facial information before one or more steps, and described method also comprises:
Receive the web-privilege password Web of user's input;
By network connection service device, inquire about described web-privilege password Web whether have legal;
When described web-privilege password Web is legal, generate the enabled instruction of described equipment.
10. method according to claim 9, is characterized in that, the web-privilege password Web step of described reception user input comprises:
By camera, scan pattern identification or the data message obtaining on user terminal, described pattern identification or data message are generated and sent to user terminal displays by server, or described pattern identification or data message send information command by server and are generated by user terminal to user terminal;
Resolve the encrypted message that described pattern identification or data message are corresponding.
11. methods according to claim 9, is characterized in that, the web-privilege password Web step of described reception user input comprises:
By microphone, receive the acoustic information that user terminal sends, described acoustic information is generated and sent to user terminal by server, or described acoustic information sends certain information command by server and by user terminal, generated to user terminal;
Resolve the encrypted message that described acoustic information is corresponding.
12. methods according to claim 1, it is characterized in that, described equipment is connected with internet network, in the character control information of described reception user input, described character control information is in body part, voice or facial information before one or more steps, and described method also comprises:
By network connection service device, the state outcome of the interactive information between inquiring user and server;
When described state outcome is legal, generate the enabled instruction of described equipment.
13. according to method described in claim 1,2,3,5,6,8,9,10,11,12 any one, it is characterized in that, described equipment is provided with audiovisual system, described audiovisual system is used for playing various videos, music or picture, so that user watches or listens in controlling described device procedures, and/or the interactive data between demonstration and user.
CN201410149230.4A 2014-04-14 2014-04-14 Device control method Pending CN103961869A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201410149230.4A CN103961869A (en) 2014-04-14 2014-04-14 Device control method
CN201710420565.9A CN107346593B (en) 2014-04-14 2014-04-14 Equipment starting control method
CN201710420621.9A CN107320948B (en) 2014-04-14 2014-04-14 Equipment starting control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410149230.4A CN103961869A (en) 2014-04-14 2014-04-14 Device control method

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN201710420565.9A Division CN107346593B (en) 2014-04-14 2014-04-14 Equipment starting control method
CN201710420621.9A Division CN107320948B (en) 2014-04-14 2014-04-14 Equipment starting control method

Publications (1)

Publication Number Publication Date
CN103961869A true CN103961869A (en) 2014-08-06

Family

ID=51232162

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201710420565.9A Active CN107346593B (en) 2014-04-14 2014-04-14 Equipment starting control method
CN201410149230.4A Pending CN103961869A (en) 2014-04-14 2014-04-14 Device control method
CN201710420621.9A Active CN107320948B (en) 2014-04-14 2014-04-14 Equipment starting control method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201710420565.9A Active CN107346593B (en) 2014-04-14 2014-04-14 Equipment starting control method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201710420621.9A Active CN107320948B (en) 2014-04-14 2014-04-14 Equipment starting control method

Country Status (1)

Country Link
CN (3) CN107346593B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463152A (en) * 2015-01-09 2015-03-25 京东方科技集团股份有限公司 Gesture recognition method and system, terminal device and wearable device
CN104536563A (en) * 2014-12-12 2015-04-22 林云帆 Electronic equipment control method and system
CN105528080A (en) * 2015-12-21 2016-04-27 魅族科技(中国)有限公司 Method and device for controlling mobile terminal
CN106325112A (en) * 2015-06-25 2017-01-11 联想(北京)有限公司 Information processing method and electronic equipment
CN107670277A (en) * 2017-08-14 2018-02-09 武汉斗鱼网络科技有限公司 A kind of method, electronic equipment and the storage medium of real-time live broadcast game
CN107804590A (en) * 2017-10-26 2018-03-16 台山市彼思捷礼品有限公司 One kind classification bonbon box
CN107863103A (en) * 2017-09-29 2018-03-30 珠海格力电器股份有限公司 A kind of apparatus control method, device, storage medium and server
CN107948052A (en) * 2017-11-14 2018-04-20 福建中金在线信息科技有限公司 Information crawler method, apparatus, electronic equipment and system
CN110075512A (en) * 2018-01-26 2019-08-02 北京云点联动科技发展有限公司 A method of grabbing doll machine and board light and player interaction game
CN110119700A (en) * 2019-04-30 2019-08-13 广州虎牙信息科技有限公司 Virtual image control method, virtual image control device and electronic equipment
WO2020051723A1 (en) * 2018-09-14 2020-03-19 朱恩辛 Vending machine and voice interaction system and method applied to vending machine
CN111209050A (en) * 2020-01-10 2020-05-29 北京百度网讯科技有限公司 Method and device for switching working mode of electronic equipment

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108704306A (en) * 2018-05-22 2018-10-26 广州数娱信息科技有限公司 Recreation amusement facility and its startup control system and startup control method
CN108635830A (en) * 2018-05-22 2018-10-12 广州数娱信息科技有限公司 Recreation amusement facility and its startup control system and startup control method
CN109224431A (en) * 2018-09-14 2019-01-18 苏州乐聚堂电子科技有限公司 Grab doll machine
CN112089596A (en) * 2020-05-22 2020-12-18 未来穿戴技术有限公司 Friend adding method of neck massager, neck massager and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1988703A (en) * 2006-12-01 2007-06-27 深圳市飞天网景通讯有限公司 Method for realizing information interactive operation based on shootable mobile terminal
CN101902554A (en) * 2009-05-25 2010-12-01 戴维 Intelligent set top box and image processing method thereof
CN103246351A (en) * 2013-05-23 2013-08-14 刘广松 User interaction system and method
CN103268769A (en) * 2013-02-06 2013-08-28 方科峰 Application method of video-audio system based on voice keyboard
CN103390123A (en) * 2012-05-08 2013-11-13 腾讯科技(深圳)有限公司 User authentication method, user authentication device and intelligent terminal
CN103475991A (en) * 2013-08-09 2013-12-25 刘波涌 Role play realization method and system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6766036B1 (en) * 1999-07-08 2004-07-20 Timothy R. Pryor Camera based man machine interfaces
CN1444382A (en) * 2002-03-08 2003-09-24 株式会社唯红 Game system
CN1565688A (en) * 2003-07-09 2005-01-19 黄贤达 Single-tone controlled controlling device
US7887404B2 (en) * 2005-01-27 2011-02-15 Igt Lottery and gaming systems with single representation for multiple instant win game outcomes
CN101393599B (en) * 2007-09-19 2012-02-08 中国科学院自动化研究所 Game role control method based on human face expression
CN102098567B (en) * 2010-11-30 2013-01-16 深圳创维-Rgb电子有限公司 Interactive television system and control method thereof
CN102671383A (en) * 2011-03-08 2012-09-19 德信互动科技(北京)有限公司 Game implementing device and method based on acoustic control
CN103368925B (en) * 2012-04-10 2017-10-27 天津米游科技有限公司 A kind of method of device authentication
CN102880691B (en) * 2012-09-19 2015-08-19 北京航空航天大学深圳研究院 A kind of mixing commending system based on user's cohesion and method
CN103252088B (en) * 2012-12-25 2015-10-28 上海绿岸网络科技股份有限公司 Outdoor scene scanning game interactive system
CN103327109A (en) * 2013-06-27 2013-09-25 腾讯科技(深圳)有限公司 Method, terminals and system for accessing game and method and servers for processing game
CN103529945A (en) * 2013-10-18 2014-01-22 深圳市凡趣科技有限公司 Method and system for having control over computer game
CN103678613B (en) * 2013-12-17 2017-01-25 北京启明星辰信息安全技术有限公司 Method and device for calculating influence data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1988703A (en) * 2006-12-01 2007-06-27 深圳市飞天网景通讯有限公司 Method for realizing information interactive operation based on shootable mobile terminal
CN101902554A (en) * 2009-05-25 2010-12-01 戴维 Intelligent set top box and image processing method thereof
CN103390123A (en) * 2012-05-08 2013-11-13 腾讯科技(深圳)有限公司 User authentication method, user authentication device and intelligent terminal
CN103268769A (en) * 2013-02-06 2013-08-28 方科峰 Application method of video-audio system based on voice keyboard
CN103246351A (en) * 2013-05-23 2013-08-14 刘广松 User interaction system and method
CN103475991A (en) * 2013-08-09 2013-12-25 刘波涌 Role play realization method and system

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536563A (en) * 2014-12-12 2015-04-22 林云帆 Electronic equipment control method and system
CN104463152B (en) * 2015-01-09 2017-12-08 京东方科技集团股份有限公司 A kind of gesture identification method, system, terminal device and Wearable
CN104463152A (en) * 2015-01-09 2015-03-25 京东方科技集团股份有限公司 Gesture recognition method and system, terminal device and wearable device
CN106325112B (en) * 2015-06-25 2020-03-24 联想(北京)有限公司 Information processing method and electronic equipment
CN106325112A (en) * 2015-06-25 2017-01-11 联想(北京)有限公司 Information processing method and electronic equipment
CN105528080A (en) * 2015-12-21 2016-04-27 魅族科技(中国)有限公司 Method and device for controlling mobile terminal
CN107670277A (en) * 2017-08-14 2018-02-09 武汉斗鱼网络科技有限公司 A kind of method, electronic equipment and the storage medium of real-time live broadcast game
CN107863103A (en) * 2017-09-29 2018-03-30 珠海格力电器股份有限公司 A kind of apparatus control method, device, storage medium and server
CN107804590A (en) * 2017-10-26 2018-03-16 台山市彼思捷礼品有限公司 One kind classification bonbon box
CN107948052A (en) * 2017-11-14 2018-04-20 福建中金在线信息科技有限公司 Information crawler method, apparatus, electronic equipment and system
CN110075512A (en) * 2018-01-26 2019-08-02 北京云点联动科技发展有限公司 A method of grabbing doll machine and board light and player interaction game
WO2020051723A1 (en) * 2018-09-14 2020-03-19 朱恩辛 Vending machine and voice interaction system and method applied to vending machine
CN110119700A (en) * 2019-04-30 2019-08-13 广州虎牙信息科技有限公司 Virtual image control method, virtual image control device and electronic equipment
CN110119700B (en) * 2019-04-30 2020-05-15 广州虎牙信息科技有限公司 Avatar control method, avatar control device and electronic equipment
CN111209050A (en) * 2020-01-10 2020-05-29 北京百度网讯科技有限公司 Method and device for switching working mode of electronic equipment

Also Published As

Publication number Publication date
CN107346593B (en) 2021-06-08
CN107346593A (en) 2017-11-14
CN107320948A (en) 2017-11-07
CN107320948B (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN103961869A (en) Device control method
US11826636B2 (en) Depth sensing module and mobile device including the same
US20200099960A1 (en) Video Stream Based Live Stream Interaction Method And Corresponding Device
US20200142490A1 (en) Haptic gloves for virtual reality systems and methods of controlling the same
US10474238B2 (en) Systems and methods for virtual affective touch
US20110154266A1 (en) Camera navigation for presentations
US20130113829A1 (en) Information processing apparatus, display control method, and program
CN104281260A (en) Method and device for operating computer and mobile phone in virtual world and glasses adopting method and device
CN202150897U (en) Body feeling control game television set
CN111045511B (en) Gesture-based control method and terminal equipment
EP3726843B1 (en) Animation implementation method, terminal and storage medium
CN105324736A (en) Techniques for touch and non-touch user interaction input
WO2012119371A1 (en) User interaction system and method
CN102939574A (en) Character selection
CN105872723A (en) Video sharing method and device based on virtual reality system
CN107291221A (en) Across screen self-adaption accuracy method of adjustment and device based on natural gesture
CN108379817A (en) Limb rehabilitation training system and method
CN106873760A (en) Portable virtual reality system
CN103558913A (en) Virtual input glove keyboard with vibration feedback function
EP3367216A1 (en) Systems and methods for virtual affective touch
CN113419634A (en) Display screen-based tourism interaction method
Vyas et al. Gesture recognition and control
CN104460962A (en) 4D somatosensory interaction system based on game engine
CN106303722A (en) The method and device that a kind of animation is play
CN206475183U (en) Robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140806