CN107320948B - Equipment starting control method - Google Patents

Equipment starting control method Download PDF

Info

Publication number
CN107320948B
CN107320948B CN201710420621.9A CN201710420621A CN107320948B CN 107320948 B CN107320948 B CN 107320948B CN 201710420621 A CN201710420621 A CN 201710420621A CN 107320948 B CN107320948 B CN 107320948B
Authority
CN
China
Prior art keywords
information
control
user
equipment
control command
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710420621.9A
Other languages
Chinese (zh)
Other versions
CN107320948A (en
Inventor
林云帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou D&e Information Technology Co ltd
Original Assignee
Guangzhou D&e Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou D&e Information Technology Co ltd filed Critical Guangzhou D&e Information Technology Co ltd
Priority to CN201710420621.9A priority Critical patent/CN107320948B/en
Publication of CN107320948A publication Critical patent/CN107320948A/en
Application granted granted Critical
Publication of CN107320948B publication Critical patent/CN107320948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3202Hardware aspects of a gaming system, e.g. components, construction, architecture thereof
    • G07F17/3204Player-machine interfaces
    • G07F17/3209Input means, e.g. buttons, touch screen
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/326Game play aspects of gaming systems

Abstract

The invention provides a device start control method, which comprises the steps of receiving interaction information which is sent after a graphic identifier configured by user scanning equipment and is matched with the equipment; when detecting that new interaction information containing the graphic identifier is generated, generating a starting instruction of the equipment to finish starting the equipment; receiving feature control information input by a user, wherein the feature control information is one or more of body part information, voice information or face information; searching a control instruction corresponding to the characteristic control information input by the user according to the preset corresponding relation between the characteristic control information and the control instruction; and controlling the equipment according to the corresponding control instruction. According to the invention, the interactive platform comprising the equipment information is provided, the starting instruction is generated according to the received interactive information, the starting control of the equipment is completed, the interactivity and starting diversification between the user and the equipment are facilitated, the input mode used by the user is more flexible, and the physical health of the user is facilitated.

Description

Equipment starting control method
The application is a divisional application of an invention patent application with the application date of 2014, 4 and 14, the application number of 201410149230.4 and the name of' an equipment control method
Technical Field
The invention belongs to the field of control, and particularly relates to a device starting control method.
Background
Along with the diversification of life of people, electronic games become a better leisure and entertainment mode. The electronic game combined with the body parts or the mental activities of people not only enables people to relax, but also enables people to exercise correspondingly, and people like the electronic game.
Currently, the starting control mode of the electronic game is generally to start the electronic game device by inserting coins (such as inserting coins or inserting physical game coins), and then to control the electronic game device by a manual mode, that is, by touching the input device with a keyboard, a mouse, a joystick, or the like. In this case, there are the following disadvantages:
the electronic game equipment is started through coin insertion, the interaction between a user and the electronic game equipment is lacked, and the starting mode is monotonous; the game control command can be better input through the touch input device, but the electronic game needs to be operated repeatedly, so that the abrasion of the wired input device is easily caused, the requirement on the material of the input device is high, and the cost of the input device is high.
Disclosure of Invention
The embodiment of the invention aims to provide a device starting control method, which aims to solve the problems that electronic game devices in the prior art lack interactivity, are monotonous in starting mode, easy to wear input devices, high in cost and not beneficial to the physical health of users.
The embodiment of the invention is realized by a method for controlling the starting of equipment, wherein the equipment is connected with an internet network and is provided with a graphic identifier which can be used for identifying the equipment, and the method comprises the following steps:
receiving interaction information which is sent from a user after scanning the graphic identifier and is matched with the equipment;
when detecting that new interaction information containing the graphic identifier is generated, generating a starting instruction of the equipment to finish starting the equipment;
receiving feature control information input by a user, wherein the feature control information is one or more of body part information, voice information or face information;
searching a control instruction corresponding to the characteristic control information input by the user according to the preset corresponding relation between the characteristic control information and the control instruction;
and controlling the equipment according to the corresponding control instruction.
In the embodiment of the invention, the starting control of the equipment is completed by providing the interactive platform comprising the equipment information and generating the starting instruction according to the received interactive information, thereby being beneficial to the interactivity and starting diversification of the user and the equipment; the control of the equipment can be realized through one or more of body part information, voice information or facial information, physical abrasion of the input equipment can not be caused, the cost is saved, an input mode used by a user is more flexible, and the health of the user is facilitated.
Drawings
Fig. 1 is a flowchart of an implementation of a device start control method according to a first embodiment of the present invention;
fig. 2 is a flowchart of implementing the searching for the control instruction corresponding to the feature control information input by the user according to the second embodiment of the present invention;
FIG. 3 is a schematic diagram of establishing coordinates in an image according to a second embodiment of the present invention;
fig. 4 is a flowchart of implementing the searching for the control instruction corresponding to the feature control information input by the user according to the third embodiment of the present invention;
fig. 4a is a schematic diagram of generating a corresponding control command according to a changing direction of a body part of a user according to a third embodiment of the present invention;
fig. 5 is a flowchart illustrating an implementation of searching for a control instruction corresponding to the feature control information input by the user according to the fourth embodiment of the present invention;
fig. 6 is a flowchart of implementing searching for a control instruction corresponding to the feature control information input by the user according to the fifth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The method of the embodiment of the invention can be used for various control equipment, in particular to equipment for electronic control games, such as widely used dolls, ball interactive games and the like, the electronic game equipment is started by inserting coins (such as coins or solid game coins) in the traditional method, the interactivity between a user and the electronic game equipment is lacked, and the starting mode is monotonous; in the traditional mode of using the handle for inputting, due to the harshness of the game, the user is too hard or too frequent in use when inputting the control command, and the game handle is easily damaged. In addition, the electronic game controlled by the game handle realizes the input of control instructions mainly through the movement of fingers, has a single mode, is not beneficial to long-time operation, and cannot achieve the purpose that the electronic game is used for relaxing the body and mind and building the body. When the starting control method of the equipment is applied to the electronic game equipment, the equipment is connected with the internet network, information interaction between a user and the electronic game equipment is added, an interaction platform comprising equipment information is provided, a starting instruction is generated according to the received interaction information, the starting of the equipment is completed, and the interactivity between the user and the equipment and starting diversification are facilitated; and through the use of multiple control modes, the cost is saved, and the physical and mental relaxation of the user is facilitated.
The first embodiment is as follows:
fig. 1 is a flowchart illustrating an implementation of a device boot control method according to a first embodiment of the present invention, where the device includes an operating system connected to an internet network. The implementation process is detailed as follows:
in step S101, receiving interaction information adapted to the device sent from a user after scanning the graphic identifier;
specifically, a graphical identifier for identifying the device is provided on the device, so that a user can scan the graphical identifier and send interactive information adapted to the device. I.e. the device receives the interaction information. The graphic identifier may be a two-dimensional code or other identifiers including the current device identifier and the interactive platform corresponding to the current device. And scanning the two-dimensional code or the identification of the interactive platform by a user to acquire the interactive information corresponding to the current equipment.
In step S102, when it is detected that new interaction information including the graphic identifier is generated, a start instruction of the device is generated, and start of the device is completed;
specifically, the interaction information including the device identifier may include attention to a wechat account corresponding to the device, forwarding of a related microblog article of the device, comment information, and the like. And after receiving the interaction information meeting the requirements, the server sends a starting instruction to the equipment through the network to finish the starting of the equipment.
In step S103, feature control information input by a user is received, wherein the feature control information is one or more of body part information, voice information or face information.
Specifically, the body part information may be position information or motion information of a part of the user, such as a hand, a foot, or a head.
Wherein, the position information, for the hand, may include, for example: up, down, left, right, left up, left down, right up, right down, etc.
In detecting the position information, it is necessary to set a recognition time threshold of the system, and when the position of the hand of the user is fixed for a time (i.e., a time of stay at a certain position) longer than the threshold, the system can recognize the position information of the hand. When the position of the user's hand is fixed for less than the threshold for a period of time, the system does not recognize it as a control instruction.
The motion information may include motions of moving up, down, left, right, left up, left down, right up, right down, etc.
The action information needs to be provided with a reference point, the center positions of the two hands of the user are used as the reference point, when the hands of the user extend out, the control instruction can be recognized, and when the hands of the user retract, the control instruction can not be recognized when the visible area of the camera disappears.
The voice information may be a voice signal stored in the device, so as to recognize a voice uttered by a user, and the common voice control instruction may include: up, down, left, right, left up, left down, right up, right down, etc. In addition, a plurality of different voice signals may be stored in advance in the device in consideration of the difference in the languages used by different users, so that the same device can accommodate the requirements of users of a plurality of languages. In addition, the voice signals can be adaptively stored according to the distribution of people using the equipment.
The facial information can be used for replacing different instructions according to facial recognition algorithms and correspondingly recognizing facial expressions, such as differences of expressions like happiness, anger, sadness, music and the like, or realizing the input of characteristic control information according to the rotation direction of the eyes of the face.
The face information may be identified by preset part variation information, such as smile control commands defined for the mouth corner rising width.
Specifically, the manner of receiving the characteristic control information may be to obtain an image of a position where the user is located through a camera of the device, or to obtain facial information of the user through the camera, and as another manner of receiving the control information according to the present invention, the manner may further include a manner of using a position sensing sensor such as an infrared sensor to obtain body part information of the user. Voice information of the user may be received through the microphone.
In step S104, a control command corresponding to the feature control information input by the user is searched for according to a preset correspondence between the feature control information and the control command.
A database of correspondence between characteristic control information and control instructions is stored in advance in the device. The corresponding database comprises a control instruction corresponding to the state or action of each specific body part, a control instruction corresponding to a set voice signal which can be used as the control instruction, and other control instructions corresponding to facial expression actions and the like.
In step S105, the device is controlled according to the corresponding control instruction.
After the control instruction corresponding to the characteristic control information is found, the motion of the driving motor is driven or stopped through the control instruction, so that the movement and the stop of the equipment are controlled, and a virtual control instruction for the game can be sent, for example, the change of the limb form of the character in the game is detected through detecting the limb form change characteristic of the user, so that the control is more visual, and a more vivid control effect is obtained.
The embodiment of the invention adds the information interaction between the user and the electronic game equipment, completes the starting control of the equipment by providing the interaction platform comprising the equipment information and generating the starting instruction according to the received interaction information, and is more beneficial to the diversification of the starting mode of the equipment; the control of the equipment can be realized through one or more of body part information, voice information or facial information, physical abrasion of the input equipment can not be caused, the cost is saved, the used input mode is more flexible, and the health of a user is facilitated.
Example two:
fig. 2 shows an implementation process of finding a control instruction corresponding to the feature control information input by the user according to the second embodiment of the present invention, where the control instruction includes a moving direction control instruction, which is described in detail below.
The steps of receiving the feature control information input by the user and controlling the device according to the control instruction are the same as those of the first embodiment, and are not repeated herein.
In step S201, image information of a preset position is acquired by a camera.
Specifically, the preset position is generally set at the front end of the device and aligned with the position of the user, and when the user is not at the preset position, the user may not acquire all body part information of the user or only acquire part of the body part information.
The camera may acquire images of different body parts of the user. For example, for the hand of the user, the camera can be arranged to be aligned with the upper half body of the user, the image of the upper half body of the user is collected, for the foot or the head, the height position of the camera can be correspondingly arranged, and in addition, the focal range of the camera is adjusted according to the distance between the user and the equipment and the possible motion range of the body part. Or can be directly recognized without knowing the body part, for example, by just looking at the hand. Since in a close-up situation no other parts of the body can be displayed.
In step S202, the coordinate position of the body part of the user is searched according to the coordinate division of the image in advance.
In step S203, a control command of the azimuth direction is generated according to the azimuth relationship between the coordinate position of the body part and the center of the image.
Coordinates are established on the acquired image, and the origin selection mode of the coordinates can be various, such as the center point of the image and the corner position of the image, and fig. 3 illustrates the establishment of the coordinates by taking the center position of the image as an example.
As shown in fig. 3, the length of the image is 6a, the width of the image is 6b, the image is displayed in real scale of the image obtained by the camera, the image is divided into 9 areas as shown in fig. 3, and the coordinate range of each area is respectively:
a first region (-3a < x < -a, b < y < 3 b);
a second region (-a < x < a, b < y < 3 b);
a third region (a < x < 3a, b < y < 3 b);
a fourth region (-3a < x < -a, -b < y < b);
a fifth region (-a < x < a, -b < y < b);
a sixth region (a < x < 3a, -b < y < b);
a seventh region (-3a < x < -a, -3b < y < -b);
an eighth region (-a < x < a, -3b < y < -b);
a ninth region (a < x < 3a, -3b < y < -b);
when the body part which needs to be detected by the user is located in the first area, because the first area is located at the upper left relative to the origin of coordinates, namely the central position of the image, a control command with the upper left direction is generated, and for game machine equipment such as a doll machine and the like, a corresponding driving command is sent to control the driving motor to move towards the direction. Driving instructions of corresponding directions are generated also for the second, third, fourth, sixth, seventh, eighth, and ninth regions.
In addition, the distance of movement required by the control command can be corresponded according to the distance between the detected position of the body part and the center of the body.
And when the body part needing to be detected by the user is positioned in the fifth area, sending a control instruction for stopping moving, and controlling the equipment to stop moving.
As a further optimized implementation manner of the embodiment of the present invention, when the device is a capture control device, the method may further include further detecting an opening/closing state of the finger, so as to generate a corresponding driving instruction according to the opening/closing state of the finger, and control the device to capture and release. Of course, this is just one specific application embodiment, and the finger state may be replaced by other detected body part states or actions, such as smiling or other expressions, so as to generate a command for triggering the control device to grab and put.
Similarly, for the operation of the pick-and-place device, other control actions, such as shaking, can be performed, and correlation can be performed according to the specific device.
The following description will be made by taking a doll as an example. The method comprises the steps of correspondingly arranging a camera capable of acquiring body part information of a user at a standing position of the user of the doll, displaying the body part information of the user through a display picture (of course, not acquiring all body part information), dividing the picture into nine regions shown in fig. 3 by establishing coordinates in the picture, identifying gestures in each region by adopting an image identification algorithm, indicating that a current instruction is a stop instruction when two hands are simultaneously positioned in a fifth region, judging that the current instruction is a control instruction of the direction of the nth region relative to the fifth region when only one hand is positioned in the nth region except the fifth region, and corresponding to an upward control instruction if the nth region is positioned above the fifth region. In addition, when both hands are outside the fifth area and are not in the same area, it is determined that the control command is invalid. When the fingers are used for holding a fist and then are opened, the grabbing action of the doll machine is triggered.
When the device is a launching control device, other control actions can be further detected, and when the detection conditions are met, corresponding launching actions can be executed, such as various shooting game devices and the like.
The embodiment of the invention can conveniently control the body part without contacting equipment by generating the control instruction according to the position of the body part, thereby reducing the cost and generating the control instruction earlier than the diversified body of a user.
Example three:
fig. 4 is a flow chart illustrating an implementation of searching for a control instruction corresponding to the feature control information input by the user according to the third embodiment of the present invention, where the control instruction includes a moving direction control instruction or a state control instruction, which is described in detail below.
The steps of receiving the feature control information input by the user and controlling the device according to the control instruction are the same as those of the first embodiment, and are not repeated herein.
In step S401, image information of a preset position is acquired by a camera.
In step S402, a direction of change of the body part of the user is detected in the image information, a control command for the direction of change is generated, a shape of the user is detected in the image information, and a corresponding state control command is generated.
Fig. 4a is a schematic diagram of generating a corresponding control command according to a change direction of a body part of a user according to a third embodiment of the present invention, and as shown in fig. 4a, the body part of the user, such as a hand, a head, or a foot, moves, displays corresponding movement speed, movement angle, and movement amplitude information on a screen, and converts the movement speed, movement angle, and movement amplitude information into a movement control command corresponding to a movement control device in a device according to a corresponding proportional relationship. According to the movement control mode of specific equipment, the movement angle can be further specifically divided into four directions or movement control instructions in multiple directions, the movement amplitude can also be in a fixed mode, and the set fixed distance is moved every time the movement control instruction is received.
In the embodiment of the invention, the body part comprises a hand, a foot or a head.
The motion direction of the hand can include an outward extension direction (because the body can not extend outwards infinitely, the body part is defined to be recycled inwards and not corresponding control instructions are not stored, so that the equipment is more conveniently operated and controlled), when the control instruction of the movement of the equipment is sent, the control instruction of the corresponding movement speed can be generated according to the motion speed of the hand, and the control instruction of the corresponding movement amplitude can be generated according to the amplitude of the hand motion.
The foot can move in the front, back, left and right directions, and the head can move leftwards and rightwards, so that control commands in the corresponding directions are generated.
The embodiment of the invention is different from the second embodiment in that the embodiment generates the corresponding control instruction according to the position of the body part, and the embodiment generates the corresponding control instruction according to the moving direction, the moving speed and the amplitude of the body part, so that the operation mode is flexible.
Of course, the state information of the body part of the user may also be included, for example, if the gesture of the user is in a fist shape, a control instruction corresponding to stop is generated, and when the gesture of the user is in a "V" shape or in an "OK" gesture, a control instruction for grabbing is generated.
Example four:
fig. 5 is an implementation process of searching for a control instruction corresponding to the feature control information input by the user according to a fourth embodiment of the present invention, where the control instruction includes a moving direction control instruction or a state control instruction, which is described in detail below.
The steps of receiving the feature control information input by the user and controlling the device according to the control instruction are the same as those of the first embodiment, and are not repeated herein.
In step S501, a voice signal transmitted by a user is acquired through a microphone.
The device generally sets the voice signal corresponding to the control instruction, for example, the voice signal corresponding to the control instruction moving forward is a "forward" or "moving forward" voice, and of course, according to a specific application, the corresponding relationship of the voice signal may be defined more variously.
The microphone is used for converting the detected voice signal into an electric signal and generating sound data so as to carry out subsequent comparison.
In step S502, a control instruction corresponding to the transmitted voice signal is obtained according to a preset correspondence between the voice signal and the control instruction.
And searching whether the voice signal received by the microphone can correspond to the control instruction to be executed or not by comparing the change condition of the voice data in the voice data sequence prestored in the equipment.
When the voice data detected by the microphone is the same as or basically similar to the voice data stored in the equipment, a corresponding control instruction is generated, and the equipment is controlled to execute corresponding operation.
In this embodiment, the control of the device is realized by receiving the voice signal of the user, and it can be understood that the embodiments of the present invention can also combine the technical features in the second embodiment and the third embodiment to jointly realize the diversified control of the device.
Example five:
fig. 6 is an implementation flow of finding a control instruction corresponding to the feature control information input by the user according to a fifth embodiment of the present invention, where the control instruction includes a moving direction control instruction or a state control instruction, which is described in detail below.
The steps of receiving the feature control information input by the user and controlling the device according to the control instruction are the same as those of the first embodiment, and are not repeated herein.
In step S601, image information of a preset position is acquired by a camera.
For improving the accuracy of detection, on the one hand, the camera can be arranged at a position close to the head of the user, and on the other hand, the resolution of the camera can be improved, so that the camera can acquire clearer facial information.
In step S602, a facial expression of the user in the image information is detected, and a control instruction corresponding to the facial expression of the user is generated according to a correspondence between the facial expression and the control instruction.
In the embodiment of the invention, facial expressions corresponding to the shape of the mouth can be predefined, for example, emotions such as smile, happy, crying, normal and the like can be defined according to the coordinate relation of the face recognition points, the defined emotions respectively correspond to different control instructions, the facial information of the user is acquired through the camera, and the corresponding expressions are matched, so that the corresponding control instructions are generated.
In this embodiment, the control of the device is realized by receiving the voice signal of the user, and it can be understood that the embodiments of the present invention can also combine the technical features in the second embodiment, the third embodiment and the fourth embodiment to jointly realize the diversified control of the device.
In addition, in order to further improve the convenience of the user, the equipment is provided with an audio-visual system which can play various videos, music or pictures so that the user can watch or listen in the process of controlling the equipment and/or display interactive data between the user and the equipment.
The interaction data can comprise points and level data information in game interaction data, or interaction information such as microblog, WeChat and personal homepage concerning attention, forwarding and comment in an interaction platform. And displaying the data in the social platform in a display screen, and starting the game after the user completes the interaction action.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (9)

1. A method for controlling the start-up of a device, said device being connected to an internet network, said device being provided with a graphical identification which can be used to identify the device, said method comprising:
receiving interaction information which is sent from a user after scanning the graphic identifier and is matched with the equipment;
when detecting that new interaction information containing the graphic identifier is generated, generating a starting instruction of the equipment to finish starting the equipment; the interactive information of the graphic identifier comprises attention to a WeChat account corresponding to the equipment, forwarding of related microblog articles of the equipment or comment information;
receiving feature control information input by a user, wherein the feature control information is one or more of body part information, voice information or face information;
searching a control instruction corresponding to the characteristic control information input by the user according to the preset corresponding relation between the characteristic control information and the control instruction;
and controlling the equipment according to the corresponding control instruction.
2. The method of claim 1, wherein the graphic identifier may be a two-dimensional code or an identifier of an interaction platform corresponding to a current device identifier and a current device, and the receiving the interaction information adapted to the device and sent from the user after scanning the graphic identifier includes: and receiving and scanning the two-dimensional code or the identification of the interactive platform to acquire the interactive information corresponding to the current equipment.
3. The method according to claim 1 or 2, wherein the device is a capture or launch control device, the control command of the device further includes a capture or launch command, and the searching for the control command corresponding to the feature control information input by the user according to the preset corresponding relationship between the feature control information and the control command further includes:
and detecting the opening and closing state of the fingers of the user, and generating a corresponding driving instruction according to the opening and closing state of the fingers to control the grabbing and releasing or emission of the equipment.
4. The method of claim 3, wherein the device is a motion sensing doll.
5. The method according to claim 1 or 2, wherein the characteristic control information is body part information, the control command of the device includes a moving direction control command, and the step of searching the control command corresponding to the characteristic control information input by the user according to the preset corresponding relationship between the characteristic control information and the control command includes:
acquiring image information of a preset position through a camera;
searching the coordinate position of the body part of the user according to the coordinate division of the image in advance;
and generating a control instruction of the azimuth direction according to the azimuth relation between the coordinate position of the body part and the center of the image.
6. The method according to claim 1 or 2, wherein the feature control information is body part information, the control command of the device includes a moving direction or state control command, and the searching for the control command corresponding to the feature control information input by the user according to the preset corresponding relationship between the feature control information and the control command includes:
acquiring image information of a preset position through a camera;
and detecting the change direction of the body part of the user in the image information to generate a control command of the change direction, or detecting the shape of the user in the image information to generate a corresponding state control command.
7. The method according to claim 1 or 2, wherein the feature control information is voice, and the searching for the control command corresponding to the feature control information input by the user according to the preset correspondence between the feature control information and the control command comprises:
acquiring a voice signal sent by a user through a microphone;
and obtaining a control instruction corresponding to the sent voice signal according to the corresponding relation between the preset voice signal and the control instruction.
8. The method according to claim 1 or 2, wherein the feature control information is face information, and the searching for the control command corresponding to the feature control information input by the user according to the preset correspondence between the feature control information and the control command further comprises:
acquiring image information of a preset position through a camera;
and detecting the facial expression of the user in the image information, and generating a control instruction corresponding to the facial expression of the user according to the corresponding relation between the facial expression and the control instruction.
9. Method according to claim 1 or 2, characterized in that the device is provided with an audiovisual system for playing various videos, music or pictures for the user to watch or listen to during the control of the device and/or for displaying interactive data with the user.
CN201710420621.9A 2014-04-14 2014-04-14 Equipment starting control method Active CN107320948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710420621.9A CN107320948B (en) 2014-04-14 2014-04-14 Equipment starting control method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710420621.9A CN107320948B (en) 2014-04-14 2014-04-14 Equipment starting control method
CN201410149230.4A CN103961869A (en) 2014-04-14 2014-04-14 Device control method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201410149230.4A Division CN103961869A (en) 2014-04-14 2014-04-14 Device control method

Publications (2)

Publication Number Publication Date
CN107320948A CN107320948A (en) 2017-11-07
CN107320948B true CN107320948B (en) 2021-01-29

Family

ID=51232162

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201710420621.9A Active CN107320948B (en) 2014-04-14 2014-04-14 Equipment starting control method
CN201410149230.4A Pending CN103961869A (en) 2014-04-14 2014-04-14 Device control method
CN201710420565.9A Active CN107346593B (en) 2014-04-14 2014-04-14 Equipment starting control method

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN201410149230.4A Pending CN103961869A (en) 2014-04-14 2014-04-14 Device control method
CN201710420565.9A Active CN107346593B (en) 2014-04-14 2014-04-14 Equipment starting control method

Country Status (1)

Country Link
CN (3) CN107320948B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536563A (en) * 2014-12-12 2015-04-22 林云帆 Electronic equipment control method and system
CN104463152B (en) * 2015-01-09 2017-12-08 京东方科技集团股份有限公司 A kind of gesture identification method, system, terminal device and Wearable
CN106325112B (en) * 2015-06-25 2020-03-24 联想(北京)有限公司 Information processing method and electronic equipment
CN105528080A (en) * 2015-12-21 2016-04-27 魅族科技(中国)有限公司 Method and device for controlling mobile terminal
CN107670277A (en) * 2017-08-14 2018-02-09 武汉斗鱼网络科技有限公司 A kind of method, electronic equipment and the storage medium of real-time live broadcast game
CN107863103A (en) * 2017-09-29 2018-03-30 珠海格力电器股份有限公司 A kind of apparatus control method, device, storage medium and server
CN107804590A (en) * 2017-10-26 2018-03-16 台山市彼思捷礼品有限公司 One kind classification bonbon box
CN107948052A (en) * 2017-11-14 2018-04-20 福建中金在线信息科技有限公司 Information crawler method, apparatus, electronic equipment and system
CN110075512A (en) * 2018-01-26 2019-08-02 北京云点联动科技发展有限公司 A method of grabbing doll machine and board light and player interaction game
CN108635830A (en) * 2018-05-22 2018-10-12 广州数娱信息科技有限公司 Recreation amusement facility and its startup control system and startup control method
CN108704306A (en) * 2018-05-22 2018-10-26 广州数娱信息科技有限公司 Recreation amusement facility and its startup control system and startup control method
WO2020051723A1 (en) * 2018-09-14 2020-03-19 朱恩辛 Vending machine and voice interaction system and method applied to vending machine
CN109224431A (en) * 2018-09-14 2019-01-18 苏州乐聚堂电子科技有限公司 Grab doll machine
CN110119700B (en) * 2019-04-30 2020-05-15 广州虎牙信息科技有限公司 Avatar control method, avatar control device and electronic equipment
CN111209050A (en) * 2020-01-10 2020-05-29 北京百度网讯科技有限公司 Method and device for switching working mode of electronic equipment
CN112089596A (en) * 2020-05-22 2020-12-18 未来穿戴技术有限公司 Friend adding method of neck massager, neck massager and readable storage medium

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6766036B1 (en) * 1999-07-08 2004-07-20 Timothy R. Pryor Camera based man machine interfaces
CN1444382A (en) * 2002-03-08 2003-09-24 株式会社唯红 Game system
CN1565688A (en) * 2003-07-09 2005-01-19 黄贤达 Single-tone controlled controlling device
US7887404B2 (en) * 2005-01-27 2011-02-15 Igt Lottery and gaming systems with single representation for multiple instant win game outcomes
CN1988703A (en) * 2006-12-01 2007-06-27 深圳市飞天网景通讯有限公司 Method for realizing information interactive operation based on shootable mobile terminal
CN101393599B (en) * 2007-09-19 2012-02-08 中国科学院自动化研究所 Game role control method based on human face expression
CN101902554A (en) * 2009-05-25 2010-12-01 戴维 Intelligent set top box and image processing method thereof
CN102098567B (en) * 2010-11-30 2013-01-16 深圳创维-Rgb电子有限公司 Interactive television system and control method thereof
CN102671383A (en) * 2011-03-08 2012-09-19 德信互动科技(北京)有限公司 Game implementing device and method based on acoustic control
CN103368925B (en) * 2012-04-10 2017-10-27 天津米游科技有限公司 A kind of method of device authentication
CN103390123B (en) * 2012-05-08 2018-01-09 腾讯科技(深圳)有限公司 User authen method, user authentication device and intelligent terminal
CN102880691B (en) * 2012-09-19 2015-08-19 北京航空航天大学深圳研究院 A kind of mixing commending system based on user's cohesion and method
CN103252088B (en) * 2012-12-25 2015-10-28 上海绿岸网络科技股份有限公司 Outdoor scene scanning game interactive system
CN103198843A (en) * 2013-02-06 2013-07-10 方科峰 Audio video system application method based on voice keyboard
CN103246351B (en) * 2013-05-23 2016-08-24 刘广松 A kind of user interactive system and method
CN103327109A (en) * 2013-06-27 2013-09-25 腾讯科技(深圳)有限公司 Method, terminals and system for accessing game and method and servers for processing game
CN103475991A (en) * 2013-08-09 2013-12-25 刘波涌 Role play realization method and system
CN103529945A (en) * 2013-10-18 2014-01-22 深圳市凡趣科技有限公司 Method and system for having control over computer game
CN103678613B (en) * 2013-12-17 2017-01-25 北京启明星辰信息安全技术有限公司 Method and device for calculating influence data

Also Published As

Publication number Publication date
CN107346593B (en) 2021-06-08
CN107346593A (en) 2017-11-14
CN103961869A (en) 2014-08-06
CN107320948A (en) 2017-11-07

Similar Documents

Publication Publication Date Title
CN107320948B (en) Equipment starting control method
US9519989B2 (en) Visual representation expression based on player expression
US8843857B2 (en) Distance scalable no touch computing
KR101643020B1 (en) Chaining animations
US9244533B2 (en) Camera navigation for presentations
US9400548B2 (en) Gesture personalization and profile roaming
Van den Hoven et al. Grasping gestures: Gesturing with physical artifacts
US20100302138A1 (en) Methods and systems for defining or modifying a visual representation
US20110007079A1 (en) Bringing a visual representation to life via learned input from the user
US20140068526A1 (en) Method and apparatus for user interaction
CN104246661A (en) Interacting with a device using gestures
JP2014238828A (en) User interface device, and user interface control program
Vyas et al. Gesture recognition and control
CN109144598A (en) Electronics mask man-machine interaction method and system based on gesture
Taheri et al. Exploratory design of a hands-free video game controller for a quadriplegic individual
Spanogianopoulos et al. Human computer interaction using gestures for mobile devices and serious games: A review
JP2014050742A (en) Game device, game device control method and program
JP2013210875A (en) Information input apparatus, information input method and computer program
Alabdulkarim Towards hand-gesture frustration detection in interactive systems
Wu et al. Interface design for somatosensory interaction
Fan Practical ad hoc tangible interactions in augmented reality
Ross Gesture control: Let me see you wave your hands in the air
Koskela et al. User experiences in throwit: A natural ui for sharing objects between mobile devices
SINGH Gesture based interaction: a survey
Chinaemerem Gesture Controlled Input Devices: Past and the Future

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201223

Address after: 510000 3rd floor, building 1, 20 taihegang, Yuexiu District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU D&E INFORMATION TECHNOLOGY Co.,Ltd.

Address before: Building 5106, guangshangyuan, Guangzhou, Guangdong

Applicant before: Lin Yunfan

GR01 Patent grant
GR01 Patent grant