CN109597405A - Control the mobile method of robot and robot - Google Patents
Control the mobile method of robot and robot Download PDFInfo
- Publication number
- CN109597405A CN109597405A CN201710919203.4A CN201710919203A CN109597405A CN 109597405 A CN109597405 A CN 109597405A CN 201710919203 A CN201710919203 A CN 201710919203A CN 109597405 A CN109597405 A CN 109597405A
- Authority
- CN
- China
- Prior art keywords
- sensor
- user gesture
- robot
- movement speed
- blocked
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0259—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
- G05D1/0263—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic strips
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0285—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network
Abstract
The embodiment of the present application provides a kind of method and robot that control robot is mobile.In the embodiment of the present application, light sensor arrays are set in robot, the state change information of generation is blocked with user gesture by capture light sensor arrays, the moving direction and movement speed that user gesture is obtained according to the state change information carry out mobile control to robot according to the moving direction of user gesture and movement speed.Control mode based on user gesture, is no longer influenced by extra members, and the freedom degree of human-computer interaction can be greatly improved, and improves the mobile real-time of control robot.
Description
Technical field
This application involves field of artificial intelligence more particularly to a kind of methods and robot that control robot is mobile.
Background technique
Robot is to integrate the multi-functional bases such as environment sensing, dynamic decision and planning, behaviour control and execution
System.Robot can all around be moved in the planes such as ground, desktop according to the instruction of people.
In the prior art, user needs to open mobile phone application (Application, App) or finds the infrared handle of operation
It can control robot all around to move.Since mobile phone and handle are all additional accessories, oneself of human-computer interaction can be greatly reduced
By spending, influence to control the mobile real-time of robot.
Summary of the invention
The many aspects of the application provide a kind of method and robot that control robot is mobile, to improve human-computer interaction
Freedom degree, improve the mobile real-time of control robot.
The embodiment of the present application provides a kind of method that control robot is mobile, comprising:
Capture the state change information that light sensor arrays block generation with user gesture, the light sensor arrays
It is installed in robot;
The state change information for blocking generation with the user gesture according to the light sensor arrays, obtains the use
The moving direction and movement speed of family gesture;
According to the moving direction and movement speed of the user gesture, it is mobile to control the robot.
The embodiment of the present application also provides a kind of robot, comprising: basic machine, light sensor arrays, processor and
Memory;The light sensor arrays are installed on the basic machine surface, the processor and the memory and described
Light sensor arrays connection;
The memory, for storing computer program;
The processor, for executing the computer program, to be used for:
Capture the state change information that light sensor arrays block generation with user gesture, the light sensor arrays
It is installed in robot;
The state change information for blocking generation with the user gesture according to the light sensor arrays, obtains the use
The moving direction and movement speed of family gesture;
According to the moving direction and movement speed of the user gesture, it is mobile to control the robot.
The embodiment of the present application also provides a kind of computer readable storage medium for being stored with computer program, the computer
Program is performed the step realized in above method embodiment constantly.
The embodiment of the present application also provides a kind of method that control robot is mobile, comprising:
The movement speed of user gesture is obtained based on light sensor arrays, the light sensor arrays are installed on machine
On people;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to be moved with the mobile range and/or movement speed.
The embodiment of the present application also provides a kind of robot, comprising: basic machine, light sensor arrays, processor and
Memory;The light sensor arrays are installed on the basic machine surface, the processor and the memory and described
Light sensor arrays connection;
The memory, for storing computer program;
The processor, for executing the computer program, to be used for:
The movement speed of user gesture is obtained based on light sensor arrays, the light sensor arrays are installed on machine
On people;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to be moved with the mobile range and/or movement speed.
In the embodiment of the present application, light sensor arrays are set in robot, by capture light sensor arrays with
User gesture blocks the state change information of generation, and moving direction and the movement of user gesture are obtained according to the state change information
Speed carries out mobile control to robot according to the moving direction of user gesture and movement speed.Control based on user gesture
Mode is no longer influenced by extra members, and the freedom degree of human-computer interaction can be greatly improved, and improves the mobile reality of control robot
Shi Xing.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present application, constitutes part of this application, this Shen
Illustrative embodiments and their description please are not constituted an undue limitation on the present application for explaining the application.In the accompanying drawings:
Fig. 1 is the structural schematic diagram for the robot that one exemplary embodiment of the application provides;
Fig. 2 a- Fig. 2 c is several exemplary layout samples for the light sensor arrays that one exemplary embodiment of the application provides
Formula;
Fig. 3 a is the user gesture schematic diagram that the control robot that the application another exemplary embodiment provides turns left;
Fig. 3 b is the process schematic that the robot that the application another exemplary embodiment provides turns left;
Fig. 3 c is the right-handed user gesture schematic diagram of control robot that the application another exemplary embodiment provides;
Fig. 3 d is the right-handed process schematic of robot that the application another exemplary embodiment provides;
Fig. 3 e is the user gesture schematic diagram that the control robot that the application another exemplary embodiment provides goes ahead;
Fig. 3 f is the process schematic that the robot that the application another exemplary embodiment provides goes ahead;
Fig. 3 g is the user gesture schematic diagram that the control robot that the application another exemplary embodiment provides draws back;
Fig. 3 h is the process schematic that the robot that the application another exemplary embodiment provides draws back;
Fig. 3 i is that the control robot that the application another exemplary embodiment provides stops mobile user gesture schematic diagram;
Fig. 4 a is a kind of process signal of the method for control robot movement that the application another exemplary embodiment provides
Figure;
Fig. 4 b is a kind of process signal of the optional embodiment for the step 401 that the application another exemplary embodiment provides
Figure;
Fig. 4 c is that the process of the method for another control robot movement that the application another exemplary embodiment provides is shown
It is intended to;
Fig. 4 d is that the process of the method for another control robot movement that the application another exemplary embodiment provides is shown
It is intended to;
Fig. 5 a is the structural schematic diagram of the device for the control robot movement that the application another exemplary embodiment provides;
Fig. 5 b is the structural schematic diagram for a kind of electronic equipment that the application another exemplary embodiment provides;
Fig. 6 is the structural schematic diagram for another electronic equipment that the application another exemplary embodiment provides.
Specific embodiment
To keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with the application specific embodiment and
Technical scheme is clearly and completely described in corresponding attached drawing.Obviously, described embodiment is only the application one
Section Example, instead of all the embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not doing
Every other embodiment obtained under the premise of creative work out, shall fall in the protection scope of this application.
For the prior art control robot it is mobile when existing human-computer interaction freedom degree it is lower, real-time is poor etc. asks
Topic, the embodiment of the present application provide a solution, and basic ideas are: adding light sensor arrays in robot, pass through
The state change information that light sensor arrays block generation with user gesture is captured, user is obtained according to the state change information
The moving direction and movement speed of gesture carry out mobile control to robot according to the moving direction of user gesture and movement speed
System.Control mode based on user gesture, is no longer influenced by extra members, and the freedom degree of human-computer interaction can be greatly improved,
Improve the mobile real-time of control robot.
Below in conjunction with attached drawing, the technical scheme provided by various embodiments of the present application will be described in detail.
Fig. 1 is the structural schematic diagram for the robot that one exemplary embodiment of the application provides.As shown in Figure 1, the robot
100 include: basic machine 101, processor 102, memory 103 and light sensor arrays 104.
As shown in Figure 1, memory 103 and processor 102 may be disposed inside basic machine 101 and realize, but not limited to this.
In addition to this, memory 103 and processor 102 may also set up in 101 surface of basic machine.
Wherein, basic machine 101 is that robot 100 relies the executing agency of the task of fulfiling assignment, can be in determining ring
The specified operation of processor 102 is executed in border.Basic machine 101 mainly includes but is not limited to: the head of robot 100, hand,
The mechanical structures such as wrist, arm, waist and pedestal.Jointed robot structure can be used in basic machine 101, generally certainly with 6
By spending, wherein 3 are used to determine the position of the structure end executive device, in addition 3 freedom degrees are then used to determine that the machinery is tied
The direction of structure end executive device.
Memory 103 is mainly used for storing computer program and fulfils assignment the relevant number of task to robot 100
According to.Wherein, according to the difference of application scenarios, robot 100 needs the job task completed will be different, then with robot
The 100 relevant data of task that fulfil assignment would also vary from.For example, needing according to user for learning robot
It practises instruction and exports knowledge content to user, then the data such as link of knowledge content or knowledge content of 100 storage inside of robot
The relevant data of the task that as fulfils assignment to robot 100.In another example needing for sweeping robot according to user
Instruction identification of sweeping the floor ground to be cleaned and complete clean up task, then the space map of 100 storage inside of robot, clean the time
Etc. data be to fulfil assignment the relevant data of task to robot 100.
Processor 102 can be regarded as the control system of robot 100, is mainly used for executing and stores in memory 103
Computer program, to handle job instruction information, internal and external environment information, and according to scheduled ontology model, environment
Model and control program make a policy, and generate corresponding control signal, and drive each jointed robot knot of basic machine 101
Structure is moved by required sequence, along determining position or track, completes specific job task.
During robot 100 fulfils assignment task and in some other scene, it is often necessary to control robot
100 is mobile, such as needs to control robot 100 and move forward, move backward, be moved to the left or move right.In order to cleverer
Control robot 100 living is moved, and the present embodiment adds light sensor arrays 104 on the surface of basic machine 101,
Light sensor arrays 104 are mainly used for incuding user gesture, to realize the control mode based on user gesture.
Optionally, light sensor arrays 104 can be directly mounted at the surface of basic machine 101.Alternatively, light sensor
Device array 104 can be embedded on one block of plate, and the surface of tool ontology 101 is fixed on by the plate.In addition, not limiting photosensitive sensor
Installation site of the array 104 on 101 surface of basic machine, such as can be installed at the prosocoel of basic machine 101, or
It is installed at the rear thoracic cavity (referred to as back) of basic machine 101, or is installed on the arm surface, etc. of basic machine 101.
101 surface of basic machine can install one or more light sensor arrays 104.
Optionally, light sensor arrays 104 can be realized using various photosensitive components, such as can use infrared biography
The components such as sensor, optoelectronic switch, photo resistance.In addition, the present embodiment is not to photosensitive sensor in light sensor arrays 104
Layout pattern limit, such as can be laid out using determinant, can also be using the layout pattern of " ten " word or class " ten " word.
It as shown in Figure 2 a, is a kind of exemplary layout pattern of light sensor arrays 104.In fig. 2 a, light sensor
Device array 104 is laid out using " ten " word, includes altogether five photosensitive sensors, respectively A1, A2, S0, B1 and B2, S0 are B1B2
With the central point of A1A2.
It as shown in Figure 2 b, is another exemplary layout pattern of light sensor arrays 104.In figure 2b, photosensitive biography
Sensor array 104 is laid out using class " ten " word, altogether includes four photosensitive sensors.
It as shown in Figure 2 c, is another exemplary layout pattern of light sensor arrays 104.In figure 2 c, photosensitive biography
Sensor array 104 is laid out using class " ten " word, includes altogether seven photosensitive sensors, two photosensitive biographies are respectively set in the left and right sides
Sensor, it is intermediate that three photosensitive sensors are set from top to bottom.
Optionally, in the exemplary light sensor arrays 104 shown in Fig. 2 a- Fig. 2 c, each photosensitive sensor includes receiving
Light unit and luminescence unit, white lamps indicate that luminescence unit, black lamp indicate to receive light unit, and but it is not limited to this.To this field
For technical staff, it is readily understood that when photosensitive sensor is realized using different types of light-sensitive device, realize structure
It will be different, the structure of photosensitive sensor shown in Fig. 2 a- Fig. 2 c is only a kind of example.
In addition, the embodiment of the present application does not limit the distance between neighboring photosensitive sensor in light sensor arrays 104 yet,
It can be arranged according to the realization Morphological adaptability of robot 100.By taking Fig. 2 a as an example, the distance between photosensitive sensor A1 and A2 are
A1A2=X millimeters, the distance between photosensitive sensor B1 and B2 are B1B2=Y millimeters.Optionally, A1A2 >=500 millimeter, B1B2
>=700 millimeters.
Wherein, executing in processor 102 includes control in the function that the computer program stored in memory 103 is realized
Robot 100 is mobile.It is as follows that 100 moving process of robot is controlled based on light sensor arrays 104:
When user needs to control robot movement, the mobile mode of robot 100 can according to need against robot
Light sensor arrays 104 on 100 issue corresponding gesture, are denoted as user gesture.The user gesture is the user that user issues
Control the mobile gesture of robot 100.Wherein, different gestures can control robot 100 to move in different ways.For example, different
User gesture can control robot 100 and move at different rates, and it is mobile to different directions also to can control robot 100
Deng.
For processor 102, needs to identify user gesture and then be based on user gesture control robot 100 according to user
The mode needed is moved.For example, user needs robot 100 to be moved to the left, then user issues control robot 100 to the left
Mobile user gesture, processor 102 need to identify the user gesture that the control robot 100 that user issues is moved to the left,
And then the instruction being moved to the left is issued to robot 100 based on the user gesture, to drive robot 100 to be moved to the left.
Wherein, when moving on light sensor arrays 104, the state of light sensor arrays 104 can be sent out user gesture
Changing, for example, may from be not blocked completely change to part be blocked, then from be partially blocked change to completely not by
It blocks.The motion track, moving direction and/or speed that the state of light sensor arrays 104 can change with user gesture
It is closely bound up.Wherein, the route that the motion track of user gesture is passed through until referring to user gesture from start to end is formed
Gesture motion space characteristics.Motion track can indicate by continuous moving direction, therefore in the embodiment of the present application,
It is mainly described with the direction of motion and the speed of service, but this is also belonged to based on the mode that motion profile controls robot
Apply for embodiment protection scope.
Based on this, processor 102 can capture the state change that light sensor arrays 104 block generation with user gesture
Information, state change information here mainly include the situation of change that light sensor arrays 104 are blocked by user gesture and
The duration etc. being blocked.Later, processor 102 can block generation with user gesture according to light sensor arrays 104
State change information obtains the moving direction and movement speed of user gesture.The moving direction and movement speed of user gesture be
The underlying attribute of user gesture identifies that the moving direction of user gesture and movement speed are equivalent to and has identified user gesture.
Next, processor 102 can control robot 100 and accordingly be moved according to the moving direction and movement speed of user gesture.
In the present embodiment, light sensor arrays are set in robot, and user can be against light sensor arrays
The mobile gesture of control robot is issued, the processor in robot is blocked by capturing light sensor arrays with user gesture
The state change information of generation obtains the moving direction and movement speed of user gesture, the movement based on user gesture based on this
Direction and movement speed carry out mobile control to robot, realize the control mode based on user gesture, are no longer additionally matched
The influence of part can greatly improve the freedom degree of human-computer interaction, improve the mobile real-time of control robot.
In addition, carrying out mobile control to robot in combination with the moving direction of user gesture and movement speed in the present embodiment
System, can also provide control mode more abundant, be also beneficial to realize that rough control to the evolution being precisely controlled, improves control
Precision.It, can be on the basis of flexible, real-time control machine people is mobile furthermore because the price of photosensitive sensor is relatively cheap
Reduce the cost of implementation of robot.
In some exemplary embodiments, the characteristic of electric signal can be converted optical signals to using photosensitive sensor, come
Capture the state change information that light sensor arrays 104 block generation with user gesture.Photosensitive sensor can be by optical signal
Be converted to electric signal, and the size of electric signal that light signal strength does not export simultaneously also can all differences.To photosensitive sensor
For photosensitive sensor in array 104, collected optical signal when being blocked when being blocked by user gesture and not by user gesture
Intensity will be different, therefore the electric signal exported would also vary from.In general, photosensitive in the case where not being blocked
The collected light signal strength of sensor is relatively strong, can export high level 1;In the case where being blocked, photosensitive sensor is adopted
The light signal strength collected is relatively weak, can export low level 0.I.e. photosensitive sensor is in the case where being blocked and be not blocked
The electric signal of output can jump.
Based on above-mentioned, when user need to control robot it is mobile when, can according to need the mobile mode pair of robot 100
Light sensor arrays 104 in robot 100 issue corresponding user gesture.For processor 102, light can be acquired
The electric signal that each photosensitive sensor exports in user gesture moving process in dependent sensor array 104;According to each light sensor
The situation of change for the electric signal that device exports in user gesture moving process obtains light sensor arrays 104 with user gesture
Block the state change information of generation.
Wherein, processor 102 can acquire each photosensitive sensor in user at set time intervals or collection period
The electric signal exported in gesture moving process.The time interval or collection period are much smaller than the movement speed of user gesture, this meaning
Taste can collect multiple electric signals of each photosensitive sensor output at the different acquisition moment in user gesture moving process.
For each photosensitive sensor, situation of change of the observable in multiple acquisition moment collected electric signals.
Wherein, the situation of change of the electric signal exported in user gesture moving process according to each photosensitive sensor, can be with
Judge user gesture blocked which photosensitive sensor, these photosensitive sensors sequencing being blocked and block when
Between information, these information such as length can reflect the state change that light sensor arrays 104 block generation with user gesture.
Next, processor 102 is believed according to light sensor arrays 104 with the state change that user gesture blocks generation
Breath, obtains the moving direction and movement speed of user gesture, is equivalent to and identifies user gesture.Wherein, the movement of user gesture
Direction and movement speed can characterize the mode that user needs robot 100 mobile, then can be according to the mobile side of user gesture
To and movement speed, control robot 100 accordingly moved.User can be moved by gesture control robot 100 as a result,
It is dynamic, without opening cell phone application, without infrared handle is operated, is not influenced by these accessories, be conducive to improve human-computer interaction
It is mobile can to control robot in time for freedom degree.
In the above-described embodiments, the variation of the electric signal exported in user gesture moving process according to each photosensitive sensor
Situation, obtaining light sensor arrays 104 can be there are many implementation with the state change information that user gesture blocks generation.
For example, the situation of change of the electric signal that can be exported in user gesture moving process with each photosensitive sensor of independent analysis obtains
The state change information is obtained, alternatively, the electricity that can also be exported in user gesture moving process with combinatory analysis photosensitive sensor
The situation of change of signal obtains the state change information;Or the light in different zones can be located at according to regional analysis
The situation of change for the electric signal that dependent sensor exports in user gesture moving process obtains described state change information, etc..
It is described in detail in combinatory analysis mode as an example in an optional embodiment below.
In an optional embodiment, it can be pressed in conjunction with the layout pattern of photosensitive sensor in light sensor arrays 104
The photosensitive sensor in light sensor arrays 104 is combined according to the mobile demand to robot 100, forms different biographies
Sensor combination.A kind of sensor combinations correspond to a kind of mobile control mode, and establish sensor combinations and reflect with mobile control mode
Penetrate relationship.Different sensors combination may include identical photosensitive sensor, but corresponding to block sequence different.The biography of the present embodiment
Sensor combination not only includes photosensitive sensor, further includes the sequence that is blocked of photosensitive sensor.
The pattern of light sensor arrays 104 in conjunction with shown in Fig. 2 a reflects the sensor combination and mobile control mode
The relationship of penetrating is illustrated.Assuming that needing to control the movement of robot 100 in a kind of application scenarios and being moved to the left, move right
Move, move forward, move backward and stop movement.It correspondingly, can be by the photosensitive sensor in light sensor arrays 104
Five kinds of sensor combinations are formed, A1, S0 and A2 sequentially form a kind of sensor combinations, successively block the user hand of A1, S0 and A2
Gesture is moved to the left for controlling robot;A2, S0 and A1 sequentially form a kind of sensor combinations, successively block A2, S0 and A1
User gesture moves right for controlling robot;B1, S0 and B2 sequentially form a kind of sensor combinations, successively block B1, S0
It is moved forward with the user gesture of B2 for controlling robot;B2, S0 and B1 sequentially form a kind of sensor combinations, successively block
The user gesture of B2, S0 and B1 are moved backward for controlling robot;A1, S0 and A2 form another combination, hide simultaneously
The user gesture of gear A1, S0 and A2 stop movement for controlling robot.This five kinds of sensor combinations and five kinds of mobile controlling parties
Formula corresponding relationship is as shown in table 1 below:
Table 1
In table 1 above, " X " indicates that corresponding photosensitive sensor is not considered.As shown in Table 1, when user gesture is from A1
When streaking A2 through S0, controllable robot 100 turns left, and the moving process of user gesture and robot is respectively such as Fig. 3 a and 3b institute
Show;When user gesture streaks A1 through S0 from A2, when, controllable robot 100 turns right, and user gesture and robot are moved through
Journey difference is as shown in figs. 3 c and 3d;When user gesture streaks B2 through S0 from B1, controllable robot 100 goes ahead, user gesture
With the moving process of robot respectively as shown in Fig. 3 e and 3f;When user gesture streaks B1 through S0 from B2, robot 100 can control
It draws back, the moving process of user gesture and robot is respectively as shown in Fig. 3 g and 3h;When user's both hands block A1, S0, A2
When firmly, robot 100 can control to stop movement, user gesture is as shown in figure 3i.Illustrate herein, to be more clear convenient for diagram,
Memory 103 and processor 103 is not shown in the robot shown in Fig. 3 b, Fig. 3 d, Fig. 3 f and Fig. 3 h.
Based on above-mentioned, for user gesture, processor 102 can be moved through according to each photosensitive sensor in the user gesture
The electric signal exported in journey determines that each sensor group is in the user gesture in sensor combinations and mobile control mode mapping relations
The electrical signal sequence exported in moving process;The electric signal sequence exported in user gesture moving process according to each sensor combinations
The situation of change of column, determines whether each sensor combinations are blocked and be blocked the duration of process by user gesture.It is right
For any sensor combination, the telecommunications of each photosensitive sensor output in same acquisition moment collected sensor combinations
Number an electrical signal sequence is formed, the electricity of each photosensitive sensor output in different acquisition moment collected sensor combinations
Signal forms different electrical signal sequences.In table 1, by taking B1, S0 and B2 combination as an example, " 111 ", " 011 ", " 001 " etc. belong to
The electrical signal sequence of different moments.
Optionally, in conjunction with table 1, signal shape when each sensor combinations correspond to corresponding mobile control mode can be preset
State rule change.For example, it includes: from 111 variations that B1, S0 and B2, which combine corresponding signal condition rule change, when moving forward
To 011, then 001 is changed to, then change to 101, then change to 100, then change to 110, finally changes to 111 again.It is based on
This, can be according to the state change of the electrical signal sequence that each sensor combinations export in the user gesture moving process and each
Whether sensor combinations correspond to signal condition rule change when corresponding mobile control mode, determine each sensor combinations by user
Gesture is blocked.For any sensor combination, it can be determined that the sensor combinations are defeated in the user gesture moving process
The signal condition when state change of electrical signal sequence out corresponding mobile control mode whether corresponding to the sensor combinations becomes
Change rule to match;If matching, determine that the sensor combinations are blocked by user gesture;If mismatching, the sensor is determined
Combination is not blocked by user gesture.For example, it is assumed that this combination of B1, S0 and B2 exported in the user gesture moving process
The state change of electrical signal sequence is followed successively by 111,011,001,101,100,110 and 111, the state change and B1, S0 and
The corresponding signal condition rule change of this combination of B2 is identical, therefore this combination of B1, S0 and B2 can be determined according to by B1 to S0
The sequence for arriving B2 again is blocked by user gesture.
For ease of description, will determine that the sensor combinations blocked by user gesture are denoted as first sensor and combine, such as B1,
This combination of S0 and B2.Based on this, the electric signal sequence exported in user gesture moving process can be combined according to first sensor
The state change time of column determines that first sensor combines the duration for the process that is blocked.First sensor combines
Any sensor combinations in each sensor combinations can be a sensor combinations and be also possible to multiple sensor combinations.One
As for, first sensor group is combined into one.
Optionally, processor 102 can acquire light sensor arrays at set time intervals or frequency acquisition
The electric signal of each photosensitive sensor output in 104.In the case where no user gesture, the telecommunications of these photosensitive sensors output
It is number all the same, it is all high level 1;When user gesture occurs, has photosensitive sensor and export low level 0 because being blocked.Base
In this, during acquiring the electric signal of each photosensitive sensor output, whether have electric signal change, work as prison if can monitor
It hears when thering is electric signal to change, determines that user gesture occurs, start the corresponding timer of each sensor combinations and carry out timing,
And enter the stage for acquiring the electric signal that each photosensitive sensor exports in user gesture moving process.When according to each sensor group
The state change for closing the electrical signal sequence exported in user gesture moving process, determines blocked by user gesture first
When sensor combinations, terminate the timing of each timer, and obtain the timing result that first sensor combines corresponding timer, makees
The duration for the process that is blocked is combined for first sensor.First sensor combination be blocked process duration refer to from
First sensor combination is blocked the duration started to this process that is not blocked completely.
In the above-described embodiment, the photosensitive sensor in each sensor combinations plays a role alone.In addition to this, it is
Raising gesture identification precision, light sensor arrays 104 may include greater number of photosensitive sensor, as shown in Figure 2 c.
It, can be using being first grouped in a fairly large number of scene of photosensitive sensor, the mode for continuing sub-clustering in group is managed.Example
Such as, first photosensitive sensor is combined, obtains multiple sensor groups, further by photosensitive sensor in each sensor group
It is combined to form sensor cluster, each sensor cluster is identical as the effect that photosensitive sensor single in previous embodiment plays.
Light sensor arrays 104 can include multiple sensor groups in this way, and each sensor group includes multiple sensor clusters, each sensing
Device cluster includes at least one photosensitive sensor.For the light sensor arrays 104 shown in Fig. 2 c, the left side array two is photosensitive
Sensor forms a cluster A1 ', and two photosensitive sensors in right side form a cluster A2 ', intermediate three from top to bottom three it is photosensitive
Sensor is respectively formed cluster B1 ', S0 ', B2 '.Wherein, gesture identification precision can be improved in the quantity for increasing photosensitive sensor.
In the array structure based on sensor cluster, determining whether each sensor combinations by user gesture block it
Before, the electrical signal sequence that each sensor combinations export in user gesture moving process can be obtained ahead of time.To each sensor
Each sensor cluster in combination, at least one photosensitive sensor for including according to the sensor cluster is in user gesture moving process
The situation of change of the electric signal of middle output calculates the shielding rate of the sensor cluster;According to the shielding rate of the sensor cluster, determining should
The whole electric signal that sensor cluster exports in user gesture moving process;Then, to each sensor combinations, according to the sensing
The whole electric signal that multiple sensor clusters that device combination includes export in user gesture moving process obtains sensor combinations and exists
The electrical signal sequence exported in user gesture moving process.
For example, the sensor cluster includes at least one can be counted to any sensor cluster in any sensor combination
Become of low level electric signal in the electric signal that a photosensitive sensor exports in user gesture moving process from high level
Number, the number as the photosensitive sensor being blocked;Include according to the number for the photosensitive sensor being blocked and the sensor cluster
At least one photosensitive sensor total number, calculate sensor cluster shielding rate.
If the shielding rate is greater than the shielding rate threshold value of setting, it can determine that the sensor cluster is blocked, it is determined that the sensing
The whole electric signal of device cluster is low level 0;If the shielding rate is less than or equal to the shielding rate threshold value of setting, the sensing can be determined
Device cluster is not blocked, it is determined that the whole electric signal of the sensor cluster is high level 1.
After the whole electric signal for obtaining sensor cluster, by the whole telecommunications for the sensor cluster for including in each sensor combinations
It number is combined, the electrical signal sequence of each sensor combinations can be obtained.It in turn, can be according to the electric signal sequence of each sensor combinations
The situation of change of column determines whether each sensor combinations are blocked and be blocked the duration of process by user gesture.
It is worth noting that the quantity for the photosensitive sensor that light sensor arrays 104 include can be suitable according to application demand
Answering property is arranged.Relative to light sensor arrays 104 shown in Fig. 2 a, in addition to increase as shown in Figure 2 c photosensitive sensor quantity it
Outside, the quantity of photosensitive sensor can also be reduced as shown in Figure 2 b.In the application scenarios of less demanding to accuracy of identification, reduce
The quantity of photosensitive sensor can reduce cost of implementation.
It is worth noting that control logic shown in above-mentioned table 1 can be used for array shown in Fig. 2 b and Fig. 2 c by appropriate deformation
Structure, this will not be repeated here.
Further, in the sensor combination embodiment, it is determined that the first sensor blocked by user gesture combines
And first sensor combines the duration for the process that is blocked.Based on this, in the moving direction for determining user gesture, processing
Device 102 can be combined according to first sensor in the sequence that is blocked of each photosensitive sensor, determine the moving direction of user gesture.
In conjunction with Fig. 2 a and table 1, it is assumed that first sensor group is combined into B1, S0 and B2 combination, then B1, S0 and B2 are in light sensor arrays
Setting position and be blocked sequence, can determine the moving direction of user gesture be it is downward, moving direction here is phase
For light sensor arrays, i.e., user gesture is moved down with respect to light sensor arrays.Correspondingly, user is being determined
When the movement speed of gesture, processor 102 can combine the duration and first for the process that is blocked according to first sensor
Sensor combinations calculate the movement speed of user gesture in the arrangement width being blocked in sequence direction.Still passed with first
Sensor group is combined into for B1, S0 and B2 combination, it is assumed that and the duration that first sensor combines the process that is blocked is s seconds, the
One sensor combinations are in the arrangement width d being blocked in sequence direction, i.e., and the distance between B1 and B2 Y millimeters, it is known that use
Family gesture has streaked Y millimeters of distance, speed v=Y/s at s seconds.
It further, in the above embodiments, can in the case where determining the moving direction and movement speed of user gesture
According to the moving direction and movement speed of user gesture, it is mobile to control robot 100.It in one embodiment, can basis
The moving direction of user gesture determines the moving direction of robot 100;According to the movement speed of user gesture, robot is determined
100 mobile range and/or movement speed;And then robot 100 is controlled with the mobile range and/or movement speed described
It is moved on moving direction.This control mode precision is higher, and the moving direction that not only can control robot 100 can also control
The mobile range or speed of robot 100, conducive to the demand for control for meeting user.Given here is only a kind of exemplary control
Mode, however it is not limited to this.
Illustrate herein, to the robot with hardware configuration shown in Fig. 1, in addition to the control that preceding embodiment can be used to describe
The logic control processed robot carries out except movement, can also control the robot using other control logics and be moved, no
It can be used in combination with control logic, also can be used alone.Another control robot 100 is enumerated in embodiment below
Mobile control logic, the control logic can be executed the computer journey in memory 103 by the processor 102 in robot 100
Sequence is realized.
In a kind of exemplary embodiment, the processor 102 in robot 100 is also used to execute the meter in memory 103
Calculation machine program, to be used for:
The movement speed of user gesture is obtained based on the light sensor arrays 104 in robot 100;
According to the movement speed of user gesture, the mobile range and/or movement speed of robot 100 are determined;
Control robot 100 is moved with above-mentioned mobile range and/or movement speed.
Wherein, the implementation for the movement speed that processor 102 obtains user gesture based on light sensor arrays 104 can
Referring to previous embodiment, details are not described herein.
According to the difference of application demand, according to the movement speed of user gesture determine robot 100 mobile range and/or
The mode of movement speed can be different.In an illustrative embodiments, multiple gesture movement speed ranges, different hands can be set
Potential shift moves velocity interval and corresponds to different robot mobile range and/or robot movement speed.Based on this, processor 102 exists
After getting the movement speed of user gesture, the movement speed of user gesture can be determined from multiple gesture movement speed ranges
Affiliated first gesture movement speed range;Will robot corresponding with first gesture movement speed range mobile range and/or
Robot movement speed, mobile range and/or movement speed as robot 100.
Further, in realization, the maximum value and minimum value of each gesture movement speed range can be directly set, or
At least one gesture speed threshold value also can be set in person, so that gesture movement speed is divided into multiple gesture movement speed models
It encloses.
In addition, it is worth noting that, in order to increase control robot 100 it is mobile when interest, diversity or be full
Sufficient application-specific demand, the present embodiment do not limit the movement speed of user gesture and the mobile range of robot and/or movement
Proportionate relationship between speed.For example, the mobile range and/or movement speed of robot can be with the movement speeds of user gesture
Increase and increase, alternatively, can also reduce with the increase of the movement speed of user gesture, or is also possible to according to using need
A kind of proportionate relationship asked and set.
After the mobile range and/or movement speed for determining robot, processor 102 can control robot with the shifting
Dynamic amplitude and/or movement speed are moved.The present embodiment combination user gesture speed controls robot, can be bigger
The demand for control for meeting user of degree, can be improved user experience.For example, user can pass through in the case where relatively more urgent
It is mobile with fast speed that gesture speed controls robot.In another example in some more complicated environment, for the ease of robot
It detours, user can control robot by gesture speed and be moved with slower speed or lesser amplitude.
In one exemplary embodiment, processor 102 is in addition to that can control robot according to the movement speed of user gesture
Except 100 mobile range and/or movement speed, it is also based on the movement that light sensor arrays 104 obtain user gesture
Direction or motion track;According to the moving direction or motion track of user gesture, the moving direction of robot 100 is determined.It is based on
This, processor 102 can be in conjunction with the moving direction of robot 100 and the mobile range and/or movement speed pair of robot 100
Robot 100 carries out mobile control, i.e., controllable robot 100 is with the robot mobile range and/or robot of above-mentioned determination
Movement speed is moved on identified moving direction.It wherein, can be with if the moving direction consecutive variations of user gesture
The motion track of user gesture is obtained, can achieve the mobile effect of Robot certain track.Wherein, processor 102 is based on
The implementation that light sensor arrays 104 obtain the moving direction of user gesture can be found in previous embodiment, no longer superfluous herein
It states.
The embodiment of the present application realizes the mobile control of robot using light sensor arrays, the cost of implementation of robot compared with
It is low, and direction and the speed of user gesture are combined, more multi-functional control may be implemented.For example, can control robot at certain
It moves linearly on a direction, such as moves forward, moves backward, is moved to the left or moves right.In another example in user
In the case where the moving direction consecutive variations of gesture, it can control robot and carry out non-rectilinear movement, such as draw circle, draw semicircle etc.
Various motion mode enriches the diversity of gesture control.
Fig. 4 a is the flow diagram of the method for the control robot movement that the application another exemplary embodiment provides.Such as
Shown in Fig. 4 a, this method comprises:
401, the state change information that light sensor arrays block generation with user gesture, light sensor arrays are captured
It is installed in robot.
402, the state change information for blocking generation with user gesture according to light sensor arrays, obtains user gesture
Moving direction and movement speed.
403, according to the moving direction of user gesture and movement speed, it is mobile to control robot.
In the present embodiment, light sensor arrays are set in robot, and light sensor arrays are mainly used for incuding
User gesture, to realize the control mode based on user gesture.
When user need to control robot it is mobile when, can according to need the mobile mode of robot against in robot
Light sensor arrays issue corresponding gesture, are denoted as user gesture.The user gesture is the user's control robot that user issues
Mobile gesture.Wherein, different gestures can control robot to move in different ways.For example, different user gesture can control
Robot moves at different rates, and it is mobile etc. to different directions also to can control robot.
Wherein, when moving on light sensor arrays, the state of light sensor arrays can change user gesture,
Such as it may partially be blocked from not being blocked to change to completely, then be not blocked completely from being partially blocked to change to.Light
The state of dependent sensor array can change closely bound up with the moving direction of user gesture and/or speed.
Based on this, the state change information that light sensor arrays block generation with user gesture can capture, here
State change information mainly include light sensor arrays by the situation of change that user gesture is blocked and be blocked it is lasting when
It is long etc..Later, the state change information that can block generation with user gesture according to light sensor arrays, obtains user gesture
Moving direction and movement speed.The moving direction and movement speed of user gesture are the underlying attributes of user gesture, identify use
The moving direction and movement speed of family gesture, which are equivalent to, has identified user gesture.Next, can be according to the movement of user gesture
Direction and movement speed, control robot is accordingly moved.
In the present embodiment, light sensor arrays are set in robot, and user can be against light sensor arrays
The mobile gesture of control robot is issued, is believed by capturing light sensor arrays with the state change that user gesture blocks generation
Breath obtains the moving direction and movement speed of user gesture, moving direction and movement speed pair based on user gesture based on this
Robot carries out mobile control, realizes the control mode based on user gesture, is no longer influenced by extra members, can be substantially
The freedom degree of human-computer interaction is improved, the mobile real-time of control robot is improved.
In some exemplary embodiments, the characteristic of electric signal can be converted optical signals to using photosensitive sensor, come
Capture the state change information that light sensor arrays block generation with user gesture.Based on this, a kind of embodiment party of step 401
Formula includes: to acquire the electric signal that each photosensitive sensor exports in user gesture moving process in light sensor arrays;According to
The situation of change for the electric signal that each photosensitive sensor exports in user gesture moving process, obtain light sensor arrays with
Family gesture blocks the state change information of generation.Wherein, it is exported in user gesture moving process according to each photosensitive sensor
The situation of change of electric signal, it can be determined which photosensitive sensor is user gesture blocked, these photosensitive sensors are blocked out
Sequencing and information, these information such as the time span blocked can reflect light sensor arrays and hidden with user gesture
Keep off the state change generated.
The situation of change of the above-mentioned electric signal exported in user gesture moving process according to each photosensitive sensor obtains light
Dependent sensor array can be there are many implementation with the state change information that user gesture blocks generation.For example, can combine
The situation of change for the electric signal that analysis photosensitive sensor exports in user gesture moving process obtains the state change information.
Based on this, as shown in Figure 4 b, a kind of embodiment of step 401 includes:
4011, the telecommunications that each photosensitive sensor exports in user gesture moving process in light sensor arrays is acquired
Number.
4012, the electric signal exported in user gesture moving process according to each photosensitive sensor, determines sensor combinations
With the electrical signal sequence that each sensor group exports in user gesture moving process in mobile control mode mapping relations.
4013, the situation of change of the electrical signal sequence exported in user gesture moving process according to each sensor combinations,
Determine whether each sensor combinations are blocked and be blocked the duration of process by user gesture.
In Fig. 4 b illustrated embodiment, in conjunction with the layout pattern of photosensitive sensor in light sensor arrays, according to right
Photosensitive sensor in light sensor arrays is combined by the mobile demand of robot, forms different sensor combinations.
A kind of sensor combinations correspond to a kind of mobile control mode, and establish sensor combinations and mobile control mode mapping relations.Into
And the situation of change of the electrical signal sequence exported in user gesture moving process according to each sensor combinations, determine each sensor
Whether combination is blocked and is blocked the duration of process by user gesture, for moving direction and the shifting for obtaining user gesture
Dynamic speed provides condition.
In some exemplary embodiments, a kind of embodiment of step 4013 include: according to each sensor combinations with
The state change of the electrical signal sequence exported in the gesture moving process of family and the corresponding corresponding mobile controlling party of each sensor combinations
Signal condition rule change when formula, determines whether each sensor combinations are blocked by user gesture;For by user gesture institute
The first sensor combination blocked, combines the electrical signal sequence exported in user gesture moving process according to first sensor
The state change time determines that first sensor combines the duration for the process that is blocked.
Further, during acquiring the electric signal that each photosensitive sensor exports in user gesture moving process, when
When having listened to electric signal and changing, starts the corresponding timer of each sensor combinations and carry out timing;When determining by user
When the combination of first sensor that gesture is blocked, the timing result that first sensor combines corresponding timer is obtained, as the
One sensor combinations are blocked the duration of process.
For example, each light sensor in light sensor arrays can be acquired at set time intervals or frequency acquisition
The electric signal of device output.In the case where no user gesture, the electric signal of these photosensitive sensors output is all the same, is all high
Level 1;When user gesture occurs, has photosensitive sensor and export low level 0 because being blocked.Based on this, each light is being acquired
During the electric signal of dependent sensor output, whether have electric signal change, when having listened to electric signal hair if can monitor
It when changing, determines that user gesture occurs, starts the corresponding timer of each sensor combinations and carry out timing, and enter and acquire each light
The stage for the electric signal that dependent sensor exports in user gesture moving process.When according to each sensor combinations in user hand potential shift
The state change of the electrical signal sequence exported during dynamic, when determining the first sensor blocked by user gesture combination,
Terminate the timing of each timer, and obtain the timing result that first sensor combines corresponding timer, as first sensor
Combine the duration for the process that is blocked.The duration that first sensor combines the process that is blocked refers to from first sensor group
Conjunction is blocked the duration started to this process that is not blocked completely.
In some exemplary embodiments, each sensor combinations separately include multiple sensor clusters, each sensor cluster packet
Containing at least one photosensitive sensor.Further include before determining whether each sensor combinations are blocked by user gesture based on this
Obtain the operation for the electrical signal sequence that each sensor combinations export in user gesture moving process.Optionally, to each sensing
Each sensor cluster in device combination, at least one photosensitive sensor for including according to sensor cluster is in user gesture moving process
The situation of change of the electric signal of middle output calculates the shielding rate of sensor cluster;According to the shielding rate of sensor cluster, sensor is determined
The whole electric signal that cluster exports in user gesture moving process;To each sensor combinations, include according to sensor combinations
It is mobile in user gesture to obtain sensor combinations for the whole electric signal that multiple sensor clusters export in user gesture moving process
The electrical signal sequence exported in the process.
Optionally, above-mentioned at least one photosensitive sensor for including according to sensor cluster is defeated in user gesture moving process
The situation of change of electric signal out, a kind of embodiment for calculating the shielding rate of sensor cluster includes: that statistics sensor cluster includes
The electric signal that is exported in user gesture moving process of at least one photosensitive sensor in low level electricity become from high level
The number of signal, the number as the photosensitive sensor being blocked;According to the number and sensor of the photosensitive sensor being blocked
The total number at least one photosensitive sensor that cluster includes calculates the shielding rate of sensor cluster.
For example, can determine that the sensor cluster is blocked if the shielding rate is greater than the shielding rate threshold value of setting, it is determined that should
The whole electric signal of sensor cluster is low level 0;If the shielding rate is less than or equal to the shielding rate threshold value of setting, this can be determined
Sensor cluster is not blocked, it is determined that the whole electric signal of the sensor cluster is high level 1.
After the whole electric signal for obtaining sensor cluster, by the whole telecommunications for the sensor cluster for including in each sensor combinations
It number is combined, the electrical signal sequence of each sensor combinations can be obtained.It in turn, can be according to the electric signal sequence of each sensor combinations
The situation of change of column determines whether each sensor combinations are blocked and be blocked the duration of process by user gesture.
As illustrated in fig. 4 c, the mobile method of the control robot that the application another exemplary embodiment provides includes following step
It is rapid:
41, the electric signal that each photosensitive sensor exports in user gesture moving process in light sensor arrays is acquired.
42, the electric signal exported in user gesture moving process according to each photosensitive sensor, determine sensor combinations with
The electrical signal sequence that each sensor group exports in user gesture moving process in mobile control mode mapping relations.
43, the situation of change of the electrical signal sequence exported in user gesture moving process according to each sensor combinations, really
Whether fixed each sensor combinations are blocked and are blocked the duration of process by user gesture.
44, it is hidden according in each sensor combinations by each photosensitive sensor in the combination of first sensor that user gesture is blocked
The sequence of gear determines the moving direction of user gesture.
45, according to first sensor combination be blocked process duration and first sensor combination be blocked it is suitable
Arrangement width in sequence direction, calculates the movement speed of user gesture.
46, according to the moving direction of user gesture, the moving direction of robot is determined.
47, according to the movement speed of user gesture, the mobile range and/or movement speed of robot are determined.
48, control robot is moved on above-mentioned moving direction with above-mentioned mobile range and/or movement speed.
In the present embodiment, light sensor arrays are set in robot, and user can be against light sensor arrays
Issuing the mobile gesture of control robot can control robot mobile.Control mode based on user gesture, is no longer additionally matched
The influence of part can greatly improve the freedom degree of human-computer interaction, improve the mobile real-time of control robot.In addition, this implementation
Example controls robot in combination with the moving direction and movement speed of user gesture, and control precision is higher, not only can be with
The moving direction of control robot can also control the mobile range or speed of robot, conducive to the demand for control for meeting user.
Fig. 4 d is the method flow signal for another control robot movement that the application another exemplary embodiment provides
Figure.As shown in figure 4d, this method comprises:
4a, the movement speed that user gesture is obtained based on light sensor arrays, the light sensor arrays are installed on
In robot.
4b, the movement speed according to user gesture determine the mobile range and/or movement speed of robot.
4c, control robot are moved with above-mentioned mobile range and/or movement speed.
Wherein, the implementation that the movement speed of user gesture is obtained based on light sensor arrays can be found in aforementioned implementation
Example, details are not described herein.
According to the difference of application demand, mobile range and/or the shifting of robot are determined according to the movement speed of user gesture
The mode of dynamic speed can be different.
In an illustrative embodiments, multiple gesture movement speed ranges, different gesture movement speed models can be set
Enclose corresponding different robot mobile range and/or robot movement speed.Based on this, in the mobile speed for getting user gesture
After degree, first gesture movement speed belonging to the movement speed of user gesture can be determined from multiple gesture movement speed ranges
Range;Will robot corresponding with first gesture movement speed range mobile range and/or robot movement speed, as machine
The mobile range and/or movement speed of people.
Further, in realization, the maximum value and minimum value of each gesture movement speed range can be directly set, or
At least one gesture speed threshold value also can be set in person, so that gesture movement speed is divided into multiple gesture movement speed models
It encloses.
In addition, it is worth noting that, interest, the diversity or special to meet when robot movement are controlled to increase
Determine application demand, the present embodiment does not limit the movement speed of user gesture and the mobile range of robot and/or movement speed
Between proportionate relationship.For example, the mobile range and/or movement speed of robot can be with the increasings of the movement speed of user gesture
Increase greatly, i.e. the speed of user gesture is faster, and robot mobile range is bigger, and movement speed is faster.Alternatively, the shifting of robot
Dynamic amplitude and/or movement speed can also reduce with the increase of the movement speed of user gesture, i.e. the speed of user gesture is got over
Fastly, robot mobile range is smaller, and movement speed is slower.Alternatively, being also possible to set the shifting of robot according to application demand
Proportionate relationship between dynamic amplitude and/or movement speed and the movement speed of user gesture.
After the mobile range and/or movement speed for determining robot, can control robot with the mobile range and/
Or movement speed is moved.The present embodiment combination user gesture speed controls robot, can greatly expire
The demand for control of sufficient user, can be improved user experience.For example, user can pass through gesture speed in the case where relatively more urgent
It is mobile with fast speed to control robot.In another example, for the ease of robot detour, being used in some more complicated environment
Family can control robot by gesture speed and be moved with slower speed or lesser amplitude.
In one exemplary embodiment, in addition to can according to the movement speed of user gesture control robot mobile range
And/or except movement speed, it is also based on moving direction or motion track that light sensor arrays obtain user gesture;Root
According to the moving direction or motion track of user gesture, the moving direction of robot is determined.It, can be in conjunction with the shifting of robot based on this
The mobile range and/or movement speed of dynamic direction and robot carry out mobile control to robot, i.e., controllable robot with
The robot mobile range and/or robot movement speed of above-mentioned determination are moved on identified moving direction.Wherein,
If the moving direction consecutive variations of user gesture, the motion track of available user gesture can achieve Robot one
The mobile effect of fixed track.
Wherein, the implementation that the moving direction of user gesture is obtained based on light sensor arrays can be found in aforementioned implementation
Example, details are not described herein.
The embodiment of the present application realizes the mobile control of robot using light sensor arrays, the cost of implementation of robot compared with
It is low, and direction and the speed of user gesture are combined, more multi-functional control may be implemented.For example, can control robot at certain
It moves linearly on a direction, such as moves forward, moves backward, is moved to the left or moves right.In another example in user
In the case where the moving direction consecutive variations of gesture, it can control robot and carry out non-rectilinear movement, such as draw circle, draw semicircle etc.
Various motion mode enriches the diversity of gesture control.
It should be noted that the executing subject of each step of above-described embodiment institute providing method may each be same equipment,
Alternatively, this method is also by distinct device as executing subject.For example, the executing subject of step 401 to step 403 can be equipment
A;For another example, step 401 and 402 executing subject can be equipment A, the executing subject of step 403 can be equipment B;Etc..
In addition, containing in some processes of the description in above-described embodiment and attached drawing according to particular order appearance
Multiple operations, but it should be clearly understood that these operations can not execute or parallel according to its sequence what appears in this article
It executes, serial number of operation such as 401,402 etc. is only used for distinguishing each different operation, and serial number itself does not represent any
Execute sequence.In addition, these processes may include more or fewer operations, and these operations can execute in order or
It is parallel to execute.It should be noted that the description such as herein " first ", " second ", be for distinguish different message, equipment,
Module etc. does not represent sequencing, does not also limit " first " and " second " and is different type.
Fig. 5 a is the apparatus structure schematic diagram for the control robot movement that the application another exemplary embodiment provides.The dress
The Implement of Function Module that can be used as robot is set, or can also be connect with machine-independent people realization but with robot communication.Such as
Shown in Fig. 5 a, which includes: capture module 51, obtains module 52 and control module 53.
Capture module 51 blocks the state change information of generation, institute for capturing light sensor arrays with user gesture
Light sensor arrays are stated to be installed in robot.
Module 52 is obtained, for blocking the state change of generation with the user gesture according to the light sensor arrays
Information obtains the moving direction and movement speed of the user gesture.
Control module 53 controls the robot and moves for the moving direction and movement speed according to the user gesture
It is dynamic.
In some exemplary embodiments, capture module 51 is specifically used for:
Acquire the electricity that each photosensitive sensor exports in the user gesture moving process in the light sensor arrays
Signal;
According to the situation of change for the electric signal that each photosensitive sensor exports in the user gesture moving process, obtain
Obtain the state change information that the light sensor arrays block generation with the user gesture.
Further, capture module 51 is specifically used for when obtaining state change information:
According to the electric signal that each photosensitive sensor exports in the user gesture moving process, sensor group is determined
The electrical signal sequence that conjunction exports in the user gesture moving process with each sensor group in mobile control mode mapping relations;
According to the variation feelings for the electrical signal sequence that each sensor combinations export in the user gesture moving process
Condition, determines whether each sensor combinations are blocked and be blocked the duration of process by the user gesture.
Further, capture module 51 is determining whether each sensor combinations by the user gesture are blocked and hidden
When the duration of gear process, it is specifically used for:
The state of the electrical signal sequence exported in the user gesture moving process according to each sensor combinations becomes
Change and each sensor combinations correspond to signal condition rule change when corresponding mobile control mode, determine each sensing
Whether device combination is blocked by the user gesture;
For the first sensor combination blocked by the user gesture, combined according to the first sensor described
The state change time of the electrical signal sequence exported in user gesture moving process determines that the first sensor combination is blocked
The duration of process.
In some exemplary embodiments, capture module 51 is also used to:
During acquiring the electric signal that each photosensitive sensor exports in the user gesture moving process, when
When having listened to electric signal and changing, starts the corresponding timer of each sensor combinations and carry out timing;
When the first sensor combination for determining to be blocked by the user gesture, the first sensor combination is obtained
The timing result of corresponding timer combines the duration for the process that is blocked as the first sensor.
In some exemplary embodiments, each sensor combinations separately include multiple sensor clusters, each sensor cluster packet
Containing at least one photosensitive sensor.Based on this, whether capture module 51 is determining each sensor combinations by the user gesture institute
It blocks and is blocked before the duration of process, be also used to:
To each sensor cluster in each sensor combinations, at least one the photosensitive biography for including according to the sensor cluster
The situation of change for the electric signal that sensor exports in the user gesture moving process calculates the shielding rate of the sensor cluster;
According to the shielding rate of the sensor cluster, the whole electricity that the sensor cluster exports in the user gesture moving process is determined
Signal;
To each sensor combinations, the multiple sensor clusters for including according to the sensor combinations are in the user hand potential shift
The whole electric signal exported during dynamic, obtains the telecommunications that the sensor combinations export in the user gesture moving process
Number sequence.
Further, capture module 51 is specifically used for when calculating the shielding rate of the sensor cluster:
At least one photosensitive sensor that the sensor cluster includes is counted to export in the user gesture moving process
Electric signal in become the number of low level electric signal, the number as the photosensitive sensor being blocked from high level;
At least one light sensor for including according to the number of the photosensitive sensor being blocked and the sensor cluster
The total number of device calculates the shielding rate of the sensor cluster.
In some exemplary embodiments, module 52 is obtained to be specifically used for:
According to the sequence that each photosensitive sensor is blocked in first sensor combination, the shifting of the user gesture is determined
Dynamic direction;And
The duration for the process that is blocked is combined according to the first sensor and the first sensor is combined in quilt
The arrangement width in sequence direction is blocked, the movement speed of the user gesture is calculated.
In some exemplary embodiments, control module 53 is specifically used for:
According to the moving direction of the user gesture, the moving direction of the robot is determined;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to move on the moving direction with the mobile range and/or movement speed.
In some exemplary embodiments, light sensor arrays are laid out using " ten " word or class " ten " word is laid out.
The mobile device of control robot provided in this embodiment can be known based on the light sensor arrays in robot
Not Chu user gesture and to control robot according to user gesture mobile, be based on gesture control robot for user and move to provide skill
Art is supported, the freedom degree of human-computer interaction can be greatly improved, and improves the mobile real-time of control robot.
The foregoing describe the built-in functions and structure of the mobile device of control robot, as shown in Figure 5 b, in practice, the control
The mobile device of robot processed can be realized as an electronic equipment, comprising: memory 501 and processor 502.
Memory 501 can be configured to store various other data to support operation on an electronic device.These data
Example include any application or method for operating on an electronic device instruction, telephone book data, message, figure
Piece, video etc..
Memory 501 can realize by any kind of volatibility or non-volatile memory medium or their combination,
Such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable is read-only
Memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, disk
Or CD.
Processor 502 is coupled with memory 501, for executing the program in memory 501, to be used for:
Capture the state change information that light sensor arrays block generation with user gesture, the light sensor arrays
It is installed in robot;
The state change information for blocking generation with the user gesture according to the light sensor arrays, obtains the use
The moving direction and movement speed of family gesture;
According to the moving direction and movement speed of the user gesture, it is mobile to control the robot.
In some exemplary embodiments, processor 502 is specifically used for:
Acquire the electricity that each photosensitive sensor exports in the user gesture moving process in the light sensor arrays
Signal;
According to the situation of change for the electric signal that each photosensitive sensor exports in the user gesture moving process, obtain
Obtain the state change information that the light sensor arrays block generation with the user gesture.
Further, processor 502 is specifically used for when obtaining state change information:
According to the electric signal that each photosensitive sensor exports in the user gesture moving process, sensor group is determined
The electrical signal sequence that conjunction exports in the user gesture moving process with each sensor group in mobile control mode mapping relations;
According to the variation feelings for the electrical signal sequence that each sensor combinations export in the user gesture moving process
Condition, determines whether each sensor combinations are blocked and be blocked the duration of process by the user gesture.
Further, processor 502 is determining whether each sensor combinations by the user gesture are blocked and hidden
When the duration of gear process, it is specifically used for:
The state of the electrical signal sequence exported in the user gesture moving process according to each sensor combinations becomes
Change and each sensor combinations correspond to signal condition rule change when corresponding mobile control mode, determine each sensing
Whether device combination is blocked by the user gesture;
For the first sensor combination blocked by the user gesture, combined according to the first sensor described
The state change time of the electrical signal sequence exported in user gesture moving process determines that the first sensor combination is blocked
The duration of process.
In some exemplary embodiments, processor 502 is also used to:
During acquiring the electric signal that each photosensitive sensor exports in the user gesture moving process, when
When having listened to electric signal and changing, starts the corresponding timer of each sensor combinations and carry out timing;
When the first sensor combination for determining to be blocked by the user gesture, the first sensor combination is obtained
The timing result of corresponding timer combines the duration for the process that is blocked as the first sensor.
In some exemplary embodiments, each sensor combinations separately include multiple sensor clusters, each sensor cluster packet
Containing at least one photosensitive sensor.Based on this, processor 502 is determining whether each sensor combinations are hidden by the user gesture
It keeps off and is blocked before the duration of process, be also used to:
To each sensor cluster in each sensor combinations, at least one the photosensitive biography for including according to the sensor cluster
The situation of change for the electric signal that sensor exports in the user gesture moving process calculates the shielding rate of the sensor cluster;
According to the shielding rate of the sensor cluster, the whole electricity that the sensor cluster exports in the user gesture moving process is determined
Signal;
To each sensor combinations, the multiple sensor clusters for including according to the sensor combinations are in the user hand potential shift
The whole electric signal exported during dynamic, obtains the telecommunications that the sensor combinations export in the user gesture moving process
Number sequence.
Further, processor 502 is specifically used for when calculating the shielding rate of the sensor cluster:
At least one photosensitive sensor that the sensor cluster includes is counted to export in the user gesture moving process
Electric signal in become the number of low level electric signal, the number as the photosensitive sensor being blocked from high level;
At least one light sensor for including according to the number of the photosensitive sensor being blocked and the sensor cluster
The total number of device calculates the shielding rate of the sensor cluster.
In some exemplary embodiments, processor 502 is specifically used for:
According to the sequence that each photosensitive sensor is blocked in first sensor combination, the shifting of the user gesture is determined
Dynamic direction;And
The duration for the process that is blocked is combined according to the first sensor and the first sensor is combined in quilt
The arrangement width in sequence direction is blocked, the movement speed of the user gesture is calculated.
In some exemplary embodiments, processor 502 is specifically used for:
According to the moving direction of the user gesture, the moving direction of the robot is determined;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to move on the moving direction with the mobile range and/or movement speed.
In some exemplary embodiments, light sensor arrays are laid out using " ten " word or class " ten " word is laid out.
Further, as shown in Figure 5 b, electronic equipment further include: communication component 503, display 504, power supply module 505, sound
Other components such as frequency component 506.Members are only schematically provided in Fig. 5 b, are not meant to that electronic equipment only includes Fig. 5 b institute
Show component.
Correspondingly, the embodiment of the present application also provides a kind of computer readable storage medium for being stored with computer program, institute
The method and step or function that can be realized in the various embodiments described above when computer program is computer-executed are stated, it is no longer superfluous herein
It states.
Fig. 6 is the structural schematic diagram for another electronic equipment that the application another exemplary embodiment provides.Such as Fig. 6 institute
Show, which includes: memory 601 and processor 602.
Memory 601 can be configured to store various other data to support operation on an electronic device.These data
Example include any application or method for operating on an electronic device instruction, telephone book data, message, figure
Piece, video etc..
Memory 601 can realize by any kind of volatibility or non-volatile memory medium or their combination,
Such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable is read-only
Memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, disk
Or CD.
Processor 602 is coupled with memory 601, for executing the program in memory 601, to be used for:
The movement speed of user gesture is obtained based on light sensor arrays, the light sensor arrays are installed on machine
On people;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to be moved with the mobile range and/or movement speed.
In some exemplary embodiments, processor 602 is in the mobile range and/or movement speed for determining robot,
It is specifically used for:
From multiple gesture movement speed ranges, determine that first gesture belonging to the movement speed of the user gesture is mobile
Velocity interval;
Will robot mobile range corresponding with the first gesture movement speed range and/or robot movement speed,
Mobile range and/or movement speed as the robot.
Further, processor 602 is also used to:
The moving direction or motion track of the user gesture are obtained based on the light sensor arrays;
According to the moving direction or motion track of the user gesture, the moving direction of the robot is determined.
Correspondingly, processor 602 is specifically used for when controlling robot movement: controlling the robot with the movement
Amplitude and/or movement speed are moved on the moving direction.
Wherein, the implementation for the movement speed that processor 602 obtains user gesture based on light sensor arrays can join
See previous embodiment, details are not described herein.Similarly, processor 602 obtains the movement of user gesture based on light sensor arrays
The implementation in direction can be found in previous embodiment, and details are not described herein.
Further, as shown in fig. 6, electronic equipment further include: communication component 603, display 604, power supply module 605, sound
Other components such as frequency component 606.Members are only schematically provided in Fig. 6, are not meant to that electronic equipment only includes shown in Fig. 6
Component.
Correspondingly, the embodiment of the present application also provides a kind of computer readable storage medium for being stored with computer program, institute
The method and step or function that can be realized in Fig. 4 d illustrated embodiment when computer program is computer-executed are stated, it is no longer superfluous herein
It states.
In Fig. 5 b and Fig. 6, communication component be can be configured to convenient for having between communication component corresponding device and other equipment
The communication of line or wireless mode.Communication component corresponding device can access the wireless network based on communication standard, such as WiFi, 2G or
3G or their combination.In one exemplary embodiment, communication component receives via broadcast channel and comes from external broadcasting management
The broadcast singal or broadcast related information of system.In one exemplary embodiment, the communication component further includes near-field communication
(NFC) module, to promote short range communication.For example, radio frequency identification (RFID) technology, Infrared Data Association can be based in NFC module
(IrDA) technology, ultra wide band (UWB) technology, bluetooth (BT) technology and other technologies are realized.
In Fig. 5 b and Fig. 6, display may include screen, and screen may include liquid crystal display (LCD) and touch
Panel (TP).If screen includes touch panel, screen may be implemented as touch screen, to receive input letter from the user
Number.Touch panel includes one or more touch sensors to sense the gesture on touch, slide, and touch panel.The touch
Sensor can not only sense the boundary of a touch or slide action, but also detect associated with the touch or slide operation hold
Continuous time and pressure.
In Fig. 5 b and Fig. 6, power supply module, the various assemblies for power supply module corresponding device provide electric power.Power supply module
May include power-supply management system, one or more power supplys and other generated, managed, and distributed with for power supply module corresponding device
The associated component of electric power.
In Fig. 5 b and Fig. 6, audio component is configured as output and/or input audio signal.For example, audio component packet
A microphone (MIC) is included, when audio component corresponding device is in operation mode, as call model, logging mode and voice are known
When other mode, microphone is configured as receiving external audio signal.The received audio signal can be further stored in and deposit
Reservoir is sent via communication component.In some embodiments, audio component further includes a loudspeaker, for exporting audio letter
Number.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, system or computer program
Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention
Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the present invention, which can be used in one or more,
The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces
The form of product.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net
Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or
The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium
Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method
Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable
Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM),
Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices
Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates
Machine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability
It include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrap
Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including described want
There is also other identical elements in the process, method of element, commodity or equipment.
The above description is only an example of the present application, is not intended to limit this application.For those skilled in the art
For, various changes and changes are possible in this application.All any modifications made within the spirit and principles of the present application are equal
Replacement, improvement etc., should be included within the scope of the claims of this application.
Claims (20)
1. a kind of method that control robot is mobile characterized by comprising
Capture the state change information that light sensor arrays block generation with user gesture, the light sensor arrays installation
In in robot;
The state change information for blocking generation with the user gesture according to the light sensor arrays, obtains the user hand
The moving direction and movement speed of gesture;
According to the moving direction and movement speed of the user gesture, it is mobile to control the robot.
2. the method according to claim 1, wherein the capture light sensor arrays are blocked with user gesture
The state change information of generation, comprising:
Acquire the electric signal that each photosensitive sensor exports in the user gesture moving process in the light sensor arrays;
According to the situation of change for the electric signal that each photosensitive sensor exports in the user gesture moving process, institute is obtained
State the state change information that light sensor arrays block generation with the user gesture.
3. according to the method described in claim 2, it is characterized in that, it is described according to each photosensitive sensor in the user hand
The situation of change of the electric signal exported during potential shift is dynamic, obtains the light sensor arrays with the user gesture and blocks production
Raw state change information, comprising:
According to the electric signal that each photosensitive sensor exports in the user gesture moving process, determine sensor combinations with
The electrical signal sequence that each sensor group exports in the user gesture moving process in mobile control mode mapping relations;
According to the situation of change for the electrical signal sequence that each sensor combinations export in the user gesture moving process, really
Whether fixed each sensor combinations are blocked and are blocked the duration of process by the user gesture.
4. according to the method described in claim 3, it is characterized in that, it is described according to each sensor combinations in the user hand
Whether the situation of change of the electrical signal sequence exported during potential shift is dynamic, determine each sensor combinations by the user gesture
It blocks and is blocked the duration of process, comprising:
The state change of the electrical signal sequence exported in the user gesture moving process according to each sensor combinations with
And each sensor combinations correspond to signal condition rule change when accordingly moving control mode, determine each sensor group
It closes and whether is blocked by the user gesture;
For the first sensor combination blocked by the user gesture, combined according to the first sensor in the user
The state change time of the electrical signal sequence exported in gesture moving process determines that the first sensor combines the process that is blocked
Duration.
5. according to the method described in claim 4, it is characterized by further comprising:
During acquiring the electric signal that each photosensitive sensor exports in the user gesture moving process, work as monitoring
Timing is carried out to the corresponding timer of each sensor combinations when having electric signal to change, is started;
When the first sensor combination for determining to be blocked by the user gesture, obtains the first sensor combination and correspond to
Timer timing result, be blocked the duration of process as first sensor combination.
6. according to the method described in claim 4, it is characterized in that, each sensor combinations separately include multiple sensors
Cluster, each sensor cluster include at least one photosensitive sensor;
Before determining whether each sensor combinations are blocked by the user gesture, the method also includes:
To each sensor cluster in each sensor combinations, at least one photosensitive sensor for including according to the sensor cluster
The situation of change of the electric signal exported in the user gesture moving process calculates the shielding rate of the sensor cluster;According to
The shielding rate of the sensor cluster determines the whole telecommunications that the sensor cluster exports in the user gesture moving process
Number;
To each sensor combinations, it is moved through according to multiple sensor clusters that the sensor combinations include in the user gesture
The whole electric signal exported in journey obtains the electric signal sequence that the sensor combinations export in the user gesture moving process
Column.
7. according to the method described in claim 6, it is characterized in that, described at least one light for including according to the sensor cluster
The situation of change for the electric signal that dependent sensor exports in the user gesture moving process, calculates blocking for the sensor cluster
Rate, comprising:
Count the electricity that at least one photosensitive sensor that the sensor cluster includes exports in the user gesture moving process
The number for becoming low level electric signal in signal from high level, the number as the photosensitive sensor being blocked;
At least one photosensitive sensor for including according to the number of the photosensitive sensor being blocked and the sensor cluster
Total number calculates the shielding rate of the sensor cluster.
8. according to the described in any item methods of claim 4-7, which is characterized in that it is described according to the light sensor arrays with
The user gesture blocks the state change information of generation, obtains the moving direction and movement speed of the user gesture, comprising:
According to the sequence that each photosensitive sensor is blocked in first sensor combination, the mobile side of the user gesture is determined
To;And
The duration for the process that is blocked is combined according to the first sensor and first sensor combination is being blocked
Arrangement width in sequence direction, calculates the movement speed of the user gesture.
9. method according to claim 1-7, which is characterized in that the direction according to the user gesture and
It is mobile to control the robot for speed, comprising:
According to the moving direction of the user gesture, the moving direction of the robot is determined;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to move on the moving direction with the mobile range and/or movement speed.
10. a kind of robot characterized by comprising basic machine, light sensor arrays, processor and memory;Institute
It states light sensor arrays and is installed on the basic machine surface, the processor and the memory and the photosensitive sensor
Array connection;
The memory, for storing computer program;
The processor, for executing the computer program, to be used for:
Capture the state change information that the light sensor arrays block generation with user gesture;
The state change information for blocking generation with the user gesture according to the light sensor arrays, obtains the user hand
The moving direction and movement speed of gesture;
According to the moving direction and movement speed of the user gesture, it is mobile to control the robot.
11. robot according to claim 10, which is characterized in that the processor is capturing the state change information
When, it is specifically used for:
Acquire the electric signal that each photosensitive sensor exports in the user gesture moving process in the light sensor arrays;
According to the situation of change for the electric signal that each photosensitive sensor exports in the user gesture moving process, institute is obtained
State the state change information that light sensor arrays block generation with the user gesture.
12. robot according to claim 11, which is characterized in that the processor is specifically used for:
According to the electric signal that each photosensitive sensor exports in the user gesture moving process, determine sensor combinations with
The electrical signal sequence that each sensor group exports in the user gesture moving process in mobile control mode mapping relations;
According to the situation of change for the electrical signal sequence that each sensor combinations export in the user gesture moving process, really
Whether fixed each sensor combinations are blocked and are blocked the duration of process by the user gesture.
13. robot according to claim 12, which is characterized in that the processor is specifically used for:
The state change of the electrical signal sequence exported in the user gesture moving process according to each sensor combinations with
And each sensor combinations correspond to signal condition rule change when accordingly moving control mode, determine each sensor group
It closes and whether is blocked by the user gesture;
For the first sensor combination blocked by the user gesture, combined according to the first sensor in the user
The state change time of the electrical signal sequence exported in gesture moving process determines that the first sensor combines the process that is blocked
Duration.
14. robot according to claim 13, which is characterized in that each sensor combinations separately include multiple sensings
Device cluster, each sensor cluster include at least one photosensitive sensor;
The processor is also used to: to each sensor cluster in each sensor combinations, including according to the sensor cluster
The situation of change for the electric signal that at least one photosensitive sensor exports in the user gesture moving process, calculates the sensing
The shielding rate of device cluster;According to the shielding rate of the sensor cluster, determine the sensor cluster in the user gesture moving process
The whole electric signal of middle output;
To each sensor combinations, it is moved through according to multiple sensor clusters that the sensor combinations include in the user gesture
The whole electric signal exported in journey obtains the electric signal sequence that the sensor combinations export in the user gesture moving process
Column.
15. the described in any item robots of 0-14 according to claim 1, which is characterized in that the processor is controlling the machine
When device people is mobile, it is specifically used for:
According to the moving direction of the user gesture, the moving direction of the robot is determined;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to move on the moving direction with the mobile range and/or movement speed.
16. a kind of computer readable storage medium for being stored with computer program, which is characterized in that the computer program is held
The step in any one of claim 1-9 the method can be realized when row.
17. a kind of method that control robot is mobile characterized by comprising
The movement speed of user gesture is obtained based on light sensor arrays, the light sensor arrays are installed on robot
On;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to be moved with the mobile range and/or movement speed.
18. according to the method for claim 17, which is characterized in that the movement speed according to the user gesture, really
The mobile range and/or movement speed of the fixed robot, comprising:
From multiple gesture movement speed ranges, first gesture movement speed belonging to the movement speed of the user gesture is determined
Range;
Will robot mobile range corresponding with the first gesture movement speed range and/or robot movement speed, as
The mobile range and/or movement speed of the robot.
19. method described in 7 or 18 according to claim 1, which is characterized in that further include:
The moving direction or motion track of the user gesture are obtained based on the light sensor arrays;
According to the moving direction or motion track of the user gesture, the moving direction of the robot is determined;
The control robot is moved with the mobile range and/or movement speed, comprising:
The robot is controlled to be moved on the moving direction with the mobile range and/or movement speed.
20. a kind of robot characterized by comprising basic machine, light sensor arrays, processor and memory;Institute
It states light sensor arrays and is installed on the basic machine surface, the processor and the memory and the photosensitive sensor
Array connection;
The memory, for storing computer program;
The processor, for executing the computer program, to be used for:
The movement speed of user gesture is obtained based on light sensor arrays, the light sensor arrays are installed on robot
On;
According to the movement speed of the user gesture, the mobile range and/or movement speed of the robot are determined;
The robot is controlled to be moved with the mobile range and/or movement speed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710919203.4A CN109597405A (en) | 2017-09-30 | 2017-09-30 | Control the mobile method of robot and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710919203.4A CN109597405A (en) | 2017-09-30 | 2017-09-30 | Control the mobile method of robot and robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109597405A true CN109597405A (en) | 2019-04-09 |
Family
ID=65956685
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710919203.4A Pending CN109597405A (en) | 2017-09-30 | 2017-09-30 | Control the mobile method of robot and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109597405A (en) |
Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080291160A1 (en) * | 2007-05-09 | 2008-11-27 | Nintendo Co., Ltd. | System and method for recognizing multi-axis gestures based on handheld controller accelerometer outputs |
CN101730874A (en) * | 2006-06-28 | 2010-06-09 | 诺基亚公司 | Touchless gesture based input |
CN102711935A (en) * | 2009-11-12 | 2012-10-03 | 解放顾问有限公司 | Toy systems and position systems |
CN103529941A (en) * | 2013-07-15 | 2014-01-22 | 李华容 | Gesture recognition device and method based on two-dimensional graph |
CN103520935A (en) * | 2012-09-21 | 2014-01-22 | 徐志强 | Two-player adversarial remote control electric toy |
CN103777758A (en) * | 2014-02-17 | 2014-05-07 | 深圳市威富多媒体有限公司 | Method and device for interaction with mobile terminal through infrared lamp gestures |
CN103929167A (en) * | 2014-04-24 | 2014-07-16 | 石海工 | Sliding gesture photoelectric inductive switch |
CN104182037A (en) * | 2014-06-17 | 2014-12-03 | 惠州市德赛西威汽车电子有限公司 | Gesture recognition method based on coordinate conversion |
CN104780677A (en) * | 2015-04-03 | 2015-07-15 | 彭云 | Inductive switch and control method thereof |
CN104784938A (en) * | 2015-04-14 | 2015-07-22 | 广东奥飞动漫文化股份有限公司 | Toy car double-mode inductive control system |
CN104834244A (en) * | 2014-11-05 | 2015-08-12 | 电子科技大学 | Gesture switch and gesture control method |
CN204731580U (en) * | 2015-06-10 | 2015-10-28 | 广州大学 | A kind of non-touch gesture control |
CN105094423A (en) * | 2015-07-03 | 2015-11-25 | 施政 | Interactive system and method of electronic plate |
CN105117005A (en) * | 2015-08-17 | 2015-12-02 | 湖南迪文科技有限公司 | Light sensing based gesture identification system and method |
CN105511631A (en) * | 2016-01-19 | 2016-04-20 | 北京小米移动软件有限公司 | Gesture recognition method and device |
CN205243912U (en) * | 2015-09-04 | 2016-05-18 | 宋彦震 | Interactive gesture control fan with stall protection |
CN105787471A (en) * | 2016-03-25 | 2016-07-20 | 南京邮电大学 | Gesture identification method applied to control of mobile service robot for elder and disabled |
CN105824404A (en) * | 2015-11-20 | 2016-08-03 | 维沃移动通信有限公司 | Gesture operation and control method and mobile terminal |
CN205438579U (en) * | 2015-12-17 | 2016-08-10 | 广东德泷智能科技有限公司 | Intelligent robot system |
CN105881548A (en) * | 2016-04-29 | 2016-08-24 | 北京快乐智慧科技有限责任公司 | Method for waking up intelligent interactive robot and intelligent interactive robot |
CN205750723U (en) * | 2016-04-27 | 2016-11-30 | 深圳市前海万象智慧科技有限公司 | The control system of gesture identification based on human-computer interaction device |
CN106383583A (en) * | 2016-09-23 | 2017-02-08 | 深圳奥比中光科技有限公司 | Method and system capable of controlling virtual object to be accurately located and used for air man-machine interaction |
CN106976098A (en) * | 2017-06-05 | 2017-07-25 | 游尔(北京)机器人科技股份有限公司 | A kind of service robot neck structure |
-
2017
- 2017-09-30 CN CN201710919203.4A patent/CN109597405A/en active Pending
Patent Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101730874A (en) * | 2006-06-28 | 2010-06-09 | 诺基亚公司 | Touchless gesture based input |
US20080291160A1 (en) * | 2007-05-09 | 2008-11-27 | Nintendo Co., Ltd. | System and method for recognizing multi-axis gestures based on handheld controller accelerometer outputs |
CN102711935A (en) * | 2009-11-12 | 2012-10-03 | 解放顾问有限公司 | Toy systems and position systems |
CN103520935A (en) * | 2012-09-21 | 2014-01-22 | 徐志强 | Two-player adversarial remote control electric toy |
CN103529941A (en) * | 2013-07-15 | 2014-01-22 | 李华容 | Gesture recognition device and method based on two-dimensional graph |
CN103777758A (en) * | 2014-02-17 | 2014-05-07 | 深圳市威富多媒体有限公司 | Method and device for interaction with mobile terminal through infrared lamp gestures |
CN103929167A (en) * | 2014-04-24 | 2014-07-16 | 石海工 | Sliding gesture photoelectric inductive switch |
CN104182037A (en) * | 2014-06-17 | 2014-12-03 | 惠州市德赛西威汽车电子有限公司 | Gesture recognition method based on coordinate conversion |
CN104834244A (en) * | 2014-11-05 | 2015-08-12 | 电子科技大学 | Gesture switch and gesture control method |
CN104780677A (en) * | 2015-04-03 | 2015-07-15 | 彭云 | Inductive switch and control method thereof |
CN104784938A (en) * | 2015-04-14 | 2015-07-22 | 广东奥飞动漫文化股份有限公司 | Toy car double-mode inductive control system |
CN204731580U (en) * | 2015-06-10 | 2015-10-28 | 广州大学 | A kind of non-touch gesture control |
CN105094423A (en) * | 2015-07-03 | 2015-11-25 | 施政 | Interactive system and method of electronic plate |
CN105117005A (en) * | 2015-08-17 | 2015-12-02 | 湖南迪文科技有限公司 | Light sensing based gesture identification system and method |
CN205243912U (en) * | 2015-09-04 | 2016-05-18 | 宋彦震 | Interactive gesture control fan with stall protection |
CN105824404A (en) * | 2015-11-20 | 2016-08-03 | 维沃移动通信有限公司 | Gesture operation and control method and mobile terminal |
CN205438579U (en) * | 2015-12-17 | 2016-08-10 | 广东德泷智能科技有限公司 | Intelligent robot system |
CN105511631A (en) * | 2016-01-19 | 2016-04-20 | 北京小米移动软件有限公司 | Gesture recognition method and device |
CN105787471A (en) * | 2016-03-25 | 2016-07-20 | 南京邮电大学 | Gesture identification method applied to control of mobile service robot for elder and disabled |
CN205750723U (en) * | 2016-04-27 | 2016-11-30 | 深圳市前海万象智慧科技有限公司 | The control system of gesture identification based on human-computer interaction device |
CN105881548A (en) * | 2016-04-29 | 2016-08-24 | 北京快乐智慧科技有限责任公司 | Method for waking up intelligent interactive robot and intelligent interactive robot |
CN106383583A (en) * | 2016-09-23 | 2017-02-08 | 深圳奥比中光科技有限公司 | Method and system capable of controlling virtual object to be accurately located and used for air man-machine interaction |
CN106976098A (en) * | 2017-06-05 | 2017-07-25 | 游尔(北京)机器人科技股份有限公司 | A kind of service robot neck structure |
Non-Patent Citations (1)
Title |
---|
郭峰: "基于ANDROID平台的手势控制机器人示教系统的研究及实现", 《中国优秀硕士学位论文全文数据库信息科技辑》 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105472469B (en) | Video playing progress adjustment method and device | |
CN105640443B (en) | The mute working method and device of automatic cleaning equipment, electronic equipment | |
CN104603763B (en) | Information transferring method and system, device and its computer readable recording medium storing program for performing | |
CN104394312B (en) | Filming control method and device | |
CN104535721B (en) | Air quality data display method and device | |
CN106814639A (en) | Speech control system and method | |
CN105555194B (en) | Activity measuring device, portable terminal, information sharing assisting system, information sharing system, movable assisting system and system | |
CN104301528B (en) | The method and device of display information | |
CN108536099A (en) | A kind of information processing method, device and mobile terminal | |
CN108073437A (en) | Method and mobile terminal are recommended in a kind of application | |
CN104170360A (en) | Intelligent response method of user equipment, and user equipment | |
CN104598076B (en) | Touch information screen method and device | |
CN105117008B (en) | Guiding method of operating and device, electronic equipment | |
CN109814952A (en) | A kind of application interface quickly starting control processing method, device and mobile terminal | |
CN106339384A (en) | Conversion method and device for storage procedures | |
CN105912190A (en) | Interface operation method and mobile terminal | |
CN104315664B (en) | Control the method and device of air purifier work | |
CN104598119A (en) | Screen capture method and device | |
CN103309520A (en) | Screen operation trace and sound input synchronous storage and processing method, system and terminal | |
CN107688385A (en) | A kind of control method and device | |
CN107529699A (en) | Control method of electronic device and device | |
CN104008764A (en) | Multimedia information marking method and relevant device | |
US20160112279A1 (en) | Sensor-based Distributed Tangible User Interface | |
CN110147318A (en) | It is a kind of to generate the method and device of test data, electronic equipment | |
CN109069900A (en) | A kind of method and device of treadmill step counting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190409 |