CN110653813A - Robot control method, robot and computer storage medium - Google Patents

Robot control method, robot and computer storage medium Download PDF

Info

Publication number
CN110653813A
CN110653813A CN201810714715.1A CN201810714715A CN110653813A CN 110653813 A CN110653813 A CN 110653813A CN 201810714715 A CN201810714715 A CN 201810714715A CN 110653813 A CN110653813 A CN 110653813A
Authority
CN
China
Prior art keywords
touch
robot
data
information
touch information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810714715.1A
Other languages
Chinese (zh)
Inventor
熊友军
伍禄林
郑晓敏
杨敬
黄青春
肖兴
李昕
徐海波
周桓宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN201810714715.1A priority Critical patent/CN110653813A/en
Publication of CN110653813A publication Critical patent/CN110653813A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application relates to the technical field of robots and discloses a robot control method, a robot and a computer storage medium. The method comprises the steps of obtaining touch information through a touch sensing point, analyzing the touch information to obtain touch data, responding to the touch information, and randomly executing an interactive action or executing an interactive action corresponding to the touch data. By acquiring the touch information and obtaining the touch data, the interaction action is executed at random or the interaction action corresponding to the touch data is executed, so that different voices can be played or different actions can be executed in the same place of the touch robot, and the interaction function between the robot and the human can be enriched.

Description

Robot control method, robot and computer storage medium
Technical Field
The present application relates to the field of robots, and in particular, to a robot control method, a robot, and a computer storage medium.
Background
Robots, a technology that contains considerable disciplinary knowledge, are almost generated with artificial intelligence. In recent years, the functions of robots have been dramatically improved with the development of image processing, voice processing, wireless network technology, internet technology, automatic control, computing and processing capabilities, and robots play an increasingly important role in human life.
The interaction function between the robot and the human is gradually developed, but after the robot at the present stage is touched, the same place is touched, only the same voice can be played or the same action can be executed, for example, the head of the robot is touched, the robot only can repeatedly say 'hello', the hand of the robot is touched, and the robot repeatedly lifts the hand, so that the interaction between the robot and the human is simple and tedious.
Disclosure of Invention
The technical problem mainly solved by the application is to provide a robot control method, a robot and a computer storage medium, which can solve the problem that the existing robot and human interaction is tedious.
In order to solve the technical problem, the application adopts a technical scheme that: provided is a robot control method including: acquiring touch information through the touch sensing points; analyzing the touch information to obtain the touch data; responding the touch information, and randomly executing an interactive action or executing an interactive action corresponding to the touch data.
In order to solve the above technical problem, another technical solution adopted by the present application is: providing a robot, comprising: the touch sensor comprises a sensor and a processor, wherein the sensor is coupled to the processor and used for acquiring touch information through a touch sensing point. The processor is used for analyzing the touch information to obtain the touch data, and the processor responds to the touch information and randomly executes an interactive action or an interactive action corresponding to the touch data.
In order to solve the above technical problem, the present application adopts another technical solution: there is provided a computer storage medium for storing program data executable to implement a method as described above.
The beneficial effect of this application does: different from the prior art, the method and the device have the advantages that the touch information is obtained through the touch sensing points, the touch data is obtained through the touch information analysis, the touch information is responded, and the interaction action is executed at random or the interaction action corresponding to the touch data is executed. By acquiring the touch information and obtaining the touch data, the interaction action is executed at random or the interaction action corresponding to the touch data is executed, so that different voices can be played or different actions can be executed in the same place of the touch robot, and the interaction function between the robot and the human can be enriched.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of a first embodiment of a robot control method of the present application;
FIG. 2 is a schematic flow chart diagram of a second embodiment of a robot control method of the present application;
FIG. 3 is a schematic flow chart diagram of a third embodiment of a robot control method of the present application;
FIG. 4 is a schematic structural diagram of an embodiment of the robot of the present application;
FIG. 5 is a schematic structural diagram of an embodiment of a robot computer storage medium according to the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application are within the scope of protection of the present application.
As shown in fig. 1, a first embodiment of a robot control method of the present application includes:
s1: and acquiring touch information through the touch sensing points.
The touch sensing point is disposed on the robot surface, and the touch sensing point may be disposed at any position of the robot surface, such as a head or a torso, and is not particularly limited herein.
The touch information may be an electric signal or an optical signal acquired by the robot, and if the user touches the robot, the robot may acquire a change in the electric signal or the optical signal, that is, acquire the touch information, and if the user does not touch the robot, the robot may not acquire the touch information.
The touch sensing points of the robot support various touch sensing functions, such as side keys, sliders, fingerprint recognition, and combinations thereof, which are not limited herein. The touch sensing point may be plural, and the number thereof is not particularly limited herein.
The side key can be arranged at any position of the robot, and can be a key or a slide control key. In one embodiment, two output states, active (with touch) or inactive (without touch), are provided, and the processor controls the two states to be respectively corresponding to an ON (ON) state and an OFF (OFF) state, and enters the ON (ON) state after acquiring the wake-up information. When the touch information is acquired through the touch sensing points, the processor records the corresponding touch sensing points as a side key ON state; and when the touch information is not acquired, the processor records the corresponding touch sensing point as a side key OFF state. And judging whether the robot acquires touch information through the touch sensing points, namely whether a user presses a key or a sliding side key, and when the user presses the key or the sliding side key, recording the corresponding touch sensing points as a side key ON state by the processor, so that the robot acquires the touch information.
The slide bar, also called sliding touch, is composed of adjacently placed slide bar segment touch sensing points corresponding to the sensors. Touching a certain slide bar segment touch sensing point can activate several adjacent slide bar segment touch sensing points, and the geometric center position, namely the centroid position, of finger touch can be calculated through the touched slide bar segment touch sensing point and the adjacent slide bar segment touch sensing points. When the sensor senses touch, the processor records that the touch sensing point of the slide bar section is in a slide bar ON state; when the sensor does not sense the touch, the processor records the touch sensing point of the slide bar segment as a slide bar OFF state. And judging whether the robot acquires touch information through the touch induction points, namely whether the user slides and touches, recording the touch induction points of the slide bar section as a slide bar ON state by the processor when the user slides and touches, and acquiring the touch information by the robot.
The fingerprint identification device can be used for identifying that touch information is obtained by comparing detail characteristic points of different fingerprints, and can also be used for judging whether the fingerprint identification device has the authority of starting the robot, and different functions can be started by different fingerprints. In one embodiment, the fingerprint of the user is entered in advance, and the fingerprint of the user is acquired through the fingerprint identification device, so that the robot can be started only by the entered fingerprint of the user, and different fingers touch the robot, and the robot can show different actions.
S2: and analyzing the touch information to obtain touch data.
After the touch information is acquired through the touch sensing points, the touch information is analyzed, so that touch pressure data or touch frequency data can be acquired, or the two data can be acquired simultaneously, or finger data of touch can be analyzed and recognized.
S3: and responding to the touch information, and randomly executing the interactive action or executing the interactive action corresponding to the touch data.
The interactive action includes at least one of a motion of the robot or a played voice or video. And after the touch information is responded, the robot randomly executes the pre-stored interactive action or the interactive action corresponding to the touch data. The random execution may be to execute all the interactive actions in sequence within a preset time, or to shuffle the sequence, to randomly sort the interactive actions and execute the randomly sorted interactive actions, or to execute only part of the interactive actions. Or after the touch information is responded, the robot executes an interactive action corresponding to the touch data, and the same touch information corresponds to a plurality of interactive actions, wherein the plurality of interactive actions can comprise a combination of actions and voice, a plurality of actions or a plurality of voices. And randomly executing the interactive action or executing the interactive action corresponding to the touch data by the user. The number of pre-stored interactions is set by the user.
Specifically, in one embodiment, 3 interactions are pre-stored, including dancing, playing "today weather really good" speech, and playing "happy know you" speech. The method comprises the steps of setting up a disordered sequence to execute interactive actions, for example, touching the head of the robot by a user, dancing the robot, touching the head of the robot again by the user, playing voice of 'today weather is good' by the robot, touching the head of the robot again, and dancing the robot.
Specifically, in another embodiment, 3 interactions are pre-stored, including dancing, push-ups, and playing "happy know you" speech. The arrangement is such that the interaction is performed in a certain order, e.g. the user touches the head of the robot, the robot dances, the user touches the head of the robot again, the robot makes a push-up, and then touches the head of the robot, the robot plays a "happy feeling you" voice.
Of course, in other embodiments, the robot may also perform multiple interactions simultaneously with the same touch action, and the sequence of the interactions performed by each touch may be changed, which is not limited herein.
The method comprises the steps of obtaining touch information through a touch sensing point, analyzing the touch information to obtain touch data, responding to the touch information, and randomly executing an interactive action or executing an interactive action corresponding to the touch data. By acquiring the touch information and obtaining the touch data, the interaction action is executed at random or the interaction action corresponding to the touch data is executed, so that different voices can be played or different actions can be executed in the same place of the touch robot, and the interaction function between the robot and the human can be enriched.
As shown in fig. 2, a second embodiment of the robot control method according to the present application is based on the first embodiment, and further defines that step S2 includes:
s21: analyzing the touch information to obtain touch pressure data; and/or touch frequency data may be analyzed using the touch information.
The touch data includes at least one of touch pressure data and touch frequency data, wherein the touch frequency is the number of touches within a preset duration, and the preset duration is set by a user. After touch information is acquired through the touch sensing points, the touch information is analyzed, the touch pressure of the touch is acquired through the touch information, different preset instructions are executed according to the pressure range, and the same preset instructions are executed in the same pressure range. Or after the touch information is acquired, the touch information is analyzed to obtain a touch frequency, and similarly, different preset instructions are executed according to the frequency range, and the same preset instruction is executed in the same frequency range. And during comprehensive analysis, analyzing the comprehensive range of the obtained touch pressure and touch frequency, executing different preset instructions according to the comprehensive range, and executing the same preset instruction in the same comprehensive range. The preset instruction is a prestored instruction, and may include an action instruction or a voice instruction, and both of them. The pressure range, the frequency range and the comprehensive range are preset ranges and can be obtained through an algorithm in advance. The number of the ranges may be plural, and is not particularly limited herein.
The interaction of the robot may also have an emotional characteristic value. The touch pressure is positively correlated with the emotional characteristic value, namely the larger the touch pressure is, the larger the emotional characteristic value of the robot is. The touch frequency and the emotion characteristic value are also in positive correlation, namely the higher the touch frequency is, the larger the emotion characteristic value of the robot is, and the touch pressure and the touch frequency can also be comprehensively analyzed. The emotional characteristics of the robot can be obtained through the expression and action performance of the robot, for example, the expression is angry or dancing.
Specifically, in one embodiment, the preset pressure ranges include "low" and "high", the user lightly touches the robot, the touch information is analyzed at this time to obtain touch pressure data, the pressure belongs to the "low" range, the robot plays "good itch", the expression is happy, the user slightly touches the robot with force, the pressure still belongs to the "low" range at this time, the robot still plays "good itch", but the expression starts to change, and the expression is not happy before. The user heavily touches the robot, the touch information is analyzed at the moment to obtain touch pressure data, the pressure belongs to a high range, the robot plays 'good pain', the expression is angry, the user touches the robot harder, the pressure still belongs to the high range, the robot still plays 'good pain', but the expression is more angry than before.
As shown in fig. 3, a third embodiment of the robot control method according to the present application is based on the first embodiment, and further includes:
s4: and in the process of executing the interactive action, touch information is acquired through the touch sensing points, and at least part of the interactive action is suspended.
The robot executes the interactive action after acquiring the touch information through the touch sensing point, acquires the touch information again when the interactive action is not completed, and at the moment, the robot suspends executing partial unfinished interactive action or suspends executing all interactive actions, and after suspension, the action can be started through the voice of a user, or the execution of at least partial interactive action can be completely stopped. The user voice turns on the action or terminates the action completely, which is set by the user.
Specifically, in one embodiment, the user touches the robot, the robot dances, the user touches the robot again, and the robot terminates dancing.
Specifically, in another embodiment, the user touches the robot, the robot dances, the user touches the robot again, the robot stops dancing, and after the user voice "continue" is obtained, the robot continues dancing, of course, the voice is not limited to "continue", and the specific voice is set by the user.
In the present embodiment, this step S4 is executed concurrently with or after the step S3.
As shown in fig. 4, the present application further provides a robot 100, where the robot 100 includes a sensor 10 and a processor 20, and the sensor 10 is coupled to the processor 20 for acquiring touch information through touch sensing points. The sensor 10 may be a pressure sensor or a frequency sensor, and is not limited in particular. The processor 20 is configured to analyze the touch information to obtain the touch data, and the processor 20 responds to the touch information to randomly execute an interactive action or execute an interactive action corresponding to the touch data.
The processor 20 is used to control the operation of the robot, and the processor 20 may also be referred to as a Central Processing Unit (CPU). The processor 20 may be an integrated circuit chip having signal processing capabilities. The processor 20 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. By acquiring the touch information and obtaining the touch data, the interaction action is executed at random or the interaction action corresponding to the touch data is executed, so that different voices can be played or different actions can be executed in the same place of the touch robot, and the interaction function between the robot and the human can be enriched.
As for the program data, it is stored in a storage medium, therefore, as shown in fig. 5, the present application also provides a computer storage medium 30, the storage medium 30 stores the program data 31, the program data 31 can be executed to implement the above method, the storage medium 30 can be a floppy disk, a hard disk, an optical disk, a memory card, etc., and the reading and writing are implemented through the interface connection; the system can also be a server which realizes reading and writing through network connection. This program data 31 may be executed to implement the robot control method described above.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. A robot control method, characterized in that the method comprises:
acquiring touch information through the touch sensing points;
analyzing the touch information to obtain the touch data;
and responding to the touch information, and randomly executing an interactive action or executing an interactive action corresponding to the touch data.
2. The method of claim 1, wherein the touch data comprises at least one of touch pressure data and touch frequency data, wherein the touch frequency is a number of touches within a preset time period.
3. The method of claim 1, wherein the analyzing the touch data using the touch information comprises:
analyzing the touch information to obtain the touch pressure data; and/or
And analyzing the touch information to obtain the touch frequency data.
4. The method of claim 1, further comprising:
and in the process of executing the interaction, the touch information is acquired through the touch sensing point, and at least part of the interaction is suspended.
5. The method of claim 1, wherein the interaction has an emotional characteristic value, and wherein the touch pressure and/or the touch frequency is positively correlated to the emotional characteristic value.
6. A robot, characterized in that the robot comprises a sensor and a processor;
the sensor is coupled to the processor and used for acquiring touch information through a touch sensing point;
the processor analyzes the touch information to obtain the touch data;
and the processor responds to the touch information and randomly executes an interactive action or an interactive action corresponding to the touch data.
7. The robot of claim 6, wherein the touch data includes at least one of touch pressure data and touch frequency data, wherein the touch frequency is a number of touches within a preset time period.
8. The robot of claim 6, wherein the processor is configured to utilize the touch information analysis to derive the touch data, and further configured to:
analyzing the touch information to obtain the touch pressure data; and/or
And analyzing the touch information to obtain the touch frequency data.
9. The robot of claim 6, wherein the sensor is configured to acquire touch information via touch sensitive points, the sensor and the processor further configured to:
and in the process of executing the interaction action by the processor, the sensor acquires the touch information through the touch sensing point, and the processor suspends executing at least part of the interaction action.
10. A computer storage medium for storing program data executable to implement the method of any one of claims 1-5.
CN201810714715.1A 2018-06-29 2018-06-29 Robot control method, robot and computer storage medium Pending CN110653813A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810714715.1A CN110653813A (en) 2018-06-29 2018-06-29 Robot control method, robot and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810714715.1A CN110653813A (en) 2018-06-29 2018-06-29 Robot control method, robot and computer storage medium

Publications (1)

Publication Number Publication Date
CN110653813A true CN110653813A (en) 2020-01-07

Family

ID=69027171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810714715.1A Pending CN110653813A (en) 2018-06-29 2018-06-29 Robot control method, robot and computer storage medium

Country Status (1)

Country Link
CN (1) CN110653813A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1304346A (en) * 1999-05-10 2001-07-18 索尼公司 Toboy device and method for controlling same
EP2105263A2 (en) * 2008-03-27 2009-09-30 Institutul de Mecanica Solidelor al Academiei Romane Real time control method and device for robots in virtual projection
CN104216299A (en) * 2014-08-22 2014-12-17 科大讯飞股份有限公司 Control device of intelligent marketing robot
CN104769645A (en) * 2013-07-10 2015-07-08 哲睿有限公司 Virtual companion
CN106393113A (en) * 2016-11-16 2017-02-15 上海木爷机器人技术有限公司 Robot and interactive control method for robot
CN107463291A (en) * 2017-07-28 2017-12-12 上海木爷机器人技术有限公司 The robot with personification performance based on touch
CN108073336A (en) * 2016-11-18 2018-05-25 香港中文大学 User emotion detecting system and method based on touch

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1304346A (en) * 1999-05-10 2001-07-18 索尼公司 Toboy device and method for controlling same
EP2105263A2 (en) * 2008-03-27 2009-09-30 Institutul de Mecanica Solidelor al Academiei Romane Real time control method and device for robots in virtual projection
CN104769645A (en) * 2013-07-10 2015-07-08 哲睿有限公司 Virtual companion
CN104216299A (en) * 2014-08-22 2014-12-17 科大讯飞股份有限公司 Control device of intelligent marketing robot
CN106393113A (en) * 2016-11-16 2017-02-15 上海木爷机器人技术有限公司 Robot and interactive control method for robot
CN108073336A (en) * 2016-11-18 2018-05-25 香港中文大学 User emotion detecting system and method based on touch
CN107463291A (en) * 2017-07-28 2017-12-12 上海木爷机器人技术有限公司 The robot with personification performance based on touch

Similar Documents

Publication Publication Date Title
KR101932210B1 (en) Method, system for implementing operation of mobile terminal according to touching signal and mobile terminal
US8958631B2 (en) System and method for automatically defining and identifying a gesture
CN104090652B (en) A kind of pronunciation inputting method and device
US9807559B2 (en) Leveraging user signals for improved interactions with digital personal assistant
JP4357935B2 (en) Information processing apparatus and signature data input program
KR100978929B1 (en) Registration method of reference gesture data, operation method of mobile terminal and mobile terminal
KR20190082140A (en) Devices and methods for dynamic association of user input with mobile device actions
US20070274591A1 (en) Input apparatus and input method thereof
CN108538298A (en) voice awakening method and device
CN108647055A (en) Application program preloads method, apparatus, storage medium and terminal
Iyer et al. Emotion based mood enhancing music recommendation
CN103529934A (en) Method and apparatus for processing multiple inputs
TW201218023A (en) Efficient gesture processing
CN108920202A (en) Using preloading management method, device, storage medium and intelligent terminal
WO2021159896A1 (en) Video processing method, video processing device, and storage medium
CN110544468B (en) Application awakening method and device, storage medium and electronic equipment
WO2017219450A1 (en) Information processing method and device, and mobile terminal
CN109360551B (en) Voice recognition method and device
WO2016168982A1 (en) Method, apparatus and terminal device for setting interrupt threshold for fingerprint identification device
WO2022007544A1 (en) Device control method and apparatus, and storage medium and electronic device
US20220391697A1 (en) Machine-learning based gesture recognition with framework for adding user-customized gestures
WO2019228149A1 (en) Collection method and apparatus for prediction sample, and storage medium and smart terminal
WO2015131590A1 (en) Method for controlling blank screen gesture processing and terminal
KR102450763B1 (en) Apparatus and method for user classification by using keystroke pattern based on user posture
CN111310725A (en) Object identification method, system, machine readable medium and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200107

RJ01 Rejection of invention patent application after publication