CN113997285A - Robot head control method and device and electronic equipment - Google Patents

Robot head control method and device and electronic equipment Download PDF

Info

Publication number
CN113997285A
CN113997285A CN202111266675.7A CN202111266675A CN113997285A CN 113997285 A CN113997285 A CN 113997285A CN 202111266675 A CN202111266675 A CN 202111266675A CN 113997285 A CN113997285 A CN 113997285A
Authority
CN
China
Prior art keywords
head
robot
robot head
target object
initial position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111266675.7A
Other languages
Chinese (zh)
Inventor
张明星
周华强
常耕林
臧超
陆寅
徐之东
张威
朱晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoqi Pujin Intelligent Technology Hefei Co ltd
Original Assignee
Guoqi Pujin Intelligent Technology Hefei Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoqi Pujin Intelligent Technology Hefei Co ltd filed Critical Guoqi Pujin Intelligent Technology Hefei Co ltd
Priority to CN202111266675.7A priority Critical patent/CN113997285A/en
Publication of CN113997285A publication Critical patent/CN113997285A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot head control method, a device and electronic equipment, and the main conception of the invention is that after a robot is powered on, the head is independently controlled to carry out two-dimensional motion to initialize the position of the head, and simultaneously, whether a target object to be interacted exists or not is monitored, when the object to be interacted is monitored, the position information of the target object is obtained, and the position of the target object is compared with a preset central area of the robot head, so that the robot head is regulated and controlled to face the target object through a two-dimensional independent motion mechanism, and after the current round of interaction is finished, the robot head is independently controlled to carry out horizontal and/or vertical two-dimensional motion again to reset the head to the initial position to wait for the next round of interaction. The invention realizes the fine control of the robot head, can independently perform target following and posture regulation and control on the robot head, and does not need to rotate a robot chassis for orientation control, thereby greatly improving the user interaction experience while flexibly regulating and controlling the orientation of the robot face.

Description

Robot head control method and device and electronic equipment
Technical Field
The present invention relates to the field of robots, and in particular, to a method and an apparatus for controlling a robot head, and an electronic device.
Background
With the rapid development of artificial intelligence technology, robots have been applied to many fields. In the practical application process, when the robot interacts with a certain target object, the head of the robot needs to be rotated, so that the holder faces the target object, and the interaction with the target object is realized.
However, in the prior art, most of the robots are directly aligned to the target object by using a chassis rotation mode, that is, the robots are integrally twisted to the direction of the target object.
Disclosure of Invention
In view of the above, the present invention is directed to a method and an apparatus for controlling a robot head, and an electronic device, so as to solve the drawback that the robot head faces a target object in a chassis rotation manner.
The technical scheme adopted by the invention is as follows:
in a first aspect, the present invention provides a robot head control method, including:
after power-on, detecting the current position of the robot head;
comparing the current position with a preset initial position;
if the current position is not consistent with the initial position, independently controlling the robot head to rotate horizontally and/or vertically so as to enable the robot head to rotate to the initial position;
monitoring a target object to be interacted in real time;
when the target object is monitored, determining the position information of the target object;
comparing the position information with a current central area, wherein the central area represents a front face orientation area of the robot head;
if the position information is not in accordance with the central area, independently controlling the head of the robot to rotate horizontally and/or vertically so as to enable the target object to be located in the range of the central area;
and after the interaction is finished, the robot head is independently controlled to rotate horizontally and/or vertically so as to reset the robot head to the initial position.
In at least one possible implementation, the resetting the robot head to the initial position includes:
when the interaction is finished and other target objects to be interacted are not monitored, if the head of the robot is not at the initial position, timing is started;
in the timing process, judging whether a preset time threshold value is exceeded or not;
and if the time is out, controlling the head of the robot to reset to the initial position.
In at least one possible implementation, the time threshold includes a robot head rest time.
In at least one possible implementation manner, the comparing the location information with the current central area includes:
when the position of the target object is within a preset tolerance range of the central area, judging that the position information is consistent with the central area; otherwise, judging that the position information is not in accordance with the central area.
In at least one possible implementation manner, the control method further includes: and detecting the current horizontal position and the current vertical position of the robot head in real time in the process of independently controlling the robot head to rotate horizontally and/or vertically.
In a second aspect, the present invention provides a robot head control device, comprising:
the current position detection module is used for detecting the current position of the head of the robot after being electrified;
the initial position comparison module is used for comparing the current position with a preset initial position;
the head two-dimensional control module is used for independently controlling the robot head to horizontally rotate and/or vertically rotate when the initial position comparison module outputs that the current position does not accord with a preset initial position, so that the robot head rotates to the initial position;
the target object monitoring module is used for monitoring a target object to be interacted in real time;
the target position calculation module is used for determining the position information of the target object when the target object is monitored;
the head orientation comparison module is used for comparing the position information with a current central area, wherein the central area represents a front orientation area of the head of the robot;
the head two-dimensional control module is further configured to, when the head orientation comparison module outputs that the position information does not coincide with the central region, individually control the robot head to perform horizontal rotation and/or vertical rotation, so that the target object is located within the range of the central region;
and the head two-dimensional control module is also used for independently controlling the robot head to horizontally rotate and/or vertically rotate after the interaction is finished so as to reset the robot head to the initial position.
In at least one possible implementation manner, the head two-dimensional control module includes a head resetting unit, and the head resetting unit specifically includes:
the timing component is used for starting timing if the head of the robot is not at the initial position when the interaction is finished and other target objects to be interacted are not monitored;
the overtime detection component is used for judging whether a preset time threshold value is exceeded or not in the timing process;
and the head two-dimensional control component is used for controlling the robot head to reset to the initial position when the output of the overtime detection component is overtime.
In at least one possible implementation manner, the control device further includes: and the head posture detection module is used for detecting the current horizontal position and the current vertical position of the robot head in real time in the process of independently controlling the robot head to rotate horizontally and/or vertically.
In a third aspect, the present invention provides an electronic device, comprising:
one or more processors, memory which may employ a non-volatile storage medium, and one or more computer programs stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the method as in the first aspect or any possible implementation of the first aspect.
The method mainly comprises the steps that after the robot is powered on, the head is independently controlled to perform two-dimensional motion to enable the position of the head to be initialized, whether a target object to be interacted exists or not is monitored, when the interaction object is monitored, information of the position of the target object is obtained, the position of the target object is compared with a preset central area of the head of the robot, the head of the robot is adjusted and controlled to be opposite to the target object through a two-dimensional independent motion mechanism, and after the current round of interaction is finished, the head of the robot is independently controlled to perform horizontal and/or vertical two-dimensional motion again to enable the head of the robot to reset to the initial position to be used for the next round of interaction. The invention realizes the fine control of the robot head, can independently perform target following and posture regulation and control on the robot head, and does not need to rotate a robot chassis for orientation control, thereby greatly improving the user interaction experience while flexibly regulating and controlling the orientation of the robot face.
Drawings
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described with reference to the accompanying drawings, in which:
fig. 1 is a flowchart of an embodiment of a robot head control method provided in the present invention;
FIG. 2 is a schematic diagram of an embodiment of a robot head control apparatus provided in the present invention;
fig. 3 is a schematic diagram of an embodiment of an electronic device provided in the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative only and should not be construed as limiting the invention.
The invention provides at least one embodiment of a robot head control method, as shown in fig. 1, which specifically includes:
and step S1, after power-on, detecting the current position of the robot head.
In actual operation, the relevant modules of the robot can be initialized, and then the control module of the robot monitors the current position of the head of the robot through the horizontal position sensor and the vertical position sensor arranged on the head.
And step S2, comparing the current position with a preset initial position.
Specifically, whether the head of the robot is at the initial position or not may be determined by comparing the current position information with preset initial position information by the control module, where the initial position information may be set based on different robots, and if the robot is an anthropomorphic robot, the head position centered on the trunk and facing right ahead may be set as the initial position.
And step S3, if the current position is not in accordance with the initial position, the robot head is controlled to rotate horizontally and/or vertically independently so as to rotate to the initial position.
Specifically, this process essentially comprises the processing results of two steps 10: (1) if the head position of the robot is not at the initial position, the head horizontal motor and/or the head vertical motor of the robot can be enabled by the control module to control the head of the robot to move, and the head of the robot stops moving after the head position moves to the initial position; (2) and if the head position of the robot is currently at the initial position, skipping the control and entering the next step.
And step S4, monitoring the target object to be interacted in real time.
It can be stated here that the foregoing steps may all be considered as an initialization link of the head pose of the robot, and in the initialization completion or the initialization adjustment process, monitoring of the interactive objects may be performed in sequence or in parallel, that is, whether there is a target object to be interacted with the robot in the current scene is identified, where the target object is not limited to a human, and may also refer to other computing devices, robots, and the like, and the interaction triggering manner may refer to the existing mature technologies, such as waking up the robot, issuing an interaction instruction in multiple modes, and the like, and thus, details of the present invention are not described herein. It should be noted that, in actual operation, the target object may be identified by the application module through a monitoring sensor (such as, but not limited to, visual image detection, audio positioning analysis, etc.).
And step S5, when the target object is monitored, determining the position information of the target object.
In practice, the position calculation may be performed by the application module based on the input data of the monitoring sensor, so as to determine the position information of the target object, such as but not limited to azimuth information and altitude information, and it is understood that the position of the target object may be further calculated and locked to be more accurate according to the characteristics of the target object, such as the face, the sound source, etc.
And step S6, comparing the position information with the current central area.
In actual operation, the application module may continuously compare the position information of the target object obtained by the previous calculation with a central area of the robot, where the central area represents a front facing area of the head of the robot, that is, a certain spatial range right in front of the head of the robot is pre-calibrated as the central area, of course, the calibration manner and result may be set according to different robot types, such as but not limited to, a robot with an anthropomorphic face or an anthropomorphic face, and a spatial range corresponding to a nose, an eye, a mouth, etc. may be determined as the central area, and further, it may be stated that in actual application, a tolerance threshold may be set for the central area (for example, a tolerance range may be extended by 2 ° to 5 ° outside of left and right eyes while a space corresponding to a distance between two eyes of the robot is used as the central area), thus, if the position of the target object is within the preset tolerance range of the central area, the position of the target object can be determined to be consistent with the central area; otherwise, it may be determined that the location information does not coincide with the central region.
And step S7, if the position information does not match the central area, controlling the robot head to rotate horizontally and/or vertically individually so that the target object is located within the central area.
In practical operation, after receiving the comparison result and the command from the application module, the control module enables the head horizontal motor and/or the head vertical motor to control the head to move, and dynamically adjusts the head of the robot to correspond to the target object, that is, the target object is within the range of the central area (and the tolerance). It can also be pointed out here that, during the current round of interaction, the attitude angle of the robot head can be continuously monitored and adjusted adaptively according to the position change of the target object, for example, if the monitoring sensor detects that the target object moves to another position during the interaction, new user position information can be calculated in real time, and the front orientation of the robot head can be adjusted by the horizontal and/or vertical two-dimensional control mechanism in the above manner.
And step S8, after the interaction is finished, independently controlling the robot head to rotate horizontally and/or vertically so as to reset the robot head to the initial position.
After the round of interaction is finished, the control module can enable the head of the robot to return to the initial position again so as to wait for the next round of interaction or enter a power-off process. In practical operation, the following concept can be adopted to realize the following operation: when the current round of interaction is finished and other target objects to be interacted are not monitored, if the head of the robot is not at the initial position at the moment, timing is started, and in the timing process, whether a preset time threshold value is exceeded or not is judged (the static time of the head of the robot under the scene can be preset, such as 5-10 seconds); if the static attitude exceeds the time threshold, the control module controls the robot head to reset to the initial position through a two-dimensional motion control mechanism.
Finally, it can be supplemented that, in the process of individually controlling the robot head to horizontally rotate and/or vertically rotate in the stages of initialization, target following, resetting, etc., the current horizontal position and the current vertical position of the robot head can be detected in real time through a horizontal position sensor and a vertical position sensor arranged on the robot head, so as to decide the operation stop time. Specifically, after determining the horizontal position information of the target object, the robot head holder can be controlled to horizontally rotate towards the direction of the target object according to the horizontal relative angle between the front face orientation of the robot head and the target object, and the horizontal position change of the robot head is continuously monitored in the process until the front face of the robot head faces the target object; meanwhile, after the vertical position information of the target object is determined, the head holder of the robot can be controlled to vertically rotate towards the direction of the target object according to the vertical relative angle between the front face of the head of the robot and the target object, and the vertical position change of the head of the robot is continuously monitored in the process until the front face of the head of the robot faces the target object.
In summary, the main idea of the present invention is that after the robot is powered on, the head is independently controlled to perform two-dimensional motion to initialize the position of the robot, and simultaneously, whether there is a target object to be interacted is monitored, when the interaction object is monitored, information of the position of the target object is obtained, and the position of the target object is compared with a preset central area of the robot head, so as to regulate and control the robot head to be directly opposite to the target object through a two-dimensional independent motion mechanism, and after the current round of interaction is finished, the robot head is independently controlled to perform horizontal and/or vertical two-dimensional motion again to reset the head to the initial position to wait for the next round of interaction. The invention realizes the fine control of the robot head, can independently perform target following and posture regulation and control on the robot head, and does not need to rotate a robot chassis for orientation control, thereby greatly improving the user interaction experience while flexibly regulating and controlling the orientation of the robot face.
Corresponding to the above embodiments and preferred solutions, the present invention further provides an embodiment of a robot head control device, as shown in fig. 2, which may specifically include the following components (the following modules are different concepts from the control module, the application module, and the like mentioned in the foregoing examples):
the current position detection module 1 is used for detecting the current position of the head of the robot after being electrified;
an initial position comparison module 2, configured to compare the current position with a preset initial position;
the head two-dimensional control module 3 is used for independently controlling the robot head to horizontally rotate and/or vertically rotate when the initial position comparison module outputs that the current position does not accord with a preset initial position, so that the robot head rotates to the initial position;
the target object monitoring module 4 is used for monitoring a target object to be interacted in real time;
the target position calculation module 5 is used for determining the position information of the target object when the target object is monitored;
the head orientation comparison module 6 is configured to compare the position information with a current central area, where the central area represents a front orientation area of the robot head;
the head two-dimensional control module 3 is further configured to, when the head orientation comparison module outputs that the position information does not coincide with the central region, individually control the robot head to perform horizontal rotation and/or vertical rotation, so that the target object is located within the range of the central region;
and the head two-dimensional control module is also used for independently controlling the robot head to horizontally rotate and/or vertically rotate 3 after the interaction is finished so as to reset the robot head to the initial position.
In at least one possible implementation manner, the head two-dimensional control module includes a head resetting unit, and the head resetting unit specifically includes:
the timing component is used for starting timing if the head of the robot is not at the initial position when the interaction is finished and other target objects to be interacted are not monitored;
the overtime detection component is used for judging whether a preset time threshold value is exceeded or not in the timing process;
and the head two-dimensional control component is used for controlling the robot head to reset to the initial position when the output of the overtime detection component is overtime.
In at least one possible implementation manner, the control device further includes: and the head posture detection module is used for detecting the current horizontal position and the current vertical position of the robot head in real time in the process of independently controlling the robot head to rotate horizontally and/or vertically.
It should be understood that the above division of the components in the robot head control device shown in fig. 2 is merely a logical division, and the actual implementation may be wholly or partially integrated into one physical entity or may be physically separated. And these components may all be implemented in software invoked by a processing element; or may be implemented entirely in hardware; and part of the components can be realized in the form of calling by the processing element in software, and part of the components can be realized in the form of hardware. For example, a certain module may be a separate processing element, or may be integrated into a certain chip of the electronic device. Other components are implemented similarly. In addition, all or part of the components can be integrated together or can be independently realized. In implementation, each step of the above method or each component above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above components may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), one or more microprocessors (DSPs), one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, these components may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
In view of the foregoing examples and preferred embodiments thereof, it will be appreciated by those skilled in the art that, in practice, the technical idea underlying the present invention may be applied in a variety of embodiments, the present invention being schematically illustrated by the following vectors:
(1) an electronic device is provided. The device may specifically include: one or more processors, memory, and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the apparatus, cause the apparatus to perform the steps/functions of the foregoing embodiments or an equivalent implementation.
The electronic device may specifically be a computer-related electronic device, such as but not limited to a robot control core, a robot remote control platform, a background server, and the like.
Fig. 3 is a schematic structural diagram of an embodiment of an electronic device provided in the present invention, and specifically, the electronic device 900 includes a processor 910 and a memory 930. Wherein, the processor 910 and the memory 930 can communicate with each other and transmit control and/or data signals through the internal connection path, the memory 930 is used for storing computer programs, and the processor 910 is used for calling and running the computer programs from the memory 930. The processor 910 and the memory 930 may be combined into a single processing device, or more generally, separate components, and the processor 910 is configured to execute the program code stored in the memory 930 to implement the functions described above. In particular implementations, the memory 930 may be integrated with the processor 910 or may be separate from the processor 910.
In addition, to further enhance the functionality of the electronic device 900, the device 900 may further include one or more of an input unit 960, a display unit 970, an audio circuit 980, a camera 990, a sensor 901, and the like, which may further include a speaker 982, a microphone 984, and the like. The display unit 970 may include a display screen, among others.
Further, the apparatus 900 may also include a power supply 950 for providing power to various devices or circuits within the apparatus 900.
It should be understood that the operation and/or function of the various components of the apparatus 900 can be referred to in the foregoing description with respect to the method, system, etc., and the detailed description is omitted here as appropriate to avoid repetition.
It should be understood that the processor 910 in the electronic device 900 shown in fig. 3 may be a system on chip SOC, and the processor 910 may include a Central Processing Unit (CPU), and may further include other types of processors, such as: an image Processing Unit (GPU), etc., which will be described in detail later.
In summary, various portions of the processors or processing units within the processor 910 may cooperate to implement the foregoing method flows, and corresponding software programs for the various portions of the processors or processing units may be stored in the memory 930.
(2) A computer data storage medium having stored thereon a computer program or the above apparatus which, when executed, causes a computer to perform the steps/functions of the preceding embodiments or equivalent implementations.
In several embodiments provided by the present invention, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer data-accessible storage medium. Based on this understanding, some aspects of the present invention may be embodied in the form of software products, which are described below, or portions thereof, which substantially contribute to the art.
In particular, it should be noted that the storage medium may refer to a server or a similar computer device, and specifically, the aforementioned computer program or the aforementioned apparatus is stored in a storage device in the server or the similar computer device.
(3) A computer program product (which may include the above apparatus) which, when run on a terminal device, causes the terminal device to perform the robot head control method of the preceding embodiment or equivalent embodiments.
From the above description of the embodiments, it is clear to those skilled in the art that all or part of the steps in the above implementation method can be implemented by software plus a necessary general hardware platform. With this understanding, the above-described computer program product may include, but is not limited to referring to APP.
In the foregoing, the device/terminal may be a computer device, and the hardware structure of the computer device may further specifically include: at least one processor, at least one communication interface, at least one memory, and at least one communication bus; the processor, the communication interface and the memory can all complete mutual communication through the communication bus. The processor may be a central Processing unit CPU, a DSP, a microcontroller, or a digital Signal processor, and may further include a GPU, an embedded Neural Network Processor (NPU), and an Image Signal Processing (ISP), and may further include a specific integrated circuit ASIC, or one or more integrated circuits configured to implement the embodiments of the present invention, and the processor may have a function of operating one or more software programs, and the software programs may be stored in a storage medium such as a memory; and the aforementioned memory/storage media may comprise: non-volatile memories (non-volatile memories) such as non-removable magnetic disks, U-disks, removable hard disks, optical disks, etc., and Read-Only memories (ROM), Random Access Memories (RAM), etc.
In the embodiments of the present invention, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of skill in the art will appreciate that the various modules, elements, and method steps described in the embodiments disclosed in this specification can be implemented as electronic hardware, combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
And, modules, units, etc. described herein as separate components may or may not be physically separate, i.e., may be located in one place, or may be distributed across multiple places, e.g., nodes of a system network. Some or all of the modules and units can be selected according to actual needs to achieve the purpose of the above-mentioned embodiment. Can be understood and carried out by those skilled in the art without inventive effort.
The structure, features and effects of the present invention have been described in detail with reference to the embodiments shown in the drawings, but the above embodiments are merely preferred embodiments of the present invention, and it should be understood that technical features related to the above embodiments and preferred modes thereof can be reasonably combined and configured into various equivalent schemes by those skilled in the art without departing from and changing the design idea and technical effects of the present invention; therefore, the invention is not limited to the embodiments shown in the drawings, and all the modifications and equivalent embodiments that can be made according to the idea of the invention are within the scope of the invention as long as they are not beyond the spirit of the description and the drawings.

Claims (10)

1. A robot head control method, comprising:
after power-on, detecting the current position of the robot head;
comparing the current position with a preset initial position;
if the current position is not consistent with the initial position, independently controlling the robot head to rotate horizontally and/or vertically so as to enable the robot head to rotate to the initial position;
monitoring a target object to be interacted in real time;
when the target object is monitored, determining the position information of the target object;
comparing the position information with a current central area, wherein the central area represents a front face orientation area of the robot head;
if the position information is not in accordance with the central area, independently controlling the head of the robot to rotate horizontally and/or vertically so as to enable the target object to be located in the range of the central area;
and after the interaction is finished, the robot head is independently controlled to rotate horizontally and/or vertically so as to reset the robot head to the initial position.
2. The robot head control method of claim 1, wherein the resetting the robot head to the initial position comprises:
when the interaction is finished and other target objects to be interacted are not monitored, if the head of the robot is not at the initial position, timing is started;
in the timing process, judging whether a preset time threshold value is exceeded or not;
and if the time is out, controlling the head of the robot to reset to the initial position.
3. A robot head control method according to claim 2, characterized in that the time threshold comprises a robot head rest time.
4. The method of claim 1, wherein comparing the position information to a current center region comprises:
when the position of the target object is within a preset tolerance range of the central area, judging that the position information is consistent with the central area; otherwise, judging that the position information is not in accordance with the central area.
5. A robot head control method according to any of claims 1-4, characterized in that the control method further comprises: and detecting the current horizontal position and the current vertical position of the robot head in real time in the process of independently controlling the robot head to rotate horizontally and/or vertically.
6. A robot head control device, comprising:
the current position detection module is used for detecting the current position of the head of the robot after being electrified;
the initial position comparison module is used for comparing the current position with a preset initial position;
the head two-dimensional control module is used for independently controlling the robot head to horizontally rotate and/or vertically rotate when the initial position comparison module outputs that the current position does not accord with a preset initial position, so that the robot head rotates to the initial position;
the target object monitoring module is used for monitoring a target object to be interacted in real time;
the target position calculation module is used for determining the position information of the target object when the target object is monitored;
the head orientation comparison module is used for comparing the position information with a current central area, wherein the central area represents a front orientation area of the head of the robot;
the head two-dimensional control module is further configured to, when the head orientation comparison module outputs that the position information does not coincide with the central region, individually control the robot head to perform horizontal rotation and/or vertical rotation, so that the target object is located within the range of the central region;
and the head two-dimensional control module is also used for independently controlling the robot head to horizontally rotate and/or vertically rotate after the interaction is finished so as to reset the robot head to the initial position.
7. The robotic head control device of claim 6, wherein the head two-dimensional control module comprises a head repositioning unit, the head repositioning unit comprising in particular:
the timing component is used for starting timing if the head of the robot is not at the initial position when the interaction is finished and other target objects to be interacted are not monitored;
the overtime detection component is used for judging whether a preset time threshold value is exceeded or not in the timing process;
and the head two-dimensional control component is used for controlling the robot head to reset to the initial position when the output of the overtime detection component is overtime.
8. A robot head control device according to claim 6 or 7, characterized in that the control device further comprises: and the head posture detection module is used for detecting the current horizontal position and the current vertical position of the robot head in real time in the process of independently controlling the robot head to rotate horizontally and/or vertically.
9. An electronic device, comprising:
one or more processors, memory, and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the electronic device, cause the electronic device to perform the robotic head control method of any of claims 1-5.
10. A computer data storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the robot head control method of any one of claims 1 to 5.
CN202111266675.7A 2021-10-28 2021-10-28 Robot head control method and device and electronic equipment Pending CN113997285A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111266675.7A CN113997285A (en) 2021-10-28 2021-10-28 Robot head control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111266675.7A CN113997285A (en) 2021-10-28 2021-10-28 Robot head control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113997285A true CN113997285A (en) 2022-02-01

Family

ID=79924793

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111266675.7A Pending CN113997285A (en) 2021-10-28 2021-10-28 Robot head control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113997285A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090234499A1 (en) * 2008-03-13 2009-09-17 Battelle Energy Alliance, Llc System and method for seamless task-directed autonomy for robots
CN206140532U (en) * 2016-09-19 2017-05-03 苏州小璐机器人有限公司 Inspection robot head initial position and control its head pivoted device
CN107471247A (en) * 2017-08-25 2017-12-15 歌尔科技有限公司 Robot head rotating method and robot
CN108177146A (en) * 2017-12-28 2018-06-19 北京奇虎科技有限公司 Control method, device and the computing device of robot head
CN109108968A (en) * 2018-08-17 2019-01-01 深圳市三宝创新智能有限公司 Exchange method, device, equipment and the storage medium of robot head movement adjustment
CN109955248A (en) * 2017-12-26 2019-07-02 深圳市优必选科技有限公司 A kind of robot and its face follower method
CN112711331A (en) * 2020-12-28 2021-04-27 京东数科海益信息科技有限公司 Robot interaction method and device, storage equipment and electronic equipment
CN214067747U (en) * 2020-12-25 2021-08-27 广州小鹏自动驾驶科技有限公司 Vehicle-mounted interactive robot and vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090234499A1 (en) * 2008-03-13 2009-09-17 Battelle Energy Alliance, Llc System and method for seamless task-directed autonomy for robots
CN206140532U (en) * 2016-09-19 2017-05-03 苏州小璐机器人有限公司 Inspection robot head initial position and control its head pivoted device
CN107471247A (en) * 2017-08-25 2017-12-15 歌尔科技有限公司 Robot head rotating method and robot
CN109955248A (en) * 2017-12-26 2019-07-02 深圳市优必选科技有限公司 A kind of robot and its face follower method
CN108177146A (en) * 2017-12-28 2018-06-19 北京奇虎科技有限公司 Control method, device and the computing device of robot head
CN109108968A (en) * 2018-08-17 2019-01-01 深圳市三宝创新智能有限公司 Exchange method, device, equipment and the storage medium of robot head movement adjustment
CN214067747U (en) * 2020-12-25 2021-08-27 广州小鹏自动驾驶科技有限公司 Vehicle-mounted interactive robot and vehicle
CN112711331A (en) * 2020-12-28 2021-04-27 京东数科海益信息科技有限公司 Robot interaction method and device, storage equipment and electronic equipment

Similar Documents

Publication Publication Date Title
US10394107B2 (en) Gimbal control method, gimbal control apparatus, and gimbal
US20190246036A1 (en) Gesture- and gaze-based visual data acquisition system
KR101978967B1 (en) Device of recognizing predetermined gesture based on a direction of input gesture and method thereof
Qureshi et al. Surveillance in virtual reality: System design and multi-camera control
TWI789423B (en) Classifiers and classifier systems for processing input signals and operating methods thereof
US11938400B2 (en) Object control method and apparatus, storage medium, and electronic apparatus
US10821606B2 (en) Computer-implemented method for robot fall prediction, and robot
US11388343B2 (en) Photographing control method and controller with target localization based on sound detectors
US11417003B2 (en) Method and apparatus for tracking eyes of user and method of generating inverse-transform image
JP7103354B2 (en) Information processing equipment, information processing methods, and programs
WO2019128496A1 (en) Device motion control
KR102051016B1 (en) Server and method for controlling learning-based speech recognition apparatus
US9919429B2 (en) Robot, control method, and program
CN113997285A (en) Robot head control method and device and electronic equipment
CN111580665B (en) Method and device for predicting fixation point, mobile terminal and storage medium
TW201916669A (en) Method and apparatus for gaze recognition and interaction
AU2010338191B2 (en) Stabilisation method and computer system
CN114979497B (en) Unmanned aerial vehicle linkage tracking method and system based on pole loading and cloud platform
CN112291701B (en) Positioning verification method, positioning verification device, robot, external equipment and storage medium
CN110245618A (en) 3D identification device and method
US10166677B2 (en) State prediction system
US20180260624A1 (en) Head mounted display device, object tracking apparatus and method for tracking object thereof
WO2019205124A1 (en) Method for protecting cradle head, device for protecting cradle head, cradle head and unmanned aerial vehicle
CN111402423A (en) Sensor setting method and device and server
Shibata et al. Fast learning of biomimetic oculomotor control with nonparametric regression networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220201

RJ01 Rejection of invention patent application after publication