CN111376244A - Robot awakening method and system and robot - Google Patents

Robot awakening method and system and robot Download PDF

Info

Publication number
CN111376244A
CN111376244A CN201811608993.5A CN201811608993A CN111376244A CN 111376244 A CN111376244 A CN 111376244A CN 201811608993 A CN201811608993 A CN 201811608993A CN 111376244 A CN111376244 A CN 111376244A
Authority
CN
China
Prior art keywords
radar data
robot
awakening
data
condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811608993.5A
Other languages
Chinese (zh)
Other versions
CN111376244B (en
Inventor
熊友军
谢文学
黄高波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201811608993.5A priority Critical patent/CN111376244B/en
Publication of CN111376244A publication Critical patent/CN111376244A/en
Application granted granted Critical
Publication of CN111376244B publication Critical patent/CN111376244B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention is suitable for the technical field of intelligent robots, and provides a robot awakening method, a system and a robot, wherein the robot awakening method comprises the steps of collecting radar data in real time and monitoring whether the radar data meet a pre-awakening condition; if the radar data meet the pre-awakening condition, acquiring first radar data within a range of a preset angle at two sides right ahead of the robot and a first preset distance from the robot; judging whether the user enters an awakening range or not according to the first radar data; if the user enters the awakening range, second radar data within a range of a preset angle of two sides of the front side and a second preset distance away from the robot, wherein the preset angle is faced by the robot; judging whether a wake-up condition is met according to the second radar data; and if the awakening condition is met, awakening the robot. The robot awakening identification accuracy rate is improved through multiple sections of radar data, and the problem that the robot is low in awakening accuracy rate in a noisy environment at present is effectively solved.

Description

Robot awakening method and system and robot
Technical Field
The invention belongs to the technical field of intelligent robots, and particularly relates to a robot awakening method, a robot awakening system and a robot.
Background
With the continuous development of science and technology, service robots have been widely used in various places such as shopping malls, hospitals, administrative service halls, etc., and provide intelligent services for people by voice interaction between the service robots and people. In order to achieve effective interaction, a user is usually required to wake up a robot first and then perform voice interaction, and the robot is generally wakened up through a wake-up word in the prior art.
In summary, the problem that the existing robot has low awakening accuracy rate in a noisy environment exists.
Disclosure of Invention
In view of this, embodiments of the present invention provide a robot wake-up method, a robot wake-up system, and a robot, so as to solve a problem that a robot has a low wake-up accuracy in a noisy environment at present.
The first aspect of the invention provides a robot awakening method, which comprises the following steps:
collecting radar data in real time, and monitoring whether the radar data meets a pre-awakening condition;
if the radar data meet the pre-awakening condition, acquiring first radar data within a range of a preset angle at two sides right ahead of the robot and a first preset distance from the robot;
judging whether the user enters an awakening range or not according to the first radar data;
if the user enters the awakening range, second radar data within a range of a preset angle of two sides of the front side and a second preset distance away from the robot, wherein the preset angle is faced by the robot; the second preset distance is smaller than the first preset distance;
judging whether a wake-up condition is met according to the second radar data;
and if the awakening condition is met, awakening the robot.
A second aspect of the present invention provides a data backup system, including:
the monitoring module is used for collecting radar data in real time and monitoring whether the radar data meet a pre-awakening condition;
the first acquisition module is used for acquiring first radar data within a range of a first preset distance from the robot to two preset angles at two sides right ahead of the robot if the radar data meet a pre-awakening condition;
the first judgment module is used for judging whether the user enters an awakening range or not according to the first radar data;
the second acquisition module is used for acquiring second radar data within a range which is within a second preset distance from the robot and is at a preset angle at two sides of the front side and is faced by the robot if the user enters the awakening range; the second preset distance is smaller than the first preset distance;
the second judgment module is used for judging whether the awakening condition is met or not according to the second radar data;
and the awakening module is used for awakening the robot if the awakening condition is met.
A third aspect of the invention provides a robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
collecting radar data in real time, and monitoring whether the radar data meets a pre-awakening condition;
if the radar data meet the pre-awakening condition, acquiring first radar data within a range of a preset angle at two sides right ahead of the robot and a first preset distance from the robot;
judging whether the user enters an awakening range or not according to the first radar data;
if the user enters the awakening range, second radar data within a range of a preset angle of two sides of the front side and a second preset distance away from the robot, wherein the preset angle is faced by the robot; the second preset distance is smaller than the first preset distance;
judging whether a wake-up condition is met according to the second radar data;
and if the awakening condition is met, awakening the robot.
A fourth aspect of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, performs the steps of:
collecting radar data in real time, and monitoring whether the radar data meets a pre-awakening condition;
if the radar data meet the pre-awakening condition, acquiring first radar data within a range of a preset angle at two sides right ahead of the robot and a first preset distance from the robot;
judging whether the user enters an awakening range or not according to the first radar data;
if the user enters the awakening range, second radar data within a range of a preset angle of two sides of the front side and a second preset distance away from the robot, wherein the preset angle is faced by the robot; the second preset distance is smaller than the first preset distance;
judging whether a wake-up condition is met according to the second radar data;
and if the awakening condition is met, awakening the robot.
According to the robot awakening method, the robot awakening system and the robot, radar data are obtained through multiple sections of distances, whether a user is close to the robot or not is analyzed according to the obtained radar data, whether the user needs to awaken the robot or not is determined, the robot is actively awakened when the user is close to the robot, the robot awakening identification accuracy is improved through the multiple sections of radar data, and the problem that the existing robot is low in awakening accuracy in a noisy environment is effectively solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation process of a robot wake-up method according to an embodiment of the present invention;
fig. 2a is a schematic diagram of radar data distribution when a user does not enter a wake-up range in the method according to an embodiment of the present invention;
fig. 2b is a schematic diagram illustrating distribution of radar data when a user enters a wake-up range and does not satisfy a wake-up condition in the method according to an embodiment of the present invention;
fig. 2c is a schematic diagram illustrating distribution of radar data when a wake-up condition is satisfied in the method according to the first embodiment of the present invention;
fig. 3 is a schematic flow chart of an implementation of step S101 according to a second embodiment of the present invention;
fig. 4 is a schematic flow chart of an implementation of step S103 according to a third embodiment of the present invention;
fig. 5 is a schematic flow chart of an implementation of step S105 according to a fourth embodiment of the present invention;
fig. 6 is a schematic structural diagram of a robot wake-up system according to a fifth embodiment of the present invention;
fig. 7 is a schematic structural diagram of a monitoring module 101 according to a sixth embodiment of the present invention;
fig. 8 is a schematic structural diagram of the first determining module 103 according to a fifth embodiment of the present invention;
fig. 9 is a schematic structural diagram of a second determining module in the fifth embodiment according to the eighth embodiment of the present invention;
fig. 10 is a schematic view of a robot according to the ninth embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
The first embodiment is as follows:
as shown in fig. 1, the present embodiment provides a robot wake-up method, which specifically includes:
step S101: and collecting radar data in real time, and monitoring whether the radar data meets a pre-awakening condition.
In specific application, radar data are collected in real time through a radar monitoring module arranged in a robot body, whether a pre-awakening condition is met or not is judged according to the radar data, wherein the pre-awakening condition is that whether the number of data points of the radar data is increased by a first number threshold or not, and when the number of the data points of the radar data is increased by the first number threshold, it is indicated that a user possibly enters an awakening range at the moment, so that whether the user enters the awakening range or not needs to be further confirmed. The awakening range is a sector area which takes any point of the longitudinal axis of the robot body as the center of a circle, takes the first preset distance as the radius and takes the preset angles on the left side and the right side right in front of the robot as boundaries. For example, the first preset distance is 1.5m, the preset angle is 30 degrees, the wake-up range refers to a sector area with any point of the longitudinal axis of the robot body as a center, 1.5m as a radius, and two left and right sidelines forming an angle of 30 degrees with the front of the robot as boundaries (the angle between the two borderlines is 60 degrees)
It should be noted that the radar transmitter of the radar monitoring module sends out a radar signal in real time, reflects the radar signal when encountering an obstacle (user or other object), monitors the received reflected radar data in real time through the radar monitoring module to realize monitoring of the radar data,
it should also be noted that the more data points of the radar data are collected as the obstacle approaches the robot.
In a specific application, as shown in fig. 2a, a circle represents a robot, a cone represents the orientation of the robot, and when no user enters a wake-up range, no reflected radar data is collected in the wake-up range of the robot.
As shown in fig. 2b, where a circle represents a robot, and a cone represents an orientation of the robot, when a user enters a wake-up range (or an object enters the wake-up range), the radar monitoring module can acquire corresponding radar data, and the number of data points of the radar data is increased as the user approaches the robot.
As shown in fig. 2c, a circle represents a robot, a cone represents an orientation of the robot, and when a user enters a wake-up range and meets a wake-up condition, that is, the user enters the wake-up range and approaches the robot, and a distance from the user to the robot is less than a second preset distance, it indicates that the user wants to interact with the robot, and then wakes up the robot. At this time, the radar monitoring module can acquire more radar data.
Step S102: and if the radar data meet the pre-awakening condition, acquiring first radar data within a range of a first preset distance from the robot to the preset angles at two sides right ahead of the robot.
In specific application, when radar data meet a pre-awakening condition, it is indicated that a user may enter an awakening range, and data points of the first radar data are counted by acquiring the first radar data within a range of a first preset distance from the robot and preset angles on two sides of the front side where the robot faces.
It should be noted that the preset angle may be set according to practical applications, and for example, the preset angle is 30 degrees. The first preset distance may also be set according to practical applications, and for example, the first preset distance is 1.5 m.
In a specific application, if the radar data does not meet the pre-wake-up condition, the radar data continues to be collected.
Step S103: and judging whether the user enters an awakening range or not according to the first radar data.
In a specific application, judging whether the user enters the wake-up range according to the first radar data is specifically to count the number of data points of the first radar data, judge whether the number of the data points meets a second number threshold according to the number of the data points of the first radar data, and if the number of the data points meets the second number threshold, indicate that the user enters the wake-up range.
Step S104: if the user enters the awakening range, second radar data within a range of a preset angle of two sides of the front side and a second preset distance away from the robot, wherein the preset angle is faced by the robot; the second preset distance is smaller than the first preset distance.
In a specific application, when the user enters the wake-up range, if the user continues to approach the robot and the distance between the robot and the robot is equal to or less than a second preset distance, the radar data meets the wake-up condition. Therefore, second radar data in a second preset distance direction of the preset angle of two sides of the right front side, which the robot faces, are obtained, and whether the robot is continuously close to the robot after entering the awakening range or not is judged according to the second radar data.
In a specific application, the approach of the user to the robot means that the user continues to approach the robot within the wake-up range, that is, the user is close to the robot in front.
It should be noted that the preset angle may be set according to practical applications, and for example, the preset angle is 30 degrees. The second preset distance may also be set according to practical applications, and for example, the first preset distance is 0.8 m.
Step S105: and judging whether the awakening condition is met or not according to the second radar data.
In a specific application, judging whether the wake-up condition is met according to the second radar data, specifically, counting the number of data points of the second radar data, judging whether the number of the data points meets a third number threshold according to the number of the data points of the second radar data, and if the number of the data points meets the third number threshold, indicating that the wake-up condition is met.
Step S106: and if the awakening condition is met, awakening the robot.
In specific application, when the awakening condition is met, the robot is awakened through the awakening module, so that the robot can effectively interact with a user.
In one embodiment, the robot wake-up method further includes, after step S106, the following steps:
step S107: and performing man-machine interaction with a user through a man-machine interaction module.
In a specific application, after waking up the robot, a user can perform human-computer interaction with the robot through the human-computer interaction module, such as voice conversation, instruction control, and the like, which is not described herein again.
According to the robot awakening method provided by the embodiment, radar data are acquired through multiple sections of distances, whether a user is close to the robot or not is analyzed according to the acquired radar data, whether the user needs to awaken the robot or not is determined, the robot is actively awakened when the user is close to the robot, the robot awakening identification accuracy rate is improved through the multiple sections of radar data, and the problem that the existing robot is low in awakening accuracy rate in a noisy environment is effectively solved.
Example two:
as shown in fig. 3, in the present embodiment, the step S101 in the first embodiment specifically includes:
step S201: and radar data are collected in real time through a radar monitoring module.
In specific application, reflected radar data are collected in real time through a radar monitoring module arranged in a robot body.
Step S202: monitoring a change in the number of data points of the radar data.
Step S203: determining whether the radar data meet a pre-awakening condition according to the change condition of the data point quantity of the radar data; and if the data point data quantity of the radar data is increased to a first quantity threshold value, determining that the radar data meets a pre-awakening condition.
In a specific application, by analyzing the change of the number of data points of the radar data, when the number of data points of the radar data increases to a first number threshold, it is determined that the radar data satisfies a pre-wake-up condition. When the number of data points of the reflected radar data is increased to the first number threshold, it is indicated that the user may enter the wake-up range, and the pre-wake-up condition is satisfied, and at this time, it is necessary to further determine whether the user actually enters the wake-up range, so as to avoid false wake-up. The first number threshold may be set according to actual requirements, and for example, the first number threshold is 2.
Example three:
as shown in fig. 4, in the present embodiment, the step S103 in the first embodiment specifically includes:
step S301: first radar data is acquired by a radar monitoring module.
In specific application, when a pre-wake-up condition is met, first radar data in a wake-up range are acquired through a radar monitoring module.
Step S302: and counting the number of data points of the first radar data.
Step S303: and if the number of data points of the first radar data meets a second number threshold, determining that the user enters an awakening range.
In a specific application, the number of data points of the first radar data in the wake-up range is counted. And judging whether the data point quantity meets a second quantity threshold value or not according to the data point quantity of the first radar data, and if the data point quantity of the first radar data meets the second quantity threshold value, determining that the user enters an awakening range. The second quantity threshold may be set according to actual demand, and is set to 4 for example.
Example four:
as shown in fig. 5, in the present embodiment, the step S105 in the first embodiment specifically includes:
step S401: and acquiring second radar data through the radar monitoring module.
In specific application, second radar data which are within a second preset distance range and have preset angles with two sides of the front side where the robot faces are obtained through the radar monitoring module.
Step S402: and counting the number of data points of the second radar data.
Step S403: and if the number of data points of the second radar data meets a third number threshold, determining that a wake-up condition is met.
In the specific application, the data point quantity of second radar data within a second preset distance range from preset angles of two sides of the front side, which is faced by the robot, is counted, whether the data point quantity meets a third quantity threshold value is judged according to the data point quantity of the second radar data points, and if the data point quantity of the second radar data meets the third quantity threshold value, the awakening condition is determined to be met. The third number threshold may be set according to actual requirements, and for example, the second number threshold is set to 4.
It should be noted that when the number of data points of the first radar data does not satisfy the second number threshold, it is continuously determined whether the number of data points of the second radar data satisfies the third number threshold, and it is determined that the wakeup condition is satisfied only when the number of data points of the second radar data satisfies the third number threshold, so that the robot is woken up, the situation that the robot is not woken up when a user approaches the robot is effectively avoided, and the wakeup accuracy is improved.
Example five:
as shown in fig. 6, the present embodiment provides a robot wake-up system 100 for executing the method steps in the first embodiment, which includes a monitoring module 101, a first obtaining module 102, a first determining module 103, a second obtaining module 104, a second determining module 105, and a wake-up module 106.
The monitoring module 101 is configured to collect radar data in real time and monitor whether the radar data meets a pre-wake-up condition.
The first obtaining module 102 is configured to obtain first radar data within a range of a first preset distance from the robot to two preset angles on two sides of a front side where the robot faces if the radar data meets a pre-wakeup condition.
The first judging module 103 is configured to judge whether the user enters a wake-up range according to the first radar data.
The second obtaining module 104 is configured to obtain second radar data within a range that is within a second preset distance from the robot and is at a preset angle at two sides of a front side where the robot faces if the user enters the wake-up range; the second preset distance is smaller than the first preset distance.
The second determining module 105 is configured to determine whether the wake-up condition is satisfied according to the second radar data.
The wake-up module 106 is configured to wake up the robot if the wake-up condition is satisfied.
In one embodiment, the robot wake-up system further comprises an interaction module.
The interaction module is used for performing man-machine interaction with a user through the man-machine interaction module.
It should be noted that, since the robot wake-up system provided in the embodiment of the present invention is based on the same concept as the method embodiment shown in fig. 1 of the present invention, the technical effect thereof is the same as the method embodiment shown in fig. 1 of the present invention, and specific contents may refer to the description in the method embodiment shown in fig. 1 of the present invention, and are not described herein again.
Therefore, the robot wake-up system provided by the embodiment can also pass through.
Example six:
as shown in fig. 7, in this embodiment, the monitoring module 101 in the fifth embodiment includes a structure for executing the method steps in the embodiment corresponding to fig. 3, and includes an acquisition unit 201, a monitoring unit 202, and a determination unit 203.
The acquisition unit 201 is used for acquiring radar data in real time through a radar monitoring module.
The monitoring unit 202 is configured to monitor a change in the number of data points of the radar data.
The judging unit 203 is configured to determine whether the radar data meets a pre-wakeup condition according to a change condition of the number of data points of the radar data; and if the data point data quantity of the radar data is increased to a first quantity threshold value, determining that the radar data meets a pre-awakening condition.
Example seven:
as shown in fig. 8, in this embodiment, the first determining module 103 in the fifth embodiment includes a structure for executing the method steps in the embodiment corresponding to fig. 4, and includes a first obtaining unit 301, a first counting unit 302, and a first determining unit 303.
The first obtaining unit 301 is configured to obtain first radar data through a radar monitoring module.
The first statistical unit 302 is configured to count a number of data points of the first radar data.
The first determining unit 303 is configured to determine that the user enters an awake range if the number of data points of the first radar data satisfies a second number threshold.
Example eight:
as shown in fig. 9, in this embodiment, the second determining module 105 in the fifth embodiment includes a structure for executing the method steps in the embodiment corresponding to fig. 3, and includes a second obtaining unit 401, a second counting unit 402, and a second determining unit 403.
The second obtaining unit 401 is configured to obtain second radar data through the radar monitoring module.
The second statistical unit 402 is configured to count the number of data points of the second radar data.
The second determining unit 403 is configured to determine that a wake-up condition is satisfied if the number of data points of the second radar data satisfies a third number threshold.
Example nine:
fig. 10 is a schematic view of a robot according to a fifth embodiment of the present invention. As shown in fig. 10, the robot 9 of this embodiment includes: a processor 90, a memory 91 and a computer program 92, e.g. a program, stored in said memory 91 and executable on said processor 90. The processor 90, when executing the computer program 92, implements the steps of the at least one picture processing method embodiment described above, such as the steps S101 to S106 shown in fig. 1. Alternatively, the processor 90, when executing the computer program 92, implements the functionality of at least one of the modules/units in the above-described system embodiments, such as the functionality of the modules 101 to 106 shown in fig. 6.
Illustratively, the computer program 92 may be partitioned into one or more modules/units that are stored in the memory 91 and executed by the processor 90 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 92 in the robot 9. For example, the computer program 92 may be divided into a monitoring module, a first obtaining module, a first determining module, a second obtaining module, a second determining module, and a waking module, where at least one of the modules specifically functions as follows:
the monitoring module is used for collecting radar data in real time and monitoring whether the radar data meet a pre-awakening condition;
the first acquisition module is used for acquiring first radar data within a range of a first preset distance from the robot to two preset angles at two sides right ahead of the robot if the radar data meet a pre-awakening condition;
the first judgment module is used for judging whether the user enters an awakening range or not according to the first radar data;
the second acquisition module is used for acquiring second radar data within a range which is within a second preset distance from the robot and is at a preset angle at two sides of the front side and is faced by the robot if the user enters the awakening range; the second preset distance is smaller than the first preset distance;
the second judgment module is used for judging whether the awakening condition is met or not according to the second radar data;
and the awakening module is used for awakening the robot if the awakening condition is met.
The robot 9 may be a desktop computer, a notebook, a palm computer, a cloud management server, or other computing device. The robot may include, but is not limited to, a processor 90, a memory 91. Those skilled in the art will appreciate that fig. 6 is merely an example of a robot 9 and does not constitute a limitation of robot 9 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the robot may also include input output devices, network access devices, buses, etc.
The Processor 90 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 91 may be an internal storage unit of the robot 9, such as a hard disk or a memory of the robot 9. The memory 91 may also be an external storage device of the robot 9, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the robot 9. Further, the memory 91 may also include both an internal storage unit and an external storage device of the robot 9. The memory 91 is used for storing the computer program and other programs and data required by the robot. The memory 91 may also be used to temporarily store data that has been output or is to be output.
It will be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division into at least one functional unit and at least one module is merely illustrated, and in practical applications, the above function distribution may be performed by different functional units and different modules as needed, that is, the internal structure of the system is divided into different functional units or different modules to perform all or part of the above described functions. In the embodiments, the at least one functional unit and the module may be integrated into one processing unit, or the at least one unit may exist alone physically, or two or more units are integrated into one unit, and the integrated unit may be implemented in a form of hardware, or implemented in a form of software functional unit. In addition, the specific names of the at least one functional unit and the at least one module are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the wireless terminal may refer to the corresponding process in the foregoing method embodiments, and details are not repeated here.
In the above embodiments, at least one of the descriptions of the at least one embodiment has a focus, and reference may be made to related descriptions of other embodiments for parts that are not described or recited in a certain embodiment.
Those of ordinary skill in the art will appreciate that the at least one exemplary means and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided by the present invention, it should be understood that the disclosed system/robot and method may be implemented in other ways. For example, the system/robot embodiments described above are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, systems or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, the at least one functional unit in the at least one embodiment of the present invention may be integrated into one processing unit, or the at least one unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and configured for individual product sale or use, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the above embodiments may be implemented by a computer program, which may be stored in a computer-readable storage medium and used by a processor to implement the steps of at least one of the method embodiments. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or system capable of carrying said computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solution described in at least one of the foregoing embodiments may still be modified or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions of at least one embodiment of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A robot wake-up method, comprising:
collecting radar data in real time, and monitoring whether the radar data meets a pre-awakening condition;
if the radar data meet the pre-awakening condition, acquiring first radar data within a range of a preset angle at two sides right ahead of the robot and a first preset distance from the robot;
judging whether the user enters an awakening range or not according to the first radar data;
if the user enters the awakening range, second radar data within a range of a preset angle of two sides of the front side and a second preset distance away from the robot, wherein the preset angle is faced by the robot; the second preset distance is smaller than the first preset distance;
judging whether a wake-up condition is met according to the second radar data;
and if the awakening condition is met, awakening the robot.
2. The robot wake-up method according to claim 1, further comprising, after waking up the robot if the wake-up condition is satisfied:
and performing man-machine interaction with a user through a man-machine interaction module.
3. A robot wakeup method according to claim 1, wherein the collecting radar data in real time and monitoring whether the radar data meets a pre-wakeup condition includes:
collecting radar data in real time through a radar monitoring module;
monitoring the change of the data point quantity of the radar data;
determining whether the radar data meet a pre-awakening condition according to the change condition of the data point quantity of the radar data; and if the data point data quantity of the radar data is increased to a first quantity threshold value, determining that the radar data meets a pre-awakening condition.
4. The robot wakeup method according to claim 1, wherein determining whether a user enters a wakeup range according to the first radar data includes:
acquiring first radar data through a radar monitoring module;
counting the number of data points of the first radar data;
and if the number of data points of the first radar data meets a second number threshold, determining that the user enters an awakening range.
5. A robot wake-up method according to claim 1, wherein determining whether a wake-up condition is met based on the second radar data comprises:
acquiring second radar data through a radar monitoring module;
counting the number of data points of the second radar data;
and if the number of data points of the second radar data meets a third number threshold, determining that a wake-up condition is met.
6. A robotic wake-up system, comprising:
the monitoring module is used for collecting radar data in real time and monitoring whether the radar data meet a pre-awakening condition;
the first acquisition module is used for acquiring first radar data within a range of a first preset distance from the robot to two preset angles at two sides right ahead of the robot if the radar data meet a pre-awakening condition;
the first judgment module is used for judging whether the user enters an awakening range or not according to the first radar data;
the second acquisition module is used for acquiring second radar data within a range which is within a second preset distance from the robot and is at a preset angle at two sides of the front side and is faced by the robot if the user enters the awakening range; the second preset distance is smaller than the first preset distance;
the second judgment module is used for judging whether the awakening condition is met or not according to the second radar data;
and the awakening module is used for awakening the robot if the awakening condition is met.
7. A robotic wake-up system according to claim 6, characterized in that the monitoring module comprises:
the acquisition unit is used for acquiring radar data in real time through the radar monitoring module;
the monitoring unit is used for monitoring the change condition of the data point quantity of the radar data;
the judging unit is used for determining whether the radar data meets a pre-awakening condition according to the change condition of the data point quantity of the radar data; and if the data point data quantity of the radar data is increased to a first quantity threshold value, determining that the radar data meets a pre-awakening condition.
8. A robotic wake-up system according to claim 6, characterized in that the first decision module comprises:
the first acquisition unit is used for acquiring first radar data through the radar monitoring module;
a first statistical unit for counting the number of data points of the first radar data;
a first determining unit, configured to determine that the user enters an awake range if the number of data points of the first radar data satisfies a second number threshold.
9. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of the claims 1 to 5 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201811608993.5A 2018-12-27 2018-12-27 Robot awakening method and system and robot Active CN111376244B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811608993.5A CN111376244B (en) 2018-12-27 2018-12-27 Robot awakening method and system and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811608993.5A CN111376244B (en) 2018-12-27 2018-12-27 Robot awakening method and system and robot

Publications (2)

Publication Number Publication Date
CN111376244A true CN111376244A (en) 2020-07-07
CN111376244B CN111376244B (en) 2021-10-29

Family

ID=71214628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811608993.5A Active CN111376244B (en) 2018-12-27 2018-12-27 Robot awakening method and system and robot

Country Status (1)

Country Link
CN (1) CN111376244B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007147174A1 (en) * 2006-06-14 2007-12-21 Robonica (Pty) Ltd Targeting system for a robot gaming environment
US20110185370A1 (en) * 2007-04-30 2011-07-28 Eliezer Tamir Method and System for Configuring a Plurality of Network Interfaces That Share a Physical Interface
CN102741705A (en) * 2009-12-04 2012-10-17 Tp视觉控股有限公司 Method and apparatus for controlling the status of a device
CN103760964A (en) * 2014-01-02 2014-04-30 深圳宝龙达信息技术股份有限公司 Method and device for adjusting and controlling equipment sleep through infrared induction
CN104298945A (en) * 2013-07-15 2015-01-21 联想(北京)有限公司 Information processing method and electronic equipment
CN106886766A (en) * 2017-02-23 2017-06-23 维沃移动通信有限公司 A kind of fingerprint identification method, fingerprint recognition circuit and mobile terminal
CN107168539A (en) * 2017-06-27 2017-09-15 乐视致新电子科技(天津)有限公司 A kind of equipment awakening method, device and electronic equipment
CN107251122A (en) * 2015-02-17 2017-10-13 罗伯特·博世有限公司 Method and sensor device for running sensor device
CN107801125A (en) * 2017-12-04 2018-03-13 深圳市易探科技有限公司 A kind of intelligent sound box control system with microwave radar sensing
CN108972592A (en) * 2018-08-09 2018-12-11 北京云迹科技有限公司 Intelligent awakening method and device for robot

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007147174A1 (en) * 2006-06-14 2007-12-21 Robonica (Pty) Ltd Targeting system for a robot gaming environment
US20110185370A1 (en) * 2007-04-30 2011-07-28 Eliezer Tamir Method and System for Configuring a Plurality of Network Interfaces That Share a Physical Interface
CN102741705A (en) * 2009-12-04 2012-10-17 Tp视觉控股有限公司 Method and apparatus for controlling the status of a device
CN104298945A (en) * 2013-07-15 2015-01-21 联想(北京)有限公司 Information processing method and electronic equipment
CN103760964A (en) * 2014-01-02 2014-04-30 深圳宝龙达信息技术股份有限公司 Method and device for adjusting and controlling equipment sleep through infrared induction
CN107251122A (en) * 2015-02-17 2017-10-13 罗伯特·博世有限公司 Method and sensor device for running sensor device
CN106886766A (en) * 2017-02-23 2017-06-23 维沃移动通信有限公司 A kind of fingerprint identification method, fingerprint recognition circuit and mobile terminal
CN107168539A (en) * 2017-06-27 2017-09-15 乐视致新电子科技(天津)有限公司 A kind of equipment awakening method, device and electronic equipment
CN107801125A (en) * 2017-12-04 2018-03-13 深圳市易探科技有限公司 A kind of intelligent sound box control system with microwave radar sensing
CN108972592A (en) * 2018-08-09 2018-12-11 北京云迹科技有限公司 Intelligent awakening method and device for robot

Also Published As

Publication number Publication date
CN111376244B (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN108733342B (en) Volume adjusting method, mobile terminal and computer readable storage medium
CN110443190B (en) Object recognition method and device
US20200209365A1 (en) Laser data calibration method and robot using the same
CN111582052A (en) Crowd intensive early warning method and device and terminal equipment
WO2021204027A1 (en) Method and apparatus for controlling microphone array, and electronic device and computer storage medium
CN113240936A (en) Parking area recommendation method and device, electronic equipment and medium
CN111124511A (en) Wake-up chip and wake-up system
CN111667843B (en) Voice wake-up method and system for terminal equipment, electronic equipment and storage medium
CN109215037A (en) Destination image partition method, device and terminal device
CN108845747A (en) A kind of Touch-control error prevention method, apparatus and terminal device
CN115396860A (en) NFC chip intelligent regulation method, device, equipment and storage medium
CN111376244B (en) Robot awakening method and system and robot
CN111444926A (en) Radar-based regional people counting method, device, equipment and storage medium
CN110809083A (en) Mobile terminal information reminding method, mobile terminal and storage medium
CN110536241A (en) A kind of workload quantification system, method and storage medium based on Bluetooth technology
CN112416128B (en) Gesture recognition method and terminal equipment
CN112119426A (en) Image feature point matching method and device
CN104516472A (en) Processor and data processing method
CN109358755B (en) Gesture detection method and device for mobile terminal and mobile terminal
CN111932411A (en) Method and device for determining urban land function and terminal equipment
CN105955823A (en) Method and system for determining operation frequency of computing resource
CN113920720A (en) Highway tunnel equipment fault processing method and device and electronic equipment
CN109614854A (en) Video data handling procedure and device, computer installation and readable storage medium storing program for executing
CN116912760B (en) Internet of things data processing method and device, electronic equipment and storage medium
CN116736256B (en) Radar identification method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant