CN112518741B - Robot control method, device, robot and storage medium - Google Patents

Robot control method, device, robot and storage medium Download PDF

Info

Publication number
CN112518741B
CN112518741B CN202011230187.6A CN202011230187A CN112518741B CN 112518741 B CN112518741 B CN 112518741B CN 202011230187 A CN202011230187 A CN 202011230187A CN 112518741 B CN112518741 B CN 112518741B
Authority
CN
China
Prior art keywords
robot
local area
user
information
use request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011230187.6A
Other languages
Chinese (zh)
Other versions
CN112518741A (en
Inventor
顾震江
梁朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202011230187.6A priority Critical patent/CN112518741B/en
Publication of CN112518741A publication Critical patent/CN112518741A/en
Application granted granted Critical
Publication of CN112518741B publication Critical patent/CN112518741B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The application is applicable to the field of robot control and provides a robot control method, a device, a robot and a storage medium. Wherein, the robot control method comprises the following steps: upon receiving a usage request requesting use within a local area, verifying whether the usage request satisfies an authorization condition; and if the use request meets the authorization condition, determining the robot to be used in a local area. The embodiment of the application can reduce the risk that the privacy is revealed when the user uses the robot, and improves the privacy safety.

Description

Robot control method, device, robot and storage medium
Technical Field
The present application relates to the field of robot control, and in particular, to a robot control method, apparatus, robot, and storage medium.
Background
In consideration of portability, cost and other factors, the hardware devices of mobile terminals such as mobile phones and notebook computers are limited at present, and the use requirements of users are difficult to meet. The existing robot can be provided with a large-screen display screen and can also be provided with higher sound equipment, so that the robot is used for providing service for a user, and the requirements of the user can be better met. However, the existing method for controlling the robot to provide the service for the user easily affects the privacy security of the user, and has the problem of low privacy security.
Disclosure of Invention
The embodiment of the application provides a robot control method and device, a robot and a storage medium, and can solve the problem that when the robot is controlled to provide services for users, the privacy safety of the users is easily affected.
In a first aspect, an embodiment of the present application provides a robot control method, including:
upon receiving a usage request requesting use within a local area, verifying whether the usage request satisfies an authorization condition;
and if the use request meets an authorization condition, determining the robot to be used in a local area.
In a possible implementation manner of the first aspect, the verifying, when a usage request requesting usage in a local area is received, whether the usage request satisfies an authorization condition includes: when a use request requesting to be used in a local area is received, analyzing the use request to obtain application terminal information for sending the use request; and verifying whether the use request meets an authorization condition or not according to the application terminal information.
In a possible implementation manner of the first aspect, the application end information includes configuration information of a sending terminal and user information of the sending terminal; the verifying whether the use request meets the authorization condition according to the application terminal information comprises: if the configuration information of the sending terminal and/or the user information of the sending terminal comprise authorization information, the use request meets authorization conditions; otherwise, the use request does not meet the authorization condition.
In one possible implementation manner of the first aspect, the determining that the robot is used in a local area includes: acquiring a global map of the robot, wherein the global map is a global map of a space area where the robot is located, and the global map comprises a local map; determining the local map according to the use request in the global map; and taking the local area corresponding to the local map as the use range of the robot.
In a possible implementation manner of the first aspect, the taking the local area corresponding to the local map as the usage area of the robot includes: and planning the local map by arranging a virtual wall in the global map, and taking a local area corresponding to the local map as the use range of the robot.
In a possible implementation manner of the first aspect, the determining that the robot is used in a local area after the robot is determined to be used in the local area if the usage request satisfies an authorization condition includes: acquiring data to be played; and playing the data to be played in the local area based on the robot.
In a possible implementation manner of the first aspect, the playing the data to be played in the local area based on the robot includes: identifying facial features of a user providing the data to be played; and according to the facial features, adjusting the pose of the robot and/or adjusting the playing parameters of the data to be played in the local area so as to play the data.
A robot control apparatus provided in a second aspect of an embodiment of the present application includes:
a verification unit configured to verify, when a usage request requesting usage in a local area is received, whether the usage request satisfies an authorization condition;
a determination unit, configured to determine that the robot is used in a local area if the usage request satisfies an authorization condition.
A third aspect of the embodiments of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the above method when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the above method.
A fifth aspect of embodiments of the present application provides a computer program product, which when run on a robot causes the robot to perform the steps of the method.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
According to the embodiment of the application, when the use request requesting to be used in the local area is received, whether the use request meets the authorization condition or not is verified. And if the use request meets the authorization condition, determining the robot to be used in the local area. Therefore, when the robot provides service for the user in the local area, whether the robot can be used in the local area or not can be verified, so that the condition that the robot leaves the local area when the robot serves the user is avoided, and the service content is known by other users when the robot leaves the local area; on the other hand, the robot is prevented from entering an area which does not meet the authorization condition, and the influence on the privacy safety of other users is avoided. According to the embodiment of the application, the risk of privacy leakage is reduced, and the privacy security is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of a robot control method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an implementation flow of verifying whether an authorization condition is satisfied according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an implementation of determining a usage range according to an embodiment of the present application;
fig. 4 is a schematic view of an implementation flow for playing data to be played according to an embodiment of the present application;
fig. 5 is a schematic flow chart illustrating an implementation of playing data according to facial features according to an embodiment of the present application;
fig. 6 is a schematic flow chart illustrating implementation of playing data at a reminding time according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a robot control device according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a robot provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
In order to explain the technical means of the present application, the following description will be given by way of specific examples.
Fig. 1 shows a schematic flow chart of an implementation of a robot control method provided in an embodiment of the present application, where the method may be applied to a robot.
Specifically, the robot control method may include the following steps S101 to S102.
Step S101, when receiving a use request requesting use in a local area, verifying whether the use request satisfies an authorization condition.
In an embodiment of the present application, an administrator may configure the robot in a certain spatial area to provide services to users in the spatial area, for example, a hotel administrator configures one or more robots in a hotel, and the robot may move inside a building of the hotel. The use request is a request for instructing the robot to use the robot in a local area when the user needs the service provided by the robot.
The user can select the mode of sending the use request according to the actual situation, and the robot can also receive the use request sent by the user in different modes.
In some embodiments of the present application, a user may establish a connection between a mobile terminal and a robot based on short-range communication technology such as bluetooth, Wifi, and the like, and operate on the mobile terminal to send a use request to the robot through the mobile terminal. For example, the user may connect Wifi in the current spatial region with a smartphone, and download application software associated with the robot on the smartphone. Then, a user operates the application software to trigger a use request, and the use request is sent to the robot through the smart phone.
In other embodiments of the present application, the user may also directly operate on the display screen of the robot to generate the usage request.
The local area is at least one part of the space area where the robot is located. In practical applications, the robot does not have the authority of all areas in the spatial area due to the factors of user privacy, convenience of robot maintenance and the like. For example, in a hotel, a robot typically cannot enter a user's room without permission.
Therefore, in the embodiment of the present application, before the robot provides the service, it is necessary to verify whether the usage request satisfies the authorization condition. The authorization condition is used for judging whether the robot can be used by a user in a local area.
And step S102, if the use request meets the authorization condition, determining the robot to be used in the local area.
In the embodiment of the application, if the usage request meets the authorization condition, it indicates that the robot has the right to move in the local area, so that the robot can be determined to be used in the local area, that is, the robot can provide services for the user in the local area.
In some embodiments of the application, if the usage request does not satisfy the authorization condition, it indicates that the robot does not have the right to move in the local area, and at this time, the robot may send a prompt message to remind the user that the authorization condition is not currently satisfied. For example, the prompt information may be displayed in the form of characters on the display screen of the robot, or the prompt information may be fed back to the mobile terminal that sent the request for use.
To better illustrate the effect of verifying the authorization conditions, the spatial area in which the robot is located is taken as an example of a hotel, and there are six different areas of room a, room B, room C, room D, gym, and hallway in the hotel. In the case where the use request is not received, the robot can only walk around and provide services in a public area such as an aisle or a gym, but cannot be used in a room, in consideration of user privacy security. When the robot receives a use request requesting use in the room D, and the use request satisfies the authorization condition, the robot can determine itself to use in the room D.
In practical applications, the local area is not necessarily matched with a pre-divided real space area. The size and position of the local region may be determined according to a specific acquisition mode of the local region.
In some embodiments of the present application, if a space region in which the robot is located is pre-divided into a plurality of sub-regions, and each sub-region is marked with corresponding region information such as an identification number or a name, the robot may determine the region information provided by the user or a region in which the robot executes a task instruction as the local region. In this case, the local area is the same as the real space area divided in advance.
In other embodiments of the present application, a user may perform a circling operation on a global map of a spatial area where the robot is located, and the robot may confirm an area pointed by the circling operation as a local area. In this case, the local area may be different from the real space area divided in advance.
In an embodiment of the application, when the robot is determined to be used in the local area, the robot needs to go to the local area and move in the local area for the user to use.
According to the embodiment of the application, when the use request requesting to be used in the local area is received, whether the use request meets the authorization condition or not is verified. And if the use request meets the authorization condition, determining the robot to be used in the local area. Therefore, when the robot provides service for the user in the local area, whether the robot can be used in the local area or not can be verified, so that the condition that the robot leaves the local area when the robot serves the user is avoided, and the service content is known by other users because the robot leaves the local area; on the other hand, the robot is prevented from entering an area which does not meet the authorization condition, and the influence on the privacy safety of other users is avoided. The embodiment of the application reduces the risk of privacy leakage and improves the privacy security.
In practical applications, for convenience, the user sends the request to the robot through a mobile phone, a mobile computer, or other terminals. In a hotel or KTV scenario, for example, a user may prefer to send a use request to a robot with a mobile terminal directly in a room or a box to instruct the robot to provide a service to the user.
At this time, as shown in fig. 2, in some embodiments of the present application, verifying whether the usage request satisfies the authorization condition when the usage request requesting the usage in the local area is received may include the following steps S201 to S202.
Step S201, when receiving a request for use in a local area, parsing the request for use to obtain information of an application end sending the request for use.
That is, in some embodiments of the present application, when a user sends a use request to the robot through another terminal, the use request carries application end information for verifying whether an authorization condition is satisfied. The robot can acquire the application terminal information carried in the use request by analyzing the use request.
Step S202, according to the application terminal information, whether the use request meets the authorization condition is verified.
The specific content of the application terminal information can be selected according to the actual situation, and correspondingly, the verification mode of the authorization condition can be adjusted according to the specific content of the application terminal information.
For example, in some embodiments of the present application, the application side information may include configuration information of the sending terminal and user information of the sending terminal.
The transmission terminal is a terminal for transmitting a use request to a robot. The verifying whether the usage request satisfies the authorization condition according to the application end information may include: if the configuration information of the sending terminal and/or the user information of the sending terminal comprise authorization information, the use request meets the authorization condition. Otherwise, the usage request does not satisfy the authorization condition.
That is to say, in some embodiments of the present application, when a user sends a usage request, authorization information is carried in the usage request and sent to the robot, and when the robot parses the authorization information, the robot confirms that the usage request satisfies authorization conditions. The authorization information may be included in the configuration information of the sending terminal, or may be included in the user information of the sending terminal, and may be specifically determined according to an obtaining manner of the authorization information.
In some embodiments of the application, the user may obtain authorization by using configuration information such as a device identifier, a device model, or network access configuration information of the transmitting terminal, and when the user transmits a use request through the mobile terminal, the configuration information of the transmitting terminal is transmitted to the robot. The robot can confirm that the use request satisfies the authorization condition by analyzing the authorization information included in the configuration information of the transmitting terminal.
In other embodiments of the application, the user may also obtain authorization by using user information such as a mobile phone number, an identification number, a user identification number, or a face image of the user, and when the user sends the use request through the mobile terminal, the user information is sent to the robot. The robot can confirm that the use request satisfies the authorization condition when analyzing the authorization information included in the user information.
When the configuration information of the sending terminal and/or the user information of the sending terminal are consistent with the content of the authorization information record table of the robot, the configuration information of the sending terminal and/or the user information of the sending terminal comprise authorization information, and the use request can be confirmed to meet the authorization condition, wherein the content of the authorization information record table can be input by a pre-registration method, for example, when a resident of a hotel transacts a registration and check-in, the user information is input, and the user information is used for verifying that the use request sent by the user meets the authorization condition. If the configuration information of the transmission terminal and the user information do not include authorization information, it is indicated that the user does not obtain authorization for the robot to use in the local area, and thus it is possible to confirm that the request for use does not satisfy the authorization condition.
In the embodiment of the application, the application terminal information for sending the use request is obtained by analyzing the use request, and if the configuration information of the sending terminal and/or the user information of the sending terminal in the application terminal information comprises authorization information, the use request meets an authorization condition; otherwise, the usage request does not satisfy the authorization condition. That is, when the user obtains authorization through the configuration information of the sending terminal and/or the user information of the sending terminal, and sends the authorization information to the robot by carrying the authorization information in the use request, the robot can be determined to be used in the local area. Therefore, the robot cannot execute the instruction of an unauthorized user and cannot enter a local area at will, and privacy safety is improved.
In some embodiments of the present application, in order to provide a service to a user in a local area, the robot needs to determine a usage range of the robot.
Specifically, as shown in fig. 3, the determination of the robot as being used in the local area may include the following steps S301 to S303.
Step S301, a global map of the robot is obtained.
The global map is a global map of a space area where the robot is located. The space area where the robot is located is an area where the robot moves, and may be different space areas such as hotels and KTVs. The size of the spatial region can be set by the skilled person according to the actual requirements. For example, it may refer to a floor or a whole floor in a hotel, etc.
In some embodiments of the present application, when the robot is configured in an environment with a known environment and a fixed layout, a technician may store a global map in the memory space of the robot in advance. For example, in a hotel or the like environment, a global map thereof is generally known, and a layout is generally not changed greatly, and therefore, a technician can store the global map in the memory space of the robot in advance. The robot may retrieve this global map stored in memory space each time it needs to provide a service.
In other embodiments of the present application, the robot may also send a request to a server for scheduling the robot to obtain a global map fed back by the server each time the robot needs to provide a service.
Step S302, in the global map, a local map is determined according to the use request.
In some embodiments of the present application, the global map includes a local map. The local map is a map for providing a service to the robot, and a local area is described in the local map. According to the use request, a local map can be determined in the global map.
Specifically, in some embodiments of the application, if a spatial region where the robot is located is pre-divided into a plurality of sub-regions, and a local map corresponding to each sub-region is divided on the global map, and each local map is marked with corresponding region information such as an identification number or a name, the robot may search the local map corresponding to the region information from the global map according to the region information provided by the user.
In other embodiments of the present application, a user may perform a circle operation on a global map on a display screen of a robot or a mobile terminal, and the robot determines a part of the map circled in the global map as a local map.
In step S303, the local area corresponding to the local map is used as the range of use of the robot.
Since the local area is recorded in the local map, in some embodiments of the present application, the local area corresponding to the local map may be used as the use range of the robot. Namely, the robot can move in a local area according to the local map and is used by a user.
In some embodiments of the present application, the local area corresponding to the local map may be used as the use range of the robot, and the robot may directly replace the used navigation map with the local map so as to move the robot within the local area.
In another embodiment of the present application, the setting of the local area corresponding to the local map as the use range of the robot may further include: a virtual wall is arranged in the global map to plan a local map, and a local area corresponding to the local map is used as a use range of the robot.
It is understood that the global map may record actual boundaries in the spatial region, such as walls, doors, etc., and the robot may not be able to cross the actual boundaries when moving. The boundaries of the local regions may not be exactly the actual boundaries. Therefore, in some embodiments of the present application, a virtual wall needs to be drawn in the global map, where the virtual wall corresponds to the boundary of the local area corresponding to the local map. The robot plans a local map of a local area in the global map, so that the robot cannot penetrate through the virtual wall and is used by a user in the local area.
In the embodiment of the application, the virtual wall is arranged on the global map, the robot can modify the same global map every time the robot provides service, and when a user sends a request for stopping use to the robot, the robot can remove the virtual wall from the global map to obtain the original global map and move according to the original global map, so that the storage pressure of the robot can be reduced.
In the embodiment of the application, the robot acquires a global map of the robot, and determines a local map according to a use request in the global map. Then, the local area corresponding to the local map is used as the use range of the robot. Therefore, the robot can determine the use range according to the local map so as to control the robot to leave the local area and provide services for the user in the local area, and the privacy safety is improved.
In some embodiments of the present application, if the usage request satisfies the authorization condition, after the robot is determined to be used in the local area, the robot may provide the service for the user according to the actual situation.
Specifically, as shown in fig. 4, in some embodiments of the present application, after the robot is determined to be used in the local area, steps S401 to S402 may be included.
Step S401, acquiring data to be played.
The data to be played refers to data that a user needs to play by the robot, and the data may be data in different forms such as videos, music or presentations. That is, when the user needs the robot to play the data to be played, the user sends a use request to the robot, and after the robot determines to use the data in the local area, the user sends the data to be played to the robot. After receiving the data to be played, the robot can play the data to be played to serve the user.
The acquisition mode of the data to be played can be selected according to actual conditions. For example, the user may establish communication between the mobile terminal and the robot in a wireless or wired manner, and send data to be played on the mobile terminal to the robot. Or, the user may also input information of the data to be played, and the robot downloads the corresponding data to be played from the internet according to the information.
Step S402, based on the robot, playing the data to be played in the local area.
In the embodiment of the present application, since the local area is a moving range of the robot when playing data, the robot can play data to be played in the local area by using its own hardware device. Compared with a mobile terminal carried by a user, the robot is generally provided with better hardware equipment, and better audio-visual experience can be provided for the user.
In order to enable the robot to more intelligently serve the user, in some embodiments of the present application, as shown in fig. 5, the playing of the data to be played in the local area based on the robot may include the following steps S501 to S502.
In step S501, facial features of a user who provides data to be played are identified.
The facial features of the user refer to the features of the face of the user when the user listens to or watches the data to be played, and may be, for example, facial expressions of the user, the time length for which the eyes are closed, the number of blinks, and the like. In some embodiments of the present application, the robot may capture a facial image of a user through a camera configured for the robot, and identify facial features therein.
And step S502, adjusting the pose of the robot and/or adjusting the playing parameters of the data to be played in the local area according to the facial features so as to play the data.
The playing parameters may be brightness of the display screen of the robot, volume of playing, or parameters such as stopping playing, starting playing, and playing the next video.
Specifically, in some embodiments of the present application, the robot may recognize a facial expression of the user, determine an emotion of the user, and if the user is in a negative emotion, decrease brightness of the display screen or decrease a volume of the playing. For example, when the facial expression of the user is recognized to accord with the preset fear expression characteristics, the emotion of the user can be judged to be fear, and the brightness of the display screen can be reduced or the volume of the playing can be reduced.
In other embodiments of the present application, the robot may calculate a time period for closing the eyes of the user, and if the time period for closing the eyes of the user is longer than a preset time period, it indicates that the user may be in a fatigue state, and at this time, the brightness of the display screen may be reduced, or the volume of the playing may be reduced. It is also possible to pause the play and turn off the screen directly. Or the pose of the robot can be adjusted, so that the display screen of the robot faces away from the user, and the user can fall asleep.
In other embodiments of the present application, the robot may identify the eyes of the user, determine the line of sight of the user; and adjusting the self pose according to the sight line of the user and the relative position of the user so as to enable the included angle between the sight line of the user and the display screen to be within a preset angle range.
The preset angle range is used for judging whether the user can normally receive and see the information displayed by the display screen. That is, when the included angle between the line of sight of the user and the display screen is outside the preset angle range, which indicates that the user cannot normally view the information displayed on the display screen, the pose of the robot needs to be adjusted so that the user can view the information displayed on the display screen. When the included angle between the sight line of the user and the display screen is within the preset angle range, the user can normally receive and see the information displayed by the display screen. The specific value of the preset angle range can be adjusted by a worker or can be set by a user.
In the embodiment of the application, the robot can better serve the user by identifying the facial features of the user, adjusting the pose of the robot in the local area and/or adjusting the playing parameters of the data to be played according to the facial features to play the data, for example, when the user is in a fatigue state, the robot can adjust the playing parameters and/or the pose of the robot according to the facial features of the user, so that the user can have better use experience.
In other embodiments of the present application, as shown in fig. 6, the playing the data to be played in the local area based on the robot may further include the following steps S601 to S602.
Step S601, acquiring a reminding time.
The reminding time may refer to a time point at which to start playing the data to be played, that is, the data to be played starts to be played at the reminding time; or the time period for playing the data to be played, that is, the data to be played is played within the reminding time.
The data to be played can be audio and video data, which can be audio and video on a network or video sent to the robot by a user through a mobile terminal. In a possible implementation manner, the audio/video data may also be an audio/video obtained by recording the user by using a camera and a microphone configured by the robot under the condition that permission of the user is obtained.
Step S602, at the reminding time, based on the robot, the data to be played is played in the local area.
Specifically, the data to be played is taken as audio/video data for example, and after the reminding time and the audio/video data are obtained, the obtained reminding time and the audio/video data can be associated. For example, when the user sends the reminding time, the associated audio/video data can be selected, so that the robot can associate the obtained reminding time with the audio/video data. Then, at the reminding time, the robot can play audio and video data in a local area.
Therefore, in the embodiment of the application, the effect of reminding the user in the preset time can be achieved by acquiring the reminding time and playing the data to be played in the local area recorded by the map through the robot at the reminding time.
In some embodiments of the application, with reference to the method in fig. 5, the robot may play data to be played in a local area through the robot at a reminding time, and then identify facial features of the user, and if the facial features of the user do not change within a period of time, it indicates that the user may not receive the reminding, so that the brightness of the display screen may be increased, or the playing volume may be increased, to remind the user.
In other embodiments of the present application, the robot may further receive a control instruction issued by the user, where the instruction is used to control the robot to adjust the playing parameters of the data to be played. After receiving the control instruction, the robot may adjust the playing parameters of the data to be played according to the control instruction.
Specifically, the user may send a control instruction to the robot through the mobile terminal, or the robot may continuously collect voice information sent by the user after starting playing the data to be played, and if the voice information of the user is collected, generate a corresponding control instruction according to the voice information.
The playing parameters may be brightness and volume when playing the data to be played, or parameters such as stopping playing, starting playing, playing the next video, and the like. For example, the user may say "pause" in the local area, and the robot generates a control command and pauses playing the data to be played after receiving the voice message.
In the embodiment of the application, the parameters can be adjusted in real time according to the instruction of the user by receiving the control instruction sent by the user, so that the user can have better use experience.
In other embodiments of the present application, the robot may acquire a relative position of the robot and a user, and adjust a position of the robot or an angle of a display screen of the robot according to the relative position.
The relative position may include a plurality of contents, and the specific contents may be selected according to actual situations. For example, the distance between the robot and the user, or the angle between the robot display and the user's line of sight, etc. The display screen is used for playing data to be played.
Specifically, the robot may acquire an image of a user in real time through the camera, and acquire the relative position according to the image of the user, or acquire the relative position by using a laser sensor, an infrared sensor, or the like. Then, according to the relative position, the robot can move in the local area, so that the distance between the robot and the user is smaller than the preset distance. And moreover, the self pose can be adjusted according to the relative position, so that the display screen faces to the user.
In other embodiments of the present application, environmental information of a position where the robot is located may also be obtained, and the position of the robot or an angle of a display screen of the robot may be adjusted according to the environmental information.
The environment information refers to environment information received by the robot at the position of the robot; for example, the information may be ambient brightness, noise intensity, etc.
Specifically, in some embodiments of the present application, when the environment information is environment brightness, the brightness of the display screen may be adjusted according to the environment brightness, so that a user may normally view information displayed in the display screen. For example, when the ambient brightness is greater than the brightness threshold, the brightness of the display screen is adjusted to the preset brightness. Or the position of the robot or the angle of the display screen can be adjusted according to the ambient brightness, so that the display screen is not in a position with too high brightness, and a user can normally watch the information displayed in the display screen.
In other embodiments of the present application, when the environment information is noise intensity, the position of the robot may be adjusted to be away from the noise according to the noise intensity. For example, when the noise intensity is greater than the preset decibel value, the robot may move within the local area until the noise intensity is less than the preset decibel value. Or the position with the minimum noise intensity in the whole local area is used as the position of the robot for playing the data to be played.
In practical application, in order to improve user experience, the robot can adjust the position, the angle of the display screen and the playing parameter according to a plurality of factors such as facial features of a user, the relative position of the robot and the user, environmental information and the like, so that the user can obtain better audio-visual experience.
It should be noted that, for simplicity of description, the foregoing method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts, as some steps may, in accordance with the present application, occur in other orders.
Fig. 7 is a schematic structural diagram of a robot control device 700 according to an embodiment of the present disclosure, where the robot control device 700 is configured on a robot. The robot controller 700 may include: an authentication unit 701 and a determination unit 702.
A verification unit 701, configured to, when receiving a usage request requesting usage in a local area, verify whether the usage request satisfies an authorization condition;
a determining unit 702, configured to determine that the robot is used in a local area if the usage request meets an authorization condition.
In some embodiments of the present application, the verification unit 701 is further specifically configured to, when receiving a use request requesting use in a local area, parse the use request to obtain application end information for sending the use request; and verifying whether the use request meets an authorization condition or not according to the application terminal information.
In some embodiments of the present application, the application-side determining information includes configuration information of a sending terminal and user information of the sending terminal; the verification unit 701 is further specifically configured to, if the configuration information of the sending terminal and/or the user information of the sending terminal includes authorization information, determine that the usage request meets an authorization condition; otherwise, the use request does not satisfy the authorization condition.
In some embodiments of the present application, the determining unit 702 is further specifically configured to obtain a global map of the robot, where the global map is a global map of a spatial area where the robot is located, and the global map includes a local map; determining the local map according to the use request in the global map; and taking the local area corresponding to the local map as the use range of the robot.
In some embodiments of the application, the determining unit 702 is further specifically configured to set a virtual wall in the global map to plan the local map, and use a local area corresponding to the local map as the use range of the robot.
In some embodiments of the present application, the robot controller 700 further includes a playing unit, configured to obtain data to be played; and playing the data to be played in the local area based on the robot.
In some embodiments of the present application, the playing unit is further specifically configured to identify a facial feature of a user providing the data to be played; and according to the facial features, adjusting the pose of the robot and/or adjusting the playing parameters of the data to be played in the local area so as to play the data.
It should be noted that, for convenience and simplicity of description, the specific working process of the robot control device 700 may refer to the corresponding process of the method described in fig. 1 to fig. 6, and is not repeated herein.
Fig. 8 is a schematic view of a robot according to an embodiment of the present disclosure. The robot 8 may include: a processor 80, a memory 81 and a computer program 82, such as a robot control program, stored in said memory 81 and executable on said processor 80. The processor 80, when executing the computer program 82, implements the steps in the various robot control method embodiments described above, such as steps S101 to S102 shown in fig. 1. Alternatively, the processor 80, when executing the computer program 82, implements the functions of each module/unit in each device embodiment described above, for example, the functions of the units 701 to 702 shown in fig. 7.
The computer program may be divided into one or more modules/units, which are stored in the memory 81 and executed by the processor 80 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program in the robot.
For example, the computer program may be divided into: an authentication unit and a determination unit. The specific functions of each unit are as follows: a verification unit configured to verify, when a usage request requesting usage in a local area is received, whether the usage request satisfies an authorization condition; a determination unit, configured to determine that the robot is used in a local area if the usage request satisfies an authorization condition.
The robot may include, but is not limited to, a processor 80, a memory 81. Those skilled in the art will appreciate that fig. 8 is merely an example of a robot and is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or different components, e.g., the robot may also include input and output devices, network access devices, buses, etc.
The Processor 80 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may be an internal storage unit of the robot, such as a hard disk or a memory of the robot. The memory 81 may also be an external storage device of the robot, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the robot. Further, the memory 81 may also include both an internal storage unit and an external storage device of the robot. The memory 81 is used for storing the computer program and other programs and data required by the robot. The memory 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/robot and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/robot are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier signal, telecommunications signal, software distribution medium, and the like. It should be noted that the computer-readable medium may contain suitable additions or subtractions depending on the requirements of legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer-readable media may not include electrical carrier signals or telecommunication signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (9)

1. A robot control method applied to a robot includes:
upon receiving a usage request requesting use within a local area, verifying whether the usage request satisfies an authorization condition;
if the use request meets an authorization condition, determining the robot to be used in a local area;
controlling the robot to go to a local area and move in the local area for a user to use;
the verifying whether the use request meets the authorization condition when receiving the use request requesting to use in the local area comprises:
when a use request requesting to be used in a local area is received, analyzing the use request to obtain application end information for sending the use request, wherein the application end information comprises configuration information and user information of a sending end;
and verifying whether the use request meets an authorization condition according to whether the application end information is analyzed to obtain authorization information.
2. The robot control method according to claim 1, wherein the application-side information includes configuration information of a transmitting terminal and user information of the transmitting terminal;
the verifying whether the use request meets the authorization condition according to whether the authorization information is analyzed from the application terminal information comprises:
if the configuration information of the sending terminal and/or the user information of the sending terminal comprises authorization information, the use request meets an authorization condition;
otherwise, the use request does not satisfy the authorization condition.
3. The robot control method of claim 1, wherein said determining the robot to be used within a local area comprises:
acquiring a global map of the robot, wherein the global map is a global map of a space area where the robot is located, and the global map comprises a local map;
determining the local map according to the use request in the global map;
and taking the local area corresponding to the local map as the use range of the robot.
4. The robot control method according to claim 3, wherein the setting the local area corresponding to the local map as the range of use of the robot includes:
and planning the local map by arranging a virtual wall in the global map, and taking a local area corresponding to the local map as the use range of the robot.
5. The robot control method according to claim 1, wherein the determining that the robot is used in a local area after the usage request satisfies an authorization condition includes:
acquiring data to be played;
and playing the data to be played in the local area based on the robot.
6. The robot control method according to claim 5, wherein the playing the data to be played in the local area based on the robot includes:
identifying facial features of a user providing the data to be played;
and according to the facial features, adjusting the pose of the robot and/or adjusting the playing parameters of the data to be played in the local area so as to play the data.
7. A robot control apparatus, comprising:
a verification unit configured to verify, when a usage request requesting usage in a local area is received, whether the usage request satisfies an authorization condition;
the determining unit is used for determining the robot to be used in a local area if the use request meets an authorization condition, and is also used for controlling the robot to move to the local area and move in the local area for a user to use;
the verification unit is further specifically configured to, when receiving a use request requesting use in a local area, parse the use request to obtain application end information for sending the use request; and verifying whether the use request meets an authorization condition according to whether the application end information is analyzed to obtain authorization information, wherein the application end information comprises configuration information and user information of a sending end.
8. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 6 are implemented when the computer program is executed by the processor.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202011230187.6A 2020-11-06 2020-11-06 Robot control method, device, robot and storage medium Active CN112518741B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011230187.6A CN112518741B (en) 2020-11-06 2020-11-06 Robot control method, device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011230187.6A CN112518741B (en) 2020-11-06 2020-11-06 Robot control method, device, robot and storage medium

Publications (2)

Publication Number Publication Date
CN112518741A CN112518741A (en) 2021-03-19
CN112518741B true CN112518741B (en) 2022-09-09

Family

ID=74979760

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011230187.6A Active CN112518741B (en) 2020-11-06 2020-11-06 Robot control method, device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN112518741B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116631153B (en) * 2023-05-12 2023-11-03 天津医药集团众健康达医疗器械有限公司 Indoor space-oriented regional inspection alarm method, device, equipment and medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250533A (en) * 2016-08-05 2016-12-21 北京光年无限科技有限公司 A kind of Rich Media's played data treating method and apparatus towards intelligent robot
CN108090474A (en) * 2018-01-17 2018-05-29 华南理工大学 A kind of hotel service robot system linked up based on cloud voice with mood sensing
CN108764507A (en) * 2018-06-01 2018-11-06 深圳乐易住智能科技股份有限公司 One kind is self-service to move in formula Hospitality management system and method
CN108803589A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 Robot virtual wall system
CN110450174A (en) * 2019-08-02 2019-11-15 深圳市三宝创新智能有限公司 A kind of navigation of foreground robot is led the way method
US10482550B1 (en) * 2013-03-27 2019-11-19 Vecna Robotics, Inc. Mobile robot for performing hospitality service(s) for guest(s) of a hospitatlity business
CN110722568A (en) * 2019-11-01 2020-01-24 北京云迹科技有限公司 Robot control method, device, service robot and storage medium
CN111222939A (en) * 2019-11-05 2020-06-02 盟广信息技术有限公司 Robot-based hotel service method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103631202A (en) * 2012-08-20 2014-03-12 北京威控科技发展有限公司 Hotel guest room intelligent monitoring system and method based on internet of things
CN107545354A (en) * 2017-05-22 2018-01-05 圆动(上海)信息技术服务有限公司 A kind of hotel service and management method
CN111245622A (en) * 2018-11-29 2020-06-05 吴德松 Remote identity authentication method
KR20190098926A (en) * 2019-08-05 2019-08-23 엘지전자 주식회사 Robot and Method for providing service providing of the robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482550B1 (en) * 2013-03-27 2019-11-19 Vecna Robotics, Inc. Mobile robot for performing hospitality service(s) for guest(s) of a hospitatlity business
CN106250533A (en) * 2016-08-05 2016-12-21 北京光年无限科技有限公司 A kind of Rich Media's played data treating method and apparatus towards intelligent robot
CN108803589A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 Robot virtual wall system
CN108090474A (en) * 2018-01-17 2018-05-29 华南理工大学 A kind of hotel service robot system linked up based on cloud voice with mood sensing
CN108764507A (en) * 2018-06-01 2018-11-06 深圳乐易住智能科技股份有限公司 One kind is self-service to move in formula Hospitality management system and method
CN110450174A (en) * 2019-08-02 2019-11-15 深圳市三宝创新智能有限公司 A kind of navigation of foreground robot is led the way method
CN110722568A (en) * 2019-11-01 2020-01-24 北京云迹科技有限公司 Robot control method, device, service robot and storage medium
CN111222939A (en) * 2019-11-05 2020-06-02 盟广信息技术有限公司 Robot-based hotel service method and device

Also Published As

Publication number Publication date
CN112518741A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
EP3188414B1 (en) Method and apparatus for controlling smart device
CN106126420B (en) Application program adjustment method and device
US20150156257A1 (en) Application service providing method and system, and related device
EP2648151A1 (en) Method and system for the real-time collection of a feedback from the audience of a television or radio show
CN105205479A (en) Human face value evaluation method, device and terminal device
US10038834B2 (en) Video call method and device
CN105337800B (en) Poll frequency method of adjustment and device
EP3174249A1 (en) Network state information presentation method and apparatus, computer program and recording medium
EP3236469B1 (en) Object monitoring method and device
JP6126755B2 (en) Terminal verification method, apparatus, program, and recording medium
CN104158665A (en) Method and device of verification
CN107371052A (en) Apparatus control method and device
CN112518741B (en) Robot control method, device, robot and storage medium
CN107330391A (en) Product information reminding method and device
US20200211306A1 (en) Parking system and method
CN107393528B (en) Voice control method and device
CN109697632A (en) Advertisement access method, device, equipment and storage medium
CN104579665B (en) Method for authenticating and device
CN107945552A (en) Become method, apparatus and the storage medium that the lamp time is prompted to signal lamp
EP3062481A1 (en) Method, system, and related device for providing application service
KR101665256B1 (en) Attendance check method and system using non-audible frequency and pattern
US20190104249A1 (en) Server apparatus, distribution system, distribution method, and program
KR102393112B1 (en) Method and apparatus for function of translation using earset
KR20210146215A (en) User customized real estate recommendation system
CN103905546B (en) A kind of method and apparatus of terminal logs in remote server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant