CN112692827A - Robot control method and system and robot - Google Patents

Robot control method and system and robot Download PDF

Info

Publication number
CN112692827A
CN112692827A CN202011472141.5A CN202011472141A CN112692827A CN 112692827 A CN112692827 A CN 112692827A CN 202011472141 A CN202011472141 A CN 202011472141A CN 112692827 A CN112692827 A CN 112692827A
Authority
CN
China
Prior art keywords
robot
task
display
display device
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011472141.5A
Other languages
Chinese (zh)
Inventor
罗沛
梁朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202011472141.5A priority Critical patent/CN112692827A/en
Publication of CN112692827A publication Critical patent/CN112692827A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V33/00Structural combinations of lighting devices with other articles, not otherwise provided for

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The application relates to the technical field of robot control, and provides a robot control method, wherein a display device and a lamp strip are installed on a machine body of a robot, and the method comprises the following steps: when the robot executes a task, displaying an execution process on the display device, and displaying a visual effect on the lamp strip according to the moving state of the robot caused by the execution of the task. Correspondingly, the application also provides a robot control system and a robot. In this application, when the user used the robot, display device and lamp area can link and produce visual effect to promote user experience.

Description

Robot control method and system and robot
Technical Field
The present application relates to the field of robot control technologies, and in particular, to a robot control method, system, and robot.
Background
With the technological progress, robots are widely used in various industries at present. In some airports, hotels or other public places, the robot undertakes reception tasks of answering visitor problems, guiding visitors and the like, and the working pressure of reception personnel is greatly relieved. Meanwhile, the robots in these situations must have good interactive functions to allow users to obtain a good service experience.
At present, the robot interaction mode is single, and user experience is influenced to a certain extent.
Disclosure of Invention
The application aims to provide a robot control method, a robot control system and a robot, and aims to improve user experience when a user uses the robot.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides a robot control method, in which a display device and a light strip are mounted on a body of a robot, the method including:
when the robot executes a task, displaying an execution process on the display device, and displaying a visual effect on the lamp strip according to the moving state of the robot caused by the execution of the task.
As an improvement of the above solution, when the robot executes a task, displaying an execution process on the display device, and before the light strip displays a visual effect according to a movement state of the robot caused by the execution of the task, the method includes:
acquiring an operation instruction triggered by a user on the display device and/or the handheld terminal;
generating a corresponding task according to the operation instruction, and storing the task in a preset task list;
and acquiring the tasks in the preset task list and executing the tasks.
As an improvement of the above solution, the displaying execution process on the display device includes:
displaying, on the display device, a 3D model effect corresponding to a movement state of the robot caused by the execution of the task.
As an improvement of the above solution, the displaying execution process on the display device further includes:
displaying progress information for executing the task and result information for executing the task on the display device, wherein the progress information and the result information are combined with an animation visual effect to be displayed on the display device.
As an improvement of the above solution, the displaying execution process on the display device further includes:
acquiring a target position to which the task is executed;
acquiring a current position in real time;
determining a remaining distance according to the current position and the target position of the robot;
and displaying a reminding effect of executing the task on the display device according to the residual distance.
As a modification of the above, the movement state of the robot includes a straight movement state, a turning movement state, and a stop movement state;
the displaying a visual effect in the light strip according to the movement state of the robot caused by the execution of the task includes:
and controlling the lamp strip to respectively present different visual effects according to the straight moving state, the turning moving state and the stopping moving state of the robot.
As an improvement of the above scheme, the visual effect includes one or more of a display range, a display color and a display brightness of the lamp strip;
the displaying a visual effect in the light strip according to a movement state of the robot caused by the execution of the task, further comprising:
and when the robot is in a straight-moving state or a turning moving state, correspondingly controlling one or more of the display range, the display color and the display brightness of the lamp strip according to the moving speed of the robot.
As an improvement of the above, the method further comprises:
acquiring system time;
when the system time is within a preset dimming time range, adjusting the display brightness of the display device and the display brightness of the lamp strip to a target brightness corresponding to a time period of the preset dimming time range;
or the like, or, alternatively,
acquiring the light intensity;
and when the light intensity is within a preset dimming intensity range, adjusting the display brightness of the display device and the display brightness of the lamp strip to a target brightness corresponding to the brightness grade of the preset dimming intensity range.
In a second aspect, the present application further provides a robot control system, a display device and a light strip are installed to a body of a robot, the system includes:
the first display module is used for displaying an execution process on the display device when the robot executes a task;
and the second display module is used for displaying a visual effect on the lamp strip according to the moving state of the robot caused by the execution of the task when the robot executes the task.
In a third aspect, the present application further provides a robot, a display device and a light strip are mounted on a body of the robot, the robot includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of any of the robot control methods when executing the computer program.
Compared with the prior art, the method has the following beneficial effects:
when the robot executes a task, the display device displays an execution process, and simultaneously displays a visual effect in the lamp strip according to the moving state of the robot caused by the execution of the task, so that different display devices and lamp strip visual effects are displayed in different situations, and the response and execution effects of instructions are fed back to a user through the visual effect displayed by the display devices and the lamp strip in a linkage mode, and the service experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a robot according to a first embodiment of a robot control method of the present application;
FIG. 2 is an expanded flowchart of a first embodiment of a robot control method according to the present application;
FIG. 3 is a flowchart illustrating a first display execution process according to a first embodiment of the robot control method of the present application;
FIG. 4 is a flowchart illustrating a second process performed by the robot control method according to the first embodiment of the present disclosure;
fig. 5 is a flowchart illustrating a visual effect displayed in a light strip according to a movement state of a robot caused by executing a task according to a first embodiment of a robot control method according to the present application;
fig. 6 is a first flowchart of a luminance display process of a lamp band adjustment according to a second embodiment of the robot control method of the present application;
fig. 7 is a second flowchart of adjusting the display brightness of the lamp strip according to the second embodiment of the robot control method of the present application;
fig. 8 is a flow chart of adjusting a lamp band idle visual effect according to a second embodiment of the robot control method of the present application;
fig. 9 is a flow chart of adjusting a visual effect of a lamp strip failure according to a second embodiment of the robot control method of the present application;
FIG. 10 is a block diagram of an embodiment of a robotic control system according to the present application;
FIG. 11 is a block diagram of an extended architecture of an embodiment of a robot control system of the present application;
fig. 12 is a block diagram of a robot according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
The robot control method provided by the embodiment of the application can be applied to terminal equipment such as a robot, the robot can be a mobile service robot, and the embodiment of the application does not limit the specific type of the robot.
In order to explain the technical means of the present application, the following examples are given below.
The first embodiment is as follows:
referring to fig. 1, an embodiment of the present application provides a robot R, and a display device a and a light strip b are mounted on a body of the robot R. The application provides a robot control method, which is applied to a robot R and comprises the following steps of S101:
and S101, when the robot executes a task, displaying an execution process on the display device, and displaying a visual effect on the lamp strip according to the movement state of the robot caused by the execution of the task.
Referring to fig. 2, in an embodiment, when the robot performs a task, displaying an execution process on the display device, and before the light strip displays a visual effect according to a movement state of the robot caused by the task, the method includes steps S102 to S104:
step S102, acquiring an operation instruction triggered by a user on the display device and/or the handheld terminal;
step S103, generating a corresponding task according to the operation instruction, and storing the task in a preset task list;
and step S104, acquiring the tasks in the preset task list and executing the tasks.
In application, the handheld terminal can be a mobile phone, a tablet computer and other devices. The user scans the two-dimensional code by using equipment such as a mobile phone and a tablet personal computer to perform authentication so as to establish communication connection with the robot, then a robot interactive operation interface is displayed on an application program interface of a handheld terminal for establishing communication connection, and the user sends a specific operation instruction to the robot based on the operation interface.
Of course, the operation instruction may also be sent through other devices, such as a desktop computer, a self-service terminal, and the like. These fixed-position devices are in communication connection with the robot through a pre-installed application, and the communication connection can be realized through a wireless communication device, such as a wireless gateway and the like. The user sends the operation instruction sent by the desktop computer, the self-service terminal and the like to the robot through the wireless communication equipment.
The display device can be a human-computer interaction device which comprises a display interface and an operation interface and realizes information interaction between a user and the robot. The user can input an operation instruction through the man-machine interaction device, so that the operation instruction is directly acquired by the robot without other equipment. Taking the hotel service robot as an example, a group of operation buttons are displayed on an operation interface of the human-computer interaction device, and each operation button corresponds to an operation instruction, for example, to query remaining room information or to query a room price. The customer can send out corresponding operation instructions by clicking a specific operation button.
In an application, the operation instructions may include query instructions and control instructions. The query instruction is used for querying the content of the local database or the cloud database. The control commands are used to drive the robot to perform relevant actions or perform relevant tasks, such as a cargo handling command, a guidance tape command, etc., for example, a robot in a hotel guides a guest to a guest room, or a robot in an airport loads and unloads cargo from a cargo plane, etc. The robot can receive a plurality of tasks simultaneously and execute the tasks one by one according to task scheduling.
The management personnel, the maintenance personnel and other personnel with the setting authority of the robot can also send setting instructions through various terminal devices, for example, through hard connection of a desktop computer and the robot, the display device of the robot and the visual effect of the lamp strip are directly set, communication connection can also be established with the robot through a mobile phone and a tablet personal computer, and then the setting instructions are sent. In addition, an identity verification mechanism can be established, and the display device and the lamp strip visual effect are set according to the setting instruction when a verification setting instruction sender is an authorized person, so that the data and setting safety of the robot is ensured.
In one embodiment, after receiving the operation instruction, the data may be confirmed and the identity may be verified. Taking the hotel service robot as an example, when a user sends an operation instruction to inquire related data, if the related data exists and the client has access right, the inquiry task is listed in the task list. If there is no relevant data, or the client does not have access to it, then the query task is not listed in the task list. By the preprocessing, unnecessary tasks can be reduced from being listed in the task list, and the task execution efficiency is improved.
Referring to fig. 3, in one embodiment, a display execution process on a display device includes:
step S1011, displaying, on the display device, a 3D model effect corresponding to the movement state of the robot according to the movement state caused by the execution of the task.
The 3D model effect can visually reflect the moving state of the robot and transfer task details to the user, so that the user or surrounding people can be reminded of the current moving state in a visual and friendly mode, and the surrounding people can follow or avoid the moving state. For example, when the robot performs a task of loading and unloading goods, the 3D model displayed on the display device prompts that the robot is turning left, and the surrounding people, upon seeing the prompt, will avoid to the right side of the advancing direction of the robot, and will prevent the robot from just colliding with the turning robot when walking to the left side.
In one embodiment, the displaying the execution process on the display device further comprises:
step S1012, displaying progress information of executing the task and result information of executing the task on the display device, wherein the progress information and the result information are combined with an animation visual effect to be displayed on the display device.
In the application, the progress information may be information on the number of cargos that have been carried, the distance that has been traveled in guiding the belt-path task, and the like. For example, the robot may display progress information of how many warehouses the robot has gone to load and unload goods, how many goods have been handled, and the like, while performing a task of loading and unloading goods. The result information may be information that the execution of the task is successful or information that the execution of the task is failed. For example, when the robot loads and unloads goods and the weight of the goods exceeds the load that the robot can carry, result information of task failure is displayed on the display device.
Referring to fig. 4, in an embodiment, the display execution process on the display device further includes steps S1013-S1016:
step S1013, a target position to which the task is to be executed is acquired.
In an application, the target position may be included in the operation instruction. The target location is information for determining a target location of the guidance task, and may be a location coordinate, or a location number, such as a room number in a hotel, a gate number in an airport, a scenic spot code in a scenic spot, or the like.
Step 1014, acquiring the current position in real time.
The current position is information for determining the real-time position of the robot. In application, a positioning device may be installed in the robot to acquire positioning coordinates of the robot in real time and use the positioning coordinates as current position information. The Positioning device may be a Global Positioning terminal, such as a Global Positioning System (GPS), or an indoor Positioning terminal, or the robot determines the current position by acquiring environmental information according to the current navigation map.
Step S1015, determining a remaining distance according to the current position and the target position of the robot.
In application, when the guiding remaining distance is calculated, the current position and the target position may be converted into geographic coordinates, and then the linear distance between the two coordinates is calculated and taken as the remaining distance. Taking the guest room guiding task as an example, the current position is the geographic position coordinate of the robot acquired in real time through the positioning device, and the information does not need to be converted into the geographic coordinate. The destination location is the room number, and the room number needs to be converted into the geographic coordinates of the room. The mapping of the room number and its geographical coordinates may be stored in advance in the in-robot memory and then converted according to the mapping. After the current position coordinates and the guest room position coordinates are determined, the linear distance between the two coordinates can be calculated as the remaining distance.
It will be appreciated that during execution of the guidance instructions, after each acquisition of the current position, a corresponding remaining distance needs to be calculated to display the remaining distance in real time.
And step S1016, displaying the reminding effect of executing the task on the display device according to the residual distance.
In application, the reminding effect can be dynamic text, namely, the number and the unit of the remaining distance are directly displayed in real time. The reminding effect can also be realized by grading and dynamic change of the display range.
The visual effect of the display device of the remaining distance ranks the display range by dividing the range of the display device light into four parts. In addition, ranking may be facilitated by calculating the boot remaining distance as a percentage of the total boot distance. The relationship between the display range level, the number of the bright portions of the lamp strip and the percentage of the guiding remaining distance to the total guiding distance is shown in the following table:
Figure BDA0002836161650000081
TABLE 1
As can be seen from the above table, when the guide remaining distance is smaller, the display range level is higher, the bright portion of the display device is more, and when the guide remaining distance is less than 25% of the total distance, the entire display device is fully bright.
In one embodiment, the reminding effect of the remaining distance can also be displayed by the lamp strip to execute the task. For example, in the case of a robot having 8 light strips, the display range level may be divided into four levels, and the relationship between the display range level, the number of light strips with bright light, and the percentage of the guiding remaining distance to the total guiding distance is as follows:
Figure BDA0002836161650000082
Figure BDA0002836161650000091
TABLE 2
As can be seen from the above table, when the guide remaining distance is smaller, the display range level is higher, the number of the bright lamp strips is larger, and when the guide remaining distance is less than 25% of the total distance, all the lamp strips are fully bright.
In one embodiment, the visual effect of the display device of the remaining distance can also be realized by grading and dynamic change of the display colors, for example, the display color grade is divided into four grades, and the first to fourth grades respectively correspond to four display colors of blue, green, yellow and red. In addition, ranking may be facilitated by calculating the boot remaining distance as a percentage of the total boot distance. The relationship between the display color level, the display color, and the guide remaining distance as a percentage of the total guide distance is shown in the following table:
Figure BDA0002836161650000092
TABLE 3
As can be seen from the above table, the display color changes correspondingly as the guide remaining distance is smaller, and the display color of the display device is red when the guide remaining distance is less than 25% of the total distance.
The visual effect of the display device of the remaining distance can also be realized by grading and dynamic change of the display brightness. For example, the display brightness is divided into four levels, and the display brightness corresponding to the first to fourth levels is changed from dark to light. In addition, ranking may be facilitated by calculating the boot remaining distance as a percentage of the total boot distance. The relationship between the display brightness level, the display brightness value of the display device and the guiding remaining distance in percentage of the total guiding distance is shown in the following table:
Figure BDA0002836161650000101
TABLE 4
As can be seen from the above table, the display luminance of the display device becomes higher as the guide remaining distance is smaller, and the display luminance value of the display device becomes highest when the guide remaining distance is less than 25% of the total distance.
In application, the dynamic change levels of the three types of reminding effects can be selected from one type for independent use, can be selected from any two types for simultaneous use, and can also be selected from three types for simultaneous use.
Referring to fig. 5, in one embodiment, the moving states of the robot include a straight moving state, a turning moving state, and a stop moving state. The displaying a visual effect in the light strip according to the moving state of the robot caused by the execution of the task includes steps S1021 to S1022:
and S1021, controlling the lamp strip to respectively present different visual effects according to the straight movement state, the turning movement state and the stop movement state of the robot.
In an application, the visual effect may include one or more of a display range, a display color, and a display brightness of the light strip. The number of the lamp strips can be one, and also can be two or more. For example, the lamp strips of a corresponding number can be installed according to the number of the visual effects of the lamp strips, and each lamp strip displays one lamp strip visual effect specially. Of course, the number of the lamp strips is fixed, and then one lamp strip can be arranged according to the requirement to display one or more lamp strip visual effects.
The lamp strip visual effect in the moving state can be divided into an upper part, a middle part and a lower part for displaying the bright light range. One moving state corresponds to one partly bright light. The relationship between the bright portion and the moving state is shown in the following table:
Figure BDA0002836161650000102
Figure BDA0002836161650000111
TABLE 5
When the lamp area quantity is many, the lamp area visual effect of mobile state also can show the visual effect of mobile state through the bright light in different lamp areas. For example, in the case that the robot has 3 light strips in the first light strip, the second light strip, and the third light strip, the relationship between the bright light strip and the moving state of the robot is shown in the following table:
luminous lamp belt Moving state
First lamp belt Straight going
Second lamp belt Turning
Third lamp belt Stop
TABLE 6
The lamp strip visual effect in the moving state can also be correspondingly displayed through the display color, for example, the display color of the lamp strip light contains three colors of blue, green and red. The relationship between the display color of the lamp strip and the moving state of the robot is shown in the following table:
displaying color Moving state
Green colour Straight going
Blue color Turning
Red colour Stop
TABLE 7
Setting scenes according to common light colors, such as a traffic light, wherein green represents passing and red represents stopping, the visual effect in the turning state can be set to be blue light, the visual effect in the straight-going state can be set to be green light, and the visual effect in the stopping state can be set to be red. Of course, the above setting may be changed as necessary.
The lamp strip visual effect in the moving state can also realize dynamic change through the grading of display brightness. For example, the display brightness is divided into three levels, and the display brightness corresponding to the first to third levels is changed from dark to light. The relationship among the display brightness level, the lamp strip display brightness value and the robot moving state is shown in the following table:
displaying brightness levels Lamp belt display brightness value (cd/m)2) Moving state
First level 50 Stop
Second grade
100 Straight going
Third level of classification 150 Turning
TABLE 8
In application, the three types of dynamic changes of visual effects may be selected from one type for use alone, or selected from any two types for use simultaneously, or selected from three types for use simultaneously.
In one embodiment, the visual effect comprises one or more of a display range, a display color, and a display brightness of the light strip. The displaying a visual effect in the light strip according to a movement state of the robot caused by the execution of the task, further comprising:
step S1022, when the robot is in the straight movement state or the turning movement state, one or more of the display range, the display color, and the display brightness of the light strip are correspondingly controlled according to the movement speed of the robot.
Similar to the visual effect of the display device of the remaining distance, the visual effect of the light strip of the moving speed can also grade the display range by dividing the range of the light strip into four parts. The relationship between the display range level, the number of bright portions in the lamp strip and the moving speed is shown in the following table:
displaying range levels Number of light parts of lamp belt Moving speed of robot (meter/second)
First level 1 <1
Second grade 2 1-1.5
Third level of classification 3 1.5-2
Fourth grade 4 >2
TABLE 9
As can be seen from the above table, when the moving speed of the robot is higher, the level of the display range is higher, the number of the bright portions of the lamp strip is larger, and when the moving speed is greater than 2 meters per second, the whole lamp strip is fully bright.
When the number of the lamp belts is multiple, the lamp belt visual effect of the moving speed can be correspondingly set with the display range through the number of the lamp belts. For example, in the case of a robot having 8 light bands, the display range level may be divided into four levels, and the relationship between the display range level, the number of light bands and the moving speed is shown in the following table:
displaying range levels Quantity of light strips Moving speed of robot (meter/second)
First level 2 <1
Second grade 4 1-1.5
Third level of classification 6 1.5-2
Fourth grade 8 >2
Watch 10
As can be seen from the above table, when the moving speed of the robot is higher, the level of the display range is higher, the number of the light strips which are bright is larger, and when the moving speed is greater than 2 meters per second, all the light strips are fully bright.
The lamp strip visual effect of the moving speed can also be realized by grading and dynamic change of display colors, for example, the display color grade is divided into four grades, and the first to fourth grades respectively correspond to four light colors of blue, green, yellow and red. The relationship among the display color level, the display color, and the moving speed is shown in the following table:
displaying color levels Lamp with light for displaying color Moving speed of robot (meter/second)
First level Blue color <1
Second grade Green colour 1-1.5
Third level of classification Yellow colour 1.5-2
Fourth grade Red colour >2
TABLE 11
It can be seen from the above table that, when the moving speed of the robot is higher, the level of the display range is higher, the display color of the lamp strip light changes accordingly, and when the moving speed is greater than 2 meters per second, the display color of the lamp strip is red.
The lamp strip visual effect of the moving speed can also be realized through grading and dynamic change of display brightness. For example, the display brightness is divided into four levels, and the display brightness corresponding to the first to fourth levels is changed from dark to light. The relationship among the display brightness level, the lamp strip display brightness value and the robot moving speed is shown in the following table:
Figure BDA0002836161650000131
Figure BDA0002836161650000141
TABLE 12
It can be seen from the above table that, when the moving speed of the robot is higher, the display brightness level is higher, the display brightness of the lamp strip light changes therewith, and when the moving speed is greater than 2 meters per second, the display brightness of the lamp strip is highest.
In application, the dynamic change levels of the three types of moving speed light strip effects can be selected from one type for independent use, can also be selected from any two types for simultaneous use, and can also be used from three types for simultaneous use.
The display time of the visual effect of the display device and the light strip can be determined according to the duration of the execution process, for example, if the display interface displays the inquired information all the time, the display device and the light strip can be controlled to display the corresponding visual effect all the time. Of course, the display time may also be set according to other needs, for example, the query operation interface may be set to be turned off when no new operation is performed for a preset time, and when the query operation interface is turned off in the vicinity, the visual effect of the display device and/or the lamp strip is no longer displayed or the display brightness is reduced, and when the query operation interface is operated normally, the visual effect is displayed again.
According to the embodiment of the robot control method, when the robot executes the task, the display device displays the execution process, and meanwhile, the visual effect is displayed in the lamp strip according to the moving state of the robot caused by the execution of the task, so that different display devices and different visual effects of the lamp strip are displayed in different situations, the response and execution information of instructions are fed back to visitors, and the service experience of the visitors is improved.
Example two:
an embodiment of the present application provides a robot control method, including step S101 in the first embodiment, which is a further description of the first embodiment, and reference may be specifically made to the related description of the first embodiment where the same or similar to the first embodiment, and details are not repeated here.
Referring to fig. 6 to 9, the second embodiment further includes steps S202 to S207 in addition to the steps of the first embodiment.
Specifically, the robot control method in the present embodiment includes:
step S201, when the robot executes a task, displaying an execution process on the display device, and displaying a visual effect on the light strip according to a movement state of the robot caused by the execution of the task.
Step S201 is the same as step S101 in the first embodiment.
In one embodiment, the robot control method of the present application further includes:
step S202, system time is acquired.
And step S203, when the system time is within a preset dimming time range, adjusting the display brightness of the display device and the display brightness of the lamp strip to a target brightness corresponding to a time period within the preset dimming time range.
In one embodiment, the system time may be the time of an internal timer of the robot or the time that the robot synchronizes over the network. The preset dimming time range may be set to two ranges, which are a daytime dimming range and a nighttime dimming range, respectively. The preset dimming time range, the preset dimming time period and the target brightness corresponding relation are as follows:
preset dimming time range Time period Target luminance (cd/m)2)
Daytime dimming range 6:00-18:00 250
Dimming range at night 18:00-6:00 100
Watch 13
When the system time is in the daytime dimming range, the display brightness of the lamp strip is adjusted to be higher target brightness, and when the system time is in the night dimming range, the display brightness of the lamp strip is adjusted to be lower target brightness, so that the lamp strip and the lamp light can have resolution under different light environments.
In one embodiment, the robot control method of the present application further includes:
step S204, obtaining the light intensity;
and step S205, when the light intensity is within a preset dimming intensity range, adjusting the display brightness of the display device and the display brightness of the lamp strip to a target brightness corresponding to the display brightness level within the preset dimming intensity range.
In one embodiment, the light intensity can be obtained by a built-in light sensor of the robot, or can be obtained by a light sensor installed in the surrounding environment and sent to the robot. The light intensity here refers to the light intensity of the environment around the robot, and may be indoor light intensity or outdoor light intensity. The display brightness level of the preset dimming intensity range may be set to 4 levels. The display brightness level of the preset dimming intensity range and the corresponding relationship between the light intensity and the target brightness are as follows:
Figure BDA0002836161650000161
TABLE 14
As can be seen from the above table, when the light intensity of the environment is higher, the display brightness level of the corresponding preset dimming intensity range is higher, and the corresponding target brightness is higher, so that the lamp strip light has resolution in different light environments.
The control modes of steps S202 to S203 and the control modes of steps S204 to S205 may be selected to be used alone or simultaneously. When the two control modes are used simultaneously, if the obtained target brightness is different, the user can select to use the target brightness adjusting lamp belt in one control mode to display brightness. In addition, the two display brightness control modes are applicable to all visual effects of the light strip, including but not limited to the visual effects of the display device and the visual effects of the light strip.
In addition, the visual effects of the remaining distance, the moving speed, and the like, such as by the display luminance change setting, when conflicting with the display luminance adjusted in the above two modes, one of the display luminance change settings may also be selected as the priority display luminance setting as needed.
The two actions of acquiring the system time in step S202 and acquiring the light intensity in step S204 are parallel and have no sequence. Step S202 and step S204 are continuously performed to obtain the real-time system time and the light intensity.
The light strip visual effect further includes an idle visual effect, and the robot control method in this embodiment further includes:
and step 206, when the lamp strip is in an idle state, controlling the lamp strip to display an idle visual effect.
In one embodiment, the idle state may be a state in which no operation instruction is fetched and no operation instruction is executed. The idle visual effect may be a static display color effect or a display brightness effect, or may be a dynamic display color effect or a display brightness effect. When the robot is provided with the human-computer interaction interface, idle characters can be displayed on the display interface at the same time.
The light strip visual effect further includes a fault visual effect, and the robot control method in this embodiment further includes:
and step 207, when the fault is detected, controlling the lamp belt to display a fault visual effect.
In one embodiment, the fault may include a hardware fault, for example, a hardware module such as an acquisition module, a communication module, or an action execution module of the robot cannot operate normally, or a software fault, for example, a hardware module cannot be connected to a cloud database for query. The fault visual effect can be a static display color effect or a display brightness effect, such as a red light long-shining effect, or a dynamic display color effect or a display brightness effect, such as a light rapid flashing effect, so as to remind peripheral maintenance personnel to overhaul. When the robot is provided with the human-computer interaction interface, the display interface can simultaneously display the character of the fault, and meanwhile, the operation interface of the human-computer interaction interface can be locked, so that the operation of the client is not allowed, and other faults are avoided.
According to the embodiment of the robot control method, when the robot executes the task, the display device displays the execution process, and meanwhile, the visual effect is displayed in the lamp strip according to the moving state of the robot caused by the execution of the task, so that different display devices and different visual effects of the lamp strip are displayed in different situations, the response and execution information of instructions are fed back to visitors, and the service experience of the visitors is improved.
Example three:
fig. 10 shows a block diagram of a robot control system 100 provided in the embodiment of the present application, where the control system may be a virtual appliance (virtual application) in the robot, run by a processor of the robot, or be integrated in the robot itself, corresponding to the robot control method described in the above embodiment. For convenience of explanation, only portions related to the embodiments of the present application are shown.
In this application embodiment robot control system 100, display device and lamp area are installed to the fuselage of robot, the system includes:
the first display module 1 is used for displaying an execution process on the display device when the robot executes a task;
and the second display module 2 is used for displaying a visual effect on the lamp strip according to the moving state of the robot caused by the execution of the task when the robot executes the task.
Referring to fig. 11, in an embodiment, the robot control system 100 further includes:
the operation instruction acquisition module 3 is used for acquiring an operation instruction triggered by a user on the display device and/or the handheld terminal;
the task list module 4 is used for generating a corresponding task according to the operation instruction and storing the task in a preset task list;
and the task execution module 5 is used for acquiring the tasks in the preset task list and executing the tasks.
In application, when the robot executes a task, an execution process is displayed on the display device, and the operation instruction acquisition module 3, the task list module 4 and the task execution module 5 are operated before the visual effect is displayed on the light belt according to the moving state of the robot caused by the execution of the task.
In one embodiment, the first display module 1 is further configured to display, on the display device, a 3D model effect corresponding to a movement state of the robot caused by the execution of the task.
In one embodiment, the first display module 1 is further configured to display progress information of performing the task and result information of performing the task on the display device.
In an application, wherein the progress information and the result information are combined with an animated visual effect for display on the display device.
In one embodiment, the first display module 1 is further configured to obtain a target location to which the task is to be executed;
the first display module 1 is further used for acquiring the current position in real time;
the first display module 1 is further configured to determine a remaining distance according to the current position and the target position of the robot;
the first display module 1 is further configured to display a reminding effect of executing the task on the display device according to the remaining distance.
In one embodiment, the movement states of the robot include a straight movement state, a turning movement state, and a stop movement state. The second display module 2 is further configured to control the lamp strip to respectively present different visual effects according to the straight movement state, the turning movement state and the stop movement state of the robot.
In an application, the visual effect includes one or more of a display range, a display color, and a display brightness of the light strip.
In one embodiment, the second display module 2 is further configured to correspondingly control one or more of a display range, a display color, and a display brightness of the light strip according to the moving speed of the robot when the robot is in a straight-moving state or a turning-moving state.
Referring to fig. 11, in an embodiment, the robot control system 100 further includes:
a system time obtaining module 6, configured to obtain system time;
and the time brightness adjusting module 7 is configured to adjust the display brightness of the display device and the display brightness of the light band to a target brightness corresponding to a time period within a preset dimming time range when the system time is within the preset dimming time range.
The light intensity acquisition module 8 is used for acquiring light intensity;
and the light intensity and brightness adjusting module 9 is configured to adjust the display brightness of the display device and the display brightness of the lamp strip to a target brightness corresponding to the brightness level of the preset dimming intensity range when the light intensity is within the preset dimming intensity range.
In the embodiment of the robot control system of the present application, the contents of the first display module 1 and the second display module 2, etc. are already described in the first embodiment and the second embodiment, and are not described again here.
According to the embodiment of the robot control system, when the robot executes a task, the display device displays an execution process, and meanwhile, according to execution, the moving state of the robot caused by the task is in the lamp strip display visual effect, so that different display devices and lamp strip visual effects are displayed in different situations, response and execution information of instructions are fed back to visitors, and service experience of the visitors is improved.
Example four:
as shown in fig. 12, the present application also provides a robot 200 comprising a memory 201, a processor 202 and a computer program 203 stored in said memory and executable on said processor, such as a control program of the robot. The processor 202, when executing the computer program 203, implements the steps in the various robot control method embodiments described above, such as the method steps in embodiment one or embodiment two. The processor 202, when executing the computer program 203, implements the functions of the modules in the above device embodiments, for example, the functions of the modules and units in the third embodiment.
Illustratively, the computer program 203 may be divided into one or more modules, and the one or more modules are stored in the memory 201 and executed by the processor 202 to implement the first, second or third embodiments of the present application. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 203 in the robot 200. For example, the computer program 203 may be divided into a first display module, a second display module, and the like, and specific functions of the modules are described in the third embodiment, which is not described herein again.
The robot 200 may be a interview robot. The robot may include, but is not limited to, a memory 201, a processor 202. Those skilled in the art will appreciate that fig. 6 is merely an example of the robot 200 and does not constitute a limitation of the robot 200 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the robot may also include input output devices, network access devices, buses, etc.
The memory 201 may be an internal storage unit of the robot 200, such as a hard disk or a memory of the robot 200. The memory 201 may also be an external storage device of the robot 200, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the robot 200. Further, the memory 201 may also include both an internal storage unit and an external storage device of the robot 200. The memory 201 is used for storing the computer program and other programs and data required by the robot. The memory 201 may also be used to temporarily store data that has been output or is to be output.
The Processor 202 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing functional units and modules are merely illustrated in terms of division, and in practical applications, the foregoing functions may be distributed as needed by different functional units and modules. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware.
In the embodiments provided by the present invention, it should be understood that the disclosed apparatus/robot and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/robot are merely illustrative, and for example, the division of the modules or units is only a logical division, and there may be other divisions when the actual implementation is performed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units.
All or part of the flow of the method of the embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and executed by a processor, to instruct related hardware to implement the steps of the embodiments of the methods. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A robot control method characterized in that a body of the robot is mounted with a display device and a lamp strip, the method comprising:
when the robot executes a task, displaying an execution process on the display device, and displaying a visual effect on the lamp strip according to the moving state of the robot caused by the execution of the task.
2. The robot control method according to claim 1, wherein displaying an execution process on the display device while the robot performs a task and before displaying a visual effect on the light bar according to a movement state of the robot caused by performing the task, comprises:
acquiring an operation instruction triggered by a user on the display device and/or the handheld terminal;
generating a corresponding task according to the operation instruction, and storing the task in a preset task list;
and acquiring the tasks in the preset task list and executing the tasks.
3. The robot control method according to claim 1, wherein the displaying the execution procedure on the display device includes:
displaying, on the display device, a 3D model effect corresponding to a movement state of the robot caused by the execution of the task.
4. The robot control method according to claim 3, wherein the displaying the execution procedure on the display device further comprises:
displaying progress information for executing the task and result information for executing the task on the display device, wherein the progress information and the result information are combined with an animation visual effect to be displayed on the display device.
5. The robot control method according to claim 3, wherein the displaying the execution procedure on the display device further comprises:
acquiring a target position to which the task is executed;
acquiring a current position in real time;
determining a remaining distance according to the current position and the target position of the robot;
and displaying a reminding effect of executing the task on the display device according to the residual distance.
6. The robot control method according to claim 1, wherein the movement state of the robot includes a straight movement state, a turning movement state, and a stop movement state;
the displaying a visual effect in the light strip according to the movement state of the robot caused by the execution of the task includes:
and controlling the lamp strip to respectively present different visual effects according to the straight moving state, the turning moving state and the stopping moving state of the robot.
7. The robot control method according to claim 6, wherein the visual effect includes one or more of a display range, a display color, and a display brightness of a light strip;
the displaying a visual effect in the light strip according to a movement state of the robot caused by the execution of the task, further comprising:
and when the robot is in a straight-moving state or a turning moving state, correspondingly controlling one or more of the display range, the display color and the display brightness of the lamp strip according to the moving speed of the robot.
8. The robot control method of claim 1, further comprising:
acquiring system time;
when the system time is within a preset dimming time range, adjusting the display brightness of the display device and the display brightness of the lamp strip to a target brightness corresponding to a time period of the preset dimming time range;
or the like, or, alternatively,
acquiring the light intensity;
and when the light intensity is within a preset dimming intensity range, adjusting the display brightness of the display device and the display brightness of the lamp strip to a target brightness corresponding to the brightness grade of the preset dimming intensity range.
9. The utility model provides a robot control system which characterized in that, display device and lamp area are installed to the fuselage of robot, the system includes:
the first display module is used for displaying an execution process on the display device when the robot executes a task;
and the second display module is used for displaying a visual effect on the lamp strip according to the moving state of the robot caused by the execution of the task when the robot executes the task.
10. A robot having a body with a display device and a light strip mounted thereon, the robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the robot control method according to any one of claims 1 to 8 when executing the computer program.
CN202011472141.5A 2020-12-15 2020-12-15 Robot control method and system and robot Pending CN112692827A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011472141.5A CN112692827A (en) 2020-12-15 2020-12-15 Robot control method and system and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011472141.5A CN112692827A (en) 2020-12-15 2020-12-15 Robot control method and system and robot

Publications (1)

Publication Number Publication Date
CN112692827A true CN112692827A (en) 2021-04-23

Family

ID=75507974

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011472141.5A Pending CN112692827A (en) 2020-12-15 2020-12-15 Robot control method and system and robot

Country Status (1)

Country Link
CN (1) CN112692827A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140302931A1 (en) * 2011-11-09 2014-10-09 Marta Isabel santos Paiva Ferraz CONCEIÇÂO Interactive embodied robot videogame through the use of sensors and physical objects
CN206426105U (en) * 2016-10-17 2017-08-22 深圳优地科技有限公司 A kind of new services humanoid robot
US20170330407A1 (en) * 2016-05-13 2017-11-16 Universal Entertainment Corporation Attendant device and gaming machine
CN107578153A (en) * 2017-08-25 2018-01-12 深圳增强现实技术有限公司 Machine operates the workshop configuring management method of monitor system
CN110712221A (en) * 2019-10-14 2020-01-21 北京云迹科技有限公司 Robot state indicating system and indicating method, electronic equipment and robot
CN210704830U (en) * 2019-01-04 2020-06-09 北京三快在线科技有限公司 Distribution robot
CN111347441A (en) * 2020-03-27 2020-06-30 广州美术学院 Children accompany robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140302931A1 (en) * 2011-11-09 2014-10-09 Marta Isabel santos Paiva Ferraz CONCEIÇÂO Interactive embodied robot videogame through the use of sensors and physical objects
US20170330407A1 (en) * 2016-05-13 2017-11-16 Universal Entertainment Corporation Attendant device and gaming machine
CN206426105U (en) * 2016-10-17 2017-08-22 深圳优地科技有限公司 A kind of new services humanoid robot
CN107578153A (en) * 2017-08-25 2018-01-12 深圳增强现实技术有限公司 Machine operates the workshop configuring management method of monitor system
CN210704830U (en) * 2019-01-04 2020-06-09 北京三快在线科技有限公司 Distribution robot
CN110712221A (en) * 2019-10-14 2020-01-21 北京云迹科技有限公司 Robot state indicating system and indicating method, electronic equipment and robot
CN111347441A (en) * 2020-03-27 2020-06-30 广州美术学院 Children accompany robot

Similar Documents

Publication Publication Date Title
US10743134B2 (en) System and method for providing dynamic supply positioning for on-demand services
US10206268B2 (en) Interlaced data architecture for a software configurable luminaire
US11629823B2 (en) Wireless lighting control system
US11908026B2 (en) Determining user interface information based on location information
US20210097905A1 (en) Vehicle with context sensitive information presentation
AU2017337335B2 (en) Dynamically modifiable user interface
CN104798109A (en) Modifying virtual object display properties
CN204374996U (en) A kind of hotel CUSTOM HOUSE HOTEL
WO2016054122A1 (en) Displaying content on a display in power save mode
CN103634978A (en) Lighting control system
CN106462411A (en) User interface for application and device
US6940528B2 (en) Display-service providing system and image display apparatus
CN102385482A (en) Methods and apparatuses for enhancing wallpaper display
CN112692827A (en) Robot control method and system and robot
US20220044172A1 (en) Method, system and terminal device for operation management of aircrew
CN117238236A (en) Display control method, system, medium and equipment based on lamp strip
US10460478B2 (en) System comprising providing means for providing data to a user
CN109905614A (en) Video Controller and its with carrying control method, display system and storage medium
CN107426898A (en) A kind of super electricity consumption management-control method of business combined with path navigation
CN108108246A (en) A kind of terrain scheduling method for airborne Synthetic vision
US7525558B2 (en) Display-service providing system and image display apparatus
CN202976705U (en) Intelligent control system for electronic bus guideboard
CN106249889A (en) Infrared gesture recognition system and control method thereof and wearable device, identification equipment
WO2020181529A1 (en) Display screen configuration method, apparatus and system
US20220404954A1 (en) Equipment management apparatus and equipment management screen generating method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210423

RJ01 Rejection of invention patent application after publication