CN114347061A - Atmosphere lamp setting method and setting device for robot and robot - Google Patents

Atmosphere lamp setting method and setting device for robot and robot Download PDF

Info

Publication number
CN114347061A
CN114347061A CN202210082375.1A CN202210082375A CN114347061A CN 114347061 A CN114347061 A CN 114347061A CN 202210082375 A CN202210082375 A CN 202210082375A CN 114347061 A CN114347061 A CN 114347061A
Authority
CN
China
Prior art keywords
action
robot
input
target
atmosphere lamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210082375.1A
Other languages
Chinese (zh)
Inventor
蔡汉嘉
王大鹏
奉飞飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Group Co Ltd
Midea Group Shanghai Co Ltd
Original Assignee
Midea Group Co Ltd
Midea Group Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, Midea Group Shanghai Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202210082375.1A priority Critical patent/CN114347061A/en
Publication of CN114347061A publication Critical patent/CN114347061A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application relates to the technical field of robots, and provides an atmosphere lamp setting method, an atmosphere lamp setting device and a robot of the robot, wherein the atmosphere lamp setting method of the robot comprises the following steps: determining at least one target action of the robot and a target color corresponding to the target action; and acquiring the current action of the robot, and if the current action corresponds to the target action, lighting the atmosphere lamp of the robot in the target color. According to the atmosphere lamp setting method of the robot, the target action and the target color corresponding to the target action are randomly generated by the system or set by the user, the atmosphere lamp of the robot can be lightened in the target color when the robot executes the target action, the usability of the atmosphere lamp of the robot is greatly increased, the applicable scene is increased, and the user experience is improved.

Description

Atmosphere lamp setting method and setting device for robot and robot
Technical Field
The application relates to the technical field of robots, in particular to an atmosphere lamp setting method and device of a robot and the robot.
Background
In the related art, the robot can create a specific atmosphere effect through actions and atmosphere lamp colors, and the life of people is enriched. However, the existing robot can create less atmosphere types, cannot meet the atmosphere creation effects of different scenes, and is poor in user experience.
Disclosure of Invention
The present application is directed to solving at least one of the problems in the prior art. Therefore, the atmosphere lamp setting method of the robot is provided, so that the types of atmospheres which can be created by the robot are increased, the applicable scenes of the robot are increased, and the user experience is improved.
The application also provides an atmosphere lamp setting device of the robot.
The application also provides a robot.
The application also provides an electronic device.
The present application also proposes a non-transitory computer-readable storage medium.
The atmosphere lamp setting method of the robot according to the embodiment of the first aspect of the application comprises the following steps:
determining at least one target action of the robot and a target color corresponding to the target action;
and acquiring the current action of the robot, and if the current action corresponds to the target action, lighting the atmosphere lamp of the robot in the target color.
According to the atmosphere lamp setting method of the robot, the target action and the target color corresponding to the target action are randomly generated by the system or set by the user, the atmosphere lamp of the robot can be turned on in the target color when the robot executes the target action, the usability of the atmosphere lamp of the robot is greatly increased, the applicable scene is increased, and the user experience is improved.
According to an embodiment of the application, before said determining at least one target action of the robot and a target color corresponding to said target action, the method further comprises:
receiving a first input of a user to the atmosphere lamp self-defining interface, wherein the first input is used for determining at least one target action of the robot and a target color corresponding to the target action.
According to an embodiment of the present application, further comprising: the atmosphere lamp self-defining interface comprises a blank action color pair, and the blank action color pair comprises a blank action column and a blank color column corresponding to the blank action column;
the determining at least one target action of the robot and a target color corresponding to the target action includes:
receiving a second input of the user to the blank action bar;
determining the target action in response to the second input;
receiving a third input of the user to the blank color bar;
in response to the third input, the target color is determined.
According to an embodiment of the application, the receiving a second input of the user to the blank action bar; in response to the second input, determining the target action includes: receiving a first sub-input of a user to the blank action bar; displaying a plurality of alternative actions in response to the first sub-input; receiving a second sub-input of a target action selected from the plurality of alternative actions by the user; determining the target action in response to the second sub-input;
the receiving user's third input to the blank color bar; in response to the third input, determining the target color comprises: receiving a third sub-input of the user to the blank color bar; displaying a palette in response to the third child input; receiving a fourth sub-input of a target color selected from the palette from a user; determining the target color in response to the fourth sub-input.
According to an embodiment of the application, before the receiving the first input of the atmosphere lamp customization interface from the user, the method further comprises:
receiving a fourth input of the atmosphere lamp setting interface from the user;
and responding to the fourth input, and displaying the atmosphere lamp self-defining interface.
According to one embodiment of the application, the atmosphere lamp setting interface comprises a plurality of machine selection action color pairs, and the machine selection action color pairs comprise machine selection actions and machine selection colors corresponding to the machine selection actions;
the atmosphere lamp self-defining interface comprises a blank action color pair and the plurality of machine selection action color pairs;
the receiving a first input of a user to the atmosphere lamp self-defining interface comprises:
and receiving a first input of the user to the blank action color pair or receiving a first input of the user to the machine selection action color pair.
According to an embodiment of the present application, after the displaying the ambience light customization interface, the method further comprises:
receiving a fifth input of the user to the atmosphere lamp self-defined interface;
in response to the fifth input, displaying an atmosphere lamp setting interface, wherein the atmosphere lamp setting interface comprises a plurality of machine selection action color pairs, and the machine selection action color pairs comprise machine selection actions and machine selection colors corresponding to the machine selection actions;
and the machine selection action color pair is the same as the machine selection action color pair displayed last time, or the machine selection action color pair is a re-randomly generated action color pair.
According to one embodiment of the application, an action sequence to be performed is determined, the action sequence comprising a plurality of target actions;
the acquiring a current action of the robot, and if the current action corresponds to the target action, lighting an atmosphere lamp of the robot in the target color, including:
and detecting the current action of the robot in real time, and lighting the atmosphere lamp by using a target color corresponding to the current action.
The atmosphere lamp setting device of the robot according to the embodiment of the second aspect of the application comprises:
the robot comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for determining at least one target action of the robot and a target color corresponding to the target action;
and the first processing module is used for acquiring the current action of the robot, and if the current action corresponds to the target action, the atmosphere lamp of the robot is lightened by the target color.
A robot according to an embodiment of the third aspect of the present application, comprising:
an atmosphere lamp for lighting in different colors;
an action module for making different actions;
and the control device is connected with the atmosphere lamp and the action module and is used for executing the atmosphere lamp setting method of the robot.
An electronic device according to an embodiment of the fourth aspect of the present application comprises a processor, a memory and a program or instructions stored on the memory and executable on the processor, which program or instructions, when executed by the processor, implement the steps of the atmosphere light setting method of the robot.
A readable storage medium according to an embodiment of the fifth aspect of the application, on which a program or instructions are stored which, when executed by a processor, carry out the steps of the atmosphere lamp setting method of the robot.
A computer program product according to an embodiment of the sixth aspect of the application comprises a computer program which, when being executed by a processor, carries out the steps of the atmosphere lamp setting method of the robot.
One or more technical solutions in the embodiments of the present application have at least one of the following technical effects:
by setting the target action and the target color corresponding to the target action, the lighting of the atmosphere lamp of the robot in the target color can be realized when the robot executes the target action, the usability of the atmosphere lamp of the robot is greatly increased, the applicable scene is increased, and the user experience is improved.
Furthermore, the user can flexibly and changeably associate the action of the robot and the atmosphere lamp color by self-defining the action color pair or randomly adjusting the machine selection action color pair, so that the use scene of the robot is enriched, and the use experience of the user is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flowchart of an atmosphere lamp setting method for a robot according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 3 is a second schematic view of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 4 is a third schematic diagram of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 5 is a fourth schematic view of an atmosphere lamp customization interface of a robot according to an embodiment of the present disclosure;
FIG. 6 is a fifth schematic view of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 7 is a sixth schematic view of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 8 is a seventh schematic view of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 9 is an eighth schematic view of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 10 is a ninth illustration of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 12 is an eleventh schematic diagram illustrating an atmosphere lamp customization interface of a robot according to an embodiment of the present disclosure;
FIG. 13 is a twelfth schematic view of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 14 is a schematic view of an atmosphere lamp setting interface of a robot according to an embodiment of the present disclosure;
FIG. 15 is a schematic diagram of an atmosphere lamp customization interface of a robot according to an embodiment of the present application;
FIG. 16 is a second schematic view of an atmosphere lamp setting interface of a robot according to an embodiment of the present disclosure;
FIG. 17 is a third schematic view of an atmosphere lamp setting interface of a robot according to an embodiment of the present disclosure;
FIG. 18 is a detailed flowchart of an atmosphere lamp setting method of a robot according to an embodiment of the present disclosure;
FIG. 19 is a schematic structural diagram of an atmosphere lamp setting device of a robot according to an embodiment of the present disclosure;
fig. 20 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
The atmosphere lamp setting method of the robot can be applied to the terminal, and can be specifically executed by hardware or software in the terminal. The execution main body of the atmosphere lamp setting method of the robot can be a terminal, or a control device of the terminal and the like.
The terminal includes, but is not limited to, a mobile service robot or other intelligent robot such as an emotional robot that touches sensitive surfaces (e.g., a touch screen display and/or a touch pad) and lighting devices (e.g., an ambience light and/or an indicator light). It should also be understood that in some embodiments, the terminal may not be a smart robot, but rather a smart device having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad) and a light device (e.g., an ambience light and/or an indicator light).
In the following various embodiments, a terminal including a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and joystick.
In the atmosphere lamp setting method for a robot provided in the embodiment of the present application, the execution subject may be an intelligent device or a functional module or a functional entity capable of implementing the atmosphere lamp setting method for the robot in the intelligent device, the intelligent device mentioned in the embodiment of the present application includes, but is not limited to, a mobile service robot, an emotional robot, and the like, and the atmosphere lamp setting method for a robot provided in the embodiment of the present application is described below with the intelligent device as the execution subject.
The atmosphere lamp setting method of the robot can be applied to various scenes, such as dance meetings, banquet welcomes or action early warning and the like. Under different scenes, the atmosphere lamp setting method of the robot is used for setting different association relations between the robot action and the atmosphere lamp of the robot.
As shown in fig. 1, the atmosphere lamp setting method of the robot includes: step 110 and step 120.
The atmosphere lamp of the robot in this embodiment may be a light emitting device that can be lit in different colors, and the atmosphere lamp of the robot may display different lighting effects according to the scene needs.
The setting method in the present embodiment includes setting of the association relationship between the robot action and the atmosphere light color of the robot, which will be described in detail below.
Step 110, determining at least one target action of the robot and a target color corresponding to the target action.
Wherein the mood light of the robot is lit in a target color when the robot performs the target action.
In this step, the system randomly generates or user-defines at least one target motion of the robot and a target color corresponding to the target motion.
The target motion of the robot may be different motions made by the robot according to user settings, such as forward walking, backward walking, left turning, right turning, and the like.
The target color can be a color of the robot atmosphere lamp which is correspondingly lightened according to different actions of the robot.
In the application, the target action and the target color corresponding to the target action are randomly generated according to a system or set by a user, so that the robot can make different actions in different occasions, the atmosphere lamp color corresponding to the actions is displayed at the same time, and the user experience can be greatly improved.
And 120, acquiring the current action of the robot, and if the current action corresponds to the target action, lighting the atmosphere lamp of the robot in the target color.
In this step, the operation of the robot is detected, and if the operation corresponds to the target operation, the atmosphere lamp of the robot is turned on in the target color.
The current motion of the robot is the motion the robot is performing.
And if the current action of the robot corresponds to the action randomly generated by the system or the user-defined action, the atmosphere lamp of the robot correspondingly lights the target color.
The target action corresponds to the target color one by one, and when the robot makes different actions, the atmosphere lamp color of the robot can correspondingly light the corresponding atmosphere lamp color according to the actions made by the robot.
For example, in a dance, the robot may perform motions such as rotating, raising hands, backing, lowering head, etc., and the color of the atmosphere light of the robot may be a light color such as green, yellow, pink, red, etc., according to the motion of the robot. Wherein, the rotation corresponds to green, the raising of the hand corresponds to yellow, the retreat corresponds to pink, and the lowering of the head corresponds to red, thereby greatly increasing the logic and the entertainment effect.
For example, in a scene of banquet welcoming, the robot can perform actions such as raising hands, advancing, retreating and the like, and the atmosphere light color of the robot can be correspondingly lightened according to the actions of the robot, such as blue, green, red and the like. Wherein, the hand-raising corresponds to blue, the forward corresponds to green, the backward corresponds to red, and the logicality and the entertainment effect are greatly increased.
In the correlation technique, the atmosphere lamp colour of robot only can wholly define, unable elasticity changes, and the robot can build the atmosphere kind less, can't satisfy the creation effect of the atmosphere lamp light colour of the robot of different scenes, leads to the usability of atmosphere lamp to be low, causes the function extravagant, and user experience feels poor.
According to the method and the device, the target action and the target color corresponding to the target action are randomly generated according to a system or set by a user, so that the atmosphere lamp of the robot is lightened in the target color when the robot executes the target action, the usability of the atmosphere lamp of the robot is greatly increased, the applicable scene is increased, and the user experience is improved.
In some embodiments, the atmosphere lamp setting method of the robot may further include the steps of: a sequence of actions to be performed is determined.
In this step, an action sequence of a plurality of target actions to be performed by the robot is determined.
Wherein the sequence of actions may be an order of execution of a plurality of target actions.
The robot sequentially executes a plurality of target motions according to the motion sequence.
The robot detects the current action in real time and lights the atmosphere lamp according to the target color corresponding to the current action.
In this step, the system may detect a currently performed motion of the robot when the robot performs the motion, and if the current motion of the robot corresponds to the target motion, the atmosphere lamp of the robot may be lit in a corresponding color.
When the robot performs an action sequence of a combination of a plurality of target actions, the robot atmosphere lamps may be sequentially lit in corresponding colors.
In this embodiment, by corresponding the execution action sequence of the robot to the lighting sequence of the robot atmosphere light colors, the relevance between the actions of the robot and the atmosphere light colors can be increased, the usability of the robot atmosphere light can be increased, the application scenes can be increased, and the user experience can be improved.
In some embodiments, prior to step 110, the atmosphere light setting method of the robot further comprises: and receiving a first input of the atmosphere lamp self-defining interface from a user, wherein the first input is used for determining at least one target action of the robot and a target color corresponding to the target action.
In this embodiment, the ambience light customization interface refers to one interface of the robot setting interface.
The user carries out first input on an atmosphere lamp self-defining interface of the terminal, and the user can carry out related setting on the action of the robot and the atmosphere lamp color of the robot on the interface.
In this embodiment, the motion of the robot and the color of the atmosphere lamp of the robot corresponding to the motion of the robot are a pair of motion colors of the robot.
The action color pair of the robot may represent a correlation between the action of the robot and the color of the atmosphere light of the robot.
The atmosphere lamp custom interface refers to a user custom interface for setting the robot action color pair in the robot setting interface, and a user can set the association relationship between the robot action and the atmosphere lamp color of the robot corresponding to the robot action in the atmosphere lamp custom interface in a related manner.
In this step, as shown in fig. 2, the first input is used for setting the action color pair of the robot on the atmosphere lamp custom interface by the user.
Wherein the first input may be expressed in at least one of the following ways:
first, the first input may be represented as a touch operation, including but not limited to a click operation, a slide operation, a press operation, and the like.
In this embodiment, receiving the first input of the user may be represented by receiving a touch operation of the user on a display area of a display screen of the terminal.
In order to reduce the misoperation rate of the user, the action area of the first input can be limited to a specific area, such as the upper middle area of the action bar of the robot; or under the state of displaying the atmosphere lamp user-defined interface, displaying the target control on the current interface, and touching the target control to realize the first input; or setting the first input as a continuous multi-tap operation on the display area within the target time interval.
Second, the first input may be represented as a physical key input.
In the embodiment, the body of the terminal is provided with physical keys for movement and confirmation, and the robot target action and the atmosphere lamp target color are selected by operating the keys and moving a cursor to the robot action or the atmosphere lamp color bar on the atmosphere lamp custom interface.
Receiving a first input of a user, which may be expressed as receiving a first input of the user pressing a corresponding entity key; the first input may also be a combined operation of pressing a plurality of physical keys simultaneously.
Third, the first input may be represented as a voice input.
In this embodiment, the terminal may trigger the alternative action list interface when receiving a voice such as "set robot action"; or when the atmosphere lamp color corresponding to the robot action is set, the color palette interface is triggered.
Of course, in other embodiments, the first input may also be in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and this is not limited in this application.
The atmosphere lamp custom interface can be expressed in two ways:
first, the atmosphere lamp self-defining interface may include a blank action color pair, where the blank action color pair includes a blank action bar and a blank color bar corresponding to the blank action bar.
In some embodiments, as shown in FIG. 2, the ambience light customization interface includes a blank action bar and a blank color bar corresponding to the blank action bar.
Wherein, the action color pair comprises a robot action and an atmosphere lamp color corresponding to the robot action.
The blank action color pair may be a robot action bar blank, and an atmosphere lamp color bar corresponding to the robot action blank.
In this embodiment, the atmosphere lamp custom interface robot action and the atmosphere lamp color corresponding to the robot action are all blank, and the user can reset the robot action and the atmosphere lamp color corresponding to the robot action. The setup can be made more flexible.
In some embodiments, the setting of the robot action may include: the terminal receives a second input of the user to the blank action bar; the terminal determines a target action in response to the second input.
In this embodiment, the second input is used for the user to select a target action in the ambience light customization interface.
Wherein the second input may be expressed in at least one of the following ways:
first, the second input may be represented as a touch operation including, but not limited to, a click operation, a slide operation, a press operation, and the like.
In this embodiment, receiving the second input of the user may be represented by receiving a touch operation of the user on a display area of a display screen of the terminal.
In order to reduce the misoperation rate of the user, the action area of the second input can be limited to a specific area, such as the upper middle area of the action bar of the robot; or under the state of displaying the atmosphere lamp user-defined interface, displaying the target control on the current interface, and touching the target control to realize the second input; or the second input is set as a continuous multi-tap operation on the display area within the target time interval.
Second, the second input may be represented as a physical key input.
In the embodiment, the body of the terminal is provided with physical keys for movement and confirmation, and the cursor is moved to the robot action bar by operating the keys on the atmosphere lamp custom interface, so that the selection of the target action of the robot is realized.
Receiving a second input from the user, which may be expressed as receiving a second input from the user pressing the corresponding physical key; the second input may also be a combined operation of pressing a plurality of physical keys simultaneously.
Third, the second input may be presented as a speech input.
In this embodiment, the terminal may trigger the display of the action setting interface upon receiving a voice such as "open action setting interface".
Of course, in other embodiments, the second input may also be represented in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and this is not limited in this application.
The method for setting the action of the user on the robot target in the atmosphere lamp self-defined interface comprises the following steps:
the terminal receives a first sub-input of a user to the blank action bar.
In this embodiment, as shown in fig. 2, the blank action column refers to a blank column next to the robot action column, and the blank action column may indicate that the robot action is not set.
In this step, the first sub-input is used for the user to call up an alternative action list interface in the atmosphere lamp self-defining interface.
Wherein, the candidate action list may include all actions that the robot can make, such as forward walking, backward walking, left turning, right turning, etc.
The alternative action list shows all actions that the robot can make, so that a user can select the actions that the robot can make more intuitively according to different requirements.
The terminal displays a plurality of alternative actions in response to the first sub-input.
In the step, the terminal displays the alternative action list according to the first sub-input of the user, so that the user can more intuitively acquire all actions performed by the robot.
And receiving a second sub-input of the target action selected from the plurality of alternative actions by the user.
In this step, as shown in fig. 3, the second sub-input may be used for the user to select a target action among a plurality of alternative actions in the alternative action list.
Where the target action may be any of a number of alternative actions, such as "turn left".
The robot can make corresponding actions according to the target actions determined by the user.
In response to the second sub-input, determining a target action;
in this step, as shown in fig. 4, in the robot action bar of the ambience light customization interface, a target action determined by the terminal based on the second sub-input of the user, for example, "turn left", is displayed.
In the embodiment, the target action of the robot is selected from the candidate action list, so that a user can intuitively acquire all actions that can be made by all robots, the setting mode is simplified, and the user experience is enhanced.
In some embodiments, the setting of the robot action may include: receiving a third input of the user to the blank color bar; in response to a third input, a target color is determined.
In this embodiment, the third input is used for the user to select the target color of the ambience lamp in the ambience lamp customization interface.
Wherein the third input may be expressed in at least one of the following ways:
first, the third input may be represented as a touch operation including, but not limited to, a click operation, a slide operation, a press operation, and the like.
In this embodiment, the receiving of the third input by the user may be represented by receiving a touch operation of the user on a display area of a display screen of the terminal.
In order to reduce the misoperation rate of the user, the action area of the third input can be limited to a specific area, such as the upper middle area of the atmosphere lamp color setting bar; or under the state of displaying the atmosphere lamp user-defined interface, displaying the target control on the current interface, and touching the target control, so that the third input can be realized; or setting the third input as a continuous multi-tap operation on the display area within the target time interval.
Second, the third input may be represented as a physical key input.
In the embodiment, the body of the terminal is provided with physical keys for movement and confirmation, and the robot realizes the selection of the target color of the atmosphere lamp by operating the keys and moving the cursor to the corresponding color bar of the atmosphere lamp on the atmosphere lamp custom interface.
Receiving a third input from the user, which may be expressed as receiving a third input from the user pressing the corresponding physical key; the third input may also be a combined operation of pressing a plurality of physical keys simultaneously.
Third, the third input may be presented as a speech input.
In this embodiment, the terminal may trigger the display of the color selection interface upon receiving a voice such as "open color selection interface".
Of course, in other embodiments, the third input may also be represented in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and this is not limited in this application.
The method for setting the target color of the atmosphere lamp of the robot by the user at the atmosphere lamp self-defined interface comprises the following steps:
the terminal receives a third sub-input of the user to the blank color bar.
In this embodiment, the blank color column refers to a blank column next to the atmosphere lamp color column, and the atmosphere lamp color column corresponds to the robot action column.
The blank color column may indicate that the robot's mood light color is not set.
In this step, as shown in fig. 4 and 5, the third sub-input is used for the user to call up a color palette in the atmosphere lamp customization interface, where the color of the atmosphere lamp of the robot can be set.
Wherein, in the palette, the user can carry out self-defining setting through hue, saturation and the luminance of adjusting the atmosphere lamp colour.
The robot's ambiance light color may be any color in a palette.
In this step, the user can set up the colour of atmosphere lamp through the palette, can greatly enrich the colour selection of atmosphere lamp, improves the applicable scene of atmosphere lamp greatly.
In response to the third sub-input, the palette is displayed.
In the step, the terminal can display the color palette according to the third sub-input of the user to the atmosphere lamp self-defined interface, so that the user can select the color of the atmosphere lamp from more colors, and the applicable scene of the atmosphere lamp is greatly improved.
And receiving a fourth sub-input of the target color selected by the user in the color palette.
In this step, as shown in fig. 5, the fourth sub-input may be used for the user to determine the target color on the color palette.
The target color may be any color selected by the user after the color parameter adjustment of the color palette, such as "color 1".
Determining a target color in response to the fourth sub-input
In this step, as shown in fig. 6, in the atmosphere lamp color column of the robot of the atmosphere lamp customization interface, a target color determined by the display terminal based on the fourth sub-input of the user, for example, "color 1", is displayed.
In this embodiment, through the target color of selecting the robot from the palette, the color selection of the atmosphere lamp can be greatly enriched, the applicable scene of the atmosphere lamp is greatly improved, and then the user experience is improved.
And secondly, the atmosphere lamp self-defining interface comprises a blank action color pair and a plurality of machine selection action color pairs.
In some embodiments, as shown in FIG. 7, the mood light customization interface may include a blank action color pair and a system default machine action color pair.
The blank action color pair may include a blank action column and a blank color column corresponding to the blank action column, and the blank action column and the blank color column corresponding to the blank action column may be used to add the action color pair.
The association relationship between the machine selection action color and the default action color of the system before the robot leaves the factory is provided, and the user can delete the existing machine selection action color pair.
In this embodiment, the user may perform the related setting on the existing machine-selected action color pair, or may perform the related setting on the action color pair in the blank action column and the blank color column corresponding to the blank action column, so that the user does not need to set the action color pair from scratch, and the setting step is simplified.
In some embodiments, when the user performs related setting on the action color pair of the robot on the atmosphere lamp custom interface, at least the following three setting modes are included:
first, a first user input of a blank action color pair is received.
In this embodiment, the pair of blank operation colors includes a blank operation field and a blank color field corresponding to the blank operation field.
And the user performs first input on the blank action color pair, and the terminal responds to the first input to set the target action of the robot and the target color corresponding to the target action.
The method for setting the blank action color pair by the user comprises the following steps:
receiving a first sub-input of a user to an atmosphere lamp self-defining interface; the terminal displays an alternative action list in response to the first sub-input.
In this embodiment, as shown in fig. 7, the user makes a first sub-input to the blank action field.
As shown in FIG. 8, the mood light customization interface calls out a list of alternate actions based on the user's first child input.
Wherein the first sub-input is used for calling out a candidate action list of the robot action.
The candidate action list comprises candidate actions which can be realized by the robot and are different from the existing candidate actions in the robot action list.
The candidate motion list of the robot motion in this setting may be a candidate motion list composed of the remaining candidate motions obtained by subtracting the motions existing in the robot motion field from the motions that the robot can perform.
Receiving a second sub-input of the atmosphere lamp self-defined interface from a user; in response to the second sub-input, a target action is determined.
In this embodiment, as shown in fig. 8, the user makes a second sub-input to the list of alternative actions, and the user selects a target action, such as "put down the hand", from the list of alternative actions.
After the user determines the target action, as shown in FIG. 9, the target action, e.g., "put down hand", is displayed in the atmosphere light custom interface robot action bar.
Receiving a third sub-input of a user to the atmosphere lamp self-defined interface; in response to the third sub-input, the palette is displayed.
In this embodiment, the user makes a third sub-input to the blank color field, as shown in fig. 9.
And the third sub-input is used for calling out the color palette on the atmosphere lamp self-defined interface by the user.
Receiving a fourth sub-input of the user to the palette; in response to the fourth sub-input, a target color is determined.
In this embodiment, as shown in fig. 10, the user makes a fourth sub-input to the palette to determine a target color, for example, "color 9".
Wherein the fourth sub-input is for the user to determine the target color on the color palette.
In this setting, as shown in fig. 11, after the user completes the setting, a blank color bar displays a target action, such as "put down the hand"; the blank color column displays a target color corresponding to "put down hand", for example, "color 9".
Under this kind of setting mode, the user can not only simplify the operating procedure through increasing self-defined action colour on the basis that original machine selection action colour is right, increases the richness that the action colour of robot is right simultaneously, promotes user experience and feels.
And secondly, receiving a first input of a user to the machine selection action color pair.
In this embodiment, the user performs a first input on a factory default machine selection action color of the robot, and the setting mode includes at least the following two types:
one, delete
In the setting mode, the user can delete and input the existing machine selection action color pairs on the atmosphere lamp custom interface.
For example, as shown in FIG. 12, the user makes a first input to "head right turn," deleting the "head right turn" action event, while the system automatically deletes the atmosphere light color "color 8" corresponding to the "head right turn" action.
In this setting, as shown in fig. 13, after the user completes the first input to the atmosphere lamp customization interface, the action color pair of "head-right turn-color 8" in the robot action column and the atmosphere lamp color column is deleted.
Second, changing the color of the atmosphere lamp
In this setting, the user can modify the association relationship between the action of the existing machine-selected action color pair and the atmosphere lamp color.
For example, the user inputs the color column of "color 1" corresponding to "go ahead", calls up the color palette, selects the "color 10" action event, and changes the action color pair of "go ahead-color 1" to "go ahead-color 10".
For example, the user inputs the color column of "color 5" corresponding to "head up", calls up the color palette, selects the "color 11" action event, and changes the action color pair of "head up-color 5" to "go forward-color 11".
Under this kind of setting mode, the user can directly modify action colour right on original machine selection action colour is right, not only can further simplify the operating procedure, makes the robot use different scenes simultaneously, promotes user experience and feels.
And thirdly, receiving first input of a user to the blank action color pair and the machine selection action color pair.
In this embodiment, the user may modify the blank action color pair and the existing default machine selection action color pair of the robot, and the setting mode is a combination of the first and second setting modes, which is not described herein again.
Under this kind of setting mode, the user can directly modify action colour right on original machine selection action colour is right, greatly simplifies the operating procedure, further promotes user experience and feels.
In some embodiments, before the terminal receives the first input of the user to the atmosphere lamp customization interface, the atmosphere lamp setting method of the robot further includes:
and the terminal receives a fourth input of the atmosphere lamp setting interface from the user.
The atmosphere lamp setting interface refers to a factory setting interface of the robot setting interface for robot action color pairs, and a user cannot set the association relationship between the robot action and the atmosphere lamp color of the robot in the interface in a relevant mode.
In this step, as shown in fig. 14, the fourth input is used for the user to call up the atmosphere lamp customization interface on the atmosphere lamp setting interface of the terminal.
In response to a fourth input, an atmosphere light customization interface is displayed.
In the step, the terminal responds to the fourth input of the atmosphere lamp setting interface by the user and calls out the atmosphere lamp self-defined interface.
The atmosphere lamp self-defining interface can comprise a blank action color pair, or the atmosphere lamp self-defining interface can comprise a blank action color pair and a plurality of machine selection action color pairs.
In some embodiments, as shown in FIG. 2, the ambience light customization interface may include a blank action color pair.
In some embodiments, as shown in FIG. 15, the mood light customization interface may include a blank action color pair and a plurality of machine action color pairs.
Wherein the fourth input may be expressed in at least one of the following ways:
first, the fourth input may be represented by a touch operation, including but not limited to a click operation, a slide operation, a press operation, and the like.
In this embodiment, the receiving of the fourth input by the user may be represented by receiving a touch operation of the user on a display area of a display screen of the terminal.
In order to reduce the user misoperation rate, the action region of the fourth input can be limited to a specific region, such as the upper middle region of the atmosphere lamp setting interface; or under the state of displaying the setting interface of the atmosphere lamp, displaying the target control on the current interface, and touching the target control, so that the fourth input can be realized; or setting the fourth input as a continuous multi-tap operation on the display area within the target time interval.
Second, the fourth input may be represented as a physical key input.
In the embodiment, the body of the terminal is provided with the entity keys for moving and confirming, and the atmosphere lamp self-defined switch is turned on by operating the keys on the atmosphere lamp setting interface.
Receiving a fourth input from the user, which may be expressed as receiving a fourth input from the user pressing the corresponding physical key; the fourth input may also be a combined operation of pressing a plurality of physical keys simultaneously.
Third, the fourth input may appear as a voice input.
In this embodiment, the terminal may trigger the display of the atmosphere lamp custom interface when receiving a voice such as "turn on the atmosphere lamp custom interface".
Of course, in other embodiments, the fourth input may also be in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and this is not limited in this application.
In this embodiment, the user may select a default machine-defined action color pair on the atmosphere lamp setting interface, or may customize an action color pair on the atmosphere lamp custom interface. The action and atmosphere lamp color incidence relation of the robot are flexible and changeable due to multiple setting modes, the use scene of the robot is enriched, and the use experience of a user is improved.
In some embodiments, after displaying the atmosphere lamp customization interface, the atmosphere lamp setting method of the robot further comprises:
and receiving a fifth input of the atmosphere lamp self-defining interface from the user.
In this embodiment, the ambience light customization interface refers to one interface of the robot setting interface.
And the user performs fifth input on the atmosphere lamp user-defined interface of the terminal and calls out an atmosphere lamp setting interface.
In this step, as shown in fig. 15, the fifth input is used for the user to turn off the atmosphere lamp customization interface and call up the atmosphere lamp setting interface on the terminal.
Wherein the fifth input may be expressed in at least one of the following ways:
first, the fifth input may be represented by a touch operation, including but not limited to a click operation, a slide operation, a press operation, and the like.
In this embodiment, the receiving of the fifth input by the user may be represented by receiving a touch operation of the user on a display area of a display screen of the terminal.
In order to reduce the misoperation rate of the user, the action area of the fifth input can be limited to a specific area, such as the upper middle area of the atmosphere lamp custom interface; or displaying the target control on the current interface and touching the target control in the state of displaying the atmosphere lamp custom interface, so as to realize the fifth input; or setting the fifth input as a continuous multi-tap operation on the display area within the target time interval.
Second, the fifth input may be represented as a physical key input.
In the embodiment, the body of the terminal is provided with the entity keys for moving and confirming, and the operation of turning off the atmosphere lamp self-defining switch is realized by operating the keys on the atmosphere lamp self-defining interface.
Receiving a fifth input from the user, which may be expressed as receiving a fifth input from the user pressing the corresponding physical key; the fifth input may also be a combined operation of pressing a plurality of physical keys at the same time.
Third, the fifth input may be presented as a voice input.
In this embodiment, the terminal may trigger the display of the ambience lamp setting interface upon receiving a voice such as "turn on the ambience lamp setting interface".
Of course, in other embodiments, the fifth input may also be in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and this is not limited in this application.
In response to a fifth input, an atmosphere light setting interface is displayed.
In the step, the terminal responds to a fifth input of the user to the atmosphere lamp self-defining interface, and calls out the atmosphere lamp setting interface.
As shown in fig. 16, the atmosphere lamp setting interface includes a plurality of machine selection action color pairs, and each machine selection action color pair includes a machine selection action and a machine selection color corresponding to the machine selection action.
In this embodiment, the user performs two inputs to turn on and off the ambient light custom switch, where turning on the ambient light custom switch calls out the ambient light custom interface from the ambient light setting interface, and turning off the ambient light custom switch calls back the ambient light setting interface from the ambient light custom interface, and the action color pair in the ambient light setting interface at this time may be the same as or different from the default action color pair of the system. The atmosphere lamp setting interface comprises at least two expressions as follows:
and one, the machine selection action color pair is the same as the machine selection action color pair displayed last time.
In this representation, the machine-selected color pair in the second recalled ambience light setting interface is the same as the machine-selected color pair in the last ambience light setting interface.
This kind of expression, the user can carry out the fifth input to atmosphere lamp self-defined interface when not satisfying to the self-defined setting of doing the action colour right, resumes original acquiescence machine selection colour right, sets up for user-defined and increases error correction function, can increase user experience and feel.
And secondly, selecting the action color pair as a re-randomly generated action color pair.
In this expression, as shown in fig. 16, the machine selection color pair in the second-time called-up atmosphere lamp setting interface is different from the machine selection color pair in the last atmosphere lamp setting interface, and the contents of the robot action bar and the atmosphere lamp color bar are randomly assigned and are different from the machine selection color pair in the last atmosphere lamp setting interface; alternatively, as shown in fig. 17, the color pair selected in the second-time atmosphere lamp setting interface is the same as the operation in the previous atmosphere lamp setting interface, but is different from the color of the atmosphere lamp of the robot corresponding to the operation of the robot.
In the expression mode, when the user is unsatisfied with the default machine selection action color of the robot system, the fourth input can be carried out on the atmosphere lamp setting interface, the fifth input is carried out on the atmosphere lamp self-defined interface, and the machine selection action color is randomly adjusted, so that the corresponding relation between the robot action and the atmosphere lamp color can be realized, the setting step is simplified, the applicable scene of the robot is increased, and the user experience is further increased.
As shown in fig. 18, the specific implementation flow of the atmosphere lamp setting method of the robot is as follows:
the robot initially controls and automatically generates a machine selection color pair;
and the machine selection color pair is generated by default of the system and is displayed on the atmosphere lamp setting interface.
When the user needs to set the machine selection color pair, two setting modes can be provided:
firstly, turning on an atmosphere lamp self-defined switch, and setting a target action and a target color corresponding to the target action.
Wherein, the setting mode includes following three at least:
first, add action color pair.
The user selects one motion event of the robot, such as "turn left", among a plurality of alternative action events of the ambience light customization interface.
And displaying the target action of the robot, such as 'left turn', in a robot action bar in the atmosphere lamp custom interface.
The user inputs the atmosphere lamp colour at atmosphere lamp self-defined interface, and atmosphere lamp self-defined interface shows atmosphere lamp colour palette, and the user carries out atmosphere lamp colour selection.
The target color of the robot, such as "color 7", is displayed in the ambient light color bar in the ambient light customization interface. "
And II, deleting action color pairs.
And clicking and deleting the action event column needing to be deleted in the robot action column of the atmosphere lamp self-defining interface by the user, deleting the action event, and deleting the atmosphere lamp color corresponding to the action event.
And thirdly, changing the color pair of the machine selection action.
And modifying the existing machine selection color pair in a robot action bar of the atmosphere lamp custom interface by a user, and modifying the association relation between the action in the default machine selection color pair and the atmosphere lamp color corresponding to the action.
For example, inputting a certain column in the action columns of the robot needing to be changed, displaying an alternative action list, and determining a target action;
for example, a certain column of the color columns of the atmosphere to be changed is input, a color palette is displayed, and a target color is determined.
And secondly, turning on the atmosphere lamp self-defining switch firstly, and then turning off the atmosphere lamp self-defining switch.
The user carries out the fourth input to atmosphere lamp setting interface, calls out the self-defined interface of atmosphere lamp, carries out the fifth input to the self-defined interface of atmosphere lamp again, calls out atmosphere lamp setting interface again, and the user can select to generate the machine at random again and select the colour pair.
The method comprises the following steps that a robot action is carried out on the atmosphere lamp, wherein a machine selection color pair is randomly generated again, the robot action is unchanged, and the color of the atmosphere lamp corresponding to the robot action is changed; alternatively, both the robot motion and the atmosphere light color corresponding to the robot motion are changed.
In this kind of setting mode, the user is through self-defined action colour right, or makes the machine select action colour to random adjustment, can make the action of robot and atmosphere lamp colour incidence relation nimble changeable, richenes the use scene of robot, increases the user and uses experience sense.
The embodiment of the application also provides an atmosphere lamp setting device of the robot.
As shown in fig. 19, the atmosphere lamp setting apparatus of the robot includes: a first obtaining module 1910 and a first processing module 1920.
A first obtaining module 1910 configured to determine at least one target action of the robot and a target color corresponding to the target action;
the first processing module 1920 is configured to obtain a current motion of the robot, and light an atmosphere lamp of the robot in a target color if the current motion corresponds to the target motion.
The atmosphere lamp setting device of the robot provided by the embodiment of the application can realize each process realized by the method embodiments of fig. 2 to fig. 18, and is not repeated here for avoiding repetition.
An embodiment of the present application further provides a robot, including: atmosphere lamp, action module and controlling means.
The atmosphere lamp is used to light the atmosphere lamp in different colors.
The color of the atmosphere lamp can be any color selected in the color palette according to the requirements of a user.
The action module is used for making different actions.
The motion module of the robot may include actions that the robot is capable of making, such as raising, lowering, or turning left, etc.
The control device is connected with the atmosphere lamp and the action module of the robot, and can execute an atmosphere lamp setting method such as a robot.
For example, the robot action and atmosphere light color correspondence may be added, deleted or modified for action color pairs.
The robot that this application provided, the user is right through self-defined action colour, or makes the machine select action colour to random adjustment, can make the atmosphere lamp colour of robot light with different colours according to the action of robot, richenes the use scene of robot, increases the user and uses experience sense.
Fig. 20 illustrates a physical structure diagram of an electronic device, and as shown, the electronic device 2000 may include: a memory 2010(memory) and a processor 2020 (processor). Processor 2020 may invoke logic instructions in memory 2010 to perform a robot atmosphere light setting method comprising: determining at least one target action of the robot and a target color corresponding to the target action; and acquiring the current action of the robot, and if the current action corresponds to the target action, lighting the atmosphere lamp of the robot in the target color.
Furthermore, the logic instructions in the memory 2010 may be implemented in the form of software functional units and stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM 2010), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Further, an embodiment of the application discloses a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, the computer is capable of performing an ambience light setting method of a robot, the method comprising: determining at least one target action of the robot and a target color corresponding to the target action; and acquiring the current action of the robot, and if the current action corresponds to the target action, lighting the atmosphere lamp of the robot in the target color.
In another aspect, embodiments of the present application further provide a non-transitory computer-readable storage medium having stored thereon a computer program, which when executed by a processor, implements an atmosphere lamp setting method for a robot, the method including: receiving a first input of a user to the atmosphere lamp self-defining interface; in response to the first input, a target action and a target color corresponding to the target action are determined, wherein an ambience light of the robot is lit at the target color when the robot performs the target action.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
The above embodiments are merely illustrative of the present application and are not intended to limit the present application. Although the present application has been described in detail with reference to the embodiments, it should be understood by those skilled in the art that various combinations, modifications or equivalents may be made to the technical solutions of the present application without departing from the spirit and scope of the technical solutions of the present application, and the technical solutions of the present application should be covered by the claims of the present application.

Claims (10)

1. A robot atmosphere lamp setting method, comprising:
determining at least one target action of the robot and a target color corresponding to the target action;
and acquiring the current action of the robot, and if the current action corresponds to the target action, lighting the atmosphere lamp of the robot in the target color.
2. The robot atmosphere light setting method according to claim 1, wherein before the determining at least one target action of the robot and a target color corresponding to the target action, the method further comprises:
receiving a first input of a user to the atmosphere lamp self-defining interface, wherein the first input is used for determining at least one target action of the robot and a target color corresponding to the target action.
3. The atmosphere lamp setting method of a robot according to claim 2, wherein the atmosphere lamp custom interface includes a blank action color pair, the blank action color pair including a blank action bar and a blank color bar corresponding to the blank action bar;
the determining at least one target action of the robot and a target color corresponding to the target action includes:
receiving a second input of the user to the blank action bar;
determining the target action in response to the second input;
receiving a third input of the user to the blank color bar;
in response to the third input, the target color is determined.
4. The robot atmosphere light setting method according to claim 3, wherein the receiving of the second input of the user to the blank action bar; in response to the second input, determining the target action includes: receiving a first sub-input of a user to the blank action bar; displaying a plurality of alternative actions in response to the first sub-input; receiving a second sub-input of a target action selected from the plurality of alternative actions by the user; determining the target action in response to the second sub-input;
the receiving user's third input to the blank color bar; in response to the third input, determining the target color comprises: receiving a third sub-input of the user to the blank color bar; displaying a palette in response to the third child input; receiving a fourth sub-input of the user to the palette; determining the target color in response to the fourth sub-input.
5. The robotic ambience light setting method of any one of claims 2-4, wherein prior to the receiving the first user input of the ambience light customization interface, the method further comprises:
receiving a fourth input of the atmosphere lamp setting interface from the user;
and responding to the fourth input, and displaying the atmosphere lamp self-defining interface.
6. The robot atmosphere light setting method according to claim 1, further comprising:
determining an action sequence to be performed, the action sequence comprising a plurality of target actions;
the acquiring a current action of the robot, and if the current action corresponds to the target action, lighting an atmosphere lamp of the robot in the target color, including:
and detecting the current action of the robot in real time, and lighting the atmosphere lamp by using a target color corresponding to the current action.
7. An atmosphere lamp setting device of a robot, comprising:
the robot comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for determining at least one target action of the robot and a target color corresponding to the target action;
and the first processing module is used for acquiring the current action of the robot, and if the current action corresponds to the target action, the atmosphere lamp of the robot is lightened by the target color.
8. A robot, comprising:
an atmosphere lamp for lighting in different colors;
an action module for making different actions;
a control device connected with the atmosphere lamp and the action module for executing the atmosphere lamp setting method of the robot according to any one of claims 1 to 6.
9. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, carry out the steps of the atmosphere light setting method of the robot according to any of the claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, carries out the steps of the atmosphere light setting method of the robot according to any of the claims 1 to 6.
CN202210082375.1A 2022-01-24 2022-01-24 Atmosphere lamp setting method and setting device for robot and robot Pending CN114347061A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210082375.1A CN114347061A (en) 2022-01-24 2022-01-24 Atmosphere lamp setting method and setting device for robot and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210082375.1A CN114347061A (en) 2022-01-24 2022-01-24 Atmosphere lamp setting method and setting device for robot and robot

Publications (1)

Publication Number Publication Date
CN114347061A true CN114347061A (en) 2022-04-15

Family

ID=81092771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210082375.1A Pending CN114347061A (en) 2022-01-24 2022-01-24 Atmosphere lamp setting method and setting device for robot and robot

Country Status (1)

Country Link
CN (1) CN114347061A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115316900A (en) * 2022-07-15 2022-11-11 深圳市正浩创新科技股份有限公司 Lamp effect control method, self-moving equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130094058A (en) * 2012-02-15 2013-08-23 주식회사 케이티 Communication system, apparatus and computer-readable storage medium
CN103931207A (en) * 2011-11-07 2014-07-16 大和房屋工业株式会社 Information notification system
KR20140115902A (en) * 2013-03-22 2014-10-01 주식회사 케이티 Apparatus and method for developing robot contents
CN109035100A (en) * 2018-08-21 2018-12-18 浙江爱创智能科技有限公司 A kind of data management system based on educational robot
CN109070356A (en) * 2016-04-28 2018-12-21 富士通株式会社 robot
CN110919669A (en) * 2019-12-25 2020-03-27 青岛合启立智能科技有限公司 Emotion robot and emotion robot system
CN111208837A (en) * 2020-03-20 2020-05-29 重庆德新机器人检测中心有限公司 Autonomous navigation robot and autonomous navigation robot interaction method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103931207A (en) * 2011-11-07 2014-07-16 大和房屋工业株式会社 Information notification system
KR20130094058A (en) * 2012-02-15 2013-08-23 주식회사 케이티 Communication system, apparatus and computer-readable storage medium
KR20140115902A (en) * 2013-03-22 2014-10-01 주식회사 케이티 Apparatus and method for developing robot contents
CN109070356A (en) * 2016-04-28 2018-12-21 富士通株式会社 robot
CN109035100A (en) * 2018-08-21 2018-12-18 浙江爱创智能科技有限公司 A kind of data management system based on educational robot
CN110919669A (en) * 2019-12-25 2020-03-27 青岛合启立智能科技有限公司 Emotion robot and emotion robot system
CN111208837A (en) * 2020-03-20 2020-05-29 重庆德新机器人检测中心有限公司 Autonomous navigation robot and autonomous navigation robot interaction method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115316900A (en) * 2022-07-15 2022-11-11 深圳市正浩创新科技股份有限公司 Lamp effect control method, self-moving equipment and storage medium

Similar Documents

Publication Publication Date Title
CN102722334B (en) The control method of touch screen and device
US20230218997A1 (en) Information processing method and apparatus, storage medium, electronic device
CN104679427B (en) Terminal split-screen display method and system
JPH0282307A (en) Information input method and user interface constituting method using the method
AU2015201798B2 (en) Graph display control apparatus, graph display control method, and graph display control program
CA2231699A1 (en) Graphical user interface, apparatus and method
JP2015537314A (en) Method and system for realizing a suspension type global button in a touch screen terminal interface
CN111760274A (en) Skill control method and device, storage medium and computer equipment
US20070091095A1 (en) Computer executable graphic method of generating animation elements
CN114347061A (en) Atmosphere lamp setting method and setting device for robot and robot
CN105630375A (en) Auxiliary implementation method and system of information input on the basis of graphical interface
JP2021517328A (en) Method and system for remote control of PC by virtual input device
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN110174984B (en) Information processing method and electronic equipment
CN109091864B (en) Information processing method and device, mobile terminal and storage medium
JP2007148589A (en) Character input device and character input method
JP2009032121A (en) Color generation support device and color generation support program
JP3093605B2 (en) Image editing device
CN109976652B (en) Information processing method and electronic equipment
CN117193540B (en) Control method and system of virtual keyboard
CN112068710A (en) Small-size keyboard for carrying out alternative layout input on numeric editing function key area
CN112346619B (en) Configuration software control method and device
JPH0916314A (en) Editing processing method
JP2000194498A (en) Mouse type input device
KR102087042B1 (en) Control system and terminal comprising the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination