CN114384903A - Prompting method and prompting system for detecting obstacle by robot - Google Patents

Prompting method and prompting system for detecting obstacle by robot Download PDF

Info

Publication number
CN114384903A
CN114384903A CN202011119373.2A CN202011119373A CN114384903A CN 114384903 A CN114384903 A CN 114384903A CN 202011119373 A CN202011119373 A CN 202011119373A CN 114384903 A CN114384903 A CN 114384903A
Authority
CN
China
Prior art keywords
obstacle
robot
display
display lamp
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011119373.2A
Other languages
Chinese (zh)
Inventor
李少海
郭盖华
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen LD Robot Co Ltd
Original Assignee
Shenzhen LD Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen LD Robot Co Ltd filed Critical Shenzhen LD Robot Co Ltd
Priority to CN202011119373.2A priority Critical patent/CN114384903A/en
Publication of CN114384903A publication Critical patent/CN114384903A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0285Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Manipulator (AREA)

Abstract

The application is applicable to the technical field of robots and provides a method and a system for prompting obstacle detection, wherein the method comprises the following steps: acquiring obstacle information detected by a robot; controlling the display lamp set to display based on the obstacle information; this application can make the display banks who associates with the robot show according to control command after the barrier is detected to the robot, and the barrier is detected to the demonstration suggestion robot through the display banks, can make the user observe the robot according to the display banks and whether break down, can in time discover when the sensor of robot breaks down, avoids the robot to become secondary damage according to the display banks information that detects.

Description

Prompting method and prompting system for detecting obstacle by robot
Technical Field
The application belongs to the technical field of robots, and particularly relates to a prompting method and a prompting system for detecting an obstacle by a robot.
Background
With the progress and development of science and technology, the robot gradually enters the life of people, and convenience is provided for the life of people.
For a movable robot, for example, a cleaning robot, the robot needs to sense whether an obstacle exists around through a sensor during the moving process, so that when the obstacle exists, the robot can timely avoid the obstacle to continue working. If the sensor in the robot is damaged, a user cannot find that the sensor is damaged in time, the robot cannot sense the obstacle during working, cannot avoid the obstacle in time, and secondary damage to the robot can be caused. Therefore, whether the sensor in the robot works normally or not is displayed to a user, the user can be ensured to find the sensor in time when the sensor is damaged, and the robot is prevented from being damaged secondarily.
Disclosure of Invention
The embodiment of the application provides a method and a system for prompting obstacle detection, and can solve the problem that a user cannot find out that a sensor in a robot is damaged in time at present.
In a first aspect, an embodiment of the present application provides a method for prompting that a robot detects an obstacle, including:
acquiring obstacle information detected by a robot;
and controlling the display lamp set to display based on the obstacle information.
In a second aspect, an embodiment of the present application provides a system for prompting a robot to detect an obstacle, including:
obstacle detection means for detecting an obstacle and transmitting detected obstacle information to the control means;
the control device is used for controlling the display lamp set to display according to the obstacle information;
and the display lamp group is used for displaying lamplight under the control of the control device.
In a third aspect, an embodiment of the present application provides a control apparatus, including: a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method for prompting the robot to detect an obstacle according to any one of the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement the method for prompting a robot to detect an obstacle according to any one of the above first aspects.
In a fifth aspect, the present application provides a computer program product, when the computer program product runs on a control device, the control device is caused to execute the method for prompting that the robot detects an obstacle according to any one of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that: firstly, obstacle information detected by a robot is obtained, and the display lamp set is controlled to display based on the obstacle information; this application detects the barrier after the robot, can make the display lamp group that associates with the robot show according to the barrier information that detects according to control command, and the barrier is detected to the demonstration suggestion robot through the display lamp group, can make the user observe whether the sensor that detects the barrier breaks down in the robot according to the display lamp group, can in time discover when the sensor of robot breaks down, avoids the robot to become secondary damage, has improved the intelligence of robot.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a system for prompting a robot to detect an obstacle according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a display light set when obstacle information provided by an embodiment of the present application includes a size and an orientation of an obstacle;
FIG. 3 is a schematic diagram of a display light set when obstacle information includes an orientation of an obstacle according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a method for prompting that a robot detects an obstacle according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating a method for controlling a display of a display light set according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a control device according to an embodiment of the present disclosure;
fig. 7 is a block diagram of a partial structure of a computer according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The utility model provides a reminder system of barrier is detected to robot, the reminder system of barrier is detected to above-mentioned robot can be used for promoting when the barrier is detected to the robot, and convenience of customers observes whether the robot operates normally to in time discover when the sensor that detects the barrier breaks down in the robot, avoid the robot because the secondary damage that the sensor trouble caused.
Fig. 1 is a schematic structural diagram of a system 100 for prompting a robot to detect an obstacle according to an embodiment of the present application, where the system includes an obstacle detection device 110, a control device 120, and a display lamp set 130, the obstacle detection device 110 is connected to the control device 120, and the control device 120 is connected to the display lamp set 130, and specifically, the detailed description of the system is as follows:
and an obstacle detection device 110 for detecting an obstacle and transmitting the detected obstacle information to the control device 120.
In the present embodiment, the obstacle detection device 110 may perform detection at a preset time, for example, every two minutes. The obstacle detection device 110 may be a sensor provided in the robot to detect an obstacle. The sensor may be one or a plurality of sensors, and when a plurality of sensors are used, each sensor may detect an obstacle in one direction. The sensors may also be used to detect obstacles of different heights. The sensors may be single line/multi-line/solid state/single line/single point/multi-point laser radars, depth cameras, ultrasonic sensors, millimeter wave radars, etc. When multiple sensors are used, there may be fusion of the information of the sensors in various directions and the sensor with omnidirectional detection.
The robot may be a cleaning robot, a meal delivery robot, a service robot, a mowing robot, or the like. The obstacle can be in the direction of motion of robot and the obstacle in robot the place ahead, can also be the obstacle in the preset range around the robot, and the preset range can set up as required, can also set up according to the distance that the sensor that detects the obstacle can detect.
In the present embodiment, the obstacle information detected by the obstacle detecting device 110 includes: at least one of a size of the obstacle, an orientation of the obstacle relative to the robot, and a distance of the obstacle from the robot.
Specifically, the size of the obstacle may be determined according to the contour of the obstacle, may be a radius of the obstacle, a perimeter of the obstacle, a volume of the obstacle, a height of the obstacle, a length of the obstacle, a width of the obstacle, an aspect ratio of the obstacle, and the like, and may be set as necessary. The orientation of the obstacle relative to the robot may be the position of the obstacle relative to the front or directly in front of the robot. The orientation of the obstacle relative to the robot may include straight ahead, straight behind, straight left, straight right, left front, left back, right front, right back, and the like.
And a control device 120, configured to control the display lamp set to display according to the obstacle information.
Specifically, the control device 120 may generate a control command according to the obstacle information and transmit the control command to the display lamp group 130. The control command is used to control the display lamp set 130 to display according to the control command.
In this embodiment, the control device 120 may generate a control instruction according to one or more of the obstacle information. The control device 120 may be a CPU or a processor, etc.
And the display lamp group 130 is used for displaying light under the control of the control device.
Specifically, the display lamp set 130 may perform a light display based on a control command of the control device 120.
In this embodiment, a display light set 130 is associated with the robot for light prompting when the robot detects an obstacle. The display lamp group 130 may include a plurality of lamps, and each lamp may include a plurality of LED lamp groups or a plurality of light bulbs. The brightness of the lamp can be conveniently adjusted by arranging a plurality of LED lamp groups or a plurality of bulbs in each lamp.
In this embodiment, the display lamp set 130 may also be used to display the remaining power of the robot, and light lamps with corresponding lengths or areas are turned on according to the remaining power of the robot.
It should be noted that, if the robot detects a plurality of obstacles at the same time, the obstacle information of the obstacle closest to the robot may be selected, and the display light set may be controlled to display according to the selected obstacle information. The display lamp set only needs to display whether the robot detects the obstacle or not, so that the selection of any one of the display lamp set can indicate that the robot detects the obstacle, and the sensor for detecting the obstacle in the robot operates normally. If different sensors are arranged in different directions, each sensor is required to select obstacle information.
In the embodiment of the present application, the obstacle detection device 110 is configured to detect an obstacle and send detected obstacle information to the control device 120. The control device 120 receives the obstacle information sent by the obstacle detection device 110, and controls the display lamp set to display the obstacle information. The display lamp set 130 is used for displaying light under the control of the control device. The display lamp set 130 displays light when the obstacle detection device 110 detects an obstacle, and the display lamp set 130 does not display light when the obstacle detection device 110 does not detect an obstacle. The light display of the display lamp group 130 can facilitate a user to observe whether the robot can accurately detect the obstacle or not, and the obstacle can be timely found when the robot cannot detect the obstacle, so that the robot is prevented from being secondarily damaged due to the fact that the obstacle cannot be detected.
In one possible implementation, a display light set is arranged above or on the side of the robot, and the display light set is provided with a protective cover.
In this embodiment, the display lamp sets may be disposed on the robot, or may be disposed on a device other than the robot, and one robot may correspond to one display lamp set. If the display lamp group is not arranged on the robot, the serial number of the robot corresponding to the display lamp group is required to be marked, and a user can check the running state of the corresponding robot conveniently according to the display lamp group.
In this embodiment, the protective cover on the display lamp set may be made of plastic material, glass material, etc., and the protective cover must have better light transmittance. And the protective cover is transparent object color, can be better discerned when showing the lamp bank and changing the light color and showing the lamp bank's light color.
In a possible implementation manner, the distribution angle of the display lamp group on the robot is not less than 90 degrees.
In this embodiment, the display lamp sets may be distributed on the robot and distributed along the circumferential direction or the upper surface of the robot.
As shown in fig. 1, in one possible implementation, the system 100 for prompting the robot to detect the obstacle may further include:
a sensing device 140 for detecting the intensity of ambient light and sending the intensity of ambient light to the control device.
In the present embodiment, the sensing device 140 is connected to the control device 120. The sensing device 140 may be a sensor that senses the intensity of illumination. The sensing device 140 can sense the ambient light intensity of the environment in which the robot is located.
The control device controls the display lamp set to display corresponding color and/or brightness based on the intensity of the ambient light sent by the sensing device 140.
Specifically, the color and/or brightness of the light used for lighting the display lamp set can be determined according to the intensity of the ambient light; and controlling the display lamp set to display according to the color and/or the brightness of the light and the obstacle information.
Specifically, a control instruction can be generated according to the color and/or brightness of the light and the obstacle information, and the control instruction is sent to the display lamp group, and the display lamp group performs light display based on the control instruction.
In this embodiment, the ambient light intensity may be a numerical value, a preset range where the current ambient light intensity is located may be searched according to the ambient light intensity, and a mode where the current environment is located is determined according to the preset range where the current ambient light intensity is located, where one preset range corresponds to one mode, for example, a first preset range corresponds to a daytime mode, and a second preset range corresponds to a nighttime mode. Determining the color of the light group according to the mode corresponding to the current ambient light intensity, for example, the light color in the daytime mode is red light or green light; the color of the night mode light is white light.
Alternatively, the ambient light intensity may be classified into different levels, for example, extra-strong illumination for level 1, strong illumination for level 2, normal illumination for level 3, and the like. Different ambient light intensities correspond to different colors of the light, for example, level 1 corresponds to red light, and the color of the light to be lit by the display lamp group is determined according to the light color corresponding to the ambient light intensities.
In this embodiment, the brightness of the display lamp set may also be selected according to a mode corresponding to the ambient light intensity, for example, the daytime mode may select a brightness of 3-5 levels, and the nighttime mode may select a brightness of 1-2 levels, with the higher the level is.
In one possible implementation, the system 100 for prompting the robot to detect an obstacle may further include a sensor for detecting the brightness of the display light set.
In this embodiment, if the robot is moving in a straight line and is approaching an obstacle, the size of the obstacle will increase with decreasing distance, the brightness of the display lamp set can be detected by the sensor for detecting the brightness of the display lamp set, and the brightness of the display lamp set can be adjusted accordingly according to the change of the distance.
In one possible implementation, when the obstacle information includes: the control device 120 is further configured to, in response to the size of the obstacle:
determining the length and/or area of the display lamp set to be lighted based on the size of the obstacle;
and controlling the display lamp group to display the corresponding length or/and area according to the length and/or area to be lightened.
In this embodiment, the display lamp set is controlled to display the corresponding length or/and area according to the length and/or area to be lit, for example, if the length or area to be lit of the display lamp set is 5 cm or 5 cm, the display lamp set is controlled to light 5 cm or 5 cm by controlling the light emission (or by blocking light).
Alternatively, the size of the obstacle may be detected by a sensor installed in the robot, and the sensor may estimate the length, width, and height of the obstacle by detecting the contour of the obstacle, thereby determining the volume of the obstacle. The control device 120 determines the length and/or area of the display lamp set to be lit according to the volume of the obstacle.
Specifically, the control device 120 presets different sizes of the obstacle, for example, the volume of the obstacle, corresponding to different lengths and/or areas to be lit, for example, when the size of the obstacle is greater than a preset threshold, lights of 10 centimeters and/or 10 square centimeters in the display light group are lit, which indicates that the obstacle is large; alternatively, when the size of the obstacle is 20, 8 cm and/or 8 cm square lamps are lit, and when the size of the obstacle is 10, 4 cm and/or 4 cm square lamps are lit. Specifically, the size of the obstacle may be set as desired corresponding to the length and/or area to be lit. The display lamp group prompts the size of the obstacle detected by the robot by lighting different lengths and/or areas.
In this embodiment, if the control instruction is generated, the control instruction may include lighting a lamp of the above-described length and/or area.
As an example, if the size of the obstacle is 5 cm and the length of the corresponding lamp to be lit is 5 cm, the control device 120 controls the display lamp set to be lit by 5 cm.
In the embodiment of the application, whether the robot detects the obstacle or not can be judged according to the length and/or the area of the lighted lamps in the display lamp group. If the robot does not avoid the obstacle in time, the size of the detected obstacle can be increased along with the decrease of the distance, and therefore the length and/or the area of the display lamp group display can also be changed along with the decrease of the distance.
It should be noted that, if the robot detects a plurality of obstacles, the light group display may be performed according to the size of one of the obstacles. The lamp group display may be performed according to the direction of each obstacle and/or the distance between obstacles, and for example, obstacle information of the obstacle with the largest size among the obstacles may be selected, and obstacle information of the obstacle closest to the robot may be selected.
In one possible implementation, when the obstacle information includes: the control device 120 is further configured to, when the obstacle is oriented relative to the robot:
determining a position area where a lamp to be turned on in the display lamp group is located based on the position of the obstacle relative to the robot, wherein the position area is an area corresponding to the position; and controlling the lamps in the position area to be lighted according to the position area where the lamps to be lighted are located.
In this embodiment, the position areas where the lamps in the display lamp group are located are pre-divided, and the division of the position areas in the display lamp group may be performed according to the orientation of the obstacle in the robot relative to the robot. When the lamps in the position area are controlled to be turned on according to the position area where the lamps to be turned on are located and not according to the size of the obstacle, all the lamps in the position area can be turned on, or part of the lamps in the position area can be turned on, and the lamps can be specifically set according to needs.
Optionally, each position area in the display lamp group may be numbered, so as to obtain an area code of each position area. And lighting the lamps in the area numbers according to the area numbers of the position areas where the lamps to be lighted are located.
Alternatively, if the orientation of the obstacle with respect to the robot includes a front side, a rear side, a left side, a right side, a front left side, a rear left side, a front right side, a rear right side, a position area of the lights in the display light set may include a position area corresponding to the front side (front display light set), a position area corresponding to the rear side (rear display light set), a position area corresponding to the front left side (left display light set), a position area corresponding to the right side (right display light set), a position area corresponding to the front left side (front left display light set), a position area corresponding to the rear left side (rear left display light set), a position area corresponding to the front right side (front right display light set), and a position area corresponding to the rear right side (rear right display light set). Optionally, the position area corresponding to the left front side in the display lamp group (the left front display lamp group and the front display lamp group), the position area corresponding to the left rear side (the left rear display lamp group and the rear display lamp group), the position area corresponding to the right front side (the front display lamp group and the right front display lamp group), and the position area corresponding to the right rear side in the display lamp group (the rear display lamp group and the right rear display lamp group).
For example, if the direction of the obstacle with respect to the robot is the front left, a position area corresponding to the front left is searched for from the front left, and all lights in the position area corresponding to the front left are turned on.
As an example, if the orientation of the obstacle 1 with respect to the robot 2 is right ahead, a lamp right ahead in the display lamp group 130 is turned on, as shown in fig. 2 (a). If the orientation of the obstacle 1 with respect to the robot 2 is right behind, the right behind lamp in the display lamp group 130 is turned on, as shown in fig. 2 (b). If the orientation of the obstacle 1 with respect to the robot 2 is right left, the right left lamp in the display lamp group 130 is turned on, as shown in fig. 2 (c). If the orientation of the obstacle 1 with respect to the robot 2 is right, the right lamp in the display lamp group 130 is turned on, as shown in fig. 2 (d). If the orientation of the obstacle 1 with respect to the robot 2 is left-front, the left-front lamp in the display lamp group 130 is lit, as shown in fig. 2 (g). If the orientation of the obstacle 1 with respect to the robot 2 is left rear, the lamps in the left rear of the display lamp group 130 are lit, as shown in fig. 2 (h). If the orientation of the obstacle 1 with respect to the robot 2 is right-front, the right-front lamp in the display lamp group 130 is turned on, as shown in fig. 2 (e). If the orientation of the obstacle 1 with respect to the robot 2 is right rear, the right rear lamp in the display lamp group 130 is lit, as shown in fig. 2 (f).
In one possible implementation, when the obstacle information includes: the control device 120 is further configured to, at a distance of the obstacle from the robot:
determining the brightness and/or the flicker frequency of the lighting of the lamp to be lighted in the display lamp group based on the distance between the obstacle and the robot; and controlling the display lamp group to display corresponding brightness and/or flicker frequency according to the lightened brightness and/or flicker frequency.
In this embodiment, if the distance between the obstacle and the robot is shorter, the brightness of the display lamp set is higher and/or the flashing frequency is faster, for example, when the robot is 1 meter away from the obstacle, the display lamp set flashes once in two seconds, and when the robot is 0.5 meter away from the obstacle, the display lamp set flashes once in 1 second.
In this embodiment, when the control command is not determined according to the size of the obstacle by the distance between the obstacle and the robot, all the lights in the display light group may be turned on, or a light having a predetermined length or area may be turned on.
In this embodiment, the color of the light when the display lamp group is turned on may be selected according to the distance between the obstacle and the robot, and the display lamp group may be controlled to display a corresponding color, and/or brightness, and/or flicker frequency.
In one possible implementation, when the obstacle information includes: the distance between the obstacle and the robot and the size of the obstacle, the control device 120 is further configured to:
determining the color of a lamp to be lighted in the display lamp group and/or the brightness and/or the flicker frequency of the lighting based on the distance between the obstacle and the robot; determining the length and/or area to be lighted in the display lamp group according to the size of the obstacle; and lighting the lamps with the corresponding length or area in the display lamp group according to the color, and/or the lighting brightness and/or the flashing frequency, and the length and/or the area to be lighted.
Optionally, a control instruction may be generated according to the color, and/or the lighting brightness and/or the flashing frequency, and the length or the area to be lit, where the control instruction is used to control the display lamp group to light the lamps of the corresponding length or area according to the color, and/or the lighting brightness and/or the flashing frequency.
In this embodiment, the control instruction includes illuminating a lamp of a first length or area, which is determined based on the size of the obstacle, in accordance with the color, and/or the brightness and/or the blinking frequency of the illumination.
As an example, if the obstacle is 3 meters away from the robot, the size of the obstacle is 5 centimeters in radius. The size of the obstacle is that the length of a lighting lamp corresponding to the radius of 5 cm is 5 cm, and the lighting brightness corresponding to the distance between the obstacle and the robot being 3 meters is 3 levels. The control command is to display a 5 cm long lamp with the display lamp group at the lighting brightness of 3 levels.
In one possible implementation, when the obstacle information includes: the control device 120 is further configured to, when the distance to the robot and the orientation of the obstacle relative to the robot are determined:
determining the color of a lamp to be lighted in the display lamp group and/or the brightness and/or the flicker frequency of the lighting based on the distance between the obstacle and the robot; determining a position area where a lamp to be lit in the display lamp group is located based on the position of the obstacle relative to the robot; and controlling the lighting of the lamps in the position area according to the color, and/or the lighting brightness and/or the flashing frequency and the position area.
In this embodiment, a position area of a lamp to be lit is determined based on the orientation, and then the lamp in the position area may be lit according to the color, and/or the lit brightness and/or the blinking frequency.
As an example, if the obstacle is 4 meters away from the robot, the obstacle is directly behind with respect to the robot. The brightness of lighting corresponding to the distance between the obstacle and the robot being 4 meters is 4 levels, and the position area right behind the display lamp group is the area right behind the display lamp group. The control instruction is to display the lamps in the area right behind the display lamp group according to the lighting brightness of 4 grades.
In one possible implementation, when the obstacle information includes: the control device 120 is further configured to, when the size of the obstacle and the orientation of the obstacle relative to the robot:
determining the length or the area of a lamp to be lighted in the display lamp group based on the size of the obstacle; determining a position area where a lamp to be lit in the display lamp group is located based on the position of the obstacle relative to the robot; and according to the length or the area of the lamp to be lightened and the position area, lightening the lamp with the corresponding length or area in the position area.
In the present embodiment, the position region of the lamp to be lit is determined based on the orientation, and then the lamps in the position region are lit by the above-described length or area, and the remaining lamps are not lit.
By way of example, if each location area in the display light set corresponds to an area number, the size of the obstacle 1 is 5 centimeters in radius, and the orientation of the obstacle 1 with respect to the robot 2 is straight ahead. The obstacle 1 having a radius of 5 cm corresponds to a length to be lighted of 4 cm, and the area immediately in front of the position area in the display lamp group 130 is numbered 5. The control command is to illuminate 4 cm length lamps in 5 zones in the display lamp group, as shown in fig. 3 (i).
As the robot 2 moves forward, the size of the obstacle 1 becomes larger, and if the size of the obstacle 1 is 8 cm in radius, 8 cm-long lamps in the 5 regions in the display lamp group 130 need to be turned on, as shown in fig. 3 (j). If the size of the obstacle 1 is 10 cm in radius, it is necessary to light 10 cm long lamps in 5 areas in the display lamp group 130, as shown in fig. 3 (k).
In one possible implementation, when the obstacle information includes: the control device 120 is further configured to, when the distance from the obstacle to the robot, the size of the obstacle, and the orientation of the obstacle relative to the robot:
determining the color of a lamp to be lighted in the display lamp group and/or the brightness and/or the flicker frequency of the lighting based on the distance between the obstacle and the robot; determining the length and/or area to be lighted in the display lamp group according to the size of the obstacle; determining a position area where a lamp to be lit in the display lamp group is located based on the position of the obstacle relative to the robot; and controlling the display lamp group to display according to the color, and/or the lighting brightness and/or the flashing frequency, and the length and/or the area and the position area to be lighted in the display lamp group.
In this embodiment, if a control command for controlling the lighting of the display lamp set is generated according to the color, and/or the lighting brightness and/or the flashing frequency, the length and/or the area to be lit in the display lamp set, and the position region, the control command includes lighting the lamps in the position region according to the color, and/or the lighting brightness and/or the flashing frequency, and the lighting lamps have the length and/or the area corresponding to the size of the obstacle.
By way of example, if the distance between the obstacle and the robot is 3 meters, the size of the obstacle is 5 centimeters in radius, and the obstacle is oriented right ahead with respect to the robot. The size of the obstacle is that the length of the lighting lamp corresponding to the radius of 5 cm is 5 cm, the lighting brightness corresponding to the distance between the obstacle and the robot being 3 m is 3 levels, and the lamp right in front of the position area in the display lamp group is right in front of the position area. The control device displays the lamps in the area 1 in the display lamp group according to the lighting brightness of 3 grades, and the length of the display is 5 centimeters.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The method for prompting the robot to detect an obstacle according to the embodiment of the present application will be described in detail below with reference to fig. 1.
Fig. 4 shows a schematic flowchart of a method for prompting that the robot detects an obstacle, which is applied to the control device 120 and is described in detail below with reference to fig. 4:
and S101, acquiring obstacle information detected by the robot.
S102, controlling a display lamp set to display based on the obstacle information
In one possible implementation, the obstacle information includes: at least one of a size of the obstacle, an orientation of the obstacle relative to the robot, and a distance of the obstacle from the robot.
In the present embodiment, the control instruction may be generated from one or more of the obstacle information.
As shown in fig. 5, in a possible implementation manner, if the obstacle information includes the size of the obstacle, the implementation process of step S102 may include:
and S1021, controlling the display lamp set to display the corresponding length or/and area based on the size of the obstacle.
In this embodiment, the length and/or area to be lit may be determined according to the size of the obstacle, and then the number of the lamp to be lit may be determined, and the lamp corresponding to the determined number may be lit. For example, 3 cm lamps need to be lit, and lamps numbered 1, 2, and 3 may be selected, as well as lamps numbered 3, 4, and 5. And generating a control instruction according to the number of the lamp to be lightened.
In a possible implementation manner, if the obstacle information includes the position of the obstacle relative to the robot, the implementation process of step S102 may include:
and S1022, controlling the display lamp set to display a corresponding position area based on the direction of the obstacle relative to the robot, wherein the position area is an area corresponding to the direction.
In this embodiment, the position area to be lit may be determined according to the orientation of the obstacle with respect to the robot. And selecting the number of the lamp to be lightened in the position area where the lamp to be lightened is positioned, and generating a control command according to the number of the lamp to be lightened.
In a possible implementation manner, if the obstacle information includes a distance between the obstacle and the robot, the implementation process of step S102 may include:
and S1023, controlling the display lamp set to display corresponding brightness and/or flicker frequency based on the distance between the obstacle and the robot. In the present embodiment, the brightness and/or the blinking frequency of the lighting of the display lamp group may be determined according to the distance of the obstacle from the robot. The number of the lamp to be lit is selected from the display lamp group. And generating a control instruction according to the number of the lamp to be lightened and the lightened brightness and/or flicker frequency.
In a possible implementation manner, if the obstacle information includes a distance between the obstacle and the robot, the implementation process of step S102 may include:
and controlling the display lamp set to display corresponding color, brightness and/or flicker frequency based on the distance between the obstacle and the robot.
In this embodiment, the color of the light when the display lamp group is turned on may be selected according to the distance between the obstacle and the robot.
In a possible implementation manner, the distribution angle of the display lamp group on the robot is not less than 90 degrees.
In a possible implementation manner, before step 102, the method may further include:
s201, acquiring the ambient light intensity of the environment where the robot is located;
s202, controlling the display lamp set to display corresponding color and/or brightness based on the ambient light intensity.
In this embodiment, after the light color is determined, the display light group is controlled to display based on the obstacle information and the light color and/or brightness.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The embodiment of the present application further provides a control device, and referring to fig. 6, the control device 120 may include: at least one processor 410, a memory 420, and a computer program stored in the memory 420 and executable on the at least one processor 410, wherein the processor 410 when executing the computer program implements the steps of any of the method embodiments described above, such as the steps S101 to S102 in the embodiment shown in fig. 4.
Illustratively, a computer program may be partitioned into one or more modules/units, which are stored in the memory 420 and executed by the processor 410 to accomplish the present application. The one or more modules/units may be a series of computer program segments capable of performing specific functions, which are used to describe the execution of the computer program in the control device 120.
Those skilled in the art will appreciate that fig. 6 is merely an example of a control device and is not intended to be limiting and may include more or fewer components than shown, or some components in combination, or different components such as input output devices, network access devices, buses, etc.
The Processor 410 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 420 may be an internal storage unit of the control apparatus, or may be an external storage device of the control apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. The memory 420 is used for storing the computer programs and other programs and data required for controlling the apparatus. The memory 420 may also be used to temporarily store data that has been output or is to be output.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The method for prompting the robot to detect the obstacle provided by the embodiment of the application can be applied to control devices such as computers, tablet computers, notebook computers, netbooks and Personal Digital Assistants (PDAs).
Take the control device as a computer as an example. Fig. 7 is a block diagram illustrating a partial structure of a computer provided in an embodiment of the present application. Referring to fig. 7, the computer includes: a communication circuit 510, a memory 520, an input unit 530, a display unit 540, an audio circuit 550, a wireless fidelity (WiFi) module 560, a processor 570, and a power supply 580.
The following describes each component of the computer in detail with reference to fig. 7:
the communication circuit 510 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives an image sample transmitted by the image capturing device and then processes the image sample to the processor 570; in addition, the image acquisition instruction is sent to the image acquisition device. Typically, the communication circuit includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the communication circuit 510 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 520 may be used to store software programs and modules, and the processor 570 performs various functional applications of the computer and data processing by operating the software programs and modules stored in the memory 520. The memory 520 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the computer, etc. Further, the memory 520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the computer. Specifically, the input unit 530 may include a touch panel 531 and other input devices 532. The touch panel 531, also called a touch screen, can collect touch operations of a user on or near the touch panel 531 (for example, operations of the user on or near the touch panel 531 by using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 570, and can receive and execute commands sent by the processor 570. In addition, the touch panel 531 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 530 may include other input devices 532 in addition to the touch panel 531. In particular, other input devices 532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 540 may be used to display information input by a user or information provided to the user and various menus of the computer. The Display unit 540 may include a Display panel 541, and optionally, the Display panel 541 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 531 may cover the display panel 541, and when the touch panel 531 detects a touch operation on or near the touch panel 531, the touch panel is transmitted to the processor 570 to determine the type of the touch event, and then the processor 570 provides a corresponding visual output on the display panel 541 according to the type of the touch event. Although in fig. 7, the touch panel 531 and the display panel 541 are two independent components to implement the input and output functions of the computer, in some embodiments, the touch panel 531 and the display panel 541 may be integrated to implement the input and output functions of the computer.
The audio circuit 550 may provide an audio interface between a user and a computer. The audio circuit 550 may transmit the received electrical signal converted from the audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 550 and converted into audio data, which is then processed by the audio data output processor 570, and then transmitted to, for example, another computer via the communication circuit 510, or the audio data is output to the memory 520 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and a computer can help a user send and receive e-mails, browse webpages, access streaming media and the like through the WiFi module 560, which provides wireless broadband internet access for the user. Although fig. 7 shows the WiFi module 560, it is understood that it does not belong to the essential constitution of the computer, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 570 is a control center of the computer, connects various parts of the entire computer using various interfaces and lines, performs various functions of the computer and processes data by operating or executing software programs and/or modules stored in the memory 520 and calling data stored in the memory 520, thereby monitoring the entire computer. Optionally, processor 570 may include one or more processing units; preferably, the processor 570 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 570.
The computer also includes a power supply 580 (e.g., a battery) for powering the various components, and preferably, the power supply 580 is logically coupled to the processor 570 via a power management system that provides management of charging, discharging, and power consumption.
The embodiment of the application also provides a computer-readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the steps in the various embodiments of the method for prompting that the robot detects an obstacle.
The embodiment of the application provides a computer program product, and when the computer program product runs on a mobile terminal, the steps in each embodiment of the method for prompting that a robot detects an obstacle can be realized when the mobile terminal is executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/control device, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method for prompting that a robot detects an obstacle, comprising:
acquiring obstacle information detected by a robot;
and controlling a display lamp group to display based on the obstacle information.
2. The method for prompting robot to detect an obstacle according to claim 1, wherein the obstacle information includes: at least one of a size of the obstacle, an orientation of the obstacle relative to the robot, and a distance of the obstacle from the robot.
3. The method for prompting robot to detect an obstacle according to claim 2, wherein if the obstacle information includes a size of the obstacle, the controlling the display lamp set to display based on the obstacle information includes:
and controlling the display lamp set to display the corresponding length or/and area based on the size of the obstacle.
4. The method for prompting robot detection of an obstacle according to claim 2 or 3, wherein if the obstacle information includes an orientation of the obstacle with respect to the robot, the controlling the display lamp set to display based on the obstacle information includes:
and controlling the display lamp set to display a corresponding position area based on the direction of the obstacle relative to the robot, wherein the position area is an area corresponding to the direction.
5. The method for prompting robot detection of an obstacle according to claim 2 or 3, wherein if the obstacle information includes a distance between the obstacle and the robot, the controlling the display lamp set to display based on the obstacle information includes:
and controlling the display lamp set to display corresponding brightness and/or flicker frequency based on the distance between the obstacle and the robot.
6. The method for prompting robot detection of obstacle according to claim 4, wherein if the obstacle information includes a distance between the obstacle and the robot, the controlling the display lamp set to display based on the obstacle information includes:
and controlling the display lamp set to display corresponding color, brightness and/or flicker frequency based on the distance between the obstacle and the robot.
7. The method for prompting robot detection of obstacle according to claim 1, wherein the distribution angle of the display lamp group on the robot is not less than 90 degrees.
8. The method for prompting robot detection of an obstacle according to claim 1, further comprising, before controlling the display lamp group to display based on the obstacle information:
acquiring the ambient light intensity of the environment where the robot is located;
and controlling the display lamp set to display corresponding color and/or brightness based on the ambient light intensity.
9. A system for prompting a robot to detect an obstacle, comprising:
obstacle detection means for detecting an obstacle and transmitting detected obstacle information to the control means;
the control device is used for controlling the display lamp set to display according to the barrier information;
and the display lamp group is used for displaying lamplight under the control of the control device.
10. The system for prompting robot detection of an obstacle according to claim 9, further comprising:
the sensing device is used for detecting the intensity of ambient light and sending the intensity of the ambient light to the control device;
correspondingly, the control device is also used for:
and controlling the display lamp set to display corresponding color and/or brightness based on the ambient light intensity.
At least one of a size of the obstacle, an orientation of the obstacle relative to the robot, and a distance of the obstacle from the robot;
when the obstacle information includes: the control device is further configured to, in response to the size of the obstacle:
controlling the display lamp set to display the corresponding length or/and area based on the size of the obstacle;
when the obstacle information includes: the control device is further configured to, when the obstacle is oriented relative to the robot:
controlling the display lamp set to display a corresponding position area based on the direction of the obstacle relative to the robot, wherein the position area is an area corresponding to the direction;
when the obstacle information includes: the control device is further configured to, at a distance of the obstacle from the robot:
and controlling the display lamp set to display corresponding brightness and/or flicker frequency based on the distance between the obstacle and the robot.
CN202011119373.2A 2020-10-19 2020-10-19 Prompting method and prompting system for detecting obstacle by robot Pending CN114384903A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011119373.2A CN114384903A (en) 2020-10-19 2020-10-19 Prompting method and prompting system for detecting obstacle by robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011119373.2A CN114384903A (en) 2020-10-19 2020-10-19 Prompting method and prompting system for detecting obstacle by robot

Publications (1)

Publication Number Publication Date
CN114384903A true CN114384903A (en) 2022-04-22

Family

ID=81194284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011119373.2A Pending CN114384903A (en) 2020-10-19 2020-10-19 Prompting method and prompting system for detecting obstacle by robot

Country Status (1)

Country Link
CN (1) CN114384903A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347273A (en) * 1993-01-14 1994-09-13 Kamyar Katiraie Adjustable, ultrasonic collision warning system
CN103587465A (en) * 2013-11-15 2014-02-19 观致汽车有限公司 Vehicle obstacle prompting system and method
US20150002620A1 (en) * 2012-03-09 2015-01-01 Lg Electronics Inc. Image display device and method thereof
CN205149654U (en) * 2015-11-12 2016-04-13 安徽理工大学 Car monitoring device that backs a car that moves ahead based on embedded linux

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347273A (en) * 1993-01-14 1994-09-13 Kamyar Katiraie Adjustable, ultrasonic collision warning system
US20150002620A1 (en) * 2012-03-09 2015-01-01 Lg Electronics Inc. Image display device and method thereof
CN103587465A (en) * 2013-11-15 2014-02-19 观致汽车有限公司 Vehicle obstacle prompting system and method
CN205149654U (en) * 2015-11-12 2016-04-13 安徽理工大学 Car monitoring device that backs a car that moves ahead based on embedded linux

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杜隆胤: "基于混沌遗传的智能吸尘器避障研究", 计算机测量与控制, vol. 21, no. 3, pages 762 - 765 *
杨志,等: "汽车安全与舒适系统故障诊断与维修", vol. 1, 同济大学出版社, pages: 75 *

Similar Documents

Publication Publication Date Title
US20200241687A1 (en) Switchable input modes for external display operation
US20230194055A1 (en) Wireless lighting control system
CN105868696B (en) A kind of method and apparatus detecting multilane lane line
EP3952269A1 (en) Camera module, and mobile terminal and control method therefor
CN106843730A (en) A kind of mobile terminal double screen changing method and mobile terminal
CN104871393A (en) Light control method, device and user terminal for user equipment
CN108604409B (en) Dynamically configurable traffic controller and method of use
CN109314727B (en) Method and device for controlling screen of mobile terminal
CN108072368B (en) Navigation method and device
CN107665079A (en) The display methods and display device of a kind of user interface
CN107451443A (en) Iris identification method and related product
CN104463105A (en) Guide board recognizing method and device
CN107122729A (en) Generation method, device and the mobile terminal of image
CN109153352B (en) Intelligent reminding method and device for automobile
WO2014022142A1 (en) Adaptive keyboard lighting
CN109104689B (en) Safety warning method and terminal
CN107707749A (en) A kind of communication information output intent, device, terminal and readable storage medium storing program for executing
WO2018214695A1 (en) Fingerprint acquisition method and related product
CN106506815A (en) A kind of application enables method and relevant device
CN104349543A (en) Apparatus for controlling lamp for vehicle and method for controlling lamp for vehicle using same
CN107168791A (en) Terminal Memory Optimize Method, device and mobile terminal under bright screen state
WO2022142713A1 (en) Method and apparatus for monitoring vehicle driving information
CN114384903A (en) Prompting method and prompting system for detecting obstacle by robot
CN106980525A (en) Using startup method, device and mobile terminal
CN110278028B (en) Information transmission method and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000 room 1601, building 2, Vanke Cloud City phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province (16th floor, block a, building 6, Shenzhen International Innovation Valley)

Applicant after: Shenzhen Ledong robot Co.,Ltd.

Address before: 518000 room 1601, building 2, Vanke Cloud City phase 6, Tongfa South Road, Xili community, Xili street, Nanshan District, Shenzhen City, Guangdong Province (16th floor, block a, building 6, Shenzhen International Innovation Valley)

Applicant before: SHENZHEN LD ROBOT Co.,Ltd.

CB02 Change of applicant information