CN112192562A - Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot - Google Patents

Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot Download PDF

Info

Publication number
CN112192562A
CN112192562A CN202010883623.3A CN202010883623A CN112192562A CN 112192562 A CN112192562 A CN 112192562A CN 202010883623 A CN202010883623 A CN 202010883623A CN 112192562 A CN112192562 A CN 112192562A
Authority
CN
China
Prior art keywords
painting
intelligent
robot
path
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010883623.3A
Other languages
Chinese (zh)
Other versions
CN112192562B (en
Inventor
许登科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202010883623.3A priority Critical patent/CN112192562B/en
Publication of CN112192562A publication Critical patent/CN112192562A/en
Application granted granted Critical
Publication of CN112192562B publication Critical patent/CN112192562B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a drawing guiding method of an intelligent drawing robot, a chip and the intelligent drawing robot, wherein the drawing guiding method comprises the following steps: step 1, controlling an intelligent painting robot to move along a pre-planned painting path to obtain a pre-configured painting period; step 2, controlling the intelligent painting robot to continuously move along a pre-planned painting path in the current painting period to draw lines until the intelligent painting robot moves to a position of a preset inflection point of the painting path to complete the description of the painting path, enabling the lines drawn in the current painting period to be parallel to the lines drawn in the previous painting period, and then entering step 3; and 3, judging whether the current counting drawing description times of the intelligent drawing robot reaches the preset description times, if so, stopping drawing lines, and otherwise, returning to the step 2. Let intelligent robot possess the ability of the lines of drawing space stereovision, draw the contour line more accurately and fast than the human.

Description

Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot
Technical Field
The invention relates to the technical field of robot autonomous drawing, in particular to a drawing guiding method and chip of an intelligent drawing robot and the intelligent drawing robot.
Background
Along with the development of science and technology, the application field of intelligent robot is more and more extensive, and the robot that can draw will also receive people's welcome. The sketch can express things in the visual world in the modes of single-color lines or painting surfaces and the like, and can also express ideas, concepts, attitudes, emotions, fantasy, symbols and even abstract forms.
At present, in practical application, an intelligent robot is required to draw an image with simple lines, the outer edge lines of an object are drawn relatively mechanically, the phenomenon that the ends of the line are connected tightly when the line is drawn is easy to directly appear, the object drawn in a picture is relatively isolated, the connection with a background environment on the same picture is lost, the object drawn by the intelligent robot has no sense of space, and the intelligent robot is difficult to make a drawn object show the sense of space when drawing.
Disclosure of Invention
In order to overcome the technical defects, the invention discloses a drawing guiding method of an intelligent drawing robot, a chip and the intelligent drawing robot.
A drawing guiding method of an intelligent drawing robot comprises the following steps: step 1, controlling an intelligent painting robot to move along a pre-planned painting path to obtain a pre-configured painting period; the pre-planned drawing path is obtained by converting the outline information of the outline edge of the pre-input drawing object; step 2, controlling the intelligent painting robot to continuously move along a pre-planned painting path in the current painting period to draw lines until the intelligent painting robot moves to a position of a preset inflection point of the painting path to complete the description of the painting path, enabling the lines drawn in the current painting period to be parallel to the lines drawn in the previous painting period, and then entering step 3; the intelligent painting robot is characterized in that lines drawn in different painting periods are arranged in parallel, and the number of pixel points contained in the lines drawn in the same painting period is reduced along with the increase of time; the preset inflection point position is used for representing the trend change of a pre-planned drawing path; step 3, judging whether the current counting drawing description times of the intelligent drawing robot reaches the preset description times, if so, stopping drawing lines, otherwise, returning to the step 2; the drawing description times are used for representing the number of preset turning points traversed by the intelligent drawing robot. Compared with the prior art, this technical scheme control robot draws the lines combination that is parallel to each other according to the drawing cycle of configuration, utilizes and predetermines the line combination that the inflection point position combines the drawing cycle to draw spacial sensation and stereovision, lets intelligent robot possess the ability of drawing the creation, implements and to get up and can draw the contour line more accurately and fast than the mankind.
Further, between the step 1 and the step 2, the method further comprises the following steps: step 11, controlling the intelligent painting robot to collect the number of pixel points at the current position, and judging whether the collected number of the pixel points is greater than a first preset value, if so, entering step 12, otherwise, entering step 2; and step 12, controlling the intelligent painting robot to move to positions with different gray values of pixel points along a pre-planned painting path, and then entering step 2. According to the technical scheme, the intelligent painting robot is controlled to draw objects to obtain richer light and shade levels of the drawing line groups, and the spatial sense of the picture is enhanced.
Further, when the current counted drawing description frequency of the intelligent drawing robot does not reach the preset description frequency, firstly, judging whether the number of the collected pixel points is smaller than a second preset value or not, if so, starting line drawing of the next drawing cycle, and then returning to the step 11, otherwise, returning to the step 2 and maintaining the movement along the pre-planned drawing path in the current drawing cycle; wherein the second predetermined value is less than the first predetermined value. According to the technical scheme, the intelligent painting robot is controlled to distinguish the boundary between the painting object and the picture background at the inflection point position of the reasonable painting path according to the painting description times and the number of the pixel points collected in real time, so that the space sense and the layering sense of the painting object are enriched.
Further, the start point of the painting path traveled by the intelligent painting robot in the current painting cycle is not connected with the end point of the painting path traveled by the intelligent painting robot in the previous painting cycle, wherein the end point of the painting path traveled by the intelligent painting robot in one painting cycle includes the preset inflection point. Therefore, objects drawn by the intelligent drawing robot can show the spatial sense of the outline after a plurality of drawing cycles, lines drawn by the robot are relatively easy and free, and the overall sense of the whole picture can be well held by a user.
Further, when the connecting line distance between a starting point pixel point in a line drawn in the current drawing period and a terminal point pixel point in a line drawn in the previous drawing period is smaller than or equal to a preset threshold value, the line drawn in the current drawing period and the line drawn in the previous drawing period are regarded as being overlapped, and the line drawn in the current drawing period is erased; and the line drawn in the current drawing period and the line drawn in the last drawing period are not connected end to end. And the noise points drawn along the drawing path are removed, so that the consumption of the storage space of the robot for drawing lines is reduced.
Furthermore, the intelligent painting robot marks the painting path traversed by the moving in real time in the constructed environment map, wherein lines drawn by the intelligent painting robot are also collected and captured, and are reflected on the periphery of the outline of the painting path marked in real time in the environment map according to the sequentially executed painting period. And the intuitiveness of the object drawn by the intelligent drawing robot is enhanced.
A chip is used for storing a program code corresponding to the drawing guiding method and is used for guiding and controlling an intelligent drawing robot to quickly and accurately draw a contour line with a spatial hierarchy along a preset and planned drawing path. On the premise of ensuring the line drawing effect, the efficiency of drawing path operation is greatly improved, the space consumption of drawing line storage is reduced, and the performance and smoothness of robot drawing are improved.
An intelligent drawing robot is internally provided with the chip, the intelligent drawing robot is a sweeping robot, a base side cover of the sweeping robot is provided with a fixed arm, a camera is arranged at a matched angle to collect pixel point information of a drawn line, the fixed arm clamps a line hooking pen or a pencil and is used for calling an electric guide signal which is output by the chip and corresponds to the drawing guide method, and the electric guide signal controls the fixed arm to clamp the line hooking pen or the pencil to draw the line along a pre-planned drawing path; wherein, the ink marks generated by the line drawing pen or pencil in each drawing period are gradually changed.
Compared with the prior art, the intelligent drawing robot disclosed by the technical scheme of the invention utilizes the fixed arm on the base to clamp the gradually-changed ink marks generated by the line-drawing pen or the pencil to draw the contour line of the pre-input drawing object on the basis of executing the drawing guiding method disclosed by the technical scheme, and combines the position of the preset inflection point and the drawing period to embody the form and the light and dark surface relation of the drawing object, so that the space sense and the hierarchy sense of the drawing object are enriched, the skeleton of the sketch drawing work is created hierarchically, and the robot has the drawing creation capability superior to that of human beings.
Furthermore, the intelligent painting robot is used for remotely sending information of the painting path and the painting line to a client of the mobile intelligent terminal so that a user can obtain a navigation map constructed in real time and the painting path marked on the navigation map from a touch screen of the mobile intelligent terminal, and meanwhile, the intelligent painting robot is also used for receiving an instruction of corresponding conversion of the line painted on the touch screen of the mobile intelligent terminal by the user through a finger or a touch screen special tool. The mobile intelligent terminal disclosed by the technical scheme can visually interact with a user, and the interactivity of drawing is improved.
Furthermore, after the information of the drawing path and the drawing line is sent to the mobile intelligent terminal, the client of the mobile intelligent terminal reads and analyzes the pixel point information of the drawing line corresponding to the drawing period and the position point of the drawing path where the intelligent drawing robot walks, then the analyzed pixel point information and the analyzed position point are subjected to coordinate information conversion, the coordinate information is displayed on a touch screen of the mobile intelligent terminal and stored in the client, the intelligent drawing robot is convenient for a user to rapidly call, the user can monitor the drawing process of the robot according to the drawing period instead of monitoring the drawing effect in real time, the effects of low chip processing pressure and high working efficiency are achieved, and the user can more directly obtain the drawing line group represented by light and dark levels.
Drawings
Fig. 1 is a flowchart of a drawing guidance method of an intelligent drawing robot according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to overcome the technical defects, the invention discloses a drawing guiding method of an intelligent drawing robot, a chip and the intelligent drawing robot. The drawing guiding method comprises the following steps:
step 1, controlling an intelligent painting robot to move along a pre-planned painting path to obtain a pre-configured painting period; the pre-planned drawing path is obtained by converting the outline information of the outline edge of the pre-input drawing object;
step 2, controlling the intelligent painting robot to move along a pre-planned painting path in the current painting period to draw lines until the intelligent painting robot moves to a position of a preset inflection point of the painting path to complete description of the painting path, enabling the lines drawn in the current painting period to be parallel to the lines drawn in the previous painting period, and then entering step 3; the number of pixel points contained in the lines drawn in the same drawing period is reduced along with the increase of time; the preset inflection point position is used for representing the trend change of a pre-planned drawing path;
step 3, judging whether the current counting drawing description times of the intelligent drawing robot reaches the preset description times, if so, stopping drawing lines, otherwise, returning to the step 2; the drawing description times are used for representing the number of preset inflection points traversed by the intelligent drawing robot, and each time the intelligent drawing robot traverses one preset inflection point, the drawing description times are counted by one.
Compared with the prior art, this technical scheme control robot combines pixel figure to draw the lines combination that is parallel to each other according to predetermineeing the flex point, and then draws the lines combination of sense of space and stereovision, lets intelligent robot possess the ability of drawing the creation, implements and to get up and can accurately draw the contour line fast more than the mankind.
As an embodiment, a drawing guiding method of an intelligent drawing robot, as shown in fig. 1, includes the following steps:
s1, controlling the intelligent painting robot to move along a pre-planned painting path to obtain a pre-configured painting period, and then entering S2; the pre-planned drawing path is obtained by converting the pre-input outline edge contour information of the drawing object, and the pre-configured drawing cycle is distributed according to the pre-input outline edge contour information of the drawing object, so that the intelligent drawing robot finishes line drawing of the outline edge contour information of the drawing object in sequence according to different configured drawing cycles.
S2, controlling the intelligent painting robot to collect the number of pixel points at the current position, and then entering S3; the number of the pixel points collected at the current position comprises the number of the pixel points of the line drawn by the intelligent drawing robot in real time and/or the number of the pixel points of the line drawn by the intelligent drawing robot in advance.
And step S3, judging whether the number of the collected pixel points is greater than a first preset value, if so, entering step S4, and if not, entering step S5. Lines with the number of pixel points larger than a first preset value are densely distributed in the picture, and the first preset value is used for representing a dark surface, namely a backlight surface, of the drawing object.
And S4, controlling the intelligent painting robot to move to positions with different gray values of pixel points along a pre-planned painting path, and then entering S5. The intelligent painting robot is controlled to paint line groups with light and shade levels by judging the quantity of the collected pixel points and the gray value difference, and the space sense of the picture is enhanced.
It should be noted that the drawing guiding method described in the foregoing steps S2 to S4 is performed between step 1 and step 2 of the foregoing embodiment.
S5, controlling the intelligent painting robot to move continuously along a pre-planned painting path in the current painting period to draw a line until the intelligent painting robot moves to a position of a preset inflection point of the painting path to complete the description of the painting path, enabling the line drawn in the current painting period to be parallel to the line drawn in the previous painting period, and then entering S6; in step S5, when the intelligent painting robot moves to a preset inflection point of the painting path in the current painting cycle, adding one to the current counted painting description times, and when the intelligent painting robot traverses a plurality of preset inflection points along the pre-planned painting path in one painting cycle, the intelligent painting robot may generate a plurality of painting description times in one painting cycle; the intelligent painting robot is characterized in that lines drawn in different painting periods are arranged in parallel, and the number of pixel points contained in the lines drawn in the same painting period is reduced along with the increase of time; the preset inflection point position is used for representing the trend change of the pre-planned drawing path.
Step S6, judging whether the current counted drawing description times of the intelligent drawing robot reach the preset description times, if so, entering step S9 to stop line drawing, otherwise, entering step S7; the drawing description times are used for representing the number of preset turning points of a drawing path traversed by the intelligent drawing robot, the number of the preset turning points of the traversed drawing path is used as a judgment condition for finishing drawing of the intelligent drawing robot, and the preset description times are set as a condition threshold value for forming a line group with alternate light and shade on a picture, so that a drawing space area can be reduced, and drawing time can be saved.
Step S7, judging whether the number of the collected pixel points is less than a second preset value, if so, entering step S8 to start line drawing of the next drawing period, otherwise, returning to the step S5 to maintain the movement along the pre-planned drawing path in the current drawing period; under the control of the drawing guiding method, the intelligent drawing robot utilizes the fact that lines drawn in different drawing periods are enclosed to have a layering sense area and assist in finding out the reality sense which is consistent with the edge profile, the color shade and the like of a drawing object, and the trend change of a pre-planned drawing path between two adjacent drawing periods is used for solving the problems of connection, interpenetration, combination and progression of turning contour lines of the drawing object. The second predetermined value is set to sufficiently represent the requirement of the pigment for rendering the edge contour and the dark surface of the object.
In the foregoing step, the second predetermined value is smaller than the first predetermined value. When the number of the pixel points acquired in the current drawing period is larger than a second preset value, the pixel points are kept to move to a new preset inflection point position along a pre-planned drawing path in the current drawing period, and therefore the contour line of the drawing object is drawn in a layered manner; when the intelligent painting robot traverses a preset inflection point every time, the number of times of painting description is counted, until the hierarchical description requirement of the painting object of the number of pixel points of the painting description number and the drawn line is met, at the moment, the intelligent painting robot does not consider which painting cycle the intelligent painting robot is in, and therefore the number of times of painting description generated in one painting cycle is not only 1. The intelligent painting robot is controlled to distinguish a boundary between an object of painting and a picture background at a reasonable painting path inflection point position according to the painting description times and the number of pixel points acquired in real time, and the effect of alternate light and shade of the outline characteristics of the drawn object is shown.
And S8, controlling the intelligent painting robot to start the line drawing working state of the next painting period, wherein the number of the collected pixel points is smaller than a second preset value, the intelligent painting robot fully shows the edge outline, the dark surface and the bright surface of the drawn object along a part of the pre-planned painting path to complete the layered drawing task of the line in the current painting period, and then returning to the step S2 to collect the number of the pixel points at the starting point of the new painting path. The end point of the drawing path traveled by the line drawn by the intelligent drawing robot in one drawing cycle comprises the preset inflection point, but the start point of the drawing path traveled by the intelligent drawing robot in the current drawing cycle is not connected with the end point of the drawing path traveled by the intelligent drawing robot in the previous drawing cycle, that is, the end point of the drawing path traveled by the intelligent drawing robot in the current drawing cycle is not connected with the start point of the drawing path traveled by the intelligent drawing robot in the next drawing cycle, so that the outer edge line of the object is avoided being mechanically drawn relatively, the phenomenon that the head and the tail of the object are tightly connected together by one line is not easy to directly occur, the object depicted in the picture is in contact with the background environment on the same picture, and the object drawn by the intelligent drawing robot has a spatial sense.
It is noted that the drawing paths traveled by the intelligent drawing robot in different drawing cycles are not continuous, resulting in non-continuous drawn lines. Therefore, after the plurality of preset inflection points are traversed and the number of the collected pixel points is smaller than a second preset value, the object drawn by the intelligent drawing robot shows the spatial sense of the outline, the line drawn by the robot is relatively easy and free, and the user can grasp the integral sense of the whole picture.
It should be noted that the drawing guiding method described in the foregoing steps S7 to S8 is executed after step 3 and before returning to step 2 in the foregoing embodiment.
And step S9, stopping line drawing, wherein the line drawing indicates that the intelligent drawing robot has traversed enough preset inflection points (which are greater than or equal to the preset drawing description times) on the pre-planned drawing path, and is used for describing the contour line of the current light and shade level.
In this embodiment, the number of pixels included in the line drawn in the same drawing cycle is reduced with the increase of time, and the intelligent painting robot can traverse a plurality of preset inflection points along a pre-planned painting path in a painting period, therefore, before the intelligent painting robot does not detect that the number of the currently collected pixel points is less than the second preset value, namely, when the number of the pixel points which are collected currently is detected to be still larger than or equal to the second preset value, the preset inflection point can be traversed, the current statistical drawing description times is added with one, whether the line drawing is stopped can be judged before the quantity of the collected pixel points is judged to be smaller than a second preset value, and the drawing speed of the line in the same drawing period is accelerated by means of the path inflection point meaning represented by the drawing description times counted currently by the intelligent drawing robot. The intelligent painting robot can accurately and quickly paint the contour line compared with a human after implementing the steps.
Preferably, when a connection distance between a pixel point at a starting position in a line drawn in a current drawing cycle and a pixel point at an ending position in a line drawn in a previous drawing cycle is less than or equal to a preset threshold, the line drawn in the current drawing cycle and the line drawn in the previous drawing cycle are regarded as being overlapped, and the line drawn in the current drawing cycle is erased, the preset threshold is set to remove a noise point drawn along a drawing path, and consumption of a storage space of the drawing line by the robot is reduced. The lines drawn in the current drawing period and the lines drawn in the previous drawing period are not connected end to end, but form an ordered line combination distributed at intervals, so that the matching of an outer contour line and an inner contour line of a drawing object is facilitated to be drawn in an auxiliary manner. This embodiment is on the basis of guaranteeing that the drawing path does not take place the adjustment, optimizes the drawing lines according to the distance characteristic of the drawing lines between two adjacent drawing cycles, has reduced the demand to robot hardware.
Preferably, the intelligent painting robot marks the painting path traversed by the movement in real time in the constructed environment map, wherein lines drawn by the intelligent painting robot are also collected and captured, and are reflected on the periphery of the outline of the painting path marked in real time in the environment map according to the sequentially executed painting period. The user can conveniently compare the difference between the robot drawing and the hand drawing.
The embodiment of the invention also discloses a chip, which is used for storing the program code corresponding to the drawing guiding method disclosed by the embodiment. The method is used for configuring to guide and control the intelligent drawing robot to draw the contour line with the spatial hierarchy rapidly and accurately along the preset drawing path planned in advance. On the premise of ensuring the line drawing effect, the efficiency of drawing path operation is greatly improved, the space consumption of drawing line storage is reduced, and the performance and smoothness of robot drawing are improved.
Based on the foregoing embodiment, the embodiment further discloses an intelligent painting robot, the chip is embedded in the intelligent painting robot, the intelligent painting robot is a floor sweeping robot, a base side cover of the floor sweeping robot is provided with a fixed arm, a camera is arranged at a matched angle to collect pixel point information of a drawn line, the fixed arm clamps a line hooking pen or a pencil, the line hooking pen or the pencil is used for calling an electric guidance signal output by the chip and corresponding to the drawing guidance method, and the fixed arm is controlled to clamp the line hooking pen or the pencil to draw the line along a pre-planned drawing path of the foregoing embodiment through the electric guidance signal; the ink marks generated by the line-drawing pen or the pencil in the process of drawing along the pre-planned drawing path are gradually changed, so that lines drawn by the intelligent drawing robot in different drawing periods are arranged in parallel, and the number of pixel points contained in the lines drawn in the same drawing period is reduced along with the increase of time. Compared with the prior art, on the basis of executing the drawing guiding method disclosed by the embodiment, the intelligent drawing robot disclosed by the embodiment utilizes the gradual change ink marks generated by clamping a line-drawing pen or a pencil by the fixing arm on the base to draw the contour line of the drawing object input in advance, reflects the form and the light and dark surface relation of the drawing object, and specifically reflects the form and the light and dark surface relation of the drawing object by combining the preset inflection point position and the drawing period, so that the space feeling and the layering feeling of the drawing object are enriched, and the skeleton of the sketch drawing work is created in a layered manner, so that the robot has the drawing creation capability superior to that of a human.
Based on the above embodiment, the intelligent painting robot is configured to remotely send information of a painting path and a drawing line to a client of the mobile intelligent terminal, so that a user obtains a navigation map constructed in real time from a touch screen of the mobile intelligent terminal and a painting path marked on the navigation map, and meanwhile, the intelligent painting robot is further configured to receive an instruction for correspondingly converting a line drawn on the touch screen of the mobile intelligent terminal by the user through a finger or a touch screen special tool Novel and interesting drawing style and low drawing threshold. The mobile intelligent terminal disclosed by the preferred embodiment can draw the pixel points of the peripheral contour line of the drawing path line by line, can reduce the drawing space area, saves the drawing time and improves the drawing efficiency. And can audio-visually interact with the user, promote the interactivity of drawing.
Preferably, after the information of the drawing path and the drawing line is sent to the mobile intelligent terminal, the client of the mobile intelligent terminal reads and analyzes pixel point information of the drawing line corresponding to the drawing period and a position point of the drawing path traveled by the intelligent drawing robot, and then performs coordinate information conversion on the analyzed pixel point information and the analyzed position point, displays the converted coordinate information on a touch screen of the mobile intelligent terminal, and stores the converted coordinate information in the client. The drawing process of person's control robot is monitored according to the drawing cycle to the person of facilitating the use calls fast, and the person of facilitating the use is not real time monitoring drawing effect, has reached that chip processing pressure is few, work efficiency is high effect to convenience of customers more directly obtains the drawing lines group of light and shade level performance.
In the embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

Claims (10)

1. A drawing guiding method of an intelligent drawing robot is characterized by comprising the following steps:
step 1, controlling an intelligent painting robot to move along a pre-planned painting path to obtain a pre-configured painting period; the pre-planned drawing path is obtained by converting the outline information of the outline edge of the pre-input drawing object;
step 2, controlling the intelligent painting robot to move along a pre-planned painting path in the current painting period to draw lines until the intelligent painting robot moves to a position of a preset inflection point of the painting path to complete description of the painting path, enabling the lines drawn in the current painting period to be parallel to the lines drawn in the previous painting period, and then entering step 3; the number of pixel points contained in the lines drawn in the same drawing period is reduced along with the increase of time; the preset inflection point position is used for representing the trend change of a pre-planned drawing path;
step 3, judging whether the current counting drawing description times of the intelligent drawing robot reaches the preset description times, if so, stopping drawing lines, otherwise, returning to the step 2; the drawing description times are used for representing the number of preset turning points traversed by the intelligent drawing robot.
2. A drawing guidance method according to claim 1, further comprising, between said step 1 and said step 2:
step 11, controlling the intelligent painting robot to collect the number of pixel points at the current position, and judging whether the collected number of the pixel points is greater than a first preset value, if so, entering step 12, otherwise, entering step 2;
and step 12, controlling the intelligent painting robot to move to positions with different gray values of pixel points along a pre-planned painting path, and then entering step 2.
3. A drawing guiding method according to claim 2, wherein when the current counted drawing description times of the intelligent drawing robot does not reach the preset description times, it is first determined whether the number of the collected pixel points is less than a second predetermined value, if so, the line drawing of the next drawing cycle is started, and then the step 11 is returned, otherwise, the step 2 is returned, but the movement is maintained in the current drawing cycle along the pre-planned drawing path; wherein the second predetermined value is less than the first predetermined value.
4. A drawing guiding method according to claim 3, wherein a start point of a drawing path traveled by the intelligent drawing robot in a current drawing cycle is not connected with an end point of the drawing path traveled in the previous drawing cycle, wherein the end point of the drawing path traveled by the intelligent drawing robot in one drawing cycle includes the preset inflection point.
5. A drawing guidance method according to claim 4, wherein when a connection distance between a start point pixel point in a line drawn in a current drawing cycle and an end point pixel point in a line drawn in a previous drawing cycle is less than or equal to a preset threshold, the line drawn in the current drawing cycle and the line drawn in the previous drawing cycle are regarded as being overlapped, and the line drawn in the current drawing cycle is erased;
and the line drawn in the current drawing period and the line drawn in the last drawing period are not connected end to end.
6. A drawing guiding method as claimed in claim 5, wherein the intelligent drawing robot marks the drawing path moved across in real time in the constructed environment map, wherein the lines drawn by the intelligent drawing robot are also captured and reflected on the outline periphery of the drawing path marked in real time in the environment map according to the successively executed drawing cycles.
7. A chip, characterized in that, the chip is used to store the program code corresponding to the drawing guiding method of any one of claims 1 to 6.
8. An intelligent painting robot, characterized in that the intelligent painting robot is provided with the chip of claim 7, the intelligent painting robot is a sweeping robot, a side cover of a base of the sweeping robot is provided with a fixed arm, a camera is arranged at a matched angle to collect pixel point information of a drawn line, the fixed arm clamps a line hooking pen or a pencil, the electric guide signal is used for calling the electric guide signal output by the chip of claim 7 and corresponding to the drawing guide method of any one of claims 1 to 6, and the fixed arm is controlled by the electric guide signal to clamp the line hooking pen or the pencil to draw the line along a pre-planned drawing path;
wherein the ink marks produced by the line drawing pen or pencil in the process of drawing along the pre-planned drawing path are gradually changed.
9. The intelligent painting robot according to claim 8, wherein the intelligent painting robot is configured to remotely send information of the painting path and the painting line to a client of the mobile intelligent terminal, so that a user can obtain a navigation map constructed in real time and the painting path marked on the navigation map from a touch screen of the mobile intelligent terminal, and meanwhile, the intelligent painting robot is further configured to receive an instruction for a corresponding conversion of the line drawn on the touch screen of the mobile intelligent terminal by the user through a finger or a touch screen dedicated tool.
10. The intelligent painting robot according to claim 9, wherein after the information of the painting path and the painting line is sent to the mobile intelligent terminal, the client of the mobile intelligent terminal reads and analyzes pixel point information corresponding to the painting line in the painting period and a position point of the painting path where the intelligent painting robot travels, and then converts the analyzed pixel point information and position point into coordinate information, displays the coordinate information on a touch screen of the mobile intelligent terminal, and stores the coordinate information in the client.
CN202010883623.3A 2020-08-28 2020-08-28 Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot Active CN112192562B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010883623.3A CN112192562B (en) 2020-08-28 2020-08-28 Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010883623.3A CN112192562B (en) 2020-08-28 2020-08-28 Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot

Publications (2)

Publication Number Publication Date
CN112192562A true CN112192562A (en) 2021-01-08
CN112192562B CN112192562B (en) 2021-08-24

Family

ID=74006497

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010883623.3A Active CN112192562B (en) 2020-08-28 2020-08-28 Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot

Country Status (1)

Country Link
CN (1) CN112192562B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114471286A (en) * 2021-12-17 2022-05-13 苏州镁伽科技有限公司 Control method and device for mobile device, storage medium and blending equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1132423A (en) * 1997-07-10 1999-02-02 Hitachi Zosen Corp Cable treating mechanism
CN108182664A (en) * 2017-12-26 2018-06-19 努比亚技术有限公司 A kind of image processing method, mobile terminal and computer readable storage medium
CN109523603A (en) * 2018-10-24 2019-03-26 广东智媒云图科技股份有限公司 A kind of drawing method based on texturing methods or types of texture strokes style, device, terminal device and storage medium
CN109993810A (en) * 2019-03-19 2019-07-09 广东智媒云图科技股份有限公司 A kind of intelligence sketch drawing method, device, storage medium and terminal device
CN110555894A (en) * 2019-07-19 2019-12-10 广东智媒云图科技股份有限公司 Intelligent robot painting method, electronic equipment and storage medium
CN110570480A (en) * 2019-07-19 2019-12-13 广东智媒云图科技股份有限公司 Sketch drawing method of drawing robot, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1132423A (en) * 1997-07-10 1999-02-02 Hitachi Zosen Corp Cable treating mechanism
CN108182664A (en) * 2017-12-26 2018-06-19 努比亚技术有限公司 A kind of image processing method, mobile terminal and computer readable storage medium
CN109523603A (en) * 2018-10-24 2019-03-26 广东智媒云图科技股份有限公司 A kind of drawing method based on texturing methods or types of texture strokes style, device, terminal device and storage medium
CN109993810A (en) * 2019-03-19 2019-07-09 广东智媒云图科技股份有限公司 A kind of intelligence sketch drawing method, device, storage medium and terminal device
CN110555894A (en) * 2019-07-19 2019-12-10 广东智媒云图科技股份有限公司 Intelligent robot painting method, electronic equipment and storage medium
CN110570480A (en) * 2019-07-19 2019-12-13 广东智媒云图科技股份有限公司 Sketch drawing method of drawing robot, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114471286A (en) * 2021-12-17 2022-05-13 苏州镁伽科技有限公司 Control method and device for mobile device, storage medium and blending equipment
CN114471286B (en) * 2021-12-17 2024-05-10 苏州镁伽科技有限公司 Control method and device of mobile device for mixing equipment and mixing equipment

Also Published As

Publication number Publication date
CN112192562B (en) 2021-08-24

Similar Documents

Publication Publication Date Title
CN107728792B (en) Gesture recognition-based augmented reality three-dimensional drawing system and drawing method
CN109199240B (en) Gesture control-based sweeping robot control method and system
CN112198962B (en) Method for interacting with virtual reality equipment and virtual reality equipment
CN112192562B (en) Drawing guiding method and chip of intelligent drawing robot and intelligent drawing robot
CN105867630A (en) Robot gesture recognition method and device and robot system
CN105500370B (en) A kind of robot off-line teaching programing system and method based on body-sensing technology
CN103246290B (en) A kind of cloud platform control method and system thereof
CN106681354A (en) Flight control method and flight control device for unmanned aerial vehicles
CN104407694A (en) Man-machine interaction method and device combining human face and gesture control
Guo et al. Skywork-daVinci: A novel CPSS-based painting support system
CN102810015B (en) Input method based on space motion and terminal
CN112192563B (en) Painting control method and chip of intelligent painting robot and intelligent painting robot
CN106272409A (en) Mechanical arm control method based on gesture identification and system
CN106023308A (en) Somatosensory interaction rapid three-dimensional modeling auxiliary system and method thereof
CN108318050B (en) Central controller and the system and method for utilizing the central controller mobile navigation
CN110142769B (en) ROS platform online mechanical arm demonstration system based on human body posture recognition
CN105096387A (en) Intelligent three-dimensional processing method of two-dimensional sketch
WO2014048170A1 (en) Method and device for in-air gesture identification applied in terminal
CN105014675A (en) Intelligent mobile robot visual navigation system and method in narrow space
CN105892668B (en) Apparatus control method and device
CN110221689A (en) A kind of space drawing method based on augmented reality
CN106468993A (en) The control method of virtual reality terminal unit and device
CN205750354U (en) A kind of expression robot
CN109741418B (en) Low polygon style drawing acquisition method and device
CN109102571B (en) Virtual image control method, device, equipment and storage medium thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Patentee after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Country or region after: China

Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Patentee before: AMICRO SEMICONDUCTOR Co.,Ltd.

Country or region before: China