CN111179148B - Data display method and device - Google Patents
Data display method and device Download PDFInfo
- Publication number
- CN111179148B CN111179148B CN201911398395.4A CN201911398395A CN111179148B CN 111179148 B CN111179148 B CN 111179148B CN 201911398395 A CN201911398395 A CN 201911398395A CN 111179148 B CN111179148 B CN 111179148B
- Authority
- CN
- China
- Prior art keywords
- image
- walked
- path
- projection
- path information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000004590 computer program Methods 0.000 claims description 21
- 230000007613 environmental effect Effects 0.000 claims description 17
- 230000003287 optical effect Effects 0.000 claims description 10
- 238000012937 correction Methods 0.000 claims description 5
- 238000005286 illumination Methods 0.000 claims description 3
- 230000001105 regulatory effect Effects 0.000 claims 3
- 230000000694 effects Effects 0.000 abstract description 13
- 238000012545 processing Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000005452 bending Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0014—Image feed-back for automatic industrial control, e.g. robot with camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/21—Collision detection, intersection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Robotics (AREA)
- Manipulator (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application is applicable to the technical field of data processing, and provides a data display method, wherein the data display method is applied to a robot, the robot is provided with a projection device, and the data display method comprises the following steps: acquiring an image comprising path information to be walked; and sending the image comprising the path information to be walked to the projection device, and indicating the projection device to project the image comprising the path information to be walked. By the method, the path prompt effect corresponding to the path to be walked of the robot can be effectively improved.
Description
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a data display method and device.
Background
At present, when a robot walks, people can be reminded of the path to be walked of the robot in a voice prompt mode so as to avoid collision with the people. However, if the robot is in a noisy environment, the prompting voice is easily covered by surrounding sounds, so that the path prompting effect corresponding to the path to be walked of the robot is poor.
Disclosure of Invention
The embodiment of the application provides a data display method and a data display device, which can solve the problem that the path prompt effect corresponding to the path to be walked of a robot is poor in the noisy environment at present.
In a first aspect, an embodiment of the present application provides a data display method, where the data display method is applied to a robot, and the robot is provided with a projection device, and the data display method includes:
acquiring an image comprising path information to be walked;
and sending the image comprising the path information to be walked to the projection device, and indicating the projection device to project the image comprising the path information to be walked.
In a first possible implementation manner of the first aspect, the width of the image including the path information to be walked is a path width occupied by the robot on the path to be walked, and correspondingly, the acquiring the image including the path information to be walked includes:
and generating an image comprising the path information to be walked according to the path width occupied by the robot on the path to be walked and the path information to be walked.
In a second possible implementation manner of the first aspect, the path information to be walked includes shape information of a path to be walked, and correspondingly, the acquiring an image including the path information to be walked includes:
acquiring an image comprising shape information of a path to be walked;
correspondingly, the sending the image including the path information to be walked to the projection device includes:
and sending the image comprising the shape information of the path to be walked to the projection device.
In a third possible implementation manner of the first aspect, after the acquiring the image including the path information to be walked, the method includes:
correcting the image comprising the path information to be walked according to an image scaling factor;
correspondingly, the sending the image including the path information to be walked to the projection device includes:
and sending the corrected image comprising the information of the path to be walked to the projection device.
In a fourth possible implementation manner of the third possible implementation manner of the first aspect of the present application, an included angle between an optical axis of a lens of the projection device and a ground is smaller than 90 degrees, and correspondingly, before the correcting the image including the path information to be walked according to the image scaling factor, the method includes:
and obtaining an image scaling factor according to a first projection image and a second projection image, wherein the first projection image is a projection image projected on the ground by the projection device, and the second projection image is a projection image projected on a plane perpendicular to the optical axis of the lens by the projection device.
In a fifth possible implementation manner of the first aspect of the present application, the image scaling factor includes: the image length scaling factor and the image width scaling factor, respectively, the obtaining the image scaling factor according to the first projection image and the second projection image, includes:
an image length scaling factor is determined from a maximum length of the first projection image and a maximum length of the second projection image, and an image width scaling factor is determined from a maximum width of the first projection image and a maximum width of the second projection image.
In a sixth possible implementation manner of the first aspect, before the sending the image including the path information to be walked to the projection device, the method includes:
acquiring environmental parameters;
adjusting image characteristics of the image comprising the path information to be walked according to the environmental parameters;
correspondingly, the sending the image including the path information to be walked to the projection device includes:
and sending the image with the image characteristics adjusted and including the information of the path to be walked to the projection device.
In a second aspect, an embodiment of the present application provides a data display device, where the data display device is applied to a robot, and the robot is provided with a projection device, and the data display device includes:
an acquisition unit configured to acquire an image including path information to be walked;
the indication unit is used for sending the image comprising the path information to be walked to the projection device and indicating the projection device to project the image comprising the path information to be walked.
In a third aspect, an embodiment of the present application provides a robot, including: the robot is provided with a projection device, a processor and a computer program stored in the memory and executable on the processor, which processor, when executing the computer program, implements the steps of the data presentation method.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium comprising: the computer readable storage medium stores a computer program which, when executed by a processor, implements the steps of the data presentation method as described.
In a fifth aspect, embodiments of the present application provide a computer program product for, when run on a robot, causing the robot to perform the steps of the data presentation method according to any one of the first aspects.
It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Compared with the prior art, the embodiment of the application has the beneficial effects that: because the robot can acquire the image comprising the path information to be walked, and send the image comprising the path information to be walked to the projection device, the projection device is instructed to project the image comprising the path information to be walked, people can acquire the path information to be walked of the robot through the image and the path information to be walked of the robot is not influenced by surrounding sounds, and therefore the path prompt effect corresponding to the path to be walked of the robot is effectively improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a data display method according to an embodiment of the application;
FIG. 2 is a schematic illustration of an arrow diagram provided by an embodiment of the present application;
FIG. 3 is a flowchart of a data display method according to another embodiment of the present application;
FIG. 4 is a schematic view of a first projection image showing a divergence phenomenon according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a designated camera according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a data display device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a robot according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, in the description of the present specification and the appended claims, the terms "first," "second," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
Embodiment one:
fig. 1 shows a flowchart of a first data display method according to an embodiment of the present application, where the data display method is applied to a robot, and the robot is provided with a projection device, and the details are as follows:
step S101, an image including path information to be walked is acquired.
Specifically, the step S101 includes: and generating an image comprising the path information to be walked, or acquiring the image comprising the path information to be walked from the appointed terminal.
As an example and not by way of limitation, the specified terminal may be a server, and the path information to be walked may be: global path information to be walked or local path information to be walked.
Optionally, since the path width occupied by the robot on the path to be walked affects the movement of people, in order to enable people to intuitively understand the path width occupied by the robot on the path to be walked, it is convenient for people to plan the path of human body movement, and the width of the image including the information of the path to be walked is the path width occupied by the robot on the path to be walked, correspondingly, the step S101 includes: and generating an image comprising the path information to be walked according to the path width occupied by the robot on the path to be walked and the path information to be walked.
As an example and not by way of limitation, it is assumed that the robot occupies a path width of 0.5 m on the path to be walked, i.e. the width of the image including the path information to be walked is 0.5 m, and correspondingly, the generating of the image including the path information to be walked is specifically: an image including the path information to be walked and having a width of 0.5 meter is generated.
As an example and not by way of limitation, the generating an image including the path information to be walked according to the path width occupied by the robot on the path to be walked and the path information to be walked specifically includes: if the path width occupied by the robot on the path to be walked is smaller than or equal to the actual width of the path to be walked, generating an image comprising the path information to be walked according to the path width occupied by the robot on the path to be walked and the path information to be walked.
In some embodiments, in a real situation, a situation may occur that a path width occupied by the robot on the path to be walked is greater than an actual width of the path to be walked, so, in order to ensure that the robot can walk smoothly, generating, according to the path width occupied by the robot on the path to be walked and the path information to be walked, the image including the path information to be walked includes: if the path width occupied by the robot on the path to be walked is larger than the actual width of the path to be walked, the path to be walked is planned again, and an image comprising the path information to be walked corresponding to the path to be walked which is planned again is generated according to the path information to be walked corresponding to the path to be walked which is planned again and the path width occupied by the robot on the path to be walked which is planned again, wherein the actual width of the path to be walked which is planned again is larger than or equal to the path width occupied by the robot on the path to be walked.
Step S102, transmitting the image including the path information to be walked to the projection device, and instructing the projection device to project the image including the path information to be walked.
Specifically, the instructing the projection device to project the image including the path information to be walked includes: and sending a projection starting instruction to the projection device, wherein the projection starting instruction is used for instructing the projection device to project the image comprising the path information to be walked.
Optionally, in order to enable the user to learn, through the projected image, shape information of a path to be walked of the robot, at this time, the path to be walked information includes shape information of the path to be walked, and correspondingly, the step S101 includes: acquiring an image comprising shape information of a path to be walked; correspondingly, the sending the image including the path information to be walked to the projection device includes: and sending the image comprising the shape information of the path to be walked to the projection device.
In some embodiments, the embodiment of the shape information of the path to be walked in the image including the shape information of the path to be walked may be text. For example, the shape information of the path to be walked is specifically an "S-shaped" path. If the embodiment form is text, the image comprising the shape information of the path to be walked can comprise text content S-shaped paths. Because the character embodying form is simple, the user can receive the shape information of the path to be walked in a short time, and therefore the information transmission efficiency of the shape information of the path to be walked is greatly improved.
In some embodiments, the embodiment form of the shape information of the path to be walked in the image including the shape information of the path to be walked may be a graph, and since the embodiment form of the graph is more vivid, the shape information of the path to be walked can leave a more profound impression for the user, and the information transmission effect of the shape information of the path to be walked is greatly improved.
By way of example and not limitation, the embodiment of the shape information of the path to be walked in the image including the shape information of the path to be walked may be specifically an arrow graphic, which may be shown in fig. 2, including an arrow end and a line end, and before the acquiring the image including the shape information of the path to be walked, including: determining arrow graphic parameters according to the shape information of the path to be walked, and correspondingly, acquiring the image comprising the shape information of the path to be walked comprises: generating an image comprising shape information of a path to be walked according to the arrow graphic parameters, wherein the arrow graphic parameters comprise any one of the following: arrow end orientation, line end shape parameters, which may include, but are not limited to: the wire end length and/or the bending degree parameter of the wire end. Wherein the arrow end pointing may represent the direction of the destination of the robot relative to said robot.
Optionally, in order to avoid adverse effects of the environment on the projection effect of the image, before said step S102, it is therefore included: acquiring environmental parameters; adjusting image characteristics of the image comprising the path information to be walked according to the environmental parameters; correspondingly, the sending the image including the path information to be walked to the projection device includes: and sending the image with the image characteristics adjusted and including the information of the path to be walked to the projection device.
By way of example and not limitation, the environmental parameters may include at least one of: illumination intensity, ground humidity. The image features may include at least one of: color characteristics of the image, texture characteristics of the image.
Specifically, the adjusting the image feature of the image including the path information to be walked according to the environmental parameter includes: and if the value corresponding to the environmental parameter is not in the preset value range, adjusting the image characteristics of the image comprising the information of the path to be walked according to the environmental parameter.
In some embodiments, if the value corresponding to the environmental parameter is within the preset value range, the step S102 is performed, and the image feature of the image including the path information to be walked does not need to be adjusted according to the environmental parameter, so as to improve the image projection efficiency.
In the embodiment of the application, the robot can acquire the image comprising the path information to be walked, and send the image comprising the path information to be walked to the projection device, so that the projection device is instructed to project the image comprising the path information to be walked, people can acquire the path information to be walked of the robot through the image and the path information to be walked of the robot is not influenced by surrounding sounds, and therefore, the path prompting effect corresponding to the path to be walked of the robot is effectively improved.
Embodiment two:
fig. 3 is a schematic flow chart of a second data display method according to an embodiment of the present application, where the data display method is applied to a robot, and the robot is provided with a projection device, and step S301 of the present embodiment is the same as step S101 of the first embodiment, and is not repeated here:
step S301, an image including path information to be walked is acquired.
Step S302, correcting the image comprising the information of the path to be walked according to the image scaling factor.
Specifically, the region of interest in the image including the path information to be walked is corrected according to an image scaling factor.
Optionally, an included angle between the optical axis of the lens of the projection device and the ground is smaller than 90 degrees, and correspondingly, before the step S302, the method includes: and obtaining an image scaling factor according to a first projection image and a second projection image, wherein the first projection image is a projection image projected on the ground by the projection device, and the second projection image is a projection image projected on a plane perpendicular to the optical axis of the lens by the projection device.
When the included angle between the optical axis of the lens of the projection device and the ground is smaller than 90 degrees, the first projection image may have a divergence phenomenon, for example, as shown in fig. 4, the arrow graphic on the left side of fig. 4 is a second projection image in which the divergence phenomenon does not occur, the arrow graphic on the right side is a first projection image in which the divergence phenomenon occurs, the upper end of the arrow graphic on the right side is the end far from the projection device, and the lower end is the end near to the projection device. Because the image scaling factors can be obtained according to the first projection image and the second projection image, the projection effect corresponding to the corrected image comprising the path information to be walked in the subsequent step can be ensured to be the same as the projection effect corresponding to the projection image without the divergence phenomenon.
Specifically, an image scaling factor is obtained from the image shape parameter of the first projection image and the image shape parameter of the second projection image.
As an example and not by way of limitation, as shown in fig. 5, the specified camera is a camera suspended directly above the middle position of the first projected image, which may be captured by the specified camera, and the image shape parameters include any one of the following: the length of the image, the width of the image, the radian of the rounded corners in the image.
Optionally, the image scaling factor includes: the image length scaling factor and the image width scaling factor, respectively, the obtaining the image scaling factor according to the first projection image and the second projection image, includes: the image length scaling factor is determined according to the maximum length of the first projection image and the maximum length of the second projection image, and the image width scaling factor is determined according to the maximum width of the first projection image and the maximum width of the second projection image, and the characteristics of the maximum length and the maximum width of the first projection image, the maximum length and the maximum width of the second projection image and the like are obvious, so that the characteristics can be determined without carrying out particularly complex operation, and the determination efficiency of the image scaling factor can be effectively improved.
By way of example and not limitation, a quotient of the maximum length of the first projection image divided by the maximum length of the second projection image is determined as an image length scaling factor, and a quotient of the maximum width of the first projection image divided by the maximum width of the second projection image is determined as an image width scaling factor.
Step S303, the corrected image comprising the information of the path to be walked is sent to the projection device.
As an example and not by way of limitation, assuming that the corrected image including the path information to be walked is an image a, the image a is transmitted to the projection device.
In the embodiment of the application, the robot can correct the image comprising the path information to be walked according to the image scaling factor, so that the corrected image comprising the path information to be walked is projected, and the path prompt effect corresponding to the path to be walked of the robot can be effectively improved.
Embodiment III:
corresponding to the second embodiment, fig. 6 shows a schematic structural diagram of a data display device provided by the embodiment of the present application, where the data display device is applied to a robot, and the robot is provided with a projection device, and for convenience of explanation, only the portion related to the embodiment of the present application is shown.
The data display device includes: an acquisition unit 601 and an instruction unit 602.
The acquiring unit 601 is configured to acquire an image including information of a path to be walked.
Optionally, since the path width occupied by the robot on the path to be walked may affect the motion of people, in order to enable people to intuitively understand the path width occupied by the robot on the path to be walked, it is convenient for people to plan the motion path of the human body, the width of the image including the information of the path to be walked is the path width occupied by the robot on the path to be walked, and correspondingly, the acquiring unit 601 is specifically configured to, when executing the acquiring of the image including the information of the path to be walked: and generating an image comprising the path information to be walked according to the path width occupied by the robot on the path to be walked and the path information to be walked.
The indicating unit 602 is configured to send the image including the path information to be walked to the projection device, and instruct the projection device to project the image including the path information to be walked.
Optionally, in order to enable the user to learn, through the projected image, shape information of a path to be walked of the robot, where the path to be walked information includes shape information of the path to be walked, the acquiring unit 601 is specifically configured to, when executing the acquiring of the image including the path to be walked: acquiring an image comprising shape information of a path to be walked; correspondingly, when the indication unit 602 performs the sending of the image including the path information to be walked to the projection device, the indication unit is specifically configured to: and sending the image comprising the shape information of the path to be walked to the projection device.
By way of example and not limitation, the embodiment of the shape information of the path to be walked in the image including the shape information of the path to be walked may be embodied as an arrow graphic including an arrow end and a line end, the data presentation device further including: and an arrow graphic parameter determination unit. The arrow graphic parameter determining unit is used for: before the acquiring unit 601 performs the acquiring of the image including the shape information of the path to be walked, determining an arrow graphic parameter according to the shape information of the path to be walked, and correspondingly, when the acquiring unit 601 performs the acquiring of the image including the shape information of the path to be walked, the method specifically includes: generating an image comprising shape information of a path to be walked according to the arrow graphic parameters, wherein the arrow graphic parameters comprise any one of the following: arrow end orientation, line end shape parameters, which may include, but are not limited to: the wire end length and/or the bending degree parameter of the wire end. Wherein the arrow end pointing may represent the direction of the destination of the robot relative to said robot.
Optionally, the data display device further includes: and an image characteristic adjusting unit.
The image characteristic adjusting unit is used for: acquiring environmental parameters before the indication unit 602 performs the sending of the image including the path information to be walked to the projection device; adjusting image characteristics of the image comprising the path information to be walked according to the environmental parameters; correspondingly, when the indication unit 602 performs the sending of the image including the path information to be walked to the projection device, the indication unit is specifically configured to: and sending the image with the image characteristics adjusted and including the information of the path to be walked to the projection device.
Optionally, the data display device further includes: and a correction unit.
The correction unit is used for: after the acquiring unit 601 performs the acquiring of the image including the path information to be walked, correcting the image including the path information to be walked according to an image scaling factor; correspondingly, when the indication unit 602 performs the sending of the image including the path information to be walked to the projection device, the indication unit is specifically configured to: and sending the corrected image comprising the information of the path to be walked to the projection device.
Optionally, an included angle between an optical axis of a lens of the projection device and the ground is smaller than 90 degrees, and the data display device further includes: and a scaling factor acquisition unit.
The scaling factor acquisition unit is used for: before the correction unit performs the correction on the image including the path information to be walked according to the image scaling factor, the image scaling factor is acquired according to a first projection image and a second projection image, wherein the first projection image is a projection image projected on the ground by the projection device, and the second projection image is a projection image projected on a plane perpendicular to the optical axis of the lens by the projection device.
Optionally, the image scaling factor includes: the image length scaling factor and the image width scaling factor, respectively, are specifically configured to, when performing the acquiring of the image scaling factor according to the first projection image and the second projection image: an image length scaling factor is determined from a maximum length of the first projection image and a maximum length of the second projection image, and an image width scaling factor is determined from a maximum width of the first projection image and a maximum width of the second projection image.
In the embodiment of the application, the robot can acquire the image comprising the path information to be walked, and send the image comprising the path information to be walked to the projection device, so that the projection device is instructed to project the image comprising the path information to be walked, people can acquire the path information to be walked of the robot through the image and the path information to be walked of the robot is not influenced by surrounding sounds, and therefore, the path prompting effect corresponding to the path to be walked of the robot is effectively improved.
Embodiment four:
fig. 7 is a schematic structural diagram of a robot according to an embodiment of the present application. As shown in fig. 7, the robot 7 of this embodiment includes: at least one processor 70 (only one shown in fig. 7), a memory 71, and a computer program 72 stored in the memory 71 and executable on the at least one processor 70, the processor 70 implementing the steps in any of the various data presentation method embodiments described above when executing the computer program 72.
The robot is provided with a projection device, which may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the robot 7 and is not limiting of the robot 7, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 70 may be a central processing unit (Central Processing Unit, CPU) and the processor 70 may be other general purpose processors, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may in some embodiments be an internal storage unit of the robot 7, such as a hard disk or a memory of the robot 7. The memory 71 may in other embodiments also be an external storage device of the robot 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the robot 7. Further, the memory 71 may also include both an internal memory unit and an external memory device of the robot 7. The memory 71 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs, etc., such as program codes of the computer program. The memory 71 may also be used for temporarily storing data that has been output or is to be output.
It should be noted that, because the content of information interaction and execution process between the above units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
Embodiments of the present application provide a computer program product enabling a robot to carry out the steps of the various method embodiments described above when the computer program product is run on the robot.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a robot, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunication signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed network device and method may be implemented in other manners. For example, the network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (7)
1. A data display method, wherein the data display method is applied to a robot provided with a projection device, the data display method comprising:
acquiring an image comprising path information to be walked;
obtaining an image scaling factor according to a first projection image and a second projection image, wherein the first projection image is a projection image projected on the ground by the projection device, and the second projection image is a projection image projected on a plane perpendicular to the optical axis of the lens by the projection device;
correcting the image comprising the path information to be walked according to the image scaling factor;
acquiring environmental parameters, wherein the environmental parameters comprise at least one of illumination intensity and ground humidity;
adjusting image characteristics of the image comprising the path information to be walked according to the environment parameters, wherein the image characteristics comprise at least one of color characteristics and texture characteristics;
and sending the corrected image which comprises the path information to be walked and is regulated by the image characteristics to the projection device, and indicating the projection device to project the image comprising the path information to be walked.
2. The data display method according to claim 1, wherein the width of the image including the path information to be walked is a path width occupied by the robot on the path to be walked, and correspondingly, the acquiring the image including the path information to be walked includes:
and generating an image comprising the path information to be walked according to the path width occupied by the robot on the path to be walked and the path information to be walked.
3. The data display method according to claim 1, wherein the path information to be walked includes shape information of a path to be walked, and correspondingly, the acquiring an image including the path information to be walked includes:
acquiring an image comprising shape information of a path to be walked;
correspondingly, the sending the corrected image which includes the path information to be walked and is adjusted by the image characteristics to the projection device includes:
and sending the corrected image which comprises the shape information of the path to be walked and is regulated by the image characteristics to the projection device.
4. The data presentation method of claim 1, wherein the image scaling factor comprises: the image length scaling factor and the image width scaling factor, respectively, the obtaining the image scaling factor according to the first projection image and the second projection image, includes:
an image length scaling factor is determined from a maximum length of the first projection image and a maximum length of the second projection image, and an image width scaling factor is determined from a maximum width of the first projection image and a maximum width of the second projection image.
5. A data presentation device, characterized in that the data presentation device is applied to a robot, the robot is provided with a projection device, the data presentation device comprises:
an acquisition unit configured to acquire an image including path information to be walked;
an image scaling factor obtaining unit, configured to obtain an image scaling factor according to a first projection image and a second projection image, where the first projection image is a projection image projected on the ground by the projection device, and the second projection image is a projection image projected on a plane perpendicular to an optical axis of the lens by the projection device;
the correction unit is used for correcting the image comprising the path information to be walked according to the image scaling factor;
the image characteristic adjusting unit is used for acquiring environmental parameters, wherein the environmental parameters comprise at least one of illumination intensity and ground humidity; and adjusting image features of the image including the path information to be walked according to the environmental parameters, the image features including at least one of color features and texture features;
and the indicating unit is used for sending the corrected image which comprises the path information to be walked and is regulated by the image characteristics to the projection device and indicating the projection device to project the image comprising the path information to be walked.
6. Robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the robot is provided with projection means, the processor executing the computer program realizing the steps of the method according to any of claims 1 to 4.
7. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911398395.4A CN111179148B (en) | 2019-12-30 | 2019-12-30 | Data display method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911398395.4A CN111179148B (en) | 2019-12-30 | 2019-12-30 | Data display method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111179148A CN111179148A (en) | 2020-05-19 |
CN111179148B true CN111179148B (en) | 2023-09-08 |
Family
ID=70655884
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911398395.4A Active CN111179148B (en) | 2019-12-30 | 2019-12-30 | Data display method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111179148B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103400392A (en) * | 2013-08-19 | 2013-11-20 | 山东鲁能智能技术有限公司 | Binocular vision navigation system and method based on inspection robot in transformer substation |
WO2018120011A1 (en) * | 2016-12-30 | 2018-07-05 | 深圳前海达闼云端智能科技有限公司 | Projected image correction method and device, and robot |
CN108303972A (en) * | 2017-10-31 | 2018-07-20 | 腾讯科技(深圳)有限公司 | The exchange method and device of mobile robot |
CN109782962A (en) * | 2018-12-11 | 2019-05-21 | 中国科学院深圳先进技术研究院 | A kind of projection interactive method, device, system and terminal device |
CN109996050A (en) * | 2017-12-29 | 2019-07-09 | 深圳市优必选科技有限公司 | Control method and control device of projection robot |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9840003B2 (en) * | 2015-06-24 | 2017-12-12 | Brain Corporation | Apparatus and methods for safe navigation of robotic devices |
US20190289206A1 (en) * | 2018-03-15 | 2019-09-19 | Keiichi Kawaguchi | Image processing apparatus, image capturing system, image processing method, and recording medium |
-
2019
- 2019-12-30 CN CN201911398395.4A patent/CN111179148B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103400392A (en) * | 2013-08-19 | 2013-11-20 | 山东鲁能智能技术有限公司 | Binocular vision navigation system and method based on inspection robot in transformer substation |
WO2018120011A1 (en) * | 2016-12-30 | 2018-07-05 | 深圳前海达闼云端智能科技有限公司 | Projected image correction method and device, and robot |
CN108303972A (en) * | 2017-10-31 | 2018-07-20 | 腾讯科技(深圳)有限公司 | The exchange method and device of mobile robot |
CN109996050A (en) * | 2017-12-29 | 2019-07-09 | 深圳市优必选科技有限公司 | Control method and control device of projection robot |
CN109782962A (en) * | 2018-12-11 | 2019-05-21 | 中国科学院深圳先进技术研究院 | A kind of projection interactive method, device, system and terminal device |
Non-Patent Citations (1)
Title |
---|
丁斗建 ; 赵晓林 ; 王长根 ; 高关根 ; 寇磊 ; .基于视觉的机器人自主定位与障碍物检测方法.计算机应用.2019,(第06期),全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN111179148A (en) | 2020-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7038853B2 (en) | Image processing methods and devices, electronic devices and computer-readable storage media | |
US20200021729A1 (en) | Control method, control device and computer device | |
US20200093460A1 (en) | Method, device, ultrasonic probe and terminal for adjusting detection position | |
CN109166156B (en) | Camera calibration image generation method, mobile terminal and storage medium | |
CN110850961B (en) | Calibration method of head-mounted display device and head-mounted display device | |
CN109889730B (en) | Prompting method and device for adjusting shooting angle and electronic equipment | |
CN109672871B (en) | White balance information synchronization method, white balance information synchronization device and computer readable medium | |
CN108682030B (en) | Face replacement method and device and computer equipment | |
CN108882025B (en) | Video frame processing method and device | |
JP2019191145A (en) | Identification method for charging stand, device, robot, and computer readable storage | |
CN111427417B (en) | Time acquisition method and device and electronic equipment | |
CN111145315A (en) | Drawing method, drawing device, toy robot and readable storage medium | |
CN111179148B (en) | Data display method and device | |
CN104077768A (en) | Method and device for calibrating fish-eye lens radial distortion | |
CN114758055A (en) | Three-dimensional model generation method, XR device and storage medium | |
EP4434488A1 (en) | Scanning apparatus, connecting method and apparatus therefor, electronic device, and medium | |
CN116134476A (en) | Plane correction method and device, computer readable medium and electronic equipment | |
CN115661493B (en) | Method, device, equipment and storage medium for determining object pose | |
CN116245731A (en) | Method, device, equipment and medium for splicing scanning data | |
CN108965715B (en) | Image processing method, mobile terminal and computer readable storage medium | |
US10664948B2 (en) | Method and apparatus for processing omni-directional image | |
CN110839151A (en) | Game projection optimization method and related device | |
CN113110414B (en) | Robot meal delivery method, meal delivery robot and computer readable storage medium | |
CN115601316A (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
CN105229706A (en) | Image processing apparatus, image processing method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address | ||
CP03 | Change of name, title or address |
Address after: Unit 7-11, 6th Floor, Building B2, No. 999-8 Gaolang East Road, Wuxi Economic Development Zone, Wuxi City, Jiangsu Province, China 214000 Patentee after: Youdi Robot (Wuxi) Co.,Ltd. Country or region after: China Address before: 5D, Building 1, Tingwei Industrial Park, No. 6 Liufang Road, Xingdong Community, Xin'an Street, Bao'an District, Shenzhen City, Guangdong Province Patentee before: UDITECH Co.,Ltd. Country or region before: China |