CN111203862B - Data display method and device, electronic equipment and storage medium - Google Patents

Data display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111203862B
CN111203862B CN202010012703.1A CN202010012703A CN111203862B CN 111203862 B CN111203862 B CN 111203862B CN 202010012703 A CN202010012703 A CN 202010012703A CN 111203862 B CN111203862 B CN 111203862B
Authority
CN
China
Prior art keywords
sensor
coordinate system
data
image
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010012703.1A
Other languages
Chinese (zh)
Other versions
CN111203862A (en
Inventor
杜辉辉
张禹翔
陈侃
秦宝星
程昊天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Gaussian Automation Technology Development Co Ltd
Original Assignee
Shanghai Gaussian Automation Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Gaussian Automation Technology Development Co Ltd filed Critical Shanghai Gaussian Automation Technology Development Co Ltd
Priority to CN202010012703.1A priority Critical patent/CN111203862B/en
Publication of CN111203862A publication Critical patent/CN111203862A/en
Application granted granted Critical
Publication of CN111203862B publication Critical patent/CN111203862B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/08Programme-controlled manipulators characterised by modular constructions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/087Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Abstract

The application discloses a data display method and device, electronic equipment and a storage medium. The display method comprises the following steps: the method comprises the steps of obtaining contour parameters of a sensor, establishing a first coordinate system according to the contour parameters, obtaining detection data of the sensor, generating first coordinates of the detection data in the first coordinate system, and generating an image of the sensor data according to the first coordinates. In the data display method according to the embodiment of the application, the contour parameters of the sensor are acquired to establish a coordinate system, the detection data acquired by the sensor is used to generate corresponding coordinate points in the coordinate system according to the relative position of the detection data, and finally, the image of the sensor data is generated according to the coordinate points of the detection data, so that the working state of the sensor can be judged according to the image of the sensor data. Therefore, the user can accurately master the working state of the sensor, and corresponding countermeasures can be taken according to the working state of the sensor.

Description

Data display method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of robots, and in particular, to a method and an apparatus for displaying data, an electronic device, and a storage medium.
Background
Generally, a robot includes various sensors such as an image sensor and a laser sensor, and the robot can generate a corresponding action command according to environmental information data acquired by the sensors. In the related art, data provided by a sensor is difficult for a user to recognize, so that the working state of the sensor cannot be confirmed, and how to enable the user to accurately grasp the working state of the sensor becomes an urgent problem to be solved.
Disclosure of Invention
In view of the above, the present invention is directed to solving, at least to some extent, one of the problems in the related art. Therefore, an object of the present invention is to provide a data display method, a display device, an electronic apparatus, and a storage medium.
The data display method of the embodiment of the application comprises the following steps:
acquiring profile parameters of the sensor;
establishing a first coordinate system according to the contour parameters;
acquiring detection data of the sensor;
generating first coordinates of the inspection data in the first coordinate system; and
an image of the sensor data is generated from the first coordinates.
According to the data display method, the contour parameters of the sensor are acquired to establish a coordinate system, the detection data acquired by the sensor are used for generating corresponding coordinate points in the coordinate system according to the relative position of the detection data, and finally, the image of the sensor data is generated according to the coordinate points of the detection data, so that the working state of the sensor can be judged according to the image of the sensor data. In this way, the user can accurately grasp the working state of the sensor to make corresponding countermeasures according to the sensor state.
In some embodiments, the generating first coordinates of the inspection data in the first coordinate system comprises:
removing the detection data outside of a region threshold and the detection data overlapping within the region threshold to generate target detection data;
first coordinates of the target detection data are generated in the first coordinate system.
In this way, redundant detection data is optimized to remove overlapping detection data and detection data other than the area threshold, so that only necessary detection data is retained to generate the first coordinates in the first coordinate system.
In some embodiments, the generating an image of the detection data of the sensor from the first coordinates comprises:
and converting the first coordinate to generate a second coordinate corresponding to the first coordinate in a second coordinate system.
An image of the sensor data is generated from the second coordinates.
In this manner, by converting the coordinate points of the detection data of the sensor in the first coordinate system into the coordinate points in the second coordinate system, an image of the sensor data can be generated from the coordinate points of the detection data of the sensor in the second coordinate system.
In some embodiments, the converting the first coordinates to generate second coordinates corresponding to the first coordinates in a second coordinate system includes:
and carrying out y-axis overturning on the first coordinate system to obtain the second coordinate system.
In this way, the second coordinate system is generated by y-axis flipping the first coordinate system so that the first coordinate system can generate a corresponding second coordinate in the second coordinate system.
In some embodiments, the display method further comprises:
an image of the sensor data is rendered on an image display area.
In this manner, an image of the sensor data may be displayed on the image display area to facilitate the user to understand the operational status of the sensor.
In some embodiments, the image of sensor data includes an image of the sensor profile parameter and an image of the detection data, and rendering the image of sensor data on the image display area includes:
and if the image of the detection data and the image of the sensor outline parameter have an overlapping area, generating prompt information in the image display area.
In this way, when the working state of the sensor is abnormal, the image display area can generate prompt information to prompt a user.
The data display device according to the embodiment of the present application includes:
the first acquisition module is used for acquiring profile parameters of the sensor;
the building module is used for building a first coordinate system according to the contour parameters;
the second acquisition module is used for acquiring detection data of the sensor;
a first generating module for generating first coordinates of the inspection data in the first coordinate system; and
a second generation module to generate an image of the sensor data from the first coordinates.
In this way, through the arrangement of the first acquisition module, the construction module, the second acquisition module, the first generation module and the second generation module included in the display device, the sensor data can generate an image, so that the working state of the sensor is determined according to the image.
In some embodiments, the first generating module comprises:
a removal unit configured to remove the detection data other than a region threshold and the detection data overlapping within the region threshold to generate target detection data;
a first generation unit configured to generate first coordinates of the target detection data in the first coordinate system.
In this way, the first generation module is arranged so that the first generation module can optimize redundant detection data to remove overlapped detection data and detection data outside the region threshold, thereby only retaining necessary detection data to generate the first coordinates in the first coordinate system.
In some embodiments, the second generating module comprises:
the conversion unit is used for converting the first coordinate to generate a second coordinate corresponding to the first coordinate in a second coordinate system;
a second generation unit for generating an image of the sensor data from the second coordinates.
In this way, the second generation module is configured to convert the coordinate point of the detection data of the sensor in the first coordinate system into a coordinate point in the second coordinate system, and then generate an image of the sensor data according to the coordinate point of the detection data of the sensor in the second coordinate system.
In some embodiments, the conversion unit may be further configured to perform y-axis flipping on the first coordinate system to obtain the second coordinate system.
In this way, the first coordinate system is y-axis inverted by the conversion unit to generate the second coordinate system, so that the first coordinate system can directly generate the second coordinate in the second coordinate system.
In some embodiments, the display device further comprises a rendering module for rendering an image of the sensor data on an image display area.
Therefore, through the arrangement of the drawing module, the image of the sensor data can be displayed on the image display area, and a user can know the working state of the sensor conveniently.
In some embodiments, the rendering module may be further configured to generate a prompt message in the image display area if there is an overlapping area between the image of the detection data and the image of the sensor profile parameter.
Therefore, when the working state of the sensor is abnormal, the drawing module can generate prompt information in the image display area to prompt a user.
The electronic device of the embodiment of the application comprises a processor, and the processor is used for:
acquiring profile parameters of the sensor;
establishing a first coordinate system according to the contour parameters;
acquiring detection data of the sensor;
generating first coordinates of the inspection data in the first coordinate system; and
an image of the sensor data is generated from the first coordinates.
In this manner, the processor may generate a display image of the sensor data to facilitate a user to accurately understand the operational state of the sensor.
The electronic device of the embodiment of the application comprises:
one or more processors, memory, and
one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the method of displaying data of any of the above.
In this manner, the program of the display method of the data is executed by the processor so that the sensor data can generate a displayable image, so that the user can judge the operating state of the sensor from the image of the sensor data.
The computer-readable storage medium of the embodiments of the present application stores computer-executable instructions that, when executed by one or more processors, cause the processors to perform a method of displaying data as described in any one of the above.
In this way, the display method of any of the above data can be realized by executing the computer-executable instructions by the processor, so that the sensor data can generate a displayable image, and the working state of the sensor can be judged according to the image of the sensor data.
In the data display method, the data display device, the electronic device and the storage medium according to the embodiments of the application, the contour parameters of the sensor are acquired to establish a coordinate system, the detection data acquired by the sensor is used to generate corresponding coordinate points in the coordinate system according to the relative position of the detection data, and finally, the image of the sensor data is generated according to the coordinate points of the detection data, so that the working state of the sensor can be judged according to the image of the sensor data. Therefore, the user can accurately master the working state of the sensor so as to take corresponding measures according to the working state of the sensor.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a data display method according to some embodiments of the present disclosure.
FIG. 2 is a block diagram of a data display device according to some embodiments of the present application.
FIG. 3 is a block diagram of an electronic device according to some embodiments of the present application.
FIG. 4 is a schematic view of a first coordinate system of certain embodiments of the present application.
FIG. 5 is an image of sensor data according to certain embodiments of the present application.
Fig. 6 is a schematic flow chart of a data display method according to some embodiments of the present disclosure.
Fig. 7 is a further flowchart of a method of displaying data according to some embodiments of the present application.
FIG. 8 is a schematic diagram of a map coordinate system in accordance with certain embodiments of the present application.
FIG. 9 is a schematic diagram of coordinate system conversion in accordance with certain embodiments of the present application.
Fig. 10 is a further schematic flow chart of a method of displaying data according to some embodiments of the present application.
Fig. 11 is a further flowchart of a method of displaying data according to some embodiments of the present application.
FIG. 12 is a schematic illustration of sensor data in an image display area according to certain embodiments of the present application.
FIG. 13 is yet another schematic illustration of an image of sensor data in a display area according to certain embodiments of the present application.
FIG. 14 is a block diagram of a storage medium according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The following disclosure provides many different embodiments or examples for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
With the development of science and technology, robots become more and more intelligent. The robot acquires environmental information data around the robot by various sensors such as an image sensor and a laser sensor, and the robot can process the data to generate corresponding action commands. Thus, the operating state of the sensor determines the behavior rules of the robot.
In the related art, data provided by the sensor is generally a large amount of redundant data, and a user is difficult to recognize the data, so that the working state of the sensor cannot be determined, and the state of the robot is difficult to further determine.
Referring to fig. 1, the present application provides a data display method, including:
s10, acquiring profile parameters of the sensor;
s20, establishing a first coordinate system according to the contour parameters;
s30, acquiring detection data of the sensor;
s40, generating first coordinates of the detection data in a first coordinate system; and
s50, an image of the sensor data is generated from the first coordinates.
Referring to fig. 2, the present embodiment provides a data display device 100. The display apparatus 100 includes a first obtaining module 11, a constructing module 12, a second obtaining module 13, a first generating module 14, and a second generating module 15. S10 may be implemented by the first obtaining module 11, S20 may be implemented by the constructing module 12, S30 may be implemented by the second obtaining module 13, S40 may be implemented by the first generating module 14, and S50 may be implemented by the second generating module 15. Alternatively, the first acquisition module 11 may be used to acquire profile parameters of the sensor. The building block 12 may be adapted to establish a first coordinate system based on the contour parameters. The second acquiring module 13 may be used to acquire detection data of the sensor. The first generating module 14 may be configured to generate first coordinates of the inspection data in a first coordinate system. The second generation module 15 may generate an image of the sensor data from the first coordinates.
Referring to fig. 3, an electronic device 1000 according to an embodiment of the present disclosure is further provided, where the electronic device 1000 includes a processor 200, and the processor 200 is configured to acquire profile parameters of a sensor, establish a first coordinate system according to the profile parameters, acquire detection data of the sensor, generate a first coordinate of the detection data in the first coordinate system, and generate an image of the sensor data according to the first coordinate.
Specifically, the electronic device 1000 may include a communication module 400. The electronic device 1000 may establish communication with the server of the robot through the communication module 400. The communication module 400 may be connected with the server of the robot in a wired or wireless manner to maintain close communication in real time, and the connection manner is not limited, and may include a long connection, a polling call, a websocket connection, or the like. The long connection refers to a communication technique in which the communication module 400 of the electronic device 1000 maintains a connected state after establishing communication with the server of the robot, and subsequent communication may not enter a connection step. Thereby, it can be ensured that the electronic device 1000 can receive and transmit data to the server of the robot in real time. The communication module 400 is connected to the first obtaining module 11, and the first obtaining module 11 may send a request to the communication module 400, so that the communication module 400 obtains profile parameters of the sensor, such as the shape and the size of the sensor, from a server of the robot, and the communication module 400 sends the profile parameters of the sensor to the first obtaining module 11. In other embodiments, the communication module 400 may also establish communication with the cloud server through a wired or wireless connection, so as to obtain the profile parameters of the sensor from the cloud server.
Referring to fig. 4 and 5, further, a first coordinate system is established by the building block 12 based on the contour center of the sensor. The outline center of the sensor means the outline center of the sensor in a plan view. The origin of the coordinates in the first coordinate system does not coincide with the center of the contour of the sensor. In an embodiment of the present application, the outline center coordinates of the sensor may be set to (50, 50) to establish a first coordinate system.
Further, the second obtaining module 13 is also connected with the communication module 400. The communication module 400 can constantly acquire detection data acquired by, for example, a laser sensor, an image sensor, or the like from a server of the robot, and transmit the detection data to the second acquisition module 13. The detection data refers to the position information related to the obstacle, such as the relative position information between the obstacle and the sensor, such as a wall, a table, a chair, a barricade, or a step, acquired by the sensor, and it should be noted that, in the plan view of the robot, the distance between the sensor and an article, a person, an animal, or the like is smaller than a safe distance, and the sensor can be regarded as the obstacle. For example, the sensor is installed at a safe distance of 2 meters, and if the distance between the wall, the table, or the person and the sensor is less than 2m, the sensor regards the wall, the table, or the person as an obstacle. Therefore, the scratch accidents sent by the robot and the defined obstacles can be avoided. The first generation module 14 processes the detection data of the sensor to generate a plurality of points, and generates different coordinate points in the first coordinate system according to the relative positions of the different points and the sensor, that is, the first coordinates of the detection data in the first coordinate system. It should be noted that, since the detection data may include a plurality of detection data, and different detection data may differ in relative position to the sensor, the first coordinate may include a plurality of coordinate points in the first coordinate system, and since the detection data may change from moment to moment, the first coordinate may also be transformed. Finally, the second generating module 15 generates an image of the sensor data according to the first coordinate of the detection data in the first coordinate system, so that the user can determine the working state of the sensor according to the image of the sensor.
In the data display method, the display device 100, and the electronic apparatus 1000 according to the embodiment of the present application, the contour parameters of the sensor are obtained to establish the coordinate system, the detection data obtained by the sensor is used to generate the corresponding coordinate points in the coordinate system according to the relative positions of the detection data, and finally the image of the sensor data is generated according to the coordinate points of the detection data, so that the working state of the sensor can be determined according to the image of the sensor data. Therefore, the user can accurately master the working state of the sensor so as to take corresponding measures according to the working state of the sensor.
Referring to fig. 6, in some embodiments, S40 includes:
s41, removing detection data outside the region threshold and detection data overlapping within the region threshold to generate target detection data;
s42, first coordinates of the object detection data are generated in the first coordinate system.
In some embodiments, the first generation module 14 includes a removal unit 141 and a first generation unit 142, S41 may be implemented by the removal unit 141, and S42 may be implemented by the first generation unit 142. In other words, the removing unit 141 is configured to remove the detection data outside the area threshold and the detection data overlapping within the area threshold to generate the target detection data. The first generating unit 142 is configured to generate first coordinates of the target detection data in a first coordinate system.
In some embodiments, the processor 200 is configured to remove detection data outside of the range of the area threshold and detection data overlapping within the area threshold to generate target detection data, and generate first coordinates of the target detection data in a first coordinate system.
It is understood that the detection data obtained by the sensor may not be the same as the relative position of the sensor, for example, the first detection data obtained by the laser sensor is 2m away from the laser sensor, and the second detection data obtained is 7m away from the laser sensor. Accordingly, if all the detection data are used to generate the first coordinate in the first coordinate system to further generate the image, it is difficult for the user to easily recognize the content of the image, and thus it is difficult to determine the working state of the sensor. The removing unit 141 includes a preset region threshold, the region is centered on the outline center of the sensor, and the size of the region threshold is not limited, and in this application, the size of the region threshold in the first coordinate system is 100 × 100. The removal unit 141 determines the relative position of the acquired detection data and the sensor, determines whether the detection data is within the area threshold range, and removes the detection data if the detection data is outside the area threshold range.
Further, since the sensor may acquire the same detection data a plurality of times, for example, the detection data acquired by the sensor twice is the same as the relative position of the sensor. Therefore, the removing unit 141 is also configured to remove detection data that is repeated within the region threshold value range to generate the target detection parameter. The first generating unit 142 generates corresponding first coordinates in the first coordinate system according to the relative position of the detection data and the sensor. In this way, redundant detection data is optimized to remove overlapping detection data and detection data other than the area threshold, so that only necessary detection data is retained to generate the first coordinates in the first coordinate system.
Referring to fig. 7, in some embodiments, S50 includes:
s51, converting the first coordinates to generate second coordinates corresponding to the first coordinates in a second coordinate system;
s52, an image of the sensor data is generated from the second coordinates.
In some embodiments, the second generation module 15 includes a conversion unit 151 and a second generation unit 152. Wherein S51 may be implemented by the conversion unit 151. S52 may be implemented by the second generating unit 152, that is, the converting unit 151 may be configured to convert the first coordinates to generate second coordinates corresponding to the first coordinates in the second coordinate system. The second generating unit 152 may be configured to generate an image of the sensor data from the second coordinates.
In some embodiments, the processor 200 is configured to convert the first coordinates to generate second coordinates corresponding to the first coordinates in a second coordinate system. An image of the sensor data is generated from the second coordinates.
Referring to fig. 8 and 9, it should be noted that the robot includes a navigation map and a map coordinate system established by the navigation map. The position coordinates of the robot and the actual position coordinates of the detected data can be determined in the map coordinate system. If the coordinate axis direction of the first coordinate system is not consistent with the coordinate axis direction of the detection data in the map coordinate system, the coordinate position of the sensor data in the first coordinate system is inconsistent with the coordinate position of the sensor data in the map coordinate system if the image of the sensor data is directly generated from the first coordinate. The display device 100 further includes a second coordinate system having the same coordinate axis direction as the map coordinate system, and the conversion unit 151 converts the first coordinate in the first coordinate system to generate a corresponding second coordinate in the second coordinate system. The second coordinates are then generated into an image of the sensor data by the second generating unit 152. The coordinate position of the sensor data is made to correspond to the position displayed on the navigation map, thereby ensuring the accuracy of the image.
Referring to fig. 10, in some embodiments, S51 further includes:
and S511, carrying out y-axis overturning on the first coordinate system to obtain a second coordinate system.
In some embodiments, S511 may be implemented by the conversion unit 151. That is, the conversion unit 151 may be configured to perform y-axis inversion on the first coordinate system to obtain the second coordinate system.
In some embodiments, the processor 200 is configured to perform a y-axis flip on the first coordinate system to obtain the second coordinate system.
Referring further to fig. 9, in particular, the first coordinate system is compared with the map coordinate system, and the y-axis direction is opposite, while the x-axis direction is the same. And the direction of the coordinate axis in the second coordinate system is consistent with that of the coordinate axis in the map coordinate system. Thus, the first coordinate system is y-axis flipped. The specific calculation method is as follows:
canvas.setmatrix(matrix);
canvas.scale(1,-1,50,50)。
secmatrix (matrix) refers to a first coordinate system that is prepared for matrix transformation. Scale (1, -1,50,50) means that y-axis inversion is performed with the outline center coordinates of the sensor in the first coordinate system as a rotation point, (50, 50) is the outline center coordinates of the sensor in the first coordinate system, 1 represents the enlargement or reduction ratio of the x-axis, and-1 is y-axis inversion. After turning over, the first coordinate system is converted into a second coordinate system, the x-axis directions of the first coordinate system and the second coordinate system are the same as those of the first coordinate system, the y-axis directions of the first coordinate system and the second coordinate system are opposite, and meanwhile, the first coordinate in the first coordinate system correspondingly generates the coordinate in the second coordinate system. For example, if the coordinate of one sub-detection parameter in the detection data in the first coordinate system is (40, 30), the coordinate in the second coordinate system is (40, 70). In this way, the first coordinate system is inverted on the y-axis and converted into the second coordinate system, so that the first coordinate of the detection parameter in the first coordinate system also generates the second coordinate in the second coordinate system, and finally, the image of the sensor data is generated according to the second coordinate point.
Referring to fig. 11, in some embodiments, the display method further includes:
s60, an image of the sensor data is rendered on the image display area.
In some embodiments, the display device 100 further includes the drawing module 16, and S60 may be implemented by the drawing module 16. Alternatively, the rendering module 16 may be configured to render an image of the sensor data on the image display area.
In some embodiments, the processor 200 may also be used to render an image of the sensor data on an image display area.
Specifically, the image of the sensor data includes an image of the sensor profile and an image of the detection data. The electronic device 1000 also includes an image display area. The rendering module 16 may render an image of the sensor profile and an image of the sensed data onto an image display area. In this way, the user can see the sensor outline image and the image of the detection data from the image display area to judge the sensor operation state.
In certain embodiments, S60 includes:
s61, when there is an overlapping area between the image of the detection data and the image of the sensor outline parameter, the image display area generates presentation information.
In some embodiments, S61 may be implemented by rendering module 16. Alternatively, the rendering module 16 may be further configured to generate a prompt message in the image display area if there is an overlapping area between the image of the detection data and the image of the sensor contour parameter.
In some embodiments, the processor 200 may be further configured to generate a prompt message in the image display area if there is an overlapping area between the image of the detected data and the image of the sensor profile parameter.
Referring to fig. 12 and 13, in particular, the image display area further includes an information prompt module, and the information prompt module can prompt that corresponding prompt information can be generated according to the detected image and the image of the sensor contour. It will be appreciated that the robot may be moving in real time, resulting in a change in the relative position between the sensed data acquired by the sensor and the sensor, and thus a change in the image in which the sensed data is displayed. The closer the image of the detection data is to the outline image of the sensor, the closer the detection data is to the sensor, if the image of the detection data and the outline image of the sensor have an overlapping area, the closer the detection data is to the sensor, the sensor is shielded, and the information prompt module can display abnormal information of the sensor data. If the sensor outline image and the image of the detection data are in a non-overlapping area, the image display area displays normal information of the sensor data.
Referring further to fig. 3, the electronic device 1000 according to the embodiment of the present application includes one or more processors 200, a memory 300, and one or more programs 301. Wherein one or more programs 301 are stored in the memory 300 and executed by the one or more processors 200, the programs 301 comprising instructions for performing the method of displaying data of any of the above.
The electronic device 1000 may be implemented in various forms. For example, the electronic device 1000 described in the present application may include a mobile phone, a computer, a camera, and the like.
Referring to fig. 14, embodiments of the present application also provide one or more non-volatile computer-readable storage media 500, where the computer-readable storage media 500 includes computer-executable instructions 501. The computer-executable instructions 501, when executed by the one or more processors 200, cause the processors 200 to perform the method of displaying data of any of the embodiments described above.
The above embodiments are merely representative of several embodiments of the present application, and the description thereof is more specific and detailed, but not to be construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium. The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.

Claims (15)

1. A method for displaying data, the method comprising:
acquiring profile parameters of a sensor;
establishing a first coordinate system according to the contour parameters;
acquiring detection data of the sensor;
generating first coordinates of the inspection data in the first coordinate system; and
an image of the sensor data is generated from the first coordinates.
2. The display method according to claim 1, wherein the generating the first coordinates of the detection data in the first coordinate system comprises:
removing the detection data outside of a region threshold and the detection data overlapping within the region threshold to generate target detection data;
first coordinates of the target detection data are generated in the first coordinate system.
3. The display method of claim 2, wherein the generating the image of the sensor data from the first coordinates comprises:
converting the first coordinate to generate a second coordinate corresponding to the first coordinate in a second coordinate system;
an image of the sensor data is generated from the second coordinates.
4. The display method according to claim 3, wherein the converting the first coordinates to generate second coordinates corresponding to the first coordinates in a second coordinate system comprises:
and carrying out y-axis overturning on the first coordinate system to obtain the second coordinate system.
5. The display method according to claim 1, further comprising:
an image of the sensor data is rendered on an image display area.
6. The display method according to claim 5, wherein the rendering the image of the sensor data on the image display area includes:
and if the image of the detection data and the image of the sensor outline parameter have an overlapping area, generating prompt information in the image display area.
7. A display device for a sensor, comprising:
the first acquisition module is used for acquiring profile parameters of the sensor;
the building module is used for building a first coordinate system according to the contour parameters;
the second acquisition module is used for acquiring detection data of the sensor;
a first generating module for generating first coordinates of the inspection data in the first coordinate system; and
a second generation module to generate an image of the sensor data from the first coordinates.
8. The display device of claim 7, wherein the first generating module comprises:
a removal unit configured to remove the detection data other than a region threshold and the detection data overlapping within the region threshold to generate target detection data;
a first generation unit configured to generate first coordinates of the target detection data in the first coordinate system.
9. The display apparatus of claim 8, wherein the second generation module comprises:
the conversion unit is used for converting the first coordinate to generate a second coordinate corresponding to the first coordinate in a second coordinate system;
a second generation unit for generating an image of the sensor data from the second coordinates.
10. The display apparatus of claim 9, wherein the transformation unit is further configured to perform y-axis flipping on the first coordinate system to obtain the second coordinate system.
11. The display device of claim 7, wherein the display device further comprises a rendering module to render an image of the sensor data on an image display area.
12. The display device of claim 11, wherein the rendering module is further configured to generate a prompt message in the image display area if there is an overlapping area between the image of the detected data and the image of the sensor profile parameter.
13. An electronic device, comprising a processor configured to:
acquiring profile parameters of a sensor;
establishing a first coordinate system according to the contour parameters;
acquiring detection data of the sensor;
generating first coordinates of the inspection data in the first coordinate system; and
an image of the sensor data is generated from the first coordinates.
14. An electronic device, comprising:
one or more processors, memory; and
one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the method of displaying data according to any of claims 1-6.
15. A computer-readable storage medium storing computer-executable instructions that, when executed by one or more processors, cause the processors to perform a method of displaying data according to any one of claims 1-6.
CN202010012703.1A 2020-01-07 2020-01-07 Data display method and device, electronic equipment and storage medium Active CN111203862B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010012703.1A CN111203862B (en) 2020-01-07 2020-01-07 Data display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010012703.1A CN111203862B (en) 2020-01-07 2020-01-07 Data display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111203862A CN111203862A (en) 2020-05-29
CN111203862B true CN111203862B (en) 2021-03-23

Family

ID=70780605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010012703.1A Active CN111203862B (en) 2020-01-07 2020-01-07 Data display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111203862B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866762A (en) * 2022-03-15 2022-08-05 中国第一汽车股份有限公司 Visual detection method, device and system of camera sensor

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2686351B2 (en) * 1990-07-19 1997-12-08 ファナック株式会社 Vision sensor calibration method
CN102538859A (en) * 2011-05-19 2012-07-04 广东迅通科技股份有限公司 Method for monitoring and processing various sensors
CN104364746B (en) * 2012-06-29 2017-08-18 日立麦克赛尔株式会社 Display system, display device, display terminal, the display methods of display terminal
CN102999901B (en) * 2012-10-17 2016-06-29 中国科学院计算技术研究所 Based on the processing method after the Online Video segmentation of depth transducer and system
CN107543540B (en) * 2016-06-27 2020-05-15 杭州海康机器人技术有限公司 Data fusion and flight mode switching method and device for flight equipment

Also Published As

Publication number Publication date
CN111203862A (en) 2020-05-29

Similar Documents

Publication Publication Date Title
CN102449568B (en) Robot control system, robot control terminal, robot control method and program
CN109141347A (en) Vehicle-mounted vidicon distance measuring method and device, storage medium and electronic equipment
KR101995223B1 (en) System, module and method for detecting pedestrian, computer program
WO2022142786A1 (en) Driving behavior recognition method, and device and storage medium
CN111203862B (en) Data display method and device, electronic equipment and storage medium
CN104167109A (en) Detection method and detection apparatus for vehicle position
CN112381889A (en) Camera inspection method, device, equipment and storage medium
CN112288882A (en) Information display method and device, computer equipment and storage medium
JP2018200558A (en) Shape recognition device and shape recognition method and a program
CN109031205B (en) Robot positioning device, method and robot
CN113607064A (en) Target object distance measuring and calculating method, device and equipment and readable storage medium
KR102174035B1 (en) Object inspection method using an augmented-reality
CN111832347B (en) Method and device for dynamically selecting region of interest
CN111401423A (en) Data processing method and device for automatic driving vehicle
CN110659626A (en) Image detection method, device and equipment
CN115902977A (en) Transformer substation robot double-positioning method and system based on vision and GPS
CN111638715B (en) Robot control method, robot, electronic device, and storage medium
CN113911918B (en) Fault emergency dispatch control method and system for intelligent tower crane cluster
JP2005331285A (en) Image processing method and image processing device
US20220027653A1 (en) Simultaneous Diagnosis And Shape Estimation From A Perceptual System Derived From Range Sensors
KR20180119344A (en) Region monitoring apparatus and method for monitoring region thereby
CN115082565A (en) Camera calibration method, device, server and medium
CN113808260A (en) Equipment information processing method and device based on augmented reality display device
JP2018141716A (en) Position estimation device, control method, and program
KR101964227B1 (en) Apparatus and method for control military strategy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant