CN117472243A - Display device, display method, and computer-readable storage medium - Google Patents

Display device, display method, and computer-readable storage medium Download PDF

Info

Publication number
CN117472243A
CN117472243A CN202310615027.0A CN202310615027A CN117472243A CN 117472243 A CN117472243 A CN 117472243A CN 202310615027 A CN202310615027 A CN 202310615027A CN 117472243 A CN117472243 A CN 117472243A
Authority
CN
China
Prior art keywords
display
display device
line
control unit
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310615027.0A
Other languages
Chinese (zh)
Inventor
浪越孝宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Corp
Original Assignee
Nidec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Corp filed Critical Nidec Corp
Publication of CN117472243A publication Critical patent/CN117472243A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a display device, a display method and a computer readable storage medium. The display device includes a control unit and a display unit. The control unit includes: a determination unit that determines a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and a display control unit that causes the display unit to display a 3D display including a coordinate image representing the specific coordinate, based on the determined viewpoint position and the determined line-of-sight direction.

Description

Display device, display method, and computer-readable storage medium
Technical Field
The present invention relates to a display device, a display method, and a computer-readable storage medium storing a program.
Background
Conventionally, a display device such as a head mounted display for displaying a virtual space is known (for example, patent document 1).
Patent document 1: japanese patent laid-open publication No. 2019-139672
Disclosure of Invention
For example, in the case where the virtual space is a 3D (three-dimensional) space in which the robot is disposed, it is desirable to display the virtual space so that the positional relationship of the so-called teaching points of the robot can be easily understood.
In view of the above, an object of the present invention is to provide a display device, a display method, and a program that make it easy to grasp a coordinate image representing a specific coordinate in 3D display.
An exemplary display device of the present invention includes a control unit and a display unit. The control unit includes: a determination unit that determines a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and a display control unit that causes the display unit to display a 3D display including a coordinate image representing the specific coordinate, based on the determined viewpoint position and the determined line-of-sight direction.
An exemplary display method of the present invention is a display method of a display device including a control unit and a display unit, the display method including:
a determination step of determining, by the control section, a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and
and a display control step of causing the control unit to display a 3D display including a coordinate image indicating the specific coordinate on the display unit based on the determined viewpoint position and the determined line-of-sight direction.
An exemplary program of the present invention is a program for use in a display device including a control unit and a display unit, the program causing the control unit to: a deciding step of deciding a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and a display control step of causing the display unit to display a 3D display including a coordinate image indicating the specific coordinates, based on the determined viewpoint position and the determined line-of-sight direction.
According to the exemplary display device, display method, and program of the present invention, it is easy to grasp a coordinate image representing a specific coordinate.
Drawings
Fig. 1 is a diagram showing a configuration of a robot system according to an exemplary embodiment of the present invention.
Fig. 2 is a flow chart associated with an exemplary 3D display control of the present invention.
Fig. 3 is a diagram showing an example of the initial 3D display.
Fig. 4 is a diagram showing an example of teaching points and areas.
Fig. 5 is a diagram showing an example in which instruction blocks are arranged in a region.
Fig. 6A is a diagram showing an example of a method for determining the viewpoint position and the line-of-sight direction based on a plane including the pseudo teaching points (parallel projection).
Fig. 6B is a diagram showing an example of a method for determining the viewpoint position and the line-of-sight direction based on a plane including the pseudo teaching point (see-through projection).
Fig. 7 is a diagram showing an example of 3D display when the second region is selected.
Fig. 8 is a diagram showing an example of a plane including 3 selected teaching points.
Fig. 9 is a diagram showing an example in which instruction blocks are arranged in an area.
Fig. 10 is a diagram showing a pair of a movement start block and a movement end block.
Fig. 11 is a diagram showing an example of the area display.
Symbol description
1 robot
2 robot controller
3 program generating device (display device)
Region 4
5 instruction block
6 lines
10 robot system
21 teaching point storage unit
22 robot program storage unit
23 control part
31 input part
32 display unit
33 control part
33A display control unit
33B calculation part
33C determination unit
33D program generating part
Regions 41-43
51 movement start block
52 movement end block
Direction of Ed line of sight
Ep viewpoint position
G gravity direction
P teaching point
Pos 1-Pos 3 teaching points
PosA and PosB pseudo teaching points
S1-S3 plane
T triangle.
Detailed Description
Hereinafter, exemplary embodiments of the present invention will be described with reference to the accompanying drawings.
< 1 robot System >)
Fig. 1 is a diagram showing a configuration of a robot system 10 according to an exemplary embodiment of the present invention. The robot system 10 includes a robot 1, a robot controller 2, and a program generating device 3.
The robot 1 is an industrial multi-joint robot or the like. The robot 1 includes motors mounted on the respective joint axes, and the front end portion of the robot 1 is controlled to a predetermined position and orientation by driving the motors. In addition, the robot 1 is not limited to the multi-joint robot.
The robot controller 2 is a device for controlling the robot 1, and includes a teaching point storage unit 21, a robot program storage unit 22, and a control unit 23. The robot controller 2 is constituted by, for example, a PC (personal computer).
An operation device called a pendant (pendant), not shown, is connected to the robot controller 2. Using this pendant, the position and orientation of the robot 1 can be registered in the robot controller 2. For example, the joints of the robot 1 are controlled by the operation in the pendant, and when the registration operation is performed in the pendant at a desired position and orientation, the position and orientation at that time is registered in the robot controller 2. Such a registration task is called teaching (teaching), and the registered position and orientation are called teaching points. The teaching points are registered in the teaching point storage section 21. The teaching point is position and orientation information of the tip of the robot 1.
The robot program storage unit 22 stores a robot program for causing the robot 1 to perform a specific operation. The robot program is composed of a program language such as BASIC or Python language. The control unit 23 controls the robot 1 according to the robot program stored in the robot program storage unit 22.
The program generating device 3 is a device for generating a robot program stored in the robot program storage unit 22, and is constituted by a PC, for example. The program generating device 3 can display various images on a display unit 32 described later, and corresponds to an exemplary display device of the present invention.
The program generating device 3 includes an input unit 31, a display unit 32, and a control unit 33. The input unit 31 is configured to generate an input signal to be input to the control unit 33 according to an operation. The input section 31 includes, for example, a mouse and a keyboard.
The display unit 32 is configured to display various images, and is configured by a liquid crystal display unit, for example.
The control unit 33 includes a CPU, a memory, and the like. The CPU executes programs stored in the memory to realize various functional blocks (display control unit 33A and the like) of the control unit 33 shown in fig. 1.
2. For 3D display >
In the program generating device 3 (display device) according to the present invention, the control unit 33 can display the virtual 3D space in 3D on the display unit 32. Position coordinates with respect to a predetermined fixed coordinate system (XYZ coordinate system) are determined in the 3D space. The display unit 32 can perform 3D display including the robot 1. That is, in the 3D space, the position coordinates of the robot 1 are predetermined. The position coordinates of the teaching points may be specified, and an image called a region, which will be described later, may be displayed on the display unit 32 together with the teaching points in the 3D display. As described later, the area is an area for disposing an instruction block indicating an instruction to the robot 1.
In addition, when the viewpoint position and the line of sight direction are determined as described later, 3D display in the determined viewpoint position and line of sight direction can be performed.
< 3.3D display control method >)
Next, an example of a 3D display control method is described with reference to a flowchart shown in fig. 2. Hereinafter, the teaching point is position information among the position and orientation information.
The control unit 33 includes a display control unit 33A (fig. 1). When the process of fig. 2 is started, first, in step S1, the display control unit 33A causes the display unit 32 to perform 3D display in the view point position above the vertical direction and the view line direction below the vertical direction (the gravitational direction) in the 3D space in which the robot 1 is disposed.
Fig. 3 schematically shows an example of 3D display at this time. In the 3D display, the region 4 is displayed together with the teaching point P. For example, as shown in fig. 3, the region 4 is displayed such that the teaching point P becomes the upper left corner of the region 4. That is, the region 4 is an image associated with the teaching point P. In addition, for example, the region 4A corresponding to the teaching point PA and the region 4B corresponding to the teaching point PB shown in fig. 3 overlap. Therefore, in order to prevent the area 4 from overlapping, for example, as in the area 4AA shown in fig. 3, the position of the teaching point PA in the area 4 may be changed. That is, the teaching point PA is located at the lower right corner of the area 4AA, and is not limited to the upper left corner of the area 4. The teaching point P may not be displayed but the area 4 may be displayed.
In step S1, all the teaching points P registered in the teaching point storage section 21 and the respective areas 4 corresponding thereto are displayed. In the example of fig. 3, all the areas 4 shown in fig. 3 are the areas 4 corresponding to all the teaching points P described above. The teaching point P and the region 4 shown here are candidates of the region (teaching point) selected after step S2. That is, in step S1, the area (teaching point) is in a state that has not been selected yet.
That is, when the specific coordinates (teaching points) are not selected, the display control unit 33A displays the coordinate images (teaching points P and areas 4) indicating all the registered coordinates including the specific coordinates on the display unit 32. Thus, by selecting an arbitrary coordinate image from all the coordinate images, an arbitrary specific coordinate can be selected.
Next, the process advances to step S2, where one area 4 displayed in step S1 is selected. I.e. the first region 4 is selected. In addition, the region selection is performed by an operation in the input section 31. The order of selecting the regions 4 corresponds to the order in which the robot 1 moves at the corresponding teaching points. That is, the movement path of the robot 1 can be specified by selecting the order of the areas 4.
Here, in the selected area 4, a command block 5 described later can be arranged. Fig. 4 is a diagram showing an area 41 corresponding to the teaching point Pos1, an area 42 corresponding to the teaching point Pos2, and an area 43 corresponding to the teaching point Pos3. The area 41 is the first area 4 selected in step S2. The region 42 is the 2 nd region 4 selected in step S3 described later. The region 43 is the 3 rd region 4 selected in step S6 described later. In fig. 4, for convenience of explanation, the areas 41 to 43 are illustrated in parallel in the lateral direction, but the positional relationship between the actual teaching points Pos1 to Pos3 is not illustrated.
The display control unit 33A performs display control of disposing the instruction block 5 in the selected area 4 based on the operation of the input unit 31. The instruction block 5 is an image showing a block of instructions (operations of the robot 1) to the robot 1. That is, for example, when the first area 41 is selected in step S2, the instruction block 5 can be arranged in the area 41.
Fig. 5 is a diagram showing an example of arrangement of instruction blocks in the respective areas 41, 42, 43 shown in fig. 4. In the example shown in fig. 5, instruction blocks 5 each representing an instruction A, B, C are stacked in this order from the top to the bottom in the area 41. In this way, the instruction blocks 5 can be arranged in a stacked configuration. The stacking order is an order of instructions to the robot 1. The instruction block 5 can be arranged from outside the area to inside the area by dragging with a mouse, for example.
Next, the process advances to step S3, where one area 4 displayed in step S1 is selected. I.e. the 2 nd zone 4 is selected.
Thus, the process advances to step S4. Step S4 is described with reference to fig. 6A. Fig. 6A illustrates teaching points Pos1 and Pos2 of the first and second areas 41 and 42. The control unit 33 includes a calculation unit 33B (fig. 1). As shown in fig. 4, the calculating unit 33B determines a pseudo teaching point PosA located in the vertical direction with respect to the teaching point Pos2 in the second area 42, and calculates a plane S1 including three teaching points Pos1, pos2 and the pseudo teaching point PosA. In fig. 6A, a gravitational direction G (vertically downward) is shown. That is, the control unit 33 includes a calculation unit 33B that calculates a plane S1 including two specific coordinates (Pos 1, pos 2) from the specific coordinates.
Next, the process advances to step S5. The control unit 33 includes a determination unit 33C (fig. 1). The determination unit 33C determines the viewpoint position Ep and the line-of-sight direction Ed based on the plane S1 calculated in step S4. That is, the control unit 33 includes a determination unit 33C that determines the viewpoint position Ep and the line-of-sight direction Ed based on the plane S1 including the two specific coordinates (Pos 1, pos 2). The viewing direction Ed is determined to be a direction perpendicular to the plane S1.
For example, the viewpoint position Ep and the line-of-sight direction Ed are determined as follows. The determination method described here is also applied to steps S8 and S11 described later.
Case of 3D display by parallel projection
In the case of performing 3D display by parallel projection, fig. 6A is taken as an example for explanation. In parallel projection, the size of the object in the projection plane does not change even if the distance of the object is far.
1) An arbitrary rectangular area T1 including 3 teaching points (Pos 1, pos2, posA) is set on the plane S1. The size of the rectangular region T1 is arbitrary. The rectangular region T1 becomes a projection plane (display region).
2) A perpendicular line PL1 perpendicular to the plane S1 is drawn from the midpoint TP1 of the rectangular region T1. The vertical line P1 becomes a line of sight.
3) The viewpoint Ep is an arbitrary position on the perpendicular line PL1.
Case of 3D display by perspective projection
In the case of 3D display by perspective projection, fig. 6B is taken as an example. In perspective projection, the farther the distance of the object, the smaller the size of the object in the projection plane.
1) The center of gravity CG1 of 3 teaching points (Pos 1, pos2, posA) was obtained.
2) A perpendicular line PL1 perpendicular to the plane S1 is drawn from the center of gravity CG1. The perpendicular line PL1 becomes a line of sight.
3) The viewpoint Ep is located on the perpendicular line PL1. The rectangular region T1, which is a tolerance part between the quadrangular pyramid having the viewpoint Ep as the vertex and the plane S1, becomes a projection plane. The opening angle of the quadrangular pyramid is the angle of view θ.
4) The viewpoint position on the perpendicular line PL1 is adjusted so that the rectangular area T1 includes 3 teaching points. In addition, the angle of view θ may also be adjusted.
Since the viewpoint position Ep and the line-of-sight direction Ed are determined in step S4, the display control unit 33A causes the display unit 32 to display 3D display when viewing the 3D space with the determined viewpoint position Ep and line-of-sight direction Ed. At this time, as shown in an example of fig. 7, 3D display including teaching points Pos1 and Pos2 and areas 41 and 42 is performed. That is, the control unit 33 includes a display control unit 33A, and the display control unit 33A causes the display unit 32 to display 3D including coordinate images (teaching points P and areas 4) indicating specific coordinates (Pos 1 and Pos 2) based on the determined viewpoint position Ep and the line-of-sight direction Ed.
In this way, by performing 3D display with the viewpoint position Ep and the line-of-sight direction Ed determined according to the plane S1, the teaching points Pos1, pos2 and the areas 41, 42 are easily seen. Therefore, the teaching point P and the area 4 are displayed in a superimposed manner or the like based on the viewpoint position and the line of sight direction, and it is difficult to grasp the teaching point P and the area 4, so that it is not necessary to change the viewpoint position, the line of sight direction, or the like by a user operation. That is, in the present embodiment, it is easy to grasp the coordinate image indicating the specific coordinates (Pos 1, pos 2). In particular, in the present embodiment, the positional relationship between the teaching points Pos1 and Pos2 and the direction in which the robot 1 moves (the direction from Pos1 to Pos 2) can be easily grasped.
In addition, the robot 1 may not be displayed when the teaching point is not selected as in step S1, and the robot 1 may be displayed when the teaching point is selected as in steps S2 and S3, the coordinates of the tip of the robot 1 being identical to the selected teaching point. For example, in fig. 7, the robot 1 having the front end coordinates matching the selected teaching point Pos2 may be displayed.
The specific coordinates (Pos 1, pos 2) are teaching points of the robot 1. This makes it easy to grasp the positional relationship of the teaching points of the robot 1.
The coordinate image includes an area 4 in which an instruction block 5 indicating an instruction to the robot 1 is arranged. Thus, the instruction to the robot at the teaching point can be easily grasped.
The determination unit 33C determines the viewpoint position and the viewing direction based on the plane S1 including the two specific coordinates (Pos 1, pos 2) and the pseudo specific coordinate (PosA) located in the vertical direction with respect to one of the two specific coordinates (Pos 2). Accordingly, the display unit 32 displays the image so that the gravity direction G is downward, and 3D display can be performed in a natural manner (fig. 7).
The determined viewing direction Ed is a direction perpendicular to the plane S1. Thus, since the 3D display is performed so that the plane S1 is parallel to the display surface of the display unit 32, the coordinate image is displayed in a state of no depth. Therefore, the moving direction between the specific coordinates is particularly easy to grasp.
After step S5, the process proceeds to step S6, where the third area 4 is selected. Next, in step S7, similarly to step S4, a pseudo teaching point PosB located in the vertical direction with respect to the teaching point Pos3 corresponding to the third area 43 is determined, and the plane S2 is calculated from the teaching points Pos2, pos3, and PosB. Then, in step S8, the viewpoint position and the line of sight direction are determined based on the calculated plane S2, and 3D display based on the determined viewpoint position and line of sight direction is performed.
After that, when the second area 42 is selected again in step S9, the flow proceeds to step S10, and as shown in fig. 8, a plane S3 is calculated, and the plane S3 includes teaching points Pos2 corresponding to the second area 42 and teaching points Pos1 and Pos3 corresponding to the areas 41 and 43 selected before and after the area 42. Then, in step S11, the viewpoint position and the line of sight direction are determined based on the calculated plane S3, and 3D display based on the determined viewpoint position and line of sight direction is performed.
That is, the control unit 33 includes a determination unit 33C for determining the viewpoint position Ep and the line-of-sight direction Ed from the plane S3 including 3 specific coordinates (Pos 1, pos2, pos 3).
The determination unit 33C determines the viewpoint position and the line-of-sight direction from a plane S3, the plane S3 including a specific coordinate (Pos 2) corresponding to the currently selected first coordinate image, a specific coordinate (Pos 1) corresponding to the second coordinate image preceding the first coordinate image in the selected order, and a specific coordinate (Pos 3) corresponding to the third coordinate image following the first coordinate image in the selected order. Thus, the 3 selected coordinate images can be easily grasped.
In the case where the viewpoint position and the line of sight direction are determined from a plane including 3 teaching points (including pseudo teaching points as well), for example, when the viewpoint position and the line of sight direction are determined from the center of gravity position of a triangle connecting 3 teaching points, the control unit 33 may not need to calculate the plane.
< 4. Move Start/finish Block >)
Fig. 9 shows an example of the arrangement of the instruction blocks 5 in the areas 41 to 43. As shown in fig. 9, the instruction block 5 includes a movement start block 51 and a movement end block 52. The movement start block 51 indicates a movement start instruction of the teaching point from the configured area. The movement end block 52 indicates a movement completion instruction of the teaching point of the configured area.
As shown in fig. 10, a movement start block 51 and a movement end block 52 are handled as a pair of instruction blocks 5. The movement start block 51 and the movement end block 52 can be freely moved independently of each other. The lower side of the movement start block 51 and the upper side of the movement end block 52 are joined by a line 6. The instruction block 5 may be stacked above the movement start block 51. Below the movement end block 52, the instruction block 5 may be stacked below. The length and shape of the wire 6 can be changed according to the movement of the movement start block 51 or the movement of the movement end block 52.
In the example of fig. 9, a movement start block 51 is arranged in the area 41, and a movement end block 52 is arranged in the area 42. Thus, the instruction starts moving from the teaching point Pos1 and ends moving at the teaching point Pos2.
By disposing such a movement start block 51 and movement end block 52 in the area, the movement start position and movement end position of the robot 1 can be intuitively specified. In the case of such an embodiment, the area 4 is selected at a point in time when the movement start block 51 or the movement end block 52 is arranged in the area 4.
In addition, in the display of the display unit 32, the movement start block 51 and the movement end block 52 are coupled by the line 6. This makes it easy to recognize the association between the movement start block 51 and the movement end block 52. In addition, for example, the movement start block 51 and the movement end block 52 may be displayed in the same color to establish association.
The arrangement instruction block 5 can be stacked on the upper side of the movement start block 51, and the arrangement instruction block 5 can be stacked on the lower side of the movement end block 52. Thus, the order of instructions is easily recognized.
After the instruction block 5 is arranged in the area, a program generating unit 33D (fig. 1) in the control unit 33 generates a robot program according to the arrangement of the instruction block 5. Next, the control unit 33 transmits the generated robot program to the robot controller 2. The transmitted robot program is stored in the robot program storage unit 22. The control unit 23 controls the robot 1 according to a robot program. Thereby, the robot 1 performs an operation corresponding to the instruction content of the instruction block 5 arranged in the area.
In the example of fig. 9, the robot 1 performs the sequential operation of the instruction A, B, C at the teaching point Pos1, and then starts moving from the teaching point Pos1 and ends moving at the teaching point Pos2. Then, at the teaching point Pos2, the sequential operation of the instruction D, E, F is performed. Then, movement starts from the teaching point Pos2, and movement ends at the teaching point Pos3.
< 5 area display >)
The area display may be performed as shown in fig. 11. In fig. 11, areas 41, 42, 43 are arranged outside a triangle T formed by connecting teaching points Pos1, pos2, pos3. That is, the display control unit 33A causes the display unit 32 to display the coordinate images (41, 42, 43) so as to be arranged outside the triangle formed by the specific coordinates (Pos 1, pos2, pos 3) and included in the plane S3. This allows the coordinate images (regions 41 to 43) to be displayed so as not to overlap.
< 6 >, others
The embodiments of the present invention have been described above. The scope of the present invention is not limited to the above embodiment. The present invention can be implemented by variously changing the above-described embodiments within a range not departing from the gist of the present invention. The matters described in the above embodiments can be appropriately combined in any range where no contradiction occurs.
For example, as shown in fig. 3, when all the teaching points P and the corresponding areas 4 are displayed, 3 arbitrary areas 4 may be selected, and the viewpoint position and the line-of-sight direction may be determined from the plane including the 3 teaching points P of the selected areas 4. Thus, the positional relationship of the teaching points arbitrarily selected can be easily grasped. That is, when any 3 coordinate images are selected from the coordinate images (region 4) indicating all the coordinates, the determination unit 33C determines the viewpoint position and the line-of-sight direction based on the plane including the specific coordinates corresponding to the 3 coordinate images.
< 7. Additionally remembered >
As described above, the display device 3 according to one embodiment of the present invention includes the control unit 33 and the display unit 32. The control unit 33 includes: a determination unit 33C that determines a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and a display control unit 33A that causes the display unit to display a 3D display (first configuration) including a coordinate image representing the specific coordinates, based on the determined viewpoint position and the determined line-of-sight direction.
In the first configuration, the specific coordinates are teaching points of the robot (second configuration).
In the second configuration, the coordinate image includes an area for disposing a command block indicating a command to the robot (third configuration).
In the third configuration, the instruction block includes: a movement start block that represents starting movement from the teaching point of the configured area; and a movement ending block that indicates ending movement at the teaching point of the configured area, the movement starting block and the movement ending block being associated with each other (fourth configuration).
In any one of the first to fourth configurations, the determined line-of-sight direction is a direction perpendicular to the plane (fifth configuration).
In any one of the first to fifth configurations, the determination unit determines the viewpoint position and the line-of-sight direction based on a plane including the specific coordinates corresponding to a currently selected first coordinate image, the specific coordinates corresponding to a second coordinate image preceding the first coordinate image in the selected order, and the specific coordinates corresponding to a third coordinate image following the first coordinate image in the selected order (a sixth configuration).
In any one of the first to fifth configurations, the determination unit determines the viewpoint position and the viewing direction based on the plane including the two specific coordinates and a pseudo specific coordinate located in a vertical direction with respect to one of the two specific coordinates (seventh configuration).
In any one of the first to seventh configurations, the display control unit may be configured to display the coordinate image on the display unit so as to be arranged outside a triangle formed by the specific coordinates and included in the plane (eighth configuration).
In addition, in any one of the first to eighth configurations, when the specific coordinates are not selected, the display control unit causes the display unit to display a coordinate image indicating all the registered coordinates including the specific coordinates (a ninth configuration).
In the ninth configuration, when any 3 coordinate images are selected from among coordinate images indicating all the coordinates, the all determination unit determines the viewpoint position and the line-of-sight direction based on a plane including the specific coordinates corresponding to the 3 coordinate images (tenth configuration).
The display method according to one embodiment of the present invention is a display method of a display device (3) provided with a control unit (33) and a display unit (32), and the display method includes the steps of: a determination step in which the control unit determines a viewpoint position and a line-of-sight direction on the basis of a plane including two or three specific coordinates; and a display control step of causing the display unit to display a 3D display including a coordinate image representing the specific coordinate based on the determined viewpoint position and the determined line-of-sight direction (eleventh configuration).
Further, a program according to an embodiment of the present invention is used for a display device (3) including a control unit (33) and a display unit (32), and the program causes the control unit to execute the steps of: a deciding step of deciding a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and a display control step of causing the display unit to display a 3D display including a coordinate image indicating the specific coordinates based on the determined viewpoint position and the determined line-of-sight direction (twelfth configuration).
Industrial applicability
The technique of the present invention can be used for example in a system including various robots.

Claims (12)

1. A display device is characterized in that,
the display device comprises a control unit and a display unit,
the control unit includes:
a determination unit that determines a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and
and a display control unit that causes the display unit to display a 3D display including a coordinate image indicating the specific coordinates, based on the determined viewpoint position and the determined line-of-sight direction.
2. The display device of claim 1, wherein the display device comprises a display device,
the specific coordinates are teaching points of the robot.
3. The display device of claim 2, wherein the display device comprises a display device,
the coordinate image includes an area for configuring an instruction block representing an instruction to the robot.
4. A display device according to claim 3, wherein,
included in the instruction block is:
a movement start block that represents starting movement from the teaching point of the configured area; and
a movement end block indicating that movement is ended at the teaching point of the configured area,
the movement start block and the movement end block are associated with each other.
5. The display device of claim 1, wherein the display device comprises a display device,
the determined line-of-sight direction is a direction perpendicular to the plane.
6. The display device of claim 1, wherein the display device comprises a display device,
the determination unit determines the viewpoint position and the line-of-sight direction based on a plane including the specific coordinates corresponding to a currently selected first coordinate image, the specific coordinates corresponding to a second coordinate image preceding the first coordinate image in a selected order, and the specific coordinates corresponding to a third coordinate image following the first coordinate image in the selected order.
7. The display device of claim 1, wherein the display device comprises a display device,
the determination unit determines the viewpoint position and the line-of-sight direction based on the plane including the two specific coordinates and a pseudo specific coordinate located in a vertical direction with respect to one of the two specific coordinates.
8. The display device of claim 1, wherein the display device comprises a display device,
the display control unit causes the display unit to display the coordinate image so as to be arranged outside a triangle formed by the specific coordinates and included in the plane.
9. The display device of claim 1, wherein the display device comprises a display device,
when the specific coordinates are not selected, the display control unit causes the display unit to display a coordinate image indicating all the registered coordinates including the specific coordinates.
10. The display device of claim 9, wherein the display device comprises a display device,
when any 3 coordinate images are selected from among the coordinate images representing all the coordinates, the determination unit determines the viewpoint position and the line-of-sight direction based on a plane including the specific coordinates corresponding to the 3 coordinate images.
11. A display method of a display device including a control unit and a display unit, the display method comprising:
a determination step of determining, by the control section, a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and
and a display control step of causing the control unit to display a 3D display including a coordinate image indicating the specific coordinate on the display unit based on the determined viewpoint position and the determined line-of-sight direction.
12. A computer-readable storage medium storing a program for use in a display device including a control unit and a display unit, the computer-readable storage medium causing the control unit to perform:
a deciding step of deciding a viewpoint position and a line-of-sight direction based on a plane including two or three specific coordinates; and
and a display control step of causing the display unit to display a 3D display including a coordinate image indicating the specific coordinates, based on the determined viewpoint position and the determined line-of-sight direction.
CN202310615027.0A 2022-07-27 2023-05-29 Display device, display method, and computer-readable storage medium Pending CN117472243A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-119557 2022-07-27
JP2022119557A JP2024017119A (en) 2022-07-27 2022-07-27 Display device, display method, and program

Publications (1)

Publication Number Publication Date
CN117472243A true CN117472243A (en) 2024-01-30

Family

ID=89628117

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310615027.0A Pending CN117472243A (en) 2022-07-27 2023-05-29 Display device, display method, and computer-readable storage medium

Country Status (2)

Country Link
JP (1) JP2024017119A (en)
CN (1) CN117472243A (en)

Also Published As

Publication number Publication date
JP2024017119A (en) 2024-02-08

Similar Documents

Publication Publication Date Title
JP6814735B2 (en) Remote control manipulator system and its operation method
US11007646B2 (en) Programming assistance apparatus, robot system, and method for generating program
CN101970184B (en) Operation teaching system and operation teaching method
KR100929445B1 (en) Recording medium including robot simulation apparatus and robot simulation program
US11370105B2 (en) Robot system and method for operating same
US10807240B2 (en) Robot control device for setting jog coordinate system
US20110122228A1 (en) Three-dimensional visual sensor
CN110977931A (en) Robot control device and display device using augmented reality and mixed reality
US10427298B2 (en) Robot system displaying information for teaching robot
EP1847359A2 (en) Robot simulation apparatus
CN104802186A (en) Robot programming apparatus for creating robot program for capturing image of workpiece
JP2010042466A (en) Robot teaching system and method for displaying simulation result of operation of robot
US11865697B2 (en) Robot system and method for operating same
US10507585B2 (en) Robot system that displays speed
CN114603533B (en) Storage medium and teaching method for robot
JPS6179589A (en) Operating device for robot
JP2019089201A (en) Teaching data creation device, method for controlling teaching data creation device, and robot system
JP6905651B1 (en) How to form a 3D model of a robot system and work
JP2003256025A (en) Robot motion teaching method and device
CN117472243A (en) Display device, display method, and computer-readable storage medium
Makita et al. Offline direct teaching for a robotic manipulator in the computational space
US10532460B2 (en) Robot teaching device that sets teaching point based on motion image of workpiece
CN112423947B (en) Robot system
WO2022158427A1 (en) Device, robot system, and method for determining position of recessed portion to be formed by scraping
JP4730337B2 (en) Robot installation position determining apparatus and robot installation position determining method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination