WO2022145014A1 - Dispositif d'affichage, système de commande et procédé de dessin - Google Patents

Dispositif d'affichage, système de commande et procédé de dessin Download PDF

Info

Publication number
WO2022145014A1
WO2022145014A1 PCT/JP2020/049246 JP2020049246W WO2022145014A1 WO 2022145014 A1 WO2022145014 A1 WO 2022145014A1 JP 2020049246 W JP2020049246 W JP 2020049246W WO 2022145014 A1 WO2022145014 A1 WO 2022145014A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
image
abnormality
unit
information
Prior art date
Application number
PCT/JP2020/049246
Other languages
English (en)
Japanese (ja)
Inventor
晶仁 山本
直希 森山
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to CN202080102240.6A priority Critical patent/CN115803785B/zh
Priority to JP2021527967A priority patent/JP6991396B1/ja
Priority to PCT/JP2020/049246 priority patent/WO2022145014A1/fr
Publication of WO2022145014A1 publication Critical patent/WO2022145014A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present disclosure relates to a display device, a control system, and a drawing method that can be connected to a control device that controls a controlled device.
  • a worker performing restoration work identifies the location where the abnormality has occurred based on the information displayed on the display device.
  • the display device is connected to a control device that controls the device.
  • the display device and the control device constitute a control system that controls the controlled device, which is the device.
  • Patent Document 1 discloses that the display means included in the sequence controller system displays a portion where an abnormality has occurred in the sequence controller system in a three-dimensional figure representing the sequence controller system. Further, according to the technique of Patent Document 1, the displayed three-dimensional figure rotates according to the operation of the terminal. The operator can search for the place where the abnormality has occurred by rotating the displayed three-dimensional figure.
  • Patent Document 1 it is necessary to rotate a three-dimensional figure by an operation by an operator in order to search for a place where an abnormality has occurred. Since the location where the abnormality has occurred is searched while changing the direction in which the three-dimensional figure is rotated, it takes more time for the worker who has less experience in the restoration work to search for the location where the abnormality has occurred. As described above, according to the conventional technique, there is a problem that it may take time to identify the place where the abnormality has occurred.
  • the present disclosure has been made in view of the above, and is a display device capable of notifying the occurrence of an abnormality in the controlled device so that the operator can easily identify the location where the abnormality has occurred in the controlled device.
  • the purpose is to get.
  • the display device is a communication unit capable of communicating with the control device that controls the controlled device, and the controlled device is rotated and displayed on the screen.
  • An image generation unit that generates a 3D image of the above, a superimposed object generation unit that generates an object that is an object indicating the presence or absence of an abnormality in the controlled device portion and is superimposed on the position of the portion of the 3D image, and a screen. It is provided with a rotation information acquisition unit for acquiring rotation information indicating the mode of rotation of the controlled device displayed on the above, and an image display unit for displaying a three-dimensional image on which an object is superimposed. The image generation unit rotates and displays the three-dimensional image based on the rotation information.
  • the display device has an effect that it is possible to notify the occurrence of an abnormality in the controlled device so that the operator can easily identify the place where the abnormality has occurred in the controlled device.
  • FIG. 3 The figure which shows the example which superposed the 2D object of the transparent display on the 3D image by the display device which concerns on Embodiment 3.
  • a flowchart showing a processing procedure when a two-dimensional object is generated by the display device according to the third embodiment.
  • a flowchart showing a processing procedure when the display device according to the fourth embodiment displays the rotation of the controlled device in the window of the alarm screen.
  • FIG. 1 is a diagram showing a functional configuration of the display device 1 according to the first embodiment.
  • FIG. 2 is a diagram showing a hardware configuration of the display device 1 according to the first embodiment.
  • FIG. 3 is a diagram showing a configuration of a control system 100 including a display device 1 according to the first embodiment.
  • the control system 100 includes a display device 1, a control device 2 which is a connection device for controlling a controlled device, and a robot 3 which is a controlled device.
  • the control system 100 is a system that controls the robot 3.
  • the display device 1 is an HMI (Human Machine Interface) such as a programmable display.
  • the control device 2 is a controller such as a programmable logic controller (PLC).
  • PLC programmable logic controller
  • the display device 1 is connected to the control device 2.
  • the robot 3 is connected to the control device 2.
  • the display device 1 communicates with the control device 2.
  • the display device 1 has a function of displaying information about the operating state of the control device 2 and a function of accepting an operation.
  • the control device 2 controls the robot 3 by executing a control program such as a ladder program.
  • the screen displayed on the display device 1 is created by the drawing device. The drawing device will be described later.
  • the display device 1 rotates a communication unit 10 that can communicate with the control device 2, a rotation information acquisition unit 11 that acquires rotation information indicating the rotation mode of the controlled device displayed on the screen, and a controlled device on the screen. It has an image generation unit 12 that generates a three-dimensional image for display, and an abnormality information acquisition unit 13 that acquires abnormality information indicating the presence or absence of an abnormality in a controlled device portion. Further, in the display device 1, the superimposed object generation unit 14 that generates a two-dimensional object that is an object indicating the presence or absence of an abnormality in the controlled device portion, the data storage unit 15 that holds various data, and the two-dimensional object are superimposed. It has an image display unit 16 for displaying the created three-dimensional image.
  • the display device 1 includes a processor 20 that executes various processes, a memory 21 that is a built-in memory, a storage device 22 that holds various information, a communication interface 23 that communicates with the control device 2, and a display that displays information. It has 24 and a touch panel 25 for inputting information to the display device 1.
  • the processor 20 is a CPU (Central Processing Unit).
  • the processor 20 may be a processing device, an arithmetic unit, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor).
  • the memory 21 is a volatile memory such as a RAM (Random Access Memory).
  • the storage device 22 is a non-volatile memory such as a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory) or an EEPROM (registered trademark) (Electrically Erasable Programmable Read Only Memory).
  • the communication interface 23 is an interface circuit for connecting the display device 1 to an external device.
  • the control device 2 and the drawing device are connected to the communication interface 23 via a network using Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • the display 24 is, for example, a liquid crystal display (LCD).
  • the touch panel 25 detects the contact of a contact object such as a finger and outputs touch information.
  • the touch information includes the identification number of each finger, the coordinates, the information indicating the contact state, and the like.
  • Each function of the rotation information acquisition unit 11, the image generation unit 12, the abnormality information acquisition unit 13, and the superimposed object generation unit 14 is realized by using a combination of the processor 20 and software.
  • Each function may be realized using a combination of processor 20 and firmware, or may be realized using a combination of processor 20, software and firmware.
  • the software or firmware is written as a program and stored in the storage device 22.
  • the function of the communication unit 10 is realized by using the communication interface 23.
  • the function of the image display unit 16 is realized by using the display 24.
  • the program executed by the display device 1 may be stored in a storage medium that can be read by a computer system.
  • the display device 1 may store the program recorded on the storage medium in the memory 21.
  • the storage medium may be a portable storage medium that is a flexible disk, or a flash memory that is a semiconductor memory.
  • FIG. 4 is a diagram showing a hardware configuration of a control device 2 connected to the display device 1 according to the first embodiment.
  • the control device 2 has a processor 26 that executes various processes, a memory 27 that is a built-in memory, a storage device 28 that holds various information, and a communication interface 29 that communicates with the display device 1 and the robot 3. ..
  • the storage device 28 is an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • the control program is stored in the storage device 28.
  • the processor 26 reads the control program stored in the storage device 28 into the memory 27 and executes it.
  • the processor 26 performs a process of generating a command output to the robot 3 and the like.
  • the description of the contents common to the processor 20, the memory 21, the storage device 22, and the communication interface 23 will be omitted.
  • FIG. 5 is a diagram showing how a three-dimensional image is displayed on the display device 1 according to the first embodiment.
  • the work support screen 30 is one of the screens displayed on the display 24.
  • the display device 1 displays the work support screen 30 in order to notify the occurrence of an abnormality in the controlled device.
  • the work support screen 30 is also a screen displayed to support the restoration work by the worker.
  • the restoration work is a work for resuming the operation of the controlled device that was interrupted due to the occurrence of an abnormality.
  • Abnormality refers to a state in which normal operation cannot be performed due to a defect such as deterioration or failure of parts.
  • a plurality of parts to be monitored for the presence or absence of an abnormality are preset in the robot 3.
  • the storage device 28 of the control device 2 secures a data area for holding a value indicating the presence or absence of an abnormality for each of the plurality of parts.
  • Variables used in programming the ladder program are associated with each data area. Such variables or data areas may be referred to as "devices" and values stored in the data areas may be referred to as "device values”.
  • the control device 2 detects the occurrence of the abnormality based on the signal received from the robot 3.
  • the control device 2 rewrites the device value corresponding to the portion where the abnormality has occurred from the value indicating normality to the value indicating abnormality.
  • the control device 2 notifies the display device 1 that an abnormality has occurred.
  • the control device 2 transmits rotation information and abnormality information indicating the presence or absence of an abnormality to the display device 1.
  • Rotation information is preset for each of the plurality of parts.
  • the storage device 28 of the control device 2 holds preset rotation information.
  • the control device 2 reads the rotation information corresponding to the portion where the abnormality has occurred from the storage device 28, and transmits the rotation information to the display device 1.
  • the communication unit 10 of the display device 1 receives rotation information and abnormality information.
  • the communication unit 10 outputs the received rotation information to the rotation information acquisition unit 11.
  • the communication unit 10 outputs the received abnormality information to the abnormality information acquisition unit 13.
  • the rotation information acquisition unit 11 acquires the rotation information transmitted from the control device 2.
  • the abnormality information acquisition unit 13 acquires the abnormality information transmitted from the control device 2.
  • the rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14.
  • the abnormality information acquisition unit 13 outputs the acquired abnormality information to the superimposed object generation unit 14.
  • Computer-aided Design (CAD) data which is the design data of the robot 3, is stored in the data storage unit 15.
  • the image generation unit 12 reads CAD data, which is three-dimensional data, from the data storage unit 15.
  • the image generation unit 12 generates a three-dimensional image for rotating and displaying the robot 3 on the work support screen 30 based on the CAD data and the rotation information.
  • the rotation information includes information such as the direction of rotation, the angle of rotation, and the period of rotation.
  • the image generation unit 12 rotates and displays a three-dimensional image based on the rotation information.
  • the image generation unit 12 outputs the generated three-dimensional image data to the image display unit 16.
  • the data storage unit 15 stores two-dimensional data, which is data of a two-dimensional object.
  • the superimposed object generation unit 14 reads two-dimensional data from the data storage unit 15.
  • the superimposed object generation unit 14 generates a two-dimensional object indicating the presence or absence of an abnormality in each part of the robot 3 based on the two-dimensional data and the abnormality information.
  • an object 33 indicating that the abnormality is generated is generated.
  • the superimposed object generation unit 14 outputs the data of the generated object 33 to the image display unit 16.
  • the image display unit 16 displays a three-dimensional image on which the object 33 is superimposed on the work support screen 30.
  • the object 33 is superimposed on the position of the portion of the image 31 where the abnormality has occurred.
  • the object 33 is a two-dimensional object for highlighting indicating that it is abnormal.
  • FIG. 5 shows a state in which the image 31 of the robot 3 in the posture at the time of installation is rotated about the rotation axis in the vertical direction.
  • the state of the image 31 shown on the left of FIG. 5 is set as the reference state
  • the image 31 shown on the right of FIG. 5 is a state rotated by 180 degrees from the reference state.
  • the work support screen 30 displays a state in which the image 31 is rotated between the reference state and the state rotated by 180 degrees from the reference state.
  • the rotation information is set so that the position of the portion where the abnormality has occurred can be easily and visually displayed by rotating the image 31 from the reference state.
  • Rotation information that can rotate the image 31 so as to be suitable for visual recognition is set for each portion where the presence or absence of an abnormality in the robot 3 is monitored.
  • the rotation information is the information described in the control program.
  • the rotation information may be information other than the information described in the control program.
  • an object 32 indicating normality may be superimposed on the position of the portion of the image 31 where no abnormality has occurred, that is, the position of the portion in the normal state.
  • the control device 2 transmits the abnormality information indicating that there is no abnormality in the normal portion to the display device 1.
  • the superimposed object generation unit 14 generates an object 32 indicating normality as a two-dimensional object superimposed on a normal portion.
  • the object 32 is superimposed on the position of the normal part of the image 31.
  • the object 32 is a two-dimensional object for highlighting that it is normal.
  • the object 32 and the object 33 can be visually distinguished from each other.
  • the superimposed object generation unit 14 generates an object 32 and an object 33 having different colors, such that the object 32 is blue and the object 33 is red. In this way, the superimposed object generation unit 14 generates a two-dimensional object in which abnormalities and normalities can be clearly visually recognized.
  • the object 32 and the object 33 may be different from each other in elements other than the color, as long as they can be visually distinguished from each other.
  • the display device 1 rotates the work support screen 30 not only for the abnormality of the portion represented by the image 31 in the reference state but also for the abnormality of the portion not represented by the image 31 in the reference state. Can be shown in.
  • the display device 1 can automatically rotate the image 31 in a manner suitable for displaying the portion where the abnormality has occurred. Therefore, the operator can easily confirm the portion where the abnormality has occurred without performing an operation for searching for the portion where the abnormality has occurred. Even a worker who has little experience in restoration work can identify the location where an abnormality has occurred in a short time.
  • the display device 1 acquires preset rotation information from the control device 2 for each part where the presence or absence of an abnormality is monitored.
  • the display device 1 does not need to hold rotation information for each portion in advance in order to rotate the image 31 so as to be suitable for visual recognition.
  • the display device 1 can display a three-dimensional image in which abnormalities in each portion can be easily visually recognized without increasing the storage capacity.
  • FIG. 6 is a diagram for explaining the creation of the work support screen 30 displayed on the display device 1 according to the first embodiment.
  • the drawing device 4 is a computer on which a drawing program is installed.
  • the display 44 of the drawing device 4 displays an editing screen that accepts an operation for creating a screen.
  • the creator who creates the screen of the display device 1 performs an editing work of arranging an object which is a display element on the editing screen.
  • the drawing device 4 creates a monitor screen for displaying information about the operating state of the control device 2, an operation screen for accepting operations, and a work support screen 30 according to the editing work by the creator.
  • a still image of the robot 3 is displayed on the edit screen for creating the work support screen 30.
  • a window 34 for setting a two-dimensional object to be superimposed on the three-dimensional image is displayed.
  • the creator arranges the two-dimensional object displayed in the window 34 at the position of the portion of the still image of the robot 3 to be monitored.
  • the drawing device 4 associates the two-dimensional data with the coordinates indicating the position of the portion of the three-dimensional data of the robot 3.
  • the creator can arrange the two-dimensional object while appropriately rotating the still image of the robot 3.
  • the drawing device 4 associates the two-dimensional data associated with the coordinates of the three-dimensional data with a data area for holding a value indicating the presence or absence of an abnormality in the portion, that is, a device.
  • a device For example, when a certain part of the robot 3 is normal, the device "M1000" is turned off, that is, the device value of "M1000” is set to "0", and when the part is abnormal, "M1000". Is on, that is, the process of setting the device value of "M1000" to "1” is described in the control program.
  • the drawing device 4 associates "M1000” with the two-dimensional data of the relevant portion.
  • the display device 1 selects one of the two objects 32 and 33 according to the device value of "M1000" by making such an association. In this way, the display device 1 uses the device value of the associated device as abnormality information.
  • the display device 1 downloads screen data 35, which is screen data created by the drawing device 4, from the drawing device 4.
  • the screen data 35 includes the data of the work support screen 30, the data of the monitor screen, and the data of the operation screen.
  • the display device 1 stores the downloaded screen data 35 in the data storage unit 15.
  • the image display unit 16 displays each screen by appropriately reading the data of each screen from the data storage unit 15.
  • FIG. 7 is a diagram showing a hardware configuration of a drawing device 4 for creating a screen of the display device 1 according to the first embodiment.
  • the drawing device 4 includes a processor 40 that executes various processes, a memory 41 that is a built-in memory, a storage device 42 that holds various information, a communication interface 43 that communicates with the display device 1, and a display that displays information. It has a 44 and an input device 45 for inputting information to the drawing device 4.
  • the drawing program is stored in the storage device 42.
  • the processor 40 reads the drawing program stored in the storage device 42 into the memory 41 and executes it.
  • the processor 40 executes a process for creating a screen according to an input.
  • the input device 45 is a device such as a keyboard, a mouse, or a touch panel.
  • FIG. 8 is a flowchart showing the procedure of the preparation process until the display of the work support screen 30 by the display device 1 according to the first embodiment is started.
  • step S1 the drawing device 4 creates a work support screen 30.
  • step S2 the display device 1 downloads the created screen data 35 from the drawing device 4.
  • step S3 the display device 1 is connected to the control device 2.
  • the display device 1 is connected to the control device 2 via a communication cable.
  • step S4 the control device 2 and the display device 1 are activated. As a result, the preparation step according to the procedure shown in FIG. 8 is completed.
  • FIG. 9 is a flowchart showing a procedure of a drawing method for creating a work support screen 30 displayed on the display device 1 according to the first embodiment.
  • the drawing device 4 accepts an operation of associating the three-dimensional data indicating the three-dimensional shape of the robot 3 with the two-dimensional data indicating an object indicating the presence or absence of an abnormality in the portion of the robot 3.
  • the drawing device 4 associates the two-dimensional data with the coordinates indicating the position of the portion where the presence or absence of abnormality is monitored.
  • step S6 the drawing device 4 associates the two-dimensional data with a data area for holding a value indicating the presence or absence of an abnormality in the portion. That is, the drawing device 4 further associates the device with the two-dimensional data associated with the coordinates indicating the position of the portion where the presence or absence of abnormality is monitored.
  • step S7 the drawing device 4 determines whether or not the correspondence between the two-dimensional data and the data area at all the locations to be monitored is completed.
  • step S7, No the drawing apparatus 4 repeats the procedure from step S5 to step S7 for each place.
  • step S7, Yes the drawing device 4 ends the creation of the work support screen 30 according to the procedure shown in FIG.
  • FIG. 10 is a flowchart showing a processing procedure when the display device 1 according to the first embodiment displays the rotation of the controlled device.
  • step S11 the display device 1 starts communication with the control device 2.
  • step S12 the display device 1 determines whether or not there has been a notification of the occurrence of an abnormality. If there is no notification of the occurrence of an abnormality (step S12, No), the display device 1 repeats the procedure of step S12 until the notification of the occurrence of an abnormality is received.
  • step S12, Yes the display device 1 acquires rotation information by the rotation information acquisition unit 11 in step S13.
  • the rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14.
  • the rotation axis is an axis perpendicular to the reference plane.
  • the reference plane is a plane of the robot 3 that is parallel to the installation plane on which the robot 3 is installed.
  • the direction of the rotation axis can be any direction in the virtual three-dimensional space in which the image 31 of the robot 3 is rotated.
  • the direction of the rotation axis is the vertical direction in the three-dimensional space.
  • the rotation information includes information indicating the direction of the rotation axis as information indicating the direction of rotation.
  • the rotation information further includes information indicating the angle of rotation and information indicating the period of rotation.
  • the rotation angle can be set to any angle in the range of 0 to 360 degrees with respect to the reference state. For example, when the angle is 180 degrees, the image 31 of the robot 3 rotates in a range of 180 degrees from the reference state. Any number of seconds can be set for the rotation cycle. For example, if the period is 1 second, the image 31 of the robot 3 rotates at a period of 1 second.
  • the display device 1 When the rotation information includes information indicating the direction of rotation, information indicating the angle of rotation, and information indicating the period of rotation, the display device 1 causes an abnormality for each part to be monitored. It is possible to display the position of the part by rotating it in an easy-to-understand manner.
  • the rotation information may include information other than the information indicating the direction of rotation, the information indicating the angle of rotation, and the information indicating the period of rotation.
  • step S14 the display device 1 generates a three-dimensional image by the image generation unit 12.
  • step S15 the display device 1 generates a two-dimensional object by the superimposed object generation unit 14.
  • the display device 1 superimposes the generated two-dimensional object on the generated three-dimensional image by the image display unit 16.
  • the image display unit 16 displays an image newly generated by superimposing a two-dimensional object on the three-dimensional image on the work support screen 30.
  • step S16 the display device 1 updates the work support screen 30 by the image display unit 16.
  • the data of the three-dimensional image, the data of the superimposed two-dimensional object, and the rotation information are stored in the data storage unit 15.
  • the display device 1 displays the rotation of the robot 3 on the work support screen 30.
  • the display device 1 ends the process according to the procedure of FIG.
  • FIG. 11 is a flowchart showing a processing procedure when the image generation unit 12 of the display device 1 according to the first embodiment generates a three-dimensional image.
  • step S21 the image generation unit 12 determines whether or not the rotation information acquired by the rotation information acquisition unit 11 has changed.
  • the data storage unit 15 stores the rotation information used for the previous display of the work support screen 30.
  • the image generation unit 12 compares the rotation information acquired this time with the previous rotation information, and determines whether or not there is a change.
  • step S21, No If there is no change in the rotation information (steps S21, No), the image generation unit 12 ends the process according to the procedure shown in FIG. In this case, the image generation unit 12 does not generate a three-dimensional image based on the rotation information acquired this time, but reads out the data of the three-dimensional image stored in the data storage unit 15. That is, the display device 1 does not update the three-dimensional image used for the previous display, but uses it as it is. On the other hand, when there is a change in the rotation information (steps S21, Yes), the image generation unit 12 advances the procedure to step S22.
  • step S22 the image generation unit 12 acquires the three-dimensional data by reading the three-dimensional data which is the CAD data of the robot 3 from the data storage unit 15.
  • step S23 the image generation unit 12 executes the drawing process of the three-dimensional data based on the rotation information. As a result, the image generation unit 12 generates a three-dimensional image.
  • FIG. 12 is a flowchart showing a processing procedure when a two-dimensional object is generated by the display device 1 according to the first embodiment.
  • step S31 the abnormality information acquisition unit 13 acquires abnormality information.
  • the abnormality information acquisition unit 13 acquires abnormality information for each portion where the presence or absence of an abnormality is monitored.
  • the abnormality information acquisition unit 13 outputs the acquired abnormality information to the superimposed object generation unit 14.
  • step S32 the superimposed object generation unit 14 determines whether or not the acquired abnormality information has changed.
  • the data storage unit 15 stores the abnormality information used for the previous display of the work support screen 30.
  • the superimposed object generation unit 14 compares the abnormality information acquired this time with the previous abnormality information for each part.
  • the superimposed object generation unit 14 determines for each part whether or not there is a change in the abnormality information.
  • the superimposed object generation unit 14 ends the process according to the procedure shown in FIG. In this case, the superimposed object generation unit 14 does not generate the two-dimensional object based on the abnormality information acquired this time, but reads out the data of the two-dimensional object stored in the data storage unit 15. That is, the display device 1 does not update the two-dimensional object used for the previous display, but uses it as it is. On the other hand, when there is a change in the abnormality information (step S32, Yes), the superimposed object generation unit 14 advances the procedure to step S33.
  • step S33 the superimposed object generation unit 14 determines whether or not the value of the abnormality information is a value indicating an abnormality.
  • the superimposed object generation unit 14 determines whether or not the abnormality information acquired for each portion is a value indicating an abnormality.
  • the superimposed object generation unit 14 reads the two-dimensional data of the object 33 indicating the abnormality from the data storage unit 15 in step S34, so that the abnormality occurs.
  • the two-dimensional data of the object 33 representing the above is acquired. Further, the superimposed object generation unit 14 acquires the coordinates indicating the position where the object 33 is arranged.
  • the superimposed object generation unit 14 reads the two-dimensional data of the object 32 indicating normality from the data storage unit 15 in step S35. , Acquires the two-dimensional data of the object 32 representing normality. Further, the superimposed object generation unit 14 acquires the coordinates indicating the position where the object 33 is arranged.
  • the superimposed object generation unit 14 acquires rotation information from the rotation information acquisition unit 11.
  • the procedure proceeds to step S36.
  • the abnormality information is stored in the data storage unit 15.
  • step S36 the superimposed object generation unit 14 executes the superimposed drawing process of the two-dimensional object based on the two-dimensional data, the coordinates, and the rotation information. As described above, the superimposed object generation unit 14 generates a two-dimensional object superimposed on the three-dimensional image.
  • FIG. 13 is a diagram for explaining a process of superimposing a two-dimensional object indicating normality on a three-dimensional image by the display device 1 according to the first embodiment.
  • FIG. 13 shows an image of an object 32 generated for a portion of the image 31 of the robot 3 for which a value indicating normality is acquired as abnormal information, and an image in which the object 32 is superimposed on the portion. Shows.
  • the superimposed object generation unit 14 acquires the two-dimensional data of the object 32 among the two-dimensional objects associated with the device corresponding to the portion.
  • the superimposed object generation unit 14 performs a superimposed drawing process in which the object 32 is superimposed on the surface of the portion of the robot 3 by the texture mapping method.
  • the superimposed object generation unit 14 acquires the parameters required for texture mapping from the rotation information. As a result, the superimposed object generation unit 14 performs the superimposed drawing process of superimposing the object 32 indicating that it is normal on the three-dimensional image.
  • FIG. 14 is a diagram for explaining a process of superimposing a two-dimensional object indicating an abnormality on a three-dimensional image by the display device 1 according to the first embodiment.
  • FIG. 14 shows an image of an object 33 generated for a portion of the image 31 of the robot 3 for which a value indicating an abnormality is acquired as abnormality information, and an image in which the object 33 is superimposed on the portion. Shows.
  • the superimposed object generation unit 14 acquires the two-dimensional data of the object 33 among the two-dimensional objects associated with the device corresponding to the portion.
  • the superimposed object generation unit 14 performs a superimposed drawing process in which the object 33 is superimposed on the surface of the portion of the robot 3 by the texture mapping method.
  • the superimposed object generation unit 14 acquires the parameters required for texture mapping from the rotation information. As a result, the superimposed object generation unit 14 performs the superimposed drawing process of superimposing the object 33 indicating that it is abnormal on the three-dimensional image.
  • the superimposed object generation unit 14 performs the superimposed drawing process of superimposing the two-dimensional object on the surface of the robot 3 in the three-dimensional image based on the rotation information.
  • the display device 1 can clearly display the abnormal portion and the normal portion of the robot 3.
  • the display device 1 generates a three-dimensional image for rotating and displaying the controlled device based on the rotation information, and also generates a two-dimensional object to be superimposed on the three-dimensional image. , The 3D image on which the object is superimposed is rotated and displayed.
  • the display device 1 can automatically rotate the image 31 of the controlled device in a manner suitable for displaying the portion where the abnormality has occurred. By visually observing the screen of the display device 1, the operator can identify the place where the abnormality has occurred in a short time. Further, since the display device 1 acquires the rotation information stored in the control device 2, it is possible to display a three-dimensional image in which abnormalities in each portion can be easily visually recognized without increasing the storage capacity.
  • the display device 1 has the effect of being able to notify the occurrence of an abnormality in the controlled device so that the operator can easily identify the location where the abnormality has occurred in the controlled device. Further, the display device 1 can display the location where the abnormality has occurred in an easy-to-understand manner without increasing the storage capacity.
  • FIG. 15 is a diagram showing a functional configuration of the display device 1A according to the second embodiment.
  • the display device 1A stops the rotation of the image 31 of the controlled device. Further, the display device 1A manually rotates the image 31 in a state where the rotation is stopped.
  • the same components as those in the first embodiment are designated by the same reference numerals, and the configurations different from those in the first embodiment will be mainly described.
  • the display device 1A has the same functional configuration as the display device 1 according to the first embodiment. Further, the display device 1A has an operation unit 51 that accepts an operation of stopping the rotation of the controlled device on the work support screen 30, a coordinate determination unit 52, and a rotation information changing unit 53. The function of the operation unit 51 is realized by using the touch panel 25. Each function of the coordinate determination unit 52 and the rotation information changing unit 53 is realized by using a combination of the processor 20 and software.
  • the operation unit 51 accepts the first operation of stopping the rotation of the robot 3 on the work support screen 30.
  • the first operation is an operation of touching a certain point on the screen with a contact object.
  • the operation unit 51 accepts a second operation following the first operation.
  • the second operation is an operation of moving the contact object from the touched position on the screen, that is, a swipe operation.
  • the second operation is an operation for rotating the image of the robot 3 stopped by the first operation in a direction designated by the movement of the contact object.
  • the first operation and the second operation are collectively referred to as a touch operation.
  • the operation unit 51 outputs touch information to the coordinate determination unit 52 when there is an operation.
  • the coordinate determination unit 52 determines the coordinates indicating the position of the contact object from the touch information.
  • the coordinate determination unit 52 determines whether or not the above touch operation has been performed based on the coordinates.
  • the coordinate determination unit 52 outputs the coordinates to the rotation information changing unit 53.
  • the rotation information changing unit 53 performs a process of changing the rotation information according to the coordinates.
  • FIG. 16 is a diagram for explaining an operation of manually rotating an image 31 of a controlled device in the display device 1A according to the second embodiment.
  • the image 31 shown on the left side of FIG. 16 is an image 31 when the rotation is stopped by the finger which is a contact object touching one point on the work support screen 30.
  • the automatic rotation of the image 31 is stopped when an arbitrary position on the work support screen 30 is touched by a finger.
  • FIG. 16 shows a state when the finger is moved in the right direction on the work support screen 30 from the state shown on the left side of FIG.
  • the image 31 displayed on the work support screen 30 is rotated clockwise from the state shown on the left of FIG.
  • the image 31 shown on the left side of FIG. 16 is a state in which the upper part of the robot 3 at the time of installation is tilted so as to face to the right.
  • the image 31 is rotated so that the lower part of the robot 3 moves to the right, and the image 31 is displayed.
  • the state shown on the right of FIG. 16 is obtained.
  • the amount of rotation of the image 31 is the amount of rotation according to the amount of movement of the finger touching the work support screen 30. The worker can move the finger touching the work support screen 30 in any direction.
  • the image 31 rotates in the direction in which the finger is moved from the state in which the rotation is stopped.
  • the display device 1A stops the automatic rotation of the image 31 by the operation of the operator, so that the operator can confirm the location where the abnormality has occurred in detail. Further, since the display device 1A can rotate the image 31 according to the direction in which the finger is moved and the amount of movement of the finger from the state in which the rotation of the image 31 is stopped, the display device 1A is operated by the operator. It is possible to improve the sex.
  • FIG. 17 is a flowchart showing a procedure for the display device 1A according to the second embodiment to rotate the image 31 of the controlled device according to the operation.
  • step S41 the display device 1A starts communication with the control device 2.
  • step S42 the display device 1A determines whether or not there has been a notification of the occurrence of an abnormality. If there is no notification of the occurrence of an abnormality (step S42, No), the display device 1A repeats the procedure of step S42 until the notification of the occurrence of an abnormality is received.
  • step S42, Yes the display device 1A acquires rotation information by the rotation information acquisition unit 11 in step S43.
  • the rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14.
  • the display device 1A changes the rotation information according to the touch operation in step S44.
  • the display device 1A generates a three-dimensional image based on the changed rotation information.
  • the display device 1A creates a two-dimensional object.
  • the display device 1A superimposes the generated two-dimensional object on the generated three-dimensional image.
  • the image display unit 16 displays an image newly generated by superimposing a two-dimensional object on the three-dimensional image on the work support screen 30.
  • the display device 1A updates the work support screen 30 by the image display unit 16.
  • the display device 1A ends the process according to the procedure of FIG.
  • FIG. 18 is a flowchart showing a procedure for changing rotation information in the display device 1A according to the second embodiment.
  • the operation unit 51 outputs the touch information by touching the contact object to the work support screen 30 to the coordinate determination unit 52.
  • the coordinate determination unit 52 acquires touch information.
  • the coordinate determination unit 52 determines whether or not the above touch operation has been performed based on the touch information.
  • the display device 1A ends the process according to the procedure shown in FIG.
  • the display device 1A advances the procedure to step S53.
  • the coordinate determination unit 52 outputs the coordinates of the position where the touch operation was performed to the rotation information changing unit 53.
  • the rotation information changing unit 53 holds the previous coordinates, which are the display coordinates before the rotation due to the movement amount is applied.
  • step S53 the rotation information changing unit 53 determines whether or not the current coordinate, which is the coordinate of the touched position, has changed from the previous coordinate, which is the held coordinate.
  • step S54 the display device 1A proceeds to step S54.
  • step S57 the display device 1A proceeds to step S57.
  • step S54 the rotation information changing unit 53 calculates the amount of change in the coordinates between the previous coordinates and the coordinates of the touched position.
  • step S55 the rotation information changing unit 53 turns off the initial flag.
  • the initial flag is a flag for setting whether to perform automatic rotation or manual rotation.
  • step S56 the rotation information changing unit 53 adds the amount of change calculated in step S54 to the manual rotation information which is the rotation information set by the operation to the operation unit 51. That is, the rotation information changing unit 53 reflects the rotation according to the movement amount in the swipe operation on the rotation display.
  • step S57 the rotation information changing unit 53 determines whether or not the initial flag is off.
  • the display device 1A proceeds to step S61.
  • step S57, Yes the initial flag is off (step S57, Yes)
  • step S59 the rotation information changing unit 53 updates the held coordinates to the coordinates of the touched position.
  • step S60 the rotation information changing unit 53 holds the rotation information acquired from the control device 2 as manual rotation information. After completing the procedure in step S60, the display device 1A proceeds to step S61.
  • step S61 the rotation information changing unit 53 changes the rotation information by the manual rotation information. That is, the rotation information changing unit 53 changes the rotation information acquired by the rotation information acquisition unit 11 into manual rotation information. As a result, the display device 1A ends the process for changing the rotation information.
  • the display device 1A stops the automatic rotation of the image 31 of the controlled device on the work support screen 30 according to the operation to the operation unit 51. Further, the display device 1A rotates the image 31 according to the direction in which the contact object is moved and the movement amount of the contact object from the state in which the automatic rotation of the image 31 is stopped. As a result, the display device 1A can display a three-dimensional image so that the location where the abnormality has occurred can be confirmed in detail, and the operability by the operator can be improved.
  • FIG. 19 is a diagram showing a functional configuration of the display device 1B according to the third embodiment.
  • the display device 1B generates a two-dimensional object that has undergone transparency processing.
  • the same components as those in the first or second embodiment are designated by the same reference numerals, and the configurations different from those in the first or second embodiment will be mainly described.
  • the display device 1B has the same functional configuration as the display device 1A according to the second embodiment. Further, the display device 1B has a transparency processing unit 61 for generating a two-dimensional object displayed by the transparent display by the superimposed object generation unit 14. The function of the transmission processing unit 61 is realized by using a combination of the processor 20 and software.
  • the display device 1B is not limited to the display device 1A in which the transmission processing unit 61 is added to the same functional configuration.
  • the display device 1B may have a transmission processing unit 61 added to the same functional configuration as the display device 1 according to the first embodiment.
  • the transparency processing unit 61 determines whether or not the transparency processing of the two-dimensional object superimposed on the three-dimensional image is possible.
  • the transmission processing unit 61 sets a transmission flag indicating the necessity of transmission processing based on the result of such determination.
  • the format of the two-dimensional data is a format capable of transparent processing, for example, ARGB8000 format
  • the transparent processing unit 61 determines that transparent processing of the two-dimensional object is possible. In this case, the transparency processing unit 61 turns on the transparency flag.
  • the format of the two-dimensional data is a format that cannot be transparently processed, for example, an RGB888 format
  • the transparent processing unit 61 determines that the transparent processing of the two-dimensional object is not possible. In this case, the transparency processing unit 61 turns off the transparency flag.
  • the superimposed object generation unit 14 When the transparency flag is on, the superimposed object generation unit 14 performs the superimposed drawing process of the two-dimensional object to which the transparency process has been performed. As a result, the superimposed object generation unit 14 superimposes the transparent display two-dimensional object on the three-dimensional image. On the other hand, when the transparency flag is off, the superimposed object generation unit 14 performs the superimposed drawing process of the two-dimensional object that has not been subjected to the transparency process. As a result, the superimposed object generation unit 14 superimposes the opaque two-dimensional object on the three-dimensional image.
  • FIG. 20 is a diagram showing an example in which a two-dimensional object having an opaque display is superimposed on a three-dimensional image by the display device 1B according to the third embodiment.
  • the superimposed object generation unit 14 When the transparency flag is off, the superimposed object generation unit 14 generates an object 33, which is a two-dimensional object of opaque display.
  • the superimposed object generation unit 14 performs a superimposed drawing process for superimposing the object 33 on the three-dimensional image.
  • the display device 1B can generate the two-dimensional object of the transparent display, the operator can easily confirm the presence or absence of the abnormality, and the abnormality has occurred in the state where the two-dimensional object is displayed. The surface condition of the part can be confirmed. As a result, the display device 1B can further improve the visibility of the location where the abnormality has occurred.
  • FIG. 21 is a diagram showing an example in which a transparent display two-dimensional object is superimposed on a three-dimensional image by the display device 1B according to the third embodiment.
  • the superimposed object generation unit 14 When the transparency flag is on, the superimposed object generation unit 14 generates the object 62, which is a transparent display two-dimensional object.
  • the superimposed object generation unit 14 performs a superimposed drawing process for superimposing the object 62 on the three-dimensional image.
  • FIG. 22 is a flowchart showing a processing procedure when a two-dimensional object is generated by the display device 1B according to the third embodiment.
  • step S71 the superimposed object generation unit 14 acquires abnormality information from the abnormality information acquisition unit 13.
  • the superimposed object generation unit 14 acquires abnormality information for each portion where the presence or absence of an abnormality is monitored.
  • step S72 the superimposed object generation unit 14 determines whether or not the acquired abnormality information has changed. When there is no change in the abnormality information (steps S72, No), the superimposed object generation unit 14 ends the process according to the procedure shown in FIG. On the other hand, when there is a change in the abnormality information (step S72, Yes), the superimposed object generation unit 14 advances the procedure to step S73.
  • step S73 the superimposed object generation unit 14 determines whether or not the value of the abnormality information is a value indicating an abnormality.
  • the value of the abnormality information is a value indicating an abnormality (step S73, Yes)
  • the superimposed object generation unit 14 acquires the two-dimensional data of the object 33 representing the abnormality in step S74.
  • the display device 1B proceeds to step S76.
  • the value of the abnormality information is not a value indicating an abnormality (step S73, No)
  • the superimposed object generation unit 14 acquires the two-dimensional data of the object 32 indicating normality in step S75. After completing the procedure in step S75, the display device 1B proceeds to step S76.
  • step S76 the transparency processing unit 61 sets the transparency flag.
  • the transparent processing unit 61 turns on the transparent flag.
  • the transparent processing unit 61 turns off the transparent flag.
  • step S77 the superimposed object generation unit 14 determines whether or not the transparency flag set in step S76 is off.
  • the superimposed object generation unit 14 executes the superimposed drawing process of the two-dimensional object in step S78. That is, the superimposed object generation unit 14 superimposes the opaque two-dimensional object on the three-dimensional image.
  • the superimposed object generation unit 14 executes the transparency process and the superimposed drawing process of the two-dimensional object in step S79. That is, the superimposed object generation unit 14 superimposes the transparent display two-dimensional object on the three-dimensional image.
  • the superimposed object generation unit 14 performs the transparency processing of the two-dimensional object by adjusting the transparency or the alpha value of the pixel in the two-dimensional data.
  • the display device 1B generates a transparent display 2D object or an opaque display 2D object superimposed on the 3D image.
  • the transparency processing unit 61 may be able to switch the transparency flag on and off for the two-dimensional data in which the transparency flag is turned on.
  • the transparency processing unit 61 switches the transmission flag on and off according to the operation to the operation unit 51.
  • the display device 1B can arbitrarily change whether the two-dimensional object superimposed on the three-dimensional image is transparently displayed or opaquely displayed.
  • the display device 1B can further improve the visibility of the place where the abnormality has occurred because the two-dimensional object to be transparently displayed can be generated by the superimposed object generation unit 14.
  • FIG. 23 is a diagram showing a functional configuration of the display device 1C according to the fourth embodiment.
  • the display device 1C displays the portion where the abnormality has occurred in the abnormality shown in the selected alarm information by selecting the alarm information which is the information for each occurrence of the abnormality from the alarm list.
  • the same components as those in the first to third embodiments are designated by the same reference numerals, and the configurations different from those in the first to third embodiments will be mainly described.
  • the display device 1C has the same functional configuration as the display device 1B according to the third embodiment. Further, the display device 1C has an alarm selection unit 71 that specifies alarm information selected by operation, a window generation unit 72 that generates a window displayed on the alarm screen, and an alarm display unit that displays an alarm list on the alarm screen. 73 and. Each function of the alarm selection unit 71 and the window generation unit 72 is realized by using a combination of the processor 20 and software. The function of the alarm display unit 73 is realized by using the display 24.
  • the display device 1C is not limited to the same functional configuration as the display device 1B with the alarm selection unit 71, the window generation unit 72, and the alarm display unit 73 added.
  • the display device 1C has the same functional configuration as the display device 1 according to the first embodiment or the same functional configuration as the display device 1A according to the second embodiment, and has the alarm selection unit 71, the window generation unit 72, and the alarm display. It may be the one to which the part 73 is added.
  • the alarm list is a list of alarm information, which is information for each abnormality that has occurred.
  • the display device 1C displays an alarm list on the alarm screen.
  • the alarm selection unit 71 identifies the selected alarm information, and the alarms specified in the window generation unit 72, the image generation unit 12, and the superimposed object generation unit 14. Notify information.
  • the window generation unit 72 Upon receiving the notification of the alarm information, the window generation unit 72 generates a window to be displayed on the alarm screen.
  • the image generation unit 12 specifies 3D data for generating a 3D image based on the notified alarm information, and reads the specified 3D data from the data storage unit 15.
  • the superimposed object generation unit 14 specifies the two-dimensional data for generating the two-dimensional object based on the notified alarm information, and reads the specified two-dimensional data from the data storage unit 15.
  • the alarm display unit 73 displays a three-dimensional image on which a two-dimensional object is superimposed in a window displayed on the alarm screen when the alarm information is selected from the alarm list. In this way, the display device 1C associates the display for rotating the image 31 of the controlled device with the alarm list.
  • FIG. 24 is a diagram showing a display of an alarm list on the display device 1C according to the fourth embodiment and a state in which a three-dimensional image is displayed on the display device 1C.
  • FIG. 24 shows a state in which the alarm screen 74 is displayed on the display device 1C and a state in which the image 31 of the robot 3 is rotated and displayed by selecting the alarm information from the alarm screen 74.
  • the alarm information includes the time when the abnormality occurred, the message indicating the content of the abnormality, and the time when the abnormality was recovered.
  • the alarm information may include information such as an error code indicating the content or cause of the abnormality, or the cumulative number of occurrences of the abnormality.
  • the operation unit 51 accepts an operation for selecting alarm information from the alarm list on the alarm screen 74.
  • the operator selects the alarm information from the alarm list by touching the line 76 in which the alarm information is described on the alarm screen 74.
  • the alarm display unit 73 displays the window 75 on the alarm screen 74.
  • the alarm display unit 73 displays the image 31 of the robot 3 that rotates according to the rotation information on the window 75.
  • a two-dimensional object indicating an abnormality is superimposed on the position of the portion of the image 31 where the abnormality indicated in the alarm information has occurred.
  • a state in which the image 31 on which the two-dimensional object is superimposed is rotated according to the rotation information is displayed.
  • the worker may not be able to instantly grasp the details of the abnormality that occurred when the alarm occurred. From the alarm list, the operator can check information about the abnormality, such as the time when the abnormality occurred. Further, the operator can identify the portion where the abnormality has occurred in the rotated image 31 by selecting the alarm information from the alarm list. This makes it possible for workers to improve work efficiency.
  • the alarm list may include not only alarm information about currently occurring abnormalities but also alarm information about abnormalities that have occurred in the past.
  • the operator can identify the part where the abnormality has occurred at present or the part where the abnormality has occurred in the past from the rotated image 31. You can also.
  • FIG. 25 is a flowchart showing a processing procedure when the display device 1C according to the fourth embodiment performs a rotation display of the controlled device in the window 75 of the alarm screen 74.
  • step S81 the display device 1C starts communication with the control device 2.
  • step S82 the display device 1C determines whether or not there has been a notification of the occurrence of an abnormality. If there is no notification of the occurrence of an abnormality (steps S82, No), the display device 1C repeats the procedure of step S82 until the notification of the occurrence of an abnormality is received. When notified of the occurrence of an abnormality (step S82, Yes), the display device 1C advances the procedure to step S83.
  • step S83 the display device 1C displays an alarm list on the alarm screen 74.
  • step S84 the display device 1C determines whether or not the alarm information has been selected. When the alarm information is not selected (step S84, No), the display device 1C repeats the procedure of step S84 until the alarm information is selected. When the alarm information is selected (steps S84, Yes), in step S85, the display device 1C displays the window 75 on the alarm screen 74.
  • step S86 the display device 1C acquires rotation information by the rotation information acquisition unit 11.
  • the rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14.
  • step S87 the display device 1C generates a three-dimensional image by the image generation unit 12.
  • step S88 the display device 1C generates a two-dimensional object by the superimposed object generation unit 14.
  • the display device 1C superimposes the generated two-dimensional object on the generated three-dimensional image by the alarm display unit 73.
  • the alarm display unit 73 displays an image newly generated by superimposing a two-dimensional object on the three-dimensional image on the window 75.
  • step S89 the display device 1C updates the display of the window 75 by the alarm display unit 73. In this way, the display device 1C displays the rotation of the robot 3 in the window 75 of the alarm screen 74.
  • step S90 the display device 1C determines whether or not the window 75 has been closed.
  • One of the operations of closing the window 75 is an operation of touching a button in the window 75. If there is no operation to close the window 75 (step S90, No), the display device 1C returns the procedure to step S86.
  • step S90, Yes the display device 1C clears the display of the window 75 and closes the window 75 in the step S91. As a result, the display device 1C ends the process according to the procedure shown in FIG.
  • the display device 1C displays a three-dimensional image on which a two-dimensional object is superimposed by selecting alarm information from the alarm list.
  • the display device 1C has an effect that the work efficiency by the operator can be improved.
  • the controlled device may be any device that can be controlled by the control device 2, and is not limited to the robot 3.
  • the display devices 1, 1A, 1B, and 1C are not limited to the programmable display device, and may be a computer system such as a personal computer or a general-purpose computer.
  • a program for realizing the function of the display device 1 is installed in the computer system.
  • the storage device 22 shown in FIG. 2 may be an HDD or an SSD. The program is stored in the storage device 22.
  • the processor 20 reads the program stored in the storage device 22 into the memory 21 and executes it.
  • each of the above embodiments shows an example of the contents of the present disclosure.
  • the configurations of each embodiment can be combined with other known techniques.
  • the configurations of the respective embodiments may be combined as appropriate. It is possible to omit or change a part of the configuration of each embodiment without departing from the gist of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)
  • User Interface Of Digital Computer (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

L'invention concerne un dispositif d'affichage (1) comprenant : une unité de communication (10) apte à communiquer avec un dispositif de commande qui commande un dispositif commandé; une unité de génération d'image (12) qui génère une image tridimensionnelle pour faire tourner et afficher le dispositif commandé sur un écran; une unité de génération d'objet superposé (14) qui génère un objet qui indique si une anomalie s'est produite ou non dans une partie du dispositif commandé et est superposé sur la position de la partie dans l'image tridimensionnelle; une unité d'acquisition d'informations de rotation (11) qui acquiert des informations de rotation indiquant le mode de rotation du dispositif commandé affiché sur l'écran; et une unité d'affichage d'image (16) qui affiche l'image tridimensionnelle sur laquelle l'objet est superposé. L'unité de génération d'image (12) fait tourner et affiche l'image tridimensionnelle sur la base des informations de rotation.
PCT/JP2020/049246 2020-12-28 2020-12-28 Dispositif d'affichage, système de commande et procédé de dessin WO2022145014A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202080102240.6A CN115803785B (zh) 2020-12-28 2020-12-28 显示装置、控制系统及绘图方法
JP2021527967A JP6991396B1 (ja) 2020-12-28 2020-12-28 表示装置、制御システムおよび作画方法
PCT/JP2020/049246 WO2022145014A1 (fr) 2020-12-28 2020-12-28 Dispositif d'affichage, système de commande et procédé de dessin

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/049246 WO2022145014A1 (fr) 2020-12-28 2020-12-28 Dispositif d'affichage, système de commande et procédé de dessin

Publications (1)

Publication Number Publication Date
WO2022145014A1 true WO2022145014A1 (fr) 2022-07-07

Family

ID=80447954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/049246 WO2022145014A1 (fr) 2020-12-28 2020-12-28 Dispositif d'affichage, système de commande et procédé de dessin

Country Status (3)

Country Link
JP (1) JP6991396B1 (fr)
CN (1) CN115803785B (fr)
WO (1) WO2022145014A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09212219A (ja) * 1996-01-31 1997-08-15 Fuji Facom Corp 三次元仮想モデル作成装置及び制御対象物の監視制御装置
JP2004227175A (ja) * 2003-01-21 2004-08-12 Sony Corp メンテナンスシステム
WO2016051544A1 (fr) * 2014-09-30 2016-04-07 株式会社牧野フライス製作所 Dispositif de commande pour machine-outil
WO2016067342A1 (fr) * 2014-10-27 2016-05-06 株式会社牧野フライス製作所 Procédé de commande de machine-outil et dispositif de commande de machine-outil
JP2016107379A (ja) * 2014-12-08 2016-06-20 ファナック株式会社 拡張現実対応ディスプレイを備えたロボットシステム
JP2017181666A (ja) * 2016-03-29 2017-10-05 ソニー株式会社 情報処理装置、情報処理方法およびプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020187488A (ja) * 2019-05-13 2020-11-19 株式会社リコー 状態監視装置及びプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09212219A (ja) * 1996-01-31 1997-08-15 Fuji Facom Corp 三次元仮想モデル作成装置及び制御対象物の監視制御装置
JP2004227175A (ja) * 2003-01-21 2004-08-12 Sony Corp メンテナンスシステム
WO2016051544A1 (fr) * 2014-09-30 2016-04-07 株式会社牧野フライス製作所 Dispositif de commande pour machine-outil
WO2016067342A1 (fr) * 2014-10-27 2016-05-06 株式会社牧野フライス製作所 Procédé de commande de machine-outil et dispositif de commande de machine-outil
JP2016107379A (ja) * 2014-12-08 2016-06-20 ファナック株式会社 拡張現実対応ディスプレイを備えたロボットシステム
JP2017181666A (ja) * 2016-03-29 2017-10-05 ソニー株式会社 情報処理装置、情報処理方法およびプログラム

Also Published As

Publication number Publication date
JPWO2022145014A1 (fr) 2022-07-07
JP6991396B1 (ja) 2022-01-14
CN115803785B (zh) 2024-03-15
CN115803785A (zh) 2023-03-14

Similar Documents

Publication Publication Date Title
JP4952401B2 (ja) Plc
EP0940739A2 (fr) Appareil de commande de robot
US10139805B2 (en) Ladder diagram monitoring device capable of additionally displaying operation situation of CNC in comment
JPH09212219A (ja) 三次元仮想モデル作成装置及び制御対象物の監視制御装置
TWI465868B (zh) 次序程式設計支援裝置
JP6991396B1 (ja) 表示装置、制御システムおよび作画方法
JPWO2020110933A1 (ja) プログラミング装置、およびプログラム
JP2006209381A (ja) 制御用表示装置、そのプログラムおよび記録媒体
JP6337810B2 (ja) 情報処理装置、情報処理方法、およびプログラム
CN110442482A (zh) 一种高速工业相机固件的防错架构及其控制方法
JP5862236B2 (ja) 制御プログラム編集装置及び制御プログラムの作成支援プログラム
JP6374456B2 (ja) 電子機器及び数値制御装置
CN106155519B (zh) 画面信息生成装置
JP5830975B2 (ja) 動作制御装置
JP4043742B2 (ja) ラダーモニタ装置、並びに、そのプログラムおよび記録媒体
JP3847665B2 (ja) 制御プログラム検索装置、および、そのプログラム
JP7425215B2 (ja) ヒューマンマシンインターフェースシステムにおける記憶装置の交換方法{Method for Changing of Storage Apparatus in Human Machine Interface System}
JP2782864B2 (ja) 端末装置
JP5581514B2 (ja) 作画装置及び作画プログラム
US20240100688A1 (en) Information processing apparatus, information processing method, robot system, manufacturing method for article using robot system, program, and recording medium
JP5840550B2 (ja) 操作手順書作成装置、操作手順書作成方法および操作手順書作成プログラム
JP3729457B2 (ja) アラーム履歴表示システム
JP3890917B2 (ja) 生産設備監視システム
JP2008293392A (ja) ラダープログラミングエディタ
JP5485751B2 (ja) ラダー図接点コメント情報入力方式およびその方式を有するコンピュータ

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021527967

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20968025

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20968025

Country of ref document: EP

Kind code of ref document: A1