CN113284233A - Visual monitoring method, device, system, electronic device and storage medium - Google Patents

Visual monitoring method, device, system, electronic device and storage medium Download PDF

Info

Publication number
CN113284233A
CN113284233A CN202110671873.5A CN202110671873A CN113284233A CN 113284233 A CN113284233 A CN 113284233A CN 202110671873 A CN202110671873 A CN 202110671873A CN 113284233 A CN113284233 A CN 113284233A
Authority
CN
China
Prior art keywords
monitoring
visual
workshop
parameters
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110671873.5A
Other languages
Chinese (zh)
Inventor
郑海滨
洪晖旭
邱进忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhishou Technology Hangzhou Co ltd
Original Assignee
Zhishou Technology Hangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhishou Technology Hangzhou Co ltd filed Critical Zhishou Technology Hangzhou Co ltd
Priority to CN202110671873.5A priority Critical patent/CN113284233A/en
Publication of CN113284233A publication Critical patent/CN113284233A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to a visual monitoring method, a visual monitoring device, a visual monitoring system, an electronic device and a storage medium, wherein the method comprises the following steps: acquiring workshop data, monitoring images and monitoring parameters in real time; establishing a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters; and processing the monitoring image to the visual scene model, and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time. The problem of current monitoring mode directly perceived inadequately, and the information content of show is far less than three-dimensional, can't demonstrate the workshop job site vividly is solved, the visual management of workshop has been realized, the workshop job site is fully demonstrate directly perceived.

Description

Visual monitoring method, device, system, electronic device and storage medium
Technical Field
The present application relates to the field of monitoring technologies, and in particular, to a visual monitoring method, apparatus, system, electronic apparatus, and storage medium.
Background
With the advent of the information-oriented era, more and more enterprises are carrying out enterprise information-oriented construction. The enterprise informatization means that the production operation efficiency of an enterprise is improved through the deployment of a computer technology, and the operation risk and cost are reduced, so that the overall management level and the continuous operation capacity of the enterprise are improved. The workshop is a basic unit for organizing production in the manufacturing industry, and the technical progress of the digital twin workshop has important significance for promoting the virtual-real fusion of the whole manufacturing industry. With increasingly severe market competition and increasingly complex product requirements, workshop operation is subject to the pressure of shorter product delivery period, higher reliability requirement, more frequent product variety change and the like, and a workshop management layer needs to master the field operation condition of the workshop in time and discover abnormal disturbance in production in time, so that the production plan and resource allocation are reasonably adjusted, and the production efficiency and reliability are improved.
At present, a workshop monitoring method is mainly based on monitoring equipment to generate two-dimensional data or graphic display of an electronic billboard. The mode is not visual enough, and the displayed information quantity is far less than three-dimensional, so that the workshop operation field cannot be displayed vividly.
At present, an effective solution is not provided aiming at the problems that the monitoring mode in the related technology is not visual enough, the displayed information quantity is far less than three-dimensional, and the workshop operation field cannot be displayed vividly.
Disclosure of Invention
The embodiment of the application provides a visual monitoring method, a visual monitoring device, a visual monitoring system, an electronic device and a storage medium, and at least solves the problems that in the related technology, a monitoring mode is not visual enough, the displayed information quantity is far smaller than three-dimensional, and a workshop operation field cannot be displayed vividly.
In a first aspect, an embodiment of the present application provides a visual monitoring method, including:
acquiring workshop data, monitoring images and monitoring parameters in real time;
establishing a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters;
and processing the monitoring image to the visual scene model, and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time.
In some of these embodiments, further comprising:
and acquiring monitoring data, and processing the monitoring data by using a video transcoding technology to obtain a monitoring image.
In some embodiments, establishing a virtual three-dimensional scene model under monitoring parameters according to the plant data and the monitoring parameters includes:
establishing an original virtual three-dimensional scene model by using a modeling tool according to the workshop data;
and calibrating the camera attitude in the original virtual three-dimensional scene model according to the camera attitude parameters in the monitoring parameters to establish the virtual three-dimensional scene model under the monitoring parameters.
In some embodiments, calibrating the camera pose in the original virtual three-dimensional scene model according to the camera pose parameters in the monitoring parameters includes:
comparing the monitoring image with a model image in the original virtual three-dimensional scene model, and adjusting the posture of a virtual camera in the original virtual three-dimensional scene model; and when the two pictures are overlapped, taking the attitude information of the current virtual camera as an initial attitude, and taking the rotation angle of 0-360 degrees.
In some embodiments, processing the monitoring image to the visual scene model, and creating real-time visual scene information based on the monitoring parameter and the processing result to monitor the workshop site in real time, includes:
synchronizing the poses of the real camera and the virtual camera;
synchronizing values of real camera optical zoom and virtual camera field angle;
and creating real-time visual scene information based on the monitoring parameters and the synchronization result so as to monitor the workshop site in real time.
In some of these embodiments, further comprising:
and pushing the visual scene information to the page end by using a WebSocket technology, and monitoring the workshop site in real time in the page end.
In a second aspect, an embodiment of the present application provides a visual monitoring apparatus, which includes an obtaining module, a creating module, and a visual monitoring module;
the acquisition module is used for acquiring workshop data, monitoring images and monitoring parameters in real time;
the creating module is used for creating a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters;
the visual monitoring module is used for processing the monitoring image to the visual scene model and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time.
In a third aspect, an embodiment of the present application provides a visual monitoring system, including: a terminal device, a transmission device and a server device; the terminal equipment is connected with the server equipment through the transmission equipment;
the terminal equipment is used for acquiring workshop data, monitoring images and monitoring parameters in real time;
the transmission equipment is used for transmitting the workshop data, the monitoring image and the monitoring parameters;
the server device is configured to implement the visual monitoring method according to the first aspect when executed.
In a fourth aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the visualization monitoring method according to the first aspect is implemented.
In a fifth aspect, the present application provides a storage medium, on which a computer program is stored, where the program is executed by a processor to implement the visual monitoring method according to the first aspect.
Compared with the related art, the visual monitoring method, the visual monitoring device, the visual monitoring system, the electronic device and the storage medium provided by the embodiment of the application acquire workshop data, monitoring images and monitoring parameters in real time; establishing a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters; and processing the monitoring image to the visual scene model, and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time. The problem of current monitoring mode directly perceived inadequately, and the information content of show is far less than three-dimensional, can't demonstrate the workshop job site vividly is solved, the visual management of workshop has been realized, the workshop job site is fully demonstrate directly perceived.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic structural diagram of a visual monitoring system provided in an embodiment of the present application;
FIG. 2 is a flow chart of a visual monitoring method provided in an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating the relationship between the focal length and the field angle according to an embodiment of the present application;
fig. 4 is a block diagram of a visual monitoring apparatus according to an embodiment of the present application.
In the figure: 100. an acquisition module; 200. a creation module; 300. and a visual monitoring module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The method provided by the embodiment can be executed in a terminal, a computer or a similar operation device. Taking the operation on the terminal as an example, fig. 1 is a hardware structure block diagram of the terminal of the visual monitoring method according to the embodiment of the present invention. As shown in fig. 1, the terminal 10 may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration and is not intended to limit the structure of the terminal. For example, the terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to the visualization monitoring method in the embodiment of the present invention, and the processor 102 executes the computer program stored in the memory 104 to execute various functional applications and data processing, i.e., to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the terminal 10. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
The present embodiment provides a visual monitoring method, and fig. 2 is a flowchart of the visual monitoring method according to the embodiment of the present application, and as shown in fig. 2, the flowchart includes the following steps:
step S210, acquiring workshop data, monitoring images and monitoring parameters in real time;
step S220, establishing a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters;
and step S230, processing the monitoring image to the visual scene model, and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time.
It should be noted that the workshop data may be device status, production data, and mapping data; the device status and the production data may be acquired by acquiring corresponding devices, or acquired by a digital information system in an intelligent factory, which is not limited in this respect. The mapping data refers to objects needing digital information representation, such as machine operation interfaces, electronic production boards and the like. A real camera is arranged in the generation workshop, monitoring data are obtained through the camera, and monitoring images and corresponding monitoring parameters can be obtained through processing the monitoring data. For example, a video transcoding technology is used to process the monitoring data to obtain a monitoring image. The monitoring parameters mainly refer to camera pose parameters corresponding to the monitored images. As long as the camera pose parameters change, the captured monitoring image also changes.
The virtual three-dimensional scene model under the monitoring parameters is directly established according to the workshop data and the monitoring parameters, the modeling process can be simplified, detailed layout modeling is not needed like the traditional 3D SCADA, and only data related to a camera and an object needing to be mapped and represented by digital information in the scene are needed. And finally, real-time visual scene information can be created by combining the monitoring image and the monitoring parameters so as to monitor the workshop site in real time.
Through the steps, the problems that the existing monitoring mode is not visual enough, the displayed information amount is far smaller than three dimensions, and the workshop operation site cannot be displayed vividly are solved, the visual management of a production workshop is realized, and the workshop operation site is displayed fully and visually.
In some embodiments, establishing a virtual three-dimensional scene model under monitoring parameters according to workshop data and the monitoring parameters, and the method comprises the following steps;
step S221, establishing an original virtual three-dimensional scene model by using a modeling tool according to workshop data;
step S222, calibrating the camera pose in the original virtual three-dimensional scene model according to the camera pose parameters in the monitoring parameters to establish a virtual three-dimensional scene model under the monitoring parameters.
In this embodiment, the modeling tools may be 3ds Max, Cinema4D, Blender, Dimension, Maya, etc. The scene in the original virtual three-dimensional scene model only requires the following entities: a virtual camera in correspondence with the real camera position; and the virtual entity is consistent with the position of an object needing digital information representation in a real scene. Namely, the original virtual three-dimensional scene model after modeling only comprises the virtual camera and the virtual entity.
Calibrating the camera pose in the original virtual three-dimensional scene model, specifically, acquiring a monitoring image of a real camera in an initial state, and establishing some reference object models; comparing the monitoring image with a model image in the original virtual three-dimensional scene model, and adjusting the posture of a virtual camera in the original virtual three-dimensional scene model; and when the two pictures are overlapped, taking the attitude information of the current virtual camera as an initial attitude, and taking the rotation angle of 0-360 degrees.
In some embodiments, the monitoring image is used for processing the visual scene model, and real-time visual scene information is created based on the monitoring parameters and the processing result so as to monitor the workshop site in real time, and the method comprises the following steps;
step S231, synchronizing the poses of the real camera and the virtual camera;
step S232, synchronizing the values of the real camera optical zoom and the virtual camera field angle;
and step S233, creating real-time visual scene information based on the monitoring parameters and the synchronization result so as to monitor the workshop site in real time.
It is to be appreciated that the above steps can be processed within a web page. Then the picture of the real camera can be output to the web page through the video transcoding technology; loading a virtual three-dimensional scene model through a WebGL technology, and displaying contents (digital information display entities) seen by each virtual camera on a picture output by a real camera; and pushing the digital information to a page side for display through the traditional SCADA and Websocket technologies. And finally, establishing real-time visual scene information based on the monitoring parameters and the synchronization result so as to monitor the workshop site in real time. In order to realize remote management, remote communication and remote operation conveniently, operation and display can be carried out in a page end. And pushing the visual scene information to the page end by using a WebSocket technology, and monitoring the workshop site in real time in the page end.
The method for synchronizing the postures of the real camera and the virtual camera specifically comprises the following steps:
the real camera sends the rotation angle P around the y axis of the real camera and the rotation angle T around the x axis of the real camera to the virtual scene, and the virtual camera rotates by the same angle to keep consistent with the posture of the real camera. Meanwhile, the pose description of the virtual camera coordinate relative to the world coordinate can be obtained, so that the subsequent calculation is facilitated.
RZ(α)Ry(β)Rx(γ);
Figure BDA0003119673470000071
In the formula, α ═ P is the rotation angle of the real camera around the y axis of the real camera; beta is T is the rotation angle of the real camera around the x axis of the real camera; γ is 0. RZ(α) a rotation matrix representing the virtual camera about the z-axis; ry(β) is expressed as a rotation matrix of the virtual camera about the y-axis, the virtual camera not rotating on the x-axis. Rx(γ) represents a rotation matrix of the virtual camera about the x-axis.
Synchronizing values of the real camera optical zoom Z and the virtual camera field angle FOV specifically are as follows:
assuming that the basic focal length of the real camera is bf, the actual focal length f of the camera is bf z; fig. 3 is a schematic diagram showing the relationship between the focal length f and the field angle α.
Then the relationship between the focal length f and the field angle α is:
Figure BDA0003119673470000072
w is the sensor/film size and bf is the fundamental focal length of the camera, both values being obtainable from the camera parameters as known quantities. In one embodiment W may be 27.
The angle of view is represented by an arc angle in the virtual three-dimensional scene, and finally the relationship between FOV and Z can be represented by the following formula:
Figure BDA0003119673470000073
it should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The embodiment also provides a visual monitoring device, which is used for implementing the above embodiments and preferred embodiments, and the description of the device is omitted. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 4 is a block diagram of a visualization monitoring apparatus according to an embodiment of the present application, and as shown in fig. 4, the apparatus includes an obtaining module 100, a creating module 200, and a visualization monitoring module 300;
the acquisition module 100 is used for acquiring workshop data, monitoring images and monitoring parameters in real time;
the creating module 200 is used for creating a virtual three-dimensional scene model under the monitoring parameters according to the workshop data and the monitoring parameters;
and the visual monitoring module 300 is used for processing the monitoring image to the visual scene model and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time.
Through visual monitoring device, the problem that the existing monitoring mode is not visual enough, and the displayed information quantity is far less than three-dimensional and can not vividly display the workshop job site is solved, the visual management of a production workshop is realized, and the workshop job site is fully and visually displayed.
In some embodiments, the creating module 200 is further configured to create an original virtual three-dimensional scene model according to the plant data by using a modeling tool;
and calibrating the camera attitude in the original virtual three-dimensional scene model according to the camera attitude parameters in the monitoring parameters to establish the virtual three-dimensional scene model under the monitoring parameters.
In some of these embodiments, the visualization monitoring module 300 is further configured to synchronize the poses of the real camera and the virtual camera;
synchronizing values of real camera optical zoom and virtual camera field angle;
and creating real-time visual scene information based on the monitoring parameters and the synchronization result so as to monitor the workshop site in real time.
In some embodiments, on the basis of the embodiment of fig. 4, a pushing module is further included; and the pushing module is used for pushing the visual scene information to the page end by using the WebSocket technology and monitoring the workshop site in real time in the page end.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring workshop data, monitoring images and monitoring parameters in real time;
s2, establishing a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters;
and S3, processing the monitoring image to the visual scene model, and creating real-time visual scene information based on the monitoring parameter and the processing result so as to monitor the workshop site in real time.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the visual monitoring method in the foregoing embodiments, the embodiments of the present application may provide a storage medium to implement. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the visual monitoring methods of the above embodiments.
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A visual monitoring method, comprising:
acquiring workshop data, monitoring images and monitoring parameters in real time;
establishing a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters;
and processing the monitoring image to the visual scene model, and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time.
2. The visual monitoring method of claim 1, further comprising:
and acquiring monitoring data, and processing the monitoring data by using a video transcoding technology to obtain a monitoring image.
3. The visual monitoring method according to claim 1, wherein the establishing of the virtual three-dimensional scene model under the monitoring parameters according to the workshop data and the monitoring parameters comprises:
establishing an original virtual three-dimensional scene model by using a modeling tool according to the workshop data;
and calibrating the camera attitude in the original virtual three-dimensional scene model according to the camera attitude parameters in the monitoring parameters to establish the virtual three-dimensional scene model under the monitoring parameters.
4. The visual monitoring method of claim 3, wherein calibrating the camera pose in the original virtual three-dimensional scene model according to the camera pose parameters in the monitoring parameters comprises:
comparing the monitoring image with a model image in the original virtual three-dimensional scene model, and adjusting the posture of a virtual camera in the original virtual three-dimensional scene model; and when the two pictures are overlapped, taking the attitude information of the current virtual camera as an initial attitude, and taking the rotation angle of 0-360 degrees.
5. The visual monitoring method of claim 1, wherein the monitoring image is processed to a visual scene model, and real-time visual scene information is created based on the monitoring parameter and the processing result to monitor the workshop site in real time, comprising:
synchronizing the poses of the real camera and the virtual camera;
synchronizing values of real camera optical zoom and virtual camera field angle;
and creating real-time visual scene information based on the monitoring parameters and the synchronization result so as to monitor the workshop site in real time.
6. The visual monitoring method of claim 1, further comprising:
and pushing the visual scene information to the page end by using a WebSocket technology, and monitoring the workshop site in real time in the page end.
7. A visual monitoring device is characterized by comprising an acquisition module, a creation module and a visual monitoring module;
the acquisition module is used for acquiring workshop data, monitoring images and monitoring parameters in real time;
the creating module is used for creating a virtual three-dimensional scene model under monitoring parameters according to the workshop data and the monitoring parameters;
the visual monitoring module is used for processing the monitoring image to the visual scene model and creating real-time visual scene information based on the monitoring parameters and the processing result so as to monitor the workshop site in real time.
8. A visual monitoring system, comprising: a terminal device, a transmission device and a server device; the terminal equipment is connected with the server equipment through the transmission equipment;
the terminal equipment is used for acquiring workshop data, monitoring images and monitoring parameters in real time;
the transmission equipment is used for transmitting the workshop data, the monitoring image and the monitoring parameters;
the server device is configured to perform the visual monitoring method according to any one of claims 1 to 6.
9. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the visual monitoring method of any one of claims 1 to 6.
10. A storage medium, in which a computer program is stored, wherein the computer program is configured to execute the visual monitoring method according to any one of claims 1 to 6 when running.
CN202110671873.5A 2021-06-17 2021-06-17 Visual monitoring method, device, system, electronic device and storage medium Pending CN113284233A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110671873.5A CN113284233A (en) 2021-06-17 2021-06-17 Visual monitoring method, device, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110671873.5A CN113284233A (en) 2021-06-17 2021-06-17 Visual monitoring method, device, system, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN113284233A true CN113284233A (en) 2021-08-20

Family

ID=77284925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110671873.5A Pending CN113284233A (en) 2021-06-17 2021-06-17 Visual monitoring method, device, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113284233A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673107A (en) * 2021-08-23 2021-11-19 山东中图软件技术有限公司 Visual chemical monitoring method, equipment and medium based on industrial simulation
CN113992890A (en) * 2021-10-22 2022-01-28 北京明略昭辉科技有限公司 Monitoring method, monitoring device, storage medium and electronic equipment
CN115742562A (en) * 2023-01-05 2023-03-07 东方合智数据科技(广东)有限责任公司 Intelligent monitoring method, device and equipment for printing and packaging equipment and storage medium
CN113673107B (en) * 2021-08-23 2024-05-31 山东中图软件技术有限公司 Visual chemical monitoring method, equipment and medium based on industrial simulation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109375595A (en) * 2018-10-25 2019-02-22 北京理工大学 A kind of workshop method for visually monitoring, device and equipment
CN110866978A (en) * 2019-11-07 2020-03-06 辽宁东智威视科技有限公司 Camera synchronization method in real-time mixed reality video shooting
CN110992484A (en) * 2019-11-20 2020-04-10 中电科新型智慧城市研究院有限公司 Method for displaying traffic dynamic video in real scene three-dimensional platform
CN111737518A (en) * 2020-06-16 2020-10-02 浙江大华技术股份有限公司 Image display method and device based on three-dimensional scene model and electronic equipment
CN111966068A (en) * 2020-08-27 2020-11-20 上海电机系统节能工程技术研究中心有限公司 Augmented reality monitoring method and device for motor production line, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109375595A (en) * 2018-10-25 2019-02-22 北京理工大学 A kind of workshop method for visually monitoring, device and equipment
CN110866978A (en) * 2019-11-07 2020-03-06 辽宁东智威视科技有限公司 Camera synchronization method in real-time mixed reality video shooting
CN110992484A (en) * 2019-11-20 2020-04-10 中电科新型智慧城市研究院有限公司 Method for displaying traffic dynamic video in real scene three-dimensional platform
CN111737518A (en) * 2020-06-16 2020-10-02 浙江大华技术股份有限公司 Image display method and device based on three-dimensional scene model and electronic equipment
CN111966068A (en) * 2020-08-27 2020-11-20 上海电机系统节能工程技术研究中心有限公司 Augmented reality monitoring method and device for motor production line, electronic equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113673107A (en) * 2021-08-23 2021-11-19 山东中图软件技术有限公司 Visual chemical monitoring method, equipment and medium based on industrial simulation
CN113673107B (en) * 2021-08-23 2024-05-31 山东中图软件技术有限公司 Visual chemical monitoring method, equipment and medium based on industrial simulation
CN113992890A (en) * 2021-10-22 2022-01-28 北京明略昭辉科技有限公司 Monitoring method, monitoring device, storage medium and electronic equipment
CN115742562A (en) * 2023-01-05 2023-03-07 东方合智数据科技(广东)有限责任公司 Intelligent monitoring method, device and equipment for printing and packaging equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109448099B (en) Picture rendering method and device, storage medium and electronic device
CN113284233A (en) Visual monitoring method, device, system, electronic device and storage medium
US20010056477A1 (en) Method and system for distributing captured motion data over a network
CN105892643A (en) Multi-interface unified display system and method based on virtual reality
CN111984114A (en) Multi-person interaction system based on virtual space and multi-person interaction method thereof
CN113313818A (en) Three-dimensional reconstruction method, device and system
CN107437273A (en) Six degree of freedom three-dimensional reconstruction method, system and the portable terminal of a kind of virtual reality
JP6398114B2 (en) Communication protocol between platform and image device
CN107610213A (en) A kind of three-dimensional modeling method and system based on panorama camera
CN114004927A (en) 3D video model reconstruction method and device, electronic equipment and storage medium
CN108765084B (en) Synchronous processing method and device for virtual three-dimensional space
CN111710020B (en) Animation rendering method and device and storage medium
CN105898272A (en) 360-degree image loading method, loading module and mobile terminal
CN113936121B (en) AR label setting method and remote collaboration system
CN109801351B (en) Dynamic image generation method and processing device
CN109660508A (en) Data visualization method, electronic device, computer equipment and storage medium
CN115797522A (en) Real-time visualization method and device for digital content creation
CN103605826B (en) A kind of fabric finished product 3D effect is shown and updates system and method
CN113515193B (en) Model data transmission method and device
CN115908670A (en) Digital twin model distributed rendering method, device and system in production scene
CN111917983B (en) Home decoration interior work management method and system based on panoramic image
CN114900743A (en) Scene rendering transition method and system based on video plug flow
CN114900742A (en) Scene rotation transition method and system based on video plug flow
CN110740274B (en) System and method for unified access to framework based on image video algorithm
CN104284077A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210820

RJ01 Rejection of invention patent application after publication