CN108010079B - State information remote monitoring system and method based on projection fusion and image recognition - Google Patents

State information remote monitoring system and method based on projection fusion and image recognition Download PDF

Info

Publication number
CN108010079B
CN108010079B CN201710978943.5A CN201710978943A CN108010079B CN 108010079 B CN108010079 B CN 108010079B CN 201710978943 A CN201710978943 A CN 201710978943A CN 108010079 B CN108010079 B CN 108010079B
Authority
CN
China
Prior art keywords
information
dimensional
model
scene
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710978943.5A
Other languages
Chinese (zh)
Other versions
CN108010079A (en
Inventor
高杰
姜广文
秦远辉
王明辉
申宇
赵辉
万然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CSSC Systems Engineering Research Institute
Original Assignee
CSSC Systems Engineering Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CSSC Systems Engineering Research Institute filed Critical CSSC Systems Engineering Research Institute
Priority to CN201710978943.5A priority Critical patent/CN108010079B/en
Publication of CN108010079A publication Critical patent/CN108010079A/en
Application granted granted Critical
Publication of CN108010079B publication Critical patent/CN108010079B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Abstract

The invention relates to a state information monitoring system based on projection fusion and image recognition, which comprises an equipment main body, a perception model and more than one state information monitoring unit; the perception model is arranged on the projection imaging and information acquisition screen when the equipment is used and is used for marking the equipment to be observed; the system comprises a high-performance computer, an ultra-short-focus projection device, a projection imaging and information acquisition screen, a video fusion device and an image high-speed identification device, wherein the high-performance computer, the ultra-short-focus projection device, the projection imaging and information acquisition screen, the video fusion device and the image high-speed identification device are used for scene display, model information acquisition and fusion processing display, and a state information monitoring unit is connected with a device main body through an Ethernet and used for remote parameter configuration and state information interaction and monitoring real-time state information of a three-dimensional model scene. The distributed publishing of the real-time arrangement state and the working state information can be realized, the problem of information isolated island is solved, and the real-time state is displayed in a visual and vivid three-dimensional mode.

Description

State information remote monitoring system and method based on projection fusion and image recognition
Technical Field
The invention relates to the technical field of computer human-computer interaction, in particular to a state information monitoring system and a control method based on projection fusion and image recognition.
Background
In recent years, with the development of technologies such as computers, optics, graphics, computer vision, virtual reality, augmented reality and the like, various human-computer interaction technologies and human-computer interaction devices appear, and a general interaction device based on ultra-short-focus projection fusion and an image recognition tracking technology is a technical scheme for guaranteeing the device by combining a laser ultra-short-focus projection technology, an image recognition tracking technology, a virtual reality technology and an augmented reality technology.
The original interactive equipment of the same type mainly adopts a physical geometric scaling model for assisting related personnel, traditional means such as voice, video, visual and the like are adopted to maintain information such as equipment or equipment positions, working conditions and the like, a manual using mode is still adopted, integral working situation information is established on the interactive equipment, but the interactive equipment does not have the automatic identification capability of each scaling model, intelligent automatic means is lacked, different physical models need to be established in different working scenes, single equipment cannot be generalized, and the interactive equipment is generally used in a single scene, and cannot be cooperatively operated by multiple departments in different places.
Disclosure of Invention
In view of the foregoing analysis, the present invention aims to provide a state information monitoring system and a control method based on projection fusion and image recognition, so as to overcome the problems that the prior art cannot provide a general and scene-configurable scheme, cannot automatically, intuitively present a global working situation in real time in different places, cannot meet ergonomic characteristics, provide a new human-computer interaction mode, and the like.
The purpose of the invention is mainly realized by the following technical scheme:
in one aspect based on the embodiments of the present invention, a state information monitoring system based on projection fusion and image recognition is provided, including an apparatus main body, a perception model, and more than one state information monitoring units; the device main body comprises a high-performance computer, a projection imaging and information acquisition screen, a video fusion device, an ultra-short-focus projection device and an image high-speed identification device; the perception model is arranged on the projection imaging and information acquisition screen when the equipment is used and is used for marking the equipment to be observed; the system comprises a high-performance computer, an ultra-short-focus projection device, a projection imaging and information acquisition screen, a video fusion device and an image high-speed identification device, wherein the high-performance computer, the ultra-short-focus projection device, the projection imaging and information acquisition screen, the video fusion device and the image high-speed identification device are used for scene display, model information acquisition and fusion processing display, and a state information monitoring unit is connected with a device main body through an Ethernet and used for remote parameter configuration and state information interaction and monitoring real-time state information of a three-dimensional model scene.
In another embodiment of the system according to the present invention, each status information monitoring unit includes a host, a three-dimensional display screen, and a control unit, the host performs data interaction with a high-performance computer of the device main body, the three-dimensional display screen is used for displaying a three-dimensional simulation scene of an observation site and a device to be observed, and the control unit is used for remote parameter configuration and status information interaction.
In another embodiment of the system according to the present invention, the system further comprises a console body for mounting each component device, and a main body mounting frame is provided for the device.
In another embodiment of the system, when scene display is performed, the high-performance computer outputs a scene image to the video fusion device, the video fusion device performs video shunt display according to the required resolution, and the image is projected to the projection imaging and information acquisition screen through the ultra-short-focus projection equipment for imaging display;
when the model information is collected, more than two pieces of image high-speed identification equipment simultaneously collect perception model information placed on a projection imaging and information collecting screen, transmit the collected information to a high-performance computer, perform fusion processing by the computer, perform image identification and tracking, and calculate the position of a perception model in real time;
when the system is used for fusion processing and displaying, the high-performance computer constructs a virtual three-dimensional model of the perception model according to the collected information, the virtual three-dimensional model is fused with the three-dimensional simulation scene, and the virtual three-dimensional model is output to the three-dimensional display screen of the state information monitoring unit by the high-performance computer to be displayed in the three-dimensional scene.
In another embodiment of the system of the present invention, the system further includes a light adjusting module, and the light adjusting module is used for controlling the internal light path of the console to be uniform, so as to ensure the accuracy of the internal image high-speed identification device in acquiring and identifying the image information.
In another embodiment of the system according to the invention, the perceptual model is a model for marking a certain equipment or device, with a special identifier; when the device works, the perception model is placed on a projection imaging and information acquisition screen, and the perception model is dragged to control synchronous display of the three-dimensional model corresponding to the device to be observed in the three-dimensional simulation scene.
In another aspect according to an embodiment of the present invention, there is provided a control method including:
step S1: starting a system, and generating a global working state two-dimensional scene on a projection imaging and information acquisition screen by an ultra-short-focus projection device according to video display information provided by a high-performance computer;
step S2: the high-performance computer acquires the entity type and the state information of the equipment to be observed and dynamically maintains the state information according to the unique identifier carried by the perception model placed in the two-dimensional scene;
step S3: the high-performance computer establishes a three-dimensional simulation scene, establishes a three-dimensional model corresponding to the equipment to be observed according to the acquired perception model information and fuses with the three-dimensional simulation scene, and releases model information comprising the three-dimensional model and the three-dimensional simulation scene to the Ethernet;
step S4: the state information monitoring unit subscribes the model information published by the high-performance computer in the step S3, analyzes the information according to an interface protocol, and displays a three-dimensional simulation scene and a three-dimensional model on a three-dimensional display screen;
step S5: the state information monitoring unit updates the model information in real time according to the change of the content of the subscription information;
step S6: an operator remotely adjusts the state of the perception model by operating the control unit, and the final effect is displayed on the local three-dimensional display screen.
In another embodiment of the method of the present invention, the method further includes that an operator operates the device to be observed according to the working requirement through the real-time state of the device under the three-dimensional simulation scene, specifically:
step S01: ensuring deduction and teaching, reproducing the global working situation by using the recorded historical data information, and reproducing and replaying the flow;
step S02: and scene roaming, namely, the perception model is placed in an interaction area of a perception desktop by controlling the perception model, and the scene roaming is carried out under a 3D visual angle, so that the working condition is comprehensively known.
The invention has the beneficial effects that:
the method has the advantages that the global working situation information can be integrated and displayed in a concise manner by adopting an interactive platform manner, so that the interaction cost among personnel is reduced, the overall operation efficiency is improved, and the possibility of errors is reduced;
secondly, a real-time natural interaction mode is adopted, the actual working situation is reflected in a 'what you see is what you get' mode, information heard in video or audio is changed into a seen actual physical model and a virtual guarantee situation, the overall working situation can be known more visually, and a user can conveniently master the overall information to make a decision;
the interaction equipment is generalized, the display scene and the base map can be changed by configuring the interaction equipment according to different application requirements, and one equipment adapts to different applications by deploying different scenes;
fourthly, the method can carry out deduction and reproduction of related processes, can be used for assisting users to quickly deduct related plans, is convenient to find problems, and can also quickly assist the users to be familiar with the operation processes;
fifthly, the method is simple, convenient, free of complex operation and easy to operate, and accords with human body operation and interaction habits;
and sixthly, the real-time arrangement state and the working state information can be distributed and published, the problem of information isolated island is solved, the real-time state is displayed in a visual and visual three-dimensional mode, a distributed information monitoring mode is provided, the use scene of the system is expanded, the information sharing degree is improved, and the use benefit of the equipment is improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, wherein like reference numerals are used to designate like parts throughout.
FIG. 1 is a schematic diagram of a state information monitoring system based on projection fusion and image recognition;
FIG. 2 is an elevation view of an optical path of an optoelectronic device in accordance with one embodiment of the present invention;
FIG. 3 is a side view of an optical path of an optoelectronic device in accordance with one embodiment of the present invention;
fig. 4 is an operation diagram of the perception model.
Detailed Description
The preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings, which form a part hereof, and which together with the embodiments of the invention serve to explain the principles of the invention.
According to an embodiment of the present invention, a state information monitoring system based on projection fusion and image recognition is disclosed, as shown in fig. 1, including an apparatus main body, a perception model, and more than one state information monitoring units; the device main body comprises a high-performance computer, a projection imaging and information acquisition screen, a video fusion device, an ultra-short-focus projection device and an image high-speed identification device; the perception model is arranged on the projection imaging and information acquisition screen when the equipment is used and is used for marking the equipment to be observed; the system adopts a high-performance computer, an ultra-short-focus projection device, a projection imaging and information acquisition screen, a video fusion device and an image high-speed identification device for scene display, model information acquisition and fusion processing display, and the state information monitoring unit is connected with the device main body through an Ethernet and used for remote parameter configuration and state information interaction and remote monitoring of the real-time state information of the three-dimensional model scene.
The state information monitoring unit comprises a host, a three-dimensional display screen and a control unit, the state information monitoring unit is connected with the equipment main body through an Ethernet, the host is in data interaction with a high-performance computer of the equipment main body, the three-dimensional display screen and the control unit are connected with the host through a connecting firmware, and the three-dimensional display screen is used for displaying an observation field and a three-dimensional simulation scene of equipment to be observed. The system is used for remotely displaying the real-time state information of the three-dimensional model scene in multiple places simultaneously, all the state information change conditions of the equipment to be observed can be displayed on the display of the state information monitoring unit, the global state can be conveniently and remotely monitored, the multi-department cooperative operation can be conveniently realized, the real-time state of the equipment to be observed can be timely mastered, and the occurrence of manual operation accidents can be prevented;
and the control unit is used for parameter configuration and state information interaction and guaranteeing condition information acquisition. In order to facilitate operation of operators, the control unit is connected with the host through a wireless network by adopting mobile terminals such as portable handheld panels, so that remote control is realized, and convenience and rapidness are realized. Meanwhile, the states of the three-dimensional scene can be switched through a switch arranged on the control unit, such as the states of the scene in the daytime and at night; displaying on a light switch; sunlight illumination display and the like, and is used for simulating a real environment. On the basis of an interactive system of projection fusion and image recognition, the real-time arrangement state and guarantee condition information of the interactive system is published to a plurality of state information monitoring units by adopting an internet distributed technology, each state information monitoring unit can be distributed and deployed to different places, a subscription-publishing mechanism is adopted, a three-dimensional mode is adopted to reconstruct the operation scene of the interactive system, and the real-time working state of the interactive system is monitored. The state information monitoring unit can display the arrangement information of the interactive system, the state information of the arrangement unit and the working state information of the equipment state; the control unit carries out information interaction with the interactive system host through the wireless communication module, is not limited by equipment space, is carried by personnel to carry out various functional operations, improves the interaction capacity and the use convenience of equipment, and improves the use benefit of the equipment.
The device also comprises a state information issuing unit which is connected with the device main body through the Ethernet, automatically acquires global information of an observation site and a device to be observed, such as fault state, light-on state and light state of the device, and automatically issues the acquired state information to the device main body for simulating a global scene. The state information is automatically received and sent, the state of the related equipment is automatically maintained, the workload of personnel operation is reduced, and the efficiency and the accuracy are improved.
The device also comprises an operation console body for installing each component device, and a main body installation frame is provided for the device; in one embodiment of the invention, the table body case material of the control table body can adopt a high-strength metal plate or an aluminum alloy, the accessory material adopts stainless steel, high-quality steel or a high-strength aluminum alloy material, the outer wall of the table body is subjected to plastic spraying treatment, the conductive adhesive tape is sealed, and the interior of the table body case is subjected to filter design. The size of the console body is 2000mm multiplied by 1000mm, and the size of the interaction area is not less than 1780mm multiplied by 540 mm. By adopting ultra-short-focus projection and video fusion technology, projection display is limited within an interaction space of a similar console body with the size of 2000mm multiplied by 1000mm, so that the equipment is miniaturized, and the applicability of the equipment is improved;
still include the light and adjust the module, adjust the module through the light and adjust that light makes the inside light path of console even, guarantees that inside camera is to the correct rate of image information acquisition and discernment.
The main functions of the equipment main body are a scene display function, a model information acquisition function and a fusion processing display function, and the core control equipment is a high-performance computer.
The scene display function is mainly that a high-performance computer outputs a scene image to a video fusion device, the video fusion device performs video shunt display according to the required resolution, the image is projected to a projection imaging and information acquisition screen through a projection device for imaging display, the high-performance computer provides video information for an ultra-short-focus projection device, a virtual reality interaction area is realized by utilizing an image high-speed identification device, the ultra-short-focus projection device and the projection imaging and information acquisition screen, and the interaction area displays a two-dimensional top view;
the model information acquisition function is mainly that a sensing model with a special identification is placed on a projection imaging and information acquisition screen, more than two image high-speed identification devices acquire video image information and simultaneously acquire identification information on the sensing model, and transmit the acquired information to a video fusion device for video fusion and then to a high-performance computer; the high-performance computer analyzes and processes the data of the acquired information, and calculates the state information of the perception model in a two-dimensional scene, wherein the state information comprises the position and the direction of the perception model;
the fusion processing display function is that the state information of the perception model collected in a two-dimensional scene is converted into a virtual three-dimensional model of the perception model reconstructed by three-dimensional information in a three-dimensional simulation scene, wherein each perception model is provided with an exclusive identifier, related parameters such as model and rotor folding state and the like which actually represent equipment to be observed are bound with the exclusive identifier, one-to-one mapping is established with the only virtual three-dimensional model of the equipment to be observed, the reconstruction from the two-dimensional model to the three-dimensional model is realized, the fusion processing display function is fused with the three-dimensional simulation scene, the display effect of virtual-real combination and good interaction are realized, and finally the three-dimensional simulation scene and the three-dimensional state information of the perception model are published on a three-dimensional display screen; in the principle implementation process, in order to effectively improve the recognition rate of the perception model, the light path inside the equipment is supplemented through the light regulation module inside the equipment, and efficient recognition and tracking of the perception model identification are guaranteed.
In a specific embodiment of the invention, the high-performance computer adopts a high-performance processor, can process and calculate the video image in real time, the memory is not less than 16GB, the capacity of the configured independent video memory is not less than 8GB, the resolution ratio supports large length-width ratio, the resolution ratio is not less than 2560 x 1600, and supports the self-defined resolution ratio, and is downward compatible; and configuring an Ethernet interface and a high-speed USB interface for peripheral connection.
The projection imaging and information acquisition screen is used for displaying a projection image and simultaneously used for acquiring information of the perception model; in one embodiment of the invention, the coated toughened glass with the thickness of 5mm is used as a substrate, a special projection film is pasted at the rear part of the substrate, the display of a projection image is realized, and the ITO coating treatment is carried out on the whole glass.
The video fusion device is used for outputting the graphic signals output by the high-performance computer to the ultra-short-focus projection device for splicing display after being processed by the video fusion device when the number of the ultra-short-focus projection devices is more than two; and when the number of the image high-speed identification devices is more than two, performing image fusion and image target identification on the images acquired by the image high-speed identification devices in a subarea mode. In one embodiment of the invention, the video input resolution supports 2560 x 1600.
The ultra-short-focus projection equipment is provided with more than two devices and is used for carrying out ultra-short-focus projection and splicing display on the high-definition image output by the computer; in one embodiment of the invention, a projection ratio of 0.248 ultra-short focus lens is adopted, and a laser projector is adopted to ensure the uniformity of a projected image, the brightness is not less than 3000 lumens, and the contrast is not less than 10000: 1.
The image high-speed identification equipment is used for acquiring images; the light adjusting module is used for controlling the uniformity of an internal light path of the console, and the accuracy of the internal camera for collecting and identifying the image information is ensured.
In an embodiment of the present invention, fig. 2-3 are schematic optical path diagrams of an interactive device internal projection device and an optoelectronic device based on projection fusion and image recognition. The ultra-short-focus projection equipment and the image high-speed recognition equipment are limited by a projection light path and a camera shooting collection light path, 2 ultra-short-focus projection equipment and 3 image high-speed recognition equipment are respectively adopted, a solid line represents a projection area, a broken line marks a photoelectric camera shooting area, the projection area is overlapped and carries out fusion display on a projection image through a video fusion device, the video overlapping area fuses 3 paths of photoelectric images through a video fusion algorithm, and recognition and tracking calculation are carried out on a perception model mark based on the fused images. In addition, the design of the adjusting mechanism of each part is considered during the structural design, and the micro-adjusting mechanism is used for micro-adjusting in the production process.
The perception model is used for marking a model of certain equipment or equipment, has a special identifier, is convenient for photoelectric equipment to identify and track, can be used for quoting the perception model so as to perceive and interact a first-person visual angle, and restores control console two-dimensional display situation information in a three-dimensional form on related three-dimensional picture display equipment; when the three-dimensional correlation model synchronous display device works, a perception model is placed on a projection imaging and information acquisition screen, the perception model is dragged to control synchronous display of the three-dimensional correlation model, and the operation modes of the perception model comprise placement, free dragging, rotation, horizontal movement and vertical movement; in addition, the state of the perception model can be adjusted through the control unit, a real scene is simulated, and the real state information of an observed object can be set as the state information of equipment or a model marked by the perception model through data interaction with the observed object in the real scene.
Fig. 4 is an operation schematic diagram of the perception model, the operation of the universal interaction device based on the ultra-short-focus projection fusion and the image recognition tracking technology mainly uses the projection imaging and information acquisition screen region and the control unit as the operation input device, and the projection imaging and information acquisition screen region applies the natural interaction language to model the perception model and visualize each function information. The user can directly drag the perception model placed on the perception area, control the content displayed in the virtual two-dimensional/three-dimensional scene, and optionally place and control virtual scenery, objects and even virtual people in the virtual scene, including lighting effect, sunshine and color matching schemes in the scene, and can also control in real time. And the functions of emergency training, real-time monitoring and commanding and the like can be performed in the virtual scene. The control of the state of the corresponding equipment in the scene can be realized by clicking a perception model label correspondingly generated by the perception model on the projection imaging and information acquisition screen to change the related state information or inputting the state information through the control unit. The control selection can be carried out by clicking the perception model label on the perception area or selecting the corresponding entity on the control unit to carry out state setting.
The control method of the state information monitoring system based on the ultra-short-focus projection fusion and image recognition tracking technology comprises the following steps:
step S1: the system is started, the high-performance computer provides video display information for the ultra-short-focus projection equipment, and a global working state two-dimensional scene is generated on a projection imaging and information acquisition screen;
specifically, a high-performance computer outputs a scene image to a video fusion device, the video fusion device performs video shunt display according to required resolution, the image is projected to a projection imaging and information acquisition screen through a projection device for imaging display, the high-performance computer provides video information for an ultra-short-focus projection device, a virtual reality interaction area is realized by utilizing an image high-speed identification device, the ultra-short-focus projection device and the projection imaging and information acquisition screen, and the interaction area displays a two-dimensional top view;
step S2: placing a perception model in a two-dimensional scene, and acquiring the entity type and state information of equipment to be observed by a high-performance computer according to the unique identification of the perception model and dynamically maintaining the state information by the high-performance computer;
specifically, an operator places a perception model with a unique identifier on a projection imaging and information acquisition screen, and image high-speed identification equipment arranged in a console acquires information of the identifier at the bottom of the perception model; the high-speed recognition camera transmits the acquired information to a high-performance computer for data analysis and processing, and the entity type and state information of the equipment to be observed corresponding to the perception model are obtained through calculation, wherein the state information mainly comprises the position and the direction of the perception model;
and an operator sets the entity type and the state information of the equipment to be observed corresponding to the perception model through clicking the perception model label or through the control unit, and dynamically maintains the state information. And the user can carry out position calculation by placing the marker according to the obtained change information and the monitored actual situation, and carry out state maintenance by sensing the relevant interactive operation of the area.
In another specific embodiment of the present invention, the entity type and the state information of the device to be observed are acquired by a state information publishing unit, and the state information publishing unit is connected to the device main body through an ethernet, collects global information of the observation site and the device to be observed, such as information of a fault state, a lighting state, and a lighting state of the observation site of the device, and publishes the collected state information to the device main body for simulating a global scene.
Step S3: the high-performance computer establishes a three-dimensional simulation scene, establishes a three-dimensional model corresponding to the equipment to be observed according to the acquired perception model information and fuses with the three-dimensional simulation scene, and releases model information comprising the three-dimensional model and the three-dimensional simulation scene to the Ethernet;
specifically, a high-performance computer converts state information of a perception model acquired in a two-dimensional scene into three-dimensional information, reconstructs a virtual three-dimensional model of the perception model in a three-dimensional simulation scene by adopting a three-dimensional modeling technology, wherein each perception model is provided with a unique identifier, binds related parameters which actually represent equipment to be observed, such as a machine type, a rotor folding state and the like with the unique identifier, establishes one-to-one mapping with the unique virtual three-dimensional model of the equipment to be observed, realizes reconstruction from the two-dimensional model to the three-dimensional model, is fused with the three-dimensional simulation scene, realizes a display effect of virtual-real combination and good interaction, and finally releases the three-dimensional simulation scene and the three-dimensional state information of the perception model on a three-dimensional display screen; in the principle implementation process, in order to effectively improve the recognition rate of the perception model, the light path inside the equipment is supplemented through the light regulation module inside the equipment, and efficient recognition and tracking of the perception model identification are guaranteed.
Step S4: the state information monitoring unit subscribes the model information published by the high-performance computer in the step S3, analyzes the information according to an interface protocol, and displays a three-dimensional simulation scene and a three-dimensional model on a three-dimensional display screen;
the three-dimensional display screen is used for displaying a three-dimensional simulation scene of an observation site and equipment to be observed, in a specific embodiment of the invention, a plurality of state information monitoring units are simultaneously arranged in a plurality of places for simultaneously and remotely displaying real-time state information of a three-dimensional model scene in the plurality of places, and all state information change conditions of the equipment to be observed can be displayed on a display of the state information monitoring units, so that the global state can be remotely monitored conveniently, multi-department cooperative operation is facilitated, the real-time state of the equipment to be observed can be mastered in time, and manual operation accidents are prevented;
step S5: the state information monitoring unit updates the model information in real time according to the change of the content of the subscription information;
step S6: an operator remotely adjusts the state of the perception model by operating the control unit, and the final effect is displayed on the local three-dimensional display screen.
Meanwhile, the states of the three-dimensional scene can be switched through a switch arranged on the control unit, such as the states of the scene in the daytime and at night; displaying on a light switch; sunlight illumination display and the like, and is used for simulating a real environment.
Further comprising the steps of:
an operator operates the equipment to be observed according to the working requirement through the real-time state of the equipment under the three-dimensional simulation scene; the operation content may be performed according to work needs, and specifically may include:
step S01: and (5) guarantee deduction and teaching. And reproducing the global working situation by using the recorded historical data information, and reproducing and playing back the flow.
Step S02: scene roaming, namely, a certain first person perception model is controlled, the perception model is placed in an interaction area of a perception desktop, so that the visual angle situation of the current position can be acquired under a 3D visual angle, the scene roaming can be carried out by moving the perception model, and the working condition can be comprehensively known.
Has the advantages that:
the method has the advantages that the global working situation information can be integrated and displayed in a concise manner by adopting an interactive platform manner, so that the interaction cost among personnel is reduced, the overall operation efficiency is improved, and the possibility of errors is reduced;
secondly, a real-time natural interaction mode is adopted, the actual working situation is reflected in a 'what you see is what you get' mode, information heard in video or audio is changed into a seen actual physical model and a virtual guarantee situation, the overall working situation can be known more visually, and a user can conveniently master the overall information to make a decision;
the interaction equipment is generalized, the display scene and the base map can be changed by configuring the interaction equipment according to different application requirements, and one equipment adapts to different applications by deploying different scenes;
fourthly, the method can carry out deduction and reproduction of related processes, can be used for assisting users to quickly deduct related plans, is convenient to find problems, and can also quickly assist the users to be familiar with the operation processes;
fifthly, the method is simple, convenient, free of complex operation and easy to operate, and accords with human body operation and interaction habits;
and sixthly, the real-time arrangement state and the working state information of the interactive system can be distributed and published, the problem of information isolated island is solved, the real-time state of the interactive system is displayed in a visual and visual three-dimensional mode, a distributed information monitoring mode is provided, the use scene of the interactive system is expanded, the information sharing degree is improved, and the equipment use benefit is improved.
Those skilled in the art will appreciate that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program, which is stored in a computer readable storage medium, to instruct related hardware. The computer readable storage medium is a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention.

Claims (4)

1. A state information monitoring system based on projection fusion and image recognition is characterized by comprising an equipment main body, a perception model and more than one state information monitoring unit; the device main body comprises a high-performance computer, a projection imaging and information acquisition screen, a video fusion device, an ultra-short-focus projection device and an image high-speed identification device; the perception model is arranged on the projection imaging and information acquisition screen when the equipment is used and is used for marking the equipment to be observed; dragging the perception model to control synchronous display of the three-dimensional correlation model, wherein the operation mode of the perception model comprises placement, free dragging, rotation, horizontal movement and vertical movement;
the perception model is used for marking certain equipment or equipment and is provided with a special identifier; when the system works, the perception model is placed on a projection imaging and information acquisition screen, and the perception model is dragged to control synchronous display of a three-dimensional model corresponding to the equipment to be observed in a three-dimensional simulation scene;
the system comprises a high-performance computer, an ultra-short-focus projection device, a projection imaging and information acquisition screen, a video fusion device and an image high-speed identification device, wherein the high-performance computer, the ultra-short-focus projection device, the projection imaging and information acquisition screen, the video fusion device and the image high-speed identification device are used for scene display, model information acquisition and fusion processing display; the state information monitoring unit is distributed, is deployed in different use places, and synchronously transmits information through a message subscription-publishing mechanism;
each state information monitoring unit comprises a host, a three-dimensional display screen and a control unit, the host is in data interaction with a high-performance computer of the equipment main body, the three-dimensional display screen is used for displaying a three-dimensional simulation scene of an observation site and equipment to be observed, and the control unit is used for remote parameter configuration and state information interaction; the switch arranged on the control unit is used for switching the state of the three-dimensional scene and simulating a real environment; the control unit adjusts the state of the perception model;
the console body is used for mounting each component device and providing a main body mounting frame for the device;
when scene display is carried out, the high-performance computer outputs a scene image to the video fusion device, the video fusion device carries out video shunt display according to the required resolution ratio, and the ultra-short-focus projection equipment projects the image to the projection imaging and information acquisition screen for two-dimensional imaging display;
when the model information is collected, more than two image high-speed recognition devices collect video image information and simultaneously collect identification information of a perception model placed on a projection imaging and information collecting screen, transmit the collected information to a video fusion device for video fusion, and then transmit the information to a high-performance computer; the high-performance computer analyzes and processes the data of the acquired information, calculates the state information of the perception model in a two-dimensional scene, the state information comprises the position and the direction of the perception model, and the projection imaging and information acquisition screen is used for displaying a projection image and is also used for acquiring the information of the perception model;
when the system is used for fusion processing and displaying, the high-performance computer constructs a virtual three-dimensional model of the perception model according to the collected information, the virtual three-dimensional model is fused with the three-dimensional simulation scene, and the virtual three-dimensional model is output to the three-dimensional display screen of the state information monitoring unit by the high-performance computer to be displayed in the three-dimensional scene.
2. The system of claim 1, further comprising a light adjusting module, wherein the light adjusting module is used for controlling the internal light path of the console to be uniform, so as to ensure the accuracy of the internal image high-speed recognition device in acquiring and recognizing the image information.
3. A control method based on the system of one of claims 1-2, comprising:
step S1: starting a system, and generating a global working state two-dimensional scene on a projection imaging and information acquisition screen by an ultra-short-focus projection device according to video display information provided by a high-performance computer;
step S2: the high-performance computer acquires the entity type and the state information of the equipment to be observed and dynamically maintains the state information according to the unique identifier carried by the perception model placed in the two-dimensional scene;
step S3: the high-performance computer establishes a three-dimensional simulation scene, establishes a three-dimensional model corresponding to the equipment to be observed according to the acquired perception model information and fuses with the three-dimensional simulation scene, and releases model information comprising the three-dimensional model and the three-dimensional simulation scene to the Ethernet;
step S4: the state information monitoring unit subscribes the model information published by the high-performance computer in the step S3, analyzes the information according to an interface protocol, and displays a three-dimensional simulation scene and a three-dimensional model on a three-dimensional display screen;
step S5: the state information monitoring unit updates the model information in real time according to the change of the content of the subscription information;
step S6: an operator remotely adjusts the state of the perception model by operating the control unit, and the final effect is displayed on the local three-dimensional display screen.
4. The method of claim 3, further comprising the step of operating, by an operator, the real-time state of the device to be observed in the three-dimensional simulation scene according to the working requirement, specifically:
step S01: ensuring deduction and teaching, reproducing the global working situation by using the recorded historical data information, and reproducing and replaying the flow;
step S02: and scene roaming, namely, the perception model is placed in an interaction area of a perception desktop by controlling the perception model, and the scene roaming is carried out under a 3D visual angle, so that the working condition is comprehensively known.
CN201710978943.5A 2017-10-19 2017-10-19 State information remote monitoring system and method based on projection fusion and image recognition Active CN108010079B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710978943.5A CN108010079B (en) 2017-10-19 2017-10-19 State information remote monitoring system and method based on projection fusion and image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710978943.5A CN108010079B (en) 2017-10-19 2017-10-19 State information remote monitoring system and method based on projection fusion and image recognition

Publications (2)

Publication Number Publication Date
CN108010079A CN108010079A (en) 2018-05-08
CN108010079B true CN108010079B (en) 2021-11-02

Family

ID=62051007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710978943.5A Active CN108010079B (en) 2017-10-19 2017-10-19 State information remote monitoring system and method based on projection fusion and image recognition

Country Status (1)

Country Link
CN (1) CN108010079B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586351A (en) * 2020-04-20 2020-08-25 上海市保安服务(集团)有限公司 Visual monitoring system and method for fusion of three-dimensional videos of venue

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105303615A (en) * 2015-11-06 2016-02-03 中国民航大学 Combination method of two-dimensional stitching and three-dimensional surface reconstruction of image
CN105549725A (en) * 2016-02-03 2016-05-04 深圳市中视典数字科技有限公司 Three-dimensional scene interaction display device and method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005009051A1 (en) * 2003-07-16 2005-01-27 Aurora Digital Advertising Inc. Three dimensional display method, system and apparatus
WO2009076303A2 (en) * 2007-12-11 2009-06-18 Bbn Technologies, Corp. Methods and systems for marking and viewing stereo pairs of images
CN101510074B (en) * 2009-02-27 2010-12-08 河北大学 High present sensation intelligent perception interactive motor system and implementing method
CN101646067B (en) * 2009-05-26 2011-06-15 华中师范大学 Digital full-space intelligent monitoring system and method
CN103034755B (en) * 2012-11-29 2016-03-30 北京科东电力控制系统有限责任公司 Based on the substation visual method for inspecting of virtual reality technology
CN103632581B (en) * 2013-12-17 2015-12-30 国家电网公司 Electric energy acquisition terminal debugging O&M analog simulation method
CN103824318B (en) * 2014-02-13 2016-11-23 西安交通大学 A kind of depth perception method of multi-cam array
CN107014375B (en) * 2017-02-22 2020-05-22 上海谦尊升网络科技有限公司 Indoor positioning system and method with ultra-low deployment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105303615A (en) * 2015-11-06 2016-02-03 中国民航大学 Combination method of two-dimensional stitching and three-dimensional surface reconstruction of image
CN105549725A (en) * 2016-02-03 2016-05-04 深圳市中视典数字科技有限公司 Three-dimensional scene interaction display device and method

Also Published As

Publication number Publication date
CN108010079A (en) 2018-05-08

Similar Documents

Publication Publication Date Title
CN111526118B (en) Remote operation guiding system and method based on mixed reality
WO2017075932A1 (en) Gesture-based control method and system based on three-dimensional displaying
CN108334199A (en) The multi-modal exchange method of movable type based on augmented reality and device
CN105549725B (en) A kind of three-dimensional scenic interaction display device and method
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
CN104268939A (en) Transformer substation virtual-reality management system based on three-dimensional panoramic view and implementation method of transformer substation virtual-reality management system based on three-dimensional panoramic view
CN113011723B (en) Remote equipment maintenance system based on augmented reality
CN108153502B (en) Handheld augmented reality display method and device based on transparent screen
WO2020020102A1 (en) Method for generating virtual content, terminal device, and storage medium
CN108594999A (en) Control method and device for panoramic picture display systems
CN207883270U (en) A kind of 3D holographic interaction display systems for museum
CN109032357A (en) More people's holography desktop interactive systems and method
US10665034B2 (en) Imaging system, display apparatus and method of producing mixed-reality images
CN103559007A (en) Method and device for dynamically generating screen wallpaper
Schütt et al. Semantic interaction in augmented reality environments for microsoft hololens
CN113253842A (en) Scene editing method and related device and equipment
CN108010079B (en) State information remote monitoring system and method based on projection fusion and image recognition
CN206946194U (en) A kind of holographic 3D interactive exhibition systems based on artificial intelligence Visual identification technology
WO2020151256A1 (en) Intelligent browsing method and system applied to cultural tourism
CN204576482U (en) Hologram showcase
CN206639510U (en) Sand table system based on VR interactions
CN111047713B (en) Augmented reality interaction system based on multi-vision positioning and operation method thereof
CN109841196B (en) Virtual idol broadcasting system based on transparent liquid crystal display
CN103777915A (en) Immersed type interaction system
CN106652712A (en) Display system and display method for human model data under virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant