CN114598790A - Subjective visual angle posture capturing and real-time image system - Google Patents

Subjective visual angle posture capturing and real-time image system Download PDF

Info

Publication number
CN114598790A
CN114598790A CN202210277850.0A CN202210277850A CN114598790A CN 114598790 A CN114598790 A CN 114598790A CN 202210277850 A CN202210277850 A CN 202210277850A CN 114598790 A CN114598790 A CN 114598790A
Authority
CN
China
Prior art keywords
controller
transmitter
signal
visual effect
workstation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210277850.0A
Other languages
Chinese (zh)
Other versions
CN114598790B (en
Inventor
李洪新
蔡震宇
魏敬鹏
卢中兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dison Digital Entertainment Technology Co ltd
Original Assignee
Beijing Dison Digital Entertainment Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dison Digital Entertainment Technology Co ltd filed Critical Beijing Dison Digital Entertainment Technology Co ltd
Priority to CN202210277850.0A priority Critical patent/CN114598790B/en
Publication of CN114598790A publication Critical patent/CN114598790A/en
Application granted granted Critical
Publication of CN114598790B publication Critical patent/CN114598790B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides a subjective visual angle posture capturing and real-time image system. The system comprises: the signal mark tree receives optical dynamic capture data through a terminal collector and transmits the optical dynamic capture data to a workstation; the controller sends an azimuth control operation instruction and a camera shooting operation instruction to the virtual camera through the wireless control end handle; the workstation selects a corresponding virtual scene according to the data signal, performs shooting processing in the virtual scene through the virtual camera, combines virtual video data acquired by the virtual camera with the data signal to generate synthetic data, the image transmitter resolves the synthetic data into video data, and the visual effect controller displays the video data transmitted by the image transmitter. The system subjectively solves the problem of site environment limitation, has good identifiability and stability and does not interfere with the outside, and can control the preview effect while capturing the real-time effect preview at the subjective visual angle posture. The method is mainly applied to three-dimensional production of quick preview of visual effects of movies, animations, games and the like.

Description

Subjective visual angle posture capturing and real-time image system
Technical Field
The invention relates to the technical field of subjective visual angle posture capture, in particular to a subjective visual angle posture capture and real-time image system.
Background
With the continuous development and progress of science and technology, the 5G technology is continuously developed and matured. The quadratic element and element universe technology is still presented to the public, the rise and progress of the film, television and animation and game industries are greatly concerned, and the Previz visual effect preview technology is also rapidly emerged along with the rise of the industries. The prior production preview technology increases the production speed of film, television, animation and games, and saves redundant production links and cost overhead. The subjective visual angle posture capturing and real-time image system can completely integrate all the existing dynamic capturing technologies, and can convert the dynamic capturing data into data which can be recognized in a three-dimensional manufacturing engine through decoding and secondary conversion.
The subjective visual angle posture capturing and real-time image system combines the optical dynamic capturing technology into the motionbuilder three-dimensional making engine, and can establish a full virtual camera which is completely the same as a real camera in the three-dimensional virtual space. We can view different places in the virtual three-dimensional space through this virtual camera. Meanwhile, pictures seen by the virtual camera are transmitted to a visual effect controller in real time, so that action performance contents of directors, investors, producers and the like on field performers, walking positions, light props in virtual scenes and the like can be displayed and guided through the novel method; and the subjective visual angle posture capturing and real-time image system also has a network cloud function, so that personnel who cannot arrive at the site can view real-time images in the subjective visual angle posture capturing and real-time image system in real time through the APP of the mobile client.
At present, the subjective view pose capturing and real-time imaging system in the prior art has the following disadvantages: in the early preparation work, a lot of time is needed for debugging equipment, and secondly, the picture delay exists in the picture transmission of virtual engine three-dimensional manufacturing software by using network cloud.
Disclosure of Invention
Embodiments of the present invention provide a subjective view pose capturing and real-time imaging system to overcome the problems of the prior art.
In order to achieve the purpose, the invention adopts the following technical scheme.
A subjective visual angle pose capturing and real-time imaging system, comprising: the system comprises a signal mark tree, a controller, a graph transmission device, a visual effect controller and a workstation, wherein the signal mark tree is in wired connection with the workstation, the workstation is in wired connection with the signal mark tree, the controller and the graph transmission device, and the graph transmission device is in wired connection with the visual effect controller;
the signal marker tree is used for receiving optical dynamic capture data through a terminal collector, coding and resolving the optical dynamic capture data to form a data signal, and transmitting the data signal to a workstation;
the controller is used for carrying out azimuth movement and camera shooting control on the virtual camera in the workstation through the wireless control end handle, sending azimuth control operation instructions of pushing, pulling, shaking and moving to the virtual camera and sending camera shooting operation instructions;
the workstation is used for selecting a corresponding virtual scene according to the data signals transmitted by the signal tag tree, performing camera shooting processing in the virtual scene according to the direction control operation instruction and the camera shooting operation instruction transmitted by the controller through the virtual camera, combining virtual video data acquired by the virtual camera with the data signals to generate synthetic data, and transmitting the synthetic data to the graph transmitter;
the image transmitter is used for resolving the synthetic data transmitted by the workstation into video data through video stream soft decoding and transmitting the video data to the visual effect controller;
and the visual effect controller is used for displaying the video data transmitted by the image transmitter.
Preferably, the signal mark tree is installed the top of controller is imitated to the vision, use the number line will the signal mark tree with the controller is imitated to the vision links to each other, the controller is imitated to the vision is installed in keel frame head position, use buckle and bolt clamp will the controller is imitated to the vision is fixed, install two controllers on keel frame left and right sides handle, use the buckle bolt to fix the controller, install the transmitter end and the receiver end of picture transmitter respectively on keel frame maincenter region, use the HDMI transmission line to link to each other with the workstation with picture transmitter, the transmitter is connected to the one end of HDMI transmission line, the other end inserts on workstation HDMI female mouth, then use strange hand and clamping fixed mounting to keel frame maincenter position with picture transmitter receiver.
Preferably, the signal marker tree comprises a receiving terminal ball, a signal transmission skeleton and a signal processor which are connected with each other;
the receiving terminal ball is a spheroid made of pc materials and used for receiving infrared light signals emitted by the dynamic catching system in all directions, the receiving terminal ball reflects the infrared light signals to the dynamic catching system, and the dynamic catching system obtains position information of the receiving terminal ball according to the infrared light signals reflected by the receiving terminal ball;
the signal transmission framework is made of titanium alloy, corresponds to the receiving terminal balls one by one and is used for transmitting infrared light signals received by the receiving terminal balls to the signal processor; transmitting the data signal generated by the signal processor to a workstation;
the signal processor comprises an IC bidirectional intelligent chip, a rheostat, a miniature circuit board and a rigid outer box, and is used for intelligently analyzing, calculating and compiling the infrared light signals through the IC bidirectional chip and regenerating data signals which can be identified and used by a three-dimensional manufacturing engine in a workstation;
the position and motion state information of the user handheld equipment is determined by receiving the terminal ball, and the position and motion state information of the user handheld equipment is sent to the workstation by the signal processor.
Preferably, the bottom of the receiving terminal ball is welded by using soldering tin, the spring buckle at the bottom of the receiving terminal ball is buckled with one section of the signal transmission framework, one end of each signal transmission framework connected with the receiving terminal ball is connected to the reserved hole sites on the signal processor one by one and is firmly welded by using the soldering tin, the other end of each signal transmission framework connected with the receiving terminal ball is installed on the signal receiving hole sites above the visual effect controller and is firmly welded by using the soldering tin.
Preferably, the controller includes: a controller receiver and a controller transmitter;
the controller receiver is used for transmitting the operation instruction signals received by the receiver terminal to a three-dimensional production virtual engine in a workstation, and performing corresponding direction control and/or camera shooting control operation on the virtual camera according to the operation instruction signals;
and the controller transmitter is used for transmitting operating instruction signals of pushing, pulling, shaking, moving and shooting to a receiver terminal of the controller receiver through a transmitter terminal of the controller operating handle.
Preferably, the controller receiver is inserted into a USB port of a host computer of a workstation, a connector of the controller is installed on the workstation, the controller receiver is wirelessly paired with a controller operating handle in a controller transmitter, the controller operating handle is installed on a handheld grip of a rigid keel frame in the visual effect controller, and the orientation is fixed by using a fixing clip buckle.
Preferably, the graph transmitter includes: the wireless antenna, the power supply, the image transmission switch, the matching module and the signal processing module;
the wireless antenna device is used for transmitting and receiving data signals, and two wireless antennas are respectively arranged on the transmitter and the receiver of the image transmitter;
the power supply device is used for being installed in power supply sockets of the image transmitter and the image transmitter receiver and supplying power to the image transmitter and the image transmitter receiver;
the picture transmission switch is used for welding the power supply position of the circuit board in the picture transmission device and is fixed on a reserved switch hole of a shell of the picture transmission device, and when the picture transmission switch is in an on state, the power supply device normally supplies power to the picture transmission device; when the map transmission switch is in an off state, the power supply device stops supplying power to the map transmission device;
the matching module is used for indicating the lamp of the matching module to be in an orange yellow flashing state when the transmitting end of the image transmitter is matched with the receiving end of the image transmitter, indicating the lamp of the matching module to be in a green state when the matching is successful, and indicating the lamp of the matching module to be in a red state when the matching is failed;
and the signal processing module is used for resolving the synthetic data received by the receiving end of the image transmitter and transmitted by the workstation into video data through video stream soft decoding, and transmitting the video data to the visual effect controller through the transmitting end of the image transmitter.
Preferably, the visual effect controller comprises: the device comprises a display window, a power supply, a switch, a key, an input/output terminal, a cloud end and a rigid keel bracket;
the display window is used for displaying the video data transmitted by the image transmitter and the operation information of the virtual camera machine;
the power supply device in the visual effect controller is used for being installed in a power supply socket of the visual effect controller and supplying electric energy to the visual effect controller;
the switch in the visual effect controller is used for controlling the working states of a power supply device and a display window of the visual effect controller;
the keys in the visual effect controller are used for controlling the content of the video data displayed on the display window and comprise a recording button, a playback button, a pause button and a previous frame/next frame button;
the input and output terminal in the visual effect controller is used for receiving the video data transmitted by the image transmitter through the input terminal, transmitting the video data to other equipment through the output terminal, adopting a high-definition HDMI terminal interface, welding two ports on a data signal receiving end and a signal output end of a circuit board in the visual effect controller respectively, and connecting the image transmitter with the input terminal of the visual effect controller through an HDMI transmission line;
the cloud end in the visual effect controller is used for storing the video data transmitted by the image transmitter and the virtual video data acquired by the virtual camera, and previewing and watching the video data through a user-defined APP;
the rigid keel bracket in the visual effect controller is used for fixing and supporting the controller, the visual effect controller and the image transmitter in the system.
Preferably, the rigid keel support is respectively installed in a preformed hole with the same diameter as the shoulder pad by using two hollow pipes, the other ends of the two hollow pipes are fixed in a faucet buckle, the faucet buckle is a rectangular rigid body and is preformed with an installation hole, and a left and a right sweat-proof plastic grips are respectively embedded below the faucet buckle.
Preferably, the workstation is configured to receive a data signal including position and motion state information of the user handheld device transmitted by the signal tag tree, match and bind the position and motion state information of the user handheld device with a virtual camera in three-dimensional production software, process the three-dimensional production engine, synthesize the data signal and a picture of the virtual camera, generate synthesized data, and transmit the synthesized data to the graph transmitter.
It can be seen from the technical solutions provided by the embodiments of the present invention that the subjective visual angle posture capture and real-time image system according to the embodiments of the present invention can receive the collected dynamic capture data signals, and resolve and convert the signals into data signals that can be read, understood and identified by the three-dimensional production software, and give the converted data signals to the virtual camera in the three-dimensional engine. Meanwhile, image signals watched by the virtual camera can be transmitted to the visual effect controller through the image transmitter, so that a user can perform orientation operation and picture preview on the virtual camera in real time.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a subjective visual angle pose capturing and real-time imaging system according to an embodiment of the present invention;
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or coupled. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
For the convenience of understanding the embodiments of the present invention, the following description will be further explained by taking several specific embodiments as examples in conjunction with the drawings, and the embodiments are not to be construed as limiting the embodiments of the present invention.
A schematic structural diagram of a subjective view pose capturing and real-time imaging system according to an embodiment of the present invention is shown in fig. 1, which includes: signal label tree, controller, graph transmitter, visual effect controller and workstation. The signal mark tree is connected with a workstation through a wire, the workstation is connected with the signal mark tree, the controller and the image transmitter through a wire, and the image transmitter is connected with the visual effect controller through a wire;
the signal marker tree is used for receiving optical dynamic capture data through a terminal collector, coding and resolving the optical dynamic capture data to form a data signal, and transmitting the data signal to a workstation;
the controller is used for carrying out azimuth movement and camera shooting control on a virtual camera established in three-dimensional manufacturing software on the workstation through a wireless control end handle, sending azimuth control operation instructions of pushing, pulling, shaking and moving to the virtual camera and sending camera shooting operation instructions;
the workstation is used for selecting a corresponding virtual scene according to the data signal transmitted by the signal marker tree, performing shooting processing in the virtual scene according to the direction control operation instruction and the shooting operation instruction transmitted by the controller by a virtual camera established in three-dimensional manufacturing software, combining virtual video data acquired by the virtual camera with the data signal to generate synthetic data, and transmitting the synthetic data to the graph transmitter;
the image transmitter is used for resolving the synthetic data transmitted by the workstation into video data through video stream soft decoding and transmitting the video data to the visual effect controller;
and the visual effect controller is used for displaying the video data transmitted by the image transmitter.
Subjective visual angle posture capturing and real-time image system integral connection introduction: firstly, the signal marking tree is installed at the top end of the visual effect controller, and the visual effect controller is installed at the head of the keel frame by using a buckle and a bolt clamp to fix the visual effect controller after the signal marking tree and the visual effect controller are connected by using a transmission line. Then install two controllers on keel frame left and right sides handle, use the buckle bolt to fix it, install the transmitter end and the receiver end of picture transmitter respectively on work and keel frame maincenter region simultaneously. The map transmitter is connected with the workstation by using an HDMI (High Definition Multimedia Interface) transmission line, one end of the transmission line is connected with the transmitter, and the other end of the transmission line is inserted into the HDMI female port of the workstation. And then the map transmitter receiver is installed and fixed on the central position of the keel frame by using a strange hand and a clamp.
Subjective visual angle posture capturing and real-time image system integral working principle introduction: the signal mark tree receives the optical dynamic capture data through the terminal collector, and simultaneously codes and calculates the dynamic capture data collected by the signal mark tree to form computer binary language data which is transmitted to a workstation and is attached to a virtual camera in three-dimensional manufacturing software motionon build. The work station picture is transmitted to the picture transmitter receiver end by the picture transmitter base station transmitting end through the stream code compression technology, and the picture is resolved into video data by the video stream soft decoding H264 and transmitted to the visual effect controller through the HDMI transmission line. The controller mainly controls a virtual camera established in three-dimensional manufacturing software motionbuilt on a workstation and performs related operations such as azimuth control, pushing, pulling, shaking, moving and the like on the virtual camera.
The signal tag tree is used for receiving a dynamic capture data signal, and comprises: the device comprises a receiving terminal ball, a signal transmission framework and a signal processor.
The receiving terminal ball is a spheroid which is made of special pc material and has the diameter of 1.2cm, the receiving terminal ball is used for receiving infrared light signals emitted by the dynamic capture system, and the dynamic capture system emits out invisible infrared light signals of 850nm through a specific capture machine. The infrared light signals are irradiated on the receiving terminal ball of the signal marking tree, the receiving terminal ball reflects the infrared light signals to the moving capture system, so that infrared light signal interaction is carried out between the receiving terminal ball and the capture machine, and the moving capture system obtains position information of the receiving terminal ball according to the infrared light signals reflected by the receiving terminal ball. The receiving terminal ball is made into a spherical shape so as to enlarge infrared signals which are received by the terminal ball and sent out and fed back by the dynamic catching machine, and even if part of the spherical surface of the receiving terminal ball is shielded, the infrared signals sent out by the dynamic catching system cannot be completely received.
The signal transmission framework is made of titanium alloy, the titanium alloy is strong in hardness and not easy to damage, and meanwhile, a high-efficiency signal transmission medium is formed through special manufacturing and processing of the optical fiber, the insulating layer, the flame-retardant layer, the titanium alloy mesh layer and the like from inside to outside. In order to match the receiving terminal ball more efficiently, the signal transmission frameworks of the invention all adopt titanium alloy frameworks with different lengths, the diameters of which are 3mm and the lengths of which are 10cm-25 cm. The number of the signal transmission frameworks is the same as that of the receiving terminal balls, and the signal transmission frameworks correspond to the receiving terminal balls one to one. The signal transmission framework transmits the infrared light signals received by the receiving terminal ball to the signal processor.
The signal processor comprises an IC bidirectional intelligent chip, a rheostat, a miniature circuit board and a rigid outer box. The signal processor is used for processing the infrared light signals, intelligently analyzing, calculating and compiling the infrared light signals received by the receiving terminal balls in all directions through the IC bidirectional chip, and regenerating data signals which can be identified and used by the three-dimensional manufacturing engine in the workstation.
The signal mark tree is composed by the following steps: the bottom of the receiving terminal ball is welded by using soldering tin, the soldering tin has a conduction function, and then the spring buckle at the bottom of the receiving terminal ball is buckled with one section of the signal transmission framework. Then, one ends of the signal transmission frameworks connected with the receiving terminal balls are connected to reserved hole positions on the signal processor one by one and are firmly welded and fixed by soldering tin. And the other ends of the signal transmission frameworks connected with the receiving terminal balls are arranged at the signal receiving hole positions above the visual effect controller 4 and are firmly welded and fixed by soldering tin.
The position of a user handheld device (a subjective visual angle posture capturing and real-time image system) is determined through a receiving terminal sub-ball, motion state information of the device is determined, such as X-axis movement, Y-axis movement, Z-axis movement and the like, the position and the motion state information of the user handheld device are transmitted to a signal processor together, and the signal processor is sent to a workstation in a unified mode and is bound with a virtual camera.
The controller is used for controlling the virtual camera in the three-dimensional production engine in the workstation and carrying out a series of direction position operations and virtual camera shooting on the virtual camera. The controller includes: a controller receiver and a controller transmitter.
The controller and the signal label tree are two independent modules and are respectively responsible for respective work, wherein the controller refers to: controlling the position of the camera in the three-dimensional software on the workstation, such as controlling the camera to move (up, down, left, right and swing), and simultaneously controlling the lens of the camera, such as: aperture, focal length. The controller does not move the device held by the user when the controller performs the command operation on the virtual camera. The moving part is a virtual camera and transmits pictures obtained by real-time movement to the visual effect controller and the cloud through the signal processing module and the image transmission module. The controller controls the camera, and controls a virtual camera created in the three-dimensional engine, and is not equipment held by a user.
The controller receiver is used for transmitting the operation instruction signal received by the receiver terminal to the three-dimensional production virtual engine in the workstation and carrying out transmitting end instruction operation on the virtual camera.
The controller transmitter is used for sending operation command signals of pushing, pulling, shaking, moving and shooting to a receiver terminal of the controller receiver through a transmitter terminal of the controller transmitter.
When the user presses the push button, the virtual camera lens picture in the three-dimensional production engine of the workstation is pushed forward relatively. When the pull button is pressed, the virtual camera lens in the engine is pulled far. The push button and the remote button have two working modes, one is click type, and the other is long-press type: the long-time pressing mode means that a user presses the push button all the time, and the virtual camera can push the picture all the time until the focal length of the virtual camera is maximum; and the pull key is pressed all the time, and the virtual camera can zoom out the picture all the time until the focal length of the virtual camera is minimum.
When the user presses the shaking button, the virtual camera in the three-dimensional production engine correspondingly shakes the virtual camera clockwise and anticlockwise according to the operation instruction of the user. When the user presses the move button, the virtual camera in the three-dimensional production engine performs azimuth movement (left, right, front, back, up, and down) according to the operation instruction of the user.
The virtual camera in the three-dimensional production engine carries out picture shooting and camera recording, and simultaneously can carry out the checking of pictures shot in the three-dimensional production engine and the video recording checking through the previous frame and the next frame.
The connection mode of the controller comprises: the method comprises the steps of firstly inserting a controller receiver into a USB3.0 port of a host computer of a workstation, simultaneously installing a connecting plug-in of the controller on the workstation, then wirelessly pairing the receiver of the controller with an operating handle of the controller, continuously flashing indicator lights on the controller receiver and an indicator light on a transmitter into yellow when the controller receiver and the transmitter are matched, and showing the indicator lights into a green constant state after pairing is successful and showing the indicator lights into a red flashing state after pairing is failed. Meanwhile, a user can automatically judge whether the connection of the controller is normal according to the color of the indicator light. Meanwhile, the controller handle is arranged on a rigid keel frame handheld grip in the visual effect controller at the emitter part of the controller, and the orientation is fixed by using a fixing clamp buckle. The controller is arranged on the handheld handle of the keel frame, so that the operation of a user is convenient in practical use, and the subjective visual angle posture capture and the real-time image system stability are not influenced to preview and perform a series of operations on the virtual camera.
The graph transmitter comprises: the device comprises a wireless antenna, a power supply, a pattern transmission switch, a matching module and a signal processing module.
The working principle of the wireless antenna device is as follows: it is mainly applied to digital signal transmission for transmitting and receiving its image transmitter. The wireless antenna is divided into two on four transmitters and two on four receivers. The working principle of the wireless antenna is only briefly summarized and detailed description is given to refer to the related book of the wireless antenna; the radio frequency signal power output by the radio transmitter is transmitted to the antenna through a feeder (cable) and radiated out by the antenna in the form of electromagnetic waves. After reaching the receiving location, the electromagnetic waves are carried by the antenna and then to the radio receiver by the feeder. It can be seen that an antenna is an important radio device for transmitting and receiving electromagnetic waves, and that there is no antenna and no radio communication. The antenna has various varieties and is used under different conditions of different frequencies, different purposes, different occasions, different requirements and the like. For a wide variety of antennas, proper classification is necessary: classified by use, they can be classified into communication antennas, television antennas, radar antennas, and the like; the antenna is classified according to working frequency bands and can be divided into a short wave antenna, an ultra-short wave antenna, a microwave antenna and the like; classified according to directionality, the antenna can be divided into an omnidirectional antenna, a directional antenna and the like; the antenna can be classified into a linear antenna and a planar antenna according to the shape.
The connection mode of the wireless antenna is as follows: the four wireless antennas are respectively arranged on a signal transmitting port at the top end of a transmitter of the image transmission device and a receiving port at the top end of a receiver of the image transmission device and are fixed by outer hexagonal metal screws on the wireless antennas.
The working principle of the power supply device in the image transmitter is as follows: the power supply device is mainly divided into two types: a special sustainable battery mainly comprises a lithium battery with the length of 10cm, the width of 8cm and the height of 5cm, and the lithium battery can be supplemented with electric quantity by a matched charger when the battery is in a power failure state. The other is that: and a direct current 9V power supply. The two differences are that in the aspect of convenient use, the lithium battery can be used by an electric appliance at any time and any place without being influenced by a power supply, and the direct current charger can only supply power to the image transmission device at the place where the power supply is influenced by the power supply.
Connection mode of power supply: and respectively installing power supplies in power supply sockets at the transmitting end and the receiving end of the image transmitter. The power supply mode of the power supply device is automatically selected according to the use mode and the place of a user.
The operation principle of the graph transmission switch in the graph transmitter is as follows: which controls the operation of the control device to be turned on and off. The image legend switch is a gear-shaped switch which is in an On state at the left and in an OFF state at the right. When the on-state is the on-state, the power supply device normally supplies power for the image transmission device, and when the switch is the off-state, the power supply device stops supplying power for the image transmission device.
Connection mode of the image transmission switch: the diagram transmission gear switch is welded at the power supply position of a circuit board in the diagram transmission device by using an electric soldering iron, the anode and the cathode of the diagram transmission gear switch are noticed, and then the diagram transmission gear switch is fixed on a reserved switch hole of a shell of the diagram transmission device.
The working principle of the matching module in the graph transmitter is as follows: when the transmitting end of the image transmitter is matched with the receiving end of the image transmitter, the matching module indicator light is in an orange-yellow flashing state, when the matching is successful, the matching module indicator light is in a green state, and when the matching is failed, the matching module indicator light is in a red state. Meanwhile, a homing button is arranged below the matching module indicator light, when the matching module homing button on the image transmitter is pressed, the image transmitter is disconnected with the image transmitter receiver, when the matching module homing button needs to be matched again, the homing button needs to be pressed at the positions of the matching modules at the transmitting end of the image transmitter and the receiving end of the image transmitter at the same time, namely, the matching indicator light is orange yellow and quickly flickers, and at the moment, the homing button is released, so that the matching state is quickly entered.
And the signal processing module in the image transmitter is used for resolving the synthetic data received by the receiving end of the image transmitter and transmitted by the workstation into video data through video stream soft decoding, and transmitting the video data to the visual effect controller through the transmitting end of the image transmitter.
The visual effect controller comprises: display window, power supply ware, switch, button, input/output terminal, high in the clouds and rigid body fossil fragments support.
And the display window of the visual effect controller is used for displaying the picture and the operation information of the virtual camera machine in the three-dimensional production engine of the workstation. The high-definition 4K liquid crystal display screen is used in the display window, so that production pictures of movies, animations, games and the like in the three-dimensional production engine of the workstation are smoother and clearer. And simultaneously displaying the operation information of pushing, pulling, shaking and moving of the virtual camera by the controller on the display window.
The power supply device in the visual effect controller is used for supplying electric energy to the visual effect controller, and the power supply device mainly comprises two types: a special sustainable battery mainly comprises a lithium battery with the length of 15cm, the width of 10cm and the height of 13cm, and the lithium battery can be supplemented with electric quantity by a matched charger when the battery is dead. The other is that: and a direct current 12V power supply. The two differences are that in the aspect of convenient use, the lithium battery can be used by an electric appliance at any time and any place without being influenced by a power supply, and the direct current charger can only supply power to the image transmission device at the place where the power supply is influenced by the power supply.
Connection mode of power supply: and installing the power supply in a power supply socket of the visual effect controller. The power supply mode of the power supply device is automatically selected according to the use mode and the place of a user.
The working principle of the switch in the visual effect controller is as follows: the power supply device controls the power supply device of the visual effect controller and controls the working state of the display window of the visual effect controller, wherein the switch for controlling the power supply device is arranged ON the front surface and the back surface of the visual effect controller, the switch adopts a gear type, when the switch is pressed down in the ON direction, the power supply device provides electric energy power with a stationary source for the visual effect controller, and when the switch is pressed down in the OFF direction, the power supply device cuts off the electric energy of the visual effect controller. The display window switch is arranged at the right lower part of the display window, the same shift type switch is adopted, the switch is off when being switched on, and the only different place is that only the power supply switch is in the on-state, and the display window switch can effectively control the opening or closing of the display window.
The working principle of the keys in the visual effect controller is as follows: the controller displays what content picture is displayed on the window. The key is mainly as follows: when a user presses a recording button, a display window appears a red frame at the periphery of the window and flickers regularly, an action picture recorded by the virtual camera of the three-dimensional production engine is stored in an external memory card in the format of MOV/MP4/AVI, and the recording button needs to be pressed again to restore to a preview picture state when the user wants to finish recording the picture action. When a playback key is pressed, the recorded picture is played back and watched, and the video player can fast forward/fast backward, fast forward multiple adjustment/fast backward multiple adjustment are simultaneously carried out, along with the progress bar and time display. When the preview playback is finished, the exit key can be pressed, and the real-time preview picture is returned to. When a pause key is pressed, pausing can be performed according to the work content in the current display window: that is, when the display window is a real-time preview image, pressing the pause key stops the real-time signal data reception and stays in the current frame. When the recorded picture is played back in the display window, the playing of the played back picture is stopped by pressing pause at the moment, and if the state before the pause is recovered, the pause key or the return key is pressed again. The previous frame/next frame is the previous/next recorded data when the previous frame/next frame is controlling playback, the previous frame/next frame is related to shooting of the controller, when the controller controls the virtual camera VCS in the three-dimensional production engine to shoot pictures, the shot pictures are transmitted to the visual effect controller through the picture transmitter and displayed in the display window, the shot picture information is stored in the external memory card in the format of jpg/png pictures, a user can check the pictures according to own use, and the previous frame/next frame is used for checking and page turning of the picture of the virtual camera VCS.
The input and output terminals in the visual effect controller work according to the following working principle: the input terminal transmits the data signal received by the receiving end of the image transmitter to the visual effect controller through the input terminal. The output terminal can transmit the data signal received by the visual effect controller to other devices, such as: a liquid crystal television is provided.
The input and output terminals are connected in an operating mode: the input and output terminal adopts a high-definition HDMI terminal interface, and the two ports are respectively welded on a data signal receiving end and a signal output end of an internal circuit board of the visual effect controller. While the 4K high definition HDMI transmission line is used to interconnect the picture transmitters and the visual effect controller input terminals.
The working principle of the cloud end in the visual effect controller is as follows: the virtual three-dimensional world picture from the virtual camera of the three-dimensional production engine is intelligently stored by adopting a 5G technology cloud, and preview viewing can be carried out through a self-defined APP. The 5G cloud technology is adopted in the visual effect controller, so that a real-time imaging system user can only preview the virtual production of the three-dimensional production engine on site by using the subjective visual angle posture capture. The virtual production picture can be watched in real time at different places as long as a network is available.
The working connection mode of the cloud: the virtual production picture is transmitted to a cloud module in the visual effect controller through the controller and the graph transmitter system, digital conversion is carried out through the cloud module, real-time packet loss is carried out to a network cloud server, and a user inputs related port numbers through a designated APP to carry out network data stream pull-packaging in real time to own mobile client equipment to preview the virtual production picture.
The working principle of the rigid keel bracket in the visual effect controller is as follows: the device is mainly used for supporting the fixation and the support of the controller, the visual effect controller and the image transmitter. The aluminum alloy anti-skid handle is formed by combining an aluminum alloy round hollow steel pipe with the diameter of 1.3cm, a rubber anti-skid handle, a shoulder pad and a faucet buckle.
The connection mode of the rigid keel support is as follows: two hollow pipes with the length of 55cm and the diameter of 1.3cm are respectively arranged in the preformed holes with the same diameter as the shoulder pad, and the shoulder pad is made of soft plastic and can be perfectly combined with the hollow pipes. The other ends of the two hollow pipes are fixed in a faucet buckle which is a rectangular rigid body with the length of 40cm, the width of 3cm and the height of 3cm, and a mounting hole is reserved at the corresponding position. The left and the right sweat-proof plastic grips are respectively embedded below the faucet buckle.
The workstation is mainly responsible for processing the three-dimensional production engine, receiving and capturing the collected signal mark tree position information, and binding the signal mark tree signal data thereof with the virtual camera in the three-dimensional production software in a matching way. And after synthesizing the signal mark tree signal data with the picture of the virtual camera, transmitting the synthesized signal mark tree signal data to an equipment image transmitter receiver and carrying out H264 video editing and decoding for the use of a visual effect controller.
In summary, the subjective view pose capturing and real-time imaging system according to the embodiments of the present invention can fully embody all the constituent structures of the subjective view pose capturing and real-time imaging system. The method is mainly applied to three-dimensional production of videos, animations, games and the like, and the visual effect can be quickly previewed on Previz.
The subjective visual angle posture capturing and real-time image system provided by the embodiment of the invention subjectively solves the limitation of site environment, has good identification and stability and does not interfere with the outside, and can control the preview effect while capturing the real-time effect preview at the subjective visual angle posture.
The various embodiments in this specification are described in a progressive manner, and the same and similar parts among the various embodiments can be referred to each other, and each embodiment focuses on the differences from other embodiments. Because the method is basically similar to the method embodiment, the description is simple, and the relevant points can be referred to the partial description of the method embodiment. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Those of ordinary skill in the art will understand that: the figures are merely schematic representations of one embodiment, and the blocks or flow diagrams in the figures are not necessarily required to practice the present invention.
From the above description of the embodiments, it is clear to those skilled in the art that the present invention can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for apparatus or system embodiments, since they are substantially similar to method embodiments, they are described in relative terms, as long as they are described in partial descriptions of method embodiments. The above-described embodiments of the apparatus and system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A subjective visual angle pose capturing and real-time imaging system, comprising: the system comprises a signal mark tree, a controller, a graph transmitter, a visual effect controller and a workstation, wherein the signal mark tree is in wired connection with the workstation, the workstation is in wired connection with the signal mark tree, the controller and the graph transmitter, and the graph transmitter is in wired connection with the visual effect controller;
the signal marker tree is used for receiving optical dynamic capture data through a terminal collector, coding and resolving the optical dynamic capture data to form a data signal, and transmitting the data signal to a workstation;
the controller is used for carrying out azimuth movement and camera shooting control on the virtual camera in the workstation through the wireless control end handle, sending azimuth control operation instructions of pushing, pulling, shaking and moving to the virtual camera and sending camera shooting operation instructions;
the workstation is used for selecting a corresponding virtual scene according to the data signal transmitted by the signal marker tree, performing camera shooting processing in the virtual scene according to the direction control operation instruction and the camera shooting operation instruction transmitted by the controller through the virtual camera, combining virtual video data acquired by the virtual camera with the data signal to generate synthetic data, and transmitting the synthetic data to the graph transmitter;
the image transmitter is used for resolving the synthetic data transmitted by the workstation into video data through video stream soft decoding and transmitting the video data to the visual effect controller;
and the visual effect controller is used for displaying the video data transmitted by the image transmitter.
2. The system of claim 1, wherein the signal label tree is installed on top of the visual effect controller, the signal label tree and the visual effect controller are connected by a transmission line, the visual effect controller is installed on the head of the keel frame, the visual effect controller is fixed by a buckle and a bolt clamp, two controllers are installed on left and right handles of the keel frame, the controllers are fixed by buckle bolts, the transmitter end and the receiver end of the transmitter are respectively installed on the central region of the keel frame, the transmitter of the transmitter is connected with the workstation by an HDMI transmission line, one end of the HDMI transmission line is connected with the transmitter, the other end of the HDMI transmission line is inserted into the female port of the workstation HDMI, and then the receiver of the transmitter is fixedly installed on the central position of the keel frame by a stranger hand and a clamp.
3. The system of claim 1 or 2, wherein the signal marker tree comprises a receiving terminal sphere, a signal transmission skeleton and a signal processor connected to each other;
the receiving terminal ball is a spheroid made of pc materials and used for receiving infrared light signals emitted by the dynamic catching system in all directions, the receiving terminal ball reflects the infrared light signals to the dynamic catching system, and the dynamic catching system obtains position information of the receiving terminal ball according to the infrared light signals reflected by the receiving terminal ball;
the signal transmission framework is made of titanium alloy, corresponds to the receiving terminal balls one by one and is used for transmitting infrared light signals received by the receiving terminal balls to the signal processor; transmitting the data signal generated by the signal processor to a workstation;
the signal processor comprises an IC bidirectional intelligent chip, a rheostat, a miniature circuit board and a rigid outer box, and is used for intelligently analyzing, calculating and compiling the infrared light signals through the IC bidirectional chip and regenerating data signals which can be identified and used by a three-dimensional manufacturing engine in a workstation;
the position and motion state information of the user handheld equipment is determined by receiving the terminal ball, and the signal processor sends the position and motion state information of the user handheld equipment to the workstation.
4. The system of claim 3, wherein the bottom of the receiving terminal ball is soldered with solder, the spring clip at the bottom of the receiving terminal ball is fastened to a section of the signal transmission frame, one end of the signal transmission frame connected to the receiving terminal ball is connected to the reserved holes on the signal processor one by one and is fixed by soldering, and the other end of the signal transmission frame connected to the receiving terminal ball is mounted to the signal receiving holes above the visual effect controller and is fixed by soldering.
5. The system of claim 1 or 2, wherein the controller comprises: a controller receiver and a controller transmitter;
the controller receiver is used for transmitting the operation instruction signal received by the receiver terminal to a three-dimensional production virtual engine in a workstation, and performing corresponding direction control and/or camera shooting control operation on the virtual camera according to the operation instruction signal;
and the controller transmitter is used for transmitting operating instruction signals of pushing, pulling, shaking, moving and shooting to a receiver terminal of the controller receiver through a transmitter terminal of the controller operating handle.
6. The system of claim 5, wherein the controller receiver is plugged into a USB port on a workstation host computer, a connector of the controller is installed on the workstation, the controller receiver is wirelessly paired with a controller operating handle in the controller transmitter, the controller operating handle is installed on a rigid keel frame handheld grip in the visual effect controller, and the orientation is fixed by using a fixing clip buckle.
7. The system of claim 1 or 2, wherein the graph propagator comprises: the wireless antenna, the power supply, the image transmission switch, the matching module and the signal processing module;
the wireless antenna device is used for transmitting and receiving data signals, and two wireless antennas are respectively arranged on the transmitter and the receiver of the image transmitter;
the power supply device is used for being installed in power supply sockets of the graph transmitter and the graph transmitter receiver and supplying power to the graph transmitter and the graph transmitter receiver;
the diagram transmission switch is used for welding the power supply position of the circuit board in the diagram transmission device and is fixed on a reserved switch hole of a shell of the diagram transmission device, and when the diagram transmission switch is in an open state, the power supply device normally supplies power to the diagram transmission device; when the map transmission switch is in an off state, the power supply device stops supplying power to the map transmission device;
the matching module is used for indicating the lamp of the matching module to be in an orange yellow flashing state when the transmitting end of the image transmitter is matched with the receiving end of the image transmitter, indicating the lamp of the matching module to be in a green state when the matching is successful, and indicating the lamp of the matching module to be in a red state when the matching is failed;
and the signal processing module is used for resolving the synthetic data received by the receiving end of the image transmitter and transmitted by the workstation into video data through video stream soft decoding, and transmitting the video data to the visual effect controller through the transmitting end of the image transmitter.
8. The system of claim 1 or 2, wherein the visual effect controller comprises: the device comprises a display window, a power supply, a switch, a key, an input/output terminal, a cloud end and a rigid keel bracket;
the display window is used for displaying the video data transmitted by the image transmitter and the operation information of the virtual camera machine;
the power supply device in the visual effect controller is used for being installed in a power supply socket of the visual effect controller and supplying electric energy to the visual effect controller;
the switch in the visual effect controller is used for controlling the working states of a power supply device and a display window of the visual effect controller;
the keys in the visual effect controller are used for controlling the content of the video data displayed on the display window and comprise a recording button, a playback button, a pause button and a previous frame/next frame button;
the input and output terminal in the visual effect controller is used for receiving the video data transmitted by the image transmitter through the input terminal, transmitting the video data to other equipment through the output terminal, adopting a high-definition HDMI terminal interface, welding two ports on a data signal receiving end and a signal output end of a circuit board in the visual effect controller respectively, and connecting the image transmitter with the input terminal of the visual effect controller through an HDMI transmission line;
the cloud end in the visual effect controller is used for storing the video data transmitted by the image transmitter and the virtual video data acquired by the virtual camera, and previewing and watching the video data through a user-defined APP;
the rigid keel bracket in the visual effect controller is used for fixing and supporting the controller, the visual effect controller and the image transmitter in the system.
9. The system of claim 8, wherein the rigid keel support is installed in a prepared hole of the shoulder pad with the same diameter using two hollow tubes, the other ends of the two hollow tubes are fixed in a faucet buckle, the faucet buckle is a rectangular rigid body and is prepared with installation holes, and a left and a right sweat-proof plastic handles are respectively embedded under the faucet buckle.
10. The system of claim 3, wherein the workstation is configured to receive a data signal transmitted by the SMT and including position and motion state information of the user handheld device, match and bind the position and motion state information of the user handheld device with a virtual camera in the three-dimensional authoring software, process the three-dimensional authoring engine, synthesize the data signal with a frame of the virtual camera to generate synthesized data, and transmit the synthesized data to the graph transmitter.
CN202210277850.0A 2022-03-21 2022-03-21 Subjective visual angle posture capturing and real-time image system Active CN114598790B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210277850.0A CN114598790B (en) 2022-03-21 2022-03-21 Subjective visual angle posture capturing and real-time image system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210277850.0A CN114598790B (en) 2022-03-21 2022-03-21 Subjective visual angle posture capturing and real-time image system

Publications (2)

Publication Number Publication Date
CN114598790A true CN114598790A (en) 2022-06-07
CN114598790B CN114598790B (en) 2024-02-02

Family

ID=81811023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210277850.0A Active CN114598790B (en) 2022-03-21 2022-03-21 Subjective visual angle posture capturing and real-time image system

Country Status (1)

Country Link
CN (1) CN114598790B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101586651B1 (en) * 2015-02-25 2016-01-20 (주) 라온엔터테인먼트 Multiplayer Robot Game System using Augmented Reality
CN109345635A (en) * 2018-11-21 2019-02-15 北京迪生数字娱乐科技股份有限公司 Unmarked virtual reality mixes performance system
CN111447340A (en) * 2020-05-29 2020-07-24 深圳市瑞立视多媒体科技有限公司 Mixed reality virtual preview shooting system
CN114051129A (en) * 2021-11-09 2022-02-15 北京电影学院 Film virtualization production system and method based on LED background wall

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101586651B1 (en) * 2015-02-25 2016-01-20 (주) 라온엔터테인먼트 Multiplayer Robot Game System using Augmented Reality
CN109345635A (en) * 2018-11-21 2019-02-15 北京迪生数字娱乐科技股份有限公司 Unmarked virtual reality mixes performance system
CN111447340A (en) * 2020-05-29 2020-07-24 深圳市瑞立视多媒体科技有限公司 Mixed reality virtual preview shooting system
CN114051129A (en) * 2021-11-09 2022-02-15 北京电影学院 Film virtualization production system and method based on LED background wall

Also Published As

Publication number Publication date
CN114598790B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
CN106454321A (en) Panoramic video processing method, device and system
CN107529021B (en) Tunnel type panoramic video acquisition, distribution, positioning and tracking system and method thereof
CN113625869B (en) Large-space multi-person interactive cloud rendering system
CN109997364A (en) Method, equipment and the stream of the instruction of the mapping of omni-directional image are provided
CN206237504U (en) A kind of intelligent tutoring tracking camera
CN108200394A (en) A kind of UAV system that multiway images is supported to transmit
CN203104668U (en) Wireless focus tracking video system
WO2021173009A1 (en) Active marker strobing for performance capture communication
CN114598790B (en) Subjective visual angle posture capturing and real-time image system
CN111246115A (en) Wireless transmission system and transmission control device
CN103338361A (en) Shooting system
Fiala Pano-presence for teleoperation
CN207397518U (en) Wireless mobile objects projection system
CN218103326U (en) 360 degree panorama live online shopping system based on tall and erect box of ann
CN207150730U (en) A kind of surgical operation images data management system main frame
CN202068512U (en) Stereo video live device watched with naked eyes
CN216291234U (en) Third person weighing visual angle shooting system
CN114286121A (en) Method and system for realizing picture guide live broadcast based on panoramic camera
CN210297905U (en) Monitoring system and video monitoring equipment compatible with synchronous control of multiple users
CN209375839U (en) Transmission system
CN209627539U (en) Wireless video control and editing system
CN207560215U (en) A kind of hand-held closed-circuit television underground piping check device with positioning function
CN203340216U (en) Shooting system
CN219761128U (en) Receiving apparatus
CN212435823U (en) Infant teaching recording and broadcasting system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant