CN108255454B - Splicing processor and visual interaction method of splicing processor - Google Patents

Splicing processor and visual interaction method of splicing processor Download PDF

Info

Publication number
CN108255454B
CN108255454B CN201810100501.5A CN201810100501A CN108255454B CN 108255454 B CN108255454 B CN 108255454B CN 201810100501 A CN201810100501 A CN 201810100501A CN 108255454 B CN108255454 B CN 108255454B
Authority
CN
China
Prior art keywords
processing module
information
image information
control instruction
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810100501.5A
Other languages
Chinese (zh)
Other versions
CN108255454A (en
Inventor
陆旸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mview Information Technology Co ltd
Original Assignee
Shanghai Mview Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mview Information Technology Co ltd filed Critical Shanghai Mview Information Technology Co ltd
Priority to CN201810100501.5A priority Critical patent/CN108255454B/en
Publication of CN108255454A publication Critical patent/CN108255454A/en
Application granted granted Critical
Publication of CN108255454B publication Critical patent/CN108255454B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls

Abstract

The invention relates to a splicing processor and a visual interaction method thereof, wherein the splicing processor comprises: the system comprises a main control module, an expansion processing module, an input processing module and an output processing module; the expansion processing module is used for acquiring control operation data and converting the control operation data to obtain a corresponding target control instruction; the target control instruction comprises control information and space information corresponding to the control information; the input processing module is used for acquiring first image information; the output processing module is used for processing the first image information according to the target control instruction to obtain corresponding second image information; and the main control module is respectively connected with the expansion processing module, the input processing module and the output processing module, controls the expansion processing module to acquire a target control instruction, controls the input processing module to acquire first image information, and controls the output processing module to process the first image information to acquire second image information. The invention optimizes the convenience and interactivity of the user for operating the large screen and improves the response efficiency of interactive control.

Description

Splicing processor and visual interaction method of splicing processor
Technical Field
The invention relates to the technical field of image processing, in particular to a splicing processor and a visual interaction method of the splicing processor.
Background
The splicing processor is widely applied to formal occasions such as command scheduling, information release, real-time monitoring, conferences and the like. The conventional splicing processor generally includes a switch card (or called a switch module) and an input card (or called an input module) connected thereto for data processing of input signals, including signal adaptation, frame rate and resolution adjustment, an output card (or called an output module for data processing of output signals, including window superimposition, etc.), and a control card (or called a control module), which is further connected to the input card and the output card. During specific splicing, the input card is used for converting the original data of each input source into data which can be processed by exchange; the exchange is used for creating a mapping channel between the input card and the output card under the action of the controller; and then, after relevant data processing is carried out on some input sources by using the output card, the processed data is displayed on the display equipment to complete the splicing function.
In the display field requiring large screens (multiple screens or large screens), a splicing processor is often used for processing operations such as access, arrangement, transformation, roaming and the like of a plurality of paths of video image signals to the large screens, and the operations are performed in the current splicing processor by adopting another computer through software, wherein the greatest defect of the operation mode is that interactivity is lacked, and the operation of a user on the other computer can only be presented to the user in a form corresponding to an operation result on the large screen; secondly, the interactive capability is improved by adopting a mode of transmitting the content on the large screen back to a computer or an operation table display after code conversion or compression, but the interactive requirement of the large screen cannot be solved; and thirdly, a control card of the splicing processor is directly connected with external equipment, and then the external equipment is used for directly operating the large screen, so that the interaction capacity of large screen operation is improved, but the splicing processor is often placed in a machine room cabinet and is far away from an operator, although the external equipment with the interaction capacity is provided, the expansion of the interaction capacity is limited by the limited number of interfaces of the splicing processor, and meanwhile, the external equipment is directly connected to obtain control information to generate image information and display the image information, so that the generated image information cannot adapt to the requirements of splicing in different scales, and the visual interaction effect can be reduced under the condition of overlarge image information or overlarge splicing scale.
Disclosure of Invention
The invention aims to provide a splicing processor and a visual interaction method of the splicing processor, which achieve the purposes of optimizing the convenience and the interactivity of a user for operating a large screen and improving the response efficiency of interaction control.
The technical scheme provided by the invention is as follows:
a stitching processor, comprising: the system comprises a main control module, an expansion processing module, an input processing module and an output processing module; the extended processing module acquires control operation data and converts the control operation data to obtain a corresponding target control instruction; the target control instruction comprises control information and space information corresponding to the control information; the input processing module acquires first image information; the output processing module is used for processing the first image information according to the target control instruction to obtain corresponding second image information; the main control module is respectively connected with the expansion processing module, the input processing module and the output processing module, controls the expansion processing module to acquire the target control instruction, controls the input processing module to acquire the first image information, and controls the output processing module to process the first image information to acquire the second image information.
Further, the main control module includes: the first communication unit is used for acquiring the target control instruction; the first communication unit sends the target control instruction to the input processing module and/or the output processing module.
Further, the input processing module is connected with the main control module to obtain the target control instruction; and/or the input processing module is connected with the output processing module to acquire the target control instruction.
Further, the input processing module includes: an input unit that acquires the first image information; the second communication unit is used for acquiring the target control instruction; and the first processing unit is used for processing the part corresponding to the spatial information in the first image information according to the control information in the target control instruction to obtain the corresponding processed first image information.
Further, the input unit includes: the first connecting port is in wired connection with the image acquisition equipment through a connecting wire; and/or the first communication subunit is wirelessly connected with the image acquisition equipment through a network.
Further, the extended processing module includes: the third communication unit is connected with the external equipment and used for acquiring the control operation data input by the user on the external equipment; the control operation data comprises operation information and coordinate information corresponding to the operation information; the second processing unit searches a control instruction table for the operation information to obtain corresponding control information, and obtains space information corresponding to a large screen according to the coordinate information to obtain the target control information; and the third communication unit sends the target control instruction to the output processing module and/or the main control module.
Further, the third communication unit includes: the second connecting port is in wired connection with the external equipment through a connecting wire; and/or the second communication subunit is wirelessly connected with the external equipment through a network.
Further, the expansion processing module is connected with the output processing module and the main control module through a network; the input processing module is connected with the main control module and the output processing module through a bus.
Further, the main control module further includes: and the distribution unit is used for setting the unique identification number and the priority of each expansion processing module.
Further, the expansion processing module further includes: and the negotiation unit sets the unique identification number and the priority of each extended processing module through a network.
Further, the output processing module includes: the fourth communication unit is used for acquiring the target control instruction and the first image information; the third processing unit is used for processing the part corresponding to the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding second image information; and the output unit transmits the second image information to a display device so that the display device displays the second image information.
Further, the fourth communication unit further sends the target control instruction to a next output processing module and the main control module.
The invention also provides a visual interaction method of the splicing processor, which comprises the following steps: s100, acquiring control operation data, and converting the control operation data to obtain a corresponding target control instruction; the target control instruction comprises control information and space information corresponding to the control information; s200, acquiring first image information; s300, processing the first image information according to the target control instruction to obtain corresponding second image information.
Further, the step S200 includes the steps of: s210, acquiring the first image information; s220, processing the corresponding part of the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding processed first image information.
Further, the step S100 includes the steps of: s110, acquiring the control operation data input by the user on the external equipment; the control operation data comprises operation information and coordinate information corresponding to the operation information; s120, searching the control instruction table for the operation information to obtain corresponding control information, and obtaining space information corresponding to a large screen according to the coordinate information to obtain the target control information.
Further, the step S300 includes the steps of: s310, processing the corresponding part of the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding second image information; s320 transmits the second image information to a display device, so that the display device displays the second image information.
Further, the step S100 includes, before the step, the steps of: s010 sets a unique identification number and priority of each extended processing module.
The splicing processor and the visual interaction method of the splicing processor provided by the invention can bring at least one of the following beneficial effects:
1) according to the invention, the control operation data are respectively converted by the extended processing module, and the images are respectively processed by the output processing module, so that the processing speed can be increased and the image processing efficiency can be improved by the distributed cooperative processing mode. The distributed cooperative processing mode can distribute the processing burden to each output processing module, is favorable for reducing the burden of the splicing processor, and improves the response speed of interactive operation.
2) According to the invention, each module is coordinated and controlled through the main control module, repeated sending of the same target control instruction to the input processing module or the output processing module can be reduced, and the system pressure of the splicing processor is reduced.
3) The invention can indirectly improve the processing efficiency and save the system resources through the primary editing processing of the input processing module.
4) The input processing module can convert the control operation data acquired by the external equipment connected with the expansion processing module into the processing of the content of the first image information acquired by the input processing module, meanwhile, the control operation data can be forwarded to the corresponding input processing module through the output processing module and the main control module, the input processing module controls the connected image acquisition equipment, the visual and intuitive effect is achieved, the interaction between the splicing processor and a user can be improved, the convenience and the interactivity of the user for operating a large screen are optimized, and the use experience of the user is improved.
5) The invention sets the unique change identification and the priority of each expansion processing module connected with the main control module, can improve the effectiveness of setting the unique identification number and the priority, and ensures that multiple users carry out ordered control on the same large screen under the multi-user operation control scene.
6) The unique identification number and the priority between the extension processing modules are set according to negotiation between the extension processing modules, a user can flexibly select and control a large screen, heavy workload caused by setting the unique identification number and the priority for the extension processing modules in advance is avoided, and user experience is guaranteed.
Drawings
The above features, technical features, advantages and implementations of a stitching processor and a method for visual interaction of a stitching processor will be further described in the following detailed description of preferred embodiments in a clearly understandable manner in conjunction with the accompanying drawings.
FIG. 1 is a schematic block diagram of one embodiment of a stitching processor of the present invention;
FIG. 2 is a schematic block diagram of one embodiment of a stitching processor of the present invention;
FIG. 3 is a flow diagram of one embodiment of a visualization interaction method of a stitching processor of the present invention; FIG. 4 is a flow diagram of another embodiment of a visualization interaction method of a stitching processor of the present invention;
FIG. 5 is a flow diagram of another embodiment of a visualization interaction method of a stitching processor of the present invention;
FIG. 6 is a flow chart of another embodiment of a visualization interaction method of a stitching processor of the present invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
For the sake of simplicity, the drawings only schematically show the parts relevant to the present invention, and they do not represent the actual structure as a product. In addition, in order to make the drawings concise and understandable, components having the same structure or function in some of the drawings are only schematically illustrated or only labeled. In this document, "one" means not only "only one" but also a case of "more than one".
One embodiment of a stitching processor 1000, as shown in FIG. 1, includes: a main control module 1100, an expansion processing module 1400, an input processing module 1200, and an output processing module 1300;
the extended processing module 1400 obtains control operation data, and converts the control operation data to obtain a corresponding target control instruction; the target control instruction comprises control information and space information corresponding to the control information;
the input processing module 1200 obtains first image information;
the output processing module 1300 is configured to process the first image information according to the target control instruction to obtain corresponding second image information;
the main control module 1100 is connected to the expansion processing module 1400, the input processing module 1200, and the output processing module 1300, respectively, and controls the expansion processing module 1400 to obtain the target control instruction, controls the input processing module 1200 to obtain the first image information, and then controls the output processing module 1300 to process the first image information to obtain the second image information.
Specifically, in this embodiment, the number of the input processing module 1200, the output processing module 1300, the expansion processing module 1400, and the external device 2000 is several, the number of the input processing module 1200, the output processing module 1300, and the expansion processing module 1400 may be the same or different, one input processing module 1200 may be connected to several output processing modules 1300, one output processing module 1300 may be connected to several input processing modules 1200, one output processing module 1300 may be connected to several expansion processing modules 1400, one expansion processing module 1400 may be connected to several output processing modules 1300, the main control module 1100 may be connected to several output processing modules 1300, the output processing modules 1300, and the expansion processing module 1400, and one expansion processing module 1400 may be connected to several external devices 2000. The first image information or the second image information includes image information such as graphics (still picture, moving picture), video, and the like. The first image information is used as a source signal source, and the second image information is used as a destination signal source. The splicing processor 1000 of the present invention includes a plurality of (one or more) expansion processing modules 1400, an input processing module 1200 and an output processing module 1300, where the target control instruction includes control information and spatial information corresponding to the control information, that is, if a real-time visual interactive operation is to be performed on an image to be displayed on a large screen, the target control instruction is obtained through the expansion processing module 1400, and if a preset icon is superimposed on a top right corner portion of the first image information, that is, the control information is the superimposed preset icon, the operation object is the first image information, and the spatial information is the top right corner of the first image information. The control operation data is acquired by the expansion processing module 1400, the control operation data is converted by the expansion processing module 1400 to obtain a target control instruction corresponding to the control operation data, and the first image information is acquired by the input processing module 1200, where the target control instruction is acquired by the expansion processing module 1400 first and the first image information is acquired after the target control instruction is input to the input processing module 1200; or the input processing module 1200 may first obtain the first image information, and then expand the processing module 1400 to obtain the target control instruction; the input processing module 1200 may also acquire the first image information while the expansion processing module 1400 acquires the target control instruction, which is not limited herein and is within the protection scope of the present invention. After the expansion processing module 1400 obtains the target control instruction and the input processing module 1200 obtains the first image information, the output processing module 1300 obtains the target control instruction and the first image information, and then the output processing module 1300 processes the obtained first image information according to the target control instruction to obtain the second image information.
Compared with the prior art, in which the main control module 1100 performs centralized processing, the present invention can convert all the acquired control operation data input by each user through the expansion processing module 1400, and is a process of distributed processing conversion of each expansion processing module 1400, that is, if the splicing processor 1000 has one expansion processing module 1400, the expansion processing module 1400 converts all the acquired control operation data input by the user one by one to obtain corresponding target control instructions; if the splicing processor 1000 has a plurality of expansion processing modules 1400, each expansion processing module 1400 converts all the control operation data input by the user acquired by itself one by one to obtain a corresponding target control instruction, for example, an expansion processing module 1400a1 and an expansion processing module 1400B1 are provided, the expansion processing module 1400a1 acquires the control operation data a2 corresponding to itself, and the expansion processing module 1400B1 acquires the control operation data B2 corresponding to itself, the expansion processing module 1400a1 converts the acquired control operation data a2 one by one to obtain a corresponding target control instruction A3, and the expansion processing module 1400B1 converts the acquired control operation data B2 one by one to obtain a corresponding target control instruction B3. According to the invention, the control operation data is acquired through the expansion processing module 1400, the expansion processing module 1400 respectively converts the acquired control operation data to obtain corresponding target control instructions, the output processing module 1300 acquires the target control instructions, and the first image information is processed according to the acquired target control instructions, so that the second image information is obtained. Because the extended processing module 1400 respectively performs conversion processing on the control operation data and the output processing module 1300 respectively performs processing on the image, the distributed cooperative processing mode can accelerate the processing speed and improve the efficiency of image processing. The distributed cooperative processing mode can distribute the processing load to each output processing module 1300, which is beneficial to reducing the load of the splicing processor 1000 and improving the response speed of the interactive operation.
Another embodiment of a splicing processor 1000 according to the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 2, the main improvement of this embodiment compared with the foregoing embodiment is that the splicing processor includes:
the main control module 1100 includes:
a first communication unit 1110 that acquires the target control instruction;
the first communication unit 1110 transmits the target control instruction to the input processing module 1200 and/or the output processing module 1300.
Specifically, in this embodiment, the target control instruction obtained by the output processing module 1300 may be sent by the connected expansion processing module 1400, or may be sent by the connected main control module 1100. The main control module 1100 plays a role similar to a transfer station, and the main control module 1100 is configured to coordinate and control connection relationships between itself and each of the expansion processing modules 1400, the input processing module 1200, and the output processing module 1300, that is, to control connection states (that is, connection or disconnection) between itself and each of the expansion processing modules 1400, and also to control connection states between itself and each of the output processing module 1300 and the input processing module 1200, where the main control module 1100 sends an acquired target control instruction to the input processing module 1200 connected to the main control module 1100 itself, or the main control module 1100 sends an acquired target control instruction to the output processing module 1300 connected to the main control module 1100, or the main control module 1100 sends an acquired target control instruction to the output processing module 1300 and the input processing module 1200 connected to the main control module 1100. The main control module 1100 may parse the acquired target control command, determine which screen connected output processing module 1300 the target control command is for, if the main control module 1100D is connected to the output processing modules 1300E1, E2, and E3, the user inputs control operation data a in the external device 2000, the control operation data a is for the first image information of the large screen to which the output processing module 1300E2 is connected, the expansion processing module 1400 converts the control operation data a into a corresponding target control command, and sends the target control instruction to the main control module 1100, and the main control module 1100 analyzes the obtained target control instruction, and analyzes that the control operation data a is for the first image information of the large screen connected to the output processing module 1300E2, so that the main control module 1100 can directly forward the target control instruction to the output processing module 1300E 2. If the control operation data a is for the first image information of the large screen to which the output processing modules 1300E1, E2, and E3 are connected, the main control module 1100 may forward the target control command to the output processing modules 1300E1, E2, and E3, respectively. According to the invention, each module is coordinated and controlled through the main control module 1100, repeated sending of the same target control instruction to the input processing module 1200 or the output processing module 1300 can be reduced, and the system pressure of the splicing processor 1000 is reduced.
Another embodiment of a splicing processor 1000 according to the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 2, the main improvement of this embodiment compared with the foregoing embodiment is that the splicing processor includes:
the input processing module 1200 is connected to the main control module 1100, and acquires the target control instruction; and/or the presence of a gas in the gas,
the input processing module 1200 is connected to the output processing module 1300, and acquires the target control instruction;
the input processing module 1200 includes:
an input unit 1210 that acquires the first image information;
a second communication unit 1220 for acquiring the target control instruction;
the first processing unit 1230 processes the corresponding part of the spatial information in the first image information according to the control information in the target control instruction, so as to obtain corresponding processed first image information.
Specifically, in this embodiment, since the main control module 1100 is respectively connected to each expansion processing module 1400, the input processing module 1200, and the output processing module 1300, that is, the main control module 1100 functions like a transfer station, the main control module 1100 can acquire all target control instructions acquired by each expansion processing module 1400, and also can acquire all first image information acquired by each input processing module 1200, so as to forward the first image information to each output processing module 1300, the input processing module 1200 can acquire the target control instructions by being connected to the main control module 1100, and process the corresponding first image information acquired by itself according to the acquired target control instructions, so as to obtain the processed first image information. The input processing module 1200 may also be connected to the output processing module 1300, acquire a target control instruction from the connected output processing module 1300, and process the corresponding first image information acquired by itself according to the acquired target control instruction to obtain processed first image information.
Wherein, the control information includes editing control, overlay control, rotation control, zooming control, etc., for example, the input processing module 1200 superimposes the first image information with graphics, cursor, or editing characters, etc., in real time, thereby realizing visual operation on the large screen and the source signal source, enhancing visual interactive operation experience, being more intuitive and having stronger interactivity, and because the input processing module 1200 can process the first image information, the first image information can be primarily processed, for example, after the initial first image information is obtained from the image acquisition device 3000 and/or the network address, the preset icon needs to be superimposed on the first image information, the input processing module 1200 superimposes the preset icon on the first image information to obtain the processed first image information, the processed first image information of the preset icon superimposed on the first image information can be transmitted to the output processing module 1300, the processed first image information is processed by the output processing module 1300 two or more times, and no matter what control operation is performed on the processed first image information by the output processing module 1300, the processed first image information obtained by the primary processing by the input processing module 1200 cannot be deleted or replaced unless all the output processing modules 1300 that acquire the processed first image information are edited at the same position, and any one of the output processing modules 1300C1 that acquire the processed first image information is not edited at the same position, and the preset icon superimposed by the input processing module 1200 is still displayed by the screen connected to the output processing module 1300C 1. Illustratively, the video image, which is the first image information acquired by the camera F that shoots the scene of the hall, needs to be marked as the hall, so that the user can conveniently record and look up the video image, and the acquired video image is subjected to text superposition through the input processing module 1200 of the camera F, that is, the text icon of the hall is superposed on the video image, so that the displayed video image is the video image marked with the hall when the large screen display is performed. If the output processing modules 1300 are used for superposition, when the large-screen display is carried out, all the output processing modules 1300 connected with the large screen are required to carry out character superposition, the processing resources of all the output processing modules 1300 are wasted, and the processing and editing efficiency is reduced.
Preferably, the input unit 1210 includes:
a first connection port 1212 connected to the image capturing device 3000 by a wire; and/or the presence of a gas in the gas,
a first communication subunit 1211 wirelessly connected to the image capturing device 3000 via a network.
The input processing module 1200 can support accessing of non-video signals (such as still images or moving images), and the input processing module 1200 can obtain the first image information through a network or a connection line (such as a USB connection line or a VGA connection line), which includes the following steps:
1. the input processing module 1200 only obtains the first image information from the network address, that is, the input processing module 1200 has a wireless communication function, and downloads and obtains the first image information from the network address (such as a video website, a picture website, etc.) through a wireless network;
2. the input processing module 1200 acquires only the first image information from the image capturing apparatus 3000, that is:
2.1, the input processing module 1200 is wirelessly connected with the image acquisition equipment 3000 through a network, and acquires first image information from the connected image acquisition equipment 3000;
2.2, the input processing module 1200 is connected with the image capturing device 3000 by a wire through a connecting wire, and acquires the first image information from the connected image capturing device 3000.
Specifically, the input processing module 1200 can acquire the first image information from any one or more of the image capturing device 3000 (e.g., a camera, a video camera, etc.) and a network address. Illustratively, when the input processing module 1200 is connected to several image capturing devices 3000 such as cameras, the input processing module 1200 obtains the first image information from the camera, the expansion processing module 1400 obtains the control operation data input by the user on the external device 2000, converts the control operation data into the target control command, the target control instruction is sent to the input processing module 1200 through the main control module 1100 or the output processing module 1300, the input processing module 1200 obtains the target control instruction, then the input processing module 1200 analyzes the target control instruction to obtain control operation data, and the input processing module 1200 rotates and moves the correspondingly connected camera to the position (direction and position) required by the user according to the control operation data, so that the camera can shoot surrounding scenes more comprehensively and effectively to obtain first image information required by the user.
For example, in the operation of capturing a suspect in public security, a large number of suspicious images preliminarily screened by artificial intelligence are called and displayed on a large screen, and are respectively displayed on seat screens (such as computers) corresponding to the work seats of various workers, and different workers carry out secondary manual comparison and verification on each large screen or seat screen, if no suspicious point exists, the workers can carry out operation (such as mouse click operation and keyboard key operation) division through a keyboard or a mouse connected with the expansion processing module 1400 of the splicing processor 1000, if the suspicious point exists, character marking can be added through a mouse mark or a key, and meanwhile, control operation data of the control camera can be input through the keyboard or the mouse, the control operation data is transmitted to the corresponding expansion processing module 1400, the expansion processing module 1400 transmits the control operation data to the main control module 1100 and/or the corresponding output processing module 1300, the input processing module 1200 obtains the control operation data from the main control module 1100 and/or the correspondingly connected output processing module 1300, and the input processing module 1200 triggers the traffic camera to rotate to a proper position and direction or focus according to the control operation data, so that the public security can conveniently perform secondary tracking and positioning on the suspect.
In this embodiment, the input processing module 1200 may convert the control operation data acquired by the external device 2000 connected to the expansion processing module 1400 into processing of the content of the first image information acquired by the input processing module 1200, and meanwhile, the control operation data may be forwarded to the corresponding input processing module 1200 through the output processing module 1300 and the main control module 1100, and the input processing module 1200 controls the connected image acquisition device 3000, thereby achieving the visualization and intuitive effects, and improving the interaction between the splicing processor 1000 and the user, optimizing the convenience and interactivity of the user in operating a large screen, and improving the use experience of the user.
Another embodiment of a splicing processor 1000 according to the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 2, the main improvement of this embodiment compared with the foregoing embodiment is that the splicing processor includes:
the expansion processing module 1400 includes:
a third communication unit 1410, connected to the external device 2000, for acquiring the control operation data input by the user on the external device 2000; the control operation data comprises operation information and coordinate information corresponding to the operation information;
the second processing unit 1420 searches a control instruction table for the operation information to obtain corresponding control information, and obtains spatial information corresponding to a large screen according to the coordinate information to obtain the target control information;
the third communication unit 1410 sends the target control instruction to the output processing module 1300 and/or the main control module 1100.
The third communication unit 1410 includes:
a second connection port 1412 wired to the external device 2000 through a connection line; and/or the presence of a gas in the gas,
the second communication subunit 1411 is wirelessly connected to the external device 2000 via a network.
Specifically, in this embodiment, the external device 2000 includes a touch pad, a keyboard, a mouse, a writing board, a touch screen, and other devices having an interactive operation function, a user operates on the external device 2000, the external device 2000 can obtain control operation data corresponding to the user operation, the external device 2000 is connected to the expansion processing module 1400 of the splicing processor 1000 by a wire, or the external device 2000 is wirelessly connected to the expansion processing module 1400 of the splicing processor 1000 by a network, the external device 2000 sends the obtained control operation data to the expansion processing module 1400, the expansion processing module 1400 converts the obtained control operation data, that is, the operation information in the control operation data is matched with the control instruction list to obtain corresponding control information, and the coordinate information corresponding to the operation information in the control operation data is mapped to obtain corresponding spatial information, the obtained control information and spatial information are target control instructions, and the expansion processing module 1400 sends the converted target control instructions to the output processing module 1300 and/or the main control module 1100 connected to the expansion processing module 1400.
Illustratively, if a user uses a touch pad to click a certain position, that is, control operation data, a highlighted selection frame layer is superimposed on a video to be displayed on a control screen, an expansion processing module 1400 connected to the touch pad converts the control operation data into a target control instruction including spatial information and control information, the expansion processing module 1400 sends the target control instruction to a main control module 1100 and/or an output processing module 1300 connected to the expansion processing module 1400 through a network, after receiving the target control instruction, the main control module 1100 and/or the output processing module 1300 connected to the expansion processing module 1400 forwards the target control instruction to an input processing module 1200 connected to the main control module 1100 and/or the expansion processing module 1400, and each output processing module 1300 or input processing module 1200 parses the target control instruction to obtain corresponding control information, The spatial information corresponding to the control information, each of the output processing modules 1300 or the input processing module 1200 determines whether the spatial information is in its corresponding display region, and determines whether the display region needs to execute the control information corresponding to the target control command, if the spatial information is in the display area of the user and the control information corresponding to the target control instruction is executed, the highlighted selection frame layer is displayed in an overlapping mode at the screen position (spatial information) corresponding to the clicking position (coordinate information) of the user on the touch screen, if the coordinate information is located in the display area of the display device 4000 to which a certain output processing module 1300M is correspondingly connected, the output processing module 1300M generates a highlighted selection frame layer in real time to be superimposed on the spatial position of the first image information displayed by the display device 4000 connected with the output processing module 1300M. If the user uses the touch pad to double-click a certain position, based on the above single-click operation, the area selected by the highlighted selection frame layer is amplified after the highlighted selection frame layer is superimposed, and the event is notified to the main control module 1100 and other input processing modules 1200 and output processing modules 1300 through the bus. In addition, a user can also generate a cursor displayed on the large screen through the touch pad, and in the moving process of the user on the touch pad, the output processing module 1300 connected with the expansion processing module 1400 corresponding to the touch pad can perform graphic or text overlapping operation in real time according to the movement of the user, so that the large screen connected with the output processing module 1300 can generate interactive effects such as window frame flickering and coordinate real-time display, and the like, and the user feels that the operation on the touch pad is visual and intuitive operation on the content displayed on the large screen.
Another embodiment of a splicing processor 1000 according to the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 2, the main improvement of this embodiment compared with the foregoing embodiment is that the splicing processor includes:
the output processing module 1300 includes:
a fourth communication unit 1310 configured to acquire the target control instruction and the first image information;
a third processing unit 1320, configured to process, according to control information in the target control instruction, a part corresponding to the spatial information in the first image information to obtain corresponding second image information;
an output unit 1330 transmitting the second image information to a display device such that the display device displays the second image information;
the fourth communication unit 1310 also sends the target control instruction to the next output processing module 1300 and the main control module 1100.
Specifically, in this embodiment, the first image information processed by the output processing module 1300 is the first image information that is not processed by the input processing module 1200, or the first image information processed by the output processing module 1300 is the first image information processed by the input processing module 1200. The output processing module 1300 processes the acquired first image information according to the target control instruction to obtain corresponding second image information, and then sends the second image information to a display device (such as an LED display screen, a numerical control display screen, or the like having a display screen) connected to the output processing module 1300, and displays the second image information through the display device, and because the second image information displayed by the display device is real-time, visual and intuitive operation on the first image information according to the control operation data input by the user in real time on the external device 2000.
The fourth communication unit 1310 also sends the target control instruction to the next output processing module 1300 and the main control module 1100. Specifically, as shown in fig. 2, the target control instruction obtained by the main control module 1100 may be sent by the connected expansion processing module 1400, or may be sent by the connected output processing module 1300.
Another embodiment of a splicing processor 1000 according to the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 2, the main improvement of this embodiment compared with the foregoing embodiment is that the splicing processor includes:
the main control module 1100 further includes:
the assigning unit 1120 sets a unique identification number and priority of each extended processing module 1400.
Specifically, in this embodiment, the unique change identifier and the priority of each expansion processing module 1400 connected to the main control module 1100 may be set. The setting by the main control module 1100 can improve the effectiveness of setting the unique identification number and the priority, and the unique identification number and the priority of each expansion processing module 1400 can be reset when another expansion processing module 1400 is newly added, or the newly added expansion processing module 1400 can be numbered and the priority of the newly added expansion processing module 1400 can be set directly according to the historical numbering rule and the historical priority setting rule on the basis of the unique identification number and the historical priority which are set historically. The invention can ensure that multiple users can orderly control the same large screen under the multi-user operation control scene.
Another embodiment of a splicing processor 1000 according to the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 2, the main improvement of this embodiment compared with the foregoing embodiment is that the splicing processor includes:
the extended processing module 1400 further includes:
the negotiation unit 1430 sets the unique identification number and priority of each extended processing module 1400 through the network.
Specifically, in this embodiment, network real-time negotiation between the extension processing modules 1400 can coordinate unique identification numbers and priorities of different extension processing modules 1400 according to input time of user input control operation data in a multi-user operation control scenario and according to a preset negotiation rule, and can ensure that multiple users perform ordered control on the same large screen, so as to avoid the situation of out-of-control or chaotic control on interactive operation control of the large screen during multi-user operation control, coordinate control of multiple users on the large screen, and indirectly improve user experience. The unique identification number and the priority between the extension processing modules 1400 are set according to negotiation between the extension processing modules, and a large screen can be flexibly selected and controlled by a user, so that heavy workload caused by setting the unique identification number and the priority for the extension processing modules 1400 in advance is avoided, and user experience is guaranteed.
For example, in an application scenario of user interactive operation, such as under a condition that a splicing processor 1000 is used to manage a large number of operator operating seats, the scenario mainly aims at a large number of operators in some command and dispatch centers, an extension processor may be connected to each seat, each extension processor may be connected to various external devices 2000 such as a mouse, a keyboard, or a stylus pen, and each operator may operate one or more screens corresponding to the seat in a large screen, and may operate independently of each other. Meanwhile, the expansion processing module 1400 of the present invention can be connected not only to the main control module 1100 through a network, but also to the output processing module 1300 having a network connection function, so that when the external device 2000 inputs control operation data, the expansion processing module 1400 connected to the external device 2000 controls the contents displayed on the output screen corresponding to the output processing module 1300 connected to the expansion processing module 1400 to be superimposed, or when the video contents processed in the output processing module 1300 are superimposed, the processing burden is only on the output processing module 1300, and in the seat management application in the field of large screen application, most of the control of the mouse keyboard touch screen and the like corresponds to some output screens, and the operation on the full screen is rather small, which is more favorable for reducing the burden of the splicing processor 1000, and improving the response speed of the interactive operation between the user and the large screen, hardware overhead is reduced.
The splicing processor 1000 with the above functions can be used in the visual command of a large military command center or an actual combat training center, the command center is provided with a large display screen, each officer and high-level commander are provided with a display and a set of keyboard mouse and drawing board in front of the seat of each officer, the keyboard mouse and the drawing board of each seat are connected to an extension processing module 1400 equipped on the seat, the splicing processor 1000 is connected with all the display screens, each officer carries out the visual command (battlefield image and computer image) of the army with the authority, the operations such as the views and the sizes of a plurality of battlefield images can be changed by drawing the drawing board, the operation of any one or a plurality of seats can be displayed on the large screen, each officer has the operation authority to the corresponding seat, and the seat corresponding to the high-level director has the management authority to operate any one of the screens of the seats or the large screen, which can be realized by defining the authority and the priority for a plurality of the extended processing modules 1400. In addition, if the input processing module 1200 is connected to the server to obtain the first image information, i.e., the battlefield diagram, the connection combination key of the high-level commander on a certain battlefield diagram can be converted into the graphic superposition of the battlefield diagram in the input processing module 1200, for example, the connection combination key of a certain battlefield diagram can be converted into the connection combination key of the input processing module 1200 to superpose a fire striking red arrow graphic on the battlefield diagram, meanwhile, the server is also connected with the input processing module 1200 through the network to obtain the target control instruction of the high-level commander, and the server issues the target control instruction to the military command system through the network, so that the high-level commander issues the fire striking instruction, and the fire striking instruction is that a fire striking red arrow graphic area superposed on the battlefield diagram is a striking range. That is to say, the splicing processor 1000 has the capability of converting the visual target control instruction into other control information or instruction and combining with the overlay graph to realize the virtualized display and the visual real control, and the user can operate the binding capability of the target control instruction through the external device 2000 to meet the requirements of different industries.
Illustratively, if a plurality of users use a plurality of expansion processing modules 1400, for example, user 1 connects an expansion processing module 1400S1 with a mouse, expansion processing module 1400S1 connects an output processing module 1300X1, user 2 connects an expansion processing module 1400S2 with a tablet, and expansion processing module 1400S2 connects an output processing module 1300X 2. The two expansion processing modules 1400 firstly obtain a unique identification number through network negotiation, convert the unique identification number and the received mouse or handwriting board operation information into a target control instruction which can be recognized by the input processing module 1200 and the output processing module 1300 of the splicing processor 1000, the two expansion processing modules 1400 send the target control instruction to the connected main control module 1100 and the output processing module 1300 through the network, after the main control module 1100 and the output processing module 1300 receive the target control instruction, the target control instruction is sent to each input processing module 1200 and the output processing module 1300, the MCU or FPGA on each input processing module 1200 and output processing module 1300 judges whether the position of the space coordinate in the target control instruction is in the display area where the MCU is located, and judges whether the display area receives and executes the control information in the target control instruction, if both are satisfied, the operation display is performed in the display area. If, for the output processing module 1300X1, the spatial information corresponding to the mouse operated by the user 1 is located at the edge of a signal window of the display area responsible for the output processing module 1300X1, the output processing module 1300X1 generates a highlighted selection frame layer in real time to be superimposed on the signal window, and generates a cursor prompt that can be pulled by the user at the same time, so that the user 1 can click and pull the selected cursor through the mouse, thereby implementing interactive operation of scaling the size of the window. If the space information corresponding to the handwriting pad operation operated by the user 2 is located in the display area of the output processing module 1300X2 for the output processing module 1300X2, the output processing module 1300X2 generates a graphic overlay track dotted line in real time, and the user 2 can draw on a large screen through the drawing operation of the handwriting pad. In this embodiment, the user has corresponding visual display to control the image information on the large screen through the operation of the external device 2000. The operations of two or even multiple users can be independent or related to each other, the expansion processor can support simultaneous interactive operation of multiple users through a network, and although each of the input processing module 1200 and the output processing module 1300 receives a large number of target control instructions, only the operations related to the display area and the image information corresponding to each of the input processing module 1200 and the output processing module 1300 need to be executed, and the operation of a larger display area and more image information can be realized by increasing the number of the input processing module 1200 and the output processing module 1300, so that the splicing processor 1000 of the present invention can adapt to the interactive capability of the operation graphics of different scales.
In all the above embodiments, the expansion processing module 1400 is connected to the output processing module 1300 and the main control module 1100 through a network; the input processing module 1200 is connected to the main control module 1100 and the output processing module 1300 through a bus.
The following description will be made based on all the above embodiments by way of example:
a splicing processor 1000 with a visual interaction function is provided with a main control module 1100, an expansion processing module 1400, an input processing module 1200 and an output processing module 1300, wherein the main control module 1100, the input processing module 1200 and the output processing module 1300 are connected with each other through a bus, and one or more expansion processing modules 1400 are connected with the main control module 1100 or the output processing module 1300 with network connection through a network. The expansion processing module 1400 may access the external device 2000 with the interactive operation function, such as a touch pad, a touch screen, a writing board, a mouse, a keyboard, and the like, which have the interactive function, and may further output the received image and control signal processed inside the splicing processor 1000 to the display, the expansion processing module 1400 may convert control operation data (such as a click position of a mouse at the current time, or a key depression position at the current time or a multi-point touch operation behavior of the touch screen) generated by the external device 2000 accessed by the expansion processing module 1400 to obtain a target control instruction, and transmit the target control instruction to other expansion processing modules 1400, the main control module 1100, and the output processing module 1300 having the network connection, where the input processing module 1200 and the output processing module 1300 both have a function of generating video overlay graphics and text in real time, and the input processing module 1200 and the output processing module 1300 have a function of displaying control information transmitted from the expansion processing module 1400 and the main control module 1100 in real time And controlling respective video superposition graphic operation, cursor and character generation operation and cursor and character change operation, thereby realizing visual and graphical operation on large screen and signal source content.
Specifically, after the expansion processing module 1400 obtains the control operation data generated by the external device 2000 having the interactive operation function, such as a touch pad, a touch screen, a writing board, a mouse, and a keyboard, the control operation data is analyzed and processed to obtain a corresponding target control instruction, and if there are multiple expansion processing modules 1400, each expansion processing module 1400 needs to negotiate or set a priority and a unique identification number through a network so as to facilitate processor identification.
The expansion processing module 1400 forwards the control operation data to the main control module 1100 and/or the output processing module 1300 with network connection through a network, the processor main control module 1100 and the output processing module 1300 with network connection obtain a plurality of target control instructions (one or more) transmitted by the expansion processing module 1400 through the network (for example, mouse click operation, coordinate information and operation information of mouse click operation are obtained and converted into corresponding space coordinates and control information; for example, writing operation of a writing board is carried out, pressure size information during writing of the writing board is obtained, the pressure size information comprises the coordinate information and the operation information, the pressure size information is converted into corresponding space information and control information, for example, key value information of keyboard key operation is obtained, the key value information comprises the coordinate information and the operation information and is converted into corresponding space information and control information).
If the target control command is obtained only by the main control module 1100 (not connected to the output processing module 1300), the processor main control module 1100 forwards the obtained target control command to each of the input processing module 1200 and the output processing module 1300 through the bus, and each of the input processing module 1200 and the output processing module 1300 generates and changes the overlay graphics, the cursor and the text for the corresponding video according to the target control command.
If the target control command is obtained only by the connected output processing module 1300, the output processing module 1300 forwards the obtained target control command to each input processing module 1200, the other output processing modules 1300 and the main control module 1100 through the bus, and each input processing module 1200 and output processing module 1300 generate and change overlay graphics, cursors and characters for the corresponding video according to the target control command.
If the main control module 1100 and the connected output processing module 1300 both obtain the target control instruction, the main control module 1100 and the connected output processing module 1300 forward the obtained target control instruction to each input processing module 1200, the output processing module 1300 and the main control module 1100, and each input processing module 1200 and the output processing module 1300 generate and change the overlay graphics, the cursor and the characters for the corresponding video according to the target control instruction.
Each input processing module 1200 of the present invention performs editing operation processing on the first image information according to the target control instruction, and the user can perform interactive control on the content of the first image information, for example, when the target control instruction is to superimpose a graphic, a cursor, or a character on the first image information, the user can perform visual and graphical interactive control on the content of the first image information, that is: the operation processing of graphics such as overlay graphics is converted into processing and control of signal content, for example, drawing a closed graphic such as a circle by using a writing board can correspond to matting the content in the graphic, and for example, if the overlay is a hand-shaped cursor, pressing a certain key and moving the cursor can be defined as picture content translation.
Each output processing module 1300 of the present invention performs editing operation processing on the first image information according to the target control instruction, and the user can perform interactive control on the content of the first image information, for example, when the target control instruction is to superimpose a graphic, a cursor or a character on the first image information, the user can perform interactive control of visual guidance on the content of the first image information, and synthesize visual second image information with a guidance graphic with a form of superimposing a graphic, a cursor or a character or mixing with the first image information, and output the second image information with the guidance graphic to a large screen for display, that is: the visual guide graph means that a certain graph is generated to guide the user to operate, for example, when a cursor moves to the edge of a window of a certain display area, the cursor is changed from a one-way arrow to a two-way arrow to guide the user to change the size of the window by clicking and pulling, for example, when a certain large screen area is operated through a touch pad, a highlighted frame is superposed on the frame of the uppermost window in the area after clicking, which is equivalent to prompting an operator to move the frame by moving a finger, and the content in the frame after releasing hands can also move past.
One embodiment of a visual interaction method of a stitching processor 1000 according to the present invention, as shown in fig. 3, includes:
s100, acquiring control operation data, and converting the control operation data to obtain a corresponding target control instruction; the target control instruction comprises control information and space information corresponding to the control information;
s200, acquiring first image information;
s300, processing the first image information according to the target control instruction to obtain corresponding second image information.
Specifically, in this embodiment, step S100 and step S200 are not in sequence, and the first image information or the second image information includes image information such as graphics (still images, moving images), videos, and the like. The first image information is used as a source signal source, and the second image information is used as a destination signal source. The target control instruction comprises control information and space information corresponding to the control information, namely if real-time visual interactive operation is to be performed on an image to be displayed on a large screen, the target control instruction is obtained, if a preset icon is superimposed on the upper right corner of the first image information, namely the control information is the superimposed preset icon, the operation object is the first image information, and the space information is the upper right corner of the first image information. The method comprises the steps of obtaining control operation data, carrying out conversion processing on the control operation data to obtain a target control instruction corresponding to the control operation data, obtaining first image information, and processing the obtained first image information according to the target control instruction to obtain second image information. According to the invention, all the acquired control operation data input by each user are converted, and the corresponding target control instructions are obtained through distributed processing conversion, and the control operation data are respectively converted to obtain the corresponding target control instructions, and the first image information is respectively processed to obtain the second image information.
Another embodiment of the visual interaction method of the stitching processor 1000 of the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 4, the main improvements of the present embodiment compared with the foregoing embodiment are that the method includes:
s110 acquires the control operation data input by the user on the external device 2000; the control operation data comprises operation information and coordinate information corresponding to the operation information;
s120, searching the control instruction table for the operation information to obtain corresponding control information, and obtaining space information corresponding to a large screen according to the coordinate information to obtain the target control information;
s200, acquiring the first image information;
s310, processing the corresponding part of the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding second image information;
s320 transmits the second image information to a display device, so that the display device displays the second image information.
Specifically, in this embodiment, the steps S110, S120 and S200 are not in sequence, and the step S110, the step S120 and then the step S200 may be performed first, or the step S200 and then the steps S110 and S120 may be performed first. The external device 2000 includes a touch pad, a keyboard, a mouse, a writing pad, a touch screen, and other devices having an interactive operation function, a user operates on the external device 2000, the external device 2000 can obtain control operation data corresponding to the user operation, the external device 2000 is connected to the splicing processor 1000 by a wire through a connection line, or the external device 2000 is wirelessly connected to the splicing processor 1000 through a network, the external device 2000 transmits the acquired control operation data to the splicing processor 1000, the splicing processor 1000 converts the acquired control operation data, namely, the operation information in the control operation data is matched with the control instruction list to obtain the corresponding control information, and mapping coordinate information corresponding to the operation information in the control operation data to obtain corresponding spatial information, wherein the obtained control information and spatial information are the target control instruction.
Specifically, the first image information can be acquired from any one or more of the image capture device 3000 (such as a camera, and the like) and a network address. Illustratively, when the splicing processor 1000 is connected to the image capturing devices 3000 such as a plurality of cameras, the splicing processor 1000 obtains first image information from the cameras, obtains control operation data input by a user on the external device 2000, converts the control operation data into a target control instruction, and then the splicing processor 1000 analyzes the target control instruction to obtain the control operation data, and rotates and moves the correspondingly connected cameras to a position (direction and position) required by the user according to the control operation data, so that the cameras can more comprehensively and effectively shoot surrounding scenes to obtain the first image information required by the user.
The control operation data acquired by the external device 2000 can be converted into a target control instruction to process the content of the first image information, and meanwhile, the control operation data can also be used for controlling the connected image acquisition, so that the visual and intuitive effects are achieved, the interaction between the splicing processor 1000 and a user can be improved, the convenience and the interactivity of the user for operating a large screen are optimized, and the use experience of the user is improved.
Another embodiment of the visual interaction method of the stitching processor 1000 of the present invention is an optimized embodiment of the foregoing embodiment, as shown in fig. 5, the main improvements of the present embodiment compared with the foregoing embodiment are that the method includes:
s110 acquires the control operation data input by the user on the external device 2000; the control operation data comprises operation information and coordinate information corresponding to the operation information;
s120, searching the control instruction table for the operation information to obtain corresponding control information, and obtaining space information corresponding to a large screen according to the coordinate information to obtain the target control information;
s210, acquiring the first image information;
s220, processing the corresponding part of the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding processed first image information;
s310, processing the corresponding part of the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding second image information;
s320 transmits the second image information to a display device, so that the display device displays the second image information.
Specifically, in this embodiment, the steps S110 and S120 and the steps S210 and S220 are not in sequence, and the steps S110 and S120 may be performed first, and then the steps S210 and S220 may be performed, or the steps S210 and S220 may be performed first, and then the steps S110 and S120 may be performed. The control information includes editing control, overlay control, rotation control, scaling control, and the like, for example, the first image information is overlaid with a graphic, a cursor, or a text and the like in real time, so as to implement visual operation on a large screen and a source signal source, and enhance visual interactive operation experience, which is more intuitive and stronger in interactivity. Illustratively, the video image, which is the first image information acquired by the camera F that shoots the scene of the hall, needs to be marked as the hall, so that the user can conveniently record and look up the video image, and the acquired video image is subjected to text superposition through the input processing module 1200 of the camera F, that is, the text icon of the hall is superposed on the video image, so that the displayed video image is the video image marked with the hall when the large screen display is performed. If the output processing modules 1300 are used for superposition, when the large-screen display is carried out, all the output processing modules 1300 connected with the large screen are required to carry out character superposition, the processing resources of all the output processing modules 1300 are wasted, and the processing and editing efficiency is reduced.
In all the above embodiments, the step S010 includes, before the step S010, the step of:
s010 sets a unique identification number and priority of each extended processing module 1400. Specifically, in this embodiment, the setting manner may be centralized setting in advance, or may be setting by real-time negotiation via a network. The setting of the unique identification number and the priority can be improved by performing centralized setting in advance, the unique identification number and the priority of each expansion processing module 1400 can be reset when another expansion processing module 1400 is newly added, the newly added expansion processing module 1400 can be numbered and the priority of the newly added expansion processing module 1400 can be set directly according to the historical numbering rule and the historical priority setting rule on the basis of the historical set unique identification number and priority, and multiple users can be ensured to orderly control the same large screen in a multi-user operation control scene.
Through network real-time negotiation setting, the unique identification numbers and the priorities of different expansion processing modules 1400 can be coordinated according to the input time of user input control operation data and preset negotiation rules in a multi-user operation control scene, and multiple users can be guaranteed to orderly control the same large screen, so that the situation that out-of-control or chaotic control is caused to interactive operation control of the large screen when multi-user operation control is performed is avoided, control of the large screen by multiple users is coordinated, and user experience is indirectly improved. The unique identification number and the priority between the extension processing modules 1400 are set according to negotiation between the extension processing modules, and a large screen can be flexibly selected and controlled by a user, so that heavy workload caused by setting the unique identification number and the priority for the extension processing modules 1400 in advance is avoided, and user experience is guaranteed.
In all the above embodiments, the image displayed on the large screen connected to the splicing processor 1000 is a multi-layer image, that is, the image information displayed by the large screen to the user by default is the second image information obtained after processing by the output processing module 1300, if the user performs an operation on the large screen, when the user performs a preset operation on the preset switching region of the second image information displayed on the large screen, the input processing module 1200 may be triggered to perform an editing process on the content of the first image information, or the image acquisition device 3000 connected to the input processing module 1200 is triggered to control, that is, when the user performs a preset operation on the preset switching region of the second image information displayed on the large screen, the extended processing module 1400 connected to the large screen converts the control operation data corresponding to the preset operation performed on the preset switching region into a target control instruction, where the target control instruction is specific to the input processing module 1200. For example, as shown in fig. 1, 2 and 6, the second image information 2 is displayed on the large screen 3, the user makes the output processing module 1300 perform processing operation on the first image information through the external device 2000 to obtain the second image information 2, and the user rotates the switch button to the horizontal right direction at the predefined preset switching area 1, so that the user makes the input processing module 1200 perform operation on the first image information through the external device 2000 to obtain the processed first image information, and the output processing module 1300 performs processing again or does not process the processed first image information to obtain the second image information, and the second image information is output to the large screen through the output processing module 1300 for display. According to the invention, the interactive operation between the input processing module 1200 and the output processing module 1300 and the user can be switched at will according to the operation selection of the user, so that the interactivity is stronger, and the user experience is higher.
It should be noted that the above embodiments can be freely combined as necessary. The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (15)

1. A stitching processor, comprising: the system comprises a main control module, a plurality of expansion processing modules, an input processing module and an output processing module;
the expansion processing module is connected with external equipment, acquires control operation data, and converts the control operation data to obtain a corresponding target control instruction; the target control instruction comprises control information and space information corresponding to the control information;
the input processing module acquires first image information;
the output processing module is used for processing the first image information according to the target control instruction to obtain corresponding second image information;
the main control module is respectively connected with the expansion processing module, the input processing module and the output processing module, controls the expansion processing module to acquire the target control instruction, controls the input processing module to acquire the first image information, and controls the output processing module to process the first image information to acquire the second image information;
the master control module comprises:
and the distribution unit is used for setting the unique identification number and the priority of each expansion processing module.
2. The splicing processor of claim 1, wherein the master module comprises:
the first communication unit is used for acquiring the target control instruction;
the first communication unit sends the target control instruction to the input processing module and/or the output processing module.
3. The splicing processor of claim 1, wherein:
the input processing module is connected with the main control module and used for acquiring the target control instruction; and/or the presence of a gas in the gas,
and the input processing module is connected with the output processing module and used for acquiring the target control instruction.
4. The splice processor of claim 3 wherein the input processing module comprises:
an input unit that acquires the first image information;
the second communication unit is used for acquiring the target control instruction;
and the first processing unit is used for processing the part corresponding to the spatial information in the first image information according to the control information in the target control instruction to obtain the corresponding processed first image information.
5. The stitching processor of claim 4, wherein the input unit comprises:
the first connecting port is in wired connection with the image acquisition equipment through a connecting wire; and/or the presence of a gas in the gas,
and the first communication subunit is wirelessly connected with the image acquisition equipment through a network.
6. The splicing processor of claim 1, wherein the expansion processing module comprises:
the third communication unit is connected with the external equipment and used for acquiring the control operation data input by the user on the external equipment; the control operation data comprises operation information and coordinate information corresponding to the operation information;
the second processing unit searches a control instruction table for the operation information to obtain corresponding control information, and obtains space information corresponding to a large screen according to the coordinate information to obtain the target control instruction;
and the third communication unit sends the target control instruction to the output processing module and/or the main control module.
7. The splicing processor of claim 6, wherein the third communication unit comprises:
the second connecting port is in wired connection with the external equipment through a connecting wire; and/or the presence of a gas in the gas,
and the second communication subunit is wirelessly connected with the external equipment through a network.
8. The splicing processor of claim 1, wherein:
the expansion processing module is mutually connected with the output processing module and the main control module through a network;
the input processing module is connected with the main control module and the output processing module through a bus.
9. The splicing processor of any one of claims 1-8, wherein the extended processing module further comprises:
and the negotiation unit sets the unique identification number and the priority of each extended processing module through a network.
10. The splicing processor of any one of claims 1 or 3, wherein the output processing module comprises:
the fourth communication unit is used for acquiring the target control instruction and the first image information;
the third processing unit is used for processing the part corresponding to the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding second image information;
and the output unit transmits the second image information to a display device so that the display device displays the second image information.
11. The splicing processor of claim 10, wherein the fourth communication unit further sends the target control command to a next output processing module and the master control module.
12. A visual interaction method for a splicing processor, wherein the splicing processor of any one of claims 1 to 11 is applied, comprising:
s010 sets the unique identification number and priority of each expansion processing module;
s100, acquiring control operation data, and converting the control operation data to obtain a corresponding target control instruction; the target control instruction comprises control information and space information corresponding to the control information;
s200, acquiring first image information;
s300, processing the first image information according to the target control instruction to obtain corresponding second image information.
13. The visual interaction method of the splicing processor according to claim 12, wherein the step S200 comprises the steps of:
s210, acquiring the first image information;
s220, processing the corresponding part of the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding processed first image information.
14. The visual interaction method of the splicing processor according to claim 12, wherein the step S100 comprises the steps of:
s110, acquiring the control operation data input by the user on the external equipment; the control operation data comprises operation information and coordinate information corresponding to the operation information;
s120, searching the control instruction table for the operation information to obtain corresponding control information, and obtaining space information corresponding to the large screen according to the coordinate information to obtain the target control instruction.
15. The visual interaction method of the splicing processor according to any one of claims 12 or 13, wherein the step S300 comprises the steps of:
s310, processing the corresponding part of the spatial information in the first image information according to the control information in the target control instruction to obtain corresponding second image information;
s320 transmits the second image information to a display device, so that the display device displays the second image information.
CN201810100501.5A 2018-02-01 2018-02-01 Splicing processor and visual interaction method of splicing processor Active CN108255454B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810100501.5A CN108255454B (en) 2018-02-01 2018-02-01 Splicing processor and visual interaction method of splicing processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810100501.5A CN108255454B (en) 2018-02-01 2018-02-01 Splicing processor and visual interaction method of splicing processor

Publications (2)

Publication Number Publication Date
CN108255454A CN108255454A (en) 2018-07-06
CN108255454B true CN108255454B (en) 2021-01-12

Family

ID=62743328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810100501.5A Active CN108255454B (en) 2018-02-01 2018-02-01 Splicing processor and visual interaction method of splicing processor

Country Status (1)

Country Link
CN (1) CN108255454B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111208965B (en) * 2020-01-15 2023-09-22 宁波Gqy视讯股份有限公司 Spliced display system and display method thereof
CN113132556B (en) * 2020-01-16 2023-04-11 西安诺瓦星云科技股份有限公司 Video processing method, device and system and video processing equipment
CN111913676B (en) * 2020-07-31 2023-01-31 宁波Gqy视讯股份有限公司 Control system and method of LED spliced screen and splicing processor
CN112256362B (en) * 2020-09-11 2023-07-25 四川南格尔生物科技有限公司 Intelligent blood sampling method and system with visual guiding prompt and intelligent blood sampling instrument
CN114253494A (en) * 2020-09-25 2022-03-29 中车株洲电力机车研究所有限公司 Large screen display device for train
CN113326849B (en) * 2021-07-20 2022-01-11 广东魅视科技股份有限公司 Visual data acquisition method and system
CN115543140A (en) * 2022-09-26 2022-12-30 深圳市国鑫恒运信息安全有限公司 Control method for displaying BIOS setting interface cursor on SOL page

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102611869A (en) * 2012-02-10 2012-07-25 江苏清投视讯科技有限公司 Output-oriented network transmission technique of multi-screen splicing system
CN102905122A (en) * 2012-11-13 2013-01-30 北京航天福道高技术股份有限公司 Auxiliary method for tracking suspicious people by monitoring system
CN104954748A (en) * 2015-06-17 2015-09-30 广东威创视讯科技股份有限公司 Video processing architecture
CN106612401A (en) * 2015-10-27 2017-05-03 宏正自动科技股份有限公司 video matrix control device and video matrix control method
CN107273076A (en) * 2017-05-08 2017-10-20 广州美凯信息技术股份有限公司 A kind of method for visualizing cooperation management of attending a banquet, apparatus and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9047039B2 (en) * 2007-05-14 2015-06-02 Christie Digital Systems Usa, Inc. Configurable imaging system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102611869A (en) * 2012-02-10 2012-07-25 江苏清投视讯科技有限公司 Output-oriented network transmission technique of multi-screen splicing system
CN102905122A (en) * 2012-11-13 2013-01-30 北京航天福道高技术股份有限公司 Auxiliary method for tracking suspicious people by monitoring system
CN104954748A (en) * 2015-06-17 2015-09-30 广东威创视讯科技股份有限公司 Video processing architecture
CN106612401A (en) * 2015-10-27 2017-05-03 宏正自动科技股份有限公司 video matrix control device and video matrix control method
CN107273076A (en) * 2017-05-08 2017-10-20 广州美凯信息技术股份有限公司 A kind of method for visualizing cooperation management of attending a banquet, apparatus and system

Also Published As

Publication number Publication date
CN108255454A (en) 2018-07-06

Similar Documents

Publication Publication Date Title
CN108255454B (en) Splicing processor and visual interaction method of splicing processor
US11301200B2 (en) Method of providing annotation track on the content displayed on an interactive whiteboard, computing device and non-transitory readable storage medium
US9733736B2 (en) Image display apparatus and method, image display system, and program
EP2446619B1 (en) Method and device for modifying a composite video signal layout
JP5903936B2 (en) Method, storage medium and apparatus for information selection and switching
CN111601120A (en) Wireless screen transmission display system and display method
CN102662498B (en) A kind of wireless control method of projection demonstration and system
CN110572591B (en) KVM (keyboard, video and mouse) agent system signal source preview system and method
CN103984494A (en) System and method for intuitive user interaction among multiple pieces of equipment
NO332170B1 (en) Camera control device and method
US20190102135A1 (en) Scalable interaction with multi-displays
CN107493498A (en) By can touch-control moving icon realize method that video conference multi-screen video shows
TWM516205U (en) Video matrix control apparatus
US11294495B2 (en) Electronic whiteboard, method for image processing in electronic whiteboard, and recording medium containing computer program of electronic whiteboard
CN106293563A (en) A kind of control method and electronic equipment
JP6244069B1 (en) Remote work support system, remote work support method, and program
CN202535475U (en) Video conference system capable of jointly selecting video content
US9548894B2 (en) Proximity based cross-screen experience App framework for use between an industrial automation console server and smart mobile devices
JP2022171661A (en) System, method, and program for creating moving image
CN110692036B (en) Presentation server, data relay method, and method for generating virtual pointer
CN101267326B (en) Orientation control system for multi-party presentation document conference
CN209980227U (en) Intelligent display system and device
CN105120327A (en) Input method for use between electronic equipment and corresponding electronic equipment
JP4766696B2 (en) Interface device and interface system
Rodrigues et al. Augmented reality and tangible user interfaces integration for enhancing the user experience

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant