CN115941861A - Pane picture playing method and device, electronic equipment and medium - Google Patents

Pane picture playing method and device, electronic equipment and medium Download PDF

Info

Publication number
CN115941861A
CN115941861A CN202211610688.6A CN202211610688A CN115941861A CN 115941861 A CN115941861 A CN 115941861A CN 202211610688 A CN202211610688 A CN 202211610688A CN 115941861 A CN115941861 A CN 115941861A
Authority
CN
China
Prior art keywords
area
pane
region
determining
enlarged
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211610688.6A
Other languages
Chinese (zh)
Other versions
CN115941861B (en
Inventor
董黎晨
景杰
刘碧波
孟亚光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sany Electronic Technology Co ltd
Original Assignee
Shanghai Sany Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sany Electronic Technology Co ltd filed Critical Shanghai Sany Electronic Technology Co ltd
Priority to CN202211610688.6A priority Critical patent/CN115941861B/en
Publication of CN115941861A publication Critical patent/CN115941861A/en
Application granted granted Critical
Publication of CN115941861B publication Critical patent/CN115941861B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Closed-Circuit Television Systems (AREA)

Abstract

The embodiment of the disclosure provides a pane picture playing method and device, electronic equipment and a medium. The pane picture playing method comprises the steps of displaying received monitoring video data in a pane; determining a region to be enlarged based on an enlargement rule in a case where it is recognized that the surveillance video data includes a target object; determining a modification area based on a conflict processing rule in the case that the quasi-amplification area causes a conflict; and amplifying the target object to the correction area for displaying.

Description

Pane picture playing method and device, electronic equipment and medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for playing a pane, an electronic device, and a medium.
Background
The monitoring field often needs to check the pictures of a plurality of cameras at the same time. The pictures of a plurality of cameras can be displayed simultaneously in a multi-grid manner, however, the display area is reduced in such a manner. Therefore, although the resolution of the received video data is sufficient, the reduction processing is performed due to the limitation of the pane, resulting in an image viewed by the user being insufficiently clear. How to look over the demand of the picture of a plurality of cameras under limited screen both satisfying, can compromise the definition of image again, be the problem that awaits a urgent need to solve.
Disclosure of Invention
To solve the problems in the related art, embodiments of the present disclosure provide a method and an apparatus for playing a pane, an electronic device, and a medium.
One aspect of the embodiments of the present disclosure provides a pane picture playing method, including displaying received surveillance video data in a pane; determining a region to be enlarged based on an enlargement rule in a case where it is recognized that the surveillance video data includes a target object; determining a correction area based on a conflict processing rule under the condition that the quasi-amplification area causes conflict; and amplifying the target object to the correction area for displaying.
According to an embodiment of the present disclosure, the target object includes a person and/or an alarm device.
According to the embodiment of the disclosure, the determining of the area to be enlarged based on the enlargement rule includes keeping the position of one vertex of the area where the target object is located unchanged, and determining the area to be enlarged based on a preset enlargement factor.
According to the embodiment of the disclosure, the determining of the area to be enlarged based on the enlargement rule includes keeping the position of the center point of the area where the target object is located unchanged, and determining the area to be enlarged based on a preset enlargement factor.
According to the embodiment of the disclosure, in the case that the quasi-enlargement area causes a conflict, determining a correction area based on a conflict processing rule includes, in the case that the quasi-enlargement area conflicts with an existing enlargement area in a current pane, adjusting display parameters of the quasi-enlargement area and the existing enlargement area so that the quasi-enlargement area does not overlap with the existing enlargement area; and determining a correction area based on the adjusted display parameters.
According to the embodiment of the disclosure, in the case that the quasi-enlargement area causes a conflict, determining the correction area based on the conflict processing rule includes, in the case that the quasi-enlargement area conflicts with the boundary of the current pane, if a preset parameter indicates that display exceeding the pane is allowed, determining the quasi-enlargement area as the correction area.
According to the embodiment of the disclosure, in the case that the quasi-enlargement area causes a conflict, determining a correction area based on a conflict processing rule includes, in the case that the quasi-enlargement area conflicts with the boundary of the current pane, if a preset parameter indicates that display exceeding the pane is not allowed, adjusting the display parameter of the quasi-enlargement area so that the quasi-enlargement area does not exceed the boundary of the current pane; a correction area is determined based on the adjusted display parameters.
According to the embodiment of the disclosure, in the case that the quasi-enlargement area causes a conflict, determining a correction area based on a conflict processing rule includes, in the case that the quasi-enlargement area conflicts with an existing enlargement area in other panes, adjusting display parameters of the quasi-enlargement area and the existing enlargement area so that the quasi-enlargement area and the existing enlargement area are displayed in the respective panes; a correction area is determined based on the adjusted display parameters.
Another aspect of the disclosed embodiments provides a pane playing apparatus, including a first display module, a first determination module, a second determination module, and a second display module. The first display module is configured to display the received monitoring video data in a pane; a first determination module configured to determine a region to be enlarged based on an enlargement rule in a case where it is recognized that the surveillance video data includes a target object; a second determination module configured to determine a revised region based on a collision processing rule in a case where the quasi-magnified region causes a collision; and the second display module is configured to enlarge the target object to the correction area for display.
Another aspect of an embodiment of the present disclosure provides an electronic device including at least one processor and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to cause the processor to implement the pane playing method as described above.
Another aspect of the embodiments of the present disclosure provides a computer-readable storage medium having computer-readable instructions stored thereon, which, when executed by a processor, cause the processor to implement the pane playing method as described above.
Another aspect of the embodiments of the present disclosure provides a computer program which, when executed by a processor, causes the processor to implement the pane screen playing method as described above.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data is displayed and identified, when the target object is identified, the region where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that the simultaneous observation of multiple paths of videos is ensured, the key information can be amplified and watched, details are not easy to omit, different requirements are considered, and the utilization rate of the screen is improved.
Drawings
Other features, objects, and advantages of the present disclosure will become more apparent from the following detailed description of non-limiting embodiments when taken in conjunction with the accompanying drawings. In the drawings:
fig. 1A schematically illustrates a system architecture diagram of a pane playing method or apparatus to which an embodiment of the present disclosure is applied;
FIG. 1B schematically illustrates a diagram of a pane to which embodiments of the present disclosure are applied;
fig. 2 schematically illustrates a flowchart of a pane screen playing method according to an embodiment of the present disclosure;
3A-3D schematically illustrate a schematic view of an area in which a magnified target object of an embodiment of the present disclosure is located;
FIG. 4 schematically illustrates a flow chart for determining a correction region in an embodiment of the present disclosure;
fig. 5A and 5B schematically illustrate a schematic view of determining a correction region according to an embodiment of the present disclosure;
FIG. 6 schematically illustrates a flow chart for determining a correction region according to another embodiment of the present disclosure;
FIGS. 7A and 7B schematically illustrate a schematic of determining a correction region in accordance with an embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow chart for determining a correction region according to another embodiment of the present disclosure;
fig. 9A and 9B schematically illustrate a schematic view of determining a correction region according to an embodiment of the present disclosure;
fig. 10A and 10B schematically illustrate a flowchart of a pane playing method according to another embodiment of the present disclosure;
fig. 11 schematically shows a block diagram of a pane playing apparatus of an embodiment of the present disclosure; and
fig. 12 is a schematic structural diagram of a computer system suitable for implementing the pane screen playing method and apparatus according to the embodiment of the present disclosure.
Detailed Description
Hereinafter, exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily implement them. Also, for the sake of clarity, parts not relevant to the description of the exemplary embodiments are omitted in the drawings.
In the present disclosure, it is to be understood that terms such as "including" or "having," etc., are intended to indicate the presence of the disclosed features, numbers, steps, behaviors, components, parts, or combinations thereof, and are not intended to preclude the possibility that one or more other features, numbers, steps, behaviors, components, parts, or combinations thereof may be present or added.
It should also be noted that the embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
As described above, simultaneous display of pictures of a plurality of cameras in the form of a multi-pane will result in a smaller display area. Therefore, although the resolution of the received video data is sufficient, the reduction processing is performed due to the limitation of the pane, resulting in an image viewed by the user being insufficiently clear.
In view of the foregoing problems, an aspect of the embodiments of the present disclosure provides a method for playing a pane, including displaying received surveillance video data in a pane; determining a region to be enlarged based on an enlargement rule in a case where it is recognized that the surveillance video data includes a target object; determining a modification area based on a conflict processing rule in the case that the quasi-amplification area causes a conflict; and amplifying the target object to the correction area for displaying. The region where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that the simultaneous observation of multiple paths of videos is guaranteed, key information can be amplified and watched, details are not easy to omit, different requirements are met, and the utilization rate of a screen is improved.
The technical solutions provided by the embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Fig. 1A schematically illustrates a system architecture diagram of a pane playing method or apparatus to which an embodiment of the present disclosure is applied.
As shown in fig. 1A, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. Network 104 is the medium used to provide communication links between terminal devices 101, 102, 103 and server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 101, 102, 103 interact with a server 105 via a network 104 to receive or send messages or the like. Various client applications may be installed on the terminal devices 101, 102, 103. Such as browser-type applications, search-type applications, instant messaging-type tools, and so forth. The terminal devices 101, 102, 103 may present the surveillance video data from the server 105 in a multi-pane manner.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various special purpose or general purpose electronic devices including, but not limited to, smart phones, tablet computers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules used to provide distributed services), or as a single piece of software or software module.
The server 105 may be a server that provides various services, such as a backend server that provides services for client applications installed on the terminal devices 101, 102, 103. The server 105 can be communicably connected with a plurality of monitoring cameras, receive monitoring video data from the monitoring cameras, and forward it to the terminal device 101, 102, or 103.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is software, it may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules used to provide distributed services), or as a single piece of software or software module.
The pane screen playing method provided by the embodiment of the present disclosure may be executed by the server 105, or may be executed by the terminal devices 101, 102, and 103, for example. Alternatively, the pane screen playing method of the embodiment of the present disclosure may be partially executed by the terminal devices 101, 102, and 103, and the other part is executed by the server 105.
It should be understood that the number of terminal devices, networks, and servers in fig. 1A are merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation.
Fig. 1B schematically illustrates a schematic view of a pane to which an embodiment of the present disclosure is applied.
According to an embodiment of the present disclosure, a display area of a terminal device may be divided into a plurality of panes. For example, as shown in FIG. 1B, a display area may be divided into four panes, A, B, C, and D. In other embodiments, the display area may also be divided into other numbers of panes, such as 2, 6, 8, 9, 12, 15, 16, etc., for example. Each pane may be used to display a path of surveillance video data.
Fig. 2 schematically illustrates a flowchart of a pane screen playing method according to an embodiment of the present disclosure.
As shown in fig. 2, the pane screen playing method includes operations S210-S240.
In operation S210, the received surveillance video data is presented in a pane.
In operation S220, in case that it is recognized that the surveillance video data includes the target object, a region to be enlarged is determined based on an enlargement rule.
In operation S230, in the case where the quasi-magnified region causes a collision, a modified region is determined based on a collision processing rule.
In operation S240, the target object is enlarged to the modified area for presentation.
According to the embodiment of the disclosure, each pane may be used to display one path of monitoring video data. And detecting the monitoring video data while displaying to identify whether the monitoring video data contains the target object. The algorithm for identification may be any of various known algorithms, for example, by using any suitable neural network model.
According to the embodiment of the disclosure, the target object may include a person, and the portrait is generally regarded as important picture information, and may be identified with respect to the portrait. The target object may also include an alarm device. The administrator can add the device pictures to be identified and train the identification model. Such as alarm devices with alarm lights, especially alarm devices that require a focused view. Thus, during the playing process, the system can identify the alarm device.
According to the embodiment of the present disclosure, after the target object is identified, the area where the target object is located, hereinafter simply referred to as the target area, may be determined. The target area may be, for example, a circumscribed rectangular area of the target object. The target area may then be enlarged in the hope of showing the details of the target object more clearly. For example, if a portrait is identified, the target area where the portrait is located can be amplified and played, so that in a multi-pane playing mode, not only can simultaneous playing of multiple camera pictures be satisfied, but also important information can be clearly seen.
According to embodiments of the present disclosure, the amplification rules may be configured proactively. For example, the administrator may configure the magnification to be 1.5 times, 2 times, 3 times, or the like. The position of the enlarged region may also be configured. In some embodiments of the present disclosure, the position of one vertex of the region in which the target object is located may be kept unchanged, and the region to be magnified is determined based on a preset magnification. In other embodiments of the present disclosure, the position of the central point of the region where the target object is located may be kept unchanged, and the region to be magnified is determined based on a preset magnification.
For example, in pane A, a target object is identified in an area A 1 As shown in fig. 3A. If the fixed point position of the lower left corner is kept unchanged, the target area A is enabled to be 1 Magnification by two times will result in a quasi-magnified region A as shown in FIG. 3B 1 '. If the center position is kept unchanged, the target area is enlarged by two times, and a quasi-enlargement area A as shown in FIG. 3C is obtained 1 '. If the center position is kept unchanged, the target area is enlarged by three times, and a quasi-enlargement area A shown in FIG. 3D is obtained 1 ’。
According to the embodiment of the disclosure, if the area to be enlarged conflicts with other contents to be displayed in space, a correction area is determined according to a conflict processing rule, so that the target object is enlarged to the correction area for displaying. For example, in the case where two target regions exist in the same pane, if two regions to be enlarged overlap, the two regions to be enlarged may be reduced so as not to overlap, thereby determining the correction region. Further embodiments will be described in detail below.
According to the embodiment of the present disclosure, the above is described with the terminal device as the execution subject, and the method of the embodiment of the present disclosure may also be partially or completely executed by the server. For example, the server may type the received surveillance video data according to the pane, identify the surveillance video data, determine a final enlargement area according to the enlargement rule and the conflict processing rule, and then send a picture to be displayed finally to the terminal device for display.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data is displayed and identified, when the target object is identified, the region where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that the simultaneous observation of multiple paths of videos is ensured, the key information can be amplified and watched, details are not easy to omit, different requirements are considered, and the utilization rate of the screen is improved.
Fig. 4 schematically illustrates a flow chart for determining a correction region according to an embodiment of the present disclosure.
As shown in fig. 4, operation S230 may include operations S410 and S420.
In operation S410, in a case where the quasi zoom-in region conflicts with an existing zoom-in region within a current pane, display parameters of the quasi zoom-in region and the existing zoom-in region are adjusted so that the quasi zoom-in region does not overlap with the existing zoom-in region.
In operation S420, a correction area is determined based on the adjusted display parameter.
For example, as shown in fig. 5A, two target objects are identified in the same monitoring video data, and after the target areas are respectively processed according to the amplification rule, a is obtained 1-1 And A 2-1 Two regions to be magnified. However, the two areas to be magnified partially overlap, resulting in a portion of the key content not being visible. At this time, the two quasi amplification areas may be adjusted so as not to overlap. For example, the magnification may be reduced, or the position of the region to be magnified may be moved while the magnification is reduced. The adjusted area may be modified as shown in FIG. 5B as modified area A 1-2 And A 2-2
According to the embodiment of the disclosure, in the case that the area to be enlarged conflicts with the boundary of the current pane, if the preset parameter indicates that the display exceeding the pane is allowed, the area to be enlarged is determined as the correction area, such as the area a shown in fig. 7A 3-1 . If the preset parameters indicate that exceeding the window pane display is not allowed, the area to be magnified needs to be adjusted.
Fig. 6 schematically illustrates a flow chart for determining a correction region according to another embodiment of the present disclosure.
As shown in fig. 6, operation S230 may include operations S610 and S620.
In operation S610, if the preset parameter indicates that the display exceeding the pane is not allowed in the case that the area to be enlarged conflicts with the boundary of the current pane, the display parameter of the area to be enlarged is adjusted so that the area to be enlarged does not exceed the boundary of the current pane.
In operation S620, a correction region is determined based on the adjusted display parameter.
For example, as shown in FIG. 7B, the region to be magnified can be adjusted so that it does not exceed the pane. For example, the magnification may be reduced, or the position of the region to be magnified may be moved while the magnification is reduced. The adjusted area may be modified as shown in FIG. 7B as modified area A 3-2
Fig. 8 schematically illustrates a flow chart for determining a correction region according to another embodiment of the present disclosure.
As shown in fig. 8, operation S230 may include operations S810 and S820.
In operation S810, in a case where the quasi zoom-in region conflicts with an existing zoom-in region in another pane, display parameters of the quasi zoom-in region and the existing zoom-in region are adjusted so that the quasi zoom-in region and the existing zoom-in region are displayed in the respective panes.
In operation S820, a correction area is determined based on the adjusted display parameter.
For example, as shown in fig. 9A, two target objects are identified in two paths of monitored video data, and after the target areas are respectively processed according to the amplification rule, a is obtained 4-1 And C 1 Two regions to be magnified. However, the two areas to be magnified partially overlap, resulting in a portion of the key content not being visible. At this time, the two quasi amplification areas may be adjusted so as not to overlap. For example, the magnification may be reduced, or the position of the region to be magnified may be moved while the magnification is reduced. For example, the adjusted regions may be caused to be respectively displayed within the respective panes without crossing the panes. The adjusted area may be modified as shown by modified area A in FIG. 9B 4-2 And C 2
Fig. 10A and 10B schematically illustrate a flowchart of a pane screen playing method according to another embodiment of the present disclosure.
As shown in fig. 10A and 10B, the method includes operations S1001-S1011 and S1021-S1029.
As shown in fig. 10A, in operation S1001, setting parameters are read.
According to the embodiment of the disclosure, the administrator can set for multi-pane playing, including setting of the magnification rule and the conflict resolution rule. For example, after identifying the area needing to be enlarged, whether the area needs to be enlarged is automatically enlarged; the coordinates of the amplification source point can be set, and amplification is started from the initial point of the coordinates of the identification area, or amplification is started from the central area; the magnification factor of automatic amplification can be set; whether overlap is allowed or automatic adjustment to a non-overlapping size can be set in the case where multiple magnification areas are contained within the pane; whether the beyond pane display is allowed or not can be set; it may be set if the pane size is exceeded, if other panes also need to be enlarged, whether to automatically zoom into the pane, etc.
In operation S1002, the reading system identifies the generated identification result to the monitoring video data.
According to the embodiment of the disclosure, after receiving data of a camera, decoding is performed first to obtain YUV data (YUV is a color coding method expressed by lightness and chroma); analyzing the decoded YUV data (for example, using opencv or other tools); comparing the picture with a preset comparison picture, and identifying whether the picture contains a specified target object, such as a portrait or an alarm device; if the comparison is successful, recording the identified region coordinates (x, y), the image size (w, h) and pixel points of the image data of the region; a picture may contain a plurality of areas to be enlarged, all of which need to be recorded. For example: magnification area 1 is large (coordinate) { (x, y) }, w, h, data [ ] }, magnification area 2 is large (coordinate) { (x, y) }, w, h, data [ ] }.
In operation S1003, if the target object is recognized, operation S1005 is performed, otherwise, operation S1004 is performed, and normal display is performed.
In operation S1005, it is determined whether to automatically zoom in according to the setting parameters, if so, operation S1006 is performed, and if not, operation S1004 is performed, normal display is performed.
In operation S1006, the magnification factor and the coordinates of the magnification source point are read, and the size and position of the magnified image, i.e., the display position of the region to be magnified, are calculated.
In operation S1007, whether multiple target areas exist in the same path of monitored video data is determined, and if yes, operation S1008 is performed, otherwise, the process proceeds to the pane determination process shown in fig. 10B.
In operation S1008, it is determined whether the target areas overlap, if so, operation S1009 is performed, otherwise, the process proceeds to the pane determination flow illustrated in fig. 10B.
In operation S1009, it is determined whether the overlapping is allowed according to the setting parameters, and if so, operation S1011 is performed to retain the original calculated position, otherwise, operation S1010 is performed to recalculate the image size and position so that they are not overlapped. And then enters the pane judgment flow shown in fig. 10B.
As shown in fig. 10B, in operation S1021, it is determined whether the enlarged image exceeds the pane, and if so, operation S1023 is performed, otherwise, operation S1022 is performed, and display is enlarged in the pane.
In operation S1023, it is determined whether the display of the over pane is allowed according to the setting parameters, and if so, operation S1025 is performed to display the over pane; otherwise, operation S1024 is performed, the size and position of the image are recalculated, and the image is adjusted to the maximum display in the pane.
In operation S1026, if the target object is also detected by the other pane and the enlarged display is required, operation S1027 is performed.
In operation S1027, it is determined whether overlay display is permitted according to the setting parameters, and if overlay display is set to be permitted, operation S1028 of overlay-displaying two target regions, for example, an enlarged image appearing later hides a preceding enlarged image, is performed. If the superimposed display is not allowed, the display is adjusted to the maximum display in the pane, for example, the size and the position of the correction area in the two panes are recalculated, and the display is adjusted to the maximum display in the pane.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data is displayed and identified, when the target object is identified, the region where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that the multi-channel video can be observed at the same time, the key information can be amplified and watched, details are not easy to miss, different requirements are met, and the utilization rate of a screen is improved.
Based on the same inventive concept, the present disclosure also provides a pane playing device, and the pane playing device according to the embodiment of the present disclosure is described below with reference to fig. 11.
Fig. 11 schematically shows a block diagram of a pane screen playback device 1100 according to an embodiment of the present disclosure.
The apparatus 1100 may be implemented as part or all of an electronic device through software, hardware, or a combination of both.
As shown in fig. 11, the pane playing apparatus 1100 includes a first presentation module 1110, a first determination module 1120, a second determination module 1130, and a second presentation module 1140. The pane playing apparatus 1100 may perform various methods described above.
A first presentation module 1110 configured to present the received surveillance video data in a pane.
A first determining module 1120 configured to determine a region to be enlarged based on an enlargement rule in case that it is recognized that the surveillance video data includes a target object.
A second determining module 1130 configured to determine a revised region based on a conflict handling rule in case the to-zoom-in region results in a conflict.
A second display module 1140 configured to enlarge the target object to the modified area for display.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data is displayed and identified, when the target object is identified, the region where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that the multi-channel video can be observed at the same time, the key information can be amplified and watched, details are not easy to miss, different requirements are met, and the utilization rate of a screen is improved.
Fig. 12 is a schematic structural diagram of a computer system suitable for implementing the pane screen playing method and apparatus according to the embodiment of the present disclosure.
As shown in fig. 12, the computer system 1200 includes a processing unit 1201 which can execute various processes in the above-described embodiments according to a program stored in a Read Only Memory (ROM) 1202 or a program loaded from a storage section 1208 into a Random Access Memory (RAM) 1203. In the RAM 1203, various programs and data necessary for the operation of the system 1200 are also stored. The processing unit 1201, the ROM 1202, and the RAM 1203 are connected to each other by a bus 1204. An input/output (I/O) interface 1205 is also connected to bus 1204.
The following components are connected to the I/O interface 1205: an input section 1206 including a keyboard, a mouse, and the like; an output portion 1207 including a display device such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 1208 including a hard disk and the like; and a communication section 1209 including a network interface card such as a LAN card, a modem, or the like. The communication section 1209 performs communication processing via a network such as the internet. A driver 1210 is also connected to the I/O interface 1205 as needed. A removable medium 1211, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 1210 as necessary, so that a computer program read out therefrom is mounted into the storage section 1208 as necessary. The processing unit 1201 can be implemented as a CPU, a GPU, a TPU, an FPGA, an NPU, or other processing units.
In particular, the above described methods may be implemented as computer software programs according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a medium readable thereby, the computer program comprising program code for performing the above-described method. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 1209, and/or installed from the removable medium 1211.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present disclosure may be implemented by software or by programmable hardware. The units or modules described may also be provided in a processor, and the names of the units or modules do not in some cases constitute a limitation of the units or modules themselves.
As another aspect, the present disclosure also provides a computer-readable storage medium, which may be a computer-readable storage medium included in the electronic device or the computer system in the above embodiments; or it may be a separate computer readable storage medium not incorporated into the device. The computer readable storage medium stores one or more programs for use by one or more processors in performing the methods described in the present disclosure.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (10)

1. A method for playing a pane is characterized by comprising the following steps:
displaying the received monitoring video data in a pane;
determining a region to be enlarged based on an enlargement rule in a case where it is recognized that the surveillance video data includes a target object;
determining a correction area based on a conflict processing rule under the condition that the quasi-amplification area causes conflict; and
and amplifying the target object to the correction area for displaying.
2. The method of claim 1, wherein the target object comprises a person and/or an alarm device.
3. The method of claim 1, wherein determining the area to be magnified based on the magnification rule comprises:
keeping the position of a vertex of the region where the target object is located unchanged, and determining a region to be amplified based on a preset amplification factor; or
And keeping the position of the central point of the region where the target object is positioned unchanged, and determining a region to be amplified based on a preset amplification factor.
4. The method according to any one of claims 1-3, wherein in the case that the quasi-amplification region causes a collision, determining a modification region based on a collision handling rule comprises:
under the condition that the to-be-enlarged area conflicts with the existing enlarged area in the current window pane, adjusting display parameters of the to-be-enlarged area and the existing enlarged area so as to enable the to-be-enlarged area and the existing enlarged area not to overlap;
a correction area is determined based on the adjusted display parameters.
5. The method according to any one of claims 1-3, wherein in the case that the quasi-amplification region causes a collision, determining a modification region based on a collision handling rule comprises:
and under the condition that the area to be enlarged conflicts with the boundary of the current pane, if the preset parameter shows that the display exceeding the pane is allowed, determining the area to be enlarged as a correction area.
6. The method according to any one of claims 1-3, wherein in the case that the quasi-amplification region causes a collision, determining a modification region based on a collision handling rule comprises:
under the condition that the to-be-amplified region conflicts with the boundary of the current pane, if preset parameters indicate that the display exceeding the pane is not allowed, adjusting display parameters of the to-be-amplified region so that the to-be-amplified region does not exceed the boundary of the current pane;
and determining a correction area based on the adjusted display parameters.
7. A method according to any one of claims 1-3, wherein said determining a revised region based on a collision handling rule in case said quasi-magnified region results in a collision comprises:
when the to-be-enlarged area conflicts with the existing enlarged area in other panes, adjusting display parameters of the to-be-enlarged area and the existing enlarged area so that the to-be-enlarged area and the existing enlarged area are displayed in the respective panes;
a correction area is determined based on the adjusted display parameters.
8. A pane playback apparatus, comprising:
the first display module is configured to display the received monitoring video data in a pane;
a first determination module configured to determine a region to be enlarged based on an enlargement rule in a case where it is recognized that the surveillance video data includes a target object;
a second determination module configured to determine a revised region based on a collision processing rule in a case where the quasi-magnified region causes a collision; and
and the second display module is configured to enlarge the target object to the correction area for display.
9. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A computer readable storage medium having computer readable instructions stored thereon which, when executed by a processor, cause the processor to perform the method of any one of claims 1-7.
CN202211610688.6A 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium Active CN115941861B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211610688.6A CN115941861B (en) 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211610688.6A CN115941861B (en) 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN115941861A true CN115941861A (en) 2023-04-07
CN115941861B CN115941861B (en) 2023-09-26

Family

ID=86555266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211610688.6A Active CN115941861B (en) 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN115941861B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110010193A1 (en) * 2008-02-26 2011-01-13 Koninklijke Philips Electronics N.V. Zoom pane for a central monitoring device
CN102291571A (en) * 2011-08-11 2011-12-21 杭州华三通信技术有限公司 Method and device for realizing frame-pulling scaling in monitoring system
CN110032701A (en) * 2019-04-04 2019-07-19 网易(杭州)网络有限公司 Image shows control method, device, storage medium and electronic equipment
CN110203803A (en) * 2019-06-06 2019-09-06 快意电梯股份有限公司 Escalator safeguard method and device based on AI intelligent monitoring
CN111316637A (en) * 2019-12-19 2020-06-19 威创集团股份有限公司 Spliced wall image content identification windowing display method and related device
CN111857493A (en) * 2020-06-16 2020-10-30 佛山市华全电气照明有限公司 Video management method and system based on smart city management system
CN113891040A (en) * 2021-09-24 2022-01-04 深圳Tcl新技术有限公司 Video processing method, video processing device, computer equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110010193A1 (en) * 2008-02-26 2011-01-13 Koninklijke Philips Electronics N.V. Zoom pane for a central monitoring device
CN102291571A (en) * 2011-08-11 2011-12-21 杭州华三通信技术有限公司 Method and device for realizing frame-pulling scaling in monitoring system
CN110032701A (en) * 2019-04-04 2019-07-19 网易(杭州)网络有限公司 Image shows control method, device, storage medium and electronic equipment
CN110203803A (en) * 2019-06-06 2019-09-06 快意电梯股份有限公司 Escalator safeguard method and device based on AI intelligent monitoring
CN111316637A (en) * 2019-12-19 2020-06-19 威创集团股份有限公司 Spliced wall image content identification windowing display method and related device
CN111857493A (en) * 2020-06-16 2020-10-30 佛山市华全电气照明有限公司 Video management method and system based on smart city management system
CN113891040A (en) * 2021-09-24 2022-01-04 深圳Tcl新技术有限公司 Video processing method, video processing device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN115941861B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
US10573039B2 (en) Techniques for incorporating a text-containing image into a digital image
US10748007B2 (en) Identifying objects in an image
CN112954450B (en) Video processing method and device, electronic equipment and storage medium
US10032257B2 (en) Super resolution processing method, device, and program for single interaction multiple data-type super parallel computation processing device, and storage medium
CN109829397B (en) Video annotation method and system based on image clustering and electronic equipment
CN111402120B (en) Labeling image processing method and device
CN109377508B (en) Image processing method and device
US10390076B2 (en) Image receiving/reproducing device, image generating/transmitting device, display system, image receiving/reproducing method, image generating/transmitting method, and computer readable medium
US10497396B2 (en) Detecting and correcting whiteboard images while enabling the removal of the speaker
CN115022679A (en) Video processing method, video processing device, electronic equipment and medium
US11521025B2 (en) Selective image compression of an image stored on a device based on user preferences
KR20140045013A (en) Method and apparatus for encoding a cloud display screen by using api information
CN110809166B (en) Video data processing method and device and electronic equipment
CN109522429B (en) Method and apparatus for generating information
CN115941861B (en) Pane playing method and device, electronic equipment and medium
CN111369557A (en) Image processing method, image processing device, computing equipment and storage medium
CN115134677A (en) Video cover selection method and device, electronic equipment and computer storage medium
CN117319736A (en) Video processing method, device, electronic equipment and storage medium
CN113191210A (en) Image processing method, device and equipment
CN111083552A (en) Thumbnail generation method, device, equipment and medium
CN110991312A (en) Method, apparatus, electronic device, and medium for generating detection information
US20230254447A1 (en) Session description protocol (sdp) signaling of occlude-free regions in 360 video conferencing
US20180350034A1 (en) Zoomable digital images
CN113117341B (en) Picture processing method and device, computer readable storage medium and electronic equipment
KR102599525B1 (en) Method, device and system for displaying screen by improving visibility of image of interest

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Room 209, 2nd Floor, Building 3, No. 588 Caolong Road, Songjiang District, Shanghai, 2016

Patentee after: Shanghai Sany Electronic Technology Co.,Ltd.

Country or region after: China

Address before: 201612 Room 611, Building 1, No. 299, Zhongchen Road, Songjiang District, Shanghai

Patentee before: Shanghai Sany Electronic Technology Co.,Ltd.

Country or region before: China