CN115941861B - Pane playing method and device, electronic equipment and medium - Google Patents

Pane playing method and device, electronic equipment and medium Download PDF

Info

Publication number
CN115941861B
CN115941861B CN202211610688.6A CN202211610688A CN115941861B CN 115941861 B CN115941861 B CN 115941861B CN 202211610688 A CN202211610688 A CN 202211610688A CN 115941861 B CN115941861 B CN 115941861B
Authority
CN
China
Prior art keywords
area
pane
determining
region
enlarged
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211610688.6A
Other languages
Chinese (zh)
Other versions
CN115941861A (en
Inventor
董黎晨
景杰
刘碧波
孟亚光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sany Electronic Technology Co ltd
Original Assignee
Shanghai Sany Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sany Electronic Technology Co ltd filed Critical Shanghai Sany Electronic Technology Co ltd
Priority to CN202211610688.6A priority Critical patent/CN115941861B/en
Publication of CN115941861A publication Critical patent/CN115941861A/en
Application granted granted Critical
Publication of CN115941861B publication Critical patent/CN115941861B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Closed-Circuit Television Systems (AREA)

Abstract

The embodiment of the disclosure provides a pane playing method, a pane playing device, electronic equipment and a medium. The pane picture playing method comprises the steps of displaying received monitoring video data in a pane; determining a region to be amplified based on an amplification rule in case that the monitoring video data is recognized to include a target object; in the case that the region to be enlarged causes a conflict, determining a correction region based on a conflict processing rule; and amplifying the target object to the correction area for display.

Description

Pane playing method and device, electronic equipment and medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and apparatus for playing a pane, an electronic device, and a medium.
Background
The field of monitoring often requires viewing pictures of multiple cameras simultaneously. The pictures of multiple cameras can be displayed simultaneously in a multi-pane format, however, this will result in a smaller display area. Therefore, although the resolution of the received video data is sufficient, the reduction process is performed due to the restriction of the pane, resulting in an insufficient definition of the image viewed by the user. How to view the pictures of a plurality of cameras simultaneously under a limited screen and also consider the definition of the images is a problem to be solved urgently.
Disclosure of Invention
In order to solve the problems in the related art, embodiments of the present disclosure provide a method, an apparatus, an electronic device, and a medium for playing a pane.
One aspect of the disclosed embodiments provides a pane playing method, including displaying received monitoring video data in a pane; determining a region to be amplified based on an amplification rule in case that the monitoring video data is recognized to include a target object; in the case that the region to be enlarged causes a conflict, determining a correction region based on a conflict processing rule; and amplifying the target object to the correction area for display.
According to an embodiment of the present disclosure, the target object comprises a person and/or an alarm device.
According to an embodiment of the disclosure, determining the region to be amplified based on the amplification rule includes determining the region to be amplified based on a preset amplification factor while keeping a position of a vertex of the region where the target object is located unchanged.
According to an embodiment of the disclosure, determining the region to be amplified based on the amplification rule includes maintaining a position of a center point of the region where the target object is located unchanged, and determining the region to be amplified based on a preset amplification factor.
According to an embodiment of the present disclosure, the determining a correction area based on a conflict processing rule in a case where the quasi-zoom-in area causes a conflict includes, in a case where the quasi-zoom-in area collides with an existing zoom-in area in a current pane, adjusting display parameters of the quasi-zoom-in area and the existing zoom-in area so that the quasi-zoom-in area does not overlap with the existing zoom-in area; and determining a correction area based on the adjusted display parameters.
According to an embodiment of the disclosure, when the to-be-enlarged area causes a conflict, determining the correction area based on the conflict processing rule includes determining that the to-be-enlarged area is the correction area if the preset parameter indicates that the display of the pane is allowed to be exceeded in the case that the to-be-enlarged area conflicts with the boundary of the current pane.
According to an embodiment of the disclosure, when the to-be-enlarged area causes a conflict, determining a correction area based on a conflict processing rule includes, when the to-be-enlarged area conflicts with a boundary of a current pane, if a preset parameter indicates that display of the pane is not allowed to be exceeded, adjusting a display parameter of the to-be-enlarged area so that the to-be-enlarged area does not exceed the boundary of the current pane; and determining a correction area based on the adjusted display parameters.
According to an embodiment of the disclosure, the determining a correction area based on a conflict processing rule when the quasi-zoom-in area causes a conflict includes, when the quasi-zoom-in area collides with an existing zoom-in area in another pane, adjusting display parameters of the quasi-zoom-in area and the existing zoom-in area so that the quasi-zoom-in area and the existing zoom-in area are displayed in respective panes; and determining a correction area based on the adjusted display parameters.
Another aspect of the embodiments of the present disclosure provides a pane playing device, including a first display module, a first determining module, a second determining module, and a second display module. A first display module configured to display the received monitoring video data in a pane; a first determination module configured to determine a region to be amplified based on an amplification rule in a case where it is recognized that the monitoring video data includes a target object; a second determination module configured to determine a correction area based on a conflict processing rule in a case where the to-be-enlarged area causes a conflict; and a second display module configured to enlarge the target object to the correction area for display.
Another aspect of an embodiment of the present disclosure provides an electronic device comprising at least one processor and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the processor to implement the pane playing method as described above.
Another aspect of the disclosed embodiments provides a computer-readable storage medium having stored thereon computer-readable instructions that, when executed by a processor, cause the processor to implement a pane playing method as described above.
Another aspect of an embodiment of the present disclosure provides a computer program which, when executed by a processor, causes the processor to implement a pane playing method as described above.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data are displayed and identified, when the target object is identified, the area where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that not only is the simultaneous observation of multiple paths of videos ensured, but also the key information can be amplified and watched, details are not easy to miss, different requirements are met, and the utilization rate of a screen is improved.
Drawings
Other features, objects and advantages of the present disclosure will become more apparent from the following detailed description of non-limiting embodiments, taken in conjunction with the accompanying drawings. In the drawings:
fig. 1A schematically illustrates a system architecture diagram of a pane playing method or apparatus to which embodiments of the present disclosure are applied;
fig. 1B schematically illustrates a schematic view of a pane to which embodiments of the present disclosure are applied;
fig. 2 schematically illustrates a flowchart of a pane playing method of an embodiment of the present disclosure;
3A-3D schematically illustrate a schematic view of an area where an enlarged target object of an embodiment of the present disclosure is located;
FIG. 4 schematically illustrates a flow chart of determining a correction zone in an embodiment of the present disclosure;
FIGS. 5A and 5B schematically illustrate schematic diagrams of determining a correction region according to embodiments of the present disclosure;
FIG. 6 schematically illustrates a flow chart of determining a correction zone in accordance with another embodiment of the present disclosure;
FIGS. 7A and 7B schematically illustrate schematic diagrams of determining a correction region in an embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow chart of determining a correction zone in accordance with another embodiment of the present disclosure;
FIGS. 9A and 9B schematically illustrate schematic diagrams of determining a correction region according to embodiments of the present disclosure;
fig. 10A and 10B schematically illustrate a flowchart of a pane playing method according to another embodiment of the present disclosure;
fig. 11 schematically illustrates a block diagram of a pane playing device of an embodiment of the present disclosure; and
fig. 12 schematically illustrates a structural diagram of a computer system suitable for implementing the pane playing method and apparatus of the embodiments of the present disclosure.
Detailed Description
Hereinafter, exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily implement them. In addition, for the sake of clarity, portions irrelevant to description of the exemplary embodiments are omitted in the drawings.
In this disclosure, it should be understood that terms such as "comprises" or "comprising," etc., are intended to indicate the presence of features, numbers, steps, acts, components, portions, or combinations thereof disclosed in this specification, and are not intended to exclude the possibility that one or more other features, numbers, steps, acts, components, portions, or combinations thereof are present or added.
In addition, it should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
As described above, simultaneous display of the pictures of a plurality of cameras in the form of multiple panes will result in a smaller display area. Therefore, although the resolution of the received video data is sufficient, the reduction process is performed due to the restriction of the pane, resulting in an insufficient definition of the image viewed by the user.
In view of the foregoing, an aspect of an embodiment of the present disclosure provides a pane playing method, including displaying received monitoring video data in a pane; determining a region to be amplified based on an amplification rule in case that the monitoring video data is recognized to include a target object; in the case that the region to be enlarged causes a conflict, determining a correction region based on a conflict processing rule; and amplifying the target object to the correction area for display. The region where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that not only is the simultaneous observation of multiple paths of videos ensured, but also key information can be amplified and watched, details are not easy to miss, different requirements are met, and the utilization rate of a screen is improved.
The following describes in detail the technical solutions provided by the embodiments of the present disclosure with reference to the accompanying drawings.
Fig. 1A schematically illustrates a system architecture diagram of a pane playing method or apparatus to which embodiments of the present disclosure are applied.
As shown in fig. 1A, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The terminal devices 101, 102, 103 interact with the server 105 via the network 104 to receive or send messages or the like. Various client applications can be installed on the terminal devices 101, 102, 103. Such as browser-like applications, search-like applications, instant messaging-like tools, etc. The terminal devices 101, 102, 103 can display the monitoring video data from the server 105 in a multi-pane manner.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be a variety of special purpose or general purpose electronic devices including, but not limited to, smartphones, tablets, laptop and desktop computers, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., multiple software or software modules for providing distributed services) or as a single software or software module.
The server 105 may be a server providing various services, such as a back-end server providing services for client applications installed on the terminal devices 101, 102, 103. The server 105 may be communicatively connected to a plurality of monitoring cameras, receive monitoring video data from the monitoring cameras, and forward it to the terminal device 101, 102, or 103.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster formed by a plurality of servers, or as a single server. When server 105 is software, it may be implemented as multiple software or software modules (e.g., multiple software or software modules for providing distributed services), or as a single software or software module.
The pane playing method provided by the embodiment of the present disclosure may be executed by the server 105, for example, or may be executed by the terminal devices 101, 102, 103. Alternatively, the pane playing method of the embodiment of the present disclosure may be partially executed by the terminal apparatuses 101, 102, 103, and the other portions are executed by the server 105.
It should be understood that the number of terminal devices, networks and servers in fig. 1A is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 1B schematically illustrates a schematic view of a pane to which embodiments of the present disclosure are applied.
According to the embodiment of the disclosure, the display area of the terminal device may be divided into a plurality of panes. For example, as shown in fig. 1B, a certain display area may be divided into A, B, C, D four panes. In other embodiments, the display area may be divided into other numbers of panes, such as 2, 6, 8, 9, 12, 15, 16, for example. Each pane may be used to present a path of surveillance video data.
Fig. 2 schematically illustrates a flowchart of a pane playing method of an embodiment of the present disclosure.
As shown in fig. 2, the pane playing method includes operations S210 to S240.
In operation S210, the received monitoring video data is displayed in a pane.
In operation S220, in case that the monitoring video data is recognized to include a target object, a region to be enlarged is determined based on an enlargement rule.
In operation S230, in case the quasi-enlarged region causes a conflict, a correction region is determined based on a conflict processing rule.
In operation S240, the target object is enlarged to the correction area for display.
According to embodiments of the present disclosure, each pane may be used to present a path of surveillance video data. And meanwhile, the monitoring video data can be detected to identify whether the monitoring video data contains a target object or not. The algorithm for identification can be various existing algorithms, for example, various suitable neural network models can be used.
According to the embodiment of the disclosure, the target object may include a person, and the person is considered to be important screen information, and may be identified for the person. The target object may also include an alarm device. The administrator can add the equipment pictures to be identified and train the identification model. Such as alarm devices with alarm indication lights, especially those which require a focused view. In this way, the system can identify the alarm device during the playing process.
According to the embodiment of the present disclosure, after the target object is identified, an area where the target object is located may be determined, hereinafter simply referred to as a target area. The target area may be, for example, a circumscribed rectangular area of the target object. The target area may be further enlarged to more clearly show the details of the target object. For example, if the portrait is identified, the target area where the portrait is located can be enlarged and played, so that in the multi-pane playing mode, the simultaneous playing of a plurality of camera pictures can be met, and important information can be clearly seen.
According to embodiments of the present disclosure, the magnification rules may be actively configured. For example, the administrator may configure the magnification to be 1.5 times, 2 times, 3 times, or the like. The location of the enlarged region may also be configured. In some embodiments of the present disclosure, the position of one vertex of the area where the target object is located may be kept unchanged, and the area to be amplified is determined based on a preset magnification. In other embodiments of the present disclosure, the location of the center point of the area where the target object is located may be kept unchanged, and the area to be amplified is determined based on a preset magnification.
For example, in pane A, a target object is identified in the area A 1 As shown in fig. 3A. If the fixed point position of the lower left corner is kept unchanged, the target area A 1 Twice the amplification will result in a quasi-amplified area A as shown in FIG. 3B 1 '. If the target area is doubled while keeping the center position unchanged, a quasi-amplified area A as shown in FIG. 3C is obtained 1 '. If the center position is kept unchanged, the target area is enlarged three times, and a quasi-enlarged area A as shown in FIG. 3D is obtained 1 ’。
According to the embodiment of the disclosure, if the to-be-enlarged area collides with other contents to be displayed in space, a correction area is determined according to a conflict processing rule so as to enlarge the target object to the correction area for displaying. For example, in the case where two target areas exist in the same pane, if two to-be-enlarged areas overlap, the two to-be-enlarged areas may be reduced so as not to overlap, thereby determining a correction area. Further embodiments will be described in detail below.
The method according to the embodiment of the present disclosure may be partially or completely executed by the server according to the embodiment of the present disclosure, which is described above with the terminal device as the execution body. For example, the server may typeset the received surveillance video data according to the pane, identify the surveillance video data at the same time, determine a final enlarged area according to the enlargement rule and the conflict processing rule, and then send the final picture to be displayed to the terminal device for display.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data are displayed and identified, when the target object is identified, the area where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that not only is the simultaneous observation of multiple paths of videos ensured, but also the key information can be amplified and watched, details are not easy to miss, different requirements are met, and the utilization rate of a screen is improved.
Fig. 4 schematically illustrates a flow chart of determining a correction area according to an embodiment of the present disclosure.
As shown in fig. 4, operation S230 may include operations S410 and S420.
In operation S410, in case that the quasi-enlarged region collides with an existing enlarged region within the current pane, display parameters of the quasi-enlarged region and the existing enlarged region are adjusted so that the quasi-enlarged region and the existing enlarged region do not overlap.
In operation S420, a correction area is determined based on the adjusted display parameter.
For example, as shown in fig. 5A, two target objects are identified in the same path of surveillance video data, the targets of whichThe regions are processed according to the amplification rule respectively to obtain A 1-1 And A 2-1 Two regions to be amplified. However, the two regions to be enlarged overlap partially, resulting in a portion of the key content not being visible. At this time, the two regions to be amplified may be adjusted so as not to overlap. For example, the magnification may be reduced, or the position of the region to be magnified may be moved while the magnification is reduced. The adjusted area may be the corrected area A as shown in FIG. 5B 1-2 And A 2-2
According to the embodiment of the disclosure, in the case that the to-be-enlarged area collides with the boundary of the current pane, if the preset parameter indicates that the pane display is allowed to be exceeded, the to-be-enlarged area is determined to be a correction area, such as an area A shown in fig. 7A 3-1 . If the preset parameters indicate that the pane display is not allowed to be exceeded, the quasi-zoom-in area needs to be adjusted.
Fig. 6 schematically illustrates a flow chart of determining a correction area according to another embodiment of the present disclosure.
As shown in fig. 6, operation S230 may include operations S610 and S620.
In operation S610, if the preset parameter indicates that the pane display is not allowed to be exceeded, the display parameter of the to-be-enlarged area is adjusted so that the to-be-enlarged area does not exceed the boundary of the current pane.
In operation S620, a correction region is determined based on the adjusted display parameter.
For example, as shown in FIG. 7B, the region to be magnified may be adjusted so that it does not exceed the pane. For example, the magnification may be reduced, or the position of the region to be magnified may be moved while the magnification is reduced. The adjusted area may be the corrected area A as shown in FIG. 7B 3-2
Fig. 8 schematically illustrates a flow chart of determining a correction area according to another embodiment of the present disclosure.
As shown in fig. 8, operation S230 may include operations S810 and S820.
In operation S810, in case that the quasi-zoom-in area collides with an existing zoom-in area in another pane, display parameters of the quasi-zoom-in area and the existing zoom-in area are adjusted so that the quasi-zoom-in area and the existing zoom-in area are displayed in respective panes.
In operation S820, a correction area is determined based on the adjusted display parameter.
For example, as shown in fig. 9A, two target objects are identified in two paths of monitoring video data, and target areas thereof are processed according to the magnification rule, respectively, to obtain a 4-1 And C 1 Two regions to be amplified. However, the two regions to be enlarged overlap partially, resulting in a portion of the key content not being visible. At this time, the two regions to be amplified may be adjusted so as not to overlap. For example, the magnification may be reduced, or the position of the region to be magnified may be moved while the magnification is reduced. For example, the adjusted regions may be individually displayed within respective panes without crossing the panes. The adjusted area may be the corrected area A as shown in FIG. 9B 4-2 And C 2
Fig. 10A and 10B schematically illustrate a flowchart of a pane playing method according to another embodiment of the present disclosure.
As shown in fig. 10A and 10B, the method includes operations S1001 to S1011 and S1021 to S1029.
As shown in fig. 10A, in operation S1001, a setting parameter is read.
According to embodiments of the present disclosure, an administrator may make settings for multi-pane playback, including settings for magnification rules and conflict resolution rules. For example, after the area to be enlarged is identified, whether to automatically enlarge or not; the magnification source point coordinates may be set, starting from the identification region coordinates starting point, or starting from the central region; the amplification factor of automatic amplification can be set; it may be provided that in the case of a pane containing a plurality of magnification areas, the overlap is allowed or automatically adjusted to a non-overlapping size; it may be set whether or not to allow for exceeding the pane display; it may be arranged if the pane size is exceeded, if other panes also need to be enlarged, if they are automatically reduced into the pane, etc.
In operation S1002, the reading system recognizes the recognition result generated for the monitoring video data.
According to the embodiment of the disclosure, after receiving the data of the camera, decoding is firstly performed to obtain YUV data (YUV is a color coding method represented by brightness and chromaticity); analyzing the decoded YUV data (for example, using an opencv tool); comparing the picture with a preset comparison picture, and identifying whether a specified target object such as a portrait or alarm equipment is contained; if the comparison is successful, recording the identified region coordinates (x, y) and image sizes (w, h), and the pixel points of the image data of the region; a picture may contain multiple regions that need to be enlarged and all need to be recorded. For example: the enlarged area 1{ coordinates { (x, y) }, w, h, data [ ] }, and the enlarged area 2{ coordinates { (x, y) }, w, h, data [ ] }.
In operation S1003, if a target object is identified, operation S1005 is performed, otherwise operation S1004 is performed, and normal display is performed.
In operation S1005, it is determined whether to automatically zoom in according to the setting parameters, if so, operation S1006 is performed, otherwise operation S1004 is performed, and normal display is performed.
In operation S1006, the magnification and the magnification source point coordinates are read, and the size and position of the magnified image, that is, the display position of the region to be magnified, are calculated.
In operation S1007, whether or not there are a plurality of target areas in the same path of the monitoring video data, if yes, operation S1008 is performed, otherwise, the pane judgment flow shown in fig. 10B is entered.
In operation S1008, it is determined whether or not a plurality of target areas overlap, and if so, operation S1009 is performed, otherwise, the flow of pane determination shown in fig. 10B is entered.
In operation S1009, it is determined whether overlap is allowed according to the setting parameters, if so, operation S1011 is performed, the original calculated position is reserved, otherwise, operation S1010 is performed, and the image size and position are recalculated so as not to overlap. And then proceeds to the pane judgment flow shown in fig. 10B.
As shown in fig. 10B, in operation S1021, it is determined whether the enlarged image exceeds the pane, and if so, operation S1023 is performed, otherwise, operation S1022 is performed to enlarge the display in the pane.
In operation S1023, determining whether to allow the exceeding of the pane display according to the setting parameters, and if so, executing operation S1025 to exceed the pane display; otherwise, operation S1024 is performed to recalculate the image size and position, and adjust to the maximum display in the pane.
In operation S1026, if the other pane also detects the target object, the enlarged display is required, operation S1027 is performed.
In operation S1027, it is determined whether or not the superimposition display is permitted based on the setting parameters, and if the superimposition display is permitted, operation S1028 is performed in which two target areas are superimposed and displayed, for example, a later-appearing magnified image covers a preceding magnified image. If the superimposed display is not allowed, the display is adjusted to the maximum display in the pane, for example, the size and the position of the correction area are recalculated in the two panes respectively, and the display is adjusted to the maximum display in the pane.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data are displayed and identified, when the target object is identified, the area where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that not only is the simultaneous observation of multiple paths of videos ensured, but also the key information can be amplified and watched, details are not easy to miss, different requirements are met, and the utilization rate of a screen is improved.
Based on the same inventive concept, the present disclosure further provides a pane playing device, and a pane playing device according to an embodiment of the present disclosure is described below with reference to fig. 11.
Fig. 11 schematically illustrates a block diagram of a pane playing apparatus 1100 of an embodiment of the present disclosure.
The apparatus 1100 may be implemented as part or all of an electronic device by software, hardware, or a combination of both.
As shown in fig. 11, the pane playing device 1100 includes a first display module 1110, a first determination module 1120, a second determination module 1130, and a second display module 1140. The pane playing device 1100 may perform the various methods described above.
A first presentation module 1110 is configured to present the received surveillance video data in a pane.
The first determining module 1120 is configured to determine a region to be enlarged based on an enlargement rule in case that the monitoring video data is recognized to include a target object.
The second determining module 1130 is configured to determine a correction area based on a conflict processing rule in case the proposed enlarged area causes a conflict.
A second display module 1140 is configured to zoom in the target object to the correction area for display.
According to the technical scheme of the embodiment of the disclosure, the monitoring video data are displayed and identified, when the target object is identified, the area where the target object is located is amplified and displayed through the amplification rule and the conflict resolution rule, so that not only is the simultaneous observation of multiple paths of videos ensured, but also the key information can be amplified and watched, details are not easy to miss, different requirements are met, and the utilization rate of a screen is improved.
Fig. 12 schematically illustrates a structural diagram of a computer system suitable for implementing the pane playing method and apparatus of the embodiments of the present disclosure.
As shown in fig. 12, the computer system 1200 includes a processing unit 1201 which can execute various processes in the above embodiments according to a program stored in a Read Only Memory (ROM) 1202 or a program loaded from a storage section 1208 into a Random Access Memory (RAM) 1203. In the RAM 1203, various programs and data required for the operation of the system 1200 are also stored. The processing unit 1201, the ROM 1202, and the RAM 1203 are connected to each other through a bus 1204. An input/output (I/O) interface 1205 is also connected to the bus 1204.
The following components are connected to the I/O interface 1205: an input section 1206 including a keyboard, a mouse, and the like; an output portion 1207 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 1208 including a hard disk or the like; and a communication section 1209 including a network interface card such as a LAN card, a modem, or the like. The communication section 1209 performs communication processing via a network such as the internet. The drive 1210 is also connected to the I/O interface 1205 as needed. A removable medium 1211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 1210 so that a computer program read out therefrom is installed into the storage section 1208 as needed. The processing unit 1201 may be implemented as a processing unit such as CPU, GPU, TPU, FPGA, NPU.
In particular, according to embodiments of the present disclosure, the methods described above may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a medium readable thereby, the computer program comprising program code for performing the method described above. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 1209, and/or installed from the removable media 1211.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules referred to in the embodiments of the present disclosure may be implemented in software or in programmable hardware. The units or modules described may also be provided in a processor, the names of which in some cases do not constitute a limitation of the unit or module itself.
As another aspect, the present disclosure also provides a computer-readable storage medium, which may be a computer-readable storage medium included in the electronic device or the computer system in the above-described embodiments; or may be a computer-readable storage medium, alone, that is not assembled into a device. The computer-readable storage medium stores one or more programs for use by one or more processors in performing the methods described in the present disclosure.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the invention referred to in this disclosure is not limited to the specific combination of features described above, but encompasses other embodiments in which any combination of features described above or their equivalents is contemplated without departing from the inventive concepts described. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).

Claims (8)

1. A method for playing a pane, comprising:
displaying the received monitoring video data in a pane;
in the case that the monitoring video data is identified to comprise a target object, determining a quasi-amplifying region based on an amplifying rule comprises: keeping the position of one vertex of the area where the target object is located unchanged, and determining a to-be-amplified area based on a preset amplification factor; or, keeping the position of the central point of the area where the target object is located unchanged, and determining a to-be-amplified area based on a preset amplification factor;
in the case that the to-be-enlarged area causes conflict, determining a correction area based on a conflict processing rule, wherein the correction area at least comprises an area determined by adjusting display parameters of the to-be-enlarged area; and
amplifying the target object to the correction area for display;
the determining a correction area based on a conflict processing rule in the case that the to-be-enlarged area causes a conflict comprises: when the quasi-amplifying region collides with the existing amplifying region in the current pane, adjusting display parameters of the quasi-amplifying region and the existing amplifying region so that the quasi-amplifying region and the existing amplifying region are not overlapped; and determining a correction area based on the adjusted display parameters.
2. The method of claim 1, wherein the target object comprises a person and/or an alarm device.
3. The method according to any one of claims 1-2, wherein in case the region to be enlarged causes a conflict, determining a correction region based on a conflict handling rule, further comprises:
and under the condition that the to-be-enlarged area collides with the boundary of the current pane, if the preset parameters indicate that the pane display is allowed to be exceeded, determining the to-be-enlarged area as a correction area.
4. The method according to any one of claims 1-2, wherein in case the region to be enlarged causes a conflict, determining a correction region based on a conflict handling rule, further comprises:
if the preset parameters indicate that the display of the pane is not allowed to be exceeded under the condition that the boundary between the to-be-enlarged area and the current pane is in conflict, the display parameters of the to-be-enlarged area are adjusted so that the to-be-enlarged area does not exceed the boundary of the current pane;
and determining a correction area based on the adjusted display parameters.
5. The method according to any one of claims 1-2, wherein in case the region to be enlarged causes a conflict, determining a correction region based on a conflict handling rule, further comprises:
when the quasi-amplifying region collides with the existing amplifying region in other panes, adjusting display parameters of the quasi-amplifying region and the existing amplifying region so that the quasi-amplifying region and the existing amplifying region are displayed in respective panes;
and determining a correction area based on the adjusted display parameters.
6. A window pane playing device, comprising:
a first display module configured to display the received monitoring video data in a pane;
a first determining module configured to determine, in a case where the monitoring video data is recognized to include a target object, a region to be enlarged based on an enlargement rule, including: keeping the position of one vertex of the area where the target object is located unchanged, and determining a to-be-amplified area based on a preset amplification factor; or, keeping the position of the central point of the area where the target object is located unchanged, and determining a to-be-amplified area based on a preset amplification factor;
a second determination module configured to determine a correction area based on a conflict processing rule, the correction area including at least an area determined by adjusting a display parameter of the to-be-enlarged area, in a case where the to-be-enlarged area causes a conflict; the determining a correction area based on a conflict processing rule in the case that the to-be-enlarged area causes a conflict comprises: when the quasi-amplifying region collides with the existing amplifying region in the current pane, adjusting display parameters of the quasi-amplifying region and the existing amplifying region so that the quasi-amplifying region and the existing amplifying region are not overlapped; determining a correction area based on the adjusted display parameters; and
and the second display module is configured to enlarge the target object to the correction area for display.
7. An electronic device, comprising:
at least one processor; the method comprises the steps of,
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
8. A computer readable storage medium having stored thereon computer readable instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1-5.
CN202211610688.6A 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium Active CN115941861B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211610688.6A CN115941861B (en) 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211610688.6A CN115941861B (en) 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN115941861A CN115941861A (en) 2023-04-07
CN115941861B true CN115941861B (en) 2023-09-26

Family

ID=86555266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211610688.6A Active CN115941861B (en) 2022-12-14 2022-12-14 Pane playing method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN115941861B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102291571A (en) * 2011-08-11 2011-12-21 杭州华三通信技术有限公司 Method and device for realizing frame-pulling scaling in monitoring system
CN110032701A (en) * 2019-04-04 2019-07-19 网易(杭州)网络有限公司 Image shows control method, device, storage medium and electronic equipment
CN110203803A (en) * 2019-06-06 2019-09-06 快意电梯股份有限公司 Escalator safeguard method and device based on AI intelligent monitoring
CN111316637A (en) * 2019-12-19 2020-06-19 威创集团股份有限公司 Spliced wall image content identification windowing display method and related device
CN111857493A (en) * 2020-06-16 2020-10-30 佛山市华全电气照明有限公司 Video management method and system based on smart city management system
CN113891040A (en) * 2021-09-24 2022-01-04 深圳Tcl新技术有限公司 Video processing method, video processing device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009107006A2 (en) * 2008-02-26 2009-09-03 Koninklijke Philips Electronics, N.V. Zoom pane for a central monitoring device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102291571A (en) * 2011-08-11 2011-12-21 杭州华三通信技术有限公司 Method and device for realizing frame-pulling scaling in monitoring system
CN110032701A (en) * 2019-04-04 2019-07-19 网易(杭州)网络有限公司 Image shows control method, device, storage medium and electronic equipment
CN110203803A (en) * 2019-06-06 2019-09-06 快意电梯股份有限公司 Escalator safeguard method and device based on AI intelligent monitoring
CN111316637A (en) * 2019-12-19 2020-06-19 威创集团股份有限公司 Spliced wall image content identification windowing display method and related device
CN111857493A (en) * 2020-06-16 2020-10-30 佛山市华全电气照明有限公司 Video management method and system based on smart city management system
CN113891040A (en) * 2021-09-24 2022-01-04 深圳Tcl新技术有限公司 Video processing method, video processing device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN115941861A (en) 2023-04-07

Similar Documents

Publication Publication Date Title
US8396316B2 (en) Method and apparatus for processing image
JP2022528294A (en) Video background subtraction method using depth
US10748007B2 (en) Identifying objects in an image
CN110189336B (en) Image generation method, system, server and storage medium
CN111695540B (en) Video frame identification method, video frame clipping method, video frame identification device, electronic equipment and medium
CN112954450B (en) Video processing method and device, electronic equipment and storage medium
US10390076B2 (en) Image receiving/reproducing device, image generating/transmitting device, display system, image receiving/reproducing method, image generating/transmitting method, and computer readable medium
US10863230B1 (en) Content stream overlay positioning
EP3513326B1 (en) Methods, systems, and media for detecting stereoscopic videos by generating fingerprints for multiple portions of a video frame
US20230351604A1 (en) Image cutting method and apparatus, computer device, and storage medium
US10497396B2 (en) Detecting and correcting whiteboard images while enabling the removal of the speaker
KR20180092494A (en) System and method for refining training image
JP2001005582A (en) System and method for plotting picture-based data
JP7429756B2 (en) Image processing method, device, electronic device, storage medium and computer program
CN115941861B (en) Pane playing method and device, electronic equipment and medium
KR101764998B1 (en) Method and system for filtering image
AU2008264173A1 (en) Splitting a single video stream into multiple viewports based on face detection
CN115134677A (en) Video cover selection method and device, electronic equipment and computer storage medium
CN114140805A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113191210A (en) Image processing method, device and equipment
US10373290B2 (en) Zoomable digital images
US20230254447A1 (en) Session description protocol (sdp) signaling of occlude-free regions in 360 video conferencing
CN113117341B (en) Picture processing method and device, computer readable storage medium and electronic equipment
US20230140042A1 (en) Method and apparatus for signaling occlude-free regions in 360 video conferencing
CN112288774B (en) Mobile detection method, mobile detection device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: Room 209, 2nd Floor, Building 3, No. 588 Caolong Road, Songjiang District, Shanghai, 2016

Patentee after: Shanghai Sany Electronic Technology Co.,Ltd.

Country or region after: China

Address before: 201612 Room 611, Building 1, No. 299, Zhongchen Road, Songjiang District, Shanghai

Patentee before: Shanghai Sany Electronic Technology Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address