CN116668608A - Image processing method, device, equipment and storage medium - Google Patents

Image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN116668608A
CN116668608A CN202211582182.9A CN202211582182A CN116668608A CN 116668608 A CN116668608 A CN 116668608A CN 202211582182 A CN202211582182 A CN 202211582182A CN 116668608 A CN116668608 A CN 116668608A
Authority
CN
China
Prior art keywords
image frame
processing
candidate image
time length
duration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211582182.9A
Other languages
Chinese (zh)
Inventor
孙一飞
原育光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN202211582182.9A priority Critical patent/CN116668608A/en
Publication of CN116668608A publication Critical patent/CN116668608A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The embodiment of the invention discloses an image processing method, an image processing device, image processing equipment and a storage medium. Wherein the method comprises the following steps: if a new candidate image frame transmitted through any image transmission channel is received or the processing of the last target image frame is finished, determining the processing time length and the residual time length of the candidate image frame; wherein the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels; determining a target image frame from the candidate image frames according to the processing time length and the residual time length of the candidate image frames; and processing the target image frames, and updating the processing time lengths of the candidate image frames and the target image frames and the residual time length according to the processed time lengths of the target image frames. According to the technical scheme, the video images can be processed on one data link according to the actual frame rate of each video image, and the image processing quality and the resource cost can be well considered, so that the actual application requirements are better met.

Description

Image processing method, device, equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method, apparatus, device, and storage medium.
Background
With the development of image processing technology, higher demands are being made on image rendering effects. In processing video images at multiple frame rates simultaneously, the following two methods are generally adopted in the prior art: the first method is that firstly, video images are cached in a storage medium at respective frame rates, then a certain fixed frame rate is set to read the video images with the respective frame rates cached in the storage medium, and video image processing is realized through a data link. The second is to buffer, read and process the video images at the respective frame rates over multiple data links.
In the first scheme, video images with different frame rates are read by adopting a fixed frame rate, and when the input video frame rate is larger than the fixed read video frame rate, the situation of video frame loss can occur, so that the display of pictures is incoherent; when the input video frame rate is less than the fixed read video frame rate, video duplication occurs, resulting in the transmission of the same video while occupying bandwidth. In the second scheme, multiple data links are used, and each video must have an independent data link for video processing, resulting in increased resource costs. Therefore, the two schemes cannot process the video images according to the actual frame rate of each video image on one data link, that is, cannot consider the image processing quality and the resource cost, and cannot satisfy the actual application requirements.
Disclosure of Invention
The invention provides an image processing method, an image processing device, image processing equipment and a storage medium, which can process video images according to the actual frame rate of each video image on one data link, better consider the image processing quality and the resource cost and better meet the actual application demands.
According to an aspect of the present invention, there is provided an image processing method including:
if a new candidate image frame transmitted through any image transmission channel is received or the processing of the last target image frame is finished, determining the processing time length and the residual time length of the candidate image frame;
the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels, the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames, and the residual time length is used for representing the difference between the time length from the current time to the next candidate image frame transmitted through the image transmission channel corresponding to the received candidate image frame and the processing time length of the candidate image frames;
determining a target image frame from the candidate image frames according to the processing duration and the residual duration of the candidate image frames;
Processing the target image frames, and updating the processing time lengths and the residual time lengths of the candidate image frames and the target image frames according to the processed time lengths of the target image frames; the processed duration is a duration from the moment when the processing of the target image frame starts to the current moment.
According to another aspect of the present invention, there is provided an image processing apparatus including:
the time length determining module is used for determining the processing time length and the residual time length of the candidate image frames if the new candidate image frames transmitted through any image transmission channel are received or the processing of the last target image frame is finished;
the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels, the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames, and the residual time length is used for representing the difference between the time length from the current time to the next candidate image frame transmitted through the image transmission channel corresponding to the received candidate image frame and the processing time length of the candidate image frames;
a target image frame determining module, configured to determine a target image frame from the candidate image frames according to a processing duration and a remaining duration of the candidate image frames;
A time length updating module, configured to process the target image frame, and update the processing time lengths of the candidate image frame and the target image frame and the remaining time length according to the processed time length of the target image frame; the processed duration is a duration from the moment when the processing of the target image frame starts to the current moment.
According to another aspect of the present invention, there is provided an image processing electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the image processing method according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute the image processing method according to any one of the embodiments of the present invention.
According to the technical scheme, if a new candidate image frame transmitted through any image transmission channel is received or the processing of the last target image frame is finished, the processing time length and the residual time length of the candidate image frame are determined; the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels, the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames, the remaining time length is used for representing the time length from the current time to the next candidate image frame transmitted through the image transmission channel corresponding to the received candidate image frame, and the difference between the processing time length of the candidate image frames is equal to the processing time length of the candidate image frames; determining a target image frame from the candidate image frames according to the processing time length and the residual time length of the candidate image frames; processing the target image frames, and updating the processing time lengths of the candidate image frames and the target image frames and the residual time length according to the processed time lengths of the target image frames; the processed time length is a time length from the moment when the processing of the target image frame starts to the current moment. According to the technical scheme, the video images can be processed according to the actual frame rate of each video image on one data link, the image processing quality and the resource cost are well considered, the display consistency of the video images can be ensured, the video processing bandwidth is saved, and the resource occupation is reduced, so that the actual application requirements are better met.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an image processing method according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of an image processing method according to a first embodiment of the present invention;
fig. 3 is a flowchart of an image processing method according to a second embodiment of the present invention;
FIG. 4 is a schematic diagram of an image processing system according to a second embodiment of the present invention;
fig. 5 is a schematic structural view of an image processing apparatus according to a third embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device implementing an image processing method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," "target," and the like in the description and claims of the present invention and in the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present application, where the method may be performed by an image processing apparatus, and the image processing apparatus may be implemented in hardware and/or software, and the image processing apparatus may be configured in an electronic device with data processing capability. As shown in fig. 1, the method includes:
s110, if a new candidate image frame transmitted through any image transmission channel is received or the processing of the last target image frame is finished, determining the processing duration and the residual duration of the candidate image frame.
The image transmission channel may be an image acquisition channel, that is, a channel through which the image acquisition device transmits image frames acquired in real time. The image transmission channel may also be a channel for transmitting candidate image frames from the storage area to an electronic device executing the inventive arrangement. The candidate image frames may refer to unprocessed completed image frames transmitted through at least two image transmission channels. The candidate image frames may include the new candidate image frame received and other candidate image frames prior to receiving the new candidate image frame. The last target image frame may refer to an image frame being processed before a new candidate image frame is received. The processing duration may be used to represent the duration required to process the unprocessed pixel points in the candidate image frame. The remaining duration may be used to represent a difference between a duration from the current time to a next candidate image frame transmitted by the image transmission channel to which the candidate image frame corresponds to being received, and a processing duration of the candidate image frame.
In this embodiment, if a new candidate image frame transmitted through any image transmission channel is received or the processing of the previous target image frame is finished, the processing duration and the remaining duration of the candidate image frame need to be determined. If the processing of the previous target image frame is finished, the processing time length of the candidate image frame can be kept unchanged, and the difference between the remaining time length of the candidate image frame and the processing time length of the previous target image frame is used as the new remaining time length of the candidate image frame.
In this embodiment, optionally, if a new candidate image frame transmitted through any image transmission channel is received, determining a processing duration and a remaining duration of the candidate image frame includes: determining the time length required for processing all pixels of the new candidate image frame according to the resolution of the new candidate image frame and the ratio of the number of the pixels processed in unit time by the image processing equipment, and determining the processing time length of the new candidate image frame according to the time length required for processing all pixels of the new candidate image frame; and determining the time length occupied by the new candidate image frame according to the frame rate of the video to which the new candidate image frame belongs, and taking the difference between the time length occupied by the new candidate image frame and the processing time length as the residual time length of the new candidate image frame.
The image processing apparatus may refer to an apparatus capable of performing image processing, such as a computer. The unit time may be set according to actual requirements, and the present embodiment is not limited thereto, and for example, the unit time may be set to 1s or 1 μs. The frame rate may refer to the number of images transmitted per second and may be used to characterize the image transmission speed. Wherein, the larger the frame rate, the faster the image transmission speed.
In the present embodiment, if the candidate image frame is a new candidate image frame, the resolution of the new candidate image frame and the number of pixels processed by the image processing apparatus per unit time (image processing) are first determinedThe processing capability of the device) and then determining a duration required to process all pixels of the new candidate image frame based on the resolution of the new candidate image frame and the processing capability of the image processing device, and determining a processing duration of the new candidate image frame based on the duration. For example, assuming that the resolution of the new candidate image frame is 1920×1080 pixels and the processing power of the image processing apparatus is 300m pixels/s, the ratio of the resolution to the number of pixels processed by the image processing apparatus per unit time can be calculatedThe processing time length may be determined, for example, by directly using, as the processing time length, a ratio of the resolution to the number of pixels processed by the image processing apparatus in a unit time, or by rounding, rounding up or rounding down the ratio to obtain an integer consistent with the minimum timing unit of the timer, as the processing time length. And further, determining the duration occupied by the new candidate image frame according to the frame rate of the video to which the new candidate image frame belongs. For example, the inverse of the frame rate of the video to which the new candidate image frame belongs may be determined as the duration taken by the new candidate image frame. For example, assuming that the frame rate of the video to which the new candidate image frame belongs is 60fps, the duration taken by the new candidate image frame may be determined to be 1/60=16667 μs. And taking the difference between the time length occupied by the new candidate image frame and the processing time length as the residual time length of the new candidate image frame. On the basis of the above example, the remaining duration of the new candidate image frame may be determined to be 16667-6912=9755 μs.
By the arrangement, when the candidate image frame is a new candidate image frame, the processing duration can be quickly and accurately determined according to the resolution of the new candidate image frame and the number of pixels processed in unit time by the image processing equipment, and the residual duration of the new candidate image frame can be quickly and accurately determined according to the frame rate of the video to which the new candidate image frame belongs and the processing duration.
In this embodiment, optionally, if the candidate image frame is not a new candidate image frame, determining the processing duration and the remaining duration of the candidate image frame includes: and for each candidate image frame, the updated processing time length and the residual time length according to the processed time length of the previous target image frame are used as the processing time length and the residual time length of the candidate image frame.
In this embodiment, if the candidate image frame is not a new candidate image frame, for each candidate image frame, the processing duration and the remaining duration updated according to the processed duration of the previous target image frame may be used as the processing duration and the remaining duration of the candidate image frame. Specifically, if the processing of the previous target image frame is finished, the processing time length of each candidate image frame may be kept unchanged, and the remaining time length of each candidate image frame may be updated according to the difference between the remaining time length of the candidate image frame and the processed time length of the previous target image frame, and meanwhile, the processing time length of the previous target image frame becomes zero.
It should be noted that the candidate image frames are not fixed and can be dynamically adjusted according to the image processing condition. If the candidate image frame is determined as the target image frame, the candidate image frame becomes the target image frame. The target image frame may refer to an image frame being processed at the current time. And if the updated processing time length of the last target image frame is not zero, taking the last target image frame as a candidate image frame. Specifically, if the updated processing duration of the previous target image frame is not zero, which indicates that a new candidate image frame is received at a certain moment, but the processing of the previous target image frame is not finished, the previous target image frame needs to be re-used as the candidate image frame at this time, so that the target image frame is re-determined according to the candidate image frame.
By the arrangement, the processing time length and the residual time length of the candidate image frames can be rapidly and accurately determined.
And S120, determining a target image frame from the candidate image frames according to the processing duration and the residual duration of the candidate image frames.
In this embodiment, after determining the processing time length and the remaining time length of the candidate image frame, the target image frame may be determined from the candidate image frames according to the processing time length and the remaining time length. Optionally, determining the target image frame from the candidate image frames according to the processing duration and the remaining duration of the candidate image frames includes: and selecting the target image frame from the candidate image frames with the processing time length not being zero according to the residual time length of the candidate image frames.
In this embodiment, the target image frame may be selected from the candidate image frames whose processing time length is not zero according to the remaining time length of the candidate image frames. Wherein the processing duration is not zero, which indicates that the candidate image frames are not processed or are not processed. If the processing duration is zero, which indicates that the corresponding image frame has been processed, the image frame is no longer a candidate image frame. For example, from among candidate image frames whose processing time length is not zero, a candidate image frame whose remaining time length and/or processing time length is relatively small may be selected as the target image frame. Wherein the remaining time period may be a factor of priority. For example, from the candidate image frames with non-zero processing duration, the candidate image frame with the smallest residual duration is selected as the target image frame, so that the image processing quality is ensured, and the situation of display discontinuity of the video picture is avoided.
According to the scheme, through the arrangement, the target image frame can be determined from the candidate image frames with non-zero processing time length according to the residual time length of the candidate image frames, so that the image processing quality can be ensured, and the situation that the display of the video picture is incoherent is avoided.
In this embodiment, optionally, selecting, according to the remaining duration of the candidate image frames, the target image frame from the candidate image frames with processing duration other than zero includes: determining a candidate image frame with the minimum residual duration from candidate image frames with non-zero processing duration as a target image frame; or, from the candidate image frames with equal residual duration, determining the candidate image frame with non-zero processing duration and minimum processing duration as the target image frame; alternatively, from among candidate image frames whose remaining time periods are equal and whose processing time periods are not zero and equal, the candidate image frame is randomly determined as the target image frame.
In this embodiment, the candidate image frame with the smallest remaining duration may be selected as the target image frame from the candidate image frames with non-zero processing duration, so as to ensure the image processing quality. If the remaining time lengths of the candidate image frames are equal, the candidate image frame with the processing time length being different from zero and the smallest candidate image frame is determined as the target image frame, so that the image processing efficiency is improved. If the processing time length and the remaining time length of each candidate image frame are equal, the image frame processing sequence is not limited, and the candidate image frame can be randomly selected from the candidate image frames with non-zero processing time length as the target image frame.
Through such setting, the scheme can better consider image processing quality and processing efficiency, and can better satisfy practical application demands.
S130, processing the target image frames, and updating the processing time lengths of the candidate image frames and the target image frames and the residual time length according to the processed time lengths of the target image frames; the processed time length is a time length from the moment when the processing of the target image frame starts to the current moment.
In this embodiment, after the target image frame is determined, the target image frame may be processed by the image processing apparatus, and then the processing durations of the candidate image frame and the target image frame and the remaining duration are updated according to the processed duration of the target image frame. It should be noted that, if the processing of the target image frame is completed, the processed duration of the target image frame is equal to the duration required for processing all the pixels of the target image frame; if the target image frame processing does not complete receiving a new candidate image frame, the processed duration of the target image frame is less than the duration required to process all pixels of the target image frame.
In this embodiment, optionally, updating the processing durations of the candidate image frames and the target image frames and the remaining durations according to the processed durations of the target image frames includes: taking the difference value between the residual time length of the candidate image frame and the processed time length as the updated residual time length of the candidate image frame, and keeping the processing time length of the candidate image frame unchanged; and taking the difference value of the processing time length of the target image frame and the processed time length as the updated processing time length of the target image frame, and keeping the residual time length of the target image frame unchanged.
In this embodiment, when updating the processing time length and the remaining time length of the candidate image frame, the difference between the remaining time length of the candidate image frame and the processed time length of the target image frame may be used as the updated remaining time length of the candidate image frame, and the processing time length of the candidate image frame may be kept unchanged. When updating the processing time length and the remaining time length of the target image frame, the difference value between the processing time length and the processed time length of the target image frame can be used as the updated processing time length of the target image frame, and the remaining time length of the target image frame is kept unchanged.
It should be noted that, in the embodiment of the present application, if there is an idle time, that is, if no image frame is processed, the time between the end of processing the previous target image frame and the processing of the current target image frame is the idle time, the processing duration and the remaining duration are updated according to the idle time, that is, the idle time is subtracted on the basis of the processing duration to be the updated processing duration, and the idle time is subtracted on the basis of the remaining duration to be the updated remaining duration.
According to the technical scheme, if a new candidate image frame transmitted through any image transmission channel is received or the processing of the last target image frame is finished, the processing time length and the residual time length of the candidate image frame are determined; the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels, the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames, the remaining time length is used for representing the time length from the current time to the next candidate image frame transmitted through the image transmission channel corresponding to the received candidate image frame, and the difference between the processing time length of the candidate image frames is equal to the processing time length of the candidate image frames; determining a target image frame from the candidate image frames according to the processing time length and the residual time length of the candidate image frames; processing the target image frames, and updating the processing time lengths of the candidate image frames and the target image frames and the residual time length according to the processed time lengths of the target image frames; the processed time length is a time length from the moment when the processing of the target image frame starts to the current moment. According to the technical scheme, the video images can be processed according to the actual frame rate of each video image on one data link, the image processing quality and the resource cost are well considered, the display consistency of the video images can be ensured, the video processing bandwidth is saved, and the resource occupation is reduced, so that the actual application requirements are better met.
Fig. 2 is a schematic diagram of an image processing method according to an embodiment of the invention. Where Ch1, ch2, and Ch3 represent three image transmission channels, respectively. The resolution of the image frames transmitted through Ch1-Ch3 is 1920×1080, the frame rates are 60fps, 50fps and 30fps, respectively, that is, the time periods occupied by the candidate image frames are 1/60=16667 μs, 1/50=20000 μs and 1/30= 33333 μs, respectively. Assuming that candidate image frames of three channels Ch1-Ch3 arrive at the same time at the time 0 and the number of pixels processed by the image processing device in a unit time is 300Mpixel/s, it can be determined that the processing duration and the remaining duration of the candidate image frames at the time 0 are as follows: the Ch1 channel was treated for 6912 μs and the remaining 9755 μs; the treatment duration of the Ch2 channel was 6912 μs, the remaining duration was 13088 μs; the Ch3 channel was treated for 6912 μs and the remaining period was 26421 μs.
Candidate image frames at times Ch1-Ch3 arrive at the same time at time 0, triggering a primary prioritization operation to determine the target image frame. At time 0, the processing duration of the candidate image frames of the Ch1 channel is not zero and the remaining duration is minimum, so that the candidate image frames of the Ch1 channel can be preferentially processed as target image frames. The target image frame for Ch1 channel is processed at 6912, with 6912 mus, at which time prioritization is triggered again. At 6912, the processing duration of the target image frame of Ch1 channel is updated to 0, and the remaining duration remains 9755 unchanged; the processing duration of the candidate image frames of the Ch2 channel remains 6912 unchanged, and the remaining duration is updated to 6176 (i.e., 13088-6912); the processing duration of the candidate image frames for Ch3 channel remains 6912 unchanged and the remaining duration is updated to 19509 (i.e., 26421-6912). At 6912, the processing duration of the candidate image frames of the Ch2 channel is not zero and the remaining duration is minimum, so that the candidate image frames of the Ch2 channel may be preferentially processed as target image frames. Ch2 completes processing at time 13824, with 6912 μs time, at which time prioritization is triggered again. At the time 13824, the processing duration of the candidate image frames of the Ch1 channel remains unchanged, and the remaining duration remains unchanged 9755; the processing duration of the target image frame of the Ch2 channel is kept updated to 0, and the remaining duration is kept 6176 unchanged; the processing duration of the candidate image frames for Ch3 channel remains 6912 unchanged and the remaining duration is updated to 12597 (i.e., 19509-6912). At the time 13824, the processing duration of the candidate image frames of the Ch3 channel alone is not zero, so that the candidate image frames of the Ch3 channel can be processed as target image frames.
At time 16667, the Ch1 channel receives a new candidate image frame, and at this time, the processing duration of the candidate image frame of the Ch1 channel needs to be updated to 6912, the remaining duration is updated to 9755, and a primary prioritization is triggered. At time 16667, the target image frame of Ch3 channel is not processed, and the target image frame of the channel is reused as a candidate image frame, and the processing duration of the candidate image frame of Ch3 channel needs to be updated to 6912- (16667-13824) =4069, and the remaining duration remains 12597 unchanged. At time 16667, the processing duration of the candidate image frames of Ch1 channel is not zero and the remaining duration is minimum, so that the candidate image frames of Ch1 channel can be preferentially processed as target image frames.
At 20000, the Ch2 channel receives a new candidate image frame, at this time, the processing duration of the candidate image frame of the Ch2 channel needs to be updated to 6912, the remaining duration is updated to 13088, and a primary prioritization is triggered. At 20000, the target image frame of Ch1 channel is still unprocessed, the target image frame of the channel is reused as a candidate image frame, the processing duration of the candidate image frame of Ch1 channel needs to be updated again, the update is 6912- (20000-16667) =3579, and the remaining duration remains 9755 unchanged. At 20000, the processing duration of the candidate image frames of Ch3 channel is not zero and the remaining duration is the smallest, so the candidate image frames of Ch3 channel are preferentially processed as target image frames. At time 24069, the target image frame processing for Ch3 channel is complete, with 4069 μs, at which time prioritization is triggered again. And circularly judging the priority of each video stream according to the process and processing the video streams, so as to ensure that each video stream is processed according to the frame rate of each video stream.
Example two
Fig. 3 is a flowchart of an image processing method according to a second embodiment of the present invention, which is optimized based on the above embodiment. The concrete optimization is as follows: the method further comprises the steps of: if a new candidate image frame transmitted through any image transmission channel is received and the processing of the previous target image frame is not finished, the processed pixel row of the previous target image frame is recorded, and the previous target image frame is used as the candidate image frame to be processed from the next row of the processed pixel row when the candidate image frame is determined as the target image frame again to be processed.
As shown in fig. 3, the method of this embodiment specifically includes the following steps:
s210, if a new candidate image frame transmitted through any image transmission channel is received or the processing of the last target image frame is finished, determining the processing duration and the residual duration of the candidate image frame.
Wherein the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels; the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames; the remaining duration is used for representing the difference between the duration from the current moment to the next candidate image frame transmitted by the image transmission channel corresponding to the received candidate image frame and the processing duration of the candidate image frame.
And S220, determining a target image frame from the candidate image frames according to the processing duration and the residual duration of the candidate image frames.
S230, processing the target image frames, and updating the processing time lengths of the candidate image frames and the target image frames and the residual time length according to the processed time lengths of the target image frames; the processed time length is a time length from the moment when the processing of the target image frame starts to the current moment.
The specific implementation of S210-S230 may be referred to in the detailed description of S110-S130, and will not be described herein.
And S240, if a new candidate image frame transmitted through any image transmission channel is received and the processing of the previous target image frame is not finished, recording the processed pixel row of the previous target image frame, and processing the previous target image frame as the candidate image frame from the next row of the processed pixel row when the candidate image frame is determined as the target image frame again for processing.
The processed pixel row may refer to a processed pixel row in the previous target image frame. In this embodiment, if a new candidate image frame transmitted through any one of the image transmission channels is received and the processing of the previous target image frame is not finished, the target image frame needs to be determined again, and at this time, the processing of the previous target image frame is interrupted. In order to improve the image processing efficiency, the processed pixel row of the previous target image frame may be recorded, so that when the previous target image frame is determined to be the target image frame to be processed again, the processing may be directly continued from the next row of the processed pixel row, so as to avoid the repeated processing of the processed pixel row. In the case of redefining the target image frame, the previous target image frame needs to be used as the candidate image frame.
According to the technical scheme, if a new candidate image frame transmitted through any image transmission channel is received and the processing of a previous target image frame is not finished, the processed pixel row of the previous target image frame is recorded, the previous target image frame is used as the candidate image frame, and when the candidate image frame is determined to be the target image frame again to be processed, the next row of the processed pixel row is processed. According to the technical scheme, the video image can be processed according to the actual frame rate of each video image on one data link, the image processing quality and the resource cost are well considered, the display consistency of the video image is guaranteed, the video processing bandwidth is saved, and the resource occupation is reduced.
Fig. 4 is a schematic diagram of an image processing system according to a second embodiment of the present invention, where the image processing system is a video mosaic control system, and the system is capable of processing video streams with multiple different frame rates through one data link. As shown in fig. 4, the system may include a video acquisition unit, a pulse sequencing unit, a video stream control unit, a video storage unit, a storage medium, and a video processing unit. In the video spelling system, there are multiple video acquisition units (3 in fig. 4 as an example), and each video acquisition unit is an image transmission channel and can be located in different external chips. The external Chip may be an FPGA (Field Programmable Gate Array ), DSP (Digital Signal Processing, digital signal processing), SOC (System on Chip), or the like. Specifically, the video acquisition unit is used for acquiring video images of the input source and transmitting the acquired image data to the video storage unit. Meanwhile, the video acquisition unit is responsible for counting the frame rate information and the resolution information of each video input source and transmitting the frame rate information, the resolution information and the frame pulse signals to the pulse sequencing unit.
The pulse sequencing unit can process the video pulses and the frame rate information transmitted by the video acquisition unit, sequence and select the image transmission channel with the highest priority, and determine the image frame of the image transmission channel as the target image frame. Specifically, for each input source video, the pulse sequencing unit provides two timers, a frame processing timer and a frame idle timer. The frame processing timer is used for recording the processing time length of the image frames, and the frame idle timer is used for recording the residual time length of the image frames. Each time a new video stream frame pulse signal arrives (i.e., a new candidate image frame is received), or the processing of the target image frame is completed, a prioritization may be triggered to re-determine the target image frame.
The video stream control unit may be configured to generate a command for video reading and to send the command to the video storage unit. Specifically, the video stream control unit takes video lines as units, and internally records a line count value for each video stream process. Meanwhile, the video stream control unit supports suspending the current video stream and saving the line count value (i.e., the processed pixel line) that the current video stream is processing, and then switching to the video stream with higher priority (i.e., the redetermined target image frame), reading the processed pixel line recorded by the redetermined target image frame and generating a read command.
The video storage unit may store video image data of the video acquisition unit in the storage medium, and read and transmit the video image data from the storage medium according to a command of the video stream control unit. The storage medium may be used to store video image data for each input source. The video processing unit may be for data processing of video image data read from the storage medium and may include one or more units. The video processing units are key channels limiting performance in the system, and each video processing unit can be distributed in different hardware chips and connected by a high-speed interface.
Example III
Fig. 5 is a schematic structural diagram of an image processing apparatus according to a third embodiment of the present invention, where the image processing apparatus may execute the image processing method according to any embodiment of the present invention, and the image processing apparatus has functional modules and beneficial effects corresponding to the execution method. As shown in fig. 5, the apparatus includes:
a duration determining module 310, configured to determine a processing duration and a remaining duration of the candidate image frame if a new candidate image frame transmitted through any one of the image transmission channels is received or processing of a previous target image frame is completed;
the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels, the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames, and the residual time length is used for representing the difference between the time length from the current time to the next candidate image frame transmitted through the image transmission channel corresponding to the received candidate image frame and the processing time length of the candidate image frames;
A target image frame determining module 320, configured to determine a target image frame from the candidate image frames according to the processing duration and the remaining duration of the candidate image frames;
a duration updating module 330, configured to process the target image frame, and update the processing durations of the candidate image frame and the target image frame and the remaining duration according to the processed duration of the target image frame; the processed duration is a duration from the moment when the processing of the target image frame starts to the current moment.
Optionally, the duration updating module 330 includes:
a candidate image frame duration updating unit, configured to use a difference value between a remaining duration of the candidate image frame and the processed duration as an updated remaining duration of the candidate image frame, and keep the processed duration of the candidate image frame unchanged;
and the target image frame duration updating unit is used for taking the difference value of the processing duration of the target image frame and the processed duration as the updated processing duration of the target image frame and keeping the residual duration of the target image frame unchanged.
Optionally, the target image frame determining module 320 includes:
And the target image frame determining unit is used for selecting the target image frame from the candidate image frames with the processing time length not being zero according to the residual time length of the candidate image frames.
Optionally, the target image frame determining unit is configured to:
determining a candidate image frame with the minimum residual duration from candidate image frames with non-zero processing duration as a target image frame; or,
from the candidate image frames with equal residual duration, determining the candidate image frame with non-zero processing duration and minimum processing duration as a target image frame; or,
from the candidate image frames whose remaining time periods are equal and whose processing time periods are not zero and equal, the candidate image frames are randomly determined as target image frames.
Optionally, if the candidate image frame is a new candidate image frame, the duration determining module 310 is configured to:
determining the time length required for processing all pixels of the new candidate image frame according to the resolution of the new candidate image frame and the ratio of the number of the pixels processed in unit time by the image processing equipment, and determining the processing time length of the new candidate image frame according to the time length required for processing all pixels of the new candidate image frame;
and determining the time length occupied by the new candidate image frame according to the frame rate of the video to which the new candidate image frame belongs, and taking the difference between the time length occupied by the new candidate image frame and the processing time length as the residual time length of the new candidate image frame.
Optionally, if the candidate image frame is not a new candidate image frame, the duration determining module 310 is configured to:
and for each candidate image frame, the updated processing time length and the residual time length according to the processed time length of the previous target image frame are used as the processing time length and the residual time length of the candidate image frame.
Optionally, the apparatus further includes:
and the processed pixel row recording module is used for recording the processed pixel row of the previous target image frame and taking the previous target image frame as the candidate image frame to process from the next row of the processed pixel row when the candidate image frame is determined to be the target image frame again to process if the new candidate image frame transmitted through any image transmission channel is received and the processing of the previous target image frame is not finished.
The image processing device provided by the embodiment of the invention can execute the image processing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 6 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 6, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the respective methods and processes described above, for example, an image processing method.
In some embodiments, the image processing method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the image processing method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the image processing method in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. An image processing method, the method comprising:
if a new candidate image frame transmitted through any image transmission channel is received or the processing of the last target image frame is finished, determining the processing time length and the residual time length of the candidate image frame;
the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels, the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames, and the residual time length is used for representing the difference between the time length from the current time to the next candidate image frame transmitted through the image transmission channel corresponding to the received candidate image frame and the processing time length of the candidate image frames;
Determining a target image frame from the candidate image frames according to the processing duration and the residual duration of the candidate image frames;
processing the target image frames, and updating the processing time lengths and the residual time lengths of the candidate image frames and the target image frames according to the processed time lengths of the target image frames; the processed duration is a duration from the moment when the processing of the target image frame starts to the current moment.
2. The method of claim 1, wherein updating the processing durations of the candidate image frames and the target image frames and the remaining durations based on the processed durations of the target image frames comprises:
taking the difference value between the residual time length of the candidate image frame and the processed time length as the updated residual time length of the candidate image frame, and keeping the processing time length of the candidate image frame unchanged;
and taking the difference value of the processing time length of the target image frame and the processed time length as the updated processing time length of the target image frame, and keeping the residual time length of the target image frame unchanged.
3. The method according to claim 1 or 2, wherein determining a target image frame from the candidate image frames based on the processing duration and the remaining duration of the candidate image frames comprises:
And selecting a target image frame from the candidate image frames with the processing time length not being zero according to the residual time length of the candidate image frames.
4. A method according to claim 3, wherein selecting a target image frame from the candidate image frames having the processing duration other than zero based on the remaining duration of the candidate image frames, comprises:
determining a candidate image frame with the minimum residual duration from candidate image frames with non-zero processing duration as a target image frame; or,
from the candidate image frames with equal residual duration, determining the candidate image frame with non-zero processing duration and minimum processing duration as a target image frame; or,
from the candidate image frames whose remaining time periods are equal and whose processing time periods are not zero and equal, the candidate image frames are randomly determined as target image frames.
5. The method of claim 1, wherein determining the processing duration and remaining duration of the candidate image frames if a new candidate image frame transmitted via any of the image transmission channels is received comprises:
determining the time length required for processing all pixels of the new candidate image frame according to the resolution of the new candidate image frame and the ratio of the number of the pixels processed in unit time by the image processing equipment, and determining the processing time length of the new candidate image frame according to the time length required for processing all pixels of the new candidate image frame;
And determining the time length occupied by the new candidate image frame according to the frame rate of the video to which the new candidate image frame belongs, and taking the difference between the time length occupied by the new candidate image frame and the processing time length as the residual time length of the new candidate image frame.
6. The method of claim 1, wherein determining the processing duration and the remaining duration of the candidate image frame if the candidate image frame is not a new candidate image frame comprises:
and for each candidate image frame, the updated processing time length and the residual time length according to the processed time length of the previous target image frame are used as the processing time length and the residual time length of the candidate image frame.
7. The method according to claim 1, wherein the method further comprises:
if a new candidate image frame transmitted through any image transmission channel is received and the processing of the previous target image frame is not finished, the processed pixel row of the previous target image frame is recorded, and the previous target image frame is used as the candidate image frame to be processed from the next row of the processed pixel row when the candidate image frame is determined as the target image frame again to be processed.
8. An image processing apparatus, characterized in that the apparatus comprises:
The time length determining module is used for determining the processing time length and the residual time length of the candidate image frames if the new candidate image frames transmitted through any image transmission channel are received or the processing of the last target image frame is finished;
the candidate image frames are unprocessed image frames transmitted through at least two image transmission channels, the processing time length is used for representing the time length required for processing unprocessed pixel points in the candidate image frames, and the residual time length is used for representing the difference between the time length from the current time to the next candidate image frame transmitted through the image transmission channel corresponding to the received candidate image frame and the processing time length of the candidate image frames;
a target image frame determining module, configured to determine a target image frame from the candidate image frames according to a processing duration and a remaining duration of the candidate image frames;
a time length updating module, configured to process the target image frame, and update the processing time lengths of the candidate image frame and the target image frame and the remaining time length according to the processed time length of the target image frame; the processed duration is a duration from the moment when the processing of the target image frame starts to the current moment.
9. An image processing electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the image processing method of any one of claims 1-7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores computer instructions for causing a processor to implement the image processing method of any one of claims 1-7 when executed.
CN202211582182.9A 2022-12-08 2022-12-08 Image processing method, device, equipment and storage medium Pending CN116668608A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211582182.9A CN116668608A (en) 2022-12-08 2022-12-08 Image processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211582182.9A CN116668608A (en) 2022-12-08 2022-12-08 Image processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116668608A true CN116668608A (en) 2023-08-29

Family

ID=87717718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211582182.9A Pending CN116668608A (en) 2022-12-08 2022-12-08 Image processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116668608A (en)

Similar Documents

Publication Publication Date Title
RU2771008C1 (en) Method and apparatus for processing tasks based on a neural network
CN108509272B (en) Method and device for copying GPU (graphics processing Unit) video memory texture to system memory and electronic equipment
US9940732B2 (en) Implementing reduced video stream bandwidth requirements when remotely rendering complex computer graphics scene
CN110860086A (en) Data processing method, readable storage medium and electronic device
CN113656176B (en) Cloud equipment distribution method, device and system, electronic equipment, medium and product
CN114422799B (en) Decoding method and device for video file, electronic equipment and program product
CN113839998B (en) Image data transmission method, apparatus, device, storage medium, and program product
US11195248B2 (en) Method and apparatus for processing pixel data of a video frame
CN116668608A (en) Image processing method, device, equipment and storage medium
US20080211820A1 (en) Information Processing Device, Graphic Processor, Control Processor, and Information Processing Method
CN116521088A (en) Data processing method, device, equipment and storage medium
CN114327918B (en) Method and device for adjusting resource amount, electronic equipment and storage medium
US12034956B2 (en) Video processing method and video processing apparatus
US20230262250A1 (en) Video processing method and apparatus, electronic device, and storage medium
CN116916095B (en) Smooth display method, device and equipment of cloud video and storage medium
CN113627363B (en) Video file processing method, device, equipment and storage medium
CN115600687B (en) Model training method, device, equipment and storage medium
CN113824955B (en) Multi-channel video time-sharing multiplexing coding method and system
CN117667000A (en) Image display method, device, electronic equipment and medium
CN117061804A (en) Video display method, device, electronic equipment and storage medium
CN115719580A (en) LED screen display control method, device, equipment and medium
CN116582707A (en) Video synchronous display method, device, equipment and medium
CN117495653A (en) Histogram statistics method and device, electronic equipment and storage medium
CN118042061A (en) Video transmission method and device, electronic equipment and storage medium
CN114140312A (en) Double-thread rendering method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination