WO2020097888A1 - 视频处理方法、装置、电子设备及计算机可读存储介质 - Google Patents
视频处理方法、装置、电子设备及计算机可读存储介质 Download PDFInfo
- Publication number
- WO2020097888A1 WO2020097888A1 PCT/CN2018/115753 CN2018115753W WO2020097888A1 WO 2020097888 A1 WO2020097888 A1 WO 2020097888A1 CN 2018115753 W CN2018115753 W CN 2018115753W WO 2020097888 A1 WO2020097888 A1 WO 2020097888A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- frame
- video
- image
- processed
- encoding
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 36
- 238000012545 processing Methods 0.000 claims abstract description 57
- 238000000034 method Methods 0.000 claims abstract description 49
- 238000007781 pre-processing Methods 0.000 claims abstract description 18
- 230000009467 reduction Effects 0.000 claims description 19
- 239000000284 extract Substances 0.000 claims description 14
- 238000013139 quantization Methods 0.000 claims description 9
- 230000000694 effects Effects 0.000 abstract description 8
- 238000007906 compression Methods 0.000 description 16
- 230000006835 compression Effects 0.000 description 16
- 230000006870 function Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 10
- 230000005540 biological transmission Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000013075 data extraction Methods 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 5
- 238000012805 post-processing Methods 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 241000023320 Luma <angiosperm> Species 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- the present application relates to the field of video image encoding and decoding, and more specifically, to a video processing method, device, electronic device, and computer-readable storage medium.
- multimedia especially video
- video has become one of the main content-bearing media
- video is developing in the direction of high-definition and even ultra-clear, resulting in video transmission occupying most of the network transmission bandwidth, which is giving users While enriching the experience, it brings great pressure to storage and transmission, so it is important to compress the video.
- low-bitrate video is usually recorded, compressed, encoded and uploaded.
- advanced video processing Advanced Video Coding, AVC
- AVC Advanced Video Coding
- the present application proposes a video processing method, device, electronic device, and computer-readable storage medium to improve the above-mentioned defects.
- embodiments of the present application provide a video processing method, which is applied to an electronic device.
- the method includes: collecting video, extracting the frame image of the video to be processed; performing blur processing on the frame image to be processed to obtain the blurred image; and, determining encoding parameters and encoding the blurred image.
- an embodiment of the present application further provides a video processing device, which is applied to an electronic device.
- the video processing device includes: a video collection module for collecting video and extracting a frame image of a video to be processed; Blurring the frame image to be processed to obtain a blurred image; and an encoding module for determining encoding parameters and encoding the blurred image.
- an embodiment of the present application further provides an electronic device, including: one or more processors; a memory; one or more application programs, where one or more application programs are stored in the memory and configured as It is executed by one or more processors, and one or more programs are configured to perform the above method.
- an embodiment of the present application further provides a computer-readable storage medium, where the program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the foregoing method.
- the preprocessing of the frame image to be processed is completed by blurring the frame image to be processed, and according to the determined encoding parameters, the preprocessed image Encoding, while compressing the amount of video data and improving the coding efficiency, remove the coding block effect and mosaic in the video, so as to ensure higher video quality and improve video clarity.
- FIG. 1 is a schematic diagram of a video processing scenario provided by an embodiment of the present application.
- FIG. 2 is a schematic flowchart of a video processing method provided by an embodiment of the present application.
- FIG. 3 is a schematic flowchart of another video processing method provided by an embodiment of the present application.
- FIG. 4 is a schematic flowchart of the blur processing steps of the video processing method shown in FIG. 3.
- FIG. 5 is a schematic flowchart of an encoding step of the video processing method shown in FIG. 3.
- FIG. 6 is a schematic diagram of functional modules of a video processing device provided by an embodiment of the present application.
- FIG. 7 is a schematic structural diagram of an electronic device provided by an embodiment of the present application for performing the video processing method of the embodiment of the present application.
- FIG. 8 is a block diagram of an electronic device for performing a video processing method according to an embodiment of the present application.
- 9 is a storage unit provided by an embodiment of the present application for storing or carrying program code for implementing the video processing method of the embodiment of the present application.
- Electrodes and “communication terminals” used in the embodiments of the present application include, but are not limited to being configured to be connected via a wired line (such as via a public switched telephone network (PSTN), digital subscribers Line (DSL), digital cable, direct cable connection, and / or another data connection / network) and / or via (eg, for cellular networks, wireless local area networks (WLAN), digital TV networks such as DVB-H networks, satellites A network, an AM-FM broadcast transmitter, and / or a device of another communication terminal) receiving / transmitting a communication signal.
- PSTN public switched telephone network
- DSL digital subscribers Line
- WLAN wireless local area networks
- digital TV networks such as DVB-H networks
- satellites A network such as an AM-FM broadcast transmitter, and / or a device of another communication terminal
- a communication terminal configured to communicate through a wireless interface may be referred to as a "wireless communication terminal", a “wireless terminal”, and / or a “mobile terminal”, an “electronic device”.
- mobile terminals and electronic equipment include, but are not limited to, satellite or cellular phones; personal communication device (PCS) terminals that can combine cellular radiotelephones with data processing, facsimile, and data communication capabilities; can include radiotelephones, pagers, Internet / internal PDAs with networked access, Web browsers, notepads, calendars, and / or global positioning device (GPS) receivers; and conventional laptop and / or palm-type receivers or other electronic devices including radiotelephone transceivers.
- PCS personal communication device
- GPS global positioning device
- low bit rate (Low Rate) videos are usually recorded, compressed, encoded and uploaded.
- some instant messaging applications support instant shooting of small videos and instant sharing.
- these videos are usually recorded with a higher resolution device (such as 960X544).
- the video bit rate is usually low (for example , The bit rate is less than or equal to 1.2Mbps), and the video duration is usually short, for example, the video with a duration of 10S, 8S, 5S, we usually call it a small video with a low bit rate.
- advanced video processing Advanced Video Coding, AVC
- AVC Advanced Video Coding
- the inventor of the present application has concentrated on studying how to improve the video quality and ensure the processing speed and transmission rate of the video during the low bit rate video recording process similar to the above scene.
- the inventor found that in the low bit rate video encoding process of the above scenario, the video data obtained is directly determined and encoded by the video frame, wherein the I frame interval of the video frame is short, and in order to improve Coding efficiency, the video frame of the above small video only contains I frame and P frame, resulting in a large amount of coded data. Therefore, to balance the video data space and transmission rate, it is difficult to obtain high-quality, low-bitrate video images through traditional encoding methods.
- the inventor proposes the video processing method of the present application, which can take into account the video processing speed, image quality, and transmission rate during video recording.
- the video processing method can be applied to the low bit rate small video recording process of the above scenes, so that the low bit rate small video recorded at a higher resolution has higher resolution.
- FIG. 1 shows a schematic diagram of a video processing and encoding scenario of the present application.
- the video content 1011 is acquired by the shooting module 108, and the video content 1011 is processed by the processor 102.
- the processor 102 may include a preprocessor 1021 and an encoder 1023.
- the preprocessor 1021 is used to preprocess the video content 1011.
- the video content 1011 is de-noised and blurred, and the encoder 1023 is used to Encode the preprocessed video content 1011.
- the video processing method provided by the embodiments of the present application through preprocessing, removes the high-frequency noise in the video content and then encodes, which is beneficial to achieve noise reduction and retain the key information in the video content, which can take into account the video processing speed, Picture quality and transmission rate.
- an embodiment of the present application proposes a video processing method, which is applied to an electronic device with a camera in practical applications.
- the electronic device may be a mobile phone, a tablet computer or other portable mobile terminals (such as smart Watches, cameras, etc.).
- the above video processing method may include: S101 to S105.
- S101 Collect video and extract the frame image of the video to be processed.
- the video is collected through the camera of the electronic device, and the to-be-processed frame image of the video is extracted in real time.
- the blur processing in the embodiment of the present application should be understood as blur processing of the YUV data of the frame image to be processed, such as reducing the sharpness of the image, removing image noise and unnecessary details.
- the YUV data in the image of the frame to be processed is first extracted, after performing time-domain noise reduction processing on the YUV data, the image of the frame to be processed is reduced and then enlarged to the original size to achieve blur processing Get a blurred image.
- the frame image to be processed can lose some details, which are insensitive to human eyes (such as high-frequency noise and excessive sharpening) Part of the details) is conducive to the subsequent encoding of the frame image to be processed, which can reduce the amount of encoded data and increase the encoding rate, thereby improving the image quality of post-processing.
- determining encoding parameters when determining encoding parameters, it is necessary to determine the type of video frame to be processed, and then encode the blurred image according to the type of video frame.
- the types of video frames include I frames, P frames, and B frames.
- the I frame is an intra-frame reference frame, also called a key frame, and is the first frame of GOP (Group of Pictures) group coding, and its coding does not depend on the preceding and following frames.
- a P-frame is a coded image that compresses the amount of data transmitted by sufficiently reducing the temporal redundancy of previously coded frames in the image sequence, also called a predicted frame.
- the B frame is a bidirectionally predicted frame, and its reference frames are the adjacent previous frames, the current frame, and the subsequent frames.
- the P frame or B frame should be set between two adjacent I frames.
- the first frame of the video frame is an I frame
- the subsequent video frame of the first frame is a B frame or / and P frame
- the I frame is further intra-coded
- the B frame or / and P frames are inter-coded.
- the amount of encoded data can be reduced in advance. Then, according to the determined encoding parameters, the pre-processed image is encoded, while realizing the compression of the video data volume and improving the encoding efficiency, it can remove the encoding block effect and mosaic in the video, thereby ensuring a higher video quality, Improve video clarity.
- the compression ratio of H.264 is more than twice that of MPEG-2 and 1.5 to 2 times that of MPEG-4.
- the size of the original file is 88GB, it will be 3.5GB after compression with the MPEG-2 compression standard, with a compression ratio of 25: 1, and 879MB after compression with the H.264 compression standard, from 88GB to 879MB, H.
- the compression ratio of 264 reaches 102: 1.
- the low bit rate plays an important role in the high compression ratio of H.264.
- H.264 compression technology will greatly save users' upload time and data traffic charges.
- H.264 has a high compression ratio and also has high-quality and smooth images. Therefore, the video processing method of the embodiment of the present application adopts the H .264 compressed video data requires less bandwidth and is more economical during network transmission.
- this application also provides another video processing method.
- the video processing method encodes a video frame
- the video frame type is set according to the motion scene of the video frame, and then the video is encoded according to the video frame type, which can ensure the recording of dynamic video scenes Time, with high picture quality.
- the video processing method provided in this embodiment may include: S201 to S205.
- S201 Collect video and extract the frame image of the video to be processed.
- the video is collected through the camera of the electronic device, and the to-be-processed frame image of the video is extracted in real time.
- the maximum video collection duration is usually set, that is, the maximum duration of the video collected by the camera of the electronic device is limited, so as to facilitate subsequent setting of encoding parameters.
- the total duration allowed by the video may be 5-30 seconds, such as 5 seconds, 10 seconds, and 15 seconds.
- the duration of the recorded video reaches the total duration allowed by the video, the recording of the video is automatically stopped.
- the video processing method may be applied to video recording of network-based applications (eg, instant messaging applications, network social applications), and the video processing method may further include the step of: recording video When the duration of is greater than the preset value, the recording of the video is automatically stopped, where the preset value is the total allowed duration of the set video.
- network-based applications eg, instant messaging applications, network social applications
- S203 Blur the frame image to be processed to obtain a blurred image.
- S203 may include: S2031 to S2035.
- the YUV data of the frame image to be processed is directly extracted.
- the format of the frame image to be processed is other formats, for example, RGB format, you need to convert the RGB format to YUV format, and then extract the YUV data.
- S2031 may include the steps of: determining the format of the frame image to be processed; extracting YUV data if the frame image to be processed is YUV format; and converting the frame image to YUV if the frame image to be processed is RGB format Format and extract YUV data.
- YUV is a color coding method.
- YUV is a type of compiled true-color color space.
- Proper nouns such as Y'UV, YUV, YCbCr, and YPbPr can be called YUV, which overlap each other.
- Y means brightness (Luminance, Luma)
- U and "V” means chroma, density (Chrominance, Chroma).
- RGB is the three primary color light model (RGB color model), also known as RGB color model or red, green and blue color model, is an additive color model, the red (Red), green (Green), blue (Blue) three color channels Changes and their superimposition with each other in different proportions to produce a variety of shades.
- the RGB color model is commonly used to detect, represent, and display images in electronic systems. The mutual conversion between YUV data and RGB data can be achieved through a preset conversion matrix.
- S2033 Perform time-domain noise reduction processing on the YUV data to obtain a noise-reduced image.
- noise in the product screen due to ambient light, shooting parameters (such as exposure parameters, etc.), it will cause noise in the product screen. From the perspective of the probability distribution of noise, it can be divided into Gaussian noise, Rayleigh noise, gamma noise, exponential noise and uniform noise. In the embodiment of the present application, in order to suppress noise and improve the quality of the frame image to be processed, so as to facilitate post-processing of the video, it is necessary to perform noise reduction pre-processing on the frame image to be processed.
- the high-frequency color signal and the low-frequency color signal in the YUV data are distinguished by a filter, the high-frequency color signal is filtered out, and the noise-reduced image is obtained. Since the bandwidth of color components is usually narrow, and the human visual system is not sensitive to high-frequency color signals, high-frequency colors can be filtered in the time domain by low-pass filtering to remove high-frequency noise in the image of the frame to be processed.
- a simple low-pass filter can be used to suppress image noise, such as a Gaussian filter, mean filter, etc., which is helpful to distinguish the required effective image content and noise interference, and can also be avoided during video processing The smear of moving objects or moving scenes in the video.
- image noise such as a Gaussian filter, mean filter, etc.
- the Gaussian filter is used to reduce the noise of the frame image to be processed.
- the Gaussian filter is a linear filter that can effectively suppress noise and smooth the frame image to be processed.
- the working principle of the Gaussian filter is similar to the average filter, and the average value of the pixels in the filter window is taken as the output.
- the coefficient of the window template is different from the mean filter.
- the template coefficient of the mean filter is the same as 1.
- the template coefficient of the Gaussian filter decreases as the distance from the center of the template increases. Therefore, compared with the mean filter, the Gaussian filter is less blurry.
- a 5 ⁇ 5 Gaussian filter window is generated, and the center position of the template is used as the coordinate origin for sampling. Bring the coordinates of each position of the template into the Gaussian function, and the value obtained is the coefficient of the template.
- the Gaussian filter window By convolving the Gaussian filter window with the image of the frame to be processed, the image of the frame to be processed can be denoised.
- the frame image to be processed can lose some high-noise details, which are insensitive to human eyes (such as high-frequency noise and excessive Sharpening part) is helpful to reduce the amount of encoded data.
- the encoding rate and post-processing image quality can be improved.
- the image is blurred by zooming. Specifically, the image after noise reduction is first reduced and then enlarged. In the process of reduction, unnecessary details in the image can be effectively removed, while the details that characterize the image are more sensitive to the human eye.
- S2035 may include: determining that the noise-reduced image size is the original size; reducing the noise-reduced image to obtain the reduced image; and enlarging the reduced image to the original size to obtain the blurred image.
- the reduction factor is not limited.
- the ratio of the size of the reduced image to the original size may be 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, and so on.
- encoding is performed based on the H.264 encoding standard.
- the encoding parameters include, but are not limited to: quantization parameter values (QP values), video frame types, and frame rates.
- the QP value determines the encoding compression ratio and image accuracy of the quantizer. If the QP value is large, the dynamic range of the quantization value is small, and the corresponding code length is small, but the image detail information is lost when inverse quantization; if the QP value is small, the dynamic range is large, and the corresponding code length It is also larger, but the loss of image detail information is less.
- the quantization parameter QP value has a total of 52 values. When the QP value takes the minimum value of 0, it represents the finest quantization, and when the QP value takes the maximum value of 51, it represents the coarsest quantization.
- the range of the QP value of the frame to be processed is determined to be 20-44, so as to take into account both image details and encoding length.
- the QP value can be any value or range of values from 20 to 44, for example, the QP value can be: 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44 and so on.
- the encoder can automatically change the QP value according to the actual dynamic range of the image, and trade off between the encoding length and the image accuracy to achieve the overall best effect of video processing.
- the types of video frames include I frames, P frames, and B frames.
- S205 When executing S205, it is determined that the first frame of the video frame is an I frame, and the subsequent video frame of the first frame is a B frame or / and P frame or / and I frame, then I frame is further intra-coded, and the B frame is Or / and P frames are inter-coded.
- the video frame may be encoded according to the type of video frame.
- S205 may include: S2051 to S2053.
- S2051 Determine that the first frame of the video is an I frame, and perform intra coding on the I frame.
- the number of I frames in the video is controlled by determining the frame interval duration of the I frames, which is beneficial to save the amount of encoded data.
- the recorded video has a maximum time limit. For example, in the current popular instant messaging application, a maximum of 10 seconds of small video can be recorded for sharing.
- the total duration limits the duration of the I-frame interval.
- the frame interval duration of the I frame is greater than or equal to 1/4, 1/3, 1/2, 2/3, 3/4, etc. of the total duration allowed by the video, and even the frame interval duration of the I frame can be greater than The total time allowed for video recording.
- the frame interval duration of the I frame can be set to the specified duration, for example, the frame interval duration of the I frame is set to 11 seconds.
- S2053 Determine that the video frame after the I frame is a B frame or / and P frame, and perform inter-frame coding on the video frame after the I frame.
- the video frame after the I frame is a B frame and a P frame
- the B frame and the P frame are set to alternate intervals in sequence.
- the B-frame adaptive setting (use adaptive B-frame placement) can be set to allow the encoder to cover the number of B-frame images that have been encoded to improve the quality, for example, when the encoder detects a scene change Or when the subsequent frame of the current frame is an I frame, the designated video frame is set as a B frame through the B frame adaptive setting.
- the interval frequency between the B frame and the P frame can be determined according to the shooting scene of the video frame, or the designated video frame can be set as the B frame according to the shooting scene of the video frame to improve coding efficiency.
- S2053 may include: performing motion scene judgment on the video frame after the I frame; adaptively adjusting the type of the video frame after the I frame according to the result of the motion scene judgment; and, according to the type of the video frame after the I frame , To encode the video frame after the I frame. Specifically, if any frame in the video frame after the I frame is in a moving scene, the video frame is determined to be a B frame, otherwise, the video frame is determined to be a P frame.
- the motion scene judgment of the video frame after the I frame includes: obtaining the first coordinate of the specified feature A of the current frame in the current frame image, and obtaining the specified feature A of the previous frame of the current frame in the frame image To obtain the difference between the first coordinate and the second coordinate, if the difference is greater than the specified value, the current frame is considered to be in a moving scene.
- the coordinates of feature A have been determined as (X, Y, Z).
- the change increment of feature A is (x1, y1, z1), when the change increment is greater than the specified value, the video frame is considered to be in a moving scene.
- the above-mentioned sports scenes can be understood as scenes with moving objects in the shooting screen, and the screen elements change rapidly in the sports scenes. For example, shooting equipment (such as electronic devices) greatly shakes, shooting scene changes, cars or people running and other scenes.
- whether the video frame is in a moving scene can be determined by other determination methods, for example, by determining the correlation between adjacent frame images. Specifically, you can obtain the image information (such as color distribution) of two adjacent frames, and obtain the correlation between the adjacent frames by comparing the image information. If the correlation is less than the preset value, the video frame is considered Sports scene.
- judging the motion scene of the video frame after the I frame includes: acquiring the first image information of the current frame, acquiring the second image information of the previous frame of the current frame, acquiring the first image information and the second image information If the difference is greater than the specified value, the current frame is considered to be in a moving scene.
- the preprocessing of the frame image to be processed is completed, and the preprocessed image is encoded according to the determined encoding parameters. While improving the coding efficiency, it can ensure a higher picture quality, thereby removing the coding block effect and mosaic in the video. Further, when encoding a video frame, the video processing method sets the type of the video frame according to the motion scene of the video frame, and then encodes the video according to the type of the video frame, which can ensure that when recording a dynamic video scene, Has a higher picture quality.
- FIG. 6 shows a structural block diagram of the video processing device 300.
- the video processing device 300 runs on the electronic device 100 shown in FIG. 7 and is used to perform the above-mentioned video processing method.
- the video processing device 300 is stored in the memory of the electronic device 100 and is configured to be executed by one or more processors of the electronic device 100.
- the video processing device 300 includes a video acquisition module 310, a preprocessing module 330 and an encoding module 350.
- the above-mentioned modules may be program modules running in a computer-readable storage medium. The purpose and work of the above-mentioned modules are as follows:
- the video capture module 310 is used to capture video and extract the frame image of the video to be processed. Specifically, the video collection module 310 collects video through the camera of the electronic device, and extracts to-be-processed frame images of the video in real time.
- the preprocessing module 330 is used to preprocess the video collected by the video collection module 310. Specifically, the pre-processing module 330 is used to perform blur processing on the frame image to be processed to obtain a blurred image.
- the blur processing in the embodiment of the present application should be understood as performing blur processing on the YUV data of the frame image to be processed, for example, reducing the sharpness of the image, removing image noise, and unnecessary details.
- the pre-processing module 330 may include a YUV data extraction unit 331, a noise reduction unit 333, and a blur processing unit 335.
- the YUV data extraction unit 331 is used to extract YUV data of the frame image to be processed.
- the YUV data extraction unit 331 is used to determine the format of the frame image to be processed, and if the frame image to be processed is YUV format, the YUV data extraction unit 331 is used to directly extract YUV data. If the frame image to be processed is in the RGB format, the YUV data extraction unit 331 is used to convert the frame image to be processed to the YUV format and extract the YUV data.
- the noise reduction unit 333 is used to perform time-domain noise reduction processing on YUV data, and is used to obtain a noise-reduced image. After the YUV data extraction unit 331 extracts the YUV data, the noise reduction unit 333 is used to distinguish high-frequency color signals and low-frequency color signals in the YUV data through the filter, and the high-frequency colors are filtered through the low-pass in the time domain to wait for Remove high-frequency noise from the processed frame image.
- the blur processing unit 335 is used to perform blur processing on the noise-reduced image and obtain a blur image.
- the blur processing unit 335 performs blur processing on the frame image to be processed to complete the pre-processing of the frame image to be processed, which can cause the frame image to be processed to lose some details, which is beneficial to encoding the frame image to be processed and can improve the encoding Rate and post-processing image quality.
- the blur processing unit 335 is further used to determine the image size after noise reduction as the original size, reduce the image after noise reduction to obtain the reduced image; and enlarge and reduce the image to the original size to obtain blur image.
- the reduction factor of the image is not limited, for example, the ratio of the size of the reduced image to the original size may be 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8 and many more.
- the encoding module 350 is used to determine encoding parameters and encode the blurred image. Specifically, the encoding module 350 encodes the blurred image based on the H.264 encoding standard.
- the encoding module 350 includes a QP value setting unit 351, a frame type setting unit 353, and an encoding unit 353.
- the QP value setting unit 351 is used to determine the range of the QP value of the frame to be processed to be 20-44, so as to take into account the image details and the encoding length. It can be understood that the QP value can be any value or range of values from 20 to 44, for example, the QP value can be: 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44 and so on. In other embodiments, the QP value setting unit 351 is used to automatically change the QP value according to the actual dynamic range of the image.
- the frame type setting unit 353 is used to determine that the first frame of the video frame is an I frame, and determine that the subsequent video frame of the first frame is a B frame or / and P frame. In some embodiments, the frame type setting unit 353 is used to determine that the video frame after the I frame is a B frame and a P frame, and to set the B frame and the P frame to alternate intervals in sequence. By spacing B and P frames, it is possible to balance frame compression efficiency and image quality. In some other embodiments, the frame type setting unit 353 can be used to set the B frame adaptive setting, for example, when the encoder detects a scene change or the subsequent frame of the current frame is an I frame, the B frame adaptive Set, set the specified video frame as B frame.
- the frame type setting unit 353 is used to appropriately reduce the number of I frames to reduce the amount of video data, thereby saving the amount of encoded data.
- the frame type setting unit 353 may encode the video frame according to the type of the video frame.
- the frame type setting unit 353 may include an I frame determination subunit 3531, a frame scene determination subunit 3533, and B frame determination. The subunit 3535 and the P frame determination subunit 3537.
- the I frame determination subunit 3531 is used to determine that the first frame image of the video is an I frame. Further, the I-frame determining subunit 3531 is also used to determine the frame interval duration of the I-frame to control the number of I-frames in the video, which is conducive to saving the amount of encoded data. Specifically, the I frame determination subunit 3531 can be used to limit the frame interval duration of the I frame according to the total duration allowed by the video. For example, the I frame determination subunit 3531 sets the frame interval duration of the I frame to be 1/4, 1/3, 1/2, 2/3, 3/4, etc. of the total duration allowed by the video, and even, the I frame The frame interval duration can be greater than the total duration allowed for video recording. For scenes where the total duration allowed for video recording has been determined, the I frame determination subunit 3531 may set the frame interval duration of the I frame to a fixed value, for example, the frame interval duration of the I frame is set to 11 seconds.
- the frame scene determination subunit 3533 is used to determine the shooting scene of the video frame, to allow the B frame determination subunit 3535 and the P frame determination subunit 3537 to determine the interval frequency between the B frame and the P frame. Specifically, the frame scene determination subunit 3533 is used to determine the motion scene of the video frame after the I frame. If any one of the video frames after the I frame is in the motion scene, the B frame determination subunit 3535 is used to determine the The video frame is a B frame, otherwise, the B frame determination subunit 3535 is used to determine that the video frame is a P frame.
- the frame scene determination subunit 3533 is used to obtain the first coordinate of the specified feature A of the current frame in the current frame image, and obtain the second of the specified feature A of the previous frame of the current frame in the frame image. Coordinates, to obtain the difference between the first coordinate and the second coordinate, if the difference is greater than the specified value, it is considered that the current frame is in a moving scene. In other embodiments, the frame scene determination subunit 3533 is used to obtain the first image information of the current frame, obtain the second image information of the previous frame of the current frame, and obtain the information between the first image information and the second image information. Difference, if the difference is greater than the specified value, the current frame is considered to be in a moving scene.
- the encoding unit 355 is used to encode the video frame according to the type of the video frame. Specifically, the encoding unit 355 is used for intra-coding of I frames and inter-coding of B frames or / and P frames.
- the preprocessing of the frame image to be processed is completed, and the preprocessed image is encoded according to the determined encoding parameters. While improving the coding efficiency, it can ensure a higher picture quality, thereby removing the coding block effect and mosaic in the video. Further, when encoding a video frame, the video processing method sets the type of the video frame according to the motion scene of the video frame, and then encodes the video according to the type of the video frame, which can ensure that when recording a dynamic video scene, Has a higher picture quality.
- the displayed or discussed modules may be coupled or directly coupled or communicated with each other through some interfaces, and the indirect coupling or communication connection between devices or modules may be electrical, Mechanical or other forms.
- each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
- the above integrated modules may be implemented in the form of hardware or software function modules.
- FIG. 8 shows a structural block diagram of the electronic device 100.
- the electronic device 100 may be an electronic device capable of running an application program such as a smart phone, a tablet computer, an e-book.
- the electronic device 100 includes an electronic body portion 10 that includes a casing 12 and a main display screen 14 disposed on the casing 12.
- the main display screen 14 generally includes a display panel 111, and may also include a circuit for responding to a touch operation on the display panel 111 and the like.
- the display panel 111 may be a liquid crystal display (Liquid Crystal Display, LCD). In some embodiments, the display panel 111 is a touch screen 109 at the same time.
- LCD Liquid Crystal Display
- the electronic device 100 can be used as a smartphone terminal.
- the electronic body 10 usually further includes one or more (only one is shown in FIG. 8) as follows: processor 102 , Memory 104, shooting module 108, audio circuit 110, input module 118, power module 122, and one or more applications, one or more applications can be stored in the memory 104 and configured to be composed of one or more
- processor 102 executes, and one or more programs are configured to execute the method as described in the foregoing method embodiments.
- FIG. 5 is merely an illustration, which does not limit the structure of the electronic body portion 10.
- the electronic body portion 10 may further include more or fewer components than those shown in FIG. 8 or have a configuration different from that shown in FIG. 8.
- the processor 102 may include one or more processing cores.
- the processor 102 connects various parts of the entire electronic device 100 using various interfaces and lines, executes or executes instructions, programs, code sets or instruction sets stored in the memory 104, and calls data stored in the memory 104 to execute Various functions and processing data of the electronic device 100.
- the processor 102 may adopt at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA) Various hardware forms.
- the processor 102 may integrate one or a combination of one of a central processing unit (Central Processing Unit, CPU), an image processing unit (Graphics Processing Unit, GPU), and a modem.
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- modem modem
- CPU mainly deals with operating system, user interface and application program, etc .
- GPU is used for rendering and rendering of display content
- modem is used for handling wireless communication. It can be understood that the above-mentioned modem may not be integrated into the processor 102, and may be implemented by a communication chip alone.
- the memory 104 may include random access memory (RAM) or read-only memory (Read-Only Memory).
- the memory 104 may be used to store instructions, programs, codes, code sets, or instruction sets.
- the memory 104 may include a storage program area and a storage data area, where the storage program area may store instructions for implementing an operating system and instructions for implementing at least one function (such as a touch function, a sound playback function, an image playback function, etc.) , Instructions for implementing the following method embodiments.
- the storage data area may also store data created by the electronic device 100 in use (such as a phone book, audio and video data, chat history data), and the like.
- the shooting module 108 may be a camera, which is disposed on the electronic body 10 and is used to perform shooting tasks, for example, to take photos, videos, or make videophone calls.
- the audio circuit 110, the speaker 101, the sound jack 103, and the microphone 105 jointly provide an audio interface between the user and the electronic body portion 10 or the main display screen 14. Specifically, the audio circuit 110 receives sound data from the processor 102, converts the sound data into electrical signals, and transmits the electrical signals to the speaker 101. The speaker 101 converts electrical signals into sound waves that can be heard by the human ear. The audio circuit 110 also receives electrical signals from the microphone 105, converts the electrical signals into sound data, and transmits the sound data to the processor 102 for further processing.
- the input module 118 may include a touch screen 109 provided on the main display screen 14, and the touch screen 109 may collect touch operations on or near the user (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc. Operation on or near the touch screen 109), and drive the corresponding connection device according to a preset program.
- the input module 118 may further include other input devices, such as a key 107 or a microphone 105.
- the keys 107 may include, for example, character keys for inputting characters, and control keys for triggering control functions. Examples of control buttons include a "return to home screen" button, a power on / off button, and so on.
- the microphone 105 may be used to receive user's voice commands.
- the main display screen 14 is used to display the information input by the user, the information provided to the user, and various graphical user interfaces of the electronic body 10. These graphical user interfaces can be composed of graphics, text, icons, numbers, video, and any combination thereof Composition, in one example, the touch screen 109 may be provided on the display panel 111 so as to form a whole with the display panel 111.
- the power supply module 122 is used to provide power supply to the processor 102 and other components.
- the power module 122 may include a power management device, one or more power sources (such as batteries or alternating current), a charging circuit, a power failure detection circuit, an inverter, a power status indicator, and any other electronic body 10 or main unit. Components related to the generation, management, and distribution of power within the display screen 14.
- the above-mentioned electronic device 100 is not limited to a smart phone terminal, and it should refer to a computer device that can be used on the move. Specifically, the electronic device 100 refers to a mobile computer device equipped with a smart operation device.
- the electronic device 100 includes, but is not limited to, a smart phone, a smart watch, a notebook, a tablet computer, a POS machine, and even a car computer, etc.
- FIG. 9 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
- the computer readable storage medium 800 stores program code, and the program code can be called by a processor to execute the method described in the above method embodiments.
- the computer-readable storage medium 800 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
- the computer-readable storage medium 800 includes a non-transitory computer-readable storage medium.
- the computer-readable storage medium 800 has a storage space for the program code 810 that performs any of the method steps described above. These program codes can be read from or written into one or more computer program products.
- the program code 810 may be compressed in an appropriate form, for example.
- a "computer-readable storage medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use in or in connection with an instruction execution device, device, or equipment .
- computer-readable storage media include the following: electrical connections (electronic devices) with one or more wires, portable computer cartridges (magnetic devices), random access memory (RAM) , Read only memory (ROM), erasable and editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
- the computer-readable storage medium may even be paper or other suitable media on which the program can be printed, because, for example, by optically scanning the paper or other media, followed by editing, interpretation, or other suitable methods as necessary Process to obtain the program electronically and then store it in computer memory.
- each part of the present application may be implemented by hardware, software, firmware, or a combination thereof.
- multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution device.
- a logic gate circuit for implementing a logic function on a data signal
- PGA programmable gate arrays
- FPGA field programmable gate arrays
- each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module.
- the above integrated modules may be implemented in the form of hardware or software function modules. If the integrated module is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer-readable storage medium.
- the storage medium mentioned above may be a read-only memory, a magnetic disk or an optical disk.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
Claims (20)
- 一种视频处理方法,其特征在于,应用于电子设备,所述方法包括:采集视频,提取所述视频的待处理帧图像;对所述待处理帧图像进行模糊处理,获取模糊图像;以及确定编码参数,对所述模糊图像进行编码。
- 根据权利要求1所述的方法,其特征在于,所述对所述待处理帧图像进行模糊处理,获取模糊图像,包括:提取所述待处理帧图像的YUV数据;对所述YUV数据进行时域降噪处理,获取降噪后的图像;以及对所述降噪后的图像进行模糊处理,获取所述模糊图像。
- 根据权利要求2所述的方法,其特征在于,所述对所述降噪后的图像进行模糊处理,获取所述模糊图像,包括:确定所述降噪后的图像的尺寸为原始尺寸;缩小所述降噪后的图像的尺寸,获取缩小图像;以及放大所述缩小图像至所述原始尺寸的大小,获取所述模糊图像。
- 根据权利要求2所述的方法,其特征在于,所述提取所述待处理帧图像的YUV数据,包括:确定所述待处理帧图像的格式;若所述待处理帧图像为YUV格式,则提取所述YUV数据;以及若所述待处理帧图像为RGB格式,则所述待处理帧图像转换为YUV格式,并提取所述YUV数据。
- 根据权利要求2所述的方法,其特征在于,所述对所述YUV数据进行时域降噪处理,获取降噪后的图像,包括:区分所述YUV数据中的高频色彩信号以及低频色彩信号;以及滤除所述高频色彩信号,获所述取所述降噪后的图像。
- 根据权利要求1所述的方法,其特征在于,所述确定编码参数,对所述模糊图像进行编码,包括:确定所述视频的第一帧图像为I帧,对所述I帧进行帧内编码。
- 根据权利要求6所述的方法,其特征在于,确定所述视频的第一帧图像为I帧时,设置所述视频的I帧间隔时长为指定时长。
- 根据权利要求7所述的方法,其特征在于,所述视频处理方法应用于基于网络的应用程序的视频录制中,所述视频录制具有总时长限制;所述指定时长大于所述视频录制所允许的总时长。
- 根据权利要求6所述的方法,其特征在于,所述确定编码参数,对所述模糊图像进行编码,还包括:确定所述I帧后的视频帧为B帧及P帧,对所述I帧后的视频帧进行帧间编码。
- 根据权利要求9所述的方法,其特征在于,所述确定所述I帧后的视频帧为B帧及P帧,包括:设定所述I帧后的视频帧为B帧及P帧依次交替。
- 根据权利要求10所述的方法,其特征在于,所述确定所述I帧后的视频帧为B帧及P帧,根据所述视频的拍摄场景,确定B帧及P帧的间隔频率。
- 根据权利要求10所述的方法,其特征在于,所述对所述I帧后的视频帧进行编码,包括:对所述I帧后的视频帧进行运动场景判断;根据所述运动场景判断的结果,自适应地调整所述I帧后的视频帧的类型;以及根据所述I帧后的视频帧的类型,对所述I帧后的视频帧进行帧间编码。
- 根据权利要求12所述的方法,其特征在于,所述根据所述运动场景判断的结果,自适应地调整所述I帧后的视频帧的类型,包括:若所述I帧后的视频帧中的任一帧处于运动场景,确定该视频帧为B帧,否则,确定该视频帧为P帧。
- 根据权利要求13所述的方法,其特征在于,所述对所述I帧后的视频帧进行运动场景判断,包括:获取当前视频帧的指定特征在当前视频帧的图像中的第一坐标;获取所述当前视频帧的前一帧的所述指定特征在所述前一帧的图像中的第二坐标;以及获取所述第一坐标与所述第二坐标之间的差值,若所述差值大于指定值,则认为所述当前视频帧处于运动场景。
- 根据权利要求13所述的方法,其特征在于,所述对所述I帧后的视频帧进行运动场景判断,包括:获取当前视频帧的第一图像信息;获取当前视频帧的前一帧的第二图像信息;以及获取视频第一图像信息与视频第二图像信息之间的差值,若所述差值大于指定值,则认为所述当前视频帧处于运动场景。
- 根据权利要求9所述的方法,其特征在于,所述确定编码参数,对所述模糊图像进行编码,还包括:确定所述待处理帧的量化参数值的范围为20~44,对所述模糊图像进行编码。
- 根据权利要求1所述的方法,其特征在于,所述视频处理方法应用于基于网络的应用程序的视频录制中,所述视频处理方法还包括:当录制视频的时长大于预设值时,自动停止录制视频。
- 一种视频处理装置,其特征在于,应用于电子设备,所述视频处理装置包括:视频采集模块,用于采集视频,提取所述视频的待处理帧图像;预处理模块,用于对所述待处理帧图像进行模糊处理,获取模糊图像;以及编码模块,用于确定编码参数,对所述模糊图像进行编码。
- 一种电子设备,其特征在于,包括:一个或多个处理器;存储器;一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行,所述一个或多个程序配置用于执行如权利要求1-17中任一项所述的方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有程序代码,所述程序代码可被处理器调用执行所述权利要求1-17中任一项所述的视频处理方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201880098282.XA CN112805990A (zh) | 2018-11-15 | 2018-11-15 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
PCT/CN2018/115753 WO2020097888A1 (zh) | 2018-11-15 | 2018-11-15 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/115753 WO2020097888A1 (zh) | 2018-11-15 | 2018-11-15 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020097888A1 true WO2020097888A1 (zh) | 2020-05-22 |
Family
ID=70730739
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/115753 WO2020097888A1 (zh) | 2018-11-15 | 2018-11-15 | 视频处理方法、装置、电子设备及计算机可读存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112805990A (zh) |
WO (1) | WO2020097888A1 (zh) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111698512A (zh) * | 2020-06-24 | 2020-09-22 | 北京达佳互联信息技术有限公司 | 视频处理方法、装置、设备及存储介质 |
CN112351285A (zh) * | 2020-11-04 | 2021-02-09 | 北京金山云网络技术有限公司 | 视频编码、解码方法和装置、电子设备和存储介质 |
CN113066139A (zh) * | 2021-03-26 | 2021-07-02 | 西安万像电子科技有限公司 | 图片处理方法和装置、存储介质及电子设备 |
CN113298723A (zh) * | 2020-07-08 | 2021-08-24 | 阿里巴巴集团控股有限公司 | 视频处理方法、装置、电子设备及计算机存储介质 |
CN113766322A (zh) * | 2021-01-18 | 2021-12-07 | 北京京东拓先科技有限公司 | 一种图像获取方法、装置、电子设备和存储介质 |
CN114302139A (zh) * | 2021-12-10 | 2022-04-08 | 阿里巴巴(中国)有限公司 | 视频编码方法、视频解码方法及装置 |
CN114390236A (zh) * | 2021-12-17 | 2022-04-22 | 云南腾云信息产业有限公司 | 视频处理方法、装置、计算机设备和存储介质 |
CN114401405A (zh) * | 2022-01-14 | 2022-04-26 | 安谋科技(中国)有限公司 | 一种视频编码方法、介质及电子设备 |
CN114501001A (zh) * | 2020-10-26 | 2022-05-13 | 国家广播电视总局广播电视科学研究院 | 视频编码方法、装置及电子设备 |
CN114630057A (zh) * | 2022-03-11 | 2022-06-14 | 北京字跳网络技术有限公司 | 确定特效视频的方法、装置、电子设备及存储介质 |
CN114630124A (zh) * | 2022-03-11 | 2022-06-14 | 商丘市第一人民医院 | 一种神经内窥镜备份方法及系统 |
CN114640852A (zh) * | 2022-03-21 | 2022-06-17 | 湖南快乐阳光互动娱乐传媒有限公司 | 视频帧对齐方法及装置 |
CN114900736A (zh) * | 2022-03-28 | 2022-08-12 | 网易(杭州)网络有限公司 | 视频生成方法、装置和电子设备 |
CN115550660A (zh) * | 2021-12-30 | 2022-12-30 | 北京智美互联科技有限公司 | 网络视频局部可变压缩方法和系统 |
CN117395381A (zh) * | 2023-12-12 | 2024-01-12 | 上海卫星互联网研究院有限公司 | 一种遥测数据的压缩方法、装置及设备 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113613024B (zh) * | 2021-08-09 | 2023-04-25 | 北京金山云网络技术有限公司 | 视频预处理方法及设备 |
CN115396672B (zh) * | 2022-08-25 | 2024-04-26 | 广东中星电子有限公司 | 比特流存储方法、装置、电子设备和计算机可读介质 |
CN118646930A (zh) * | 2024-08-16 | 2024-09-13 | 浙江嗨皮网络科技有限公司 | 基于网络信号强度的视频背景处理方法、系统及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130107066A1 (en) * | 2011-10-27 | 2013-05-02 | Qualcomm Incorporated | Sensor aided video stabilization |
US20140270568A1 (en) * | 2013-03-14 | 2014-09-18 | Drs Rsta, Inc. | Method and system for noise reduction in video systems |
CN104966266A (zh) * | 2015-06-04 | 2015-10-07 | 福建天晴数码有限公司 | 自动模糊身体部位的方法及系统 |
CN105825490A (zh) * | 2016-03-16 | 2016-08-03 | 北京小米移动软件有限公司 | 图像的高斯模糊方法及装置 |
CN107797783A (zh) * | 2017-10-25 | 2018-03-13 | 广东欧珀移动通信有限公司 | 控制方法、控制装置和计算机可读存储介质 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100394883C (zh) * | 2005-12-02 | 2008-06-18 | 清华大学 | 无线内窥镜系统的准无损图像压缩和解压缩方法 |
JP4613990B2 (ja) * | 2008-07-31 | 2011-01-19 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム |
CN102546917B (zh) * | 2010-12-31 | 2014-10-22 | 联想移动通信科技有限公司 | 带摄像头的移动终端及其视频处理方法 |
CN105103554A (zh) * | 2013-03-28 | 2015-11-25 | 华为技术有限公司 | 用于保护视频帧序列防止包丢失的方法 |
CN103702016B (zh) * | 2013-12-20 | 2017-06-09 | 广东威创视讯科技股份有限公司 | 视频降噪方法及装置 |
CN104661023B (zh) * | 2015-02-04 | 2018-03-09 | 天津大学 | 基于预失真和训练滤波器的图像或视频编码方法 |
-
2018
- 2018-11-15 CN CN201880098282.XA patent/CN112805990A/zh active Pending
- 2018-11-15 WO PCT/CN2018/115753 patent/WO2020097888A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130107066A1 (en) * | 2011-10-27 | 2013-05-02 | Qualcomm Incorporated | Sensor aided video stabilization |
US20140270568A1 (en) * | 2013-03-14 | 2014-09-18 | Drs Rsta, Inc. | Method and system for noise reduction in video systems |
CN104966266A (zh) * | 2015-06-04 | 2015-10-07 | 福建天晴数码有限公司 | 自动模糊身体部位的方法及系统 |
CN105825490A (zh) * | 2016-03-16 | 2016-08-03 | 北京小米移动软件有限公司 | 图像的高斯模糊方法及装置 |
CN107797783A (zh) * | 2017-10-25 | 2018-03-13 | 广东欧珀移动通信有限公司 | 控制方法、控制装置和计算机可读存储介质 |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111698512A (zh) * | 2020-06-24 | 2020-09-22 | 北京达佳互联信息技术有限公司 | 视频处理方法、装置、设备及存储介质 |
CN113298723A (zh) * | 2020-07-08 | 2021-08-24 | 阿里巴巴集团控股有限公司 | 视频处理方法、装置、电子设备及计算机存储介质 |
CN114501001A (zh) * | 2020-10-26 | 2022-05-13 | 国家广播电视总局广播电视科学研究院 | 视频编码方法、装置及电子设备 |
CN112351285A (zh) * | 2020-11-04 | 2021-02-09 | 北京金山云网络技术有限公司 | 视频编码、解码方法和装置、电子设备和存储介质 |
CN112351285B (zh) * | 2020-11-04 | 2024-04-05 | 北京金山云网络技术有限公司 | 视频编码、解码方法和装置、电子设备和存储介质 |
CN113766322A (zh) * | 2021-01-18 | 2021-12-07 | 北京京东拓先科技有限公司 | 一种图像获取方法、装置、电子设备和存储介质 |
CN113066139A (zh) * | 2021-03-26 | 2021-07-02 | 西安万像电子科技有限公司 | 图片处理方法和装置、存储介质及电子设备 |
CN114302139A (zh) * | 2021-12-10 | 2022-04-08 | 阿里巴巴(中国)有限公司 | 视频编码方法、视频解码方法及装置 |
CN114390236A (zh) * | 2021-12-17 | 2022-04-22 | 云南腾云信息产业有限公司 | 视频处理方法、装置、计算机设备和存储介质 |
CN115550660A (zh) * | 2021-12-30 | 2022-12-30 | 北京智美互联科技有限公司 | 网络视频局部可变压缩方法和系统 |
CN115550660B (zh) * | 2021-12-30 | 2023-08-22 | 北京国瑞数智技术有限公司 | 网络视频局部可变压缩方法和系统 |
CN114401405A (zh) * | 2022-01-14 | 2022-04-26 | 安谋科技(中国)有限公司 | 一种视频编码方法、介质及电子设备 |
CN114630124A (zh) * | 2022-03-11 | 2022-06-14 | 商丘市第一人民医院 | 一种神经内窥镜备份方法及系统 |
CN114630057B (zh) * | 2022-03-11 | 2024-01-30 | 北京字跳网络技术有限公司 | 确定特效视频的方法、装置、电子设备及存储介质 |
CN114630124B (zh) * | 2022-03-11 | 2024-03-22 | 商丘市第一人民医院 | 一种神经内窥镜备份方法及系统 |
CN114630057A (zh) * | 2022-03-11 | 2022-06-14 | 北京字跳网络技术有限公司 | 确定特效视频的方法、装置、电子设备及存储介质 |
CN114640852A (zh) * | 2022-03-21 | 2022-06-17 | 湖南快乐阳光互动娱乐传媒有限公司 | 视频帧对齐方法及装置 |
CN114900736A (zh) * | 2022-03-28 | 2022-08-12 | 网易(杭州)网络有限公司 | 视频生成方法、装置和电子设备 |
CN117395381A (zh) * | 2023-12-12 | 2024-01-12 | 上海卫星互联网研究院有限公司 | 一种遥测数据的压缩方法、装置及设备 |
CN117395381B (zh) * | 2023-12-12 | 2024-03-12 | 上海卫星互联网研究院有限公司 | 一种遥测数据的压缩方法、装置及设备 |
Also Published As
Publication number | Publication date |
---|---|
CN112805990A (zh) | 2021-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020097888A1 (zh) | 视频处理方法、装置、电子设备及计算机可读存储介质 | |
CN105472205B (zh) | 编码过程中的实时视频降噪方法和装置 | |
US20110026591A1 (en) | System and method of compressing video content | |
CN102484710B (zh) | 用于像素内插的系统及方法 | |
US11627369B2 (en) | Video enhancement control method, device, electronic device, and storage medium | |
CN108337465B (zh) | 视频处理方法和装置 | |
KR102558385B1 (ko) | 비디오 증강 제어 방법, 장치, 전자 기기 및 저장 매체 | |
CN113099233B (zh) | 视频编码方法、装置、视频编码设备及存储介质 | |
US9619887B2 (en) | Method and device for video-signal processing, transmitter, corresponding computer program product | |
WO2018196864A1 (zh) | 图像预测方法和相关产品 | |
US20200021822A1 (en) | Image Filtering Method and Apparatus | |
WO2021073449A1 (zh) | 基于机器学习的去伪影方法、去伪影模型训练方法及装置 | |
CN103517072A (zh) | 视频通信方法和设备 | |
US10674163B2 (en) | Color space compression | |
CN114554212A (zh) | 视频处理装置及方法、计算机存储介质 | |
CN113709504B (zh) | 图像处理方法、智能终端及可读存储介质 | |
WO2024156269A1 (zh) | 处理方法、处理设备及存储介质 | |
WO2024187645A1 (zh) | 处理方法、处理设备及存储介质 | |
CN115623215B (zh) | 一种播放视频的方法、电子设备和计算机可读存储介质 | |
JPH1051770A (ja) | 画像符号化システム及び方法、及び画像分割システム | |
US20200106821A1 (en) | Video processing apparatus, video conference system, and video processing method | |
WO2022179600A1 (zh) | 视频编码方法、视频解码方法、装置及电子设备 | |
US12058312B2 (en) | Generative adversarial network for video compression | |
WO2020181540A1 (zh) | 一种视频处理方法、装置、编码设备及解码设备 | |
CN113989136A (zh) | 清晰度增强方法、终端及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18939873 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18939873 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01/10/2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18939873 Country of ref document: EP Kind code of ref document: A1 |