CN114449354A - Video editing method and system - Google Patents

Video editing method and system Download PDF

Info

Publication number
CN114449354A
CN114449354A CN202210115612.XA CN202210115612A CN114449354A CN 114449354 A CN114449354 A CN 114449354A CN 202210115612 A CN202210115612 A CN 202210115612A CN 114449354 A CN114449354 A CN 114449354A
Authority
CN
China
Prior art keywords
video
lut
edited
target
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210115612.XA
Other languages
Chinese (zh)
Other versions
CN114449354B (en
Inventor
钮圣虓
卞琛毓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Hode Information Technology Co Ltd
Original Assignee
Shanghai Hode Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Hode Information Technology Co Ltd filed Critical Shanghai Hode Information Technology Co Ltd
Priority to CN202210115612.XA priority Critical patent/CN114449354B/en
Publication of CN114449354A publication Critical patent/CN114449354A/en
Application granted granted Critical
Publication of CN114449354B publication Critical patent/CN114449354B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The application provides a video editing method, which comprises the following steps: determining a video to be edited, wherein the video to be edited comprises a plurality of video frames; receiving a color adjustment instruction for a video to be edited; editing a primary color LUT (look-up table) image associated with the video to be edited according to the color adjusting instruction to obtain a target LUT image; and responding to a derivation instruction, acquiring a target video according to the target LUT image and the plurality of video frames and deriving the target video. The present application also provides a video editing system, a computer device, and a computer-readable storage medium. The technical scheme provided by the application can reduce the computing resource consumption in the video exporting process, shorten the video exporting time and improve the user experience.

Description

Video editing method and system
Technical Field
The present application relates to media technologies, and in particular, to a video editing method, system, computer device, and computer-readable storage medium.
Background
With the development of computer technology, services such as video playing have become the next popular network service. In order to further improve the quality of video playing, video editing becomes an indispensable video production process. In order to meet the requirements of the public on video editing, terminals (such as smart phones and tablet computers) and the like are provided with video editing programs.
Video editing typically involves a number of adjustment items, such as adjusting brightness, contrast, saturation, etc. for a video frame. When the video editing program finally exports the edited video, the processor carries out operation processing on each frame of picture of the video during exporting according to the adjusting effect added by the user in the editing process, and the accumulated computing time consumed by overlapping a plurality of adjusting items is further accumulated when each frame of picture is exported frame by frame finally, so that the more adjusting items are finally obtained, the video exporting time is correspondingly and greatly increased, and the use experience is influenced.
Disclosure of Invention
The present application is directed to a video editing method, system, computer device and computer-readable storage medium for solving the above problems.
One aspect of an embodiment of the present application provides a video editing method, including:
determining a video to be edited, wherein the video to be edited comprises a plurality of video frames;
receiving a color adjustment instruction for a video to be edited;
editing a primary color LUT (look-up table) image associated with the video to be edited according to the color adjusting instruction to obtain a target LUT image; and
in response to a derivation instruction, a target video is acquired and derived from the target LUT map and the plurality of video frames.
Optionally, the editing, according to the color adjustment instruction, the primary color LUT map associated with the video to be edited to obtain a target LUT map includes:
generating a LUT map based on a color adjustment algorithm and the primary color LUT map in response to the color adjustment instruction; and
saving the LUT map, wherein the LUT map is configured as the target LUT map.
Optionally, the editing, according to the color adjustment instruction, the primary color LUT map associated with the video to be edited to obtain a target LUT map includes:
monitoring the parameter update of the color adjustment algorithm;
executing LUT graph generation operation once when monitoring parameter updating once; the LUT map generating operation includes: generating a new LUT (look up table) diagram based on the color adjustment algorithm after parameter updating and the primary color LUT diagram, and storing the new LUT diagram; and
updating the target LUT map to a last new LUT map corresponding to a last LUT map generation operation.
Optionally, the method further comprises:
and executing a preview operation each time a new LUT image is obtained, wherein the preview operation comprises the following steps: generating a video preview picture according to the obtained new LUT picture and the currently displayed video frame; and displaying the video preview picture on a display interface.
Optionally, the method further comprises:
generating a corresponding parameter file every time the parameter is monitored to be updated;
and establishing a mapping relation between each LUT graph and the corresponding parameter file, and saving the mapping relation for subsequent effect restoration.
Optionally, the method further comprises:
and providing the primary color LUT map based on a first preset strategy according to the adjustment items and/or equipment parameters of the color adjustment instruction.
Optionally, the obtaining and deriving a target video according to the target LUT map and the plurality of video frames in response to a derivation instruction includes:
acquiring the color adjustment effect of the target LUT image through a color lookup table; and
and loading the color adjustment effect on each video frame in the plurality of video frames to obtain and export the target video.
Optionally, the video to be edited includes a plurality of video segments; the method further comprises the following steps:
according to a second preset strategy, different primary color LUT graphs are respectively configured for each video segment;
when a video segment to be edited in the plurality of video segments is edited, adjusting a primary color LUT (look up table) diagram corresponding to the video segment to be edited to obtain a target video segment LUT diagram, wherein the video segment to be edited is one of the plurality of video segments;
and when the video segment to be edited is exported, loading the color adjustment effect of the LUT (look up table) of the target video segment to each video frame of the video segment to be edited so as to obtain the video segment to be exported loaded with the color adjustment effect.
An aspect of an embodiment of the present application further provides a video editing system, including:
the device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining a video to be edited, and the video to be edited comprises a plurality of video frames;
the receiving module is used for receiving a color adjusting instruction aiming at a video to be edited;
the editing module is used for editing the primary color LUT image associated with the video to be edited according to the color adjusting instruction to obtain a target LUT image; and
and the derivation module is used for responding to a derivation instruction, acquiring a target video according to the target LUT graph and the video frames and deriving the target video.
An aspect of an embodiment of the present application further provides a video editing method, where the method includes:
providing a video editor, the editing editor comprising a picture editing control and an export control;
importing a video to be edited into a video editor, wherein the video to be edited comprises a plurality of video frames;
displaying the video to be edited in the video editor;
responding to the triggering of the picture editing control, and adjusting a primary color LUT (look-up table) graph associated with the video to be edited to obtain a target LUT graph;
in response to triggering the export control, a target video is obtained and exported according to the target LUT map and the plurality of video frames.
Optionally, the method further comprises:
generating a video preview picture according to the target LUT picture and the video frame currently displayed by the video editor; and
and displaying the video preview picture in a video picture display area or a preview picture display area of the video editor.
Optionally, the method further comprises:
in a video track of the video editor, segmenting the video to be edited into a plurality of video segments;
respectively configuring a primary color LUT (look-up table) graph for each video segment according to a preset strategy;
editing each primary color LUT graph to obtain a plurality of video segment LUT graphs corresponding to the plurality of video segments;
loading the color adjustment effect of each video segment LUT map into each video frame of the corresponding video segment in response to triggering the export control; each video frame under one video segment is associated with the LUT (look-up table) of the video segment corresponding to the video segment.
An aspect of the embodiments of the present application further provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the video editing method as described above when executing the computer program.
An aspect of the embodiments of the present application further provides a computer-readable storage medium, in which a computer program is stored, the computer program being executable by at least one processor to cause the at least one processor to execute steps implementing the video editing method as described above.
In the video editing method, system, computer device, and computer-readable storage medium provided in the embodiments of the present application, in the video editing process, the color adjustment does not directly act on the video frame of the video to be edited, but acts on one primary color LUT map first, and stores the LUT map (target LUT map) with the color adjustment effect generated after the action, and at this time, the LUT map with the color adjustment effect stores all the effects of the color adjustment.
And when the target video is exported, restoring the color adjustment effect of the target LUT diagram, and loading the color adjustment effect to each video frame of the video to be edited, thereby finally obtaining the target video loaded with the color adjustment effect and exporting the target video.
That is, when the video is exported, the color adjustment effect of the target LUT image is loaded to each video frame of the video to be edited, the color adjustment algorithm for color adjustment does not participate in the whole video exporting process, and adding a plurality of adjustment effects during video exporting does not increase time consumption and hardly affects the adjustment effect. Therefore, the consumption of computing resources in the video export process can be reduced, the video export time is shortened, and the user experience is improved.
Drawings
Fig. 1 schematically shows an application environment diagram of a video editing method according to an embodiment of the present application;
fig. 2 schematically shows a flow chart of a video editing method according to a first embodiment of the present application;
fig. 3 schematically shows a comparison diagram between a primary color LUT diagram and a target LUT diagram;
fig. 4 schematically shows a new flowchart of a video editing method according to a first embodiment of the present application;
FIG. 5 is a flowchart illustrating sub-steps of step S206 in FIG. 2;
FIG. 6 is a flowchart illustrating sub-steps of step S204 in FIG. 2;
FIG. 7 is a flowchart illustrating another sub-step of step S204 in FIG. 2;
FIG. 8 is a flowchart illustrating another sub-step of step S204 in FIG. 2;
fig. 9 schematically shows a new flowchart of a video editing method according to a first embodiment of the present application;
fig. 10 schematically shows a new flowchart of a video editing method according to a first embodiment of the present application;
fig. 11 schematically shows a flow chart of a video editing method according to a second embodiment of the present application;
fig. 12 schematically shows a block diagram of a video editing system according to a third embodiment of the present application;
fig. 13 schematically shows a block diagram of a video editing system according to a fourth embodiment of the present application;
fig. 14 schematically shows a hardware architecture diagram of a computer device suitable for implementing the video editing method according to the fifth embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the descriptions relating to "first", "second", etc. in the embodiments of the present application are only for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present application.
The inventor finds that, in the video clip of the mobile terminal, there are adjustment requirements for color adjustment such as brightness, contrast, saturation, etc. to the video frame, and these adjustment items correspond to different image algorithms, but because there are many selectable adjustment items, a user usually opens multiple adjustment items at the same time in order to adjust to a satisfactory effect, when opening multiple adjustment items, the related adjustment algorithms will operate in an overlapping manner, and simultaneously the occupation of computing resources will be correspondingly accumulated, which means that when processing the video frame, the mobile terminal processor adds multiple color adjustment items, which will bring about the accumulation of computing time, and the last step of the video clip software usually performs export operation to the edited video, when exporting, the processor will perform operation processing to each frame of the video according to the adjustment effect added by the user in the editing process, that means, since the accumulated computation time for superimposing a plurality of color adjustment effects is further accumulated when each frame of picture is finally derived frame by frame, the more color adjustment items are used, and the video derivation time is also correspondingly increased greatly, which affects the use experience.
The inventors have appreciated that: the total time consumption increase caused by the accumulation of a plurality of effects can be reduced by optimizing a single adjusting algorithm, but the space which can be optimized only on the image algorithm is limited because the adjustment effect needs to be considered for compromise. For example, the goal of reducing the overall time consumption is achieved by optimizing only a single adjustment algorithm, such as optimizing the brightness algorithm alone if it is found that adjusting the brightness is time consuming, if the goal is not reached, the time consumption of the algorithm is further reduced by sacrificing the adjustment effect, or the degree of time consumption increase is constrained by directly limiting the number of adjustable adjustment items. However, this kind of method is very easy to get into the bottleneck, the optimization space is very small, and the increase of adjustment items still causes the increase of time consumption, and still cannot fundamentally solve the problem.
In view of this, embodiments of the present application provide a video clipping scheme for fast and globally adjusting color of a video in a mobile-end video clip scene. The problem that increasing a plurality of adjusting effects when the video is exported can continuously increase the time consumption can be thoroughly solved, and the adjusting effects can hardly be influenced. Adding one adjustment effect does not have the goal of any increase in export time, efficiently and with little increase in export time, compared to adding multiple adjustment effects.
The following provides technical term explanations of the embodiments of the present application:
color adjustment algorithm: and a class of algorithms for uniformly transforming colors or gray values of the image according to a certain functional relation curve, such as brightness adjustment, contrast adjustment, saturation adjustment, color temperature adjustment, hue adjustment, highlight adjustment, shadow adjustment and the like.
LUT (Look Up Table, color lookup Table): the saved color adjustment effect is restored by loading the LUT map with color effects and applied to the target image. The LUT (Look-Up Table) can be applied to a mapping Table of pixel gray-level values, which transforms the actually sampled pixel gray-level values into another corresponding gray-level value through certain transformation, such as threshold, inversion, binarization, contrast adjustment, linear transformation, etc., so as to highlight the useful information of the image and enhance the optical contrast of the image.
LUT graph: an image capable of recording and saving a color adjustment effect.
Primary color LUT map: in order to be a special LUT map, which does not contain any color adjustment effect, the target image is not changed after the primary color LUT map is loaded to the target image by using the color lookup table technique. Another characteristic of the primary color LUT map is that a LUT map with effects that preserves all the effects of the color adjustment algorithm can be obtained by applying the color adjustment algorithm directly to the primary color LUT map.
An exemplary application environment for embodiments of the present application is provided below.
Fig. 1 schematically shows an environment application diagram of a video editing method according to an embodiment of the present application.
The computer device 10000 may be a terminal device such as a smartphone, a tablet device, a PC (personal computer), or the like. The computer device 10000 can be equipped with a video editor for providing a video editing service. The video editor may provide a graphical user interface for video editing. The video editor may be a client program, a browser, etc.
The video editor may output (e.g., display, render, present) the video to be edited to a user.
The video editor can also be provided with import and export controls, and picture editing controls (such as a brightness adjustment control, a contrast adjustment control, a highlight adjustment control, a shadow adjustment control, and a saturation adjustment control). Based on these controls, the user can adjust the parameters through gestures, or text input. Such as changing the contrast to 60 by a drag gesture. The interface shown in fig. 1 is merely schematic. In practical application, the interface of the video editor is more reasonable.
The video editing scheme will be described below by way of various embodiments. The scheme may be implemented by a computer device 10000.
In the description of the present application, it should be understood that the numerical references before the steps do not identify the order of performing the steps, but merely serve to facilitate the description of the present application and to distinguish each step, and therefore should not be construed as limiting the present application.
Example one
Fig. 2 schematically shows a flowchart of a video editing method according to a first embodiment of the present application.
As shown in fig. 2, the video editing method may include steps S200 to S206, in which:
step S200, determining a video to be edited, wherein the video to be edited comprises a plurality of video frames.
The Video to be edited may be a Video manuscript based on various Video formats, such as an AVI (Audio Video Interleaved) format, an h.264/AVC (Advanced Video Coding), an h.265/HEVC (High Efficiency Video Coding) h.265 format, and the like. The video to be edited may include shooting contents in various scenes, such as people, objects, and natural scenery.
Based on the influence of a shooting scene, a shooting means and the like, the video to be edited generally cannot meet the requirement, and brightness adjustment, contrast adjustment, saturation adjustment, color temperature adjustment, hue adjustment, highlight adjustment, shadow adjustment and the like are required.
In this embodiment, after determining the video to be edited, the computer device 10000 decodes the video to be edited by a video decoder to obtain a plurality of video frames, and may load a color adjustment effect on each video frame.
Step S202, receiving a color adjustment instruction for a video to be edited.
The video to be edited can be edited in an editing page of a special video editor or browser.
Taking a video editor as an example, it may include import controls, export controls, color adjustment controls, and the like.
And the import control is used for importing the video to be edited so as to implement editing operation in the video editor.
And the export control is used for exporting the edited video.
The color adjustment control may include a plurality of adjustment items that may be used to trigger brightness adjustment, contrast adjustment, saturation adjustment, color temperature adjustment, hue adjustment, highlight adjustment, shadow adjustment, and the like.
The user may adjust one or more of the plurality of adjustment items via the color adjustment control, thereby generating the color adjustment instruction. As an example, different color adjustments, i.e. different color adjustment instructions are generated, may be implemented for different video frames or video segments in the video to be edited. For example, when the content of the video is natural wind, the brightness, saturation, etc. can be adjusted to be high. When the content of the video is a night scene, the exposure can be increased.
And step S204, editing the primary color LUT associated with the video to be edited according to the color adjusting instruction to obtain a target LUT.
Rather than directly adjusting the video to be edited. In this embodiment, the primary LUT image is directly applied to the primary LUT image, and the applied primary LUT image (i.e., the target LUT image) stores information on the color adjustment effect. Moreover, not only one color adjustment effect can be saved, but also the effect of overlapping a plurality of color adjustment effects can be saved. Fig. 3 provides a comparison of the primary LUT map and the target LUT map. The target LUT map has a color adjustment effect.
The primary color LUT map is a primary color LUT map with preset specifications or a specific primary color LUT map adapted to a video to be edited.
As an alternative embodiment, as shown in fig. 4, the method may further include: step S400, providing the primary color LUT map based on a first preset strategy according to the adjustment item and/or device parameter of the color adjustment instruction.
Illustratively, the first preset policy is: based on the number of adjustment items and the device parameters, primary color LUT maps of different resolutions are provided. For example: (1) if only very simple adjustment items are included, for example only brightness is adjusted, a primary color LUT map of resolution 64x64 is provided; (2) providing a primary color LUT map with a resolution of 512x512 if more than a preset number of adjustment items are included; (3) if the device parameters (e.g., memory, CPU) exceed the preset values, a primary color LUT map with a resolution of 1024x1024 is provided. The above examples provide primary LUT maps provided in several cases. It should be noted that the above examples are not intended to limit the scope of the present application. The embodiment of the present application can also provide different primary color LUT diagrams through other arrangements. By adopting the primary color LUT diagram with low resolution, the processing speed can be improved, less computing resources are occupied, and the method is particularly suitable for mobile terminals with poor equipment parameters. The effect restoration accuracy can be improved by adopting a high-resolution primary color LUT (look-up table). In this embodiment, the primary color LUT map with the corresponding resolution is adaptively provided according to the adjustment items and/or device parameters of the color adjustment instruction, so that the processing (derivation) speed and the color effect can be balanced.
Step S206, responding to the export instruction, and acquiring and exporting the target video according to the target LUT and the plurality of video frames.
The color adjustment does not directly act on each video frame of the video to be edited, but acts on the primary color LUT map, resulting in a target LUT map (LUT map with color adjustment effect). And when the target video is exported, loading the color adjustment effect of the target LUT image to each video frame of the video to be edited to finally obtain the target video.
As an example, in order to ensure effective restoration of the color adjustment effect, so that the target video desired by the user can be derived, as shown in fig. 5, the step S206 may include steps S500 to S502, where: step S500, obtaining the color adjustment effect of the target LUT image through a color lookup table; step S502, the color adjustment effect is loaded to each of the plurality of video frames to obtain and export the target video. For example, an LUT map with an effect (target LUT map) is applied to each video frame of the video to be edited via a color lookup table, so that the color adjustment effect of the target LUT is loaded onto each video frame of the video to be edited via the color lookup table.
The video editing method provided by the embodiment of the application has the following advantages:
in the video editing process, the color adjustment is not directly applied to the video frame of the video to be edited, but is firstly applied to one primary color LUT (look up table) diagram, and the LUT diagram (target LUT diagram) with the color adjustment effect generated after the application is stored, and at the moment, the LUT diagram with the color adjustment effect stores all the effects of the color adjustment.
And when the target video is exported, restoring the color adjustment effect of the target LUT diagram, and loading the color adjustment effect to each video frame of the video to be edited, thereby finally obtaining the target video loaded with the color adjustment effect and exporting the target video.
That is, when the video is exported, the color adjustment effect of the target LUT image is loaded to each video frame of the video to be edited, the color adjustment algorithm for color adjustment does not participate in the whole video exporting process, and adding a plurality of adjustment effects during video exporting does not increase time consumption and hardly affects the adjustment effect. That is, adding one color adjustment effect does not increase any more in the derivation time than adding a plurality of color adjustment effects.
In an alternative embodiment, as shown in fig. 6, the step S204 may be implemented by the following steps: step S600, in response to the color adjustment instruction, generating an LUT map based on a color adjustment algorithm and the primary color LUT map; and step S602, saving the LUT map, wherein the LUT map is configured as the target LUT map. Namely, the colors or gray values of the primary color LUT map are uniformly transformed based on a color adjustment algorithm, such as brightness adjustment, contrast adjustment, saturation adjustment, color temperature adjustment, hue adjustment, highlight adjustment, shadow adjustment and the like. Different adjustment items correspond to different color adjustment algorithms, e.g., brightness adjustment corresponds to one color adjustment algorithm, and contrast adjustment corresponds to another color adjustment algorithm. When the color adjustment instruction only aims at one adjustment item, the color adjustment is executed through the color adjustment algorithm corresponding to the adjustment item. When the color adjustment instruction aims at a plurality of adjustment items, the color adjustment is executed by overlapping a plurality of color adjustment algorithms corresponding to the plurality of adjustment items. In order to adjust to a satisfactory result, a plurality of adjustment items are usually started simultaneously.
Taking several adjustment items as an example, several corresponding color adjustment algorithms may be applied to the primary color LUT, so as to obtain an LUT map (target LUT map) with color adjustment effect. In the video export, the color adjustment effects of the target LUT image are loaded on each video frame of the video to be edited. That is, the color adjustment algorithms are not directly applied to the video frames of the video to be edited, so as to avoid the following problems: when the video is exported, a plurality of color adjusting algorithms directly act on each video frame of the video to be edited one by one, so that the computing resources are greatly consumed and the exporting time is greatly increased. Therefore, the user experience is effectively improved.
It should be noted that besides the color adjustment algorithm, other algorithms, such as a correction algorithm, may be applied to the primary color LUT.
In an alternative embodiment, as shown in fig. 7, the step S204 may further include:
step S700, monitoring the parameter update of the color adjusting algorithm;
step S702, executing LUT graph generation operation once monitoring parameter updating; the LUT map generating operation includes: generating a new LUT (look up table) diagram based on the color adjustment algorithm after parameter updating and the primary color LUT diagram, and storing the new LUT diagram; and
step S704, updating the target LUT map to a last new LUT map corresponding to the last LUT map generating operation.
In an exemplary application, a user typically makes multiple adjustments until the adjustments are made to meet a desired goal. In this alternative embodiment, when the user needs to update some parameters in the color adjustment or add a new effect, these updating operations are applied to the primary LUT map to regenerate a new LUT map, taking the last new LUT map as the target LUT map. When deriving, the color adjustment effect of the last new LUT map is loaded onto each video frame of the video to be edited. And a new LUT image is generated by the primary LUT image every time, so that the effect backtracking is facilitated.
In an alternative embodiment, as shown in fig. 8, the step S204 may further include: step S800, obtaining a new LUT image each time, and executing a preview operation, where the preview operation includes: generating a video preview picture according to the obtained new LUT picture and the currently displayed video frame; and displaying the video preview picture on a display interface.
When the user edits many times, the user can see the adjustment items, such as the brightness, contrast, saturation, and their values shown in fig. 1, from the video editor, and cannot see the LUT image data with effects that will be continuously updated during the adjustment process. Therefore, each adjustment can be previewed by the user by using the color lookup table to load the effect. Visually, the whole adjustment process of the user looks like that the color adjustment algorithm directly acts on the video frame, but the interior actually generates an LUT (look-up table) with effects by acting on the primary LUT firstly through one color adjustment algorithm, and then the LUT with effects is loaded by the color lookup table to restore the color adjustment effect. In this embodiment, the user operation is facilitated by previewing the color adjustment effect of the target LUT image immediately.
In an alternative embodiment, as shown in fig. 9, the video editing method may further include:
step S900, generating a corresponding parameter file when monitoring parameter updating once;
and step S902, establishing a mapping relation between each LUT and the corresponding parameter file, and saving the mapping relation for subsequent effect restoration.
Both the primary color LUT map and the LUT map with effects are stored in the memory. In an exemplary application, the parameter file generated by the user operating the adjustment project on the video editor is saved while the respective LUT maps generated during the editing process are saved. And establishing a mapping relation between each LUT graph and each parameter file. When a user wants to use the historical adjustment data, the parameter file and the LUT diagram can be conveniently called, parameters of each adjustment item are restored on the interface of the video editor through the called parameter file, application preview of the called LUT diagram is carried out, the color adjustment effect of the historical adjustment is efficiently restored, and therefore the operation experience of the user is effectively improved.
In an alternative embodiment, the video to be edited includes a plurality of video segments.
As shown in fig. 10, the video editing method further includes the steps of: step S1000, configuring different primary color LUT graphs for each video segment according to a second preset strategy; step S1002, when a video segment to be edited in the plurality of video segments is edited, adjusting a primary color LUT corresponding to the video segment to be edited to obtain a target video segment LUT, where the video segment to be edited is one of the plurality of video segments; step S1004, when the video segment to be edited is exported, loading the color adjustment effect of the LUT map of the target video segment to each video frame of the video segment to be edited, so as to obtain the video segment to be exported loaded with the color adjustment effect.
In an exemplary application, the video to be edited may be captured by a number of means, such as: multiple fields, multiple mirrors, multiple scene categories and multiple mirror movement methods. The video content may include character perspective, close-up, natural features, buildings, vegetation, food, and the like. Different shooting means, different video contents may need to be adjusted corresponding to different colors.
Therefore, the video to be edited can be manually divided into a plurality of video segments according to the requirements, the video to be edited can be automatically divided into a plurality of video segments according to the shooting means and the video content, and the automatic division can be realized through artificial intelligence.
Different video segments may correspond to different color adjustment requirements, some video segments need to adjust brightness, and some video segments need to adjust brightness, hue and highlight simultaneously. Different video segments can therefore employ different primary LUT maps.
Therefore, one primary color LUT map can be configured for each video segment by the second preset policy, respectively. It should be noted that, the second preset policy may refer to the configuration of the first preset policy, or be the same as the configuration of the first preset policy.
When the video segment A in the plurality of video segments is used as the video to be edited, the color adjustment is applied to the primary color LUT graph associated with the video segment A, so that a target video segment LUT graph A' associated with the video segment A is generated.
When the video segment B in the plurality of video segments is used as the video to be edited, the color adjustment is applied to the primary color LUT graph associated with the video segment B, so that a target video segment LUT graph B' associated with the video segment B is generated.
And analogizing in turn to obtain a target video segment LUT (look up table) diagram of each video segment.
When exporting:
loading the color adjustment effect of the target video segment LUT map a' onto each video frame of the video segment a;
loading the color adjustment effect of the target video segment LUT map B' onto each video frame of the video segment B;
and analogizing in turn, generating each video segment to be exported, loaded with the color adjustment effect, on the video frame, and exporting finally.
In an exemplary application, when the color is adjusted for the video segment a, a specific identifier may be marked for each video frame of the video segment a, and the specific identifier may be marked for a primary color LUT icon associated with the video segment a, and the specific identifier may be marked for the generated video segment LUT icon, so as to establish an association relationship among each video frame of the video segment a, the primary color LUT map, and the video segment LUT map, thereby achieving aspect management and improving management efficiency. . When the second editing (parameter updating) is performed, another specific identifier is marked again for each video frame of the video segment a and the new video segment LUT map, so that the video segment a is always kept associated with the last new video segment LUT map for effect loading when exporting. That is, at the time of derivation, based on the mapping relationship between each video frame of the video segment a and the video segment LUT, a specific video segment LUT is made to act on each video frame of the video segment a through a color lookup table algorithm. Since the video segment LUT completely contains all the color adjustment effects desired by the user, and the calculation cost of the color adjustment in the whole derivation time is only one item of the color lookup table, and the calculation cost of the color lookup table algorithm itself is very low and can be almost ignored, the calculation resource and the derivation time can be saved in the derivation.
Because the video segments to be edited are segmented, different video segments are subjected to different color adjustment, and the color adjustment effect is refined.
In addition, as described above, video segments with different specifications (e.g., resolution) based on the second preset policy can be adaptively mapped to primary color LUTs, so as to achieve more flexible system resource scheduling, thereby saving computing resources.
Example two
The present embodiment provides another video editing method, the technical details and references of which are given above.
Fig. 11 schematically shows a flowchart of a video editing method according to the second embodiment of the present application.
As shown in fig. 11, the video editing method may include steps S1100 to S1108, in which:
step S1100, providing a video editor, wherein the video editor comprises a picture editing control and an export control;
step S1102, importing a video to be edited into a video editor, wherein the video to be edited comprises a plurality of video frames;
step S1104, displaying the video to be edited in the video editor;
step S1106, in response to triggering the picture editing control, adjusting a primary color LUT image associated with the video to be edited to obtain a target LUT image;
step S1108, in response to triggering the export control, acquiring and exporting a target video according to the target LUT and the plurality of video frames.
As an optional embodiment, the video editing method further includes:
generating a video preview picture according to the target LUT picture and the video frame currently displayed by the video editor; and
and displaying the video preview picture in a video picture display area or a preview picture display area of the video editor.
As an optional embodiment, the video editing method further includes:
in a video track of the video editor, segmenting the video to be edited into a plurality of video segments;
respectively configuring a primary color LUT (look-up table) graph for each video segment according to a preset strategy;
editing each primary color LUT graph to obtain a plurality of video segment LUT graphs corresponding to the plurality of video segments;
loading the color adjustment effect of each video segment LUT map into each video frame of the corresponding video segment in response to triggering the export control; each video frame under one video segment is associated with the LUT (look-up table) of the video segment corresponding to the video segment.
EXAMPLE III
Fig. 12 schematically illustrates a block diagram of a video editing system according to a third embodiment of the present application, which may be partitioned into one or more program modules, stored in a storage medium, and executed by one or more processors to implement the third embodiment of the present application. The program modules referred to in the embodiments of the present application refer to a series of computer program instruction segments that can perform specific functions, and the following description will specifically describe the functions of the program modules in the embodiments of the present application. As shown in fig. 12, the video editing system 1200 may include a determining module 1210, a receiving module 1220, an editing module 1230, and an deriving module 1240, wherein:
a determining module 1210, configured to determine a video to be edited, where the video to be edited includes a plurality of video frames;
the receiving module 1220 is configured to receive a color adjustment instruction for a video to be edited;
the editing module 1230 is configured to edit the primary color LUT image associated with the video to be edited according to the color adjustment instruction to obtain a target LUT image; and
and a deriving module 1240, configured to, in response to a deriving instruction, acquire and derive a target video according to the target LUT map and the plurality of video frames.
In an alternative embodiment, the editing module 1230 is further configured to:
generating a LUT map based on a color adjustment algorithm and the primary LUT map in response to the color adjustment instruction; and
saving the LUT map, wherein the LUT map is configured as the target LUT map.
In an alternative embodiment, the editing module 1230 is further configured to:
monitoring the parameter update of the color adjustment algorithm;
executing LUT graph generation operation once when monitoring parameter updating once; the LUT map generating operation includes: generating a new LUT (look up table) diagram based on the color adjustment algorithm after parameter updating and the primary color LUT diagram, and storing the new LUT diagram; and
updating the target LUT map to a last new LUT map corresponding to a last LUT map generation operation.
In an alternative embodiment, the system further comprises a preview module (not identified) for:
and executing a preview operation each time a new LUT image is obtained, wherein the preview operation comprises the following steps: generating a video preview picture according to the obtained new LUT picture and the currently displayed video frame; and displaying the video preview picture on a display interface.
In an alternative embodiment, the system further comprises a saving module (not identified) for:
generating a corresponding parameter file every time the parameter is monitored to be updated;
and establishing a mapping relation between each LUT graph and the corresponding parameter file, and saving the mapping relation for subsequent effect restoration.
In an alternative embodiment, the system further comprises a providing module (not identified) for:
and providing the primary color LUT map based on a first preset strategy according to the adjustment items and/or equipment parameters of the color adjustment instruction.
In an alternative embodiment, the derivation module 1240 is further configured to:
acquiring the color adjustment effect of the target LUT image through a color lookup table; and
and loading the color adjustment effect on each video frame in the plurality of video frames to obtain and export the target video.
In an alternative embodiment, the video to be edited includes a plurality of video segments; the system further comprises a segmentation module (not identified) for:
according to a second preset strategy, different primary color LUT graphs are respectively configured for each video segment;
when a video segment to be edited in the plurality of video segments is edited, adjusting a primary color LUT (look up table) diagram corresponding to the video segment to be edited to obtain a target video segment LUT diagram, wherein the video segment to be edited is one of the plurality of video segments;
and when the video segment to be edited is exported, loading the color adjustment effect of the LUT (look up table) of the target video segment onto each video frame of the video segment to be edited so as to obtain the video segment to be exported loaded with the color adjustment effect.
Example four
Fig. 13 schematically illustrates a block diagram of a video editing system according to a fourth embodiment of the present application, which may be partitioned into one or more program modules, stored in a storage medium, and executed by one or more processors to implement the embodiments of the present application. The program modules referred to in the embodiments of the present application refer to a series of computer program instruction segments that can perform specific functions, and the following description will specifically describe the functions of the program modules in the embodiments of the present application. As shown in fig. 13, the video editing system 1300 may include a providing module 1310, an importing module 1320, a presenting module 1330, an adjusting module 1340, and an exporting module 1350, wherein:
a providing module 1310 for providing a video editor, the editing editor comprising a picture editing control and an export control;
an importing module 1320, configured to import a video to be edited into a video editor, where the video to be edited includes a plurality of video frames;
a display module 1330 configured to display the video to be edited in the video editor;
the adjusting module 1340 is configured to, in response to triggering the picture editing control, adjust a primary color LUT image associated with the video to be edited to obtain a target LUT image;
a derivation module 1350, configured to, in response to triggering the derivation control, obtain and derive a target video according to the target LUT map and the plurality of video frames.
In an alternative embodiment, the system further comprises a preview module (not identified) for:
generating a video preview picture according to the target LUT picture and the video frame currently displayed by the video editor; and
and displaying the video preview picture in a video picture display area or a preview picture display area of the video editor.
In an alternative embodiment, the system further comprises a segmentation module (not identified) for:
in a video track of the video editor, segmenting the video to be edited into a plurality of video segments;
respectively configuring a primary color LUT (look-up table) graph for each video segment according to a preset strategy;
editing each primary color LUT graph to obtain a plurality of video segment LUT graphs corresponding to the plurality of video segments;
in response to triggering the export control, loading the color adjustment effect of each video segment LUT image into each video frame of the corresponding video segment; each video frame under one video segment is associated with the LUT (look-up table) of the video segment corresponding to the video segment.
EXAMPLE five
Fig. 14 schematically shows a hardware architecture diagram of a computer device 10000 suitable for implementing a video editing method according to an embodiment of the present application. In this embodiment, the computer device 10000 is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction. For example, the terminal device may be a smartphone, a tablet computer, a notebook computer, a desktop computer, or the like. As shown in fig. 14, computer device 10000 includes at least, but is not limited to: the memory 10010, processor 10020, and network interface 10030 may be communicatively linked to each other via a system bus. Wherein:
the memory 10010 includes at least one type of computer-readable storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the storage 10010 may be an internal storage module of the computer device 10000, such as a hard disk or a memory of the computer device 10000. In other embodiments, the memory 10010 may also be an external storage device of the computer device 10000, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the computer device 10000. Of course, the memory 10010 may also include both internal and external memory modules of the computer device 10000. In this embodiment, the memory 10010 is generally used for storing an operating system installed in the computer device 10000 and various application software, such as program codes of a video editing method. In addition, the memory 10010 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 10020, in some embodiments, can be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip. The processor 10020 is generally configured to control overall operations of the computer device 10000, such as performing control and processing related to data interaction or communication with the computer device 10000. In this embodiment, the processor 10020 is configured to execute program codes stored in the memory 10010 or process data.
Network interface 10030 may comprise a wireless network interface or a wired network interface, and network interface 10030 is generally used to establish a communication link between computer device 10000 and other computer devices. For example, the network interface 10030 is used to connect the computer device 10000 to an external terminal via a network, establish a data transmission channel and a communication link between the computer device 10000 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System of Mobile communication (GSM), Wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, Bluetooth (Bluetooth), or Wi-Fi.
It should be noted that fig. 14 only illustrates a computer device having the components 10010 and 10030, but it is to be understood that not all illustrated components are required and that more or less components may be implemented instead.
In this embodiment, the video editing method stored in the memory 10010 can be further divided into one or more program modules and executed by one or more processors (in this embodiment, the processor 10020) to complete the embodiment of the present application.
Example six
Embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the video editing method in the embodiments.
In this embodiment, the computer-readable storage medium includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the computer readable storage medium may be an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. In other embodiments, the computer readable storage medium may be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the computer device. Of course, the computer-readable storage medium may also include both internal and external storage devices of the computer device. In this embodiment, the computer-readable storage medium is generally used for storing an operating system and various types of application software installed in the computer device, for example, the program codes of the video editing method in the embodiment, and the like. Further, the computer-readable storage medium may also be used to temporarily store various types of data that have been output or are to be output.
It will be apparent to those skilled in the art that the modules or steps of the embodiments of the present application described above may be implemented by a general purpose computer device, they may be centralized on a single computer device or distributed over a network of multiple computer devices, and alternatively, they may be implemented by program code executable by a computer device, such that they may be stored in a storage device and executed by a computer device, and in some cases, the steps shown or described may be executed out of order, or separately as individual integrated circuit modules, or multiple ones of them may be implemented as a single integrated circuit module. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
The above are only exemplary embodiments of the present application, and not intended to limit the scope of the claims of the present application, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and all such changes or substitutions are included in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A method of video editing, the method comprising:
determining a video to be edited, wherein the video to be edited comprises a plurality of video frames;
receiving a color adjustment instruction for a video to be edited;
editing a primary color LUT (look-up Table) graph associated with the video to be edited according to the color adjusting instruction to obtain a target LUT graph; and
in response to a derivation instruction, a target video is acquired and derived from the target LUT map and the plurality of video frames.
2. The video editing method according to claim 1, wherein editing the primary color LUT map associated with the video to be edited to obtain a target LUT map according to the color adjustment instruction comprises:
generating a LUT map based on a color adjustment algorithm and the primary color LUT map in response to the color adjustment instruction; and
saving the LUT map, wherein the LUT map is configured as the target LUT map.
3. The video editing method according to claim 2, wherein editing the primary color LUT map associated with the video to be edited to obtain a target LUT map according to the color adjustment instruction comprises:
monitoring the parameter update of the color adjustment algorithm;
executing LUT graph generation operation once when monitoring parameter updating once; the LUT map generating operation includes: generating a new LUT (look up table) diagram based on the color adjustment algorithm after parameter updating and the primary color LUT diagram, and storing the new LUT diagram; and
updating the target LUT map to a last new LUT map corresponding to a last LUT map generation operation.
4. The video editing method of claim 3, wherein the method further comprises:
and executing a preview operation each time a new LUT image is obtained, wherein the preview operation comprises the following steps: generating a video preview picture according to the obtained new LUT picture and the currently displayed video frame; and displaying the video preview picture on a display interface.
5. The video editing method of claim 3, wherein the method further comprises:
generating a corresponding parameter file every time the parameter is monitored to be updated;
and establishing a mapping relation between each LUT graph and the corresponding parameter file, and saving the mapping relation for subsequent effect restoration.
6. The video editing method according to any one of claims 1 to 5, wherein the method further comprises:
and providing the primary color LUT map based on a first preset strategy according to the adjustment items and/or equipment parameters of the color adjustment instruction.
7. The video editing method according to any one of claims 1 to 5, wherein the acquiring and deriving a target video from the target LUT map and the plurality of video frames in response to an deriving instruction comprises:
acquiring the color adjustment effect of the target LUT image through a color lookup table; and
and loading the color adjustment effect on each video frame in the plurality of video frames to obtain and export the target video.
8. The video editing method according to claim 1, wherein the video to be edited includes a plurality of video segments; the method further comprises the following steps:
according to a second preset strategy, different primary color LUT graphs are respectively configured for each video segment;
when a video segment to be edited in the plurality of video segments is edited, adjusting a primary color LUT (look up table) diagram corresponding to the video segment to be edited to obtain a target video segment LUT diagram, wherein the video segment to be edited is one of the plurality of video segments;
and when the video segment to be edited is exported, loading the color adjustment effect of the LUT (look up table) of the target video segment onto each video frame of the video segment to be edited so as to obtain the video segment to be exported loaded with the color adjustment effect.
9. A video editing system, comprising:
the device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining a video to be edited, and the video to be edited comprises a plurality of video frames;
the receiving module is used for receiving a color adjusting instruction aiming at a video to be edited;
the editing module is used for editing the primary color LUT image associated with the video to be edited according to the color adjusting instruction to obtain a target LUT image; and
and the derivation module is used for responding to a derivation instruction, acquiring a target video according to the target LUT graph and the video frames and deriving the target video.
10. A method of video editing, the method comprising:
providing a video editor, the editing editor comprising a picture editing control and an export control;
importing a video to be edited into a video editor, wherein the video to be edited comprises a plurality of video frames;
displaying the video to be edited in the video editor;
responding to the triggering of the picture editing control, and adjusting a primary color LUT (look-up table) graph associated with the video to be edited to obtain a target LUT graph;
in response to triggering the export control, a target video is obtained and exported according to the target LUT map and the plurality of video frames.
11. The video editing method according to claim 10, further comprising:
generating a video preview picture according to the target LUT picture and the video frame currently displayed by the video editor; and
and displaying the video preview picture in a video picture display area or a preview picture display area of the video editor.
12. The video editing method according to claim 10, further comprising:
in a video track of the video editor, segmenting the video to be edited into a plurality of video segments;
respectively configuring a primary color LUT (look-up table) graph for each video segment according to a preset strategy;
editing each primary color LUT graph to obtain a plurality of video segment LUT graphs corresponding to the plurality of video segments;
loading the color adjustment effect of each video segment LUT map into each video frame of the corresponding video segment in response to triggering the export control; each video frame under one video segment is associated with the LUT (look-up table) of the video segment corresponding to the video segment.
13. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, is adapted to carry out the steps of the video editing method of any of claims 1 to 8 or 10 to 12.
14. A computer-readable storage medium, having stored therein a computer program executable by at least one processor to cause the at least one processor to perform the steps of the video editing method of any one of claims 1 to 8 or 10 to 12.
CN202210115612.XA 2022-02-07 2022-02-07 Video editing method and system Active CN114449354B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210115612.XA CN114449354B (en) 2022-02-07 2022-02-07 Video editing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210115612.XA CN114449354B (en) 2022-02-07 2022-02-07 Video editing method and system

Publications (2)

Publication Number Publication Date
CN114449354A true CN114449354A (en) 2022-05-06
CN114449354B CN114449354B (en) 2023-12-08

Family

ID=81372486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210115612.XA Active CN114449354B (en) 2022-02-07 2022-02-07 Video editing method and system

Country Status (1)

Country Link
CN (1) CN114449354B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6724935B1 (en) * 1999-08-20 2004-04-20 Kabushiki Kaisha Toshiba Color image processing apparatus for performing color adjustment and color conversion processing
US20070188814A1 (en) * 2006-02-15 2007-08-16 Sigma Tel, Inc. Color conversion system and method
US20140036105A1 (en) * 2011-04-11 2014-02-06 Fujifilm Corporation Video conversion device, photography system of video system employing same, video conversion method, and recording medium of video conversion program
JP2014233064A (en) * 2013-05-02 2014-12-11 富士フイルム株式会社 Video conversion system, photographing system and look-up table generation server
WO2015030003A1 (en) * 2013-08-27 2015-03-05 富士フイルム株式会社 Video production system and video production method
CN105847995A (en) * 2016-05-16 2016-08-10 上海幻电信息科技有限公司 Method for video position jumping via bullet screen anchor points
US20190260908A1 (en) * 2018-02-20 2019-08-22 Filmic Inc. Cubiform method
US20200195959A1 (en) * 2018-06-29 2020-06-18 Beijing Bytedance Network Technology Co., Ltd. Concept of using one or multiple look up tables to store motion information of previously coded in order and use them to code following blocks
KR20210094057A (en) * 2019-03-24 2021-07-28 후아웨이 테크놀러지 컴퍼니 리미티드 Method and apparatus for chroma intra prediction in video coding
CN113810642A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN113810764A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video editing method and video editing device
WO2022017006A1 (en) * 2020-07-22 2022-01-27 Oppo广东移动通信有限公司 Video processing method and apparatus, and terminal device and computer-readable storage medium
EP3945723A1 (en) * 2020-07-30 2022-02-02 Arçelik Anonim Sirketi A television and a method of controlling video settings thereof

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6724935B1 (en) * 1999-08-20 2004-04-20 Kabushiki Kaisha Toshiba Color image processing apparatus for performing color adjustment and color conversion processing
US20070188814A1 (en) * 2006-02-15 2007-08-16 Sigma Tel, Inc. Color conversion system and method
US20140036105A1 (en) * 2011-04-11 2014-02-06 Fujifilm Corporation Video conversion device, photography system of video system employing same, video conversion method, and recording medium of video conversion program
JP2014233064A (en) * 2013-05-02 2014-12-11 富士フイルム株式会社 Video conversion system, photographing system and look-up table generation server
WO2015030003A1 (en) * 2013-08-27 2015-03-05 富士フイルム株式会社 Video production system and video production method
CN105847995A (en) * 2016-05-16 2016-08-10 上海幻电信息科技有限公司 Method for video position jumping via bullet screen anchor points
US20190260908A1 (en) * 2018-02-20 2019-08-22 Filmic Inc. Cubiform method
US20200195959A1 (en) * 2018-06-29 2020-06-18 Beijing Bytedance Network Technology Co., Ltd. Concept of using one or multiple look up tables to store motion information of previously coded in order and use them to code following blocks
KR20210094057A (en) * 2019-03-24 2021-07-28 후아웨이 테크놀러지 컴퍼니 리미티드 Method and apparatus for chroma intra prediction in video coding
WO2022017006A1 (en) * 2020-07-22 2022-01-27 Oppo广东移动通信有限公司 Video processing method and apparatus, and terminal device and computer-readable storage medium
EP3945723A1 (en) * 2020-07-30 2022-02-02 Arçelik Anonim Sirketi A television and a method of controlling video settings thereof
CN113810642A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN113810764A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video editing method and video editing device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘涵: "CDL 的原理及其在影视制作中的应用", 《现代电影技术》, pages 36 - 41 *
柯健;戴敏利;刘畅;: "Mac平台CinemaDNG 4K视频后期制作方案设计与实现", 苏州市职业大学学报, no. 03 *

Also Published As

Publication number Publication date
CN114449354B (en) 2023-12-08

Similar Documents

Publication Publication Date Title
CN109219844B (en) Transitioning between video priority and graphics priority
CN108496356B (en) Method, apparatus and computer readable medium for tone mapping
JP3758452B2 (en) RECORDING MEDIUM, IMAGE PROCESSING DEVICE, AND IMAGE PROCESSING METHOD
CN107888943B (en) Image processing
US9042682B2 (en) Content creation using interpolation between content versions
CN107948733B (en) Video image processing method and device and electronic equipment
CN111127342B (en) Image processing method, device, storage medium and terminal equipment
CN107179889A (en) Interface color conditioning method, webpage color conditioning method and device
US11587526B2 (en) Luminance adaption to minimize discomfort and improve visibility
CN111090384B (en) Soft keyboard display method and device
CN113055709A (en) Video distribution method, device, equipment, storage medium and program product
CN113112422B (en) Image processing method, device, electronic equipment and computer readable medium
CN113411553A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114449354B (en) Video editing method and system
WO2023005853A1 (en) Image processing method and apparatus, electronic device, storage medium, and computer program product
CN107578753B (en) Mobile terminal, display screen brightness adjusting method and storage medium
CN114363697B (en) Video file generation and playing method and device
EP2675171B1 (en) Transparency information in image or video format not natively supporting transparency
CN108335659A (en) Method for displaying image and equipment
CN110378973B (en) Image information processing method and device and electronic equipment
EP4042405A1 (en) Perceptually improved color display in image sequences on physical displays
CN113822784A (en) Image processing method and device
CN116932118B (en) Color adjustment method and device for graphic primitive, computer equipment and storage medium
US12046179B2 (en) Color display in image sequences on physical displays
CN118538184A (en) Brightness adjusting method and device and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant