CN113034357A - Method and system for converting RAW format file, electronic device and storage medium - Google Patents

Method and system for converting RAW format file, electronic device and storage medium Download PDF

Info

Publication number
CN113034357A
CN113034357A CN202110562914.7A CN202110562914A CN113034357A CN 113034357 A CN113034357 A CN 113034357A CN 202110562914 A CN202110562914 A CN 202110562914A CN 113034357 A CN113034357 A CN 113034357A
Authority
CN
China
Prior art keywords
color correction
image
raw
color
standard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110562914.7A
Other languages
Chinese (zh)
Other versions
CN113034357B (en
Inventor
梁栋荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Huoshaoyun Technology Co ltd
Original Assignee
Hangzhou Huoshaoyun Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Huoshaoyun Technology Co ltd filed Critical Hangzhou Huoshaoyun Technology Co ltd
Priority to CN202110562914.7A priority Critical patent/CN113034357B/en
Publication of CN113034357A publication Critical patent/CN113034357A/en
Application granted granted Critical
Publication of CN113034357B publication Critical patent/CN113034357B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4015Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The application relates to a method, a system, an electronic device and a storage medium for converting files in a RAW format, wherein the method comprises the following steps: acquiring camera parameter configuration and a Bayer array from the RAW format file, and processing the RAW format file through an ISP algorithm to generate a standard ISP original data set; carrying out color correction on a standard image in a standard ISP original data set through a color correction model to obtain a color correction image, wherein the color correction model is an image enhancement neural network embedded with a control command module; respectively extracting a color correction characteristic matrix of a color correction chart and a RAW characteristic matrix of a Bayer array through a neural network, and fusing the color correction characteristic matrix and the RAW characteristic matrix through a fusion operator to obtain a high dynamic color range characteristic matrix; and converting the high dynamic color range characteristic matrix through a high-definition neural network to generate a file transfer picture. The method for automatically shifting the files is provided for the RAW format file, and the quality of the pictures after shifting can be effectively improved.

Description

Method and system for converting RAW format file, electronic device and storage medium
Technical Field
The present application relates to the field of computers, and in particular, to a method, system, electronic device, and storage medium for transferring a RAW format file.
Background
The RAW format picture is RAW camera data which is not processed, continuous information of details, colors and brightness of the picture is reserved to the greatest extent, the largest image processing space is provided, the RAW format picture is the first choice for image processing of professionals, and in the 5G era of network speed bandwidth improvement in the near future, a high-flexibility RAW format or a higher color depth picture format obtained by processing the RAW format is more popular.
At present, most recent mainstream mobile phones support a Raw format, for example, Iphone12 released in the recent days supports a prodaw format, a photographer can completely keep control over colors, details and a dynamic range during shooting, but can obtain strong depth and flexibility of Raw, in addition, a display screen adopts a latest XDR technology, the display color depth reaches an HDR10 level, 10-bit color depth display is supported, 10 hundred million colors can be displayed, Iphone12 pro supports Dolby visual field shooting, the color depth reaches 14 hundred million, and 680 colors can be recorded and displayed; p40 released in 4 months of 2020 supports the storage and processing of raw format, the carried AI AWB accurately processes the color information and complex light change of raw format pictures, and the pictures taken at night become bright spots of P40 pro; galaxy Note20 Ultra, released by samsung at 8 months of 2020, can output photographs in the format RAW10 and RAW 8; adobe Camera Raw is updated by image processing software Adobe Photo Shop in 6 months in 2020, and is used for processing a function plate in Raw format, optimizing a UI (user interface) for local image processing, and enhancing a simple global image to user-controllable local image transformation.
With the rapid development of graphics card hardware, the traditional image algorithm is fully advanced to deep learning, the research of the image algorithm is gradually moved from low-pixel image processing research to high-pixel image processing in recent years, and the algorithms of ultrahigh resolution, denoising, image enhancement and the like are applied to the RAW format. In addition, since 2019, international image processing meetings such as ICCV and CVPR begin, a paper related to raw format images appears in a winning paper; international image processing games also start to turn to RAW format, and image enhancement traditional games such as denoising and high-definition are gradually provided with RAW format sub-items, for example, the world champion of the real image denoising game "RAW-RGB" group of ntie 2019 is captured by the general institute of vision in 2019, and the project champion is captured by the hundredth institute of academic degree.
Furthermore, in the later-stage picture-correcting link of wedding dress photography, "file transfer" is the first step of daily work performed by the painters, and the purpose of the file transfer is not only to obtain daily-used pictures in the JPG format, but also to unify the color temperature, exposure and the like of the pictures as much as possible, so that all details in the pictures can be presented as much as possible, and the difficulty and the final effect of the later-stage style picture-correcting are directly influenced by the quality of the file transfer result. The reason for shifting files through the RAW format file is mainly as follows: the numerical value range in the picture data space of the generic image processing format is much smaller than that of the RAW format, the former being 8-bit 0-255, the latter being 14-bit 0-16348, and even up to 16-bit 0-65536. The 8-bit color depth of an RGB picture can cause many image problems such as color distortion, color overflow, color faults, loss of detail due to overexposure and underexposure, etc. The edge problem and the color fault problem which are common to us are basically the color value jump caused by calculation under 8-bit precision. For example, the color value of red of an 8-bit RGB picture is [255,0,0], and the color depth of 14 bits of RAW format is an interval [16319:16384,0,0], so it can be seen that 64 color values are lost in the representation of the color value of red compared with 14 bits of 8 bits, which is only the loss of a single-channel color value, and if the color value representation of 3 channels is calculated, such as the color value of 8 bits of pink is [255,192,203], 64 × 64 = 262144 color values are lost compared with the representation interval of 14 bits [16319:16384, 12272:12335, 12978:13042 ]. However, this is only color distortion in the representation of color values, and also color value calculation, for example, the neural network is 32 bits at present, and 8 bits of color space is calculated by using 32 bits, so that it is theoretically easier to have accuracy loss and color value jump than 14 bits of color space calculated by using 32 bits, thereby causing color faults and shading faults. From the above, it is understood that the shift of the RAW format file is of great significance for the later image processing.
At present, in the related art, the deep learning technology has not been used for intelligent file transfer of the RAW format file.
Aiming at the problem that intelligent file transfer is not performed on a RAW format file through a deep learning network in the related technology, an effective solution is not provided yet.
Disclosure of Invention
The embodiment of the application provides a method, a system, an electronic device and a storage medium for transferring files in a RAW format, so as to at least solve the problem that the files in the RAW format are lack of intelligent transfer through a deep learning network in the related art.
In a first aspect, an embodiment of the present application provides a method for transferring a RAW format file, where the method includes:
acquiring camera parameter configuration and a Bayer array from a RAW format file, and processing the RAW format file through an ISP algorithm to generate a standard ISP original data set;
carrying out color correction on the standard image in the standard ISP original data set through a color correction model to obtain a color correction image, wherein the color correction model is an image enhancement neural network embedded with a control command module;
respectively extracting a color correction characteristic matrix of the color correction chart and a RAW characteristic matrix of the Bayer array through a neural network, and fusing the color correction characteristic matrix and the RAW characteristic matrix through a fusion operator to obtain a high dynamic color range characteristic matrix;
and converting the high dynamic color range characteristic matrix through a high-definition neural network to generate a file transfer picture under the original resolution.
In some of these embodiments, prior to color correcting a standard image in the standard ISP raw data set by a color correction model, the method comprises:
and performing image compression on the standard image in the standard ISP original data set to obtain a low-pixel standard image, and performing color correction on the low-pixel standard image through the color correction model.
In some of these embodiments, the generating of the control command module comprises:
and respectively setting a control command instruction set of the color temperature grade and the exposure grade, generating the control command module, and adjusting the image enhancement neural network through the control command module.
In some embodiments, the color correcting the standard image in the standard ISP raw data set by the color correction model comprises:
the relationship of the control command variable and the color correction map variation is semantically learned by a natural language processing model.
In some of these embodiments, said semantically learning the relationship of the control command variable and the color correction map variation by a natural language processing model comprises:
extracting features of the standard image through a global information image neural network to obtain a global information feature vector;
processing the global information characteristic vector and the control command variable through a natural language processing neural network to obtain color correction information;
and carrying out color correction on the standard image through a color correction image neural network integrated with the color correction information to obtain a color correction image.
In some of these embodiments, before color correcting the standard image in the standard ISP raw data set by a color correction model, the method further comprises:
performing PhotoShop camera raw file transfer and manual file transfer on the standard ISP original data set to obtain a training sample set of the color correction model, wherein the training sample set comprises pictures in the standard ISP original data set, file transfer pictures obtained after the PhotoShop camera raw file transfer and the manual file transfer, and color temperature grades and exposure grades of manual marks corresponding to the PhotoShop camera raw file transfer and the manual file transfer respectively.
In some of these embodiments, after obtaining the training sample set of the color correction model, the method includes:
and compressing the sample images in the training sample set to obtain a small-size training sample set, and training the color correction model through the small-size training sample set.
In a second aspect, an embodiment of the present application provides a system for transferring a RAW format file, where the system includes:
the acquisition module is used for acquiring camera parameter configuration and a Bayer array from the RAW format file, and processing the RAW format file through an ISP algorithm to generate a standard ISP original data set;
the color correction module is used for carrying out color correction on the standard image in the standard ISP original data set through a color correction model to obtain a color correction image, wherein the color correction model is an image enhancement neural network embedded with a control command module;
the gear shifting module is used for respectively extracting the color correction characteristic matrix of the color correction chart and the RAW characteristic matrix of the Bayer array through a neural network, and fusing the color correction characteristic matrix and the RAW characteristic matrix through a fusion operator to obtain a high dynamic color range characteristic matrix,
and converting the high dynamic color range characteristic matrix through a high-definition neural network to generate a file transfer picture under the original resolution.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor, when executing the computer program, implements the method for forwarding the RAW-format file according to the first aspect.
In a fourth aspect, embodiments of the present application provide a storage medium, on which a computer program is stored, where the program, when executed by a processor, implements the method for file transferring in RAW format as described in the first aspect above.
Compared with the related art, the method for transferring the RAW format file, provided by the embodiment of the application, acquires the camera parameter configuration and the Bayer array from the RAW format file, and processes the RAW format file through the ISP algorithm to generate the standard ISP original data set; then, carrying out color correction on the standard image in the standard ISP original data set through a color correction model to obtain a color correction image, wherein the color correction model is an image enhancement neural network embedded with a control command module; then extracting a color correction characteristic matrix of the color correction chart and a RAW characteristic matrix of a Bayer array through a neural network respectively, and fusing the color correction characteristic matrix and the RAW characteristic matrix through a fusion operator to obtain a high dynamic color range characteristic matrix; and finally, converting the high dynamic color range characteristic matrix through a high-definition neural network to generate a transfer picture under the original resolution.
Compared with the related technology, the method and the device replace a manual file transfer process, realize the automatic file transfer through a neural network algorithm, unify the original film standard, and transfer the RAW format picture to generate a common format picture with normal exposure and neutral color temperature. In addition, by carrying out various color correction and high-definition processing on the RAW format file, original camera shooting details and color information can be retained to the greatest extent, and a file transfer picture with normal exposure, neutral color temperature and rich details is generated.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart of a method for transferring a RAW format file according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of color correction map generation according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of another color correction map generation according to an embodiment of the present application;
FIG. 4 is a schematic flow chart illustrating the generation of a shift print according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an automatic document shifting process for a RAW format document according to an embodiment of the present application;
fig. 6 is a block diagram of a system for transferring a RAW format file according to an embodiment of the present application;
fig. 7 is an internal structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
Fig. 1 is a flowchart of a method for transferring a RAW format file according to an embodiment of the present application, and as shown in fig. 1, the flowchart includes the following steps:
step S101, acquiring camera parameter configuration and a Bayer array from the RAW format file, and processing the RAW format file through an ISP algorithm to generate a standard ISP original data set;
it should be noted that different cameras and image processing software have different ISP image processor algorithms, for example, a picture seen on a PhotoShop camera RAW interface is actually a picture generated after the PS-owned ISP algorithm is shifted, and a RAW format thumbnail seen in a file manager is a thumbnail generated by the RAW format file-owned camera ISP algorithm. In this embodiment, the RAW format file is processed by the standardized ISP algorithm of the image processor to generate the standard ISP RAW data set
Figure DEST_PATH_IMAGE001
Wherein the standard ISP original data set comprises standardized original images without additional color correctionThe standardized original image has the most basic image texture information and the exposure degree and the color temperature degree determined according to the camera parameters.
Step S102, carrying out color correction on a standard image in a standard ISP original data set through a color correction model to obtain a color correction image, wherein the color correction model is an image enhancement neural network embedded with a control command module;
it should be noted that the unified standard of the converted pictures is limited by the sample quality and number of the manual film correction, and the sample number can be obtained by collecting a large number of samples, but the sample quality is limited by human subjectivity difference, so that the respective exposure color temperature standards are inconsistent, in a better case, the exposure and color temperature fluctuate within a certain range, in a worst case, even the opposite standard is caused by the style difference preferred by the film correction engineer. For the problem of the contradiction or inconsistency which is difficult to avoid due to the manual difference, the purpose of unifying the original standard cannot be achieved only by increasing the sample size to train the gear shifting neural network. In addition, the color correction problem in the shift belongs to an image enhancement problem in the technical field, a common image enhancement model is trained by a paired sample set of a graph-to-graph, however, such a training mode and a model structure depend on the distribution of the sample set, that is, the effect of each graph is enhanced manually, so that overfitting of the effect is easily caused, and when the picture shift is performed by using a test set, the shift effect cannot reach the ideal effect of the training set.
Based on the above problems, in order to enhance the universality of the model and avoid the conflict problem caused by manual gear shifting, preferably, the embodiment constructs an image enhancement neural network into which a control command module can be embedded, wherein control command instruction sets of a color temperature level and an exposure level are respectively set, and the control command module is generated, specifically, the color temperature degree value control instruction set is { cooling: 0, neutral: 1, warming: 2, the exposure level value control instruction set is { underexposure: 0, dark: 1, normal: 2, slightly bright: 3, overexposure: 4}. The control command module receives commands for controlling the effect, namely an exposure command and a color temperature command, and adjusts the image enhancement network according to the commands to realize different effect changes of the same picture. Thus, the color correction model can learn the variation in the dimension of the effect command, rather than simply fitting a single pair of samples. Because the prior rule of the command control module is embedded in the embodiment, the number of the training sets is increased, and the same picture has a plurality of corresponding enhancement images in the effect control dimension, such as underexposure, normal exposure and overexposure, and the color temperature is warmer, neutral and colder. The general enhancement of the whole color correction model can be realized by adding a command rule according with the expectation of our effect, data enhancement on the specifiable effect dimension and a model synchronously changing with the command, and combining the three, and fig. 2 is a schematic flow chart of the color correction chart generation according to the embodiment of the application, and is shown in fig. 2. Preferably, before the color correction is performed on the standard image in the standard ISP original data set through the color correction model, the standard image in the standard ISP original data set is subjected to image compression to obtain a low-pixel standard image, and then the color correction is performed on the low-pixel standard image through the color correction model. One reason for adopting the low-pixel standard image to perform color correction in the embodiment is that the color correction image only needs color exposure information and does not need texture detail information, and the low-pixel standard image can save storage space and improve efficiency; and secondly, the enhancement network of the low-pixel standard image has higher fitting degree.
Furthermore, the control command module in the color correction model has two dimensions of exposure and color temperature, and the dimensions are independently distributed without mutual interference from the perspective of the model, but generally affect each other in the specific implementation process, if the color temperature neutrality is to be realized, the corresponding exposure degree is generally adjusted, the relationship is not literal one-to-one, and even the control of the effect is corresponding to the exposure, the color temperature and the picture type of the original film. Therefore, in order for the model to understand the relationship between commands and pictures, it is preferable that the relationship of the control command variable and the color correction map change be semantically learned by a natural language processing model. Fig. 3 is a schematic flow chart of another color correction map generation according to an embodiment of the present application, and as shown in fig. 3, the specific steps of the color correction are as follows:
s1, extracting the features of the standard RGB image through a global information image neural network to obtain a global information feature vector;
s2, processing the global information characteristic vector and the control command variable through a natural language processing neural network to obtain color correction information, wherein the control command variable comprises an exposure control command and a color temperature control command, such as normal exposure and neutral color temperature;
and S3, performing color correction on the standard RGB image through the color correction image neural network integrated with the color correction information to obtain a color correction picture, wherein the effect of the correction picture is normal exposure and neutral color temperature.
Through the above, in this embodiment, a control command module is introduced into the color correction model, a control command instruction set is set, and the image is automatically corrected by inputting different color temperature and exposure level control commands, so as to obtain a color correction image with unified standard, thereby improving the quality of the image.
Step S103, extracting a color correction characteristic matrix of the color correction image and a RAW characteristic matrix of the Bayer array through a neural network respectively, fusing the color correction characteristic matrix and the RAW characteristic matrix through a fusion operator to obtain a high dynamic color range characteristic matrix, and converting the high dynamic color range characteristic matrix through a high-definition neural network to generate a converted image under the original resolution.
The bayer array of the RAW format file is the most RAW image data of the video camera, and has color information and texture information with a color depth of 16 bits or more, and there are different photosensitive elements and demosaicing algorithms depending on different cameras, and there is no uniform image generation method.
Fig. 4 is a schematic flow chart of the generation of a shift image according to an embodiment of the present application, and as shown in fig. 4, in this embodiment, a convolution neural network is used to perform feature extraction on a bayer array, so as to extract texture gradient information and color information with high color depth of the bayer array, and in addition, a convolution neural network is used to perform feature extraction on a color correction map, so as to extract a color correction feature matrix of the color correction map, and finally, a fusion operator is used to fuse the color correction feature matrix and a RAW feature matrix of the bayer array, so as to obtain a feature matrix with a high dynamic color range and abundant detail textures, and finally, the feature matrix is transformed by a high-definition neural network, so as to generate a final shift image with normal exposure, neutral color temperature, and abundant details.
Fig. 5 is a schematic diagram of an automatic document transferring process of a RAW format file according to an embodiment of the present application, and as shown in fig. 5, through the above steps S101 to S103, this embodiment replaces the manual document transferring process, realizes document transferring automation through a neural network algorithm, unifies an original standard, and transfers a RAW format picture to generate a commonly-used format picture with normal exposure and neutral color temperature. In addition, by carrying out various color correction and high-definition processing on the RAW format file, original camera shooting details and color information can be retained to the greatest extent, and a file transfer picture with normal exposure, neutral color temperature and rich details is generated.
In some embodiments, before color correction is performed on a standard image in a standard ISP original data set through a color correction model, PhotoShop camera raw shift and manual shift are performed on the standard ISP original data set to obtain a training sample set of the color correction model, where the training sample set includes pictures in the standard ISP original data set, shift pictures obtained after PhotoShop camera raw shift and manual shift, and color temperature levels and exposure levels of manual labels corresponding to the PhotoShop camera raw shift and the manual shift, respectively. Specifically, the converted document data set pictures obtained after Photoshop camera raw conversion and manual conversion are respectively
Figure 404705DEST_PATH_IMAGE002
Wherein
Figure DEST_PATH_IMAGE003
Represents a Photoshop Camera Raw shift sample data set,
Figure 118583DEST_PATH_IMAGE004
representing a manual gear shifting sample data set;
Figure DEST_PATH_IMAGE005
Figure 882009DEST_PATH_IMAGE006
and
Figure DEST_PATH_IMAGE007
are respectively
Figure 622692DEST_PATH_IMAGE008
Manually marking the color temperature grade and the exposure grade;
Figure DEST_PATH_IMAGE009
it is the result of manual shifting,
Figure 827278DEST_PATH_IMAGE010
and
Figure DEST_PATH_IMAGE011
respectively marking the color temperature grade and the exposure grade manually in the manual file transfer picture; thus, a complete training sample set of the color correction model can be obtained
Figure 180899DEST_PATH_IMAGE012
Wherein, in the step (A),
Figure DEST_PATH_IMAGE013
for the standard image before color correction, the sample image size of the color correction chart
Figure 293604DEST_PATH_IMAGE014
And (4) showing.
In some embodiments, after obtaining the training sample set of the color correction model, the sample images in the training sample set are compressed to obtain a small-sized training sample set, and the small-sized training sample set is passed throughTraining a color correction model. Specifically, the present embodiment compresses the size of the training sample image obtained as described above to
Figure DEST_PATH_IMAGE015
And obtaining a small-size small-image training sample set so as to meet the capacity requirement of video memory of a video card and the coverage of a designed neural network receptive field.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The present embodiment further provides a system for transferring a RAW format file, where the system is used to implement the foregoing embodiments and preferred embodiments, and the description already made is omitted here for brevity. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 6 is a block diagram of a system for shifting a RAW format file according to an embodiment of the present application, and as shown in fig. 6, the system includes an obtaining module 61, a color correction module 62, and a shifting module 63:
the acquisition module 61 is configured to acquire the camera parameter configuration and the bayer array from the RAW format file, and process the RAW format file through an ISP algorithm to generate a standard ISP RAW data set; the color correction module 62 is configured to perform color correction on the standard image in the standard ISP original data set through a color correction model to obtain a color correction map, where the color correction model is an image enhancement neural network embedded with the control command module; and the file transfer module 63 is configured to extract the color correction feature matrix of the color correction chart and the RAW feature matrix of the bayer array through a neural network, fuse the color correction feature matrix and the RAW feature matrix through a fusion operator to obtain a high dynamic color range feature matrix, and convert the high dynamic color range feature matrix through a high-definition neural network to generate a file transfer picture under an original resolution.
Through the system, in the embodiment, the camera parameter configuration and the bayer array of the RAW format file are acquired by the acquisition module 61, the standard ISP RAW data set is generated by the ISP algorithm, the color correction of the standard image is realized by the color correction module 62 on the standard ISP RAW data set, and finally the color correction image is shifted by the shift module 63 to obtain the shift image with normal exposure, neutral color temperature and rich details, wherein a control command instruction set is introduced into the color correction module 62, and the corresponding correction result can be obtained by inputting different color temperatures and exposure levels, so that the original image is corrected into a more natural image visually. The method solves the problem that the intelligent file transfer of the RAW format file through the deep learning network is lacked in the related technology, provides an automatic file transfer method for the RAW format file, and can effectively improve the quality of the transferred picture.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
Note that each of the modules may be a functional module or a program module, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
In addition, in combination with the method for transferring the RAW-format file in the foregoing embodiment, the embodiment of the present application may provide a storage medium to implement. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements the method for transferring a RAW format file in any of the above embodiments.
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of RAW format file transfer. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
In one embodiment, fig. 7 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application, and as shown in fig. 7, there is provided an electronic device, which may be a server, and an internal structure diagram of which may be as shown in fig. 7. The electronic device comprises a processor, a network interface, an internal memory and a non-volatile memory connected by an internal bus, wherein the non-volatile memory stores an operating system, a computer program and a database. The processor is used for providing calculation and control capability, the network interface is used for communicating with an external terminal through network connection, the internal memory is used for providing an environment for an operating system and the running of a computer program, the computer program is executed by the processor to realize a method for file transferring in a RAW format, and the database is used for storing data.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for converting files in a RAW format, the method comprising:
acquiring camera parameter configuration and a Bayer array from a RAW format file, and processing the RAW format file through an ISP algorithm to generate a standard ISP original data set;
carrying out color correction on the standard image in the standard ISP original data set through a color correction model to obtain a color correction image, wherein the color correction model is an image enhancement neural network embedded with a control command module;
respectively extracting a color correction characteristic matrix of the color correction chart and a RAW characteristic matrix of the Bayer array through a neural network, and fusing the color correction characteristic matrix and the RAW characteristic matrix through a fusion operator to obtain a high dynamic color range characteristic matrix;
and converting the high dynamic color range characteristic matrix through a high-definition neural network to generate a file transfer picture under the original resolution.
2. The method of claim 1, wherein prior to color correcting a standard image in the standard ISP raw data set by a color correction model, the method comprises:
and performing image compression on the standard image in the standard ISP original data set to obtain a low-pixel standard image, and performing color correction on the low-pixel standard image through the color correction model.
3. The method of claim 1, wherein the generating of the control command module comprises:
and respectively setting a control command instruction set of the color temperature grade and the exposure grade, generating the control command module, and adjusting the image enhancement neural network through the control command module.
4. The method of claim 3, wherein the color correcting the standard image in the standard ISP raw data set by the color correction model comprises:
the relationship of the control command variable and the color correction map variation is semantically learned by a natural language processing model.
5. The method of claim 4, wherein semantically learning the relationship of the control command variable and the color correction map variation by a natural language processing model comprises:
extracting features of the standard image through a global information image neural network to obtain a global information feature vector;
processing the global information characteristic vector and the control command variable through a natural language processing neural network to obtain color correction information;
and carrying out color correction on the standard image through a color correction image neural network integrated with the color correction information to obtain a color correction image.
6. The method of claim 1, wherein prior to color correcting a standard image in the standard ISP raw data set by a color correction model, the method further comprises:
performing PhotoShop camera raw file transfer and manual file transfer on the standard ISP original data set to obtain a training sample set of the color correction model, wherein the training sample set comprises pictures in the standard ISP original data set, file transfer pictures obtained after the PhotoShop camera raw file transfer and the manual file transfer, and color temperature grades and exposure grades of manual marks corresponding to the PhotoShop camera raw file transfer and the manual file transfer respectively.
7. The method of claim 6, wherein after obtaining the training sample set of the color correction model, the method comprises:
and compressing the sample images in the training sample set to obtain a small-size training sample set, and training the color correction model through the small-size training sample set.
8. A system for converting files in RAW format, the system comprising:
the acquisition module is used for acquiring camera parameter configuration and a Bayer array from the RAW format file, and processing the RAW format file through an ISP algorithm to generate a standard ISP original data set;
the color correction module is used for carrying out color correction on the standard image in the standard ISP original data set through a color correction model to obtain a color correction image, wherein the color correction model is an image enhancement neural network embedded with a control command module;
the gear shifting module is used for respectively extracting the color correction characteristic matrix of the color correction chart and the RAW characteristic matrix of the Bayer array through a neural network, and fusing the color correction characteristic matrix and the RAW characteristic matrix through a fusion operator to obtain a high dynamic color range characteristic matrix,
and converting the high dynamic color range characteristic matrix through a high-definition neural network to generate a file transfer picture under the original resolution.
9. An electronic device comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method of RAW format file conversion according to any one of claims 1 to 7.
10. A storage medium having stored thereon a computer program, wherein the computer program is configured to execute the method of RAW format file migration according to any one of claims 1 to 7 when running.
CN202110562914.7A 2021-05-24 2021-05-24 Method and system for converting RAW format file, electronic device and storage medium Active CN113034357B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110562914.7A CN113034357B (en) 2021-05-24 2021-05-24 Method and system for converting RAW format file, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110562914.7A CN113034357B (en) 2021-05-24 2021-05-24 Method and system for converting RAW format file, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN113034357A true CN113034357A (en) 2021-06-25
CN113034357B CN113034357B (en) 2021-08-17

Family

ID=76455531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110562914.7A Active CN113034357B (en) 2021-05-24 2021-05-24 Method and system for converting RAW format file, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113034357B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7751642B1 (en) * 2005-05-18 2010-07-06 Arm Limited Methods and devices for image processing, image capturing and image downscaling
CN110602467A (en) * 2019-09-09 2019-12-20 Oppo广东移动通信有限公司 Image noise reduction method and device, storage medium and electronic equipment
CN110766621A (en) * 2019-10-09 2020-02-07 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111064860A (en) * 2018-10-17 2020-04-24 北京地平线机器人技术研发有限公司 Image correction method, image correction device and electronic equipment
CN112261391A (en) * 2020-10-26 2021-01-22 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7751642B1 (en) * 2005-05-18 2010-07-06 Arm Limited Methods and devices for image processing, image capturing and image downscaling
CN111064860A (en) * 2018-10-17 2020-04-24 北京地平线机器人技术研发有限公司 Image correction method, image correction device and electronic equipment
CN110602467A (en) * 2019-09-09 2019-12-20 Oppo广东移动通信有限公司 Image noise reduction method and device, storage medium and electronic equipment
CN110766621A (en) * 2019-10-09 2020-02-07 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN112261391A (en) * 2020-10-26 2021-01-22 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LAN LIU等: "Thematic information detection for remote sensing image using SVM kernel functions", 《ICSPCC2015》 *

Also Published As

Publication number Publication date
CN113034357B (en) 2021-08-17

Similar Documents

Publication Publication Date Title
He et al. Conditional sequential modulation for efficient global image retouching
Yang et al. Image correction via deep reciprocating HDR transformation
CN108764370A (en) Image processing method, device, computer readable storage medium and computer equipment
CN111311532B (en) Image processing method and device, electronic device and storage medium
US20060204124A1 (en) Image processing apparatus for correcting contrast of image
CN113454680A (en) Image processor
CN109785252A (en) Based on multiple dimensioned residual error dense network nighttime image enhancing method
CN110930341A (en) Low-illumination image enhancement method based on image fusion
CN113763296A (en) Image processing method, apparatus and medium
US20190251670A1 (en) Electronic device and method for correcting images using external electronic device
CN105940673A (en) Image processing device, imaging device, image processing method, program, and storage medium
CN113962859B (en) Panorama generation method, device, equipment and medium
CN112508812A (en) Image color cast correction method, model training method, device and equipment
WO2023202200A1 (en) Method for reconstructing hdr images, terminal, and electronic device
CN111372006A (en) High dynamic range imaging method and system for mobile terminal
CN114862698B (en) Channel-guided real overexposure image correction method and device
Lv et al. Low-light image enhancement via deep Retinex decomposition and bilateral learning
WO2022100250A1 (en) Method and apparatus for image fusion, computer device and storage medium
Shao et al. Hybrid conditional deep inverse tone mapping
CN113034357B (en) Method and system for converting RAW format file, electronic device and storage medium
WO2023246392A1 (en) Image acquisition method, apparatus and device, and non-transient computer storage medium
WO2023151210A1 (en) Image processing method, electronic device and computer-readable storage medium
KR20210040702A (en) Mosaic generation apparatus and method thereof
CN112950509B (en) Image processing method and device and electronic equipment
CN113409196B (en) High-speed global chromatic aberration correction method for real-time video splicing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Method, system, electronic device, and storage medium for converting RAW format files

Effective date of registration: 20231226

Granted publication date: 20210817

Pledgee: High-tech Branch of Hangzhou United Rural Commercial Bank Co.,Ltd.

Pledgor: HANGZHOU HUOSHAOYUN TECHNOLOGY CO.,LTD.

Registration number: Y2023980074577

PE01 Entry into force of the registration of the contract for pledge of patent right