CN111340683B - Image data processing method, image data processing device, image processing system and server - Google Patents

Image data processing method, image data processing device, image processing system and server Download PDF

Info

Publication number
CN111340683B
CN111340683B CN202010085500.5A CN202010085500A CN111340683B CN 111340683 B CN111340683 B CN 111340683B CN 202010085500 A CN202010085500 A CN 202010085500A CN 111340683 B CN111340683 B CN 111340683B
Authority
CN
China
Prior art keywords
test
image
compatible
compatibility
coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010085500.5A
Other languages
Chinese (zh)
Other versions
CN111340683A (en
Inventor
朱明华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Kenid Jumao Medical Equipment Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202011009167.6A priority Critical patent/CN112215743A/en
Priority to CN202011011436.2A priority patent/CN112215744A/en
Priority to CN202010085500.5A priority patent/CN111340683B/en
Publication of CN111340683A publication Critical patent/CN111340683A/en
Application granted granted Critical
Publication of CN111340683B publication Critical patent/CN111340683B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/02Affine transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • G06T9/002Image coding using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Facsimiles In General (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image data processing method, an image data processing device, an image processing system and a server, compatibility test parameters for each different image source are obtained by performing image compatibility test on image data to be tested based on different image sources, then a compatibility test result for each image source is determined according to the compatibility test parameters and the image format type of the image data to be tested, so that image coding parameters of the corresponding image source are respectively generated according to the compatibility test results, and image coding processing is performed on the image data input by the image source according to the image coding parameters. Therefore, the image compatibility test can be performed on the image data of different image sources, so that the image data input by different image sources can be encoded and corrected in a targeted manner, the intelligent terminal of a user can be well compatible with the image data of various image sources, and the image display effect is improved.

Description

Image data processing method, image data processing device, image processing system and server
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image data processing method, an image data processing apparatus, an image processing system, and a server.
Background
With the rapid development of the internet technology, a user can upload image data to a server through an intelligent terminal at any time or download interested image data from the server at any time. However, for image data of different image sources, the intelligent terminal of the user may not be well compatible with the image data, which may result in that the downloaded image data is not available, or the display quality is lower than the original image quality, which seriously affects the user experience.
Disclosure of Invention
In order to overcome at least the above disadvantages in the prior art, an object of the present application is to provide an image data processing method, an image data processing apparatus, an image processing system, and a server, which can perform an image compatibility test on image data of different image sources, so as to perform encoding and correction on image data input by different image sources in a targeted manner, so that an intelligent terminal of a user can better be compatible with image data of various image sources, and an image display effect is improved.
In a first aspect, the present application provides an image data processing method, applied to a server, where the server is communicatively connected to an image source, and the method includes:
performing image compatibility test on image data to be tested based on different image sources to obtain compatibility test parameters aiming at each different image source, wherein the compatibility test parameters comprise compatibility test parameters of a plurality of compatibility test nodes, and each compatibility test node corresponds to one compatibility test type;
determining a compatibility test result aiming at each image source according to the compatibility test parameters and the image format type of the image data to be tested;
and respectively generating image coding parameters of corresponding image sources according to the compatibility test results, and carrying out image coding processing on image data input by the image sources according to the image coding parameters.
In a possible design of the first aspect, the step of performing an image compatibility test on image data to be tested based on different image sources to obtain compatibility test parameters for each different image source includes:
respectively constructing a compatibility test model of an image source type corresponding to each image source according to a plurality of compatibility test nodes aiming at the image data to be tested of each image source to obtain a plurality of compatibility test models;
testing image data to be tested of the image source according to each compatible test model, taking the image source type of the image data to be tested as a point convolution network layer of a compatible test network structure of a test input part as a test network layer, and testing the image data to be tested by a preset network depth to obtain a test result of a first compatible test network structure corresponding to each compatible test model;
the test result of the first compatible test network structure is pushed backwards by the position of the preset network layer number according to the point convolution network layer to obtain the test result of the second compatible test network structure corresponding to each compatible test model, and the test result of the second compatible test network structure is pushed backwards by the position of the preset network layer number according to the depth convolution layer direction to obtain the test result of the third compatible test network structure corresponding to each compatible test model;
taking the depth convolution layer of the third compatibility test network structure as a test network layer, testing preset convolution kernel nodes on the test result to obtain the test result of a fourth compatibility test network structure corresponding to each compatibility test model, and setting the test result of the fourth compatibility test network structure as the test result of a target compatibility test network structure;
acquiring a position identification position of the position of each target compatibility test network structure, and setting the acquired position identification position as a space identification position point of the position of each compatibility test node forming a test process;
acquiring a network input depth and a network input direction of a network input direction based on a test input part, outwardly setting a vector direction based on the test input part according to the network input depth and the network input direction, and acquiring a unit vector of the vector direction;
mapping the spatial identification position point of the position of each compatible test node to a test network layer corresponding to the unit vector, acquiring the compatible test node with the longest position in the position between each compatible test node and the test input part in the test network layer, and then taking the network structure between the compatible test node with the longest position and the test input part as a first network structure, taking the part connected with the first network structure as a second network structure, and establishing a corresponding target network structure according to the first network structure and the second network structure;
determining the position of each compatible test node according to the target network structure to obtain a compatible test network layer of each compatible test node, executing a test process of the image data to be tested according to the compatible test network layer of each compatible test node, and labeling each compatible test node in the test process according to the compatible test network layer of each compatible test node;
and summarizing the output parameters of the test network layer where the marked test process is located to obtain the compatible test parameters of each compatible test node.
In a possible design of the first aspect, the step of determining a compatibility test result for each image source according to the compatibility test parameter and an image format type of the image data to be tested includes:
inputting the compatibility test parameters into an image parameter matrix corresponding to the image format type of the image data to be tested aiming at the compatibility test parameters of each compatibility test node, extracting the output of a plurality of different matrix nodes of the image parameter matrix as the characteristics of a plurality of parameter types, and combining the characteristics of the plurality of parameter types to generate the characteristics of the compatibility test parameters;
based on the extracted features of the compatible test parameters, predicting confidence coefficient that each feature of the compatible test parameters is an abnormal feature and identification position of the feature relative to the compatible test process information of the compatible test item when each feature is an abnormal feature by adopting an abnormal prediction network corresponding to the image format type of the image data to be tested;
determining the characteristic with the confidence coefficient higher than a preset confidence coefficient threshold value as an abnormal characteristic, and determining the identification position of the compatible test process information of the compatible test item positioned by the abnormal characteristic based on the position difference of the identification position of each abnormal characteristic relative to the identification position of the process node of the compatible test process information of the compatible test item where the abnormal characteristic is positioned;
filtering the identification positions of the compatible test process information of the compatible test items which are repeatedly positioned in the identification positions of the compatible test process information of the compatible test items which are positioned by the abnormal features to obtain the identification positions of the compatible test process information of the candidate compatible test items;
inputting the extracted features of the compatibility test parameters into a compatibility test item index model corresponding to the image format type of the image data to be tested, and processing index feature information output by the compatibility test item index model based on the identification position of the compatibility test process information of each candidate compatibility test item to generate a compatibility test item feature quantity set for representing the features of the candidate compatibility test items;
determining related candidate compatible test items of each candidate compatible test item in the compatibility test parameters based on the identification positions of the compatible test process information of the candidate compatible test items, and calculating the characteristic quantity set difference between the characteristic quantity set of the compatible test items of the candidate compatible test items and the characteristic quantity set of the compatible test items of the corresponding related candidate compatible test items as the difference degree between the candidate compatible test items and the corresponding related candidate compatible test items;
taking the associated candidate compatible test items with the difference degree smaller than a preset difference degree threshold value with the candidate compatible test items as associated candidate compatible test items related to the candidate compatible test items, and combining each candidate compatible test item and the related associated candidate compatible test items into a compatible test item group;
and determining the test results of all compatible test items in the compatible test item group as the compatible test results of the compatible test node based on the identification positions of the compatible test process information of each candidate compatible test item in the compatible test item group.
In a possible design of the first aspect, the step of processing the index feature information output by the compatibility test item index model based on the identification position of the compatibility test process information of each candidate compatibility test item to generate a compatible test item feature quantity set used for characterizing features of the candidate compatibility test item includes:
extracting corresponding affine transformation data from the index feature information output by the compatible test item index model based on the identification position of the compatible test process information of each candidate compatible test item to obtain an affine transformation data set;
performing fusion operation on affine transformation data in the affine transformation data set based on a preset fusion comparison value to obtain at least one affine transformation data set;
determining at least one continuous number range based on the continuous number of affine transformation data included in the affine transformation data sets, wherein the range covered by the at least one continuous number range contains the continuous number of affine transformation data included in each affine transformation data set in the at least one affine transformation data set;
for each of the at least one continuous number range, combining affine transformation data sets including successive numbers of affine transformation data in the continuous number range into an affine transformation data set group;
extracting, for each of the at least one affine transformation data set group, a set number of affine transformation data sets from the affine transformation data set group;
for each extracted affine transformation data set in the affine transformation data sets, determining average affine transformation data of the affine transformation data included in the affine transformation data set, wherein the average affine transformation data is a vector formed by the average values of vectors at the same position in the affine transformation vectors;
determining the similarity between every two determined average affine transformation data, and determining the similarity between the determined average affine transformation data as a comparison parameter between corresponding affine transformation data sets, wherein the comparison parameter is used for representing the similarity after comparison between the affine transformation data sets;
combining affine transformation data sets corresponding to the determined contrast parameters with the contrast parameters larger than a preset set value into affine transformation data sets for comparison;
determining an affine transformation data set corresponding to the contrast parameters with the determined contrast parameters less than or equal to the set value as an affine transformation data set for comparison;
for each affine transformation data set in the at least one affine transformation data set for comparison, determining affine transformation data with which a contrast parameter for representing a degree of similarity after comparison between affine transformation data and affine transformation data sets is greater than a set value and less than a fusion contrast value as affine transformation data to be output, from affine transformation data in the affine transformation data sets that do not belong to the at least one affine transformation data set for comparison;
and selecting a set number of affine transformation data from the affine transformation data to be output and outputting a compatible test item feature quantity set for characterizing the features of the candidate compatible test items.
In a possible design of the first aspect, the step of generating image encoding parameters of corresponding image sources according to the compatibility test results includes:
constructing an encoding sample database of a corresponding image source according to a compatibility test result of each compatibility test node, wherein the encoding sample database contains image encoding parameters of image encoding nodes corresponding to each compatibility test node extracted from the compatibility test result;
and generating image coding parameters of the image source according to the image coding parameters of the image coding nodes corresponding to the compatible test nodes and the compatibility test incidence relation among the compatible test nodes, wherein the compatibility test incidence relation is used for representing the time sequence relation among the compatible test nodes, and the image coding parameters are arranged according to the compatibility test incidence relation to obtain the image coding parameters of the image source.
In one possible design of the first aspect, the step of performing image encoding processing on the image data input by the image source according to the image encoding parameter includes:
extracting the characteristics of image coding information in image data input by the image source to obtain coding node characteristics of the image coding information;
and acquiring an image coding information state corresponding to the coding node characteristics, and processing the coding node characteristics of the image coding information according to the image coding parameters and the image coding information state.
In one possible design of the first aspect, the processing the coding node characteristic of the image coding information according to the image coding parameter and the image coding information state includes:
acquiring each coding node characteristic of image coding information of image data input by the image source and a state bitmap corresponding to each coding node characteristic according to the image coding information state, and acquiring each coding node characteristic in the image coding parameters and a state bitmap corresponding to each coding node characteristic;
generating a first coding bitmap of a state bitmap about coding node characteristics according to the coding node characteristics of the image coding information and the state bitmap corresponding to the coding node characteristics, and generating a second coding bitmap of the state bitmap about the coding node characteristics according to the coding node characteristics in the image coding parameters and the state bitmap corresponding to the coding node characteristics, wherein the coding node characteristics of the image coding information correspond to the coding node characteristics in the image coding parameters;
sequentially comparing the state bitmap corresponding to each coding node feature of the second coding bitmap with the state bitmap corresponding to each coding node feature of the first coding bitmap, and judging whether the state bitmap corresponding to the coding node feature of the second coding bitmap matches the state bitmap corresponding to the coding node feature of the first coding bitmap;
for each coding node feature, when the state bitmap of the second coding bitmap does not match with the corresponding first coding bitmap state bitmap, determining the difference value of each bitmap point between the state bitmap of the first coding bitmap and the state bitmap of the second coding bitmap as a coding comparison value corresponding to the coding node feature, and generating a corresponding coding comparison bitmap according to each coding node feature and the coding comparison value corresponding to each coding node feature;
and processing the coding node characteristics of the image coding information according to the value of each bitmap point in the coding comparison bitmap.
In a second aspect, an embodiment of the present application further provides an image data processing apparatus, which is applied to a server, where the server is communicatively connected to an image source, and the apparatus includes:
the device comprises a test module, a processing module and a processing module, wherein the test module is used for carrying out image compatibility test on image data to be tested based on different image sources so as to obtain compatibility test parameters aiming at each different image source, the compatibility test parameters comprise compatibility test parameters of a plurality of compatibility test nodes, and each compatibility test node corresponds to one compatibility test type;
the determining module is used for determining a compatibility test result aiming at each image source according to the compatibility test parameters and the image format type of the image data to be tested;
and the coding processing module is used for respectively generating image coding parameters of corresponding image sources according to the compatibility test results and carrying out image coding processing on image data input by the image sources according to the image coding parameters.
In a third aspect, an embodiment of the present application further provides an image processing system, where the image processing system includes a server and an image source communicatively connected to the server;
the image source is used for sending image data to be detected to the server;
the server is used for carrying out image compatibility test on image data to be tested based on different image sources so as to obtain compatibility test parameters aiming at each different image source, the compatibility test parameters comprise compatibility test parameters of a plurality of compatibility test nodes, and each compatibility test node corresponds to one compatibility test type;
the server is used for determining a compatibility test result aiming at each image source according to the compatibility test parameters and the image format type of the image data to be tested;
and the server is used for respectively generating image coding parameters of corresponding image sources according to the compatibility test results and carrying out image coding processing on image data input by the image sources according to the image coding parameters.
In a fourth aspect, the embodiments of the present application further provide a server, where the server includes a processor, a machine-readable storage medium, and a network interface, where the machine-readable storage medium, the network interface, and the processor are connected through a bus system, the network interface is configured to be communicatively connected to at least one image source, the machine-readable storage medium is configured to store a program, instructions, or codes, and the processor is configured to execute the program, instructions, or codes in the machine-readable storage medium to perform the image data processing method in the first aspect or any possible design of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, where instructions are stored, and when the instructions are detected on a computer, the instructions cause the computer to perform the image data processing method in the first aspect or any one of the possible designs of the first aspect.
Based on any one of the aspects, the method and the device for processing the image coding are used for performing image compatibility testing on image data to be tested based on different image sources to obtain compatibility testing parameters aiming at each different image source, then determining a compatibility testing result aiming at each image source according to the compatibility testing parameters and the image format type of the image data to be tested, respectively generating image coding parameters of the corresponding image source according to the compatibility testing results, and performing image coding processing on the image data input by the image source according to the image coding parameters. Therefore, the image compatibility test can be performed on the image data of different image sources, so that the image data input by different image sources can be encoded and corrected in a targeted manner, the intelligent terminal of a user can be well compatible with the image data of various image sources, and the image display effect is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic view of an application scenario of an image processing system according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an image data processing method according to an embodiment of the present application;
fig. 3 is a schematic functional block diagram of an image data processing apparatus according to an embodiment of the present application;
fig. 4 is a block diagram schematically illustrating a structure of a server for implementing the image data processing method according to an embodiment of the present application.
Detailed Description
The present application will now be described in detail with reference to the drawings, and the specific operations in the method embodiments may also be applied to the apparatus embodiments or the system embodiments. In the description of the present application, "at least one" includes one or more unless otherwise specified. "plurality" means two or more. For example, at least one of A, B and C, comprising: a alone, B alone, a and B in combination, a and C in combination, B and C in combination, and A, B and C in combination. In this application, "/" means "or, for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
Fig. 1 is an interactive schematic diagram of an image processing system 10 according to an embodiment of the present application. The image processing system 10 may include a server 100 and an image source 200 communicatively coupled to the server 100, and the server 100 may include a processor therein to perform operations of instructions. The image processing system 10 shown in fig. 1 is only one possible example, and in other possible embodiments, the image processing system 10 may include only a portion of the components shown in fig. 1 or may include other components.
In some embodiments, the server 100 may be a single server or a group of servers. The set of servers may be centralized or distributed (e.g., server 100 may be a distributed system). In some embodiments, the server 100 may be local or remote with respect to the image source 200. For example, the server 100 may access information stored in the image source 200 and a database, or any combination thereof, via a network. As another example, the server 100 may be directly connected to at least one of the image source 200 and a database to access information and/or data stored therein. In some embodiments, the server 100 may be implemented on a cloud platform; by way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud (community cloud), a distributed cloud, an inter-cloud, a multi-cloud, and the like, or any combination thereof.
In some embodiments, the server 100 may include a processor. The processor may process information and/or data related to the service request to perform one or more of the functions described herein. A processor may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, a Processor may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set computer (Reduced Instruction Set computer), a microprocessor, or the like, or any combination thereof.
The network may be used for the exchange of information and/or data. In some embodiments, one or more components in the image processing system 10 (e.g., the server 100, the image source 200, and the database) may send information and/or data to other components. In some embodiments, the network may be any type of wired or wireless network, or combination thereof. Merely by way of example, Network 130 may include a wired Network, a Wireless Network, a fiber optic Network, a telecommunications Network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a WLAN, a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth Network, a ZigBee Network, a Near Field Communication (NFC) Network, or the like, or any combination thereof. In some embodiments, the network may include one or more network access points. For example, the network may include wired or wireless network access points, such as base stations and/or network switching nodes, through which one or more components of the image processing system 10 may connect to the network to exchange data and/or information.
The aforementioned database may store data and/or instructions. In some embodiments, the database may store data assigned to the image sources 200. In some embodiments, the database may store data and/or instructions for the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile Read-write Memory, or Read-Only Memory (ROM), among others, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; removable memory may include flash drives, floppy disks, optical disks, memory cards, zip disks, tapes, and the like; volatile read-write Memory may include Random Access Memory (RAM); the RAM may include Dynamic RAM (DRAM), Double data Rate Synchronous Dynamic RAM (DDR SDRAM); static RAM (SRAM), Thyristor-Based Random Access Memory (T-RAM), Zero-capacitor RAM (Zero-RAM), and the like. By way of example, ROMs may include Mask Read-Only memories (MROMs), Programmable ROMs (PROMs), Erasable Programmable ROMs (PERROMs), Electrically Erasable Programmable ROMs (EEPROMs), compact disk ROMs (CD-ROMs), digital versatile disks (ROMs), and the like. In some embodiments, the database may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, across clouds, multiple clouds, or the like, or any combination thereof.
In some embodiments, the database may be connected to a network to communicate with one or more components in the image processing system 10 (e.g., the server 100, the image source 200, etc.). One or more components in image processing system 10 may access data or instructions stored in a database via a network. In some embodiments, the database may be directly connected to one or more components in the image processing system 10 (e.g., the server 100, the image source 200, etc.; or, in some embodiments, the database may be part of the server 100.
To solve the technical problem in the foregoing background art, fig. 2 is a flowchart illustrating an image data processing method provided in an embodiment of the present application, which can be executed by the server 100 shown in fig. 1, and the image data processing method is described in detail below.
Step S110, performing an image compatibility test on the image data to be tested based on different image sources 200 to obtain a compatibility test parameter for each different image source 200.
Step S120, determining a compatibility test result for each image source 200 according to the compatibility test parameters and the image format type of the image data to be tested.
Step S130, respectively generating image coding parameters of the corresponding image source 200 according to the compatibility test result, and performing image coding processing on the image data input by the image source 200 according to the image coding parameters.
In this embodiment, the image source 200 may refer to various image capturing devices for generating image data, such as a smart phone, a tablet computer, a single lens reflex camera, and the like.
In this embodiment, the compatibility test parameters may include compatibility test parameters of a plurality of compatibility test nodes, and each of the compatibility test nodes corresponds to one of the compatibility test types. For example, the compatibility test type may include a compatibility test type of a software operating system, a compatibility test type of a hardware model, a compatibility test type of a resolution, and the like, and is not particularly limited herein.
In this embodiment, the image format type may be an original image format of the test image data, such as a JPG format, a PNG format, and the like.
Based on the above steps, the embodiment performs the image compatibility test on the image data to be tested based on different image sources 200 to obtain the compatibility test parameter for each different image source 200, and then determines the compatibility test result for each image source 200 according to the compatibility test parameter and the image format type of the image data to be tested, so as to respectively generate the image coding parameters of the corresponding image source 200 according to the compatibility test result, and perform the image coding processing on the image data input by the image source 200 according to the image coding parameters. Therefore, image compatibility test can be performed on the image data of different image sources 200, so that the image data input by different image sources 200 can be encoded and corrected in a targeted manner, and thus, the intelligent terminal of a user can be better compatible with the image data of various image sources 200, and the image display effect is improved.
In a possible design, for step S110, in order to accurately adapt to the corresponding image source type of each image source 200 in the compatibility test process, in this embodiment, for the image data to be tested of each image source 200, a compatibility test model of the image source type corresponding to the image source 200 is respectively constructed according to a plurality of compatibility test nodes to obtain a plurality of compatibility test models, and the image data to be tested of the image source 200 is tested according to each compatibility test model, and the test result of the first compatibility test network structure corresponding to each compatibility test model is obtained by taking the image source type of the image data to be tested as the point convolution network layer of the compatibility test network structure of the test input portion as the test network layer and testing the image data to be tested with the preset network depth.
It should be noted that the preset network depth may be set according to an actual strength requirement for the compatibility test, and the embodiment is not particularly limited. For example, the preset network depth may be set to 10, that is, 10 tests of the network layer are performed.
On the basis, the test result of the first compatible test network structure can be tested according to the position of the point convolution network layer backward pushing the preset network layer number to obtain the test result of the second compatible test network structure corresponding to each compatible test model, and the test result of the second compatible test network structure is pushed backward according to the depth convolution layer direction to obtain the test result of the third compatible test network structure corresponding to each compatible test model.
It should be noted that the preset network layer number may also be set according to actual test requirements, and when the preset network layer number is larger, the test requirement on the compatibility is higher.
Therefore, the deep convolution layer of the third compatible test network structure can be used as a test network layer, the test result is tested by the preset convolution kernel node, the test result of the fourth compatible test network structure corresponding to each compatible test model is obtained, and the test result of the fourth compatible test network structure is set as the test result of the target compatible test network structure.
Then, a position identification position of a position where each target compatibility test network structure is located can be obtained, each obtained position identification position is set as a space identification position point of the position where each compatibility test node forming the test process is located, then a network input depth and a network input direction of a network input direction with the test input part as a reference are obtained, a vector direction with the test input part as the reference is outwards arranged according to the network input depth and the network input direction, and a unit vector of the vector direction is obtained.
It should be noted that the network input depth may refer to depth information from the beginning of the test input portion to the end of the test, and the network input direction may refer to a vector direction from the beginning of the test input portion to the end of the test.
Then, the spatial identification position point of the position of each compatible test node can be mapped to the test network layer corresponding to the unit vector, and after the compatible test node with the longest position in the positions between each compatible test node and the test input part is obtained in the test network layer, the network structure between the compatible test node with the longest position and the test input part is used as a first network structure, the part connected with the first network structure is used as a second network structure, and a corresponding target network structure is established according to the first network structure and the second network structure.
Therefore, the position of each compatible test node can be determined according to the target network structure, the compatible test network layer of each compatible test node is obtained, the test process of the image data to be tested is executed according to the compatible test network layer of each compatible test node, each compatible test node in the test process is labeled according to the compatible test network layer of each compatible test node, and therefore the output parameters of the test network layer where the labeled test process is located can be summarized, and the compatible test parameters of each compatible test node are obtained.
In a possible design, for step S120, after determining the compatibility test parameters for each different image source 200, in order to further adapt to the image format type of the image data to be tested, so as to improve the accuracy of the compatibility test, in this embodiment, for the compatibility test parameters of each compatibility test node, the compatibility test parameters are input into the image parameter matrix corresponding to the image format type of the image data to be tested, the outputs of a plurality of different matrix nodes of the image parameter matrix are extracted as the features of a plurality of parameter types, and the features of the plurality of parameter types are combined, so as to generate the features of the compatibility test parameters.
In this embodiment, for different image format types, corresponding image parameter matrices may be configured in advance, the image parameter matrices are generated after averaging parameters of a large amount of image data of the corresponding image format types, each parameter may correspond to one matrix node of the image parameter matrix, and specifically, the configuration may be flexibly performed by a user, which is not particularly limited in this embodiment.
On the basis, based on the extracted features of the compatible test parameters, the confidence coefficient that each feature of the compatible test parameters is an abnormal feature and the identification position of the feature relative to the compatible test process information of the compatible test item where the feature is located when each feature is the abnormal feature can be predicted by adopting the abnormal prediction network corresponding to the image format type of the image data to be tested.
For example, features with a confidence level higher than a preset confidence level threshold may be determined as abnormal features, and based on the position difference between the determined identification position of each abnormal feature and the identification position of the process node of the compatible test process information of the compatible test item where the abnormal feature is located, the identification position of the compatible test process information of the compatible test item located by the abnormal feature may be determined.
For example, the identification position of the compatible test process information of the compatible test item repeatedly positioned in the identification positions of the compatible test process information of the compatible test items positioned by the different abnormal features may be filtered out, so as to obtain the identification position of the compatible test process information of each candidate compatible test item.
Then, the extracted features of the compatible test parameters can be input into a compatible test item index model corresponding to the image format type of the image data to be tested, and index feature information output by the compatible test item index model is processed based on the identification position of the compatible test process information of each candidate compatible test item, so that a compatible test item feature quantity set used for representing the features of the candidate compatible test items is generated.
Then, based on the identification position of the compatible testing process information of the candidate compatible testing item, the associated candidate compatible testing item of each candidate compatible testing item is determined in the compatible testing parameters, and the feature quantity set difference between the feature quantity set of the compatible testing item of the candidate compatible testing item and the feature quantity set of the compatible testing item of each associated candidate compatible testing item is calculated as the difference degree between the candidate compatible testing item and each associated candidate compatible testing item.
Therefore, the associated candidate compatible test items with the difference degree smaller than the preset difference degree threshold value with the candidate compatible test items can be used as the associated candidate compatible test items related to the candidate compatible test items, and each candidate compatible test item and the related associated candidate compatible test items are combined into the compatible test item group, so that the test results of all compatible test items in the compatible test item group can be determined as the compatible test results of the compatible test node based on the identification positions of the compatible test process information of each candidate compatible test item in the compatible test item group.
In the above process, as a possible implementation manner, in the process of processing the index feature information output by the compatible test item index model based on the identification position of the compatible test process information of each candidate compatible test item to generate the compatible test item feature quantity set used for characterizing the features of the candidate compatible test item, this embodiment may further extract corresponding affine transformation data from the index feature information output by the compatible test item index model based on the identification position of the compatible test process information of each candidate compatible test item to obtain an affine transformation data set, and perform a fusion operation on the affine transformation data in the affine transformation data set based on a preset fusion contrast value to obtain at least one affine transformation data set.
On this basis, at least one continuous number range may be determined based on the continuous number of affine transformation data included in the affine transformation data sets, wherein the range covered by the at least one continuous number range contains the continuous number of affine transformation data included in each of the at least one affine transformation data sets.
For example, for each of at least one consecutive number range (e.g., 5), affine transformation data sets including consecutive numbers of affine transformation data in the consecutive number range are combined into an affine transformation data set group.
For another example, for each of at least one affine transformation data set group, a set number of affine transformation data sets are extracted from the affine transformation data set group.
For another example, for each extracted affine transformation data set in the respective affine transformation data sets, average affine transformation data of the affine transformation data included in the affine transformation data set is determined, where the average affine transformation data is a vector formed by the mean values of vectors at the same position in the respective affine transformation vectors.
Then, the similarity between every two determined average affine transformation data can be determined, and the similarity between the determined average affine transformation data is determined as a comparison parameter between corresponding affine transformation data sets, wherein the comparison parameter is used for representing the similarity after comparison between the affine transformation data sets.
Then, affine transformation data sets corresponding to the determined contrast parameters with the contrast parameters larger than the preset set value may be merged into affine transformation data sets for comparison, and affine transformation data sets corresponding to the determined contrast parameters with the contrast parameters smaller than or equal to the set value may be determined as affine transformation data sets for comparison.
As one possible example, for each affine transformation data set of the at least one affine transformation data set for comparison, affine transformation data having a comparison parameter with the affine transformation data set larger than a set value and smaller than a fusion comparison value is determined as affine transformation data to be output, from affine transformation data of the affine transformation data sets that do not belong to the at least one affine transformation data set for comparison, wherein the comparison parameter is used to represent a degree of similarity after comparison between the affine transformation data and the affine transformation data sets.
Thus, it is possible to select a set number of affine transformation data from the respective affine transformation data to be output and output a compatible test item feature quantity set for characterizing features of candidate compatible test items.
In a possible design, further referring to step S130, the present embodiment may construct an encoding sample database of the corresponding image source 200 according to the compatibility test result of each compatibility test node, where the encoding sample database includes the image encoding parameters of the image encoding node corresponding to each compatibility test node extracted from the compatibility test result.
Therefore, the image coding parameters of the image source 200 can be generated according to the image coding parameters of the image coding nodes corresponding to each compatible test node and the compatibility test incidence relation among the compatible test nodes, wherein the compatibility test incidence relation is used for representing the time sequence relation among the compatible test nodes, and the image coding parameters are arranged according to the compatibility test incidence relation to obtain the image coding parameters of the image source 200.
For example, if the compatibility test association relationship includes the compatibility test node a, the compatibility test node B, the compatibility test node C, and the compatibility test node D, the image coding node a, the image coding node B, the image coding node C, and the image coding node D correspond to each other, and if the compatibility test association relationship includes the compatibility test node a, the compatibility test node C, the compatibility test node B, and the compatibility test node D, the image coding parameters of the image source 200 may be obtained by arranging the image coding parameters of the image coding node a, the image coding node C, the image coding node B, and the image coding node D in the order.
On the basis, as a possible implementation manner, in this embodiment, feature extraction may be performed on image coding information in image data input by the image source 200 to obtain coding node features of the image coding information, and an image coding information state corresponding to the coding node features is obtained, and the coding node features of the image coding information are processed according to the image coding parameters and the image coding information state.
For example, as a possible implementation manner, each coding node feature of the image coding information of the image data input by the image source 200 and a state bitmap corresponding to each coding node feature in the image coding parameters may be acquired according to the state of the image coding information.
Then, a first coded bitmap of the state bitmap about the coded node features can be generated according to the coded node features of the image coded information and the state bitmap corresponding to the coded node features, and a second coded bitmap of the state bitmap about the coded node features can be generated according to the coded node features of the image coded parameters and the state bitmap corresponding to the coded node features, wherein the coded node features of the image coded information correspond to the coded node features of the image coded parameters.
On this basis, the status bitmap corresponding to each coding node feature of the second coding bitmap may be sequentially compared with the status bitmap corresponding to each coding node feature of the first coding bitmap, and whether the status bitmap corresponding to the coding node feature of the second coding bitmap matches the status bitmap corresponding to the coding node feature of the first coding bitmap may be determined.
Therefore, for each coding node feature, when the state bitmap of the second coding bitmap does not match with the corresponding state bitmap of the first coding bitmap, the difference value of each bitmap point between the state bitmap of the first coding bitmap and the state bitmap of the second coding bitmap is determined as a coding comparison value corresponding to the coding node feature, and the corresponding coding comparison bitmap is generated according to each coding node feature and the coding comparison value corresponding to each coding node feature, so that the coding node feature of the image coding information can be processed according to the value of each bitmap point in the coding comparison bitmap.
For example, as a possible example, in the process of processing the coding node feature of the image coding information according to the value of each bitmap point in the coding comparison bitmap, for each image coding node in at least one image coding node of the image coding information, the present embodiment may determine, according to the value of the bitmap point corresponding to the image coding node, a first coding correction parameter corresponding to the image coding node, and determine, according to the values of two bitmap points adjacent to the bitmap point, a second coding correction parameter corresponding to the image coding node.
On the basis, the initial coding parameters of the coding node characteristics can be used as correction objects, the initial coding parameters are corrected according to the first coding correction parameters, and then the coding node characteristics are continuously corrected according to the second coding correction parameters.
Fig. 3 is a schematic diagram of functional modules of an image data processing apparatus 300 according to an embodiment of the present application, and the embodiment may divide the functional modules of the image data processing apparatus 300 according to the foregoing method embodiment. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one processing block. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the present application is schematic, and is only a logical function division, and there may be another division manner in actual implementation. For example, in the case of dividing each functional module by corresponding functions, the image data processing apparatus 300 shown in fig. 3 is only an apparatus diagram. The image data processing apparatus 300 may include a testing module 310, a determining module 320, and an encoding processing module 330, and the functions of the functional modules of the image data processing apparatus 300 are described in detail below.
A testing module 310, configured to perform an image compatibility test on image data to be tested based on different image sources 200 to obtain a compatibility test parameter for each different image source 200, where the compatibility test parameter includes a compatibility test parameter of a plurality of compatibility test nodes, and each compatibility test node corresponds to one compatibility test type;
a determining module 320, configured to determine a compatibility test result for each image source 200 according to the compatibility test parameter and the image format type of the image data to be tested;
the encoding processing module 330 is configured to generate image encoding parameters of the corresponding image source 200 according to the compatibility test result, and perform image encoding processing on image data input by the image source 200 according to the image encoding parameters.
Further, fig. 4 is a schematic structural diagram of a server 100 for executing the image data processing method according to an embodiment of the present application. As shown in FIG. 4, the server 100 may include a network interface 110, a machine-readable storage medium 120, a processor 130, and a bus 140. The processor 130 may be one or more, and one processor 130 is illustrated in fig. 4 as an example. The network interface 110, the machine-readable storage medium 120, and the processor 130 may be connected by a bus 140 or otherwise, as exemplified by the connection by the bus 140 in fig. 4.
The machine-readable storage medium 120, which is a computer-readable storage medium, can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the image data processing method in the embodiment of the present application (for example, the testing module 310, the determining module 320, and the encoding processing module 330 of the image data processing apparatus 300 shown in fig. 3). The processor 130 detects the software program, instructions and modules stored in the machine-readable storage medium 120, so as to execute various functional applications and data processing of the terminal device, that is, to implement the image data processing method, which is not described herein again.
The machine-readable storage medium 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the machine-readable storage medium 120 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memories of the systems and methods described herein are intended to comprise, without being limited to, these and any other suitable memory of a publishing node. In some examples, the machine-readable storage medium 120 may further include memory located remotely from the processor 130, which may be connected to the server 100 over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The processor 130 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 130. The processor 130 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
The server 100 may exchange information with other devices, such as the image source 200, through the network interface 110. Network interface 110 may be a circuit, bus, transceiver, or any other device that may be used to exchange information. Processor 130 may send and receive information using network interface 110.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the spirit and scope of the application. Thus, to the extent that such expressions and modifications of the embodiments of the application fall within the scope of the claims and their equivalents, the application is intended to embrace such alterations and modifications.

Claims (9)

1. An image data processing method, applied to a server, wherein the server is in communication connection with an image source, the method comprising:
performing image compatibility test on image data to be tested based on different image sources to obtain compatibility test parameters aiming at each different image source, wherein the compatibility test parameters comprise compatibility test parameters of a plurality of compatibility test nodes, and each compatibility test node corresponds to one compatibility test type;
determining a compatibility test result aiming at each image source according to the compatibility test parameters and the image format type of the image data to be tested;
respectively generating image coding parameters of corresponding image sources according to the compatibility test results, and carrying out image coding processing on image data input into the image sources according to the image coding parameters;
the step of performing image compatibility test on image data to be tested based on different image sources to obtain compatibility test parameters for each different image source comprises:
respectively constructing a compatibility test model of an image source type corresponding to each image source according to a plurality of compatibility test nodes aiming at the image data to be tested of each image source to obtain a plurality of compatibility test models;
testing image data to be tested of the image source according to each compatible test model, taking the image source type of the image data to be tested as a point convolution network layer of a compatible test network structure of a test input part as a test network layer, and testing the image data to be tested by a preset network depth to obtain a test result of a first compatible test network structure corresponding to each compatible test model;
the test result of the first compatible test network structure is pushed backwards by the position of the preset network layer number according to the point convolution network layer to obtain the test result of the second compatible test network structure corresponding to each compatible test model, and the test result of the second compatible test network structure is pushed backwards by the position of the preset network layer number according to the depth convolution layer direction to obtain the test result of the third compatible test network structure corresponding to each compatible test model;
taking the depth convolution layer of the third compatibility test network structure as a test network layer, testing preset convolution kernel nodes on the test result to obtain the test result of a fourth compatibility test network structure corresponding to each compatibility test model, and setting the test result of the fourth compatibility test network structure as the test result of a target compatibility test network structure;
acquiring a position identification position of the position of each target compatibility test network structure, and setting the acquired position identification position as a space identification position point of the position of each compatibility test node forming a test process;
acquiring a network input depth and a network input direction of a network input direction based on a test input part, outwardly setting a vector direction based on the test input part according to the network input depth and the network input direction, and acquiring a unit vector of the vector direction;
mapping the spatial identification position point of the position of each compatible test node to a test network layer corresponding to the unit vector, acquiring the compatible test node with the longest position in the position between each compatible test node and the test input part in the test network layer, and then taking the network structure between the compatible test node with the longest position and the test input part as a first network structure, taking the part connected with the first network structure as a second network structure, and establishing a corresponding target network structure according to the first network structure and the second network structure;
determining the position of each compatible test node according to the target network structure to obtain a compatible test network layer of each compatible test node, executing a test process of the image data to be tested according to the compatible test network layer of each compatible test node, and labeling each compatible test node in the test process according to the compatible test network layer of each compatible test node;
and summarizing the output parameters of the test network layer where the marked test process is located to obtain the compatible test parameters of each compatible test node.
2. The method according to claim 1, wherein the step of determining a compatibility test result for each image source according to the compatibility test parameters and the image format type of the image data to be tested comprises:
inputting the compatibility test parameters into an image parameter matrix corresponding to the image format type of the image data to be tested aiming at the compatibility test parameters of each compatibility test node, extracting the output of a plurality of different matrix nodes of the image parameter matrix as the characteristics of a plurality of parameter types, and combining the characteristics of the plurality of parameter types to generate the characteristics of the compatibility test parameters;
based on the extracted features of the compatible test parameters, predicting confidence coefficient that each feature of the compatible test parameters is an abnormal feature and identification position of the feature relative to the compatible test process information of the compatible test item when each feature is an abnormal feature by adopting an abnormal prediction network corresponding to the image format type of the image data to be tested;
determining the characteristic with the confidence coefficient higher than a preset confidence coefficient threshold value as an abnormal characteristic, and determining the identification position of the compatible test process information of the compatible test item positioned by the abnormal characteristic based on the position difference of the identification position of each abnormal characteristic relative to the identification position of the process node of the compatible test process information of the compatible test item where the abnormal characteristic is positioned;
filtering the identification positions of the compatible test process information of the compatible test items which are repeatedly positioned in the identification positions of the compatible test process information of the compatible test items which are positioned by the abnormal features to obtain the identification positions of the compatible test process information of the candidate compatible test items;
inputting the extracted features of the compatibility test parameters into a compatibility test item index model corresponding to the image format type of the image data to be tested, and processing index feature information output by the compatibility test item index model based on the identification position of the compatibility test process information of each candidate compatibility test item to generate a compatibility test item feature quantity set for representing the features of the candidate compatibility test items;
determining related candidate compatible test items of each candidate compatible test item in the compatibility test parameters based on the identification positions of the compatible test process information of the candidate compatible test items, and calculating the characteristic quantity set difference between the characteristic quantity set of the compatible test items of the candidate compatible test items and the characteristic quantity set of the compatible test items of the corresponding related candidate compatible test items as the difference degree between the candidate compatible test items and the corresponding related candidate compatible test items;
taking the associated candidate compatible test items with the difference degree smaller than a preset difference degree threshold value with the candidate compatible test items as associated candidate compatible test items related to the candidate compatible test items, and combining each candidate compatible test item and the related associated candidate compatible test items into a compatible test item group;
and determining the test results of all compatible test items in the compatible test item group as the compatible test results of the compatible test node based on the identification positions of the compatible test process information of each candidate compatible test item in the compatible test item group.
3. The image data processing method according to claim 2, wherein the step of processing the index feature information output by the compatibility test item index model based on the identification position of the compatibility test procedure information of each candidate compatibility test item to generate a compatible test item feature quantity set for characterizing the feature of the candidate compatibility test item includes:
extracting corresponding affine transformation data from the index feature information output by the compatible test item index model based on the identification position of the compatible test process information of each candidate compatible test item to obtain an affine transformation data set;
performing fusion operation on affine transformation data in the affine transformation data set based on a preset fusion comparison value to obtain at least one affine transformation data set;
determining at least one continuous number range based on the continuous number of affine transformation data included in the affine transformation data sets, wherein the range covered by the at least one continuous number range contains the continuous number of affine transformation data included in each affine transformation data set in the at least one affine transformation data set;
for each of the at least one continuous number range, combining affine transformation data sets including successive numbers of affine transformation data in the continuous number range into an affine transformation data set group;
extracting, for each of the at least one affine transformation data set group, a set number of affine transformation data sets from the affine transformation data set group;
for each extracted affine transformation data set in the affine transformation data sets, determining average affine transformation data of the affine transformation data included in the affine transformation data set, wherein the average affine transformation data is a vector formed by the average values of vectors at the same position in the affine transformation vectors;
determining the similarity between every two determined average affine transformation data, and determining the similarity between the determined average affine transformation data as a comparison parameter between corresponding affine transformation data sets, wherein the comparison parameter is used for representing the similarity after comparison between the affine transformation data sets;
combining affine transformation data sets corresponding to the determined contrast parameters with the contrast parameters larger than a preset set value into affine transformation data sets for comparison;
determining an affine transformation data set corresponding to the contrast parameters with the determined contrast parameters less than or equal to the set value as an affine transformation data set for comparison;
for each affine transformation data set in the at least one affine transformation data set for comparison, determining affine transformation data with which a contrast parameter for representing a degree of similarity after comparison between affine transformation data and affine transformation data sets is greater than a set value and less than a fusion contrast value as affine transformation data to be output, from affine transformation data in the affine transformation data sets that do not belong to the at least one affine transformation data set for comparison;
and selecting a set number of affine transformation data from the affine transformation data to be output and outputting a compatible test item feature quantity set for characterizing the features of the candidate compatible test items.
4. The method according to any one of claims 1 to 3, wherein the step of generating the image encoding parameters of the corresponding image sources according to the compatibility test results respectively comprises:
constructing an encoding sample database of a corresponding image source according to a compatibility test result of each compatibility test node, wherein the encoding sample database contains image encoding parameters of image encoding nodes corresponding to each compatibility test node extracted from the compatibility test result;
and generating image coding parameters of the image source according to the image coding parameters of the image coding nodes corresponding to the compatible test nodes and the compatibility test incidence relation among the compatible test nodes, wherein the compatibility test incidence relation is used for representing the time sequence relation among the compatible test nodes, and the image coding parameters are arranged according to the compatibility test incidence relation to obtain the image coding parameters of the image source.
5. The method according to claim 1, wherein the step of performing image encoding processing on the image data input to the image source according to the image encoding parameter comprises:
extracting the characteristics of image coding information in the image data input into the image source to obtain the coding node characteristics of the image coding information;
and acquiring an image coding information state corresponding to the coding node characteristics, and processing the coding node characteristics of the image coding information according to the image coding parameters and the image coding information state.
6. The method according to claim 5, wherein the step of processing the coding node characteristic of the image coding information according to the image coding parameter and the image coding information state comprises:
acquiring each coding node characteristic of image coding information of image data input by the image source and a state bitmap corresponding to each coding node characteristic according to the image coding information state, and acquiring each coding node characteristic in the image coding parameters and a state bitmap corresponding to each coding node characteristic;
generating a first coding bitmap of a state bitmap about coding node characteristics according to the coding node characteristics of the image coding information and the state bitmap corresponding to the coding node characteristics, and generating a second coding bitmap of the state bitmap about the coding node characteristics according to the coding node characteristics in the image coding parameters and the state bitmap corresponding to the coding node characteristics, wherein the coding node characteristics of the image coding information correspond to the coding node characteristics in the image coding parameters;
sequentially comparing the state bitmap corresponding to each coding node feature of the second coding bitmap with the state bitmap corresponding to each coding node feature of the first coding bitmap, and judging whether the state bitmap corresponding to the coding node feature of the second coding bitmap matches the state bitmap corresponding to the coding node feature of the first coding bitmap;
for each coding node feature, when the state bitmap of the second coding bitmap does not match with the corresponding first coding bitmap state bitmap, determining the difference value of each bitmap point between the state bitmap of the first coding bitmap and the state bitmap of the second coding bitmap as a coding comparison value corresponding to the coding node feature, and generating a corresponding coding comparison bitmap according to each coding node feature and the coding comparison value corresponding to each coding node feature;
and processing the coding node characteristics of the image coding information according to the value of each bitmap point in the coding comparison bitmap.
7. An image data processing apparatus, applied to a server, the server being communicatively connected to an image source, the apparatus comprising:
the device comprises a test module, a processing module and a processing module, wherein the test module is used for carrying out image compatibility test on image data to be tested based on different image sources so as to obtain compatibility test parameters aiming at each different image source, the compatibility test parameters comprise compatibility test parameters of a plurality of compatibility test nodes, and each compatibility test node corresponds to one compatibility test type;
the determining module is used for determining a compatibility test result aiming at each image source according to the compatibility test parameters and the image format type of the image data to be tested;
the coding processing module is used for respectively generating image coding parameters of corresponding image sources according to the compatibility test results and carrying out image coding processing on image data input into the image sources according to the image coding parameters;
the testing module is used for carrying out image compatibility testing on image data to be tested based on different image sources in the following mode to obtain compatibility testing parameters aiming at each different image source:
respectively constructing a compatibility test model of an image source type corresponding to each image source according to a plurality of compatibility test nodes aiming at the image data to be tested of each image source to obtain a plurality of compatibility test models;
testing image data to be tested of the image source according to each compatible test model, taking the image source type of the image data to be tested as a point convolution network layer of a compatible test network structure of a test input part as a test network layer, and testing the image data to be tested by a preset network depth to obtain a test result of a first compatible test network structure corresponding to each compatible test model;
the test result of the first compatible test network structure is pushed backwards by the position of the preset network layer number according to the point convolution network layer to obtain the test result of the second compatible test network structure corresponding to each compatible test model, and the test result of the second compatible test network structure is pushed backwards by the position of the preset network layer number according to the depth convolution layer direction to obtain the test result of the third compatible test network structure corresponding to each compatible test model;
taking the depth convolution layer of the third compatibility test network structure as a test network layer, testing preset convolution kernel nodes on the test result to obtain the test result of a fourth compatibility test network structure corresponding to each compatibility test model, and setting the test result of the fourth compatibility test network structure as the test result of a target compatibility test network structure;
acquiring a position identification position of the position of each target compatibility test network structure, and setting the acquired position identification position as a space identification position point of the position of each compatibility test node forming a test process;
acquiring a network input depth and a network input direction of a network input direction based on a test input part, outwardly setting a vector direction based on the test input part according to the network input depth and the network input direction, and acquiring a unit vector of the vector direction;
mapping the spatial identification position point of the position of each compatible test node to a test network layer corresponding to the unit vector, acquiring the compatible test node with the longest position in the position between each compatible test node and the test input part in the test network layer, and then taking the network structure between the compatible test node with the longest position and the test input part as a first network structure, taking the part connected with the first network structure as a second network structure, and establishing a corresponding target network structure according to the first network structure and the second network structure;
determining the position of each compatible test node according to the target network structure to obtain a compatible test network layer of each compatible test node, executing a test process of the image data to be tested according to the compatible test network layer of each compatible test node, and labeling each compatible test node in the test process according to the compatible test network layer of each compatible test node;
and summarizing the output parameters of the test network layer where the marked test process is located to obtain the compatible test parameters of each compatible test node.
8. An image processing system comprising a server and an image source communicatively coupled to the server;
the image source is used for sending image data to be detected to the server;
the server is used for carrying out image compatibility test on image data to be tested based on different image sources so as to obtain compatibility test parameters aiming at each different image source, the compatibility test parameters comprise compatibility test parameters of a plurality of compatibility test nodes, and each compatibility test node corresponds to one compatibility test type;
the server is used for determining a compatibility test result aiming at each image source according to the compatibility test parameters and the image format type of the image data to be tested;
the server is used for respectively generating image coding parameters of corresponding image sources according to the compatibility test results and carrying out image coding processing on image data input into the image sources according to the image coding parameters;
the server is used for carrying out image compatibility test on image data to be tested based on different image sources in the following mode to obtain compatibility test parameters aiming at each different image source:
respectively constructing a compatibility test model of an image source type corresponding to each image source according to a plurality of compatibility test nodes aiming at the image data to be tested of each image source to obtain a plurality of compatibility test models;
testing image data to be tested of the image source according to each compatible test model, taking the image source type of the image data to be tested as a point convolution network layer of a compatible test network structure of a test input part as a test network layer, and testing the image data to be tested by a preset network depth to obtain a test result of a first compatible test network structure corresponding to each compatible test model;
the test result of the first compatible test network structure is pushed backwards by the position of the preset network layer number according to the point convolution network layer to obtain the test result of the second compatible test network structure corresponding to each compatible test model, and the test result of the second compatible test network structure is pushed backwards by the position of the preset network layer number according to the depth convolution layer direction to obtain the test result of the third compatible test network structure corresponding to each compatible test model;
taking the depth convolution layer of the third compatibility test network structure as a test network layer, testing preset convolution kernel nodes on the test result to obtain the test result of a fourth compatibility test network structure corresponding to each compatibility test model, and setting the test result of the fourth compatibility test network structure as the test result of a target compatibility test network structure;
acquiring a position identification position of the position of each target compatibility test network structure, and setting the acquired position identification position as a space identification position point of the position of each compatibility test node forming a test process;
acquiring a network input depth and a network input direction of a network input direction based on a test input part, outwardly setting a vector direction based on the test input part according to the network input depth and the network input direction, and acquiring a unit vector of the vector direction;
mapping the spatial identification position point of the position of each compatible test node to a test network layer corresponding to the unit vector, acquiring the compatible test node with the longest position in the position between each compatible test node and the test input part in the test network layer, and then taking the network structure between the compatible test node with the longest position and the test input part as a first network structure, taking the part connected with the first network structure as a second network structure, and establishing a corresponding target network structure according to the first network structure and the second network structure;
determining the position of each compatible test node according to the target network structure to obtain a compatible test network layer of each compatible test node, executing a test process of the image data to be tested according to the compatible test network layer of each compatible test node, and labeling each compatible test node in the test process according to the compatible test network layer of each compatible test node;
and summarizing the output parameters of the test network layer where the marked test process is located to obtain the compatible test parameters of each compatible test node.
9. A server, comprising a processor, a machine-readable storage medium, and a network interface, the machine-readable storage medium, the network interface, and the processor being connected via a bus system, the network interface being configured to communicatively connect with at least one image source, the machine-readable storage medium being configured to store a program, instructions, or code, and the processor being configured to execute the program, instructions, or code in the machine-readable storage medium to perform the image data processing method of any one of claims 1-6.
CN202010085500.5A 2020-02-08 2020-02-08 Image data processing method, image data processing device, image processing system and server Active CN111340683B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011009167.6A CN112215743A (en) 2020-02-08 2020-02-08 Image data processing method, device and image processing system
CN202011011436.2A CN112215744A (en) 2020-02-08 2020-02-08 Image data processing method, device and system based on cloud platform
CN202010085500.5A CN111340683B (en) 2020-02-08 2020-02-08 Image data processing method, image data processing device, image processing system and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010085500.5A CN111340683B (en) 2020-02-08 2020-02-08 Image data processing method, image data processing device, image processing system and server

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202011009167.6A Division CN112215743A (en) 2020-02-08 2020-02-08 Image data processing method, device and image processing system
CN202011011436.2A Division CN112215744A (en) 2020-02-08 2020-02-08 Image data processing method, device and system based on cloud platform

Publications (2)

Publication Number Publication Date
CN111340683A CN111340683A (en) 2020-06-26
CN111340683B true CN111340683B (en) 2020-11-13

Family

ID=71186807

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202011011436.2A Withdrawn CN112215744A (en) 2020-02-08 2020-02-08 Image data processing method, device and system based on cloud platform
CN202010085500.5A Active CN111340683B (en) 2020-02-08 2020-02-08 Image data processing method, image data processing device, image processing system and server
CN202011009167.6A Withdrawn CN112215743A (en) 2020-02-08 2020-02-08 Image data processing method, device and image processing system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202011011436.2A Withdrawn CN112215744A (en) 2020-02-08 2020-02-08 Image data processing method, device and system based on cloud platform

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202011009167.6A Withdrawn CN112215743A (en) 2020-02-08 2020-02-08 Image data processing method, device and image processing system

Country Status (1)

Country Link
CN (3) CN112215744A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101668255A (en) * 2008-09-02 2010-03-10 杨明极 Self-adaptive method for mobile terminal to display or download pictures through network
CN102340698A (en) * 2011-10-12 2012-02-01 福建新大陆通信科技股份有限公司 Scalable vector graphics (SVG)-based set-top box interface representation method
CN103366341A (en) * 2013-04-16 2013-10-23 广东欧珀移动通信有限公司 Method and device for adaptive-resolution display of image
CN103516748A (en) * 2012-06-25 2014-01-15 北京智慧风云科技有限公司 Method and system for self-adaption file downloading processing of cloud server and cloud server
CN103761315A (en) * 2014-01-27 2014-04-30 乐视网信息技术(北京)股份有限公司 Method and system for displaying webpage content
CN104182220A (en) * 2014-08-12 2014-12-03 浪潮软件股份有限公司 Image acquisition method based on domestic CPU and domestic OS
CN105122795A (en) * 2013-02-19 2015-12-02 特克特朗尼克公司 Image appearance framework and applications for digital image creation and display
CN105519130A (en) * 2013-07-19 2016-04-20 索尼公司 Information processing device and method
CN105956062A (en) * 2016-04-27 2016-09-21 广东欧珀移动通信有限公司 Picture acquisition method, system and equipment thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10334254B2 (en) * 2016-09-23 2019-06-25 Apple Inc. Feed-forward and feed-back metadata exchange in image processing pipelines to improve image quality
CN107066609A (en) * 2017-04-27 2017-08-18 北京京东尚科信息技术有限公司 Image display method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101668255A (en) * 2008-09-02 2010-03-10 杨明极 Self-adaptive method for mobile terminal to display or download pictures through network
CN102340698A (en) * 2011-10-12 2012-02-01 福建新大陆通信科技股份有限公司 Scalable vector graphics (SVG)-based set-top box interface representation method
CN103516748A (en) * 2012-06-25 2014-01-15 北京智慧风云科技有限公司 Method and system for self-adaption file downloading processing of cloud server and cloud server
CN105122795A (en) * 2013-02-19 2015-12-02 特克特朗尼克公司 Image appearance framework and applications for digital image creation and display
CN103366341A (en) * 2013-04-16 2013-10-23 广东欧珀移动通信有限公司 Method and device for adaptive-resolution display of image
CN105519130A (en) * 2013-07-19 2016-04-20 索尼公司 Information processing device and method
CN103761315A (en) * 2014-01-27 2014-04-30 乐视网信息技术(北京)股份有限公司 Method and system for displaying webpage content
CN104182220A (en) * 2014-08-12 2014-12-03 浪潮软件股份有限公司 Image acquisition method based on domestic CPU and domestic OS
CN105956062A (en) * 2016-04-27 2016-09-21 广东欧珀移动通信有限公司 Picture acquisition method, system and equipment thereof

Also Published As

Publication number Publication date
CN112215743A (en) 2021-01-12
CN111340683A (en) 2020-06-26
CN112215744A (en) 2021-01-12

Similar Documents

Publication Publication Date Title
CN111476498B (en) New energy automobile charging management method and device and new energy charging management system
CN111245912B (en) Intelligent building information monitoring method and device, server and intelligent building system
CN111339129B (en) Remote meter reading abnormity monitoring method and device, gas meter system and cloud server
CN111352670B (en) Virtual reality scene loading method and device, virtual reality system and equipment
CN111277469B (en) Network diagnosis processing method and device, network system and server
CN110826606B (en) Element matching method, device, server and readable storage medium
CN111209336B (en) Data distribution method and device based on block chain and server
CN115205736A (en) Video data identification method and device, electronic equipment and storage medium
CN111258968B (en) Enterprise redundant data cleaning method and device and big data platform
CN111367528B (en) Compiling method and device of software development kit, software development system and server
CN111338955B (en) Software graphical interface testing method and device, software development system and server
CN110968273B (en) Laser color printing method and system based on Internet of things
CN112035490B (en) Electric vehicle information monitoring method, device and system based on cloud platform
CN111340683B (en) Image data processing method, image data processing device, image processing system and server
CN112491985A (en) Remote meter reading data processing method, gas meter system and gas cloud platform
CN112181563A (en) Browser view loading method, device and system based on cloud platform and server
CN111107162B (en) Indoor positioning data processing method, device and system based on Internet of things
CN111209943B (en) Data fusion method and device and server
CN111539029B (en) Industrial internet-based big data storage rate optimization method and cloud computing center
CN111209509B (en) Information display method and device based on big data platform and big data platform
CN112055076A (en) Multifunctional intelligent monitoring method and device based on Internet and server
CN111526333A (en) Security monitoring method and device, server and readable storage medium
CN111783012A (en) Internet product monitoring method, device, server and readable storage medium
CN112055075A (en) Internet product monitoring method, device, server and readable storage medium
CN111988187A (en) Internet connection method and device of central management server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: No.1389 Changyuan North Road, high tech Zone, Wuhua District, Kunming City, Yunnan Province

Applicant after: Zhu Minghua

Address before: Room 306, building A1, Qilu Software Park building, 1768 Xinluo street, high tech Zone, Jinan City, Shandong Province 250101

Applicant before: Zhu Minghua

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201112

Address after: 361006 No. 0137, unit 109, No. 62, Chengyi North Street, phase III, Xiamen City, Fujian Province

Patentee after: Xiamen Quantong Network Technology Co., Ltd

Address before: No.1389 Changyuan North Road, high tech Zone, Wuhua District, Kunming City, Yunnan Province

Patentee before: Zhu Minghua

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201130

Address after: No.1389 Changyuan North Road, high tech Zone, Wuhua District, Kunming City, Yunnan Province

Patentee after: Zhu Minghua

Address before: 361006 No. 0137, unit 109, No. 62, Chengyi North Street, phase III, Xiamen City, Fujian Province

Patentee before: Xiamen Quantong Network Technology Co., Ltd

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210129

Address after: 518000 Room 201, building h, 33 Ping'an Avenue, Xinnan community, Pinghu street, Longgang District, Shenzhen City, Guangdong Province

Patentee after: SHENZHEN KENID JUMAO MEDICAL EQUIPMENT Co.,Ltd.

Address before: No. 1389, Changyuan North Road, high tech Zone, Wuhua District, Kunming, Yunnan 650106

Patentee before: Zhu Minghua

TR01 Transfer of patent right