CN116758165B - Image calibration method and device based on array camera - Google Patents

Image calibration method and device based on array camera Download PDF

Info

Publication number
CN116758165B
CN116758165B CN202310731532.1A CN202310731532A CN116758165B CN 116758165 B CN116758165 B CN 116758165B CN 202310731532 A CN202310731532 A CN 202310731532A CN 116758165 B CN116758165 B CN 116758165B
Authority
CN
China
Prior art keywords
image data
calibration
calibrated
parameters
characteristic value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310731532.1A
Other languages
Chinese (zh)
Other versions
CN116758165A (en
Inventor
袁潮
邓迪旻
温建伟
武海兵
肖占中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhuohe Technology Co Ltd
Original Assignee
Beijing Zhuohe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhuohe Technology Co Ltd filed Critical Beijing Zhuohe Technology Co Ltd
Priority to CN202310731532.1A priority Critical patent/CN116758165B/en
Publication of CN116758165A publication Critical patent/CN116758165A/en
Application granted granted Critical
Publication of CN116758165B publication Critical patent/CN116758165B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses an image calibration method and device based on an array camera. Wherein the method comprises the following steps: acquiring original image data based on an array camera device; classifying the original image data according to scene parameters to obtain image data to be calibrated; extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into a characteristic matching matrix to obtain a matching result; and calibrating the image data to be calibrated according to the matching result to obtain calibration image data. The invention solves the technical problems that in the prior art, the calibration effect of the camera for the calibration parameters is poor and powerful technical support cannot be provided because the calibration parameters cannot be analyzed according to the existing parameters only by directly calibrating and outputting the collected image data by the environmental parameters.

Description

Image calibration method and device based on array camera
Technical Field
The invention relates to the field of image processing, in particular to an image calibration method and device based on an array camera.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
Currently, image tracking for array cameras is often used in airport monitoring or in marine defense projects, so there are three types of application requirements in airport or marine defense projects: 1. when a certain ground or sea surface target is detected in the panoramic picture of the array camera, the actual geographic position of the target needs to be known, that is, the actual corresponding GPS coordinate needs to be obtained through the coordinate of a certain pixel point on the panoramic picture. 2. Conversely, after the third party system (ADS-B for aircraft and AIS for ship) obtains the geographic location information of the target, the target needs to be marked on the panoramic picture, that is, the GPS coordinates need to be converted into pixel coordinates on the panoramic picture. 3. In combination with the above two applications, if the panoramic picture and the third party system both detect a certain target, the panoramic picture and the third party system can be associated through GPS position information, so that more information of the target, such as the flight number of an airplane (obtained from ADS-B) and the number of a ship (obtained from AIS), can be displayed on the panoramic picture, and fusion display of the image and real world information is realized. These functions are also currently implemented, but it is difficult to achieve a relatively accurate position, and there are also some places for array cameras that are different from the single camera that need to be processed. However, in the prior art, the calibration of the array camera is performed only by directly calibrating and outputting the collected image data through the environmental parameters, the calibration parameters cannot be analyzed according to the existing parameters, so that the camera calibration effect of the calibration parameters is poor, and powerful technical support cannot be provided.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides an image calibration method and device based on an array camera, which at least solve the technical problems that in the prior art, the calibration effect of the camera of the calibration parameters is poor and powerful technical support cannot be provided because the calibration parameters cannot be analyzed according to the existing parameters only by directly calibrating and outputting collected image data through environmental parameters.
According to an aspect of the embodiment of the present invention, there is provided an image calibration method based on an array camera, including: acquiring original image data based on an array camera device; classifying the original image data according to scene parameters to obtain image data to be calibrated; extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into a characteristic matching matrix to obtain a matching result; and calibrating the image data to be calibrated according to the matching result to obtain calibration image data.
Optionally, before the classifying the original image data according to the scene parameters to obtain the image data to be calibrated, the method further includes: acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing functional purposes and classifying expectations of image calibration; and generating the scene parameters according to the environment parameters and the application demand parameters.
Optionally, extracting the feature value of the image data to be calibrated, and inputting the feature value into a feature matching matrix, where obtaining a matching result includes: acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data; extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value; inputting the characteristic value to the characteristic matching matrix to generate the matching result, wherein the characteristic matching matrix comprises:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, and R is a matching constant, wherein R is a positive integer greater than 1.
Optionally, the calibrating the image data to be calibrated according to the matching result includes: fusing the matching result with the image data set to be calibrated to obtain a calibration result; and performing splicing processing on the calibration result and the original image data to obtain the calibration image data.
According to another aspect of the embodiment of the present invention, there is also provided an image calibration device based on an array camera, including: an acquisition module for acquiring original image data based on the array image pickup device; the classifying module is used for classifying the original image data according to scene parameters to obtain image data to be calibrated; the extraction module is used for extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into the characteristic matching matrix to obtain a matching result; and the calibration module is used for calibrating the image data to be calibrated according to the matching result to obtain calibration image data.
Optionally, the apparatus further includes: the acquisition module is also used for acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing the functional purpose and classifying expectations of image calibration; and the generating module is used for generating the scene parameters according to the environment parameters and the application demand parameters.
Optionally, the extracting module includes: the acquisition unit is used for acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data; the extraction unit is used for extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value; the input unit is configured to input the feature value to the feature matching matrix, and generate the matching result, where the feature matching matrix includes:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, and R is a matching constant, wherein R is a positive integer greater than 1.
Optionally, the calibration module includes: the fusion unit is used for fusing the matching result with the image number to be calibrated to obtain a calibration result; and the splicing unit is used for splicing the calibration result with the original image data to obtain the calibration image data.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the device where the nonvolatile storage medium is controlled to execute an image calibration method based on an array camera.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute an image calibration method based on an array camera when executed.
In the embodiment of the invention, the original image data based on the array camera equipment is acquired; classifying the original image data according to scene parameters to obtain image data to be calibrated; extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into a characteristic matching matrix to obtain a matching result; the method for calibrating the image data to be calibrated according to the matching result to obtain the calibrated image data solves the technical problems that in the prior art, the calibration effect of the camera of the calibration parameters is poor and powerful technical support cannot be provided due to the fact that the calibration parameters cannot be analyzed according to the existing parameters only by directly calibrating and outputting the collected image data through the environmental parameters.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of an image calibration method based on an array camera according to an embodiment of the invention;
FIG. 2 is a block diagram of an image calibration device based on an array camera according to an embodiment of the present invention;
fig. 3 is a block diagram of a terminal device for performing the method according to the invention according to an embodiment of the invention;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided a method embodiment of an array camera-based image calibration method, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and that, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
Example 1
Fig. 1 is a flowchart of an image calibration method based on an array camera according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, raw image data based on the array image capturing apparatus is acquired.
Specifically, in order to solve the technical problems that in the prior art, calibration parameters cannot be analyzed according to the existing parameters only through direct calibration and output of environmental parameters on collected image data, so that the camera calibration effect of the calibration parameters is poor and powerful technical support cannot be provided, original image data needs to be collected aiming at an array camera deployed in an application scene, wherein the original image data does not contain any mark trace and is used for subsequent image analysis and calibration.
And step S104, classifying the original image data according to scene parameters to obtain the image data to be calibrated.
Optionally, before the classifying the original image data according to the scene parameters to obtain the image data to be calibrated, the method further includes: acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing functional purposes and classifying expectations of image calibration; and generating the scene parameters according to the environment parameters and the application demand parameters.
Specifically, after the original image data is obtained, the method needs to perform classification processing according to the collection condition of the original image data to obtain the image data to be calibrated, wherein the classification processing can classify the original image data according to the preset application rule configured by a user, and gather classification data sets of classification results so as to reduce the probability of error calibration in subsequent image calibration and increase the calibration efficiency, and in addition, before the original image data is classified according to scene parameters to obtain the image data to be calibrated, the method further comprises the steps of: acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing functional purposes and classifying expectations of image calibration; and generating the scene parameters according to the environment parameters and the application demand parameters.
And S106, extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into a characteristic matching matrix to obtain a matching result.
Optionally, extracting the feature value of the image data to be calibrated, and inputting the feature value into a feature matching matrix, where obtaining a matching result includes: acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data; extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value; inputting the characteristic value to the characteristic matching matrix to generate the matching result, wherein the characteristic matching matrix comprises:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, and R is a matching constant, wherein R is a positive integer greater than 1.
Specifically, the embodiment of the invention can utilize the matching matrix to match the characteristic value to be calibrated to obtain the related parameter data corresponding to the image data to be calibrated, calibrate the parameter to be calibrated in the subsequent processing, for example, obtain the GPS information in a matrix matching mode, and calibrate the related GPS information in the original image data subjected to the classifying processing, thereby achieving the technical effect of finally calibrating the image result.
And step S108, calibrating the image data to be calibrated according to the matching result to obtain calibration image data.
Optionally, the calibrating the image data to be calibrated according to the matching result includes: fusing the matching result with the image data set to be calibrated to obtain a calibration result; and performing splicing processing on the calibration result and the original image data to obtain the calibration image data.
Through the embodiment, the technical problems that in the prior art, the calibration effect of the camera for the calibration parameters is poor and powerful technical support cannot be provided because the calibration parameters cannot be analyzed according to the existing parameters only by directly calibrating and outputting the collected image data by the environmental parameters are solved.
Example two
Fig. 2 is a block diagram of an image calibration apparatus based on an array camera according to an embodiment of the present invention, as shown in fig. 2, the apparatus includes:
an acquisition module 20 for acquiring raw image data based on the array image capturing apparatus.
Specifically, in order to solve the technical problems that in the prior art, calibration parameters cannot be analyzed according to the existing parameters only through direct calibration and output of environmental parameters on collected image data, so that the camera calibration effect of the calibration parameters is poor and powerful technical support cannot be provided, original image data needs to be collected aiming at an array camera deployed in an application scene, wherein the original image data does not contain any mark trace and is used for subsequent image analysis and calibration.
The classifying module 22 is configured to classify the raw image data according to the scene parameters, so as to obtain image data to be calibrated.
Optionally, the apparatus further includes: the acquisition module is also used for acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing the functional purpose and classifying expectations of image calibration; and the generating module is used for generating the scene parameters according to the environment parameters and the application demand parameters.
Specifically, after the original image data is obtained, the method needs to perform classification processing according to the collection condition of the original image data to obtain the image data to be calibrated, wherein the classification processing can classify the original image data according to the preset application rule configured by a user, and gather classification data sets of classification results so as to reduce the probability of error calibration in subsequent image calibration and increase the calibration efficiency, and in addition, before the original image data is classified according to scene parameters to obtain the image data to be calibrated, the method further comprises the steps of: acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing functional purposes and classifying expectations of image calibration; and generating the scene parameters according to the environment parameters and the application demand parameters.
And the extracting module 24 is used for extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into the characteristic matching matrix to obtain a matching result.
Optionally, the extracting module includes: the acquisition unit is used for acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data; the extraction unit is used for extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value; the input unit is configured to input the feature value to the feature matching matrix, and generate the matching result, where the feature matching matrix includes:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, and R is a matching constant, wherein R is a positive integer greater than 1.
Specifically, the embodiment of the invention can utilize the matching matrix to match the characteristic value to be calibrated to obtain the related parameter data corresponding to the image data to be calibrated, calibrate the parameter to be calibrated in the subsequent processing, for example, obtain the GPS information in a matrix matching mode, and calibrate the related GPS information in the original image data subjected to the classifying processing, thereby achieving the technical effect of finally calibrating the image result.
And the calibration module 26 is used for calibrating the image data to be calibrated according to the matching result to obtain calibration image data.
Optionally, the calibration module includes: the fusion unit is used for fusing the matching result with the image number to be calibrated to obtain a calibration result; and the splicing unit is used for splicing the calibration result with the original image data to obtain the calibration image data.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the device where the nonvolatile storage medium is controlled to execute an image calibration method based on an array camera.
Specifically, the method comprises the following steps: acquiring original image data based on an array camera device; classifying the original image data according to scene parameters to obtain image data to be calibrated; extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into a characteristic matching matrix to obtain a matching result; and calibrating the image data to be calibrated according to the matching result to obtain calibration image data. Optionally, before the classifying the original image data according to the scene parameters to obtain the image data to be calibrated, the method further includes: acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing functional purposes and classifying expectations of image calibration; and generating the scene parameters according to the environment parameters and the application demand parameters. Optionally, extracting the feature value of the image data to be calibrated, and inputting the feature value into a feature matching matrix, where obtaining a matching result includes: acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data; extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value; inputting the characteristic value to the characteristic matching matrix to generate the matching result, wherein the characteristic matching matrix comprises:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, and R is a matching constant, wherein R is a positive integer greater than 1. Optionally, the calibrating the image data to be calibrated according to the matching result includes: fusing the matching result with the image data set to be calibrated to obtain a calibration result; and performing splicing processing on the calibration result and the original image data to obtain the calibration image data.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute an image calibration method based on an array camera when executed.
Specifically, the method comprises the following steps: acquiring original image data based on an array camera device; classifying the original image data according to scene parameters to obtain image data to be calibrated; extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into a characteristic matching matrix to obtain a matching result; and calibrating the image data to be calibrated according to the matching result to obtain calibration image data. Optionally, before the classifying the original image data according to the scene parameters to obtain the image data to be calibrated, the method further includes: acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing functional purposes and classifying expectations of image calibration; and generating the scene parameters according to the environment parameters and the application demand parameters. Optionally, extracting the feature value of the image data to be calibrated, and inputting the feature value into a feature matching matrix, where obtaining a matching result includes: acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data; extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value; inputting the characteristic value to the characteristic matching matrix to generate the matching result, wherein the characteristic matching matrix comprises:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, and R is a matching constant, wherein R is a positive integer greater than 1. Optionally, the calibrating the image data to be calibrated according to the matching result includes: fusing the matching result with the image data set to be calibrated to obtain a calibration result; and performing splicing processing on the calibration result and the original image data to obtain the calibration image data.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (4)

1. An image calibration method based on an array camera is characterized by comprising the following steps:
acquiring original image data based on an array camera device;
classifying the original image data according to scene parameters to obtain image data to be calibrated;
extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into a characteristic matching matrix to obtain a matching result;
calibrating the image data to be calibrated according to the matching result to obtain calibration image data;
before classifying the original image data according to the scene parameters to obtain the image data to be calibrated, the method further comprises the following steps:
acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing functional purposes and classifying expectations of image calibration;
generating the scene parameters according to the environment parameters and the application demand parameters;
extracting the characteristic value of the image data to be calibrated, inputting the characteristic value into a characteristic matching matrix, and obtaining a matching result comprises:
acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data;
extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value;
inputting the characteristic value to the characteristic matching matrix to generate the matching result, wherein the characteristic matching matrix comprises:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, R is a matching constant, wherein R is a positive integer greater than 1;
calibrating the image data to be calibrated according to the matching result, wherein obtaining calibration image data comprises the following steps:
fusing the matching result with the image data set to be calibrated to obtain a calibration result;
and performing splicing processing on the calibration result and the original image data to obtain the calibration image data.
2. An image calibration device based on an array camera, comprising:
an acquisition module for acquiring original image data based on the array image pickup device;
the classifying module is used for classifying the original image data according to scene parameters to obtain image data to be calibrated;
the extraction module is used for extracting the characteristic value of the image data to be calibrated, and inputting the characteristic value into the characteristic matching matrix to obtain a matching result;
the calibration module is used for calibrating the image data to be calibrated according to the matching result to obtain calibration image data;
the apparatus further comprises:
the acquisition module is also used for acquiring environmental parameters and application demand parameters according to the array camera equipment, wherein the application demand parameters are used for representing the functional purpose and classifying expectations of image calibration;
the generating module is used for generating the scene parameters according to the environment parameters and the application demand parameters;
the extraction module comprises:
the acquisition unit is used for acquiring a characteristic value extraction model, wherein the characteristic value extraction model is trained by a plurality of image calibration historical data;
the extraction unit is used for extracting the image data to be calibrated according to the characteristic value extraction model to obtain the characteristic value;
the input unit is configured to input the feature value to the feature matching matrix, and generate the matching result, where the feature matching matrix includes:
wherein B1-B3 are target calibration data, T1-T3 are characteristic values, X1-X3 are characteristic coordinate values, R is a matching constant, wherein R is a positive integer greater than 1;
the calibration module comprises:
the fusion unit is used for fusing the matching result with the image number to be calibrated to obtain a calibration result;
and the splicing unit is used for splicing the calibration result with the original image data to obtain the calibration image data.
3. A non-volatile storage medium comprising a stored program, wherein the program when run controls a device in which the non-volatile storage medium resides to perform the method of claim 1.
4. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method of claim 1.
CN202310731532.1A 2023-06-20 2023-06-20 Image calibration method and device based on array camera Active CN116758165B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310731532.1A CN116758165B (en) 2023-06-20 2023-06-20 Image calibration method and device based on array camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310731532.1A CN116758165B (en) 2023-06-20 2023-06-20 Image calibration method and device based on array camera

Publications (2)

Publication Number Publication Date
CN116758165A CN116758165A (en) 2023-09-15
CN116758165B true CN116758165B (en) 2024-01-30

Family

ID=87956744

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310731532.1A Active CN116758165B (en) 2023-06-20 2023-06-20 Image calibration method and device based on array camera

Country Status (1)

Country Link
CN (1) CN116758165B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130004732A (en) * 2011-07-04 2013-01-14 전자부품연구원 Method for measuring similarity of corresponding image and recording medium thereof
CN105488810A (en) * 2016-01-20 2016-04-13 东南大学 Focused light field camera internal and external parameter calibration method
CN114549652A (en) * 2022-01-13 2022-05-27 湖南视比特机器人有限公司 Camera calibration method, device, equipment and computer readable medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130004732A (en) * 2011-07-04 2013-01-14 전자부품연구원 Method for measuring similarity of corresponding image and recording medium thereof
CN105488810A (en) * 2016-01-20 2016-04-13 东南大学 Focused light field camera internal and external parameter calibration method
CN114549652A (en) * 2022-01-13 2022-05-27 湖南视比特机器人有限公司 Camera calibration method, device, equipment and computer readable medium

Also Published As

Publication number Publication date
CN116758165A (en) 2023-09-15

Similar Documents

Publication Publication Date Title
CN109409161B (en) Graphic code identification method, device, terminal and storage medium
CN110210219B (en) Virus file identification method, device, equipment and storage medium
CN109495616B (en) Photographing method and terminal equipment
WO2018184260A1 (en) Correcting method and device for document image
WO2024098906A1 (en) Image tracking method and apparatus for gigapixel photographic device
CN115631122A (en) Image optimization method and device for edge image algorithm
CN111586279B (en) Method, device and equipment for determining shooting state and storage medium
CN109639981B (en) Image shooting method and mobile terminal
CN116614453B (en) Image transmission bandwidth selection method and device based on cloud interconnection
CN110650210B (en) Image data acquisition method, device and storage medium
CN110796673B (en) Image segmentation method and related product
CN110086987B (en) Camera visual angle cutting method and device and storage medium
CN116758165B (en) Image calibration method and device based on array camera
CN109218620B (en) Photographing method and device based on ambient brightness, storage medium and mobile terminal
CN115527045A (en) Image identification method and device for snow field danger identification
CN112560612B (en) System, method, computer device and storage medium for determining business algorithm
CN116579965B (en) Multi-image fusion method and device
CN112101297A (en) Training data set determination method, behavior analysis method, device, system and medium
CN116030501B (en) Method and device for extracting bird detection data
CN112308104A (en) Abnormity identification method and device and computer storage medium
CN115511735B (en) Snow field gray scale picture optimization method and device
CN115914819B (en) Picture capturing method and device based on orthogonal decomposition algorithm
CN116579964B (en) Dynamic frame gradual-in gradual-out dynamic fusion method and device
CN116468883B (en) High-precision image data volume fog recognition method and device
CN116757983B (en) Main and auxiliary image fusion method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant