CN116485912B - Multi-module coordination method and device for light field camera - Google Patents

Multi-module coordination method and device for light field camera Download PDF

Info

Publication number
CN116485912B
CN116485912B CN202310456648.9A CN202310456648A CN116485912B CN 116485912 B CN116485912 B CN 116485912B CN 202310456648 A CN202310456648 A CN 202310456648A CN 116485912 B CN116485912 B CN 116485912B
Authority
CN
China
Prior art keywords
joint debugging
module
variable
modules
debugging result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310456648.9A
Other languages
Chinese (zh)
Other versions
CN116485912A (en
Inventor
袁潮
邓迪旻
温建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhuohe Technology Co Ltd
Original Assignee
Beijing Zhuohe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhuohe Technology Co Ltd filed Critical Beijing Zhuohe Technology Co Ltd
Priority to CN202310456648.9A priority Critical patent/CN116485912B/en
Publication of CN116485912A publication Critical patent/CN116485912A/en
Application granted granted Critical
Publication of CN116485912B publication Critical patent/CN116485912B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/957Light-field or plenoptic cameras or camera modules
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a light field camera multimode group coordination method and device. Wherein the method comprises the following steps: acquiring light field camera module information; calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data; according to the module information and the calibration data, performing joint debugging on all modules by using a coordination control algorithm to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result; and adjusting the joint debugging result in real time according to the environment variable and the variable factor. The application solves the technical problems that in the prior art, in a light field camera system of a plurality of modules, the optical calibration among the modules may be inaccurate due to factors such as hardware limitation, weather, special conditions and the like, so that the image synthesis is discontinuous and even distorted, and meanwhile, the plurality of modules need to be controlled in a coordinated manner during operation, so that the stability and the performance of the system cannot be ensured.

Description

Multi-module coordination method and device for light field camera
Technical Field
The application relates to the field of optical camera configuration optimization, in particular to a multimode group coordination method and device for a light field camera.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
At present, generally, when monitoring is performed, the light field camera is a camera capable of capturing scene depth information and multiple angles at the same time, and is widely applied to fields of virtual reality, augmented reality, three-dimensional modeling and the like, so in the prior art, in a light field camera system of multiple modules, due to factors such as hardware limitation, weather, special conditions and the like, optical calibration between the modules may not be accurate enough, so that image synthesis is discontinuous and even distorted, and meanwhile, when the multiple modules are operated, coordinated control is needed, and stability and performance of the system cannot be ensured. Therefore, there is a need for a coordinated control method that can improve the operational efficiency and image quality of a light field camera multimode system.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the application provides a multi-module coordination method and device for a light field camera, which at least solve the technical problems that in a light field camera system of a plurality of modules in the prior art, due to factors such as hardware limitation, weather, special conditions and the like, optical calibration among the modules may be inaccurate, so that image synthesis is discontinuous and even distorted, and meanwhile, the plurality of modules need coordination control during operation, so that the stability and performance of the system cannot be ensured.
According to an aspect of an embodiment of the present application, there is provided a light field camera multimode group coordination method, including: acquiring light field camera module information; calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data; according to the module information and the calibration data, performing joint debugging on all modules by using a coordination control algorithm to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result; and adjusting the joint debugging result in real time according to the environment variable and the variable factor.
Optionally, the performing joint debugging on all modules by using a coordination control algorithm according to the module information and the calibration data, and obtaining the joint debugging result and the variable factor includes: constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data; by the formula
Calculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor.
Optionally, before the adjusting the joint debugging result in real time according to the environment variable and the variable factor, the method further includes: acquiring the environment variable parameters in real time, wherein the environment variable parameters comprise: temperature, humidity, light intensity, pressure.
Optionally, after the adjusting the joint adjustment result in real time according to the environmental variable and the variable factor, the method further includes: and carrying out optimization processing on the spliced image generated by the joint debugging result, wherein the optimization processing comprises the following steps: denoising and light enhancement.
According to another aspect of the embodiment of the present application, there is also provided a light field camera multimode group coordination device, including: the acquisition module is used for acquiring the information of the light field camera module; the calibration module is used for calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data; the joint debugging module is used for joint debugging all modules by utilizing a coordination control algorithm according to the module information and the calibration data to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result; and the adjusting module is used for adjusting the joint debugging result in real time according to the environment variable and the variable factor.
Optionally, the calibration module includes: the construction unit is used for constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data; a calculation unit for passing through the formula
Calculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor.
Optionally, the apparatus further includes: the acquisition module is further configured to acquire the environment variable parameter in real time, where the environment variable parameter includes: temperature, humidity, light intensity, pressure.
Optionally, the apparatus further includes: the optimizing module is used for optimizing the spliced image generated by the joint debugging result, wherein the optimizing process comprises the following steps: denoising and light enhancement.
According to another aspect of the embodiment of the application, there is also provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the device where the nonvolatile storage medium is controlled to execute a light field camera multi-module coordination method.
According to another aspect of the embodiment of the present application, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a light field camera multi-module coordination method when executed.
In the embodiment of the application, the information of the light field camera module is acquired; calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data; according to the module information and the calibration data, performing joint debugging on all modules by using a coordination control algorithm to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result; according to the environment variable and the variable factor, the joint debugging result is adjusted in real time, so that the technical problems that in a light field camera system of a plurality of modules in the prior art, due to factors such as hardware limitation, weather, special conditions and the like, optical calibration among the modules may not be accurate enough, and the image synthesis is discontinuous or even distorted, and meanwhile, the modules need to be coordinated and controlled during operation, so that the stability and performance of the system cannot be ensured are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of a light field camera multi-module coordination method according to an embodiment of the application;
FIG. 2 is a block diagram of a light field camera multi-module coordination device according to an embodiment of the application;
fig. 3 is a block diagram of a terminal device for performing the method according to the application according to an embodiment of the application;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the application, according to an embodiment of the application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present application, a method embodiment of a light field camera multi-module coordination method is provided, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and that, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order different from that shown or described herein.
Example 1
Fig. 1 is a flowchart of a light field camera multi-module coordination method according to an embodiment of the application, as shown in fig. 1, the method includes the following steps:
step S102, obtaining light field camera module information.
Specifically, in order to solve the technical problems that in the light field camera system of a plurality of modules in the prior art, due to factors such as hardware limitation, weather, special conditions and the like, optical calibration among the modules may not be accurate enough, so that image synthesis is discontinuous or even distorted, and meanwhile, the plurality of modules need to be coordinated and controlled during operation, so that stability and performance of the system cannot be ensured, the light field camera module information needs to be acquired firstly, so that mutual coordination among different modules and realization of mutually optimized real-time dynamic adjustment processes are performed for each parameter in the module information.
And step S104, calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data.
Specifically, after module information of different modules is obtained, the embodiment of the application needs to calibrate the relative positions and relative parameters of the different modules so as to increase the precision degree and reduce the occurrence of mismatching when the coordination processing between the modules is carried out subsequently.
And S106, performing joint debugging on all modules by using a coordination control algorithm according to the module information and the calibration data to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result.
Optionally, the performing joint debugging on all modules by using a coordination control algorithm according to the module information and the calibration data, and obtaining the joint debugging result and the variable factor includes: constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data; by the formula
Calculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor.
And step S108, adjusting the joint debugging result in real time according to the environment variable and the variable factor.
Optionally, before the adjusting the joint debugging result in real time according to the environment variable and the variable factor, the method further includes: acquiring the environment variable parameters in real time, wherein the environment variable parameters comprise: temperature, humidity, light intensity, pressure.
Optionally, after the adjusting the joint adjustment result in real time according to the environmental variable and the variable factor, the method further includes: and carrying out optimization processing on the spliced image generated by the joint debugging result, wherein the optimization processing comprises the following steps: denoising and light enhancement.
Specifically, according to an environmental variable and a variable factor obtained in the embodiment of the present application, an adjustment result may be adjusted to achieve a technical purpose of flexible real-time dynamic adjustment, where before the adjustment result is adjusted in real time according to the environmental variable and the variable factor, the embodiment of the present application may further obtain, in real time, the environmental variable parameter, where the environmental variable parameter includes: temperature, humidity, light intensity, pressure, and according to the above-mentioned environmental variables, after said adjusting the joint debugging result in real time according to the environmental variables and the variable factors, the method further comprises: and carrying out optimization processing on the spliced image generated by the joint debugging result, wherein the optimization processing comprises the following steps: denoising and light enhancement.
By the embodiment, the technical problems that in a light field camera system of a plurality of modules in the prior art, due to factors such as hardware limitation, weather, special conditions and the like, optical calibration among the modules may be inaccurate, so that image synthesis is discontinuous and even distorted, and meanwhile, the modules need coordinated control during operation, so that the stability and performance of the system cannot be ensured are solved.
Example two
Fig. 2 is a block diagram of a light field camera multimode group coordination device according to an embodiment of the application, and as shown in fig. 2, the device includes:
the acquiring module 20 is configured to acquire light field camera module information.
Specifically, in order to solve the technical problems that in the light field camera system of a plurality of modules in the prior art, due to factors such as hardware limitation, weather, special conditions and the like, optical calibration among the modules may not be accurate enough, so that image synthesis is discontinuous or even distorted, and meanwhile, the plurality of modules need to be coordinated and controlled during operation, so that stability and performance of the system cannot be ensured, the light field camera module information needs to be acquired firstly, so that mutual coordination among different modules and realization of mutually optimized real-time dynamic adjustment processes are performed for each parameter in the module information.
And the calibration module 22 is used for calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data.
Specifically, after module information of different modules is obtained, the embodiment of the application needs to calibrate the relative positions and relative parameters of the different modules so as to increase the precision degree and reduce the occurrence of mismatching when the coordination processing between the modules is carried out subsequently.
And the joint debugging module 24 is used for joint debugging all modules by utilizing a coordination control algorithm according to the module information and the calibration data to obtain a joint debugging result and a variable factor, wherein the variable factor characterizes a coefficient of influence of an environment variable on the joint debugging result.
Optionally, the calibration module includes: the construction unit is used for constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data; a calculation unit for passing through the formula
Calculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor.
And the adjustment module 26 is configured to adjust the joint adjustment result in real time according to the environmental variable and the variable factor.
Optionally, the apparatus further includes: the acquisition module is further configured to acquire the environment variable parameter in real time, where the environment variable parameter includes: temperature, humidity, light intensity, pressure.
Optionally, the apparatus further includes: the optimizing module is used for optimizing the spliced image generated by the joint debugging result, wherein the optimizing process comprises the following steps: denoising and light enhancement.
By the embodiment, the technical problems that in a light field camera system of a plurality of modules in the prior art, due to factors such as hardware limitation, weather, special conditions and the like, optical calibration among the modules may be inaccurate, so that image synthesis is discontinuous and even distorted, and meanwhile, the modules need coordinated control during operation, so that the stability and performance of the system cannot be ensured are solved.
According to another aspect of the embodiment of the application, there is also provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the device where the nonvolatile storage medium is controlled to execute a light field camera multi-module coordination method.
Specifically, the method comprises the following steps: acquiring light field camera module information; calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data; according to the module information and the calibration data, performing joint debugging on all modules by using a coordination control algorithm to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result; based on the environmental variables and the variablesAnd the factor is used for adjusting the joint debugging result in real time. Optionally, the performing joint debugging on all modules by using a coordination control algorithm according to the module information and the calibration data, and obtaining the joint debugging result and the variable factor includes: constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data; by the formulaCalculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor. Optionally, before the adjusting the joint debugging result in real time according to the environment variable and the variable factor, the method further includes: acquiring the environment variable parameters in real time, wherein the environment variable parameters comprise: temperature, humidity, light intensity, pressure. Optionally, after the adjusting the joint adjustment result in real time according to the environmental variable and the variable factor, the method further includes: and carrying out optimization processing on the spliced image generated by the joint debugging result, wherein the optimization processing comprises the following steps: denoising and light enhancement.
According to another aspect of the embodiment of the present application, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a light field camera multi-module coordination method when executed.
Specifically, the method comprises the following steps: acquiring light field camera module information; calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data; according to the module information and the calibration data, performing joint debugging on all modules by using a coordination control algorithm to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result; and adjusting the joint debugging result in real time according to the environment variable and the variable factor. Optionally, the coordination control algorithm is utilized to conduct joint debugging on all modules according to the module information and the calibration data, and a joint debugging result and a variable result are obtainedThe factors include: constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data; by the formulaCalculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor. Optionally, before the adjusting the joint debugging result in real time according to the environment variable and the variable factor, the method further includes: acquiring the environment variable parameters in real time, wherein the environment variable parameters comprise: temperature, humidity, light intensity, pressure. Optionally, after the adjusting the joint adjustment result in real time according to the environmental variable and the variable factor, the method further includes: and carrying out optimization processing on the spliced image generated by the joint debugging result, wherein the optimization processing comprises the following steps: denoising and light enhancement.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.

Claims (4)

1. A light field camera multi-module coordination method, comprising:
acquiring light field camera module information;
calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data;
according to the module information and the calibration data, performing joint debugging on all modules by using a coordination control algorithm to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result;
according to the environment variable and the variable factor, the joint debugging result is adjusted in real time;
and performing joint debugging on all modules by using a coordination control algorithm according to the module information and the calibration data to obtain a joint debugging result and a variable factor, wherein the step of obtaining the joint debugging result comprises the following steps of:
constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data;
by the formula
Calculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor;
before said adjusting said joint debugging result in real time according to said environmental variable and said variable factor, said method further comprises:
acquiring the environment variable parameters in real time, wherein the environment variable parameters comprise: temperature, humidity, light intensity, pressure;
after said adjusting said joint debugging results in real time according to said environmental variables and said variable factors, said method further comprises:
and carrying out optimization processing on the spliced image generated by the joint debugging result, wherein the optimization processing comprises the following steps: denoising and light enhancement.
2. A light field camera multi-module coordination device, comprising:
the acquisition module is used for acquiring the information of the light field camera module;
the calibration module is used for calibrating the relative positions and parameters of the modules according to the module information to obtain calibration data;
the joint debugging module is used for joint debugging all modules by utilizing a coordination control algorithm according to the module information and the calibration data to obtain a joint debugging result and a variable factor, wherein the variable factor represents a coefficient of influence of an environment variable on the joint debugging result;
the adjustment module is used for adjusting the joint debugging result in real time according to the environment variable and the variable factor;
the calibration module comprises:
the construction unit is used for constructing a module-calibration two-dimensional matrix through the module information and the corresponding calibration data;
a calculation unit for passing through the formula
Calculating to obtain the joint debugging result, wherein L1-L3 are joint debugging results, f 1-f 3 are module information, b 1-b 3 are calibration data, and ζ is a variable factor;
the apparatus further comprises:
the acquisition module is further configured to acquire the environment variable parameter in real time, where the environment variable parameter includes: temperature, humidity, light intensity, pressure;
the apparatus further comprises:
the optimizing module is used for optimizing the spliced image generated by the joint debugging result, wherein the optimizing process comprises the following steps: denoising and light enhancement.
3. A non-volatile storage medium comprising a stored program, wherein the program when run controls a device in which the non-volatile storage medium resides to perform the method of claim 1.
4. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method of claim 1.
CN202310456648.9A 2023-04-25 2023-04-25 Multi-module coordination method and device for light field camera Active CN116485912B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310456648.9A CN116485912B (en) 2023-04-25 2023-04-25 Multi-module coordination method and device for light field camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310456648.9A CN116485912B (en) 2023-04-25 2023-04-25 Multi-module coordination method and device for light field camera

Publications (2)

Publication Number Publication Date
CN116485912A CN116485912A (en) 2023-07-25
CN116485912B true CN116485912B (en) 2023-12-05

Family

ID=87211492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310456648.9A Active CN116485912B (en) 2023-04-25 2023-04-25 Multi-module coordination method and device for light field camera

Country Status (1)

Country Link
CN (1) CN116485912B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3099078A1 (en) * 2015-05-29 2016-11-30 Thomson Licensing Method for collecting information on users of 4d light field data, corresponding apparatuses and computer programs
CN107492127A (en) * 2017-09-18 2017-12-19 丁志宇 Light-field camera parameter calibration method, device, storage medium and computer equipment
WO2021227504A1 (en) * 2020-05-13 2021-11-18 奥比中光科技集团股份有限公司 Depth calculation system and method, and computer-readable storage medium
CN114926547A (en) * 2022-06-01 2022-08-19 海信电子科技(深圳)有限公司 Calibration method of camera and IMU, electronic device and system
CN115546313A (en) * 2022-09-30 2022-12-30 重庆长安汽车股份有限公司 Vehicle-mounted camera self-calibration method and device, electronic equipment and storage medium
CN115809188A (en) * 2022-08-03 2023-03-17 宁德时代新能源科技股份有限公司 Debugging method, device, equipment, medium and program product of image detection algorithm
CN115953478A (en) * 2022-12-23 2023-04-11 上海华测导航技术股份有限公司 Camera parameter calibration method and device, electronic equipment and readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9357132B2 (en) * 2014-05-30 2016-05-31 Apple Inc. Video rolling shutter correction for lens movement in optical image stabilization cameras
US20190340317A1 (en) * 2018-05-07 2019-11-07 Microsoft Technology Licensing, Llc Computer vision through simulated hardware optimization

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3099078A1 (en) * 2015-05-29 2016-11-30 Thomson Licensing Method for collecting information on users of 4d light field data, corresponding apparatuses and computer programs
CN107492127A (en) * 2017-09-18 2017-12-19 丁志宇 Light-field camera parameter calibration method, device, storage medium and computer equipment
WO2021227504A1 (en) * 2020-05-13 2021-11-18 奥比中光科技集团股份有限公司 Depth calculation system and method, and computer-readable storage medium
CN114926547A (en) * 2022-06-01 2022-08-19 海信电子科技(深圳)有限公司 Calibration method of camera and IMU, electronic device and system
CN115809188A (en) * 2022-08-03 2023-03-17 宁德时代新能源科技股份有限公司 Debugging method, device, equipment, medium and program product of image detection algorithm
CN115546313A (en) * 2022-09-30 2022-12-30 重庆长安汽车股份有限公司 Vehicle-mounted camera self-calibration method and device, electronic equipment and storage medium
CN115953478A (en) * 2022-12-23 2023-04-11 上海华测导航技术股份有限公司 Camera parameter calibration method and device, electronic equipment and readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种基于光场图像的聚焦光场相机标定方法;孙俊阳;孙俊;许传龙;张彪;王式民;;光学学报(第05期);第1-11页 *
光场相机成像模型及参数标定方法综述;张春萍;王庆;;中国激光(第06期);第1-12页 *

Also Published As

Publication number Publication date
CN116485912A (en) 2023-07-25

Similar Documents

Publication Publication Date Title
CN111367407B (en) Intelligent glasses interaction method, intelligent glasses interaction device and intelligent glasses
CN115984126A (en) Optical image correction method and device based on input instruction
CN115375582A (en) Moire digestion method and device based on low-order Taylor decomposition
CN116614453B (en) Image transmission bandwidth selection method and device based on cloud interconnection
CN116485912B (en) Multi-module coordination method and device for light field camera
CN105630486B (en) Typesetting method and device for desktop of intelligent terminal equipment
CN116389915B (en) Method and device for reducing flicker of light field camera
CN116302041B (en) Optimization method and device for light field camera interface module
CN116088580B (en) Flying object tracking method and device
CN116664413B (en) Image volume fog eliminating method and device based on Abbe convergence operator
CN116468883B (en) High-precision image data volume fog recognition method and device
CN116579964B (en) Dynamic frame gradual-in gradual-out dynamic fusion method and device
CN116723419B (en) Acquisition speed optimization method and device for billion-level high-precision camera
CN116758165B (en) Image calibration method and device based on array camera
CN116797479B (en) Image vertical distortion conversion method
CN115511735B (en) Snow field gray scale picture optimization method and device
CN116402935B (en) Image synthesis method and device based on ray tracing algorithm
CN116757983B (en) Main and auxiliary image fusion method and device
CN116389887A (en) Dynamic optimization-based light field camera configuration method and device
CN115546053B (en) Method and device for eliminating diffuse reflection of graphics on snow in complex terrain
CN116228593B (en) Image perfecting method and device based on hierarchical antialiasing
CN116579965B (en) Multi-image fusion method and device
CN115460389B (en) Image white balance area optimization method and device
CN116452481A (en) Multi-angle combined shooting method and device
CN117367455A (en) Deep learning algorithm unmanned aerial vehicle route design method and device for photovoltaic power station

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant