CN116017128A - Edge camera auxiliary image construction method and device - Google Patents

Edge camera auxiliary image construction method and device Download PDF

Info

Publication number
CN116017128A
CN116017128A CN202211539684.3A CN202211539684A CN116017128A CN 116017128 A CN116017128 A CN 116017128A CN 202211539684 A CN202211539684 A CN 202211539684A CN 116017128 A CN116017128 A CN 116017128A
Authority
CN
China
Prior art keywords
image data
edge
parameters
main
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211539684.3A
Other languages
Chinese (zh)
Inventor
袁潮
邓迪旻
温建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhuohe Technology Co Ltd
Original Assignee
Beijing Zhuohe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhuohe Technology Co Ltd filed Critical Beijing Zhuohe Technology Co Ltd
Priority to CN202211539684.3A priority Critical patent/CN116017128A/en
Publication of CN116017128A publication Critical patent/CN116017128A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses an edge camera auxiliary image construction method and device. Wherein the method comprises the following steps: acquiring edge camera parameters and linkage parameters; extracting edge image data corresponding to the edge camera parameters according to the linkage parameters; carrying out main image function marking on the edge image data to obtain marked edge image data; and fitting the marked edge image data with the main image data to obtain an image construction result. The invention solves the technical problems that in the prior art, when an edge camera in a high-precision shooting matrix works in an auxiliary mode, an edge image is only mixed into image data of a main shooting system to serve as auxiliary display to output in a multi-angle and multi-mode, and intelligent main image data combination or fitting cannot be carried out according to the image data of the edge camera when the edge data is large in quantity and complex in function, so that full-function image data with a good fusion effect is obtained.

Description

Edge camera auxiliary image construction method and device
Technical Field
The invention relates to the field of image construction and processing, in particular to an edge camera auxiliary image construction method and device.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
At present, when high-precision image monitoring or high-precision image recognition and judgment are performed, high-precision image pickup systems with hundred million or higher-level pixels are used, and even image pickup systems with different precision are built into an image pickup matrix to work, so that accurate image recognition results and judgment results are obtained. However, in the prior art, when an edge camera in a high-precision shooting matrix works in an auxiliary mode, an edge image is only mixed into image data of a main shooting system to serve as auxiliary display to output in multiple angles and multiple modes, and intelligent main image data combination or fitting cannot be performed according to the image data of the edge camera when the edge data size is large and the function is complex, so that full-function image data with a good fusion effect is obtained.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides an edge camera auxiliary image construction method and device, which at least solve the technical problems that in the prior art, when an edge camera in a high-precision shooting matrix works in an auxiliary mode, an image is constructed by only mixing an edge image into image data of a main shooting system to serve as auxiliary display for multi-angle and diversified output, and intelligent main image data combination or fitting cannot be carried out according to the image data of the edge camera when the edge data amount is large and the function is complex, so that full-function image data with a good fusion effect is obtained.
According to an aspect of an embodiment of the present invention, there is provided an edge camera auxiliary image construction method including: acquiring edge camera parameters and linkage parameters; extracting edge image data corresponding to the edge camera parameters according to the linkage parameters; carrying out main image function marking on the edge image data to obtain marked edge image data; and fitting the marked edge image data with the main image data to obtain an image construction result.
Optionally, the linkage parameters include: main camera parameters, edge camera function parameters.
Optionally, after extracting the edge image data corresponding to the edge camera parameter according to the linkage parameter, the method further includes: performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched.
Optionally, the fitting the marked edge image data with the main image data to obtain an image construction result includes: assigning the mark data in the mark edge image data to the main image data; fitting according to the marked main image data and marked edge image data to obtain the image construction result, wherein the fitting formula is that
P=bp*tan(mp)
Where P is the image construction result, bp is the marker edge image data, and mp is the main image data.
According to another aspect of the embodiment of the present invention, there is also provided an edge camera auxiliary image construction apparatus including: the acquisition module is used for acquiring the edge camera parameters and the linkage parameters; the extraction module is used for extracting edge image data corresponding to the edge camera parameters according to the linkage parameters; the marking module is used for marking the edge image data with a main image function to obtain marked edge image data; and the fitting module is used for fitting the marked edge image data with the main image data to obtain an image construction result.
Optionally, the linkage parameters include: main camera parameters, edge camera function parameters.
Optionally, the apparatus further includes: the anti-verification module is used for performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched.
Optionally, the fitting module includes: a giving unit for giving the mark data in the mark edge image data to the main image data; a fitting unit, configured to perform fitting according to the marked main image data and marked edge image data to obtain the image construction result, where the fitting formula is
P=bp*tan(mp)
Where P is the image construction result, bp is the marker edge image data, and mp is the main image data.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the program controls a device in which the nonvolatile storage medium is located to execute an edge camera auxiliary image construction method.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute an edge camera assisted image construction method when executed.
In the embodiment of the invention, acquiring the parameters of the edge camera and the linkage parameters; extracting edge image data corresponding to the edge camera parameters according to the linkage parameters; carrying out main image function marking on the edge image data to obtain marked edge image data; the method for fitting the marked edge image data and the main image data to obtain an image construction result solves the technical problems that in the prior art, when an edge camera in a high-precision shooting matrix is used for assisting in working, an edge image is only mixed into the image data of a main shooting system to serve as auxiliary display for multi-angle and diversified output, and intelligent main image data combination or fitting cannot be carried out according to the image data of the edge camera when the edge data amount is large and the function is complex, so that full-function image data with a good fusion effect is obtained.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of an edge camera assisted image construction method according to an embodiment of the invention;
FIG. 2 is a block diagram of an edge camera assisted image construction apparatus according to an embodiment of the present invention;
fig. 3 is a block diagram of a terminal device for performing the method according to the invention according to an embodiment of the invention;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of an edge camera assisted image construction method, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
Example 1
Fig. 1 is a flowchart of an edge camera auxiliary image construction method according to an embodiment of the present invention, as shown in fig. 1, the method includes the steps of:
step S102, acquiring edge camera parameters and linkage parameters.
Specifically, in order to solve the technical problem that in the prior art, when an edge camera in a high-precision image pickup matrix is used for assisting in working, an image is constructed by only mixing the edge image into image data of a main image pickup system, and outputting in multiple angles and multiple modes as assisting display, and when the edge data size is large and the function is complex, intelligent main image data combination or fitting cannot be performed according to the image data of the edge camera, so that full-function image data with a good fusion effect is obtained, firstly, working parameters of the edge camera and linkage parameters of the edge camera and a main camera system are required to be extracted, wherein the working parameters of the edge camera can comprise the types, states, functions, signals and the like of the edge camera, and the linkage parameters comprise: the camera system comprises main camera parameters and edge camera function parameters, wherein the edge camera function parameters in the linkage parameters are used for representing the capacity range of an edge camera in the whole camera matrix system.
The edge camera may be a single camera device configured for a certain function, or may be a camera cluster for all edge position auxiliary functions for an imaging system matrix.
Step S104, extracting edge image data corresponding to the edge camera parameters according to the linkage parameters.
Specifically, in order to obtain edge image data according to linkage parameters and the data acquisition condition of the edge camera, an image pickup or photographing data transmission channel of the edge camera needs to be activated according to the configuration of an image pickup matrix system, so that the edge image data is extracted in real time by utilizing the CPU and GPU operation functions for subsequent image construction.
Optionally, after extracting the edge image data corresponding to the edge camera parameter according to the linkage parameter, the method further includes: performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched.
Specifically, after obtaining edge image data generated by an edge camera, the embodiment of the invention needs to perform anti-verification operation on the edge image data and main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched. The anti-verification operation is used for checking whether the matching relationship between the edge image data and the main image data is correct, and because the working mode of the edge camera is independent during matrix array shooting, there may be mismatching between the image data shot by the edge camera and the content and the requirement displayed in the main image data, thereby causing image mashup errors and output errors. Therefore, through the reverse verification process, the edge image data is reversely utilized to carry out matching verification on the linkage parameters and the function requirement data and the main image data of the main camera system, and the technical effect of further increasing the image construction accuracy degree can be achieved.
And S106, performing main image function marking on the edge image data to obtain marked edge image data.
Specifically, the edge image data collected by the edge camera in the embodiment of the invention is an aggregate of a plurality of image data or image data types, such as wide-angle image data and infrared image data, but the auxiliary addition of the edge image data required by the main image data is often one or one of the data collected by the edge camera, so that the functional requirement of the main image data is required to be marked, thereby obtaining marked edge image data, and the marked edge image data is the edge image data completely meeting the requirement of the main image data and can be directly used for fitting calculation by the main image data.
And step S108, fitting the marked edge image data with the main image data to obtain an image construction result.
Optionally, the fitting the marked edge image data with the main image data to obtain an image construction result includes: assigning the mark data in the mark edge image data to the main image data; fitting according to the marked main image data and marked edge image data to obtain the image construction result, wherein the fitting formula is that
P=bp*tan(mp)
Where P is the image construction result, bp is the marker edge image data, and mp is the main image data.
Specifically, in order to fuse the marked edge image data into the main image data, a presentable image construction result may be generated by assigning marked data in the marked edge image data to the main image data; and fitting according to the marked main image data and marked edge image data to obtain an image construction result, wherein the fitting formula is P=bp×tan (mp), wherein P is the image construction result, bp is the marked edge image data, mp is the main image data, image fitting operation is carried out in a mode that the tan function value of the main image data is the inflection point result of all pixels, and superposition is carried out by using bianPIC (bp) matlab simulation fitting calculation to obtain image construction data for displaying and outputting.
By the embodiment, the technical problems that in the prior art, when an edge camera in a high-precision shooting matrix is used for assisting in working, an image is constructed by only mixing the edge image into image data of a main shooting system, multi-angle and diversified output is carried out as auxiliary display, and intelligent main image data combination or fitting cannot be carried out according to the image data of the edge camera when the edge data amount is large and the function is complex are solved, so that full-function image data with a good fusion effect is obtained.
Example two
Fig. 2 is a block diagram of an edge camera auxiliary image constructing apparatus according to an embodiment of the present invention, as shown in fig. 2, the apparatus including:
the acquiring module 20 is configured to acquire the edge camera parameter and the linkage parameter.
Specifically, in order to solve the technical problem that in the prior art, when an edge camera in a high-precision image pickup matrix is used for assisting in working, an image is constructed by only mixing the edge image into image data of a main image pickup system, and outputting in multiple angles and multiple modes as assisting display, and when the edge data size is large and the function is complex, intelligent main image data combination or fitting cannot be performed according to the image data of the edge camera, so that full-function image data with a good fusion effect is obtained, firstly, working parameters of the edge camera and linkage parameters of the edge camera and a main camera system are required to be extracted, wherein the working parameters of the edge camera can comprise the types, states, functions, signals and the like of the edge camera, and the linkage parameters comprise: the camera system comprises main camera parameters and edge camera function parameters, wherein the edge camera function parameters in the linkage parameters are used for representing the capacity range of an edge camera in the whole camera matrix system.
The edge camera may be a single camera device configured for a certain function, or may be a camera cluster for all edge position auxiliary functions for an imaging system matrix.
And the extracting module 22 is configured to extract edge image data corresponding to the edge camera parameters according to the linkage parameters.
Specifically, in order to obtain edge image data according to linkage parameters and the data acquisition condition of the edge camera, an image pickup or photographing data transmission channel of the edge camera needs to be activated according to the configuration of an image pickup matrix system, so that the edge image data is extracted in real time by utilizing the CPU and GPU operation functions for subsequent image construction.
Optionally, the apparatus further includes: the anti-verification module is used for performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched.
Specifically, after obtaining edge image data generated by an edge camera, the embodiment of the invention needs to perform anti-verification operation on the edge image data and main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched. The anti-verification operation is used for checking whether the matching relationship between the edge image data and the main image data is correct, and because the working mode of the edge camera is independent during matrix array shooting, there may be mismatching between the image data shot by the edge camera and the content and the requirement displayed in the main image data, thereby causing image mashup errors and output errors. Therefore, through the reverse verification process, the edge image data is reversely utilized to carry out matching verification on the linkage parameters and the function requirement data and the main image data of the main camera system, and the technical effect of further increasing the image construction accuracy degree can be achieved.
And the marking module 24 is used for marking the edge image data with a main image function to obtain marked edge image data.
Specifically, the edge image data collected by the edge camera in the embodiment of the invention is an aggregate of a plurality of image data or image data types, such as wide-angle image data and infrared image data, but the auxiliary addition of the edge image data required by the main image data is often one or one of the data collected by the edge camera, so that the functional requirement of the main image data is required to be marked, thereby obtaining marked edge image data, and the marked edge image data is the edge image data completely meeting the requirement of the main image data and can be directly used for fitting calculation by the main image data.
And a fitting module 26, configured to fit the marked edge image data to the main image data, so as to obtain an image construction result.
Optionally, the fitting module includes: a giving unit for giving the mark data in the mark edge image data to the main image data; a fitting unit, configured to perform fitting according to the marked main image data and marked edge image data to obtain the image construction result, where the fitting formula is
P=bp*tan(mp)
Where P is the image construction result, bp is the marker edge image data, and mp is the main image data.
Specifically, in order to fuse the marked edge image data into the main image data, a presentable image construction result may be generated by assigning marked data in the marked edge image data to the main image data; and fitting according to the marked main image data and marked edge image data to obtain an image construction result, wherein the fitting formula is P=bp×tan (mp), wherein P is the image construction result, bp is the marked edge image data, mp is the main image data, image fitting operation is carried out in a mode that the tan function value of the main image data is the inflection point result of all pixels, and superposition is carried out by using bianPIC (bp) matlab simulation fitting calculation to obtain image construction data for displaying and outputting.
By the embodiment, the technical problems that in the prior art, when an edge camera in a high-precision shooting matrix is used for assisting in working, an image is constructed by only mixing the edge image into image data of a main shooting system, multi-angle and diversified output is carried out as auxiliary display, and intelligent main image data combination or fitting cannot be carried out according to the image data of the edge camera when the edge data amount is large and the function is complex are solved, so that full-function image data with a good fusion effect is obtained.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the program controls a device in which the nonvolatile storage medium is located to execute an edge camera auxiliary image construction method.
Specifically, the method comprises the following steps: acquiring edge camera parameters and linkage parameters; extracting edge image data corresponding to the edge camera parameters according to the linkage parameters; carrying out main image function marking on the edge image data to obtain marked edge image data; and fitting the marked edge image data with the main image data to obtain an image construction result. Optionally, the linkage parameters include: main camera parameters, edge camera function parameters. Optionally, after extracting the edge image data corresponding to the edge camera parameter according to the linkage parameter, the method further includes: performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched. Optionally, the fitting the marked edge image data with the main image data to obtain an image construction result includes: assigning the mark data in the mark edge image data to the main image data; and fitting according to the marked main image data and marked edge image data to obtain an image construction result, wherein the fitting formula is p=bp×tan (mp), wherein P is the image construction result, bp is the marked edge image data, and mp is the main image data.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute an edge camera assisted image construction method when executed.
Specifically, the method comprises the following steps: acquiring edge camera parameters and linkage parameters; extracting edge image data corresponding to the edge camera parameters according to the linkage parameters; carrying out main image function marking on the edge image data to obtain marked edge image data; and fitting the marked edge image data with the main image data to obtain an image construction result. Optionally, the linkage parameters include: main camera parameters, edge camera function parameters. Optionally, after extracting the edge image data corresponding to the edge camera parameter according to the linkage parameter, the method further includes: performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched. Optionally, the fitting the marked edge image data with the main image data to obtain an image construction result includes: assigning the mark data in the mark edge image data to the main image data; and fitting according to the marked main image data and marked edge image data to obtain an image construction result, wherein the fitting formula is p=bp×tan (mp), wherein P is the image construction result, bp is the marked edge image data, and mp is the main image data.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (10)

1. An edge camera assisted image construction method, comprising:
acquiring edge camera parameters and linkage parameters;
extracting edge image data corresponding to the edge camera parameters according to the linkage parameters;
carrying out main image function marking on the edge image data to obtain marked edge image data;
and fitting the marked edge image data with the main image data to obtain an image construction result.
2. The method of claim 1, wherein the linkage parameters include: main camera parameters, edge camera function parameters.
3. The method according to claim 1, wherein after the extracting the edge image data corresponding to the edge camera parameters according to the linkage parameters, the method further comprises:
performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched.
4. The method of claim 1, wherein fitting the marker edge image data to the main image data to obtain an image construction result comprises:
assigning the mark data in the mark edge image data to the main image data;
fitting according to the marked main image data and marked edge image data to obtain the image construction result, wherein the fitting formula is that
P=bp*tan(mp)
Where P is the image construction result, bp is the marker edge image data, and mp is the main image data.
5. An edge camera assisted image construction apparatus, comprising:
the acquisition module is used for acquiring the edge camera parameters and the linkage parameters;
the extraction module is used for extracting edge image data corresponding to the edge camera parameters according to the linkage parameters;
the marking module is used for marking the edge image data with a main image function to obtain marked edge image data;
and the fitting module is used for fitting the marked edge image data with the main image data to obtain an image construction result.
6. The apparatus of claim 5, wherein the linkage parameters comprise: main camera parameters, edge camera function parameters.
7. The apparatus of claim 5, wherein the apparatus further comprises:
the anti-verification module is used for performing anti-verification operation on the edge image data and the main image data in the linkage parameters to obtain a verification result, wherein the verification result comprises: the main image is matched and the main image is not matched.
8. The apparatus of claim 5, wherein the fitting module comprises:
a giving unit for giving the mark data in the mark edge image data to the main image data;
a fitting unit, configured to perform fitting according to the marked main image data and marked edge image data to obtain the image construction result, where the fitting formula is
P=bp*tan(mp)
Where P is the image construction result, bp is the marker edge image data, and mp is the main image data.
9. A non-volatile storage medium, characterized in that the non-volatile storage medium comprises a stored program, wherein the program, when run, controls a device in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 4.
10. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for executing the processor, wherein the computer readable instructions when executed perform the method of any of claims 1 to 4.
CN202211539684.3A 2022-12-02 2022-12-02 Edge camera auxiliary image construction method and device Pending CN116017128A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211539684.3A CN116017128A (en) 2022-12-02 2022-12-02 Edge camera auxiliary image construction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211539684.3A CN116017128A (en) 2022-12-02 2022-12-02 Edge camera auxiliary image construction method and device

Publications (1)

Publication Number Publication Date
CN116017128A true CN116017128A (en) 2023-04-25

Family

ID=86021987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211539684.3A Pending CN116017128A (en) 2022-12-02 2022-12-02 Edge camera auxiliary image construction method and device

Country Status (1)

Country Link
CN (1) CN116017128A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107516325A (en) * 2017-08-22 2017-12-26 上海理工大学 Center of circle detection method based on sub-pixel edge
US20180191940A1 (en) * 2016-12-30 2018-07-05 Altek Semiconductor Corp. Image capturing device and control method thereof
CN110929615A (en) * 2019-11-14 2020-03-27 RealMe重庆移动通信有限公司 Image processing method, image processing apparatus, storage medium, and terminal device
CN114520871A (en) * 2020-11-20 2022-05-20 华为技术有限公司 Image processing method and apparatus thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180191940A1 (en) * 2016-12-30 2018-07-05 Altek Semiconductor Corp. Image capturing device and control method thereof
CN107516325A (en) * 2017-08-22 2017-12-26 上海理工大学 Center of circle detection method based on sub-pixel edge
CN110929615A (en) * 2019-11-14 2020-03-27 RealMe重庆移动通信有限公司 Image processing method, image processing apparatus, storage medium, and terminal device
CN114520871A (en) * 2020-11-20 2022-05-20 华为技术有限公司 Image processing method and apparatus thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴军;吴冬梅;李耀辉;: "基于DSP的掌纹图像定位算法的实现", 通信技术, no. 11, pages 188 - 190 *

Similar Documents

Publication Publication Date Title
CN116614453B (en) Image transmission bandwidth selection method and device based on cloud interconnection
CN115409869B (en) Snow field track analysis method and device based on MAC tracking
CN115527045A (en) Image identification method and device for snow field danger identification
CN116017128A (en) Edge camera auxiliary image construction method and device
CN116579965B (en) Multi-image fusion method and device
CN116228593B (en) Image perfecting method and device based on hierarchical antialiasing
CN116468883B (en) High-precision image data volume fog recognition method and device
CN115345808B (en) Picture generation method and device based on multi-element information acquisition
CN115858240B (en) Optical camera data backup method and device
CN116302041B (en) Optimization method and device for light field camera interface module
CN116452481A (en) Multi-angle combined shooting method and device
CN116389915B (en) Method and device for reducing flicker of light field camera
CN116579964B (en) Dynamic frame gradual-in gradual-out dynamic fusion method and device
CN116723419B (en) Acquisition speed optimization method and device for billion-level high-precision camera
CN116757981A (en) Multi-terminal image fusion method and device
CN116485912B (en) Multi-module coordination method and device for light field camera
CN116030501B (en) Method and device for extracting bird detection data
CN115984333B (en) Smooth tracking method and device for airplane target
CN116088580B (en) Flying object tracking method and device
CN117896625A (en) Picture imaging method and device based on low-altitude high-resolution analysis
CN116506423A (en) Information security data reporting method and device
CN118096648A (en) Scene depth measurement method and device
CN116485841A (en) Motion rule identification method and device based on multiple wide angles
CN116431392A (en) Important data separation method and device
CN116309523A (en) Dynamic frame image dynamic fuzzy recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination