CN115695679A - Triple depth module matching method and device, mobile terminal, medium and chip - Google Patents

Triple depth module matching method and device, mobile terminal, medium and chip Download PDF

Info

Publication number
CN115695679A
CN115695679A CN202211304233.1A CN202211304233A CN115695679A CN 115695679 A CN115695679 A CN 115695679A CN 202211304233 A CN202211304233 A CN 202211304233A CN 115695679 A CN115695679 A CN 115695679A
Authority
CN
China
Prior art keywords
depth
module
determining
modules
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211304233.1A
Other languages
Chinese (zh)
Inventor
张鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202211304233.1A priority Critical patent/CN115695679A/en
Publication of CN115695679A publication Critical patent/CN115695679A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The application discloses trigeminy depth module matching method and device, mobile terminal, medium and chip belongs to module matching technical field, wherein, the trigeminy depth module matching method includes: determining depth data of each depth module for shooting a plurality of shooting distances in a plurality of depth modules; determining an error characteristic of each depth module according to the plurality of depth data; three mutually matching depth modules are determined according to the error characteristics and the relative position of each depth module is determined.

Description

Triple depth module matching method and device, mobile terminal, medium and chip
Technical Field
The application belongs to the field of module matching, and particularly relates to a triple depth module matching method, a triple depth module matching device, a mobile terminal, a readable storage medium and a chip.
Background
With the development of the technology, the technology of shooting panoramic photos by using a VR camera is mature gradually, but at present, when the VR camera is produced, because the field angle of a single depth module is limited, three depth modules are usually spliced to form a lens as the camera, however, for the depth modules, the individual difference of absolute errors of the shot depth is large, triple combination is carried out according to the current mode, the depth point cloud of the whole camera is irregular, and the shooting quality is reduced.
Disclosure of Invention
The embodiment of the application aims to provide a triple depth module matching method and device, a mobile terminal, a readable storage medium and a chip, and the problem that shooting quality is poor due to respective error characteristics of three depth modules when a panoramic photo is shot can be solved.
In a first aspect, an embodiment of the present application provides a triple depth module matching method, including: determining depth data of each depth module for shooting a plurality of shooting distances in a plurality of depth modules; determining an error characteristic of each depth module according to the plurality of depth data; three mutually matching depth modules are determined according to the error characteristics and the relative position of each depth module is determined.
In a second aspect, an embodiment of the present application provides a triple depth module matching device, including: the data detection module is used for determining depth data of a plurality of shooting distances shot by each depth module in a plurality of depth modules; the error determining module is used for determining the error characteristic of each depth module according to the depth data; and the matching module is used for determining three depth modules which are matched with each other according to the error characteristics and determining the relative position of each depth module.
In a third aspect, embodiments of the present application provide a mobile terminal, which includes a processor, a memory, and a program or instructions stored in the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the method according to the first aspect.
In this application embodiment, to a plurality of degree of depth modules, at first need acquire the basic data of every degree of depth module, so that subsequent screening is categorised and the matching, specifically, carry out the shooting of a plurality of shooting distances to every degree of depth module, thereby can obtain the depth data of a plurality of shootings, again through carrying out the analysis to a plurality of depth data, can obtain the error characteristic of every degree of depth module, so that follow-up error characteristic according to every degree of depth module, match and select a plurality of degree of depth modules, determine three assorted degree of depth modules and carry out the relative position of combination, thereby accomplish the matching.
It should be noted that, for each depth module, a plurality of depth data may be generated according to the difference of the shooting distances, a plurality of depth data corresponding to each shooting distance may be correspondingly present, and then according to the analysis of the depth data, the error characteristic of the depth module at a certain shooting distance may be determined, or the error characteristic of the depth module at any shooting distance may be determined, so as to facilitate the classification of the depth module according to the error characteristic.
It can be understood that, for the triple depth module, the relative positions are divided into an upper position, a middle position and a lower position, and the angle of view detected by the middle depth module is wider, so that the one with better shooting quality can be used as the middle shooting depth module, and the quality requirements for the upper position and the lower position are slightly lower than that of the middle depth module.
Drawings
Fig. 1 shows a flow diagram of a triple depth module matching method according to an embodiment of the present application;
FIG. 2 shows a flow diagram of a triple depth module matching method according to one embodiment of the present application;
FIG. 3 shows a flow diagram of a triple depth module matching method according to one embodiment of the present application;
FIG. 4 shows a flow diagram of a triple depth module matching method according to one embodiment of the present application;
FIG. 5 illustrates a schematic structural diagram of a triple depth module matching apparatus according to one embodiment of the present application;
FIG. 6 shows a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 7 shows a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Wherein, the correspondence between the reference numbers and the component names in fig. 5 to 7 is:
100: a mobile terminal; 101: a radio frequency unit; 102: a network module; 103: an audio output unit; 104: an input unit; 1041: a graphics processor; 1042: a microphone; 105: a sensor; 106: a display unit; 1061: a display panel; 107: a user input unit; 1071: a touch panel; 1072: other input devices; 108: an interface unit; 1109: a memory; 1110: a processor; 900: a triple depth module matching device; 901: a data detection module; 902: an error determination module; 903: and a matching module.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The triple depth module matching method and apparatus, the mobile terminal, the readable storage medium, and the chip provided in the embodiments of the present application are described in detail below with reference to fig. 1 to 7 through specific embodiments and application scenarios thereof.
The embodiment provides a triple depth module matching method, as shown in fig. 1, including:
step S102: determining depth data of each depth module for shooting a plurality of shooting distances in a plurality of depth modules;
step S104: determining an error characteristic of each depth module according to the plurality of depth data;
step S106: three mutually matching depth modules are determined from the error characteristics and the relative position of each depth module is determined.
The triple depth module matching method provided by the embodiment, for a plurality of depth modules, firstly, basic data of each depth module needs to be acquired, so that subsequent screening classification and matching are facilitated, specifically, shooting of a plurality of shooting distances is performed on each depth module, so that depth data of a plurality of shooting can be acquired, analysis is performed on the plurality of depth data, and error characteristics of each depth module can be obtained, so that the error characteristics of each depth module can be conveniently and subsequently matched and selected according to the error characteristics of each depth module, three matched depth modules are determined, and relative positions of combination are determined, so that matching is completed.
It should be noted that, for each depth module, a plurality of depth data may be generated according to the difference of the shooting distances, a plurality of depth data corresponding to each shooting distance may be correspondingly present, and then according to the analysis of the depth data, the error characteristic of the depth module at a certain shooting distance may be determined, or the error characteristic of the depth module at any shooting distance may be determined, so as to facilitate the classification of the depth module according to the error characteristic.
It can be understood that, for the triple depth module, the relative positions are divided into an upper position, a middle position and a lower position, and the angle of view detected by the middle depth module is wider, so that the one with better shooting quality can be used as the middle shooting depth module, and the quality requirements for the upper position and the lower position are slightly lower than that of the middle depth module.
In one embodiment, the depth module needs to be preheated before shooting, so that the possibility of depth deflection caused by subsequent temperature influence is reduced.
Optionally, as shown in fig. 2, determining depth data of a plurality of shooting distances by each depth module in the plurality of depth modules specifically includes:
step S1022: placing the depth module in a calibration space, shooting towards the wall of each shooting distance, and obtaining a corresponding shot image;
step S1024: determining a plurality of sampling regions in a captured image;
step S1026: and acquiring depth data of the pixel points in each sampling region on wall measurement of the shooting distance.
When carrying out basic data's detection to the degree of depth module, earlier place the degree of depth module in the demarcation space that has set up in advance, can be according to predetermined shooting distance this moment, place apart from the wall, carry out the rigidity with this degree of depth module this moment, shoot the wall, can obtain the shot image, through carrying out the analysis to the shot image, determine a plurality of sampling regions from it earlier, carry out the degree of depth measurement through the pixel to in every sampling region, thereby can correspond this sampling region's of every degree of depth module absolute error, on this basis, be convenient for follow-up to a plurality of degree of depth modules match.
It is understood that the selected location of the sampling region may be fixed or random. It should be noted that by using a fixed sampling region, the probability that the data of multiple depth modules are affected by random factors is low, and the accuracy of error evaluation on each depth module is improved.
Optionally, as shown in fig. 3, further determining a plurality of sampling regions in the captured image specifically includes: step S1025: an up-sampled image located above the center of the image of the captured image, a down-sampled image located below the center of the image, a left sampled image located to the left of the center of the image, and a right sampled image located to the right of the center of the image are determined.
For the selection of the sampling area, the up-sampling image, the down-sampling image, the left-sampling image and the right-sampling image in the shot image can be respectively determined, and the depth parameter value measured by each pixel point in each sampling image is averaged, so that the data judgment of the sampling image is realized, and the subsequent judgment of the absolute error of the whole depth module is facilitated.
In a specific embodiment, the resolution of the captured image is 640 × 480, and when processing, the image data of the 50 pixel width at the edge may be deleted, the position of 540 × 380 is reserved, and in the central area right above the central point, a 50 × 50 range is selected as the upsampled image, that is, the lower left corner of the captured image is used as the origin, the central point of the upsampled image is (270, 285), the central point of the downsampled image is (270, 95), the central point of the left sampled image is (135, 190), and the central point of the right sampled image is (405, 190).
Optionally, as shown in fig. 4, the shooting distance includes a first distance and a second distance, and the determining an error characteristic of each depth module according to the multiple depth data specifically includes: step S1042: determining a theoretical ratio of the first distance to the second distance and an actual ratio of the two depth data; step S1044: and determining the deviation ratio of the theoretical ratio and the actual ratio.
In a specific embodiment, only carry out the basis to two distances and measure, first distance and second distance promptly, when detecting the error characteristic, can acquire theoretical ratio earlier, the ratio of first distance and second distance promptly, the degree of depth module can be in the actual depth data that first distance and second distance were shot and do the ratio, generates actual ratio, compares through actual ratio and theoretical ratio, can obtain the relevant parameter corresponding to the absolute error of degree of depth module, the deviation ratio promptly.
In a specific embodiment, the first distance is 3m, the second distance is 1m, and the theoretical ratio is 3:1, if the actual ratio is 2.7:1, the deviation ratio is 90%, and if the actual ratio is 3.3:1, the deviation ratio is 110%.
Further, according to the error characteristics, three depth modules matched with each other are determined, and the relative position of each depth module is determined, specifically including: screening depth modules with deviation proportions meeting a first proportion range from a plurality of depth modules; determining three depth modules with the closest difference values according to the deviation proportion; the one with the best deviation ratio among the three depth modules is placed in the middle, the worst one is placed at the bottom, and the rest one is placed at the top.
When finally screening three matched depth modules and specific placing positions, primarily screening is carried out, namely only the depth modules with deviation proportions meeting a first proportion range are reserved in the depth modules, so that absolute errors are large, and the depth modules which do not meet the requirement of depth module type selection are removed.
In one embodiment, the first ratio range is 90% to 110%, and less than 90% or more than 110% of the depth modules are rejected.
As shown in fig. 5, the present embodiment provides a triple depth module matching device 900. The triple depth module matching device comprises a data detection module 901, an error determination module 902 and a matching module 903.
The data detection module 901 is configured to determine, in a plurality of depth modules, depth data of a plurality of shooting distances, which are shot by each depth module; an error determining module 902, configured to determine an error characteristic of each depth module according to the plurality of depth data; and a matching module 903, configured to determine three depth modules that match each other according to the error characteristic and determine a relative position of each depth module.
Optionally, the data detection module is further configured to place the depth module in the calibration space, shoot towards the wall of each shooting distance, and obtain a corresponding shot image; determining a plurality of sampling regions in a captured image; and acquiring depth data of the pixel points in each sampling region on wall measurement of the shooting distance.
Optionally, the data detection module is further configured to determine an up-sampled image located above the center of the captured image, a down-sampled image located below the center of the image, a left-sampled image located to the left of the center of the image, and a right-sampled image located to the right of the center of the image.
Optionally, the error determining module is configured to determine a theoretical ratio of the first distance and the second distance and an actual ratio of the two depth data; and determining the deviation ratio of the theoretical ratio and the actual ratio.
Optionally, the matching module is further configured to screen a depth module, of which the deviation ratio satisfies a first ratio range, from the plurality of depth modules; determining three depth modules with the closest difference values according to the deviation proportion; the one with the best deviation ratio among the three depth modules is placed in the middle, the worst one is placed at the bottom, and the rest one is placed at the top.
The triple depth module matching device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be a mobile terminal or a non-mobile terminal. By way of example, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted mobile terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present application is not particularly limited.
The triple depth module matching device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The triple depth module matching device provided by the embodiment of the application can realize each process realized by the method embodiments of fig. 1 to fig. 4, and is not repeated here for avoiding repetition.
Optionally, as shown in fig. 6, an embodiment of the present application further provides a mobile terminal 100, which includes a processor 1110, a memory 1109, and a program or an instruction that is stored in the memory 1109 and can be executed on the processor 1110, where the program or the instruction is executed by the processor 1110 to implement each process of the embodiment of the triple depth module matching method, and can achieve the same technical effect, and is not described herein again to avoid repetition.
It should be noted that the mobile terminal in the embodiment of the present application includes the mobile terminal and the non-mobile terminal described above.
Fig. 7 is a schematic diagram of a hardware structure of a mobile terminal implementing an embodiment of the present application.
The mobile terminal 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 1109, a processor 1110, and the like.
Those skilled in the art will appreciate that the mobile terminal 100 may further include a power supply (e.g., a battery) for supplying power to the various components, which may be logically coupled to the processor 1110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The mobile terminal structure shown in fig. 7 does not constitute a limitation of the mobile terminal, and the mobile terminal may include more or less components than those shown, or combine some components, or arrange different components, and thus will not be described again.
The processor 1110 is configured to determine, in the multiple depth modules, depth data for each depth module to capture multiple capture distances; determining an error characteristic of each depth module according to the plurality of depth data; three mutually matching depth modules are determined according to the error characteristics and the relative position of each depth module is determined.
Through the above scheme, to a plurality of degree of depth modules, at first need acquire the basic data of every degree of depth module, so that subsequent screening is categorised and the matching, specifically, carry out the shooting of a plurality of shooting distances to every degree of depth module, thereby can obtain the depth data of a plurality of shootings, again through carrying out the analysis to a plurality of depth data, can obtain the error characteristic of every degree of depth module, so that follow-up error characteristic according to every degree of depth module, match and select a plurality of degree of depth modules, determine three assorted degree of depth modules and carry out the relative position of combination, thereby accomplish the matching.
Optionally, the processor 1110 is further configured to implement the following steps: placing the depth module in a calibration space, shooting towards the wall of each shooting distance, and obtaining a corresponding shot image; determining a plurality of sampling regions in the captured image; and acquiring depth data of the pixel points in each sampling region on wall measurement of the shooting distance.
Optionally, the processor 1110 is further configured to determine an up-sampled image located above the center of the image of the captured image, a down-sampled image located below the center of the image, a left-sampled image located to the left of the center of the image, and a right-sampled image located to the right of the center of the image.
Optionally, the processor 1110 is further configured to determine a theoretical ratio of the first distance and the second distance and an actual ratio of the two depth data; and determining the deviation ratio of the theoretical ratio and the actual ratio.
Optionally, the processor 1110 is further configured to screen a depth module, of the plurality of depth modules, whose deviation ratio satisfies the first ratio range; determining three depth modules with the closest difference values according to the deviation proportion; the one with the best deviation ratio among the three depth modules is placed in the middle, the worst one is placed at the bottom, and the rest one is placed at the top.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1109 may be used for storing software programs and various data including, but not limited to, application programs and an operating system. Processor 1110 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the triple depth module matching method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the mobile terminal in the above embodiment. Readable storage media, including computer-readable storage media, such as computer Read-Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, etc.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction, so as to implement each process of the above triple deep module matching method embodiment, and achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one of 8230, and" comprising 8230does not exclude the presence of additional like elements in a process, method, article, or apparatus comprising the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (13)

1. A triple depth module matching method, comprising:
determining depth data of each depth module for shooting a plurality of shooting distances in a plurality of depth modules;
determining an error characteristic of each depth module according to a plurality of depth data;
determining three mutually matched depth modules according to the error characteristics and determining the relative position of each depth module.
2. The triple depth module matching method according to claim 1, wherein determining depth data for each depth module to capture a plurality of capture distances among a plurality of depth modules comprises:
placing the depth module in a calibration space, and shooting towards the wall of each shooting distance to obtain a corresponding shot image;
determining a plurality of sampling regions in the captured image;
and acquiring depth data of the pixel points in each sampling area on the wall measurement of the shooting distance.
3. The triple depth module matching method of claim 2,
the determining a plurality of sampling regions in the shot image specifically includes:
an up-sampled image located above the center of the image of the captured image, a down-sampled image located below the center of the image, a left-sampled image located to the left of the center of the image, and a right-sampled image located to the right of the center of the image are determined.
4. The triple depth module matching method according to any one of claims 1 to 3, wherein the shot distance comprises a first distance and a second distance, and the determining the error characteristic of each depth module from the plurality of depth data comprises:
determining a theoretical ratio of the first distance and the second distance and an actual ratio of the two depth data;
and determining the deviation ratio of the theoretical ratio and the actual ratio.
5. The triple depth module matching method of claim 4, wherein determining three mutually matching depth modules and determining a relative position of each depth module based on the error characteristics comprises:
screening the depth modules with the deviation proportion meeting a first proportion range from a plurality of depth modules;
determining three depth modules with the closest difference values according to the deviation proportion;
and placing the optimal deviation ratio of the three depth modules in the middle, the worst depth module in the bottom and the rest depth modules in the top.
6. A triple depth module matching apparatus, comprising:
the data detection module is used for determining depth data of a plurality of shooting distances shot by each depth module in a plurality of depth modules;
the error determining module is used for determining the error characteristic of each depth module according to a plurality of depth data;
and the matching module is used for determining three depth modules which are matched with each other according to the error characteristics and determining the relative position of each depth module.
7. The triple depth module matching device according to claim 6, wherein the data detection module is further configured to place the depth module in a calibration space, shoot towards the wall of each shooting distance, and obtain a corresponding shot image; determining a plurality of sampling regions in the captured image; and acquiring depth data of the pixel points in each sampling area on the wall measurement of the shooting distance.
8. The triple depth module matching apparatus of claim 7, wherein the data detection module is further configured to determine an up-sampled image located above an image center of the captured image, a down-sampled image located below the image center, a left sampled image located to the left of the image center, and a right sampled image located to the right of the image center.
9. The triple depth module matching apparatus according to any one of claims 6 to 8, wherein the shot distance comprises a first distance and a second distance, the error determination module is configured to determine a theoretical ratio of the first distance and the second distance and an actual ratio of the two depth data; and determining the deviation ratio of the theoretical ratio and the actual ratio.
10. The triple depth module matching apparatus of claim 9, wherein the matching module is further configured to screen a plurality of depth modules for which the deviation ratio satisfies a first ratio range; determining three depth modules with the closest difference values according to the deviation proportion; and placing the optimal one of the three depth modules in the deviation ratio in the middle, placing the worst one in the bottom, and placing the rest one in the top.
11. A mobile terminal comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the triple depth module matching method of any one of claims 1 to 5.
12. A readable storage medium, storing thereon a program or instructions which, when executed by a processor, implement the steps of the triple depth module matching method of any of claims 1 to 5.
13. A chip comprising a processor and a communication interface, the communication interface being coupled to the processor, the processor being configured to execute a program or instructions to implement the steps of the triple depth module matching method of any of claims 1 to 5.
CN202211304233.1A 2022-10-24 2022-10-24 Triple depth module matching method and device, mobile terminal, medium and chip Pending CN115695679A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211304233.1A CN115695679A (en) 2022-10-24 2022-10-24 Triple depth module matching method and device, mobile terminal, medium and chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211304233.1A CN115695679A (en) 2022-10-24 2022-10-24 Triple depth module matching method and device, mobile terminal, medium and chip

Publications (1)

Publication Number Publication Date
CN115695679A true CN115695679A (en) 2023-02-03

Family

ID=85099204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211304233.1A Pending CN115695679A (en) 2022-10-24 2022-10-24 Triple depth module matching method and device, mobile terminal, medium and chip

Country Status (1)

Country Link
CN (1) CN115695679A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106873112A (en) * 2017-01-03 2017-06-20 深圳市比亚迪电子部品件有限公司 A kind of dual camera assemble method
CN108931202A (en) * 2018-07-13 2018-12-04 Oppo广东移动通信有限公司 Detection method and device, electronic device, computer equipment and readable storage medium storing program for executing
CN108965525A (en) * 2018-07-13 2018-12-07 Oppo广东移动通信有限公司 Detection method and device, terminal, computer equipment and readable storage medium storing program for executing
CN109598763A (en) * 2018-11-30 2019-04-09 Oppo广东移动通信有限公司 Camera calibration method, device, electronic equipment and computer readable storage medium
CN109990734A (en) * 2018-01-03 2019-07-09 浙江舜宇智能光学技术有限公司 Depth information camera module precision automatic checkout system and its accuracy checking method
CN111212280A (en) * 2019-12-27 2020-05-29 杭州艾芯智能科技有限公司 Method and system for testing depth camera module, computer equipment and storage medium
CN111385558A (en) * 2018-12-28 2020-07-07 浙江舜宇智能光学技术有限公司 TOF camera module precision measurement method and system thereof
CN111402188A (en) * 2018-12-28 2020-07-10 浙江舜宇智能光学技术有限公司 TOF camera module depth measurement evaluation method and TOF camera module depth measurement evaluation device
CN112824929A (en) * 2019-11-20 2021-05-21 南昌欧菲生物识别技术有限公司 Precision measurement method, device and equipment of TOF module
CN114187366A (en) * 2021-12-10 2022-03-15 北京有竹居网络技术有限公司 Camera installation correction method and device, electronic equipment and storage medium
CN114283141A (en) * 2021-12-29 2022-04-05 上海肇观电子科技有限公司 Method, apparatus, electronic device, and medium for evaluating depth image quality
CN114795079A (en) * 2022-05-06 2022-07-29 广州为实光电医疗科技有限公司 Matching calibration method and device for medical endoscope double-camera module
WO2022178666A1 (en) * 2021-02-23 2022-09-01 深圳市艾比森光电股份有限公司 Led display screen and display control method therefor

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106873112A (en) * 2017-01-03 2017-06-20 深圳市比亚迪电子部品件有限公司 A kind of dual camera assemble method
CN109990734A (en) * 2018-01-03 2019-07-09 浙江舜宇智能光学技术有限公司 Depth information camera module precision automatic checkout system and its accuracy checking method
CN108931202A (en) * 2018-07-13 2018-12-04 Oppo广东移动通信有限公司 Detection method and device, electronic device, computer equipment and readable storage medium storing program for executing
CN108965525A (en) * 2018-07-13 2018-12-07 Oppo广东移动通信有限公司 Detection method and device, terminal, computer equipment and readable storage medium storing program for executing
CN109598763A (en) * 2018-11-30 2019-04-09 Oppo广东移动通信有限公司 Camera calibration method, device, electronic equipment and computer readable storage medium
CN111385558A (en) * 2018-12-28 2020-07-07 浙江舜宇智能光学技术有限公司 TOF camera module precision measurement method and system thereof
CN111402188A (en) * 2018-12-28 2020-07-10 浙江舜宇智能光学技术有限公司 TOF camera module depth measurement evaluation method and TOF camera module depth measurement evaluation device
CN112824929A (en) * 2019-11-20 2021-05-21 南昌欧菲生物识别技术有限公司 Precision measurement method, device and equipment of TOF module
CN111212280A (en) * 2019-12-27 2020-05-29 杭州艾芯智能科技有限公司 Method and system for testing depth camera module, computer equipment and storage medium
WO2022178666A1 (en) * 2021-02-23 2022-09-01 深圳市艾比森光电股份有限公司 Led display screen and display control method therefor
CN114187366A (en) * 2021-12-10 2022-03-15 北京有竹居网络技术有限公司 Camera installation correction method and device, electronic equipment and storage medium
CN114283141A (en) * 2021-12-29 2022-04-05 上海肇观电子科技有限公司 Method, apparatus, electronic device, and medium for evaluating depth image quality
CN114795079A (en) * 2022-05-06 2022-07-29 广州为实光电医疗科技有限公司 Matching calibration method and device for medical endoscope double-camera module

Similar Documents

Publication Publication Date Title
CN110059685B (en) Character area detection method, device and storage medium
CN107613202B (en) Shooting method and mobile terminal
EP2942753A1 (en) Method and device for image segmentation
CN108668086B (en) Automatic focusing method and device, storage medium and terminal
CN103019537B (en) A kind of image preview method and device
CN111476780A (en) Image detection method and device, electronic equipment and storage medium
WO2013109478A1 (en) Systems and methods for mobile image capture and processing
CN109639896A (en) Block object detecting method, device, storage medium and mobile terminal
EP4072131A1 (en) Image processing method and apparatus, terminal and storage medium
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
CN110619807B (en) Method and device for generating global thermodynamic diagram
CN110442521B (en) Control unit detection method and device
CN108009563B (en) Image processing method and device and terminal
CN112637587B (en) Dead pixel detection method and device
US20150317800A1 (en) Method and device for image segmentation
CN113132695A (en) Lens shadow correction method and device and electronic equipment
US20130137482A1 (en) Perspective correction using a reflection
CN108984097B (en) Touch operation method and device, storage medium and electronic equipment
CN112053360B (en) Image segmentation method, device, computer equipment and storage medium
CN113012211A (en) Image acquisition method, device, system, computer equipment and storage medium
CN115695679A (en) Triple depth module matching method and device, mobile terminal, medium and chip
CN111507144A (en) Touch area acquisition method and device, intelligent device and storage medium
CN112907462B (en) Distortion correction method and system for ultra-wide-angle camera device and shooting device comprising distortion correction method and system
CN112241697B (en) Corner color determination method and device, terminal device and readable storage medium
CN116363174A (en) Parameter calibration method, storage medium, co-processing chip and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination