CN118052752A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN118052752A
CN118052752A CN202211406248.9A CN202211406248A CN118052752A CN 118052752 A CN118052752 A CN 118052752A CN 202211406248 A CN202211406248 A CN 202211406248A CN 118052752 A CN118052752 A CN 118052752A
Authority
CN
China
Prior art keywords
photographing
depth
devices
preview image
photographing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211406248.9A
Other languages
Chinese (zh)
Inventor
李国盛
张明华
李浩瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202211406248.9A priority Critical patent/CN118052752A/en
Priority to EP23165996.2A priority patent/EP4369728A1/en
Priority to US18/194,063 priority patent/US20240163416A1/en
Publication of CN118052752A publication Critical patent/CN118052752A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The disclosure relates to an image processing method, an image processing device and a storage medium. The image processing method is applied to the first photographing device and comprises the following steps: determining that the first photographing device is triggered to perform depth information detection based on a plurality of photographing devices, and sending a depth information detection instruction to one or more second photographing devices, wherein the depth information detection instruction is used for controlling the first photographing device and the one or more second photographing devices to synchronously photograph multi-frame depth images; based on the depth information detection instruction, shooting depth images synchronously with the one or more second shooting devices; and performing image processing based on the depth images synchronously shot by the first shooting equipment and the one or more second shooting equipment. By the method and the device, high-precision and high-resolution depth image information can be realized.

Description

Image processing method, device and storage medium
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image processing method, apparatus, and storage medium.
Background
The image depth detection plays an important role in electronic equipment such as mobile phones, can acquire environmental object depth information, three-dimensional size and space information in real time, provides technical support for scenes such as motion capture, three-dimensional modeling, indoor navigation and positioning, and has wide consumption level and industrial level application requirements.
Cameras of electronic devices such as mobile phones and the like are required to be supported by image depth detection technology in the fields of image out-of-focus imaging (Bokeh), ranging, three-dimensional reconstruction and the like. Currently, commonly used image depth detection schemes include Time of flight (TOF), binocular vision (Stereo-vision), structured light (Structured light), and Lidar (Lidar), which are single-phone-based schemes, and each have problems of depth accuracy and resolution of measurement, or are relatively costly.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method applied to a first photographing apparatus, the method including:
Determining that the first photographing device is triggered to perform depth information detection based on a plurality of photographing devices, and sending a depth information detection instruction to one or more second photographing devices, wherein the depth information detection instruction is used for controlling the first photographing device and the one or more second photographing devices to synchronously photograph multi-frame depth images; based on the depth information detection instruction, shooting depth images synchronously with the one or more second shooting devices; and performing image processing based on the depth images synchronously shot by the first shooting equipment and the one or more second shooting equipment.
In one embodiment, the capturing depth images based on the depth information detection instruction and the one or more second photographing apparatuses synchronously includes:
receiving synchronization information fed back by the one or more second devices based on the depth information detection instruction; and controlling the photographing applications of the one or more second photographing devices to synchronously photograph the depth image through the photographing application linkage of the first photographing device based on the synchronization information.
In one embodiment, the receiving the synchronization information fed back by the one or more second devices based on the depth information detection instruction includes:
Receiving internal references of the second photographing devices respectively sent by the one or more second photographing devices based on the depth information detection instruction; determining external parameters between the first photographing device and the one or more second photographing devices based on the internal parameters of the first photographing device and the one or more second photographing devices, and calibrating a camera of the first photographing device; transmitting external parameters between the first photographing device and the one or more second photographing devices to the one or more second photographing devices, respectively; and receiving synchronization information fed back by the one or more second devices after calibration based on external parameters between the first photographing device and the one or more second photographing devices.
In one embodiment, the synchronization information includes a field of view of a second preview image captured by a photographing application of the one or more second photographing devices;
the step of controlling the photographing applications of the one or more second devices to synchronously photograph the depth image by the photographing application of the first photographing device based on the synchronization information comprises the following steps:
Acquiring a first preview image shot by a shooting application of the first shooting equipment, and determining the field of view of the first preview image; acquiring the field of view of a second preview image shot by a shooting application of the one or more second shooting devices; and controlling the one or more second photographing devices to synchronously photograph by the first photographing device in a linkage mode based on the visual field of the first preview image and the visual field of the second preview image.
In one embodiment, the controlling, by the first photographing apparatus, the one or more second photographing apparatuses to perform synchronous photographing in a coordinated manner based on the field of view of the first preview image and the field of view of the second preview image includes:
Determining a coincident field of view region between the field of view of the first preview image and the field of view of the second preview image; based on the coincident visual field area, adjusting the position of the first photographing device and/or the position of the second photographing device to enable the visual field of the first preview image to be the same as the visual field of the second preview image; and when the visual field of the first preview image is determined to be the same as that of the second preview image, the first photographing device is used for controlling the one or more second photographing devices to synchronously photograph.
According to a second aspect of embodiments of the present disclosure, there is provided an image processing method applied to a second photographing apparatus, the method including:
Receiving a depth information detection instruction sent by a first photographing device, wherein the depth information detection instruction is used for controlling the first photographing device and the second photographing device to synchronously photograph multi-frame depth images; and shooting a depth image synchronously with the first shooting equipment based on the depth information detection instruction.
In one embodiment, based on the depth information detection instruction, capturing a depth image in synchronization with the first photographing apparatus includes:
Based on the depth information detection instruction, synchronous information is fed back to the first photographing equipment, wherein the synchronous information is used for indicating photographing applications of the first photographing equipment to control photographing applications of the one or more second photographing equipment to synchronously photograph depth images.
In one embodiment, based on the depth information detection instruction, feeding back synchronization information to the first photographing apparatus includes:
Based on the depth information detection instruction, respectively sending internal parameters of a second photographing device to the first photographing device; if the external parameters between the first photographing equipment and the one or more second photographing equipment, which are sent by the first photographing equipment based on the internal parameters of the second photographing equipment, are received, respectively calibrating cameras of the second photographing equipment; and after determining that the camera calibration of the second equipment is finished, sending synchronization information to the first photographing equipment.
In one embodiment, the synchronization information includes a field of view of a second preview image captured by a photographing application of the one or more second photographing devices;
When the field of view of the first preview image and the field of view of the second preview image of the first photographing device are the same, the depth image is photographed synchronously with the first photographing device; and when the visual field of the first preview image and the visual field of the second preview image of the first photographing device are different, adjusting the position of the first photographing device and/or the position of the second photographing device based on the overlapping visual field area between the visual field of the first preview image and the visual field of the second preview image so that the visual field of the first preview image is identical with the visual field of the second preview image.
According to a third aspect of embodiments of the present disclosure, there is provided an image processing apparatus including:
A sending unit, configured to determine that the first photographing device is triggered to perform depth information detection based on a plurality of photographing devices, and send a depth information detection instruction to one or more second photographing devices, where the depth information detection instruction is used to control the first photographing device and the one or more second photographing devices to synchronously photograph multi-frame depth images;
A photographing unit for photographing a depth image in synchronization with the one or more second photographing apparatuses based on the depth information detection instruction;
and the processing unit is used for processing the image based on the depth image synchronously shot by the first shooting equipment and the one or more second shooting equipment.
In one embodiment, the photographing unit photographs depth images in synchronization with the one or more second photographing apparatuses based on the depth information detection instruction in such a manner that:
receiving synchronization information fed back by the one or more second devices based on the depth information detection instruction; and controlling the photographing applications of the one or more second photographing devices to synchronously photograph the depth image through the photographing application linkage of the first photographing device based on the synchronization information.
In one embodiment, the shooting unit receives synchronization information fed back by the one or more second devices based on the depth information detection instruction in the following manner:
Receiving internal references of the second photographing devices respectively sent by the one or more second photographing devices based on the depth information detection instruction; determining external parameters between the first photographing device and the one or more second photographing devices based on the internal parameters of the first photographing device and the one or more second photographing devices, and calibrating a camera of the first photographing device; transmitting external parameters between the first photographing device and the one or more second photographing devices to the one or more second photographing devices, respectively; and receiving synchronization information fed back by the one or more second devices after calibration based on external parameters between the first photographing device and the one or more second photographing devices.
In one embodiment, the synchronization information includes a field of view of a second preview image captured by a photographing application of the one or more second photographing devices;
the shooting unit adopts the following mode to control the shooting application of the first shooting device to synchronously shoot depth images by the shooting application of the one or more second devices in a linkage mode based on the synchronous information:
Acquiring a first preview image shot by a shooting application of the first shooting equipment, and determining the field of view of the first preview image; acquiring the field of view of a second preview image shot by a shooting application of the one or more second shooting devices; and controlling the one or more second photographing devices to synchronously photograph by the first photographing device in a linkage mode based on the visual field of the first preview image and the visual field of the second preview image.
In one embodiment, the photographing unit uses the first photographing device to control the one or more second photographing devices to perform synchronous photographing in a linkage manner based on the field of view of the first preview image and the field of view of the second preview image:
Determining a coincident field of view region between the field of view of the first preview image and the field of view of the second preview image; based on the coincident visual field area, adjusting the position of the first photographing device and/or the position of the second photographing device to enable the visual field of the first preview image to be the same as the visual field of the second preview image; and when the visual field of the first preview image is determined to be the same as that of the second preview image, the first photographing device is used for controlling the one or more second photographing devices to synchronously photograph.
According to a fourth aspect of embodiments of the present disclosure, there is provided an image processing apparatus including:
The receiving unit is used for receiving a depth information detection instruction sent by the first photographing equipment, and the depth information detection instruction is used for controlling the first photographing equipment and the second photographing equipment to synchronously photograph multi-frame depth images;
And the shooting unit is used for synchronously shooting the depth image with the first shooting equipment based on the depth information detection instruction.
In one embodiment, the photographing unit photographs the depth image in synchronization with the first photographing apparatus based on the depth information detection instruction in the following manner:
Based on the depth information detection instruction, synchronous information is fed back to the first photographing equipment, wherein the synchronous information is used for indicating photographing applications of the first photographing equipment to control photographing applications of the one or more second photographing equipment to synchronously photograph depth images.
In one embodiment, the shooting unit feeds back synchronization information to the first shooting device based on the depth information detection instruction in the following manner:
Based on the depth information detection instruction, respectively sending internal parameters of a second photographing device to the first photographing device; if the external parameters between the first photographing equipment and the one or more second photographing equipment, which are sent by the first photographing equipment based on the internal parameters of the second photographing equipment, are received, respectively calibrating cameras of the second photographing equipment; and after determining that the camera calibration of the second equipment is finished, sending synchronization information to the first photographing equipment.
In one embodiment, the synchronization information includes a field of view of a second preview image captured by a photographing application of the one or more second photographing devices;
When the field of view of the first preview image and the field of view of the second preview image of the first photographing device are the same, the depth image is photographed synchronously with the first photographing device; and when the visual field of the first preview image and the visual field of the second preview image of the first photographing device are different, adjusting the position of the first photographing device and/or the position of the second photographing device based on the overlapping visual field area between the visual field of the first preview image and the visual field of the second preview image so that the visual field of the first preview image is identical with the visual field of the second preview image.
According to a fifth aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
A processor; a memory for storing processor-executable instructions;
Wherein the processor is configured to: the method of the first aspect or any implementation of the first aspect is performed.
According to a sixth aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
A processor; a memory for storing processor-executable instructions;
wherein the processor is configured to: the method of the second aspect or any one of the embodiments of the second aspect is performed.
According to a seventh aspect of the disclosed embodiments, there is provided a storage medium having stored therein instructions which, when executed by a processor of a first photographing apparatus, enable the first photographing apparatus to perform the method of the first aspect or any one of the embodiments of the first aspect.
According to an eighth aspect of the disclosed embodiments, there is provided a storage medium having stored therein instructions which, when executed by a processor of a second photographing apparatus, enable the second photographing apparatus to perform the method of the second aspect or any one of the embodiments of the second aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: the first photographing device sends a depth information detection instruction to one or more second photographing devices and controls the one or more second photographing devices to synchronously photograph depth images; image processing is performed based on a plurality of depth images photographed in synchronization. The first photographing device and the second photographing devices photograph depth images synchronously, so that the time delay problem can be avoided, and further high-precision and high-resolution depth image information can be achieved through the method and the device.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic diagram of an image processing method provided in an embodiment of the present disclosure.
Fig. 2 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating a method for calibrating and determining internal and external parameters of a photographing apparatus according to an exemplary embodiment.
FIG. 5 is a flowchart illustrating a method of determining the same preview image view in accordance with an exemplary embodiment.
Fig. 6 is a flowchart illustrating a method of adjusting a photographing field of view according to an exemplary embodiment.
Fig. 7 is a schematic diagram of adjusting fields of view of a first photographing apparatus and a second photographing apparatus according to an embodiment of the present disclosure.
Fig. 8 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 9 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 10 is a flowchart illustrating an image processing method according to an exemplary embodiment.
Fig. 11 is a flow chart illustrating an image processing method according to an exemplary embodiment.
Fig. 12 is a block diagram of an image processing apparatus according to an exemplary embodiment.
Fig. 13 is a block diagram of an image processing apparatus according to an exemplary embodiment.
Fig. 14 is a block diagram of an apparatus according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure.
The image processing method provided by the embodiment of the disclosure is applied to scenes subjected to image processing. Such as ranging, three-dimensional reconstruction, image bokeh, etc.
In the related art, image depth detection may be applied to an intelligent terminal, such as a mobile terminal having an image capturing device (e.g., a camera). The image depth detection plays an important role in the intelligent terminal, can acquire the depth information, the three-dimensional size and the space information of an environmental object in real time, provides technical support for scenes such as motion capture, three-dimensional modeling, indoor navigation and positioning, and has wide consumption level and industrial level application requirements. The image acquisition device of the intelligent terminal needs support in the image depth detection technology in the fields of image bokeh, ranging, three-dimensional reconstruction and the like. At present, common image depth detection schemes on intelligent terminals include TOF, binocular visible light, structured light, lidar and other schemes.
The TOF needs to sample and integrate for many times when detecting the phase offset, so that the resource consumption is high, and the edge precision is low; the binocular visible light scheme is a pure vision method, the calculated amount is large, and the binocular vision performs image matching according to the vision characteristics, and no characteristics can cause matching failure, so that the binocular visible light scheme is not suitable for a scene lacking texture monotonically; the structured light scheme increases along with the detection distance, so that the precision is poor, and the structured light scheme is easily interfered by ambient light; the Lidar scheme is relatively costly. In addition, the resolution of the depth camera is low at present, and the resolution of the common RGB camera reaches over ten tens of times or even hundreds of times of the resolution of the depth camera. Therefore, the low-resolution depth map needs to be changed into a high resolution consistent with the RGB camera, the resolution improvement needs to use the texture, boundary and other content information of the object in the color map, and the process is difficult to maintain in detail.
The present disclosure provides an image processing method, based on multi-machine interconnection, controlling a plurality of photographing devices to obtain image depth information, so as to implement operations such as background blurring, object segmentation, three-dimensional reconstruction, and the like. When the image depth detection is realized, compared with a single photographing device, the image depth precision and resolution can be improved without increasing hardware cost.
For convenience of description, a master device among a plurality of photographing devices for depth image processing is referred to as a first photographing device, and a slave device is referred to as a second photographing device.
Fig. 1 is a schematic diagram of an image processing method provided in an embodiment of the present disclosure. As shown in fig. 1, the first photographing device and the second photographing device are used to adjust the photographed field of view, so that the plurality of photographing devices photograph images with the same field of view, the first photographing device controls the plurality of second photographing devices to photograph synchronously according to respective internal parameters (focal length, image center, distortion parameters and the like of the photographing devices) and external parameters (rotation and translation information of the plurality of photographing devices) to obtain a plurality of images, and the obtained plurality of depth maps are combined to perform optical simulation, including depth of view, circle radius of diffusion, light spot reconstruction, noise matching and the like.
The method for generating the high dynamic range image provided by the embodiment of the disclosure can be applied to photographing equipment such as mobile phones, tablet computers, all-in-one computers, notebook computers, desktop computers, digital cameras, video cameras and the like which are provided with the camera device.
In an embodiment of the present disclosure, a plurality of photographing apparatuses are included. And a plurality of photographing devices are connected through Bluetooth, wireless local area network or NFC and other technologies, so that multi-machine interconnection is realized. The main photographing device playing a role in control is called a first photographing device, and one or more auxiliary photographing devices which are connected with the first photographing device, receive photographing instructions and synchronously photograph images become second photographing devices.
Fig. 2 is a flowchart illustrating an image processing method according to an exemplary embodiment, as shown in fig. 2, for a first photographing apparatus, comprising the steps of:
in step S11, it is determined that the first photographing apparatus is triggered to perform depth information detection based on the plurality of photographing apparatuses, and a depth information detection instruction is sent to one or more second photographing apparatuses, where the depth information detection instruction is used to control the first photographing apparatus and the one or more second photographing apparatuses to synchronously photograph multi-frame depth images.
In step S12, a depth image is photographed in synchronization with one or more second photographing apparatuses based on the depth information detection instruction.
In step S13, image processing is performed based on depth images photographed by the first photographing apparatus in synchronization with one or more second photographing apparatuses.
In the embodiment of the disclosure, a first photographing device sends a depth information detection instruction to one or more second photographing devices, and controls the one or more second photographing devices to synchronously photograph depth images; and performing depth image processing based on the synchronously shot multiple depth images. According to the depth image information processing method and device, the depth images are synchronously shot through the plurality of shooting devices, the time delay problem can be avoided, and further high-precision and high-resolution depth image information can be achieved.
In the embodiment of the disclosure, the first photographing device and one or more second photographing devices are interconnected by manual or automatic connection. In general, selecting 2 to 8 photographing devices according to a photographing instruction set by a first photographing device controls the plurality of photographing devices to synchronously photograph 2 to 8 images.
Taking out-of-focus imaging (bokeh) as an example, out-of-focus imaging may be understood as imaging of the out-of-focus virtual portion, also known as virtual image, defocus. In the post-photographing image mode of the photographing device, two lenses are required to be adopted for photographing due to the limited space of the photographing device, a bokey of a single photographing device needs to use a super wide-angle lens and a tele lens to photograph pictures, and multiple photographing devices can obtain depth map (depthmap) information through processing pictures photographed by main photographing lenses of multiple photographing devices by using a bokey image algorithm. Namely, a plurality of visual angles of a plurality of photographing devices are used for photographing, the distance between an object and a lens is judged according to parallax, the depth information of an image is extracted according to the distance, and blurring operation is carried out on the image, so that a blurring effect is achieved.
In one embodiment, the multi-view depth reconstruction selects a frame in the middle from N frames of reference images shot by a plurality of shooting devices as a reference frame of a main view angle, and extracts motion information from left and right frames of images to recover a scene expressed by the main view angle. And background blurring is performed by means of scene content expressed by the main viewing angle. The main visual angle reference image shot by the first shooting equipment is used as a reference image, the reference image shot by the second shooting equipment is used as an auxiliary image, and image depth processing is carried out to obtain a depth image with a blurring effect.
In the embodiment of the disclosure, the image processing capability and the computing capability of the hardware of the first photographing device are limited, and the first photographing device can upload the image shot by the first photographing device and the received one or more images shot by the second photographing device to the image processing server for image processing. Or the first photographing device has enough image processing capability and calculation capability, and the image depth processing of a plurality of images is carried out through the picture processing software of the first photographing device.
According to the embodiment of the disclosure, after the first photographing device sends an instruction to the second photographing device, the first photographing device receives synchronization information fed back by the second photographing device and controls the second photographing device to synchronously photograph the depth image based on the synchronization information.
Fig. 3 is a flowchart illustrating an image processing method according to an exemplary embodiment, as shown in fig. 3, for a first photographing apparatus, comprising the steps of:
in step S21, one or more second devices receive synchronization information fed back by the detection instruction based on the depth information.
In the embodiment of the disclosure, the first photographing device and the second photographing devices are connected in a bluetooth, wireless local area network or NFC mode to realize information communication, the first photographing device sends a depth information detection instruction to one or more second photographing devices, the instruction includes internal and external parameter information and synchronous information, the one or more second photographing devices feedback the synchronous information back to the first photographing device after receiving the instruction, and the first photographing device receives the feedback synchronous information. The synchronization information is used for informing the first photographing equipment that the establishment of the multi-machine interconnection channel is completed, and the second photographing equipment is ready.
In step S22, based on the synchronization information, the photographing applications of the one or more second photographing apparatuses are controlled in linkage by the photographing application of the first photographing apparatus to synchronously photograph the depth image.
In the embodiment of the disclosure, the synchronization information includes various pieces of information of a second device that performs synchronous image capturing with the first photographing device and the number of devices.
In one embodiment, the second photographing device may be the same as the first photographing device in the same type, or may be a device such as a mobile phone with a different type, and the resolution of the photographing pixels of each device may be the same or different. Each photographing device shoots synchronously, so that the pictures captured by all photographing devices at the same moment in each shooting process can be ensured to be scenes at the same moment, and subsequent image processing is facilitated.
According to the embodiment of the disclosure, before shooting a depth image, calibration is required for shooting equipment, and internal parameters and external parameters of the first shooting equipment and the second shooting equipment are respectively determined.
Fig. 4 is a flowchart illustrating a method for calibrating and determining internal and external parameters of a photographing apparatus according to an exemplary embodiment, and the method is used for a first photographing apparatus, as shown in fig. 4, and includes the following steps:
in step S31, the internal references of the second photographing apparatuses, which are respectively transmitted by the one or more second photographing apparatuses based on the depth information detection instruction, are received.
In the embodiment of the disclosure, the internal parameters of the photographing apparatus need to be determined, where the internal parameters include focal length, image center, distortion parameters, and the like of the camera, and the following are internal parameter matrices:
Wherein, the internal parameters of the photographing equipment are respectively: f is the focal length in millimeters; f x is the length of the focal length in the x-axis direction using pixels; f y is the length of the focal length in the y-axis direction using pixels; u 0,v0 is the principal point coordinates (relative to the imaging plane), the unit is also a pixel; gamma is a coordinate axis tilt parameter, ideally 0; the internal reference matrix is the attribute of the camera itself, and the parameters can be obtained through calibration.
In step S32, external parameters between the first photographing apparatus and the one or more second photographing apparatuses are determined based on the internal parameters of the first photographing apparatus and the one or more second photographing apparatuses, and camera calibration is performed on the first photographing apparatus.
In the embodiment of the disclosure, an external parameter of a photographing apparatus needs to be determined, where the external parameter includes rotation and translation information of a plurality of photographing apparatuses, and the following external parameter matrix is:
wherein, the external parameters of the photographing device are descriptions of the world coordinate system under the camera coordinate system. R is a rotation parameter, which is the product of the rotation matrices for each axis, where the rotation parameter (phi, omega, theta) for each axis. T is the translation parameter (Tx, ty, tz).
In addition, the calibration of the photographing equipment is the preface work of multi-view depth image photographing, and the purpose of the calibration is to determine internal parameters, external parameters and distortion parameters of the camera.
In step S33, external parameters between the first photographing apparatus and the one or more second photographing apparatuses are transmitted to the one or more second photographing apparatuses, respectively.
In the embodiment of the disclosure, ranging, object separation, and three-dimensional reconstruction including an image processing algorithm of a bokeh photographing require that a first photographing device and one or more second photographing devices set corresponding internal parameters and external parameters.
In step S34, synchronization information fed back after calibration by the one or more second devices based on external parameters between the first photographing device and the one or more second photographing devices is received.
In the embodiment of the disclosure, a first photographing device receives internal parameters of a second photographing device, which are respectively sent by one or more second photographing devices based on a depth information detection instruction, wherein the internal parameters comprise a focal length, an image center, distortion parameters and the like of a camera; and determining external parameters between the first photographing device and one or more second photographing devices based on the internal parameters, wherein the external parameters comprise rotation and translation information of the photographing devices, and calibrating the camera of the first photographing device. The first photographing device sends external parameters between the first photographing device and one or more second photographing devices to the one or more second photographing devices respectively; the first photographing device receives synchronization information fed back by the one or more second devices after calibration based on external parameters between the first photographing device and the one or more second photographing devices. After the synchronous information is obtained, the first photographing device controls two or more photographing devices to simultaneously photograph scenes.
In one embodiment, multiple photographing devices may be used to photograph the same scene from multiple angles, and the depth of the scene may be recovered by calculating the distance from a point in the scene to the camera by triangulation or the like.
According to an embodiment of the present disclosure, before performing depth image photographing, it is necessary to determine that the first photographing apparatus and the second photographing apparatus are in the same photographing field of view.
Fig. 5 is a flowchart illustrating a method of determining the same preview image view, as shown in fig. 5, for a first photographing apparatus, according to an exemplary embodiment, comprising the steps of:
In step S41, a first preview image captured by a photographing application of the first photographing apparatus is acquired, and a field of view of the first preview image is determined.
In step S42, a field of view of a second preview image captured by a photographing application of one or more second photographing apparatuses is acquired.
In step S43, based on the field of view of the first preview image and the field of view of the second preview image, one or more second photographing devices are controlled by the first photographing device in linkage to perform synchronous photographing.
In the embodiment of the present disclosure, the photographing application of the first photographing apparatus may be the same or different application program as the photographing application of the second photographing apparatus. The method comprises the steps that a first preview image shot by a shooting application of a first shooting device is displayed on a display screen of the first shooting device, the first preview image is a main image shot subsequently and is determined to be a reference field of view, and second preview images shot by one or more shooting applications of a second shooting device are displayed on the display screen of the second shooting device and are determined to be the field of view. And determining a second device corresponding to a second preview image which is identical to the field of view of the first preview image by human eye observation or a computer image matching technology as a second device for synchronously shooting images with the first shooting device. The field of view of the first preview image of the first photographing device and the field of view of the second preview image of the second photographing device are adjusted and kept the same so as to photograph images with the same field of view, and the depth processing of a plurality of images is facilitated.
According to an embodiment of the present disclosure, before determining a second device corresponding to a second preview image that is identical to a field of view of a first preview image, that is, adjusting the first photographing device and the second photographing device to photograph the same field of view.
Fig. 6 is a flowchart illustrating a method of adjusting a photographing view according to an exemplary embodiment, as shown in fig. 6, for a first photographing apparatus, including the steps of:
In step S51, a superimposed field of view region between the field of view of the first preview image and the field of view of the second preview image is determined.
In step S52, the position of the first photographing apparatus and/or the position of the second photographing apparatus is adjusted so that the field of view of the first preview image is the same as the field of view of the second preview image based on the overlapping field of view regions.
In step S53, when it is determined that the field of view of the first preview image is the same as the field of view of the second preview image, the one or more second photographing devices are controlled by the first photographing device to perform synchronous photographing.
In the embodiment of the disclosure, before determining the second device corresponding to the second preview image that is identical to the field of view of the first preview image, a coinciding field of view area between the field of view of the first preview image and the field of view of the second preview image needs to be determined, and based on the coinciding field of view area, the position of the first photographing device and/or the position of the second photographing device is adjusted so that the field of view of the first preview image is identical to the field of view of the second preview image.
In the embodiment of the disclosure, after receiving the focusing point coordinate returned by the second photographing device, the first photographing device performs optical simulation by combining the depth map obtained by itself, wherein the optical simulation includes depth field drawing, circle radius of diffusion, light spot reconstruction, noise matching and the like, so as to obtain the depth map after depth image processing.
Fig. 7 is a schematic diagram of adjusting fields of view of a first photographing apparatus and a second photographing apparatus according to an embodiment of the present disclosure. As shown in fig. 7, taking a photographing device as two mobile phones as an example, the field of view adjustment is performed by establishing multi-phone interconnection. Referring to fig. 7, the first preview image and the second preview image after the two mobile phones are turned on are matched with each other as feature points, the fields of vision of the overlapped images are respectively framed in the preview images of the two mobile phones, and the mobile phone user is guided to adjust the angle of the mobile phones by displaying arrows on the preview interface of the camera, so that the fields of vision of the images of the two mobile phones are consistent. When the visual fields of the images are basically overlapped, the mobile phone prompts the user to keep motionless, and prompts the host to press a photographing button to photograph, and other mobile phone devices with multiple interconnected mobile phones keep motionless.
In one embodiment, the photographing device is fixed on a rotatable support, and the support can be manually or electrically adjusted to enable the visual field of images of two mobile phones to be consistent.
The following is a description of a process in which the second photographing apparatus feeds back synchronization information based on a depth information detection instruction transmitted by the first photographing apparatus, adjusts a photographing field of view with the first photographing apparatus, and photographs a depth image in synchronization with the first photographing apparatus.
Fig. 8 is a flowchart showing an image processing method for a second photographing apparatus, as shown in fig. 8, according to an exemplary embodiment, comprising the steps of:
in step S61, a depth information detection instruction sent by the first photographing apparatus is received, where the depth information detection instruction is used to control the first photographing apparatus and the second photographing apparatus to synchronously photograph multiple frames of depth images.
In step S62, a depth image is photographed in synchronization with the first photographing apparatus based on the depth information detection instruction.
In the embodiment of the disclosure, the second photographing device is connected with the first photographing device through bluetooth, wireless local area network or NFC to realize information communication, and the second photographing device receives a depth information detection instruction sent by the first photographing device, where the instruction includes internal parameter information, external parameter information and synchronization information. The depth information detection instruction is used for controlling the first photographing device and the second photographing device to synchronously photograph multi-frame depth images. The second photographing device is used for synchronously photographing a plurality of images with different visual angles with the first photographing device based on the depth information detection instruction, and transmitting the images with different visual angles synchronously photographed with the first photographing device to the first photographing device after photographing is completed. According to the embodiment of the disclosure, a plurality of depth images are synchronously shot, so that corresponding image processing is conveniently carried out on the basis of the plurality of depth images.
In the embodiment of the disclosure, based on a depth information detection instruction, the second photographing device feeds back synchronization information to the first photographing device, where the synchronization information is used to instruct a photographing application of the first photographing device to control photographing applications of one or more second photographing devices to synchronously photograph depth images. It can be understood that the second photographing apparatus informs the first photographing apparatus that the first photographing apparatus is ready to work before photographing, and the second photographing apparatus is controlled to photograph synchronously by the first photographing apparatus.
After information synchronization and information sharing between the first photographing device and the second photographing device are completed, internal and external parameters photographed by the first photographing device and the second photographing device need to be determined.
Fig. 9 is a flowchart showing an image processing method for a second photographing apparatus, as shown in fig. 9, according to an exemplary embodiment, comprising the steps of:
In step S71, the internal references of the second photographing apparatus are respectively transmitted to the first photographing apparatus based on the depth information detection instruction.
In step S72, if external parameters between the first photographing apparatus and one or more second photographing apparatuses transmitted by the first photographing apparatus based on internal parameters of the second photographing apparatus are received, camera calibration is performed on the second photographing apparatuses, respectively.
In the embodiment of the disclosure, the photographing device is calibrated, and the process is mainly used for acquiring the internal parameters of the camera.
In step S73, after determining that the camera calibration of the second device is completed, synchronization information is sent to the first photographing device.
In the embodiment of the disclosure, based on the depth information detection instruction, internal references of the second photographing device are respectively sent to the first photographing device. And if the external parameters between the first photographing equipment and one or more second photographing equipment, which are sent by the first photographing equipment based on the internal parameters of the second photographing equipment, are received, respectively calibrating the cameras of the second photographing equipment. And after determining that the camera calibration of the second equipment is finished, sending synchronization information to the first photographing equipment. The synchronization information is used for informing the first photographing device that the next action can be performed.
Fig. 10 is a flowchart showing an image processing method for a second photographing apparatus, as shown in fig. 10, according to an exemplary embodiment, comprising the steps of:
in step S81, a field of view of a first preview image of a first photographing apparatus is determined.
In step S82, it is determined whether the field of view of the first preview image and the field of view of the second preview image of the first photographing apparatus are the same.
In step S83a, when the field of view of the first preview image and the field of view of the second preview image of the first photographing apparatus are the same, the depth image is photographed in synchronization with the first photographing apparatus.
In step S83b, when the field of view of the first preview image and the field of view of the second preview image of the first photographing apparatus are different, the position of the first photographing apparatus and/or the position of the second photographing apparatus is adjusted so that the field of view of the first preview image is identical to the field of view of the second preview image based on the overlapping field of view region between the field of view of the first preview image and the field of view of the second preview image.
In the embodiment of the disclosure, the field of view of the first preview image of the first photographing device is determined, and when the field of view of the first preview image of the first photographing device and the field of view of the second preview image are determined to be the same, images with different viewing angles are synchronously photographed with the first photographing device, so that the field of view of the first preview image of the first photographing device and the field of view of the second preview image of the second photographing device are adjusted and kept the same, images with the same field of view are photographed, and depth image processing of a plurality of images is facilitated.
In an embodiment of the disclosure, if the field of view of the first preview image and the field of view of the second preview image of the first photographing apparatus are different, the position of the first photographing apparatus and/or the position of the second photographing apparatus is adjusted based on the overlapping field of view region between the field of view of the first preview image and the field of view of the second preview image, so that the field of view of the first preview image is the same as the field of view of the second preview image.
Fig. 11 is a flow chart illustrating an image processing method according to an exemplary embodiment. As shown in fig. 11, after the photographing apparatus 1 selects a bokeh scene, the selection may be continued to be to perform a single photographing apparatus bokeh or a multi photographing apparatus bokeh. If the multi-photographing equipment bokeh mode is selected, executing a multi-photographing equipment bokeh flow; otherwise, executing the normal flow of the single photographing equipment bokeh photographing. The photographing apparatus 1 sends a control instruction of photographing a boot to the photographing apparatus 2 through a multi-camera-interconnected channel, and the photographing instruction includes internal parameters (focal length, image center, distortion parameters, etc. of the cameras) and external parameters (rotation and translation information of the two mobile phone cameras) of the photographing apparatus 1 and the photographing apparatus 2. After receiving the instruction, the photographing device 2 returns synchronization information to perform information synchronization with the photographing device 1. After synchronization, the two photographing devices simultaneously perform scene photographing. The photographing apparatus 2 transmits the photographed image to the photographing apparatus 1 through the multi-machine interconnection channel. After receiving the focal point coordinates returned by the photographing device 2, the photographing device 1 performs optical simulation by combining the depth map obtained by itself, wherein the optical simulation comprises depth field drawing, circle radius of diffusion, light spot reconstruction, noise matching and the like. The photographing apparatus 1 continues with other post-processing to complete the post-boot photographing process.
The method comprises the steps that a first photographing device sends a depth information detection instruction to one or more second photographing devices, wherein the instruction comprises internal parameters and external parameters calibrated by the photographing devices, and the first photographing device controls the one or more second photographing devices to synchronously photograph depth images based on synchronous information returned by the second photographing devices; and performing depth image processing based on the plurality of depth images shot synchronously. The method and the device can realize high-precision and high-resolution depth image information in scenes such as ranging, three-dimensional reconstruction and image bokeh.
Based on the same conception, the embodiment of the disclosure also provides an image processing device.
It will be appreciated that, in order to implement the above-described functions, the image processing apparatus provided in the embodiments of the present disclosure includes corresponding hardware structures and/or software modules that perform the respective functions. The disclosed embodiments may be implemented in hardware or a combination of hardware and computer software, in combination with the various example elements and algorithm steps disclosed in the embodiments of the disclosure. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present disclosure.
Fig. 12 is a block diagram 100 of an image processing apparatus according to an exemplary embodiment. Referring to fig. 12, the apparatus includes a transmitting unit 101, a photographing unit 102, and a processing unit 103.
A sending unit 101, configured to determine that the first photographing device is triggered to perform depth information detection based on the plurality of photographing devices, and send a depth information detection instruction to one or more second photographing devices, where the depth information detection instruction is used to control the first photographing device and the one or more second photographing devices to synchronously photograph multiple frames of depth images;
A photographing unit 102 for photographing a depth image in synchronization with one or more second photographing apparatuses based on the depth information detection instruction;
And a processing unit 103, configured to perform image processing based on depth images synchronously captured by the first photographing apparatus and the one or more second photographing apparatuses.
In one embodiment, the photographing unit 102 photographs depth images in synchronization with one or more second photographing apparatuses based on the depth information detection instruction in the following manner:
Receiving synchronization information fed back by one or more second devices based on the depth information detection instruction; based on the synchronization information, the photographing applications of the one or more second photographing devices are controlled in a linkage manner through the photographing applications of the first photographing device to synchronously photograph the depth images.
In one embodiment, the capturing unit 102 receives synchronization information fed back by one or more second devices based on the depth information detection instruction in the following manner:
Receiving internal references of the second photographing devices respectively sent by one or more second photographing devices based on the depth information detection instruction; determining external parameters between the first photographing device and one or more second photographing devices based on the internal parameters of the first photographing device and the one or more second photographing devices, and calibrating a camera of the first photographing device; transmitting external parameters between the first photographing device and one or more second photographing devices to the one or more second photographing devices respectively; and receiving synchronization information fed back by the one or more second devices after calibration based on external parameters between the first photographing device and the one or more second photographing devices.
In one embodiment, the synchronization information includes a field of view of a second preview image captured by a capture application of the one or more second capture devices;
the photographing unit 102 controls the photographing applications of the one or more second devices to synchronously photograph the depth image by the photographing application of the first photographing device based on the synchronization information in a manner including:
Acquiring a first preview image shot by a shooting application of first shooting equipment, and determining the field of view of the first preview image; acquiring the field of view of a second preview image shot by a shooting application of one or more second shooting devices; and controlling one or more second photographing devices to synchronously photograph by the first photographing device in a linkage manner based on the field of view of the first preview image and the field of view of the second preview image.
In one embodiment, the photographing unit 102 performs, based on the field of view of the first preview image and the field of view of the second preview image, the controlling, by the first photographing apparatus, the one or more second photographing apparatuses to perform the synchronous photographing in a coordinated manner, including:
Determining a coincident field of view region between the field of view of the first preview image and the field of view of the second preview image; based on the overlapped visual field area, adjusting the position of the first photographing device and/or the position of the second photographing device to enable the visual field of the first preview image to be the same as the visual field of the second preview image; and when the visual field of the first preview image is determined to be the same as that of the second preview image, the first photographing device is used for controlling the one or more second photographing devices to synchronously photograph.
Fig. 13 is a block diagram 200 of an image processing apparatus according to an exemplary embodiment. Referring to fig. 13, the apparatus includes a receiving unit 201 and a photographing unit 202.
A receiving unit 201, configured to receive a depth information detection instruction sent by a first photographing device, where the depth information detection instruction is used to control the first photographing device and a second photographing device to synchronously photograph multiple frames of depth images;
And a photographing unit 202 for photographing the depth image in synchronization with the first photographing apparatus based on the depth information detection instruction.
In one embodiment, the photographing unit 202 photographs a depth image in synchronization with the first photographing apparatus based on the depth information detection instruction in a manner including:
Based on the depth information detection instruction, synchronous information is fed back to the first photographing equipment, and the synchronous information is used for indicating the photographing application of the first photographing equipment to control the photographing application of one or more second photographing equipment to synchronously photograph the depth images.
In one embodiment, the photographing unit 202 feeds back synchronization information to the first photographing apparatus based on the depth information detection instruction in the following manner, including:
Based on the depth information detection instruction, respectively sending internal parameters of the second photographing equipment to the first photographing equipment;
If the external parameters between the first photographing equipment and one or more second photographing equipment, which are sent by the first photographing equipment based on the internal parameters of the second photographing equipment, are received, respectively calibrating cameras of the second photographing equipment;
And after determining that the camera calibration of the second equipment is finished, sending synchronization information to the first photographing equipment.
In one embodiment, the synchronization information includes a field of view of a second preview image captured by a capture application of the one or more second capture devices; when the field of view of the first preview image of the first photographing device is the same as the field of view of the second preview image, the depth image is photographed synchronously with the first photographing device;
When the field of view of the first preview image and the field of view of the second preview image of the first photographing device are different, the position of the first photographing device and/or the position of the second photographing device is/are adjusted based on the overlapping field of view area between the field of view of the first preview image and the field of view of the second preview image, so that the field of view of the first preview image is identical to the field of view of the second preview image.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Referring to fig. 14, the apparatus 300 may include one or more of the following components: a processing component 302, a memory 304, a power component 306, a multimedia component 308, an audio component 310, an input/output (I/O) interface 312, a sensor component 314, and a communication component 316.
The processing component 302 generally controls overall operation of the apparatus 300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 302 may include one or more processors 320 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 302 can include one or more modules that facilitate interactions between the processing component 302 and other components. For example, the processing component 302 may include a multimedia module to facilitate interaction between the multimedia component 308 and the processing component 302.
Memory 304 is configured to store various types of data to support operations at apparatus 300. Examples of such data include instructions for any application or method operating on the device 300, contact data, phonebook data, messages, pictures, videos, and the like. The memory 304 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 306 provides power to the various components of the device 300. The power components 306 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 300.
The multimedia component 308 includes a screen between the device 300 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 308 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the apparatus 300 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 310 is configured to output and/or input audio signals. For example, the audio component 310 includes a Microphone (MIC) configured to receive external audio signals when the device 300 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 304 or transmitted via the communication component 316. In some embodiments, audio component 310 further comprises a speaker for outputting audio signals.
The I/O interface 312 provides an interface between the processing component 302 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 314 includes one or more sensors for providing status assessment of various aspects of the apparatus 300. For example, the sensor assembly 314 may detect the on/off state of the device 300, the relative positioning of the components, such as the display and keypad of the device 300, the sensor assembly 314 may also detect a change in position of the device 300 or a component of the device 300, the presence or absence of user contact with the device 300, the orientation or acceleration/deceleration of the device 300, and a change in temperature of the device 300. The sensor assembly 314 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 314 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 316 is configured to facilitate communication between the apparatus 300 and other devices, either wired or wireless. The device 300 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 316 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 316 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 304, including instructions executable by processor 320 of apparatus 300 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It is understood that the term "plurality" in this disclosure means two or more, and other adjectives are similar thereto. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is further understood that the terms "first," "second," and the like are used to describe various information, but such information should not be limited to these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the expressions "first", "second", etc. may be used entirely interchangeably. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that the terms "center," "longitudinal," "transverse," "front," "rear," "upper," "lower," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like, as used herein, refer to an orientation or positional relationship based on that shown in the drawings, merely for convenience in describing the present embodiments and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operate in a particular orientation.
It will be further understood that "connected" includes both direct connection where no other member is present and indirect connection where other element is present, unless specifically stated otherwise.
It will be further understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the scope of the appended claims.

Claims (15)

1. An image processing method, applied to a first photographing apparatus, comprising:
Determining that the first photographing device is triggered to perform depth information detection based on a plurality of photographing devices, and sending a depth information detection instruction to one or more second photographing devices, wherein the depth information detection instruction is used for controlling the first photographing device and the one or more second photographing devices to synchronously photograph multi-frame depth images;
Based on the depth information detection instruction, shooting depth images synchronously with the one or more second shooting devices;
and performing image processing based on the depth images synchronously shot by the first shooting equipment and the one or more second shooting equipment.
2. The method of claim 1, wherein the capturing depth images in synchronization with the one or more second photographing devices based on the depth information detection instruction comprises:
Receiving synchronization information fed back by the one or more second devices based on the depth information detection instruction;
And controlling the photographing applications of the one or more second photographing devices to synchronously photograph the depth image through the photographing application linkage of the first photographing device based on the synchronization information.
3. The method of claim 2, wherein the receiving synchronization information fed back by the one or more second devices based on the depth information detection instructions comprises:
Receiving internal references of the second photographing devices respectively sent by the one or more second photographing devices based on the depth information detection instruction;
Determining external parameters between the first photographing device and the one or more second photographing devices based on the internal parameters of the first photographing device and the one or more second photographing devices, and calibrating a camera of the first photographing device;
Transmitting external parameters between the first photographing device and the one or more second photographing devices to the one or more second photographing devices, respectively;
And receiving synchronization information fed back by the one or more second devices after calibration based on external parameters between the first photographing device and the one or more second photographing devices.
4. A method according to claim 2 or 3, wherein the synchronisation information comprises a field of view of a second preview image taken by a photographing application of the one or more second photographing devices;
the step of controlling the photographing applications of the one or more second devices to synchronously photograph the depth image by the photographing application of the first photographing device based on the synchronization information comprises the following steps:
Acquiring a first preview image shot by a shooting application of the first shooting equipment, and determining the field of view of the first preview image;
Acquiring the field of view of a second preview image shot by a shooting application of the one or more second shooting devices;
And controlling the one or more second photographing devices to synchronously photograph by the first photographing device in a linkage mode based on the visual field of the first preview image and the visual field of the second preview image.
5. The method of claim 4, wherein the controlling, by the first photographing device, the one or more second photographing devices to photograph synchronously based on the field of view of the first preview image and the field of view of the second preview image includes:
determining a coincident field of view region between the field of view of the first preview image and the field of view of the second preview image;
Based on the coincident visual field area, adjusting the position of the first photographing device and/or the position of the second photographing device to enable the visual field of the first preview image to be the same as the visual field of the second preview image;
and when the visual field of the first preview image is determined to be the same as that of the second preview image, the first photographing device is used for controlling the one or more second photographing devices to synchronously photograph.
6. An image processing method, characterized by being applied to a second photographing apparatus, comprising:
Receiving a depth information detection instruction sent by a first photographing device, wherein the depth information detection instruction is used for controlling the first photographing device and the second photographing device to synchronously photograph multi-frame depth images;
And shooting a depth image synchronously with the first shooting equipment based on the depth information detection instruction.
7. The method of claim 6, wherein capturing depth images in synchronization with the first photographing device based on the depth information detection instruction comprises:
Based on the depth information detection instruction, synchronous information is fed back to the first photographing equipment, wherein the synchronous information is used for indicating photographing applications of the first photographing equipment to control photographing applications of the one or more second photographing equipment to synchronously photograph depth images.
8. The method of claim 7, wherein feeding back synchronization information to the first photographing device based on the depth information detection instruction, comprises:
based on the depth information detection instruction, respectively sending internal parameters of a second photographing device to the first photographing device;
If the external parameters between the first photographing equipment and the one or more second photographing equipment, which are sent by the first photographing equipment based on the internal parameters of the second photographing equipment, are received, respectively calibrating cameras of the second photographing equipment;
and after determining that the camera calibration of the second equipment is finished, sending synchronization information to the first photographing equipment.
9. The method of claim 7 or 8, wherein the synchronization information includes a field of view of a second preview image captured by a capture application of the one or more second capture devices; when the field of view of the first preview image and the field of view of the second preview image of the first photographing device are the same, the depth image is photographed synchronously with the first photographing device;
And when the visual field of the first preview image and the visual field of the second preview image of the first photographing device are different, adjusting the position of the first photographing device and/or the position of the second photographing device based on the overlapping visual field area between the visual field of the first preview image and the visual field of the second preview image so that the visual field of the first preview image is identical with the visual field of the second preview image.
10. An image processing apparatus, characterized by being applied to a first photographing device, comprising:
A sending unit, configured to determine that the first photographing device is triggered to perform depth information detection based on a plurality of photographing devices, and send a depth information detection instruction to one or more second photographing devices, where the depth information detection instruction is used to control the first photographing device and the one or more second photographing devices to synchronously photograph multi-frame depth images;
A photographing unit for photographing a depth image in synchronization with the one or more second photographing apparatuses based on the depth information detection instruction;
and the processing unit is used for processing the image based on the depth image synchronously shot by the first shooting equipment and the one or more second shooting equipment.
11. An image processing apparatus, characterized by being applied to a second photographing device, comprising:
The receiving unit is used for receiving a depth information detection instruction sent by the first photographing equipment, and the depth information detection instruction is used for controlling the first photographing equipment and the second photographing equipment to synchronously photograph multi-frame depth images;
And the shooting unit is used for synchronously shooting the depth image with the first shooting equipment based on the depth information detection instruction.
12. An image processing apparatus, comprising:
A processor;
a memory for storing processor-executable instructions;
Wherein the processor is configured to: performing the method of any one of claims 1 to 5.
13. An image processing apparatus, comprising:
A processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: performing the method of any one of claims 6 to 9.
14. A storage medium having instructions stored therein that, when executed by a processor of a first photographing device, enable the first photographing device to perform the method of any of claims 1 to 5.
15. A storage medium having instructions stored therein which, when executed by a processor of a second photographing apparatus, enable the second photographing apparatus to perform the method of any of claims 6 to 9.
CN202211406248.9A 2022-11-10 2022-11-10 Image processing method, device and storage medium Pending CN118052752A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202211406248.9A CN118052752A (en) 2022-11-10 2022-11-10 Image processing method, device and storage medium
EP23165996.2A EP4369728A1 (en) 2022-11-10 2023-03-31 Photographing method and device
US18/194,063 US20240163416A1 (en) 2022-11-10 2023-03-31 Photographing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211406248.9A CN118052752A (en) 2022-11-10 2022-11-10 Image processing method, device and storage medium

Publications (1)

Publication Number Publication Date
CN118052752A true CN118052752A (en) 2024-05-17

Family

ID=91050754

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211406248.9A Pending CN118052752A (en) 2022-11-10 2022-11-10 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN118052752A (en)

Similar Documents

Publication Publication Date Title
EP3067746B1 (en) Photographing method for dual-camera device and dual-camera device
JP6348611B2 (en) Automatic focusing method, apparatus, program and recording medium
EP3544286B1 (en) Focusing method, device and storage medium
WO2015192547A1 (en) Method for taking three-dimensional picture based on mobile terminal, and mobile terminal
CN106791483B (en) Image transmission method and device and electronic equipment
CN111083371A (en) Shooting method and electronic equipment
CN113364965A (en) Shooting method and device based on multiple cameras and electronic equipment
CN115134505B (en) Preview picture generation method and device, electronic equipment and storage medium
US11265529B2 (en) Method and apparatus for controlling image display
US9325975B2 (en) Image display apparatus, parallax adjustment display method thereof, and image capturing apparatus
CN112738399B (en) Image processing method and device and electronic equipment
CN114422687B (en) Preview image switching method and device, electronic equipment and storage medium
CN113099113B (en) Electronic terminal, photographing method and device and storage medium
CN118052752A (en) Image processing method, device and storage medium
US11252341B2 (en) Method and device for shooting image, and storage medium
CN112866555B (en) Shooting method, shooting device, shooting equipment and storage medium
EP4369728A1 (en) Photographing method and device
CN114390189A (en) Image processing method, device, storage medium and mobile terminal
WO2023225910A1 (en) Video display method and apparatus, terminal device, and computer storage medium
CN114125417B (en) Image sensor, image pickup apparatus, image pickup method, image pickup apparatus, and storage medium
CN114268731B (en) Camera switching method, camera switching device and storage medium
CN118055334A (en) Photographing method, photographing device and storage medium
CN115144870A (en) Image shooting method, device, terminal and storage medium
CN118018854A (en) Method, device and storage medium for generating high dynamic range image
CN118052958A (en) Panoramic map construction method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination