WO2023273923A1 - 一种3d背景替换方法、装置、存储介质和终端设备 - Google Patents

一种3d背景替换方法、装置、存储介质和终端设备 Download PDF

Info

Publication number
WO2023273923A1
WO2023273923A1 PCT/CN2022/099532 CN2022099532W WO2023273923A1 WO 2023273923 A1 WO2023273923 A1 WO 2023273923A1 CN 2022099532 W CN2022099532 W CN 2022099532W WO 2023273923 A1 WO2023273923 A1 WO 2023273923A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
generate
clock offset
data corresponding
rotation angles
Prior art date
Application number
PCT/CN2022/099532
Other languages
English (en)
French (fr)
Inventor
常玉军
Original Assignee
展讯通信(天津)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 展讯通信(天津)有限公司 filed Critical 展讯通信(天津)有限公司
Publication of WO2023273923A1 publication Critical patent/WO2023273923A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Definitions

  • the present application relates to the field of image technology, and in particular to a 3D background replacement method, device, storage medium and terminal equipment.
  • the 3D background replacement is divided into two parts.
  • the first part is to construct a virtual 3D world and control the camera in the virtual 3D world in real time according to the camera posture in the real world, so as to render images similar to those captured by the real camera in real time.
  • the second part is to divide the current frame into background and foreground regions through the scene segmentation algorithm, and then use the image obtained in the first part as the background and the foreground of the current image to form a new scene image.
  • inertial measurement unit Inertial Measurement Unit, IMU for short
  • IMU Inertial Measurement Unit
  • embodiments of the present application provide a 3D background replacement method, device, storage medium, and terminal device, so as to improve the stability of 3D background replacement.
  • the embodiment of the present application provides a 3D background replacement method, including:
  • performing offline calibration according to a set number of the feature images and the IMU data corresponding to each of the feature images to generate a first clock offset includes:
  • performing online calibration according to multiple captured images and IMU data corresponding to each captured image to generate a second clock offset includes:
  • the generating a first set of rotation angles according to the first extracted image and the IMU data corresponding to the first extracted image includes:
  • the IMU data corresponding to the first extracted image and the first extracted image is calculated by an image processing technology function to generate a second set of feature points;
  • the second set of feature points is calculated by using an optical flow pyramid function to generate the first set of rotation angles.
  • the generating the first clock offset according to the first set of rotation angles and the acquired second set of rotation angles includes:
  • the generating a third set of rotation angles according to a plurality of the second extracted images and the IMU data corresponding to the second extracted images includes:
  • the third set of feature points is calculated by using an optical flow pyramid function to generate the third set of rotation angles.
  • the generating the second clock offset according to the third set of rotation angles and the acquired fourth set of rotation angles includes:
  • the embodiment of the present application provides a 3D background replacement device, including:
  • the first acquisition module is used to acquire a set number of feature images and inertial measurement unit IMU data corresponding to each of the feature images;
  • the first generation module is used to perform offline calibration according to the set number of the characteristic images and the IMU data corresponding to each of the characteristic images, to generate the first clock offset;
  • the second acquisition module is used to acquire a plurality of captured images and IMU data corresponding to each of the captured images;
  • the second generation module is used to perform online calibration according to multiple captured images and IMU data corresponding to each of the captured images, to generate a second clock offset;
  • a third generating module configured to perform 3D background replacement on each of the captured images according to the first clock offset and the second clock offset, to generate a plurality of background replacement images.
  • an embodiment of the present application provides a storage medium, which is characterized in that it includes: the storage medium includes a stored program, wherein when the program is running, the device where the storage medium is located is controlled to execute the above-mentioned 3D The background replacement method.
  • an embodiment of the present application provides a terminal device, including a memory and a processor, the memory is used to store information including program instructions, and the processor is used to control the execution of program instructions, wherein the When the program instructions are loaded and executed by the processor, the steps of implementing the above-mentioned 3D background replacement method are implemented.
  • a set number of feature images and the IMU data corresponding to each feature image are acquired; according to the set number of feature images and the IMU corresponding to each feature image
  • the data is calibrated offline to generate the first clock offset; multiple captured images and the IMU data corresponding to each captured image are obtained; online calibration is performed according to the multiple captured images and the IMU data corresponding to each captured image to generate the second clock offset performing 3D background replacement on each captured image according to the first clock offset and the second clock offset, to generate multiple background replacement images.
  • the stability of 3D background replacement can be fully ensured according to the first clock offset generated by offline calibration and the second clock offset generated by online calibration.
  • FIG. 1 is a flow chart of a 3D background replacement method provided in an embodiment of the present application
  • Fig. 2 is the schematic diagram of feature image
  • FIG. 3 is a flow chart of generating a first clock offset by performing offline calibration according to a set number of feature images and the IMU data corresponding to each feature image in FIG. 1;
  • Fig. 4 is a flow chart of generating a second clock offset by performing online calibration according to multiple captured images and IMU data corresponding to each captured image in Fig. 1;
  • FIG. 5 is a schematic structural diagram of a 3D background replacement device provided in an embodiment of the present application.
  • Fig. 6 is a schematic structural diagram of the first generation module in Fig. 5;
  • Fig. 7 is a schematic structural diagram of the second generation module in Fig. 5;
  • FIG. 8 is a schematic diagram of a terminal device provided in an embodiment of the present application.
  • FIG. 1 is a flowchart of a 3D background replacement method provided in the embodiment of the present application. As shown in FIG. 1 , the method includes:
  • Step 102 acquiring a set number of feature images and IMU data corresponding to each feature image.
  • each step is performed by a terminal device equipped with an image sensor and an IMU sensor, for example, a mobile phone or a tablet computer.
  • the IMU sensor is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of an object.
  • the IMU sensors of the terminal device include: gyroscope, accelerometer, gravity sensor and geomagnetic sensor.
  • the IMU data includes one of gyroscope data, accelerometer data, gravity sensor data, geomagnetic sensor data or any combination thereof.
  • the user shoots a set number of characteristic images at different camera angles through the terminal device, and obtains IMU data corresponding to each characteristic image.
  • the set quantity can be set according to the actual situation. For example, set the quantity to 100.
  • the characteristic image includes a time stamp, exposure time and rolling shutter time.
  • FIG. 2 is a schematic diagram of a feature image.
  • the feature image is a scene image with obvious feature points. For example, a checkerboard image.
  • Step 104 Perform offline calibration according to a set number of feature images and the IMU data corresponding to each feature image to generate a first clock offset.
  • FIG. 3 is a flow chart for offline calibration and generation of the first clock offset according to the set number of feature images and the IMU data corresponding to each feature image in FIG. 1 , as shown in FIG. 3 , step 104 include:
  • Step 1042 extracting two adjacent feature images among the set number of feature images to generate a first extracted image.
  • Step 1044 generate a first set of rotation angles according to the first extracted image and the IMU data corresponding to the first extracted image.
  • step 1044 includes:
  • Step A1 Calculate the first extracted image and the IMU data corresponding to the first extracted image through an image processing technology function to generate a second set of feature points.
  • the image processing technology function includes the googFeaturesToTrack function of the opencv library.
  • Step A2. Calculate the second set of feature points by using the optical flow pyramid function to generate the first set of rotation angles.
  • the optical flow pyramid function includes the calcOpticalFlowPyrLk function.
  • Step 1046 Generate a first clock offset according to the first set of rotation angles and the acquired second set of rotation angles, where the second set of rotation angles includes its own set of rotation angles.
  • the second set of rotation angles includes a set of rotation angles of the terminal device.
  • the terminal device shoots a set number of feature images at different camera angles, it acquires a set of rotation angles when the feature images are shot at different camera angles.
  • step 1046 includes:
  • Step B1 Generate a first image rotation angle curve according to the first rotation angle set and the acquired time stamp corresponding to the first rotation angle set.
  • the corresponding relationship between the first rotation angle set and the timestamp is stored in the terminal device, and the timestamp corresponding to the first rotation angle set is obtained from the terminal device according to the corresponding relationship between the first rotation angle set and the timestamp .
  • the first image rotation angle curve is generated by taking the first rotation angle set as the ordinate and taking the time stamp corresponding to the first rotation angle set as the abscissa.
  • Step B2 Generate the first IMU angle curve according to the second set of rotation angles and the acquired time stamp corresponding to the second set of rotation angles.
  • the corresponding relationship between the second rotation angle set and the timestamp is stored in the terminal device, and the timestamp corresponding to the second rotation angle set is obtained from the terminal device according to the corresponding relationship between the second rotation angle set and the timestamp .
  • the first IMU angle curve is generated by taking the second set of rotation angles as the ordinate, and taking the time stamp corresponding to the second set of rotation angles as the abscissa.
  • Step B3 generating a plurality of first correlation distances according to the first image rotation angle curve and the first IMU angle curve.
  • the distance between the first image rotation angle curve and the first IMU angle curve is the first correlation distance.
  • the first correlation distance may indicate the degree of matching between the first set of rotation angles and the second set of rotation angles. The smaller the value of the first correlation distance, the higher the degree of matching between the first set of rotation angles and the second set of rotation angles.
  • the time stamp is offset within this range with a step size of 0.5ms. shift, the first correlation distance is calculated once for each shift, so as to obtain multiple first correlation distances.
  • Step B4 querying the smallest first correlation distance among the plurality of first correlation distances and the first clock offset corresponding to the smallest first correlation distance.
  • the terminal device stores the corresponding relationship between the first correlation distance and the first clock offset, and can query the corresponding relationship between the first correlation distance and the first clock offset according to the corresponding relationship between the first correlation distance and the first clock offset.
  • the first clock offset is the clock offset of the image sensor and the IMU sensor.
  • Step 106 acquiring multiple captured images and IMU data corresponding to each captured image.
  • a user uses a terminal device to capture multiple captured images at different camera angles, and obtains IMU data corresponding to each captured image.
  • the captured image includes a time stamp, an exposure time, and a rolling shutter time.
  • Step 108 Perform online calibration according to the plurality of captured images and the IMU data corresponding to each captured image to generate a second clock offset.
  • the online calibration runs in parallel with the 3D background replacement.
  • the online calibration runs in a separate thread.
  • the parallel operation of online calibration and 3D background replacement not only does not affect the efficiency of 3D background replacement, but also ensures the accuracy of 3D background replacement in real time, which makes up for the shortcomings of insufficient timeliness of offline calibration.
  • FIG. 4 is a flow chart of performing online calibration according to multiple captured images and IMU data corresponding to each captured image in FIG. 1 to generate a second clock offset.
  • step 108 includes:
  • Step 1082 extracting a first feature point set of multiple captured images.
  • step 1082 is executed after a new frame of captured image of the image sensor is acquired.
  • Step 1084 judging whether the number of elements in the first feature point set is greater than the set threshold, if yes, execute step 1086 ; if not, the process ends.
  • online calibration is performed only if enough feature points can be extracted from multiple consecutive frames of images.
  • the setting threshold can be set according to the actual situation. For example, if the range of frame rates of multiple captured images is [3,10], the setting threshold is set to 30.
  • the set threshold if it is judged that the number of elements in the first feature point set is greater than the set threshold, it indicates that enough feature points can be extracted from consecutive frames of images; if it is judged that the number of elements in the first feature point set is less than Or equal to the set threshold, it indicates that the continuous multi-frame images cannot extract enough feature points.
  • Step 1086 Extract two adjacent captured images among the plurality of captured images to generate a second extracted image.
  • Step 1088 Generate a third set of rotation angles according to the plurality of second extracted images and the IMU data corresponding to the second extracted images.
  • step 1088 includes:
  • Step C1 Calculate the second extracted image and the IMU data corresponding to the second extracted image through an image processing technology function to generate a third set of feature points.
  • Step C2. Calculate the third set of feature points by using the optical flow pyramid function to generate a third set of rotation angles.
  • Step 1090 Generate a second clock offset according to the third set of rotation angles and the acquired fourth set of rotation angles, where the fourth set of rotation angles includes its own set of rotation angles.
  • the fourth set of rotation angles includes a set of rotation angles of the terminal device.
  • the terminal device captures multiple captured images with different camera angles, it acquires a set of rotation angles when the captured images are captured with different camera angles.
  • step 1090 includes:
  • Step D1 Generate a second image rotation angle curve according to the third rotation angle set and the acquired time stamp corresponding to the third rotation angle set.
  • the corresponding relationship between the third rotation angle set and the timestamp is stored in the terminal device, and the timestamp corresponding to the third rotation angle set is obtained from the terminal device according to the corresponding relationship between the third rotation angle set and the timestamp .
  • the second image rotation angle curve is generated by taking the third rotation angle set as the ordinate and taking the time stamp corresponding to the third rotation angle set as the abscissa.
  • Step D2 Generate a second IMU angle curve according to the fourth set of rotation angles and the acquired time stamp corresponding to the fourth set of rotation angles.
  • the corresponding relationship between the fourth set of rotation angles and the timestamp is stored in the terminal device, and the timestamp corresponding to the fourth set of rotation angles is obtained from the terminal device according to the corresponding relationship between the fourth set of rotation angles and the timestamp .
  • the second IMU angle curve is generated by taking the fourth rotation angle set as the ordinate and taking the time stamp corresponding to the fourth rotation angle set as the abscissa.
  • Step D3 generating a plurality of second correlation distances according to the second image rotation angle curve and the second IMU angle curve.
  • the distance between the second image rotation angle curve and the second IMU angle curve is the second correlation distance.
  • the second correlation distance may indicate the matching degree between the third rotation angle set and the fourth rotation angle set, and the smaller the value of the second correlation distance is, the higher the matching degree between the third rotation angle set and the fourth rotation angle set is.
  • the time stamp is offset within this range with a step size of 0.5ms. shift, the second correlation distance is calculated once for each shift, so as to obtain multiple second correlation distances.
  • Step D4 querying the smallest second correlation distance among the plurality of second correlation distances and the second clock offset corresponding to the smallest second correlation distance.
  • the terminal device stores the corresponding relationship between the second correlation distance and the second clock offset, and can query the corresponding relationship between the second correlation distance and the second clock offset according to the corresponding relationship between the second correlation distance and the second clock offset.
  • the second clock offset is the clock offset of the image sensor and the IMU sensor.
  • Step 110 Perform 3D background replacement on each captured image according to the first clock offset and the second clock offset to generate multiple background replacement images.
  • a set number of characteristic images and the IMU data corresponding to each characteristic image are obtained; offline calibration is performed according to the set number of characteristic images and the IMU data corresponding to each characteristic image, Generate a first clock offset; acquire a plurality of captured images and IMU data corresponding to each captured image; perform online calibration according to a plurality of captured images and IMU data corresponding to each captured image, and generate a second clock offset; according to the first The clock offset and the second clock offset perform 3D background replacement on each captured image to generate a plurality of background replacement images.
  • the stability of 3D background replacement can be fully ensured according to the first clock offset generated by offline calibration and the second clock offset generated by online calibration.
  • FIG. 5 is a schematic structural diagram of a 3D background replacement device provided by an embodiment of the present application. As shown in Fig. 5, the device includes: a first acquisition module 11, a first generation module 12, a second acquisition module 13, a second generation module 14 and a third generation module 15 .
  • the first acquisition module 11 is used to acquire a set number of feature images and IMU data corresponding to each feature image.
  • the first generation module 12 is configured to perform offline calibration according to a set number of feature images and IMU data corresponding to each feature image, to generate a first clock offset.
  • the second acquiring module 13 is configured to acquire multiple captured images and IMU data corresponding to each captured image.
  • the second generation module 14 is configured to perform online calibration according to multiple captured images and IMU data corresponding to each captured image, to generate a second clock offset.
  • the third generating module 15 is configured to perform 3D background replacement on each captured image according to the first clock offset and the second clock offset, and generate multiple background replacement images.
  • FIG. 6 is a schematic structural diagram of the first generation module 12 in FIG. 5. As shown in FIG. A submodule 123 is generated.
  • the first generation sub-module 121 is used to extract two adjacent feature images among the set number of feature images to generate a first extracted image.
  • the second generating sub-module 122 is configured to generate a first set of rotation angles according to the first extracted image and the IMU data corresponding to the first extracted image.
  • the third generation sub-module 123 is configured to generate the first clock offset according to the first set of rotation angles and the obtained second set of rotation angles, the second set of rotation angles includes its own set of rotation angles.
  • FIG. 7 is a schematic structural diagram of the second generating module 14 in FIG. 5. As shown in FIG. , the fifth generation sub-module 144 and the sixth generation sub-module 145 .
  • the extraction sub-module 141 is used for extracting a first feature point set of multiple captured images.
  • the judging submodule 142 is used to judge whether the number of elements in the first feature point set is greater than a set threshold, and if it is judged that the number of elements in the first feature point set is greater than the set threshold, trigger the fourth generation submodule 143 to extract multiple shots Two adjacent captured images in the image are used to generate a second extracted image.
  • the fifth generation sub-module 144 is configured to generate a third set of rotation angles according to a plurality of second extracted images and IMU data corresponding to the second extracted images.
  • the sixth generating submodule 145 is configured to generate the second clock offset according to the third set of rotation angles and the obtained fourth set of rotation angles, where the fourth set of rotation angles includes its own set of rotation angles.
  • the second generation sub-module 122 is specifically used to calculate the first extracted image and the IMU data corresponding to the first extracted image through image processing technology functions to generate a second set of feature points; Calculations are performed on the second set of feature points to generate the first set of rotation angles.
  • the third generation sub-module 123 is specifically configured to generate the first image rotation angle curve according to the first rotation angle set and the time stamp corresponding to the obtained first rotation angle set; according to the second rotation angle set and the acquired The time stamp corresponding to the second set of rotation angles, generate the first IMU angle curve; generate a plurality of first correlation distances according to the first image rotation angle curve and the first IMU angle curve; query the minimum of the plurality of first correlation distances The first correlation distance of and the first clock offset corresponding to the smallest first correlation distance.
  • the fifth generation sub-module 144 is specifically used to calculate the second extracted image and the IMU data corresponding to the second extracted image through an image processing technology function to generate a third set of feature points; Calculate the third set of feature points to generate a third set of rotation angles.
  • the sixth generation sub-module 145 is specifically configured to generate the second image rotation angle curve according to the third rotation angle set and the timestamp corresponding to the obtained third rotation angle set; according to the fourth rotation angle set and the acquired The time stamp corresponding to the fourth rotation angle set of the second IMU angle curve is generated; according to the second image rotation angle curve and the second IMU angle curve, a plurality of second correlation distances are generated; the minimum of the plurality of second correlation distances is queried The second correlation distance and the second clock offset corresponding to the smallest second correlation distance.
  • a set number of characteristic images and the IMU data corresponding to each characteristic image are obtained; offline calibration is performed according to the set number of characteristic images and the IMU data corresponding to each characteristic image, Generate a first clock offset; acquire a plurality of captured images and IMU data corresponding to each captured image; perform online calibration according to a plurality of captured images and IMU data corresponding to each captured image, and generate a second clock offset; according to the first The clock offset and the second clock offset perform 3D background replacement on each captured image to generate a plurality of background replacement images.
  • the stability of 3D background replacement can be fully ensured according to the first clock offset generated by offline calibration and the second clock offset generated by online calibration.
  • the 3D background replacement device provided in this embodiment can be used to implement the above-mentioned 3D background replacement method in FIGS. 1 to 4 .
  • the description will not be repeated here.
  • An embodiment of the present application provides a storage medium, the storage medium includes a stored program, wherein, when the program is running, the device where the storage medium is located is controlled to execute the steps of the above-mentioned 3D background replacement method.
  • the storage medium includes a stored program, wherein, when the program is running, the device where the storage medium is located is controlled to execute the steps of the above-mentioned 3D background replacement method.
  • the above-mentioned 3D background replacement Example of the method please refer to the above-mentioned 3D background replacement Example of the method.
  • An embodiment of the present application provides a terminal device, including a memory and a processor.
  • the memory is used to store information including program instructions
  • the processor is used to control the execution of the program instructions.
  • the above-mentioned 3D background replacement is realized.
  • the steps of the embodiment of the method for a specific description, please refer to the embodiment of the above-mentioned 3D background replacement method.
  • FIG. 8 is a schematic diagram of a terminal device provided in an embodiment of the present application.
  • the terminal device 20 of this embodiment includes: a processor 21, a memory 22, and a computer program 23 stored in the memory 22 and operable on the processor 21.
  • the computer program 23 is executed by the processor 21
  • the functions applied to each model/unit in the 3D background replacement device in the embodiment are implemented. To avoid repetition, details are not repeated here.
  • the terminal device 20 includes, but is not limited to, a processor 21 and a memory 22 .
  • FIG. 8 is only an example of the terminal device 20, and does not constitute a limitation on the terminal device 20. It may include more or less components than those shown in the figure, or combine certain components, or different components. , for example, a terminal device may also include an input and output device, a network access device, a bus, and the like.
  • the so-called processor 21 can be a central processing unit (Central Processing Unit, CPU), and can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor can be a microprocessor, or the processor can be any conventional processor, and the like.
  • the storage 22 may be an internal storage unit of the terminal device 20 , such as a hard disk or memory of the terminal device 20 .
  • Memory 22 also can be the external storage device of terminal equipment 20, for example the plug-in type hard disk equipped on terminal equipment 20, smart memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash memory card (Flash Card) and so on.
  • the memory 22 may also include both an internal storage unit of the terminal device 20 and an external storage device.
  • the memory 22 is used to store computer programs and other programs and data required by the terminal device.
  • the memory 22 can also be used to temporarily store data that has been output or will be output.
  • the disclosed system, device and method can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined Or it can be integrated into another system, or some features can be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the above-mentioned integrated units implemented in the form of software functional units may be stored in a computer-readable storage medium.
  • the above-mentioned software functional units are stored in a storage medium, and include several instructions to enable a computer device (which may be a personal computer, server, or network device, etc.) or a processor (Processor) to execute the methods described in various embodiments of the present application. partial steps.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disc and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请实施例提供了一种3D背景替换方法、装置、存储介质和终端设备。该方法包括:获取设定数量个特征图像和每个特征图像对应的惯性测量单元IMU数据;根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;获取多个拍摄图像和每个拍摄图像对应的IMU数据;根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;根据第一时钟偏移和第二时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像。本申请实施例提供的技术方案中,根据离线校准生成的第一时钟偏移与在线校准生成的第二时钟偏移,可充分确保3D背景替换的稳定性。

Description

一种3D背景替换方法、装置、存储介质和终端设备 技术领域
本申请涉及图像技术领域,尤其涉及一种3D背景替换方法、装置、存储介质和终端设备。
背景技术
3D背景替换分为两部分,第一部分为构建虚拟的3D世界并根据现实世界中的摄像机姿态来实时控制虚拟3D世界的摄像机,从而渲染得到类似于现实摄像机实时采集到的图像。第二部分为通过场景分割算法把当前帧分割为背景和前景两个区域,然后把第一部分得到的图像当作背景和当前图像的前景融合在一起形成新的场景图像。
相关技术中,根据多个拍摄图像中相邻的两个拍摄图像来计算终端设备的旋转角度,计算量非常大,使得计算的耗时过长并且计算结果和拍摄图像所对应的拍摄的场景有很大关系,如果所拍摄的场景缺少特征点则计算得到的姿态和实际姿态就会存在很大的偏差,降低了3D背景替换的稳定性。
另一种相关技术中,使用惯性测量单元(Inertial Measurement Unit,简称IMU)数据来控制终端设备的姿态,会使得终端设备的IMU系统和摄像机系统存在时钟不匹配的问题而导致姿态控制不能和图像同步,会出现时钟不匹配的问题,降低了3D背景替换的稳定性。
申请内容
有鉴于此,本申请实施例提供了一种3D背景替换方法、装置、存储介质和终端设备,用以提高3D背景替换的稳定性。
一方面,本申请实施例提供了一种3D背景替换方法,包括:
获取设定数量个特征图像和每个所述特征图像对应的惯性测量单元 IMU数据;
根据设定数量个所述特征图像和每个所述特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;
获取多个拍摄图像和每个所述拍摄图像对应的IMU数据;
根据多个拍摄图像和每个所述拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;
根据所述第一时钟偏移和所述第二时钟偏移对每个所述拍摄图像进行3D背景替换,生成多个背景替换图像。
可选地,所述根据设定数量个所述特征图像和每个所述特征图像对应的IMU数据进行离线校准,生成第一时钟偏移,包括:
提取设定数量个所述特征图像中相邻的两个所述特征图像,生成第一提取图像;
根据所述第一提取图像和所述第一提取图像对应的IMU数据,生成第一旋转角度集合;
根据所述第一旋转角度集合和获取的第二旋转角度集合,生成所述第一时钟偏移,所述第二旋转角度集合包括自身的旋转角度集合。
可选地,所述根据多个拍摄图像和每个所述拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移,包括:
提取多个所述拍摄图像的第一特征点集合;
判断所述第一特征点集合中元素的数量是否大于设定阈值;
若判断出所述第一特征点集合中元素的数量大于设定阈值,提取多个所述拍摄图像中相邻的两个所述拍摄图像,生成第二提取图像;
根据多个所述第二提取图像和所述第二提取图像对应的IMU数据,生成第三旋转角度集合;
根据所述第三旋转角度集合和获取的第四旋转角度集合,生成所述第二时钟偏移,所述第四旋转角度集合包括自身的旋转角度集合。
可选地,所述根据所述第一提取图像和所述第一提取图像对应的IMU数据,生成第一旋转角度集合,包括:
通过图像处理技术函数对所述第一提取图像和所述第一提取图像对 应的IMU数据进行计算,生成第二特征点集合;
通过光流金字塔函数对所述第二特征点集合进行计算,生成所述第一旋转角度集合。
可选地,所述根据所述第一旋转角度集合和获取的第二旋转角度集合,生成所述第一时钟偏移,包括:
根据所述第一旋转角度集合和获取的所述第一旋转角度集合对应的时间戳,生成第一图像旋转角度曲线;
根据所述第二旋转角度集合和获取的所述第二旋转角度集合对应的时间戳,生成第一IMU角度曲线;
根据所述第一图像旋转角度曲线和所述第一IMU角度曲线,生成多个第一相关距离;
查询出多个所述第一相关距离中最小的第一相关距离和所述最小的第一相关距离对应的第一时钟偏移。
可选地,所述根据多个所述第二提取图像和所述第二提取图像对应的IMU数据,生成第三旋转角度集合,包括:
通过图像处理技术函数对所述第二提取图像和所述第二提取图像对应的IMU数据进行计算,生成第三特征点集合;
通过光流金字塔函数对所述第三特征点集合进行计算,生成所述第三旋转角度集合。
可选地,所述根据所述第三旋转角度集合和获取的第四旋转角度集合,生成所述第二时钟偏移,包括:
根据所述第三旋转角度集合和获取的所述第三旋转角度集合对应的时间戳,生成第二图像旋转角度曲线;
根据所述第四旋转角度集合和获取的所述第四旋转角度集合对应的时间戳,生成第二IMU角度曲线;
根据所述第二图像旋转角度曲线和所述第二IMU角度曲线,生成多个第二相关距离;
查询出多个所述第二相关距离中最小的第二相关距离和所述最小的第二相关距离对应的第二时钟偏移。
另一方面,本申请实施例提供了一种3D背景替换装置,包括:
第一获取模块,用于获取设定数量个特征图像和每个所述特征图像对应的惯性测量单元IMU数据;
第一生成模块,用于根据设定数量个所述特征图像和每个所述特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;
第二获取模块,用于获取多个拍摄图像和每个所述拍摄图像对应的IMU数据;
第二生成模块,用于根据多个拍摄图像和每个所述拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;
第三生成模块,用于根据所述第一时钟偏移和所述第二时钟偏移对每个所述拍摄图像进行3D背景替换,生成多个背景替换图像。
另一方面,本申请实施例提供了一种存储介质,其特征在于,包括:所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行上述一种3D背景替换方法。
另一方面,本申请实施例提供了一种终端设备,包括存储器和处理器,所述存储器用于存储包括程序指令的信息,所述处理器用于控制程序指令的执行,其特征在于,所述程序指令被处理器加载并执行时实现上述一种3D背景替换方法的步骤。
本申请实施例提供的3D背景替换方法的技术方案中,获取设定数量个特征图像和每个特征图像对应的惯性测量单元IMU数据;根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;获取多个拍摄图像和每个拍摄图像对应的IMU数据;根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;根据第一时钟偏移和第二时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像。本申请实施例提供的技术方案中,根据离线校准生成的第一时钟偏移与在线校准生成的第二时钟偏移,可充分确保3D背景替换的稳定性。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其它的附图。
图1为本申请实施例提供的一种3D背景替换方法的流程图;
图2为特征图像的示意图;
图3为图1中根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移的流程图;
图4为图1中根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移的流程图;
图5为本申请实施例提供的一种3D背景替换装置的结构示意图;
图6为图5中第一生成模块的结构示意图;
图7为图5中第二生成模块的结构示意图;
图8为本申请实施例提供的一种终端设备的示意图。
具体实施方式
为了更好的理解本申请的技术方案,下面结合附图对本申请实施例进行详细描述。
应当明确,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
在本申请实施例中使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本申请。在本申请实施例和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。
应当理解,本文中使用的术语“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,甲和/或乙,可以表示:单独存在甲,同时存在甲和乙,单独存在乙这三种情况。另外,本文中字 符“/”,一般表示前后关联对象是一种“或”的关系。
本申请实施例提供了一种3D背景替换方法,图1为本申请实施例提供的一种3D背景替换方法的流程图,如图1所示,该方法包括:
步骤102、获取设定数量个特征图像和每个特征图像对应的惯性测量单元IMU数据。
本申请实施例中,各步骤由具备图像传感器和IMU传感器的终端设备执行,例如,手机或平板电脑。
本申请实施例中,IMU传感器是测量物体三轴姿态角(或角速度)以及加速度的装置。终端设备的IMU传感器包括:陀螺仪、加速度计、重力感应器和地磁感应器。
本申请实施例中,IMU数据包括陀螺仪数据、加速度计数据、重力感应器数据、地磁感应器数据之一或其任意组合。
本申请实施例中,用户通过终端设备以不同的摄像角度拍摄设定数量个特征图像,并获取每个特征图像对应的惯性测量单元IMU数据。
本申请实施例中,能够根据实际情况设置设定数量。例如,设定数量为100。
本申请实施例中,特征图像包括时间戳、曝光时间和卷帘快门时间。
本申请实施例中,图2为特征图像的示意图,如图2所示,特征图像为具有明显特征点的场景图像。例如,棋盘格图像。
步骤104、根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移。
本申请实施例中,图3为图1中根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移的流程图,如图3所示,步骤104包括:
步骤1042、提取设定数量个特征图像中相邻的两个特征图像,生成第一提取图像。
本步骤中,提取设定数量帧特征图像中相邻的两帧特征图像,生成第一提取图像。
步骤1044、根据第一提取图像和第一提取图像对应的IMU数据,生 成第一旋转角度集合。
本申请实施例中,步骤1044包括:
步骤A1、通过图像处理技术函数对第一提取图像和第一提取图像对应的IMU数据进行计算,生成第二特征点集合。
本申请实施例中,图像处理技术函数包括opencv库的googFeaturesToTrack函数。
步骤A2、通过光流金字塔函数对第二特征点集合进行计算,生成第一旋转角度集合。
本申请实施例中,光流金字塔函数包括calcOpticalFlowPyrLk函数。
步骤1046、根据第一旋转角度集合和获取的第二旋转角度集合,生成第一时钟偏移,第二旋转角度集合包括自身的旋转角度集合。
本申请实施例中,第二旋转角度集合包括终端设备的旋转角度集合。终端设备以不同的摄像角度拍摄设定数量个特征图像时,获取以不同的摄像角度拍摄的特征图像时的旋转角度集合。
本申请实施例中,步骤1046包括:
步骤B1、根据第一旋转角度集合和获取的第一旋转角度集合对应的时间戳,生成第一图像旋转角度曲线。
本申请实施例中,终端设备中存储有第一旋转角度集合与时间戳的对应关系,根据第一旋转角度集合与时间戳的对应关系,从终端设备中获取第一旋转角度集合对应的时间戳。
本步骤中,以第一旋转角度集合为纵坐标,以第一旋转角度集合对应的时间戳为横坐标,生成第一图像旋转角度曲线。
步骤B2、根据第二旋转角度集合和获取的第二旋转角度集合对应的时间戳,生成第一IMU角度曲线。
本申请实施例中,终端设备中存储有第二旋转角度集合与时间戳的对应关系,根据第二旋转角度集合与时间戳的对应关系,从终端设备中获取第二旋转角度集合对应的时间戳。
本步骤中,以第二旋转角度集合为纵坐标,以第二旋转角度集合对应的时间戳为横坐标,生成第一IMU角度曲线。
步骤B3、根据第一图像旋转角度曲线和第一IMU角度曲线,生成多个第一相关距离。
本步骤中,第一图像旋转角度曲线和第一IMU角度曲线之间的距离为第一相关距离。第一相关距离可表示第一旋转角度集合与第二旋转角度集合的匹配程度,第一相关距离的数值越小,第一旋转角度集合与第二旋转角度集合的匹配程度越高。
本申请实施例中,若设定第一旋转角度集合与第二旋转角度集合的时钟偏差在[-50ms,+50ms]之间,然后以0.5ms为步长在该范围内对时间戳进行偏移,每次偏移都计算一次第一相关距离,从而得到多个第一相关距离。
步骤B4、查询出多个第一相关距离中最小的第一相关距离和最小的第一相关距离对应的第一时钟偏移。
本申请实施例中,终端设备中存储有第一相关距离与第一时钟偏移的对应关系,能够根据第一相关距离与第一时钟偏移的对应关系,查询出最小的第一相关距离对应的第一时钟偏移。
本申请实施例中,第一时钟偏移为图像传感器和IMU传感器的时钟偏移。
步骤106、获取多个拍摄图像和每个拍摄图像对应的IMU数据。
本申请实施例中,用户通过终端设备以不同的摄像角度拍摄多个拍摄图像,并获取每个拍摄图像对应的IMU数据。
本申请实施例中,拍摄图像包括时间戳、曝光时间和卷帘快门时间。
步骤108、根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移。
本申请实施例中,在线校准是和3D背景替换并行运行的,为了不阻塞3D背景替换的运算,在线校准会在单独的线程中运行。在线校准和3D背景替换并行运行非但不影响3D背景替换的效率而且还可实时保证3D背景替换的准确性,弥补了离线校准时效性不足的缺点。
本申请实施例中,图4为图1中根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移的流程图,如图4 所示,步骤108包括:
步骤1082、提取多个拍摄图像的第一特征点集合。
本申请实施例中,在获取到图像传感器的新的一帧拍摄图像之后,执行步骤1082。
步骤1084、判断第一特征点集合中元素的数量是否大于设定阈值,若是,执行步骤1086;若否,流程结束。
本申请实施例中,为了计算的准确性,只有连续多帧图像都能提取足够的特征点才进行在线校准。
本申请实施例中,能够根据实际情况设置设定阈值,例如,若多个拍摄图像的帧率的范围为[3,10],将设定阈值设置为30。
本申请实施例中,若判断出第一特征点集合中元素的数量大于设定阈值,则表明连续多帧图像都能提取足够的特征点;若判断出第一特征点集合中元素的数量小于或等于设定阈值,则表明连续多帧图像不都能提取足够的特征点。
步骤1086、提取多个拍摄图像中相邻的两个拍摄图像,生成第二提取图像。
本步骤中,提取多帧拍摄图像中相邻的两帧拍摄图像,生成第二提取图像。
步骤1088、根据多个第二提取图像和第二提取图像对应的IMU数据,生成第三旋转角度集合。
本申请实施例中,步骤1088包括:
步骤C1、通过图像处理技术函数对第二提取图像和第二提取图像对应的IMU数据进行计算,生成第三特征点集合。
步骤C2、通过光流金字塔函数对第三特征点集合进行计算,生成第三旋转角度集合。
步骤1090、根据第三旋转角度集合和获取的第四旋转角度集合,生成第二时钟偏移,第四旋转角度集合包括自身的旋转角度集合。
本申请实施例中,第四旋转角度集合包括终端设备的旋转角度集合。终端设备以不同的摄像角度拍摄多个拍摄图像时,获取以不同的摄 像角度拍摄的拍摄图像时的旋转角度集合。
本申请实施例中,步骤1090包括:
步骤D1、根据第三旋转角度集合和获取的第三旋转角度集合对应的时间戳,生成第二图像旋转角度曲线。
本申请实施例中,终端设备中存储有第三旋转角度集合与时间戳的对应关系,根据第三旋转角度集合与时间戳的对应关系,从终端设备中获取第三旋转角度集合对应的时间戳。
本步骤中,以第三旋转角度集合为纵坐标,以第三旋转角度集合对应的时间戳为横坐标,生成第二图像旋转角度曲线。
步骤D2、根据第四旋转角度集合和获取的第四旋转角度集合对应的时间戳,生成第二IMU角度曲线。
本申请实施例中,终端设备中存储有第四旋转角度集合与时间戳的对应关系,根据第四旋转角度集合与时间戳的对应关系,从终端设备中获取第四旋转角度集合对应的时间戳。
本步骤中,以第四旋转角度集合为纵坐标,以第四旋转角度集合对应的时间戳为横坐标,生成第二IMU角度曲线。
步骤D3、根据第二图像旋转角度曲线和第二IMU角度曲线,生成多个第二相关距离。
本步骤中,第二图像旋转角度曲线和第二IMU角度曲线之间的距离为第二相关距离。第二相关距离可表示第三旋转角度集合与第四旋转角度集合的匹配程度,第二相关距离的数值越小,第三旋转角度集合与第四旋转角度集合的匹配程度越高。
本申请实施例中,若设定第三旋转角度集合与第四旋转角度集合的时钟偏差在[-50ms,+50ms]之间,然后以0.5ms为步长在该范围内对时间戳进行偏移,每次偏移都计算一次第二相关距离,从而得到多个第二相关距离。
步骤D4、查询出多个第二相关距离中最小的第二相关距离和最小的第二相关距离对应的第二时钟偏移。
本申请实施例中,终端设备中存储有第二相关距离与第二时钟偏移的 对应关系,能够根据第二相关距离与第二时钟偏移的对应关系,查询出最小的第二相关距离对应的第二时钟偏移。
本申请实施例中,第二时钟偏移为图像传感器和IMU传感器的时钟偏移。
步骤110、根据第一时钟偏移和第二时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像。
本申请实施例中,当未有连续的多帧拍摄图像都具有足够的特征点时,根据第一时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像;当有连续的多帧拍摄图像都具有足够的特征点时,根据第二时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像。离线校准部分可充分弥补在线校准的不确定性引发的效果问题。
本申请实施例提供的技术方案中,获取设定数量个特征图像和每个特征图像对应的惯性测量单元IMU数据;根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;获取多个拍摄图像和每个拍摄图像对应的IMU数据;根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;根据第一时钟偏移和第二时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像。本申请实施例提供的技术方案中,根据离线校准生成的第一时钟偏移与在线校准生成的第二时钟偏移,可充分确保3D背景替换的稳定性。
本申请实施例提供的技术方案中,解决了3D背景融合时由于终端设备的姿态计算不准确导致的融合不理想以及终端设备的姿态计算耗时过大导致的帧率下降问题。
本申请实施例提供了一种3D背景替换装置。图5为本申请实施例提供的一种3D背景替换装置的结构示意图,如图5所示,该装置包括:第一获取模块11、第一生成模块12、第二获取模块13、第二生成模块14和第三生成模块15。
第一获取模块11用于获取设定数量个特征图像和每个特征图像对应的惯性测量单元IMU数据。
第一生成模块12用于根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移。
第二获取模块13用于获取多个拍摄图像和每个拍摄图像对应的IMU数据。
第二生成模块14用于根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移。
第三生成模块15用于根据第一时钟偏移和第二时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像。
本申请实施例中,图6为图5中第一生成模块12的结构示意图,如图6所示,第一生成模块12包括:第一生成子模块121、第二生成子模块122和第三生成子模块123。
第一生成子模块121用于提取设定数量个特征图像中相邻的两个特征图像,生成第一提取图像。
第二生成子模块122用于根据第一提取图像和第一提取图像对应的IMU数据,生成第一旋转角度集合。
第三生成子模块123用于根据第一旋转角度集合和获取的第二旋转角度集合,生成第一时钟偏移,第二旋转角度集合包括自身的旋转角度集合。
本申请实施例中,图7为图5中第二生成模块14的结构示意图,如图7所示,第二生成模块14包括:提取子模块141、判断子模块142、第四生成子模块143、第五生成子模块144和第六生成子模块145。
提取子模块141用于提取多个拍摄图像的第一特征点集合。
判断子模块142用于判断第一特征点集合中元素的数量是否大于设定阈值,若判断出第一特征点集合中元素的数量大于设定阈值,触发第四生成子模块143提取多个拍摄图像中相邻的两个所述拍摄图像,生成第二提取图像。
第五生成子模块144用于根据多个第二提取图像和第二提取图像对应的IMU数据,生成第三旋转角度集合。
第六生成子模块145用于根据第三旋转角度集合和获取的第四旋转 角度集合,生成第二时钟偏移,第四旋转角度集合包括自身的旋转角度集合。
本申请实施例中,第二生成子模块122具体用于通过图像处理技术函数对第一提取图像和第一提取图像对应的IMU数据进行计算,生成第二特征点集合;通过光流金字塔函数对第二特征点集合进行计算,生成第一旋转角度集合。
本申请实施例中,第三生成子模块123具体用于根据第一旋转角度集合和获取的第一旋转角度集合对应的时间戳,生成第一图像旋转角度曲线;根据第二旋转角度集合和获取的第二旋转角度集合对应的时间戳,生成第一IMU角度曲线;根据第一图像旋转角度曲线和第一IMU角度曲线,生成多个第一相关距离;查询出多个第一相关距离中最小的第一相关距离和最小的第一相关距离对应的第一时钟偏移。
本申请实施例中,第五生成子模块144具体用于通过图像处理技术函数对第二提取图像和第二提取图像对应的IMU数据进行计算,生成第三特征点集合;通过光流金字塔函数对第三特征点集合进行计算,生成第三旋转角度集合。
本申请实施例中,第六生成子模块145具体用于根据第三旋转角度集合和获取的第三旋转角度集合对应的时间戳,生成第二图像旋转角度曲线;根据第四旋转角度集合和获取的第四旋转角度集合对应的时间戳,生成第二IMU角度曲线;根据第二图像旋转角度曲线和第二IMU角度曲线,生成多个第二相关距离;查询出多个第二相关距离中最小的第二相关距离和最小的第二相关距离对应的第二时钟偏移。
本申请实施例提供的技术方案中,获取设定数量个特征图像和每个特征图像对应的惯性测量单元IMU数据;根据设定数量个特征图像和每个特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;获取多个拍摄图像和每个拍摄图像对应的IMU数据;根据多个拍摄图像和每个拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;根据第一时钟偏移和第二时钟偏移对每个拍摄图像进行3D背景替换,生成多个背景替换图像。本申请实施例提供的技术方案中,根据离线校准生成的第一时 钟偏移与在线校准生成的第二时钟偏移,可充分确保3D背景替换的稳定性。
本实施例提供的3D背景替换装置可用于实现上述图1至图4中的3D背景替换方法,具体描述可参见上述3D背景替换方法的实施例,此处不再重复描述。
本申请实施例提供了一种存储介质,存储介质包括存储的程序,其中,在程序运行时控制存储介质所在设备执行上述3D背景替换方法的实施例的各步骤,具体描述可参见上述3D背景替换方法的实施例。
本申请实施例提供了一种终端设备,包括存储器和处理器,存储器用于存储包括程序指令的信息,处理器用于控制程序指令的执行,程序指令被处理器加载并执行时实现上述3D背景替换方法的实施例的各步骤,具体描述可参见上述3D背景替换方法的实施例。
图8为本申请实施例提供的一种终端设备的示意图。如图8所示,该实施例的终端设备20包括:处理器21、存储器22以及存储在存储器22中并可在处理器21上运行的计算机程序23,该计算机程序23被处理器21执行时实现实施例中的应用于3D背景替换方法,为避免重复,此处不一一赘述。或者,该计算机程序被处理器21执行时实现实施例中应用于3D背景替换装置中各模型/单元的功能,为避免重复,此处不一一赘述。
终端设备20包括,但不仅限于,处理器21、存储器22。本领域技术人员可以理解,图8仅仅是终端设备20的示例,并不构成对终端设备20的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如终端设备还可以包括输入输出设备、网络接入设备、总线等。
所称处理器21可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用 处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器22可以是终端设备20的内部存储单元,例如终端设备20的硬盘或内存。存储器22也可以是终端设备20的外部存储设备,例如终端设备20上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,存储器22还可以既包括终端设备20的内部存储单元也包括外部存储设备。存储器22用于存储计算机程序以及终端设备所需的其他程序和数据。存储器22还可以用于暂时地存储已经输出或者将要输出的数据。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如,多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
上述以软件功能单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。上述软件功能单元存储在一个存储介质中,包括若 干指令用以使得一台计算机装置(可以是个人计算机,服务器,或者网络装置等)或处理器(Processor)执行本申请各个实施例所述方法的部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (10)

  1. 一种3D背景替换方法,其特征在于,包括:
    获取设定数量个特征图像和每个所述特征图像对应的惯性测量单元IMU数据;
    根据设定数量个所述特征图像和每个所述特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;
    获取多个拍摄图像和每个所述拍摄图像对应的IMU数据;
    根据多个拍摄图像和每个所述拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;
    根据所述第一时钟偏移和所述第二时钟偏移对每个所述拍摄图像进行3D背景替换,生成多个背景替换图像。
  2. 根据权利要求1所述的方法,其特征在于,所述根据设定数量个所述特征图像和每个所述特征图像对应的IMU数据进行离线校准,生成第一时钟偏移,包括:
    提取设定数量个所述特征图像中相邻的两个所述特征图像,生成第一提取图像;
    根据所述第一提取图像和所述第一提取图像对应的IMU数据,生成第一旋转角度集合;
    根据所述第一旋转角度集合和获取的第二旋转角度集合,生成所述第一时钟偏移,所述第二旋转角度集合包括自身的旋转角度集合。
  3. 根据权利要求1所述的方法,其特征在于,所述根据多个拍摄图像和每个所述拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移,包括:
    提取多个所述拍摄图像的第一特征点集合;
    判断所述第一特征点集合中元素的数量是否大于设定阈值;
    若判断出所述第一特征点集合中元素的数量大于设定阈值,提取多个所述拍摄图像中相邻的两个所述拍摄图像,生成第二提取图像;
    根据多个所述第二提取图像和所述第二提取图像对应的IMU数据, 生成第三旋转角度集合;
    根据所述第三旋转角度集合和获取的第四旋转角度集合,生成所述第二时钟偏移,所述第四旋转角度集合包括自身的旋转角度集合。
  4. 根据权利要求2所述的方法,其特征在于,所述根据所述第一提取图像和所述第一提取图像对应的IMU数据,生成第一旋转角度集合,包括:
    通过图像处理技术函数对所述第一提取图像和所述第一提取图像对应的IMU数据进行计算,生成第二特征点集合;
    通过光流金字塔函数对所述第二特征点集合进行计算,生成所述第一旋转角度集合。
  5. 根据权利要求2所述的方法,其特征在于,所述根据所述第一旋转角度集合和获取的第二旋转角度集合,生成所述第一时钟偏移,包括:
    根据所述第一旋转角度集合和获取的所述第一旋转角度集合对应的时间戳,生成第一图像旋转角度曲线;
    根据所述第二旋转角度集合和获取的所述第二旋转角度集合对应的时间戳,生成第一IMU角度曲线;
    根据所述第一图像旋转角度曲线和所述第一IMU角度曲线,生成多个第一相关距离;
    查询出多个所述第一相关距离中最小的第一相关距离和所述最小的第一相关距离对应的第一时钟偏移。
  6. 根据权利要求3所述的方法,其特征在于,所述根据多个所述第二提取图像和所述第二提取图像对应的IMU数据,生成第三旋转角度集合,包括:
    通过图像处理技术函数对所述第二提取图像和所述第二提取图像对应的IMU数据进行计算,生成第三特征点集合;
    通过光流金字塔函数对所述第三特征点集合进行计算,生成所述第三旋转角度集合。
  7. 根据权利要求3所述的方法,其特征在于,所述根据所述第三旋转角度集合和获取的第四旋转角度集合,生成所述第二时钟偏移,包括:
    根据所述第三旋转角度集合和获取的所述第三旋转角度集合对应的时间戳,生成第二图像旋转角度曲线;
    根据所述第四旋转角度集合和获取的所述第四旋转角度集合对应的时间戳,生成第二IMU角度曲线;
    根据所述第二图像旋转角度曲线和所述第二IMU角度曲线,生成多个第二相关距离;
    查询出多个所述第二相关距离中最小的第二相关距离和所述最小的第二相关距离对应的第二时钟偏移。
  8. 一种3D背景替换装置,其特征在于,包括:
    第一获取模块,用于获取设定数量个特征图像和每个所述特征图像对应的惯性测量单元IMU数据;
    第一生成模块,用于根据设定数量个所述特征图像和每个所述特征图像对应的IMU数据进行离线校准,生成第一时钟偏移;
    第二获取模块,用于获取多个拍摄图像和每个所述拍摄图像对应的IMU数据;
    第二生成模块,用于根据多个拍摄图像和每个所述拍摄图像对应的IMU数据进行在线校准,生成第二时钟偏移;
    第三生成模块,用于根据所述第一时钟偏移和所述第二时钟偏移对每个所述拍摄图像进行3D背景替换,生成多个背景替换图像。
  9. 一种存储介质,其特征在于,包括:所述存储介质包括存储的程序,其中,在所述程序运行时控制所述存储介质所在设备执行权利要求1至7任意一项所述的一种3D背景替换方法。
  10. 一种终端设备,包括存储器和处理器,所述存储器用于存储包括程序指令的信息,所述处理器用于控制程序指令的执行,其特征在于,所述程序指令被处理器加载并执行时实现权利要求1至7任意一项所述的一种3D背景替换方法的步骤。
PCT/CN2022/099532 2021-06-28 2022-06-17 一种3d背景替换方法、装置、存储介质和终端设备 WO2023273923A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110721434.0A CN113436349B (zh) 2021-06-28 2021-06-28 一种3d背景替换方法、装置、存储介质和终端设备
CN202110721434.0 2021-06-28

Publications (1)

Publication Number Publication Date
WO2023273923A1 true WO2023273923A1 (zh) 2023-01-05

Family

ID=77755042

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/099532 WO2023273923A1 (zh) 2021-06-28 2022-06-17 一种3d背景替换方法、装置、存储介质和终端设备

Country Status (2)

Country Link
CN (1) CN113436349B (zh)
WO (1) WO2023273923A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436349B (zh) * 2021-06-28 2023-05-16 展讯通信(天津)有限公司 一种3d背景替换方法、装置、存储介质和终端设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090315915A1 (en) * 2008-06-19 2009-12-24 Motorola, Inc. Modulation of background substitution based on camera attitude and motion
CN110555882A (zh) * 2018-04-27 2019-12-10 腾讯科技(深圳)有限公司 界面显示方法、装置及存储介质
US20200234451A1 (en) * 2019-01-22 2020-07-23 Fyusion, Inc. Automatic background replacement for single-image and multi-view captures
CN112752038A (zh) * 2020-12-28 2021-05-04 广州虎牙科技有限公司 背景替换方法、装置、电子设备及计算机可读存储介质
CN113436349A (zh) * 2021-06-28 2021-09-24 展讯通信(天津)有限公司 一种3d背景替换方法、装置、存储介质和终端设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241544B (zh) * 2016-03-28 2019-11-26 展讯通信(天津)有限公司 视频稳像方法、装置及摄像终端
WO2019080052A1 (zh) * 2017-10-26 2019-05-02 深圳市大疆创新科技有限公司 姿态标定方法、设备及无人飞行器
CN108988974B (zh) * 2018-06-19 2020-04-07 远形时空科技(北京)有限公司 时间延时的测量方法、装置和对电子设备时间同步的系统
CN109186592B (zh) * 2018-08-31 2022-05-20 腾讯科技(深圳)有限公司 用于视觉惯导信息融合的方法和装置以及存储介质
CN112396639A (zh) * 2019-08-19 2021-02-23 虹软科技股份有限公司 图像对齐方法
CN111798489B (zh) * 2020-06-29 2024-03-08 北京三快在线科技有限公司 一种特征点跟踪方法、设备、介质及无人设备
CN112907629A (zh) * 2021-02-08 2021-06-04 浙江商汤科技开发有限公司 图像特征的跟踪方法、装置、计算机设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090315915A1 (en) * 2008-06-19 2009-12-24 Motorola, Inc. Modulation of background substitution based on camera attitude and motion
CN110555882A (zh) * 2018-04-27 2019-12-10 腾讯科技(深圳)有限公司 界面显示方法、装置及存储介质
US20200234451A1 (en) * 2019-01-22 2020-07-23 Fyusion, Inc. Automatic background replacement for single-image and multi-view captures
CN112752038A (zh) * 2020-12-28 2021-05-04 广州虎牙科技有限公司 背景替换方法、装置、电子设备及计算机可读存储介质
CN113436349A (zh) * 2021-06-28 2021-09-24 展讯通信(天津)有限公司 一种3d背景替换方法、装置、存储介质和终端设备

Also Published As

Publication number Publication date
CN113436349A (zh) 2021-09-24
CN113436349B (zh) 2023-05-16

Similar Documents

Publication Publication Date Title
WO2020207191A1 (zh) 虚拟物体被遮挡的区域确定方法、装置及终端设备
WO2022022063A1 (zh) 三维人体姿态估计方法和相关装置
WO2018119889A1 (zh) 三维场景定位方法和装置
Tanskanen et al. Live metric 3D reconstruction on mobile phones
TWI678099B (zh) 視頻處理方法、裝置和儲存介質
JPWO2018047687A1 (ja) 三次元モデル生成装置及び三次元モデル生成方法
CN110850961B (zh) 一种头戴式显示设备的校准方法及头戴式显示设备
WO2019237745A1 (zh) 人脸图像处理方法、装置、电子设备及计算机可读存储介质
WO2021031790A1 (zh) 一种信息处理方法、装置、电子设备、存储介质和程序
WO2023273923A1 (zh) 一种3d背景替换方法、装置、存储介质和终端设备
CN109089038A (zh) 增强现实拍摄方法、装置、电子设备及存储介质
CN114494388B (zh) 一种大视场环境下图像三维重建方法、装置、设备及介质
CN113393563A (zh) 关键点自动标注的方法、系统、电子装置和存储介质
WO2023160445A1 (zh) 即时定位与地图构建方法、装置、电子设备及可读存储介质
WO2022174603A1 (zh) 一种位姿预测方法、位姿预测装置及机器人
CN115311624A (zh) 一种边坡位移监测方法、装置、电子设备及存储介质
WO2021170127A1 (zh) 一种半身像的三维重建方法及装置
CN108431867B (zh) 一种数据处理方法及终端
CN112396117A (zh) 图像的检测方法、装置及电子设备
TWI740275B (zh) 擴增實境物件顯示裝置及擴增實境物件顯示方法
CN113099266B (zh) 基于无人机pos数据的视频融合方法、系统、介质及装置
CN114422736B (zh) 一种视频处理方法、电子设备及计算机存储介质
CN110849317B (zh) 显示屏幕间夹角的确定方法、电子设备及存储介质
TWI823491B (zh) 深度估計模型的優化方法、裝置、電子設備及存儲介質
TWI779332B (zh) 擴增實境系統與其錨定顯示虛擬物件的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22831741

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE