WO2021238317A1 - Procédé et dispositif de capture d'image panoramique - Google Patents

Procédé et dispositif de capture d'image panoramique Download PDF

Info

Publication number
WO2021238317A1
WO2021238317A1 PCT/CN2021/078666 CN2021078666W WO2021238317A1 WO 2021238317 A1 WO2021238317 A1 WO 2021238317A1 CN 2021078666 W CN2021078666 W CN 2021078666W WO 2021238317 A1 WO2021238317 A1 WO 2021238317A1
Authority
WO
WIPO (PCT)
Prior art keywords
path
sub
electronic device
image
deviation
Prior art date
Application number
PCT/CN2021/078666
Other languages
English (en)
Chinese (zh)
Inventor
漆思远
李伟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021238317A1 publication Critical patent/WO2021238317A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • the embodiments of the present application relate to the field of electronic technology, and in particular to a panoramic shooting method and device.
  • the electronic device can support multiple shooting modes and functions.
  • the electronic device can support night scene shooting, skin beautification shooting, time-lapse photography shooting, or panoramic shooting.
  • the mobile phone in the panoramic shooting, can guide the user to turn the mobile phone according to the guide line 01 shown in Figure 1, so that the mobile phone collects multiple frames of images with different angles, and stitches the images with different angles in the extension direction of the guide line 01 , So as to form a panoramic image with a wide field of view.
  • this panoramic shooting method lacks novelty and cannot meet the growing diverse shooting needs of users.
  • the embodiments of the present application provide a panoramic shooting method and device, which can respectively stitch images collected from different angles in two mutually perpendicular directions to generate a panoramic image, so that the angle of view of the panoramic image can be expanded in two directions , Improve the user's panoramic shooting experience.
  • an embodiment of the present application provides a panoramic shooting method, including: an electronic device enters a panoramic shooting mode of a camera application.
  • the electronic device displays the first guide information on the preview interface, and the first guide information includes the first guide path.
  • the first guide path includes at least two sub-paths that are arranged along a first direction and are parallel to each other, the first direction is parallel to a side of the electronic device, and the first guide path is used to guide the user along the first guide path during the shooting process. Turn the electronic device.
  • a guide path is displayed on the preview interface, and the guide path includes at least two sub-paths arranged in a first direction and parallel to each other, so as to guide the user to rotate the electronic device along the guide path during the shooting process.
  • the angle of view of the panoramic image in the first direction can be expanded;
  • the panoramic image can be expanded in a second direction perpendicular to the first direction while expanding the angle of view of the panoramic image in the first direction. The angle of view.
  • the coordinate ranges corresponding to the different sub-paths set along the first direction overlap in the first direction.
  • different sub-paths set along the first direction correspond to the same coordinate range in the first direction.
  • the two ends of the different sub-paths arranged along the first direction are respectively aligned, and the different sub-paths have the same length.
  • the preview interface also includes a splicing preview window for displaying thumbnails of the images collected by the electronic device. Moreover, the splicing preview window is located at the beginning of the first guide path. In this way, in some cases, the stitching preview window may be blocked and occupy a part of the guide path.
  • the entire guide path is a continuous complete path.
  • the sub-paths set along the second direction can guide the user to take the sequence of different sub-paths set along the first direction when shooting.
  • the method further includes: after the electronic device detects the shooting operation, displaying second guide information on the shooting interface.
  • the second guide information includes a splicing preview window, a second guide path, and a deviation indicator.
  • the deviation indicator is used to indicate the position of the center line of the image collected by the electronic device, and the deviation indicator moves along the second guide path during the shooting.
  • the second guide path includes a portion of the first guide path that does not deviate from the indicator mark.
  • the electronic device displays the target image obtained by splicing the images collected by the electronic device when the deviation indicator mark moves along the sub-path in the first direction in the stitching preview window; the electronic device stops shooting after the deviation indicator mark reaches the end of the second guide path, and the electronic device
  • the target image obtained by the splicing of the equipment is the panoramic image.
  • the electronic device can rotate along the guide path, and stitch the images collected when the sub path rotates along the first direction to generate a panoramic image.
  • the shooting interface only displays a guide path that deviates from the unfinished shooting part where the indicating arrow has not passed.
  • the deviation range indicator line is used to indicate the maximum range within which the center line of the image collected by the electronic device can deviate from the guide path. If the deviation indicator mark exceeds the range indicated by the deviation range indicator line, the center of the image collected by the electronic device exceeds the maximum range that can be deviated. At this time, the image may not include the image within the cropping range required for stitching, and thus cannot be performed. Image splicing, so the electronic device can stop the panoramic image shooting process.
  • the first guide path includes a first sub-path and a third sub-path that are arranged along the first direction and are parallel to each other, and the first sub-path is the initial sub-path.
  • the first guide path further includes a second sub-path arranged along the second direction, and the second sub-path is used to connect the first sub-path and the third sub-path.
  • the deviation indicator mark moves along the first sub-path, the second sub-path, and the third sub-path in sequence.
  • the guide path includes three sub-paths arranged along the first direction and two sub-paths arranged along the second direction.
  • the second guide path when the deviation indicator moves along the first sub-path, includes the portion of the first sub-path that the deviation indicator does not pass and the second sub-path to the first sub-path.
  • Five sub-paths, and deviation range indication lines are displayed on both sides of the first sub-path; the stitching preview window displays the target image obtained by splicing the images collected by the electronic device when the deviation indicating mark moves along the first sub-path.
  • the second guide path includes the portion of the second sub-path that the deviation indicator does not pass and the third to fifth sub-paths, and deviations are displayed on both sides of the second sub-path Range indicator line;
  • the target image displayed in the splicing preview window is the splicing result corresponding to the first path obtained by splicing the images collected by the electronic device after the deviation indicator mark moves to the end of the first sub-path.
  • the second guide path includes the part of the third sub-path that the deviation indicator does not pass, the fourth and fifth sub-paths, and deviations are displayed on both sides of the third sub-path Range indicator line;
  • the target image displayed in the splicing preview window is an image obtained by splicing the image collected by the electronic device to the splicing result corresponding to the first sub-path when the deviation indicator mark moves along the second sub-path.
  • the electronic device displays the guide path of the unfinished shooting and the deviation range indication line of the sub-path along which the electronic device is currently moving.
  • the target image in the preview splicing window is an image generated by splicing the electronic device along the sub-paths in the first direction, rather than the image generated by splicing the electronic device along all the sub-paths.
  • the electronic device displays in the mosaic preview window when the deviation indicator moves along the first sub-path, the image collected by the electronic device is spliced
  • the target image of includes: the electronic device maps the i-th frame image I i of the first sub-path to the cylindrical surface to obtain an image i is an integer greater than 1.
  • Electronic equipment extraction with The feature points F I,i and F I,i-1 It is the image obtained after the i-1th frame image I i-1 of the first subpath is mapped to the cylindrical surface.
  • the electronic device calculates the matching result of F I,i and F I,i-1. According to the matching result of F I,i and F I,i-1, the electronic equipment will Towards Map it.
  • the electronic device will be mapped
  • the part within the preset first cropping range is spliced with the spliced image RI(i-1) of the first sub-path, so as to obtain the spliced image RIi of the first sub-path.
  • the first clipping range includes a clipping line corresponding to the deviation range indication line of the first sub-path and a range defined by a left boundary line and a right boundary line preset by the electronic device.
  • the electronic device rotates along the first sub-path, and the electronic device performs cylindrical mapping on two adjacent frames of images collected during the rotation and extracts the feature points, and
  • the matching result is calculated according to the feature points, and the homography matrix is calculated according to the matching result, so that the next frame image is mapped to the previous frame image according to the homography matrix, and the next frame image after the mapping is cropped to match the previous frame image.
  • the splicing results of the splicing are spliced, and a new splicing result is obtained.
  • the cylindrical mapping of images from different angles taken by the electronic device in different postures can match the mapped size and imaging characteristics of the same object on the images taken at different angles, and then perform registration and stitching to generate a panoramic image , In order to meet the visual effect of the same image size of each part of the panorama.
  • the method further includes: the electronic device obtains multiple key frames from the image frames collected while rotating along the sub-path in the first direction.
  • the electronic device can display the target image obtained by splicing the images collected by the electronic device when the deviation indicator moves along the third sub-path in the mosaic preview window according to the key frame.
  • the electronic device may display the target image obtained by splicing the images collected by the electronic device when the deviation indicator moves along the fifth sub-path in the mosaic preview window according to the key frame.
  • the electronic device displays in the stitching preview window the image obtained by splicing the image collected by the electronic device when the deviation indicator moves along the sub-path in the first direction.
  • the target image includes: the electronic device obtains a target key frame G f1 that matches the first frame image A 1 of the third sub-path from a plurality of key frames.
  • the electronic device maps G f1 to the cylindrical surface to get the image
  • the electronic device maps A 1 to the cylindrical surface to get the image Electronic equipment extraction with The feature points F A,1 and F A,f1 .
  • the electronic device calculates the matching result of F A,1 and F A,f1.
  • the electronic device will Towards Map it.
  • the electronic device will be mapped
  • the part within the preset second cropping range is spliced with the corresponding splicing result of the first sub-path to obtain the spliced image RA1 of the second sub-path.
  • the second clipping range includes a clipping line corresponding to the deviation range indication line of the third subpath and a range defined by a left boundary line and a right boundary line preset by the electronic device.
  • the electronic device when the deviation indicator moves along the third sub-path, the electronic device rotates along the third sub-path, the electronic device determines the target key frame corresponding to the first frame of image After the image and the target key frame are mapped to the cylindrical surface, the feature points are extracted, and the matching results are calculated according to the feature points, and the homography matrix is calculated according to the matching results, so that the first frame image is mapped to the target key frame image according to the homography matrix.
  • the first frame image after the mapping is cropped, so that the splicing result corresponding to the first sub-path is spliced, and a new splicing result is obtained.
  • the electronic device registers the image of the third sub-path with the key frame in the image of the first sub-path, and can timely correct the misalignment error between the image of the third sub-path and the image of the first sub-path during the splicing of the images of the third sub-path, so that the first sub-path
  • the images of the three sub-paths are accurately registered with the splicing result of the first sub-path, thereby realizing global registration.
  • the splicing result of the third sub-path and the splicing result of the first sub-path can form a smooth and natural transition overall image.
  • the lens of the electronic device camera is located on the xy plane of the xyz three-dimensional coordinate system; the interval of the rotation angle around the y axis corresponding to the key frame is greater than or equal to the preset value ⁇ ; Among the multiple key frames, the difference between the rotation angle around the y axis corresponding to the target key frame G f1 and the rotation angle around the y axis corresponding to A 1 is the smallest.
  • the image misalignment error of the image of the target key frame and the third sub-path is the smallest when stitching, and the image after the two are mapped to the cylindrical surface is easier to be registered.
  • the electronic device displays in the stitching preview window the image obtained by splicing the image collected by the electronic device when the deviation indicator moves along the sub-path in the first direction.
  • target image further comprising: an electronic device to acquire the i-th third sub-frame image a i path matches the i-th keyframe G fi target from the plurality of keyframes.
  • the electronic device maps the G fi to the cylindrical surface to obtain the image
  • the electronic device to obtain an image A i is mapped to the cylindrical surface Electronic equipment extraction with Feature points F A,i , F A,i-1 and F A,fi , It is the image obtained after the i-1th frame image Ai-1 of the second subpath is mapped to the cylindrical surface.
  • the electronic device calculates the matching results of F A,i , F A,i-1 and F A,fi. According to the matching results of F A,i , F A,i-1 and F A,fi, the electronic equipment will Towards with Map it. The electronic device will be mapped For the part within the preset second cropping range, the stitching result corresponding to the first sub-path and the stitched image RA(i-1) of the second sub-path are stitched to obtain the stitched image RAi of the second sub-path.
  • the electronic device when the electronic device rotates along the third sub-path, the electronic device can perform registration, mapping, and splicing of images other than the first frame image according to the previous frame image and the determined target key frame.
  • the electronic device displays in the stitching preview window the image obtained by splicing the image collected by the electronic device when the deviation indicator moves along the sub-path in the first direction.
  • the target image includes: the electronic device splicing the image collected while rotating along the fifth sub-path with the splicing result corresponding to the first sub-path according to a plurality of key frames.
  • the electronic device splices the image collected while rotating along the fifth sub-path with the splicing result corresponding to the first sub-path according to a plurality of key frames
  • the splicing result corresponding to the first sub-path may already be the same as the splicing result of the first sub-path.
  • the splicing results of the three sub-paths are spliced. That is, the electronic device splices the image collected while rotating along the fifth sub-path with the splicing result of the first sub-path and the splicing result of the third sub-path, thereby forming a spliced image with a larger field of view.
  • the lens of the electronic device camera is located on the xy plane of the xyz three-dimensional coordinate system.
  • the electronic device is preset with a left boundary line, a left baseline, a left clipping line, a right clipping line, a right baseline and a right boundary line; the left baseline corresponds to the fourth sub-path, the left baseline corresponds to the rotation angle around the y axis is 0, and the right baseline corresponds to
  • the second subpath corresponds to the rotation angle around the y axis corresponding to the right baseline is ⁇ r , the rotation angle around the y axis corresponding to the left clipping line is ⁇ 2 , and the rotation angle around the y axis corresponding to the right clipping line is ⁇ 3 ,
  • the rotation angle around the y-axis corresponding to the left boundary line is ⁇ 1
  • the rotation angle around the y-axis corresponding to the right boundary line is ⁇ 4 .
  • the electronic device is also preset with upper, middle, and lower baselines, as well as the first cutting line, the second cutting line, the third cutting line, and the fourth cutting line; among them, the upper, middle, and lower baselines are the same as the third The sub-path, the first sub-path and the fifth sub-path correspond to each other, the first and second cutting lines correspond to the deviation range indication lines of the third sub-path, and the second and third cutting lines correspond to the deviation of the first sub-path.
  • the deviation range indicator line corresponds to the third cutting line and the fourth cutting line correspond to the deviation range indicator line of the fifth subpath; the first cutting line, the upper baseline, the second cutting line, the middle line, the third cutting line, the lower baseline
  • the rotation angles around the x-axis corresponding to the fourth cutting line are ⁇ 1 , ⁇ t , ⁇ 2 , 0, ⁇ 3 , ⁇ b , and ⁇ 4, respectively .
  • the method further includes: when the deviation indicator moves along the first sub-path, if the rotation angle of the electronic device around the x axis is ⁇ > ⁇ 1 or ⁇ 2 , the electronic device stops shooting.
  • the deviation indicator When the rotation angle ⁇ 0 of the electronic device around the y-axis, the deviation indicator will switch from moving along the third sub-path to moving along the fourth sub-path; when the rotation angle ⁇ > ⁇ 2 or ⁇ of the electronic device around the y-axis ⁇ when ⁇ 1, the electronic device to stop recording.
  • the electronic device about the x axis rotation angle ⁇ ⁇ When ⁇ b, departing from the indicating mark is moved by the switching path is traversed along fourth sub fifth sub path; when the electronic device about the x axis rotation angle ⁇ > ⁇ 3 or When ⁇ 4 , the electronic device stops shooting.
  • the electronic device about the y-axis rotational angle ⁇ > ⁇ r the electronic device to stop recording.
  • the electronic device can determine whether to switch the sub-path along which it rotates, and determine whether the center line of the image collected by the electronic device exceeds the cropping range, etc., according to the size of the rotation angle around the x-axis or the y-axis.
  • an embodiment of the present application provides a photographing device, which is included in an electronic device.
  • the device has the function of realizing the behavior of the electronic device in any method in the foregoing aspects and possible designs, so that the electronic device executes the panoramic shooting method executed by the electronic device in any of the foregoing aspects in a possible design.
  • This function can be realized by hardware, or by hardware executing corresponding software.
  • the hardware or software includes at least one module or unit corresponding to the above-mentioned functions.
  • the device may include a processing unit, a display unit, a detection unit, and so on.
  • an embodiment of the present application provides an electronic device, including: a camera for capturing images; a screen for displaying an interface; one or more processors; and a memory in which codes are stored.
  • the code is executed by the electronic device, the electronic device is caused to execute the panoramic photography method executed by the electronic device in any one of the possible designs in the foregoing aspects.
  • an embodiment of the present application provides an electronic device, including: one or more processors; and a memory, in which code is stored.
  • the electronic device is caused to execute the panoramic photography method executed by the electronic device in any one of the possible designs in the foregoing aspects.
  • an embodiment of the present application provides a computer-readable storage medium, including computer instructions, which, when the computer instructions run on an electronic device, cause the electronic device to execute the panoramic photography method in any one of the possible designs in the foregoing aspects.
  • an embodiment of the present application provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the panoramic shooting method executed by the electronic device in any one of the above-mentioned possible designs.
  • an embodiment of the present application provides a chip system, which is applied to an electronic device.
  • the chip system includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected by wires; the interface circuit is used to receive signals from the memory of the electronic device and send signals to the processor.
  • the signals include the memory Stored computer instructions; when the processor executes the computer instructions, it causes the electronic device to execute any one of the above-mentioned aspects of the possible design of the panoramic photography method.
  • FIG. 1 is a schematic diagram of a preview interface for panoramic shooting in the prior art
  • FIG. 2 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the application.
  • FIG. 3 is a flowchart of a panoramic shooting provided by an embodiment of the application.
  • 4A is a schematic diagram of a set of interfaces provided by an embodiment of the application.
  • 4B is a schematic diagram of a set of guide paths provided by an embodiment of the application.
  • Figure 5 is a schematic diagram of a set of guidance information provided by an embodiment of the application.
  • FIG. 6 is a schematic diagram of another set of interfaces provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of a set of rule lines provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of a coordinate system provided by an embodiment of the application, and a schematic diagram of the quantitative relationship between image frames and gyroscope data;
  • FIG. 9 is a schematic diagram of a set of cylindrical surface mapping effects provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram of a splicing effect provided by an embodiment of this application.
  • FIG. 11 is a schematic diagram of a comparison of feature points provided by an embodiment of this application.
  • FIG. 12 is a schematic diagram of a set of splicing effects provided by an embodiment of the application.
  • FIG. 13 is a schematic diagram of a set of key frames provided by an embodiment of the application.
  • 14A is a schematic diagram of another set of interfaces provided by an embodiment of the application.
  • 14B is a schematic diagram of a set of guide paths on a shooting interface provided by an embodiment of the application.
  • FIG. 15 is a schematic diagram of another interface provided by an embodiment of the application.
  • 16A is a schematic diagram of another interface provided by an embodiment of the application.
  • 16B is a schematic diagram of another splicing effect provided by an embodiment of the application.
  • 16C is a schematic diagram of comparison between a set of upper baseline images and target key frames provided by an embodiment of the application;
  • 16D is a schematic diagram of another splicing effect provided by an embodiment of the application.
  • 16E is a schematic diagram of another splicing effect provided by an embodiment of the application.
  • FIG. 17 is a schematic diagram of another splicing effect provided by an embodiment of the application.
  • FIG. 18 is a schematic diagram of another set of splicing effects provided by an embodiment of the application.
  • FIG. 19 is a schematic diagram of another set of interfaces provided by an embodiment of the application.
  • FIG. 20 is a schematic diagram of another set of guide paths provided by an embodiment of this application.
  • FIG. 21 is a schematic diagram of another interface provided by an embodiment of this application.
  • 22A is a schematic diagram of another interface provided by an embodiment of the application.
  • 22B is a schematic diagram of another set of guide paths provided by an embodiment of the application.
  • 22C is a schematic diagram of another splicing effect provided by an embodiment of the application.
  • FIG. 23 is a schematic diagram of a set of guide paths and panoramic images obtained by stitching provided by an embodiment of the application.
  • 24 is a schematic diagram of another set of guide paths and panoramic images obtained by stitching provided by an embodiment of the application.
  • 25 is a schematic diagram of another set of guide paths and panoramic images obtained by stitching provided by an embodiment of the application.
  • FIG. 26 is a schematic structural diagram of an electronic device provided by an embodiment of this application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the present embodiment, unless otherwise specified, “plurality” means two or more.
  • the embodiment of the present application provides a panoramic shooting method, which can stitch images collected from different angles in two mutually perpendicular directions to generate a panoramic image, so that the view of the stitched and synthesized image can be expanded in two directions.
  • Field angle to obtain panoramic images that can cover a larger viewing angle in both directions, and improve the user's shooting experience.
  • the panoramic shooting method provided in the embodiments of the present application can be applied to electronic devices.
  • the electronic device may be a mobile phone, a tablet computer, a wearable device, a vehicle-mounted device, an augmented reality (AR)/virtual reality (VR) device, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer).
  • Computers, UMPCs), netbooks, personal digital assistants (personal digital assistants, PDAs) and other devices can also be professional cameras and other devices.
  • the embodiments of the present application do not impose any restrictions on the specific types of electronic devices.
  • FIG. 2 shows a schematic structural diagram of the electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 may display a preview interface and a shooting interface in the panoramic shooting mode.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active matrix organic light-emitting diode active-matrix organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may
  • the electronic device 100 can realize a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, and the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, which is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • the camera 193 may include a front camera and/or a rear camera.
  • the camera 193 may also include multiple types.
  • the camera 193 may include a telephoto camera, a wide-angle camera, an ultra-wide-angle camera, etc. whose field of view angle varies from small to large.
  • the electronic device 100 in the panoramic shooting mode, may use a camera with a larger field of view (for example, an ultra-wide-angle camera or a wide-angle camera) to collect multiple frames of images at different angles, so as to have a larger view of the captured images. After cropping multiple frames of images in the viewing angle range, they are stitched into a panoramic image with a larger viewing angle range.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent cognition of the electronic device 100, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the internal memory 121 may be used to store computer executable program code, and the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the processor 110 runs the instructions stored in the internal memory 121 to stitch images collected from different angles by the camera 193 in two mutually perpendicular directions to generate a panoramic image. Expand the angle of view of the panoramic image in two directions.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the gyroscope sensor 180B may be used to calculate the rotation angle ⁇ of the mobile phone around the x-axis and the rotation angle ⁇ around the y-axis in the panoramic shooting process.
  • the rotation angle ⁇ and the rotation angle ⁇ can be used to determine the shooting stage of the panoramic shooting process, determine the position of the deviation indicator corresponding to the center of the image frame, and determine whether the deviation range of the current image frame exceeds the maximum deviation range, and so on.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be provided on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation acting on or near it, for example, a touch operation used to instruct to take a panoramic image.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the camera 193 in the panoramic shooting mode, can collect multiple frames of images at different angles.
  • the display screen 194 can display the preview interface and the shooting interface in the panoramic shooting mode.
  • the processor 110 by running the instructions stored in the internal memory 121, stitches the images collected by the camera 193 from different angles in two mutually perpendicular directions to generate a panoramic image, so that the panoramic image can be expanded in two directions.
  • the gyro sensor 180B can be used to calculate the rotation angle ⁇ of the mobile phone around the x-axis and the rotation angle ⁇ around the y-axis in the panoramic shooting process.
  • the rotation angle ⁇ and the rotation angle ⁇ can be used to determine the shooting stage of the panoramic shooting process, determine the position of the deviation indicator corresponding to the center of the image frame, and determine whether the deviation range of the current image frame exceeds the maximum deviation range, and so on.
  • the method may include:
  • a guide path is displayed on a preview interface, and the guide path includes at least two parallel sub-paths set along a first direction.
  • the mobile phone After the mobile phone detects that the user has clicked the camera icon 401 shown in (a) in FIG. 4A, it starts the camera application and enters the photographing mode shown in (b) in FIG. 4A.
  • the mobile phone detects that the user has clicked the operation of the panoramic shooting control 402 shown in (b) in FIG. 4A, it enters the panoramic shooting mode and displays the preview interface as shown in (c) in FIG. 4A.
  • the mobile phone detects that the user clicks on the control 403 shown in (b) in FIG. 4A, it displays the interface shown in (d) in FIG. 4A; after the mobile phone detects that the user clicks on the control 404, , Enter the panoramic shooting mode, and display the preview interface as shown in (c) in Figure 4A.
  • the mobile phone can also enter the panoramic shooting mode in response to other touch operations, voice commands, or shortcut gestures of the user.
  • the embodiment of the present application does not limit the operation of triggering the mobile phone to enter the panoramic shooting mode.
  • the preview interface in the panoramic shooting mode includes guide information for panoramic shooting.
  • the guidance information includes the guidance path.
  • the guide path is used to guide the user to rotate (and/or move) the mobile phone along the guide path during the shooting process, so that the center line of the image collected by the mobile phone moves along the guide path to complete panoramic shooting.
  • the guide path includes at least two sub-paths arranged along the first direction and parallel to each other. The mobile phone can collect images from different angles along the sub-paths in the first direction, and stitch them along each sub-path to generate a panoramic image.
  • the angle of view of the panoramic image in the first direction can be expanded; when the sub-path in the first direction includes multiple sub-paths, the field of view in the first direction is expanded.
  • the mobile phone can also expand the angle of view of the panoramic image in the second direction perpendicular to the first direction.
  • the different sub-paths set along the first direction are used to guide the user to take multiple shots back and forth along the first direction to expand the angle of view of the panoramic image in the second direction perpendicular to the first direction.
  • This kind of guidance can be precise guidance or general trend guidance.
  • the guide path includes a middle sub-path 41, an upper sub-path 42 and a lower sub-path 43 arranged along the first direction, and the middle sub-path 41, the upper sub-path 42 and the lower sub-path
  • the coordinate ranges corresponding to the path 43 in the first direction are the same.
  • the guide path includes a middle sub-path 41, an upper sub-path 42 and a lower sub-path 43 arranged along the first direction, and the middle sub-path 41, the upper sub-path 42 and the lower sub-path There is a partial overlap between the corresponding coordinate ranges of the path 43 in the first direction.
  • the sub-path set along the first direction may be provided with a direction indicator (for example, an arrow on the sub-path) to guide the direction of rotation when the mobile phone rotates along the sub-path.
  • a direction indicator for example, an arrow on the sub-path
  • the sub-path set along the first direction may not be provided with a direction indication mark, and the user only needs to complete the rotation along the sub-path, and the specific rotation direction of the user is not limited.
  • the mobile phone can remind the user of the sequence of shooting along different sub-paths in the first direction by means of display information or voice broadcast.
  • the mobile phone can prompt the user through text messages to first rotate along the middle sub-path during the shooting, and then rotate along the upper sub-path and the lower sub-path to shoot.
  • the mobile phone can remind the user of the sequence of shooting along different sub-paths by indicating arrows or other indicating methods.
  • the arrow 44 is used to instruct the user to first turn and shoot along the middle sub-path, and then to rotate along the upper sub-path to shoot; the arrow 45 is used to instruct the user to rotate and shoot along the upper sub-path , And then rotate and shoot along the sub-path below.
  • the guide path may further include at least one sub-path arranged along the second direction.
  • the sub-paths set along the second direction are used to remind the user of the sequence of shooting of different sub-paths along the first direction.
  • the guide path further includes a right sub-path 46 and a left sub-path 47.
  • the right sub-path 46 is used to instruct the user to first follow the middle sub-path.
  • the path rotates and shoots, and then rotates along the upper sub-path to shoot;
  • the left sub-path 47 is used to instruct the user to rotate along the upper sub-path for shooting, and then rotate along the lower sub-path to shoot.
  • the sub-path in the second direction is used to connect the sub-path in the first direction, and the entire guide path is a continuous path.
  • the multiple sub-paths are parallel to each other.
  • the guide path may refer to the path 405 shown in (c) in FIG. 4A.
  • the mobile phone can also remind the user of the sequence of shooting along different sub-paths in the first direction in various other ways, which is not limited in the embodiment of the present application.
  • first direction and the second direction may be two directions respectively parallel to two adjacent sides of the mobile phone, and the second direction is perpendicular to the first direction.
  • one side of the mobile phone is usually parallel to the horizontal direction (or the angle between one side of the mobile phone and the horizontal direction is less than or equal to the preset value, that is, one side of the mobile phone is basically parallel to the horizontal direction.
  • the other side adjacent to the side is usually parallel to the vertical direction (or the angle between the other side of the mobile phone and the vertical direction is less than or equal to the preset value, that is, the other side of the mobile phone is The vertical direction is basically parallel). That is, the first direction and the second direction may be a horizontal direction and a vertical direction.
  • the first direction is the horizontal direction (or horizontal direction), and the second direction is the vertical direction (or vertical direction).
  • the guide path includes horizontal and vertical sub-paths, and the horizontal sub-path may include at least two, and the vertical sub-path is used for Connect the sub-paths in the horizontal direction.
  • the guide path includes three sub-paths: the middle sub-path 501, the upper sub-path 502, and the lower sub-path 503 in the horizontal direction, and the left sub-path 504 and the lower sub-path 503 in the vertical direction.
  • the right sub-path 505 is two sub-paths.
  • the guide path on the preview interface is used to remind the user of the complete path that needs to be taken during the entire shooting process.
  • the guide information on the preview interface may also include a deviation indicator mark, which is used to indicate the position of the center line (ie, the horizontal center line) of the image currently collected by the camera in real time in the first direction.
  • the deviation indicator arrow may also point to the direction to be moved of the deviation indicator arrow, that is, to the end direction of the initial sub-path, that is, to the direction to be rotated of the mobile phone.
  • the deviation indicator is located at the beginning of the starting sub-path. Exemplarily, in the case shown in (b) in FIG.
  • the deviation indicator mark may be a deviation indicator arrow 506, the starting sub-path is the middle sub-path 501, and the deviation indicator arrow 506 is located on the middle sub-path 501 on the preview interface. The left end of and points to the right end of the middle sub-path 501. It is understandable that the deviation indicator mark may also be a deviation indicator line or other forms, which is not limited in the embodiment of the present application. The following will take the deviation indicator as the deviation indicator arrow as an example for description.
  • the guidance information on the preview interface may also include a deviation range indication line of the sub-path. Since the user is prone to shaking when holding the mobile phone, the position of the horizontal center line of the image collected by the mobile phone will usually change, and the position of the deviation indicator arrow will also change.
  • the deviation range indicator line is located on both sides of the sub-path, and is parallel to the sub-path, and is used to indicate the maximum allowable range of the deviation indicator arrow from the sub-path.
  • the mobile phone can only display the deviation range indicator line of the starting sub-path. Exemplarily, in the case shown in (b) of FIG. 5, the starting sub-path is the middle sub-path, and the deviation range indication lines 501a-501b may be located on both sides of the middle sub-path 501, and are in line with the middle sub-path. 501 Parallel dotted lines.
  • the mobile phone may prompt the user by displaying information or voice broadcast, etc., that the deviation indicator arrow overlaps with the guide path as much as possible, and should not exceed the deviation range indicator line.
  • the mobile phone can prompt the user through text messages on the preview interface: After starting to shoot, slowly turn the mobile phone so that the arrow moves along the guide path and does not exceed the range of the dotted line.
  • the preview interface may further include a splicing preview window 507.
  • the splicing preview window is located at the beginning of the guide path, and the deviation indicator arrow can be located beside the splicing preview window and on the side where the end of the starting sub-path is located.
  • the splicing preview window is located at the beginning of the starting sub-path 501, and the deviation indicating arrow is located on the right side of the splicing preview window.
  • the stitching preview window is used to display the thumbnail of the preview image currently displayed on the preview interface (or called the preview image of the preview image).
  • the stitching preview window will occupy or block a part of the guide path. It can be seen from Fig. 5(b) that the lengths of the sub-path 501, the sub-path 502 and the sub-path 503 are equal, but because the splicing preview window will block a part of the guide path, as shown in Fig. 5(b), the sub-path 502 and sub-path 503 are indented compared to the left end of sub-path 501, and the right ends of the three sub-paths are aligned.
  • the length of the sub-path 501 itself is smaller than the sub-path 502 and the sub-path 503, and compared with the sub-path 502 and the sub-path 503, the left end of the sub-path 501 is reduced. Then, the left ends of the sub-path 502 and the sub-path 503 are aligned, and the right ends of the three sub-paths are also aligned.
  • the mobile phone After the mobile phone detects the user's shooting operation, it displays an image obtained by splicing images collected according to the guide path on the shooting interface.
  • the mobile phone detects that the user has clicked the operation of the shooting control 406 shown in (c) of FIG. 4A, it starts to shoot a panoramic image and displays the shooting interface.
  • the mobile phone displays the images collected by the camera on the shooting interface in real time. It is understandable that the mobile phone can also start panoramic image shooting in response to operations such as voice instructions or shortcut gestures of the user.
  • the embodiment of the present application does not limit the operation of triggering the mobile phone to start panoramic image shooting.
  • the mobile phone displays guide information on the shooting interface, and the guide information includes a guide path, a deviation indicator arrow, and a deviation range indicator line.
  • the displayed shooting interface may refer to (a) in FIG. 6.
  • the deviation indicator arrow is used to remind the user of the position of the center line of the image currently collected by the camera and the degree of deviation between the position of the center line and the target sub-path.
  • the deviation indicator arrow may also point to the direction to be moved of the deviation indicator arrow, that is, to the end direction of the target sub-path, that is, to the direction to be rotated of the mobile phone.
  • the target sub-path is a sub-path that deviates from the indicated arrow currently moving along it.
  • the target sub-path is the starting sub-path (for example, the intermediate sub-path described above), and as the shooting process proceeds, the target sub-path will switch.
  • the target sub-path is a horizontal sub-path (that is, a sub-path set in the horizontal direction)
  • the deviation indicating arrow moves along the horizontal sub-path
  • the deviation indicating arrow is used to indicate the position of the horizontal center line of the image collected by the mobile phone.
  • the horizontal center lines of the different images move along the horizontal sub-path.
  • the deviation indicating arrow moves along the vertical sub-path, and the deviation indicating arrow is used to indicate the position of the vertical center line of the image collected by the mobile phone.
  • the vertical center lines of the different images collected by the mobile phone move along the vertical sub-path.
  • the mobile phone can prompt the user by displaying information or voice broadcast, etc., slowly turning the mobile phone along the guide path, and the deviation indicator arrow should coincide with the guide path as much as possible, and do not exceed the deviation range indicator line.
  • the mobile phone can prompt the user through a text message: Please turn the mobile phone slowly so that the arrow moves along the guide path and does not exceed the range of the dotted line.
  • the mobile phone can prompt the user to move the mobile phone to coincide the deviation indicator arrow with the target sub-path.
  • the mobile phone may prompt the user: Please move up so that the arrow coincides with the guide path.
  • the guide path displayed on the shooting interface is a complete guide path.
  • the mobile phone rotates to the right so that the deviation indicator arrow moves to the right along the horizontal sub-path.
  • the phone keeps turning to collect images from different angles, so that the deviation indicator arrow moves to the right along the middle sub-path, upwards along the right sub-path, and to the left along the upper sub-path, along the The left sub-path moves down, and moves to the right along the lower sub-path, until the entire guide path is taken.
  • the sub-path that deviates from the current movement of the indicating arrow is the target sub-path. For example, when the deviation indicating arrow moves along the middle sub-path, the target sub-path is the middle sub-path; when the deviation indicating arrow moves along the right sub-path, the target sub-path is the right sub-path.
  • the guide path displayed on the shooting interface will change with the user's shooting process, and the shooting interface may only display the guide path of the unfinished shooting. , That is, the part of the guide path displayed on the preview interface that deviates from the indicated arrow.
  • the deviation indicator arrow can also fluctuate within the deviation range indicator line when it moves along the guide path. Therefore, it can be understood that deviation from the guide path that the indicating arrow passes includes, but is not limited to, the part that the deviation indicating arrow passes through when the indicating arrow coincides with the guide path, and also includes the deviation indicating arrow that does not coincide with the guide path and is along the deviation range indicating line.
  • the mobile phone only displays the current target sub-path.
  • the mobile phone displays the complete guide path on the shooting interface, and the display mode of the guide path for the completed shooting and the guide path for the incomplete shooting is different.
  • the guide path for unfinished shooting is a solid line
  • the guide path for completed shooting is a dotted line.
  • the mobile phone displays the complete guide path on the shooting interface until the shooting is completed.
  • the shooting interface also includes a stitching preview window, which is used to display thumbnails (or called previews of stitched images) of images obtained by stitching by mobile phones during shooting.
  • the deviation indicator arrow can be located beside the stitching preview window and on the side where the end of the target subpath is located.
  • the size, position, and shape of the stitching preview window correspond to the guide path of the completed shooting.
  • the stitching preview window on the shooting interface can cover the guide path of the completed shooting.
  • the shooting angle and shooting range of the camera also change, so that images from different angles can be collected.
  • the phone can stitch images from different angles and display them in the stitching preview window.
  • the image displayed by the mobile phone in the splicing preview window can be specifically the target image obtained by splicing the collected images when the mobile phone rotates along the horizontal sub-path in the first direction, that is, when the deviation indicator moves along the horizontal sub-path in the first direction
  • the target image obtained by splicing the collected images by the mobile phone can be specifically the target image obtained by splicing the collected images when the mobile phone rotates along the horizontal sub-path in the first direction, that is, when the deviation indicator moves along the horizontal sub-path in the first direction.
  • the rule line includes an upper baseline, a middle baseline, and a lower baseline, which correspond to the upper sub-path, the middle sub-path, and the lower sub-path in the horizontal direction, respectively.
  • the rule line also includes the first cutting line, the second cutting line, the third cutting line, and the fourth cutting line.
  • the first cutting line and the second cutting line are located on both sides of the upper baseline, and respectively correspond to the deviation range indication lines on both sides of the upper sub-path in the horizontal direction.
  • the first cropping line and the second cropping line constitute the upper and lower cropping ranges of the upper baseline image (that is, the image collected when the mobile phone rotates along the upper sub-path), which limits the images collected during the process of the mobile phone rotating along the upper sub-path.
  • the second cutting line and the third cutting line are located on both sides of the middle baseline, and respectively correspond to the deviation range indication lines on both sides of the middle sub-path in the horizontal direction.
  • the second cropping line and the third cropping line constitute the upper and lower cropping ranges of the mid-baseline image (that is, the image collected when the mobile phone rotates along the middle sub-path), which limits the image collected during the shooting of the mobile phone while rotating along the middle sub-path
  • the maximum deviation range of the horizontal centerline The 3rd cutting line and the 4th cutting line are located on both sides of the lower baseline, and respectively correspond to the deviation range indicator lines on both sides of the upper and lower subpaths in the horizontal direction.
  • the 3rd and 4th cutting lines form the upper and lower cutting ranges of the lower baseline image (that is, the image collected when the mobile phone rotates along the lower sub-path), which limits the images collected during the process of the mobile phone rotating along the lower sub-path.
  • the maximum deviation range of the horizontal centerline The maximum deviation range of the horizontal centerline.
  • the ruled line also includes a left baseline and a right baseline, which respectively correspond to the left sub-path and the right sub-path in the vertical direction.
  • the rule line also includes a left boundary, a left clipping line, a right clipping line, and a right boundary.
  • the left border and the left clipping line are located on both sides of the left baseline, and respectively correspond to the deviation range indication lines on both sides of the left sub-path in the vertical direction.
  • the left border and the left crop line define the left and right maximum deviation range of the vertical center line of the left baseline image (ie, the image collected when the mobile phone rotates along the left sub-path).
  • the left border also defines the left border when stitching the panoramic image, and the part of the image beyond the left border can be directly cut off when stitching.
  • the right border and the right clipping line are located on both sides of the right baseline, and respectively correspond to the deviation range indication lines on both sides of the right sub-path in the vertical direction.
  • the right border and the right crop line define the left and right maximum deviation range of the vertical center line of the right baseline image (that is, the image collected when the mobile phone rotates along the right sub-path).
  • the right border also defines the right border when stitching the panoramic image, and the part of the image beyond the right border can be directly cut off when stitching.
  • each of the regular lines shown in FIG. 7 is a straight line on a plane, and in fact, each regular line is a curve on a cylindrical surface.
  • the posture of the mobile phone when the image is collected and the image collection time can be used to mark the position deviating from the indicator arrow, to switch the target sub-path, and to determine whether the center line of the image exceeds the cropping range, etc.
  • the three-dimensional coordinate axes include x-axis, y-axis, and z-axis.
  • the plane where the mobile phone lens is located is on the xy plane or parallel to the xy plane, and the z axis is perpendicular to the plane where the mobile phone lens is located.
  • the plane where the lens of the mobile phone is located is also parallel to the screen where the screen of the mobile phone is located.
  • the posture of the mobile phone when collecting images can be represented by the rotation angle of the mobile phone around the x-axis, y-axis, and z-axis.
  • the mobile phone rotates along the horizontal sub-path the mobile phone rotates around the y-axis.
  • the mobile phone rotates along the vertical sub-path the mobile phone rotates around the x-axis.
  • Equation 1 the rotation angle ⁇ 'of the mobile phone around the x-axis between the collection moments of two adjacent frames of images.
  • T 0 represents the time corresponding to the previous frame
  • T 1 represents the time corresponding to the next frame
  • t k represents the time corresponding to the k-th gyroscope data
  • t k-1 represents the time corresponding to the k-1th gyroscope data
  • t 0 represents the time corresponding to the 0th gyroscope data
  • It represents the rotational angular velocity of the Nth gyroscope between T 0 and T 1
  • t N represents the time corresponding to the Nth gyroscope data.
  • the mobile phone can also obtain the rotation angle of the mobile phone around the y-axis between the collection moments of two adjacent frames of images.
  • the rotation angle ⁇ of the phone around the x-axis and the rotation angle ⁇ around the y-axis at the time of collection of any frame of image can be obtained (also called the rotation angle ⁇ corresponding to any frame of image And the rotation angle ⁇ ).
  • the horizontal centerline of the first frame of image mapped to the cylindrical surface coincides with the center baseline on the cylindrical surface.
  • the upper and lower baselines are rotated by a certain angle around the x-axis relative to the middle baseline, and the rotation angles are denoted by ⁇ t and ⁇ b respectively .
  • the rotation angles of the horizontal cutting lines 1-4 around the x-axis can also be denoted as ⁇ 1 , ⁇ 2 , ⁇ 3 , and ⁇ 4, respectively .
  • the values of these rotation angles can be used to adjust the extension range of the vertical view angle when stitching panoramic images, and the maximum range within which the horizontal center line of the image can deviate from the baseline.
  • the right baseline rotates a certain angle around the y axis relative to the left baseline, which is recorded as ⁇ r .
  • the rotation angles of the left border, the left clipping line, the right clipping line, and the right border around the y-axis can be denoted as ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4, respectively .
  • the corresponding relationship between each ruled line and the rotation angle can be seen in FIG. 7.
  • the mobile phone can determine the horizontal centerline of the current image by the rotation angle ⁇ of the mobile phone when each frame of image is collected, thereby drawing a deviation indicator arrow corresponding to the rotation angle ⁇ on the shooting interface.
  • the mobile phone can determine whether the deviation indicator arrow exceeds the deviation range indicator line of the horizontal sub-path according to the rotation angle ⁇ , and the mobile phone can also determine whether the deviation indicator arrow exceeds the deviation range indicator line of the vertical sub-path according to the rotation angle ⁇ .
  • the mobile phone can also determine whether to switch the target sub-path according to the rotation angle ⁇ and the rotation angle ⁇ .
  • the target sub-path is different
  • the image stitching method is also different.
  • the guide path shown in (a) of FIG. 4A will be used as an example for description.
  • the target sub-path is the middle sub-path, and the baseline image is spliced
  • the phone After shooting, the phone first rotates to the right along the middle sub-path.
  • the image collected by the phone can be called the middle baseline image or the image of the middle sub-path.
  • the target sub-path is the middle sub-path, and the rotation angle ⁇ Constantly changing from 0.
  • the mobile phone After the mobile phone collects the baseline image in the first frame (that is, the first frame image of the intermediate sub-path), it performs cylindrical mapping on the first frame image according to formula 2-4 to obtain the baseline image' in the first frame on the cylindrical surface.
  • the phone rotates upward by a certain angle, and the scale change of the upper boundary of the image after mapping is greater than the scale change of the lower boundary.
  • the phone rotates downward by a certain angle, and the scale change of the upper boundary of the image after mapping is smaller than the scale change of the lower boundary.
  • the mapping result of the baseline image in the first frame is symmetric with respect to the middle baseline level.
  • the left edge of the baseline image in the first frame is the left edge of the entire panoramic image. See Figure 10.
  • the part of the baseline image in the first frame that lies within the cropping range is the initial mid-baseline image stitching result RI1, that is, the deviation indicating arrow edge
  • the initial target image RI1 obtained by stitching when the middle sub-path moves.
  • the mobile phone displays the middle baseline image stitching result RI1 in the stitching preview window.
  • the mobile phone further calculates according to the matching result with The homography matrix H. See (a) in Figure 12, the mobile phone will Towards Map it. The phone will be mapped The rectangular part within the cropping range (that is, the rectangular part filled with horizontal lines in (a) in Figure 12) is cropped and spliced to the middle baseline image stitching result RI(i-1) (that is, in (a) in Figure 12). The right side of the rectangular part filled with stripes); thereby forming the middle baseline image stitching result RIi, that is, the target image RIi obtained by stitching when the deviation indicator arrow moves along the middle sub-path.
  • the cropping range corresponding to the middle baseline image is the range defined by the second cropping line, the third cropping line, the left boundary line, and the right boundary line.
  • the mid-baseline image stitching result RIi displayed in the stitching preview window of the mobile phone can be seen in (b) in FIG. 12.
  • the user continues to turn the phone along the middle sub-path.
  • the mobile phone repeats the above collection, mapping and splicing process until ⁇ > ⁇ r .
  • all the mid-baseline image frames collected by the mobile phone along the middle sub-path are stitched, thereby generating a middle-baseline image stitching result RI, that is, the stitching result RI corresponding to the middle sub-path.
  • cut lines may be used as the right boundary of the right side thereof, beyond the right boundary portion may be cut.
  • the mobile phone can select key frames according to the preset algorithm and combined with the rotation angle ⁇ corresponding to the mid-baseline image, so that these key frames overlap partially and are basically evenly distributed.
  • These key frames can make the upper and lower baseline image stitching results and the middle baseline image stitching results better match and merge.
  • the interval ⁇ of the rotation angle around the y-axis between adjacent key frames can be flexibly set according to factors such as the size of the buffer or actual requirements.
  • FIG. 13 for a schematic diagram of the key frames acquired in the mid-baseline image.
  • the rotation angle ⁇ changes continuously with the rotation of the mobile phone.
  • the mobile phone displays the middle sub-path, the right sub-path, the upper sub-path, the left sub-path, and the lower sub-path that have not been photographed.
  • the mobile phone see (a) in Figure 14A, the mobile phone only displays the currently targeted middle sub-path; see (b) in Figure 14A, and then display the right sub-path when it is about to switch to the right sub-path .
  • the mobile phone displays the complete shooting path on the shooting interface, and the guide path for the completed shooting is a dotted line, and the guide path for the unfinished shooting is a solid line. In other embodiments, the mobile phone still displays the complete guidance path.
  • the mobile phone can prompt the user to take the sequence of different sub-paths in the first direction through the sub-path in the second direction.
  • the mobile phone can prompt the user to take pictures along the different sub-paths by means of display information or voice broadcast. A description will be given by taking the guide path as the path shown in (a) in FIG. 4B as an example.
  • the mobile phone displays the middle sub-path, upper sub-path and lower sub-path that have not been photographed.
  • the deviating indicator arrow reaches the end of the middle sub-path
  • the mobile phone highlights the upper sub-path to prompt the user to follow the upper sub-path next.
  • the path turns the phone to shoot.
  • deviation range indication lines are displayed on both sides of the middle sub-path.
  • the shooting along the middle sub-path that is, when the target sub-path is the middle sub-path, if ⁇ 3 ⁇ 2 , then the deviation indication
  • the arrow is located within the maximum deviation range of the middle sub-path; if ⁇ > ⁇ 2 , or ⁇ 3 , the deviation indicator arrow exceeds the maximum deviation range of the middle sub-path, and the mobile phone stops shooting.
  • the image collected by the mobile phone may not include the complete cropping range.
  • the size of the image retained by the mobile phone after cropping according to the preset cropping range is smaller than the cropping range, that is, less than The other images captured by the camera are cropped and retained. Therefore, after the retained image is spliced to the panoramic image, the panoramic image will have blank parts, which will result in poor splicing effect. Therefore, the mobile phone can stop the shooting process.
  • the mobile phone may also prompt the user on the shooting interface or by voice or other means that the camera has automatically stopped shooting if the maximum deviation range is exceeded.
  • the target sub-path is the right sub-path
  • the deviating indicator arrow When the deviating indicator arrow reaches the end of the middle sub-path, and the rotation angle of the phone ⁇ > ⁇ r or ⁇ 3 ⁇ ⁇ 4 , the image on the middle sub-path has been taken, and the user is guided to turn the phone upwards along the right sub-path. Make the deviation indicator arrow move upward along the right sub-path. In the process of deviating the indicating arrow and moving upward along the right sub-path, the rotation angle ⁇ changes continuously with the rotation of the mobile phone. In some technical solutions, referring to FIG. 15, the mobile phone displays the unfinished right sub-path 1501, the upper sub-path, the left sub-path, and the lower sub-path 1501.
  • the dotted lines on both sides of the right sub-path 1501 represent the deviation range indicator line
  • the arrow 1502 represents the deviation indicator arrow.
  • the mobile phone only displays the currently targeted right sub-path.
  • the mobile phone displays the complete shooting path on the shooting interface, and the guide path for the completed shooting is a dotted line, and the guide path for the unfinished shooting is a solid line. In other embodiments, the mobile phone still displays the complete guidance path.
  • the mobile phone may not perform image stitching.
  • the target sub-path is the upper sub-path, and the baseline image is spliced
  • the deviation indicator arrow When the deviation indicator arrow reaches the end of the right sub-path, and the rotation angle of the mobile phone is ⁇ > ⁇ t or ⁇ 2 ⁇ 1 , the user is guided to turn the mobile phone to the left along the upper sub-path, so that the deviation indicator arrow follows the upper sub-path move to the left.
  • the image collected by the mobile phone can be called the upper baseline image or the image of the upper sub-path, and the target sub-path is the upper sub-path. It is understandable that the first frame of the upper baseline image taken by the mobile phone along the upper sub-path is also the last frame of the image taken along the right sub-path.
  • the rotation angle ⁇ changes continuously with the rotation of the mobile phone.
  • the mobile phone displays the upper sub-path 1601, the left sub-path, and the lower sub-path that have not been photographed.
  • the dotted lines on both sides of the upper sub-path 1601 represent the deviation range indicator line
  • the arrow 1602 represents the deviation indicator arrow.
  • the mobile phone only displays the currently targeted upper sub-path.
  • the mobile phone displays the complete shooting path on the shooting interface, and the guide path for the completed shooting is a dotted line, and the guide path for the unfinished shooting is a solid line. In other embodiments, the mobile phone still displays the complete guidance path.
  • the deviation range indicator lines are displayed on both sides of the upper sub-path.
  • the deviation indicator arrow is located within the maximum deviation range of the upper sub-path; if ⁇ > ⁇ 1 , or ⁇ 2 , the deviation indicator arrow exceeds the maximum deviation range of the upper sub-path, and the mobile phone stops shooting.
  • the mobile phone may also prompt the user on the shooting interface or by voice or other means that the camera has automatically stopped shooting if the maximum deviation range is exceeded.
  • the mobile phone stitches the upper baseline image, which specifically includes processes such as cylindrical surface mapping, feature extraction, feature matching, and image stitching.
  • the mobile phone determines a target key frame whose rotation angle ⁇ is closest to the rotation angle ⁇ 1 corresponding to the baseline image on the first frame from the above key frames according to a preset algorithm as the reference frame G f1 , that is, G f1 is a key frame matching A 1.
  • G f1 is a key frame matching A 1.
  • the difference between the rotation angles around the y axis corresponding to G f1 and A 1 is the smallest.
  • the mobile phone extracts the image after G f1 cylindrical surface mapping The feature points F A, f1 .
  • the mobile phone maps the upper baseline image A 1 in the first frame to the cylindrical surface according to the above formula 2-4 to obtain the upper baseline image in the first frame. And get the baseline image on the first frame' The feature points F A, f1 .
  • the mobile phone calculates the matching result of F A,1 and F A,f1 , and calculates the homography matrix H according to the matching result.
  • the H matrix obtained according to the target key frame is more accurate, and can be better registered with the key frame in the mid-baseline image, so that it can be matched with the mid-baseline splicing result.
  • Phone will Towards Map it.
  • the mobile phone will map the The rectangular part within the cropping range is cropped and stitched to the upper right of the middle baseline image stitching result RI to form the upper baseline image stitching result RA1 (that is, the horizontal line filling part), that is, stitching when the deviation indicator arrow moves along the upper sub-path
  • the cropping range corresponding to the upper baseline image is the range defined by the first cropping line, the second cropping line, the left boundary line, and the right boundary line.
  • the first I (an integer greater than 1) a baseline frame of image A i, the mobile phone according to a preset algorithm, determining a rotation angle beta] and A i ⁇ i closest to the rotational angle of the target from said key frames in the key frame as a reference frame G fi . That is, G fi is a key frame that matches A i.
  • the mobile phone extracts the image after the G fi cylindrical surface mapping
  • the feature point F A,fi of the mobile phone maps A i to the cylindrical surface according to the above formula 2-4, so as to obtain the baseline image on the i-th frame.
  • the feature points F A,fi of the mobile phone can also extract the baseline image A i-1 on the i-1th frame after mapping it to the cylindrical surface, and the baseline image on the i-1th frame is obtained.
  • the feature point F A,i-1 The mobile phone calculates the matching results of F A,i , F A,i-1 , and F A,fi to calculate the homography matrix H.
  • the H matrix obtained in this way is more accurate, and can be better registered with the key frame while registering with the baseline image on the previous frame. Phone will Towards with Map it.
  • the phone will map The rectangular part within the cropping range is cropped and stitched to the left of the upper baseline image stitching result RA(i-1) and the upper side of the middle baseline image stitching result RI, thereby forming the upper baseline image stitching result RAi (i.e. horizontal Line filling part), that is, the target image RAi obtained by stitching when the deviation indicator arrow moves along the upper sub-path.
  • the mobile phone displays the upper baseline image stitching result RAi that is stitched above the middle baseline image stitching result RI in the stitching preview window.
  • the user continues to turn the phone along the upper sub-path.
  • the mobile phone repeats the above collection, mapping and splicing process until ⁇ 0.
  • all the upper baseline image frames collected by the mobile phone along the middle sub-path are spliced, thereby generating the upper baseline image splicing result RA, that is, the splicing result RA corresponding to the upper sub-path.
  • the left border can be used as the left side of the crop line, and the part beyond the left border can be cut off.
  • the mobile phone registers the key frames in the upper baseline image and the middle baseline image, which can correct the misalignment error between the upper baseline image and the middle baseline splicing result in time, so that the upper baseline image and the middle baseline image are spliced
  • the results are accurately registered to achieve global registration.
  • the upper baseline image stitching result and the middle baseline image stitching result can form a smooth, natural transition overall image.
  • the target sub-path is the left sub-path
  • the deviating indicator arrow When the deviating indicator arrow reaches the end of the upper sub-path, and the rotation angle of the phone ⁇ 0 or ⁇ 1 ⁇ 2 , the image on the upper sub-path has been taken, and the user is guided to turn the phone down along the left sub-path. Make the deviation indicator arrow move down along the left sub-path. In the process of deviating the indicating arrow and moving down the left sub-path, the rotation angle ⁇ changes continuously with the rotation of the mobile phone.
  • the mobile phone displays the left sub-path 1701 and the lower sub-path that have not been photographed. Among them, the dashed lines on both sides of the left sub-path 1701 represent the deviation range indicator line, and the arrow 1702 represents the deviation indicator arrow.
  • the mobile phone only displays the currently targeted left sub-path. In some other technical solutions, the mobile phone displays the complete shooting path on the shooting interface, and the guide path for the completed shooting is a dotted line, and the guide path for the unfinished shooting is a solid line. In other embodiments, the mobile phone still displays the complete guidance path.
  • the deviation range indicator line is displayed on both sides of the left sub-path.
  • the deviation indicates the maximum deviation of the arrow on the left sub-path Within the range; if the rotation angle ⁇ 1 or ⁇ > ⁇ 2 , the deviation indicator arrow exceeds the maximum deviation range of the left sub-path, and the phone stops shooting.
  • the target sub-path is the lower sub-path, and the baseline image is stitched together
  • the deviating arrow When the deviating arrow reaches the end of the left sub-path, and the rotation angle of the phone is ⁇ b or ⁇ 3 ⁇ 4 , guide the user to turn the phone to the right along the lower sub-path, so that the deviating arrow follows the lower sub-path move to the right.
  • the image collected by the mobile phone can be called the lower baseline image or the image of the lower sub-path, and the target sub-path is the lower sub-path. It is understandable that the first frame of the lower baseline image taken by the mobile phone along the lower sub-path is also the last frame of the image taken along the left sub-path.
  • the rotation angle ⁇ changes continuously with the rotation of the mobile phone.
  • the mobile phone displays the lower sub-path 1801 that has not been photographed.
  • the dashed lines on both sides of the lower sub-path 1801 represent the deviation range indicator line
  • the arrow 1802 represents the deviation indicator arrow.
  • the mobile phone displays the complete upper sub-path that is currently targeted.
  • the mobile phone displays the complete shooting path on the shooting interface, and the guide path for the completed shooting is a dotted line, and the guide path for the unfinished shooting is a solid line. In other embodiments, the mobile phone still displays the complete guidance path.
  • the method of stitching the lower baseline image by the mobile phone is the same as that of the lower baseline image.
  • the mobile phone determines a rotation angle ⁇ and the first frame from the above key frames according to a preset algorithm.
  • the target key frame with the closest rotation angle ⁇ 1 corresponding to the baseline image under the frame is used as the reference frame C f1 , and the image after the cylindrical surface mapping of C f1 is extracted Feature points F C,f1 .
  • the mobile phone maps the lower baseline image B 1 in the first frame to the cylindrical surface according to the above formula 2-4 to obtain the lower baseline image in the first frame.
  • the mobile phone calculates the matching result of F B,1 and F C,f1 , thereby calculating the homography matrix H according to the matching result. Then, according to the matrix H, the mobile phone will Towards Map it.
  • the H matrix obtained by combining the target key frame is more accurate, which enables the lower baseline image to be better registered with the key frame, so that the registration can be replaced with the middle baseline splicing result.
  • the phone will map The rectangular part within the cropping range is cropped and stitched to the lower left of the middle baseline image stitching result RI to form the lower baseline image stitching result RB1, that is, the target image RB1 obtained by stitching when the deviation indicator arrow moves along the lower sub-path.
  • the cropping range corresponding to the lower baseline image is the range defined by the third cropping line, the fourth cropping line, the left boundary line, and the right boundary line.
  • the mobile phone displays the lower baseline image stitching result RB1 stitched below the middle baseline stitching result RI on the shooting interface, see (b) in FIG. 18.
  • the upper baseline splicing result RA is also spliced above the middle baseline splicing result RI.
  • the mobile phone determines a target key frame whose rotation angle ⁇ is closest to the rotation angle ⁇ i of I i from the above key frames according to a preset algorithm as the reference frame C fi , and extract the image after C fi cylindrical surface mapping Feature points F c,fi .
  • the mobile phone maps B i to the cylindrical surface according to the above formula 2-4 to obtain the baseline image under the i-th frame' The feature point F B,i .
  • the mobile phone calculates F B,i , the image after the baseline image B i-1 is mapped to the cylindrical surface in the i-1th frame Feature points F B,i-1 , and the matching results of F c,fi to calculate the homography matrix H.
  • the H matrix obtained in this way is more accurate, and can be better registered with the key frame while registering with the baseline image of the previous frame.
  • Phone will Towards with Map it. The phone will be mapped The rectangular part within the cropping range is cropped and stitched to the right of the lower baseline image stitching result RB(i-1) to form the lower baseline image stitching result RBi, which is obtained by stitching when the deviation indicator arrow moves along the lower sub-path The target image RBi.
  • the user continues to turn the phone along the sub-path below.
  • the mobile phone repeats the above collection, mapping and splicing process until ⁇ > ⁇ r .
  • all the lower baseline image frames collected by the mobile phone along the middle sub-path are stitched together, and the lower baseline image stitching result RB is generated, that is, the stitching result RB corresponding to the lower sub-path, the entire panoramic image stitching is completed, and the entire shooting process ends.
  • cut lines may be used as the right boundary of the right side thereof, beyond the right boundary portion may be cut.
  • a schematic diagram of the stitching result of the baseline image in the last frame see (c) in FIG. 18.
  • the mobile phone registers the key frames in the lower baseline image and the middle baseline image, which can correct the misalignment error between the lower baseline image and the middle baseline splicing result in time, so that the lower baseline image and the middle baseline image splicing result can be accurately matched.
  • the stitching result of the lower baseline image and the middle baseline image can form a smooth, natural transition overall image.
  • the panoramic image generated by the mobile phone may refer to the thumbnail 1901 shown in (a) in FIG. 19 and the thumbnail 1902 in the gallery shown in (b) in FIG. 19.
  • the mobile phone detects the user's operation to stop shooting (for example, the user clicks the operation of the stop shooting control 1803 shown in (b) in FIG. 18), the shooting of the panoramic image is stopped.
  • the mobile phone stops shooting.
  • the mobile phone detects the user's operation to stop shooting, it will stop shooting.
  • the target sub-path is the middle sub-path when the shooting is suspended
  • the obtained panoramic image is the result of the middle baseline image stitching when the shooting is suspended.
  • the panoramic image obtained by the mobile phone is the complete mid-baseline image stitching result. If the target sub-path is the left sub-path or the lower sub-path when the shooting is stopped, the panoramic image obtained by the mobile phone is the complete middle baseline image and the upper baseline image stitching result. In this way, it can be avoided that in the panoramic image captured by the mobile phone, the field angle of the middle baseline image stitching result is larger and the field angle of the upper baseline image or the lower baseline image stitching result is smaller, resulting in irregular and irregular panoramic images obtained. whole.
  • the mobile phone can also generate a video after stopping shooting, and the video image is the spliced image displayed in the splicing preview window during the shooting process.
  • the mobile phone can dynamically present the image splicing process during the shooting process to the user.
  • the mobile phone may also save the image sequence collected according to the guide path during the shooting process, and in response to the user's editing operation of the image sequence, the mobile phone may generate a panoramic image.
  • the mobile phone can expand the field of view of the panoramic image in the horizontal direction by splicing the upper baseline image, upper baseline image, or lower baseline image in the horizontal direction; by combining the upper baseline image and the middle baseline image Perform registration and stitching, and register and stitch the lower baseline image and the middle baseline image. It can also expand the angle of view of the panoramic image in the vertical direction, thereby making the entire angle of view of the panoramic image larger and improving the user Shooting experience.
  • cylindrical mapping can match the size and imaging characteristics of the same object after mapping on the image taken by the mobile phone from different angles, so that the images taken from different angles can be registered and stitched.
  • a panoramic image is generated, which can conform to the visual effect that the image size of each part of the panoramic image is basically the same.
  • the panoramic image is not a simple combination of the splicing results of the upper, middle, and lower baseline images.
  • Each kind of baseline image stitching result is obtained by registration and stitching based on multiple homography matrices H, and each small part corresponds to a different homography matrix H, so different types of baseline image stitching results cannot be calculated between A homography matrix H enables good correspondence and matching between the parts of the two baseline image stitching results, so it is impossible to simply directly stitch the upper, middle, and lower baseline image stitching results.
  • the upper baseline image and the lower baseline image are also registered and spliced according to the key frame of the middle baseline.
  • the misalignment error of the splicing result of the upper and lower baseline images and the middle baseline can be corrected in time, so that the upper baseline The image, the lower baseline image and the middle baseline image stitching results are accurately registered to achieve global registration.
  • the upper baseline image stitching results, the middle baseline image stitching results, and the lower baseline image stitching results can be better integrated to form An overall image with a smooth, natural transition.
  • the beginning of the guide path is the left end of the middle sub-path
  • the end of the guide path is the end of the lower sub-path.
  • the guide paths of different start and end can refer to (a)-(f) in FIG. 20. It is understandable that in case 1, other guiding paths other than the example shown in FIG. 20 may also be included, which are not limited in the embodiment of the present application.
  • the guide path when the mobile phone is in a different state such as a vertical screen or a horizontal screen, the guide path will change accordingly with the state of the mobile phone.
  • a different state such as a vertical screen or a horizontal screen
  • the guide path will change accordingly with the state of the mobile phone.
  • the preview interface displayed by the mobile phone can be seen in FIG. 21.
  • the first direction is the vertical direction
  • the second direction is the horizontal direction
  • the first direction is the vertical direction
  • the second direction is the horizontal direction.
  • the guide path includes at least two sub-paths in the vertical direction.
  • the guide path may further include at least one horizontal sub-path for connecting the vertical sub-path.
  • a schematic diagram of a preview interface displayed by the mobile phone can be seen in FIG. 22A.
  • the guide path can also take many different forms according to the beginning, end, or running direction. Exemplarily, the different start and end guide paths can be referred to (a)-(c) in FIG. 22B. It is understandable that in case 1, other guiding paths other than the example illustrated in FIG.
  • case 22B may be included, which are not limited in the embodiment of the present application.
  • the panoramic shooting method corresponding to case 2 is similar to case 1, and will not be repeated here. The difference is: in case 1, the mobile phone stitches the images according to the three baselines in the horizontal direction; and in case 2, the mobile phone uses the vertical direction The three baselines of the image are stitched together.
  • case 1 is suitable for shooting scenes with a large field of view in the horizontal direction
  • case 2 is suitable for shooting scenes with a large field of view in the vertical direction (for example, shooting high-rise buildings).
  • the schematic diagram of the image splicing result in the splicing preview window can be seen in FIG. 22C.
  • the mobile phone can expand the angle of view of the panoramic image in the vertical direction by stitching the images collected along a certain vertical sub-path;
  • the splicing of the images collected by the sub-path can also expand the angle of view of the panoramic image in the horizontal direction, thereby making the entire angle of view of the panoramic image larger and improving the user's shooting experience.
  • the user can select or switch the guide path mode corresponding to the situation 1 or the situation 2 to take the panoramic image according to the actual needs of the shooting scene.
  • the guide path may refer to (a) in FIG. 24, and the panoramic image obtained by shooting may refer to FIG. (B) in 24. It is understandable that when the start, end, or running direction is different, the guide path can also take many different forms.
  • the guide path when the guide path includes two sub-paths, although the guide path reduces one sub-path in the first direction, it can still expand the field of view in the second direction and simplify the panoramic view. The shooting process of the image.
  • the first direction may also include more than three sub-paths
  • the second direction may include multiple sub-paths for connecting the sub-paths in the first direction.
  • the guide path including 4 sub-paths in the horizontal direction can be seen in (a)-(b) in FIG. 25, and the panoramic image obtained by shooting can be seen in (c) in FIG. 25.
  • the mobile phone can expand the range of the field of view in the second direction perpendicular to the first direction.
  • the electronic device is a mobile phone.
  • the method described in the above embodiment can also be used to perform panoramic shooting, which will not be repeated here.
  • the electronic device includes hardware and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Those skilled in the art can use different methods for each specific application in combination with the embodiments to implement the described functions, but such implementation should not be considered as going beyond the scope of the present application.
  • the electronic device can be divided into functional modules according to the foregoing method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • the electronic device 2600 may include a camera 2601, an ISP 2602, an input buffer unit 2603, a calculation processing unit 2604, an inertial measurement unit (IMU) 2605, and an output buffer unit 2606 , Encoder 2607, display unit 2608, and other units/modules.
  • a camera 2601 an ISP 2602, an input buffer unit 2603, a calculation processing unit 2604, an inertial measurement unit (IMU) 2605, and an output buffer unit 2606 , Encoder 2607, display unit 2608, and other units/modules.
  • IMU inertial measurement unit
  • the ISP processes the image frames collected by the camera and outputs them to the input buffer unit.
  • the calculation processing unit performs corresponding cropping and splicing processing on the image data in the input buffer unit according to the data of the IMU, and outputs the processing result to the output buffer unit.
  • the display unit displays the interface and guide information according to the processing result in the output buffer unit.
  • the encoder encodes the image data in the processing result and outputs it to a gallery or other applications.
  • the calculation processing unit initializes various parameters in the scene, including the rotation angle ⁇ t around the x axis corresponding to the upper and lower baselines, ⁇ b (the middle baseline rotation angle ⁇ is 0), first, 2,3,4 The rotation angle corresponding to the clipping boundary ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 , the rotation angle ⁇ r around the y-axis corresponding to the right baseline (the left baseline rotation angle ⁇ is 0), the left boundary and the left clipping The rotation angles ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 corresponding to the line, the right clipping line, and the right border, and the key frame interval ⁇ , etc.
  • the calculation processing unit determines the position of the guidance information according to the rotation angle.
  • the display screen can display a preview interface as shown in (b) of FIG. 4A, and the preview interface includes guidance information.
  • the ISP processes the image frame received by the camera and sends it to the calculation processing unit.
  • the calculation processing unit refreshes the interface in real time according to the shooting process, and the display screen displays the shooting interface as shown in (a) in FIG. 6.
  • the calculation and processing unit stitches the mid-baseline images collected by the camera.
  • the output buffer unit buffers the key frames and the splicing result, and the display screen displays the splicing result.
  • the display shows the screen shown in FIG shot 15, to guide the user rotates the electronic device upwards.
  • the display screen displays the shooting interface as shown in Fig. 16A to guide the user to turn the electronic device to the left, and the calculation processing unit stitches the upper baseline image collected by the camera according to the key frame .
  • the output buffer unit buffers the splicing result, and the display screen displays the splicing result.
  • the display screen displays the shooting interface as shown in FIG. 17 to guide the user to turn the electronic device downward.
  • the rotation angle ⁇ ⁇ b of the electronic device the screen display shown in FIG.
  • An embodiment of the present application also provides an electronic device, including: a camera for collecting images; a display screen for displaying an interface; one or more processors and one or more memories.
  • the one or more memories are coupled with one or more processors, and the one or more memories are used to store computer program codes.
  • the computer program codes include computer instructions.
  • the electronic device executes The above-mentioned related method steps implement the panoramic shooting method in the above-mentioned embodiment.
  • An embodiment of the present application also provides an electronic device including one or more processors and one or more memories.
  • the one or more memories are coupled with one or more processors, and the one or more memories are used to store computer program codes.
  • the computer program codes include computer instructions.
  • the electronic device executes The above-mentioned related method steps implement the panoramic shooting method in the above-mentioned embodiment.
  • the embodiments of the present application also provide a computer-readable storage medium that stores computer instructions in the computer-readable storage medium.
  • the computer instructions run on an electronic device, the electronic device executes the above-mentioned related method steps to implement the above-mentioned embodiments. Panorama shooting method in.
  • the embodiments of the present application also provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute the above-mentioned related steps, so as to realize the panoramic photography method executed by the electronic device in the above-mentioned embodiment.
  • the embodiments of the present application also provide a device.
  • the device may specifically be a chip, component, or module.
  • the device may include a processor and a memory connected to each other.
  • the memory is used to store computer execution instructions.
  • the processor can execute the computer-executable instructions stored in the memory, so that the chip executes the panoramic shooting method executed by the electronic device in the foregoing method embodiments.
  • the electronic device, computer readable storage medium, computer program product, or chip provided in this embodiment are all used to execute the corresponding method provided above. Therefore, the beneficial effects that can be achieved can refer to the above provided The beneficial effects of the corresponding method will not be repeated here.
  • the disclosed device and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be other division methods for example, multiple units or components may be It can be combined or integrated into another device, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate parts may or may not be physically separate.
  • the parts displayed as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of a software product, and the software product is stored in a storage medium. It includes several instructions to make a device (may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read only memory (read only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Abstract

Des modes de réalisation de la présente demande se rapportent au domaine de la technologie électronique et la présente invention concerne un procédé et un dispositif de capture d'image panoramique, qui peuvent assembler des images, acquises à partir de différents angles, dans deux directions perpendiculaires l'une à l'autre de façon à générer une image panoramique, de telle sorte que le champ de vision de l'image panoramique puisse être étendu dans les deux directions, améliorant l'expérience de capture d'image panoramique d'un utilisateur. La solution spécifique est : un dispositif électronique entre dans un mode de capture d'image panoramique d'une application de caméra ; le dispositif électronique affiche des premières informations de guidage sur une interface de prévisualisation, les premières informations de guidage comprennent un premier trajet de guidage, le premier trajet de guidage comprend au moins deux sous-trajets qui sont disposés dans une première direction et qui sont parallèles l'un à l'autre, la première direction est parallèle à un bord latéral du dispositif électronique et le premier trajet de guidage est utilisé pour guider un utilisateur de manière à ce qu'il tourne le dispositif électronique le long du premier trajet de guidage pendant un processus de capture d'image. Les modes de réalisation de la présente invention sont utilisés pour capturer des images panoramiques.
PCT/CN2021/078666 2020-05-29 2021-03-02 Procédé et dispositif de capture d'image panoramique WO2021238317A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010478652.1A CN113747044B (zh) 2020-05-29 2020-05-29 一种全景拍摄方法及设备
CN202010478652.1 2020-05-29

Publications (1)

Publication Number Publication Date
WO2021238317A1 true WO2021238317A1 (fr) 2021-12-02

Family

ID=78724968

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/078666 WO2021238317A1 (fr) 2020-05-29 2021-03-02 Procédé et dispositif de capture d'image panoramique

Country Status (2)

Country Link
CN (1) CN113747044B (fr)
WO (1) WO2021238317A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135259A (zh) * 2023-04-11 2023-11-28 荣耀终端有限公司 摄像头的切换方法及电子设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114827472B (zh) * 2022-04-29 2023-05-30 北京城市网邻信息技术有限公司 全景拍摄方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101964869A (zh) * 2009-07-23 2011-02-02 华晶科技股份有限公司 全景图像的引导拍摄方法
CN102201115A (zh) * 2011-04-07 2011-09-28 湖南天幕智能科技有限公司 无人机航拍视频实时全景图拼接方法
CN103430530A (zh) * 2011-03-30 2013-12-04 Nec卡西欧移动通信株式会社 成像设备、用于成像设备的拍摄向导显示方法和非暂时性计算机可读介质
CN105957008A (zh) * 2016-05-10 2016-09-21 厦门美图之家科技有限公司 基于移动终端的全景图像实时拼接方法及系统
US9716828B2 (en) * 2013-08-28 2017-07-25 Samsung Electronics Co., Ltd. Method for shooting image and electronic device thereof
CN107545538A (zh) * 2016-06-24 2018-01-05 清华大学深圳研究生院 一种基于无人机的全景图像拼接方法及装置
US20190182422A1 (en) * 2017-12-11 2019-06-13 Canon Kabushiki Kaisha Image capturing apparatus and control method for image capturing apparatus

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176347B (zh) * 2011-12-22 2016-07-27 百度在线网络技术(北京)有限公司 全景图拍摄方法及拍摄装置和电子设备
KR102021857B1 (ko) * 2013-07-23 2019-09-17 엘지전자 주식회사 이동 단말기 및 그의 파노라마 촬영방법
CN104394321B (zh) * 2014-11-28 2017-05-24 广东欧珀移动通信有限公司 移动终端及移动终端的成像方法
CN105657257B (zh) * 2015-12-29 2018-07-17 广东欧珀移动通信有限公司 全景照片的拍摄方法、装置、系统、移动终端及自拍杆
CN110012209B (zh) * 2018-01-05 2020-08-14 Oppo广东移动通信有限公司 全景图像生成方法、装置、存储介质及电子设备
CN108259762A (zh) * 2018-03-23 2018-07-06 南京嘉码信息科技有限公司 一种漫游式全景图自动拍摄系统及方法
CN109087244B (zh) * 2018-07-26 2023-04-18 深圳禾苗通信科技有限公司 一种全景图像拼接方法、智能终端及存储介质
CN110505401A (zh) * 2019-08-16 2019-11-26 维沃移动通信有限公司 一种摄像头控制方法及电子设备
CN110675319B (zh) * 2019-09-12 2020-11-03 创新奇智(成都)科技有限公司 一种基于最小生成树的手机拍照全景图像拼接方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101964869A (zh) * 2009-07-23 2011-02-02 华晶科技股份有限公司 全景图像的引导拍摄方法
CN103430530A (zh) * 2011-03-30 2013-12-04 Nec卡西欧移动通信株式会社 成像设备、用于成像设备的拍摄向导显示方法和非暂时性计算机可读介质
CN102201115A (zh) * 2011-04-07 2011-09-28 湖南天幕智能科技有限公司 无人机航拍视频实时全景图拼接方法
US9716828B2 (en) * 2013-08-28 2017-07-25 Samsung Electronics Co., Ltd. Method for shooting image and electronic device thereof
CN105957008A (zh) * 2016-05-10 2016-09-21 厦门美图之家科技有限公司 基于移动终端的全景图像实时拼接方法及系统
CN107545538A (zh) * 2016-06-24 2018-01-05 清华大学深圳研究生院 一种基于无人机的全景图像拼接方法及装置
US20190182422A1 (en) * 2017-12-11 2019-06-13 Canon Kabushiki Kaisha Image capturing apparatus and control method for image capturing apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135259A (zh) * 2023-04-11 2023-11-28 荣耀终端有限公司 摄像头的切换方法及电子设备

Also Published As

Publication number Publication date
CN113747044A (zh) 2021-12-03
CN113747044B (zh) 2023-05-02

Similar Documents

Publication Publication Date Title
CN110555883B (zh) 相机姿态追踪过程的重定位方法、装置及存储介质
WO2021238325A1 (fr) Procédé et appareil de traitement d'images
WO2022062318A1 (fr) Procédé et dispositif de photographie
KR102222073B1 (ko) 촬영 방법 및 전자 장치
WO2019134516A1 (fr) Procédé et dispositif de génération d'image panoramique, support d'informations et appareil électronique
WO2017088678A1 (fr) Appareil et procédé de prise d'image panoramique à exposition prolongée
WO2022022715A1 (fr) Procédé et dispositif photographique
WO2021223500A1 (fr) Procédé et dispositif photographique
WO2013015147A1 (fr) Système de traitement d'image, dispositif de traitement d'informations, programme et procédé de traitement d'image
WO2021238317A1 (fr) Procédé et dispositif de capture d'image panoramique
CN110636276B (zh) 视频拍摄方法、装置、存储介质及电子设备
US8400532B2 (en) Digital image capturing device providing photographing composition and method thereof
CN114390213B (zh) 一种拍摄方法及设备
WO2021244104A1 (fr) Procédé de photographie à laps de temps et dispositif
WO2022022726A1 (fr) Procédé et dispositif de capture d'image
CN114339102B (zh) 一种录像方法及设备
WO2017054185A1 (fr) Procédé, dispositif et terminal pour afficher un contenu visuel panoramique
WO2021185374A1 (fr) Procédé de capture d'image et dispositif électronique
US11657477B2 (en) Image processing device, image processing system, imaging device, image processing method, and recording medium storing program code
CN114390186B (zh) 视频拍摄方法及电子设备
WO2023165535A1 (fr) Procédé et appareil de traitement d'image et dispositif
WO2023231697A1 (fr) Procédé pour photographier et dispositif associé
US8665317B2 (en) Imaging apparatus, imaging method and recording medium
CN116051361B (zh) 图像维测数据的处理方法及装置
CN114898084B (zh) 视觉定位方法、设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21813319

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21813319

Country of ref document: EP

Kind code of ref document: A1