CN113947522A - Panoramic image synthesis method and device and electronic equipment - Google Patents

Panoramic image synthesis method and device and electronic equipment Download PDF

Info

Publication number
CN113947522A
CN113947522A CN202111203488.4A CN202111203488A CN113947522A CN 113947522 A CN113947522 A CN 113947522A CN 202111203488 A CN202111203488 A CN 202111203488A CN 113947522 A CN113947522 A CN 113947522A
Authority
CN
China
Prior art keywords
target
information
point
specific point
azimuth information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111203488.4A
Other languages
Chinese (zh)
Inventor
陈映宜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202111203488.4A priority Critical patent/CN113947522A/en
Publication of CN113947522A publication Critical patent/CN113947522A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the disclosure discloses a panoramic image synthesis method, a panoramic image synthesis device and electronic equipment. One embodiment of the method comprises: receiving transmission data of a three-dimensional scanner, wherein the transmission data comprises: the number of basic images of the target shot by the three-dimensional scanner is large; the three-dimensional scanner acquires the azimuth information of the actual position indicated by the specific point in each basic image; and synthesizing a panoramic image which comprises a target number of specific points and has azimuth information at the actual position indicated by each specific point based on the target number of basic images. This embodiment may enrich the information contained in the synthesized panoramic image.

Description

Panoramic image synthesis method and device and electronic equipment
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a panoramic image synthesis method and device and electronic equipment.
Background
The panoramic image can present content to the user in a three-dimensional visual effect. In some situations, it is important to present content to the user through the panoramic image. Accordingly, various methods of synthesizing a panoramic image have emerged.
In the related art, photographed images are directly synthesized into a panoramic image.
Disclosure of Invention
This disclosure is provided to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The embodiment of the disclosure provides a panoramic image synthesis method, a panoramic image synthesis device and electronic equipment, which can enrich information contained in a synthesized panoramic image.
In a first aspect, an embodiment of the present disclosure provides a panoramic image synthesis method, including: receiving transmission data of a three-dimensional scanner, wherein the transmission data comprises: the number of basic images of the target shot by the three-dimensional scanner is large; the three-dimensional scanner acquires the azimuth information of the actual position indicated by the specific point in each basic image; and synthesizing a panoramic image which comprises a target number of specific points and has azimuth information at the actual position indicated by each specific point based on the target number of basic images.
In a second aspect, an embodiment of the present disclosure provides a panoramic image synthesis apparatus, including: a receiving unit, configured to receive transmission data of a three-dimensional scanner, where the transmission data includes: the number of basic images of the target shot by the three-dimensional scanner is large; the three-dimensional scanner acquires the azimuth information of the actual position indicated by the specific point in each basic image; and a synthesizing unit configured to synthesize a panoramic image including a target number of specific points and having azimuth information at an actual position indicated by each specific point, based on the target number of base images.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the panoramic image synthesis method according to the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides a computer-readable medium on which a computer program is stored, which when executed by a processor, implements the steps of the panoramic image synthesis method according to the first aspect.
The panoramic image synthesis method, the panoramic image synthesis device and the electronic equipment provided by the embodiment of the disclosure can receive a target number of basic images shot by a three-dimensional scanner and orientation information of an actual position indicated by a specific point in each acquired basic image. Further, a panoramic image including a target number of specific points each having orientation information at an actual position indicated by the specific point may be synthesized based on the target number of base images. This makes it possible to enrich information contained in the synthesized panoramic image.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a flow diagram of some embodiments of a panoramic image synthesis method of the present disclosure;
FIG. 2 is a flow diagram of a panoramic image synthesis method of the present disclosure in some embodiments of a three-dimensional scanner capturing base images and gathering orientation information;
FIG. 3 is a flow diagram of a panoramic image synthesis method of the present disclosure showing target orientation information in some embodiments;
FIG. 4 is a flow diagram of a panoramic image synthesis method of the present disclosure showing a target local image in some embodiments;
fig. 5 is a schematic structural diagram of some embodiments of a panoramic image synthesis apparatus of the present disclosure;
FIG. 6 is an exemplary system architecture to which the panoramic image synthesis method of the present disclosure may be applied in some embodiments;
fig. 7 is a schematic diagram of a basic structure of an electronic device provided in accordance with some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Referring to fig. 1, a flow of some embodiments of a panoramic image synthesis method according to the present disclosure is shown. As shown in fig. 1, the panoramic image synthesis method includes the following steps:
step 101, receiving transmission data of a three-dimensional scanner.
The transmission data is data transmitted to the terminal device by the three-dimensional scanner. The data transmission comprises the following steps: the number of basic images of the target shot by the three-dimensional scanner is large; and acquiring the position information of the actual position indicated by the specific point in each basic image by the three-dimensional scanner.
Any point in the base image may indicate the corresponding actual position. The specific point is a point designated in advance from the base image. Alternatively, the specific point may be a central point in the base image.
The orientation information may be "true south", "true north", "true west", "true east", "north-east 30 degrees", and the like. As can be seen, the orientation information may characterize a particular orientation of a location. As an example, if the orientation information of the actual position a indicated by the specific point in the image is "20 degrees north, the actual position a is located at an orientation of 20 degrees north.
The three-dimensional scanner can stay at a target number of positions in the process of rotating 360 degrees, a basic image is shot at each stay position, and orientation information of an actual position indicated by a specific point in the basic image is collected.
The three-dimensional scanner can shoot one basic image at a time, collect the position information of the actual position indicated by the specific point in the basic image, and transmit the basic image and the position information to the terminal equipment once. The three-dimensional scanner can also shoot a target number of basic images, acquire the orientation information of the actual position indicated by the specific point in each basic image, and then transmit all the basic images and the orientation information to the terminal equipment.
After the three-dimensional scanner transmits the data to the terminal device, the execution subject of the panoramic image synthesis method may receive the transmitted data.
And 102, synthesizing a panoramic image which comprises a target number of specific points and has azimuth information at the actual position indicated by each specific point based on the target number of basic images.
The execution subject may synthesize a panoramic image including a target number of specific points, each of which indicates an actual position having orientation information, based on the target number of base images.
Referring to the foregoing, the actual position indicated by a particular point in each base image has orientation information. Therefore, based on the target number of base images, a panoramic image including the target number of specific points each having the azimuth information at the indicated actual position can be synthesized.
In this embodiment, the three-dimensional scanner may collect orientation information of an actual position indicated by a specific point in the base image when taking the base image. Further, based on the number of base images of the target taken by the three-dimensional scanner, a panoramic image including the number of specific points of the target, each of which indicates an actual position having orientation information, may be synthesized. This makes it possible to enrich information contained in the synthesized panoramic image.
In some embodiments, the three-dimensional scanner has a camera for acquiring a base image and an electronic compass for acquiring bearing information for the actual position indicated by a particular point in the base image.
The three-dimensional scanner may take the target number of base images and collect the orientation information of the actual position indicated by the specific point in each base image according to the process shown in fig. 2, which includes the following steps.
Step 201, in response to receiving the shooting instruction of the number of basic images of the shooting target sent by the terminal device, executing the shooting step. The shooting step comprises steps 2011-2012.
In step 2011, the ratio of 360 degrees to the target number is used as the single rotation angle.
As an example, if the target number is 10, then the single rotation angle is a ratio of 360 degrees to 10 (i.e., 36 degrees).
Step 2012, in the process of rotating 360 degrees, when the basic image is rotated once, the camera is used to shoot the basic image, and the electronic compass is used to collect the azimuth information of the actual position indicated by the specific point in the basic image.
Therefore, by issuing a shooting instruction for shooting a target number of basic images to the three-dimensional scanner, the three-dimensional scanner can shoot one basic image every certain angle (namely, a single rotation angle) in the process of rotating 360 degrees, and the azimuth information of the actual position indicated by the specific point in the basic image is collected. Therefore, the terminal equipment can control the three-dimensional scanner to shoot the basic image and collect the azimuth information through a simple and convenient flow.
In some embodiments, the specific point is a center point in the base image, and the electronic compass is disposed at a position corresponding to the center of the camera.
A three-dimensional scanner may use an electronic compass to gather orientation information for the actual position indicated by a particular point in the base image.
Specifically, the azimuth information collected by the electronic compass is used as the azimuth information of the actual position indicated by the specific point in the basic image.
Therefore, the electronic compass is arranged at the position corresponding to the center of the camera, so that the three-dimensional scanner can use the camera to acquire the azimuth information of the actual position indicated by the central point in the basic image.
In some embodiments, the executing subject may execute the panoramic image synthesis method according to a flow shown in fig. 3, which includes the following steps.
Step 301, receiving transmission data of a three-dimensional scanner.
Step 302, based on the target number of base images, a panoramic image including the target number of specific points and having azimuth information at the actual position indicated by each specific point is synthesized.
In some scenarios, the execution manners of steps 301 to 302 are similar to the execution manners of steps 101 to 102 in the embodiment shown in fig. 1, and are not repeated here.
Step 303, in response to the orientation information viewing operation for any point in the panoramic image, determining a first specific point and a second specific point on both sides of the any point.
The orientation information viewing operation may be used to view target orientation information of the actual position indicated by the above arbitrary point.
The panoramic image is synthesized from a target number of base images, and therefore, first specific points and second specific points are distributed on both sides of any point in the panoramic image.
The orientation information of the actual position indicated by the first specific point is first orientation information. The orientation information of the actual position indicated by the second specific point is the second orientation information
Step 304, determining a first distance from the arbitrary point to a first specific point, and determining a second distance from the arbitrary point to a second specific point.
And 305, determining target azimuth information of the actual position indicated by the arbitrary point from the azimuth information range between the first azimuth information and the second azimuth information based on the first distance and the second distance.
Since the arbitrary point is located between the first specific point and the second specific point, the target azimuth information is located between the first azimuth information and the second azimuth information.
And step 306, displaying the target position information.
Therefore, the user views the target azimuth information of the actual position indicated by any point in the panoramic image, and the user flexibly views the azimuth information of the actual position indicated by any point in the panoramic image.
In some embodiments, the execution subject may determine the target orientation information of the actual position indicated by any point in the panoramic image in the following manner.
The method comprises the following steps of firstly, dividing an included angle between first azimuth information and second azimuth information into a first included angle between target azimuth information and the first azimuth information and a second included angle between the target azimuth information and the second azimuth information according to the distance proportion occupied by the first distance and the second distance respectively.
It is understood that the distance proportion occupied by the first distance may be a ratio of the first distance to a sum of distances (i.e., a sum of the first distance and the second distance). The distance proportion occupied by the second distance may be a ratio of the second distance to the sum of the distances.
As an example, a first distance between the arbitrary point and the first specific point is a, a second distance between the arbitrary point and the second specific point is b, and an angle between the first azimuth information and the second azimuth information is α. The execution body canSo as to divide the included angle alpha into a first included angle between the target azimuth information and the first azimuth information
Figure BDA0003305962640000071
And a second angle between the target orientation information and the second orientation information
Figure BDA0003305962640000072
And secondly, determining target azimuth information from the azimuth information range between the first azimuth information and the second azimuth information based on the first included angle and the second included angle.
It can be understood that after the first angle (the angle between the target position information and the first position information) and the second angle (the angle between the target position information and the second position information) are determined, the target position information of the arbitrary point may be determined from the position information range between the first position information and the second position information.
As an example, the first azimuth information is true north, the second azimuth information is true east, the first angle of the target azimuth information with the first azimuth information is 30 degrees, and the second angle with the second azimuth information is 60 degrees. The execution body may determine that the target azimuth information is "30 degrees north-east".
Therefore, the target azimuth information of the actual position indicated by the arbitrary point is reasonably determined based on the distance proportion of the distance between the arbitrary point and the two specific points distributed on the two sides. Therefore, the azimuth information of the actual position indicated by any point in the panoramic image can be determined more accurately.
In some embodiments, the executing subject may execute the panoramic image synthesis method according to a flow shown in fig. 4, which includes the following steps.
Step 401, receiving transmission data of a three-dimensional scanner.
Step 402, synthesizing a panoramic image containing a target number of specific points and having azimuth information at the actual position indicated by each specific point based on the target number of base images.
In some scenarios, the execution manners of steps 401 to 402 are similar to the execution manners of steps 101 to 102 in the embodiment shown in fig. 1, and are not repeated here.
And step 403, in response to receiving the third position information provided by the user, determining that the position information of the indicated actual position is located at the third specific point and the fourth specific point on both sides of the third position information from the target number of specific points.
Optionally, the third orientation information is orientation information selected by the user from preset orientation information, or orientation information input by the user in the input control.
The azimuth information of the actual position indicated by the third specific point is fourth azimuth information. The position information of the actual position indicated by the fourth specific point is fifth position information. It is understood that the third orientation information is located between the fourth orientation information and the fifth orientation information.
Step 404, determining a third included angle formed by the third azimuth information and the fourth azimuth information, and determining a fourth included angle formed by the third azimuth information and the fifth azimuth information.
Step 405, based on the third angle and the fourth angle, determines a target point where the position information of the indicated actual position is the third position information from the target line segment between the third specific point and the fourth specific point.
And 406, displaying a target local image in the panoramic image according to the target point.
Therefore, the target local image in the panoramic image is shown to the user according to the third azimuth information provided by the user. Thus, the user can view the approximate situation around the information of a certain orientation by using the panoramic image.
In some embodiments, the execution subject may determine the target point in the following manner.
And step one, dividing the distance between the third specific point and the fourth specific point into a third distance between the target point and the third specific point and a fourth distance between the target point and the fourth specific point according to the proportion of the included angles respectively occupied by the third included angle and the fourth included angle.
It is understood that the ratio of the first included angle to the total included angle (the total of the first included angle and the second included angle) may be a ratio of the first included angle to the total included angle. The ratio of the second included angle to the total included angle may be a ratio of the second included angle to the total included angle.
As an example, a third angle formed by the third orientation information and the fourth orientation information is α 1, a fourth angle formed by the third orientation information and the fifth orientation information is α 2, and a distance between the third specific point and the fourth specific point is c. The execution subject may divide the distance c into a third distance between the target point and a third specific point
Figure BDA0003305962640000081
And a fourth distance of the target point from the fourth specific point
Figure BDA0003305962640000082
And a second step of determining a target point from a target line segment between the third specific point and the fourth specific point based on the third distance and the fourth distance.
It will be appreciated that after determining the third distance (the distance between the target point and the third specific point on the target line segment) and the fourth distance (the distance between the target point and the fourth specific point on the target line segment), the target point may be determined from the target line segment between the third specific point and the fourth specific point. Therefore, the target point is reasonably determined based on the included angle proportion of the included angle of the third direction information and the included angles of the two direction information distributed on the two sides. Therefore, the target line segment can be accurately determined, the position information of the indicated actual position is the target point of the third position information provided by the user, and therefore the accuracy of displaying the target local image to the user can be improved.
In some embodiments, the specific point is a central point in the base image. At this time, the execution subject described above may present the target partial image in the panoramic image in the following manner.
Specifically, the target local image in the panoramic image is shown with the target point as the center point of the local image.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides some embodiments of a panoramic image synthesis apparatus, which correspond to the method embodiment shown in fig. 1, and which may be applied in various electronic devices in particular.
As shown in fig. 5, the panoramic image synthesis apparatus of the present embodiment includes a receiving unit 501 and a synthesis unit 502. A receiving unit 501, configured to receive transmission data of a three-dimensional scanner, where the transmission data includes: the number of basic images of the target shot by the three-dimensional scanner is large; and acquiring the position information of the actual position indicated by the specific point in each basic image by the three-dimensional scanner. A synthesizing unit 502 for synthesizing a panoramic image including a target number of specific points and having azimuth information at an actual position indicated by each specific point, based on the target number of base images.
In this embodiment, specific processing of the receiving unit 501 and the synthesizing unit 502 of the panoramic image synthesizing apparatus and the technical effects thereof can refer to the related descriptions of step 101 and step 102 in the corresponding embodiment of fig. 1, which are not repeated herein.
In some embodiments, the three-dimensional scanner has a camera for acquiring a base image and an electronic compass for acquiring bearing information of an actual position indicated by a particular point in the base image; the three-dimensional scanner shoots the target number of basic images and collects the azimuth information of the actual position indicated by the specific point in each basic image by the following modes: responding to a shooting instruction of shooting a target number of basic images sent by the terminal equipment, executing the shooting step: taking the ratio of 360 degrees to the target number as a single rotation angle; in the process of rotating 360 degrees, each time the basic image is rotated for a single rotation angle, the camera is used for shooting the basic image, and the electronic compass is used for collecting the azimuth information of the actual position indicated by the specific point in the basic image.
In some embodiments, the specific point is a center point in the base image, and the electronic compass is disposed at a position corresponding to the center of the camera; the above-mentioned use electronic compass to gather the azimuth information of the actual position that the particular point points in the basic image, including: and the azimuth information collected by the electronic compass is used as the azimuth information of the actual position indicated by the specific point in the basic image.
In some embodiments, the specific point is a central point in the base image.
In some embodiments, the panorama image synthesizing apparatus may further include a first presentation unit (not shown in the drawings). The panoramic image display method comprises a first display unit, a second display unit and a display unit, wherein the first display unit is used for determining a first specific point and a second specific point on two sides of an arbitrary point in a panoramic image in response to an orientation information viewing operation for viewing target orientation information of an actual position indicated by the arbitrary point, the orientation information of the actual position indicated by the first specific point is first orientation information, and the orientation information of the actual position indicated by the second specific point is second orientation information; determining a first distance from the arbitrary point to a first specific point, and determining a second distance from the arbitrary point to a second specific point; determining target azimuth information of an actual position indicated by the arbitrary point in an azimuth information range from the first azimuth information to the second azimuth information based on the first distance and the second distance; and displaying the target position information.
In some embodiments, the first display unit is further configured to divide an included angle between the first azimuth information and the second azimuth information into a first included angle between the target azimuth information and the first azimuth information and a second included angle between the target azimuth information and the second azimuth information according to a distance ratio occupied by the first distance and the second distance, respectively; determining target azimuth information from an azimuth information range between the first azimuth information and the second azimuth information based on a first angle and a second angle
In some embodiments, the panoramic image synthesis apparatus may further include a second presentation unit (not shown in the drawings). The second presentation unit is used for responding to the received third position information provided by the user, and determining a third specific point and a fourth specific point which are positioned at two sides of the third position information and are used for determining the position information of the indicated actual position from the specific points of the target number, wherein the position information of the actual position indicated by the third specific point is fourth position information, and the position information of the actual position indicated by the fourth specific point is fifth position information; determining a third included angle formed by the third azimuth information and the fourth azimuth information, and determining a fourth included angle formed by the third azimuth information and the fifth azimuth information; determining a target point of which the position information of the indicated actual position is the third position information from a target line segment between the third specific point and the fourth specific point on the basis of the third angle and the fourth angle; and displaying a target local image in the panoramic image according to the target point.
In some embodiments, the second display unit is further configured to divide a distance between the third specific point and the fourth specific point into a third distance between the target point and the third specific point and a fourth distance between the target point and the fourth specific point according to an included angle ratio occupied by the third included angle and the fourth included angle, respectively; and determining the target point from the target line segment between the third specific point and the fourth specific point based on the third distance and the fourth distance.
In some embodiments, the second presenting unit is further configured to present the target local image in the panoramic image with the target point as a central point.
In some embodiments, the third orientation information is orientation information selected by the user from preset orientation information, or orientation information input by the user in the input control.
With further reference to fig. 6, fig. 6 illustrates an exemplary system architecture to which the panoramic image synthesis method of some embodiments of the present disclosure may be applied.
As shown in fig. 6, the system architecture may include terminal devices 601, 602 and a three-dimensional scanner 603. The terminal devices 601 and 602 are in communication connection with the three-dimensional scanner 603 through a network. The foregoing may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
Various applications (e.g., an image composition-type application) may be installed on the terminal apparatuses 601, 602. A camera and an electronic compass may be mounted on the three-dimensional scanner 603.
In some scenarios, the three-dimensional scanner 603 may take a target number of base images and collect orientation information for the actual location indicated by a particular point in each base image. Further, the three-dimensional scanner 603 may transmit the photographed base image and the collected orientation information to the terminal devices 601, 602. Thereby. The terminal apparatuses 601, 602 may synthesize a panoramic image containing the target number of specific points, each of which indicates an actual position having orientation information, based on the target number of base images.
The terminal devices 601 and 602 may be hardware or software. When the terminal devices 601, 602 are hardware, they may be various electronic devices supporting image synthesis, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like. When the terminal device 601, 602 is software, it can be installed in the electronic device listed above, and can be implemented as multiple pieces of software or software modules, or can be implemented as a single piece of software or software modules, which is not limited herein.
It should be noted that the panoramic image synthesis method provided by the embodiments of the present disclosure may be executed by the terminal devices 601 and 602, and accordingly, the panoramic image synthesis apparatus may be provided in the terminal devices 601 and 602.
It should be understood that the number of terminal devices and three-dimensional scanners in fig. 6 is merely illustrative. There may be any number of terminal devices and three-dimensional scanners, as desired for implementation.
Referring now to fig. 7, shown is a schematic diagram of an electronic device (e.g., the terminal device of fig. 6) suitable for use in implementing some embodiments of the present disclosure. The terminal device in some embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, the electronic device may include a processing device (e.g., central processing unit, graphics processor, etc.) 701, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage device 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication device 709 may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 7 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communication means 709, or may be installed from the storage means 708, or may be installed from the ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be included in the electronic device or may exist separately without being incorporated in the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving transmission data of a three-dimensional scanner, wherein the transmission data comprises: the number of basic images of the target shot by the three-dimensional scanner is large; the three-dimensional scanner acquires the azimuth information of the actual position indicated by the specific point in each basic image; and synthesizing a panoramic image which comprises a target number of specific points and has azimuth information at the actual position indicated by each specific point based on the target number of basic images.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. Here, the names of the units do not constitute a limitation of the unit itself in some cases, and for example, the receiving unit may also be described as a unit that "receives transmission data of the three-dimensional scanner".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure in the embodiments of the present disclosure is not limited to the particular combination of the above-described features, but also encompasses other embodiments in which any combination of the above-described features or their equivalents is possible without departing from the scope of the present disclosure. For example, the above features may be interchanged with other features disclosed in this disclosure (but not limited to) those having similar functions.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (13)

1. A panoramic image synthesis method is applied to a terminal device, and comprises the following steps:
receiving transmission data of a three-dimensional scanner, wherein the transmission data comprises: the three-dimensional scanner shoots a number of basic images of a target; the three-dimensional scanner acquires the position information of the actual position indicated by the specific point in each basic image;
and synthesizing a panoramic image which comprises a target number of specific points and has azimuth information at the actual position indicated by each specific point based on the target number of base images.
2. The method of claim 1, wherein the three-dimensional scanner has a camera for acquiring a base image and an electronic compass for acquiring bearing information of an actual position indicated by a specific point in the base image; and
the three-dimensional scanner shoots the number of basic images of the target and collects the azimuth information of the actual position indicated by the specific point in each basic image by the following method:
responding to a shooting instruction of shooting a target number of basic images sent by the terminal equipment, executing a shooting step: taking the ratio of 360 degrees to the target number as a single rotation angle; and in the process of rotating 360 degrees, each time the camera rotates for a single rotation angle, the camera is used for shooting a basic image, and the electronic compass is used for collecting the azimuth information of the actual position indicated by a specific point in the basic image.
3. The method of claim 2, wherein the specific point is a center point in the base image, and the electronic compass is disposed at a position corresponding to the center of the camera; and
the collecting the azimuth information of the actual position indicated by the specific point in the basic image by using the electronic compass comprises the following steps:
and the azimuth information collected by the electronic compass is used as azimuth information of the actual position indicated by the specific point in the basic image.
4. The method of claim 1, wherein the particular point is a center point in the base image.
5. The method according to any one of claims 1-4, further comprising:
determining a first specific point and a second specific point on both sides of an arbitrary point in the panoramic image in response to a position information viewing operation for viewing target position information of an actual position indicated by the arbitrary point, wherein the position information viewing operation is used for viewing position information of an actual position indicated by the arbitrary point, the position information of the actual position indicated by the first specific point is first position information, and the position information of the actual position indicated by the second specific point is second position information;
determining a first distance from the arbitrary point to the first specific point, and determining a second distance from the arbitrary point to the second specific point;
determining target azimuth information of an actual position indicated by the arbitrary point from an azimuth information range between the first azimuth information and the second azimuth information based on the first distance and the second distance;
and displaying the target position information.
6. The method according to claim 5, wherein the determining target position information of the actual position indicated by the arbitrary point from a position information range between the first position information and the second position information based on the first distance and the second distance comprises:
dividing an included angle between the first azimuth information and the second azimuth information into a first included angle between the target azimuth information and the first azimuth information and a second included angle between the target azimuth information and the second azimuth information according to the distance proportion occupied by the first distance and the second distance respectively;
and determining the target azimuth information from the azimuth information range between the first azimuth information and the second azimuth information based on the first included angle and the second included angle.
7. The method according to any one of claims 1-4, further comprising:
in response to receiving third position information provided by a user, determining that the position information of the indicated actual position is located at third specific points and fourth specific points on two sides of the third position information from the target number of specific points, wherein the position information of the actual position indicated by the third specific points is fourth position information, and the position information of the actual position indicated by the fourth specific points is fifth position information;
determining a third included angle formed by the third azimuth information and the fourth azimuth information, and determining a fourth included angle formed by the third azimuth information and the fifth azimuth information;
determining, from a target line segment between the third specific point and the fourth specific point, a target point at which the position information of the indicated actual position is the third position information, based on the third angle and the fourth angle;
and displaying a target local image in the panoramic image according to the target point.
8. The method according to claim 7, wherein the determining that the position information of the indicated actual position is a target point of the third position information from a target line segment between the third specific point and the fourth specific point based on the third angle and the fourth angle comprises:
dividing the distance between the third specific point and the fourth specific point into a third distance between the target point and the third specific point and a fourth distance between the target point and the fourth specific point according to the proportion of the included angles respectively occupied by the third included angle and the fourth included angle;
determining the target point from a target line segment between the third specific point and the fourth specific point based on the third distance and the fourth distance.
9. The method of claim 7, wherein the specific point is a center point in the base image; and
the displaying the target local image in the panoramic image according to the target point comprises:
and displaying the target local image in the panoramic image by taking the target point as the central point of the local image.
10. The method according to claim 7, wherein the third orientation information is orientation information selected by a user from preset orientation information or orientation information input by the user in an input control.
11. A panoramic image synthesis apparatus, applied to a terminal device, includes:
a receiving unit, configured to receive transmission data of a three-dimensional scanner, where the transmission data includes: the three-dimensional scanner shoots a number of basic images of a target; the three-dimensional scanner acquires the position information of the actual position indicated by the specific point in each basic image;
and a synthesizing unit for synthesizing a panoramic image including a target number of specific points and having azimuth information at an actual position indicated by each specific point, based on the target number of base images.
12. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-10.
13. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-10.
CN202111203488.4A 2021-10-15 2021-10-15 Panoramic image synthesis method and device and electronic equipment Pending CN113947522A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111203488.4A CN113947522A (en) 2021-10-15 2021-10-15 Panoramic image synthesis method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111203488.4A CN113947522A (en) 2021-10-15 2021-10-15 Panoramic image synthesis method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113947522A true CN113947522A (en) 2022-01-18

Family

ID=79330627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111203488.4A Pending CN113947522A (en) 2021-10-15 2021-10-15 Panoramic image synthesis method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113947522A (en)

Similar Documents

Publication Publication Date Title
CN112488783B (en) Image acquisition method and device and electronic equipment
CN110728622B (en) Fisheye image processing method, device, electronic equipment and computer readable medium
CN110809189A (en) Video playing method and device, electronic equipment and computer readable medium
CN114554092A (en) Equipment control method and device and electronic equipment
CN112257582A (en) Foot posture determination method, device, equipment and computer readable medium
CN111273884A (en) Image display method and device and electronic equipment
CN113947522A (en) Panoramic image synthesis method and device and electronic equipment
CN116228952A (en) Virtual object mounting method, device, equipment and medium
CN113766178B (en) Video control method, device, terminal and storage medium
CN113873156A (en) Image processing method and device and electronic equipment
CN114332224A (en) Method, device and equipment for generating 3D target detection sample and storage medium
CN114529452A (en) Method and device for displaying image and electronic equipment
CN114089891A (en) Display control method and device and electronic equipment
CN110457106B (en) Information display method, device, equipment and storage medium
CN113096194B (en) Method, device, terminal and non-transitory storage medium for determining time sequence
CN113066166A (en) Image processing method and device and electronic equipment
CN115086541A (en) Shooting position determining method, device, equipment and medium
CN112383810A (en) Lyric video display method and device, electronic equipment and computer readable medium
CN113592734B (en) Image processing method and device and electronic equipment
CN113837918A (en) Method and device for realizing rendering isolation by multiple processes
CN111460334A (en) Information display method and device and electronic equipment
CN110807114A (en) Method, device, terminal and storage medium for picture display
CN114357348B (en) Display method and device and electronic equipment
CN114363526A (en) Image shooting method and device and electronic equipment
CN112214187B (en) Water ripple image implementation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination