CN115767291A - Virtual photographing method, device, equipment and storage medium - Google Patents

Virtual photographing method, device, equipment and storage medium Download PDF

Info

Publication number
CN115767291A
CN115767291A CN202211277030.8A CN202211277030A CN115767291A CN 115767291 A CN115767291 A CN 115767291A CN 202211277030 A CN202211277030 A CN 202211277030A CN 115767291 A CN115767291 A CN 115767291A
Authority
CN
China
Prior art keywords
image
target content
server
terminal device
terminal equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211277030.8A
Other languages
Chinese (zh)
Inventor
余雁
潘建忠
李如旺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Hantele Communication Co ltd
Original Assignee
Guangzhou Hantele Communication Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Hantele Communication Co ltd filed Critical Guangzhou Hantele Communication Co ltd
Priority to CN202211277030.8A priority Critical patent/CN115767291A/en
Publication of CN115767291A publication Critical patent/CN115767291A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the application provides a virtual photographing method, a virtual photographing device and a virtual photographing storage medium, and relates to the technical field of photographing, the application provides convenient photographing service for a user by applying a corresponding virtual photographing method in a terminal device and a server, the server sends corresponding target content to the terminal device, and the terminal device can display the target content, so that the user can observe the corresponding display effect of a photo before filming; and corresponding to the shooting operation of the user, the terminal equipment sends the shot image to the server so as to obtain a composite image corresponding to the shot image and the target content, so that the convenience is met, meanwhile, the shooting interest is improved, and the shooting requirement of the user is better met.

Description

Virtual photographing method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of photography, in particular to a virtual photographing method, device, equipment and storage medium.
Background
Traveling outside becomes a common leisure and recreation mode in daily life of people, and people usually have a memory of a tourist place in a photographing mode when traveling. With the popularization of smart phones, people have a constant state through mobile phone photographing.
Moreover, people simply shoot landscape pictures, group pictures of people and background objects and the like during playing, and the requirements of users on pursuing personalized and highly interesting pictures are not met. Generally, people can perform post-synthesis processing on photos by themselves or by professionals, change backgrounds in photos, add patterns to backgrounds, synthesize characters in photos, and the like so as to facilitate personalized production. But for people, the specialty of post-synthesis processing on photos is strong, and the post-synthesis processing on photos is difficult to realize by most people; before the professional finishes making, people are difficult to see the display effect of the photo.
Therefore, the existing photographing scheme is difficult to meet the photographing requirements of people.
Disclosure of Invention
The embodiment of the application provides a virtual photographing display method, device, equipment and storage medium, which can conveniently provide personalized and highly interesting photographing service for people, have low operation requirement on people and effectively meet the photographing requirement of people.
In a first aspect, an embodiment of the present application provides a virtual photographing method, which is applied to a server, and the virtual photographing method includes:
receiving a background image and position information sent by terminal equipment;
determining a camera pose corresponding to the terminal equipment according to the background image and the position information;
sending the target content associated with the camera pose to the terminal device to display the target content in the terminal device;
if a shot image shot by the terminal equipment at present is received, determining whether the similarity between the shot image and a background image meets a preset threshold value;
and generating a composite image corresponding to the target content and the shot image and transmitting the composite image to the terminal equipment under the condition that the similarity between the shot image and the background image meets a preset threshold value.
In a second aspect, an embodiment of the present application provides a virtual photographing method, which is applied to a terminal device, where the terminal device includes a camera, and the virtual photographing method includes:
if the currently running application program is determined to be the target application program, starting a camera;
capturing a shooting picture corresponding to the camera, generating a background image, and sending the background image and position information corresponding to the shooting background image to a server;
receiving target content sent by a server, and displaying the target content on a shot picture;
responding to the shooting operation, and sending the obtained shot image to a server;
and receiving the composite image which is sent by the server and corresponds to the shooting image and the target content.
In a third aspect, an embodiment of the present application provides a virtual camera apparatus, which is applied to a server, and includes:
the first data receiving module is configured to receive a background image and position information sent by the terminal equipment;
the pose determining module is configured to determine a camera pose corresponding to the terminal equipment according to the background image and the position information;
the first data sending module is configured to send the target content associated with the camera pose to the terminal equipment so as to show the target content in the terminal equipment;
the similarity judging module is configured to determine whether the similarity between the shot image and the background image meets a preset threshold value or not if the shot image shot at present by the terminal equipment is received;
and the second data transmission module is configured to generate a composite image corresponding to the target content and the shot image and transmit the composite image to the terminal equipment under the condition that the similarity between the shot image and the background image meets a preset threshold value.
In a fourth aspect, an embodiment of the present application provides a virtual device of shooing, which is applied to a terminal device, the terminal device includes a camera, and the virtual device of shooing includes:
the device starting module is configured to start the camera if the currently running application program is determined to be a target application program;
the third data sending module is configured to intercept a shooting picture corresponding to the camera, generate a background image and send the background image and position information corresponding to the shooting background image to the server;
the second data receiving module is configured to receive the target content sent by the server and display the target content on the shooting picture;
the fourth data sending module is configured to respond to shooting operation and send the obtained shot image to the server;
and a third data receiving module configured to receive a composite image corresponding to the photographed image and the target content transmitted by the server.
In a fifth aspect, an embodiment of the present application provides a virtual camera device, including:
one or more processors;
a storage device for storing one or more programs,
when executed by one or more processors, cause the one or more processors to implement the virtual photography method provided in the above-described aspects.
In a sixth aspect, embodiments of the present application further provide a storage medium storing computer-executable instructions, which when executed by a processor, are configured to perform the virtual photography method provided in the above aspect.
According to the method, the corresponding virtual photographing method is applied to the terminal equipment and the server, so that convenient photographing service is provided for a user, the server sends the corresponding target content to the terminal equipment, and the terminal equipment can display the target content, so that the user can observe the corresponding display effect of the photo before filming; and corresponding to the shooting operation of the user, the terminal equipment sends the shot image to the server so as to obtain a composite image corresponding to the shot image and the target content, so that the shooting interest is improved while the convenience is met, and the shooting requirement of the user is better met.
Drawings
Fig. 1 is a flowchart of a virtual photo taking method according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a virtual photo method according to an embodiment of the present application;
fig. 3 is a flowchart of a virtual photo-taking method according to another embodiment of the present application
Fig. 4 is a flowchart of another virtual photo taking method according to an embodiment of the present application;
fig. 5 is a schematic diagram of a virtual camera device according to an embodiment of the present disclosure;
fig. 6 is a schematic view of another virtual camera device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a virtual camera device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the embodiments described herein are illustrative of the present application and are not limiting of the present application. It should be further noted that, for the convenience of description, only some of the structures associated with the present application are shown in the drawings, not all of them.
It should be noted that, for the sake of brevity, this description does not exhaust all alternative embodiments, and it should be understood by those skilled in the art after reading this description that any combination of features may constitute an alternative embodiment as long as the features are not mutually inconsistent.
It is noted that, in this document, relational terms such as "first" and "second" are used solely to distinguish one entity or action or object from another entity or action or object without necessarily requiring or implying any actual such relationship or order between such entities or actions or objects. The number of the objects to be distinguished such as "first" and "second" is not limited to one, and may be one or more, and it is conceivable that "a plurality" is represented by two or more in the description of the present application.
The virtual photographing method can be applied to terminal equipment such as a mobile phone, an intelligent tablet and the like, a camera used for photographing images is configured in the terminal equipment, and the virtual photographing method can also be applied to a server. The user shoots through the terminal equipment, and the server provides and edits the shot image.
Fig. 1 is a flowchart of a virtual photo taking method provided in an embodiment of the present application, where the virtual photo taking method may be applied to a server, and the server processes image data sent by a terminal device, as shown in the figure, the method specifically includes the following steps:
and step S110, receiving the background image and the position information sent by the terminal equipment.
And step S120, determining the camera pose corresponding to the terminal equipment according to the background image and the position information.
And the server receives the background image and the position information sent by the terminal equipment and processes the received data, so that the corresponding camera pose when the terminal equipment shoots the background image is determined. It is understood that the camera pose is used to represent the position and orientation of the camera head when the corresponding image was taken.
In an embodiment, the location information comprises a positioning parameter and an orientation parameter of the terminal device. The server stores a point cloud map corresponding to a target shooting place, wherein the target shooting place can be some architectural scenic spots, sculpture scenic spots and other places, and the place is constructed with the point cloud map in advance, for example, the point cloud map is acquired for the place, and a three-dimensional point cloud map corresponding to a grid map of the place, namely the point cloud map, is generated.
Fig. 2 is a flowchart of a virtual photo-taking method according to an embodiment of the present application, and as shown in the figure, the virtual photo-taking method includes the following steps:
step S210, according to the positioning parameters, determining the initial position of the terminal equipment, and determining a target area corresponding to the initial position in the point cloud map which is constructed in advance.
And S220, selecting corresponding point cloud data in the target area according to the orientation parameters.
And step S230, extracting visual characteristic information in the background image.
And S240, selecting target point cloud data matched with the visual characteristic information from the point cloud data of the target area according to the visual characteristic information to determine the pose of the camera.
When a user shoots in a target shooting place, the terminal equipment sends the obtained background image and the position parameter to the server. The positioning parameters of the terminal device may be determined according to a GNSS (Global Navigation Satellite System) sensor integrated on the terminal device, and the orientation parameters may be determined according to a magnetometer integrated on the terminal device.
Therefore, after the server receives the position information and the background image, the server can determine the initial position of the terminal device according to the current positioning parameters, namely, the position of the terminal device in the target shooting place, so as to position the terminal device, and therefore, the target area corresponding to the initial position is selected from the point cloud map. In addition, the server can also select point cloud data in the target area on the azimuth corresponding to the orientation parameter according to the corresponding orientation parameter.
The server further performs feature extraction corresponding to the received background image, such as extracting visual feature information in the background image, where the visual feature information includes color, texture, and shape. Therefore, the server also performs matching selection on the point cloud data and the background image according to the visual characteristic information, so that the point cloud data matched with the visual characteristic information is selected as target point cloud data, the camera pose of the terminal equipment is finally determined, and the camera pose is positioned. In addition, the positioning accuracy of the camera pose can be further improved by combining the positioning parameters and the orientation parameters.
And step S130, sending the target content associated with the camera pose to the terminal equipment so as to display the target content in the terminal equipment.
After the camera pose is determined, correspondingly, the server sends the target content corresponding to the camera pose to the terminal device, and it is conceivable that the steps may be repeated until the terminal device sends the shot image, that is, the user may shoot the corresponding image with different camera poses by adjusting the shooting angle of the terminal device, and upload the image to the server through the terminal device, and the server sends the corresponding target content to the terminal device, so that the terminal device can display the target content.
Fig. 3 is a flowchart of a virtual photo taking method according to another embodiment of the present application, where the virtual photo taking method further includes the following steps:
and step S310, determining corresponding target content according to the position information.
And S320, determining a display part of the target content on the terminal equipment based on the camera pose, and sending the target content corresponding to the display part to the terminal equipment.
In one embodiment, different target contents are allocated in the server corresponding to different positions in the target shooting place, so that when the position of the terminal device in the target shooting place is determined, the server can determine the target content corresponding to the position.
Therefore, the server displays part of the target content corresponding to different camera poses on the terminal device, and the corresponding target content is displayed on the terminal device, for example, when the user shoots a building at a shooting angle I, the target content is displayed on the terminal device in addition to the building and the surrounding environment thereof; when the user photographs the building at a photographing angle II, a part of the target content is displayed on the terminal device, or the target content corresponding to another angle of view is presented. When the terminal equipment is seen from the user, the target content displayed on the terminal equipment can be changed along with the adjustment of the shooting angle of the user, and the shooting experience of the user is further improved.
It should be noted that the server may also send a plurality of target contents to the terminal device to provide a selection for the user, so as to meet the requirement of the user for personalized shooting of the image.
Step S140, if a shot image currently shot by the terminal device is received, determining whether the similarity between the shot image and the background image meets a preset threshold.
After the server receives the shot image shot by the terminal equipment, the server also needs to judge the similarity of the shot image, namely the shot image and the background image, and the corresponding similarity also needs to judge whether the preset threshold is met. Therefore, when the similarity between the shot image and the background image meets the preset threshold, the difference between the current shot image and the background image is small, and the corresponding target content does not need to be modified; and for the condition that the similarity between the shot image and the background image meets the preset threshold, the difference between the shot image and the background image is large, the corresponding target content is wrong, and modification is needed, for example, the shot image can be used as the background image, so as to obtain the corresponding target content, and perform image synthesis processing.
And S150, generating a composite image corresponding to the target content and the shot image under the condition that the similarity between the shot image and the background image meets a preset threshold value, and sending the composite image to the terminal equipment.
The server performs a synthesizing process on the captured image, and synthesizes the recorded target content in the current captured image to form a synthesized image. The server sends the composite image to the terminal device, for example, in a scene where the terminal device has an application corresponding to the virtual photographing function, the composite image sent by the server may be displayed in an application message manner, for example, in a display interface of the terminal device, a user is informed that the server has sent the application message in a pop-up window manner; or the server sends the corresponding composite image to the terminal device in the form of a 5G message so that the terminal device can learn the corresponding composite image.
According to the scheme, the server applying the virtual photographing method can provide the processing of the photographed image for the terminal equipment, so that the equipment configuration requirement of the terminal equipment is reduced; and by sending the target content to the terminal equipment, the target content is put in the shooting picture of the terminal equipment, personalized and high-interest shooting service is provided for the user, and in addition, the user can observe the display effect of the corresponding photo before shooting, so that the shooting requirement of the user can be met.
In an embodiment, the visual feature information includes a first texture feature, which is extracted from the background image, and the feature extraction is also performed on the selected point cloud data to obtain a second texture feature of the selected point cloud data. Therefore, the first texture features and the second texture features are matched, the second texture features with the highest matching degree are selected from the obtained second texture features corresponding to different point cloud data, the point cloud data corresponding to the second texture features with the highest matching degree serve as target point cloud data, and therefore the camera posture is determined.
Fig. 4 is a flowchart of another virtual photo taking method provided in the embodiment of the present application, where the virtual photo taking method may be applied to a terminal device, and as shown in the drawing, the method specifically includes the following steps:
and step S410, if the currently running application program is determined to be the target application program, starting the camera.
In the running process of the terminal device, a target application program needs to be run, and the target application program is used for controlling the terminal device to realize the virtual photographing method provided by the embodiment of the application. Therefore, the currently running application program is the target application program, and the terminal device starts the camera, so that the shooting picture is obtained and displayed in the display interface of the terminal device.
The user can open the application corresponding to the target application program on the display desktop of the terminal equipment, so that the terminal equipment runs the target application program; or the user can cause the terminal device to run the target application by scanning the two-dimensional code using the terminal device. It is noted that the shot pictures shot by the camera can be displayed in the main interface of the target application program.
And step S420, capturing a shooting picture corresponding to the camera, generating a background image, and sending the background image and position information corresponding to the shooting background image to a server.
After the camera is started, the terminal equipment intercepts a shooting picture and generates a background image so as to obtain the position information of the corresponding background image. The shot picture intercepted by the terminal equipment is the current shot picture of the terminal equipment, and meanwhile, the recorded position information corresponding to the shot picture is sent to the server.
Further, it is conceivable that the terminal device may also intercept the photographing screen in real time and transmit it as a background image to the server, so that the server determines the photographing position of the user in real time, thereby providing the terminal device with the corresponding target content.
It is conceivable to integrate sensors for positioning, such as GNSS sensors, in the terminal device; in addition, the terminal device is integrated with sensors for indicating the tilt angle and tilt direction of the terminal device, such as magnetometers; the camera serves as a device integrated on the terminal device. Therefore, the tilt angle and tilt direction of the camera are the same as the terminal device, i.e. the orientation of the camera is related to the tilt angle and tilt direction of the terminal device. The terminal device transmits the recorded positioning parameters and orientation parameters as position information associated with the background image to the server.
And step S430, receiving the target content sent by the server, and displaying the target content on the shooting picture.
After the server receives the background image and the position information, the server determines a shooting place, and sends corresponding target content to the terminal device, for example, the server sends virtual characters, pictures and the like generated based on an Augmented Reality (AR) technology to the terminal device. After the terminal device receives the target content, the terminal device displays the target content in the shooting picture, namely the terminal device displays the target in the current shooting picture. Therefore, from the user's perspective, the target content appears in the image to be captured, and the user can capture a photograph or the like of the target content as needed for the capturing.
Step S440, in response to the photographing operation, transmits the acquired photographed image to the server.
Step S450, receiving the composite image which is sent by the server and corresponds to the shooting image and the target content.
Because the target content is displayed in the shooting picture, the user can select a proper angle to shoot so as to shoot a corresponding image, for example, the user can realize shooting operation by touching a shooting button on a display interface of the terminal equipment so that the terminal equipment takes the current shooting picture as a shooting image; of course, the user may also perform the photographing operation by a voice input operation, such as inputting a voice instruction to the terminal device for controlling the terminal device to perform photographing. And the terminal device transmits the acquired photographed image to the server in response to the photographing operation, and the server performs the composition.
And the terminal device receives the composite image corresponding to the transmission from the server, it is understood that the server records the target content transmitted to the terminal device and synthesizes the recorded target content with the photographed image, thereby generating a corresponding composite image.
According to the technical scheme, the terminal equipment applying the virtual photographing method can provide highly interesting photographing service for the user, the photographing operation of the user is the same as the traditional photographing mode using the terminal equipment, the user can realize highly interesting photographing experience through simple operation, and the photographing requirement of the user is met.
In one embodiment, the composite image is sent to the terminal device in the form of a 5G message, and the 5G message may carry at least one of a picture, text, and a hyperlink. Therefore, the user can send the number information associated with the terminal device to the server through the terminal device, so that the server can send the 5G message to the terminal device. The number information is a network access number of the terminal device, such as a mobile phone number, and the like, so that when the user uses the mobile phone to take a picture, the server can transmit the composite image to the mobile phone through a 5G message.
It should be noted that the number information sent by the user through the terminal device may also be number information associated with other terminal devices, that is, the user may report another number information to the server to additionally specify the terminal device that receives the 5G message. In addition, the server may carry the composite image in the transmitted 5G message.
It can be understood that the server sends a 5G message to the terminal device specified by the user, the specified terminal device is the terminal device associated with the sent number information, and the 5G message may carry corresponding pictures, texts and hyperlinks, for example, the 5G message displays the composite image in the form of thumbnail, and attaches a hyperlink to provide a download path for downloading the composite image to the user, and may also attach text description such as shooting location, shooting date and other information related to the composite image.
Therefore, the user can receive the 5G message by specifying the terminal device, so that other terminal devices can also receive the composite image, and the user who owns a plurality of terminal devices can flexibly receive the composite image; and the content carried by the 5G message is richer, so that the user can learn the relevant information of the synthetic image besides the corresponding synthetic image, and the shooting experience of the user is improved.
For example, in a specific application scenario of the present application, for example, in a building scenic spot, a point cloud map constructed corresponding to the building scenic spot is stored in the server. When the user carries out shooting memorial in a building scenic spot, the user can open the application program on the terminal device to enable the terminal device to execute the virtual shooting method provided by the application, for example, the application on a display desktop of the terminal device is opened, or the two-dimensional code provided in the scenic spot is scanned to open the corresponding application program.
Therefore, after the camera of the terminal device is turned on, the user can display the target content sent by the server on the display interface of the terminal device, for example, the terminal device is in a shooting state, so that the target content is displayed on the shooting interface of the terminal device, that is, the target content appears in the picture shot by the user, and the shooting interest is improved. It should be noted that the target content may be a virtual character, a picture, or the like generated based on AR technology.
When a user presses the shooting control on the terminal equipment, the terminal equipment generates a shooting image and uploads the shooting image to the server, and the server synthesizes the target content and the shooting image to obtain a synthetic image and sends the synthetic image to the terminal equipment in a 5G message mode. The user receives the 5G message through the terminal device to acquire a composite image.
It is contemplated that the 5G message may carry a hyperlink that the user may click to download the composite image. Thumbnails of the composite images may also be carried in the 5G message to show the user a filming effect of the composite images.
Therefore, the mode that the user uses the terminal equipment to shoot is not changed, but the scheme can provide personalized and highly interesting shooting experience for the user, target content is displayed on the terminal equipment, the user can shoot in a personalized mode according to the target content to obtain a desired shooting effect, and the synthesized synthetic image provides the highly interesting shooting experience that the user can interact with virtual characters or pictures and the like.
Fig. 5 is a schematic diagram of a virtual camera device according to an embodiment of the present application, where the virtual camera device is configured to execute the virtual camera method according to the embodiment, and has functional modules and beneficial effects for executing the method. The virtual photographing device comprises:
a first data receiving module 501 configured to receive a background image and location information sent by a terminal device;
a pose determining module 502 configured to determine a camera pose corresponding to the terminal device according to the background image and the position information;
a first data sending module 503 configured to send the target content associated with the camera pose to the terminal device to present the target content in the terminal device;
the similarity determination module 504 is configured to determine whether the similarity between the captured image and the background image meets a preset threshold value if the captured image currently captured by the terminal device is received;
and a second data transmission module 505 configured to generate a composite image corresponding to the target content and the photographed image and transmit the composite image to the terminal device, in case that the similarity of the photographed image and the background image satisfies a preset threshold.
On the basis of the foregoing embodiment, the position information includes a positioning parameter and an orientation parameter of the terminal device, and the pose determination module 502 is further configured to:
determining an initial position of the terminal equipment according to the positioning parameters, and determining a target area corresponding to the initial position in a pre-constructed point cloud map;
selecting corresponding point cloud data in the target area according to the orientation parameters;
extracting visual characteristic information in the background image;
and selecting target point cloud data matched with the visual characteristic information from the point cloud data of the target area according to the visual characteristic information so as to determine the pose of the camera.
On the basis of the above embodiment, the first data sending module 503 is further configured to:
determining corresponding target content according to the position information;
and determining a display part of the target content on the terminal equipment based on the camera pose, and sending the target content corresponding to the display part to the terminal equipment.
On the basis of the above embodiment, the visual feature information includes the first texture feature. The similarity determination module is further configured to:
according to the visual characteristic information, selecting target point cloud data matched with the visual characteristic information from the point cloud data of the target area to determine the pose of the camera comprises the following steps:
acquiring a second texture feature corresponding to the point cloud data of the target area;
and determining point cloud data corresponding to the second texture features with the highest matching degree with the first texture features as target point cloud data.
Fig. 6 is a schematic diagram of another virtual camera device according to an embodiment of the present application, where the device is configured to execute the virtual camera method according to the embodiment, and has functional modules and beneficial effects for executing the method. The virtual photographing device comprises:
the device starting module 601 is configured to start the camera if the currently running application program is determined to be the target application program;
a third data sending module 602, configured to intercept a shooting picture corresponding to the camera, generate a background image, and send the background image and position information corresponding to the shooting background image to the server;
a second data receiving module 603 configured to receive the target content sent by the server and display the target content on the captured picture;
a fourth data transmission module 604 configured to transmit the acquired photographed image to the server in response to the photographing operation;
a third data receiving module 605 configured to receive the composite image corresponding to the captured image and the target content transmitted by the server.
On the basis of the above-described embodiment, the composite image is transmitted to the terminal device in the form of a 5G message. The virtual photographing device further comprises an information sending module, and the information sending module is further configured to:
and sending the number information associated with the terminal equipment to a server to receive the 5G message.
It should be noted that, in the embodiment of the virtual camera device, the functional modules included in the virtual camera device are only divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional modules are only used for distinguishing one functional module from another, and are not used for limiting the protection scope of the application.
Fig. 7 is a schematic structural diagram of a virtual camera device according to an embodiment of the present application, where the virtual camera device can be used to execute the virtual camera method provided in the foregoing embodiment, and has functional modules and beneficial effects corresponding to the execution method. As shown, the apparatus includes a processor 701, a memory 702, an input device 703 and an output device 704, where the number of processors 701 in the apparatus may be one or more, and one processor 701 is taken as an example in the figure; the processor 701, the memory 702, the input device 703 and the output device 704 of the apparatus may be connected by a bus or other means, and are illustrated as being connected by a bus. The memory 702 is a computer-readable storage medium, and can be used for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the virtual photography method in the embodiments of the present application. The processor 701 executes various functional applications and data processing of the device by running software programs, instructions and modules stored in the memory 702, that is, implements the virtual photographing method described above.
The memory 702 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 702 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 702 may further include memory located remotely from the processor 710, which may be connected to the terminal device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 703 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function controls of the apparatus. The output device 704 may be used to send or display key signal outputs relating to user settings and function control of the apparatus.
The embodiment of the present application further provides a storage medium storing computer-executable instructions, and the computer-executable instructions are used for executing relevant operations in the virtual photographing method provided by the embodiment of the present application when executed by a processor.
Computer-readable storage media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer-readable storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

Claims (10)

1. A virtual photographing method is applied to a server, and comprises the following steps:
receiving a background image and position information sent by terminal equipment;
determining a camera pose corresponding to the terminal equipment according to the background image and the position information;
sending the target content associated with the camera pose to the terminal device to show the target content in the terminal device;
if a shot image shot at present by the terminal equipment is received, whether the similarity between the shot image and the background image meets a preset threshold value is confirmed;
and under the condition that the similarity between the shot image and the background image meets a preset threshold, generating a composite image corresponding to the target content and the shot image, and sending the composite image to the terminal equipment.
2. The virtual photographing method according to claim 1, wherein the location information includes a positioning parameter and an orientation parameter of the terminal device;
the determining, according to the background image and the position information, a camera pose corresponding to the terminal device includes:
determining an initial position of the terminal equipment according to the positioning parameters, and determining a target area corresponding to the initial position in a pre-constructed point cloud map;
selecting corresponding point cloud data in the target area according to the orientation parameters;
extracting visual characteristic information in the background image;
and selecting target point cloud data matched with the visual characteristic information from the point cloud data of the target area according to the visual characteristic information so as to determine the pose of the camera.
3. The virtual photography method according to claim 1 or 2, wherein the sending of the target content associated with the camera pose to the terminal device to present the target content in the terminal device comprises:
determining the corresponding target content according to the position information;
and determining a display part of the target content on the terminal equipment based on the camera pose, and sending the target content corresponding to the display part to the terminal equipment.
4. The virtual photographing method according to claim 2, wherein the visual feature information includes a first texture feature;
selecting target point cloud data matched with the visual feature information from the point cloud data of the target area according to the visual feature information to determine the camera pose comprises:
acquiring a second texture feature corresponding to the point cloud data of the target area;
and determining the point cloud data corresponding to the second texture features with the highest matching degree with the first texture features as the target point cloud data.
5. A virtual photographing method is applied to terminal equipment, the terminal equipment comprises a camera, and the method comprises the following steps:
if the currently running application program is determined to be the target application program, the camera is started;
capturing a shooting picture corresponding to the camera, generating a background image, and sending the background image and position information corresponding to the background image to a server;
receiving target content sent by the server, and displaying the target content in the shooting picture;
responding to shooting operation, and sending the obtained shot image to the server;
and receiving a composite image which is sent by the server and corresponds to the shooting image and the target content.
6. The virtual photographing method according to claim 5, wherein the composite image is transmitted to the terminal device in the form of a 5G message;
before the receiving the composite image corresponding to the shot image and the target content sent by the server, the method further includes:
and sending the number information associated with the terminal equipment to the server so as to receive the 5G message.
7. A virtual photographing device applied to a server, the device comprising:
the first data receiving module is configured to receive a background image and position information sent by the terminal equipment;
the pose determining module is configured to determine a camera pose corresponding to the terminal device according to the background image and the position information;
a first data sending module configured to send target content associated with the camera pose to the terminal device to present the target content in the terminal device;
the similarity judging module is configured to determine whether the similarity between the shot image and the background image meets a preset threshold value or not if the shot image shot at present by the terminal equipment is received;
and the second data sending module is configured to generate a composite image corresponding to the target content and the shot image and send the composite image to the terminal equipment under the condition that the similarity between the shot image and the background image meets a preset threshold value.
8. The utility model provides a virtual device of shooing which characterized in that is applied to terminal equipment, terminal equipment includes the camera, the device includes:
the equipment starting module is configured to start the camera if the currently running application program is determined to be a target application program;
the third data sending module is configured to intercept a shooting picture corresponding to the camera, generate a background image and send the background image and position information corresponding to the shooting of the background image to a server;
the second data receiving module is configured to receive the target content sent by the server and display the target content in the shooting picture;
a fourth data transmission module configured to transmit the acquired photographed image to the server in response to a photographing operation;
a third data receiving module configured to receive a composite image corresponding to the photographed image and the target content transmitted by the server.
9. A virtual camera device, comprising:
one or more processors;
a storage device for storing one or more programs which, when executed by one or more of the processors, cause the one or more processors to implement the virtual photo-taking method according to any one of claims 1 to 6.
10. A storage medium storing computer-executable instructions, which when executed by a processor, are configured to perform the virtual photography method of any of claims 1 to 6.
CN202211277030.8A 2022-10-18 2022-10-18 Virtual photographing method, device, equipment and storage medium Pending CN115767291A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211277030.8A CN115767291A (en) 2022-10-18 2022-10-18 Virtual photographing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211277030.8A CN115767291A (en) 2022-10-18 2022-10-18 Virtual photographing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115767291A true CN115767291A (en) 2023-03-07

Family

ID=85353781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211277030.8A Pending CN115767291A (en) 2022-10-18 2022-10-18 Virtual photographing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115767291A (en)

Similar Documents

Publication Publication Date Title
US10057485B2 (en) Imaging apparatus and methods for generating a guide display using imaging height posture information
EP2225607B1 (en) Guided photography based on image capturing device rendered user recommendations
CN105320695B (en) Picture processing method and device
US11870951B2 (en) Photographing method and terminal
JP2008513852A (en) Method and system for identifying object in photograph, and program, recording medium, terminal and server for realizing the system
CN111680238B (en) Information sharing method, device and storage medium
TW201814552A (en) Method and system for sorting a search result with space objects, and a computer-readable storage device
JP4423929B2 (en) Image output device, image output method, image output processing program, image distribution server, and image distribution processing program
JP2022140458A (en) Information processing device, information processing method, and program
JP5596844B2 (en) Image processing apparatus, image processing system, and image processing method
JP2016194784A (en) Image management system, communication terminal, communication system, image management method, and program
JP2016194783A (en) Image management system, communication terminal, communication system, image management method, and program
JP6617547B2 (en) Image management system, image management method, and program
CN107343142A (en) The image pickup method and filming apparatus of a kind of photo
JP2008078836A (en) Camera, blog search system, and program
CN115767291A (en) Virtual photographing method, device, equipment and storage medium
JP2013214158A (en) Display image retrieval device, display control system, display control method, and program
CN114584704A (en) Shooting method and device and electronic equipment
JP5458411B2 (en) Photographed image display device, photographed image display method, and electronic map output by photographed image display method
CN111242107A (en) Method and electronic device for setting virtual object in space
JP2005107988A (en) Image output device, image output method, image output processing program, image delivery server and image delivery processing program
WO2022019171A1 (en) Information processing device, information processing method, and program
JP5409852B2 (en) Virtual landscape display device and virtual landscape display method
WO2022226930A1 (en) Data processing method, terminal device, unmanned aerial vehicle, system, and storage medium
JP5058842B2 (en) Virtual landscape display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination