CN115212565B - Method, apparatus and medium for setting virtual environment in virtual scene - Google Patents

Method, apparatus and medium for setting virtual environment in virtual scene Download PDF

Info

Publication number
CN115212565B
CN115212565B CN202210924526.3A CN202210924526A CN115212565B CN 115212565 B CN115212565 B CN 115212565B CN 202210924526 A CN202210924526 A CN 202210924526A CN 115212565 B CN115212565 B CN 115212565B
Authority
CN
China
Prior art keywords
actual
virtual
environment
scene
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210924526.3A
Other languages
Chinese (zh)
Other versions
CN115212565A (en
Inventor
陈剑峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lingyue Digital Information Technology Co ltd
Original Assignee
Lingyue Digital Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lingyue Digital Information Technology Co ltd filed Critical Lingyue Digital Information Technology Co ltd
Priority to CN202210924526.3A priority Critical patent/CN115212565B/en
Publication of CN115212565A publication Critical patent/CN115212565A/en
Application granted granted Critical
Publication of CN115212565B publication Critical patent/CN115212565B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/77Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present disclosure relates to a method, apparatus, and medium for setting a virtual environment in a virtual scene. A method for setting up a virtual environment in a virtual scene, comprising: acquiring actual environment information of an actual scene corresponding to the virtual scene, acquired by a camera; and virtualizing the actual environment information to be displayed in the virtual scene as a virtual environment.

Description

Method, apparatus and medium for setting virtual environment in virtual scene
Technical Field
The present disclosure relates to the field of computer technology, and in particular, to setting a virtual environment in a virtual scene.
Background
With the development of computer technology, a virtual reality technology combining a virtual scene and a real scene is attracting more and more attention. Virtual Reality includes a number of technical branches, such as AR (Augmented Reality ), VR (Virtual Reality), MR (Mixed Reality), and the like.
The AR superimposes virtual information in a real scene through equipment identification and judgment, so that the virtual information is interacted in real time in the real scene. VR is often referred to as immersive virtual reality, presenting a fully virtual environment for a user, giving the user a sensation of being in the real world. MR refers to a new visual environment created by merging real and virtual worlds in which physical and digital objects coexist and interact in real time.
Based on the above virtual reality technology, the concept of meta space (Metaverse) has developed. The metauniverse is a virtual world which is linked and created based on the virtual reality technology and is mapped and interacted with the real world, and is a digital living space with a novel social system.
Disclosure of Invention
In setting up a virtual environment in a virtual scene (e.g., weather conditions of the virtual scene, surrounding environment of the virtual scene, people or things present in the virtual scene), one way the inventors of the present disclosure know is to set up a fixed virtual environment in the virtual scene. For example, when setting the weather conditions in the virtual scene, the weather conditions in a certain season are fixedly adopted as the weather conditions in the virtual scene. However, such a setting may be disconnected from the real environment and may not be realistic for the user.
The present disclosure is made in view of one or more of the above problems, and proposes a mechanism for setting a virtual environment in a virtual scene, which can make the virtual environment in the virtual scene more realistic for a user, thereby improving user experience.
According to one aspect of the present disclosure, there is provided a method for setting up a virtual environment in a virtual scene, including: acquiring actual environment information of an actual scene corresponding to the virtual scene, acquired by a camera; and virtualizing the actual environment information to be displayed in the virtual scene as a virtual environment.
According to another aspect of the present disclosure, there is provided an apparatus for setting a virtual environment in a virtual scene, including: a memory having instructions stored thereon; and a processor configured to execute instructions stored on the memory to perform the following: acquiring actual environment information of an actual scene corresponding to the virtual scene, acquired by a camera; and virtualizing the actual environment information to be displayed in the virtual scene as a virtual environment.
According to yet another aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform a method as according to the present disclosure.
Drawings
The present disclosure will now be described in the following detailed description with reference to the figures, wherein like reference numerals refer to the same or similar components throughout the several views. It is to be understood that the drawings are not necessarily to scale and that they are merely intended to illustrate exemplary embodiments of the disclosure and should not be taken as limiting the scope of the disclosure. Wherein:
FIG. 1 illustrates an exemplary configuration block diagram of an apparatus for setting up a virtual environment in a virtual scene according to an embodiment of the present disclosure;
FIG. 2 illustrates an exemplary flow chart of a method for setting up a virtual environment in a virtual scene according to an embodiment of the disclosure;
FIG. 3 illustrates an exemplary configuration block diagram of an apparatus for setting up a virtual environment in a virtual scene according to another embodiment of the present disclosure;
FIG. 4 illustrates an exemplary flow chart of a method for setting up a virtual environment in a virtual scene according to another embodiment of the disclosure; and
FIG. 5 illustrates an exemplary configuration of a computing device in which embodiments according to the invention may be implemented.
Detailed Description
Various exemplary embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. It is to be understood that the description of the various exemplary embodiments is merely illustrative and not intended to be any limitation on the techniques of the present disclosure. The relative arrangement of components and steps, expressions and numerical values in the exemplary embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Fig. 1 illustrates an exemplary configuration block diagram of an apparatus 1000 for setting a virtual environment in a virtual scene according to an embodiment of the present disclosure.
As shown in fig. 1, in some embodiments, apparatus 1000 may comprise a processor 1010. The processor 1010 of the device 1000 provides various functions of the device 1000. In some embodiments, the processor 1010 of the apparatus 1000 may be configured to perform a method 2000 for setting up a virtual environment in a virtual scene (described below with reference to fig. 2). Specifically, as shown in fig. 1, the processor 1010 of the apparatus 1000 may include an actual environment acquisition unit 1020 and a virtualization unit 1030 configured to perform steps S2010 and S2020, respectively, in a method 2000 for setting a virtual environment in a virtual scene shown in fig. 2 described later. It should be understood that the various units of the apparatus 1000 shown in fig. 1 are merely logic modules divided according to the specific functions they implement, and are not intended to limit the specific implementation. In actual implementation, each module described above may be implemented as an independent physical entity, or may be implemented by a single entity (e.g., a processor (CPU or DSP, etc.), an integrated circuit, etc.).
The processor 1010 of the apparatus 1000 may refer to various implementations of digital circuitry, analog circuitry, or mixed-signal (a combination of analog and digital) circuitry that perform functions in a computing system. The processing circuitry may include, for example, circuitry such as an Integrated Circuit (IC), an Application Specific Integrated Circuit (ASIC), a portion or circuit of an individual processor core, an entire processor core, an individual processor, a programmable hardware device such as a Field Programmable Gate Array (FPGA), and/or a system including multiple processors.
In some embodiments, the apparatus 1000 may also include a memory (not shown). The memory of apparatus 1000 may store information generated by processor 1010 as well as programs and data for operation of processor 1010. The memory may be volatile memory and/or nonvolatile memory. For example, the memory may include, but is not limited to, random Access Memory (RAM), dynamic Random Access Memory (DRAM), static Random Access Memory (SRAM), read Only Memory (ROM), and flash memory. In addition, the apparatus 1000 may be implemented at a chip level or may be implemented at a device level by including other external components.
Fig. 2 illustrates an exemplary flowchart of a method 2000 for setting up a virtual environment in a virtual scene, according to an embodiment of the present disclosure. This method may be used, for example, in an apparatus 1000 as shown in fig. 1.
The virtual scenes described in this disclosure are virtual scenes presented to a user using virtual reality techniques, including but not limited to virtual scenes presented using AR, VR, MR, metauniverse, etc. techniques, such as virtual exhibitors, virtual stadiums, and the like. In addition, the virtual scene described in the present disclosure is not limited to a completely virtual scene, and may include a combination of a virtual scene and a real scene. The virtual environment described in the present disclosure is an environment provided in the above-described virtual scene, and includes, but is not limited to, weather conditions (sunny, cloudy, rain, snow, fog, etc.) of the virtual scene, surrounding environments (roads, traffic information, vegetation conditions, etc.) of the virtual scene, persons or objects existing in the virtual scene, and the like, any environment related to the virtual scene.
In addition, the user described in this disclosure is a user who experiences in a virtual scene using virtual reality technology. The user enters the virtual scene by using the virtual reality equipment worn by the user, controls the virtual image corresponding to the user and displayed in the virtual scene, experiences the virtual environment of the virtual scene, interacts with other users in the virtual scene, interacts with the virtual object in the virtual scene and the like.
As shown in fig. 2, in S2010, the real environment acquisition unit 1020 acquires real environment information of a real scene corresponding to a virtual scene acquired by a camera.
Next, a case where the virtual scene is a virtual stadium will be described as an example. In this case, the corresponding actual scene is an actual stadium. The user can experience the virtual stadium (for example, watch virtual sports in the virtual stadium) through interacting with the virtual reality device, and the immersive experience of putting in the actual stadium can be realized without going to the actual stadium.
In order to realize such a virtual scene, a virtual environment of the virtual scene needs to be set. One way the inventors of the present disclosure know is to set a fixed virtual environment in a virtual scene. For example, when setting the weather conditions in the virtual scene, the weather conditions in a certain season are fixedly adopted as the weather conditions in the virtual scene. However, such a setting may be disconnected from the real environment and may not be realistic for the user.
In the method, the device and the system, the actual environment information of the actual scene corresponding to the virtual scene acquired by the camera is acquired so that the virtual environment in the virtual scene can be set later, the setting of the virtual environment is enabled to be closer to the real world, and the sense of reality when the user experiences the virtual scene is improved.
In some embodiments, the camera may be disposed in an actual stadium to collect actual environmental information of the actual stadium. The actual environmental information includes, but is not limited to, at least one of actual weather information (e.g., weather conditions of a stadium), ambient environmental information of an actual scene (e.g., roads around the stadium, traffic information, vegetation status, etc.), people or things in the actual scene (e.g., athletes, spectators, appliances, etc. in the stadium). For example, a plurality of cameras may be arranged in an actual stadium so as to acquire images, videos, etc. of the actual environment of the stadium from different angles.
In some embodiments, the actual environmental information may be real-time environmental information of an actual scene acquired in real-time by the camera. The apparatus 1000 may be configured to communicate with the camera via a communication module (not shown) by means of, for example, wireless communication, so as to acquire real-time environment information of an actual scene acquired by the camera in real time. In addition, the real-time environmental information collected in real time by the camera may also be uploaded to the server, and the apparatus 1000 obtains the real-time environmental information collected in real time by communicating with the server. In other embodiments, the actual environmental information collected by the camera may be stored locally or in a server in advance, and the apparatus 1000 obtains the pre-stored actual environmental information from the camera locally or from the server.
Next, in S2020, the virtualization unit 1030 virtualizes the actual environment information acquired by the actual environment acquisition unit 1020 in S2010 to be displayed as a virtual environment in the virtual scene.
Therefore, the virtual environment matched with the corresponding actual scene (such as the actual stadium) can be displayed in the virtual scene (such as the virtual stadium), so that the virtual environment in the virtual scene is more realistic for the user, and the user experience is improved.
For example, when the current weather condition of the actual stadium is cloudy days, the actual environmental information of the cloudy days collected by the camera in real time can be virtualized, so that the virtual weather in the virtual scene is also cloudy days. In addition, when the weather conditions of the actual stadium change (for example, the actual weather conditions change to rainy), the actual rainy environmental information acquired in real time can be further virtualized, so that the virtual weather in the virtual scene also changes to rainy. Therefore, the virtual environment in the virtual scene can be changed synchronously with the real scene, so that the user can be just like to be in the real scene, and the user experience is improved.
In some embodiments, the key targets (such as sun, cloud, etc.) related to the environment in the image acquired by the camera can be extracted through an image recognition technology, and the extracted key targets in the actual environment are utilized to change the corresponding virtual key targets in the virtual scene, so that the change consistent with the change of the key targets in the actual environment is embodied in the virtual scene.
In some embodiments, the RGB camera, the infrared camera, the wide-angle camera, and the like may be selected for acquiring the actual environment according to different weather conditions, so as to obtain the optimal acquisition effect.
It should be understood that the above description has been given taking the example that the actual environment is an actual weather condition, but other actual environment information of the actual scene acquired by the camera may be acquired and virtualized into the virtual scene. For example, ambient information (e.g., vegetation conditions) of an actual scene acquired by a camera may be periodically acquired and virtualized into a virtual scene. Thus, for example, the virtual vegetation condition can be displayed in the virtual scene in synchronization with the actual vegetation condition, and the change of vegetation throughout the year can be reflected in the virtual scene.
An embodiment of setting a virtual environment that is more realistic for a user in a virtual scene is described above with reference to fig. 1 and 2. In some cases, a user may wish to be able to autonomously set a virtual environment in a virtual scene. In some embodiments, the apparatus 1000 may be configured to provide the user with setup options for different virtual environments.
As an example, in some embodiments, in a case where the real environment information includes real weather information, the real environment acquisition unit 1020 may acquire a plurality of real weather information of a plurality of real scenes for a user in the virtual scene to select the real weather information to be virtualized. For example, still taking a virtual scene such as a virtual stadium as an example, the real environment acquiring unit 1020 may acquire a plurality of real weather information of a plurality of real stadiums located in different regions, for example, a weather information of a sunny day of a stadium located in three-layer, a weather information of a small snow located in haerbin, a weather information of a cloudiness located in beijing. In some embodiments, the apparatus 1000 may be configured to display these three different weather information (e.g., virtual reality devices sent to the user) to the user via the user interface for the user to select the actual weather information to virtualize. In the case where the user selects weather information on a sunny day of a stadium of the three parties, for example, through a virtual reality device, the virtualization unit 1030 virtualizes the weather information so that virtual weather information matching the sunny day of the stadium of the three parties is displayed in the virtual scene.
In addition, in some embodiments, multiple pieces of actual weather information (such as sunny days, rainy days, fog, etc.) of different time periods may also be acquired for one actual scene, so that a user in the virtual scene may select actual weather information to be virtualized.
In addition, the setting of the virtual environment provided to the user is not limited to weather information, but may be other environment information, and the user can make personalized selections and settings.
In the method, the personalized virtual environment setting scheme can be provided for the user by providing different setting options of the virtual environment for the user, so that the user experience is improved.
In some embodiments, where the actual environment information includes a person or thing in an actual scene, the person or thing in the actual scene is also virtualized as part of the virtual environment. For example, in the actual environment information of the actual scene acquired by the camera, a real person or object in the actual scene may be included, and the actual environment information is also virtualized into the virtual environment, so that an avatar corresponding to the real person or object is displayed as a virtual environment in the virtual scene. In some embodiments, the blurring process may also be performed on the person in the virtual environment displayed in the virtual scene, thereby protecting the privacy of the person.
In some embodiments, in the case that the virtual person is set in the virtual scene, the display proportion of the virtual person and the virtual image of the actual person in the virtual scene may be adjusted (for example, the display proportion of the virtual person and the virtual image of the actual person are consistent), so that the display of the virtual person and the virtual image of the actual person are coordinated, and the problem that the virtual image of the virtual person or the virtual image of the actual person is too large or too small is avoided.
Next, an exemplary configuration block diagram of an apparatus 3000 for setting a virtual environment in a virtual scene according to another embodiment of the present disclosure is described with reference to fig. 3.
As shown in fig. 3, in some embodiments, apparatus 3000 may comprise a processor 3010. The processor 3010 of the device 3000 provides various functions of the device 3000. In some embodiments, the processor 3010 of the apparatus 3000 may be configured to perform a method 4000 for setting up a virtual environment in a virtual scene (described below with reference to fig. 4). Specifically, as shown in fig. 3, the processor 3010 of the apparatus 3000 may include an image specification determination unit 3020 and a virtualization unit 3030, configured to perform steps S4010 and S4020, respectively, in a method 4000 for setting a virtual environment in a virtual scene shown in fig. 4 described later. It should be understood that the various units of the apparatus 3000 shown in fig. 3 are merely logic modules divided according to the specific functions they implement, and are not intended to limit the specific implementation. In actual implementation, each module described above may be implemented as an independent physical entity, or may be implemented by a single entity (e.g., a processor (CPU or DSP, etc.), an integrated circuit, etc.).
The processor 3010 of the apparatus 3000 may refer to various implementations of digital circuitry, analog circuitry, or mixed-signal (a combination of analog and digital) circuitry that performs functions in a computing system. The processing circuitry may include, for example, circuitry such as an Integrated Circuit (IC), an Application Specific Integrated Circuit (ASIC), a portion or circuit of an individual processor core, an entire processor core, an individual processor, a programmable hardware device such as a Field Programmable Gate Array (FPGA), and/or a system including multiple processors.
In some embodiments, the apparatus 3000 may also include a memory (not shown). The memory of apparatus 3000 may store information generated by processor 3010 as well as programs and data for operation by processor 1010. The memory may be volatile memory and/or nonvolatile memory. For example, the memory may include, but is not limited to, random Access Memory (RAM), dynamic Random Access Memory (DRAM), static Random Access Memory (SRAM), read Only Memory (ROM), and flash memory. In addition, the apparatus 3000 may be implemented at a chip level or may be implemented at a device level by including other external components.
Fig. 4 illustrates an exemplary flowchart of a method 4000 for setting up a virtual environment in a virtual scene, according to an embodiment of the present disclosure. This method may be used, for example, with the apparatus 3000 shown in fig. 3.
As shown in fig. 4, in S4010, the image specification determining unit 3020 may determine whether the image specification of the real environment information acquired in real time is lower than a predetermined threshold. The real environment information acquired in real time may be acquired in S2010 by the real environment acquisition unit 1020, for example.
In this application, image specification refers to parameters related to the quality of an image, including but not limited to image sharpness, image contrast, image resolution, and the like. In addition, the images described herein may include still images, dynamically captured video, and the like. In some embodiments, the predetermined threshold may be preset, for example, according to actual needs. An image specification of real-time acquired real-environment information below a predetermined threshold indicates that the image quality of the acquired image of the real-environment may not meet the requirement for subsequent virtualization, and thus the effect of the virtualization may be affected.
Therefore, in S4020, in the case where it is determined in S4010 by the image specification determining unit 3020 that the image specification of the real-time acquired actual environment information is lower than the predetermined threshold, the virtualized optimization processing is performed by the virtualization unit 3030. The virtualized unit 3030 may be implemented by the virtualized unit 1030 shown in fig. 1, for example, or may be a unit different from the virtualized unit 1030.
Next, the optimization processing of the virtualization by the virtualization unit 3030 will be specifically described.
In some embodiments, in the case that the image specification of the real-time acquired real-environment information is lower than the predetermined threshold, the virtualization unit 3030 may replace the real-time acquired real-environment information with the pre-acquired real-environment information. For example, due to the influence of real-time weather conditions (such as weather of rain, snow, fog, etc.), the influence of light such as the camera being blocked, night, etc., the image quality of the real-time collected real-environment information by the camera may not reach the expected quality, so that the image specification of the real-time collected real-time environment information is lower than a predetermined threshold.
In this case, the actual environmental information acquired in advance may be employed instead of the actual environmental information acquired in real time. The pre-acquired actual environmental information may be, for example, pre-acquired in case of good weather conditions (sunny days, etc.), or acquired without a camera being blocked, or acquired in case of sufficient light in the daytime, etc., and the image of the actual environmental information may have a higher image specification (e.g., higher definition, higher contrast, etc.) than the image of the actual environmental information acquired in real time. Therefore, the real environment information acquired in advance is utilized to replace the real environment information acquired in real time, so that the quality of the real environment information for virtualization is higher, and the virtual environment displayed in the virtual scene is ensured to have higher quality.
In some embodiments, in a case where the image specification of the real-time collected real-environment information is lower than the predetermined threshold, the virtualization unit 3030 may fuse the pre-collected real-environment information with the real-time collected real-environment information, and virtualize the fused real-environment information. Similar to the above description, the pre-acquired actual environmental information may be, for example, pre-acquired in case of good weather conditions (sunny days, etc.), or acquired without a camera being blocked, or acquired in case of sufficient light in the daytime, etc., and the image of the actual environmental information may have a higher image specification (e.g., higher definition, higher contrast, etc.) than the image of the actual environmental information acquired in real time.
In this embodiment, the pre-collected actual environment information and the real-time collected actual environment information are comprehensively utilized, so that the pre-collected actual environment information can be utilized to appropriately compensate the real-time collected actual environment information. For example, a part of the actual environmental information that is not acquired due to the occlusion of the camera may be supplemented with the actual environmental information acquired in advance. Therefore, the real-time performance of the actual environment information is reserved through a real-time acquisition mode, and the image specification of the real-time acquired actual environment information is correspondingly compensated through the pre-acquired actual environment information, so that the virtual environment displayed in the virtual scene has good real-time performance and high quality.
In some embodiments, in the case that the image specification of the real-time acquired real-environment information is below a predetermined threshold, the real-time environment information acquired in real-time by other cameras may be further acquired, for example, using the real-time environment acquisition unit 1020. In addition, the virtualization unit 3030 may fuse the real environment information collected in real time by the camera with the real environment information collected in real time by other cameras, and virtualize the fused real environment information.
Therefore, for example, a plurality of cameras arranged at different positions of an actual scene can be utilized to acquire and fuse a plurality of actual environment information at different angles in real time, so that the virtual environment displayed in the virtual scene has higher quality, and meanwhile, the instantaneity is ensured.
Fig. 5 illustrates an exemplary configuration of a computing device 500 capable of implementing embodiments of the present invention.
Computing device 500 is an example of a hardware device to which the above aspects of the invention can be applied. Computing device 500 may be any machine configured to perform processing and/or calculations. Computing device 500 may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a Personal Data Assistant (PDA), a smart phone, an in-vehicle computer, or a combination thereof.
As shown in fig. 5, computing device 500 may include one or more elements that may be connected to or in communication with bus 502 via one or more interfaces. Bus 502 can include, but is not limited to, an industry standard architecture (Industry Standard Architecture, ISA) bus, a micro channel architecture (Micro Channel Architecture, MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus. Computing device 500 may include, for example, one or more processors 504, one or more input devices 506, and one or more output devices 508. The one or more processors 504 may be any kind of processor and may include, but are not limited to, one or more general purpose processors or special purpose processors (such as special purpose processing chips). Processor 502 may correspond, for example, to processor 1010 in fig. 1 or processor 3010 in fig. 3, configured to implement the functionality of the units of the apparatus for setting up a virtual environment in a virtual scene of the present invention. Input device 506 may be any type of input device capable of inputting information to a computing device and may include, but is not limited to, a mouse, keyboard, touch screen, microphone, and/or remote controller. Output device 508 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, video/audio output terminals, vibrators, and/or printers.
Computing device 500 may also include or be connected to a non-transitory storage device 514, which non-transitory storage device 514 may be any storage device that is non-transitory and that may enable data storage, and may include, but is not limited to, disk drives, optical storage devices, solid state memory, floppy diskettes, flexible disks, hard disks, magnetic tape, or any other magnetic medium, compact disk or any other optical medium, cache memory, and/or any other memory chip or module, and/or any other medium from which a computer may read data, instructions, and/or code. Computing device 500 may also include Random Access Memory (RAM) 510 and Read Only Memory (ROM) 512. The ROM 512 may store programs, utilities or processes to be executed in a nonvolatile manner. The RAM 510 may provide volatile data storage and stores instructions related to the operation of the computing device 500. Computing device 500 may also include a network/bus interface 516 that is coupled to a data link 518. The network/bus interface 516 can be any kind of device or system capable of enabling communication with external equipment and/or a network and can include, but is not limited to, modems, network cards, infrared communicationDevice, wireless communication device and/or chipset (such as bluetooth TM Devices, IEEE802.11 devices, wiFi devices, wiMax devices, mobile cellular communication facilities, etc.).
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, and/or components, and/or groups thereof. Furthermore, in the description of the present disclosure, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or order. Furthermore, in the description of the present disclosure, unless otherwise indicated, the meaning of "a plurality" is two or more.
Reference in the specification to "an embodiment" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of the phrase "in an embodiment of the present disclosure" or similar expressions in this specification are not necessarily referring to the same embodiment.
Those skilled in the art will appreciate that the present disclosure may be implemented in various forms, such as an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-program code, etc.), or an implementation of software and hardware, which may all generally be referred to as a "circuit," "module," "unit" or "system" hereafter. Furthermore, the present disclosure may also be embodied in any tangible media form as a computer program product having computer usable program code stored thereon.
The relevant description of the present disclosure is described with reference to flowchart illustrations and/or block diagrams of systems, apparatuses, methods and computer program products according to specific embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and any combination of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be executed by a machine, such as a processor of a general purpose computer or special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the computer or other programmable data processing apparatus, create means for implementing the functions or acts specified in the flowchart and/or block diagram block or blocks.
A flowchart and block diagrams of the architecture, functionality, and operation that a system, apparatus, method, and computer program product may implement according to various embodiments of the present disclosure are shown in the figures. Accordingly, each block in the flowchart or block diagrams may represent a module, segment, or portion of program code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some other embodiments, the functions described for the blocks may occur out of the order shown in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order of the figures, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvement of market technology, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (7)

1. A method for setting up a virtual environment in a virtual scene, comprising:
acquiring actual environment information of an actual scene corresponding to the virtual scene, which is acquired in real time by a camera;
virtualizing the actual environment information to be displayed in the virtual scene as a virtual environment;
determining whether an image specification of the real environment information acquired in real time is lower than a predetermined threshold, the image specification being a parameter related to image quality of the real environment information acquired in real time; and
fusing the pre-acquired actual environment information with the real environment information acquired in real time under the condition that the image specification of the real environment information acquired in real time is lower than the preset threshold value, virtualizing the fused actual environment information,
wherein the actual environmental information includes at least one of actual weather information, surrounding environmental information of the actual scene, and information of a person or object in the actual scene.
2. The method of claim 1, wherein the actual environmental information comprises actual weather information, the method further comprising:
and acquiring a plurality of pieces of actual weather information of a plurality of actual scenes so as to enable a user in the virtual scene to select the actual weather information to be virtualized.
3. The method of claim 1, wherein the actual environment information comprises a person in the actual scene virtualized as part of the virtual environment, the method further comprising:
and carrying out blurring processing on the people in the virtual environment displayed in the virtual scene.
4. An apparatus for setting up a virtual environment in a virtual scene, comprising:
a memory having instructions stored thereon; and
a processor configured to execute instructions stored on the memory to perform the following:
acquiring actual environment information of an actual scene corresponding to the virtual scene, which is acquired in real time by a camera;
virtualizing the actual environment information to be displayed in the virtual scene as a virtual environment;
determining whether the image specification of the real-time acquired real-time environment information is lower than a preset threshold value; and
fusing the pre-acquired actual environment information with the real environment information acquired in real time under the condition that the image specification of the real environment information acquired in real time is lower than the preset threshold value, virtualizing the fused actual environment information,
wherein the actual environmental information includes at least one of actual weather information, surrounding environmental information of the actual scene, and information of a person or object in the actual scene.
5. The apparatus of claim 4, wherein the actual environmental information comprises actual weather information, the processor further configured to execute instructions stored on the memory to:
and acquiring a plurality of pieces of actual weather information of a plurality of actual scenes so as to enable a user in the virtual scene to select the actual weather information to be virtualized.
6. The apparatus of claim 4, wherein the actual environment information comprises a person in the actual scene virtualized as part of the virtual environment, the processor further configured to execute instructions stored on the memory to:
and carrying out blurring processing on the people in the virtual environment displayed in the virtual scene.
7. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform the method of any of claims 1 to 3.
CN202210924526.3A 2022-08-02 2022-08-02 Method, apparatus and medium for setting virtual environment in virtual scene Active CN115212565B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210924526.3A CN115212565B (en) 2022-08-02 2022-08-02 Method, apparatus and medium for setting virtual environment in virtual scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210924526.3A CN115212565B (en) 2022-08-02 2022-08-02 Method, apparatus and medium for setting virtual environment in virtual scene

Publications (2)

Publication Number Publication Date
CN115212565A CN115212565A (en) 2022-10-21
CN115212565B true CN115212565B (en) 2024-03-26

Family

ID=83615890

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210924526.3A Active CN115212565B (en) 2022-08-02 2022-08-02 Method, apparatus and medium for setting virtual environment in virtual scene

Country Status (1)

Country Link
CN (1) CN115212565B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
CN106971426A (en) * 2017-04-14 2017-07-21 陈柳华 A kind of method that virtual reality is merged with real scene
CN107229335A (en) * 2017-06-01 2017-10-03 北京小鸟看看科技有限公司 A kind of replacing options of virtual reality scenario, device and wear display device
WO2018058601A1 (en) * 2016-09-30 2018-04-05 深圳达闼科技控股有限公司 Method and system for fusing virtuality and reality, and virtual reality device
CN108614635A (en) * 2016-12-12 2018-10-02 北京康得新创科技股份有限公司 The control method and device of virtual reality device, virtual reality device
CN110674422A (en) * 2019-09-17 2020-01-10 西安时代科技有限公司 Method and system for realizing virtual scene display according to real scene information

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
WO2018058601A1 (en) * 2016-09-30 2018-04-05 深圳达闼科技控股有限公司 Method and system for fusing virtuality and reality, and virtual reality device
CN108614635A (en) * 2016-12-12 2018-10-02 北京康得新创科技股份有限公司 The control method and device of virtual reality device, virtual reality device
CN106971426A (en) * 2017-04-14 2017-07-21 陈柳华 A kind of method that virtual reality is merged with real scene
CN107229335A (en) * 2017-06-01 2017-10-03 北京小鸟看看科技有限公司 A kind of replacing options of virtual reality scenario, device and wear display device
CN110674422A (en) * 2019-09-17 2020-01-10 西安时代科技有限公司 Method and system for realizing virtual scene display according to real scene information

Also Published As

Publication number Publication date
CN115212565A (en) 2022-10-21

Similar Documents

Publication Publication Date Title
US9716827B2 (en) Location aware photograph recommendation notification
CN106688031B (en) Apparatus and method for providing content aware photo filter
US20190190865A1 (en) Message sharing method for sharing image data reflecting status of each user via chat room and computer program for executing same method
CN109064390B (en) Image processing method, image processing device and mobile terminal
US10198846B2 (en) Digital Image Animation
US10182187B2 (en) Composing real-time processed video content with a mobile device
US11151791B2 (en) R-snap for production of augmented realities
US20190147620A1 (en) Determining optimal conditions to photograph a point of interest
US20230298538A1 (en) Augmented reality systems
US20190028721A1 (en) Imaging device system with edge processing
US10127395B2 (en) Ad hoc target based photograph sharing
US20180063372A1 (en) Imaging device and system with edge processing
US20200380783A1 (en) Virtual Parallax to Create Three-Dimensional Appearance
CN106897040B (en) Show the method for visual information and the method that push shows visual information
US10013622B2 (en) Removing unwanted objects from a photograph
US20180122080A1 (en) Reducing storage using commonalities
CN115212565B (en) Method, apparatus and medium for setting virtual environment in virtual scene
US10649710B2 (en) Electronic content simulation for digital signage
CN116069159A (en) Method, apparatus and medium for displaying avatar
US20140104312A1 (en) Creation and Sharing of Digital Postcards Associated with Locations
CN114428660A (en) Page processing method, device, equipment and storage medium
CN115082496A (en) Image segmentation method and device
US20200314355A1 (en) Replacing a background portion of an image
US10887531B2 (en) Replacing a background portion of an image
US20230360282A1 (en) Generating shared augmented reality scenes utilizing video textures from video streams of video call participants

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant