CN106598217B - Display method, display device and electronic equipment - Google Patents
Display method, display device and electronic equipment Download PDFInfo
- Publication number
- CN106598217B CN106598217B CN201610982557.9A CN201610982557A CN106598217B CN 106598217 B CN106598217 B CN 106598217B CN 201610982557 A CN201610982557 A CN 201610982557A CN 106598217 B CN106598217 B CN 106598217B
- Authority
- CN
- China
- Prior art keywords
- environment image
- virtual
- attribute information
- virtual object
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/08—Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present disclosure relates to a display method and a display device, the display method including: acquiring a real environment image; processing the real environment image to generate a virtual environment image; and displaying a virtual object in the virtual environment image. According to the technical scheme of the disclosure, on one hand, the virtual environment image and the virtual object are displayed in the same or relatively close way, and compared with the method for directly displaying the virtual object in the real environment image, the method for displaying the virtual object in the virtual environment can enable the virtual object in the virtual environment image viewed by a user to have a more sense of introduction, so that the viewing experience is improved; the virtual environment image is obtained from the real environment image, so that the user can know the real environment image by watching the virtual environment image, and the interaction with the real environment is realized while watching the virtual object in the virtual environment image.
Description
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a display method, a display device, and an electronic apparatus.
Background
Virtual Reality (VR) technology and Augmented Reality (AR) technology are receiving increasing attention, and products using both technologies are increasing.
The virtual reality technology mainly displays a virtual environment and a virtual object on a device, and creates an immersive experience for a user, but the user generally uses both eyes to view the content displayed by the device during wearing the device, so that the user cannot perceive the real environment.
The augmented reality technology mainly displays a real environment and a virtual object in the equipment, so that the interaction between a user and the real environment is improved, but the display modes of the real environment and the virtual object displayed by the equipment have great difference, and the watching experience is poor.
Disclosure of Invention
The present disclosure provides a display method, a display apparatus, and an electronic device to solve the disadvantages of the related art.
According to a first aspect of the embodiments of the present disclosure, there is provided a display method including:
acquiring a real environment image;
processing the real environment image to generate a virtual environment image;
and displaying a virtual object in the virtual environment image.
Optionally, the processing the real-environment image to generate a virtual-environment image includes:
determining attribute information of the virtual object;
and processing the real environment image according to the attribute information to generate a virtual environment image matched with the attribute information.
Optionally, the attribute information includes at least one of:
size, color, type of display.
Optionally, the display method further includes:
determining whether a first preset instruction is received;
if the first preset instruction is received, acquiring a real environment image;
and if the first preset instruction is not received, displaying the virtual object in a preset virtual environment image.
Optionally, the display method further includes:
determining whether a second preset instruction is received;
if the second preset instruction is received, processing the real environment image to generate a virtual environment image;
and if the second preset instruction is not received, displaying the virtual object in the image of the real environment.
According to a second aspect of the embodiments of the present disclosure, there is provided a display device including:
an acquisition unit configured to acquire a real environment image;
a processing unit configured to process the real environment image to generate a virtual environment image;
a display unit configured to display a virtual object in the virtual environment image.
Optionally, the processing unit comprises:
a determination subunit configured to determine attribute information of the virtual object;
a generating subunit configured to process the real environment image according to the attribute information to generate a virtual environment image matching the attribute information.
Optionally, the attribute information includes at least one of:
size, color, type of display.
Optionally, the display device further includes:
a first determination unit configured to determine whether a first preset instruction is received;
the acquisition unit acquires a real environment image when the first determination unit determines that the first preset instruction is received, and the display unit displays the virtual object in a preset virtual environment image when the first determination unit determines that the first preset instruction is not received.
Optionally, the display device further includes:
a second determination unit configured to determine whether a second preset instruction is received;
the processing unit processes the real environment image to generate a virtual environment image when the second determining unit determines that the second preset instruction is received, and the display unit displays the virtual object in the real environment image when the second determining unit determines that the second preset instruction is not received.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring a real environment image;
processing the real environment image to generate a virtual environment image;
and displaying a virtual object in the virtual environment image.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the embodiment, on one hand, the virtual environment image and the virtual object are displayed in the same or relatively close way, and compared with the method for directly displaying the virtual object in the real environment image, the method for displaying the virtual object in the virtual environment can enable the virtual object in the virtual environment image viewed by the user to have a more sense of introduction, so that the viewing experience is improved; the virtual environment image is obtained from the real environment image, so that the user can know the real environment image by watching the virtual environment image, and the interaction with the real environment is realized while watching the virtual object in the virtual environment image.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic flow chart diagram illustrating a display method in accordance with an exemplary embodiment.
FIG. 2 is a schematic flow chart diagram illustrating another display method in accordance with an exemplary embodiment.
FIG. 3 is a schematic flow chart diagram illustrating yet another display method in accordance with an exemplary embodiment.
FIG. 4 is a schematic flow chart diagram illustrating yet another display method in accordance with an exemplary embodiment.
Fig. 5 is a schematic block diagram illustrating a display device according to an exemplary embodiment.
Fig. 6 is a schematic block diagram illustrating another display device according to an exemplary embodiment.
Fig. 7 is a schematic block diagram illustrating another display apparatus according to an exemplary embodiment.
Fig. 8 is a schematic block diagram illustrating another display device according to an exemplary embodiment.
FIG. 9 is a block diagram illustrating an apparatus for display in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a schematic flow chart illustrating a display method according to an exemplary embodiment, which may be used in a helmet device or a terminal device, as shown in fig. 1, the method including the steps of:
in step S11, a real environment image is acquired.
In an embodiment, the real-environment image may be acquired by an acquisition unit, such as a camera, of the helmet device or the terminal device, where the acquired real-environment image may be a static image or a dynamic image, and accordingly, the subsequently generated virtual-environment image may also include the static image and/or the dynamic image.
In step S12, the real environment image is processed to generate a virtual environment image.
In one embodiment, the processing of the real-environment image may include parsing the real-environment image, for example, identifying depth information of objects therein, and then performing 3D rendering on the parsed content to obtain the virtual environment image.
In step S13, a virtual object is displayed in the virtual environment image.
In one embodiment, the real environment image is processed into the virtual environment image, and the virtual object is displayed in the virtual environment image, so that on one hand, the virtual environment image and the virtual object are displayed in the same or relatively close way, and compared with the method of directly displaying the virtual object in the real environment image, the method for displaying the virtual object in the virtual environment can enable the virtual object in the virtual environment image viewed by the user to have a more sense of introduction, thereby improving the viewing experience; the virtual environment image is obtained from the real environment image, so that the user can know the real environment image by watching the virtual environment image, and the interaction with the real environment is realized while watching the virtual object in the virtual environment image.
FIG. 2 is a schematic flow chart diagram illustrating another display method in accordance with an exemplary embodiment. As shown in fig. 2, on the basis of the embodiment shown in fig. 1, the processing the real environment image to generate a virtual environment image includes:
in step S121, attribute information of the virtual object is determined.
In one embodiment, the virtual object to be displayed may be predetermined, and thus its attribute information may be determined before being displayed in the virtual environment image.
In step S122, the real-environment image is processed according to the attribute information to generate a virtual-environment image matching the attribute information.
In an embodiment, the matching between the generated virtual environment image and the attribute information of the virtual object may refer to matching between one item of attribute information of the virtual environment image and one item of attribute information of the virtual object, or matching between multiple items of attribute information of the virtual environment image and multiple items of attribute information of the virtual object, where the matching may refer to being the same, or may refer to being set according to a preset rule.
In an embodiment, for example, the attribute information of the display style (also referred to as a style) is taken as an example, for example, the display style of the virtual object is relatively cartoon, and then the display style of the generated virtual environment image is relatively cartoon, for example, the display style of the virtual object is relatively realistic, and then the display style of the generated virtual image is relatively realistic.
In one embodiment, taking the attribute information of color as an example, if the color of the virtual object is warm, the color of the virtual environment image may be set to be cool according to a preset rule, and if the color of the virtual object is cool, the color of the virtual environment image may be set to be warm according to a preset rule.
According to the embodiment, the virtual environment image matched with the attribute information of the virtual object is generated, so that the matching degree of the virtual object viewed by the user and the virtual environment image where the virtual object is located is higher, the virtual object is ensured to have stronger bring-in feeling when being displayed in the virtual environment image, and the user is ensured to have higher viewing experience for the virtual object.
In one embodiment, in step S12, the attribute information of the virtual environment image may be determined, and then the virtual object may be adjusted so that the adjusted attribute information of the virtual object matches the attribute information of the virtual environment image. For example, under the condition that the virtual environment image is complex and is not easy to be adjusted too much, the matching degree of the virtual object and the virtual environment image can be improved by adjusting the virtual object, and the viewing experience is further ensured.
In addition to adjusting the attribute information of the virtual environment image and the attribute information of the virtual object separately, the attribute information of the virtual environment image and the attribute information of the virtual object may be adjusted simultaneously so that the adjusted attribute information of the virtual environment image matches the adjusted attribute information of the virtual object.
Optionally, the attribute information includes at least one of:
size, color, display style.
In one embodiment, the attribute information may be set as desired, including but not limited to the above-mentioned size, color, and display style.
FIG. 3 is a schematic flow chart diagram illustrating yet another display method in accordance with an exemplary embodiment. As shown in fig. 3, on the basis of the embodiment shown in fig. 1, the method further includes:
in step S14, determining whether a first preset command is received, if the first preset command is received, executing step S11, acquiring an image of a real environment, and if the first preset command is not received, executing step S15;
in step S15, the virtual object is displayed in a preset virtual environment image.
In one embodiment, when the virtual object needs to be displayed, the user can control whether the device collects the real environment image or not by inputting a first preset instruction. If the user inputs the first preset instruction, the capturing unit may be triggered to start, and then the steps S11, S12, S13, etc. are sequentially performed to display the virtual object in the virtual environment image after the real environment image is processed. If the user does not input the first preset instruction, the virtual object can be directly displayed in the preset virtual environment image.
FIG. 4 is a schematic flow chart diagram illustrating yet another display method in accordance with an exemplary embodiment. As shown in fig. 4, on the basis of the embodiment shown in fig. 1, the method further includes:
in step S16, determining whether a second preset command is received, if the second preset command is received, performing step S12, processing the real environment image to generate a virtual environment image, and if the second preset command is not received, performing step S17;
in step S17, the virtual object is displayed in the real environment image.
In one embodiment, when the virtual object needs to be displayed, the user may control whether the device processes the acquired environment image by inputting a second preset instruction. If the user inputs the second preset instruction, the steps S12, S13, and the like may be sequentially performed, so that the virtual object is displayed in the virtual environment image processed with respect to the real environment image. If the user does not input the second preset instruction, the virtual object can be directly displayed in the acquired image of the real environment.
Here, step S16 may be performed before step S11, in addition to step S12 after step S11 as shown in fig. 4.
Corresponding to the embodiment of the display method, the disclosure also provides an embodiment of a display device.
Fig. 5 is a schematic block diagram illustrating a display device according to an exemplary embodiment. Referring to fig. 5, the apparatus includes:
an acquisition unit 51 configured to acquire a real environment image;
a processing unit 52 configured to process the real environment image to generate a virtual environment image;
a display unit 53 configured to display a virtual object in the virtual environment image.
Fig. 6 is a schematic block diagram of another display device according to an exemplary embodiment, as shown in fig. 6, and on the basis of the embodiment shown in fig. 5, the processing unit 52 includes:
a determination subunit 521 configured to determine attribute information of the virtual object;
a generating subunit 522, configured to process the real environment image according to the attribute information to generate a virtual environment image matching the attribute information.
Optionally, the attribute information includes at least one of:
size, color, type of display.
Fig. 7 is a schematic block diagram illustrating another display apparatus according to an exemplary embodiment, as shown in fig. 7, and based on the embodiment shown in fig. 5, the display apparatus further includes:
a first determination unit 54 configured to determine whether a first preset instruction is received;
the acquiring unit 51 acquires a real environment image when the first determining unit 54 determines that the first preset instruction is received, and the displaying unit 53 displays the virtual object in a preset virtual environment image when the first determining unit 54 determines that the first preset instruction is not received.
Fig. 8 is a schematic block diagram illustrating another display apparatus according to an exemplary embodiment, as shown in fig. 8, the display apparatus further includes, on the basis of the embodiment shown in fig. 5:
a second determination unit 55 configured to determine whether a second preset instruction is accepted;
wherein the processing unit 52 processes the real environment image to generate a virtual environment image when the second determining unit 55 determines that the second preset instruction is received, and the display unit 53 displays the virtual object in the real environment image when the second determining unit 55 determines that the second preset instruction is not received.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present disclosure also provides a display device, comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: acquiring a real environment image; processing the real environment image to generate a virtual environment image; and displaying a virtual object in the virtual environment image.
Accordingly, the present disclosure also provides a terminal comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured for execution by the one or more processors to include instructions for: acquiring a real environment image; processing the real environment image to generate a virtual environment image; and displaying a virtual object in the virtual environment image.
Fig. 9 is a block diagram illustrating an apparatus 900 for displaying according to an example embodiment. For example, the apparatus 900 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 9, apparatus 900 may include one or more of the following components: processing component 902, memory 904, power component 906, multimedia component 908, audio component 910, input/output (I/O) interface 912, sensor component 914, and communication component 916.
The processing component 902 generally controls overall operation of the device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. Processing component 902 may include one or more processors 920 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 902 can include one or more modules that facilitate interaction between processing component 902 and other components. For example, the processing component 902 can include a multimedia module to facilitate interaction between the multimedia component 908 and the processing component 902.
The memory 904 is configured to store various types of data to support operation at the apparatus 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 904 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 906 provides power to the various components of the device 900. The power components 906 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 900.
The multimedia component 908 comprises a screen providing an output interface between the device 900 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 908 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 900 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 910 is configured to output and/or input audio signals. For example, audio component 910 includes a Microphone (MIC) configured to receive external audio signals when apparatus 900 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 904 or transmitted via the communication component 916. In some embodiments, audio component 910 also includes a speaker for outputting audio signals.
I/O interface 912 provides an interface between processing component 902 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 914 includes one or more sensors for providing status assessment of various aspects of the apparatus 900. For example, sensor assembly 914 may detect an open/closed state of device 900, the relative positioning of components, such as a display and keypad of device 900, the change in position of device 900 or a component of device 900, the presence or absence of user contact with device 900, the orientation or acceleration/deceleration of device 900, and the change in temperature of device 900. The sensor assembly 914 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 914 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 916 is configured to facilitate communications between the apparatus 900 and other devices in a wired or wireless manner. The apparatus 900 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 916 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 916 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 904 comprising instructions, executable by the processor 920 of the apparatus 900 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (9)
1. A display method, comprising:
acquiring a real environment image;
processing the real environment image to generate a virtual environment image;
displaying a virtual object in the virtual environment image;
wherein the processing the real-environment image to generate a virtual-environment image comprises:
determining attribute information of the virtual object;
processing the real environment image according to the attribute information to generate a virtual environment image matched with the attribute information;
under the condition that the virtual environment image is complex and is not easy to be adjusted too much, the attribute information of the virtual environment image can be determined first, and then the virtual object can be adjusted, so that the adjusted attribute information of the virtual object is matched with the attribute information of the virtual environment image.
2. The display method according to claim 1, wherein the attribute information includes at least one of:
size, color, type of display.
3. The display method according to any one of claims 1 and 2, further comprising:
determining whether a first preset instruction is received;
if the first preset instruction is received, acquiring a real environment image;
and if the first preset instruction is not received, displaying the virtual object in a preset virtual environment image.
4. The display method according to any one of claims 1 and 2, further comprising:
determining whether a second preset instruction is received;
if the second preset instruction is received, processing the real environment image to generate a virtual environment image;
and if the second preset instruction is not received, displaying the virtual object in the image of the real environment.
5. A display device, comprising:
an acquisition unit configured to acquire a real environment image;
a processing unit configured to process the real environment image to generate a virtual environment image;
a display unit configured to display a virtual object in the virtual environment image;
wherein the processing unit comprises:
a determination subunit configured to determine attribute information of the virtual object;
a generation subunit configured to process the real environment image according to the attribute information to generate a virtual environment image matching the attribute information;
and under the condition that the virtual environment image is complex and is not easy to be adjusted too much, the attribute information of the virtual environment image can be determined first, and then the virtual object is adjusted, so that the adjusted attribute information of the virtual object is matched with the attribute information of the virtual environment image.
6. The display device according to claim 5, wherein the attribute information includes at least one of:
size, color, type of display.
7. The display device according to any one of claims 5 and 6, further comprising:
a first determination unit configured to determine whether a first preset instruction is received;
the acquisition unit acquires a real environment image when the first determination unit determines that the first preset instruction is received, and the display unit displays the virtual object in a preset virtual environment image when the first determination unit determines that the first preset instruction is not received.
8. The display device according to any one of claims 5 and 6, further comprising:
a second determination unit configured to determine whether a second preset instruction is received;
the processing unit processes the real environment image to generate a virtual environment image when the second determining unit determines that the second preset instruction is received, and the display unit displays the virtual object in the real environment image when the second determining unit determines that the second preset instruction is not received.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring a real environment image;
processing the real environment image to generate a virtual environment image;
displaying a virtual object in the virtual environment image;
wherein the processing the real-environment image to generate a virtual-environment image comprises:
determining attribute information of the virtual object;
processing the real environment image according to the attribute information to generate a virtual environment image matched with the attribute information;
under the condition that the virtual environment image is complex and is not easy to be adjusted too much, the attribute information of the virtual environment image can be determined first, and then the virtual object can be adjusted, so that the adjusted attribute information of the virtual object is matched with the attribute information of the virtual environment image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610982557.9A CN106598217B (en) | 2016-11-08 | 2016-11-08 | Display method, display device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610982557.9A CN106598217B (en) | 2016-11-08 | 2016-11-08 | Display method, display device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106598217A CN106598217A (en) | 2017-04-26 |
CN106598217B true CN106598217B (en) | 2020-06-19 |
Family
ID=58590766
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610982557.9A Active CN106598217B (en) | 2016-11-08 | 2016-11-08 | Display method, display device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106598217B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110060354B (en) * | 2019-04-19 | 2023-08-04 | 苏州梦想人软件科技有限公司 | Positioning and interaction method of real image in virtual space |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101872241A (en) * | 2009-04-26 | 2010-10-27 | 艾利维公司 | Set up the method and system of the network game communal space |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130063560A1 (en) * | 2011-09-12 | 2013-03-14 | Palo Alto Research Center Incorporated | Combined stereo camera and stereo display interaction |
-
2016
- 2016-11-08 CN CN201610982557.9A patent/CN106598217B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101872241A (en) * | 2009-04-26 | 2010-10-27 | 艾利维公司 | Set up the method and system of the network game communal space |
Also Published As
Publication number | Publication date |
---|---|
CN106598217A (en) | 2017-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111970533B (en) | Interaction method and device for live broadcast room and electronic equipment | |
US11636653B2 (en) | Method and apparatus for synthesizing virtual and real objects | |
US20170304735A1 (en) | Method and Apparatus for Performing Live Broadcast on Game | |
CN105898364A (en) | Video playing processing method, device, terminal and system | |
CN104469437A (en) | Advertisement pushing method and device | |
CN110928627B (en) | Interface display method and device, electronic equipment and storage medium | |
CN112153400A (en) | Live broadcast interaction method and device, electronic equipment and storage medium | |
KR101654493B1 (en) | Method and device for transmitting image | |
CN109496293B (en) | Extended content display method, device, system and storage medium | |
US20180144546A1 (en) | Method, device and terminal for processing live shows | |
CN108174269B (en) | Visual audio playing method and device | |
CN107132769B (en) | Intelligent equipment control method and device | |
CN107797662B (en) | Viewing angle control method and device and electronic equipment | |
CN112738544A (en) | Live broadcast room interaction method and device, electronic equipment and storage medium | |
CN108986803B (en) | Scene control method and device, electronic equipment and readable storage medium | |
CN108346179B (en) | AR equipment display method and device | |
CN104850643B (en) | Picture comparison method and device | |
CN106648650A (en) | Method and device for adjusting terminal display status | |
CN108616719B (en) | Method, device and system for displaying monitoring video | |
CN107239140A (en) | Processing method, device and the terminal of VR scenes | |
US11600300B2 (en) | Method and device for generating dynamic image | |
CN107437269B (en) | Method and device for processing picture | |
CN106447747B (en) | Image processing method and device | |
CN106773750B (en) | Equipment image display method and device | |
CN106034214A (en) | Video data recording method and apparatus thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |