CN110673738A - Interaction method and electronic equipment - Google Patents

Interaction method and electronic equipment Download PDF

Info

Publication number
CN110673738A
CN110673738A CN201910947571.9A CN201910947571A CN110673738A CN 110673738 A CN110673738 A CN 110673738A CN 201910947571 A CN201910947571 A CN 201910947571A CN 110673738 A CN110673738 A CN 110673738A
Authority
CN
China
Prior art keywords
virtual
mode
electronic device
display
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910947571.9A
Other languages
Chinese (zh)
Other versions
CN110673738B (en
Inventor
李洪伟
苏立军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201910947571.9A priority Critical patent/CN110673738B/en
Publication of CN110673738A publication Critical patent/CN110673738A/en
Application granted granted Critical
Publication of CN110673738B publication Critical patent/CN110673738B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an interaction method applied to electronic equipment. The method comprises the following steps: the method comprises the steps of receiving a first interaction operation of a user under the condition that the electronic equipment runs in a first virtual mode, and responding to the first interaction operation to switch the electronic equipment from the first virtual mode to a second virtual mode. If the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode, or if the first virtual mode is the virtual space mode, the second virtual mode is the screen display mode. The present disclosure also provides an electronic device.

Description

Interaction method and electronic equipment
Technical Field
The present disclosure relates to an interaction method and an electronic device.
Background
Augmented reality AR (augmented reality) or virtual reality VR (virtual reality) devices create a new device experience mode by synthesizing pictures. The basic working principle of augmented reality or virtual reality equipment is that a sensor is used in head display equipment to sense the gesture of a user, the spatial state of output content is combined with the gesture of the user, and then the user senses a virtual object in the space in the view angle through an optical system in the head display equipment, so that new interactive experience can be achieved or the user can view a virtual picture at a relatively fixed position in the space.
Disclosure of Invention
The present disclosure provides an interactive method for switching between a virtual space mode and a screen display mode, which can be applied to augmented reality or virtual reality devices.
One aspect of the present disclosure provides an interaction method applied to an electronic device. The method includes receiving a first interaction operation of a user while the electronic device is operating in a first virtual mode, and switching the electronic device from the first virtual mode to a second virtual mode in response to the first interaction operation. If the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode; or if the first virtual mode is the virtual space mode, the second virtual mode is the screen display mode.
Optionally, if the first virtual mode is the virtual space mode, the switching the electronic device from the first virtual mode to the second virtual mode in response to the first interactive operation includes switching the electronic device from the virtual space mode to the screen display mode in response to the first interactive operation when the electronic device is in communication connection with the second electronic device.
Optionally, in the screen display mode, at least one virtual display screen is presented to the user, and the display content of the at least one virtual display screen is controlled based on the data transmitted by the second electronic device.
Optionally, the method further comprises: responding to the electronic equipment to switch to the screen display mode, acquiring attribute information of the at least one virtual display screen, and displaying the at least one virtual display screen to the user based on the attribute information.
Optionally, the attribute information includes at least one of: resolution, tilt angle, display arc, spacing between adjacent virtual display screens, or relative position information of the at least one virtual display screen in space with the user.
Optionally, the method further comprises: responding to the electronic equipment to switch to the screen display mode, and acquiring driving information of the at least one virtual display screen; and transmitting the driving information to the second electronic device to load a virtual display driver of the at least one virtual display screen in the second electronic device, wherein the virtual display driver is used for enabling the second electronic device to control the virtual display screen.
Optionally, the method further includes controlling display content of the at least one virtual display screen based on a second interactive operation of the user on the display interface of the at least one virtual display screen.
In another aspect of the present disclosure, an electronic device is provided. The electronic device comprises an interaction module and a switching module. The interaction module is used for receiving a first interaction operation of a user under the condition that the electronic equipment runs in a first virtual mode. The switching module is used for responding to the first interaction operation and switching the electronic equipment from the first virtual mode to a second virtual mode. If the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode; or if the first virtual mode is the virtual space mode, the second virtual mode is the screen display mode.
Optionally, the switching module is specifically configured to, if the first virtual mode is the virtual space mode, switch the electronic device from the virtual space mode to the screen display mode in response to the first interactive operation when the electronic device is in communication connection with a second electronic device.
Optionally, in the screen display mode, at least one virtual display screen is presented to the user, and the display content of the at least one virtual display screen is controlled based on the data transmitted by the second electronic device.
Optionally, the electronic device further comprises a virtual screen presentation module. The virtual screen display module is used for responding to the electronic equipment to be switched to the screen display mode, acquiring the attribute information of the at least one virtual display screen, and displaying the at least one virtual display screen to the user based on the attribute information.
Optionally, the attribute information includes at least one of: resolution, tilt angle, display arc, spacing between adjacent virtual display screens, or relative position information of the at least one virtual display screen in space with the user.
Optionally, the electronic device further includes a virtual display driver loading module. The virtual display driver loading module is used for responding to the electronic equipment to be switched to the screen display mode and acquiring the driving information of the at least one virtual display screen; and transmitting the driving information to the second electronic device to load a virtual display driver of the at least one virtual display screen in the second electronic device, wherein the virtual display driver is used for enabling the second electronic device to control the virtual display screen.
Optionally, the interaction module is further configured to control display content of the at least one virtual display screen based on a second interaction operation of the user on the display interface of the at least one virtual display screen.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically shows an application scenario of an interaction method and an electronic device according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of an interaction method according to an embodiment of the present disclosure;
FIG. 3 schematically shows a flow chart of an interaction method according to another embodiment of the present disclosure;
FIG. 4 schematically shows a flow chart of an interaction method according to yet another embodiment of the present disclosure;
FIG. 5 schematically shows a flow chart of an interaction method according to a further embodiment of the present disclosure;
FIG. 6 schematically shows a flow chart of an interaction method according to a further embodiment of the present disclosure;
FIG. 7 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure; and
fig. 8 schematically shows a block diagram of a computer system suitable for implementing an interaction method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
At present, in some virtual reality or augmented reality interactive applications, a virtual display screen in a space can be output through augmented reality or virtual reality equipment, so that the purpose of expanding a common display is achieved. However, the user's needs may be diversified when using a virtual reality or augmented reality device, for example, in some occasions the user may want to implement a virtual reality screen through the augmented reality or virtual reality device to enlarge or enrich the content of a common reality device; or, for example, in other situations the user may simply wish to implement a virtual space through an augmented reality or virtual reality device.
In view of this, the embodiments of the present disclosure provide an interaction method for switching between a virtual space mode and a screen display mode, and a corresponding electronic device, which can meet the diversified requirements of a user.
The embodiment of the disclosure provides an interaction method and electronic equipment. The method includes receiving a first interaction operation of a user while the electronic device is operating in a first virtual mode, and switching the electronic device from the first virtual mode to a second virtual mode in response to the first interaction operation. If the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode; or if the first virtual mode is a virtual space mode, the second virtual mode is a screen display mode.
Fig. 1 schematically shows an application scenario of an interaction method and an electronic device according to an embodiment of the present disclosure.
As shown in fig. 1, the application scenario may include an electronic device 101. Among them, the electronic device 101 illustrated as AR glasses in fig. 1 is only one example. According to an embodiment of the present disclosure, the electronic device 101 may also be a VR headset or the like.
The electronic device 101 may operate in a screen display mode or a virtual space mode.
The example of fig. 1 is an effect illustration of the electronic device 101 operating in the screen display mode. In the screen display mode, the electronic device 101 may present the plurality of virtual display screens 11 to the user to display data through the plurality of virtual display screens 11, so that the user may obtain information through display contents in the plurality of virtual display screens 11. For example, the user visually perceives, by means of the optical system of the AR glasses, that the plurality of virtual display screens 11 exist in space and take on a specific form, for example, three virtual display screens 11 are arranged in a connected manner in an approximately arc-shaped manner.
In the virtual space mode, the electronic device 101 may present a virtual space to the user to render virtual objects in the virtual space. For example, the output virtual object is rendered in the entire space around the user, with which the user may interact.
According to an embodiment of the present disclosure, the electronic device 101 may switch between the screen display mode and the virtual space mode based on an interactive operation by a user.
According to an embodiment of the present disclosure, the application scenario may further include the second electronic device 102. Wherein, when the electronic device 101 is in communication connection with the second electronic device 102 in the screen display mode, the display data in the plurality of virtual display screens 11 can be controlled based on the output information of the second electronic device 102. For example, a plurality of virtual display screens 11 may be used as an external display screen of the second electronic device 102, so that the display screen application of the second electronic device 102 may be expanded. The screen display mode can bring convenience to users in many occasions. For example, it is common for financial practitioners to require multiple display screens to view different reports so that a user may carry only one electronic device 101 instead of multiple display screens. When multi-screen display is required, a plurality of virtual display screens 11 can be displayed by the electronic equipment 101 running in the screen display mode; alternatively, for example, when the display effect of the second electronic device 102 needs to be enlarged, the user can enlarge the display content in the second electronic device 102 to the virtual display screen 11 for displaying through the virtual display screen 11 displayed on the electronic device 101. Moreover, the style (e.g., flat display or curved display), size, and spatial arrangement of the at least one virtual display 11 can be set by a user, which provides great convenience for the user.
The second electronic device 102 is illustrated in the schematic of fig. 1 as a laptop computer is merely one example. In other embodiments, the second electronic device 102 may also be a cell phone, Ipad, or desktop, wearable, medical device, or the like.
It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
Fig. 2 schematically shows a flow chart of an interaction method according to an embodiment of the present disclosure.
As shown in fig. 2, the interaction method may include operations S201 and S202 according to an embodiment of the present disclosure.
In operation S201, a first interactive operation of a user is received while the electronic device 101 is operating in a first virtual mode. The first interactive operation may be, for example, a predefined interactive gesture, and the interactive operation gesture of the user may be captured by, for example, a gesture capture sensing system in the electronic device 101. Or the first interactive operation may be, for example, an operation of a specific button on the electronic device 101.
In operation S202, the electronic device 101 is switched from the first virtual mode to the second virtual mode in response to the first interactive operation.
If the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode; or if the first virtual mode is a virtual space mode, the second virtual mode is a screen display mode.
In the screen display mode, a plurality of virtual display screens 11 are presented to a user to display data through the plurality of virtual display screens 11; and in a virtual space mode, presenting the virtual space to a user to render the virtual object in the virtual space.
According to the embodiment of the disclosure, the electronic device 101 can be switched between the virtual space mode and the screen display mode based on user operation, so that various interactive experiences and use choices are provided for a user, and diversified requirements of the user can be met.
Fig. 3 schematically shows a flow chart of an interaction method according to another embodiment of the present disclosure.
As shown in fig. 3, according to an embodiment of the present disclosure, if the first virtual mode is the virtual space mode, the interaction method may include operation S301 and operation S302.
In operation S301, a first interactive operation of a user is received while the electronic device 101 is operating in the virtual space mode.
In operation S302, in a case where there is a communication connection between the electronic device 101 and the second electronic device 102, the electronic device 101 is switched from the virtual space mode to the screen display mode in response to the first interactive operation.
For example, after receiving the first interactive operation of the user, the electronic device 101 may check or detect whether there is a second electronic device 102 in communication connection with the electronic device 101. If the virtual space mode exists, the electronic equipment 101 is switched to the screen display mode from the virtual space mode. If not, the first interactive operation may be a misoperation of the user, and the mode switching may not be performed.
According to the embodiment of the present disclosure, when the electronic device 101 is in communication connection with the second electronic device 102, in the screen display mode, the at least one virtual display screen 11 is presented to the user, and the display content of the at least one virtual display screen 11 is controlled based on the data transmitted to the electronic device 101 by the second electronic device 102. For example, the output content of the second electronic device 102 is displayed through the at least one virtual display screen 11, so that the at least one virtual display screen 11 can be used as an external display screen of the second electronic device 102.
Fig. 4 schematically shows a flow chart of an interaction method according to yet another embodiment of the present disclosure.
As shown in fig. 4, the interaction method may include operation S301, operation S302, and operation S403 and operation S404, according to an embodiment of the present disclosure. Operations S301 and S302 may refer to the related description in fig. 3.
In operation S403, in response to the electronic device 101 switching to the screen display mode, attribute information of at least one virtual display screen 11 is acquired.
In operation S404, at least one virtual display screen 11 is presented to the user based on the above-described attribute information.
According to an embodiment of the present disclosure, the attribute information includes at least one of: resolution, tilt angle, display arc, spacing between adjacent virtual display screens 11, or relative position information of at least one virtual display screen 11 in space with respect to a user.
For example, in the screen display mode, the electronic device 101 may provide a setting interface for the user to select the number of virtual display screens 11, and set the resolution, tilt angle, display arc, spacing between adjacent display screens, and/or relative distance range between the position in space and the user of the virtual display screens 11. In this way, the electronic device 101 can display the at least one virtual display screen 11 to the user at a relatively fixed position in space in the arrangement manner set by the user according to the pose information of the user and the attribute information.
Fig. 5 schematically shows a flow chart of an interaction method according to a further embodiment of the present disclosure.
As shown in fig. 5, according to an embodiment of the present disclosure, the interaction method may include operation S301, operation S302, operation S503, and operation S504. Operations S301 and S302 may refer to the related description in fig. 3.
In operation S503, in response to the electronic device 101 switching to the screen display mode, driving information of at least one virtual display screen 11 is acquired.
In operation S504, driving information is transmitted to the second electronic device 102 to load a virtual display driver of at least one virtual display screen 11 in the second electronic device 102, wherein the virtual display driver is used to enable the second electronic device 102 to control the virtual display screen 11.
According to an embodiment of the present disclosure, while rendering and outputting the at least one virtual display screen 11, the electronic device 101 may transmit driving information of the at least one virtual display screen 11 to the second electronic device 102 to cause the second electronic device 102 to load a virtual display driver of the at least one virtual display screen 11. In this way, the at least one virtual display screen 11 may be used as a display screen for the second electronic device 102 as the display screen of the entity. So that the second electronic device 102 can output the display content into the at least one virtual display screen 11 through the virtual display driver.
Fig. 6 schematically shows a flow chart of an interaction method according to a further embodiment of the present disclosure.
As shown in fig. 6, the interaction method may include operation S301, operation S302, and operation S603 according to an embodiment of the present disclosure. Operations S301 and S302 may refer to the related description in fig. 3.
In operation S603, display content of the at least one virtual display screen 11 is controlled based on a second interactive operation of the user on the display interface of the at least one virtual display screen 11.
In the screen display mode, the user can perform an interactive operation with the at least one virtual display screen 11 as an interactive interface. The interactive operation may be that the at least one virtual display screen 11 is touched by a gesture, or a display interface of the virtual display screen 11 is operated by a virtual cursor on the virtual display screen 11 in a manner similar to a manner of clicking the screen with a mouse.
Fig. 7 schematically shows a block diagram of an electronic device 700 according to an embodiment of the disclosure.
As shown in fig. 7, an electronic device 700 may include an interaction module 710, and a switching module 720, according to embodiments of the present disclosure. The electronic device 700 may be used to perform the methods described with reference to fig. 2-6. Electronic device 700 may be a particular embodiment of electronic device 101.
The interaction module 710 may perform operation S201 or operation S301, for example, to receive a first interaction operation of a user when the electronic device is operating in the first virtual mode. The switching module 720 may perform, for example, operation S201 or operation S302, for switching the electronic device from the first virtual mode to the second virtual mode in response to the first interactive operation. If the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode; or if the first virtual mode is a virtual space mode, the second virtual mode is a screen display mode.
According to an embodiment of the present disclosure, the interaction module 710 may further perform operation S603, for example, to control display contents of the at least one virtual display screen based on a second interaction operation of the user on the display interface of the at least one virtual display screen.
According to an embodiment of the present disclosure, the interaction module 720 may perform operation S302, if the first virtual mode is the virtual space mode, in a case that the electronic device is in communication connection with a second electronic device, in response to the first interaction operation, switch the electronic device from the virtual space mode to the screen display mode. According to one embodiment of the present disclosure, in the screen display mode, at least one virtual display screen is presented to a user, and display content of the at least one virtual display screen is controlled based on data transmitted by the second electronic device.
According to an embodiment of the present disclosure, the electronic device 700 may further include a virtual screen presentation module 730. The virtual screen presentation module 730 may perform, for example, operations S403 and S404 for acquiring attribute information of at least one virtual display screen in response to the electronic device being switched to the screen display mode, and presenting the at least one virtual display screen to the user based on the attribute information. According to one embodiment of the present disclosure, the attribute information includes at least one of: resolution, tilt angle, display arc, spacing between adjacent virtual display screens, or relative position information of at least one virtual display screen in space to a user.
According to one embodiment of the present disclosure, the electronic device 700 may further include a virtual display driver loading module 740. The virtual display driver loading module 740 may, for example, perform operations S503 and S504, configured to, in response to the electronic device being switched to the screen display mode, obtain driving information of at least one virtual display screen; and transmitting the driving information to the second electronic device to load a virtual display driver of at least one virtual display screen in the second electronic device, wherein the virtual display driver is used for enabling the second electronic device to control the virtual display screen.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any of the interaction module 710, the switching module 720, the virtual screen presentation module 730, and the virtual display driver loading module 740 may be combined into one module to be implemented, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the interaction module 710, the switching module 720, the virtual screen presentation module 730, and the virtual display driver loading module 740 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in any suitable combination of any of them. Alternatively, at least one of the interaction module 710, the switching module 720, the virtual screen presentation module 730, and the virtual display driver loading module 740 may be at least partially implemented as a computer program module that, when executed, may perform a corresponding function.
Fig. 8 schematically shows a block diagram of a computer system 800 suitable for implementing an interaction method according to an embodiment of the present disclosure. The computer system illustrated in FIG. 8 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 8, computer system 800 includes a processor 810, and a computer-readable storage medium 820. The computer system 800 may perform a method according to an embodiment of the disclosure. The computer system 800 may be disposed in the electronic device 101.
In particular, processor 810 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 810 may also include on-board memory for caching purposes. Processor 810 may be a single processing unit or a plurality of processing units for performing different actions of a method flow according to embodiments of the disclosure.
Computer-readable storage medium 820, for example, may be a non-volatile computer-readable storage medium, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 820 may include a computer program 821, which computer program 821 may include code/computer-executable instructions that, when executed by the processor 810, cause the processor 810 to perform a method according to an embodiment of the present disclosure, or any variation thereof.
The computer program 821 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 821 may include one or more program modules, including for example 821A, modules 821B, … …. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, and when the program modules are executed by the processor 810, the processor 810 may execute the method according to the embodiment of the present disclosure or any variation thereof.
According to an embodiment of the present invention, at least one of the interaction module 710, the switching module 720, the virtual screen presentation module 730, and the virtual display driver loading module 740 may be implemented as a computer program module described with reference to fig. 8, which, when executed by the processor 810, may implement the corresponding operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the disclosure can be made without conflict, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. An interaction method applied to an electronic device, the method comprising:
receiving a first interactive operation of a user under the condition that the electronic equipment runs in a first virtual mode; and
switching the electronic device from the first virtual mode to a second virtual mode in response to the first interactive operation;
wherein the content of the first and second substances,
if the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode; or
And if the first virtual mode is the virtual space mode, the second virtual mode is the screen display mode.
2. The method of claim 1, wherein, if the first virtual mode is the virtual space mode, the switching the electronic device from the first virtual mode to a second virtual mode in response to the first interaction comprises:
and under the condition that the electronic equipment is in communication connection with second electronic equipment, responding to the first interaction operation, and switching the electronic equipment from the virtual space mode to the screen display mode.
3. The method of claim 2, wherein:
in the screen display mode, at least one virtual display screen is displayed to the user, and the display content of the at least one virtual display screen is controlled based on the data transmitted by the second electronic equipment.
4. The method of claim 3, wherein the method further comprises:
responding to the electronic equipment to switch to the screen display mode, and acquiring attribute information of the at least one virtual display screen; and
and displaying the at least one virtual display screen to the user based on the attribute information.
5. The method of claim 3, wherein the attribute information comprises at least one of:
resolution, tilt angle, display arc, spacing between adjacent virtual display screens, or relative position information of the at least one virtual display screen in space with the user.
6. The method of claim 3, wherein the method further comprises:
responding to the electronic equipment to switch to the screen display mode, and acquiring driving information of the at least one virtual display screen; and
transmitting the driving information to the second electronic device to load a virtual display driver of the at least one virtual display screen in the second electronic device, wherein the virtual display driver is used for enabling the second electronic device to control the virtual display screen.
7. The method of claim 3, wherein the method further comprises:
and controlling the display content of the at least one virtual display screen based on the second interactive operation of the user on the display interface of the at least one virtual display screen.
8. An electronic device, comprising:
the interaction module is used for receiving a first interaction operation of a user under the condition that the electronic equipment runs in a first virtual mode; and
a switching module, configured to switch the electronic device from the first virtual mode to a second virtual mode in response to the first interactive operation;
wherein the content of the first and second substances,
if the first virtual mode is a screen display mode, the second virtual mode is a virtual space mode; or
And if the first virtual mode is the virtual space mode, the second virtual mode is the screen display mode.
9. The electronic device of claim 8, wherein the switching module is specifically configured to:
if the first virtual mode is the virtual space mode, under the condition that the electronic equipment is in communication connection with second electronic equipment, responding to the first interactive operation, and switching the electronic equipment from the virtual space mode to the screen display mode.
10. The electronic device of claim 9,
in the screen display mode, at least one virtual display screen is displayed to the user, and the display content of the at least one virtual display screen is controlled based on the data transmitted by the second electronic equipment.
CN201910947571.9A 2019-09-29 2019-09-29 Interaction method and electronic equipment Active CN110673738B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910947571.9A CN110673738B (en) 2019-09-29 2019-09-29 Interaction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910947571.9A CN110673738B (en) 2019-09-29 2019-09-29 Interaction method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110673738A true CN110673738A (en) 2020-01-10
CN110673738B CN110673738B (en) 2022-02-18

Family

ID=69080855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910947571.9A Active CN110673738B (en) 2019-09-29 2019-09-29 Interaction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110673738B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522141A (en) * 2020-06-08 2020-08-11 歌尔光学科技有限公司 Head-mounted device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285704A1 (en) * 2010-02-03 2011-11-24 Genyo Takeda Spatially-correlated multi-display human-machine interface
CN105915990A (en) * 2016-05-25 2016-08-31 青岛歌尔声学科技有限公司 Virtual reality (VR) helmet and using method thereof
US20160314624A1 (en) * 2015-04-24 2016-10-27 Eon Reality, Inc. Systems and methods for transition between augmented reality and virtual reality
CN106293067A (en) * 2016-07-27 2017-01-04 上海与德通讯技术有限公司 A kind of display changeover method and wearable display device
CN106774929A (en) * 2016-12-30 2017-05-31 维沃移动通信有限公司 The display processing method and virtual reality terminal of a kind of virtual reality terminal
CN106951316A (en) * 2017-03-20 2017-07-14 北京奇虎科技有限公司 Changing method, device and the virtual reality device of Virtualization Mode and Realistic model
CN107168513A (en) * 2017-03-22 2017-09-15 联想(北京)有限公司 Information processing method and electronic equipment
CN107678546A (en) * 2017-09-26 2018-02-09 歌尔科技有限公司 Virtual scene switching method and wear display device
CN108401463A (en) * 2017-08-11 2018-08-14 深圳前海达闼云端智能科技有限公司 Virtual display device, intelligent interaction method and cloud server
CN109522070A (en) * 2018-10-29 2019-03-26 联想(北京)有限公司 Display processing method and system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285704A1 (en) * 2010-02-03 2011-11-24 Genyo Takeda Spatially-correlated multi-display human-machine interface
US20160314624A1 (en) * 2015-04-24 2016-10-27 Eon Reality, Inc. Systems and methods for transition between augmented reality and virtual reality
CN105915990A (en) * 2016-05-25 2016-08-31 青岛歌尔声学科技有限公司 Virtual reality (VR) helmet and using method thereof
CN106293067A (en) * 2016-07-27 2017-01-04 上海与德通讯技术有限公司 A kind of display changeover method and wearable display device
CN106774929A (en) * 2016-12-30 2017-05-31 维沃移动通信有限公司 The display processing method and virtual reality terminal of a kind of virtual reality terminal
CN106951316A (en) * 2017-03-20 2017-07-14 北京奇虎科技有限公司 Changing method, device and the virtual reality device of Virtualization Mode and Realistic model
CN107168513A (en) * 2017-03-22 2017-09-15 联想(北京)有限公司 Information processing method and electronic equipment
CN108401463A (en) * 2017-08-11 2018-08-14 深圳前海达闼云端智能科技有限公司 Virtual display device, intelligent interaction method and cloud server
CN107678546A (en) * 2017-09-26 2018-02-09 歌尔科技有限公司 Virtual scene switching method and wear display device
CN109522070A (en) * 2018-10-29 2019-03-26 联想(北京)有限公司 Display processing method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ADRIAN H. HOPPE等: "VirtualTablet: Extending Movable Surfaces with Touch Interaction", 《2019 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES (VR)》 *
马莉等: "增强现实外语教学环境及其多模态话语研究", 《现代教育技术》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522141A (en) * 2020-06-08 2020-08-11 歌尔光学科技有限公司 Head-mounted device

Also Published As

Publication number Publication date
CN110673738B (en) 2022-02-18

Similar Documents

Publication Publication Date Title
US9710217B2 (en) Identifying the positioning in a multiple display grid
US9645720B2 (en) Data sharing
CN106796495B (en) Combined handover and window placement
US9417753B2 (en) Method and apparatus for providing contextual information between operating system environments
US10915284B2 (en) Multi-monitor full screen mode in a windowing environment
US20090235177A1 (en) Multi-monitor remote desktop environment user interface
US9389884B2 (en) Method and apparatus for providing adaptive wallpaper display for a device having multiple operating system environments
US20110063191A1 (en) Method of managing applications in a multi-monitor computer system and multi-monitor computer system employing the method
US10901612B2 (en) Alternate video summarization
CN109460180B (en) Control method and electronic device
JP2010178331A (en) Dynamic geometry management of virtual frame buffer for appendable logical display
US20140043366A1 (en) Image processing apparatus, image processing system, and image processing method
CN108228074B (en) Display control method, display system, electronic device, and computer-readable medium
US20180039470A1 (en) Image output control method and display device
US11240432B2 (en) Control method for displaying images upright
TW202008066A (en) Method and system of partially projecting a computer screen
CN110673738B (en) Interaction method and electronic equipment
US10754524B2 (en) Resizing of images with respect to a single point of convergence or divergence during zooming operations in a user interface
CN111143298A (en) Information processing method and information processing apparatus
WO2016114135A1 (en) Image processing apparatus and image processing method
US20160313965A1 (en) Interactive control system, touch sensitive display apparatus and control method thereof
CN108416847B (en) Method and device for displaying operation object
KR20120117107A (en) Mobile terminal comprising dual display and method for operating that mobile terminal
CN110032295B (en) Control method and control device for electronic device, and medium
CN116954419A (en) Content transmission method and content transmission device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant