CN106933368B - Information processing method and device - Google Patents

Information processing method and device Download PDF

Info

Publication number
CN106933368B
CN106933368B CN201710197126.6A CN201710197126A CN106933368B CN 106933368 B CN106933368 B CN 106933368B CN 201710197126 A CN201710197126 A CN 201710197126A CN 106933368 B CN106933368 B CN 106933368B
Authority
CN
China
Prior art keywords
display
electronic device
virtual scene
view
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710197126.6A
Other languages
Chinese (zh)
Other versions
CN106933368A (en
Inventor
许奔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201710197126.6A priority Critical patent/CN106933368B/en
Publication of CN106933368A publication Critical patent/CN106933368A/en
Application granted granted Critical
Publication of CN106933368B publication Critical patent/CN106933368B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an information processing method and device, the method is applied to a first electronic device, and by acquiring the type of a display object in a virtual scene played by a second electronic device, when the type is the first type, the first electronic device displays the virtual scene in a first display view, the display object is positioned in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is positioned in the second display view, wherein the first display view is different from the second display view. When the type is the second type, the first electronic device and the second electronic device both display the virtual scene in the third display view, and the display object is located in the third display view. Therefore, by the method, the display view of the virtual scene on the first electronic equipment is controlled according to the type of the display object in the current virtual scene of the second electronic equipment, and the problem of how to display a plurality of pieces of electronic equipment in the same virtual scene is solved.

Description

Information processing method and device
Technical Field
The present invention relates to the field of information processing technologies, and in particular, to an information processing method and apparatus.
Background
With the continuous development of science and technology, people also have higher and higher requirements on the functions of electronic equipment, wherein the Virtual Reality (VR), Augmented Reality (AR) and other technologies can perform scene virtual display or enhancement on real things, and the visual effect and sense of the user are enhanced.
The inventor discovers that in the process of implementing the invention: most of the existing virtual reality devices are head-mounted display devices, and virtual scenes of the virtual reality devices are limited to single people, namely, scenes of a single electronic device are virtual. With the continuous improvement of user demands, when a plurality of electronic devices join the same virtual scene at the same time, such as watching movies at the same time, or when a plurality of devices perform interactive games, how each virtual reality device displays the virtual scene becomes a problem to be solved urgently at present.
Disclosure of Invention
In view of this, the present invention provides an information processing method and apparatus, so as to overcome the problem in the prior art how to display a plurality of electronic devices in the same virtual scene.
In order to achieve the purpose, the invention provides the following technical scheme:
an information processing method applied to a first electronic device comprises the following steps:
acquiring the type of a display object in a virtual scene played by second electronic equipment;
if the type is a first type, the first electronic device displays the virtual scene in a first display view, and the display object is located in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, wherein the first display view and the second display view are different;
if the type is a second type, the first electronic device and the second electronic device both display the virtual scene in a third display view, and the display object is located in the third display view.
Preferably, the type determination of the display object includes:
and judging whether the display object is a plane object or a stereo object, wherein if the display object is the stereo object, the display object belongs to a first type, and if the display object is the plane object, the display object belongs to a second type.
Preferably, the first electronic device displays the virtual scene in a first display field of view, including:
acquiring the first display visual field and the virtual scene;
and displaying the virtual scene according to the first display visual field.
Preferably, the acquiring the first display field of view includes:
acquiring the relative position of the first electronic equipment and the second electronic equipment;
and determining the first display visual field according to the relative position.
Preferably, the acquiring the first display field of view includes:
receiving a first display visual field and the virtual scene sent by the second electronic equipment;
and displaying the virtual scene according to the first display visual field.
Preferably, the method further comprises the following steps:
sending a target display view switching request;
acquiring a target virtual scene corresponding to the target display view switching request;
and displaying the target virtual scene with the target display visual field.
An information processing apparatus comprising:
a display for displaying a virtual scene of a plurality of display views, wherein a display object is located within the display views;
a processor to:
acquiring the type of a display object in a virtual scene played by second electronic equipment;
if the type is a first type, the first electronic device displays the virtual scene in a first display view, and the display object is located in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, wherein the first display view and the second display view are different;
if the type is a second type, the first electronic device and the second electronic device both display the virtual scene in a third display view, and the display object is located in the third display view.
Preferably, when the first electronic device displays the virtual scene in the first display view, the processor is specifically configured to:
acquiring the first display visual field and the virtual scene;
and displaying the virtual scene according to the first display visual field.
Preferably, when acquiring the first display field of view, the processor is specifically configured to:
acquiring the relative position of the first electronic equipment and the second electronic equipment;
and determining the first display visual field according to the relative position.
Preferably, when acquiring the first display field of view, the processor is specifically configured to:
receiving a first display visual field and the virtual scene sent by the second electronic equipment;
and displaying the virtual scene according to the first display visual field.
Preferably, the processor is further configured to:
sending a target display view switching request;
acquiring a target virtual scene corresponding to the target display view switching request;
and displaying the target virtual scene with the target display visual field.
As can be seen from the foregoing technical solutions, compared with the prior art, the method and apparatus for processing information provided by the present invention are applied to a first electronic device, and by acquiring a type of a display object in a virtual scene played by a second electronic device, when the type is a first type, the first electronic device displays the virtual scene in a first display view, and the display object is located in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, where the first display view is different from the second display view. When the type is the second type, the first electronic device and the second electronic device both display the virtual scene in the third display view, and the display object is located in the third display view. Therefore, by the method, the display view of the virtual scene on the first electronic equipment is controlled according to the type of the display object in the current virtual scene of the second electronic equipment, and the problem of how to display a plurality of pieces of electronic equipment in the same virtual scene is solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flowchart of an information processing method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a process that a first electronic device acquires a type of a display object in a virtual scene played by a second electronic device according to an embodiment of the present invention;
fig. 3 is a schematic diagram illustrating a display object classification according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a process that a first electronic device acquires a type of a display object in a virtual scene played by a second electronic device according to another embodiment of the present invention;
fig. 5a is an application scenario diagram of a first electronic device according to an embodiment of the present invention;
fig. 5b is an application scenario diagram of yet another first electronic device according to an embodiment of the present invention;
fig. 6 is a schematic flowchart of a process of determining a display view for playing a virtual scene by a first electronic device according to an embodiment of the present invention;
fig. 7 is an application scenario diagram of yet another first electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides an information processing method and device, the method is applied to a first electronic device, and by acquiring the type of a display object in a virtual scene played by a second electronic device, when the type is the first type, the first electronic device displays the virtual scene in a first display view, the display object is positioned in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is positioned in the second display view, wherein the first display view is different from the second display view. When the type is the second type, the first electronic device and the second electronic device both display the virtual scene in the third display view, and the display object is located in the third display view. Therefore, by the method, the display view of the virtual scene on the first electronic equipment is controlled according to the type of the display object in the current virtual scene of the second electronic equipment, and the problem of how to display a plurality of pieces of electronic equipment in the same virtual scene is solved.
Referring to fig. 1, a schematic flow chart of an information processing method according to the present invention is shown, where the information processing method is applied to a first electronic device, and includes the following steps:
and S1, acquiring the type of the display object in the virtual scene played by the second electronic equipment.
It should be noted that, in this embodiment, the application scenario needs to be based on at least two electronic devices, for example, a combination of a first electronic device and a second electronic device, and data interaction is performed between the two electronic devices. In this embodiment, the number of the first electronic device and the second electronic device is not limited, and for example, the number may be a combination of one first electronic device and a plurality of second electronic devices, a combination of one second electronic device and a plurality of first electronic devices, and a combination of a plurality of first electronic devices and a plurality of second electronic devices.
Specifically, the information processing method provided in this embodiment describes a data interaction process between the first electronic device and the second electronic device based on the first electronic device, where the electronic device may be a conventional fixed terminal, such as a desktop, an all-in-one machine, or a mobile terminal, such as a mobile phone, a Pad, or a virtual terminal, such as VR glasses, AR glasses, or the like.
However, no matter what kind of electronic device, in this embodiment, the electronic device needs to be able to play a virtual scene, for example, VR glasses play a football game, or play a movie, or play a set of game interaction virtual scenes. At least one display object exists in the virtual scene, wherein the display object is a part of the currently displayed content in the virtual scene currently played by the electronic device, for example, in a football game, the display object may be a football or a certain player; for another example, in a virtual scene of game interaction, a display object may be a certain display content at a certain viewing angle in the virtual scene, such as a certain person in a CS gun battle scene, or a certain card content in a chess and card virtual scene, and so on.
In this embodiment, the display objects may be classified according to the display dimensions of the display objects, such as two-dimensional display objects and multi-dimensional display objects, where the two-dimensional display objects refer to planar objects such as information and pictures, and the multi-dimensional display objects refer to stereoscopic objects such as soccer and people.
Specifically, there are various implementation manners for the first electronic device to acquire the type of the display object in the virtual scene played by the second electronic device, as follows:
first, as shown in fig. 2, the first electronic device first obtains the content of the virtual scene currently played by the second electronic device, and then determines the type of the display object in the virtual scene. The type of the display object in the virtual scene may be determined by setting a classification library of the display object in the first electronic device in advance, acquiring the display object in the virtual scene after the first electronic device receives the virtual scene played by the second electronic device, searching for the display object in the virtual scene in the classification library built in the first electronic device, and determining that the classification including the display object is the type of the display object.
For example, referring to fig. 3, a first method is illustrated, the classification library is divided into two major classes, one is a two-dimensional class, and the other is a multi-dimensional class, wherein in each class, an attribution relationship of a display object to a class is recorded, for example, a two-dimensional class display object includes planar content such as a planar picture, information, a video, and the like, and a three-dimensional class display object includes stereoscopic content such as a football, a dynamic puppy, a stereoscopic dinosaur, and the like.
Assuming that a virtual scene currently played by the second electronic device is a jungle mystery scene, wherein one dinosaur is walking, the first electronic device acquires the jungle mystery scene currently played by the second electronic device, then determines a display object-dinosaur in the jungle mystery scene, searches an affiliation relation table of the display object and the classification library to obtain a three-dimensional display object to which the dinosaur belongs, and therefore determines that the type of the display object in the second electronic device is a three-dimensional type.
And, assuming that the virtual scene currently played by the second electronic device is the content of a letter and is a character scrolling on a plane, the first electronic device obtains the content of the letter currently played by the second electronic device, then determines a display object-character information of the content of the letter, searches for an affiliation relation table of the display object and the classification library, and obtains that the 'character information' belongs to a two-dimensional display object, so that the type of the display object in the second electronic device is determined to be a two-dimensional type.
In a second mode, as shown in fig. 4, the second electronic device first determines the content of the virtual scene currently played by the second electronic device, determines the display object in the current virtual scene, then determines the type of the display object in the virtual scene, and finally sends the type of the display object to the first electronic device.
Similarly, the method and the mode for the second electronic device to determine the type of the display object are basically the same, and specifically include: a classification library of the display object is set in the second electronic device in advance, then the display object in the currently played virtual scene is acquired, the display object in the virtual scene is searched in the classification library, and the classification including the display object is determined as the type of the display object.
It should be noted that, in addition to the classification library being built in the first electronic device or the second electronic device, the classification library may also be set on the server side, when the first electronic device or the second electronic device needs to perform the classification query of the display object, the display object is sent to the preset server, and after being judged by the server, the judgment result is sent to the first electronic device or the second electronic device again. Finally, in both the first mode and the second mode, the first electronic device may obtain the type of the display object in the virtual scene played by the second electronic device.
S2, if the type is a first type, the first electronic device displays the virtual scene in a first display view, and the display object is located in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, where the first display view and the second display view are different.
S3, if the type is the second type, the first electronic device and the second electronic device both display the virtual scene in a third display view, and the display object is located in the third display view.
In conjunction with step S1, the virtual scene played in the second electronic device has a display field of view, for example, if the virtual scene is a football game, the angle of viewing the football is different from the perspective of different players, that is, each player has a display field of view. Of course, the display view field is not limited to this, and may be set according to the requirement, for example, if the virtual scene is a football game, the angles of the watched football and the players are different from the angles of different fans on the audience field, that is, each fan has a display view field.
For another example, the virtual scene played in the second electronic device is a scene of each sub-venue in a zoo, and the display view of the visitor is different according to the position of the visitor. If the visitor a sees a panda which is silly beechable in a panda hall in a zoological park, and the visitor B sees an elephant in the elephant hall, then in the same sub-hall, the display fields of the scene seen by the visitor B are different because of the different positions of the visitor, for example, the visitor C is in the panda hall, but the display fields of the visitor C and the visitor a are different because of the different positions of the visitor C and the visitor a in the panda hall, that is, the angles of the pandas, which are the display objects seen by the visitor a and the visitor C, are different, for example, the visitor a sees the head of the panda, and the visitor C sees the tail of the panda.
However, steps S2 and S3 are processes for implementing display of the virtual scene in different display views according to the type of the display object in the virtual scene played by the second electronic device.
Specifically, when the type of the display object in the virtual scene played by the second electronic device is the first type, the first electronic device displays the virtual scene in the first display view, and the display object is located in the first display view. Meanwhile, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, wherein the first display view is different from the second display view.
When the type of the display object in the virtual scene played by the second electronic device is the second type, the first electronic device and the second electronic device both display the virtual scene in a third display view, and the display object is located in the third display view. The third display field of view may be the same as the first display field of view or the second display field of view, or may be different. In this embodiment, it is preferable that the third display field of view is the same as the second display field of view and is different from the first display field of view.
It should be noted that, in this embodiment, the types of the display objects are divided into two types: the display object comprises a first type and a second type, wherein the first type is that the display object is a stereo object, and the second type is that the display object is a plane object.
When the display object in the virtual scene in the second electronic device acquired by the first electronic device is of the first type, the second electronic device continues to display the virtual scene in the display view of the currently playing virtual scene, where the display view currently played by the second electronic device is defined as the second display view, and the display object in the virtual scene in the second electronic device exists in the current second display view.
Meanwhile, the first electronic device displays the virtual scene played by the second electronic device in the first display view, and meanwhile, the display object needs to be located in the first display view.
Specifically, for example, as shown in fig. 5a, the virtual scene played in the second electronic device is a CS battle, wherein the second electronic device displays the CS battle scene in the display view of an a-party competitor (corresponding to the second display view above), and can view the display object-enemy interface in the second display view, when the first electronic device needs to join the game and interact with the second electronic device, the first electronic device obtains the type of the display object-enemy in the CS battle in the second electronic device, and the first electronic device is a stereoscopic object, and then the first electronic device displays the CS battle in a display view (first display view) different from that of the second electronic device, for example, the first electronic device displays the CS battle in the display view of another a-party competitor, and at the same time, the virtual scene played in the first electronic device needs to include the same display as the virtual scene displayed in the second electronic device Show object-enemy.
However, as shown in fig. 5b, when the virtual scene played in the second electronic device is an electronic album of carousel, at this time, the display object in the virtual scene played by the second electronic device acquired by the first electronic device is a photo in the electronic album, and the photo is determined to be a planar object, and then the type of the display object of the current virtual scene played by the second electronic device is obtained to be the second type, at this time, the first electronic device and the second electronic device may play the current virtual scene in the same display view (third display view), without switching the display views, where the third display view may be a second display view of the virtual scene currently played by the second display device, and of course, may also be another display view different from the first display view and the second display view.
In summary, in this embodiment, the display view of the virtual scene played on the first electronic device is determined according to the type of the display object in the virtual scene played on the second electronic device, when the display object is a stereoscopic object, the display views of the virtual scenes of the first electronic device and the second electronic device are different, and when the display object is a planar object, the display views of the virtual scenes of the first electronic device and the second electronic device are the same.
On the basis of the foregoing embodiment, this embodiment further provides a specific implementation method for how to implement, by the first electronic device, the determination of the display view of the playing virtual scene when the type of the display object is the first type, as shown in fig. 6, where the implementation method includes the steps of:
s21, acquiring the first display visual field and the virtual scene;
and S22, displaying the virtual scene according to the first display visual field.
The first electronic device may acquire the first display view by acquiring a relative position between the first electronic device and the second electronic device, and then determine the first display view according to the relative position.
As shown in fig. 7, when the second electronic device is located at the position a and the first electronic device is located at the position B, the second display view is determined as the vector AO between the second electronic device and the target object and the position relationship between the first electronic device and the second electronic device, that is, the vector AB, and the display view between the first electronic device and the target object, that is, the vector BO, can be determined according to the vectors AO and AB, so that the first display view can be obtained according to the relative position relationship between the first electronic device and the second electronic device, and then the first electronic device displays the virtual scene where the target object is located in the first display view.
That is, the change in the display field of view of the virtual scene played by the first electronic device and the second electronic device specifically corresponds to the change in the field of view from the position a to the position B.
It should be noted that, this embodiment does not limit the determination subject of the first display view, and may be that the first electronic device determines according to the relative position with the second electronic device, and may also be determined by the background server and then sent to the first electronic device, and of course, the second electronic device may also determine the first display view and then send to the first electronic device, and at this time, the first electronic device only needs to receive the first display view and the virtual scene sent by the second electronic device, and display the virtual scene according to the first display view.
On the basis of the foregoing embodiment, the first electronic device further provided in this embodiment may further send a scene join request for joining the second electronic device before obtaining the type of the display object in the virtual scene played on the second electronic device. And after the second electronic equipment or the background server passes the verification, authorizing to acquire the type of the display object of the virtual scene played on the current second electronic equipment.
If the first electronic device is the virtual glasses a, the second electronic device is the virtual glasses B, at this time, the virtual glasses B are playing a 3D movie, where the display object is a stereoscopic dinosaur, the virtual glasses B send a scene join request to the virtual glasses a, after the virtual glasses a send feedback information triggered by the user based on the scene join request, the feedback information is authorized to the virtual glasses a, at this time, the virtual glasses a may obtain a virtual scene being played by the virtual glasses B, and obtain a first display view determined by the virtual glasses a based on the spatial orientation of the virtual glasses B, and then the first electronic device displays the virtual scene with the first display view.
However, the electronic device provided in this embodiment may further send a target display view switching request, acquire a target virtual scene corresponding to the target display view switching request, and display the target virtual scene with the target display view.
Specifically, the first electronic device may further switch the display views based on an operation of a user, in the above embodiment, the first electronic device displays a virtual scene with the first display view, and the second electronic device displays the virtual scene with the second display view, at this time, if the user of the first electronic device wants to change one display view to view the virtual scene, the user may trigger the first electronic device to send a display view switching request, may switch the first display view to the second display view, may also switch the first display view to the third display view, and may also switch between different display views back and forth according to a requirement of the user.
It should be noted that there are various triggering manners of the display view switching request, for example, a switching key is arranged on the first electronic device and is manually triggered by a user, or a virtual switching button may be added at a certain position of a virtual scene played by the first electronic device, and the present embodiment is not limited to the several presentation manners.
Moreover, the above example only describes the determination process of the display views of the first electronic device and the second electronic device when the type of the display object is the first type or the second type, and besides, this embodiment may determine the display views of the first electronic device and the second electronic device again according to the change of the display object in the virtual scene after determining that the display object is the type.
For example, if the virtual scene currently played in the second electronic device is a football game, the display object in the current scene is a football, and the current display field of view of the second electronic device is the field of view of player a, then the first electronic device displays the football game in the field of view of player B, when a player corresponding to a certain electronic device is punished during the game, the current virtual scene may display information "XXX is warned by yellow cards", and at this time, the display object of the virtual scene may be switched from the football to warning information, that is, switched from a stereoscopic object to a planar object, then at this time, the first electronic device and the second electronic device may simultaneously view the punishment information in the third display field of view, and the display field of view of the first electronic device at this time is not necessarily different from the display field of view of the second electronic device.
The determination of the display object may be switched by the electronic device, or may be set and switched based on an operation of a user, or may be switched automatically by the system when a preset time arrives, which is not specifically limited in this embodiment.
The method is described in detail in the embodiment provided by the invention, and the method can be realized by adopting various forms of devices, so that the invention also provides a device, and the detailed description is given in the specific embodiment below.
The embodiment provides an information processing apparatus including:
a display for displaying a virtual scene of a plurality of display views, wherein a display object is located within the display views;
a processor to:
acquiring the type of a display object in a virtual scene played by second electronic equipment;
if the type is a first type, the first electronic device displays the virtual scene in a first display view, and the display object is located in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, wherein the first display view and the second display view are different;
if the type is a second type, the first electronic device and the second electronic device both display the virtual scene in a third display view, and the display object is located in the third display view.
Wherein, when the first electronic device displays the virtual scene with a first display view, the processor is specifically configured to:
acquiring the first display visual field and the virtual scene;
and displaying the virtual scene according to the first display visual field.
Wherein, when acquiring the first display field of view, the processor is specifically configured to:
acquiring the relative position of the first electronic equipment and the second electronic equipment;
and determining the first display visual field according to the relative position.
In addition, the processor, when acquiring the first display field of view, is specifically configured to:
receiving a first display visual field and the virtual scene sent by the second electronic equipment;
and displaying the virtual scene according to the first display visual field.
On the basis of the above embodiment, the processor is further configured to:
sending a target display view switching request;
acquiring a target virtual scene corresponding to the target display view switching request;
and displaying the target virtual scene with the target display visual field.
The working principle of the device provided by the embodiment is the same as that of the method embodiment, and the description is not repeated here.
In summary, the following steps: the invention provides an information processing method and device, the method is applied to a first electronic device, and by acquiring the type of a display object in a virtual scene played by a second electronic device, when the type is the first type, the first electronic device displays the virtual scene in a first display view, the display object is positioned in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is positioned in the second display view, wherein the first display view is different from the second display view. When the type is the second type, the first electronic device and the second electronic device both display the virtual scene in the third display view, and the display object is located in the third display view. Therefore, by the method, the display view of the virtual scene on the first electronic equipment is controlled according to the type of the display object in the current virtual scene of the second electronic equipment, and the problem of how to display a plurality of pieces of electronic equipment in the same virtual scene is solved.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the device provided by the embodiment, the description is relatively simple because the device corresponds to the method provided by the embodiment, and the relevant points can be referred to the method part for description.
The previous description of the provided embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features provided herein.

Claims (10)

1. An information processing method applied to a first electronic device is characterized in that,
acquiring the type of a display object in a virtual scene played by second electronic equipment;
if the type is a first type, the first electronic device displays the virtual scene in a first display view, and the display object is located in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, wherein the first display view and the second display view are different;
if the type is a second type, the first electronic device and the second electronic device both display the virtual scene in a third display view, and the display object is located in the third display view.
2. The information processing method according to claim 1, wherein the type determination of the display object includes:
and judging whether the display object is a plane object or a stereo object, wherein if the display object is the stereo object, the display object belongs to a first type, and if the display object is the plane object, the display object belongs to a second type.
3. The information processing method according to claim 1, wherein the first electronic device displays the virtual scene in a first display field of view, including:
acquiring the first display visual field and the virtual scene;
and displaying the virtual scene according to the first display visual field.
4. The information processing method according to claim 3, wherein the acquiring the first display field of view includes:
acquiring the relative position of the first electronic equipment and the second electronic equipment;
determining the first display view according to the relative position;
or the like, or, alternatively,
receiving a first display visual field and the virtual scene sent by the second electronic equipment;
and displaying the virtual scene according to the first display visual field.
5. The information processing method according to claim 1, further comprising:
sending a target display view switching request;
acquiring a target virtual scene corresponding to the target display view switching request;
and displaying the target virtual scene with the target display visual field.
6. An information processing apparatus characterized by comprising:
a display for displaying a virtual scene of a plurality of display views, wherein a display object is located within the display views;
a processor to:
acquiring the type of a display object in a virtual scene played by second electronic equipment;
if the type is a first type, the first electronic device displays the virtual scene in a first display view, the display object is located in the first display view, the second electronic device displays the virtual scene in a second display view, and the display object is located in the second display view, wherein the first display view is different from the second display view;
if the type is a second type, the first electronic device and the second electronic device both display the virtual scene in a third display view, and the display object is located in the third display view.
7. The information processing apparatus of claim 6, wherein the processor, when the first electronic device displays the virtual scene with a first display field of view, is specifically configured to:
acquiring the first display visual field and the virtual scene;
and displaying the virtual scene according to the first display visual field.
8. The information processing apparatus of claim 7, wherein the processor, when acquiring the first display field of view, is specifically configured to:
acquiring the relative position of the first electronic equipment and the second electronic equipment;
and determining the first display visual field according to the relative position.
9. The information processing apparatus of claim 7, wherein the processor, when acquiring the first display field of view, is specifically configured to:
receiving a first display visual field and the virtual scene sent by the second electronic equipment;
and displaying the virtual scene according to the first display visual field.
10. The information processing apparatus of claim 6, wherein the processor is further configured to:
sending a target display view switching request;
acquiring a target virtual scene corresponding to the target display view switching request;
and displaying the target virtual scene with the target display visual field.
CN201710197126.6A 2017-03-29 2017-03-29 Information processing method and device Active CN106933368B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710197126.6A CN106933368B (en) 2017-03-29 2017-03-29 Information processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710197126.6A CN106933368B (en) 2017-03-29 2017-03-29 Information processing method and device

Publications (2)

Publication Number Publication Date
CN106933368A CN106933368A (en) 2017-07-07
CN106933368B true CN106933368B (en) 2019-12-24

Family

ID=59425250

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710197126.6A Active CN106933368B (en) 2017-03-29 2017-03-29 Information processing method and device

Country Status (1)

Country Link
CN (1) CN106933368B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023706A (en) * 2009-09-15 2011-04-20 帕洛阿尔托研究中心公司 System for interacting with objects in a virtual environment
CN104571532A (en) * 2015-02-04 2015-04-29 网易有道信息技术(北京)有限公司 Method and device for realizing augmented reality or virtual reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060284791A1 (en) * 2005-06-21 2006-12-21 National Applied Research Laboratories National Center For High-Performance Computing Augmented reality system and method with mobile and interactive function for multiple users

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023706A (en) * 2009-09-15 2011-04-20 帕洛阿尔托研究中心公司 System for interacting with objects in a virtual environment
CN104571532A (en) * 2015-02-04 2015-04-29 网易有道信息技术(北京)有限公司 Method and device for realizing augmented reality or virtual reality

Also Published As

Publication number Publication date
CN106933368A (en) 2017-07-07

Similar Documents

Publication Publication Date Title
CN110519611B (en) Live broadcast interaction method and device, electronic equipment and storage medium
CN106803966B (en) Multi-user network live broadcast method and device and electronic equipment thereof
US9728011B2 (en) System and method for implementing augmented reality via three-dimensional painting
US8547401B2 (en) Portable augmented reality device and method
US20170169598A1 (en) System and method for delivering augmented reality using scalable frames to pre-existing media
US10873768B2 (en) Three-dimensional advertising space determination system, user terminal, and three-dimensional advertising space determination computer
CN106791906B (en) Multi-user network live broadcast method and device and electronic equipment thereof
WO2017113577A1 (en) Method for playing game scene in real-time and relevant apparatus and system
US20150172634A1 (en) Dynamic POV Composite 3D Video System
CN113115061B (en) Live broadcast interaction method and device, electronic equipment and storage medium
US20130038702A1 (en) System, method, and computer program product for performing actions based on received input in a theater environment
CN112312111A (en) Virtual image display method and device, electronic equipment and storage medium
US20170235462A1 (en) Interaction control method and electronic device for virtual reality
US9621703B2 (en) Motion to connect to kiosk
CN107665231A (en) Localization method and system
US20220161144A1 (en) Image display method and apparatus, storage medium, and electronic device
CN108076379B (en) Multi-screen interaction realization method and device
CN114191823A (en) Multi-view game live broadcast method and device and electronic equipment
US20150161799A1 (en) 3d immersion technology
JP2018169735A (en) Video retrieval program and video retrieval method and video information processing equipment
CN106933368B (en) Information processing method and device
CN112288877A (en) Video playing method and device, electronic equipment and storage medium
CN107578306A (en) Commodity in track identification video image and the method and apparatus for showing merchandise news
GB2555838A (en) An apparatus, computer program and method
GB2555841A (en) An apparatus, computer program and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant