CN113296721A - Display method, display device and multi-screen linkage system - Google Patents

Display method, display device and multi-screen linkage system Download PDF

Info

Publication number
CN113296721A
CN113296721A CN202011491190.3A CN202011491190A CN113296721A CN 113296721 A CN113296721 A CN 113296721A CN 202011491190 A CN202011491190 A CN 202011491190A CN 113296721 A CN113296721 A CN 113296721A
Authority
CN
China
Prior art keywords
image
screen
displayed
operation instruction
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011491190.3A
Other languages
Chinese (zh)
Inventor
陈晟沁
陈霖
戴湛祥
郭嘉
刘洋
张凌柱
余柳
曾骁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202011491190.3A priority Critical patent/CN113296721A/en
Publication of CN113296721A publication Critical patent/CN113296721A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1438Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using more than one graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A display method, a display device and a multi-screen linkage system are disclosed. A first image is displayed on a first screen of a first device. Associating a second device with the first device such that a second image is displayed on a second screen of the second device, the second image corresponding to the same video content or video scene as the first image. An operation instruction issued on the first device based on the first image is received. And processing corresponding to the operation instruction is carried out on the second image displayed on the second screen. Therefore, more interaction schemes can be provided between the large screen and the small screen, so that the respective advantages of the large screen and the small screen are better utilized, and the user experience is further improved.

Description

Display method, display device and multi-screen linkage system
Technical Field
The disclosure relates to an internet of things system, and in particular relates to a display method and a display device in a multi-screen linkage system.
Background
The Internet of Things (IoT, Internet of Things), i.e., "Internet with everything connected to one another", is an extended and expanded network on the basis of the Internet, and combines various information sensing devices with the Internet to form a huge network, thereby realizing the intercommunication of people, machines and Things at any time and any place.
With the development of the internet of things technology, the equipment has certain linkage capacity. For example, content displayed on a cell phone side screen may be dropped onto a television with a larger display screen.
However, in the existing screen projection scheme, the mobile phone end can only be used as a remote controller, and the content displayed on the large screen of the television is synchronously controlled by controlling the content displayed on the screen of the mobile phone end, and the content displayed on the large screen of the television is the same as the content displayed on the screen of the mobile phone end.
Therefore, the experience of the large screen is behind that of the mobile phone end, and the functions of the mobile phone end cannot be born at the large screen end.
Therefore, there is still a need for a multi-screen linkage scheme, so as to present more content display modes on a large screen and provide more interaction schemes between the large screen and a small screen, thereby better utilizing respective advantages of the large screen and the small screen and further improving user experience.
Disclosure of Invention
The technical problem to be solved by the present disclosure is to provide a display method, a display device and a multi-screen linkage system, which can provide more interaction schemes between a large screen and a small screen, so as to better utilize respective advantages of the large screen and the small screen, and further improve user experience.
According to a first aspect of the present disclosure, there is provided a display method including: displaying a first image on a first screen of a first device; associating a second device with the first device such that a second image is displayed on a second screen of the second device, the second image corresponding to the same video content or video scene as the first image; receiving an operation instruction sent out by the first equipment based on the first image; and processing corresponding to the operation instruction is carried out on the second image displayed on the second screen.
Optionally, the second image is derived based on the first image; or the first image and the second image are derived based on the same image data source.
Optionally, the operation instruction includes a local enlargement instruction for a local area range on the first image, and the method further includes: in response to the local enlargement instruction, labeling the local area range on a first image displayed on a first screen, wherein the processing corresponding to the operation instruction on a second image displayed on a second screen includes: and magnifying and displaying the local image in the local area range in the first image on a second screen.
Optionally, the operation instruction further includes a local area range adjustment instruction, and the method further includes: moving and/or zooming the local area range marked on the first image in response to the local area range adjusting instruction, wherein the processing corresponding to the operation instruction on the second image displayed on the second screen comprises: as the partial area range is moved and/or zoomed on the first image, the partial image displayed enlarged on the second screen is adjusted accordingly so as to correspond to the partial area range.
Optionally, the operation instruction further includes a partial zoom instruction issued for a partial range within the local area range, and the method further includes: in response to the partial enlargement instruction, the partial range is marked on a first image displayed on a first screen, wherein the processing corresponding to the operation instruction on a second image displayed on a second screen includes: and superposing a partial image display frame on the partial image which is displayed in an enlarged manner on the second screen, and enlarging and displaying a partial image in the partial range in the first image in the partial image display frame.
Optionally, the operation instruction further includes a local zoom instruction for the first image, and the method further includes: and in response to the local scaling instruction, performing scaling processing on the first image displayed on the first screen without performing corresponding scaling processing on the second image displayed on the second screen.
Optionally, the operation instruction further includes an identification instruction issued for the object image or the object region on the first image, and the method further includes: and overlaying and displaying object information obtained by identifying the object image or the object image in the object area on a second image displayed on a second screen.
Optionally, the method may further include: displaying shopping guide information associated with the object information on a first screen; and/or displaying object associated information obtained based on the object information on a second screen.
Optionally, the first image and the second image are video images, and the method further includes: and displaying object information of a predetermined object on a first screen in response to the video image being played to a video frame containing the predetermined object.
Optionally, the displaying of the object information of the predetermined object on the first screen includes: displaying an object presentation image of the predetermined object on a first screen, the object presentation image including a configuration object image of the predetermined object, and highlighting the configured predetermined object image and object information thereof on the configuration object image; or displays a predetermined object image and object information thereof on the first screen.
Optionally, the operation instruction further includes a viewpoint and/or view angle conversion instruction for converting a viewpoint and/or view angle corresponding to the second image in the three-dimensional virtual space, where the step of performing processing corresponding to the operation instruction on the second image displayed on the second screen includes: adjusting the display content of the second image based on the transformation of the viewpoint and/or the viewing angle.
Optionally, the first image and the second image are derived based on the same image data source, which is formed based on a virtual viewpoint reconstruction technique.
Optionally, the viewpoint and/or view angle conversion instruction is issued by a touch operation on the first screen; or the viewpoint and/or perspective translation instruction is issued by sensing movement and/or rotation of the first device; or the viewpoint and/or view angle conversion instruction is issued by sensing a change in position and/or posture of the head or body of the user using the camera of the first device.
Optionally, the operation instruction is a voice operation instruction received by the first device, and the method further includes: and recognizing the voice operation instruction.
Optionally, the first image and the second image are video images, and the method further includes: in response to the video image playing to a predetermined image frame or a predetermined point in time, a game associated with the video image is initiated on the first device while the video image continues to play on the second screen.
Optionally, the method may further include: and displaying information related to the game on a second screen in an overlapping manner.
Optionally, the method may further include: starting the game on the first device while continuing to display the second image on the second screen; and displaying information associated with the game on a second screen in an overlapping manner.
Optionally, the method may further include: displaying a map on a first screen, wherein a plurality of areas on the map respectively correspond to different second images; in response to a selection of an area on the map displayed on the first screen, switching to a second image corresponding to the selected area on the second screen.
Optionally, the second image is a video image, and the method further comprises: and correspondingly changing the map displayed on the first screen along with the playing of the video image.
Optionally, for the same content, a plurality of images corresponding to different viewing angles and/or different stands are provided, the method further comprising: and in response to a switching instruction received by the first equipment, switching the visual angle and/or different machine positions corresponding to the first image and/or the second image, so that the images corresponding to the different visual angles and/or different machine positions are displayed on the first screen and the second screen.
Optionally, the method may further include: receiving the input barrage information on the first device; and displaying the bullet screen information from the first equipment and/or bullet screen information from other users on the second image in an overlapping mode.
Optionally, the method may further include: receiving, by the second device, a second operation instruction issued based on the second image; and processing corresponding to the second operation instruction is carried out on the first image displayed on the first screen.
Optionally, the second operation instruction includes at least one of the following instructions: using an operation instruction issued by a controller of the second device; sending an operation instruction through touch operation on a second screen; a voice operation instruction received by the second device; an operation instruction issued by sensing a change in position and/or posture of the head or body of the user using the camera of the second device.
According to a second aspect of the present disclosure, there is provided a display device including: first display control means for displaying a first image on a first screen of a first device; second display control means for associating the second device with the first device so that a second image is displayed on a second screen of the second device, the second image corresponding to the same video content or video scene as the first image; and an instruction receiving means for receiving an operation instruction issued based on the first image at the first device, wherein the second display control means performs processing corresponding to the operation instruction on the second image displayed on the second screen.
According to a third aspect of the present disclosure, there is provided a multi-screen linkage system, comprising: a first device including a first screen; and the second device comprises a second screen, wherein the first screen displays a first image, the second device is associated with the first device, so that the second screen displays a second image, the second image and the first image correspond to the same video content or video scene, the first device receives an operation instruction sent based on the first image, and the second device performs processing corresponding to the operation instruction on the second image displayed on the second screen.
According to a fourth aspect of the present disclosure, there is provided a computing device comprising: a processor; and a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method as described in the first aspect above.
According to a fifth aspect of the present disclosure, there is provided a non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the method as described in the first aspect above.
Therefore, more interaction schemes can be provided between the large screen and the small screen, so that the respective advantages of the large screen and the small screen are better utilized, and the user experience is further improved.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
FIG. 1 is a schematic view of a multi-screen linkage system.
Fig. 2 is a schematic block diagram of a display device according to the present disclosure.
Fig. 3 is a schematic flow diagram of a display method according to the present disclosure.
Fig. 4 is a schematic diagram showing the same contents displayed on the first screen and the second screen.
Fig. 5 is a schematic diagram showing an image in a partial area range on a first screen on a second screen.
Fig. 6 is a schematic diagram of two-level superimposition and enlargement of an image in a partial range on a first screen on a second screen.
Fig. 7 is a schematic diagram in which an image on a first screen is scaled while an image displayed on a second screen is unchanged.
Fig. 8 is a schematic diagram showing object information of an object selected on the first screen on the second screen.
Fig. 9 is a schematic diagram showing shopping guide information for a selected object on a first screen.
Fig. 10 is a schematic diagram of a game being played on a first screen while images continue to be displayed on a second screen.
Fig. 11 is a schematic diagram of inputting barrage information on a first screen and displaying a barrage on a second screen.
Fig. 12 is a schematic diagram of switching the second image displayed on the second screen based on the map displayed on the first screen.
Fig. 13 is a schematic structural diagram of a computing device that can be used to implement the display method according to an embodiment of the present invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
First, a multi-screen linkage system to which the display control scheme of the present disclosure can be applied will be briefly described with reference to fig. 1.
FIG. 1 is a schematic view of a multi-screen linkage system.
As shown in fig. 1, a multi-screen linkage system may include a first device 100 and a second device 200. For example, the first device 100 may be a mobile phone, and the second device may be a television or a projector, etc.
The first device 100 may include a first screen 110.
The second device 200 may include a second screen 210. The first screen 110 may be relatively small for handheld operation, while the second screen 210 may be relatively large for more clear viewing of the displayed content.
In the multi-screen linkage scheme according to the present disclosure, a first image is displayed on the first screen 110, and the second device 200 is associated with the first device 100 such that a second image displayed on the second screen 210 corresponds to the same video content or video scene as the first image.
The first device 100 may receive an operation instruction issued based on the first image.
The second device 200 performs a process corresponding to the operation instruction on the second image displayed on the second screen 210.
A display device and a display method according to the present disclosure are described below with reference to fig. 2 and 3.
Fig. 2 is a schematic block diagram of a display device according to the present disclosure.
Fig. 3 is a schematic flow diagram of a display method according to the present disclosure.
As shown in fig. 2, the display device 10 according to the present disclosure may include a first display control device 11, a second display control device 12, and an instruction receiving device 13.
As shown in fig. 3, in step S21, a first image may be displayed on the first screen 110 of the first device 100, for example, by the first display control apparatus 11.
In step S22, the second device 200 may be associated with the first device 100, for example, by the second display control apparatus 12, so that the second image is displayed on the second screen 210 of the second device 200.
The second image corresponds to the same video content or video scene as the first image.
Here, the second image may be obtained based on the first image. For example, the second image may be the same as the first image, or a partially enlarged image of the first image, for example.
Alternatively, the first image and the second image may be derived based on the same image data source. For example, images from projections from different viewing angles and/or points of view based on three-dimensional stereo image data of the same scene. Also for example, images taken from different perspectives such as backboard perspective, dive perspective, spectator perspective, alternate team member perspective, etc., for the same scene such as a basketball game scene.
Reference to "images" in this disclosure includes static images such as pictures, and may also include dynamic images such as video/animation and the like.
In step S23, an operation instruction issued at the first device 100 based on the first image may be received, for example, by the instruction receiving means 13.
In step S24, the second image displayed on the second screen may be subjected to processing corresponding to the operation instruction by the second display control device 12, for example.
Thus, images corresponding to the same video content or video scene may be displayed on the first screen and the second screen. However, the images displayed on the two screens may be different, and the second image displayed on the second screen may be processed accordingly in response to a user's operation on the first device with respect to the first image displayed on the first screen, such as partial zoom-in, object selection, and the like, so that the second image displayed on the second screen 210 changes, while the first image displayed on the first screen 110 may not change.
It should be understood that the changes of the images mentioned in the present disclosure do not include changes generated by the video content or the video scene itself corresponding to the first image and the second image, which are advanced along with time, but changes of the display manner of the video content or the video scene.
A variety of interaction modes that may be performed between the first screen 110 and the second screen 120 according to the present disclosure are described in detail below with reference to fig. 4 to 11.
In the drawings, the displayed image is represented by a fill pattern.
It should be understood that the first device 100 may be displayed in a landscape screen manner and may also be displayed in a portrait screen manner. Fig. 4 to 8 are described by way of example in a landscape screen mode, and fig. 9 to 11 are described by way of example in a portrait screen mode.
It should also be understood that on both the first screen 110 and the second screen 210, only a portion of the screen area may be used to display the first image and the second image.
Fig. 4 is a schematic diagram showing the same content on the first screen 110 and the second screen 210.
At this time, a common screen-projection display interaction scheme may be adopted between the first screen 110 and the second screen 210.
Fig. 5 is a schematic diagram showing an image in a partial area range on the first screen 110 on the second screen 210.
Here, the operation instruction may include a local enlargement instruction for the local area range 120 on the first image.
Then, in response to the partial enlargement instruction, the partial region range 120 may be marked on the first image displayed on the first screen 110. For example, a selection box may be displayed or the local area range 120 may be highlighted to show that the image within the range is to be displayed on the second screen 210 in an enlarged manner.
On the other hand, in response to the partial enlargement instruction, the partial image within the partial area range 120 in the first image is enlarged and displayed on the second screen 210.
In addition, the first device 100 may further receive a local area range adjustment instruction as the above operation instruction.
In response to the local region extent adjustment instruction, the position and/or size of the local region extent 120 marked on the first image may be moved and/or scaled on the first screen 110.
Accordingly, as the partial area coverage 120 is moved and/or zoomed on the first image, the partial image displayed enlarged on the second screen 210 may be adjusted so as to correspond to the partial area coverage 120.
In addition, a two-stage enlargement of a smaller range of the local area range 120 is also possible.
Fig. 6 is a schematic diagram of enlarging an image in a partial range on the first screen 110 in two stages of superimposition on the second screen 210.
Here, the first device 100 may further receive, as the above-described operation instruction, a partial enlargement instruction issued for the partial range 130 within the local area range 120 on the basis of the case shown in fig. 5.
Then, in response to the partial zoom instruction, the partial range 130 is marked on the first image displayed on the first screen 110.
Accordingly, the partial image display frame 230 is superimposed on the partial image enlarged and displayed on the second screen 210, and the partial image within the partial range 130 in the first image is enlarged and displayed in the partial image display frame 230.
For example, the user is interested in the object 140 on the first image. After the local enlargement instruction is issued for the first time, the local image within the local area range 120 including the object 140 is enlarged and displayed on the second screen 210. It is thus seen that object 140 is an earring.
The user then wishes to see further details of this earring and selects a partial area 130 on the first screen, giving a partial zoom instruction. Then, a partial image display box 230 in which the object 140, i.e., the object image 240, is displayed in an enlarged manner can be superimposed on the second screen.
On the other hand, conversely, the second image displayed on the second screen 210 may be subjected to the scaling process of the first image on the first screen 110 without a change in the scaling.
Fig. 7 is a schematic diagram in which an image on the first screen 110 is scaled while an image displayed on the second screen 210 is unchanged.
Here, the operation instruction may be a native zoom instruction for the first image.
In response to the local zoom instruction, the first image displayed on the first screen 110 is subjected to zoom processing, and the second image displayed on the second screen 210 is not subjected to corresponding zoom processing.
As shown in fig. 7, on the second image on the second screen 210, the image in the area 250 corresponds to the partial image displayed enlarged on the first screen 110.
The zooming process described above with reference to fig. 5 to 7 may be performed during the playing of a video, for example, a tv series. In scenes such as live video and urban brain display, the local area may be zoomed in or out using the same scheme, for example, an image of the local area may be enlarged.
Fig. 8 is a schematic diagram showing object information of a selected object on the first screen 110 on the second screen 210.
The interactive scheme shown in fig. 8 may be performed in a case where the same image is displayed on the first screen 110 and the second screen 210 (fig. 4), or in a case where the image displayed on the second screen 210 is zoomed (fig. 5 and 6) or in a case where the image displayed on the first screen 110 is zoomed (fig. 7).
Here, the operation instruction may include a recognition instruction issued for the object image 160 or the object area on the first image.
Then, the object information 265 obtained by recognizing the object image 260 or the object image 260 in the object area is displayed superimposed on the second image displayed on the second screen 210. For example, the object may be displayed as an earring.
In addition, object association information derived based on the object information may also be displayed on the second screen 210. Such as which movie works, which characters have worn similar earrings, etc.
It should be understood that object information or object association information may also be displayed on the first screen 110.
In addition, as shown in fig. 9, shopping guide information 165 for the selected object may also be displayed on the first screen 110.
In addition, as described above, the first image and the second image may be video images. Object information of a predetermined object may be displayed on the first screen 110 in response to a video image being played to a video frame containing the predetermined object.
Here, an object presentation image of a predetermined object, for example, an image of a model wearing corresponding apparel, having a corresponding makeup effect, wearing corresponding jewelry/props (so-called "dress lane") may be displayed on the first screen 110.
The object presentation image includes a configuration object image (e.g., a model image) of a predetermined object, and highlights the configured predetermined object image (e.g., a so-called "uniform lane") and object information thereof on the configuration object image (e.g., the model image).
Alternatively, an image of a predetermined object and object information thereof may be directly displayed on the first screen 110. Such as displaying an earring image and corresponding information on the first screen 110.
In addition, while the second image is displayed on the second screen 210, game interaction may also be performed on the first screen 110.
Fig. 10 is a schematic diagram of a game being played on the first screen 110 while images continue to be displayed on the second screen 210.
The first image and the second image are video images. In response to the video image being played to a predetermined image frame or a predetermined point in time, a game associated with the video image may be initiated on the first device 100 while the video image continues to be played on the second screen 210.
At this time, information 270 associated with the game may be displayed superimposed on the second screen 210.
For example, the information 270 may be game invitation information, such as a system prompt that a game may be played on the first device 100 at this time, or that other users have invited to join the game, etc.
Alternatively, the information 270 may be game progress information, such as a prompt to enter a game, to be in progress, to end, etc.
Alternatively, the game played on the first device 100 may be unrelated to the second image displayed on the second screen. In this way, the game is started on the first device 100 while the second image continues to be displayed on the second screen 210, and then information associated with the game may be displayed superimposed on the second screen 210. In this way, when the user plays a game using the first device 100, game prompt information may be displayed on the second device.
In addition, a barrage interaction may also be performed on the first screen 110 and the second screen 210.
Fig. 11 is a schematic diagram of inputting barrage information on the first screen 110 and displaying a barrage on the second screen 210.
The entered barrage information, such as "barrage AAA," may be received at the first device 100, such as the barrage entry field 180 on the first screen 110.
On the other hand, the bullet screen information 280 "bullet screen AAA" from the first device 100 and/or the bullet screen information 280 "bullet screen BBB", "bullet screen CCC" from another user may be displayed superimposed on the second image.
The bullet-screen information may not be displayed superimposed on the first image displayed on the first screen 110 of the first device 100.
In addition, in the case where the video content is divided into a plurality of scenes or has corresponding video contents at a plurality of places, respectively, it is also possible to select the image content to be displayed by a map.
Fig. 12 is a schematic diagram for switching the second image displayed on the second screen 210 based on the map displayed on the first screen 110.
A map may be displayed on the first screen 110, and a plurality of areas (e.g., area a to area I) on the map correspond to different second images, respectively.
For example, in a television show, video content may be respectively photographed for a plurality of areas, for example, a plurality of rooms, for example, different characters are simultaneously active in different areas. As another example, in a sporting event scenario, different regions have different game scenarios. As another example, in a scenario of an art-integrated activity, different participants are active in different areas. For another example, in a multi-place joint performance program, different performance areas have different performance programs. The user can select video content of an area that is currently desired to be viewed through the map displayed on the first screen 110.
Accordingly, in response to a region selection, for example, region D, on the map displayed on the first screen 110, switching to the second image corresponding to the selected region D may be performed on the second screen 210.
In addition, the character in the region D may move to other regions as the video image is played, for example, to the region I at the lowest side of the currently displayed map on the first screen 110. At this time, the adjacent area of the area I is changed, and a part of the adjacent area is not in the map display range. Accordingly, the map displayed on the first screen 110 may be changed accordingly so that more adjacent areas of the area I are displayed.
In addition, the operation instruction received by the first device 100 may further include a viewpoint and/or perspective conversion instruction for converting a corresponding viewpoint and/or perspective of the second image in the three-dimensional virtual space.
Accordingly, based on the conversion of the viewpoint and/or the angle of view, the display content of the second image may be adjusted.
Here, the first image and the second image may be obtained based on the same image data source. For example, the Image data source may be formed based on a virtual viewpoint reconstruction technique DIBR (or 3D Image transformation).
The viewpoint and/or view angle conversion instruction may be issued by a touch operation on the first screen 110.
Alternatively, the viewpoint and/or perspective translation instructions may be issued by sensing movement and/or rotation of the first device 100.
Or the viewpoint and/or view angle conversion instruction is issued by sensing a change in position and/or posture of the head or body of the user using the camera of the first device 100.
In addition, for the same content, such as a basketball game scene, multiple images corresponding to different viewing angles and/or different stands may be provided, such as backboard viewing angles, dive angles, audience viewing angles, alternate team member viewing angles, and the like.
In response to a switching instruction received by the first device 100, the viewing angle and/or different machine position corresponding to the first image and/or the second image may be switched, so that images corresponding to different viewing angles and/or different machine positions are displayed on the first screen 110 and the second screen 210.
For example, a first image of the backboard angle of view may be displayed on first screen 110 and a second image of the viewer angle of view may be displayed on second screen 210.
Various interaction modes that may be implemented between the first screen 110 and the second screen 210 according to the present disclosure are described in detail above with reference to fig. 4 to 12.
The first device 100 may receive the operation instruction in various ways.
For example, the operation instruction may be received through a controller for the first device 100, and may also be received through sensing a touch operation of the first screen 210. A plurality of touch operations can be defined, which correspond to different operation instructions, and are not described herein.
In addition, the operation instruction may also be a voice operation instruction received by the first device 100. In this case, the voice operation instruction can be further recognized.
In addition, the operation instruction may also be an operation instruction obtained by sensing a change in position and/or posture of the head or body of the user by the camera of the first device 100. In this case, image recognition or gesture recognition may be further performed on the image sensed by the camera to acquire a corresponding operation instruction.
Additionally, in some scenarios, the second device 200 may receive operational instructions.
At this time, it may be that the second operation instruction issued based on the second image is received by the second device 200. Then, processing corresponding to the second operation instruction is performed on the first image displayed on the first screen 110.
Therefore, the reverse interactive operation from a large screen to a small screen can be realized.
Likewise, the second operation instruction may include at least one of the following instructions:
using an operation instruction issued by the controller of the second device 200;
issuing an operation instruction through touch operation on the second screen 210;
a voice operation instruction received by the second device 200;
an operation instruction issued by sensing a change in position and/or posture of the head or body of the user using the camera of the second device 200.
In the embodiment of the disclosure, an innovative interactive mode is realized based on linkage among a plurality of devices, for example, gesture touch operations including zooming in, zooming out, moving and the like can be performed on a small-screen device, the content and size of the display of a large-screen end are controlled, and an innovative video consumption form can be generated. Therefore, the product experience close to that of the mobile phone end can be generated by the large screen, and the content consumption part can be even superior to that of the mobile phone end.
For example, in a window enlarged by two fingers at a mobile phone end, the content at a television end can be enlarged, the mobile phone end can display the enlarged range of the picture, and the enlarged range can be pressed by two fingers, so that the enlarged position can be moved, and the close-up watching and consumption aiming at the content can be realized.
Fig. 13 is a schematic structural diagram of a computing device that can be used to implement the display method according to an embodiment of the present invention.
Referring to fig. 13, computing device 1300 includes a memory 1310 and a processor 1320.
Processor 1320 may be a multi-core processor or may include multiple processors. In some embodiments, processor 1320 may include a general-purpose host processor and one or more special purpose coprocessors such as a Graphics Processor (GPU), Digital Signal Processor (DSP), or the like. In some embodiments, processor 1320 may be implemented using custom circuits, such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).
The memory 1310 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. The ROM may store, among other things, static data or instructions for the processor 1320 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. Further, the memory 1310 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 1310 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a digital versatile disc read only (e.g., DVD-ROM, dual layer DVD-ROM), a Blu-ray disc read only, an ultra-dense disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disk, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 1310 has stored thereon executable code that, when processed by the processor 1320, causes the processor 1320 to perform the display methods described above.
The display method and the display apparatus and the multi-screen linkage system according to the present invention have been described in detail above with reference to the accompanying drawings.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out the above-mentioned steps defined in the above-mentioned method of the invention.
Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (27)

1. A display method, comprising:
displaying a first image on a first screen of a first device;
associating a second device with the first device such that a second image is displayed on a second screen of the second device, the second image corresponding to the same video content or video scene as the first image;
receiving an operation instruction sent out by the first equipment based on the first image;
and processing corresponding to the operation instruction is carried out on the second image displayed on the second screen.
2. The method of claim 1, wherein,
the second image is obtained based on the first image; or
The first image and the second image are derived based on the same image data source.
3. The method of claim 1, wherein,
the operation instruction includes a local enlargement instruction for a local area range on the first image,
the method further comprises the following steps: labeling the local region range on a first image displayed on a first screen in response to the local enlargement instruction,
wherein the step of performing processing corresponding to the operation instruction on the second image displayed on the second screen includes: and magnifying and displaying the local image in the local area range in the first image on a second screen.
4. The method of claim 3, wherein,
the operation instructions further include a local region range adjustment instruction,
the method further comprises the following steps: moving and/or scaling a local region range labeled on the first image in response to the local region range adjustment instruction,
wherein the step of performing processing corresponding to the operation instruction on the second image displayed on the second screen includes: as the partial area range is moved and/or zoomed on the first image, the partial image displayed enlarged on the second screen is adjusted accordingly so as to correspond to the partial area range.
5. The method of claim 3, wherein,
the operation instructions further include a partial enlargement instruction issued for a partial range within the local area range,
the method further comprises the following steps: labeling the partial range on a first image displayed on a first screen in response to the partial enlargement instruction,
wherein the step of performing processing corresponding to the operation instruction on the second image displayed on the second screen includes: and superposing a partial image display frame on the partial image which is displayed in an enlarged manner on the second screen, and enlarging and displaying a partial image in the partial range in the first image in the partial image display frame.
6. The method of any one of claims 1 to 5,
the operation instructions further include native zoom instructions for the first image,
the method further comprises the following steps: and in response to the local scaling instruction, performing scaling processing on the first image displayed on the first screen without performing corresponding scaling processing on the second image displayed on the second screen.
7. The method of any one of claims 1 to 5,
the operation instructions further include an identification instruction issued for the object image or the object region on the first image,
the method further comprises the following steps: and overlaying and displaying object information obtained by identifying the object image or the object image in the object area on a second image displayed on a second screen.
8. The method of claim 7, further comprising:
displaying shopping guide information associated with the object information on a first screen; and/or
And displaying object associated information obtained based on the object information on a second screen.
9. The method of any of claims 1 to 5, wherein the first and second images are video images, the method further comprising:
and displaying object information of a predetermined object on a first screen in response to the video image being played to a video frame containing the predetermined object.
10. The method of claim 9, wherein the displaying of the object information of the predetermined object on the first screen comprises:
displaying an object presentation image of the predetermined object on a first screen, the object presentation image including a configuration object image of the predetermined object, and highlighting the configured predetermined object image and object information thereof on the configuration object image; or
A predetermined object image and object information thereof are displayed on a first screen.
11. The method of any one of claims 1 to 5,
the operating instructions further comprise observation point and/or view angle conversion instructions for converting the corresponding observation point and/or view angle of the second image in the three-dimensional virtual space,
wherein the step of performing processing corresponding to the operation instruction on the second image displayed on the second screen includes: adjusting the display content of the second image based on the transformation of the viewpoint and/or the viewing angle.
12. The method of claim 11, wherein,
the first image and the second image are derived based on the same image data source, which is formed based on a virtual viewpoint reconstruction technique.
13. The method of claim 11, wherein,
the observation point and/or visual angle conversion instruction is sent out through touch operation on a first screen; or
The viewpoint and/or perspective transformation instructions are issued by sensing movement and/or rotation of the first device; or
The viewpoint and/or perspective conversion instruction is issued by sensing a change in position and/or posture of the head or body of the user using a camera of the first device.
14. The method of any one of claims 1 to 5,
the operation instruction is a voice operation instruction received by the first device,
the method further comprises the following steps: and recognizing the voice operation instruction.
15. The method of claim 1, wherein the first and second images are video images, the method further comprising:
in response to the video image playing to a predetermined image frame or a predetermined point in time, a game associated with the video image is initiated on the first device while the video image continues to play on the second screen.
16. The method of claim 15, further comprising:
and displaying information related to the game on a second screen in an overlapping manner.
17. The method of claim 1, further comprising:
starting the game on the first device while continuing to display the second image on the second screen; and
and displaying information related to the game on a second screen in an overlapping manner.
18. The method of claim 1, further comprising:
displaying a map on a first screen, wherein a plurality of areas on the map respectively correspond to different second images;
in response to a selection of an area on the map displayed on the first screen, switching to a second image corresponding to the selected area on the second screen.
19. The method of claim 18, wherein the second image is a video image, the method further comprising:
and correspondingly changing the map displayed on the first screen along with the playing of the video image.
20. The method of claim 1, wherein multiple images corresponding to different perspectives and/or different stands are provided for the same content, the method further comprising:
and in response to a switching instruction received by the first equipment, switching the visual angle and/or different machine positions corresponding to the first image and/or the second image, so that the images corresponding to the different visual angles and/or different machine positions are displayed on the first screen and the second screen.
21. The method of claim 1, further comprising:
receiving the input barrage information on the first device;
and displaying the bullet screen information from the first equipment and/or bullet screen information from other users on the second image in an overlapping mode.
22. The method of claim 1, further comprising:
receiving, by the second device, a second operation instruction issued based on the second image;
and processing corresponding to the second operation instruction is carried out on the first image displayed on the first screen.
23. The method of claim 22, wherein the second operational instruction comprises at least one of:
using an operation instruction issued by a controller of the second device;
sending an operation instruction through touch operation on a second screen;
a voice operation instruction received by the second device;
an operation instruction issued by sensing a change in position and/or posture of the head or body of the user using the camera of the second device.
24. A display device, comprising:
first display control means for displaying a first image on a first screen of a first device;
second display control means for associating the second device with the first device so that a second image is displayed on a second screen of the second device, the second image corresponding to the same video content or video scene as the first image;
instruction receiving means for receiving an operation instruction issued at the first device based on the first image,
wherein the second display control means performs processing corresponding to the operation instruction on the second image displayed on the second screen.
25. A multi-screen linkage system, comprising:
a first device including a first screen; and
a second device including a second screen,
wherein a first image is displayed on the first screen,
the second device being associated with the first device such that a second image is displayed on the second screen, the second image corresponding to the same video content or video scene as the first image,
the first device receives an operation instruction issued based on the first image,
and the second device performs processing corresponding to the operation instruction on a second image displayed on a second screen.
26. A computing device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any of claims 1 to 23.
27. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method of any of claims 1-23.
CN202011491190.3A 2020-12-16 2020-12-16 Display method, display device and multi-screen linkage system Pending CN113296721A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011491190.3A CN113296721A (en) 2020-12-16 2020-12-16 Display method, display device and multi-screen linkage system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011491190.3A CN113296721A (en) 2020-12-16 2020-12-16 Display method, display device and multi-screen linkage system

Publications (1)

Publication Number Publication Date
CN113296721A true CN113296721A (en) 2021-08-24

Family

ID=77318705

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011491190.3A Pending CN113296721A (en) 2020-12-16 2020-12-16 Display method, display device and multi-screen linkage system

Country Status (1)

Country Link
CN (1) CN113296721A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986171A (en) * 2021-10-26 2022-01-28 交控科技股份有限公司 Multi-screen cluster control method and system in rail transit station
CN114189696A (en) * 2021-11-24 2022-03-15 阿里巴巴(中国)有限公司 Video playing method and device
CN114442893A (en) * 2022-01-17 2022-05-06 北京翠鸟视觉科技有限公司 Image display method of near-eye display system and near-eye display system
CN114827688A (en) * 2022-02-16 2022-07-29 北京优酷科技有限公司 Content display method and device and electronic equipment

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060179127A1 (en) * 2005-02-07 2006-08-10 Stephen Randall System and Method for Location-based Interactive Content
EP2071578A1 (en) * 2007-12-13 2009-06-17 Sony Computer Entertainment Europe Ltd. Video interaction apparatus and method
US20110239142A1 (en) * 2010-03-25 2011-09-29 Nokia Corporation Method and apparatus for providing content over multiple displays
CN102938834A (en) * 2012-11-21 2013-02-20 北京佳讯飞鸿电气股份有限公司 Multiple-screen multiple-video micro-view scheduling system and scheduling operation method
CN103561315A (en) * 2013-10-21 2014-02-05 华为技术有限公司 Multi-screen interactive method, device and system
CN103916697A (en) * 2014-04-04 2014-07-09 深圳市同洲电子股份有限公司 Multi-application display method and relevant intelligent terminals
US20140282677A1 (en) * 2013-03-12 2014-09-18 Cbs Interactive Inc. Second screen application linked to media content delivery
US20140333671A1 (en) * 2013-05-10 2014-11-13 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
CN104333650A (en) * 2014-09-12 2015-02-04 深圳市中兴移动通信有限公司 Screen sharing method, system and mobile terminal
US20150181294A1 (en) * 2013-12-19 2015-06-25 Electronics And Telecommunications Research Institute Method and system for providing and receiving multi-screen based content
CN105573688A (en) * 2014-10-10 2016-05-11 广州杰赛科技股份有限公司 Multi-screen interoperation method based on image capture
CN105808091A (en) * 2014-12-31 2016-07-27 阿里巴巴集团控股有限公司 Apparatus and method for adjusting distribution range of interface operation icons and touch screen device
CN106034178A (en) * 2015-03-18 2016-10-19 阿里巴巴集团控股有限公司 Application switching method on intelligent terminal and apparatus thereof
CN107589848A (en) * 2017-09-25 2018-01-16 京东方科技集团股份有限公司 A kind of interactive display method, terminal and interactive display system
JP2018060083A (en) * 2016-10-06 2018-04-12 キヤノン株式会社 Display device, display method, and display system
US20180301121A1 (en) * 2017-04-13 2018-10-18 Samsung Electronics Co., Ltd. Electronic apparatus and method for displaying contents thereof
CN208638380U (en) * 2018-07-04 2019-03-22 厦门声连网信息科技有限公司 A kind of Multi-screen interaction system and interactive screen equipment
CN110363733A (en) * 2019-06-05 2019-10-22 阿里巴巴集团控股有限公司 A kind of mixed image generation method and device
CN113010136A (en) * 2021-05-24 2021-06-22 全时云商务服务股份有限公司 Method and system for intelligently amplifying shared desktop and readable storage medium
CN114143586A (en) * 2021-11-30 2022-03-04 深圳康佳电子科技有限公司 Split screen display sharing method, system, storage medium and intelligent screen

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060179127A1 (en) * 2005-02-07 2006-08-10 Stephen Randall System and Method for Location-based Interactive Content
EP2071578A1 (en) * 2007-12-13 2009-06-17 Sony Computer Entertainment Europe Ltd. Video interaction apparatus and method
US20110239142A1 (en) * 2010-03-25 2011-09-29 Nokia Corporation Method and apparatus for providing content over multiple displays
CN102822787A (en) * 2010-03-25 2012-12-12 诺基亚公司 Method and apparatus for providing content over multiple displays
CN102938834A (en) * 2012-11-21 2013-02-20 北京佳讯飞鸿电气股份有限公司 Multiple-screen multiple-video micro-view scheduling system and scheduling operation method
US20140282677A1 (en) * 2013-03-12 2014-09-18 Cbs Interactive Inc. Second screen application linked to media content delivery
US20140333671A1 (en) * 2013-05-10 2014-11-13 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
CN103561315A (en) * 2013-10-21 2014-02-05 华为技术有限公司 Multi-screen interactive method, device and system
US20150181294A1 (en) * 2013-12-19 2015-06-25 Electronics And Telecommunications Research Institute Method and system for providing and receiving multi-screen based content
CN103916697A (en) * 2014-04-04 2014-07-09 深圳市同洲电子股份有限公司 Multi-application display method and relevant intelligent terminals
CN104333650A (en) * 2014-09-12 2015-02-04 深圳市中兴移动通信有限公司 Screen sharing method, system and mobile terminal
CN105573688A (en) * 2014-10-10 2016-05-11 广州杰赛科技股份有限公司 Multi-screen interoperation method based on image capture
CN105808091A (en) * 2014-12-31 2016-07-27 阿里巴巴集团控股有限公司 Apparatus and method for adjusting distribution range of interface operation icons and touch screen device
CN106034178A (en) * 2015-03-18 2016-10-19 阿里巴巴集团控股有限公司 Application switching method on intelligent terminal and apparatus thereof
JP2018060083A (en) * 2016-10-06 2018-04-12 キヤノン株式会社 Display device, display method, and display system
US20180301121A1 (en) * 2017-04-13 2018-10-18 Samsung Electronics Co., Ltd. Electronic apparatus and method for displaying contents thereof
CN107589848A (en) * 2017-09-25 2018-01-16 京东方科技集团股份有限公司 A kind of interactive display method, terminal and interactive display system
CN208638380U (en) * 2018-07-04 2019-03-22 厦门声连网信息科技有限公司 A kind of Multi-screen interaction system and interactive screen equipment
CN110363733A (en) * 2019-06-05 2019-10-22 阿里巴巴集团控股有限公司 A kind of mixed image generation method and device
CN113010136A (en) * 2021-05-24 2021-06-22 全时云商务服务股份有限公司 Method and system for intelligently amplifying shared desktop and readable storage medium
CN114143586A (en) * 2021-11-30 2022-03-04 深圳康佳电子科技有限公司 Split screen display sharing method, system, storage medium and intelligent screen

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986171A (en) * 2021-10-26 2022-01-28 交控科技股份有限公司 Multi-screen cluster control method and system in rail transit station
CN114189696A (en) * 2021-11-24 2022-03-15 阿里巴巴(中国)有限公司 Video playing method and device
CN114189696B (en) * 2021-11-24 2024-03-08 阿里巴巴(中国)有限公司 Video playing method and device
CN114442893A (en) * 2022-01-17 2022-05-06 北京翠鸟视觉科技有限公司 Image display method of near-eye display system and near-eye display system
CN114827688A (en) * 2022-02-16 2022-07-29 北京优酷科技有限公司 Content display method and device and electronic equipment
CN114827688B (en) * 2022-02-16 2024-01-09 北京优酷科技有限公司 Content display method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN113296721A (en) Display method, display device and multi-screen linkage system
TWI530157B (en) Method and system for displaying multi-view images and non-transitory computer readable storage medium thereof
US20200288098A1 (en) Method, apparatus, medium, terminal, and device for multi-angle free-perspective interaction
CN106331732B (en) Generate, show the method and device of panorama content
JP5406813B2 (en) Panorama image display device and panorama image display method
CN106792228B (en) Live broadcast interaction method and system
US11533438B2 (en) Method to configure a virtual camera path
US20100153847A1 (en) User deformation of movie character images
US20080246759A1 (en) Automatic Scene Modeling for the 3D Camera and 3D Video
WO2022002181A1 (en) Free viewpoint video reconstruction method and playing processing method, and device and storage medium
US9129657B2 (en) Video image display apparatus, video image display method, non-transitory computer readable medium, and video image processing/display system for video images of an object shot from multiple angles
WO2017032336A1 (en) System and method for capturing and displaying images
JP2021002288A (en) Image processor, content processing system, and image processing method
CN111970532A (en) Video playing method, device and equipment
EP3503101A1 (en) Object based user interface
JP6628343B2 (en) Apparatus and related methods
CN114327700A (en) Virtual reality equipment and screenshot picture playing method
JP2021034885A (en) Image generation device, image display device, and image processing method
US20230353717A1 (en) Image processing system, image processing method, and storage medium
CN110730340B (en) Virtual audience display method, system and storage medium based on lens transformation
WO2020206647A1 (en) Method and apparatus for controlling, by means of following motion of user, playing of video content
KR102073230B1 (en) Apparaturs for playing vr video to improve quality of specific area
CN112288877A (en) Video playing method and device, electronic equipment and storage medium
KR20120035322A (en) System and method for playing contents of augmented reality
US20200225467A1 (en) Method for projecting immersive audiovisual content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210824

RJ01 Rejection of invention patent application after publication