CN114500846B - Live action viewing angle switching method, device, equipment and readable storage medium - Google Patents

Live action viewing angle switching method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN114500846B
CN114500846B CN202210130893.6A CN202210130893A CN114500846B CN 114500846 B CN114500846 B CN 114500846B CN 202210130893 A CN202210130893 A CN 202210130893A CN 114500846 B CN114500846 B CN 114500846B
Authority
CN
China
Prior art keywords
visual angle
shooting
angle
view
view angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210130893.6A
Other languages
Chinese (zh)
Other versions
CN114500846A (en
Inventor
刘威
夏勇峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Beehive Century Technology Co ltd
Original Assignee
Beijing Beehive Century Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Beehive Century Technology Co ltd filed Critical Beijing Beehive Century Technology Co ltd
Priority to CN202210130893.6A priority Critical patent/CN114500846B/en
Publication of CN114500846A publication Critical patent/CN114500846A/en
Application granted granted Critical
Publication of CN114500846B publication Critical patent/CN114500846B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4524Management of client data or end-user data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Abstract

The application relates to a method, a device, equipment and a readable storage medium for switching viewing angles of live action, which relate to the technical field of video information processing, and the method comprises the following steps: acquiring the position information of the first visual angle shooting equipment and the position information of the second visual angle shooting equipment; dividing an activity scene into at least one view angle area based on the position information of the first view angle shooting equipment and the position information of the second view angle shooting equipment, wherein at least one second view angle equipment is arranged in each view angle area; generating visual angle switching prompt information corresponding to each visual angle area respectively; displaying the visual angle switching prompt information for selecting a visual angle area by a user; and responding to the view angle region selection action of the user, and acquiring and displaying the video stream acquired by the second view angle shooting equipment corresponding to the view angle region selected by the user. According to the method and the device, the user can switch the viewing angle when watching the live activities, so that the effect of the optimal viewing angle is obtained.

Description

Live action viewing angle switching method, device, equipment and readable storage medium
Technical Field
The present invention relates to the field of video information processing technologies, and in particular, to a method, an apparatus, a device, and a readable storage medium for switching viewing angles of live events.
Background
With the increasing development of social economy, the entertainment industry is actively developing, in which live activities are more and more living, such as football games, basketball games, concerts, drama, and the like. Viewing live activities become objects for people to compete and pursue, and people like to exchange viewing activities, enjoy activities, and further enjoy artistic feelings, personal care, spirit and the like which are contained in the activities and accord with pursuits of the people.
The look and feel of the live event often depends on the viewing angle, and because of the large number of spectators in the live event, only a certain number of spectators can be accommodated in the areas with different viewing angles and the spectators cannot walk at will for the purpose of maintaining order and the safety of the current staff. Thus, the current viewing angle of the user is not the optimal viewing angle of the user, the viewing angle of the user cannot be switched, and the viewing experience is poor, so a technology for switching the viewing angle of the live event is needed.
Disclosure of Invention
In order to make the viewing angle of a live audience be the optimal viewing angle, the viewing angle of the live audience can be switched, and the application provides a live activity viewing angle switching method, a live activity viewing angle switching device, live activity viewing angle switching equipment and a readable storage medium.
In a first aspect, the present application provides a method for switching viewing angles of live action, which adopts the following technical scheme:
the field activity watching visual angle switching method comprises the steps of obtaining position information of the first visual angle shooting equipment and position information of the second visual angle shooting equipment;
dividing an activity scene into at least one view angle area based on the position information of the first view angle shooting equipment and the position information of the second view angle shooting equipment, wherein at least one second view angle equipment is arranged in each view angle area;
generating visual angle switching prompt information corresponding to each visual angle area respectively;
displaying the visual angle switching prompt information for selecting a visual angle area by a user;
and responding to the view angle region selection action of the user, and acquiring and displaying the video stream acquired by the second view angle shooting equipment corresponding to the view angle region selected by the user.
By adopting the technical scheme, the moving scene in the first visual angle shooting equipment is divided into at least one visual angle area by utilizing the position information of the first visual angle switching equipment and the second visual angle shooting equipment, so that the relative position of each second visual angle shooting equipment relative to the first visual angle shooting equipment is known, the prompting information of each visual angle area is generated to remind a user of being capable of switching visual angles with the second visual angle switching equipment in the area, and the video stream acquired by the second visual angle shooting equipment corresponding to the visual angle area selected by the user is acquired and displayed, so that the user can watch and select different visual angles, the problem that the user cannot switch the visual angles when watching the moving scene is solved, and the watching experience of the user is better.
Optionally, before the acquiring the position information of the first view photographing device and the second view photographing device, the method further includes:
determining position information of the activity scene based on the positioning information of the first view photographing apparatus;
audience area information for the event venue is obtained based on the location information for the event venue.
Optionally, the acquiring the position information of the first view photographing device includes: responding to a photographing triggering action of a user, and acquiring image information comprising a current seat number;
performing feature recognition on the image information to obtain feature information;
and mapping the characteristic information and audience area information to obtain the position information of the first visual angle shooting equipment.
By adopting the technical scheme, the image photographed by the first visual angle photographing device is subjected to feature recognition by utilizing the image recognition, so that the determined position information of the first visual angle photographing device is more accurate, the first visual angle photographing device is convenient to divide the activity site, and the azimuth of the second visual angle photographing device relative to the first visual angle photographing device is determined.
Optionally, the acquiring and displaying the video stream acquired by the second view shooting device corresponding to the view area selected by the user includes:
if a second visual angle shooting device is arranged in the visual angle area selected by the user, acquiring and displaying a video stream acquired by the second visual angle shooting device;
if a plurality of second view angle switching devices are arranged in the view angle area selected by the user, calculating the superposition area of the first shooting view angle of the first view angle shooting device and the second shooting view angle of each second view angle shooting device in the view angle area selected by the user based on the position information of the first view angle shooting device and the position information of the plurality of second view angle shooting devices;
if the first shooting visual angle and at least one second shooting visual angle have overlapping areas, acquiring and displaying video streams acquired by second visual angle shooting equipment with the smallest overlapping areas;
and if the first shooting visual angle and each second shooting visual angle do not have a superposition area, acquiring and displaying the video stream acquired by the second visual angle shooting equipment of the second shooting visual angle adjacent to the first shooting visual angle.
Through adopting above-mentioned technical scheme, through the video stream that the second visual angle shooting equipment in the visual angle region gathered is obtained and displayed for the user can switch the visual angle according to own demand, has improved user's viewing experience and has felt and be convenient for user operation. The method and the device have the advantages that the optimal view angle in the view angle area is acquired and displayed, the processing steps are simplified, and the memory quantity generated by displaying video streams acquired by a plurality of second view angle shooting devices in the view angle area is reduced.
Optionally, after the capturing and displaying the video stream acquired by the second view capturing device with the smallest overlapping area, the method further includes:
splicing the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream shot by the first visual angle shooting equipment to obtain a panoramic video stream;
displaying panoramic viewing prompt information for a user to select a panoramic video stream to display;
and responding to the panoramic viewing trigger action of the user, and displaying the panoramic video stream.
Through adopting above-mentioned technical scheme, through to the video stream that the minimum second visual angle shooting equipment of coincidence region gathered with the video stream that first visual angle shooting equipment shot is spliced, and the user selects panorama to watch prompt message for the visual angle scope of the scene activity that the user can watch is bigger.
Optionally, the splicing the video stream acquired by the second view angle shooting device with the smallest overlapping area with the video stream shot by the first view angle shooting device to obtain a panoramic video stream includes:
preprocessing a video stream acquired by the second visual angle shooting equipment with the smallest overlapping area and a video stream shot by the first visual angle shooting equipment, wherein the preprocessing comprises image consistency adjustment, light consistency adjustment and video stream angle consistency adjustment;
extracting characteristics of frames of the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area at the same time as the video stream shot by the first visual angle shooting equipment, and obtaining characteristic points of the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area and the video stream shot by the first visual angle shooting equipment;
performing feature point matching based on the feature points to obtain a feature point matching set;
and merging the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream shot by the first visual angle shooting equipment based on the characteristic point matching set to obtain a panoramic video stream.
Through adopting above-mentioned technical scheme, carry out the preliminary treatment to the video stream that first visual angle shooting equipment gathered and the video stream that second visual angle shooting equipment gathered for the feature point of gathering is more accurate, and the panorama video stream impression that the concatenation obtained is better.
In a second aspect, the present application provides a viewing angle switching device for live action, which adopts the following technical scheme:
a live action viewing angle switching apparatus applied to a first viewing angle photographing device, the apparatus comprising:
the first acquisition module is used for acquiring the position information of the first visual angle shooting equipment and the position information of the second visual angle shooting equipment;
the division module is used for dividing the activity site into at least one view angle area based on the position information of the first view angle shooting equipment and the position information of the second view angle shooting equipment, wherein at least one second view angle equipment is arranged in each view angle area;
the generation module is used for respectively generating visual angle switching prompt information corresponding to each visual angle area;
the display module is used for displaying the visual angle switching prompt information and is used for a user to select a visual angle area;
the response module is used for responding to the selected action of the view angle area of the user and acquiring and displaying the video stream acquired by the second view angle shooting equipment corresponding to the view angle area selected by the user.
By adopting the technical scheme, the moving scene in the first visual angle shooting equipment is divided into at least one visual angle area by utilizing the position information of the first visual angle switching equipment and the second visual angle shooting equipment, so that the relative position of each second visual angle shooting equipment relative to the first visual angle shooting equipment is known, the prompting information of each visual angle area is generated to remind a user of being capable of switching visual angles with the second visual angle switching equipment in the area, and the video stream acquired by the second visual angle shooting equipment corresponding to the visual angle area selected by the user is acquired and displayed, so that the user can watch and select different visual angles, the problem that the user cannot switch the visual angles when watching the moving scene is solved, and the watching experience of the user is better.
In a third aspect, the present application provides a viewing angle switching apparatus, which adopts the following technical scheme:
a view angle photographing apparatus comprising a memory and a processor, the memory having stored thereon a computer program capable of being loaded by the processor and executing the live action view angle switching method of any one of the first aspects.
In a fourth aspect, the present application provides a computer readable storage medium, which adopts the following technical scheme:
a computer-readable storage medium storing a computer program capable of being loaded by a processor and executing the live-action viewing angle switching method of any one of the first aspects.
In summary, the present application includes at least one of the following beneficial technical effects:
1. the method comprises the steps of dividing an activity scene in a first visual angle shooting device into at least one visual angle area by using position information of the first visual angle switching device and second visual angle shooting devices, obtaining the relative position of each second visual angle shooting device relative to the first visual angle shooting device, generating prompt information of each visual angle area to remind a user of being capable of switching visual angles with the second visual angle switching devices in the area, and obtaining and displaying video streams acquired by the second visual angle shooting devices corresponding to the visual angle areas selected by the user, so that the user can watch and select different visual angles, the problem that the user cannot switch visual angles when watching the scene activity is solved, and the watching experience of the user is good.
2. Through obtaining and displaying the video stream that second visual angle shooting equipment gathered in the visual angle region for the user can switch the visual angle according to own demand, has improved user's viewing experience and has felt and be convenient for user operation. The method comprises the steps of acquiring and displaying an optimal view angle in a view angle area, simplifying processing steps, and reducing the memory quantity generated by displaying video streams acquired by a plurality of second view angle shooting devices in the view angle area;
3. and the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area is spliced with the video stream shot by the first visual angle shooting equipment, and the user selects panoramic viewing prompt information, so that the visual angle range of the field activity which can be watched by the user is larger.
Drawings
Fig. 1 is a flow chart of a live action view angle switching method according to an embodiment of the present application.
Fig. 2 is a schematic diagram of an embodiment of the present application embodying division of an activity scene.
Fig. 3 is a schematic flow chart of step S105 in the embodiment of the present application.
Fig. 4 is a block diagram of a live view angle switching device according to an embodiment of the present application.
Fig. 5 is a block diagram of the structure of the view photographing apparatus according to the embodiment of the application.
Detailed Description
The present embodiment is merely illustrative of the present application and is not intended to be limiting, and those skilled in the art, after having read the present specification, may make modifications to the present embodiment without creative contribution as required, but is protected by patent laws within the scope of the claims of the present application.
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
In addition, the term "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In this context, unless otherwise specified, the term "/" generally indicates that the associated object is an "or" relationship.
The method for switching the viewing angle of live action provided by the embodiment of the application is implemented by a viewing angle shooting device, and the method for switching the viewing angle of live action uses a first viewing angle shooting device as an execution subject, wherein the first viewing angle shooting device can be AR glasses, a head-mounted display device, a mobile phone and a tablet computer, and the embodiment is not particularly limited.
Embodiments of the present application are described in further detail below with reference to the drawings attached hereto. As shown in fig. 1, the main flow of the method is described as follows (steps S101 to S105):
step S101, acquiring position information of a first visual angle shooting device and position information of a second visual angle shooting device;
in this embodiment, a user may use a first view photographing apparatus in various field activities, such as a concert, a basketball game, a court, etc., and if other users use a second view photographing apparatus in the field activities, the first view photographing apparatus needs to acquire position information of the first view photographing apparatus and position information of the second view photographing apparatus.
Since the sites used for different live activities are different, it is necessary to determine the position information of the first view photographing apparatus and the second view photographing apparatus from the position information of the live activities, and thus it is necessary to acquire the position information of the live site before acquiring the position information of the first view photographing apparatus and the second view photographing apparatus.
Firstly, determining the position information of an activity site through the positioning information of first view shooting equipment;
then, audience area information of the event venue is acquired based on the location information of the event venue.
In this embodiment, the positioning information of the first view angle capturing device may be obtained by the GPS positioning module, or the user may input the position information of the activity site to the first view angle capturing device. The first view photographing apparatus may acquire the audience area information through the internet by using the location information of the live event, which is not particularly limited in this embodiment.
After the first view shooting device acquires the audience area information, acquiring the position information of the first view shooting device and the position information of the second view shooting device, wherein the position information acquisition method of the first view shooting device is as follows:
the method comprises the steps that a user shoots a seat number by using a first visual angle shooting device, the user can trigger a shooting key in a mode of a key, a touch screen and the like of the first visual angle shooting device to generate shooting operation, and the first visual angle shooting device responds to the shooting operation to acquire image information comprising the current seat number;
carrying out feature recognition on the image information to obtain feature information;
and mapping the characteristic information and the audience area information to obtain the position information of the first visual angle shooting equipment.
In this embodiment, the first view angle capturing device obtains image information and performs feature recognition on the image information through an image recognition model to obtain feature information, and by mapping the feature information with audience area information, the position information of the first view angle capturing device can be obtained, so that the obtained position information is more accurate, where the image recognition model is a deep neural network for performing image recognition, and the deep neural network may be a convolutional neural network.
It should be noted that, the user may first take a picture of the seat number, store the taken picture in the memory, and perform feature recognition after the audience area information is obtained by the first view angle photographing device, or may first obtain the audience area information by the first view angle photographing device, and then take a picture of the seat number, which is not specifically limited in this embodiment.
In addition, after the second view angle shooting device shoots the image, the seat number can be obtained through image information feature recognition, the seat number can be directly sent to the first view angle shooting device, the shot image can also be sent to the first view angle shooting device, and the first view angle shooting device carries out image information feature recognition to obtain the seat number. The first view capturing device and the second view capturing device may communicate through internet of things, wireless or bluetooth, which is not specifically limited in this embodiment.
Step S102, dividing an activity site into at least one view angle area based on the position information of the first view angle shooting device and the position information of the second view angle shooting device, wherein at least one second view angle device is arranged in each view angle area;
the division of the live view into at least one view area will be specifically described below taking the first view photographing apparatus a and the second view photographing apparatuses B to F in fig. 2 as an example.
As shown in fig. 2, the first view photographing apparatus a divides the live scene into four view areas, namely, view area 1, view area 2, view area 3 and view area 4, wherein each view area has at least one second view photographing apparatus therein.
In this embodiment, the division view angle area may divide the active site into at least one view angle area of a standard graphic, or may be a view angle area of an irregular graphic, or may be a view angle area of a combination of a regular graphic and an irregular graphic, which is not particularly limited in this embodiment. It is noted that if the side of the first viewing device a does not have the second viewing device, the side of the first viewing device a that does not have the second viewing device is not used as the viewing area.
Step S103, respectively generating visual angle switching prompt information corresponding to each visual angle area;
step S104, displaying visual angle switching prompt information for selecting a visual angle area by a user;
in this embodiment, the visual angle switching prompt information may be in the form of text or pattern, or may be in the form of text and pattern combination, so as to prompt the user to switch the visual angle, thereby facilitating the user to select the visual angle area.
Step S105, in response to the user' S view angle region selection action, acquiring and displaying the video stream acquired by the second view angle capturing device corresponding to the view angle region selected by the user.
In this embodiment, the view angle region selection action may be a rotation direction and/or a movement track of the first view angle capturing device. For example, if the user needs to select the right view angle area, the user rotates or moves the first view angle capturing device to the right to generate a right view angle area selecting action, and the first view angle capturing device responds to the view angle area selecting operation to acquire and display a video stream acquired by the second view angle capturing device corresponding to the view angle area selected by the user.
Specifically, the following situations may be obtained and displayed for the video stream acquired by the second view capturing device corresponding to the view area selected by the user:
(1) If a second visual angle shooting device is arranged in the visual angle area selected by the user, acquiring and displaying a video stream acquired by the second visual angle shooting device;
(2) If a plurality of second view angle switching devices are arranged in the view angle area selected by the user, respectively calculating the superposition area of the first shooting view angle of the first view angle shooting device and the second shooting view angle of each second view angle shooting device in the view angle area selected by the user based on the position information of the first view angle shooting device and the position information of the plurality of second view angle shooting devices;
if the first shooting visual angle and at least one second shooting visual angle have overlapping areas, acquiring and displaying video streams acquired by second visual angle shooting equipment with the smallest overlapping areas;
and if the first shooting visual angle and each second shooting visual angle do not have the overlapping area, acquiring and displaying the video stream acquired by the second visual angle shooting equipment of the second shooting visual angle adjacent to the first shooting visual angle.
In this embodiment, the overlapping area of the first photographing view angle of the first view angle photographing device and the second photographing view angle of each of the second view angle photographing devices in the view angle area selected by the user may be calculated using a polygon cross-over area algorithm.
Through obtaining and displaying the video stream that second visual angle shooting equipment gathered in the visual angle region for the user can switch the visual angle according to own demand, has improved user's viewing experience and has felt and be convenient for user operation. The method and the device have the advantages that the optimal view angle in the view angle area is acquired and displayed, the processing steps are simplified, and the memory quantity generated by displaying video streams acquired by a plurality of second view angle shooting devices in the view angle area is reduced.
The on-site activity watching visual angle switching method divides an activity site in a first visual angle shooting device into at least one visual angle area by utilizing the position information of the first visual angle switching device and the second visual angle shooting device so as to obtain the relative position of each second visual angle shooting device relative to the first visual angle shooting device, and generates prompt information of each visual angle area so as to remind a user of being capable of switching visual angles with the second visual angle switching device in the area, and video streams acquired by the second visual angle shooting device corresponding to the visual angle area selected by the user are acquired and displayed, so that the user can watch and select different visual angles, the problem that the user cannot conduct visual angle switching when watching the on-site activity is solved, and the watching experience of the user is better.
In order to enable the user to see a more comprehensive live video stream, the method for switching the live viewing angle is further optimized.
As a further embodiment of live-view angle switching, after acquiring and displaying the video stream acquired by the second view angle capturing device with the smallest overlapping area, as shown in fig. 3, the method further includes the following flow (steps S1051 to S1052):
step S1051, splicing the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream shot by the first visual angle shooting equipment to obtain a panoramic video stream;
specifically, preprocessing a video stream acquired by a second visual angle shooting device with the smallest overlapping area and a video stream shot by a first visual angle shooting device, wherein the preprocessing comprises image consistency adjustment, light consistency adjustment and video stream angle consistency adjustment;
the method comprises the steps of performing feature extraction on frames of a video stream acquired by second visual angle shooting equipment with the smallest overlapping area at the same moment as a video stream shot by first visual angle shooting equipment, and obtaining feature points of the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area and the video stream shot by the first visual angle shooting equipment;
performing feature point matching based on the feature points to obtain a feature point matching set;
and merging the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream shot by the first visual angle shooting equipment based on the feature point matching set to obtain a panoramic video stream.
In this embodiment, the video stream acquired by the second view angle capturing device with the smallest overlapping area and the video stream captured by the first view angle capturing device may be identified by image recognition and image stitching, which are disclosed in the art, and will not be described in detail in this embodiment.
Step S1052, displaying panoramic viewing prompt information for a user to select a panoramic video stream to display;
step S1053, in response to the panoramic viewing trigger action of the user, displays the panoramic video stream.
In this embodiment, the panoramic viewing hint information may be pattern information or text information. The panoramic viewing trigger action can be different from other actions of the view angle area selection action, and can be consistent with the view angle area selection action, the panoramic viewing trigger action can be performed with secondary or repeated actions to be different from the view angle area selection action, and the panoramic video stream from the video stream acquired by the second view angle shooting device corresponding to the view angle area selected by the user to the panoramic video stream in the area is switched.
The video stream acquired by the second visual angle shooting equipment with the smallest overlapping area is spliced with the video stream shot by the first visual angle shooting equipment, and the user selects panoramic viewing prompt information, so that the visual angle range of the field activity which can be watched by the user is larger.
It should be noted that, if a second viewing angle capturing device is disposed in the viewing angle area and the second capturing viewing angle and the first capturing viewing angle have overlapping areas, the video stream captured by the first viewing angle capturing device and the video stream captured by the second viewing angle capturing device may be spliced by using the above method to obtain a panoramic video stream, which is not described herein.
Fig. 4 is a block diagram illustrating a structure of a live view angle switching device 200 according to an embodiment of the present application.
As shown in fig. 4, the live-view angle switching device 200 mainly includes:
a first obtaining module 201, configured to obtain location information of a first view angle capturing device and location information of a second view angle capturing device;
a dividing module 202, configured to divide an activity scene into at least one view area based on the position information of the first view photographing device and the position information of the second view photographing device, where at least one second view device is disposed in each view area;
a generating module 203, configured to generate viewing angle switching prompt information corresponding to each viewing angle region;
the display module 204 is configured to display viewing angle switching prompt information, for selecting a viewing angle region by a user;
the response module 205 is configured to obtain and display, in response to a user's view angle region selection action, a video stream acquired by a second view angle capturing device corresponding to the view angle region selected by the user.
As an optional implementation manner of this embodiment, the live action view angle switching apparatus 200 further includes a second acquisition module, configured to determine, before acquiring the position information of the first view angle capturing device and the second view angle capturing device, the position information of the action scene based on the positioning information of the first view angle capturing device; audience area information for an event venue is obtained based on location information for the event venue.
As an optional implementation manner of this embodiment, the first obtaining module 201 is specifically configured to obtain, in response to a photographing trigger action of a user, image information including a current seat number; carrying out feature recognition on the image information to obtain feature information; and mapping the characteristic information and the audience area information to obtain the position information of the first visual angle shooting equipment.
As an alternative implementation of this embodiment, the response module 205 includes:
the first acquisition sub-module is used for acquiring and displaying the video stream acquired by the second visual angle shooting equipment if the second visual angle shooting equipment is arranged in the visual angle area selected by the user;
the second obtaining submodule is used for respectively calculating the superposition area of the first shooting view angle of the first view angle shooting device and the second shooting view angle of each second view angle shooting device in the view angle area selected by the user based on the position information of the first view angle shooting device and the position information of the plurality of second view angle shooting devices if the plurality of second view angle switching devices are arranged in the view angle area selected by the user;
if the first shooting visual angle and at least one second shooting visual angle have overlapping areas, acquiring and displaying video streams acquired by second visual angle shooting equipment with the smallest overlapping areas;
and if the first shooting visual angle and each second shooting visual angle do not have the overlapping area, acquiring and displaying the video stream acquired by the second visual angle shooting equipment of the second shooting visual angle adjacent to the first shooting visual angle.
In this alternative embodiment, the second acquisition submodule includes:
the splicing sub-module is used for splicing the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream acquired by the first visual angle shooting equipment after acquiring and displaying the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area, so as to obtain a panoramic video stream;
the display sub-module is used for displaying panoramic viewing prompt information and is used for a user to select a panoramic video stream to display;
and the response sub-module is used for responding to the panoramic viewing trigger action of the user and displaying the panoramic video stream.
Optionally, the splicing submodule is specifically configured to perform preprocessing on a video stream acquired by the second view angle shooting device and a video stream shot by the first view angle shooting device, where the overlapping area is the smallest, where the preprocessing includes image consistency adjustment, light consistency adjustment and video stream angle consistency adjustment;
the method comprises the steps of performing feature extraction on frames of a video stream acquired by second visual angle shooting equipment with the smallest overlapping area at the same moment as a video stream shot by first visual angle shooting equipment, and obtaining feature points of the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area and the video stream shot by the first visual angle shooting equipment;
performing feature point matching based on the feature points to obtain a feature point matching set;
and merging the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream shot by the first visual angle shooting equipment based on the feature point matching set to obtain a panoramic video stream.
In one example, a module in any of the above apparatuses may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (application specific integratedcircuit, ASIC), or one or more digital signal processors (digital signal processor, DSP), or one or more field programmable gate arrays (field programmable gate array, FPGA), or a combination of at least two of these integrated circuit forms.
For another example, when a module in an apparatus may be implemented in the form of a scheduler of processing elements, the processing elements may be general-purpose processors, such as a central processing unit (central processing unit, CPU) or other processor that may invoke a program. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Various objects such as various messages/information/devices/network elements/systems/devices/actions/operations/processes/concepts may be named in the present application, and it should be understood that these specific names do not constitute limitations on related objects, and that the named names may be changed according to the scenario, context, or usage habit, etc., and understanding of technical meaning of technical terms in the present application should be mainly determined from functions and technical effects that are embodied/performed in the technical solution.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Fig. 5 is a block diagram illustrating a structure of a view photographing apparatus 300 according to an embodiment of the present application.
As shown in fig. 5, the view photographing apparatus 300 includes a processor 301 and a memory 302, and may further include one or more of an information input/information output (I/O) interface 303 and a communication component 304.
Wherein the processor 301 is configured to control the overall operation of the view photographing apparatus 300 to complete all or part of the steps in the indoor humidity adjustment method described above; the memory 302 is used to store various types of data to support operation of the view photographing apparatus 300, which may include, for example, instructions for any application or method operating on the view photographing apparatus 300, as well as application-related data. The Memory 302 may be implemented by any type or combination of volatile or non-volatile Memory devices, such as one or more of static random access Memory (Static Random Access Memory, SRAM), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk.
The I/O interface 303 provides an interface between the processor 301 and other interface modules, which may be a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 304 is used to test wired or wireless communication between the view photographing apparatus 300 and other apparatuses. Wireless communication, such as Wi-Fi, bluetooth, near field communication (Near Field Communication, NFC for short), 2G, 3G or 4G, or a combination of one or more thereof, the corresponding communication component 104 may thus comprise: wi-Fi part, bluetooth part, NFC part.
Communication bus 305 may include a pathway to transfer information between the aforementioned components. The communication bus 305 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. The communication bus 305 may be divided into an address bus, a data bus, a control bus, and the like.
The viewing angle capturing device 300 may be implemented by one or more application specific integrated circuits (Application SpecificIntegrated Circuit, abbreviated as ASIC), digital signal processors (Digital Signal Processor, abbreviated as DSP), digital signal processing devices (Digital Signal Processing Device, abbreviated as DSPD), programmable logic devices (Programmable Logic Device, abbreviated as PLD), field programmable gate arrays (Field Programmable Gate Array, abbreviated as FPGA), controllers, microcontrollers, microprocessors, or other electronic components for performing the field active viewing angle switching method as set forth in the above embodiments.
The view photographing apparatus 300 may include, but is not limited to, a mobile terminal such as a digital broadcast receiver, a PDA (personal digital assistant), a PMP (portable multimedia player), etc., and a fixed terminal such as a digital TV, a desktop computer, etc., and may also be a server, etc.
The following describes a computer-readable storage medium provided in an embodiment of the present application, and the computer-readable storage medium described below and the live-action viewing angle switching method described above may be referred to correspondingly with each other.
The present application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the live action viewing angle switching method described above.
The computer readable storage medium may include: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
The foregoing description is only of the preferred embodiments of the present application and is presented as a description of the principles of the technology being utilized. It will be appreciated by persons skilled in the art that the scope of the application referred to in this application is not limited to the specific combinations of features described above, but it is intended to cover other embodiments in which any combination of features described above or their equivalents is possible without departing from the spirit of the application. Such as the above-mentioned features and the technical features having similar functions (but not limited to) applied for in this application are replaced with each other.

Claims (8)

1. A live view angle switching method, applied to a first view angle photographing apparatus, comprising:
acquiring the position information of the first visual angle shooting equipment and the position information of the second visual angle shooting equipment;
dividing an activity scene into at least one view angle area based on the position information of the first view angle shooting equipment and the position information of the second view angle shooting equipment, wherein at least one second view angle equipment is arranged in each view angle area, and if one side of the first view angle equipment does not have the second view angle equipment, one side of the first view angle equipment, which does not have the second view angle equipment, is not used as the view angle area;
generating visual angle switching prompt information corresponding to each visual angle area respectively;
displaying the visual angle switching prompt information for selecting a visual angle area by a user;
responding to a view angle region selection action of a user, and acquiring and displaying a video stream acquired by second view angle shooting equipment corresponding to the view angle region selected by the user;
the obtaining and displaying the video stream collected by the second view shooting device corresponding to the view area selected by the user includes:
if a second visual angle shooting device is arranged in the visual angle area selected by the user, acquiring and displaying a video stream acquired by the second visual angle shooting device;
if a plurality of second view angle switching devices are arranged in the view angle area selected by the user, calculating the superposition area of the first shooting view angle of the first view angle shooting device and the second shooting view angle of each second view angle shooting device in the view angle area selected by the user based on the position information of the first view angle shooting device and the position information of the plurality of second view angle shooting devices;
if the first shooting visual angle and at least one second shooting visual angle have overlapping areas, acquiring and displaying video streams acquired by second visual angle shooting equipment with the smallest overlapping areas;
and if the first shooting visual angle and each second shooting visual angle do not have a superposition area, acquiring and displaying the video stream acquired by the second visual angle shooting equipment of the second shooting visual angle adjacent to the first shooting visual angle.
2. The method of claim 1, wherein prior to the acquiring the position information of the first view photographing device and the second view photographing device, the method further comprises:
determining position information of the activity scene based on the positioning information of the first view photographing apparatus;
audience area information for the event venue is obtained based on the location information for the event venue.
3. The method of claim 2, wherein the acquiring the position information of the first perspective shooting device comprises:
responding to a photographing triggering action of a user, and acquiring image information comprising a current seat number;
performing feature recognition on the image information to obtain feature information;
and mapping the characteristic information and audience area information to obtain the position information of the first visual angle shooting equipment.
4. The method of claim 1, wherein after the capturing and displaying the video stream captured by the second view capturing device having the smallest overlapping area, the method further comprises:
splicing the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream shot by the first visual angle shooting equipment to obtain a panoramic video stream;
displaying panoramic viewing prompt information for a user to select a panoramic video stream to display;
and responding to the panoramic viewing trigger action of the user, and displaying the panoramic video stream.
5. The method of claim 4, wherein the splicing the video stream acquired by the second view photographing device with the smallest overlapping area with the video stream acquired by the first view photographing device to obtain a panoramic video stream comprises:
preprocessing a video stream acquired by the second visual angle shooting equipment with the smallest overlapping area and a video stream shot by the first visual angle shooting equipment, wherein the preprocessing comprises image consistency adjustment, light consistency adjustment and video stream angle consistency adjustment;
extracting characteristics of frames of the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area at the same time as the video stream shot by the first visual angle shooting equipment, and obtaining characteristic points of the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area and the video stream shot by the first visual angle shooting equipment;
performing feature point matching based on the feature points to obtain a feature point matching set;
and merging the video stream acquired by the second visual angle shooting equipment with the smallest overlapping area with the video stream shot by the first visual angle shooting equipment based on the characteristic point matching set to obtain a panoramic video stream.
6. An on-site activity viewing angle switching apparatus, which is applied to a first viewing angle photographing device, comprising,
the first acquisition module is used for acquiring the position information of the first visual angle shooting equipment and the position information of the second visual angle shooting equipment;
the division module is used for dividing the activity scene into at least one view angle area based on the position information of the first view angle shooting equipment and the position information of the second view angle shooting equipment, wherein at least one second view angle equipment is arranged in each view angle area, and if one side of the first view angle equipment does not have the second view angle equipment, one side of the first view angle equipment, which does not have the second view angle equipment, is not used as the view angle area;
the generation module is used for respectively generating visual angle switching prompt information corresponding to each visual angle area;
the display module is used for displaying the visual angle switching prompt information and is used for a user to select a visual angle area;
the response module is used for responding to the selected action of the view angle area of the user and acquiring and displaying the video stream acquired by the second view angle shooting equipment corresponding to the view angle area selected by the user;
the response module includes:
the first acquisition sub-module is used for acquiring and displaying the video stream acquired by the second visual angle shooting equipment if the second visual angle shooting equipment is arranged in the visual angle area selected by the user;
the second obtaining submodule is used for respectively calculating the superposition area of the first shooting view angle of the first view angle shooting device and the second shooting view angle of each second view angle shooting device in the view angle area selected by the user based on the position information of the first view angle shooting device and the position information of the plurality of second view angle shooting devices if the plurality of second view angle switching devices are arranged in the view angle area selected by the user;
if the first shooting visual angle and at least one second shooting visual angle have overlapping areas, acquiring and displaying video streams acquired by second visual angle shooting equipment with the smallest overlapping areas;
and if the first shooting visual angle and each second shooting visual angle do not have the overlapping area, acquiring and displaying the video stream acquired by the second visual angle shooting equipment of the second shooting visual angle adjacent to the first shooting visual angle.
7. A view photographing apparatus comprising a memory and a processor, the memory having stored thereon a computer program capable of being loaded by the processor and executing the method according to any one of claims 1 to 5.
8. A computer readable storage medium, characterized in that a computer program is stored which can be loaded by a processor and which performs the method according to any one of claims 1 to 5.
CN202210130893.6A 2022-02-12 2022-02-12 Live action viewing angle switching method, device, equipment and readable storage medium Active CN114500846B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210130893.6A CN114500846B (en) 2022-02-12 2022-02-12 Live action viewing angle switching method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210130893.6A CN114500846B (en) 2022-02-12 2022-02-12 Live action viewing angle switching method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN114500846A CN114500846A (en) 2022-05-13
CN114500846B true CN114500846B (en) 2024-04-02

Family

ID=81481002

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210130893.6A Active CN114500846B (en) 2022-02-12 2022-02-12 Live action viewing angle switching method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114500846B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107623812A (en) * 2016-07-14 2018-01-23 幸福在线(北京)网络技术有限公司 A kind of method, relevant apparatus and system for realizing outdoor scene displaying
CN108174240A (en) * 2017-12-29 2018-06-15 哈尔滨市舍科技有限公司 Panoramic video playback method and system based on user location
WO2019138682A1 (en) * 2018-01-09 2019-07-18 ソニー株式会社 Information processing device, information processing method, and program
CN111629242A (en) * 2020-05-27 2020-09-04 腾讯科技(深圳)有限公司 Image rendering method, device, system, equipment and storage medium
CN111800644A (en) * 2020-07-14 2020-10-20 深圳市人工智能与机器人研究院 Video sharing and acquiring method, server, terminal equipment and medium
CN113596544A (en) * 2021-07-26 2021-11-02 王博 Video generation method and device, electronic equipment and storage medium
CN113784217A (en) * 2021-09-13 2021-12-10 天津智融创新科技发展有限公司 Video playing method, device, equipment and storage medium
CN113938647A (en) * 2021-09-13 2022-01-14 杭州大杰智能传动科技有限公司 Intelligent tower crane operation panoramic monitoring and restoring method and system for intelligent construction site

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105120251A (en) * 2015-08-19 2015-12-02 京东方科技集团股份有限公司 3D scene display method and device
US10750243B2 (en) * 2016-08-30 2020-08-18 Sony Corporation Distribution device, distribution method, reception device, reception method, program, and content distribution system
EP3720136A4 (en) * 2017-11-30 2020-10-07 Sony Corporation Transmission device, transmission method, reception device, and reception method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107623812A (en) * 2016-07-14 2018-01-23 幸福在线(北京)网络技术有限公司 A kind of method, relevant apparatus and system for realizing outdoor scene displaying
CN108174240A (en) * 2017-12-29 2018-06-15 哈尔滨市舍科技有限公司 Panoramic video playback method and system based on user location
WO2019138682A1 (en) * 2018-01-09 2019-07-18 ソニー株式会社 Information processing device, information processing method, and program
CN111629242A (en) * 2020-05-27 2020-09-04 腾讯科技(深圳)有限公司 Image rendering method, device, system, equipment and storage medium
CN111800644A (en) * 2020-07-14 2020-10-20 深圳市人工智能与机器人研究院 Video sharing and acquiring method, server, terminal equipment and medium
CN113596544A (en) * 2021-07-26 2021-11-02 王博 Video generation method and device, electronic equipment and storage medium
CN113784217A (en) * 2021-09-13 2021-12-10 天津智融创新科技发展有限公司 Video playing method, device, equipment and storage medium
CN113938647A (en) * 2021-09-13 2022-01-14 杭州大杰智能传动科技有限公司 Intelligent tower crane operation panoramic monitoring and restoring method and system for intelligent construction site

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
智慧家庭的VR全景视频业务实现;罗传飞;孔德辉;刘翔凯;徐科;杨浩;;电信科学(10);全文 *

Also Published As

Publication number Publication date
CN114500846A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
JP6725038B2 (en) Information processing apparatus and method, display control apparatus and method, program, and information processing system
US10270825B2 (en) Prediction-based methods and systems for efficient distribution of virtual reality media content
CN107079141B (en) Image mosaic for 3 D video
CN104571532B (en) A kind of method and device for realizing augmented reality or virtual reality
CN109729372B (en) Live broadcast room switching method, device, terminal, server and storage medium
US9392248B2 (en) Dynamic POV composite 3D video system
CN111741334B (en) Live broadcast data generation method, live broadcast data display method, device and equipment
CN111787407B (en) Interactive video playing method and device, computer equipment and storage medium
CN105989573A (en) Method and system for providing exhibition hall guide information based on 360-degree digital panoramic technology
CN113490010B (en) Interaction method, device and equipment based on live video and storage medium
JP2018117312A (en) Video distribution system, user terminal and video distribution method
CN110837300B (en) Virtual interaction method and device, electronic equipment and storage medium
CN112007362A (en) Display control method, device, storage medium and equipment in virtual world
CN112774185B (en) Virtual card control method, device and equipment in card virtual scene
CN114302160A (en) Information display method, information display device, computer equipment and medium
JP2020042407A (en) Information processor and information processing method and program
CN114500846B (en) Live action viewing angle switching method, device, equipment and readable storage medium
CN112995687A (en) Interaction method, device, equipment and medium based on Internet
JP6609078B1 (en) Content distribution system, content distribution method, and content distribution program
CN112511743A (en) Video shooting method and device
JP7301521B2 (en) Image processing device
CN112991157B (en) Image processing method, image processing device, electronic equipment and storage medium
CN112702533B (en) Sight line correction method and sight line correction device
CN113413587B (en) Information determination method, device, equipment and medium for card sports
CN114415907A (en) Media resource display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant