CN114518903A - AR system control method and device, electronic device and storage medium - Google Patents

AR system control method and device, electronic device and storage medium Download PDF

Info

Publication number
CN114518903A
CN114518903A CN202210097349.6A CN202210097349A CN114518903A CN 114518903 A CN114518903 A CN 114518903A CN 202210097349 A CN202210097349 A CN 202210097349A CN 114518903 A CN114518903 A CN 114518903A
Authority
CN
China
Prior art keywords
equipment
control instruction
multimedia resource
time point
execution time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210097349.6A
Other languages
Chinese (zh)
Inventor
王海强
李志雄
李荣芳
柯建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huilian Network Technology Co ltd
Original Assignee
Guangzhou Huilian Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huilian Network Technology Co ltd filed Critical Guangzhou Huilian Network Technology Co ltd
Priority to CN202210097349.6A priority Critical patent/CN114518903A/en
Publication of CN114518903A publication Critical patent/CN114518903A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/30003Arrangements for executing specific machine instructions
    • G06F9/30076Arrangements for executing specific machine instructions to perform miscellaneous control operations, e.g. NOP
    • G06F9/30087Synchronisation or serialisation instructions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The embodiment of the application relates to the technical field of AR (augmented reality) equipment, and discloses a control method and device of an AR system, electronic equipment and a storage medium, wherein the method is applied to a server, the AR system at least comprises the AR equipment and field equipment, the field equipment is arranged in a real field, and the method comprises the following steps: and sending corresponding data packets to the AR equipment and the field equipment respectively, wherein each data packet comprises one or more control instructions, each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point, and the control instructions are used for enabling the AR equipment and/or the field equipment to process the corresponding multimedia resource according to the corresponding execution mode at the execution time point indicated by the control instruction. By implementing the embodiment of the application, the representation effect of the AR image output by the AR equipment can be improved.

Description

AR system control method and apparatus, electronic device, and storage medium
Technical Field
The present application relates to the technical field of AR devices, and in particular, to a method and an apparatus for controlling an AR system, an electronic device, and a storage medium.
Background
With the rapid development of the AR (Augmented Reality) technology, the AR devices of today can work in cooperation with multimedia resources output by some field devices installed in a real field, thereby achieving a better presentation effect.
However, in practice, it is found that multimedia resources output by the AR device and the venue device in the related art are difficult to synchronize, thereby reducing the representation effect of the AR image output by the AR device.
Disclosure of Invention
The embodiment of the application discloses a control method and device of an AR system, electronic equipment and a storage medium, which can improve the expression effect of an AR image output by the AR equipment.
The first aspect of the embodiment of the present application discloses a server is applied to, the AR system includes AR equipment and place equipment at least, place equipment sets up in the reality place, include:
and sending corresponding data packets to the AR equipment and the field equipment respectively, wherein each data packet comprises one or more control instructions, each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point, and the control instructions are used for enabling the AR equipment and/or the field equipment to process the corresponding multimedia resource at the execution time point indicated by the control instruction according to the corresponding execution mode.
The second aspect of the embodiment of this application discloses a controlling means of AR system, is applied to the server, the AR system includes AR equipment and place equipment at least, place equipment sets up in the reality place, the device includes:
and the sending unit is configured to send corresponding data packets to the AR device and the site device, where each data packet includes one or more control instructions, each control instruction includes a multimedia resource, an execution mode of the multimedia resource, and an execution time point, and the control instruction is configured to enable the AR device and/or the site device to process the corresponding multimedia resource at the execution time point indicated by the control instruction according to the corresponding execution mode.
A third aspect of the embodiments of the present application discloses an electronic device, including:
a memory storing executable program code;
a processor coupled with the memory;
the processor calls the executable program code stored in the memory to execute the control method of the AR system disclosed in the first aspect of the embodiment of the present application.
A fourth aspect of the embodiments of the present application discloses a computer-readable storage medium storing a computer program, where the computer program causes a computer to execute the method for controlling an AR system disclosed in the first aspect of the embodiments of the present application.
A fifth aspect of an embodiment of the present application discloses a control system, including an AR system and a server, the AR system at least includes an AR device and a site device, wherein:
the server is configured to send corresponding data packets to the AR device and the site device, where each data packet includes one or more control instructions, and each control instruction includes a multimedia resource, an execution mode of the multimedia resource, and an execution time point;
the AR equipment is used for processing the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction;
and the field equipment is used for processing the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction.
A sixth aspect of embodiments of the present application discloses a computer program product, which, when run on a computer, causes the computer to perform some or all of the steps of any one of the methods of the first aspect of the embodiments of the present application.
A seventh aspect of embodiments of the present application discloses an application publishing platform, where the application publishing platform is configured to publish a computer program product, where, when the computer program product runs on a computer, the computer is caused to perform part or all of the steps of any one of the methods in the first aspect of the embodiments of the present application.
Compared with the related art, the embodiment of the application has the following beneficial effects:
in this embodiment, the server may send corresponding data packets to the AR device and the site device, respectively, where each data may include one or more control instructions, and each control instruction includes a multimedia resource, an execution mode of the multimedia resource, and an execution time point; and the AR device and/or the field device can process the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction. It can be understood that, in the related art, the server usually sends the control instruction to the AR device and the site device in real time, but because there is a time difference between the control instruction reaching the AR device and the site device, it is difficult to achieve that the AR device and the site device synchronously output the multimedia resource; in the embodiment of the application, the server can send the execution time point control command to the AR device and the field device in advance, so that the AR device and the field device can execute the control command synchronously at the scheduled execution time point to achieve the effect of synchronously outputting the multimedia resources, the problem that the multimedia resources output by the AR device and the field device are difficult to synchronize is solved, and the expression effect of the AR image output by the AR device is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic diagram of an application scenario disclosed in an embodiment of the present application;
fig. 2 is a schematic flowchart of a control method of an AR system according to an embodiment of the present disclosure;
fig. 3A is a schematic flowchart of another control method of an AR system disclosed in the embodiments of the present application;
fig. 3B is a schematic diagram of a second distance obtaining method disclosed in the embodiment of the present application;
fig. 4 is a schematic flowchart of a control method of an AR system according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a control device of an AR system disclosed in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first", "second", "third" and "fourth", etc. in the description and claims of the present application are used for distinguishing different objects, and are not used for describing a specific order. The terms "comprises," "comprising," and "having," and any variations thereof, of the embodiments of the present application, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the application discloses a control method and device of an AR system, electronic equipment and a storage medium, which can improve the expression effect of an AR image output by the AR equipment.
The technical solution of the present application will be described in detail with reference to specific examples.
In order to more clearly describe the control method and apparatus, the electronic device, and the storage medium of the AR system disclosed in the embodiments of the present application, an application scenario of the control method suitable for the AR system is first introduced. Optionally, the method may be applied to various servers, including but not limited to a cloud server, a local server, or a server cluster.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario disclosed in an embodiment of the present application. The AR system may include an AR device 1101 and a site device 1102, and the server 120 (fig. 1 illustrates a cloud server as an example, and should not be limited in this embodiment of the present disclosure) may respectively establish a transmission link with the AR device 1101 and the site device 1102, where the transmission link is used to transmit a data packet or an instruction, and is not limited herein. Optionally, the AR device 1101 may include AR glasses, an AR mobile phone, an AR watch, and the like, and fig. 1 illustrates the AR glasses as an example, which should not be construed as a limitation to the embodiment of the present application. Optionally, the site device 1102 may be a device disposed in a real site and configured to output a multimedia resource, and the site device 1102 may include, but is not limited to, a display screen, a lamp, or a sound box, and fig. 1 illustrates the lamp as an example, which should not be limited in this embodiment of the present application.
Based on this, before or during the operation of the AR system, the server 120 may send corresponding data packets to the AR device 1101 and the venue device 1102 through a transmission link, where each data packet may include one or more control instructions, and each control instruction includes a multimedia resource (including but not limited to a video, music, or a picture), an execution mode of the multimedia resource (including but not limited to playing, pausing playing, adjusting a volume or adjusting a brightness, etc.), and an execution time point; after receiving the data packet, the AR device 1101 may process the corresponding multimedia resource according to the corresponding execution mode at the execution time point indicated by the control instruction; similarly, after receiving the data packet, the site device 1102 may process the corresponding multimedia resource according to the corresponding execution mode at the execution time point indicated by the control instruction. Therefore, the effect that the AR device 1101 and the field device 1102 output multimedia resources in a matched mode is achieved, and the representation effect of the AR images output by the AR device 1101 is further improved.
Based on this, the following describes a control method of the AR system disclosed in the embodiment of the present application.
Referring to fig. 2, fig. 2 is a flowchart illustrating a control method of an AR system according to an embodiment of the present disclosure, where the method may be applied to the server or other execution entities, and is not limited herein. The AR system at least comprises AR equipment and site equipment, wherein the site equipment is arranged in a real site, and the method can comprise the following steps:
202. and sending corresponding data packets to the AR equipment and the field equipment respectively, wherein each data packet comprises one or more control instructions, each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point, and the control instructions are used for enabling the AR equipment and/or the field equipment to process the corresponding multimedia resource according to the corresponding execution mode at the execution time point indicated by the control instruction.
In the embodiment of the present application, the server may include, but is not limited to, a cloud server, a local server, or a server cluster. It can be understood that the server has a certain logical operation capability, and for this purpose, the server can generate data packets corresponding to the AR device and the venue device respectively according to the effect of the AR image to be presented in the AR device. Further, when the sending condition is satisfied, the server may send corresponding data packets to the AR device and the venue device, respectively.
Optionally, when determining that the AR device and the site device are in the starting state, the server may determine that the sending condition is satisfied; in another embodiment, the server may determine that the transmission condition is satisfied when receiving a data packet acquisition request transmitted by the AR device and/or the venue device. It should be further noted that, in this embodiment of the present application, the data packets sent to the AR device and the venue device may be the same or different, and are not limited herein.
Further, after receiving the data packet, the AR device and/or the site device may sequentially execute the control instructions included in the data packet according to the sequence of the execution time point from morning to evening; optionally, the manner in which the AR device and/or the venue device executes the control instruction included in the data packet may be: and processing the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control command.
For example, the data packet sent to the AR device includes a control instruction a and a control instruction b; wherein the control command a comprises a video clip a, and at 14: 20 playing the video clip A; control command b includes picture a, at 14: 30 displaying picture A; the AR device, after receiving the data packet, may, according to control instruction a, at 14: 20 playing the video segment a, and according to the control command b at 14: picture a is displayed at 30.
It is understood that, in the related art, the server generally sends the control command to the AR device and the venue device in real time, so as to control the AR device and the venue device to work in real time through the control command. However, since there is a time difference between the arrival of the control command from the server to the AR device and the venue device (for example, if the server transmits the control command to the AR device and the venue device at the same time, but it takes 2 seconds for the control command to be transmitted from the server to the AR device, and the control command is transmitted from the server to the venue device for 1 second, the AR device will receive the control command first and perform the corresponding operation first), it is difficult to achieve the effect that the AR device and the venue device synchronously output the multimedia resource according to the control command. In the embodiment of the present application, based on the above-described interaction manner between the server and the AR system, the server may send the execution time point control command to the AR device and the site device in advance, so that the subsequent AR device and the site device may execute the execution time point synchronous control command at a predetermined time, and an effect of outputting the multimedia resource synchronously is achieved, thereby solving a problem that the multimedia resource output by the AR device and the site device is difficult to synchronize, and further improving a performance effect of the AR image output by the AR device.
For example, the control instruction a sent to the AR device includes video clip a, at 14: 20 playing the video clip A; and the data packet sent by the server to the venue device may include control command c, which includes picture B, at 14: 20 display picture B; the AR device and venue device may be at 14: 20, playing the video clip A and the display picture B respectively, thereby achieving the effect of synchronously outputting multimedia resources.
In the method disclosed in each of the above embodiments, the server may send corresponding data packets to the AR device and the site device, where each data packet may include one or more control instructions, and each control instruction includes a multimedia resource, an execution mode of the multimedia resource, and an execution time point; and the AR device and/or the field device can process the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction. It can be understood that, in the related art, the server usually sends the control instruction to the AR device and the site device in real time, but because there is a time difference between the control instruction reaching the AR device and the site device, it is difficult to achieve that the AR device and the site device synchronously output the multimedia resource; in the embodiment of the application, the server can send the execution time point control command to the AR device and the field device in advance, so that the AR device and the field device can execute the control command synchronously at the scheduled execution time point to achieve the effect of synchronously outputting the multimedia resources, the problem that the multimedia resources output by the AR device and the field device are difficult to synchronize is solved, and the expression effect of the AR image output by the AR device is improved.
Referring to fig. 3A, fig. 3A is a schematic flowchart of another control method of an AR system according to an embodiment of the present disclosure, which may be applied to the server or other execution entities, and is not limited herein. The AR system at least comprises AR equipment and site equipment, wherein the site equipment is arranged in a real site, and the method can comprise the following steps:
302. and sending corresponding data packets to the AR equipment and the field equipment respectively, wherein each data packet comprises one or more control instructions, each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point, and the control instructions are used for enabling the AR equipment and/or the field equipment to process the corresponding multimedia resource according to the corresponding execution mode at the execution time point indicated by the control instruction.
It can be understood that, the field device is added in the AR system to make the multimedia resource output by the field device displayed in cooperation with the multimedia resource output by the AR device, thereby improving the expression effect of the AR image output by the AR device. In this regard, as an optional implementation manner, if a first multimedia resource included in the first control instruction matches a second multimedia resource included in the second control instruction, the execution time point corresponding to the first control instruction may be the same as the execution time point corresponding to the second control instruction; the first control instruction may be any one of first data packets, where the first data packet is a data packet sent to the AR device, the second control instruction is any one of second data packets, and the second data packet is a data packet sent to the venue device.
For example, the multimedia resource included in the first control instruction sent to the AR device is a character video, the multimedia resource included in the second control instruction sent to the venue device is a background video, and the character video and the background video are cooperatively output, so that the execution time points of the first control instruction and the second control instruction may be the same, and the AR device and the venue device may synchronously output the character video and the background video to achieve an effect of cooperative display.
By implementing the method, the execution time points of the multimedia resources which are output in a matching mode can be set to be the same, so that the AR equipment and the field equipment can synchronously output the multimedia resources which need to be output in a matching mode, the matching effect is achieved, and the representation effect of the AR images of the AR equipment is improved.
In practice it is found that the distance of the AR device and the field device from the user's eyes is usually different, for example: the AR glasses are typically worn on the bridge of the user's nose and the venue device is placed on the ground around the user, so the venue device is farther from the user's eyes than the AR device. It will be appreciated that the length of time that the multimedia asset is output from the device to the eyes of the user is positively correlated with the distance between the device and the eyes of the user, i.e. the longer the distance between the device and the eyes of the user, the longer the length of time that the multimedia asset is output from the device to the eyes of the user. In contrast, if the AR device and the field device synchronously output the multimedia resources that need to be cooperatively output, the multimedia resources respectively output by the AR device and the field device are sequentially transmitted to the eyes of the user due to the different distances between the AR device and the field device and the eyes of the user, so that the effect that the user can synchronously see the two multimedia resources cannot be realized.
In this regard, as another optional implementation manner, if a first multimedia resource included in the first control instruction matches a second multimedia resource included in the second control instruction, a difference between an execution time point corresponding to the first control instruction and an execution time point corresponding to the second control instruction may be a target time length, where the target time length is a time difference between the first time length and the second time length, the first time length is a time length required for the first multimedia resource to be transmitted from the AR device to the eyes of the user, and the second time length is a time length required for the second multimedia resource to be transmitted from the field device to the eyes of the user.
Optionally, if the first duration is less than the second duration, the execution time point corresponding to the first control instruction may be later than the execution time point corresponding to the second control instruction by the target duration; if the first duration is longer than the second duration, the execution time point corresponding to the first control instruction may be earlier than the execution time point corresponding to the second control instruction by the target duration; if the first duration is equal to the second duration, the execution time point corresponding to the first control instruction may be the same as the execution time point corresponding to the second control instruction.
Alternatively, the first duration may be determined by a first distance between the AR device and the user's eyes, and the second duration may be determined by a second distance between the venue device and the user's eyes.
Alternatively, the second distance between the field device and the user's eyes may be determined by the server based on the user's height, the distance between the user and the field device.
Optionally, the square of the height of the user may be calculated to obtain a first result, and the square of the distance between the user and the field may be calculated to obtain a second result; and calculating the sum of the first result and the second result to obtain a third result, squaring the third result to obtain a fourth result, and taking the fourth result as the second distance.
Referring to fig. 3B, fig. 3B is a schematic diagram illustrating a second distance obtaining method according to an embodiment of the present disclosure. Wherein, assume that the height of the user 130 is x, and the distance between the user 130 and the field device 1102 is y; it will be appreciated that the user 110 is typically close to 90 degrees from the ground 140, for which the second distance z can be determined according to the Pythagorean theorem as:
Figure BDA0003490902920000091
by implementing the method, the execution time points of the first multimedia resource and the second multimedia resource which need to be output in a matched manner can be set according to the time difference between the arrival of the multimedia resource at the eyes of the user, which is respectively output by the AR equipment and the field equipment, so as to eliminate the influence of the transmission time length, ensure that the first multimedia resource and the second multimedia resource can be synchronously output to the eyes of the user, achieve the expected matched effect, and further improve the expression effect of the AR image output by the AR equipment.
In yet another embodiment, after the server sends the corresponding data packets to the AR device and the field device, if it is detected that the AR device moves, a third time period required for the multimedia resource to be transmitted from the AR device to the eyes of the user and a fourth time period required for the multimedia resource to be transmitted from the field device to the eyes of the user may be re-determined, and then the interval time period may be determined according to the third time period and the fourth time period.
The server can then re-determine the execution time point of each control instruction in the first data packet sent to the AR device according to the interval duration to obtain a third data packet; and re-determining the execution time points of the control instructions in the second data packet sent to the field device according to the interval duration to obtain a fourth data packet, so that the execution time points of the third control instruction and the fourth control instruction are different from each other by the interval duration, the third control instruction is any one control instruction in the third data packet, the fourth control instruction is any one control instruction in the fourth data packet, and the third control instruction is matched with the fourth control instruction.
The server may then send the third data packet to the AR device to cause the AR device to replace the first data packet with the third data packet and send the fourth data packet to the venue device to cause the venue device to replace the second data packet with the fourth data packet.
Optionally, the server may obtain a first real-time location of the AR device, a second real-time location of the venue device, and a third real-time location of the user; and the server can determine a third time length required by the multimedia resource to be transmitted from the AR device to the eyes of the user according to the first real-time position and the third real-time position and determine a fourth time length required by the multimedia resource to be transmitted from the field device to the eyes of the user according to the second real-time position and the third real-time position under the condition that the AR device is determined to move according to the first real-time position of the AR device.
By implementing the method, the server can also re-determine the third time length required by the multimedia resource to be transmitted from the AR equipment to the eyes of the user and the fourth time length required by the multimedia resource to be transmitted from the field equipment to the eyes of the user when the AR equipment is detected to move, and further adjust the execution time point of the control instruction needing to be cooperatively output according to the third time length and the fourth time length, so as to ensure that the multimedia resource needing to be synchronously output can be synchronously output to the eyes of the user, achieve the expected cooperation effect, and further improve the expression effect of the AR image output by the AR equipment.
304. Synchronizing system times of the server, the AR device, and the venue device.
In the embodiment of the application, the server can acquire the current system time of the server as the standard system time and respectively send the standard system time to the AR equipment and the field equipment; and then AR equipment and place equipment can be adjusted to standard system time with self system after receiving standard system time to reach the effect of synchronous server, AR equipment and place equipment's system time.
In another embodiment, the server may synchronize the system Time of the server, the AR device, and the site device through a Time synchronization Protocol, and optionally, the Time synchronization Protocol may include a Network Time Protocol (NTP), a Precision Time Protocol (PTP), and the like, which are not limited herein.
Further, after receiving the data packet, the AR device may process the corresponding multimedia resource according to the execution mode corresponding to the control instruction when the system time of the AR device matches the execution time point indicated by the control instruction included in the data packet.
Optionally, after receiving the data packet, the site device may process the corresponding multimedia resource according to the execution mode corresponding to the control instruction when the system time of the site device matches the execution time point indicated by the control instruction included in the data packet.
By implementing the method, the system time of the server, the AR equipment and the site equipment can be synchronized, and further, the AR equipment and the site equipment can execute the control command by taking the system time of the AR equipment and the site equipment as a standard, so that the control commands with the same execution time point can be synchronously executed, and the AR equipment and the site equipment can achieve the effect of synchronously outputting the multimedia resources.
By implementing the method disclosed by each embodiment, the server can send the execution time point control command to the AR equipment and the field equipment in advance, so that the AR equipment and the field equipment can synchronously execute the control command at the scheduled execution time point to achieve the effect of synchronously outputting the multimedia resources, thereby solving the problem that the multimedia resources output by the AR equipment and the field equipment are difficult to synchronize and improving the expression effect of the AR image output by the AR equipment; the execution time points of the multimedia resources which are output in a matching mode can be set to be the same, so that the AR equipment and the field equipment can synchronously output the multimedia resources which need to be output in a matching mode, the matching effect is achieved, and the representation effect of the AR image of the AR equipment is improved; the execution time points of the first multimedia resource and the second multimedia resource which need to be output in a matched mode can be set according to the time difference of the multimedia resources output by the AR equipment and the field equipment to the eyes of the user, so that the influence of transmission time length is eliminated, the first multimedia resource and the second multimedia resource can be ensured to be output to the eyes of the user synchronously, the expected matched effect is achieved, and the expression effect of the AR image output by the AR equipment is improved; and the system time of the server, the AR equipment and the field equipment can be synchronized, and further the AR equipment and the field equipment can execute the control command by taking the system time of the AR equipment and the field equipment as a standard, so that the control commands with the same execution time point can be synchronously executed, and the AR equipment and the field equipment can achieve the effect of synchronously outputting the multimedia resources.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating a control method of an AR system according to another embodiment of the present disclosure, where the method may be applied to the server or other execution entities, and is not limited herein. The AR system at least comprises AR equipment and site equipment, wherein the site equipment is arranged in a real site, and the method can comprise the following steps:
402. when the real-time position of the AR equipment is determined to be within the first range of the site equipment, corresponding data packets are respectively sent to the AR equipment and the site equipment, each data packet comprises one or more control instructions, each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point, and the control instructions are used for enabling the AR equipment and/or the site equipment to process the corresponding multimedia resource at the execution time point indicated by the control instruction according to the corresponding execution mode.
In this embodiment of the application, the AR device may be provided with a Positioning module, including but not limited to a Positioning module operating a beidou System and a Positioning module operating a Global Positioning System (GPS), which is not limited herein. And the server can determine the real-time position of the AR device through the position information fed back by the positioning module.
It is understood that, in order to improve the representation effect of the AR image output by the AR device, the AR device and the venue device are usually used together, that is, if the AR device and the venue device are too far away from each other, they cannot be used together, so that there is no need to generate a data packet to the AR device and the venue device. Therefore, the server can respectively send the corresponding data packets to the AR equipment and the field equipment when the real-time position of the AR equipment is determined to be within the first range of the field equipment, so that the situation that the data packets are sent to the AR equipment and the field equipment but cannot be used in time is avoided, the power consumption of the equipment is saved, and the storage space of the equipment can be saved.
Alternatively, the first range may be set by a developer according to a large amount of development data, or may be set by a user, which is not limited herein. The first range may be a circular range centered on the location of the field device or a rectangular range, which is not limited herein.
As another optional implementation, the server may send corresponding data packets to the AR device and the venue device respectively when determining that the real-time location of the AR device matches a first location, where the first location matches a multimedia resource to be output by the venue device, and the first location is within a first range of the venue device.
It will be appreciated that the optimal viewing or listening position for a venue device may be different when outputting different multimedia assets. For example, if the volume of music played by the field device is high, the matching effect is good when the AR device is far away from the field device; otherwise, if the volume of the music played by the field equipment is low, the matching effect is better when the AR equipment is close to the field equipment.
In this regard, the server may determine one or more first locations within a first range of the venue device, each first location matching one or more multimedia assets; the server can determine a corresponding first position according to the multimedia resource to be output by the field device, and then the server can respectively send corresponding data packets to the AR device and the field device when determining that the real-time position of the AR device is matched with the first position.
In an embodiment, the server may send a request for obtaining a multimedia resource to be output to the site device, receive the multimedia resource to be output sent by the site device, and further determine the first position according to the multimedia resource to be output.
Optionally, the field device may receive the multimedia resource reservation information sent by the AR device, and further determine the multimedia resource to be output according to the multimedia resource reservation information. For example, if the reservation information sent by the AR device is a first stage play, the venue device may use background music of the first stage play as the multimedia resource to be output.
By implementing the method, when the AR device is at the first position, the corresponding data packets can be respectively sent to the AR device and the site device, so that the AR device and the site device execute subsequent operations; the first position is the best position for watching or listening to the multimedia resources output by the field equipment, so that the matching effect of the multimedia resources output by the field equipment and the AR equipment can be improved, and the expression effect of the AR images output by the AR equipment can be improved.
In an embodiment, after sending the corresponding data packets to the AR device and the venue device, respectively, if it is determined that the real-time location of the AR device is not within the first range of the venue device, the server may send a deletion instruction to the AR device and the venue device, where the deletion instruction is used to cause the AR device and the venue device to delete the received data packets.
By implementing the method, when the AR equipment is determined to leave the field equipment, the default AR equipment does not need to be matched with the field equipment any more, and the data packets in the AR equipment and the field equipment can be deleted, so that the storage space of the AR equipment and the field equipment is saved.
In another embodiment, after sending the corresponding data packets to the AR device and the venue device, if it is determined that the real-time location of the AR device is not within the first range of the venue device for a fifth duration (a specific value may be set by a developer according to a great deal of development experience, and a typical value may be 10 minutes, 15 minutes, and the like, which is not limited herein), the server may send a deletion instruction to the AR device and the venue device, where the deletion instruction is used to cause the AR device and the venue device to delete the received data packets.
By implementing the method, when the AR equipment leaves the first range of the field equipment for enough time, the AR equipment is defaulted to be no longer matched with the field equipment, the situation that the AR equipment enters the first range again after short time and the server needs to send the data packet to the AR equipment and the field equipment again is avoided, so that the power consumption of the equipment can be saved, the situation that the AR equipment can continue to execute the task only after receiving the data packet again can be avoided, and the use experience of a user is improved.
In another embodiment, the delete instruction is further configured to enable the venue device to determine whether a second multimedia resource reservation break sent by another AR device is received; and if second multimedia resource reservation information sent by other AR devices is received, and a multimedia resource corresponding to the second multimedia resource reservation information is the same as a multimedia resource included in a data packet received by the site device, storing a sixth duration of the received data packet (a specific value may be set by a developer according to a great amount of development experience, and a typical value may be 2 minutes, 5 minutes, or the like, which is not limited herein); if the received data packet is stored for the sixth time period, and other AR devices are not detected to be in the first range of the site device, deleting the received data packet; and if the other AR equipment is detected to be in the first range of the site equipment within the sixth time length of the received data packet, outputting the multimedia resource according to the received data packet.
For example, if the venue device receives the multimedia resource as the video clip C before, and receives the deletion instruction, if the venue device receives the reservation information for reserving the video clip C sent by the other AR device, the venue device may not delete the data packet including the video clip C first; further, if subsequent other AR devices move within the first range of the venue device, the venue device may output video clip C; and if the subsequent other AR devices do not move within the first range of the venue device, the venue device deletes the data packet including video clip C.
By implementing the method, the situation that the field equipment repeatedly downloads the data packet can be avoided, so that the power consumption of the receiving equipment is received; and the subsequent multimedia resources which can be rapidly output can be matched with other AR equipment, so that the response speed is increased, and the use experience of the user is improved.
By implementing the method disclosed by each embodiment, the server can send the execution time point control command to the AR equipment and the field equipment in advance, so that the AR equipment and the field equipment can synchronously execute the control command at the scheduled execution time point to achieve the effect of synchronously outputting the multimedia resources, thereby solving the problem that the multimedia resources output by the AR equipment and the field equipment are difficult to synchronize and improving the expression effect of the AR image output by the AR equipment; when the AR device is located at the first position, the corresponding data packets can be respectively sent to the AR device and the field device, so that the AR device and the field device execute subsequent operations; the first position is the optimal position for watching or listening to the multimedia resource output by the field equipment, so that the matching effect of the multimedia resource output by the field equipment and the AR equipment can be improved, and the expression effect of the AR image output by the AR equipment can be improved; and the server can respectively send corresponding data packets to the AR equipment and the field equipment when determining that the real-time position of the AR equipment is within the first range of the field equipment, so that the condition that the data packets are sent to the AR equipment and the field equipment but cannot be used in time is avoided, the power consumption of the equipment is saved, and the storage space of the equipment can be saved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a control device of an AR system according to an embodiment of the present disclosure, where the control device may be applied to the server or other execution bodies, and is not limited herein. This AR system includes AR equipment and place equipment at least, and this place equipment sets up in the reality place, and the device can include sending element 501, wherein:
a sending unit 501, configured to send corresponding data packets to the AR device and the site device, where each data packet includes one or more control instructions, each control instruction includes a multimedia resource, an execution mode of the multimedia resource, and an execution time point, and the control instruction is used to enable the AR device and/or the site device to process the corresponding multimedia resource at the execution time point indicated by the control instruction according to the corresponding execution mode.
By implementing the device, the server can respectively send corresponding data packets to the AR equipment and the field equipment, wherein each data packet can comprise one or more control instructions, and each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point; and the AR device and/or the field device can process the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction. It can be understood that, in the related art, the server usually sends the control instruction to the AR device and the site device in real time, but because there is a time difference between the control instruction reaching the AR device and the site device, it is difficult to achieve that the AR device and the site device synchronously output the multimedia resource; in the embodiment of the application, the server can send the execution time point control command to the AR device and the field device in advance, so that the AR device and the field device can execute the control command synchronously at the scheduled execution time point to achieve the effect of synchronously outputting the multimedia resources, the problem that the multimedia resources output by the AR device and the field device are difficult to synchronize is solved, and the expression effect of the AR image output by the AR device is improved.
As an alternative embodiment, the apparatus shown in fig. 5 may further include a synchronization unit, not shown, wherein:
the system comprises a synchronization unit, a data packet processing unit and a data packet processing unit, wherein the synchronization unit is used for synchronizing the system time of the server, the AR equipment and the site equipment after sending corresponding data packets to the AR equipment and the site equipment respectively;
the control instruction is used for enabling the AR equipment to process the corresponding multimedia resource according to the execution mode corresponding to the control instruction when the system time of the AR equipment is matched with the execution time point indicated by the control instruction;
and the control instruction is used for enabling the field equipment to process the corresponding multimedia resource according to the execution mode corresponding to the control instruction when the system time of the field equipment is matched with the execution time point indicated by the control instruction.
By implementing the device, the system time of the server, the AR equipment and the field equipment can be synchronized, and further, the AR equipment and the field equipment can execute the control command by taking the system time of the AR equipment and the field equipment as a standard, so that the control commands with the same execution time point can be synchronously executed, and the AR equipment and the field equipment can achieve the effect of synchronously outputting the multimedia resources.
As an optional implementation manner, the sending unit 501 is further configured to send, when it is determined that the real-time location of the AR device is within the first range of the venue device, corresponding data packets to the AR device and the venue device, where each data packet includes one or more control instructions, each control instruction includes a multimedia resource, an execution mode of the multimedia resource, and an execution time point, and the control instruction is used to enable the AR device and/or the venue device to process the corresponding multimedia resource according to the corresponding execution mode at the execution time point indicated by the control instruction.
By implementing the device, the server can respectively send the corresponding data packets to the AR equipment and the field equipment when the real-time position of the AR equipment is determined to be within the first range of the field equipment, so that the situation that the data packets are sent to the AR equipment and the field equipment but cannot be used in time is avoided, the power consumption of the equipment is saved, and the storage space of the equipment can be saved.
As an optional implementation manner, the sending unit 501 is further configured to send corresponding data packets to the AR device and the venue device respectively when it is determined that the real-time location of the AR device matches the first location, where each data packet includes one or more control instructions, each control instruction includes a multimedia resource, an execution manner of the multimedia resource, and an execution time point, and the control instruction is used to enable the AR device and/or the venue device to process the corresponding multimedia resource according to the corresponding execution manner at the execution time point indicated by the control instruction, where the first location matches the multimedia resource to be output by the venue device, and the first location is within a first range of the venue device.
By implementing the device, when the AR equipment is at the first position, the corresponding data packets can be respectively sent to the AR equipment and the field equipment, so that the AR equipment and the field equipment execute subsequent operations; the first position is the best position for watching or listening to the multimedia resources output by the field equipment, so that the matching effect of the multimedia resources output by the field equipment and the AR equipment can be improved, and the expression effect of the AR images output by the AR equipment can be improved.
As an optional implementation manner, if a first multimedia resource included in the first control instruction matches a second multimedia resource included in the second control instruction, an execution time point corresponding to the first control instruction is the same as an execution time point corresponding to the second control instruction, the first control instruction is any one control instruction in a first data packet, the first data packet is a data packet sent to the AR device, the second control instruction is any one control instruction in a second data packet, and the second data packet is a data packet sent to the site device.
By implementing the device, the execution time points of the multimedia resources which are output in a matching way can be set to be the same, so that the AR equipment and the field equipment can synchronously output the multimedia resources which need to be output in a matching way, the matching effect is achieved, and the representation effect of the AR images of the AR equipment is improved.
As an optional implementation manner, if a first multimedia resource included in the first control instruction is matched with a second multimedia resource included in the second control instruction, a difference between an execution time point corresponding to the first control instruction and an execution time point corresponding to the second control instruction is a target time length, where the target time length is a time difference between the first time length and the second time length, the first time length is a time length required for the first multimedia resource to be transmitted from the AR device to the eyes of the user, and the second time length is a time length required for the second multimedia resource to be transmitted from the field device to the eyes of the user.
By implementing the device, the execution time points of the first multimedia resource and the second multimedia resource which need to be output in a matched mode can be set according to the time difference between the arrival time of the multimedia resource output by the AR equipment and the arrival time of the multimedia resource output by the field equipment, so that the influence of transmission time length is eliminated, the first multimedia resource and the second multimedia resource can be synchronously output to the eyes of the user, the expected matched effect is achieved, and the representation effect of the AR image output by the AR equipment is further improved.
The embodiment of the present application further discloses a control system, which may include an AR system and a server, where the AR system at least includes an AR device and a site device, where:
the system comprises a server and a site device, wherein the server is used for respectively sending corresponding data packets to the AR device and the site device, each data packet comprises one or more control instructions, and each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point;
the AR equipment is used for processing the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction;
and the field equipment is used for processing the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction.
By implementing the control system, the server can respectively send corresponding data packets to the AR equipment and the site equipment, wherein each data packet can comprise one or more control instructions, and each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point; and the AR device and/or the field device can process the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction. It can be understood that, in the related art, the server usually sends the control instruction to the AR device and the site device in real time, but because there is a time difference between the control instruction reaching the AR device and the site device, it is difficult to achieve that the AR device and the site device synchronously output the multimedia resource; in the embodiment of the application, the server can send the execution time point control command to the AR device and the field device in advance, so that the AR device and the field device can execute the control command synchronously at the scheduled execution time point to achieve the effect of synchronously outputting the multimedia resources, the problem that the multimedia resources output by the AR device and the field device are difficult to synchronize is solved, and the expression effect of the AR image output by the AR device is improved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
As shown in fig. 6, the electronic device may include:
a memory 601 in which executable program code is stored;
a processor 602 coupled to a memory 601;
the processor 602 calls the executable program code stored in the memory 601 to execute the control method of the AR system disclosed in the above embodiments.
An embodiment of the present application discloses a computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute the control method of the AR system disclosed in each of the above embodiments.
The embodiment of the present application also discloses an application publishing platform, wherein the application publishing platform is used for publishing a computer program product, and when the computer program product runs on a computer, the computer is caused to execute part or all of the steps of the method in the above method embodiments.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Those skilled in the art should also appreciate that the embodiments described in this specification are all alternative embodiments and that the acts and modules involved are not necessarily required for this application.
In various embodiments of the present application, it should be understood that the sequence numbers of the above-mentioned processes do not imply a necessary order of execution, and the order of execution of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units, if implemented as software functional units and sold or used as a stand-alone product, may be stored in a computer accessible memory. Based on such understanding, the technical solution of the present application, which is a part of or contributes to the prior art in essence, or all or part of the technical solution, may be embodied in the form of a software product, stored in a memory, including several requests for causing a computer device (which may be a personal computer, a server, a network device, or the like, and may specifically be a processor in the computer device) to execute part or all of the steps of the above-described method of the embodiments of the present application.
It will be understood by those skilled in the art that all or part of the steps in the methods of the embodiments described above may be implemented by hardware instructions of a program, and the program may be stored in a computer-readable storage medium, where the storage medium includes Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Compact Disc Read-Only Memory (CD-ROM), or other Memory, such as a magnetic disk, or a combination thereof, A tape memory, or any other medium readable by a computer that can be used to carry or store data.
The control method and apparatus, the electronic device, and the storage medium of the AR system disclosed in the embodiments of the present application are introduced in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. The control method of the AR system is applied to a server, the AR system at least comprises AR equipment and site equipment, the site equipment is arranged in a real site, and the method comprises the following steps:
and sending corresponding data packets to the AR equipment and the field equipment respectively, wherein each data packet comprises one or more control instructions, each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point, and the control instructions are used for enabling the AR equipment and/or the field equipment to process the corresponding multimedia resource at the execution time point indicated by the control instruction according to the corresponding execution mode.
2. The method of claim 1, wherein after said sending corresponding data packets to said AR device and said venue device, respectively, said method comprises:
synchronizing system times of the server, the AR device, and the venue device;
the control instruction is used for enabling the AR equipment to process the corresponding multimedia resource according to the execution mode corresponding to the control instruction when the system time of the AR equipment is matched with the execution time point indicated by the control instruction;
and the control instruction is used for enabling the field equipment to process the corresponding multimedia resource according to the execution mode corresponding to the control instruction when the system time of the field equipment is matched with the execution time point indicated by the control instruction.
3. The method of claim 1, wherein the sending the corresponding data packets to the AR device and the venue device, respectively, comprises:
and when the real-time position of the AR equipment is determined to be within the first range of the field equipment, respectively sending corresponding data packets to the AR equipment and the field equipment.
4. The method of claim 3, wherein the sending corresponding data packets to the AR device and the venue device, respectively, upon determining that the real-time location of the AR device is within the first range of the venue device comprises:
when the real-time position of the AR equipment is determined to be matched with a first position, corresponding data packets are respectively sent to the AR equipment and the field equipment, the first position is matched with a multimedia resource to be output by the field equipment, and the first position is within a first range of the field equipment.
5. The method according to any one of claims 1 to 4, wherein if a first multimedia resource included in a first control instruction matches a second multimedia resource included in a second control instruction, an execution time point corresponding to the first control instruction is the same as an execution time point corresponding to the second control instruction, the first control instruction is any one of first packets, the first packet is a packet sent to the AR device, the second control instruction is any one of second packets, and the second packet is a packet sent to the venue device.
6. The method of claim 5, wherein if a first multimedia resource included in a first control instruction matches a second multimedia resource included in a second control instruction, an execution time point corresponding to the first control instruction differs from an execution time point corresponding to the second control instruction by a target duration, the target duration is a time difference between a first duration and a second duration, the first duration is a duration required for the first multimedia resource to be transmitted from the AR device to the eyes of the user, and the second duration is a duration required for the second multimedia resource to be transmitted from the field device to the eyes of the user.
7. The utility model provides a controlling means of AR system, its characterized in that is applied to the server, the AR system includes AR equipment and place equipment at least, the place equipment sets up in the reality place, the device includes:
and the sending unit is configured to send corresponding data packets to the AR device and the site device, where each data packet includes one or more control instructions, each control instruction includes a multimedia resource, an execution mode of the multimedia resource, and an execution time point, and the control instruction is configured to enable the AR device and/or the site device to process the corresponding multimedia resource at the execution time point indicated by the control instruction according to the corresponding execution mode.
8. An electronic device comprising a memory storing executable program code, and a processor coupled to the memory; wherein the processor calls the executable program code stored in the memory to execute the method according to any one of claims 1 to 6.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
10. A control system comprising an AR system and a server, the AR system including at least an AR device and a site device, wherein:
the server is used for respectively sending corresponding data packets to the AR equipment and the field equipment, wherein each data packet comprises one or more control instructions, and each control instruction comprises a multimedia resource, an execution mode of the multimedia resource and an execution time point;
the AR equipment is used for processing the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction;
and the field equipment is used for processing the corresponding multimedia resources according to the corresponding execution mode at the execution time point indicated by the control instruction.
CN202210097349.6A 2022-01-26 2022-01-26 AR system control method and device, electronic device and storage medium Pending CN114518903A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210097349.6A CN114518903A (en) 2022-01-26 2022-01-26 AR system control method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210097349.6A CN114518903A (en) 2022-01-26 2022-01-26 AR system control method and device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN114518903A true CN114518903A (en) 2022-05-20

Family

ID=81597060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210097349.6A Pending CN114518903A (en) 2022-01-26 2022-01-26 AR system control method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN114518903A (en)

Similar Documents

Publication Publication Date Title
CN109032793B (en) Resource allocation method, device, terminal and storage medium
CN112258086A (en) Cross-device task relay method and device, cloud platform and storage medium
JP7443621B2 (en) Video interaction methods, devices, electronic devices and storage media
US20220103873A1 (en) Computer program, method, and server apparatus
CN104539977A (en) Live broadcast previewing method and device
CN112995759A (en) Interactive service processing method, system, device, equipment and storage medium
US8797357B2 (en) Terminal, system and method for providing augmented broadcasting service using augmented scene description data
CN108880983B (en) Real-time voice processing method and device for virtual three-dimensional space
CN112752114B (en) Method and device for generating live broadcast playback interactive message, server and storage medium
CN113518247A (en) Video playing method, related equipment and computer readable storage medium
CN108174227B (en) Virtual article display method and device and storage medium
CN111401964A (en) Advertisement putting control method, device and system
JP2020017954A (en) Method, system and non-transitory computer-readable record medium for synchronization of real-time live video and event
CN112969093A (en) Interactive service processing method, device, equipment and storage medium
CN114157893B (en) Method and device for synchronously playing videos among multiple devices
CN111901619A (en) Message pushing method and device
CN113645472B (en) Interaction method and device based on play object, electronic equipment and storage medium
CN114666671A (en) Live broadcast praise interaction method, system, device, equipment and storage medium
CN114339444A (en) Method, device and equipment for adjusting playing time of video frame and storage medium
CN114518903A (en) AR system control method and device, electronic device and storage medium
CN113727125B (en) Live broadcast room screenshot method, device, system, medium and computer equipment
CN111970268B (en) Method and device for showing spectator and fighting data and computer readable storage medium
CN112004116B (en) Method, device, electronic equipment and medium for determining object adding mode
CN114025184A (en) Video live broadcast method and electronic equipment
CN114500572B (en) Multi-device synchronization method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination