CN215378902U - AR guide system - Google Patents

AR guide system Download PDF

Info

Publication number
CN215378902U
CN215378902U CN202121499602.8U CN202121499602U CN215378902U CN 215378902 U CN215378902 U CN 215378902U CN 202121499602 U CN202121499602 U CN 202121499602U CN 215378902 U CN215378902 U CN 215378902U
Authority
CN
China
Prior art keywords
audio signal
audio
signal instruction
send
transmitting device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121499602.8U
Other languages
Chinese (zh)
Inventor
赵维奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Smart Boy Technology Co ltd
Original Assignee
Sichuan Smart Boy Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Smart Boy Technology Co ltd filed Critical Sichuan Smart Boy Technology Co ltd
Priority to CN202121499602.8U priority Critical patent/CN215378902U/en
Application granted granted Critical
Publication of CN215378902U publication Critical patent/CN215378902U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Embodiments of the present disclosure disclose an AR navigation system. One embodiment of the system comprises: the AR glasses comprise a transmitting device, a receiving device and AR glasses, wherein the transmitting device comprises an input device and a first voice acquisition device, the transmitting device is in communication connection with the receiving device, and the transmitting device is configured to respond to the detection of input operation acting on the input device, generate audio signal instructions and send the audio signal instructions to the receiving device; the transmitting device is further configured to send real-time audio to the receiving device in response to detecting that the first voice capture device captured the real-time audio; the receiving device is in communicative connection with the AR glasses, the receiving device configured to send the received real-time audio and/or audio signal instructions to the AR glasses in response to receiving the real-time audio and/or audio signal instructions sent by the transmitting device. The implementation mode can provide diversified audio and video and clear real-time explanation for visitors, and improves the user experience and efficiency of visiting and navigating.

Description

AR guide system
Technical Field
The embodiment of the disclosure relates to the technical field of navigation equipment, in particular to an AR navigation system.
Background
Navigation refers to the interpretation of scene-related information for visitors under a scene (e.g., a museum). Currently, the commonly used navigation methods are: the sound of the guide staff is collected through the loudspeaker, and the sound is played after the volume is enlarged.
With the development of AR (Augmented Reality) technology, some navigation technologies using AR glasses as carriers have appeared. However, the existing AR navigation method usually uses CV (Computer Vision) algorithm to identify the target object, and then plays some AR content to the user.
However, the following technical problems generally exist in the navigation method: the loudspeaker can not provide diversified audio and video for visitors, and when the flow of people is large, the visitors can hardly hear the sound played by the loudspeaker. In some scenes, due to environmental restrictions, the AR device may not be able to identify the target object, resulting in that AR content cannot be presented in time, thus resulting in poor user experience. On the other hand, in the case of group AR navigation, the navigator cannot control the user's playback content, which affects the efficiency of navigation.
SUMMERY OF THE UTILITY MODEL
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose AR navigation systems to solve the technical problems mentioned in the background section above.
Some embodiments of the present disclosure provide an AR navigation system, comprising: the AR glasses comprise a transmitting device, a receiving device and AR glasses, wherein the transmitting device comprises an input device and a first voice acquisition device, the transmitting device is in communication connection with the receiving device, the transmitting device is configured to respond to the detection of input operation acting on the input device, generate an audio signal instruction according to the input operation and send the audio signal instruction to the receiving device; the transmitting device is further configured to respond to the detection that the first voice acquisition device acquires real-time audio, and send the real-time audio acquired by the first voice acquisition device to the receiving device; the receiving device is in communication connection with the AR glasses, and the receiving device is configured to respond to receiving the real-time audio and/or audio signal instruction sent by the transmitting device and send the received real-time audio and/or audio signal instruction to the AR glasses.
The above embodiments of the present disclosure have the following advantages: through the AR navigation system of some embodiments of the present disclosure, diversified audios and videos can be provided for the visitors, and the visitors can clearly hear the real-time explanation of the navigation personnel, and the navigation personnel can control the function trigger (e.g., play function) of the navigation content at any time. Specifically, the reasons why diversified audio and video cannot be provided for the visitors, the visitors cannot clearly listen to the sound of the navigation staff and the user experience is poor are that: the loudspeaker can not provide diversified audio and video for visitors, and when the flow of people is large, the visitors can hardly hear the sound played by the loudspeaker. In some scenes, due to environmental restrictions, the AR device may not be able to identify the target object, resulting in that AR content cannot be presented in time, thus resulting in poor user experience. On the other hand, in the case of group AR navigation, the navigator cannot control the user's playback content, which affects the efficiency of navigation. Based on this, the AR navigation system of some embodiments of the present disclosure may generate audio signal instructions by the transmitting device and send the audio signal instructions to the receiving device. The real-time audio collected by the first voice collecting device can be sent to the receiving device through the transmitting device; by the receiving means, the received real-time audio and/or audio signal instructions may be sent to the AR glasses. Furthermore, the visitors can listen to real-time audio and watch AR audios and videos through the AR glasses. From this, can provide diversified audio frequency and video for the visitor, and can make the visitor clearly hear guide's real-time explanation, guide's personnel also can control the function trigger of the AR glasses that the visitor wore at any time simultaneously, promote the user experience and the efficiency of visiting and leading.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a schematic diagram of one application scenario of an AR navigation system according to some embodiments of the present disclosure;
fig. 2 is a schematic structural diagram of some embodiments of an AR navigation system according to the present disclosure;
fig. 3 is a schematic structural diagram of further embodiments of an AR navigation system according to the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the relevant portions of the related inventions are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of one application scenario of an AR navigation system according to some embodiments of the present disclosure.
In the application scenario of fig. 1, the navigator can hold the transmitting device by hand, and the visitor can wear the AR glasses, wherein the receiving device (not shown in the figure) can be built in the AR glasses or can be separately arranged. The guide person can perform input operation through an input device (for example, a key) of the transmitting device, for example, input a number "001", the transmitting device can transmit an audio signal instruction represented by "001" to the receiving device, and the receiving device can transmit the audio signal instruction represented by "001" to the AR glasses, so that the visitor can watch and listen to the AR audio and video represented by "001" through the AR glasses. The guide personnel can also explain through emitter's first pronunciation collection device (for example, microphone), then emitter can send the real-time audio frequency of guide personnel explanation to receiving arrangement, and receiving arrangement can send the real-time audio frequency of guide personnel explanation to AR glasses to, the visitor can listen to the real-time audio frequency of guide personnel explanation through AR glasses.
It should be understood that the number of transmitting devices and AR glasses in fig. 1 is merely illustrative. There may be any number of transmitting devices and AR glasses, as desired for implementation.
With continued reference to fig. 2, a structural schematic diagram of some embodiments of an AR navigation system according to the present disclosure is shown. As shown in fig. 2, the AR navigation system of the present disclosure may include: a transmitting device 1, a receiving device 2, and AR glasses 3.
In some embodiments, the transmitting device 1 comprises an input device 101 and a first speech acquisition device 102. Wherein the transmitting device 1 is in communication connection with the receiving device 2. The transmission apparatus 1 may be configured to generate an audio signal instruction according to an input operation acting on the input apparatus 101 in response to detection of the input operation, and transmit the audio signal instruction to the reception apparatus 2. Here, the transmitting apparatus 1 may be an apparatus having an audio signal instruction and a real-time audio transmission function. The transmitting device 1 may be, for example, a wireless transmitter of a voice navigator. The input device 101 may be a device having an input function. For example, the input device 101 may be a key, and the input operation may be a key operation. The input device 101 may also be a touch key, and the input operation may be a touch operation. The audio signal instructions may be instructions in the form of audio signals, which may be, but are not limited to: a voice signal instruction. The voice signal instruction may be an audio signal for playing AR audio video. For example, the voice signal instruction may be an audio signal corresponding to a key "001" for playing the first AR audio/video. The first voice collecting apparatus 102 may be a voice collecting apparatus having a voice collecting function. For example, the first voice capture device 102 may be a microphone. The real-time audio may be speech that the navigator explained in real-time. The receiving device 2 may be a device having a function of receiving and transmitting an audio signal instruction and real-time audio. For example, the receiving device 2 may be a wireless receiver. The transmitting device 1 and the receiving device 2 can communicate by radio, and the carrier frequency of the transmitting device 1 and the receiving device 2 can be 220MHz-270MHz, and can also be other available frequency bands, which is not limited here. A plurality of channels may be provided between the transmitting apparatus 1 and the receiving apparatus 2. The transmitting device 1 and the receiving device 2 may communicate with each other through bluetooth, WIFI, or other methods. The transmitting apparatus 1 may be further configured to transmit the real-time audio collected by the first voice collecting apparatus 102 to the receiving apparatus 2 in response to detecting that the real-time audio is collected by the first voice collecting apparatus 102. In practice, one transmitting device 1 may be communicatively connected to a plurality of receiving devices 2. Here, the number of the receiving apparatuses 2 that can be connected to each other is not limited.
In some embodiments, the receiving device 2 is communicatively connected with the AR glasses 3. The receiving device 2 may be configured to send the received real-time audio and/or audio signal instructions to the AR glasses 3 in response to receiving the real-time audio and/or audio signal instructions sent by the transmitting device 1. Here, the AR glasses 3 may be a head-mounted display device having audio playback and virtual image playback functions. For example, the AR glasses may be head-mounted AR glasses.
Optionally, the transmitting device 1 further comprises a first display screen. The first display screen is a display screen for displaying identification information of the AR audios and videos played in the AR glasses. Here, the first display screen may be a display screen having a display function. The identification information may be information uniquely identifying the AR audio/video played. For example, the identification information may be "001" corresponding to the key "001". Therefore, the identification information of the AR audio and video played currently can be displayed on the first display screen of the transmitting device.
Optionally, the receiving device 2 further comprises an audio output interface or an audio playing device. The audio output interface is connected with an audio output device. The audio output interface may be an interface to connect an audio output device. For example, the audio output interface may be a 3.5mm interface to connect headphones. The audio output device may be a device having an audio output function. For example, the audio output device may be a wired headset. It is understood that the receiving device 2, including the audio output interface or audio playing device, is a separate device, communicatively connected to the AR glasses 3. Therefore, when the receiving device is not connected with the AR glasses, the audio output device or the audio playing device of the receiving device can be used for directly playing the real-time audio of the guide staff and/or the audio corresponding to the audio signal instruction.
Alternatively, the input device 101 may include a pause play key and a continue play key. The transmission apparatus 1 may be further configured to generate an audio signal instruction indicating pause of play in response to detection of a pause play operation acting on the above-described pause play key, and to transmit the audio signal instruction indicating pause of play to the reception apparatus 2. The pause playing operation may be an operation of pausing the AR audio video in playing. The audio signal instruction indicative of the pause of the play may be an audio signal instruction generated by pressing a pause play key included in the input device 101. The pause play key may be a key representing pause play. The transmitting apparatus 1 may be further configured to generate an audio signal instruction indicating a continued play in response to detection of a continued play operation acting on the above-described continued play key, and to transmit the audio signal instruction indicating a continued play to the receiving apparatus 2. The play resuming operation may be an operation of resuming playing the AR audio video whose playing is paused. The audio signal instruction indicative of continued play may be an audio signal instruction generated by pressing a continued play key included in the input device 101. The resume play key may be a key representing resume play.
Optionally, the input device 101 may also include a replay key. The transmitting device 1 may be further configured to generate an audio signal instruction indicative of playback in response to detection of a playback operation on the playback key described above, and to transmit the audio signal instruction indicative of playback to the receiving device 2. The replay operation may be an operation of replaying the AR audio video played last time. The audio signal command indicative of playback may be an audio signal command generated by pressing a playback key included in the input device 101. The replay key may be a key that characterizes replay.
Optionally, the input device 101 may further include a volume adjustment key and a progress adjustment key. The transmitting apparatus 1 may be further configured to generate an audio signal instruction representing an adjusted volume in response to detection of a volume adjustment operation acting on the above-described volume adjustment key, and transmit the audio signal instruction representing the adjusted volume to the receiving apparatus 2. The volume adjusting operation may be an operation of adjusting the volume of the AR audio video being played. For example, the volume adjustment operation may be an operation of turning up the volume. The volume adjustment operation may also be an operation of turning down the volume. The audio signal instruction indicative of adjusting the volume may be an audio signal instruction generated by pressing a volume adjustment key included in the input device 101. The volume adjustment key may be a key that characterizes an adjustment to the volume. For example, the volume adjustment key may be a volume up key or a volume down key. The transmitting apparatus 1 may be further configured to generate an audio signal instruction representing a progress of the adjustment in response to detection of a progress adjustment operation acting on the progress adjustment key described above, and to transmit the audio signal instruction representing the progress of the adjustment to the receiving apparatus 2. The process adjusting operation may be an operation of adjusting a process of the AR audio video being played. For example, the process adjustment operation may be a fast forward operation or a fast reverse operation. The audio signal instruction characterizing the adjustment process may be an audio signal instruction generated by pressing a process adjustment key included in the input apparatus 101. The course adjustment key may be a key that characterizes the adjustment of the course. For example, the progress adjustment key may be a fast forward key or a fast backward key.
The above embodiments of the present disclosure have the following advantages: through the AR navigation system of some embodiments of the present disclosure, diversified audios and videos can be provided for the visitors, and the visitors can clearly hear the real-time explanation of the navigation personnel, and the navigation personnel can control the function trigger (e.g., play function) of the navigation content at any time. Specifically, the reasons why diversified audio and video cannot be provided for the visitors, the visitors cannot clearly listen to the sound of the navigation staff and the user experience is poor are that: the loudspeaker can not provide diversified audio and video for visitors, and when the flow of people is large, the visitors can hardly hear the sound played by the loudspeaker. In some scenes, due to environmental restrictions, the AR device may not be able to identify the target object, resulting in that AR content cannot be presented in time, thus resulting in poor user experience. On the other hand, in the case of group AR navigation, the navigator cannot control the user's playback content, which affects the efficiency of navigation. Based on this, the AR navigation system of some embodiments of the present disclosure may generate audio signal instructions by the transmitting device and send the audio signal instructions to the receiving device. The real-time audio collected by the first voice collecting device can be sent to the receiving device through the transmitting device; by the receiving means, the received real-time audio and/or audio signal instructions may be sent to the AR glasses. Furthermore, the visitors can listen to real-time audio and watch AR audios and videos through the AR glasses. From this, can provide diversified audio frequency and video for the visitor, and can make the visitor clearly hear guide's real-time explanation, guide's personnel also can control the function trigger of the AR glasses that the visitor wore at any time simultaneously, promote the user experience and the efficiency of visiting and leading.
With further reference to fig. 3, a schematic structural diagram of further embodiments of the AR navigation system according to the present disclosure is shown. As with the AR navigation system in the embodiment of fig. 2, the AR navigation system in the embodiment may also include a transmitting device 1, a receiving device 2, and AR glasses 3. For a specific structural relationship, reference may be made to the related description in the embodiment of fig. 2, which is not described herein again.
Unlike the AR navigation system in the embodiment of fig. 2, the AR navigation system in the present embodiment may further include a mobile device 4. The mobile device 4 comprises a second display screen 401 and a second speech capturing means 402. The mobile device 4 is in communication connection with the transmitting apparatus 1.
In some embodiments, the mobile device 4 may be configured to generate an audio signal instruction according to a touch screen operation acting on the second display screen 401 in response to detecting the touch screen operation, and to transmit the generated audio signal instruction to the transmitting apparatus 1. Here, the mobile device 4 may be a mobile phone or a tablet computer. The second display screen 401 may be a display screen having a touch-sensitive function and a display function. The touch screen operation may be an operation acting on the second display screen 401. For example, the touch screen operation may be a selection operation for any AR audio video. The mobile device 4 may also be configured to transmit real-time audio captured by the second voice capture device 402 to the transmitting device 1 in response to detecting that the real-time audio was captured by the second voice capture device 402. The second voice collecting apparatus 402 may be a voice collecting apparatus having a voice collecting function. For example, the second voice capturing device 402 may be a microphone of a mobile phone.
In some embodiments, the transmitting device 1 may also be configured to send the received real-time audio and/or audio signal instructions to the receiving device 2 in response to receiving the real-time audio and/or audio signal instructions. Therefore, the guide staff can be led to explain through the mobile device and send the audio signal instruction through the mobile device.
Alternatively, the second display screen 401 may be a display screen displaying a playing process of an AR audio and video in each AR glasses in a connected state, which is included in the AR navigation system. The playing progress can be the playing progress of the AR audio and video. The mobile device 4 may also be configured to generate audio signal instructions characterizing the adjustment progress in response to the touch screen operation being a progress adjustment operation as described above, and to send the audio signal instructions characterizing the adjustment progress to the transmitting apparatus 1. The process adjusting operation may be an operation of adjusting a process of the AR audio video being played. For example, the process adjustment operation may be a fast forward operation or a fast reverse operation. The audio signal instructions characterizing the adjustment process may be audio signal instructions generated by clicking a process adjustment control in the second display screen 401. The process adjustment control may be a control that characterizes an adjustment process. For example, the progress adjustment control can be a fast forward control and can also be a fast reverse control.
Optionally, the mobile device 4 may be further configured to generate an audio signal instruction representing the pause of the playing in response to the touch screen operation being the pause of the playing, and send the audio signal instruction representing the pause of the playing to the transmitting apparatus 1. The pause playing operation may be an operation of pausing the AR audio video in playing. The audio signal instruction characterizing the pause play may be an audio signal instruction generated by clicking a pause play control in the second display screen 401. The pause play control can be a control that characterizes a pause play. The mobile device 4 may be further configured to generate an audio signal instruction indicating that the playing is continued in response to the touch screen operation being the play continuation operation, and to transmit the audio signal instruction indicating that the playing is continued to the transmitting apparatus 1. The play resuming operation may be an operation of resuming playing the AR audio video whose playing is paused. The audio signal instruction characterizing the continued play may be an audio signal instruction generated by clicking a continued play control in the second display screen 401. The continue play control can be a control that characterizes the continue play.
Optionally, the mobile device 4 may be further configured to generate audio signal instructions characterizing playback in response to the touch screen operation being a playback operation as described above, and to transmit the audio signal instructions characterizing playback to the transmitting apparatus 1. The replay operation may be an operation of replaying the AR audio video played last time. The audio signal instruction characterizing replay may be an audio signal instruction generated by clicking a replay control in the second display screen 401. The replay control may be a control that characterizes replay.
Optionally, the mobile device 4 may be further configured to generate an audio signal instruction representing a volume adjustment in response to the touch screen operation being a volume adjustment operation, and to transmit the audio signal instruction representing the volume adjustment to the transmitting apparatus 1. The volume adjusting operation may be an operation of adjusting the volume of the AR audio video being played. For example, the volume adjustment input operation may be an operation of turning up the volume. The volume adjustment input operation may also be an operation of turning down the volume. The audio signal instruction characterizing the adjustment of the volume may be an audio signal instruction generated by clicking a volume adjustment control in the second display screen 401. The volume adjustment control may be a control that characterizes an adjustment to volume. For example, the volume adjustment control may be a volume up control and may also be a volume down control.
As can be seen from fig. 3, compared to the description of some embodiments corresponding to fig. 2, the structure in some embodiments corresponding to fig. 3 may first send real-time audio and/or audio signal instructions to the transmitting apparatus through the mobile device. The real-time audio and/or audio signal instructions may then be transmitted by the transmitting device to the receiving device. Further, real-time audio and/or audio signal instructions may be sent to the AR glasses via the receiving device. Thus, the guide person can be enabled to explain through the mobile device and send audio signal instructions through the mobile device.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be understood by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combinations of the above-mentioned features, but also covers other embodiments formed by any combination of the above-mentioned features or their equivalents without departing from the spirit of the present disclosure. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (11)

1. An AR navigation system comprising: the device comprises a transmitting device, a receiving device and AR glasses, wherein the transmitting device comprises an input device and a first voice acquisition device,
the transmitting device is in communication connection with the receiving device, and is configured to respond to the detection of an input operation acting on the input device, generate an audio signal instruction according to the input operation, and send the audio signal instruction to the receiving device;
the transmitting device is further configured to respond to the detection that the first voice acquisition device acquires real-time audio, and send the real-time audio acquired by the first voice acquisition device to the receiving device;
the receiving device is in communicative connection with the AR glasses, the receiving device configured to send the received real-time audio and/or audio signal instructions to the AR glasses in response to receiving the real-time audio and/or audio signal instructions sent by the transmitting device.
2. The AR navigation system of claim 1, wherein the transmitting device further comprises a first display screen, the first display screen being a display screen displaying identification information of AR audio and video played in the AR glasses.
3. The AR navigation system of claim 1, wherein the receiving means further comprises an audio output interface or an audio playing means, the audio output interface being connected to an audio output means.
4. The AR navigation system according to claim 1, wherein the input means comprises a pause play key and a continue play key; and
the transmitting device is also configured to respond to the detection of the pause playing operation acting on the pause playing key, generate an audio signal instruction for representing pause playing and send the audio signal instruction for representing pause playing to the receiving device;
the transmitting device is also configured to respond to the detection of the continuous playing operation acting on the continuous playing key, generate an audio signal instruction for representing continuous playing, and send the audio signal instruction for representing continuous playing to the receiving device.
5. The AR navigation system of claim 1, wherein the input means comprises a replay key; and
the transmitting device is further configured to generate an audio signal instruction characterizing replay in response to detection of a replay operation on the replay key, and to transmit the audio signal instruction characterizing replay to the receiving device.
6. The AR navigation system according to claim 1, wherein the input means includes a volume adjustment key and a progress adjustment key; and
the transmitting device is further configured to generate an audio signal instruction representing adjusted volume in response to detecting a volume adjustment operation acting on the volume adjustment key, and to transmit the audio signal instruction representing adjusted volume to the receiving device;
the transmitting device is further configured to generate an audio signal instruction representing a progress of the adjustment in response to detecting a progress adjustment operation acting on the progress adjustment key, and to transmit the audio signal instruction representing the progress of the adjustment to the receiving device.
7. The AR navigation system according to claim 1, further comprising a mobile device comprising a second display screen and a second voice capturing means, said mobile device being communicatively connected to said transmitting means; and
the mobile device is configured to respond to the detection of a touch screen operation acting on the second display screen, generate an audio signal instruction according to the touch screen operation and send the generated audio signal instruction to the transmitting device;
the mobile device is also configured to respond to the detection that the second voice acquisition device acquires real-time audio, and send the real-time audio acquired by the second voice acquisition device to the transmitting device;
the transmitting device is further configured to send the received real-time audio and/or audio signal instructions to the receiving device in response to receiving the real-time audio and/or audio signal instructions.
8. The AR navigation system according to claim 7, wherein the second display screen is a display screen displaying a play progress of each AR glasses included in the AR navigation system in a connected state; and
the mobile device is further configured to respond to the touch screen operation as a process adjusting operation, generate an audio signal instruction for representing the adjusting process, and send the audio signal instruction for representing the adjusting process to the transmitting device.
9. The AR navigation system of claim 7, wherein the mobile device is further configured to generate an audio signal instruction characterizing pause play in response to the touch screen operation being a pause play operation, and to send the audio signal instruction characterizing pause play to the transmitting means;
the mobile device is also configured to respond to the touch screen operation as a play continuation operation, generate an audio signal instruction for representing play continuation, and send the audio signal instruction for representing play continuation to the transmitting device.
10. The AR navigation system of claim 7, wherein the mobile device is further configured to generate audio signal instructions characterizing playback in response to the touch screen operation being a playback operation, and to send the audio signal instructions characterizing playback to the transmitting means.
11. The AR navigation system of claim 7, wherein the mobile device is further configured to generate audio signal instructions indicative of an adjusted volume in response to the touchscreen operation being a volume adjustment operation, and to send the audio signal instructions indicative of the adjusted volume to the transmitting apparatus.
CN202121499602.8U 2021-06-30 2021-06-30 AR guide system Active CN215378902U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202121499602.8U CN215378902U (en) 2021-06-30 2021-06-30 AR guide system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121499602.8U CN215378902U (en) 2021-06-30 2021-06-30 AR guide system

Publications (1)

Publication Number Publication Date
CN215378902U true CN215378902U (en) 2021-12-31

Family

ID=79606885

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121499602.8U Active CN215378902U (en) 2021-06-30 2021-06-30 AR guide system

Country Status (1)

Country Link
CN (1) CN215378902U (en)

Similar Documents

Publication Publication Date Title
KR101878279B1 (en) Video remote-commentary synchronization method and system, and terminal device
US9344878B2 (en) Method and system for operating communication service
JP5421316B2 (en) Portable terminal, pairing system, and pairing method
GB2507097A (en) Providing customised supplementary content to a personal user device
CN107889044B (en) The processing method and processing device of audio data
CN110062273B (en) Screenshot method and mobile terminal
CN106506437B (en) Audio data processing method and device
CN102131071A (en) Method and device for video screen switching
US20220021980A1 (en) Terminal, audio cooperative reproduction system, and content display apparatus
CN107147927A (en) Live broadcasting method and device based on live even wheat
CN108683980B (en) Audio signal transmission method and mobile terminal
CN113206970A (en) Wireless screen projection method and device for video communication and storage medium
CN109194998B (en) Data transmission method and device, electronic equipment and computer readable medium
CN108632718B (en) Audio sharing method and system
CN215378902U (en) AR guide system
CN109388471B (en) Navigation method and device
CN113395072A (en) AR navigation system and method
JP6212719B2 (en) Video receiving apparatus, information display method, and video receiving system
CN109413566A (en) A kind of playback method and device
CN108055633A (en) A kind of audio frequency playing method and mobile terminal
CN112770149B (en) Video processing method, device, terminal and storage medium
CN116193179A (en) Conference recording method, terminal equipment and conference recording system
CN109640305B (en) Audio and video acquisition method, Bluetooth device and terminal device
JP3949886B2 (en) Portable audio player and output destination control method for portable audio player
CN111796748A (en) Control method and device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant