CN210090827U - Portable AR glasses implementation system - Google Patents

Portable AR glasses implementation system Download PDF

Info

Publication number
CN210090827U
CN210090827U CN201921082070.0U CN201921082070U CN210090827U CN 210090827 U CN210090827 U CN 210090827U CN 201921082070 U CN201921082070 U CN 201921082070U CN 210090827 U CN210090827 U CN 210090827U
Authority
CN
China
Prior art keywords
communication device
processing terminal
glasses
glasses body
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201921082070.0U
Other languages
Chinese (zh)
Inventor
徐龙
李骊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing HJIMI Technology Co Ltd
Original Assignee
Beijing HJIMI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing HJIMI Technology Co Ltd filed Critical Beijing HJIMI Technology Co Ltd
Priority to CN201921082070.0U priority Critical patent/CN210090827U/en
Application granted granted Critical
Publication of CN210090827U publication Critical patent/CN210090827U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The utility model discloses a portable AR glasses implementation system, which comprises an AR glasses body, a communication device and a processing terminal; a display device for displaying pictures is embedded in the AR glasses body, and a depth camera module for acquiring a depth image signal is further arranged on the AR glasses body; the processing terminal is used for analyzing and processing the received depth image signal, identifying the interactive action, generating a display picture and sending a display picture instruction to the communication device. The utility model discloses a strip signal processing, data calculation and the power supply system of AR glasses to cell-phone and communication device, realized simple and the lightweight of glasses body structure, promoted user's use and experienced. Meanwhile, the problems of endurance and communication of the product can be well solved through the communication device. The combination of the AR glasses and the mobile phone part can well enrich the application of products, improve the processing capacity of equipment and realize the function of virtual interaction.

Description

Portable AR glasses implementation system
Technical Field
The utility model relates to a AR glasses, especially a system is realized to AR glasses of portability.
Background
AR glasses are a wearable device developed based on AR virtual reality technology. The virtual world and the real world can be butted by a computer image technology, so that interaction and communication between the virtual world and the real world are realized. The AR technology has wide application field, and mainly relates to the fields of medical treatment, education, military, industry, entertainment games and the like.
Because the system integration level of the AR wearable device in the prior art is high, the product structure is often too complex, the quality and the volume are too large, the wearing burden is heavy, the action of a wearer is inconvenient, the user experience of the consumer is sacrificed, and the AR wearable device can only be used for a short time.
SUMMERY OF THE UTILITY MODEL
Utility model purpose: to the defect that above-mentioned prior art exists, the utility model aims at providing a AR glasses implementation system of portability, when improving user experience, furthest's simplification glasses body's structure.
The technical scheme is as follows: a portable AR glasses implementation system comprises an AR glasses body, a communication device and a processing terminal;
a display device for displaying pictures is embedded in the AR glasses body, and a depth camera module for acquiring a depth image signal is further arranged on the AR glasses body;
the communication device is used for transmitting the depth image signals collected by the depth camera module to the processing terminal and transmitting the display picture instructions sent by the processing terminal to the display device;
the processing terminal is used for analyzing and processing the received depth image signal, identifying the interactive action, generating a display picture and sending a display picture instruction to the communication device.
In order to optimize the interaction experience and improve the response speed, further, the depth camera module comprises a color camera for shooting a color image, a dot matrix laser emitter for projecting an infrared dot matrix and an infrared receiver for capturing an object reflection infrared dot matrix pattern.
In order to facilitate processing, the color camera, the infrared laser transmitter and the infrared laser receiver are arranged on the same horizontal line on the glasses body.
Further, the interactive action is a gesture interactive action or a human body interactive action.
In order to improve the accuracy of the recognition action, a 3D sensing chip for recognizing the interaction action and generating a display picture after analyzing and processing the depth image signal is further arranged in the processing terminal.
In order to make the system more portable and realizable, the processing terminal is a mobile phone with a built-in 3D sensing chip.
In order to realize convenient real-time control, the AR glasses body is further provided with an audio receiving module for receiving audio; the communication device is also used for transmitting the audio to the processing terminal; the processing terminal is also used for processing audio, recognizing voice content and converting the voice content into a control instruction.
In order to enrich the user experience, the AR glasses body is further provided with earphone holes which are connected with earphones in a pluggable mode; the processing terminal is also used for transmitting the audio signal to be played to the earphone through the communication device for playing.
Further, the communication device establishes communication connection with the processing terminal in a wireless communication mode.
Further, communication device and AR glasses body electrical connection still are equipped with the power module that is used for AR glasses body power supply in the communication device.
Has the advantages that: the utility model discloses a strip signal processing, data calculation and the power supply system of AR glasses to cell-phone and communication device, realized simple and the lightweight of glasses body structure, promoted user's use and experienced. Meanwhile, the problems of endurance and communication of the product can be well solved through the communication device. The combination of the AR glasses and the mobile phone part can well enrich the application of products, improve the processing capacity of equipment and realize the function of virtual interaction.
Drawings
FIG. 1 is a schematic view of structural connection of example 1;
FIG. 2 is a schematic system diagram of example 1;
fig. 3 is a wearing diagram of embodiment 1.
Detailed Description
The technical solution is described in detail below with reference to a preferred embodiment and the accompanying drawings.
In the present embodiment, the computing system and the power supply system of the AR glasses are separated from the mobile phone 3 and the communication device 2, so that a system configuration of the AR glasses is redefined. In the system, the mobile phone 3 is used as a powerful operation processing platform and mainly responsible for data processing work of the AR glasses. Particularly for the mobile phone 3 carrying AR and MR chips and integrating cloud computing resources in the future, the application and experience of the product can be well enriched, and the capabilities of communication, entertainment, application management and virtual interaction are realized. Meanwhile, the structure is also beneficial to reducing the heat generated by the AR glasses body 1 in the working process, and the wearing experience of a user is improved. The communication device 2 is mainly responsible for signal transmission between the AR glasses and the mobile phone 3, and has the capability of supplying power to the AR glasses. Like this, glasses body 1 part only remains the basic hardware in several aspects of demonstration, audio frequency, gesture recognition, speech recognition, has simplified the product structure greatly, has realized the purpose of lightweight.
As shown in fig. 1 and 2, a portable AR glasses implementation system includes an AR glasses body 1, a communication device 2, and a processing terminal, i.e., a mobile phone 3;
AR glasses body 1, including the mirror holder, the mirror holder comprises picture frame and mirror leg, and the picture frame is inside to be inlayed and to be equipped with display device 7 that is used for the display frame as the lens, and display device is prior art, can refer to the display device who is arranged in current AR glasses, still is equipped with the degree of depth camera module that is used for acquireing the degree of depth image signal on the mirror holder.
The spectacle frame is also provided with an audio receiving module 10 for receiving audio, the audio receiving module 10 is an MIC (microphone) arranged on the spectacle frame, and the communication device 2 transmits the audio to a processing terminal; the processing terminal processes the audio, recognizes the voice content and converts the voice content into a control instruction.
The glasses legs are provided with earphone holes which can be plugged and disconnected with earphones 8 and 9 (a left earphone and a right earphone respectively); the processing terminal is further configured to transmit the audio signal to be played to the earphones 8 and 9 via the communication device 2.
The depth camera module is a structured light depth camera module, is hidden behind the display device 7, is used for detecting user actions, such as gesture recognition, and realizes the application operation and virtual interaction capacity, and comprises a color camera 5 for shooting a color image, a dot matrix laser emitter 6 for projecting an infrared dot matrix and an infrared receiver 4 for capturing an object reflection infrared dot matrix pattern;
the lattice laser emitter 6 is used for projecting an infrared light lattice, namely projecting tens of thousands of infrared light spots, and each infrared light spot has XYZ three-dimensional coordinates. The infrared receiver 4 is configured to capture an infrared dot matrix pattern reflected by an object, that is, capture each light spot, and acquire coordinate information of each light spot, so as to obtain position and depth information, that is, a depth image signal (RGB-D signal).
The structure is the prior art, for example, the face recognition device of apple company and the IMI optical module of Huajiemii have the above structures.
In order to process image data conveniently, the color camera 5, the infrared laser emitter 6 and the infrared laser receiver of the structured light module are all arranged on a horizontal line.
The communication device 2 is used for transmitting the depth image signals collected by the depth camera module to the processing terminal and transmitting the display picture instructions sent by the processing terminal to the display device 7; the communication device 2 of the present embodiment is a demodulator, and is connected to the lens holder through a wire, transmits data, and supplies power.
As shown in the figure, the communication device 2 is composed of a control button 11, a charging port 12, a display 13, and a wire 14 integrating power supply and communication functions (pluggable), and mainly functions to realize communication between the mobile phone 3 and the AR glasses. In addition, the device can also supply power to the AR glasses, so that the equipment can stably run for a long time.
The communication device 2 establishes communication connection with the processing terminal in a wireless communication mode, wherein the wireless communication mode comprises a Bluetooth mode, a Wi-Fi mode, a mobile communication mode and the like, and can be selected according to scenes. The communication device 2 is electrically connected with the mirror bracket, and a power supply module, namely a battery, for supplying power to the mirror bracket is also arranged in the communication device 2. A CPU may be provided for controlling transmission and reception of signals according to the demodulator in the related art.
The processing terminal is configured to analyze and process the received depth image signal, recognize an interaction, generate a display screen, and send a display screen instruction to the communication device 2. The method is mainly responsible for the operation processing work of the AR data, and simultaneously has the capabilities of managing the AR application, network communication and application expansion.
And a 3D perception chip for recognizing the interaction action and generating a display picture after analyzing and processing the depth image signal is arranged in the processing terminal. The interaction is a gesture interaction or a human interaction.
As shown in the figure, the processing terminal of the embodiment is a mobile phone 3 with a built-in 3D sensing chip, and may also be other devices with an arithmetic function, such as a computer or a tablet.
Analyzing the received image signal to obtain position and depth information, processing the position and depth information to convert the position and depth information into human skeleton action information/gesture action information, recognizing the interaction action and generating a display picture. The above methods are all the prior art, for example, an analysis processing method adopted by Kinect for acquired image signals, or an image analysis processing method adopted by an IMI optical module of Huajiemi or a 3D somatosensory camera can be realized.
The above method can be implemented by a 3D sensing chip of waremide, such as an IMI3D chip 1180 and an IMI3D chip 1280.
The mode of wearing the communication device 2 of the present embodiment is as shown in fig. 3, and can be selectively worn on the upper arm, fixed by the fixing band 15 or placed in the pocket 16.
The light-weighted design has been carried out to AR glasses to this embodiment, and the part of glasses body 1 has only kept the hardware architecture of several aspects of demonstration, earphone 8, 9, the lens module that cooperate gesture recognition to use and speech signal reception. In the working process, the communication device 2 acquires the gesture and the voice command received from the glasses body 1 through the signal transmission line, and then transmits the gesture and the voice command to the mobile phone 3 through the WIFI/BT channel. After the cell-phone 3 receives the signal that communication device 2 transmitted, through AR operation and processing, the required video of virtual reality and audio information backward pass back earphone 8, 9 and display device 7 of AR glasses body 1 through communication device 2, receive by human sense organ at last, realize the interactive function between virtual and the reality. The communication device 2 part can not only have signal relay and processing capabilities, but also supply power to the AR glasses body 1 part through the power supply and communication cable 14, so that the AR glasses can work stably for a long time. Meanwhile, the equipment can also be charged through the external Micro USB interface. The mobile phone 3 is defined as a powerful operation processing platform in the system, and is mainly responsible for management and expansion of AR application, data calculation and instant communication.
The above is only a preferred embodiment of the present invention, and it is obvious to those skilled in the art that a plurality of improvements and decorations can be made without departing from the principle of the present invention, and these improvements and decorations should be regarded as the protection scope of the present invention.

Claims (10)

1. A portable AR glasses implementation system is characterized by comprising an AR glasses body, a communication device and a processing terminal;
a display device for displaying pictures is embedded in the AR glasses body, and a depth camera module for acquiring a depth image signal is further arranged on the AR glasses body;
the communication device is used for transmitting the depth image signals collected by the depth camera module to the processing terminal and transmitting the display picture instructions sent by the processing terminal to the display device;
the processing terminal is used for analyzing and processing the received depth image signal, identifying the interactive action, generating a display picture and sending a display picture instruction to the communication device.
2. The system of claim 1, wherein the depth camera module comprises a color camera for capturing color images, a dot matrix laser emitter for projecting an infrared dot matrix, and an infrared receiver for capturing an object reflection infrared dot matrix pattern.
3. The system of claim 2, wherein the color camera, the infrared laser transmitter and the infrared laser receiver are disposed on a same horizontal line on the glasses body.
4. The system of claim 1, wherein the interaction is a gesture interaction or a human interaction.
5. The system of claim 1, wherein a 3D sensor chip is disposed in the processing terminal for analyzing and processing the depth image signal, recognizing the interaction, and generating the display screen.
6. The system of claim 5, wherein the processing terminal is a mobile phone with a built-in 3D sensing chip.
7. The system of claim 1, wherein the AR glasses body further comprises an audio receiving module for receiving audio; the communication device is also used for transmitting the audio to the processing terminal; the processing terminal is also used for processing audio, recognizing voice content and converting the voice content into a control instruction.
8. The system of claim 1, wherein the AR glasses body has an earphone hole for connecting an earphone in a pluggable manner; the processing terminal is also used for transmitting the audio signal to be played to the earphone through the communication device for playing.
9. The system of claim 1, wherein the communication device establishes a communication connection with the processing terminal via a wireless communication manner.
10. The system of claim 1, wherein the communication device is electrically connected to the AR glasses body, and a power supply module for supplying power to the AR glasses body is further disposed in the communication device.
CN201921082070.0U 2019-07-11 2019-07-11 Portable AR glasses implementation system Active CN210090827U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201921082070.0U CN210090827U (en) 2019-07-11 2019-07-11 Portable AR glasses implementation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201921082070.0U CN210090827U (en) 2019-07-11 2019-07-11 Portable AR glasses implementation system

Publications (1)

Publication Number Publication Date
CN210090827U true CN210090827U (en) 2020-02-18

Family

ID=69484955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201921082070.0U Active CN210090827U (en) 2019-07-11 2019-07-11 Portable AR glasses implementation system

Country Status (1)

Country Link
CN (1) CN210090827U (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111479148A (en) * 2020-04-17 2020-07-31 Oppo广东移动通信有限公司 Wearable device, glasses terminal, processing terminal, data interaction method and medium
CN112379526A (en) * 2020-12-04 2021-02-19 上海影创信息科技有限公司 VR glasses structure with physical energy emitter
CN112684892A (en) * 2020-12-30 2021-04-20 中国人民解放军32181部队 Augmented reality ammunition recognition glasses-handle continuous carrying system
CN113301322A (en) * 2021-06-25 2021-08-24 上海锐像信息科技有限公司 VR glasses and VR equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111479148A (en) * 2020-04-17 2020-07-31 Oppo广东移动通信有限公司 Wearable device, glasses terminal, processing terminal, data interaction method and medium
CN111479148B (en) * 2020-04-17 2022-02-08 Oppo广东移动通信有限公司 Wearable device, glasses terminal, processing terminal, data interaction method and medium
CN112379526A (en) * 2020-12-04 2021-02-19 上海影创信息科技有限公司 VR glasses structure with physical energy emitter
CN112379526B (en) * 2020-12-04 2022-04-19 上海影创信息科技有限公司 VR glasses structure with physical energy emitter
CN112684892A (en) * 2020-12-30 2021-04-20 中国人民解放军32181部队 Augmented reality ammunition recognition glasses-handle continuous carrying system
CN112684892B (en) * 2020-12-30 2024-01-12 中国人民解放军32181部队 Augmented reality ammunition recognition glasses-handle carrying system
CN113301322A (en) * 2021-06-25 2021-08-24 上海锐像信息科技有限公司 VR glasses and VR equipment

Similar Documents

Publication Publication Date Title
CN210090827U (en) Portable AR glasses implementation system
CN111445583B (en) Augmented reality processing method and device, storage medium and electronic equipment
CN104520787B (en) Wearing-on-head type computer is as the secondary monitor inputted with automatic speech recognition and head-tracking
US11790612B2 (en) Information display method and device, terminal, and storage medium
WO2017092396A1 (en) Virtual reality interaction system and method
CN206440890U (en) Wearable Split intelligent glasses
EP3964985A1 (en) Simulation object identity recognition method, and related apparatus and system
CN112270754A (en) Local grid map construction method and device, readable medium and electronic equipment
CN108431872A (en) A kind of method and apparatus of shared virtual reality data
CN111045945B (en) Method, device, terminal, storage medium and program product for simulating live broadcast
EP3974950A1 (en) Interactive method and apparatus in virtual reality scene
CN111479148B (en) Wearable device, glasses terminal, processing terminal, data interaction method and medium
CN106377401A (en) Blind guiding front-end equipment, blind guiding rear-end equipment and blind guiding system
CN106412810B (en) Data transmission method and device
KR101784095B1 (en) Head-mounted display apparatus using a plurality of data and system for transmitting and receiving the plurality of data
CN111985252B (en) Dialogue translation method and device, storage medium and electronic equipment
CN114422935A (en) Audio processing method, terminal and computer readable storage medium
CN112449098B (en) Shooting method, device, terminal and storage medium
CN109542218B (en) Mobile terminal, human-computer interaction system and method
CN113342158A (en) Glasses equipment, data processing method and device and electronic equipment
CN114416237B (en) Display state switching method, device and system, electronic equipment and storage medium
JP2020201575A (en) Display controller, display control method, and display control program
WO2022135272A1 (en) Three-dimensional model reconstruction method, device, and storage medium
CN111462335B (en) Equipment control method and device based on virtual object interaction, medium and equipment
CN210750149U (en) Blind guiding glasses and blind guiding system thereof

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant