CN106155311A - AR helmet, AR interactive system and the exchange method of AR scene - Google Patents

AR helmet, AR interactive system and the exchange method of AR scene Download PDF

Info

Publication number
CN106155311A
CN106155311A CN201610489453.4A CN201610489453A CN106155311A CN 106155311 A CN106155311 A CN 106155311A CN 201610489453 A CN201610489453 A CN 201610489453A CN 106155311 A CN106155311 A CN 106155311A
Authority
CN
China
Prior art keywords
helmet
interactive
user
operating gesture
described object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610489453.4A
Other languages
Chinese (zh)
Inventor
张圣杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201610489453.4A priority Critical patent/CN106155311A/en
Publication of CN106155311A publication Critical patent/CN106155311A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of AR helmet, AR interactive system and the exchange method of AR scene, wherein, this AR helmet includes: acquisition module, for being obtained the object in AR scene by photographic head;Matching module, for mating object with the eigenvalue in default property data base, to determine whether object is can interactive object;Perform module, for be can be in the case of interactive object at object, according to the interactive action that the operating gesture of user is predetermined to object execution.The present invention need not quote the external equipment of similar keyboard or mouse etc., have only to user's hand directly operate, photographic head catches the operating gesture of user, just can object is performed corresponding operation according to this operating gesture, solve in prior art when user uses AR helmet, Internet audience needs to input by equipment such as external keyboard, mouses, the problem that operation is complicated.

Description

AR helmet, AR interactive system and the exchange method of AR scene
Technical field
The present invention relates to communication field, particularly relate to a kind of AR (augmented reality, Augmented Reality) and wear Equipment, AR interactive system and the exchange method of AR scene.
Background technology
AR is a kind of position calculating camera image in real time and angle plus respective image, video, 3D model Technology, the target of this technology is on screen, virtual world to be enclosed within real world and carry out interaction.Along with accompanied electronic produces The lifting of product CPU (central processing unit, Central Processing Unit) operational capability, it is contemplated that the purposes of augmented reality will Can be more and more wider.
In prior art when user uses AR helmet, Internet audience needs by external keyboard, mouse etc. Equipment inputs, operation complexity;And after input data collection, it is projected to the AR helmet screen of this user, this operating effect Being only an individual service, service area is narrow.
Summary of the invention
Present invention is primarily targeted at a kind of AR helmet of proposition, AR interactive system and the exchange method of AR scene, Aiming to solve the problem that in prior art when user uses AR helmet, Internet audience needs by external keyboard, mouse etc. Equipment inputs, operation complexity.
For achieving the above object, a kind of AR helmet that the present invention provides, including: acquisition module, for by shooting Head obtains the object in AR scene;Matching module, for carrying out the eigenvalue in described object and default property data base Join, to determine whether described object is can interactive object;Perform module, for being can the situation of interactive object at described object Under, according to the operating gesture of user, described object is performed predetermined interactive action.
Whether optionally, described matching module includes: characteristic matching unit, be used for judging in described default property data base Existence and described object reach the eigenvalue of preset matching degree;Determine unit, for existing in described default property data base When reaching the eigenvalue of preset matching degree with described object, determine that described object is for can interactive object.
Optionally, described execution module includes: judging unit, is used for judging whether described operating gesture has described object Block;Operation matching unit, in the case of described object is blocked by described operating gesture, by described operating gesture with Operating gesture in predetermined registration operation data base mates, the operation content corresponding to determine described operating gesture, wherein, described Operation content at least includes one below: newly-built, edit, send, delete, pull, amplify, reduce, select;Performance element, uses According to described operation content, described object being operated.
Optionally, also include: described acquisition module, be additionally operable to the operating gesture according to user and described object performed predetermined Interactive action before, from cloud server obtain described object can the type of interactive operation, wherein, described type is at least wrapped Include one below: buying mutual, digital label is mutual
Additionally, for achieving the above object, the present invention also proposes a kind of AR interactive system, including: cloud server and above-mentioned AR helmet described in any one;Described cloud server, with all AR helmets being connected to described cloud server Alternately, and in the case of receiving the digital label that a certain AR helmet has operated, in the next tag update time During arrival, described digital label is pushed to described all AR helmets.
Additionally, for achieving the above object, the present invention also proposes the exchange method of a kind of AR scene, and described method includes step Rapid: AR helmet obtains the object in AR scene by photographic head;Described AR helmet is by described object and default feature Eigenvalue in data base mates, to determine whether described object is can interactive object;It is can be the most right at described object In the case of as, described AR helmet performs predetermined interactive action according to the operating gesture of user to described object.
Optionally, described object is mated by described AR helmet with the eigenvalue in default property data base, with Determine described object be whether can the step of interactive object, including: described AR helmet judges described default property data base In whether exist and reach the eigenvalue of preset matching degree with described object;It is right with described to exist in described default property data base During as reaching the eigenvalue of preset matching degree, described AR helmet determines that described object is for can interactive object.
Optionally, described AR helmet performs predetermined interactive action according to the operating gesture of user to described object Step, including: described AR helmet judges whether described operating gesture blocks described object;At described operating gesture pair In the case of described object blocks, described AR helmet is by described operating gesture and the manipulator in predetermined registration operation data base Gesture is mated, the operation content corresponding to determine described operating gesture, and wherein, described operation content at least includes with purgation One: newly-built, edit, send, delete, pull, amplify, reduce, select;Described AR helmet according to described operation content to institute State object to operate.
Optionally, described AR helmet performs predetermined interactive action according to the operating gesture of user to described object Before step, also include: described AR helmet from cloud server obtain described object can the type of interactive operation, its In, described type at least includes one below: buying mutual, digital label is mutual.
Optionally, can the type of interactive operation be digital label mutual in the case of, according to described operation content to institute State after object carries out the step operated, also include: described cloud server receives the numeral that AR helmet operate and marks Sign;The described digital label received, when the next tag update time arrives, is pushed to and institute by described cloud server State all AR helmets that cloud server connects.
The present invention obtains the object in AR scene by photographic head, and the interaction spending this object is analyzed, right As for object being performed some interactive actions according to the operating gesture of user in the case of interactive object, should during not Needing to quote the external equipment of similar keyboard or mouse etc., it is only necessary to user's hand directly operates, photographic head catches user Operating gesture, it is possible to object is performed corresponding operation according to this operating gesture, solve in prior art user When using AR helmet, Internet audience needs to input by equipment such as external keyboard, mouses, complicated the asking of operation Topic.
Accompanying drawing explanation
Fig. 1 is the hardware architecture diagram realizing the optional mobile terminal of the embodiment of the present invention one;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the structural representation of the AR helmet realizing first embodiment of the invention;
Fig. 4 is the structural representation of the AR helmet realizing second embodiment of the invention;
Fig. 5 is the system interaction schematic diagram of the AR interactive system realizing third embodiment of the invention;
Fig. 6 is the flow chart of the exchange method of the AR scene realizing fourth embodiment of the invention.
The realization of the object of the invention, functional characteristics and advantage will in conjunction with the embodiments, are described further referring to the drawings.
Detailed description of the invention
Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
The mobile terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, use For representing the suffix explanation only for the beneficially present invention of such as " module ", " parts " or " unit " of element, itself Not specific meaning.Therefore, " module " can mixedly use with " parts ".
Mobile terminal can be implemented in a variety of manners.Such as, the terminal described in the present invention can include such as moving Phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP The mobile terminal of (portable media player), guider etc. and consolidating of such as numeral TV, desk computer etc. Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, mobile except being used in particular for Outside the element of purpose, structure according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is the hardware configuration signal realizing each optional mobile terminal of embodiment one of the present invention.
Mobile terminal 100 can include wireless communication unit 110, A/V (audio/video) input block 120, user's input Unit 130, sensing unit 140, output unit 150, memorizer 160, interface unit 170, controller 180 and power subsystem 190 Etc..Fig. 1 shows the mobile terminal with various assembly, it should be understood that be not required for implementing all groups illustrated Part.Can alternatively implement more or less of assembly.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assembly, and it allows mobile terminal 100 and wireless communication system Or the radio communication between network.Such as, wireless communication unit can include broadcast reception module 111, mobile communication module 112, at least one in wireless Internet module 113, short range communication module 114 and positional information module 115.
Broadcast reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcasting management server Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be to generate and send Broadcast singal and/or the server of broadcast related information or the broadcast singal generated before receiving and/or broadcast related information And send it to the server of terminal.Broadcast singal can include that TV broadcast singal, radio signals, data are broadcasted Signal etc..And, broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast phase Pass information can also provide via mobile communications network, and in this case, broadcast related information can be by mobile communication mould Block 112 receives.Broadcast singal can exist in a variety of manners, and such as, it can be with the electronics of DMB (DMB) Program guide (EPG), the form of electronic service guidebooks (ESG) etc. of digital video broadcast-handheld (DVB-H) and exist.Broadcast Receiver module 111 can be broadcasted by using various types of broadcast systems to receive signal.Especially, broadcast reception module 111 Can be by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video wide Broadcast-hand-held (DVB-H), forward link media (MediaFLO@) Radio Data System, received terrestrial digital broadcasting integrated service Etc. (ISDB-T) digit broadcasting system receives digital broadcasting.Broadcast reception module 111 may be constructed such that and is adapted to provide for extensively Broadcast the various broadcast systems of signal and above-mentioned digit broadcasting system.Via broadcast reception module 111 receive broadcast singal and/ Or broadcast related information can be stored in memorizer 160 (or other type of storage medium).
Mobile communication module 112 sends radio signals to base station (such as, access point, node B etc.), exterior terminal And in server at least one and/or receive from it radio signal.Such radio signal can include that voice leads to Words signal, video calling signal or the various types of data sending according to text and/or Multimedia Message and/or receiving.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be internally or externally It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can include WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave access), HSDPA (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant ToothTM, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybeeTMEtc..
Positional information module 115 is the module of positional information for checking or obtain mobile terminal.Positional information module Typical case be GPS (global positioning system).According to current technology, GPS module 115 calculates from three or more satellites Range information and correct time information and for the Information application triangulation calculated, thus according to longitude, latitude Highly accurately calculate three-dimensional current location information.Currently, the method use three being used for calculating position and temporal information is defended Star and by using the position and the error of temporal information that other satellite correction calculates.Additionally, GPS module 115 Velocity information can be calculated by Continuous plus current location information in real time.
A/V input block 120 is used for receiving audio or video signal.A/V input block 120 can include camera 121 He Mike 1220, the camera 121 static map to being obtained by image capture apparatus in Video Capture pattern or image capture mode The view data of sheet or video processes.Picture frame after process may be displayed on display unit 151.At camera 121 Picture frame after reason can be stored in memorizer 160 (or other storage medium) or carry out via wireless communication unit 110 Send, two or more cameras 1210 can be provided according to the structure of mobile terminal.Mike 122 can be at telephone relation mould Formula, logging mode, speech recognition mode etc. operational mode receives sound (voice data) via mike, and can be by Such acoustic processing is voice data.Audio frequency (voice) data after process can be changed in the case of telephone calling model For can be sent to the form output of mobile communication base station via mobile communication module 112.Mike 122 can implement all kinds Noise eliminate (or suppression) algorithm with the noise eliminating (or suppression) and producing during receiving and send audio signal or Person disturbs.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user's input Plant operation.User input unit 130 allows user to input various types of information, and can include keyboard, metal dome, touch Plate (such as, detection due to touched and cause resistance, pressure, the sensitive component of change of electric capacity etc.), roller, rocking bar etc. Deng.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 100, (such as, mobile terminal 100 open or close shape State), the position of mobile terminal 100, user is for the presence or absence of contact (that is, touch input) of mobile terminal 100, mobile terminal Orientation, the acceleration or deceleration of mobile terminal 100 of 100 move and direction etc., and generate for controlling mobile terminal 100 The order of operation or signal.Such as, when mobile terminal 100 is embodied as sliding-type mobile phone, sensing unit 140 can sense This sliding-type phone opens or cuts out.It addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or Whether person's interface unit 170 couples with external device (ED).
Interface unit 170 is used as at least one external device (ED) and is connected, with mobile terminal 100, the interface that can pass through.Such as, External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing Line FPDP, memory card port, for connect have the port of device of identification module, audio frequency input/output (I/O) end Mouth, video i/o port, ear port etc..Identification module can be that storage is for verifying that user uses each of mobile terminal 100 Kind of information and subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) can be included Etc..It addition, the device (hereinafter referred to as " identifying device ") with identification module can be to take the form of smart card, therefore, know Other device can be connected with mobile terminal 100 via port or other attachment means.Interface unit 170 may be used for receive from The input (such as, data message, electric power etc.) of external device (ED) and the input received is transferred in mobile terminal 100 One or more elements or may be used between mobile terminal and external device (ED) transmit data.
It addition, when mobile terminal 100 is connected with external base, interface unit 170 can serve as allowing electricity by it Power provides the path of mobile terminal 100 from base or can serve as allowing from the various command signals of base input by it It is transferred to the path of mobile terminal.May serve as identifying that mobile terminal is from various command signals or the electric power of base input The no signal being accurately fitted within base.Output unit 150 is configured to provide defeated with vision, audio frequency and/or tactile manner Go out signal (such as, audio signal, video signal, alarm signal, vibration signal etc.).Output unit 150 can include display Unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 100.Such as, it is in electricity when mobile terminal 100 During words call mode, display unit 151 can show and call or other (such as, text messaging, multimedia file that communicate Download etc.) relevant user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling pattern Or during image capture mode, display unit 151 can show image and/or the image of reception of capture, illustrate video or figure UI or GUI of picture and correlation function etc..
Meanwhile, when display unit 151 and touch pad the most superposed on one another with formed touch screen time, display unit 151 can serve as input equipment and output device.Display unit 151 can include liquid crystal display (LCD), thin film transistor (TFT) In LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc. at least A kind of.Some in these display may be constructed such that transparence is watched from outside with permission user, and this is properly termed as transparent Display, typical transparent display can for example, TOLED (transparent organic light emitting diode) display etc..According to specific The embodiment wanted, mobile terminal 100 can include two or more display units (or other display device), such as, move Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch screen can be used for detection and touches Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal be in call signal receive pattern, call mode, logging mode, Time under the isotype such as speech recognition mode, broadcast reception mode, that wireless communication unit 110 is received or at memorizer 160 The voice data transducing audio signal of middle storage and be output as sound.And, dio Output Modules 152 can provide with mobile The audio frequency output (such as, call signal receives sound, message sink sound etc.) that the specific function that terminal 100 performs is relevant. Dio Output Modules 152 can include speaker, buzzer etc..
Alarm unit 153 can provide output to notify event to mobile terminal 100.Typical event is permissible Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video exports, alarm unit 153 can provide output with the generation of notification event in a different manner.Such as, alarm unit 153 can be with the form of vibration Output is provided, when receiving calling, message or some other entrance communication (incomingcommunication), alarm list Unit 153 can provide sense of touch output (that is, vibration) to notify to user.By providing such sense of touch to export, even if When the mobile phone of user is in the pocket of user, user also is able to identify the generation of various event.Alarm unit 153 is also The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memorizer 160 can store the process performed by controller 180 and the software program controlling operation etc., or can Temporarily to store the data (such as, telephone directory, message, still image, video etc.) that oneself maybe will export through output.And And, memorizer 160 can with storage about when touch be applied to touch screen time the vibration of various modes of output and audio signal Data.
Memorizer 160 can include that the storage medium of at least one type, described storage medium include flash memory, hard disk, many Media card, card-type memorizer (such as, SD or DX memorizer etc.), random access storage device (RAM), static random-access store Device (SRAM), read only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc..And, mobile terminal 100 can be connected execution memorizer with by network The network storage device cooperation of the storage function of 160.
Controller 180 generally controls the overall operation of mobile terminal.Such as, controller 180 performs and voice call, data Control that communication, video calling etc. are relevant and process.It addition, controller 180 can include for reproducing (or playback) many matchmakers The multi-media module 1810 of volume data, multi-media module 1810 can construct in controller 180, or it is so structured that with control Device 180 processed separates.Controller 180 can perform pattern recognition process, with the handwriting input that will perform on the touchscreen or figure Sheet is drawn input and is identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit Suitable electric power needed for part and assembly.
Various embodiment described herein can be to use such as computer software, hardware or its any combination of calculating Machine computer-readable recording medium is implemented.Implementing for hardware, embodiment described herein can be by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), scene can Program gate array (FPGA), processor, controller, microcontroller, microprocessor, be designed to perform function described herein At least one in electronic unit is implemented, and in some cases, such embodiment can be implemented in controller 180. Software is implemented, the embodiment of such as process or function can with allow to perform the single of at least one function or operation Software module is implemented.Software code can be come by the software application (or program) write with any suitable programming language Implementing, software code can be stored in memorizer 160 and be performed by controller 180.
So far, oneself is through describing mobile terminal according to its function.Below, for the sake of brevity, will describe such as folded form, Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is as showing Example.Therefore, the present invention can be applied to any kind of mobile terminal, and is not limited to slide type mobile terminal.
As shown in Figure 1 mobile terminal 100 may be constructed such that utilize via frame or packet transmission data all if any Line and wireless communication system and satellite-based communication system operate.
The communication system being wherein operable to according to the mobile terminal of the present invention is described referring now to Fig. 2.
Such communication system can use different air interfaces and/or physical layer.Such as, communication system use Air interface includes such as frequency division multiple access (FDMA), time division multiple acess (TDMA), CDMA (CDMA) and universal mobile communications system System (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc..As non-limiting example, under The description in face relates to cdma communication system, but such teaching is equally applicable to other type of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 100, multiple base station (BS) 270, base station Controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is configured to and Public Switched Telephony Network (PSTN) 290 form interface.MSC280 is also structured to and the BSC275 formation interface that can be couple to base station 270 via back haul link. If back haul link can construct according to any one in the interface that Ganji knows, described interface includes such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system as shown in Figure 2 can include multiple BSC2750.
Each BS270 can service one or more subregion (or region), by multidirectional antenna or the sky of sensing specific direction Each subregion that line covers is radially away from BS270.Or, each subregion can be by for two or more of diversity reception Antenna covers.Each BS270 may be constructed such that support multiple frequencies distribution, and the distribution of each frequency has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
Intersecting that subregion and frequency are distributed can be referred to as CDMA Channel.BS270 can also be referred to as base station transceiver System (BTS) or other equivalent terms.In this case, term " base station " may be used for broadly representing single BSC275 and at least one BS270.Base station can also be referred to as " cellular station ".Or, each subregion of specific BS270 can be claimed For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent in system the mobile terminal operated by broadcsting transmitter (BT) 295 100.Broadcast reception module 111 is arranged on mobile terminal 100 and sentences the broadcast that reception is sent by BT295 as shown in Figure 1 Signal.In fig. 2 it is shown that several global positioning systems (GPS) satellite 300.Satellite 300 helps to position multiple mobile terminals At least one in 100.
In fig. 2, depict multiple satellite 300, it is understood that be, it is possible to use any number of satellite obtain useful Location information.GPS module 115 is generally configured to coordinate with satellite 300 to obtain the location letter wanted as shown in Figure 1 Breath.Substitute GPS tracking technique or outside GPS tracking technique, it is possible to use other of position of mobile terminal can be followed the tracks of Technology.It addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
As a typical operation of wireless communication system, BS270 receives the reverse link from various mobile terminals 100 Signal.Mobile terminal 100 generally participates in call, information receiving and transmitting communicates with other type of.Certain base station 270 receive each instead Processed in specific BS270 to link signal.The data obtained are forwarded to the BSC275 being correlated with.BSC provides call Resource distribution and the mobile management function of the coordination of soft switching process included between BS270.The number that BSC275 also will receive According to being routed to MSC280, it provides the extra route service for forming interface with PSTN290.Similarly, PSTN290 with MSC280 forms interface, MSC Yu BSC275 forms interface, and BSC275 correspondingly controls BS270 with by forward link signals It is sent to mobile terminal 100.
AR helmet falls within the one of mobile terminal, and it possesses the various speciality of mobile terminal, can be by above-mentioned shifting The various modules of dynamic terminal are arranged in AR helmet, make AR helmet have complete communication interaction function.
Now based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the present invention is proposed.
First embodiment of the invention proposes a kind of AR helmet, and its structural representation is as it is shown on figure 3, include:
Acquisition module 10, for obtaining the object in AR scene by photographic head;
Matching module 11, for mating object with the eigenvalue in default property data base, to determine liking No is can interactive object;
Perform module 12, for being according to the operating gesture of user, object can be held in the case of interactive object at object The interactive action that row is predetermined.
During stating AR helmet in realization, if AR helmet is set to possess complete communication function Mobile terminal, the function of the most above-mentioned acquisition module 10, matching module 11 and execution module 12 correspondence may be located on processing Device realizes.For AR helmet, it can be to be helmet form, it is also possible to is glasses forms, it is of course also possible to be eye Hood-shaped formula etc..
AR helmet the most all has optical camera and infrared camera, and optical camera is for daily scenery Take pictures, and infrared camera is owing to it makes use of ultrared feature, it is possible to penetrate tissue, in this manner it is possible to increase The capturing ability of strong image capture capabilities, especially nighttime image.
The acquisition module 10 of the present embodiment make use of optical camera that AR helmet has and infrared camera to obtain Take the object in AR scene.For the object obtained, it can be many forms, such as, and people, commodity, showpiece etc..
After object in acquisition module 10 has got AR scene, matching module 11 is by this object and default characteristic number Mate according to the eigenvalue in storehouse, determine whether above-mentioned object is can interactive object.Wherein, preset in property data base Eigenvalue is exactly a kind of correlation map of object, such as, captures one bottle of medicine by photographic head, then can be by this medicine bottle with default Similar picture in property data base mates, to search out similar picture.When mating with similar picture, permissible First find similar picture according to certain similarity, such as, find similarity picture more than 80%.It is of course also possible to it is logical Cross prior art barcode scanning form to obtain object information, such as, obtain object by the Quick Response Code on photographic head scanning medicine bottle Information, mates the object information in the object information got and default property data base, determines that whether this medicine is Commercially available maybe can comment on.
Determine that object is can be after interactive object at matching module 11, it is possible to by performing module 12, above-mentioned object is performed Interactive action, i.e. performs module 12, according to the operating gesture of user, object is performed predetermined interactive action.
Execution module 12 is when obtaining user operation gesture, if the most single operating gesture, such as clicking operation, then The interest brought to user is less, therefore, in order to make user the most interesting when using AR helmet to interact Taste, then need the operating gesture to user to carry out an identification.
During identifying, first have to determine whether this gesture has operating right to fixed object, therefore, need Whether the hands of user to be judged is before object, if before object, then can carry out various operation, if after object Hands has been blocked in face, i.e. object, just cannot distinguish that what action is gesture be.
After determining that the hands of user is before object, it is possible to what the operating gesture analyzing user is, different operations The corresponding different action of gesture.Such as, if the two of user fingers move along two contrary directions respectively, then illustrate now User may need to be amplified object;If the finger of user double-clicks object, illustrate that user may think amplification subject, or Person, may want this object is added comment etc., the most now can operate object according to the function that double click operation is corresponding. Of course, different actions can be performed according to different operating gesture, such as, newly-built, edit, send, delete, pull, put Greatly, reduce, the operation such as selection, the custom that the setting of this operation refers to when using mouse and keyboard to input operates, Preferably to facilitate user to rapidly adapt to AR helmet, here is omitted.
After user has operated object by AR helmet, need to send to mutual with AR helmet operation information Cloud server.If user's operation information is the purchase for commodity, then cloud server can send this information to Buy the server that website, commodity place is corresponding, if user's operation information is for someone or the digital label editor of works (comment, discussion etc.), then, after receiving operation information, need to send operation information to the most all these cloud servers that are connected to Other AR helmets, so, other AR helmets can present this digital label, it is achieved barrage effect.By numeral When label sends to other AR helmets, various presentation mode, such as, barrage 3D cincture object can be set, or, barrage It is fade-in fade-out.
The embodiment of the present invention obtains the object in AR scene by photographic head, and the interaction spending this object is carried out point Analysis, is according to the operating gesture of user, object can to perform some interactive actions, this mistake in the case of interactive object at object Journey need not quote the external equipment of similar keyboard or mouse etc., it is only necessary to user's hand directly operates, and photographic head is caught Catch the operating gesture of user, it is possible to object is performed corresponding operation according to this operating gesture, solve in prior art When user uses AR helmet, Internet audience needs to input by equipment such as external keyboard, mouses, and operation is multiple Miscellaneous problem.
Second embodiment of the invention proposes a kind of AR helmet, its structural representation as shown in Figure 4, including:
Acquisition module 20, for obtaining the object in AR scene by photographic head;
Matching module 21, for mating object with the eigenvalue in default property data base, to determine liking No is can interactive object;
Perform module 22, for being according to the operating gesture of user, object can be held in the case of interactive object at object The interactive action that row is predetermined.
Whether wherein, above-mentioned matching module 21 includes: characteristic matching unit 210, be used for judging presetting in property data base Existence and object reach the eigenvalue of preset matching degree;Determine unit 211, for existing and object in default property data base When reaching the eigenvalue of preset matching degree, determine that object is can interactive object.
Above-mentioned execution module 22 includes: judging unit 220, is used for judging whether operating gesture blocks object;Operation Matching unit 221, in the case of object is blocked by operating gesture, by operating gesture and predetermined registration operation data base Operating gesture mates, the operation content corresponding to determine operating gesture, and wherein, operation content at least includes one below: Newly-built, edit, send, delete, pull, amplify, reduce, select;Performance element 222, for entering object according to operation content Row operation.
During stating AR helmet in realization, if AR helmet is set to possess complete communication function Mobile terminal, the function of the most above-mentioned acquisition module 20, matching module 21 and execution module 22 correspondence may be located on processing Device realizes.For AR helmet, it can be to be helmet form, it is also possible to is glasses forms, it is of course also possible to be eye Hood-shaped formula etc..
AR helmet the most all has optical camera and infrared camera, and optical camera is for daily scenery Take pictures, and infrared camera is owing to it makes use of ultrared feature, it is possible to penetrate tissue, in this manner it is possible to increase The capturing ability of strong image capture capabilities, especially nighttime image.
The acquisition module 20 of the present embodiment make use of optical camera that AR helmet has and infrared camera to obtain Take the object in AR scene.For the object obtained, it can be many forms, such as, and people, commodity, showpiece etc..
After object in acquisition module 20 has got AR scene, the characteristic matching unit 210 of matching module 21 will This object mates with the eigenvalue in default property data base, and is determined by unit 211 whether to determine above-mentioned object For can interactive object.Wherein, a kind of correlation map that the eigenvalue in property data base is exactly object is preset, such as, by taking the photograph As head captures one bottle of medicine, then this medicine bottle can be mated with the similar picture in default property data base, to search out Similar picture.When mating with similar picture, first can find similar picture according to certain similarity, such as, seek Look for similarity picture more than 80%.It is of course also possible to obtain object information by prior art barcode scanning form, such as, Object information is obtained, by the object information got and default property data base by the Quick Response Code on photographic head scanning medicine bottle In object information mate, determine that whether this medicine is commercially available maybe can to comment on.
Determining that object is can be after interactive object at matching module 21, acquisition module 20 also obtains object from cloud server Can the type of interactive operation, wherein, type at least includes one below: buying mutual, digital label is mutual.Determining object The type can be presented to user after the type of interactive operation, user is it is known which interactive operation can be carried out , according to the operating gesture of user, object is performed predetermined interactive action by performing module 22.
Execution module 22 is when obtaining user operation gesture, if the most single operating gesture, such as clicking operation, then The interest brought to user is less, therefore, in order to make user the most interesting when using AR helmet to interact Taste, then need the operating gesture to user to carry out an identification.
During identifying, first have to determine whether this gesture has operating right to fixed object, therefore, need The hands of user, whether before object, if before object, then can carry out various operation to want judging unit 220 to judge, as Fruit is after object, i.e. object has blocked hands, just cannot distinguish that what action is gesture be.
After determining that the hands of user is before object, it is assorted that operation matching unit 221 just can analyze the operating gesture of user , performance element 222 just can perform different actions according to different operating gestures.Such as, if the two of user handss Refer to move along two contrary directions respectively, then illustrate that now user may need to be amplified object;If the hands of user Refer to double-click object, illustrate that user may think amplification subject, or, may want this object is added comment etc., the most now can root Object is operated by the function corresponding according to double click operation.Of course, can perform different according to different operating gestures Action, such as, newly-built, edit, send, delete, pull, amplify, reduce, the operation such as selection, the setting of this operation refers to Custom when using mouse and keyboard to input operates, preferably to facilitate user to rapidly adapt to AR helmet.
Third embodiment of the invention proposes a kind of AR interactive system, and its system interaction is illustrated as it is shown in figure 5, include:
Cloud server 1 and multiple AR helmet 2;
Cloud server 1, mutual with all AR helmets 2 being connected to cloud server, and receive a certain AR and wear The operation data of equipment 2 transmission, in the case of operation data are the digital label operated, when next tag update Between arrive time, digital label is pushed to all AR helmets 2.
Wherein, above-mentioned any AR helmet 2 all includes with lower module:
Acquisition module 30, for obtaining the object in AR scene by photographic head;
Matching module 31, for mating object with the eigenvalue in default property data base, to determine liking No is can interactive object;
Perform module 32, for being according to the operating gesture of user, object can be held in the case of interactive object at object The interactive action that row is predetermined.
During stating AR helmet in realization, if AR helmet is set to possess complete communication function Mobile terminal, the function of the most above-mentioned acquisition module 30, matching module 31 and execution module 32 correspondence may be located on processing Device realizes.For AR helmet, it can be to be helmet form, it is also possible to is glasses forms, it is of course also possible to be eye Hood-shaped formula etc..
AR helmet the most all has optical camera and infrared camera, and optical camera is for daily scenery Take pictures, and infrared camera is owing to it makes use of ultrared feature, it is possible to penetrate tissue, in this manner it is possible to increase The capturing ability of strong image capture capabilities, especially nighttime image.
The acquisition module 30 of the present embodiment make use of optical camera that AR helmet has and infrared camera to obtain Take the object in AR scene.For the object obtained, it can be many forms, such as, and people, commodity, showpiece etc..
After object in acquisition module 30 has got AR scene, matching module 31 is by this object and default characteristic number Mate according to the eigenvalue in storehouse, determine whether above-mentioned object is can interactive object.Wherein, preset in property data base Eigenvalue is exactly a kind of correlation map of object, such as, captures one bottle of medicine by photographic head, then can be by this medicine bottle with default Similar picture in property data base mates, to search out similar picture.When mating with similar picture, permissible First find similar picture according to certain similarity, such as, find similarity picture more than 80%.It is of course also possible to it is logical Cross prior art barcode scanning form to obtain object information, such as, obtain object by the Quick Response Code on photographic head scanning medicine bottle Information, mates the object information in the object information got and default property data base, determines that whether this medicine is Commercially available maybe can comment on.
Determine that object is can be after interactive object at matching module 31, it is possible to by performing module 32, above-mentioned object is performed Interactive action, i.e. performs module 32, according to the operating gesture of user, object is performed predetermined interactive action.
Execution module 32 is when obtaining user operation gesture, if the most single operating gesture, such as clicking operation, then The interest brought to user is less, therefore, in order to make user the most interesting when using AR helmet to interact Taste, then need the operating gesture to user to carry out an identification.
During identifying, first have to determine whether this gesture has operating right to fixed object, therefore, need Whether the hands of user to be judged is before object, if before object, then can carry out various operation, if after object Hands has been blocked in face, i.e. object, just cannot distinguish that what action is gesture be.
After determining that the hands of user is before object, it is possible to what the operating gesture analyzing user is, different operations The corresponding different action of gesture.Such as, if the two of user fingers move along two contrary directions respectively, then illustrate now User may need to be amplified object;If the finger of user double-clicks object, illustrate that user may think amplification subject, or Person, may want this object is added comment etc., the most now can operate object according to the function that double click operation is corresponding. Of course, different actions can be performed according to different operating gesture, such as, newly-built, edit, send, delete, pull, put Greatly, reduce, the operation such as selection, the custom that the setting of this operation refers to when using mouse and keyboard to input operates, Preferably to facilitate user to rapidly adapt to AR helmet, here is omitted.
After user has operated object by AR helmet, need to send to mutual with AR helmet operation information Cloud server.If user's operation information is the purchase for commodity, then cloud server can send this information to Buy the server that website, commodity place is corresponding, if user's operation information is for someone or the digital label editor of works (comment, discussion etc.), then, after receiving operation information, need to send operation information to the most all these cloud servers that are connected to Other AR helmets, so, other AR helmets can present this digital label, it is achieved barrage effect.By numeral When label sends to other AR helmets, various presentation mode, such as, barrage 3D cincture object can be set, or, barrage It is fade-in fade-out.
The embodiment of the present invention obtains the object in AR scene by photographic head, and the interaction spending this object is carried out point Analysis, is according to the operating gesture of user, object can to perform some interactive actions, this mistake in the case of interactive object at object Journey need not quote the external equipment of similar keyboard or mouse etc., it is only necessary to user's hand directly operates, and photographic head is caught Catch the operating gesture of user, it is possible to object is performed corresponding operation according to this operating gesture, and cloud server is being received After the digital label of user operation, this digital label can be sent the user to other AR helmets, service area is wide General, solve in prior art when user uses AR helmet, Internet audience needs by external keyboard, mouse etc. Equipment inputs, the problem that operation is complicated;Also the AR helmet screen of this user it is projected to after solving input data collection Curtain, this operating effect is only an individual service, the problem that service area is narrow.
Fourth embodiment of the invention proposes the exchange method of a kind of AR scene, and this interaction relates to AR helmet, For AR helmet, it can be to be helmet form, it is also possible to is glasses forms, it is of course also possible to be eyeshield form etc. Deng.
The flow process of the exchange method of above-mentioned AR scene as shown in Figure 6, including step S602 to S606:
S602, AR helmet obtains the object in AR scene by photographic head;
Object is mated by S604, AR helmet with the eigenvalue in default property data base, to determine liking No is can interactive object;
S606, is can be in the case of interactive object at object, and object is held by AR helmet according to the operating gesture of user The interactive action that row is predetermined.
Owing to AR helmet the most all has optical camera and infrared camera, optical camera is for daily Scenery is taken pictures, and infrared camera is owing to it makes use of ultrared feature, it is possible to penetrate tissue, so, so that it may To strengthen the capturing ability of image capture capabilities, especially nighttime image.Therefore, AR field can be got by AR helmet The exchange method of scape and user operation gesture, and by the image transmitting that obtains in AR helmet processor (CPU, MCU, GPU etc.), processor carry out subsequent treatment.
The object got due to photographic head can be people, commodity, showpiece etc., therefore, when processor runs, needs to run Image recognition algorithm, the eigenvalue of persons or things specific to performer, commodity, showpiece etc. and the feature in default property data base Value is identified, owing to default property data base may be beyond the clouds in server or local AR helmet, so getting Object can be compared by the image feature value locally stored with cloud server or AR helmet.Determine default characteristic The object that there is the eigenvalue reaching preset matching degree with object in storehouse is can interactive object.
Wherein, preset a kind of correlation map that the eigenvalue in property data base is exactly object, such as, caught by photographic head Grasp one bottle of medicine, then this medicine bottle can be mated with the similar picture in default property data base, similar to search out Picture.When mating with similar picture, first can find similar picture according to certain similarity, such as, find similar Degree picture more than 80%.It is of course also possible to obtain object information by prior art barcode scanning form, such as, by taking the photograph As the Quick Response Code on head scanning medicine bottle obtains object information, right by the object information got and default property data base Image information mates, and determines that whether this medicine is commercially available maybe can to comment on.
Being can be in the case of interactive object at object, needing the gesture operation identifying user further be any operation, Corresponding action can be performed, therefore, before really interacting, in addition it is also necessary to the manipulator to user according to this gesture operation Gesture is identified.
Before AR helmet performs predetermined interactive action according to the operating gesture of user to object, it is also possible to from cloud End server obtain object can the type of interactive operation, wherein, type at least includes one below: buy mutual, numeral mark Sign mutual.The type can be presented to user after the type of interactive operation determine object, user is it is known that can To carry out which interactive operation.
When the operating gesture according to user performs predetermined interactive action to object, AR helmet judges operating gesture Whether object is blocked;In the case of object is blocked by operating gesture, illustrate hands before object, various operations are all Can realize, then operating gesture is mated by AR helmet with the operating gesture in predetermined registration operation data base, to determine The operation content that operating gesture is corresponding, wherein, operation content at least includes one below: newly-built, edit, send, delete, drag Drag, amplify, reduce, select;Subsequently, object is operated by AR helmet according to operation content.If operating gesture does not has Object is had any blocking, then cannot determine that the operation carrying out which object, various operations all cannot realize.
After object being operated according to operation content, need to send to mutual with AR helmet operation information Cloud server.After user clicks on transmission, then operation information is directly deposited to cloud server by wireless network transmission Storage, such as, if with the addition of digital label (barrage), then cloud server stores, and subsequently, cloud server is to display Content at all user's AR helmets once updates, thus reaches the AR helmet terminal of all Connection Service devices By the effect that can watch this barrage per family in real time.
If user's operation information is the purchase for commodity, then cloud server can send this information to buy business The server that website, product place is corresponding, (comments if user's operation information is the digital label editor for someone or works, begs for Opinion etc.), then after receiving operation information, need to send to all other AR heads being connected to this cloud server operation information Wearing equipment, so, other AR helmets can present this digital label, it is achieved barrage effect.Digital label is being sent During to other AR helmets, various presentation mode, such as, barrage 3D cincture object can be set, or, barrage is fade-in fade-out Deng.
Barrage 3D around object or barrage be fade-in fade-out etc. mode be all prior art realized on computer screen in Existing mode, is briefly described this process below.
For barrage 3D around object, can be after the identification success of interactive object in realizing AR scene, to can interactive object Split with background, it is achieved prospect, background separation;After separating successfully, the periphery in foreground object (the most just adds digital label The label with barrage form, i.e. barrage), play digital label around can the animation of interactive object, when digital label runs During to the dead ahead of object, the two lap display effect is that digital label content covers can the pixel of interactive object;Work as number When sign label run to the dead astern of object, the two lap display effect is the pixel of interactive object can to cover numeral mark Sign, thus reach the display effect of 3D cincture.
Barrage is fade-in fade-out, refer to digital label from can the top of interactive object to lower section, left to right, front During ejecting operation to rear, running orbit must comply with near big and far smaller, the perspective principle of horizontal parallax, wears at AR and sets Project different pixel coordinates on standby left and right screen, thus reach the stereos copic viewing effect of right and left eyes;Also fade in light Go out, initially and latter two stage referring to run, it then follows the transparency of barrage from 100 to 0 again from 0 to 100 linear or Nonlinear change, rather than occur suddenly so that transition effect is more smooth, without there is suddenly scaring to user.
The public users of the embodiment of the present invention, can be the most right in AR scene easily by modes such as gesture interactions Barrage content carries out increasing newly, edit and sending, and more user can be allowed by the object in more easily mode and AR scene Interact, it is also possible to exchanging with the user more with common hobby more easily, domestic consumer can directly lead to Cross the optical camera on AR helmet, infrared camera carries out gesture identification and interacts, thus defeated without extra increase Enter equipment (keyboard, mouse, touch screen etc.), directly just the object shown can be evaluated in AR scene, be the most more people Property, the most more convenient.
It should be noted that in this article, term " includes ", " comprising " or its any other variant are intended to non-row Comprising of his property, so that include that the process of a series of key element, method, article or device not only include those key elements, and And also include other key elements being not expressly set out, or also include intrinsic for this process, method, article or device Key element.In the case of there is no more restriction, statement " including ... " key element limited, it is not excluded that including this The process of key element, method, article or device there is also other identical element.
The invention described above embodiment sequence number, just to describing, does not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art is it can be understood that arrive above-described embodiment side Method can add the mode of required general hardware platform by software and realize, naturally it is also possible to by hardware, but a lot of in the case of The former is more preferably embodiment.Based on such understanding, prior art is done by technical scheme the most in other words The part going out contribution can embody with the form of software product, and this computer software product is stored in a storage medium In (such as ROM/RAM, magnetic disc, CD), including some instructions with so that a station terminal equipment (can be mobile phone, computer, take Business device, air-conditioner, or the network equipment etc.) perform the method described in each embodiment of the present invention.
These are only the preferred embodiments of the present invention, not thereby limit the scope of the claims of the present invention, every utilize this Equivalent structure or equivalence flow process that bright description and accompanying drawing content are made convert, or are directly or indirectly used in other relevant skills Art field, is the most in like manner included in the scope of patent protection of the present invention.

Claims (10)

1. an augmented reality AR helmet, it is characterised in that including:
Acquisition module, for obtaining the object in augmented reality AR scene by photographic head;
Matching module, for mating described object with the eigenvalue in default property data base, to determine described object Whether it is can interactive object;
Perform module, for being can be in the case of interactive object at described object, according to the operating gesture of user to described object Perform predetermined interactive action.
2. AR helmet as claimed in claim 1, it is characterised in that described matching module includes:
Whether characteristic matching unit, be used for judging to exist in described default property data base to reach preset matching degree with described object Eigenvalue;
Determine unit, reach the eigenvalue of preset matching degree for existence and described object in described default property data base Time, determine that described object is for can interactive object.
3. AR helmet as claimed in claim 1, it is characterised in that described execution module includes:
Judging unit, is used for judging whether described operating gesture blocks described object;
Operation matching unit, in the case of described object is blocked by described operating gesture, by described operating gesture with Operating gesture in predetermined registration operation data base mates, the operation content corresponding to determine described operating gesture, wherein, described Operation content at least includes one below: newly-built, edit, send, delete, pull, amplify, reduce, select;
Performance element, for operating described object according to described operation content.
4. AR helmet as claimed any one in claims 1 to 3, it is characterised in that also include:
Described acquisition module, before being additionally operable to the interactive action that the operating gesture according to user is predetermined to the execution of described object, from Cloud server obtain described object can the type of interactive operation, wherein, described type at least includes one below: buys and hands over Mutually, digital label is mutual.
5. an AR interactive system, it is characterised in that including:
AR helmet according to any one of cloud server and Claims 1-4;
Described cloud server, mutual and a certain receiving with all AR helmets being connected to described cloud server In the case of the digital label that AR helmet has operated, when the next tag update time arrives, by described numeral mark Label push to described all AR helmets.
6. the exchange method of an AR scene, it is characterised in that described method includes step:
AR helmet obtains the object in augmented reality AR scene by photographic head;
Described object is mated by described AR helmet with the eigenvalue in default property data base, to determine described object Whether it is can interactive object;
Described object be can in the case of interactive object, described AR helmet according to the operating gesture of user to described object Perform predetermined interactive action.
7. the exchange method of AR scene as claimed in claim 6, it is characterised in that described AR helmet by described object with The eigenvalue preset in property data base mates, with determine described object be whether can the step of interactive object, including:
Whether described AR helmet judges to exist in described default property data base to reach preset matching degree with described object Eigenvalue;
When there is, in described default property data base, the eigenvalue reaching preset matching degree with described object, described AR wears and sets Standby determine that described object is for can interactive object.
8. the exchange method of AR scene as claimed in claim 6, it is characterised in that described AR helmet is according to the behaviour of user Make a sign with the hand and described object is performed the step of predetermined interactive action, including:
Described AR helmet judges whether described operating gesture blocks described object;
In the case of described object is blocked by described operating gesture, described AR helmet is by described operating gesture and presets Operating gesture in operating database mates, the operation content corresponding to determine described operating gesture, wherein, and described operation Content at least includes one below: newly-built, edit, send, delete, pull, amplify, reduce, select;
Described object is operated by described AR helmet according to described operation content.
9. the exchange method of the AR scene as according to any one of claim 6 to 8, it is characterised in that described AR helmet Before operating gesture according to user performs the step of predetermined interactive action to described object, also include:
Described AR helmet from cloud server obtain described object can the type of interactive operation, wherein, described type is extremely Include one below less: buying mutual, digital label is mutual.
10. the exchange method of AR scene as claimed in claim 9, it is characterised in that can the type of interactive operation be digital In the case of label is mutual, according to after the step that described object is operated by described operation content, also include:
Described cloud server receives the digital label that AR helmet has operated;
The described digital label received, when the next tag update time arrives, is pushed to and institute by described cloud server State all AR helmets that cloud server connects.
CN201610489453.4A 2016-06-28 2016-06-28 AR helmet, AR interactive system and the exchange method of AR scene Pending CN106155311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610489453.4A CN106155311A (en) 2016-06-28 2016-06-28 AR helmet, AR interactive system and the exchange method of AR scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610489453.4A CN106155311A (en) 2016-06-28 2016-06-28 AR helmet, AR interactive system and the exchange method of AR scene

Publications (1)

Publication Number Publication Date
CN106155311A true CN106155311A (en) 2016-11-23

Family

ID=57350129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610489453.4A Pending CN106155311A (en) 2016-06-28 2016-06-28 AR helmet, AR interactive system and the exchange method of AR scene

Country Status (1)

Country Link
CN (1) CN106155311A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106683201A (en) * 2016-12-23 2017-05-17 深圳市豆娱科技有限公司 Scene editing method and device based on three-dimensional virtual reality
CN106740581A (en) * 2017-01-03 2017-05-31 青岛海信移动通信技术股份有限公司 A kind of control method of mobile unit, AR devices and AR systems
CN106951073A (en) * 2017-03-08 2017-07-14 深圳市瑞尔时代科技有限公司 A kind of 3D solid scene preview layout system based on VR technology platforms
CN107197339A (en) * 2017-04-10 2017-09-22 北京小鸟看看科技有限公司 Display control method, device and the head-mounted display apparatus of film barrage
CN107300972A (en) * 2017-06-15 2017-10-27 北京小鸟看看科技有限公司 The method of comment in display device is worn, device and wear display device
CN107635057A (en) * 2017-07-31 2018-01-26 努比亚技术有限公司 A kind of virtual reality terminal control method, terminal and computer-readable recording medium
WO2018152685A1 (en) * 2017-02-22 2018-08-30 Tencent Technology (Shenzhen) Company Limited Image processing in a vr system
CN109068125A (en) * 2018-07-30 2018-12-21 上海闻泰电子科技有限公司 AR system
CN109564706A (en) * 2016-12-01 2019-04-02 英特吉姆股份有限公司 User's interaction platform based on intelligent interactive augmented reality
WO2019114092A1 (en) * 2017-12-15 2019-06-20 深圳梦境视觉智能科技有限公司 Image augmented reality method and apparatus, and augmented reality display device and terminal
CN110100199A (en) * 2017-01-13 2019-08-06 努诺·安图内斯 It acquires, the system and method for registration and multimedia administration
CN110249291A (en) * 2017-02-01 2019-09-17 Pcms控股公司 System and method for the augmented reality content delivery in pre-capture environment
CN111142675A (en) * 2019-12-31 2020-05-12 维沃移动通信有限公司 Input method and head-mounted electronic equipment
WO2020143443A1 (en) * 2019-01-09 2020-07-16 北京京东尚科信息技术有限公司 Review interface entering method and device
CN111476911A (en) * 2020-04-08 2020-07-31 Oppo广东移动通信有限公司 Virtual image implementation method and device, storage medium and terminal equipment
CN111580655A (en) * 2020-05-08 2020-08-25 维沃移动通信有限公司 Information processing method and device and electronic equipment
CN112987929A (en) * 2021-03-16 2021-06-18 南京新知艺测科技有限公司 AR visual scene learning method and system based on labor skill teaching materials
US20220319520A1 (en) * 2019-06-03 2022-10-06 Tsinghua University Voice interaction wakeup electronic device, method and medium based on mouth-covering action recognition
TWI782211B (en) * 2018-08-02 2022-11-01 開曼群島商創新先進技術有限公司 Human-computer interaction method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226787A (en) * 2013-05-17 2013-07-31 王琼 Mobile electronic equipment with reality enhancing technique and sale field application method thereof
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN104615237A (en) * 2013-11-05 2015-05-13 精工爱普生株式会社 Image display system, method of controlling image display system, and head-mount type display device
CN105045375A (en) * 2014-05-01 2015-11-11 精工爱普生株式会社 Head-mount type display device, method of controlling head-mount type display device, control system, and computer program
CN105190477A (en) * 2013-03-21 2015-12-23 索尼公司 Head-mounted device for user interactions in an amplified reality environment
CN105487673A (en) * 2016-01-04 2016-04-13 京东方科技集团股份有限公司 Man-machine interactive system, method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN105190477A (en) * 2013-03-21 2015-12-23 索尼公司 Head-mounted device for user interactions in an amplified reality environment
CN103226787A (en) * 2013-05-17 2013-07-31 王琼 Mobile electronic equipment with reality enhancing technique and sale field application method thereof
CN104615237A (en) * 2013-11-05 2015-05-13 精工爱普生株式会社 Image display system, method of controlling image display system, and head-mount type display device
CN105045375A (en) * 2014-05-01 2015-11-11 精工爱普生株式会社 Head-mount type display device, method of controlling head-mount type display device, control system, and computer program
CN105487673A (en) * 2016-01-04 2016-04-13 京东方科技集团股份有限公司 Man-machine interactive system, method and device

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109564706A (en) * 2016-12-01 2019-04-02 英特吉姆股份有限公司 User's interaction platform based on intelligent interactive augmented reality
CN109564706B (en) * 2016-12-01 2023-03-10 英特吉姆股份有限公司 User interaction platform based on intelligent interactive augmented reality
CN106683201A (en) * 2016-12-23 2017-05-17 深圳市豆娱科技有限公司 Scene editing method and device based on three-dimensional virtual reality
CN106740581A (en) * 2017-01-03 2017-05-31 青岛海信移动通信技术股份有限公司 A kind of control method of mobile unit, AR devices and AR systems
CN110100199B (en) * 2017-01-13 2022-06-14 努诺·安图内斯 System and method for acquisition, registration and multimedia management
CN110100199A (en) * 2017-01-13 2019-08-06 努诺·安图内斯 It acquires, the system and method for registration and multimedia administration
CN110249291A (en) * 2017-02-01 2019-09-17 Pcms控股公司 System and method for the augmented reality content delivery in pre-capture environment
WO2018152685A1 (en) * 2017-02-22 2018-08-30 Tencent Technology (Shenzhen) Company Limited Image processing in a vr system
US11003707B2 (en) 2017-02-22 2021-05-11 Tencent Technology (Shenzhen) Company Limited Image processing in a virtual reality (VR) system
CN106951073A (en) * 2017-03-08 2017-07-14 深圳市瑞尔时代科技有限公司 A kind of 3D solid scene preview layout system based on VR technology platforms
CN107197339B (en) * 2017-04-10 2019-12-31 北京小鸟看看科技有限公司 Display control method and device of film bullet screen and head-mounted display equipment
CN107197339A (en) * 2017-04-10 2017-09-22 北京小鸟看看科技有限公司 Display control method, device and the head-mounted display apparatus of film barrage
CN107300972A (en) * 2017-06-15 2017-10-27 北京小鸟看看科技有限公司 The method of comment in display device is worn, device and wear display device
CN107635057A (en) * 2017-07-31 2018-01-26 努比亚技术有限公司 A kind of virtual reality terminal control method, terminal and computer-readable recording medium
WO2019114092A1 (en) * 2017-12-15 2019-06-20 深圳梦境视觉智能科技有限公司 Image augmented reality method and apparatus, and augmented reality display device and terminal
CN109934929A (en) * 2017-12-15 2019-06-25 深圳梦境视觉智能科技有限公司 The method, apparatus of image enhancement reality, augmented reality show equipment and terminal
CN109068125A (en) * 2018-07-30 2018-12-21 上海闻泰电子科技有限公司 AR system
TWI782211B (en) * 2018-08-02 2022-11-01 開曼群島商創新先進技術有限公司 Human-computer interaction method and device
CN111429148A (en) * 2019-01-09 2020-07-17 北京京东尚科信息技术有限公司 Evaluation interface entering method and device
WO2020143443A1 (en) * 2019-01-09 2020-07-16 北京京东尚科信息技术有限公司 Review interface entering method and device
CN111429148B (en) * 2019-01-09 2024-05-21 北京京东尚科信息技术有限公司 Evaluation interface entering method and device
US20220319520A1 (en) * 2019-06-03 2022-10-06 Tsinghua University Voice interaction wakeup electronic device, method and medium based on mouth-covering action recognition
CN111142675A (en) * 2019-12-31 2020-05-12 维沃移动通信有限公司 Input method and head-mounted electronic equipment
CN111476911A (en) * 2020-04-08 2020-07-31 Oppo广东移动通信有限公司 Virtual image implementation method and device, storage medium and terminal equipment
CN111476911B (en) * 2020-04-08 2023-07-25 Oppo广东移动通信有限公司 Virtual image realization method, device, storage medium and terminal equipment
CN111580655A (en) * 2020-05-08 2020-08-25 维沃移动通信有限公司 Information processing method and device and electronic equipment
CN112987929A (en) * 2021-03-16 2021-06-18 南京新知艺测科技有限公司 AR visual scene learning method and system based on labor skill teaching materials

Similar Documents

Publication Publication Date Title
CN106155311A (en) AR helmet, AR interactive system and the exchange method of AR scene
CN106502693B (en) A kind of image display method and device
CN104750420B (en) Screenshotss method and device
CN106453538A (en) Screen sharing apparatus and method
CN106909274A (en) A kind of method for displaying image and device
KR20140136595A (en) Mobile terminal and method for controlling the same
CN106293069A (en) The automatic share system of content and method
CN106657650A (en) System expression recommendation method and device, and terminal
CN104301709A (en) Glass type portable device and information projecting side searching method thereof
CN106156144A (en) Information-pushing method and device
CN106097284A (en) The processing method of a kind of night scene image and mobile terminal
CN105472241A (en) Image splicing method and mobile terminals
CN106131327A (en) Terminal and image-pickup method
CN106534500A (en) Customization service system and method based on figure attributes
CN107018334A (en) A kind of applied program processing method and device based on dual camera
CN105278860A (en) Mobile terminal image uploading device and method
CN106101376A (en) A kind of message pusher, method and mobile terminal
CN106130981A (en) The self-defined device and method of digital label of augmented reality equipment
CN106383707A (en) Picture display method and system
CN106534652A (en) Lens module, lens and terminal
CN106250130A (en) A kind of mobile terminal and the method for response button operation
CN106231095A (en) Picture synthesizer and method
CN105681654A (en) Photographing method and mobile terminal
CN106851114A (en) A kind of photo shows, photo generating means and method, terminal
CN105183830B (en) picture browsing method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20161123

RJ01 Rejection of invention patent application after publication