CN107657235A - Recognition methods and device based on augmented reality - Google Patents
Recognition methods and device based on augmented reality Download PDFInfo
- Publication number
- CN107657235A CN107657235A CN201710901518.6A CN201710901518A CN107657235A CN 107657235 A CN107657235 A CN 107657235A CN 201710901518 A CN201710901518 A CN 201710901518A CN 107657235 A CN107657235 A CN 107657235A
- Authority
- CN
- China
- Prior art keywords
- augmented reality
- user
- reality scene
- target object
- recognition result
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/35—Categorising the entire scene, e.g. birthday party or wedding scene
Abstract
The disclosure is directed to a kind of recognition methods based on augmented reality and device.This method includes:Obtain the ocular movemeut information of user;Blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information;The target object that the user pays close attention in the augmented reality scene is determined according to the position of the blinkpunkt;Obtain recognition result corresponding to the target object.The disclosure is by obtaining the ocular movemeut information of user, blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information of the user, the target object that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt, and obtain recognition result corresponding to target object, recognition result corresponding to the target object paid close attention to thus, it is possible to quick obtaining user, improve the efficiency of Real time identification.
Description
Technical field
This disclosure relates to augmented reality field, more particularly to the recognition methods based on augmented reality and device.
Background technology
Augmented reality (Augmented Reality, AR) technology is a kind of image for calculating camera collection in real time
Position and angle, and plus the technology of corresponding image, video and 3D (3Dimensions, three-dimensional) model.Augmented reality
Real world information and virtual world information are integrated, are difficult in the certain time spatial dimension of real world by script
The entity information (visual information, sound, taste, tactile etc.) experienced, by science and technology such as computers, is folded again after analog simulation
Add, virtual Information application to real world is perceived by human sensory, so as to reach the sensory experience of exceeding reality.It is logical
Augmented reality is crossed, real environment and virtual object have been added in same picture or space while deposited in real time
.Augmented reality contain multimedia, three-dimensional modeling, real-time video show and control, Multi-sensor Fusion, real-time tracking
And the technology such as registration and scene fusion.
In correlation technique, the image gathered according to the camera of augmented reality equipment carries out Real time identification.It is existing due to strengthening
The scope of the camera collection of real equipment is generally large, and the object in the acquisition range of camera is typically more, causes in real time
The speed of identification is slower, less efficient.
The content of the invention
To overcome problem present in correlation technique, the disclosure provides a kind of recognition methods and dress based on augmented reality
Put.
According to the first aspect of the embodiment of the present disclosure, there is provided a kind of recognition methods based on augmented reality, including:
Obtain the ocular movemeut information of user;
Blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information;
The target object that the user pays close attention in the augmented reality scene is determined according to the position of the blinkpunkt;
Obtain recognition result corresponding to the target object.
In a kind of possible implementation, determine the user in augmented reality scene according to the ocular movemeut information
In blinkpunkt, including:
A certain position that the user watched attentively in the augmented reality scene is being determined according to the ocular movemeut information
In the case that time exceedes time threshold, the position that the user watches attentively is defined as the user in augmented reality scene
Blinkpunkt.
In a kind of possible implementation, after recognition result corresponding to the target object is obtained, methods described
Also include:
The feature object of the augmented reality scene is determined, wherein, the feature object belongs to specified classification;
Obtain recognition result corresponding to the feature object.
In a kind of possible implementation, after obtaining recognition result corresponding to the target object, methods described is also
Including:
Obtain the recognition result of other objects in the augmented reality scene in addition to the target object.
According to the second aspect of the embodiment of the present disclosure, there is provided a kind of identification device based on augmented reality, including:
First acquisition module, for obtaining the ocular movemeut information of user;
First determining module, for determining note of the user in augmented reality scene according to the ocular movemeut information
Viewpoint;
Second determining module, for determining the user in the augmented reality scene according to the position of the blinkpunkt
The target object of concern;
Second acquisition module, for obtaining recognition result corresponding to the target object.
In a kind of possible implementation, first determining module is used for:
A certain position that the user watched attentively in the augmented reality scene is being determined according to the ocular movemeut information
In the case that time exceedes time threshold, the position that the user watches attentively is defined as the user in augmented reality scene
Blinkpunkt.
In a kind of possible implementation, described device also includes:
3rd determining module, for determining the feature object of the augmented reality scene, wherein, the feature object belongs to
Specify classification;
3rd acquisition module, for obtaining recognition result corresponding to the feature object.
In a kind of possible implementation, described device also includes:
4th acquisition module, for obtaining other objects in the augmented reality scene in addition to the target object
Recognition result.
According to the third aspect of the embodiment of the present disclosure, there is provided a kind of identification device based on augmented reality, it is characterised in that
Including:Processor;For storing the memory of processor-executable instruction;Wherein, the processor is configured as performing above-mentioned
Method.
According to the fourth aspect of the embodiment of the present disclosure, there is provided a kind of non-transitorycomputer readable storage medium, when described
When instruction in storage medium is by computing device so that processor is able to carry out the above method.
The technical scheme provided by this disclosed embodiment can include the following benefits:Embodiment of the disclosure is by obtaining
The ocular movemeut information at family is taken, the user watching attentively in augmented reality scene is determined according to the ocular movemeut information of the user
Point, the target object that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt, and obtain target object
Corresponding recognition result, thus, it is possible to recognition result corresponding to the target object of quick obtaining user concern, improve Real time identification
Efficiency.
It should be appreciated that the general description and following detailed description of the above are only exemplary and explanatory, not
The disclosure can be limited.
Brief description of the drawings
Accompanying drawing herein is merged in specification and forms the part of this specification, shows the implementation for meeting the disclosure
Example, and be used to together with specification to explain the principle of the disclosure.
Fig. 1 is a kind of flow chart of recognition methods based on augmented reality according to an exemplary embodiment.
Fig. 2 is an a kind of exemplary stream of recognition methods based on augmented reality according to an exemplary embodiment
Cheng Tu.
Fig. 3 is a kind of another exemplary of recognition methods based on augmented reality according to an exemplary embodiment
Flow chart.
Fig. 4 is a kind of another exemplary of recognition methods based on augmented reality according to an exemplary embodiment
Flow chart.
Fig. 5 is a kind of block diagram of identification device based on augmented reality according to an exemplary embodiment.
Fig. 6 is an a kind of exemplary frame of identification device based on augmented reality according to an exemplary embodiment
Figure.
Fig. 7 is a kind of block diagram of the device 800 of identification for augmented reality according to an exemplary embodiment.
Embodiment
Here exemplary embodiment will be illustrated in detail, its example is illustrated in the accompanying drawings.Following description is related to
During accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawings represent same or analogous key element.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the disclosure.On the contrary, they be only with it is such as appended
The example of the consistent apparatus and method of some aspects be described in detail in claims, the disclosure.
Fig. 1 is a kind of flow chart of recognition methods based on augmented reality according to an exemplary embodiment.The party
Method can apply in augmented reality equipment, and augmented reality equipment can be augmented reality glasses or augmented reality head aobvious etc. one
Body formula augmented reality equipment, or can form the installing terminal equipments such as mobile phone in the support equipments such as augmented reality picture frame
Split type augmented reality equipment.As shown in figure 1, the method comprising the steps of S11 to step S14.
In step s 11, the ocular movemeut information of user is obtained.
In the present embodiment, eyeball tracking device is configured in augmented reality equipment, augmented reality equipment can be by this
Eyeball tracking device obtains the ocular movemeut information of user.Wherein, eyeball tracking device can use existing eyeball tracking skill
Art obtains the ocular movemeut information of user, will not be repeated here.
In step s 12, blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information.
In a kind of possible implementation, determine the user in augmented reality scene according to the ocular movemeut information
Blinkpunkt, including:The time of a certain position that the user watched attentively in augmented reality scene is being determined according to the ocular movemeut information
In the case of more than time threshold, the position that the user watches attentively is defined as blinkpunkt of the user in augmented reality scene.
For example, time threshold can be 1 second.
In the present embodiment, if the time for a certain position that user is watched attentively in augmented reality scene exceedes time threshold,
It can largely show that user is interested in the object of the position.
In step s 13, the object that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt
Body.
As an example of the present embodiment, if the coordinate model of the coordinate of blinkpunkt jobbie in augmented reality scene
In enclosing, then augmented reality equipment can determine that the object is the target object that the user pays close attention in augmented reality scene.
In step S14, recognition result corresponding to target object is obtained.
As an example of the present embodiment, augmented reality equipment can intercept target object from augmented reality scene
Image, and the image of target object can be sent to server, so that server should according to the image recognition of the target object
Target object, recognition result corresponding to the target object is obtained, and knowledge corresponding to the target object is returned to augmented reality equipment
Other result, so as to which augmented reality equipment can obtain recognition result corresponding to target object from server.
For example, target object is a certain face, recognition result corresponding to target object is the society of personage corresponding to the face
Hand over card information.
The present embodiment determines the user by obtaining the ocular movemeut information of user according to the ocular movemeut information of the user
Blinkpunkt in augmented reality scene, the mesh that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt
Object is marked, and obtains recognition result corresponding to target object, thus, it is possible to corresponding to the target object of quick obtaining user concern
Recognition result, improve the efficiency of Real time identification.
Fig. 2 is an a kind of exemplary stream of recognition methods based on augmented reality according to an exemplary embodiment
Cheng Tu.As shown in Fig. 2 this method can include step S11 to step S16.
In step s 11, the ocular movemeut information of user is obtained.
In step s 12, blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information.
In step s 13, the object that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt
Body.
In step S14, recognition result corresponding to target object is obtained.
In step S15, the feature object of augmented reality scene is determined, wherein, feature object belongs to specified classification.
Wherein, specified classification can be the classification of augmented reality equipment acquiescence, or the classification that user is set, herein
It is not construed as limiting.
For example, it can be face to specify classification.
In a kind of possible implementation, each object in two graders identification augmented reality scene can be used to be
It is no to belong to specified classification, so as to determine feature object from each object of augmented reality scene.
In alternatively possible implementation, each thing in multi-class grader identification augmented reality scene can be used
The classification of body, so as to determine feature object from each object of augmented reality scene.
In step s 16, recognition result corresponding to feature object is obtained.
In this example, known by after recognition result corresponding to target object is obtained, obtaining corresponding to feature object
Other result, thus after the quickly target object of identification user concern, the quick feature object for identifying user may be interested, make
User can quick obtaining its may recognition result corresponding to feature object interested.
Fig. 3 is a kind of another exemplary of recognition methods based on augmented reality according to an exemplary embodiment
Flow chart.As shown in figure 3, this method can include step S11 to step S14, and step S17.
In step s 11, the ocular movemeut information of user is obtained.
In step s 12, blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information.
In step s 13, the object that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt
Body.
In step S14, recognition result corresponding to target object is obtained.
In step S17, the recognition result of other objects in augmented reality scene in addition to target object is obtained.
In this example, after recognition result corresponding to target object is obtained, then obtain and target is removed in augmented reality scene
The recognition result of other objects beyond object, the target object that thus preferentially identification user pays close attention to, then other objects are identified, from
And it is capable of the recognition result of the target object of quick obtaining user concern, the degree phase for making the order of object identification be paid close attention to user
Matching.
Fig. 4 is a kind of another exemplary of recognition methods based on augmented reality according to an exemplary embodiment
Flow chart.As shown in figure 4, this method can include step S11 to step S16, and step S18.
In step s 11, the ocular movemeut information of user is obtained.
In step s 12, blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information.
In step s 13, the object that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt
Body.
In step S14, recognition result corresponding to target object is obtained.
In step S15, the feature object of augmented reality scene is determined, wherein, feature object belongs to specified classification.
In step s 16, recognition result corresponding to feature object is obtained.
In step S18, the knowledge of other objects in augmented reality scene in addition to target object and feature object is obtained
Other result.
In this example, the target object that user pays close attention in augmented reality scene is first identified, then identifies augmented reality field
Feature object in scape, then other objects in augmented reality scene in addition to target object and feature object are identified, thus, it is possible to
The degree for enough making the order of object identification be paid close attention to user matches, the recognition result of quick obtaining user object interested.
Fig. 5 is a kind of block diagram of identification device based on augmented reality according to an exemplary embodiment.Reference picture
5, the device includes the first acquisition module 51, the first determining module 52, the second determining module 53 and the second acquisition module 54.
First acquisition module 51, for obtaining the ocular movemeut information of user;
First determining module 52, for determining note of the user in augmented reality scene according to the ocular movemeut information
Viewpoint;
Second determining module 53, for determining that the user pays close attention in augmented reality scene according to the position of the blinkpunkt
Target object;
Second acquisition module 54, for obtaining recognition result corresponding to target object.
In a kind of possible implementation, first determining module 52 is configured as true according to the ocular movemeut information
In the case that the time for a certain position that the fixed user is watched attentively in augmented reality scene exceedes time threshold, the user is watched attentively
Position is defined as blinkpunkt of the user in augmented reality scene.
Fig. 6 is an a kind of exemplary frame of identification device based on augmented reality according to an exemplary embodiment
Figure.As shown in Figure 6:
In a kind of possible implementation, the device also includes the 3rd determining module 55 and the 3rd acquisition module 56.
3rd determining module 55, for determining the feature object of augmented reality scene, wherein, feature object belongs to specified
Classification;
3rd acquisition module 56, for obtaining recognition result corresponding to feature object.
In a kind of possible implementation, the device also includes the 4th acquisition module 57.
4th acquisition module 57, for obtaining the identification of other objects in augmented reality scene in addition to target object
As a result.
On the device in above-described embodiment, wherein modules perform the concrete mode of operation in relevant this method
Embodiment in be described in detail, explanation will be not set forth in detail herein.
The present embodiment determines the user by obtaining the ocular movemeut information of user according to the ocular movemeut information of the user
Blinkpunkt in augmented reality scene, the mesh that the user pays close attention in augmented reality scene is determined according to the position of the blinkpunkt
Object is marked, and obtains recognition result corresponding to target object, thus, it is possible to corresponding to the target object of quick obtaining user concern
Recognition result, improve the efficiency of Real time identification.
Fig. 7 is a kind of block diagram of the device 800 of identification for augmented reality according to an exemplary embodiment.Example
Such as, device 800 can be mobile phone, computer, digital broadcast terminal, messaging devices, game console, and flat board is set
It is standby, Medical Devices, body-building equipment, personal digital assistant etc..
Reference picture 7, device 800 can include following one or more assemblies:Processing component 802, memory 804, power supply
Component 806, multimedia groupware 808, audio-frequency assembly 810, the interface 812 of input/output (I/O), sensor cluster 814, and
Communication component 816.
The integrated operation of the usual control device 800 of processing component 802, such as communicated with display, call, data, phase
The operation that machine operates and record operation is associated.Processing component 802 can refer to including one or more processors 820 to perform
Order, to complete all or part of step of above-mentioned method.In addition, processing component 802 can include one or more modules, just
Interaction between processing component 802 and other assemblies.For example, processing component 802 can include multi-media module, it is more to facilitate
Interaction between media component 808 and processing component 802.
Memory 804 is configured as storing various types of data to support the operation in device 800.These data are shown
Example includes the instruction of any application program or method for being operated on device 800, contact data, telephone book data, disappears
Breath, picture, video etc..Memory 804 can be by any kind of volatibility or non-volatile memory device or their group
Close and realize, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM) are erasable to compile
Journey read-only storage (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash
Device, disk or CD.
Power supply module 806 provides electric power for the various assemblies of device 800.Power supply module 806 can include power management system
System, one or more power supplys, and other components associated with generating, managing and distributing electric power for device 800.
Multimedia groupware 808 is included in the screen of one output interface of offer between described device 800 and user.One
In a little embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen
Curtain may be implemented as touch-screen, to receive the input signal from user.Touch panel includes one or more touch sensings
Device is with the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or sliding action
Border, but also detect and touched or the related duration and pressure of slide with described.In certain embodiments, more matchmakers
Body component 808 includes a front camera and/or rear camera.When device 800 is in operator scheme, such as screening-mode or
During video mode, front camera and/or rear camera can receive outside multi-medium data.Each front camera and
Rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio-frequency assembly 810 is configured as output and/or input audio signal.For example, audio-frequency assembly 810 includes a Mike
Wind (MIC), when device 800 is in operator scheme, during such as call model, logging mode and speech recognition mode, microphone by with
It is set to reception external audio signal.The audio signal received can be further stored in memory 804 or via communication set
Part 816 is sent.In certain embodiments, audio-frequency assembly 810 also includes a loudspeaker, for exports audio signal.
I/O interfaces 812 provide interface between processing component 802 and peripheral interface module, and above-mentioned peripheral interface module can
To be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and lock
Determine button.
Sensor cluster 814 includes one or more sensors, and the state for providing various aspects for device 800 is commented
Estimate.For example, sensor cluster 814 can detect opening/closed mode of device 800, and the relative positioning of component, for example, it is described
Component is the display and keypad of device 800, and sensor cluster 814 can be with 800 1 components of detection means 800 or device
Position change, the existence or non-existence that user contacts with device 800, the orientation of device 800 or acceleration/deceleration and device 800
Temperature change.Sensor cluster 814 can include proximity transducer, be configured to detect in no any physical contact
The presence of neighbouring object.Sensor cluster 814 can also include optical sensor, such as CMOS or ccd image sensor, for into
As being used in application.In certain embodiments, the sensor cluster 814 can also include acceleration transducer, gyro sensors
Device, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 816 is configured to facilitate the communication of wired or wireless way between device 800 and other equipment.Device
800 can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or combinations thereof.In an exemplary implementation
In example, communication component 816 receives broadcast singal or broadcast related information from external broadcasting management system via broadcast channel.
In one exemplary embodiment, the communication component 816 also includes near-field communication (NFC) module, to promote junction service.Example
Such as, in NFC module radio frequency identification (RFID) technology can be based on, Infrared Data Association (IrDA) technology, ultra wide band (UWB) technology,
Bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 800 can be believed by one or more application specific integrated circuits (ASIC), numeral
Number processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for performing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided
Such as include the memory 804 of instruction, above-mentioned instruction can be performed to complete the above method by the processor 820 of device 800.For example,
The non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk
With optical data storage devices etc..
Those skilled in the art will readily occur to the disclosure its after considering specification and putting into practice invention disclosed herein
Its embodiment.The application is intended to any modification, purposes or the adaptations of the disclosure, these modifications, purposes or
Person's adaptations follow the general principle of the disclosure and including the undocumented common knowledges in the art of the disclosure
Or conventional techniques.Description and embodiments are considered only as exemplary, and the true scope of the disclosure and spirit are by following
Claim is pointed out.
It should be appreciated that the precision architecture that the disclosure is not limited to be described above and is shown in the drawings, and
And various modifications and changes can be being carried out without departing from the scope.The scope of the present disclosure is only limited by appended claim.
Claims (10)
- A kind of 1. recognition methods based on augmented reality, it is characterised in that including:Obtain the ocular movemeut information of user;Blinkpunkt of the user in augmented reality scene is determined according to the ocular movemeut information;The target object that the user pays close attention in the augmented reality scene is determined according to the position of the blinkpunkt;Obtain recognition result corresponding to the target object.
- 2. the recognition methods according to claim 1 based on augmented reality, it is characterised in that believed according to the ocular movemeut Breath determines blinkpunkt of the user in augmented reality scene, including:The time of a certain position that the user watched attentively in the augmented reality scene is being determined according to the ocular movemeut information In the case of more than time threshold, the position that the user watches attentively is defined as the user watching attentively in augmented reality scene Point.
- 3. the recognition methods according to claim 1 based on augmented reality, it is characterised in that obtaining the target object After corresponding recognition result, methods described also includes:The feature object of the augmented reality scene is determined, wherein, the feature object belongs to specified classification;Obtain recognition result corresponding to the feature object.
- 4. the recognition methods according to claim 1 based on augmented reality, it is characterised in that obtain the target object pair After the recognition result answered, methods described also includes:Obtain the recognition result of other objects in the augmented reality scene in addition to the target object.
- A kind of 5. identification device based on augmented reality, it is characterised in that including:First acquisition module, for obtaining the ocular movemeut information of user;First determining module, for determining the user watching attentively in augmented reality scene according to the ocular movemeut information Point;Second determining module, for determining that the user pays close attention in the augmented reality scene according to the position of the blinkpunkt Target object;Second acquisition module, for obtaining recognition result corresponding to the target object.
- 6. the identification device according to claim 5 based on augmented reality, it is characterised in that first determining module is used In:The time of a certain position that the user watched attentively in the augmented reality scene is being determined according to the ocular movemeut information In the case of more than time threshold, the position that the user watches attentively is defined as the user watching attentively in augmented reality scene Point.
- 7. the identification device according to claim 5 based on augmented reality, it is characterised in that described device also includes:3rd determining module, for determining the feature object of the augmented reality scene, wherein, the feature object belongs to specified Classification;3rd acquisition module, for obtaining recognition result corresponding to the feature object.
- 8. the identification device according to claim 5 based on augmented reality, it is characterised in that described device also includes:4th acquisition module, for obtaining the identification of other objects in the augmented reality scene in addition to the target object As a result.
- A kind of 9. identification device based on augmented reality, it is characterised in that including:Processor;For storing the memory of processor-executable instruction;Wherein, the processor is configured as the method described in any one in perform claim requirement 1 to 4.
- 10. a kind of non-transitorycomputer readable storage medium, when the instruction in the storage medium is by computing device, make Processor is able to carry out in Claims 1-4 method described in any one.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710901518.6A CN107657235A (en) | 2017-09-28 | 2017-09-28 | Recognition methods and device based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710901518.6A CN107657235A (en) | 2017-09-28 | 2017-09-28 | Recognition methods and device based on augmented reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107657235A true CN107657235A (en) | 2018-02-02 |
Family
ID=61117139
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710901518.6A Pending CN107657235A (en) | 2017-09-28 | 2017-09-28 | Recognition methods and device based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107657235A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108592865A (en) * | 2018-04-28 | 2018-09-28 | 京东方科技集团股份有限公司 | Geometric measurement method and its device, AR equipment based on AR equipment |
CN109727317A (en) * | 2019-01-07 | 2019-05-07 | 京东方科技集团股份有限公司 | Augmented reality system and control method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101067716A (en) * | 2007-05-29 | 2007-11-07 | 南京航空航天大学 | Enhanced real natural interactive helmet with sight line follow-up function |
CN103752010A (en) * | 2013-12-18 | 2014-04-30 | 微软公司 | Reality coverage enhancing method used for control equipment |
JP2015060071A (en) * | 2013-09-18 | 2015-03-30 | コニカミノルタ株式会社 | Image display device, image display method, and image display program |
CN104898276A (en) * | 2014-12-26 | 2015-09-09 | 成都理想境界科技有限公司 | Head-mounted display device |
CN106095089A (en) * | 2016-06-06 | 2016-11-09 | 郑黎光 | A kind of method obtaining interesting target information |
CN107122039A (en) * | 2017-03-15 | 2017-09-01 | 苏州大学 | A kind of intelligent vision auxiliary strengthening system and its application method |
-
2017
- 2017-09-28 CN CN201710901518.6A patent/CN107657235A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101067716A (en) * | 2007-05-29 | 2007-11-07 | 南京航空航天大学 | Enhanced real natural interactive helmet with sight line follow-up function |
JP2015060071A (en) * | 2013-09-18 | 2015-03-30 | コニカミノルタ株式会社 | Image display device, image display method, and image display program |
CN103752010A (en) * | 2013-12-18 | 2014-04-30 | 微软公司 | Reality coverage enhancing method used for control equipment |
CN104898276A (en) * | 2014-12-26 | 2015-09-09 | 成都理想境界科技有限公司 | Head-mounted display device |
CN106095089A (en) * | 2016-06-06 | 2016-11-09 | 郑黎光 | A kind of method obtaining interesting target information |
CN107122039A (en) * | 2017-03-15 | 2017-09-01 | 苏州大学 | A kind of intelligent vision auxiliary strengthening system and its application method |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108592865A (en) * | 2018-04-28 | 2018-09-28 | 京东方科技集团股份有限公司 | Geometric measurement method and its device, AR equipment based on AR equipment |
WO2019206187A1 (en) * | 2018-04-28 | 2019-10-31 | 京东方科技集团股份有限公司 | Geometric measurement method and apparatus, augmented reality device, and storage medium |
US11385710B2 (en) | 2018-04-28 | 2022-07-12 | Boe Technology Group Co., Ltd. | Geometric parameter measurement method and device thereof, augmented reality device, and storage medium |
CN109727317A (en) * | 2019-01-07 | 2019-05-07 | 京东方科技集团股份有限公司 | Augmented reality system and control method |
US11402900B2 (en) | 2019-01-07 | 2022-08-02 | Beijing Boe Optoelectronics Technology Co., Ltd. | Augmented reality system comprising an aircraft and control method therefor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104918107B (en) | The identification processing method and device of video file | |
CN106791893A (en) | Net cast method and device | |
CN108510597A (en) | Edit methods, device and the non-transitorycomputer readable storage medium of virtual scene | |
CN107040646A (en) | Mobile terminal and its control method | |
CN106804000A (en) | Direct playing and playback method and device | |
CN107832036A (en) | Sound control method, device and computer-readable recording medium | |
CN107832741A (en) | The method, apparatus and computer-readable recording medium of facial modeling | |
CN104301610B (en) | Image taking control metho and device | |
CN106993229A (en) | Interactive attribute methods of exhibiting and device | |
CN106682736A (en) | Image identification method and apparatus | |
CN106778773A (en) | The localization method and device of object in picture | |
CN104809744B (en) | Image processing method and device | |
CN106778531A (en) | Face detection method and device | |
CN106980840A (en) | Shape of face matching process, device and storage medium | |
CN107563994A (en) | The conspicuousness detection method and device of image | |
CN107515669A (en) | Display methods and device | |
CN107832746A (en) | Expression recognition method and device | |
CN108108671A (en) | Description of product information acquisition method and device | |
CN108280434A (en) | The method and apparatus for showing information | |
CN107330391A (en) | Product information reminding method and device | |
CN107544802A (en) | device identification method and device | |
CN108319363A (en) | Product introduction method, apparatus based on VR and electronic equipment | |
CN107704190A (en) | Gesture identification method, device, terminal and storage medium | |
CN104853223B (en) | The inserting method and terminal device of video flowing | |
CN107132769A (en) | Smart machine control method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180202 |
|
RJ01 | Rejection of invention patent application after publication |