CN105651300A - Information obtaining method and electronic facility - Google Patents
Information obtaining method and electronic facility Download PDFInfo
- Publication number
- CN105651300A CN105651300A CN201510870602.7A CN201510870602A CN105651300A CN 105651300 A CN105651300 A CN 105651300A CN 201510870602 A CN201510870602 A CN 201510870602A CN 105651300 A CN105651300 A CN 105651300A
- Authority
- CN
- China
- Prior art keywords
- user
- target object
- electronics
- information
- coordinate information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3635—Guidance using 3D or perspective road maps
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The invention provides an information obtaining method and an electronic facility. After a user wears the electronic facility and a target to be observed is determined, the distance between the user and the target can be obtained, the coordinate information of the target can be calculated according to the coordinate information of the position where the user stands and the distance; and then the live map information corresponding to the coordinate information of the target is obtained through the wireless network and displayed; so that the user can accurately acknowledge the live map information of the target, a guiding route is provided for the user, the user will have an immersive feeling, the user experience is greatly improved, and thus the competitive performance of the electronic facility on the market is enhanced.
Description
Technical field
The present invention relates to navigation field, more specifically relate to a kind of information getting method and electronics.
Background technology
Along with the fast development of Internet technology and the enhancing gradually of electronic functionalities, the electronicss such as the mobile phone of major part user are all provided with the navigate application such as Baidu's map, to facilitate user in time, fast and accurately to understand traffic information, very practical.
But, in prior art, especially when driving, on electronics according to navigate application usually all be export current position arrive point of destination route, play impulse, but user cannot be made to learn the concrete live state information of this route, reduce user's impression.
Summary of the invention
In view of this, the present invention provides a kind of information getting method and electronics, after user wears this electronics, it is possible to obtains the live map information of selected target object in real time and presents, to a kind of impression on the spot in person of user, substantially increase user's impression.
In order to realize above-mentioned purpose, this application provides following technical scheme:
A kind of information getting method, is applied to electronics, and described method comprises:
Obtain the target distance wearing between the user of described electronics and its target object checked;
Utilize the coordinate information of the current position of described user and described target distance, calculate the coordinate information of described target object;
Obtain the live map information corresponding with the coordinate information of described target object by wireless network and present.
Preferably, described method also comprises:
The target object that user checks when wearing described electronics is judged according to preset rules.
Preferably, described when judging that user wears described electronics according to preset rules the target object checked be specially:
The Eyeball motion information of user of described electronics is worn in monitoring, and to determine that described user currently stares object based on described Eyeball motion information be target object;
When based on described Eyeball motion information, judge that described user currently stares object when changing, object will be stared as new target object after change.
Preferably, described when judging that user wears described electronics according to preset rules the target object checked be specially:
Obtain the current focal object of described electronics, and using described current focal object as target object;
When the current focal object monitoring described electronics changes, using the focal object after change as new target object.
Preferably, described utilizing the coordinate information of the current position of described user and described target distance, the coordinate information calculating described target object comprises:
Obtain the current direction of described target object relative to described user;
According to described current direction and described target distance, the coordinate information of the current position of described user obtained is carried out space computing, obtains the coordinate information of described target object.
Preferably, the described target object of described acquisition is specially relative to the current direction of described user:
The indication information that detection is preset, and judge the current direction of described target object relative to described user according to described indication information; Or;
The shade azimuth information of the object of reference in detected target object preset range, and the shade azimuth information utilizing described object of reference judges the current direction of described target object relative to described user.
A kind of electronics, described electronics comprises:
Stationary installation, for maintaining the relative position of described electronics with the body part wearing described electronics user;
Locating device, for obtaining the coordinate information of the current position of described user;
Treatment unit, for the target distance obtained between described user and target object, and utilizes the coordinate information of the current position of described user and described target distance, calculates the coordinate information of described target object;
Communication present device, for by wireless network obtain the live map information corresponding with the coordinate information of described target object and present.
Preferably, described treatment unit is also for judging, according to preset rules, the target object that user checks when wearing described electronics.
Preferably, described treatment unit also comprises:
Monitoring modular, for monitoring the Eyeball motion information of the user wearing described electronics;
First treater, it is target object for determining that described user currently stares object based on described Eyeball motion information, and based on described Eyeball motion information, judge that described user currently stares object when changing, object will be stared as new target object after change.
Preferably, described treatment unit is specifically for obtaining the current direction of described target object relative to described user, and according to described current direction and described target distance, the coordinate information of the current position of described user is carried out space computing, obtains the coordinate information of described target object.
As can be seen here, compared with prior art, this application provides a kind of information getting method and electronics, after user wears this electronics and determines the target object to be checked, by obtaining the target distance of this user and this target object, to utilize coordinate information and this target distance of the current position of this user, calculate the coordinate information of this target object, afterwards, namely obtain the live map information corresponding with the coordinate information of this target object by wireless network and present, thus enable user understand the live map information of target object in time and accurately, instruct user's track route, and give a kind of sensation on the spot in person of user, substantially increase user's impression.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, it is briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only embodiments of the invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, it is also possible to obtain other accompanying drawing according to the accompanying drawing provided.
The schematic flow sheet of a kind of information getting method embodiment that Fig. 1 provides for the application;
The schematic flow sheet of a kind of information getting method preferred embodiment that Fig. 2 provides for the application;
The structural representation of a kind of electronics embodiment that Fig. 3 provides for the application;
The structural representation of a kind of electronics preferred embodiment that Fig. 4 provides for the application.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only the present invention's part embodiment, instead of whole embodiments. Based on the embodiment in the present invention, those of ordinary skill in the art, not making other embodiments all obtained under creative work prerequisite, belong to the scope of protection of the invention.
This application provides a kind of information getting method and electronics, after user wears this electronics and determines the target object to be checked, by obtaining the target distance of this user and this target object, to utilize coordinate information and this target distance of the current position of this user, calculate the coordinate information of this target object, afterwards, namely obtain the live map information corresponding with the coordinate information of this target object by wireless network and present, thus enable user understand the live map information of target object in time and accurately, instruct user's track route, and give a kind of sensation on the spot in person of user, substantially increase user's impression.
In order to enable above-mentioned purpose, the feature and advantage of the present invention more become apparent, below in conjunction with the drawings and specific embodiments, the present invention is further detailed explanation.
As shown in Figure 1, being the schematic flow sheet of a kind of information getting method embodiment provided by the invention, the method can be applied to electronics, such as intelligent glasses etc., this is not construed as limiting by the application, and in actual applications, the method that the present embodiment provides can comprise the following steps:
Step S110: obtain the target distance wearing between the user of this electronics and its target object checked.
In the present embodiment, this electronics can be intelligent glasses, and such user can directly wear this intelligent glasses in the process of moving and realize navigation, it is not necessary to hand-held, very convenient, but do not limit to this kind of electronics of territory intelligent glasses.
Wherein, this intelligent glasses can also be called smart mirror, it refers to as the electronicss such as smart mobile phone, there is independent operating system, the various application programs that can provide by user installation software house, by voice or action manipulated adding schedule, digital map navigation and good friend interactive, take pictures and video and friend launch the functions such as video call, it is also possible to realized the general name of such class glasses of wireless network access by mobile communication network.
In actual applications, it is possible to pre-stored map information in this electronics, and this map information upgrade by wireless network; Certainly, it is also possible to directly installing all kinds of map application, such as Baidu's map etc., this is not construed as limiting by the application, like this, after user wears this electronics, this map information can be presented to user, so that user therefrom searches point of destination as target object.
Certainly, user can also wear this electronics in the process of moving and walk while see, so that user understands the live state information of the target object stared in time. Now, this target object can be realize perpetual object in this user's driving process namely currently to stare object;Can also being that user wears in this electronics viewing process, the automatic focusing of this electronics and the object that locks, this be not construed as limiting by the application.
Based on above-mentioned analysis, wear in the practical application of this electronics user, before performing step S110, it is possible to judge specifically can judge the target object that user checks target object in such a way, but be not limited thereto according to preset rules:
Mode one: the Eyeball motion information of user of described electronics is worn in monitoring, and to determine that described user currently stares object based on described Eyeball motion information be target object.
Eye tracking technology can be applied in this electronics by the present embodiment, like this, after user wears this electronics, pay close attention to along with user's glasses or stare the change of object and adjust the target object judged, to obtain the live map information of the physical location of new target object, that is, along with the change staring object of the user wearing this electronics, the live map information presenting to user also can change thereupon, thus enable user promptly and accurately grasp required live map information, so that user determines whether to be changed travel route etc. accordingly, substantially increase user to embody.
Thus, on the basis of above-described embodiment, the method can also comprise: based on the Eyeball motion information obtained, judge that the user wearing this electronics currently stares object when changing, using change after stare object as new target object, and perform above-mentioned steps S110, to obtain the live map information of this new target object.
Wherein, this eyeball operation information can comprise the change of user's glasses pupil, the rotation of eyeball etc., it should be noted that, determining that the specific implementation method that user currently stares object can refer to existing eye tracking technology about how according to Eyeball motion information, the present embodiment is not described in detail in this.
Mode two: the current focal object obtaining the electronics that user wears, and using this current focal object as target object.
In the present embodiment, this electronics can have the shoot function of existing dual camera camera, for this electronics as intelligent glasses, after user wears this intelligent glasses and starts, in this intelligent glasses, the filming apparatus of each eyeglass corresponding is by automatic focusing, thus locking current focal object and target object, concrete focusing process can refer to the shooting principle of existing dual camera camera, and the present embodiment is not described in detail in this.
In addition, when user wears intelligent glasses turning direction, current focal object also can change thereupon so, when the current focal object monitoring this electronics changes, focal object after change can be performed above-mentioned steps S110 as new target object, the live map information of new target object is presented to user.
On the basis of the various embodiments described above, still for this electronics as intelligent glasses, the distance of two eyeglasses to target object of intelligent glasses can be utilized, and the distance between two eyeglasses, calculate the actual range between user and this target object and target distance, also it is exactly three length of sides of known trilateral, calculate the height of certain of this trilateral, existing trilateral calculation of correlation formula specifically can be utilized to obtain this target distance, and this is not construed as limiting by the application.
In addition, it should be noted that, focusing process for above-mentioned electronics can also adopt the camera of existing single camera or the work focusing principle of pick up camera, and using the eyeglass of intelligent glasses as the preview image after screen display focusing, now, user can according to whether the current focal object of the image authentication presented is the object that user wishes to check, if not, user can focus on by rotation head again, wishes on the object checked until focusing on.Certainly, it is also possible to by gathering the said voice information comprising this correlated characteristic checking object of user, and this voice information being extracted and analyze, generating corresponding instruction and carry out automatic focusing, to meet user's needs, this is not construed as limiting by the application.
Step S120: coordinate information and the target distance utilizing the current position of this user, calculates the coordinate information of described target object.
In the present embodiment, this electronics can comprise locating device, such as GPS (GlobalPositioningSystem, global positioning system (GPS)) etc., thus, user wears this electronics and after startup, this locating device can be utilized to obtain the coordinate information of the current position of this user in real time. Afterwards, by being added by coordinate figure each in this coordinate information or subtract this target distance, the coordinate information of target object can be obtained.
Optionally, in actual applications, the application can first obtain the current direction of target object relative to user, so that the coordinate figure that each coordinate figure determining the coordinate information of target object is the correspondence direction of the coordinate information by the current position of user adds target distance and obtains, still subtract this target distance and obtain. That is, according to described current direction and described target distance, the coordinate information of the current position of described user is carried out space computing, obtains the coordinate information of described target object.
Wherein, about the acquisition in above-mentioned current direction, the indication information can preset by detection, and judge the current direction of target object relative to this user according to this indication information, wherein, this indication information specifically can by obtaining the voice information analysis of the user collected, it is also possible to obtaining by detecting the action information of programmable button, this is not construed as limiting by the application.
In addition, the shade azimuth information of the object of reference in all right detected target object preset range of the application, and the shade azimuth information utilizing this object of reference judges the current direction of target object relative to this user.
In the practical application of this embodiment, when the sun is different relative to the orientation of object of reference, the direction that solar irradiation is mapped to the incident light of this object of reference also can be different, thus the shade azimuth information of this object of reference formed also can be different, now, electronics can be utilized to obtain this object of reference and shadow image thereof, image analysis algorithm is utilized to draw the shade azimuth information of this object of reference, thus obtain the incident light direction of sunlight, and then determine the current direction of target object relative to user, afterwards, in the manner described above the coordinate information of the current position of user can be carried out space computing, obtain the coordinate information of target object.
It should be noted that, the mode about the coordinate information calculating target object is not limited to aforesaid way, pays what creative work was determined so long as not those skilled in the art, all belongs to the application's protection domain.
Step S130: obtain the live map information corresponding with the coordinate information of this target object by wireless network and present.
After the coordinate information determining target object and actual position information, by wireless network, the coordinate information of this target object can be sent to other electronicss, such as mobile phone or cloud server etc., now, this electronics such as mobile phone or cloud server can position according to the coordinate information of this target object, and after obtaining the live map information of locating point, by radio network feedback to electronics, so that the live map information received is presented to user by this electronics, to give a kind of sensation on the spot in person of user. It should be noted that, the obtain manner for the live map information of target object is not limited to aforesaid way.
In sum, in this enforcement, after user wears this electronics and determines the target object to be checked, by obtaining the target distance of this user and this target object, to utilize coordinate information and this target distance of the current position of this user, calculate the coordinate information of this target object, afterwards, namely obtain the live map information corresponding with the coordinate information of this target object by wireless network and present, thus enable user understand the live map information of target object in time and accurately, instruct user's track route, and give a kind of sensation on the spot in person of user, substantially increase user's impression.
As shown in Figure 2, being the schematic flow sheet of a kind of information getting method preferred embodiment that the application provides, the method can be applied to electronics, and such as intelligent glasses etc., the method provided at the present embodiment can comprise the following steps:
Step S201: the Eyeball motion information of user of this electronics is worn in monitoring, and to determine that this user currently stares object based on this Eyeball motion information be target object.
In the present embodiment practical application, eye tracking technology can being utilized to carry out real-time detected target object, detailed process can refer to the description of above-described embodiment corresponding part, and the present embodiment does not repeat them here.
Step S202: calculate the target distance between this user and this target object, and determine the current direction of this target object relative to this user.
In the present embodiment practical application, if this electronics is intelligent glasses, the principle of work of dual camera camera can be utilized to calculate the target distance between user and target object, but it is not limited thereto. In addition, can determining in the manner described above relative to the current direction of user about target object, the application does not repeat them here.
Step S203: the coordinate information obtaining the current position of this user.
Wherein, the coordinate information of this user position of locating device Real-Time Monitoring that this electronics installs can be utilized.
Step S204: according to deserve front to and this target distance, the coordinate information of the current position of this user is carried out space computing, obtains the coordinate information of this target object.
Step S205: obtain the live map information corresponding with the coordinate information of described target object by wireless network and present.
As can be seen here, the coordinate information of the target object currently determined of the present embodiment by obtaining in real time, obtain corresponding live map information and present to user, not only reach the object instructing user to travel, but also a kind of sensation on the spot in person of user can be given, substantially increase user's impression.
As shown in Figure 3, being the structural representation of a kind of electronics embodiment that the application provides, this electronics can comprise:
Stationary installation 310, for maintaining the relative position of described electronics with the body part wearing described electronics user.
Optionally, in the present embodiment practical application, this electronics can be intelligent glasses, now, this stationary installation 310 can be maintain this electronics respectively with the outer auricle of two ears of user outside fixing fixing support, the application does not limit this support bracket fastened concrete structure, as long as this object can be reached, such as the temple of common spectacles. Certainly, as shown in Figure 4, this stationary installation 310 can also maintain this electronics and the relative position of this user's head, and the concrete structure of this stationary installation 310 is not construed as limiting by the application.
Locating device 320, for obtaining the coordinate information of the current position of described user.
Wherein, this locating device 320 can be GPS, but is not limited thereto.
Treatment unit 330, for the target distance obtained between described user and target object, and utilizes the coordinate information of the current position of described user and described target distance, calculates the coordinate information of described target object.
Optionally, in actual applications, this treatment unit 330 can also be used for judging, according to preset rules, the target object that user checks when wearing described electronics, and monitor the target object checked when changing, the target object checked after change, as new target object, is recalculated the target distance of this user and this new target object.
Based on this, this treatment unit 330 can comprise:
Monitoring modular, for monitoring the Eyeball motion information of the user wearing described electronics.
First treater, it is target object for determining that described user currently stares object based on Eyeball motion information, and based on described Eyeball motion information, judge that described user currently stares object when changing, object will be stared as new target object after change.
In addition, description in conjunction with aforesaid method embodiment corresponding part is passable, this monitoring modular can also be used to monitor the change of the current focal object of this electronics, thus reach the object of monitoring objective object variation, concrete, by the current focal object of this this electronics of monitoring module monitors, and it can be used as target object to be sent to the 2nd treater of this treatment unit 330, so that the 2nd treater is based on the change of this current focal object, redefine target object, it is about to the focal object after change as new target object, with the target distance calculated between this new target object and user.
Afterwards, this treatment unit can obtain the current direction of target object relative to described user, and according to deserve front to and described target distance, the coordinate information of the current position of this user is carried out space computing, obtains the coordinate information of this target object.
Communication present device 340, for by wireless network obtain the live map information corresponding with the coordinate information of described target object and present.
Wherein, in the present embodiment, this communication presents device 340 and can comprise communication module and present module, this communication module can communicate with other electronicss, to obtain the live map information corresponding with the coordinate information of this target object, and present module by this and this fact map information is presented to user at the moment in real time, to give a kind of sensation on the spot in person of user.
Optionally, this communication module can be functionality mobile communication module, WIFI module, bluetooth module etc., and other electronicss can be can set up the mobile phone of wireless connections, cloud server, computer etc. with this electronics, this is not done concrete restriction by the application.
In sum, in the present embodiment, after user wears this electronics and determines the target object to be checked, by obtaining the target distance of this user and this target object, to utilize coordinate information and this target distance of the current position of this user, calculate the coordinate information of this target object, afterwards, namely obtain the live map information corresponding with the coordinate information of this target object by wireless network and present, thus enable user understand the live map information of target object in time and accurately, instruct user's track route, and give a kind of sensation on the spot in person of user, substantially increase user's impression, and then enhance the market competitiveness of this electronics.
Finally, it should be noted that, about in the various embodiments described above, the such as relational terms of first, second grade and so on is only used for an operation or unit and another operates or cellular zone separates, and not necessarily requires or imply to there is any this kind of actual relation or order between these unit or operation.
In this specification sheets, each embodiment adopts the mode gone forward one by one to describe, and what each embodiment emphasis illustrated is the difference with other embodiments, between each embodiment identical similar portion mutually see.For electronics disclosed in embodiment, due to its with embodiment disclosed in method corresponding, so what describe is fairly simple, relevant part illustrates see method part.
To the above-mentioned explanation of the disclosed embodiments, professional and technical personnel in the field are enable to realize or use the present invention. To be apparent for those skilled in the art to the multiple amendment of these embodiments, General Principle as defined herein can without departing from the spirit or scope of the present invention, realize in other embodiments. Therefore, the present invention can not be limited in these embodiments shown in this article, but be met the widest scope consistent with principle disclosed herein and features of novelty.
Claims (10)
1. an information getting method, it is characterised in that, it is applied to electronics, described method comprises:
Obtain the target distance wearing between the user of described electronics and its target object checked;
Utilize the coordinate information of the current position of described user and described target distance, calculate the coordinate information of described target object;
Obtain the live map information corresponding with the coordinate information of described target object by wireless network and present.
2. method according to claim 1, it is characterised in that, described method also comprises:
The target object that user checks when wearing described electronics is judged according to preset rules.
3. method according to claim 2, it is characterised in that, described when judging that user wears described electronics according to preset rules the target object checked be specially:
The Eyeball motion information of user of described electronics is worn in monitoring, and to determine that described user currently stares object based on described Eyeball motion information be target object;
When based on described Eyeball motion information, judge that described user currently stares object when changing, object will be stared as new target object after change.
4. method according to claim 2, it is characterised in that, described when judging that user wears described electronics according to preset rules the target object checked be specially:
Obtain the current focal object of described electronics, and using described current focal object as target object;
When the current focal object monitoring described electronics changes, using the focal object after change as new target object.
5. method according to claim 1-4 any one, it is characterised in that, described utilize the coordinate information of the current position of described user and described target distance, the coordinate information calculating described target object comprises:
Obtain the current direction of described target object relative to described user;
According to described current direction and described target distance, the coordinate information of the current position of described user obtained is carried out space computing, obtains the coordinate information of described target object.
6. method according to claim 5, it is characterised in that, the described target object of described acquisition is specially relative to the current direction of described user:
The indication information that detection is preset, and judge the current direction of described target object relative to described user according to described indication information; Or;
The shade azimuth information of the object of reference in detected target object preset range, and the shade azimuth information utilizing described object of reference judges the current direction of described target object relative to described user.
7. an electronics, it is characterised in that, described electronics comprises:
Stationary installation, for maintaining the relative position of described electronics with the body part wearing described electronics user;
Locating device, for obtaining the coordinate information of the current position of described user;
Treatment unit, for the target distance obtained between described user and target object, and utilizes the coordinate information of the current position of described user and described target distance, calculates the coordinate information of described target object;
Communication present device, for by wireless network obtain the live map information corresponding with the coordinate information of described target object and present.
8. electronics according to claim 7, it is characterised in that, described treatment unit is also for judging, according to preset rules, the target object that user checks when wearing described electronics.
9. electronics according to claim 8, it is characterised in that, described treatment unit comprises:
Monitoring modular, for monitoring the Eyeball motion information of the user wearing described electronics;
First treater, it is target object for determining that described user currently stares object based on described Eyeball motion information, and based on described Eyeball motion information, judge that described user currently stares object when changing, object will be stared as new target object after change.
10. electronics according to claim 7, it is characterized in that, described treatment unit is specifically for obtaining the current direction of described target object relative to described user, and according to described current direction and described target distance, the coordinate information of the current position of described user is carried out space computing, obtains the coordinate information of described target object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510870602.7A CN105651300A (en) | 2015-11-30 | 2015-11-30 | Information obtaining method and electronic facility |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510870602.7A CN105651300A (en) | 2015-11-30 | 2015-11-30 | Information obtaining method and electronic facility |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105651300A true CN105651300A (en) | 2016-06-08 |
Family
ID=56481922
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510870602.7A Pending CN105651300A (en) | 2015-11-30 | 2015-11-30 | Information obtaining method and electronic facility |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105651300A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725726A (en) * | 2018-12-29 | 2019-05-07 | 上海掌门科技有限公司 | A kind of querying method and device |
CN110276251A (en) * | 2019-05-13 | 2019-09-24 | 联想(上海)信息技术有限公司 | A kind of image-recognizing method, device, equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101807349A (en) * | 2010-01-08 | 2010-08-18 | 北京世纪高通科技有限公司 | Road condition distribution system and method based on Web |
CN102141869A (en) * | 2010-01-29 | 2011-08-03 | 联想(北京)有限公司 | Information identification and prompting method and mobile terminal |
CN103105993A (en) * | 2013-01-25 | 2013-05-15 | 腾讯科技(深圳)有限公司 | Method and system for realizing interaction based on augmented reality technology |
CN103335657A (en) * | 2013-05-30 | 2013-10-02 | 佛山电视台南海分台 | Method and system for strengthening navigation performance based on image capture and recognition technology |
CN103604412A (en) * | 2013-10-30 | 2014-02-26 | 北京智谷睿拓技术服务有限公司 | Positioning method and positioning device |
CN104596523A (en) * | 2014-06-05 | 2015-05-06 | 腾讯科技(深圳)有限公司 | Streetscape destination guide method and streetscape destination guide equipment |
-
2015
- 2015-11-30 CN CN201510870602.7A patent/CN105651300A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101807349A (en) * | 2010-01-08 | 2010-08-18 | 北京世纪高通科技有限公司 | Road condition distribution system and method based on Web |
CN102141869A (en) * | 2010-01-29 | 2011-08-03 | 联想(北京)有限公司 | Information identification and prompting method and mobile terminal |
CN103105993A (en) * | 2013-01-25 | 2013-05-15 | 腾讯科技(深圳)有限公司 | Method and system for realizing interaction based on augmented reality technology |
CN103335657A (en) * | 2013-05-30 | 2013-10-02 | 佛山电视台南海分台 | Method and system for strengthening navigation performance based on image capture and recognition technology |
CN103604412A (en) * | 2013-10-30 | 2014-02-26 | 北京智谷睿拓技术服务有限公司 | Positioning method and positioning device |
CN104596523A (en) * | 2014-06-05 | 2015-05-06 | 腾讯科技(深圳)有限公司 | Streetscape destination guide method and streetscape destination guide equipment |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725726A (en) * | 2018-12-29 | 2019-05-07 | 上海掌门科技有限公司 | A kind of querying method and device |
CN110276251A (en) * | 2019-05-13 | 2019-09-24 | 联想(上海)信息技术有限公司 | A kind of image-recognizing method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10670421B2 (en) | Apparatus, system, and method of information sharing, and recording medium | |
KR101515484B1 (en) | Augmented Reality Information Providing Apparatus and the Method | |
US8963956B2 (en) | Location based skins for mixed reality displays | |
JP5675470B2 (en) | Image generation system, program, and information storage medium | |
CN107167138B (en) | A kind of library's intelligence Way guidance system and method | |
US20150212576A1 (en) | Radial selection by vestibulo-ocular reflex fixation | |
US10948994B2 (en) | Gesture control method for wearable system and wearable system | |
US10712167B2 (en) | Methods, systems, and devices for displaying maps | |
WO2016079557A1 (en) | Display system for remote control of working machine | |
JP2010123121A (en) | Method and apparatus for marking position of real world object in see-through display | |
WO2015051606A1 (en) | Locating method and locating system | |
CN105787884A (en) | Image processing method and electronic device | |
CN104702871A (en) | Unmanned plane projection displaying method, unmanned plane projection displaying system and unmanned plane projection displaying device | |
CN109975757A (en) | Indoor positioning air navigation aid, terminal and computer storage medium | |
CN103974047A (en) | Wearable projector and focusing method and projection method thereof | |
JP2017191490A (en) | Skill transmission system and method | |
JP2017120556A (en) | Head-mounted display for operation, control method of head-mounted display for operation, and program for head-mounted display for operation | |
CN104998376A (en) | Virtual reality technology-based scenery viewing system | |
EP3368962B1 (en) | Method and system for interaction using holographic display system | |
CN106814518A (en) | Auto-focusing camera system and electronic installation | |
KR101739768B1 (en) | Gaze tracking system at a distance using stereo camera and narrow angle camera | |
CN105651300A (en) | Information obtaining method and electronic facility | |
CN104501797B (en) | A kind of air navigation aid based on augmented reality IP maps | |
CN104484051B (en) | The real-time marketing command methods in interior and system based on wearable glasses sight | |
US10559132B2 (en) | Display apparatus, display system, and control method for display apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160608 |
|
RJ01 | Rejection of invention patent application after publication |