CN104750448A - Method for information processing, electronic equipment and wearable equipment - Google Patents

Method for information processing, electronic equipment and wearable equipment Download PDF

Info

Publication number
CN104750448A
CN104750448A CN201510129085.8A CN201510129085A CN104750448A CN 104750448 A CN104750448 A CN 104750448A CN 201510129085 A CN201510129085 A CN 201510129085A CN 104750448 A CN104750448 A CN 104750448A
Authority
CN
China
Prior art keywords
image
user
electronic equipment
road information
way guidance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510129085.8A
Other languages
Chinese (zh)
Inventor
张守鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510129085.8A priority Critical patent/CN104750448A/en
Publication of CN104750448A publication Critical patent/CN104750448A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a method for information processing. The method is applied to electronic equipment. The electronic equipment is provided with a projection unit and a first image collecting unit. The method includes the steps that based on a scenery image, within a preset range, collected by the first image collecting unit, road information within the preset range is obtained, wherein the preset range is the region which is formed by extension from the current position of the electronic equipment in the moving direction of the electronic equipment by a preset distance; a path guiding image corresponding to the road information is generated; based on a user sight line at the current moment, a corresponding projection region is determined on the projection plane; through the projection unit, the path guiding image is projected to the projection region. The embodiment of the invention further discloses the electronic equipment and wearable equipment.

Description

A kind of method of information processing, electronic equipment and wearable device
Technical field
The present invention relates to field of information processing, particularly relate to a kind of method of information processing, electronic equipment and wearable device.
Background technology
Along with the development of science and technology, electronic technology have also been obtained development at full speed, and the kind of electronic product also gets more and more, and people have also enjoyed the various facilities that development in science and technology brings.Present people can pass through various types of electronic equipment, enjoy the comfortable life along with development in science and technology brings.Such as, the electronic equipment such as smart mobile phone, panel computer has become an important ingredient in people's life, user can use the electronic equipment such as smart mobile phone, panel computer to listen to the music, play games etc., to live the pressure brought to alleviate modern fast pace.
At present, when user is in driving, by bike or in the process of walking, in order to determine next to walk toward which, just need viewing navigation interface frequently, or listen to navigation hint sound, like this, the road conditions probably owing to not observing in a flash vehicle front cause traffic hazard, or to postpone due to any or vehicle drives to two close crossings, cause user to sail crossing or the bend of mistake into.
So, there is not a kind of safer navigation scheme in prior art.
Summary of the invention
In view of this, the embodiment of the present invention expects the method and the electronic equipment that provide a kind of information processing, to provide a kind of safer air navigation aid, user is made to observe reception navigation information real-time while road surface, reduce probability traffic hazard occurring or takes a wrong way, good Consumer's Experience is provided.
For achieving the above object, technical scheme of the present invention is achieved in that
First aspect, the embodiment of the present invention provides a kind of method of information processing, is applied to an electronic equipment, and described electronic equipment has projecting cell and the first image acquisition units; Described method comprises: based on the real scene image in the preset range that described first image acquisition units gathers, obtain the road information in described preset range, wherein, described preset range extends the region that formed of predeterminable range based on the current location of described electronic equipment towards the moving direction of described electronic equipment; Generate the Way guidance image corresponding with described road information; Based on user's sight line of current time, determine corresponding view field on the projection surface; By described projecting cell by described Way guidance image projection extremely described view field.
Second aspect, the embodiment of the present invention provides a kind of electronic equipment, comprise: the first image acquisition units, for gathering the real scene image of preset range, wherein, described preset range extends the region that formed of predeterminable range based on the current location of described electronic equipment towards the moving direction of described electronic equipment; Control module, for based on described real scene image, obtains the road information in described preset range; Generate the Way guidance image corresponding with described road information; Based on user's sight line of current time, determine corresponding view field on the projection surface; Projecting cell, for by described Way guidance image projection to described view field.
The third aspect, the embodiment of the present invention provides a kind of wearable device, comprising: projection supporting part; Support, is erected at user's head, for described projection supporting part being fixed on the front of eyes of user, and keeps the relative position of described projection supporting part and described user's head constant; First image acquisition units, arrange on the bracket, for gathering the real scene image in preset range, wherein, described preset range extends the region that formed of predeterminable range based on the current location of described wearable device towards the moving direction of described wearable device; Control module, for based on described real scene image, obtains the road information in described preset range; Generate the Way guidance image corresponding with described road information; Based on user's sight line of current time, described projection supporting part is determined corresponding view field; Projecting cell, for by described Way guidance image projection to described view field.
The embodiment of the present invention provide in the method for information processing, electronic equipment has projecting cell and the first image acquisition units, real scene image in the preset range that electronic equipment gathers based on the first image acquisition units, obtain the road information in preset range, wherein, preset range extends the region that formed of predeterminable range based on the current location of electronic equipment towards the moving direction of electronic equipment, namely the first image acquisition units towards certain limit in region, then, electronic equipment generates the Way guidance image corresponding with road information, such as, routing indicator, direction-indicating arrow, to test the speed alarm etc., next, electronic equipment is based on user's sight line of current time, determine corresponding view field on the projection surface, finally, by projecting cell by Way guidance image projection to view field, like this, user just can receive navigation information without the need to converting sight line in the process of driving while observation road surface, reduce probability traffic hazard occurring or takes a wrong way, a kind of safer navigation scheme is provided, good Consumer's Experience is provided.
Accompanying drawing explanation
Fig. 1 is the structural representation of the electronic equipment in the embodiment of the present invention one and three;
Fig. 2 is the schematic diagram of the preset range in the embodiment of the present invention one and two;
Fig. 3 is the method flow schematic diagram of the information processing in the embodiment of the present invention one;
Fig. 4 is the schematic diagram of the Way guidance image in the embodiment of the present invention one and two;
Fig. 5 is the structural representation of the electronic equipment in the embodiment of the present invention two and four;
Fig. 6 is the method flow schematic diagram of the information processing in the embodiment of the present invention two.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described.
Embodiment one:
The present embodiment provides a kind of method of information processing, the method is applied to an electronic equipment, this electronic equipment can for being erected at the navigating instrument at vehicle front windshield vehicle window place, also can for being placed on the smart mobile phone, panel computer, micro projector etc. above meter panel of motor vehicle.
Shown in Figure 1, this electronic equipment comprises: the first image acquisition units 11, control module 12 and projecting cell 13; Wherein, the first image acquisition units 11 is for gathering the real scene image of preset range; Control module 12, for based on real scene image, obtains the road information in preset range; Generate the Way guidance image corresponding with road information; Based on user's sight line of current time, determine corresponding view field on the projection surface; Projecting cell 13 for by Way guidance image projection to view field.
Preset range mentioned here refers to that the current location based on electronic equipment extends the region that formed of predeterminable range towards the moving direction of electronic equipment, shown in Figure 2, is exactly the space that in Fig. 2, dotted line surrounds.Wherein, above-mentioned predeterminable area is determined, so in actual applications, the size of predeterminable area is as the criterion with practical application, and the present invention is not specifically limited by with the size of vehicle front windshield vehicle window and/or the image-capture field of the first image acquisition units 11.
In specific implementation process, above-mentioned first image acquisition units 11 can be that common camera, high-definition camera, high-speed camera are first-class, and the present invention is not specifically limited.
Be described below in conjunction with the method for above-mentioned electronic equipment to the information processing that the embodiment of the present invention provides.
Shown in Figure 3, the method comprises:
S301: based on the real scene image in the preset range that the first image acquisition units gathers, obtains the road information in preset range;
Specifically, above-mentioned electronic equipment is placed on above the panel board of vehicle by user, and opens, and now, the first image acquisition units can real scene image in Real-time Collection preset range, that is to say the surface conditions that user can see through front windshield vehicle window.Above-mentioned real scene image is sent to control module by the first image acquisition units, is analyzed this image by control module, obtains current road information, such as, whether be Through Lane in preset range, with or without bend, whether be crossroad etc.
S302: generate the Way guidance image corresponding with road information;
Specifically, control module, can generation pass guiding plan picture according to road information after acquisition road information.
In specific implementation process, S302 can and do not limit and there are following two kinds of situations.
The first, based on the locating information that electronic equipment is current, obtain map data information; Based on map data information and road information, generation pass guiding plan picture.
Specifically, electronic equipment can also by built-in GPS (GPS, GlobalPositioning System) module, or by base station is auxiliary, self is positioned, then, from the map data base in local or high in the clouds, obtain the map data information that current location is relevant, then, electronic equipment according to the map data message in conjunction with the road information obtained in S301, generate guided path image.
For example, electronic equipment navigates to by GPS the A street that current location is X city, then, the map data information in A street, X city is transferred in map data base, as street length, surrounding building, traffic conditions, traffic information etc., next, according to the road information that the first image acquisition units gathers, namely be Through Lane in predeterminable area, then combining cartographic information, generate the Way guidance image that 200 meters, front keeps keeping straight on, as shown in Figure 4 A, guided path (as arrow in Fig. 4 A) can be comprised in this image, and keep the distance of craspedodrome, namely 200 meters.
In the middle of practical application, Way guidance image is except comprising information of road surface, the information such as the title of Adjacent Buildings, image can also be comprised, point out around user and have what buildings, such as, user will go to the zoo, when electronic equipment location self also has 200 meters to arrive zoo, data message according to the map, learns that zoo is positioned at the left side of road, so, the just title in the display zoo, left side of guided path in Way guidance image, i.e. " ZOO ", and distance " distance zoo also has 200 meters ", as shown in Figure 4 B.
Certainly, can also have and there is other forms of Way guidance image, the present invention does not do concrete restriction.
The second, according to road information, generates guided path; Guided path be added on real scene image, generation pass guiding plan picture, moves according to guided path to guide user.
Specifically, electronic equipment can also obtain road information according in S301, generates guided path, is then added on real scene image by this path, and generate Way guidance image as shown in Figure 4 C, user, after seeing this image, moves according to guided path.
Certainly, can also there is other forms of Way guidance image, the present invention is not specifically limited.
S303: based on user's sight line of current time, determines corresponding view field on the projection surface;
Specifically, after electronic equipment generates Way guidance image, according to the visual line of sight that user presets, on projecting plane, front windshield vehicle window etc. as vehicle determines corresponding view field above, this view field can be the whole region on projecting plane, can be also subregion, as long as be positioned at the visual line of sight of user.
Further, electronic equipment can also be provided with the second image acquisition units, and the pickup area of this image acquisition units and the pickup area of the first image acquisition units are opposing, and the collection direction of the second image acquisition units is towards user.So, when user is in the process of driving, second image acquisition units can the human eye eyeball image of Real-time Collection user, and these images are sent to control module, control module is resolved human eye eyeball image, obtains the eye movement data of user, such as, the eye position etc. of user, then based on eye movement data, determine on the projection surface and the view field that the sight line of user crosses.
S304: by projecting cell by Way guidance image projection to view field.
More preferably, observing road surface in order to not affect user by front windshield vehicle window, when Way guidance image is transparent in view field, transparency being adjusted to the parameter value that user pre-sets.
From the above, electronic equipment has projecting cell and the first image acquisition units, real scene image in the preset range that electronic equipment gathers based on the first image acquisition units, obtain the road information in preset range, wherein, preset range extends the region that formed of predeterminable range based on the current location of electronic equipment towards the moving direction of electronic equipment, namely the first image acquisition units towards certain limit in region, then, electronic equipment generates the Way guidance image corresponding with road information, such as, routing indicator, direction-indicating arrow, to test the speed alarm etc., next, electronic equipment is based on user's sight line of current time, determine corresponding view field on the projection surface, finally, by projecting cell by Way guidance image projection to view field, like this, user just can receive navigation information without the need to converting sight line in the process of driving while observation road surface, reduce probability traffic hazard occurring or takes a wrong way, a kind of safer navigation scheme is provided, good Consumer's Experience is provided.
Embodiment two:
The present embodiment provides a kind of method of information processing, and the method is applied to a wearable device, and wearable device can be the wearable devices such as intelligent glasses.
Shown in Figure 5, this wearable device, comprising: projection supporting part 51; Support 52, is erected at user's head, for projection supporting part 51 being fixed on the front of eyes of user, and keeps projection supporting part 51 constant with the relative position of user's head; First image acquisition units 53, is arranged on support 52, and for gathering the real scene image in preset range, wherein, preset range extends the region that formed of predeterminable range based on the current location of wearable device towards the moving direction of wearable device; Control module 54, for based on real scene image, obtains the road information in preset range; Generate the Way guidance image corresponding with road information; Based on user's sight line of current time, projection supporting part 51 determines corresponding view field; Projecting cell 55, for the supporting part 51 that Way guidance image projection extremely projected.
Preset range mentioned here refers to that the current location based on wearable device extends the region that formed of predeterminable range towards its moving direction, consistent, shown in Figure 2 with said preset range in one or more embodiment above-mentioned, be exactly the space that in Fig. 2, dotted line surrounds.Wherein, above-mentioned predeterminable area is determined, so in actual applications, the size of predeterminable area is as the criterion with practical application, and the present invention is not specifically limited by with the size of vehicle front windshield vehicle window and/or the image-capture field of the first image acquisition units 53.
In specific implementation process, above-mentioned first image acquisition units 53 can be that common camera, high-definition camera, high-speed camera are first-class, and the present invention is not specifically limited.
Be described below in conjunction with the method for above-mentioned wearable device to the information processing that the embodiment of the present invention provides.
Shown in Figure 6, the method comprises:
S601: based on the real scene image in the preset range that the first image acquisition units gathers, obtains the road information in preset range;
Specifically, the support of above-mentioned wearable device is erected at its head by user, pass through support, projection supporting part is fixed on the facial less parallel with user, and in the orientation, the visual field in the front of eyes of user, so, when wearable device works, first image acquisition units can real scene image in Real-time Collection preset range, that is to say the surface conditions that user can see through front windshield vehicle window.Above-mentioned real scene image is sent to control module by the first image acquisition units, is analyzed this image by control module, obtains current road information, such as, whether be Through Lane in preset range, with or without bend, whether be crossroad etc.
S602: generate the Way guidance image corresponding with road information;
Specifically, control module, can generation pass guiding plan picture according to road information after acquisition road information.
In specific implementation process, S602 can and do not limit and there are following two kinds of situations.
The first, based on the locating information that wearable device is current, obtain map data information; Based on map data information and road information, generation pass guiding plan picture.
Specifically, wearable device can also by built-in GPS (GPS, GlobalPositioning System) module, or by base station is auxiliary, self is positioned, then, from the map data base in local or high in the clouds, obtain the map data information that current location is relevant, then, wearable device according to the map data message in conjunction with the road information obtained in S601, generate guided path image.
For example, wearable device navigates to by GPS the A street that current location is X city, then, the map data information in A street, X city is transferred in map data base, as street length, surrounding building, traffic conditions, traffic information etc., next, according to the road information that the first image acquisition units gathers, namely be Through Lane in predeterminable area, combining cartographic information again, generate the Way guidance image that 200 meters, front keeps keeping straight on, as shown in Figure 4 A, guided path (as arrow in Fig. 4 A) can be comprised in this image, and keep the distance of craspedodrome, namely 200 meters.
In the middle of practical application, Way guidance image is except comprising information of road surface, the information such as the title of Adjacent Buildings, image can also be comprised, point out around user and have what buildings, such as, user will go to the zoo, when wearable device location self also has 200 meters to arrive zoo, data message according to the map, learn that zoo is positioned at the left side of road, so, the just title in the display zoo, left side of guided path and distance in Way guidance image, namely 200 meters, as shown in Figure 4 B.
Certainly, can also have and there is other forms of Way guidance image, the present invention does not do concrete restriction.
The second, according to road information, generates guided path; Guided path be added on real scene image, generation pass guiding plan picture, moves according to guided path to guide user.
Specifically, wearable device can also obtain road information according in S601, generates guided path, is then added on real scene image by this path, and generate Way guidance image as shown in Figure 4 C, user, after seeing this image, moves according to guided path.
Certainly, can also there is other forms of Way guidance image, the present invention is not specifically limited.
S603: based on user's sight line of current time, projection supporting part determines corresponding view field;
Specifically, after wearable device generates Way guidance image, according to the visual line of sight that user presets, at projection supporting part, corresponding view field determined above by eyeglass etc. as intelligent glasses, this view field can, for the whole region on projection supporting part, also can be the subregion corresponding with user's sight line, as long as be positioned at the visual line of sight of user.
Further, wearable device can also be provided with the second image acquisition units, and the pickup area of this image acquisition units and the pickup area of the first image acquisition units are opposing, and the collection direction of the second image acquisition units is towards user.So, when user is in the process of driving, second image acquisition units can the human eye eyeball image of Real-time Collection user, and these images are sent to control module, control module is resolved human eye eyeball image, obtains the eye movement data of user, such as, the eye position etc. of user, then based on eye movement data, projection supporting part is determined and the view field that the sight line of user crosses.
S604: by projecting cell by Way guidance image projection in view field.
More preferably, observing road surface in order to not affect user by projection supporting part, when Way guidance image is transparent in view field, transparency being adjusted to the parameter value that user pre-sets.
From the above, wearable device has projecting cell and the first image acquisition units, real scene image in the preset range that wearable device gathers based on the first image acquisition units, obtain the road information in preset range, wherein, preset range extends the region that formed of predeterminable range based on the current location of wearable device towards the moving direction of wearable device, namely the first image acquisition units towards certain limit in region, then, wearable device generates the Way guidance image corresponding with road information, such as, routing indicator, direction-indicating arrow, to test the speed alarm etc., next, wearable device is based on user's sight line of current time, determine corresponding view field on the projection surface, finally, by projecting cell by Way guidance image projection to view field, like this, user just can receive navigation information without the need to converting sight line in the process of driving while observation road surface, reduce probability traffic hazard occurring or takes a wrong way, a kind of safer navigation scheme is provided, good Consumer's Experience is provided.
Embodiment three:
The present embodiment provides a kind of electronic equipment, consistent with the electronic equipment described in above-described embodiment one.
Shown in Figure 1, this electronic equipment comprises: the first image acquisition units 11, and for gathering the real scene image of preset range, wherein, preset range extends the region that formed of predeterminable range based on the current location of electronic equipment towards the moving direction of electronic equipment; Control module 12, for based on real scene image, obtains the road information in preset range; Generate the Way guidance image corresponding with road information; Based on user's sight line of current time, determine corresponding view field on the projection surface; Projecting cell 13, for by Way guidance image projection to view field.
Further, electronic equipment also comprises: the second image acquisition units, for before projecting cell 13 is by Way guidance image projection to view field, gathers the human eye eyeball image of user; Control module 12, also for obtaining human eye eyeball image; Human eye eyeball image is resolved, obtains the eye movement data of user; Based on eye movement data, determine on the projection surface and the view field that the sight line of user crosses.
Further, control module 12, specifically for based on the current locating information of electronic equipment, obtains map data information; Based on map data information and road information, generation pass guiding plan picture.
Further, control module 12, specifically for according to road information, generates guided path; Guided path be added on real scene image, generation pass guiding plan picture, moves according to guided path to guide user.
Embodiment four:
The present embodiment provides a kind of wearable device, consistent with the wearable device in above-described embodiment two.
Shown in Figure 5, this wearable device comprises: projection supporting part 51; Support 52, is erected at user's head, for projection supporting part 51 being fixed on the front of eyes of user, and keeps projection supporting part 51 constant with the relative position of user's head; First image acquisition units 53, is arranged on support 52, and for gathering the real scene image in preset range, wherein, preset range extends the region that formed of predeterminable range based on the current location of wearable device towards the moving direction of wearable device; Control module 54, for based on real scene image, obtains the road information in preset range; Generate the Way guidance image corresponding with road information; Based on user's sight line of current time, projection supporting part 51 determines corresponding view field; Projecting cell 55, for by Way guidance image projection to view field.
Further, wearable device also comprises: the second image acquisition units, is arranged on support 52, for obtaining the human eye eyeball image of user; Correspondingly, control module 54, also for resolving human eye eyeball image, obtains the eye movement data of user; Based on eye movement data, projection supporting part 51 is determined and the view field that the sight line of user crosses.
Further, control module 54, specifically for the locating information that the wearable device belonging to self is current, obtains map data information; Based on map data information and road information, generation pass guiding plan picture.
Further, control module 54, specifically for according to road information, generates guided path; Guided path be added on real scene image, generation pass guiding plan picture, moves according to guided path to guide user.
Those skilled in the art should understand, embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt the form of hardware embodiment, software implementation or the embodiment in conjunction with software and hardware aspect.And the present invention can adopt in one or more form wherein including the upper computer program implemented of computer-usable storage medium (including but not limited to magnetic disk memory and optical memory etc.) of computer usable program code.
The present invention describes with reference to according to the process flow diagram of the method for the embodiment of the present invention, equipment (system) and computer program and/or block scheme.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block scheme and/or square frame and process flow diagram and/or block scheme and/or square frame.These computer program instructions can being provided to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device to produce a machine, making the instruction performed by the processor of computing machine or other programmable data processing device produce device for realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make on computing machine or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computing machine or other programmable devices is provided for the step realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
The above, be only preferred embodiment of the present invention, be not intended to limit protection scope of the present invention.

Claims (12)

1. a method for information processing, is applied to an electronic equipment, and electronic equipment has projecting cell and the first image acquisition units; Described method comprises:
Based on the real scene image in the preset range that described first image acquisition units gathers, obtain the road information in described preset range, wherein, described preset range extends the region that formed of predeterminable range based on the current location of described electronic equipment towards the moving direction of described electronic equipment;
Generate the Way guidance image corresponding with described road information;
Based on user's sight line of current time, determine corresponding view field on the projection surface;
By described projecting cell by described Way guidance image projection extremely described view field.
2. method according to claim 1, is characterized in that, described electronic equipment has the second image acquisition units;
Correspondingly, described user's sight line according to current time, determine corresponding view field on the projection surface, comprising:
The human eye eyeball image of described user is obtained by the second image acquisition units;
Described human eye eyeball image is resolved, obtains the eye movement data of described user;
Based on described eye movement data, described projecting plane is determined the described view field crossed with the sight line of described user.
3. method according to claim 1, is characterized in that, the Way guidance image that described generation is corresponding with described road information, comprising:
Based on the locating information that described electronic equipment is current, obtain map data information;
Based on described map data information and described road information, generate described Way guidance image.
4. method according to claim 1, is characterized in that, the Way guidance image that described generation is corresponding with described road information, comprising:
According to described road information, generate guided path;
Described guided path is added on described real scene image, generates described Way guidance image, move according to described guided path to guide user.
5. an electronic equipment, comprising:
First image acquisition units, for gathering the real scene image of preset range, wherein, described preset range extends the region that formed of predeterminable range based on the current location of described electronic equipment towards the moving direction of described electronic equipment;
Control module, for based on described real scene image, obtains the road information in described preset range; Generate the Way guidance image corresponding with described road information; Based on user's sight line of current time, determine corresponding view field on the projection surface;
Projecting cell, for by described Way guidance image projection to described view field.
6. electronic equipment according to claim 5, it is characterized in that, described electronic equipment also comprises: the second image acquisition units, for before described projecting cell is by described Way guidance image projection to described view field, gathers the human eye eyeball image of described user;
Described control module, also for obtaining described human eye eyeball image; Described human eye eyeball image is resolved, obtains the eye movement data of described user; Based on described eye movement data, described projecting plane is determined the described view field crossed with the sight line of described user.
7. electronic equipment according to claim 5, is characterized in that, described control module, specifically for based on the current locating information of described electronic equipment, obtains map data information; Based on described map data information and described road information, generate described Way guidance image.
8. electronic equipment according to claim 5, is characterized in that, described control module, specifically for according to described road information, generates guided path; Described guided path is added on described real scene image, generates described Way guidance image, move according to described guided path to guide user.
9. a wearable device, comprising:
Projection supporting part;
Support, is erected at user's head, for described projection supporting part being fixed on the front of eyes of user, and keeps the relative position of described projection supporting part and described user's head constant;
First image acquisition units, arrange on the bracket, for gathering the real scene image in preset range, wherein, described preset range extends the region that formed of predeterminable range based on the current location of described wearable device towards the moving direction of described wearable device;
Control module, for based on described real scene image, obtains the road information in described preset range; Generate the Way guidance image corresponding with described road information; Based on user's sight line of current time, described projection supporting part is determined corresponding view field;
Projecting cell, for by described Way guidance image projection to described view field.
10. wearable device according to claim 9, is characterized in that, described wearable device also comprises: the second image acquisition units, arranges on the bracket, for obtaining the human eye eyeball image of user;
Correspondingly, described control module, also for resolving described human eye eyeball image, obtains the eye movement data of described user; Based on described eye movement data, described projection supporting part is determined and the view field that the sight line of described user crosses.
11. wearable devices according to claim 9, is characterized in that, described control module, specifically for the locating information that the described wearable device belonging to self is current, obtain map data information; Based on described map data information and described road information, generate described Way guidance image.
12. wearable devices according to claim 9, is characterized in that, described control module, specifically for according to described road information, generate guided path; Described guided path is added on described real scene image, generates described Way guidance image, move according to described guided path to guide user.
CN201510129085.8A 2015-03-23 2015-03-23 Method for information processing, electronic equipment and wearable equipment Pending CN104750448A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510129085.8A CN104750448A (en) 2015-03-23 2015-03-23 Method for information processing, electronic equipment and wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510129085.8A CN104750448A (en) 2015-03-23 2015-03-23 Method for information processing, electronic equipment and wearable equipment

Publications (1)

Publication Number Publication Date
CN104750448A true CN104750448A (en) 2015-07-01

Family

ID=53590208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510129085.8A Pending CN104750448A (en) 2015-03-23 2015-03-23 Method for information processing, electronic equipment and wearable equipment

Country Status (1)

Country Link
CN (1) CN104750448A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105007441A (en) * 2015-08-30 2015-10-28 于卫华 Shoulder hanging portable WiFi play projector
CN108303972A (en) * 2017-10-31 2018-07-20 腾讯科技(深圳)有限公司 The exchange method and device of mobile robot
CN109005632A (en) * 2018-09-27 2018-12-14 广东小天才科技有限公司 Auxiliary learning method and intelligent desk lamp
CN110260878A (en) * 2019-06-20 2019-09-20 北京百度网讯科技有限公司 To the guidance method in destination orientation, device, equipment and storage medium
CN110383214A (en) * 2017-03-09 2019-10-25 索尼公司 Information processing unit, information processing method and recording medium
CN114567764A (en) * 2022-03-11 2022-05-31 联想(北京)有限公司 Wearable device and information processing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393029A (en) * 2007-09-17 2009-03-25 王保合 Automobile navigation apparatus and navigation system using the same
CN102829788A (en) * 2012-08-27 2012-12-19 北京百度网讯科技有限公司 Live action navigation method and live action navigation device
WO2014065595A1 (en) * 2012-10-23 2014-05-01 엘지전자 주식회사 Image display device and method for controlling same
CN103888163A (en) * 2012-12-22 2014-06-25 华为技术有限公司 Glasses type communication apparatus, system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393029A (en) * 2007-09-17 2009-03-25 王保合 Automobile navigation apparatus and navigation system using the same
CN102829788A (en) * 2012-08-27 2012-12-19 北京百度网讯科技有限公司 Live action navigation method and live action navigation device
WO2014065595A1 (en) * 2012-10-23 2014-05-01 엘지전자 주식회사 Image display device and method for controlling same
CN103888163A (en) * 2012-12-22 2014-06-25 华为技术有限公司 Glasses type communication apparatus, system and method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105007441A (en) * 2015-08-30 2015-10-28 于卫华 Shoulder hanging portable WiFi play projector
CN110383214A (en) * 2017-03-09 2019-10-25 索尼公司 Information processing unit, information processing method and recording medium
CN108303972A (en) * 2017-10-31 2018-07-20 腾讯科技(深圳)有限公司 The exchange method and device of mobile robot
US11142121B2 (en) 2017-10-31 2021-10-12 Tencent Technology (Shenzhen) Company Limited Interaction method and apparatus of mobile robot, mobile robot, and storage medium
CN109005632A (en) * 2018-09-27 2018-12-14 广东小天才科技有限公司 Auxiliary learning method and intelligent desk lamp
CN110260878A (en) * 2019-06-20 2019-09-20 北京百度网讯科技有限公司 To the guidance method in destination orientation, device, equipment and storage medium
CN114567764A (en) * 2022-03-11 2022-05-31 联想(北京)有限公司 Wearable device and information processing method

Similar Documents

Publication Publication Date Title
CN104750448A (en) Method for information processing, electronic equipment and wearable equipment
KR102695518B1 (en) Method and apparatus for guiding vehicle route
JP7133470B2 (en) System and method for network augmented reality representation
CN104359487B (en) A kind of real scene navigation system
EP2972095B1 (en) System and method for context dependent level of detail adjustment for navigation maps and systems
CA2964693C (en) Street-level guidance via route path
US10235812B2 (en) Augmented reality
JP2019095213A (en) Superimposed image display device and computer program
JP2009020089A (en) System, method, and program for navigation
CN103674016A (en) Walking guide system based on mobile terminal and implementation method of walking guide system
CN112985432B (en) Vehicle navigation method, device, electronic equipment and storage medium
JP2014181927A (en) Information provision device, and information provision program
JP2008128827A (en) Navigation device, navigation method, and program thereof
WO2023020215A1 (en) Method and apparatus for displaying navigation interface, and terminal and storage medium
CN102200445B (en) Real-time augmented reality device and method thereof
JP6345381B2 (en) Augmented reality system
CN115355926B (en) Method, device, equipment and storage medium for vehicle navigation
JP4105609B2 (en) 3D display method for navigation and navigation apparatus
KR20150088662A (en) Method and apparatus for providing mobile navigation service used augmented reality singularity
JPWO2007142084A1 (en) Navigation device
KR101641672B1 (en) The system for Augmented Reality of architecture model tracing using mobile terminal
JP2023165757A (en) Information display control device, information display control method, and information display control program
JP2013235367A (en) Flight path display system, method, and program
CN108629842B (en) Unmanned equipment motion information providing and motion control method and equipment
JP2014211431A (en) Navigation device, and display control method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150701