CN104508601A - Head-worn computer with improved virtual display function - Google Patents

Head-worn computer with improved virtual display function Download PDF

Info

Publication number
CN104508601A
CN104508601A CN201380040303.XA CN201380040303A CN104508601A CN 104508601 A CN104508601 A CN 104508601A CN 201380040303 A CN201380040303 A CN 201380040303A CN 104508601 A CN104508601 A CN 104508601A
Authority
CN
China
Prior art keywords
user interface
display
information
ground floor
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201380040303.XA
Other languages
Chinese (zh)
Other versions
CN104508601B (en
Inventor
克里斯多福·帕金森
路克·霍普金斯
大卫·尼兰德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kopin Corp
Original Assignee
Kopin Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/799,888 external-priority patent/US9377862B2/en
Application filed by Kopin Corp filed Critical Kopin Corp
Publication of CN104508601A publication Critical patent/CN104508601A/en
Application granted granted Critical
Publication of CN104508601B publication Critical patent/CN104508601B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

In one embodiment, a method for displaying a user interface on a display of a head worn computer can include displaying a first layer of information in the user interface on a display of the head worn computer. The method can further include receiving a directional input from body movement, eye tracking, or hand gestures. The method can additionally include highlighting an area of the user interface on the display with a second layer of information. The area can be located in the user interface based on the received directional input.

Description

There is the wear-type computer of the virtual Presentation Function of improvement
Related application
Subject application is the U. S. application case the 13/799th of application on March 13rd, 2013, the case that continues of No. 888, the U.S. Provisional Application case the 61/653rd of described U. S. application case request application on May 30th, 2012, the right of No. 127.Whole instructions of above application case are incorporated herein by reference.
Technical field
Background technology
Mobile computing device, such as notebook personal computer (PC), smart mobile phone and tablet computing device, for present for producing in business and personal lifestyle, analyzing, transmit and the common tool of overhead data.Increase owing to becoming ubiquitous to the access easiness of numerical information along with high-speed radiocommunication technology, consumer continues to enjoy mobile digital life style.The universal purposes of mobile computing device comprises display a large amount of high resolving power computer graphic information of wireless crossfire auto levelizer and video content often.Although described device comprises display screen usually, because the physical size of described mobile device is limited with lifting mobile, so can not the preferred visual experience of easily copying high-resolution giant display in said device.Another shortcoming of previously mentioned type of device is that user interface is that dependence is manual, usually needs user use keyboard (entity or virtual) or touch screen displays input data or make a choice.Therefore, consumer seeks a kind ofly to expand or replace it rely on manual mobile device without the need to manual high-quality pocket color display solution now.
Summary of the invention
In one embodiment, the display that a kind of method showing user interface on the display at wear-type computer can be included in wear-type computer shows the ground floor information in user interface.Described method can comprise the orientation received from body kinematics, eyeball tracking or gesture further and input.Described method can comprise the region using second layer information to highlight user interface on display in addition.Can based on the orientation input received by described zone location in the user interface.
In another embodiment, method can comprise second layer information is alignd with ground floor information.Method also can comprise in a same direction and input the highlighted area of mobile user interface pro rata with the orientation received.
In another embodiment, second layer information can show voice command to activate the region of ground floor information.Method can comprise the voice command being received in and showing in second layer information further.Method also can comprise the feature activating the ground floor information corresponding with voice command.
In another embodiment, method also can comprise based on selecting user to input the region using third layer information to highlight user interface on display.That selects user's input can be configured to user interface on use first, second, and third layer of message loop display highlights region.The orientation input received can from head movement.Region can move up in the side identical with head movement.
Ground floor information can comprise presenting of webpage.Second layer information can comprise the link presenting to activate in webpage of voice command.
Method also can comprise reorganize voice command present make to be rendered as non-overlapped and for readable.
In another embodiment, the wear-type computer for showing user interface can comprise and is configured to receive the receiver module inputted from the orientation of at least one in body kinematics, eyeball tracking or gesture.Wear-type computer can to comprise on the display that is configured to show wear-type computer ground floor information in user interface further and be configured to further use second layer information to highlight the display module in the region of user interface on display.Described region can based on the orientation input received through location in the user interface.
Accompanying drawing explanation
Foregoing teachings is by more clear according to illustrative embodiments following of the present invention as illustrated in the accompanying drawings, and in the drawing, same reference character runs through different views instruction same parts.Graphicly need not to draw in proportion, but to focus on and illustrate on embodiments of the invention.
Fig. 1 describes and wirelessly calculates headset equipment (or head mounted display (HMD) or wear-type computer (HSC)) without the need to manual video.
Fig. 2 is the user interface map of the illustrative embodiments that the webpage layer shown to user by wear-type computer is described.
Fig. 3 is the user interface map of the illustrative embodiments that the webpage layer highlighted is described.
Fig. 4 A is the user interface map of the illustrative embodiments that spotlight (or electric torch) is described, described spotlight (or electric torch) in local zone, show the webpage layer that highlights and in remaining users interface displayed web page layer.
Fig. 4 B is the user interface map of the illustrative embodiments that spotlight (or electric torch) is described, described spotlight (or electric torch) in local zone, show the webpage layer that highlights and in remaining users interface displayed web page layer.
Fig. 5 is the user interface map of the illustrative embodiments that the spotlight used within a mapping application is described.
Fig. 6 illustrates can implement computer network of the present invention or similar digital processing environment wherein.
Fig. 7 is the cut-away view of the computer (such as, client processor/device or server computer) in the computer system of Fig. 6.
Embodiment
Illustrative embodiments of the present invention is below described.
Fig. 1 describes and comprises one or more micro-display, native processor and wireless transceiver and can or can not integrate with one or more peripheral unit further wirelessly calculate headset equipment (or head mounted display (HMD) or wear-type computer (HSC)) without the need to manual video, described wireless transceiver is for transmitting data, described data can comprise audio/speech and/or graphics/video data, the example of the peripheral unit of described integration is including but not limited to microphone, loudspeaker, the direction sensor of 3 axle to 9 axle degree of freedom, geographic position sensors, atmospheric sensor, health status sensor, GPS, digital compass (multiaxis magnetometer), flashlamp, altimeter, pressure transducer, various environmental sensor, individual's sensor, energy sensor, optical sensor and/or camera.
Wireless without the need to manual video calculate headset equipment can comprise micro-display device vivo to present information to user, and input media can be used (such as, head tracking accelerometer, gyroscope or one or more camera) to detect the motion of adjoint Alternative voice order (such as, head movement, hand exercise and/or gesture) with variously to native processor or distance host processor any one on the application program run Long-distance Control is provided.
The illustrative embodiments described in FIG shows the wireless computing wear-type device 100 (herein also referred to as the video eye wear device 100 of HSC 100) being combined with resolution (WQVGA or better) micro-display element 1010 and other features described below.Audio frequency input and/or output unit comprise one or more microphone, input and output loudspeaker, geographic position sensors, the direction sensor of 3 axle to 9 axle degree of freedom, atmospheric sensor, health status sensor, GPS, digital compass, pressure transducer, environmental sensor, energy sensor, acceleration, position, height above sea level, motion, speed or optical sensor, camera (visible ray, infrared (IR), ultraviolet (UV) etc.), other wireless radio ( lTE 4GCellular, FM etc.), floor light, viameter etc. and/or embed headset equipment and/or be attached to the sensor array of device by one or more peripheral interface (in Fig. 1 non-detail display).(Bluetooth is the Bluetooth Sig of Kirkland Washington, the registered trademark of Inc.; Wi-Fi is the registered trademark of the Wi-Fi Alliance Corporation of Austin Texas).Various electronic circuit is (as understood very soon, comprise microcomputer (monokaryon or multinuclear), one or more wired or wireless interface and/or optical interface, the storer of association and/or memory storage, various sensor and one or more peripheral mounting, such as " hot shoe ") usually also locate in the housing.
Can operative installations 100 in every way.Described device can be used as the remote display of the STREAMING VIDEO signal provided by distance host calculation element 120.Main frame 120 can be (such as) notebook computer, mobile phone, blackberry, blueberry (Blackberry), iPhone tMor there are other calculation elements of the computational complexity lower or higher than wireless computing wear-type telechiric device 100.Main frame 120 can be connected with other networks further, such as, by being connected 122 with the wired or wireless of the Internet (Internet).Coupling arrangement 100 and main frame 120 is come by one or more the suitable wireless connections such as provided by bluetooth WiFi, honeycomb, LTE, WiMax or other radio links 150.
Device 100 also can be used as the telepilot of main frame 120.Such as, device 100 can allow user to select the visual field 130 in the much bigger region defined by the virtual monitor 140 on main frame 120.User can use head movement or hand exercise or body gesture or otherwise (such as usually, use as the voice command that detected by HSC 100) control the position in the visual field 300, scope (such as, X-Y or 3D scope) and/or enlargement factor.Therefore wireless computing wear-type device 100 can have private subscribers input peripheral and process with (such as) Pan and Zoom and the visual field controlling display.
Circuit (as understood very soon, comprising aforesaid microcomputer (monokaryon or multinuclear), one or more radio interface, the storer of association or other memory storages, one or more camera (optical sensor) and/or various sensor) is also positioned in device 100.Camera, motion sensor and/or position transducer are for following the tracks of the head of user, hand and/or health at least the first axle 111 (level) but also preferred motion in the second (vertically) 112, the 3rd (degree of depth) 113, the 4th (spacing) 114, the 5th (rolling) 115 and the 6th (deflection) 116 and/or position.Three axis magnetometers (digital compass) can be added to provide wireless computing headset equipment or the peripheral unit of the positional precision with complete 9 axle degree of freedom.
Device 100 also comprises at least one microphone and respective electronic equipment and/or the programmable processor for speech recognition.Device 100 detects the sound of user and uses speech recognition to obtain order and/or instruction.Device 100 uses the order obtained from speech recognition to perform by the function of order instruction.
As described in, device 100 is used as the telepilot of Host computing device 120.Main frame 120 can be (such as) notebook computer, mobile phone, blackberry, blueberry tM, iPhone tMor there are other calculation elements of the computational complexity lower or higher than telechiric device 100.Main frame 120 can be connected with other networks further, such as, by the wireless connections 122 with the Internet.By such as by Bluetooth tM, the suitable wireless connections that provide of WiFi or other short-range wireless links 150 to be to connect telepilot 100 and main frame 120.
According to the aspect hereafter will be explained in more detail, telechiric device 100 allows user to select the visual field 130 in the much bigger region defined by virtual monitor.User can control the position in the visual field 130, scope (such as, X-Y or 3D scope) and/or enlargement factor usually.
HSC 100 communicates 150 with one or more host-processor.Host-processor can be integrated into the device of such as typical personal computer (PC) 200, meteorology, diagnosis or other testing apparatuss 145, comprises any other calculation element of wireless test probe and/or the vehicle-mounted computer system including but not limited to printer 160, thermostat or vehicle.Wirelessly calculate headset equipment 100 transmission command 150 to main frame and the reply received based on those orders without the need to manual video.
The reply 150 received by HSC 100 can comprise from the status message (state of instruction calculation element) of host-processor, the data caused by self diagnosis process or other information.
In an alternative embodiment, HSC order 150 and the reply 150 that receives can comprise and send message and promote to utilize one or more expert decision-making to set and perform diagnostic routine.
In another alternate embodiment, HSC 100 and the communication 150 of host-processor can comprise real-time Transmission and receive mail to/from the audio frequency of on-the-spot expert, geography and/or video data.
In another alternate embodiment, the video that the three-dimensional that HSC 100 can comprise user's body with the communication 150 of host-processor presents, the presenting of covering that the video that described three-dimensional presents can be included in entity in the adjacent ambient of user and non-physical object and/or expert's health thinks that user provides expert to instruct.
In one embodiment, HSC 100 enables " searchlight navigation (SearchlightNavigation) " of the hyperlink page.User interface screen, application program and the complicated hyperlink page (such as, webpage) can be navigated by voice, sound or posture.HSC 100 highlights hyperlink and uses respective voice to be that each hyperlink lifts an example on the page.The voice command enumerated allows user to say link number, title or coded word, and therefore HSC 100 navigates to selected hyperlink.Webpage in essence comparatively simple (such as, every page has the webpage being less than 30 links and will enumerating) time, described method is very effective.But webpage may be more complicated, every page provides hundreds of link for user.In addition, webpage can provide the small compact being difficult to highlight separately and mark link.
In one embodiment, system is by enabling searchlight in user interface (such as, spotlight, electric torch, highlight region etc.) simplify the navigation of complicated webpage in HSC, thus only pay close attention to a part for screen and/or webpage, described user interface is extra relevant information for user presents.Searchlight makes exemplary web page occur with circle, rectangle or other shapes, and described shape highlights the region of the screen that user pays close attention to.This searchlight can use head-tracker (or eye tracker, gesture or described three any combination) move around webpage.Nature head movement 110,111,112 can make searchlight or move on the diagonal line of any gradient left, to the right, upwards and downwards.Posture can increase or reduce the size of searchlight further.
Beyond searchlight, webpage seems normal when zero lap links.But, in the inside of searchlight, link and highlighted and enumerate.The searchlight of HSC 100 can ' decomposition ' is intensive further link list the respective labels of link around shaped exterior.
In another embodiment, searchlight navigation provides " see-through capabilities " of multilayer file.Same procedure can be applied to the many viewdata types except voice link by 100 subsequently, provides " X-light " or perspective visibility to the data set under top layer or preliminary visible layer or layer.Such as, in multilayer graph (such as, the schematic diagram of multilayer electronic plate), can top layer be shown and searchlight can be used for following the tracks of bottom.In addition, in one embodiment, gesture can circulate between multiple bottom.
Similarly, HSC 100 can show route map at top layer, shows that satellite covers in searchlight.Searchlight is guided under the control of head movement.Posture can circulation in multiple map (such as, show the map of landform, show the map of the map of traffic and displaying satellite).
Fig. 2 is the user interface Figure 200 of the illustrative embodiments that the webpage layer 202 shown to user by HSC 100 is described.Webpage layer 202 shows new web page, shows to user as usual.Webpage comprises multiple hyperlink that user can select to check another webpage.But when user uses voice command to navigate, user may need prompting can select correct hyperlink.
Fig. 3 is the user interface map 300 of the illustrative embodiments that the webpage layer 302 highlighted is described.The webpage layer 302 highlighted comprises the link 304a-s highlighted.The link 304a-r that each highlights shows that each order or phonetic symbol are to open link (such as, linking 1-19).In certain embodiments, the link 304a-r highlighted can be translucent, with the word allowing user to see bottom hyperlink.In addition, the linking layer 302 highlighted can be configured to stop advertisement and phonetic symbol to be arranged in pairs or groups, to simplify the option of presenting to user further.
Fig. 4 A is the user interface map 400 of the illustrative embodiments that spotlight 402 (or electric torch) is described, described spotlight (or electric torch) in regional area, show the webpage layer 302 that highlights and in remaining users interface displayed web page layer 202.Link 304c-i, 304k, 304n and 304q of highlighting are in spotlight.Spotlight is the shape of showing another layer information relevant to ground floor, such as circular.In the illustration being described, spotlight discloses the 202 webpage layers 302 highlighted on webpage layer.User can read the voice command of displaying subsequently to activate the hyperlink provided.In addition, user can make spotlight move to show other links highlighted by his/her head mobile around user interface.HSC 100 is by head tracking technology determination head movement and move spotlight with corresponding manner.
In another embodiment, user can say voice command (such as, " showing order ") to present the whole linking layer 302 highlighted to user.User also can say voice command (such as, " displaying spotlight ") to enter spotlight pattern.
Fig. 4 B is the user interface map 450 of the illustrative embodiments that spotlight 456 (or electric torch) is described, described spotlight (or electric torch) in regional area, show the webpage layer 454 that highlights and in remaining users interface displayed web page layer 452.Spotlight 456 is similar to about the spotlight described in Fig. 4 A, but the webpage layer 454 highlighted with the link 458a-d highlighted shown by spotlight 456 in Fig. 4 B, described in the link 458a-d that highlights show the word of minimum.That is, the link that highlights described in is shown and the quantity linking (such as, being 1-4 respectively) and associate.Such as, in the described embodiment, the link 458a-d highlighted is compacter and can highlight more multi-link in zonule.
Fig. 5 is the user interface map 500 of the illustrative embodiments that the spotlight 508 used within a mapping application is described.User interface 502 shows satellite map layer 504 and spotlight 508.Itinerary map layer 506 is in spotlight.As described above, user makes spotlight 508 move across user interface by using head tracking technology, thus shows other regions of itinerary map layer 506.
Spotlight is also by multilayer circulation.Such as, user can issue voice command (such as, " lower one deck " or " front one deck ") or posture order to show another layer, and such as, within a mapping application, another layer may be topographic layer or traffic layer.
Fig. 6 illustrates can implement computer network of the present invention or similar digital processing environment wherein.
Client computer/device 50 and server computer 60 provide the process of executive utility etc., storage and input/output device.Client computer/device 50 also links to other calculation elements by communication network 70, comprises other client terminal device/processors 50 and server computer 60.Communication network 70 may be remote access network, the part of global network (such as, the Internet), computer worldwide collection, LAN (Local Area Network) or wide area network and the gateway that uses at present each agreement (TCP/IP, Bluetooth etc.) to communicate with one another.Other electronic installation/computer network infrastructure are applicable.
Fig. 7 is the figure of the inner structure of computer (such as, client end processor/device 60 or server computer 60) in the computer system of Fig. 6.Each computer 50,60 comprises system bus 79, and wherein bus is one group of hardware circuit for carrying out data transmission between the assembly of computer or disposal system.Bus 79 is essentially the shared conduit of the different elements enabling the connection computer system (such as, processor, magnetic disk memory, storer, input/output end port, the network port etc.) that information is transmitted between element.What be attached to system bus 79 is I/O device interface 82 for various input and output device (such as, keyboard, mouse, display, printer, loudspeaker etc.) being connected to computer 50,60.Socket 86 allows computer to be connected to be attached to each other devices of network (such as, the network 70 of Fig. 6).Storer 90 is that computer software instruction 92 for implementing embodiments of the invention and data 94 (spot module code such as, described above) provide temporary transient storage.Magnetic disk memory 95 provides non-transitory to store for the computer software instruction 92 and data 94 for implementing embodiments of the invention.Central processor unit 84 is also attached to system bus 79 and provides the execution of computer instruction.
In one embodiment, processor routine 92 and data 94 are computer program product (being generally designated as 92), be included as invention system and provide the computer readable media at least partially of software instruction (such as, mobile storage medium, such as one or more DVD-ROM, CD-ROM, flexible plastic disc, tape etc.).By any suitable software installation procedure well known in the art, computer program product 92 is installed.In another embodiment, also being downloaded by cable, communication and/or wireless connections at least partially of software instruction.In other embodiments, program of the present invention is the Signal Products 107 propagated through computer program, described Signal Products is embodied in communications media (such as, radiowave, infrared waves, laser wave, sound wave or electric wave by global network (such as, the Internet or other networks) is propagated) on transmitting signal on.Described carrier medium or signal are provided for the software instruction of routines/program 92 of the present invention at least partially.
In alternative embodiments, transmitting signal is analog carrier or digital signal that communications media carries.Such as, transmitting signal may be the digitized signal by global network (such as, the Internet), communication network or other Internet communications.In one embodiment, transmitting signal is by the signal that communications media transmits within a time period, such as, millisecond, second, minute or longer time section in by network with the instruction for software application of Packet Generation.In another embodiment, as as described in the above Signal Products for propagating through computer program, the computer readable media of computer program product 92 is that computer system 50 can (such as, by receiving communications media and being identified in the transmitting signal embodied in communications media) receive and the communications media read.
Generally speaking, term " carrier medium " or instantaneous carrier contain aforementioned momentary signal, transmitting signal, communications media, medium etc.
The instruction of all patents quoted herein, openly application case and reference is incorporated in full by reference.
Although describe the present invention with reference to illustrative embodiments detail display of the present invention, but it will be understood by those skilled in the art that the various changes can made the present invention when not departing from the scope of the present invention contained by appended claims in form and details.

Claims (18)

1. on the display at wear-type computer, show a method for user interface, described method comprises:
The display of described wear-type computer shows the ground floor information in described user interface;
Accept from the directed input of at least one in body kinematics, eye tracking or gesture; And
Second layer information is used to highlight the region of the described user interface on described display, based on the described orientation input received by described zone location in described user interface.
2. method according to claim 1, comprises further and described second layer information being alignd with described ground floor information.
3. method according to claim 1, comprises further in a same direction and inputs with the described orientation received and move described user interface pro rata by the described region highlighted.
4. method according to claim 1, wherein said second layer information displaying voice command is to activate the region of described ground floor information.
5. method according to claim 4, comprises further:
Be received in the voice command shown in described second layer information; And
Activate the feature of the described ground floor information corresponding to institute's speech commands.
6. method according to claim 1, comprises further:
Inputting based on selection user uses third layer information to highlight the described region of the described user interface on described display, and wherein said selection user input is configured to use described user interface on display described in described ground floor information, described second layer information and described third layer message loop by the described region highlighted.
7. method according to claim 1, wherein said ground floor packets of information containing the presenting of webpage, and described second layer packets of information containing voice command present activate the link in described webpage.
8. method according to claim 1, comprise further reorganize to present described in voice command to make described in be rendered as non-overlapped and for readable.
9. method according to claim 1, the described orientation input wherein received is from head movement, and described region and described head movement move in a same direction.
10., for showing a wear-type computer for user interface, described system comprises:
Receiver module, it is configured to accept from the directed input of at least one in body kinematics, eye tracking or gesture; And
Display module, it is configured to the ground floor information shown on the display of described wear-type computer in described user interface, and be configured to further use second layer information to highlight the region of the described user interface on described display, based on the described orientation input received by described zone location in described user interface.
11. systems according to claim 10, wherein said display module is through being configured to further make described second layer information align with described ground floor information.
12. systems according to claim 10, wherein said display module moves described user interface by the described region highlighted pro rata through being configured to further in a same direction and inputting with the described orientation received.
13. systems according to claim 10, wherein said second layer information displaying voice command is to activate the region of described ground floor information.
14. systems according to claim 13, wherein said receiver module is through being configured to receive the voice command be presented in described second layer information further, and wherein said display module is through being configured to the feature activating the described ground floor information corresponding to institute's speech commands further.
15. systems according to claim 10, wherein said display module uses third layer information to highlight the described region of the described user interface on described display through being configured to further input based on selection user, and wherein said selection user input to be configured to use on display described in described ground floor information, described second layer information and described third layer message loop described user interface by the described region highlighted.
16. systems according to claim 10, wherein said ground floor information can comprise presenting of webpage, and described second layer information comprise voice command present activate the link in described webpage.
17. systems according to claim 10, wherein said display module through be configured to further reorganize to present described in voice command to make described in be rendered as non-overlapped and for readable.
18. systems according to claim 10, the described orientation input wherein received is from head movement, and described region and described head movement move in a same direction.
CN201380040303.XA 2012-05-30 2013-05-16 The wear-type computer of virtual display function with improvement Active CN104508601B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201261653127P 2012-05-30 2012-05-30
US61/653,127 2012-05-30
US13/799,888 2013-03-13
US13/799,888 US9377862B2 (en) 2010-09-20 2013-03-13 Searchlight navigation using headtracker to reveal hidden or extra document data
PCT/US2013/041349 WO2013180966A1 (en) 2012-05-30 2013-05-16 Head -worn computer with improved virtual display function

Publications (2)

Publication Number Publication Date
CN104508601A true CN104508601A (en) 2015-04-08
CN104508601B CN104508601B (en) 2017-11-21

Family

ID=48579475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380040303.XA Active CN104508601B (en) 2012-05-30 2013-05-16 The wear-type computer of virtual display function with improvement

Country Status (5)

Country Link
EP (1) EP2856284B1 (en)
JP (1) JP6110938B2 (en)
KR (1) KR20150027163A (en)
CN (1) CN104508601B (en)
WO (1) WO2013180966A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108269460A (en) * 2018-01-04 2018-07-10 高大山 A kind of reading method of electronic curtain, system and terminal device

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9316827B2 (en) 2010-09-20 2016-04-19 Kopin Corporation LifeBoard—series of home pages for head mounted displays (HMD) that respond to head tracking
US9377862B2 (en) 2010-09-20 2016-06-28 Kopin Corporation Searchlight navigation using headtracker to reveal hidden or extra document data
CN105940371A (en) * 2013-12-26 2016-09-14 寇平公司 User configurable speech commands
WO2015144621A1 (en) * 2014-03-26 2015-10-01 Sony Corporation Electronic device and method for controlling the electronic device
JP6609994B2 (en) * 2015-05-22 2019-11-27 富士通株式会社 Display control method, information processing apparatus, and display control program
US9959677B2 (en) * 2015-05-26 2018-05-01 Google Llc Multidimensional graphical method for entering and exiting applications and activities in immersive media
EP3384370A4 (en) * 2015-12-01 2020-02-19 Quantum Interface, LLC Motion based systems, apparatuses and methods for implementing 3d controls using 2d constructs, using real or virtual controllers, using preview framing, and blob data controllers
KR20170100309A (en) 2016-02-25 2017-09-04 삼성전자주식회사 Electronic apparatus for providing a voice recognition control and method thereof
US11340756B2 (en) 2019-09-27 2022-05-24 Apple Inc. Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
US11567625B2 (en) 2020-09-24 2023-01-31 Apple Inc. Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
US11615596B2 (en) 2020-09-24 2023-03-28 Apple Inc. Devices, methods, and graphical user interfaces for interacting with three-dimensional environments

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510121A (en) * 2009-03-12 2009-08-19 重庆大学 Interface roaming operation method and apparatus based on gesture identification
US20110221656A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Displayed content vision correction with electrically adjustable lens
US20110254865A1 (en) * 2010-04-16 2011-10-20 Yee Jadine N Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047629A1 (en) * 2003-08-25 2005-03-03 International Business Machines Corporation System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking
US7707501B2 (en) * 2005-08-10 2010-04-27 International Business Machines Corporation Visual marker for speech enabled links
JP2007303878A (en) * 2006-05-09 2007-11-22 Denso Corp Navigation apparatus
JPWO2011070705A1 (en) * 2009-12-11 2013-04-22 日本電気株式会社 Information display apparatus, computer program thereof, and data processing method
JP5316453B2 (en) * 2010-03-24 2013-10-16 ブラザー工業株式会社 Head mounted display and program
JP4934228B2 (en) * 2010-06-17 2012-05-16 新日鉄ソリューションズ株式会社 Information processing apparatus, information processing method, and program
US8706170B2 (en) * 2010-09-20 2014-04-22 Kopin Corporation Miniature communications gateway for head mounted display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510121A (en) * 2009-03-12 2009-08-19 重庆大学 Interface roaming operation method and apparatus based on gesture identification
US20110221656A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Displayed content vision correction with electrically adjustable lens
US20110254865A1 (en) * 2010-04-16 2011-10-20 Yee Jadine N Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108269460A (en) * 2018-01-04 2018-07-10 高大山 A kind of reading method of electronic curtain, system and terminal device

Also Published As

Publication number Publication date
EP2856284A1 (en) 2015-04-08
EP2856284B1 (en) 2017-10-04
JP6110938B2 (en) 2017-04-05
WO2013180966A1 (en) 2013-12-05
CN104508601B (en) 2017-11-21
JP2015523642A (en) 2015-08-13
KR20150027163A (en) 2015-03-11

Similar Documents

Publication Publication Date Title
CN104508601A (en) Head-worn computer with improved virtual display function
US20130239000A1 (en) Searchlight Navigation Using Headtracker To Reveal Hidden or Extra Document Data
EP2752362B1 (en) Image display system, image display method and program
CN104428627B (en) For the Vehicular navigation system and method for the information from mobile device to be presented
CN102812417B (en) The wireless hands-free with the detachable accessory that can be controlled by motion, body gesture and/or verbal order calculates headset
KR101682880B1 (en) Vehicle and remote vehicle manipulating system comprising the same
CN113660611B (en) Positioning method and device
CN104246864A (en) Head-mounted display and image display device
CN105187484A (en) Mobile Terminal And Method For Controlling The Same
Fröhlich et al. On the move, wirelessly connected to the world
AU2008236660A1 (en) Method and apparatus for acquiring local position and overlaying information
CN109240507A (en) Wearing-on-head type computer is as the secondary monitor inputted with automatic speech recognition and head-tracking
US20200264694A1 (en) Screen control method and device for virtual reality service
CN103475689A (en) Apparatus and method for providing augmented reality service
CN115357311A (en) Travel information sharing method and device, computer equipment and storage medium
KR101600793B1 (en) Method for providing inforamtion related to navigation in mobile terminal and mobile terminal thereof
CN111093585A (en) Wheelchair, control method and computer readable storage medium
Prandi et al. On augmenting the experience of people with mobility impairments while exploring the city: A case study with wearable devices
KR20170009558A (en) Navigation terminal device for sharing intervehicle black box image
KR20170023491A (en) Camera and virtual reality system comorising thereof
US12018947B2 (en) Method for providing navigation service using mobile terminal, and mobile terminal
US20220163334A1 (en) Method for providing navigation service using mobile terminal, and mobile terminal
EP4130663A1 (en) Positioning apparatus and method
GB2610700A (en) Positioning apparatus and method
KR20220123890A (en) Electronic device and method for providing location information thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant