TW201205121A - Maintaining multiple views on a shared stable virtual space - Google Patents

Maintaining multiple views on a shared stable virtual space Download PDF

Info

Publication number
TW201205121A
TW201205121A TW100103494A TW100103494A TW201205121A TW 201205121 A TW201205121 A TW 201205121A TW 100103494 A TW100103494 A TW 100103494A TW 100103494 A TW100103494 A TW 100103494A TW 201205121 A TW201205121 A TW 201205121A
Authority
TW
Taiwan
Prior art keywords
virtual
portable device
space
view
virtual scene
Prior art date
Application number
TW100103494A
Other languages
Chinese (zh)
Other versions
TWI468734B (en
Inventor
George Weising
Thomas Miller
Original Assignee
Sony Comp Entertainment Us
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/947,290 external-priority patent/US8730156B2/en
Application filed by Sony Comp Entertainment Us filed Critical Sony Comp Entertainment Us
Publication of TW201205121A publication Critical patent/TW201205121A/en
Application granted granted Critical
Publication of TWI468734B publication Critical patent/TWI468734B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/18Commands or executable codes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/205Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Methods, apparatus, and computer programs for controlling a view of a virtual scene with a portable device are presented. In one method, a signal is received and the portable device is synchronized to make the location of the portable device a reference point in a three-dimensional (3D) space. A virtual scene, which includes virtual reality elements, is generated in the 3D space around the reference point. Further, the method determines the current position in the 3D space of the portable device with respect to the reference point and creates a view of the virtual scene. The view represents the virtual scene as seen from the current position of the portable device and with a viewing angle based on the current position of the portable device. Additionally, the created view is displayed in the portable device, and the view of the virtual scene is changed as the portable device is moved by the user within the 3D space. In another method, multiple players shared the virtual reality and interact among each other view the objects in the virtual reality.

Description

201205121 六、發明說明: 【發明所屬之技術領域】 本發明關係於控制於攜帶式裝置與虛擬場景的視面的 方法、裝置及電腦程式’更明確地說,用以促成在虛擬或 擴增實境中之多玩家互動的方法、裝置及電腦程式。 術 技 前 先 虛擬實境(VR )係爲電腦模擬環境,不論該環境是 爲真實世界的模擬或想像世界,其中使用者可以透過標準 輸入裝置或者特殊多方向輸入裝置的使用,與虛擬環境或 虛擬物體(VA )互動。該模擬環境可以類似於真實世界 ,例如,模擬飛行或戰鬥訓練,或者,其可以與真實有顯 著不同,例如在VR遊戲中。虛擬實境經常被用來描述各 種的應用,通常相關於其沒入式高視覺3D環境。電腦輔 助設計(CAD )軟體、圖形硬體加速、頭戴式顯示器、資 料庫手套、及微小化的發展已經協助使這些普及化。 擴增實境(AR )提供實體上真實世界環境的即時視 面,其元件被混合(或擴增)虛擬電腦產生影像,以建立 一混合實境。擴增在傳統上爲即時並爲具有環境元素的語 境,例如在比賽時電視上之運動分數。藉由先進AR技術 的協助(例如加上電腦視面及物體辨識),有關使用者周 遭真實世界的資訊可以變成互動並作數位使用。 用語「擴增實境(AV)」也用於虛擬實境世界並類 似於AR。擴增實境也稱爲真實世界物體合倂入虛擬世界 -5- 201205121 。在虛擬進行的中間,AV稱爲主要虛擬空間,其中實體 元件,例如實體物體或人物被動態地整合入並可以與虛擬 世界作即時互動。用語VR係被用於此應用中,作爲上位 用語,除了特別指出之外,也包含AR及AV。 VR遊戲典型需要大量的電腦資源。在VR遊戲之手 持裝置的實施法係很稀有及現存遊戲係相對地簡單,具有 基本的VR作用。另外,多玩家AR遊戲允許在虛擬世界 中之玩家的互動,但互動係限制於在虛擬世界中爲玩家所 操縱的物體(例如,車輛、火箭、球等等)。虛擬世界爲 電腦產生並無關於玩家與攜帶式裝置的位置。玩家彼此相 對位置及有關其周遭的相對位置,當建立”實境”虛擬實際 經驗時並未列入考fi。 在本文中,討論本發明之實施例。 【發明內容】 本發明之實施例提供用以控制攜帶式裝置與虛擬場景 的視面的方法、設備及電腦程式。應了解的是,本發明可 以以各種方式實施,例如程序、設備、系統、裝置或電腦 可讀取媒體上之方法。幾個本發明之發明實施例係說明如 下。 .在一方法的實施例中,一信號被接收及該攜帶式裝置 係被同步在3維(3 D )空間的參考點,作出該攜帶式裝 置的位置。包含虛擬實境元件的虛擬的虛擬場景係被產生 在參考點旁的3D空間中。再者,該方法決定該攜帶式裝 -6- 201205121 置相對於參考點在3 D空間中之現行地點並建立虛擬場景 的視面。該視面代表由攜帶式裝置的現行地點及跟據該攜 帶式裝置的現行地點的視角看到之虛擬場景。另外,所建 立視面係被顯示於攜帶式裝置中,及虛擬場景的視面係當 攜帶式裝置在3D空間中爲使用者所移動。在另一方法中 ,多玩家共享該虛擬實境並彼此互動觀看在虛擬實境中之 物體。 在另一實施例中,呈現出一種在裝置間共享一虛擬場 景的方法。該方法包含同步化第一裝置至三維(3D)空 間中之參考點及用以計算相對於第一裝置的位置的第二裝 置的位置。再者,該方法的操作包含在第一裝置與第二裝 置間之資訊的交換,以使第二裝置同步於3D空間中之參 考點。該資訊包含該參考點及該第一與第二裝置的位置》 另外,使用一方法操作以在3 D空間中在參考點附近產生 虛擬場景。虛擬場景係爲兩裝置所共用並當裝置與虛擬場 景互動時在兩裝置中同時改變。由第一裝置的現行地點看 出之虛擬場景所建立之視面具有根據該攜帶式裝置之現行 地點的視角,及所建立視面係顯示在第一裝置上。當攜帶 式裝置移動於3D空間中時,該方法藉由改變該虛擬場景 的顯示視面而繼續。 在另一實施例中,一種方法被執行以控制以第一裝置 的虛擬場景的視面。該方法包含一操作,用以將第一裝置 同步化至在第一三維(3D)空間中之第一參考點。在另 一操作中,在第一裝置與第二裝置間建立通訊鏈結。該第 201205121 二裝置係在第二3D空間中,在第一 3D空 化至在第二3D空間中之第二參考點。再者 係被執行,用以產生共同虛擬場景,其包含 ,其中該共同虛擬場景係可以爲該第一及第 到。第一裝置建立該共同虛擬場景於該第一 第二裝置建立共同虛擬場景於第二參考點旁 能與虛擬實境元件互動。再者,該方法包含 定第一裝置相對於該參考點的第一 3D空間 ,用以建立該共同虛擬場景的視面。該視面 置的現行地點並具有根據該第一裝置的現行 所見的共同虛擬場景。所建立之視面係被顯 中,及當該第一裝置移動於該第一 3D空間 虛擬場景的顯示視面改變。 在另一實施例中,該方法操作控制具有 的虛擬場景的視面。在一操作中.,攜帶式裝 三維(3 D )空間中之參考點,其中定位有 。攜帶式裝置包含面向該攜帶式裝置前面的 向該攜帶式裝置的背面之背面攝影機。再者 執行以3 D空間在參考點旁產生虛擬場景。 含虛擬實境元件。在攜帶式裝置之3D空間 係被相對.於參考點決定。在另一方法操作中 場景的視面。該視面捕捉由在玩家的該3 D 眼睛位置所看到之虛擬場景的代表圖,該玩 式裝置,其對應於玩家將透過視窗所見到之 間外,並同步 ,一方法操作 虛擬實境元件 二裝置所觀察 參考點旁,及 。該兩裝置均 操作,用以決 中之現行地點 表示由第一裝 地點的一視角 示於第一裝置 內時,該共同 可攜帶式裝置 置係被同步至 該攜帶式裝置 前攝影機及面 ,一操作係被 該虛擬場景包 中之現行地點 ,建立該虛擬 空間中之現行 家持有該攜帶 虛擬場景。在 201205121 3 D空間中之視窗位置係等於在攜帶式裝置中之顯示器的 3D空間中之位置。該方法同時也包含用以顯示所建立的 視面在該顯示器上,及用以改變虛擬場景的顯示視面爲攜 帶式裝置或在3D空間中玩家移動的操作。 在再一實施例中,攜帶式裝置係被用以與擴增實境相 互動作。攜帶式裝置包含地點模組、虛擬實境產生器、視 面產生器、及顯示器。地點模組係用以決定在攜帶式裝置 所在之3D空間中之攜帶式裝置的位置,其中攜帶式裝置 的位置係被設定爲在攜帶式裝置接收一信號以同步化時的 3D空間中之參考點。虛擬實境產生器在該3D空間中的參 考點旁建立虛擬場景。虛擬場景包含虛擬實境元件。再者 ,該視面產生器建立該虛擬場景的視面,該視面代表由攜 帶式裝置並根據該攜帶式裝置的位置之視角所看到之虛擬 場景。另外,顯示器係被用以顯示虛擬場景的視面。當該 攜帶式裝置移動於3 D空間中,示於顯示器中之視面改變 〇 在另一實施例中,內藏於電腦可讀取儲存媒體中之電 腦程式’當爲一或更多處理器所執行時,電腦程式係被用 以實施本發明之方法。 本發明之其他態樣將由以下之詳細說明配合上附圖而 了解,附圖係例示本發明之原理。 【實施方式】 以下實施例描述用以控制在虛擬或擴增實境中之虛擬 -9- 201205121 場景的視面的方法、裝置及電腦程式。然而,爲熟習於本 技藝者所了解,本發明可以在沒有部份或所有這些特定細 節下實施。在其他例子中,我們知道已知程序未被詳細描 述以防止不必要地阻礙本發明。 圖1描繪依據一實施例之在使用者在將攜帶式裝置同 步化至空間中之一參考點之前。攜帶式裝置1 04係放在桌 上,準備將攜帶式裝置同步化至參考點的情形。使用者 1 02已經將攜帶式裝置放置於作爲參考點之一點或錨定住 以在該點旁建立虛擬實境。如圖1所示,攜帶式裝置係放 置於桌子的大約中心,及一旦攜帶式裝置被同步化,則在 桌子的中心附近建立虛擬世界。攜帶式裝置可以以各種方 式加以同步化,例如,壓下在攜帶式裝置1 04上之按鈕、 觸碰在攜帶式裝置中之觸控螢幕、使得該裝置保持不動一 時間段(例如,5秒),進入語音命令等等。 一旦攜帶式裝置接收予以同步化的輸入時,在攜帶式 裝置中之位置追蹤模組被重設。攜帶式裝置可以包含各種 地點追蹤模組,如下參考圖2 1加以討論者,例如加速度 計、磁力計、全球定位系統(GPS )裝置、攝影機、深度 攝影機、羅盤、陀螺儀等等。 攜帶式裝置可以爲很多類型之一,例如手持攜帶遊戲 裝置、行動電話、平板電腦、筆記型電腦、小筆電、個人 數位助理(P D A )等等。本發明之實施例將參考攜帶式遊 戲裝置加以描述,但其原理可以應用至具有一顯示器的任 一攜帶型電子裝置上。本發明之原理也可以應用至連接至 -10- 201205121 具有顯示器的計算裝置的遊戲控制器或其他輸入裝置。 圖2顯示以攜帶式裝置觀察之虛擬實境場景。在相對 於參考點1 〇 6同步化裝置1 〇 4後’攜帶式裝置將開始顯示 虛擬實境1 〇 8的視面。在顯示器中之視面係藉由模擬於攜 帶式裝置的背面之攝影機在3D空間內移動於參考點1〇6 附近加以建立。圖2描繪包含棋盤的虛擬實境。攜帶式裝 置1 04能檢測動作並決定當裝置移動時,其相對於參考點 1 0 6的相對地點。位置及地點的決定可以以不同方法不同 準確程度加以完成。例如,位置可以藉由分析以攝影機捕 捉之影像、由慣性系統、GPS、超音波三角定位法、WiFi 通訊、位置推算法、等等或其組合所取得之資料加以檢測 〇 在一實施例中,裝置追蹤攜帶式裝置相對於參考點 1 06在空間中之位置及在攜帶式裝置之空間中之地點。地 點係用以決定攝影機的視角,即,攜帶式裝置作爲拍入虛 擬場景的攝影機。如果攜帶式裝置指向右,則視面將轉向 右等等。換句話說,視角係被決定爲在顯示器的中心(或 裝置的其他部份)的原點之向量,及具有垂直進出該顯示 器之一方向。在另一實施例中,只有在空間中之地點被追 蹤,在顯示器中之視面係被計算,如同攝影機針對空間中 攜帶式裝置所在及朝向參考點的位置。 在部份現存實施法中,擴增實境(AR)標籤係被放 置在桌上,並利用作爲受託標示碼,以用以產生擴增實境 。AR標籤可以爲當出現在真實環境中之捕捉影像串流時 -11 - 201205121 被認出的物體或圖。AR標籤作爲受託標示碼,其促成在 真實環境內的位置之決定。本發明之實施例免除了 AR標 籤的必要,因爲同步化至3 D空間及攜帶式裝置位置的追 蹤。另外,位置資訊允許在攜帶式裝置中之遊戲輸送真實 性3D虛擬經驗。另外,一陣列之網路攜帶式裝置可以用 以建立共享虛擬世界,如下之參考圖4所述。 圖3例示依據一實施例之具有虛擬棋盤與混合玩家的 手部之擴增實境棋赛。3 D空間的影像係被用以藉由相對 於校正點組合真實與虛擬元件而建立擴增實境,並提供光 學動作捕捉狀功能。以校正多攝影機技術,有可能決定手 部或手臂的地點,以使得玩家”進入”擴增實境場景並與遊 戲物體(棋子)互動。 在一實施例中,在單一裝置的背面使用兩攝影機,以 決定物體進入3D空間的位置。也可以使用深度攝影機以 取得三維資訊。在其他實施例中,如以下參考圖4所討論 ,來自多裝置的攝影機係被用以決定手部306的位置。於 手持攜帶物302放在一手上的同時,玩家透過螢幕3 04看 到並進行遊戲空間,該空間被產生以使玩家觸及3D遊戲 物體及環境。比賽遊戲係爲完全地觸覺。多個玩家可能同 時進入一遊戲區域並以智慧方式與遊戲物體互動。例如, 玩家的手部3 06可以藉由交界、握持、推動、拉動、抓持 、移動、擊破、擠壓、敲打、擲敲、打鬥、開、合、導通 或關斷、按鈕、點火、吃(棋子)等等與虛擬物體互動。 同步至遊戲空間之各個攜帶式裝置加入另一可能攝影 -12- 201205121 機’相對運動的追蹤及音源資料,使得這可能由多數透視 圖看到玩家的手部及手指,以建立有效3D攝影機爲主之 動作捕捉欄。手部與虛擬空間被混合在一起,而在虛擬空 間中之虛擬兀件出現在顯示視面中,如同爲3 D空間的一 部份。當該攜帶式裝置移動於3D空間中時,虛擬元件之 視面以與真實元件視面改變的方式,由幾何透視圖改變。 圖4描繪依據一實施例之多玩家虛擬實境遊戲。當校 正地點與影像分析資料被組合以高速連接性,則地點及遊 戲資訊可以在選擇參與共享空間遊戲經驗的各個裝置間交 換。這允許各個玩家系統取用攝影機視面及來自所有其他 玩家的地點資訊’以同步化其校正地點並共享一虛擬空間 ,一起也稱爲共享空間。 在玩家402A-402 C已經參考共同3D空間(例如桌上 的一點)中之一點同步化或校正其攜帶式裝置後,建立共 同虛場景404。各個玩家具有虛擬場景404的視面,如同 在此例中之打軍事戰爭遊戲之虛擬場景係真的在玩家的前 面的桌上。攜帶式裝置作爲攝影機,如同玩家移動於裝置 旁’視面改變。結果,在各個顯示器上之實際視面係無關 於在其他顯示器中之視面,及該視面係只根據該攜帶式裝 置相對於虛擬場景之相對位置,該虛擬場景係錨定至3D 空間上之實際實體位置。 藉由利用多數攝影機、加速度計及其他機械裝置以決 定地點,配合在攜帶式裝置間之高速通訊,有可能建立 3D動作捕捉狀經驗,允許玩家以可相信的方式,觀看及 -13- 201205121 可能碰觸虛擬遊戲角色及環境。 共享空間404遊戲利用裝置之高速連接性,以交換參 與共享空間遊戲經驗之裝置之資訊。藉由將該裝置轉至爲 穩定“魔術窗”,以持續於各個裝置間之空間中,共享空間 4 04遊戲空間係透過該裝置觀看。藉由使用動作追蹤、影 像分析、及在各個裝置間之資訊的高持久性的組合,即使 當該等裝置略微移動,顯示區域仍以穩定地點出現。 圖5例示多玩家環境之校正方法的一實施例。如前所 述,由裝置感應器(加速度計、GPS、羅盤、深度攝影機 等等)取得之地點資訊係被傳送至其他鏈結裝置,以加強 在虛擬空間中之合作維持之資料。在建立同步於共同參考 點5 02的共同共享空間的實施例中,第一玩家504A相對 於參考點502同步化其裝置至3D空間。在共享空間中之 其他玩家建立與第一玩家的通訊鏈結,以交換地點及遊戲 資訊。相對位置可以以不同方式取得,例如使用WiFi三 角定位及上網測試以決定相對位置。另外,視覺資訊可以 用以決定其他位置,例如檢測其他玩家的臉,及由他們的 臉得知遊戲裝置的可能位置。 在一實施例中,藉由超音波通訊及方向性麥克風,音 訊三角定位係用以決定相對位置。多數頻率也可以用以執 行音訊三角定位。一旦裝眞已經交換地點資訊,則例如超 音波、WiFi、或藍芽的無線通訊被用以同步化其他的裝置 至參考點502。在所有裝置被校正後,該等裝置已經認可 該參考點502及其相對於參考點502的相對位置。應了解 -14- 201205121 的是,其他方法也可以用以校正多數裝置至共享參考點。 例如’所有裝置可以藉由將裝置依序放在參考點上而校正 至相同參考點》 虛擬場景可以藉由使用在室中之發光源所決定的陰影 及光而更真實化。藉由使用攝影機回饋,遊戲環境及角色 已經爲真實世界所影響的場景光與陰影。當手伸入虛擬世 界以與虛擬物體互動時,這表示玩家手將投影一陰影在虛 擬角色或物體上。遊戲世界陰影及光係藉由真實世界的陰 影與光所調整,以取得可能最佳效果。 圖6例示依據一實施例如何在網路連接上玩一互動遊 戲。很多類型遊戲可以用在共享空間中。例如,該攜帶式 裝置可以使用成爲一球拍,以打桌球遊戲。該裝置略微移 動,如同它像一球拍以擊中該球。玩家們看球浮動於螢幕 與對手螢幕之間。在戰爭遊戲中,玩家透過攜帶式裝置加 以觀看並將石彈投射在對手的城堡。玩家將該裝置向後拉 載入石彈,然後,按下按鈕以將石彈投向敵人的城堡。 共享空間也可以當玩家在不同位置時建立,如圖6所 示。玩家已經建立網路連接以玩該遊戲。各個玩家同步化 其裝置至玩家空間中之參考點,及建立例如桌球桌的虛擬 實境。對手係被顯示爲在球桌的後側,其中對手裝置的移 動係配合對手球拍的動作。該遊戲也可以加入一化身以握 持球拍,以作爲更真實遊戲經驗。在打球時,各個裝置追 蹤裝置在空間中之動作與位置。此資訊係爲其他裝置所分 享,以促使其他裝置將虛擬球拍匹配於裝置的動作。也可 -15- 201205121 以分享其他遊戲資訊,例如,球的位置與移動。 圖7顯示互動遊戲,其係無關於該攜帶式裝置的位置 者。圖7所示之遊戲顯示玩遊戲的限制並未相對於參考點 7 06同步。多玩家空氣曲棍球係被同時遊玩於兩分開裝置 704C及704A上》遊戲包含曲棍球場708、圓碟714、及 球棍710及712。各個玩家藉由移動在顯示器上之手指而 控制球棍。顯示器顯示圓碟及球棍的位置。然而,當攜帶 式裝置略微移動,因爲沒有地域同步化一參考點,所以, 在顯示器上之視面並未改變。例如,當玩家702A移動至 位置702B時,視面係相同,無關於該裝置位於何處。 爲了玩遊戲,攜帶式裝置只交換有關於圓碟的移動與 球棍位置的資訊。沒有虛擬經驗綁在3 D空間中。 圖8顯示依據一實施例之互動遊戲,其中在顯示器中 之視面係取決於攜帶式裝置的位置。裝置802A及802B 已經被校正至共同空間,曲棍球場已經被建立爲虛擬元件 。該裝置作用爲進入該空間之攝影機,及該裝置並不必然 需要顯示整個遊戲面。例如,當該裝置被拉離開參考點時 ’則發生縮小顯示及可取得較大球場。再者,如果裝置斜 向上,則視面顯示球場的頂部,及如果裝置向下斜,則在 裝置中之視面愈接近玩家的本身目標。如圖8所示,在顯 示器中之視面係彼此無關並係懌據由各個攜帶式裝置的遊 戲面的現行視面。 圖9顯示依據一實施例,攜帶式裝置如何移動具有在 將攝影機移動於虛擬空間時的類似作用。圖9顯示在虛擬 -16- 201205121 空間中之車輛902。假想攜帶式裝置係由球體中的一點對 準至車輛902,則當攜帶式裝置移動於球體內時,可取得 車的視面多數。例如,來自“北極”的視面將顯示車輛的車 頂,及來自“南極”的視面將顯示車輛的底部。同時,示於 圖9爲車輛的側、前及後視面。 在一實施例中,玩家可以輸入一命令,以改變或翻動 虛擬世界的視面。例如,爲車輛時,玩家由車子的前面看 到車子的背面,如同場景己經旋轉1 8 0度左右及透過參考 點垂直進行之軸。以此方式,玩家並不必在室內移動以取 得不同的視角。其他輸入也可以產生不同作用,例如90 度旋轉,視圖的縮放(以使得虛擬世界似乎更小或更大) ,相對於x、y或z軸等旋轉。在另一實施例中,攜帶式 裝置的翻轉,即在玩家的手上旋轉180度,將造成虛擬世 界的視面上下顛倒翻轉。 圖10顯示依據一實施例當旋轉攜帶式裝置時,示於 顯示器中之影像的變化的二維化表圖。攜帶式裝置1 5 2係 對準向一壁面,具有視角a,造成在壁面上投影160»因 此,在攜帶式裝置152上之視面將對應投影160。當裝置 152旋轉沒角時,攜帶式裝置結束於地點154。在維持攝 影機視角α時,視面同時也旋轉角沒。結果,在攜帶式裝 置上之視面對應於投影1 62。應注意的是,在螢幕上之視 面係無關於眼睛位置,例如地點1 5 8及1 5 6,及視面係無 關於玩家在何處。另外,在顯示器上之影像係取決於作動 爲虛擬攝影機之攜帶式裝置的地點。以下所述之其他實施 -17- 201205121 例包含在顯示器上之視面,其依據眼睛的位 圖11顯示依據一實施例之遊玩VR遊 置。圖11至12F例示賽車遊戲,其中,植 被使用作爲攝影機或控制車輛的駕駛。攜帶 車的視面’其中賽車道係被顯示在其他對中 及人們坐在車道的側邊上的台子上。 圖12A至12F例示依據一實施例,攜 點如何影響在顯示器中之視面。在此順序中 係被使用作爲攝影機,而不是用以開車。E 家固持住該攜帶式裝置以玩賽車比赛。該裝 玩家的前面,相隔大約手臂的長度。當玩家 示之地點時,遊戲的視面係爲圖1 2B所示者 的視面顯示如由車輛駕駛員所見的賽車。駕 前方車道及車輛內裝的一部份,包含方向盤 圖12C顯示當玩家仍保持攜帶式裝置 45度時。結果,攜帶式裝置與玩家一起在 玩家的移動結果係如圖1 2 D所示,其中賽 轉向約45度。可以看出,攜帶式裝置作動 果攝影機在3 D世界中改變地點,則在顯示 變〇 圖12E顯示玩家再左轉45度。結果, 頭及視角已經相對於原始地點改變約9 0度 之結果係被描繪於圖1 2F中,其中遊戲的駕 —側視圖,其中,遊戲的駕駛員已有一側視 置改變。 戲的攜帶式裝 帶式裝置可以 式裝置顯示賽 車輛的中央, 帶式裝置的地 ,攜帶式裝置 丨12A顯示玩 置係被保持於 在如圖12所 ,其中顯示器 駛員可以看到 〇 在其前方左轉 空間中移動。 車道的視面也 爲攝影機及如 器上之視面改 攜帶式裝匱的 。在顯示器上 駛員已經具有 面,其包含另 -18- 201205121 一賽車與台子。 圖1 3 A-1 3 B例示依據一實施例之在遠端位置之使用 者間玩擴增實境遊戲。圖13A顯示具有攝影機1 302面向 握持住該攜帶式裝置的玩家的攜帶式裝置。面向玩家攝影 機有很多用途,例如通訊、視見平截頭體應用(見圖15-19B)、在遊戲中加入玩家的臉等等。 圖13B顯示產生近似真實效果的擴增實境遊戲的實施 例。玩家1308係在遠端位置並透過網路連接交換遊戲及 環境資訊。在遠端位置中之攝影機拍攝該玩家及其周圍, 例如背景1 3 1 0。該影像被送至對手的裝置,其中影像被 混合以一虛擬棋盤1 3 06。同樣地,攝影機1 3 04拍攝握持 該裝置的玩家的照片並將影像送給遠端玩家。此方式該等 玩家可以共享一空間。 各個玩家把視面視爲擴增實境,當視面經過另一玩家 之場景時,擴增實境淡入爲虛擬實境霧。各個玩家的所有 動作仍相對於兩裝置的同步校正地點作追蹤。該遊戲將虛 擬棋盤插入提供3 D經驗的桌子的頂部。如前所述,攜帶 式裝置可以略微移動以改變視角並由不同透視點看棋盤, 例如由上方、側面、對手方向等看棋盤。 在一實施例中’藉由定期地更新對手的臉及背景,而 不使用現場方式’可以降低所需之通訊與處理頻寬。另外 ’也可能只送遠端影像的一部份,例如玩家的影像,因爲 背景可以爲靜態並且較沒有關係。例如,遠端玩家的臉可 以每五秒更新一次、每次玩家改變姿勢時更新、當玩家交 -19- 201205121 談時更新等。 在另一實施例中,聲音可以在玩家間交換,以使得 3D經驗更真實。在另一實施例中,玩家具有改變視面的 選項,例如,在混合3 D影像與只顯示棋盤間作切換,以 改良棋盤的視圖。在另一實施例中,影像穩定化可以用以 穩定由於玩家手抖動所造成之小影像變動。在一實施例中 ,握持該裝置的玩家的臉可以加至顯示器中,以顯示該使 用者給對手看到的樣子。 圖14A-14H描繪依據一實施例當攜帶式裝置改變地 點時顯示器中之變化。在圖14A-14H的順序中,攜帶式 裝置正使用視見平截頭體效應,以決定擴增實境世界係如 何呈現給使用者。 在現行3 D電腦圖形中,視見平截頭體係爲可以出現 在螢幕上之模型化世界中之空間的區域。視見平截頭體係 爲標準攝影機的視野。此區域的準確形狀取決於模擬何種 攝影鏡頭而改變,但典型爲平截頭體的矩形錐體(如同所 命名)。垂直於視見方向切割截頭體平面被稱爲近面及遠 面。在一實施例中,近面對應於在攜帶式裝置中之顯示面 。較近面更接近攝影機或較遠面更遠離攝影機的物體並未 繪出。 在一實施例中,視見平截頭體係被錨定(錐體的頂部 )於握持攜帶式裝置的玩家的(雙眼間)眼中。顯示器作 動爲進入虛擬實境的視窗。因此,“視窗”愈接近眼部,則 虛擬實境的顯示面積愈大。相反地,“視窗”愈遠離眼部, -20- 201205121 則虛擬實境的視面愈小(更詳細)。該作用係類似於更接 近矩形舊式窺孔,而沒有扭曲光學。眼睛愈接近窺孔,則 可以看到愈多外部。 圖14A顯示在室內玩家握持擴增實境攜帶式裝置的 情形。在裝置被同步於室內時,虛擬實境產生器已經將虛 擬三角形“畫”在面向玩家的壁上,及正方形“畫,,在壁面上 在玩家的左方。在圖14A中,玩家正握持該裝置略低於 眼位準’手臂幾乎完全伸展。示於顯示器中之視面係呈現 在圖14B中,其中顯示三角形的一部份在玩家之前。 在圖1 4C中,玩家係在相同地點並彎曲手肘以使攜帶 式裝置更接近臉。由於如上討論之視見平截頭體作用,玩 家看到更大部份的壁面。圖14D顯示圖14C的裝置中所 顯示的視面。因爲視見平截頭體作用,相較於圖1 4 B的前 一視面可看到更大壁面之截面。完整三角形現顯示在顯示 器中。 圖14E顯示玩家將裝置向下移,以看相對壁面的底部 ,如圖14F所示。三角形的底部被顯示在顯示器中。在圖 1 4G中,玩家向左轉並使用“視窗”進入擴增世界,以看室 內的角落,如圖1 4H所示。 圖1 5顯示使用前及後攝影機,在攜帶式裝置上實施 視見平截頭體的實施例。圖1 5顯示視見平截頭體的2D 投影,並且因爲其爲2D投影,所以看到之視見平截頭錐 體爲三角形。攜帶式裝置1 506分別包含前及後面向攝影 機1514及1512。攝影機1512被用以捕捉玩家所在之空 -21 - 201205121 間的影像。攝影機1 5 1 4係用以捕捉握持裝置1 5 06的玩家 之影像。臉部辨識軟體允許該等裝置軟體決定玩家眼部的 位置,以模擬視見平截頭體作用。 在一實施例中,視見平截頭體具有頂點,該矩形截頭 錐體的邊緣由眼部延伸穿過在手持裝置中之顯示器的角落 。當眼部在地點1 5 02時,玩家“看到”面向該裝置的壁面 的區域1 5 1 0。由眼部開始的直線,並接觸顯示器的角落 與壁面相交叉,以界定區域1 5 1 0。當眼部移動至地點 1 5 04時,結果起源於眼部的直線改變。新直線定義區域 1508 »總結,如果攜帶式裝置1 506被保持固定,則當眼 部位置的改變將造成顯示器所顯示之內容改變。當然,如 果攜帶式裝置移動,則視面將也因爲當錐體的邊緣相交於 顯示器的角落時,視見平截頭體改變,則視面也將改變。 應了解的是,示於圖1 5中之實施例爲視見平截頭體 的例示實施法。其他實施例也可以利用不同形狀於視見平 截頭體上,並可以縮放視見平截頭體作用,或可以增加邊 界於視見平截頭體。示於圖15的實施例因此將不被解釋 爲排除或限制性,而是例示或示範性。 圖16Α-16Β例示依據一實施例之當玩家移動時改變 視見平截頭體的作用。圖16Α包含在攜帶式裝置中之顯 示器1 606,其中顯示器的表面係平行於壁面的表面。當 玩家以視見平截頭體作用透過顯示器觀看時,建立有矩形 截頭錐體,其具有頂部在玩家(例如在雙眼之間)的臉中 ’具有基於壁面及由邊緣延伸至眼部並接觸顯示器16 06 -22- 201205121 的角落。 當玩家在地點1 6 0 2,則視見平截頭體建立一 部1610,其係爲該玩家在顯示器1606上所見。當 動至地點1 604來移動顯示器時,視見平截頭體結 。用於截頭的新基部爲矩形1 608,其係在顯示器 所見。結果爲玩家地點的改變造成在虛擬實境的視 改變。 圖16B例示當使用視見平截頭體作用時,於臉 移近該顯示器時所建立的變焦作用。當玩家於地I 時,玩家看到矩形1 63 8,如前所述。如果玩家遠 器1 636至地點1 632而不移動顯示器,則看到對應 1 640的新顯示。因此,當玩家遠離,則虛擬世界 面積將因爲在顯示器中之觀看面積變小而收縮及在 區域中之物體變大,而造成放大作用。當玩家移動 顯示器1 63 6的相反動作將造成相反之縮小作用。 圖1 7例示依據一實施例如何使用虛擬攝影機 虛擬場景的視面。虛擬或擴增實境並不必要被侷限 所在之室內限制內,如同在賽車遊戲之圖11之前 超出玩家的實體邊界的虛擬世界也可以被模擬。圖 示一玩家觀看虛擬演唱會。實際舞台係於室內的壁 以模擬離開攜帶式裝置幾百呎,在此例中作爲虛擬 。視見平截頭體也可以以相同方式模擬。 如同在底下所看到,不同攝影機地點及視角將 器上造成不同視面。例如,第一地點係對焦在伴唱 矩形基 玩家移 果改變 1 606 中 ,面中之 移開或 Ιέ 16 3 2 離顯示 於矩形 的觀察 此觀看 更靠近 以隔開 在玩家 所見。 17例 外並可 攝影機 在顯示 者,第 -23- 201205121 二位置在主唱,及第三位置在觀眾。虛擬攝影機也可以加 入變焦輸入’以如同真實攝影機般地放大及縮小。 在一實施例中’縮放係被用以透過虛擬實境導引。例 如’如果玩家移前一呎,則攜帶式裝置將如同玩家已向前 1 〇呎的虛擬視面。以此,玩家可以導引虛擬世界,其係 較玩家所在之室內更大》 在另一實施例中’玩家可以輸入命令以使得攝影機移 動於虛擬實境內’而不實際移動攜帶式裝置。因爲攜帶式 裝置係相對於一參考點同步,所以攝影機的此移動不必爲 玩家所移動,具有將參考點改變至新位置的作用。此新參 考點可以被稱爲虛擬參考點,並且,不必位於玩家所在之 實際實體空間內。例如,在示於圖17之場景內,玩家可 以使用“前移”命令’以移動攝影機後台。一旦玩家“在”後 台’則玩家可以開始移動攜帶式裝置以探索視面後台,如 先前所述。 圖1 8A-1 8H顯示依據一實施例以例示視見平截頭體 作用的順序視面。圖1 8 A顯示正握持攜帶式裝置的玩家 。在顯示器上之視面係對應於示於圖18B所示之森林的影 像。在圖1 8 C中,玩家向右移動他的頭,同時,保持攜帶 式裝置大約與圖18A相同的地點。圖18D對應於在圖 1 8C中之玩家的視面,並顯示出由於視見平截頭體作用之 森林的改變透視圖。 在圖1 8 E中,玩家繼續使他的頭向右,同時移動攜帶 式裝置向左’以強調視見平截頭體作用,因爲玩家想要知 -24- 201205121 道是否在樹後有東西。圖18F顯示對應於圖18E 之顯示器。森林的透視圖再次改變。在樹本之一 —精靈,如圖1 8B所藏,但精靈的一部份可以當 森林的視角時由圖18F看到。圖18G顯不玩家 頭向右並移動攜帶式裝置進一步遠離左邊。如圈 看到,該作用係爲玩家可以看到樹後有什麼,及 整個精靈。 圖19A-19B例示將視見平截頭體作用與攝 組合的實施例。組合視見平截頭體與攝影機作用 爲用於建立虛擬視面的行爲爲不同時不可能。然 規則以定義何時使用一作用而何時使用另一作用 合爲可能。在一實施例中,攝影機作用係當玩家 式裝置時被使用,及視見平截頭體係當玩家相對 裝置移動頭時使用。當兩事件同時發生時,一作 ,例如視見平截頭體。 此組合表示給定眼睛與攜帶式裝置的一地點 眼睛及攝影機如何到達該地點,其中有可能有不 例如,眼睛1 902看穿裝置1 906時,不同視面的 係被顯示於圖1 9A及1 9B,如以下所討論。 參考圖19A,眼睛1 902係原來看穿過裝置 用視見平截頭體作用,該裝置“對準”直入該虛擬 造成在視見平截頭體錐體的頂部開始的α角,並 機角/3。使用參考圖10及15所前述之相同2D 玩家在其第一地點,在該壁面上看見區段1908 中之玩家 後面藏有 玩家改變 進一步低 | 1 8 Η 所 可以看到 影機作用 可以被認 而,當有 時,該組 移動攜帶 於攜帶式 用被選擇 ,取決於 同視面。 虛擬實境 1904° 使 實境。這 造成攝影 表示法, 。玩家然 -25- 201205121 後轉動該裝置7角度,以使裝置在地點1 906。因爲玩家 已經移動該裝置,所以,攜帶式裝置回應於有關攝影機作 用的移動,使得虛擬攝影機也轉r角。該結果爲顯示器現 顯示壁面的區域1910。 圖】9B顯示一玩家,於看穿攜帶式裝置1906的啓始 眼睛地點1 9 1 2。視見平截頭體係被使用及結果爲在顯示 區域1 9 1 8上之外表。玩家然後移動至眼睛地點1 902,而 不移動攜帶式裝置。因爲該裝置未移動,所以發生視見平 截頭體作用及玩家然後在顯示器上看到區域1916。應注 意的是,雖然眼睛1 902及顯示器1 906在圖19A及19B 爲相同地點,但實際視面因爲事件的順序使得眼睛及顯示 器在該地點。 圖20顯示一演算法的流程圖2000,用以依據本發明 一實施例以控制具有攜帶式裝置的虛擬場景的視面。在操 作2 002中,信號被接收,以同步化攜帶式裝置,例如, 按鈕按壓或螢幕接觸。在操作2 004中,方法同步化該攜 帶式裝置,以使得攜帶式裝置所在之位置係位於三維( 3D )空間中之參考點上。在一實施例中,3D空間係爲玩 家所在之室內。在另一實施例中,虛擬實境包含室內及延 伸超出該室壁面之虛擬空間。 在操作2006中,一虛擬場景係產生於3D空間內參 考點附近。虛擬場景包含虛擬實境元件,例如,圖2的棋 盤》在操作2008中,攜帶式裝置決定該攜帶式裝置相對 於參考點在該3D空間中之現行地點。虛擬場景的視面係 -26- 201205121 在操作2010中建立。該視面代表由該攜帶式裝置的 地點,及根據該攜帶式裝置的現行地點的視角看到之 場景。再者,在操作201 2期間,所建立之視面係被 在攜帶式裝置的顯示器上。在操作2014中,攜帶式 檢查是否該攜帶式裝置已經爲使用者所移動,即現行 已經改變。如果攜帶式裝置被移動,則該方法流程回 作2008,以再循環現行地點。如果攜帶式裝置未被 ,則攜帶式裝置持續藉由流動至操作20 1 2而顯示先 立視面。 圖2 1例示可以用以實施本發明實施例之裝置的 。攜帶式裝置係爲一計算裝置並包含出現在計算裝置 典型模組,例如處理器、記憶體(RAM、ROM等等 電池或其他電源、及永久儲存器(例如硬碟)。通訊 允許攜帶式裝置與其他攜帶式裝置、其他電腦、伺服 等交換資訊。通訊模組包含通用串列匯流排(USB ) 器、通訊鏈結(例如乙太)、超音波通訊、藍芽 WiFi。 輸入模組包含輸入按鈕及感應器、麥克風、觸控 、攝影機(向前、向後、深度攝影機)、及讀卡機。 輸入/輸出裝置,例如鍵盤或滑鼠也可以經由通訊鏈 例如USB或藍芽被連接至攜帶式裝置。輸出模組包 顯示器(具有觸控螢幕)、發光二極體(LED)、振 覺回授、及揚聲器。其他輸出裝置也可以經由通訊模 接至攜帶式裝置。 現行 虛擬 顯示 裝置 地點 到操 移動 前建 架構 中之 )、 模組 器等 連接 、及 螢幕 其他 結, 含一 動觸 組連 -27- 201205121 來自不同裝置的資訊可以爲地點模組所使用以 攜帶式裝置的地點。這些模組包含磁力計、加速度 螺儀、GPS、及羅盤。另外,地點模組可以分析以 及麥克風所捕捉之聲音或影像資料,以計算該地點 ,地點模組可以執行測試,以決定該攜帶式裝置的 其附近之其他裝置的地點,例如WiFi網路測試或 測試。 虛擬實境產生器建立該虛擬或擴增實境,如同 述,使用爲地點模組所計算的地點。一視面產生器 虛擬實境及該地點,產生顯示在顯示器上之視面。 生器也可以產生爲虛擬實境產生器所啓始的聲音, 用至多喇叭系統的方向作用》 應了解的是,示於圖21中之實施例係爲攜帶 的例示實施例。其他實施例也可以利用不同模組、 組或指定相關工作至不同模組。顯示於圖21的實 此應不被解釋爲排除或限制,而是例示及顯示用。 圖22爲依據本發明之一實施例之透過網際網 伺服器的與遊戲客戶1102互動的場景A至場景E 使用者A至使用者E的例示顯示圖。遊戲客戶係 置’其允許使用者經由網際網路連接至伺服器應用 。遊戲客戶允許使用者接取及播放線上娛樂內容, 並不限於遊戲、電影 '音樂及圖片。另外,遊戲客 以提供對線上通訊應用程式的接取,例如VOIP、 談協定及電子郵件。 計算出 計、陀 攝影機 。再者 地點或 超音波 先前所 根據該 視面產 使用應 式裝置 模組次 施例因 路連至 相關於 爲一裝 及處理 例如但 戶也可 文字交 -28 - 201205121 使用者透過控制器與遊戲客戶互動。在一些實施 ,控制器爲特定遊戲客戶控制器,而在其他實施例中 制器可以爲鍵盤及滑鼠的組合。在一實施例中,遊戲 爲單獨裝置,其可以輸出音訊及視訊信號,以透過監 /電視及相關音訊設備建立多媒體環境。例如,遊戲 可以但並不限於薄客戶、內部PCI-express卡、外部 express卡、ExpressCard裝置、內部、外部、或無線 裝置、Firewire裝置等等。在其他實施例中,遊戲客 整合以電視或其他多媒體裝置,例如DVR、藍光播 、DVD播放器或多頻接收器。 在圖22的場景A中,使用者A使用與遊戲 1102A成對的控制器100與顯示在監視器1〇6上的客 用程式互動。同樣地,在場景B內,使用者B使用 戲客戶1102B成對的控制器100與顯示在監視器106 另一客戶應用程式互動。場景C例示由使用者C後 到顯示一遊戲的監視器及遊戲客戶1 1 02C之夥伴名單 面。雖然圖22顯示單一伺服器處理模組,但在一實 中,在整個世界上有多數伺服器處理模組。各個伺服 理模組包含用以使用者交談控制、共享/通訊邏輯、 者地面位置、及負載平衡處理服務的次模組。再者, 器處理模組包含網路處理及分散式儲存器。 當遊戲客戶11 〇2連接伺服器處理模組時,使用 談控制可以被用以鑑別該使用者。已鑑別使用者可以 相關虛擬化分散式儲存器及虛擬化網路處理。可以儲 例中 ,控 客戶 視器 客戶 PCI- USB 戶被 放器 客戶 戶應 與遊 上的 面看 的視 施例 器處 使用 伺服 者交 具有 存作 •29- 201205121 爲使用者虛擬分散式儲存器的一部份之例示項目包含購買 媒體,例如但並不限於遊戲、影片及音樂等。另外’分散 式儲存器可以用以儲存多數遊戲的遊戲狀態、個別遊戲的 客製設定、及遊戲客戶的一般設定。在一實施例中,伺服 器處理的使用者地面位置模組係被使用以決定使用者的地 理位置及其個別遊戲客戶。使用者的地理位置可以爲共享 /通訊邏輯及負載平衡處理服務所使用,以根據多數伺服 器處理模組的地理位置及處理需求,而最佳化效能。網路 處理及網路儲存之一或兩者之虛擬化將允許來自遊戲客戶 的處理工作被動態移動至欠利用伺服器處理模組。因此, 負載平衡可以用以最小化有關於來自儲存器的呼叫及在伺 服器處理模組與遊戲客戶間之資料傳輸的潛伏期。 伺服器處理模組具有伺服器應用程式A及伺服器應 用程式B的例子。伺服器處理模組係能支援多數伺服器應 用程式,如伺服器應用程式X,及伺服器應用程式X2所指 示。在一實施例中,伺服器處理係根據群集計算架構,其 允許在一群集內之多數處理器處理伺服器應用程式。在另 一實施例中,不同類型之多電腦處理方案係應用以處理該 伺服器應用程式。這允許伺服器處理被縮放,以容許大量 的遊戲客戶執行多數客戶應用程式及對應伺服器應用程式 。或者,伺服器處理可以縮放,以容許爲更多需求圖形處 理或遊戲、影像壓縮或應用程式複雜度所必要的增加之計 算需求》在一實施例中’伺服器處理模組經由伺服器應用 程式執行主要處理。這允許相對昂貴的元件,例如圖形處 -30- 201205121 理器、RAM、及一般處理器被集中定位並降低遊戲客戶的 成本。處理伺服器應用資料經由網際網路被送回至對應遊 戲客戶,以被顯示在監視器上。 場景C例示可以爲遊戲客戶及伺服器處理模組所執行 之例示應用程式。例如,在一實施例中,遊戲客戶1 1 0 2 C 允許使用者C建立及觀看夥伴名單U20,其包含使用者 A、使用者B、使用者D、及使用者E。如所示,在場景C 中,使用者C能到現場影像或個別使用者於監視器1 〇6C 上的化身。伺服器處理執行遊戲客戶1102C的個別應用及 具有個別遊戲客戶1102的使用者A、使用者B、使用者 D及使用者E。因爲伺服器處理得知應用程式正爲遊戲客 戶B所執行,及用於使用者A的夥伴名單可以表示哪一 遊戲使用者B正在玩。再者,在一實施例中,使用者A 可以直接由使用者B觀看遊戲電玩的實際影像。除了遊戲 客戶B外,這是僅藉由送出處理伺服器應用資料給使用者 B給遊戲客戶A。 除了能由夥伴看到影像,通訊應用程式也可以在夥伴 間之即時通訊。如同應用至前一例子,這允許使用者A 於觀看使用者B的即時影像的同時提供鼓勵或暗示。在一 實施例中,雙向即時語音通訊係透過客戶/伺服器應用程 式加以完成。在另一實施例中,客戶/伺服器應用完成文 字交談。在另一實施例中,客戶/伺服器應用程式將語音 轉換爲文字顯示在夥伴螢幕上。 場景D及場景E例示個別使用者D與使用者E分別 -31 - 201205121 與遊戲平台1110D及11I0E互動。各個遊戲平台1110D 及1110E係連接至伺服器處理模組並顯示一網路,其中伺 服器處理模組與遊戲平台與遊戲客戶協調遊戲。 圖2 3顯示資訊服務提供者架構的實施例。資訊服務 提供者(ISP ) 25 0輸送一數量的資訊服務給地理上分散 並透過網路266連接的使用者262。ISP可以只輸送一類 型服務,例如股票價格更新、或各種服務,例如廣播媒體 、新聞、運動、遊戲等等。另外,爲各個ISP所提供之服 務係動態的,即,服務可以在任意時間點被加入或取出。 因此,提供特定類型給特定個人之服務的ISP可以隨時間 改變。例如,使用者可以爲一接近該使用者的ISP所服務 ,在該使用者係在家鄕附近時,及當使用者旅行至不同城 市’則使用者可以爲不同ISP所服務。家鄕ISP將傳送所 需資訊及資料至新I S P,使得使用者資訊”跟隨”使用者至 新城市,使得資料更接近使用者並容易存取。在另一實施 例中’主從關係可以建立於一主ISP及從ISP之間,該主 ISP管理使用者的資訊,及在來自主ISP的控制下,從 ISP直接與使用者作成界面。在其他實施例中,當客戶在 全世界移動,資料係由一ISP傳送至另一ISP,以使得在 較佳地點的ISP爲傳送這些服務以服務使用者。 ISP2 5 0包含應用程式服務提供者(ASP ) 252,其透 過網路提供電腦爲主服務給客戶。使用ASP模型提供的 軟體有時也被稱爲隨選軟體或服務軟體(SaaS )。對特定 應用程式(例如客戶關係管理)提供存取的簡單形式係爲 -32- 201205121 藉由使用例如HTTP的標準協定。應用軟體內佇在販賣者 的系統上並藉由販賣者所提供之特殊目的客戶軟體或其他 例如薄客戶之其他遠端界面,爲使用者透過使用HTML的 網頁瀏覽器所接取。 服務輸送在廣大地理區域上經常使用雲端計算。雲端 計算爲一計算的格式,其中,動態可縮放及經常虛擬化資 源係在網際網路上被提供作爲服務。在支援它們的“雲端” 中,使用者並不必要爲技術基礎結構中之專家。雲端計算 可以被細分爲不同服務,例如基礎結構作爲服務(IaaS ) 、平台作爲服務(PaaS )、及軟體作爲服務(SaaS )。雲 端計算服務經常提供線上的共同商務應用,其係由網頁瀏 覽器接取,而軟體與資料係被存在伺服器上。用語”雲端” 係被使用爲網際網路的比喻,根據網際網路如何被描繪於 電腦網路圖並爲其隱藏的複雜基礎結構的摘要。 再者,ISP 25 0包含遊戲處理伺服器(GPS) 254,其 係爲遊戲客戶以玩單人或多人電玩遊戲加以使用。多數在 網際網路上玩的電玩遊戲經由對遊戲伺服器的連接而操作 。典型地,遊戲使用一專用伺服器應用程式,其收集來自 玩家的資料並將之分配至其他玩家。這是較有效並有效於 對等配置,但需要分開的伺服器以主管伺服器應用。在另 一實施例中,GPS建立於玩家與個別遊戲遊玩交換資訊間 之通訊,而不依賴集中式的GPS。 專用GPS爲伺服器,其獨立於客戶執行。此等伺服 器通常執行於位在資料中心的專用硬體上,提供更多頻寬 -33- 201205121 及專用處理電力。專用 多玩家遊戲的遊戲伺服 戲執行於專用伺服器通 管,允許它們控制及更 廣播處理伺服器( 聽眾。廣播至很窄範圍 最後一腳係信號係如何 電視台時,其可以散佈 透過有線電視或有線電 由網路。網際網路也可 是具有多播,以允許信 受限於地理區域,例如 網路的快速擴散,廣播 可以幾乎到達世上之任 儲存服務提供者( 關管理服務》SSP同時 供儲存器作爲服務,使 。另一主要優點爲SSP 障,使用者將不會遺失 具有全部或部份使用者 式存取資料,而無關於 資料。例如,使用者可 使用者正在移動時,在 通訊提供者260提 伺服器係爲主管用於多數 器之較佳方法。大量多玩 常爲擁有遊戲名稱的軟體 新內容。 BPS ) 256分配音訊或視 的聽眾有時稱爲窄播。廣 到達聽眾或觀眾,當爲無 在空氣中至天線或接收器 台(或無線纜線)經由電 以爲無線電或TV給接收 號及頻寬被共享。歷史上 國界或區域廣播。然而, 並未爲地理區域所界定, 何一國家。 SSP ) 25 8提供電腦儲存 也提供週期性備份及歸檔 用者可以訂購如所需的更 包含備份服務,如果其電 其所有資料。再者,多數 資料的拷貝,允許使用者 使用者位於何處或裝置正 以在家用電腦中存取個人 行動電話中之個人檔案。 供連接性給該等使用者。 PC-爲主 家線上遊 公司所主 訊信號至 播分佈的 線電台或 ,或可以 台或直接 器,特別 ,廣播係 隨著網際 因爲內容 空間及相 。藉由提 多的儲存 腦硬碟故 S S P可以 以有效方 用以存取 檔案,及 一類型之 -34- 201205121 通訊提供者爲網際網路服務提供者(ISP ),其提供對網 際網路的存取。該ISP使用適用以傳輸網際網路協定資料 塊,例如撥接(dial-up ) 、DSL、有線數據機、無線或專 用高速互連的資料傳輸技術,以連接其客戶。通訊提供者 也可以提供發信服務,例如電子郵件、即時發信及SMS 文字。另一類型之通訊提供者爲網路服務提供者(NSP) ,其藉由提供對網際網路的直接主幹接取,而販賣頻寬或 網路存取。網路服務提供者可以由電信公司、資料載送商 、無線通訊提供者、網際網路服務提供者、提供高速網際 網路接取之有線電視操作者等等。 資料交換器2 6 8將在ISP25 0內的幾個模組互連並將 這些模組經由網路266連接至使用者262。資料交換器 268可以涵蓋小面積,其中所有ISP250的模組係相鄰近 ,或者當不同模組在地理位置上分散時,可以涵蓋大地理 面積。例如,資料交換器268可以包含在資料中心的櫃內 的快速十億位元乙太網路(或更快),或洲際之虛擬區域 網路(VLAN)。 使用者262以客戶裝置264接取遠端服務,該客戶裝 置2 64包含至少CPU、顯示器及I/O。該客戶裝置可以爲 PC、行動電話、小筆電、PDA等等。在一實施例中, ISP2 50認出爲客戶所用之裝置的類型並調整所用之通訊 方法。在其他例子中,客戶裝置使用標準通訊方法,例如 html 接取 ISP250。 資訊眼務提供者(IS P ) 25 0輸送多數資訊服務至地 -35- 201205121 理位置上分散並經由網路266連接的多數使用者262。 ISP可以只輸送一類型服務,例如股票價格更新,或各種 服務,例如廣播媒體、新聞、運動、遊戲等等。另外,爲 各個IS P所供給之服務爲動態的,即服務可以在任何時間 點被加入或取出。因此,提供特定類型服務給特定個人的 I s P可以隨時間改變。例如,當使用者在自己居住城市時 ,使用者可以在使用者附近的IS P所服務,當使用者旅行 至不同城市時,則使用者可以爲不同ISP所服務。自己居 住ISP將傳送所需資訊及資料給新ISP,使得使用者資訊 ”跟隨”使用者至新城市,使得資料更接近使用者及更容易 接取。在另一實施例中,可以在主ISP與從ISP間建立主 從關係,該主ISP管理使用者的資訊,及從ISP在主ISP 的控制下,直接與使用者作成界面。在另一實施例中,當 客戶全世界移動時,資料係由一ISP傳送至另一ISP,以 使得ISP於較佳位置來服務使用者,該ISP爲輸送這些服 務者。 ISP250包含應用服務提供者(ASP) 252,其提供電 腦爲主服務給在網路上之客戶。使用 ASP模型提供的軟 體有時也稱爲隨選軟體或服務軟體(SaaS) »提供對特定 應用程式(例如客戶關係管理)的簡單形式係藉由使用例 如HTTP的標準協定。應用軟體內佇於販賣者的系統中, 並爲販賣者所提供之特殊目的客戶軟體或例如薄客戶的其 他遠端界面所使用HTML透過網頁瀏覽器爲使用者所接取 -36- 201205121 在寬廣地理區域上傳送之服務經常使用雲端計算。雲 端計算爲一計算的格式,其中,動態可縮放及經常虛擬化 資源係在網際網路上被提供作爲服務。在支援它們的“雲 端”中,使用者並不必要爲技術基礎結構中之專家。雲端 計算可以被細分爲不同服務,例如基礎結構作爲服務( IaaS )、平台作爲服務(PaaS )、及軟體作爲服務(SaaS )。雲端計算服務經常提供線上的共同商務應用,其係由 網頁瀏覽器接取,而軟體與資料係被存在伺服器上。用語 “雲端”係被使用爲網際網路的比喻,根據網際網路如何被 描繪於電腦網路圖並爲其隱藏的複雜基礎結構的摘要。 再者,ISP 2 50包含遊戲處理伺服器(GPS) 254,其 係爲遊戲客戶以玩單人或多人電玩遊戲加以使用。多數在 網際網路上玩的電玩遊戲經由對遊戲伺服器的連接而操作 。典型地,遊戲使用一專用伺服器應用程式,其收集來自 玩家的資料並將之分配至其他玩家。這是較有效並有效於 對等配置,但需要分開的伺服器以主管伺服器應用。在另 —實施例中’ GPS建立於玩家與個別遊戲遊玩交換資訊間 之通訊,而不依賴集中式的GPS。 專用GPS爲伺服器,其獨立於客戶執行。此等伺服 器通常執行於位在資料中心的專用硬體上,提供更多頻寬 及專用處理電力。專用伺服器係爲主管用於多數PC-爲主 多玩家遊戲的遊戲伺服器之較佳方法。大量多玩家線上遊 戲執行於專用伺服器通常爲擁有遊戲名稱的軟體公司所主 管,允許它們控制及更新內容。 -37- 201205121 廣播處理伺服器(BPS ) 25 6分配音訊或視訊信號至 聽眾。廣播至很窄範圍的聽眾有時稱爲窄播。廣播分佈的 最後一腳係信號係如何到達聽眾或觀眾,當爲無線電台或 電視台時’其可以散佈在空氣中至天線或接收器,或可以 透過有線電視或有線電台(或無線纜線)經由電台或直接 由網路》網際網路也可以爲無線電或TV帶給接收器,特 別是具有多播,以允許信號及頻寬被共享。歷史上,廣播 係受限於地理區域,例如國界或區域廣播。然而,隨著網 際網路的快速擴散,廣播並未爲地理區域所界定,因爲內 容可以幾乎到達世上之任何一國家。 儲存服務提供者(SSP ) 25 8提供電腦儲存空間及相 關管理服務。SSP同時也提供週期性備份及歸檔。藉由提 供儲存器作爲服務,使用者可以訂購如所需的更多的儲存 。另一主要優點爲SSP包含備份服務,如果其電腦硬碟故 障,使用者將不會遺失其所有資料。再者,多數SSP可以 具有全部或部份使用者資料的拷貝,允許使用者以有效方 式存取資料,而無關於使用者位於何處或正用以存取資料 的裝置。例如,使用者可以在家用電腦中存取個人檔案, 及使用者正在移動時,在行動電話中之個人檔案。 通訊提供者260提供連接性給該等使用者。一類型之 通訊提供者爲網際網路服務提供者(ISP ),其提供對網 際網路的存取。該ISP使用適用以傳輸網際網路協定資料 塊,例如撥接、DSL、有線數據機、無線或專用高速互連 的資料傳輸技術,以連接其客戶。通訊提供者也可以提供 -38- 201205121 發信服務,例如電子郵件、即時發信及SMS文字。另一 類型之通訊提供者爲網路服務提供者(NSP ),其藉由提 供對網際網路的直接主幹接取,而販賣頻寬或網路存取。 網路服務提供者可以由電信公司、資料載送商、無線通訊 提供者、網際網路服務提供者、提供高速網際網路接取之 有線電視操作者等等。 資料交換器26 8將在ISP250內的幾個模組互連並將 這些模組經由網路266連接至使用者262。資料交換器 2 6 8可以涵蓋小面積,其中所有I S P 2 5 0的模組係相鄰近 ’或者當不同模組在地理位置上分散時,可以涵蓋大地理 面積。例如’資料交換器2 6 8可以包含在資料中心的櫃內 的快速十億位元乙太網路(或更快),或洲際之虛擬區域 網路(VLAN)。 使用者262以客戶裝置264接取遠端服務,該客戶裝 置264包含至少CPU、顯示器及I/O。該客戶裝置可以爲 PC、行動電話、小筆電' pda等等。在一實施例中, ISP25 0認出爲客戶所用之裝置的類型並調整所用之通訊 方法。在其他例子中,客戶裝置使用標準通訊方法,例如 html 接取 ISP2 5 0。 本發明之施例可以以各種電腦系統架構實施,包含手 持裝置、微處理器系統 '微處理器爲主或可程式消費電子 、迷你電腦、主機電腦及類似物。本發明也可以以分散計 算環境實施’其中工作係爲透過網路的鏈結在一起的遠端 處理裝置所執行。 -39- 201205121 記住上述實施例,應了解的是,本發明可以使 電腦可實施的涉及儲存於電腦系統中之運算。這些 需要實體運算實體量者。於此所述之任何運算形成 部份並有用於機器操作。本發明有關於執行這些運 置及設備。該設備可以特別建構用於所需目的,例 目的電腦。當界定爲特殊目的電腦時,於仍能用於 的運算時’電腦可以執行其他處理,程式執行或不 目的常式。或者’操作可以爲一般目的電腦所處理 腦選擇地爲儲存在電腦記憶體、快取中或透過網路 —或更多電腦程式所作動或組態。當資料透過網路 ’資料可以爲在網路上之其他電腦所處理,例如, 計算資源所處理。 本發明之Η施例可以定義爲機器,其將資料由 轉換爲另一狀態。所轉換資料可以儲存至儲存器然 理器所運算。處理器因此將資料由一事物轉換爲另 。再者,該等方法也可以爲一或更多機器或處理器 ,這些係透過網路加以連接。各個機器可以將資料 態或事物轉換至另一狀態或事物,並也可以處理資 存資料至儲存器、在網路上傳送資料、顯示結果、 結果至另一機器。 本發明之一或更多實施例也可以被製造爲電腦 碼在電腦可讀取媒體上。電腦可讀取媒體可以爲任 儲存裝置,其可以儲存資料,隨後爲電腦系統所讀 腦可讀取媒體的例子包含硬碟機、網路附接儲存器201205121 VI. Description of the Invention: [Technical Field] The present invention relates to a method, apparatus and computer program for controlling a view of a portable device and a virtual scene, more specifically, to facilitate virtual or augmentation The method, device and computer program for multi-player interaction in the environment. The virtual reality (VR) system is a computer simulation environment before the technology, whether the environment is a real-world simulation or imaginary world, where the user can use the standard input device or a special multi-directional input device, and the virtual environment or Virtual object (VA) interaction. The simulated environment can be similar to the real world, for example, simulated flight or combat training, or it can be significantly different from reality, such as in a VR game. Virtual reality is often used to describe a variety of applications, often associated with its immersive high-vision 3D environment. The development of computer-assisted design (CAD) software, graphics hardware acceleration, head-mounted displays, library gloves, and miniaturization has helped to make these popular. Augmented Reality (AR) provides an instant view of the real world environment on the entity, whose components are mixed (or amplified) by virtual computers to produce images to create a hybrid reality. Amplification is traditionally an instant and contextual element, such as the score of a sport on television during a match. With the help of advanced AR technology (such as computer view and object recognition), information about the real world around the user can be interactive and digitally used. The term "Augmented Reality (AV)" is also used in the virtual reality world and is similar to AR. Augmented reality is also known as the combination of real-world objects into the virtual world -5 201205121. In the middle of virtual execution, the AV is called the primary virtual space, where physical elements, such as physical objects or characters, are dynamically integrated and can interact instantly with the virtual world. The term VR is used in this application. As a superordinate term, AR and AV are included unless otherwise specified. VR games typically require a lot of computer resources. The implementation of the VR game is very rare and the existing game system is relatively simple and has a basic VR role. In addition, multi-player AR games allow for player interaction in the virtual world, but interaction is limited to objects that are manipulated by the player in the virtual world (eg, vehicles, rockets, balls, etc.). The virtual world creates a computer with no location about the player and the portable device. Players' relative positions and relative positions around them are not included in the “real” virtual reality experience. In this document, embodiments of the invention are discussed. SUMMARY OF THE INVENTION Embodiments of the present invention provide a method, apparatus, and computer program for controlling a view of a portable device and a virtual scene. It will be appreciated that the present invention can be implemented in various forms, such as a program, device, system, device, or computer readable medium. Several inventive embodiments of the invention are described below. . In an embodiment of the method, a signal is received and the portable device is synchronized to a reference point in a 3-dimensional (3D) space to make the location of the portable device. A virtual virtual scene containing virtual reality elements is generated in the 3D space next to the reference point. Furthermore, the method determines the current location of the portable device -6-201205121 relative to the reference point in the 3D space and establishes the view of the virtual scene. The view represents the virtual scene seen by the current location of the portable device and from the perspective of the current location of the portable device. In addition, the built-in viewing surface is displayed in the portable device, and the viewing surface of the virtual scene is moved by the user in the 3D space when the portable device is viewed. In another method, multiple players share the virtual reality and interact with each other to view objects in the virtual reality. In another embodiment, a method of sharing a virtual scene between devices is presented. The method includes synchronizing a reference point in the first device to the three dimensional (3D) space and a position of the second device to calculate a position relative to the first device. Moreover, the operation of the method includes the exchange of information between the first device and the second device to synchronize the second device to a reference point in the 3D space. The information includes the reference point and the location of the first and second devices. Additionally, a method is used to generate a virtual scene near the reference point in the 3D space. The virtual scene is shared by both devices and changes simultaneously in both devices when the device interacts with the virtual scene. The viewing surface created by the virtual scene as seen by the current location of the first device has a viewing angle based on the current location of the portable device, and the established viewing surface is displayed on the first device. When the portable device moves in the 3D space, the method continues by changing the display view of the virtual scene. In another embodiment, a method is performed to control a view of a virtual scene of the first device. The method includes an operation to synchronize the first device to a first reference point in a first three-dimensional (3D) space. In another operation, a communication link is established between the first device and the second device. The second 201205121 second device is in the second 3D space, and is cavitation in the first 3D to a second reference point in the second 3D space. Furthermore, the system is executed to generate a common virtual scene, which includes, wherein the common virtual scene can be the first and the first. The first device establishes the common virtual scene, and the first second device establishes a common virtual scene to interact with the virtual reality element by the second reference point. Moreover, the method includes determining a first 3D space of the first device relative to the reference point to establish a view of the common virtual scene. The current location of the viewport has a common virtual scene as seen by the current device. The established viewport is visualized and changes when the first device moves over the display view of the first 3D space virtual scene. In another embodiment, the method operates to control the viewport of the virtual scene. In one operation. , portable with reference points in the three-dimensional (3 D) space, where the positioning is . The portable device includes a rear camera facing the back of the portable device facing the back of the portable device. Furthermore, the execution creates a virtual scene next to the reference point in 3D space. Contains virtual reality components. The 3D space in the portable device is relative. Determined at the reference point. The viewport of the scene in another method operation. The view captures a representative image of the virtual scene as seen by the player's 3D eye position. The play device, It corresponds to what the player will see through the window. And sync, A method of operation, virtual reality components, two devices observed, next to the reference point, And . Both devices operate, When the current location for the decision is indicated by the first installation location in the first device, The common portable device is synchronized to the front camera and the face of the portable device, An operating system is the current location in the virtual scene package, The current home in the virtual space is established to hold the portable virtual scene. The position of the window in the 201205121 3D space is equal to the position in the 3D space of the display in the portable device. The method also includes displaying the created view on the display. And an operation for changing the display view of the virtual scene to be a portable device or a player moving in a 3D space.  In still another embodiment, Portable devices are used to interact with the augmented reality. The portable device includes a location module, Virtual reality generator, View generator, And display. The location module is used to determine the location of the portable device in the 3D space in which the portable device is located. The location of the portable device is set to a reference point in the 3D space when the portable device receives a signal to synchronize. The virtual reality generator creates a virtual scene next to the reference point in the 3D space. The virtual scene contains virtual reality components. Again, The view generator creates a view of the virtual scene, The viewing surface represents a virtual scene as seen by the portable device and from the perspective of the position of the portable device. In addition, The display is used to display the view of the virtual scene. When the portable device moves in the 3D space, Viewer change shown in the display 〇 In another embodiment, A computer program built into a computer readable storage medium' when executed by one or more processors, Computer programs are used to implement the methods of the present invention.  Other aspects of the invention will be apparent from the following detailed description in conjunction with the accompanying drawings. The drawings illustrate the principles of the invention.  [Embodiment] The following embodiment describes a method for controlling a view of a virtual -9-201205121 scene in a virtual or augmented reality, Devices and computer programs. however, As is familiar to those skilled in the art, The invention may be practiced without some or all of these specific details. In other examples, We know that known procedures are not described in detail to prevent unnecessarily obstructing the invention.  1 depicts a user prior to synchronizing a portable device to a reference point in space, in accordance with an embodiment. The portable device 104 is placed on the table. Prepare to synchronize the portable device to the reference point. The user 102 has placed the portable device at one point or anchor as a reference point to establish a virtual reality next to the point. As shown in Figure 1, The portable device is placed approximately at the center of the table. And once the portable device is synchronized, Then create a virtual world near the center of the table. Portable devices can be synchronized in a variety of ways. E.g, Pressing the button on the portable device 104,  Touch the touch screen in the portable device, Keeping the device for a period of time (for example, 5 seconds), Enter voice commands and more.  Once the portable device receives the input to be synchronized, The location tracking module in the portable device is reset. Portable devices can include a variety of location tracking modules. As discussed below with reference to Figure 2, Such as an accelerometer, Magnetometer, Global Positioning System (GPS) device, camera, Depth camera, compass, Gyros and so on.  Portable devices can be one of many types. Such as holding a game device, mobile phone, tablet, Notebook computer, Small notebook, Personal Digital Assistant (P D A ) and more. Embodiments of the present invention will be described with reference to a portable game device. However, the principle can be applied to any portable electronic device having a display. The principles of the present invention are also applicable to game controllers or other input devices connected to the computing device having a display of -10- 201205121.  Figure 2 shows a virtual reality scene observed with a portable device. After synchronizing the device 1 〇 4 with respect to the reference point 1 ’ 6 the portable device will begin to display the view of the virtual reality 1 〇 8. The viewing surface in the display is established by moving in the 3D space near the reference point 1〇6 by a camera simulating the back of the portable device. Figure 2 depicts a virtual reality containing a checkerboard. The portable device 104 can detect the motion and decide when the device moves, Its relative position relative to the reference point 106. The location and location decisions can be made in different ways with varying degrees of accuracy. E.g, The location can be analyzed by capturing images captured by the camera, By inertial system, GPS, Ultrasonic triangulation method, WiFi communication, Position push algorithm, Detecting information obtained by or the like or a combination thereof 〇 In an embodiment, The device tracks the location of the portable device relative to the reference point 106 in space and in the space of the portable device. The location is used to determine the camera's perspective. which is, The portable device acts as a camera that captures a virtual scene. If the portable device points to the right, Then the view will turn to the right and so on. in other words, The viewing angle is determined as the vector of the origin at the center of the display (or other part of the device). And have one direction of vertical entry and exit of the display. In another embodiment, Only places in space are being traced, The viewing surface in the display is calculated, Just like the position of the camera in the space where the portable device is located and towards the reference point.  In some existing implementation methods, The Augmented Reality (AR) label is placed on the table. And use as the trustee identification code, Used to generate augmented reality. The AR tag can be an object or figure that is recognized when the captured image stream appears in the real environment. The AR label is used as the trusted identification code. It contributes to the decision of location within the real environment. Embodiments of the present invention eliminate the need for an AR tag, Because it is synchronized to the 3D space and the tracking of the location of the portable device. In addition, Location information allows games in portable devices to deliver authentic 3D virtual experiences. In addition, An array of network portable devices can be used to create a shared virtual world. The following is described with reference to FIG. 4.  Figure 3 illustrates an augmented reality chess game with a virtual board and a mixed player's hand in accordance with an embodiment. The 3D space image is used to establish augmented reality by combining real and virtual components with respect to the correction points. It also provides optical motion capture. To correct multiple camera technology, It is possible to determine the location of the hand or arm, To allow the player to "enter" the augmented reality scene and interact with the game object (chick).  In an embodiment, Use two cameras on the back of a single unit, To determine where the object enters the 3D space. You can also use a depth camera to get 3D information. In other embodiments, As discussed below with reference to Figure 4, A camera system from multiple devices is used to determine the position of the hand 306. While the hand-held carrier 302 is placed on one hand, The player sees and plays the game space through the screen 3 04. This space is created to allow the player to access the 3D game objects and environment. The game is completely tactile. Multiple players may simultaneously enter a game area and interact with the game object in a smart way. E.g,  The player's hand 3 06 can be used by the border, Hold, promote, Pull, Grasp, mobile, Break, extrusion, beat, Throw, Fighting, open, Combined Turn on or off, Button, ignition, Eat (chess) and so on to interact with virtual objects.  Each portable device synchronized to the game space joins another possible camera -12-201205121 machine's relative motion tracking and source data. This makes it possible to see the player's hands and fingers from most perspectives. A motion capture bar based on the creation of an effective 3D camera. The hand and the virtual space are mixed together, And the virtual component in the virtual space appears in the display view. It is like a part of the 3D space. When the portable device moves in the 3D space, The view of the virtual component changes in a manner that changes from the view of the real component. Changed by the geometric perspective.  4 depicts a multi-player virtual reality game in accordance with an embodiment. When the correction location and image analysis data are combined for high-speed connectivity, Location and game information can be exchanged between devices that choose to participate in shared space gaming experience. This allows individual player systems to access the camera view and location information from all other players to synchronize their correction locations and share a virtual space. Also known as shared space.  After the player 402A-402 C has synchronized or corrected its portable device with reference to one of the common 3D spaces (e.g., a point on the table), Establish a common virtual scene 404. Each player has a view of the virtual scene 404. The virtual scene like the military war game in this example is really on the table in front of the player. Portable device as a camera, As the player moves to the side of the device, the view changes. result, The actual viewing surface on each display is independent of the viewing surface in other displays. And the viewing surface is based only on the relative position of the portable device relative to the virtual scene. The virtual scene is anchored to the actual physical location on the 3D space.  By using most cameras, Accelerometers and other mechanical devices to determine the location, Cooperating with high-speed communication between portable devices, It is possible to build a 3D motion capture experience. Allow players to believe in a way that can be trusted Watch and -13- 201205121 May touch virtual game characters and environments.  The shared space 404 game utilizes the high speed connectivity of the device, Information on devices that share experience in sharing space games. By turning the device to stabilize the "magic window", To continue in the space between the various devices, Shared space 4 04 The game space is viewed through the device. By using motion tracking, Image analysis, And a combination of high persistence of information between devices, Even when the devices move slightly, The display area still appears in a stable location.  FIG. 5 illustrates an embodiment of a method of correcting a multi-player environment. As mentioned before, By device sensor (accelerometer, GPS, compass, The location information obtained by the depth camera, etc. is transmitted to other link devices. To strengthen the information on the maintenance of cooperation in the virtual space. In an embodiment in which a common shared space synchronized to a common reference point 502 is established, The first player 504A synchronizes its device to the 3D space relative to the reference point 502. Other players in the shared space establish a communication link with the first player. In exchange for location and game information. Relative position can be obtained in different ways. For example, use WiFi triangulation and Internet testing to determine relative position. In addition, Visual information can be used to determine other locations. For example, detecting the faces of other players, And the possible location of the game device is known by their face.  In an embodiment, With ultrasonic communication and directional microphones, Audio triangle positioning is used to determine the relative position. Most frequencies can also be used to perform audio triangulation. Once the installation has exchanged location information, Such as ultrasound, WiFi, Or Bluetooth wireless communication is used to synchronize other devices to reference point 502. After all devices have been calibrated, The devices have recognized the reference point 502 and its relative position relative to the reference point 502. It should be understood that -14- 201205121 is, Other methods can also be used to correct most devices to a shared reference point.  For example, 'all devices can be corrected to the same reference point by placing the device sequentially on the reference point." The virtual scene can be more realisticated by using the shadows and light determined by the illumination source in the chamber. By using the camera to feed back, Game Environment and Characters Scene lights and shadows that have been affected by the real world. When the hand reaches into the virtual world to interact with the virtual object, This means that the player's hand will project a shadow on the virtual character or object. The shadows and light of the game world are adjusted by the shadows and light of the real world. For the best possible results.  Figure 6 illustrates how to play an interactive game over a network connection in accordance with an embodiment. Many types of games can be used in shared spaces. E.g, The portable device can be used as a racket. To play billiard games. The device is slightly moved, As if it hits the ball like a racket. Players watch the ball float between the screen and the opponent's screen. In the war game, The player views through the portable device and projects the stone into the opponent's castle. The player pulls the device back and loads the stone bomb. then, Press the button to cast the stone to the enemy's castle.  The shared space can also be created when the player is in a different location. As shown in Figure 6. The player has established an internet connection to play the game. Each player synchronizes its device to a reference point in the player's space. And create a virtual reality such as a billiard table. The opponent is shown as being on the back side of the table. The movement of the opponent device is coordinated with the action of the opponent's racquet. The game can also be added to an avatar to hold the racket. Take as a more real game experience. When playing, Each device tracks the action and position of the device in space. This information is shared by other devices. To cause other devices to match the virtual racket to the action of the device. Also -15- 201205121 to share other game information, E.g, The position and movement of the ball.  Figure 7 shows the interactive game, It is not related to the location of the portable device. The game shown in Figure 7 shows that the restrictions on playing the game are not synchronized with respect to reference point 7 06. The multiplayer air hockey system is played simultaneously on two separate devices 704C and 704A. The game includes hockey field 708. Round dish 714, And clubs 710 and 712. Each player controls the club by moving the finger on the display. The display shows the position of the disc and the stick. however, When the portable device moves slightly, Because there is no geographical synchronization, a reference point, and so,  The viewing surface on the display has not changed. E.g, When player 702A moves to position 702B, The viewing surface is the same, No matter where the device is located.  In order to play the game, The portable device only exchanges information about the movement of the disc and the position of the club. No virtual experience is tied to the 3D space.  Figure 8 shows an interactive game in accordance with an embodiment. The viewing surface in the display depends on the location of the portable device. Devices 802A and 802B have been corrected to a common space, The hockey field has been built as a virtual component. The device acts as a camera into the space, And the device does not necessarily need to display the entire game surface. E.g, When the device is pulled away from the reference point, then a zoom out display occurs and a larger pitch can be obtained. Furthermore, If the device is tilted up, The viewport shows the top of the course. And if the device is tilted down, The closer the view in the device is to the player's own target. As shown in Figure 8, The viewing planes in the display are independent of each other and are based on the current viewing surface of the game surface of each portable device.  Figure 9 shows, according to an embodiment, How the portable device moves has a similar effect when moving the camera to the virtual space. Figure 9 shows the vehicle 902 in the virtual -16 - 201205121 space. The imaginary portable device is aligned to the vehicle 902 by a point in the sphere. When the portable device moves into the sphere, A majority of the car's view can be obtained. E.g, The view from the "Arctic" will show the roof of the vehicle. And the view from the "Antarctic" will show the bottom of the vehicle. Simultaneously, Figure 9 shows the side of the vehicle, Front and rear views.  In an embodiment, The player can enter a command, To change or flip the visual side of the virtual world. E.g, When it is a vehicle, The player sees the back of the car from the front of the car. It is like an axis that has been rotated about 180 degrees and vertically through the reference point. In this way, Players don't have to move indoors to get a different perspective. Other inputs can also have different effects. Such as a 90 degree rotation, The zoom of the view (so that the virtual world seems smaller or larger), Relative to x, Rotate the y or z axis. In another embodiment, Flip of the portable device, That is, rotate 180 degrees on the player's hand. Will cause the virtual world to flip upside down.  Figure 10 shows a rotating portable device in accordance with an embodiment. A two-dimensional representation of the changes in the image shown in the display. The portable device 1 5 2 is aligned to a wall surface, With a viewing angle a, Causing projection on the wall 160», therefore, The viewing surface on the portable device 152 will correspond to the projection 160. When the device 152 is rotated without a corner, The portable device ends at location 154. While maintaining the camera angle α, The viewing surface is also rotated at the same time. result, The viewing surface on the portable device corresponds to projection 1 62. It should be noted that The view on the screen is not related to the eye position. For example, location 1 5 8 and 1 5 6, And the view is not about where the player is. In addition, The image on the display depends on the location of the portable device that acts as a virtual camera. Other implementations described below -17- 201205121 Examples are included on the display side of the display, It shows the play VR tour according to an embodiment in accordance with the bitmap of the eye. Figures 11 to 12F illustrate a racing game, among them, The plant is used as a camera or to control the driving of the vehicle. The side view of the carrying vehicle' where the racing track is displayed on the other center and the people sitting on the side of the lane.  12A through 12F illustrate, in accordance with an embodiment, How the carrying point affects the viewing surface in the display. In this order, it is used as a camera. Not to drive. The E family holds the portable device to play the racing game. In front of the player, About the length of the arm. When the player shows the location, The game's viewing surface is the car's face as shown in Figure 1 2B showing the car as seen by the driver of the vehicle. Driving the front lane and part of the vehicle interior, Contains the steering wheel Figure 12C shows when the player still maintains the portable device 45 degrees. result, The portable device is played with the player. The result of the player's movement is shown in Figure 1 2 D. The race turned to about 45 degrees. As can be seen, The portable device acts as a moving camera to change the location in the 3D world. Then the display changes. Figure 12E shows the player turning left again by 45 degrees. result,  The result of the head and viewing angle having changed by about 90 degrees relative to the original location is depicted in Figure 1 2F. Where the game is driving - side view, among them, The driver of the game has changed in one side of the view.  The portable carrying device of the play can display the center of the racing vehicle.  Belt device, Portable device 丨12A shows that the play system is maintained as shown in Figure 12. The display driver can see 〇 moving in the left-turn space in front of it.  The view of the driveway is also changed to the camera and the viewing surface on the camera. On the display, the driver already has a face, It includes another -18- 201205121 a racing car and a platform.  Figure 1 3 A-1 3 B illustrates playing an augmented reality game between users at a remote location in accordance with an embodiment. Figure 13A shows a portable device having a camera 1 302 facing a player holding the portable device. There are many uses for player cameras. Such as communication, See the frustum application (see Figure 15-19B), Add the player's face and so on in the game.  Figure 13B shows an embodiment of an augmented reality game that produces an approximate real effect. Player 1308 is at a remote location and exchanges game and environmental information over a network connection. The camera in the remote location captures the player and its surroundings,  For example, background 1 3 1 0. The image is sent to the opponent's device. The images are mixed with a virtual board 1 3 06. Similarly, Camera 1 3 04 Take a picture of the player holding the device and send the image to the far-end player. This way these players can share a space.  Each player sees the view as augmented reality. When the view passes through the scene of another player, The augmented reality fades into a virtual reality fog. All actions of each player are still tracked relative to the synchronized correction locations of the two devices. The game inserts a virtual board into the top of the table that provides 3D experience. As mentioned earlier, The portable device can be moved slightly to change the viewing angle and view the board from different perspective points.  For example from the top, side, Look at the board in the direction of the opponent.  In an embodiment, by periodically updating the face and background of the opponent, Without the on-site method, the required communication and processing bandwidth can be reduced. In addition, it may also only send a part of the far-end image. Such as the image of the player, Because the background can be static and less relevant. E.g, The far-end player's face can be updated every five seconds. Updated each time the player changes his or her posture, When the player pays -19- 201205121 to talk about updates and so on.  In another embodiment, Sound can be exchanged between players. To make the 3D experience more realistic. In another embodiment, The player has the option to change the view. E.g, Switch between mixing 3D images and displaying only the board. To improve the view of the board. In another embodiment, Image stabilization can be used to stabilize small image changes caused by player hand shake. In an embodiment, The face of the player holding the device can be added to the display. To show what the user sees to the opponent.  14A-14H depict changes in the display as the portable device changes location in accordance with an embodiment. In the sequence of Figures 14A-14H, The portable device is using the see-through frustum effect. To determine how the augmented reality world is presented to the user.  In the current 3D computer graphics, The viewing frustum system is the area of space that can appear in the modeled world on the screen. Viewing the frustum system is the view of a standard camera. The exact shape of this area depends on which photographic lens is simulated, But it is typically a rectangular frustum of the frustum (as named). Cutting the plane of the frustum perpendicular to the viewing direction is referred to as near and far. In an embodiment, The near face corresponds to the display surface in the portable device. Objects that are closer to the camera than nearer or farther away from the camera are not drawn.  In an embodiment, The viewing frustum system is anchored (top of the cone) in the eye of the player (between the eyes) holding the portable device. The display acts as a window into the virtual reality. therefore, The closer the "window" is to the eye, The larger the display area of the virtual reality. Conversely, The farther the "window" is from the eye,  -20- 201205121 The smaller the virtual reality's view (more detailed). This effect is similar to the closer to the rectangular old peephole. There is no distortion of optics. The closer the eye is to the peephole, Then you can see more outside.  Fig. 14A shows a situation in which an indoor player holds an augmented reality portable device. When the device is synchronized indoors, The virtual reality generator has "painted" the virtual triangle on the wall facing the player. And squares, "painting, , On the wall is on the left side of the player. In Figure 14A, The player is holding the device slightly below the eye level. The arm is almost fully extended. The viewing surface shown in the display is presented in Figure 14B. It shows a part of the triangle before the player.  In Figure 1 4C, The player is in the same location and bends the elbow to bring the portable device closer to the face. Due to the frustum function as discussed above, The player sees a larger part of the wall. Figure 14D shows the viewing surface shown in the apparatus of Figure 14C. Because of the view of the frustum, A larger wall section can be seen compared to the front view of Figure 14B. The full triangle is now displayed in the display.  Figure 14E shows the player moving the device down, To see the bottom of the opposite wall, This is shown in Figure 14F. The bottom of the triangle is displayed in the display. In Figure 1 4G, The player turns left and uses "window" to enter the augmented world. To see the corners inside the room, As shown in Figure 1 4H.  Figure 15 shows the camera before and after use. An embodiment of the viewing frustum is implemented on a portable device. Figure 15 shows the 2D projection of the viewing frustum, And because it is a 2D projection, So seeing the frustum cone is a triangle. The portable device 1 506 includes front and rear facing cameras 1514 and 1512, respectively. The camera 1512 is used to capture images between the players - 21 - 201205121. The camera 1 5 1 4 is used to capture the image of the player holding the device 1 5 06. The face recognition software allows the device software to determine the position of the player's eyes. The effect of the frustum is simulated.  In an embodiment, Seeing that the frustum has vertices, The edge of the rectangular truncated cone extends from the eye through the corner of the display in the handheld device. When the eye is at location 1 5 02, The player "sees" the area facing the wall of the device 1 5 1 0. a line starting from the eye, And touching the corner of the display and crossing the wall, To define the area 1 5 1 0. When the eye moves to location 1 5 04, The result is a change in the line that originates in the eye. New line definition area 1508 » Summary, If the portable device 1 506 is kept fixed, Then, when the position of the eye changes, the content displayed on the display changes. of course, If the portable device moves, The view will also be because when the edges of the cone intersect at the corner of the display, Seeing the frustum change, The view will also change.  It should be understood that The embodiment shown in Figure 15 is an exemplary embodiment of the viewing frustum. Other embodiments may also utilize different shapes on the viewing frustum. And can zoom the view of the frustum, Or you can increase the boundary to see the frustum. The embodiment shown in Figure 15 will therefore not be construed as limiting or limiting. Rather, it is exemplary or exemplary.  Figures 16-16 illustrate the effect of changing the viewing frustum as the player moves in accordance with an embodiment. Figure 16A shows a display 1 606 included in a portable device, Wherein the surface of the display is parallel to the surface of the wall. When the player views through the display as a viewing frustum, Create a rectangular frustum, It has a top in the face of the player (e.g., between the eyes) having a corner based on the wall and extending from the edge to the eye and contacting the display 16 06-22-201205121.  When the player is at location 1 6 0 2, Then see the frustum to create a 1610, It is what the player sees on the display 1606. When moving to location 1 604 to move the display, See the frustum body knot. The new base for the truncation is a rectangle 1 608, It is seen on the display. The result is a change in the virtual reality that results from a change in the player's location.  Figure 16B illustrates when using the viewing frustum, The zoom effect that is established when the face moves closer to the display. When the player is at the ground, Player sees rectangle 1 63 8, As mentioned earlier. If the player moves 1 636 to location 1 632 without moving the display, Then see the new display corresponding to 1 640. therefore, When the player is away, The virtual world area will shrink as the viewing area in the display becomes smaller and the objects in the area become larger. And cause amplification. The opposite action when the player moves the display 1 63 6 will cause the opposite reduction.  Figure 17 illustrates how a virtual camera virtual scene can be used in accordance with an embodiment. Virtual or augmented reality is not necessarily limited to indoor limits, As in Figure 11 of the racing game, the virtual world beyond the physical boundary of the player can also be simulated. The picture shows a player watching a virtual concert. The actual stage is tied to the wall of the room to simulate a few hundred feet away from the portable device. In this case it is virtual. The viewing frustum can also be simulated in the same way.  As seen underneath, Different camera locations and viewing angles create different viewing surfaces. E.g, The first location is focused on the chorus rectangle base player change 1 606 , Remove from the face or Ιέ 16 3 2 From the view of the rectangle, this view is closer to the space to be seen by the player.  17 cases and cameras can be displayed on the display, No. -23- 201205121 Second position in the lead singer, And the third position is in the audience. The virtual camera can also be added to the zoom input' to zoom in and out like a real camera.  In one embodiment, the & zoom is used to navigate through the virtual reality. For example, 'If the player moves forward, The portable device will be like a virtual view of the player who has moved forward. With this, Players can guide the virtual world, It is larger than the room in which the player is located. In another embodiment, the player can enter commands to move the camera within the virtual reality without actually moving the portable device. Because the portable device is synchronized with respect to a reference point, So this movement of the camera does not have to be moved by the player. Has the effect of changing the reference point to a new position. This new reference point can be called a virtual reference point. and, It does not have to be in the actual physical space where the player is located. E.g, In the scenario shown in Figure 17, Players can use the “forward” command to move the camera backstage. Once the player is "on", the player can start moving the portable device to explore the backstage. As previously stated.  Figures 1AA-8H show sequential view planes illustrating the effect of viewing frustums in accordance with an embodiment. Figure 1 8 A shows the player holding the portable device. The viewing surface on the display corresponds to the image of the forest shown in Figure 18B. In Figure 18 C, The player moves his head to the right, Simultaneously, The portable device is maintained at approximately the same location as Figure 18A. Figure 18D corresponds to the face of the player in Figure 18C, It also shows a perspective view of the change in the forest due to the view of the frustum.  In Figure 18 E, The player continues to make his head to the right, Simultaneously moving the portable device to the left to emphasize the role of the frustum. Because the player wants to know -24- 201205121 Whether there is something behind the tree. Figure 18F shows the display corresponding to Figure 18E. The perspective of the forest changed again. In one of the tree books - the elf, As shown in Figure 1 8B, But a part of the elf can be seen in Figure 18F as a forest perspective. Figure 18G shows the player head to the right and moves the portable device further away from the left. As seen in the circle, This function is what the player can see after the tree. And the whole elf.  19A-19B illustrate an embodiment in which the viewing frustum is combined and photographed. The combined view of the frustum and the camera is not possible for the behavior of creating a virtual view. The rule is to define when to use one role and when to use another. In an embodiment, The camera function is used when the player device is used. And the viewing frustum system is used when the player moves the head relative to the device. When two events occur simultaneously, One work, For example, see the frustum.  This combination represents a given point of view of the eye and the portable device. How the eye and camera arrive at the location, There may be some, for example, When eye 1 902 sees through device 1 906, The different views are shown in Figures 19A and 19B. As discussed below.  Referring to Figure 19A, Eye 1 902 is originally seen through the device. The device "aligns" straight into the virtual resulting in an alpha angle that begins at the top of the viewing frustum cone, And the machine angle / 3. Using the same 2D player as described above with reference to Figures 10 and 15 at its first location, On the wall, the player in the segment 1908 is seen behind the player and the player changes further. 1 8 Η Can see the effect of the camera can be recognized, When there are, The group of mobiles carried in the portable type is selected, Depends on the same view.  Virtual reality 1904° makes the real world. This causes photographic representation,  . The player turns the device 7 angle after -25- 201205121, So that the device is at location 1 906. Because the player has moved the device, and so, The portable device responds to the movement of the camera, Make the virtual camera also turn the corner. The result is the area 1910 where the display now shows the wall.  Figure 9B shows a player, The starting eye location of the portable device 1906 is seen 1 9 1 2 . The viewing frustum system is used and the result is outside the display area 1 9 1 8 . The player then moves to the eye location 1 902, Do not move the portable device. Because the device has not moved, So the view of the interception head occurs and the player then sees the area 1916 on the display. It should be noted that Although eye 1 902 and display 1 906 are the same locations in Figures 19A and 19B, However, the actual viewing surface causes the eyes and the display to be at the location because of the sequence of events.  Figure 20 shows a flow chart 2000 of an algorithm, A view surface for controlling a virtual scene having a portable device in accordance with an embodiment of the present invention. In operation 2 002, The signal is received, To synchronize the portable device, E.g,  Button press or screen contact. In operation 2 004, Method of synchronizing the portable device, The position at which the portable device is located is located at a reference point in a three-dimensional (3D) space. In an embodiment, The 3D space is the room where the player is located. In another embodiment, The virtual reality consists of a virtual space that extends indoors and beyond the wall of the room.  In operation 2006, A virtual scene is generated near the reference point in the 3D space. The virtual scene contains virtual reality components. E.g, The Chessboard of Figure 2 is in Operation 2008, The portable device determines the current location of the portable device relative to the reference point in the 3D space. The visual aspect of the virtual scene -26- 201205121 was established in operation 2010. The viewing surface represents the location of the portable device, And the scene as seen from the perspective of the current location of the portable device. Furthermore, During operation 201 2, The created viewing surface is on the display of the portable device. In operation 2014, Portable Check if the portable device has been moved by the user. That is, the current situation has changed. If the portable device is moved, Then the method flow returns to 2008, To recycle the current location. If the portable device is not, The portable device then continues to display the pre-view surface by flowing to operation 20 1 2 .  Figure 2 illustrates an apparatus that can be used to implement embodiments of the present invention. The portable device is a computing device and includes a typical module that appears in the computing device. Such as a processor, Memory (RAM, ROM, etc. Battery or other power source, And permanent storage (such as a hard drive). Communication allows portable devices and other portable devices, Other computers, Servo and so on exchange information. The communication module includes a universal serial bus (USB) device, Communication link (such as Ethernet), Ultrasonic communication, Bluetooth WiFi.  The input module includes an input button and a sensor, microphone, Touch, Camera (forward, backward, Depth camera), And card reader.  Input/output device, For example, a keyboard or mouse can also be connected to the portable device via a communication link such as USB or Bluetooth. Output module package display (with touch screen), Light-emitting diode (LED), Vibration feedback, And speakers. Other output devices can also be communicatively coupled to the portable device.  The current virtual display device is located in the pre-mobile architecture.  Modulators, etc. And other screens,  With one touch combination -27- 201205121 Information from different devices can be the location of the portable device used by the location module. These modules contain magnetometers, Acceleration screw, GPS, And the compass. In addition, The location module can analyze the sound or image data captured by the microphone. To calculate the location, The location module can perform the test. To determine the location of other devices in the vicinity of the portable device, For example, WiFi network testing or testing.  The virtual reality generator establishes the virtual or augmented reality, As stated, Use the location calculated for the location module. a visual generator virtual reality and the location, Produces a view that is displayed on the display.  The generator can also generate sounds that are initiated by the virtual reality generator.  Use the direction of the speaker system at most" It should be understood that The embodiment shown in Figure 21 is an exemplary embodiment of carrying. Other embodiments may also utilize different modules,  Group or specify related work to different modules. The actual conditions shown in Figure 21 should not be construed as exclusion or limitation. It is used for illustration and display.  Figure 22 is a diagram showing an exemplary display of scene A to scene E user A to user E interacting with game client 1102 via an internet server in accordance with an embodiment of the present invention. The game client system 'allows the user to connect to the server application via the internet. Game customers allow users to access and play online entertainment content.  Not limited to games, The movie 'Music and Pictures. In addition, Gamers to provide access to online communication applications, Such as VOIP,  Talk about agreements and emails.  Calculate the calculation, Tuo camera. In addition, the location or the ultrasonic wave has previously been used according to the visual aspect of the device. The second instance is connected to the relevant device for processing and processing, for example, but the user can also text -28 - 201205121 Game customer interaction. In some implementations, The controller is a specific game client controller, In other embodiments, the controller can be a combination of a keyboard and a mouse. In an embodiment, The game is a separate device, It can output audio and video signals. Establish a multimedia environment through surveillance/TV and related audio equipment. E.g, Games can be, but are not limited to, thin customers, Internal PCI-express card, External express card, ExpressCard device, internal, external, Or wireless device, Firewire devices and more. In other embodiments, Gamers integrate with TV or other multimedia devices, Such as DVR, Blu-ray broadcast, DVD player or multi-frequency receiver.  In scene A of Fig. 22, User A interacts with the guest program displayed on monitor 1〇6 using controller 100 paired with game 1102A. Similarly, In scene B, User B uses the client 1102B paired controller 100 to interact with another client application displayed on the monitor 106. Scene C illustrates the list of friends from the user C to the monitor displaying the game and the game client 1 1 02C. Although Figure 22 shows a single server processing module, But in reality, There are many server processing modules throughout the world. Each servo module includes user chat control, Sharing/communication logic,  Ground location, And the secondary module of the load balancing processing service. Furthermore,  The processor processing module includes network processing and distributed storage.  When the game client 11 〇 2 is connected to the server processing module, The usage control can be used to authenticate the user. Authorized users can be associated with virtualized decentralized storage and virtualized network processing. Can be stored in the case, Control client video client PCI-USB household device client should use the server to view the application of the server. 29-201205121 is part of the user's virtual distributed storage The illustrated project contains the purchase media, For example but not limited to games, Film and music, etc. In addition, the 'decentralized storage can be used to store the game state of most games, Custom settings for individual games, And general settings for game customers. In an embodiment, The user's ground location module processed by the server is used to determine the geographic location of the user and its individual game customers. The user's geographic location can be used for sharing/communication logic and load balancing processing services. In order to process the module's geographic location and processing requirements based on most servers, And optimize performance. Virtualization of one or both of network processing and network storage will allow processing from game customers to be dynamically moved to underutilized server processing modules. therefore,  Load balancing can be used to minimize the latency associated with calls from the storage and data transfer between the servo processing module and the gaming client.  The server processing module has an example of a server application A and a server application B. The server processing module can support most server applications. Like server application X, And the server application X2 is indicated. In an embodiment, The server processing is based on the cluster computing architecture. It allows most processors in a cluster to process server applications. In another embodiment, Different types of multi-computer processing schemes are applied to process the server application. This allows the server processing to be scaled, To allow a large number of game customers to execute most client applications and corresponding server applications. or, Server processing can be scaled, To allow for more demand graphics processing or games, The increased computational requirements necessary for image compression or application complexity. In one embodiment, the server processing module performs the main processing via the server application. This allows for relatively expensive components, For example, the graphics area -30- 201205121, RAM, And general processors are centrally located and reduce the cost of game customers. The processing server application data is sent back to the corresponding game client via the Internet. To be displayed on the monitor.  Scenario C illustrates an exemplary application that can be executed by a game client and a server processing module. E.g, In an embodiment, Game client 1 1 0 2 C allows user C to create and view the partner list U20, It contains user A, User B, User D, And user E. As shown, In scenario C, User C can go to the live image or the avatar of the individual user on monitor 1 〇 6C. The server processes the individual application executing the game client 1102C and the user A having the individual game client 1102, User B, User D and User E. Because the server handles that the application is being executed for GameB, And the list of buddies for user A can indicate which game user B is playing. Furthermore, In an embodiment, User A can directly view the actual video of the game video game by user B. In addition to the game customer B, This is only given to the game client A by sending the processing server application profile to the user B.  In addition to being able to see images by partners, The communication application can also be instant messaging between partners. As applied to the previous example, This allows User A to provide encouragement or hinting while viewing User B's live image. In an embodiment, Two-way instant voice communication is done through the client/server application. In another embodiment, The client/server application completes the text conversation. In another embodiment, The client/server application converts the voice into text on the partner screen.  Scene D and Scene E illustrate that individual user D and user E interact with game platforms 1110D and 11I0E, respectively -31 - 201205121. Each game platform 1110D and 1110E is connected to the server processing module and displays a network. The servo processing module coordinates the game with the game platform and the game client.  Figure 23 shows an embodiment of an information service provider architecture. The Information Services Provider (ISP) 25 0 delivers a number of information services to users 262 that are geographically dispersed and connected via network 266. ISPs can only deliver one type of service. Such as stock price updates, Or various services, Such as broadcast media, news, motion, Games and more. In addition, The services provided for each ISP are dynamic, which is, Services can be added or removed at any point in time.  therefore, An ISP that provides a particular type of service to a particular individual can change over time. E.g, The user can serve an ISP close to the user. When the user is near the home, And when users travel to different cities, then users can serve different ISPs. The home ISP will transmit the required information and materials to the new I S P. Enable user information to "follow" users to new cities, Make the data closer to the user and easy to access. In another embodiment, the master-slave relationship can be established between a master ISP and the slave ISP. The primary ISP manages the user's information, And under the control of the main ISP, Make an interface directly from the ISP to the user. In other embodiments, When customers move around the world, The data is transmitted from one ISP to another ISP. In order for the ISP at a preferred location to deliver these services to serve the user.  ISP2 505 includes an Application Service Provider (ASP) 252, It provides computers to the customer through the Internet. Software provided using the ASP model is sometimes referred to as on-demand software or service software (SaaS). The simple form of access to a particular application (such as customer relationship management) is -32- 201205121 by using a standard protocol such as HTTP. Apply the software to the vendor's system and use the special purpose client software provided by the vendor or other remote interface such as a thin client. For users to access through the use of HTML web browser.  Service delivery often uses cloud computing over a wide geographic area. The cloud is calculated as a calculated format, among them, Dynamically scalable and often virtualized resources are provided as services on the Internet. In the "cloud" that supports them, Users are not necessarily experts in the technology infrastructure. Cloud computing can be broken down into different services. For example, infrastructure as a service (IaaS), Platform as a service (PaaS), And software as a service (SaaS). Cloud computing services often provide online common business applications. It is accessed by a web browser. The software and data are stored on the server. The term "cloud" is used as a metaphor for the Internet. A summary of the complex infrastructure based on how the Internet is portrayed and hidden from the computer network map.  Furthermore, ISP 25 0 includes Game Processing Server (GPS) 254, It is used by game customers to play single or multiplayer video games. Most video games played on the Internet operate via a connection to the game server. Typically, The game uses a dedicated server application. It collects information from players and distributes them to other players. This is more efficient and effective in a peer-to-peer configuration. However, a separate server is required to host the server application. In another embodiment, GPS is built on the communication between players and individual game games. Instead of relying on centralized GPS.  The dedicated GPS is the server. It is executed independently of the customer. These servers are usually executed on dedicated hardware located in the data center. Provide more bandwidth -33- 201205121 and dedicated processing power. The game console of the dedicated multi-player game is executed by a dedicated server. Allow them to control and broadcast the processing server (listeners. Broadcast to a very narrow range, how is the last foot signal system? It can be distributed over cable or cable networks. The Internet can also have multicast. To allow letters to be restricted to geographical areas, Such as the rapid spread of the Internet, Broadcasting can almost reach the world's storage service provider (Off Management Service) SSP is also available as a storage device. Make . Another major advantage is the SSP barrier. Users will not lose all or part of the user access data. No relevant information. E.g, User can be while the user is moving, The communication provider 260 is the preferred method for the supervisor to use for the majority. A lot of play is often software new content with the name of the game.  BPS) 256 listeners who distribute audio or video are sometimes referred to as narrowcasts. Wide reach the audience or the audience, When no in the air to the antenna or receiver station (or no cable line) via electricity, the radio or TV gives the reception number and bandwidth shared. Historically national or regional broadcasts. however,  Not defined by the geographical area,  He is a country.  SSP) 25 8 provides computer storage and also provides periodic backup and archiving. Users can order more backup services as needed. If it is electricity, it has all the information. Furthermore, a copy of most of the information, Allow the user where the user is located or the device is accessing the personal profile on the personal mobile phone on the home computer.  For connectivity to such users.  PC-main line of the main line of the company's upstream signal to the broadcast line station, Or can be a table or a direct device, Special, The broadcasting department follows the Internet because of the content space and phase. By adding a storage hard disk, S S P can be used to access files efficiently. And one type -34- 201205121 The communication provider is an Internet Service Provider (ISP), It provides access to the Internet. The ISP uses a block of data that is suitable for transporting Internet Protocol. Such as dial-up, DSL, Wired modem, Wireless or dedicated high-speed interconnected data transmission technology, To connect with their customers. The communication provider can also provide a messaging service. Such as email, Instant messaging and SMS text. Another type of communication provider is the Network Service Provider (NSP). By providing direct access to the Internet, Selling bandwidth or network access. Internet service providers can be made up of telecommunications companies, Data carrier, Wireless communication provider, Internet service provider, Cable TV operators that provide high-speed Internet access, and more.  The data switch 286 interconnects several modules within the ISP 25 0 and connects the modules to the user 262 via the network 266. The data exchanger 268 can cover a small area, The modules of all ISP250s are adjacent to each other. Or when different modules are geographically dispersed, Can cover large geographic areas. E.g, The data switch 268 can contain a fast one billion bit Ethernet (or faster) in the cabinet of the data center. Or intercontinental virtual area network (VLAN).  The user 262 receives the remote service from the client device 264. The client device 2 64 contains at least a CPU, Display and I/O. The client device can be a PC, mobile phone, Small notebook, PDA and more. In an embodiment,  ISP2 50 recognizes the type of device used by the customer and adjusts the communication method used. In other examples, Client devices use standard communication methods, For example, html takes ISP250.  Information Eye Provider (IS P) 25 0 delivers most of the information services to the local area -35 - 201205121 Most users 262 that are geographically dispersed and connected via network 266.  ISPs can only deliver one type of service. Such as stock price updates, Or various services, Such as broadcast media, news, motion, Games and more. In addition, The services provided for each IS P are dynamic, That is, the service can be added or removed at any point in time. therefore, The I s P that provides a particular type of service to a particular individual can change over time. E.g, When the user is living in the city, The user can serve the IS P near the user. When users travel to different cities, Users can serve different ISPs. Your own ISP will send the required information and information to the new ISP. Enable user information to "follow" users to new cities, Make the data closer to the user and easier to access. In another embodiment, A master-slave relationship can be established between the primary ISP and the secondary ISP. The primary ISP manages the user's information, And from the ISP under the control of the main ISP, Directly interface with the user. In another embodiment, When customers move around the world, The data is transmitted from one ISP to another ISP. In order to enable the ISP to serve the user in a better location, The ISP is to deliver these services.  ISP 250 includes an Application Service Provider (ASP) 252, It provides computer-based services to customers on the Internet. Software provided using the ASP model, sometimes referred to as on-demand software or service software (SaaS) » provides a simple form of application-specific (such as customer relationship management) by using standard protocols such as HTTP. Applying software to the vendor’s system,  The special purpose client software provided by the vendor or the other remote interface such as the thin client is accessed by the user through the web browser. -36- 201205121 Services transmitted over a wide geographical area often use cloud computing. The cloud is calculated as a calculated format. among them, Dynamically scalable and often virtualized resources are provided as services on the Internet. In the "cloud" that supports them, Users are not necessarily experts in the technology infrastructure. Cloud computing can be broken down into different services. For example, infrastructure as a service ( IaaS ), Platform as a service (PaaS), And software as a service (SaaS). Cloud computing services often provide online common business applications. It is accessed by a web browser. The software and data are stored on the server. The term "cloud" is used as a metaphor for the Internet. A summary of the complex infrastructure based on how the Internet is portrayed and hidden from the computer network map.  Furthermore, ISP 2 50 includes a Game Processing Server (GPS) 254, It is used by game customers to play single or multiplayer video games. Most video games played on the Internet operate via a connection to the game server. Typically, The game uses a dedicated server application. It collects information from players and distributes them to other players. This is more efficient and effective in a peer-to-peer configuration. However, a separate server is required to host the server application. In another embodiment, GPS is established between the player and the individual game play exchange information. Instead of relying on centralized GPS.  The dedicated GPS is the server. It is executed independently of the customer. These servers are usually executed on dedicated hardware located in the data center. Provide more bandwidth and dedicated processing power. The dedicated server is the preferred method for hosting game servers for most PC-based multiplayer games. A large number of multi-player line upstream games are executed on a dedicated server, usually hosted by a software company with a game name. Allow them to control and update content.  -37- 201205121 Broadcast Processing Server (BPS) 25 6 Assign audio or video signals to the listener. Broadcasting to a narrow range of listeners is sometimes referred to as narrowcasting. How the last foot signal of the broadcast distribution reaches the audience or the audience, When it is a radio or television station, it can be scattered in the air to the antenna or receiver. Or you can bring a radio or TV to the receiver via cable or cable radio (or no cable) via radio or directly over the Internet. Especially with multicast, Allow the signal and bandwidth to be shared. in history, Broadcasting is limited to geographical areas. For example, national borders or regional broadcasts. however, With the rapid spread of the Internet, The broadcast is not defined by the geographical area. Because the content can reach almost any country in the world.  The Storage Service Provider (SSP) 25 8 provides computer storage and related management services. SSP also provides periodic backup and archiving. By providing storage as a service, Users can order more storage as needed. Another major advantage is that the SSP includes a backup service. If its computer hard drive is faulty, Users will not lose all of their information. Furthermore, Most SSPs can have copies of all or part of the user's data. Allow users to access data in an efficient manner. It does not matter where the user is located or is using the device to access the data. E.g, Users can access personal files on their home computers.  And when the user is moving, Personal profile in the mobile phone.  Communication provider 260 provides connectivity to such users. One type of communication provider is an Internet Service Provider (ISP). It provides access to the Internet. The ISP uses a block of data that is suitable for transporting Internet Protocol. Such as dialing, DSL, Wired modem, Wireless or dedicated high-speed interconnected data transmission technology, To connect with their customers. The communication provider can also provide -38- 201205121 signaling service. Such as email, Instant messaging and SMS text. Another type of communication provider is the Network Service Provider (NSP). By providing direct access to the Internet, And selling bandwidth or network access.  Internet service providers can be made up of telecommunications companies, Data carrier, Wireless communication provider, Internet service provider, Cable TV operators that provide high-speed Internet access, and more.  The data switch 268 interconnects several modules within the ISP 250 and connects the modules to the user 262 via the network 266. The data exchanger 2 6 8 can cover a small area, All of the I S P 2 50 modules are adjacent to each other' or when different modules are geographically dispersed. Can cover large geographic areas. For example, the 'Data Exchanger 268 can contain a fast billion-bit Ethernet (or faster) in the cabinet of the data center. Or intercontinental virtual area network (VLAN).  The user 262 receives the remote service from the client device 264. The client device 264 includes at least a CPU, Display and I/O. The client device can be a PC, mobile phone, Small notebook 'pda and so on. In an embodiment,  ISP25 0 recognizes the type of device used by the customer and adjusts the communication method used. In other examples, Client devices use standard communication methods, For example, html takes ISP2 5 0.  The embodiments of the present invention can be implemented in a variety of computer system architectures. Contains a handheld device, Microprocessor system 'microprocessor-based or programmable consumer electronics, Mini computer, Host computer and the like. The present invention can also be practiced in a distributed computing environment where the work is done by a remote processing device that is linked together by a network.  -39- 201205121 Remember the above embodiment, It should be understood that The present invention enables computer-implemented operations involving storage in a computer system. These require entities to calculate the entity quantity. Any of the operations described herein form part and are used for machine operation. The present invention is directed to performing such operations and equipment. The device can be specially constructed for the required purpose, Example Computer. When defined as a special purpose computer, When the operation can still be used, the computer can perform other processing. Program execution or no purpose routine. Or the operation can be stored in the computer memory for the brain of the general purpose computer. Move or configure in the cache or through the network - or more computer programs. When data is transmitted through the Internet, the data can be processed by other computers on the network. E.g,  Computed by the computing resources.  Embodiments of the invention may be defined as machines, It converts the data from another state. The converted data can be stored in the memory and calculated by the processor. The processor therefore converts the data from one thing to another. Furthermore, The methods can also be one or more machines or processors, These are connected through the network. Each machine can convert data or things to another state or thing, And can also process the storage data to the storage, Transfer data on the web, Show results,  The result goes to another machine.  One or more embodiments of the invention may also be fabricated as computer code on a computer readable medium. The computer readable medium can be any storage device. It can store data, Examples of brain-readable media that are subsequently read by computer systems include hard disk drives, Network attached storage

用各種 運算爲 本發明 算的裝 如特殊 特殊目 是特殊 ,該電 取得之 取得時 雲端的 一狀態 後爲處 一事物 所處理 由一狀 料、儲 或傳送 可讀取 一資料 取。電 (NAS -40- 201205121 )、唯讀記億體、隨機存取記憶體、CD-ROM、CD-R、 CD-RW、磁帶及其他光學及非光學資料儲存裝置。電腦可 讀取媒體可以包含分散在網路耦接之電腦系統的電腦可讀 取實體媒體’使得電腦可讀取碼係被以分散方式儲存及執 行。 雖然方法運算係以特定順序加以描述,但應了解的是 ,其他管家操作也可以在運算之間執行,或者運算可以被 調整’使得它們發生於略微不同的時間,或者可以分散於 一系統中,其允許處理運算的發生在有關於該處理的不同 時間期間’只要該重疊操作的處理係以想要方式執行即可 〇 雖然本發明已爲了了解清楚的目的而加以詳細描述, 但可以得知,某些改變及變化可以在隨附申請專利範圍的 範圍內加以實現。因此,本實施係被認爲是例示性非限制 性’及本發明並不限於此所給定之細節,並可以在隨附之 申請專利範圍的範圍與等效範圍內加以修改。 【圖式簡單說明】 本發明可以參考附圖及以下說明而了解。 圖1描繪一使用者將攜帶式裝置依據本發明同步化至 空間中之一參考點之前。 圖2例示以攜帶式裝置觀察的虛擬實境場景。 圖3例示依據一實施例之具有虛擬棋盤及混合玩家之 手部的擴增實境棋賽。 -41 - 201205121 圖4描繪依據一實施例之多玩家虛擬實境遊戲。 圖5例示用於多玩家環境之校正方法的實施例。 圖6例示依據一實施例如何在網路連接上玩互動遊戲 〇 圖7顯示無關於攜帶式裝置的位置互動遊戲。 圖8顯示一互動遊戲,其中依據一實施例之相關於攜 帶式裝置的位置之顯示器的視面。 圖9顯示依據一實施例,攜帶式裝置如何移動以具有 如在顯示器上之類似作用,以如同在虛擬空間中移動攝影 機。 圖1〇顯示依據一實施例之當旋轉攜帶式裝置時,示 於顯示器中之影像的二維變化代表圖。 圖11顯示依據一實施例之玩VR遊戲之攜帶式裝置 〇 圖1 2 A-1 2F例示依據一實施例,攜帶式裝置的位置係 如何影II在顯示器中之視面。 圖13A-13B顥示依據一實施例之在遠端位置中之使 用者間之所玩的擴增實境遊戲。 圖14A-14H描繪依據一實施例之當攜帶式裝置改變 位置時之顯示器中之變化。 _ 1 5顯示使用前及背面攝影機之攜帶式裝置的視見 平截頭體實施例。 圖1 6 A -1 6 B例示當玩家依據一實施例移動時,視見 平截頭體的變化作用。 -42- 201205121 圖1 7例示如何使用虛擬攝影機,以依據一實施例分 開虛擬場景的視面。 圖18A-18H顯示依據一實施例之例示視見平截頭體 作用之連續視面。 圖1 9A-1 9B例示組合視見平截頭體作用與攝影作用 之實施例。 圖20顯示依據本發明一實施例之控制虛擬場景與攜 帶式裝置的演算法流程圖。 圖2 1例示可以用以實施本發明實施例之裝置的架構 〇 圖22爲依據本發明一實施例之場景A至場景E之具 有個別使用者A至使用者E與遊戲客戶1102互動之例示 示意圖,該等遊戲用戶係經由網際網路連接至伺服器處理 〇 圖23爲資訊服務提供者架構的實施例。 【主要元件符號說明】 102 :使用者 104 :攜帶式裝置 1 0 6 :參考點 108 :虛擬實境 302 :攜帶式裝置 304 :螢幕 306 :手部 43- 201205121 402A-C :玩家 404 :共同虛擬場景 502:共同參考點 504A-D :玩家 702A-C :玩家 704A-C :裝置 7 0 6 :參考點 7 0 8 :曲棍球場 7 1 0 :圓碟 712 :圓碟 802A-B :裝置 9 0 2 :車輛 1 52 :裝置 1 5 4 :地點 1 5 6 :地點 1 5 8 :地點 1 6 0 :投影 162 :投影 1 3 02 :攝影機 1 304 :攝影機 1 3 06 :虛擬棋盤 1308 :玩家 1 3 1 0 :背景 1 5 0 2 :地點 201205121 1 5 0 4 ··地點 1 506 :攜帶式裝置 1 5 0 8 ·區域 1 5 1 0 :區域 1 5 1 2 :攝影機 1 5 1 4 :攝影機 1 6 0 2 :地點 1 6 04 :地點 1 6 0 6 :顯示器 1 608 :矩形 1 6 1 0 :矩形基部 1 6 3 2 :地點 1 6 3 6 :顯示器 1 6 3 8 :矩形 1 6 4 0 :矩形 1 9 0 2 :眼睛 1 904:裝置 1 906 :地點 1 9 0 8 :區段 1910:區域 1 9 1 2 :地點 1916:區域 19 18 ·區域 1 〇 〇 :控制器 -45 201205121 1 102A-C :遊戲客戶 1 1 2 0 :夥伴名單 1 110D,E :平台 2 5 0 :資訊服務提供者 266 :網路 252 :應用服務提供者 254 :遊戲處理伺服器 2 5 6 :廣播處理伺服器 25 8 :儲存服務提供者 260 :通訊提供者 2 6 8 :資料交換 262 :使用者 264 :客戶裝置It is special to use the various calculations for the present invention as a special special purpose. When the acquisition of the electricity is obtained, a state of the cloud is processed by a thing, and a data can be read from a material, a storage or a transmission. Electric (NAS-40-201205121), reading only billions, random access memory, CD-ROM, CD-R, CD-RW, magnetic tape and other optical and non-optical data storage devices. The computer readable medium can include computer readable physical media dispersed throughout the network coupled computer system' such that the computer readable code system is stored and executed in a decentralized manner. Although method operations are described in a particular order, it should be understood that other housekeeper operations can also be performed between operations, or operations can be adjusted 'so that they occur at slightly different times, or can be spread across a system. It allows processing operations to occur during different times with respect to the processing as long as the processing of the overlapping operations is performed in a desired manner, although the invention has been described in detail for the purposes of clarity, it will be appreciated that Certain changes and modifications can be made within the scope of the appended claims. Therefore, the present invention is to be considered as illustrative and not limiting, and the invention is not limited to the details of the invention, and may be modified within the scope and equivalents of the appended claims. BRIEF DESCRIPTION OF THE DRAWINGS The invention can be understood by reference to the drawings and the following description. Figure 1 depicts a user prior to synchronizing a portable device to a reference point in space in accordance with the present invention. Figure 2 illustrates a virtual reality scene viewed with a portable device. Figure 3 illustrates an augmented reality chess game with a virtual board and a mixed player's hand in accordance with an embodiment. -41 - 201205121 Figure 4 depicts a multiplayer virtual reality game in accordance with an embodiment. FIG. 5 illustrates an embodiment of a correction method for a multi-player environment. Figure 6 illustrates how to play an interactive game over a network connection in accordance with an embodiment. Figure 7 shows a location interactive game that is not related to a portable device. Figure 8 shows an interactive game in which the viewing surface of the display relating to the position of the portable device is in accordance with an embodiment. Figure 9 shows how a portable device can be moved to have a similar effect as on a display, as in a virtual space, in accordance with an embodiment. 1A shows a two-dimensional variation representation of an image displayed in a display when the portable device is rotated in accordance with an embodiment. Figure 11 shows a portable device for playing a VR game in accordance with an embodiment. Figure 1 2 A-1 2F illustrates how the position of the portable device is in the view of the display in accordance with an embodiment. Figures 13A-13B illustrate an augmented reality game played between users in a remote location in accordance with an embodiment. Figures 14A-14H depict changes in the display when the portable device changes position in accordance with an embodiment. _ 1 5 shows the view of the portable device using the front and rear cameras. Figure 1 6 A - 1 6 B illustrates the effect of viewing the frustum as the player moves in accordance with an embodiment. -42- 201205121 Figure 1 7 illustrates how a virtual camera can be used to separate the view of a virtual scene in accordance with an embodiment. Figures 18A-18H illustrate a continuous view of a view of a frustum in accordance with an embodiment. Figures 1A-1-9B illustrate an embodiment of a combined view frustum action and photographic action. Figure 20 is a flow chart showing the algorithm for controlling a virtual scene and a portable device in accordance with an embodiment of the present invention. FIG. 2 is a schematic diagram of an apparatus that can be used to implement the apparatus of the embodiment of the present invention. FIG. 22 is a schematic diagram of interaction between an individual user A and a user E and a game client 1102 according to an embodiment A to a scene E according to an embodiment of the invention. The game users are connected to the server via the Internet. Figure 23 is an embodiment of the information service provider architecture. [Main component symbol description] 102: User 104: Portable device 1 0 6 : Reference point 108: Virtual reality 302: Portable device 304: Screen 306: Hand 43-201205121 402A-C: Player 404: Common virtual Scenario 502: Common Reference Point 504A-D: Player 702A-C: Player 704A-C: Device 7 0 6: Reference Point 7 0 8 : Hockey Field 7 1 0: Round Disc 712: Round Disc 802A-B: Device 9 0 2: Vehicle 1 52: Device 1 5 4: Location 1 5 6 : Location 1 5 8 : Location 1 6 0 : Projection 162 : Projection 1 3 02 : Camera 1 304 : Camera 1 3 06 : Virtual Board 1308 : Player 1 3 1 0 : Background 1 5 0 2 : Location 201205121 1 5 0 4 · Location 1 506 : Portable device 1 5 0 8 · Area 1 5 1 0 : Area 1 5 1 2 : Camera 1 5 1 4 : Camera 1 6 0 2 : Location 1 6 04 : Location 1 6 0 6 : Display 1 608 : Rectangular 1 6 1 0 : Rectangular base 1 6 3 2 : Location 1 6 3 6 : Display 1 6 3 8 : Rectangular 1 6 4 0 : Rectangular 1 9 0 2 : Eye 1 904: Device 1 906: Location 1 9 0 8 : Section 1910: Area 1 9 1 2 : Location 1916: Area 19 18 · Area 1 〇〇: Controller - 45 201205121 1 102A-C : Game Customers 1 1 2 0 : buddy list 1 110D, E: platform 2 5 0 : information service provider 266 : network 252 : application service provider 254 : game processing server 2 5 6 : broadcast processing server 25 8 : storage service provision 260: Communication Provider 2 6 8 : Data Exchange 262: User 264: Client Device

Claims (1)

201205121 七、申請專利範圍: 1 · 一種用以控制虛擬場景與攜帶式裝置的視面的方 法,該方法包含: 接收信號’以同步化該攜帶式裝置; 同步化該攜帶式裝置,以使該攜帶式裝置所在之位置 爲三維(3 D )空間中之參考點; 在該3D空間中在該參考點旁產生虛擬場景,該虛擬 場景包含虛擬實境元件; 相對於該參考點,決定該攜帶式裝置在該3D空間中 之現行地點; 建立該虛擬場景的視面,其中該視面代表由該攜帶式 裝置的該現行地點並根據該攜帶式裝置的該現行地點的視 角看到之該虛擬場景; 在該攜帶式裝置中,顯示該建立視面;及 當該攜帶式裝置係爲在該3D空間中之使用者所移動 時’改變該虛擬場景的該顯示視面。 2.如申請專利範圍第1項所述之方法,其中建立該 虛擬場景的視面更包含: 以在該攜帶式裝置中之攝影機,捕捉該3D空間之影 像’該影像包含在該3D空間中之真實元件;及 混合該真實元件與該虛擬實境元件,其中該虛擬元件 出現在該顯示視面,如同該虛擬元件爲該3 D空間的部份 ’其中該虛擬元件的視面由幾何透視以該攜帶式裝置於 3 D空間內移動時該真實元件的視面改變的相同方式改變 -47- 201205121 該等虛擬元件的視面。 3. 如申請專利範圍第2項所述之方法,其中該真實 元件包含玩家的手部,該方法更包含: 檢測該手部佔用在該3D空間中第一虛擬元件所在之 地點;及 在檢測後,促成該手部與該第一虛擬元件的互動,以 模擬該手部正接觸該虛擬元件,其中該手部能操縱該第一 虛擬元件,以改變該第一虛擬元件的位置或特性,如同該 第一虛擬元件爲真實物體般。 4. 如申請專利範圍第3項所述之方法,其中該手部 的互動係由交界、固持、推動、拉動、握持、移動、擊破 、擠壓、敲打、擲敲、打鬥、開、合、導通或關斷、按鈕 、點火、吃所構成的群組中選出之作動於該第一虛擬元件 的動作。 5 .如申請專利範圍第2項所述之方法,其中建立視 面更包含: 依據在該3D空間中之發光狀態及該虛擬場景,將該 手部的陰影加至該虛擬元件之上。 6·如申請專利範圍第2項所述之方法,其中該真實 元件包含桌子,其中混合更包含: 放置虛擬元件在_桌子之上,如同該放置虛擬元件被 放置在該桌子上。 1 如申請專利範圍第2項所述之方法,其中該真實 元件包含在室內中之壁面,其中混合更包含: -48- 201205121 將一顯示加至該壁面,該顯示係爲該等虛擬元件之一 〇 8 ·如申請專利範圍第1項所述之方法,其中該攜帶 式裝置的該現行地點包含該攜帶式裝置的幾何座標及在該 攜帶式裝置中之顯示器的觀看表面的幾何座標。 9 ·如申請專利範圍第8項所述之方法,其中該攜帶 式裝置的該幾何座標等於在該攜帶式裝置中之攝影機的幾 何座標。 10·如申請專利範圍第9項所述之方法,其中該視角 係參考一向量加以界定,該向量具有該顯示器的該觀看表 面中心之起點及垂直於該顯示器的觀看表面的方向。 11·如申請專利範圍第1項所述之方法,其中建立該 虛擬場景的視面更包含: 當該攜帶式裝置移動更接近該第二虛擬實境元件時, 放大第二虛擬實境元件:及 當該攜帶式裝置移動離開該第二虛擬實境元件時,縮 小該第二虛擬實境元件。 12. 如申請專利範圍第1項所述之方法,其中改變該 顯示視面更包含: 當該攜帶式裝置移動時,將影像穩定化加入至該顯示 視面。 13. 如申請專利範圍第1項所述之方法,更包含: 接收輸入以改變視面;及 改變該虛擬場景的該視面的建立,使得該虛擬場景的 -49 - 201205121 該視面係由與該攜帶式裝置的現行地點不同的另一點計算 〇 1 4 ·如申請專利範圍第1 3項所述之方法,其中該虛 擬場景的視面係相對於與該參考點交叉的垂直線旋轉180 度。 15-如申請專利範圍第1項所述之方法,其中該接收 信號係藉由按壓在該攜帶式裝置上的按鈕或藉由碰觸該攜 帶式裝置的觸控顯示器加以產生。 16·如申請專利範圍第1項所述之方法,其中該虛擬 場景的邊界係爲室內的壁面所定義,其中該攜帶式裝置係 當該號被接收時加以定位。 17·如申請專利範圍第1項所述之方法,其中同步化 該攜帶式裝置更包含: 重置該攜帶式裝置中之地點追蹤模組,該地點追蹤模 組係由加速度計、磁力計、GPS裝置、攝影機、深度攝影 機、羅盤或陀螺儀所構成之群組所選出, 其中該現行地點的決定係由來自該地點追蹤模組的資 訊來執行。 18· —種於裝置間共享虛擬場景的方法,該方法包含 同步化第一裝置至三維.(3 D )空間中之參考點; 計算第二裝置相對於該第一裝置的位置的位置; 交換於該第一裝置與該第二裝置間之資訊,以令該第 二裝置同步化在該3D空間中之該參考點,該資訊包括該 -50- 201205121 參考點及該第一與第二裝置的位置; 在該3D空間中在該參考點附近產生虛擬場景,該虛 擬場景爲該兩裝置所共用,該虛擬場景同時改變於該兩裝 置中’當兩裝置與該虛擬場景互動時; 建立該虛擬場景的視面,其係根據該攜帶式裝置的現 行地點的視角,由該第一裝置的現行地點所看到者; 顯示該建立視面於該第一裝置中;及 當該攜帶式裝置移動於該3D空間內時,改變該虛擬 場景的該顯示視面。 1 9 ·如申請專利範圍第1 8項所述之方法,其中計算 該桌一裝置的位置更包括: 收集在該第一裝置與該第二裝置間之第一相對位置資 訊’該收集包括WiFi地點追蹤、音訊三角定位' 或由該 第一裝置中之攝影機取得之影像分析之一或更多; 根據該第一相對地點資訊,決定該第二裝置相關於該 第一裝置的相對位置;及 將該相對地點與該參考點的座標送至該第二裝置。 20-如申請專利範圍第1 8項所述之方法,更包含: 收集在該第一裝置與該第二裝置間之第一相對位置資 訊,該收集包括WiFi地點追蹤、音訊三角定位、或由第 一裝置或第二裝置中之攝影機所取得之影像分析之一或多 者; 由該第二裝置接收第二相對位置資訊; 根據該第一及第二相對資訊,決定該第二裝置相關於 -51 - 201205121 該第一裝置的相對地點;及 將該相對地點與該參考點的座標送至該第二裝置 2 1 .如申請專利範圍第1 8項所述之方法’其中 擬場景包括虛擬棋盤遊戲,其中玩家分別握持該第一 與第二裝置,玩虛擬棋盤遊戲。 22 .如申請專利範圍第1 8項所述之方法,其中 住該第一攜帶式裝置的第一玩家藉由同步於第一虛擬 移動該第一攜帶式裝置,而控制該第一虛擬元件在該 場景中的移動。 2 3 . —種控制虛擬場景與第一裝置的視面的方法 方法包含: 同步化該第一裝置至在第一三維(3D)空間中 一參考點; 在該第一裝置與第二裝置間建立通訊鏈結,該第 置係在該第一 3 D空間外的第二3 D空間中,該第二 係同步化於在該第二3D空間中之第二參考點; 產生包括虛擬實境元件的共同虛擬場景,該共同 場景係可以爲該第一及該第二裝置所觀察,該第一裝 該第一參考點旁建立該共同虛擬場景,該第二裝置在 二參考點旁建立該共同虛擬場景,該兩裝置係與與該 實境元件互動; 決定該第一裝置相對於該參考點在該第一 3D空 之現行地點; 建立該共同虛擬場景的視面,其中該視面代表由 該虛 裝置 握持 元件 虛擬 ,該 之第 二裝 裝置 虛擬 置在 該第 虛擬 間中 該第 -52- 201205121 一裝置的該現行地點並根據該第一裝置的該現行地點的視 角所見的共同虛擬場景; 顯不該建立視面於該第一裝置中;及 當該第一裝置移動於該第一 3D空間內時,改變該共 同虛擬場景的該顯示視面。 24.如申請專利範圔第23項所述之方法,其中該通 訊鏈結包括於該第一裝置與該第二裝置間之網路連接。 25-如申請專利範圍第2 3項所述之方法,更包含: 指定在該第一 3D空間中之虛擬位置給該第二裝置; 自該第二裝置接收對應於該第二裝置與該共同虛擬場 景間之互動的第二裝置互動資訊;及 依據該接收之第二裝置互動資訊與該虛擬位置,改變 該共同虛擬場景的該視面,其中該第二裝置出現在該第一 3D空間中並與該第一裝置互動,如同該第二裝置係實際 出現在該第一3 D空間。 26. 如申請專利範圍第23項所述之方法,其中該共 同虛擬場景的該視面更包含位於接近該第二裝置的第二玩 家的影像’該第二裝置具有攝影機,用以捕捉該第二玩家 的影像。 27. 如申請專利範圍第26項所述之方法,更包含: 週期地更新該第二玩家的影像。 28. 如申請專利範圍第26項所述之方法,更包含: 當該第一玩豕移動時,更新該第二玩家.的影像。 29. 如申請專利範圍第26項所述之方法,其中該虛 -53- 201205121 擬元件包括西洋棋盤及西洋棋子,其中該第一裝置及該第 二裝置係用以藉由操縱西洋棋子來玩西洋棋遊戲。 30. 如申請專利範圍第26項所述之方法,其中該第 一與第二裝置爲該共同虛擬場景之該視面所分別代表成爲 第一物體及第二物體,其中該第一物體的移動匹配於在該 第一 3D空間中該第一裝置的移動,及該第二物體的移動 匹配於在該第二3D空間中該第二裝置的移動。 31. —種控制虛擬場景與攜帶式裝置的視面的方法, 該方法: 同步化攜帶式裝置至該攜帶式裝置所在的三維(3D )空間中之參考點,該攜帶式裝置包含面向該攜帶式裝置 前面的前面攝影機及面向該攜帶式裝置背面的背面攝影機 » 在該3D空間中,在該參考點旁產生虛擬場景,該虛 擬場景包括虛擬實境元件; 決定該攜帶式裝置相對於該參考點在該3D空間內的 現行地點; 建立該虛擬場景的視面,該視面捕捉由握持該攜帶式 裝置的玩家在該3 D空間中之現行眼睛地點所見的該虛擬 場景的代表視面,該捕捉對應於該玩家看穿窗口進入該虛 擬場景所見,在該3 D空間中之窗口埤點等於在該攜帶式 裝置中之顯示器的該3D空間中之地點; 顯示所建立視面於該顯示器中;及 當該攜帶式裝置或該玩家移動於該3D空間內時,改 -54- 201205121 變該虛擬場景的所顯示視面。 32.如申請專利範圍第3 1項所述之方法,其中來自 BU面攝影機的影像係用以決定現行眼睛地點及來自背面攝 影機的影像係用以取得該3 D空間的視面。 3 3.如申請專利範圍第3 1項所述之方法,其中將該 顯示器自玩家的眼睛拉開,使得該視面放大該虛擬場景及 該顯示器拉向接近該玩家的眼睛,使得該視面縮小該虛擬 場景。 34. —種控制場景與攜帶式裝置的方法,該方法包含 接收信號以同步化該攜帶式裝置; 同步化該攜帶式裝置,以使該攜帶式裝置所在之位置 爲三維(3 D )空間中之參考點; 在該參考點旁,在該3D空間內產生虛擬場景,該虛 擬場景包括虛擬實境元件; 當該攜帶式裝置移動離開該參考點時,建立該虛擬場 景的視面,其中該視面代表由該攜帶式裝置的現行地點所 見之該虛擬場景;及 顯示該建立的視面於該攜帶式裝置中;及 當該攜帶式裝置移動於該3D空間中時,改變該虛擬 場景的該顯示視面。 35. 如申請專利範圔第34項所述之方法,更包含: 決定產生聲音的虛擬元件; 由該攜帶式裝置發射對應於爲該虛擬元件所產生之聲 -55- 201205121 音的聲音’其中當該攜帶式裝置移動靠近產生聲音的該虛 擬元件的地點時,該被發射的聲音變得更大。 3 6.如申請專利範圍第3 5項所述之方法,其中該攜 帶式裝置具有立體音喇叭’其中該發出之聲音係被調整以 依據在該攜帶式裝置與產生聲音的該虛擬元件間之相對位 置,提供立體音效果。 37. —種用以與擴增實境互動的攜帶式裝置,該攜帶 式裝置包含: 地點模組,用以決定在該攜帶式裝置所在之3 D空間 中之該攜帶式裝置的地點,其中當攜帶式裝置接收信號以 同步化時,該攜帶式裝置的地點係被設定爲在該3 D空間 中之參考點; 虛擬實境產生器,其在該3D空間中在該參考點旁建 立虛擬場景,該虛擬場景包括虛擬實境元件; 視面產生器,建立該虛擬場景的視面,其中該視面代 表由該攜帶式裝置的地點及根據該攜帶式裝置的該地點的 一視角所看之虛擬場景:及 顯示器,用以顯示該虛擬場景的視面,當該攜帶式裝 置移動於該3 D空間內時,示於該顯示器的該視面改變。 3 8 .—種內藏於電腦可讀取儲存媒體中的電腦程式, 當電腦程式爲一或更多處理器所執行,用以在裝置間分享 虛擬場景,該電腦程式包含z 用以同步化第一裝置至在三維(3D)空間中之參考 點的程式指令, -56- 201205121 用以相對於該第一位置的位置,計算第 的程式指令: 用以交換在該第一裝置與該第二裝置間 得該第二裝置同步化至在該3D空間中之該 指令,該資訊包含該參考點與該第一與第二 置; 用以在該3D空間中於該參考點旁產生 式指令,該虛擬場景係爲該兩裝置所共用, 該虛擬場景互動時,該虛擬場景在該兩裝置 用以建立該虛擬場景的視面的程式指令 的視面係如同由該第一裝置的現行地點並根 置的該現行地點之一視角所看到的; 用以在該第一裝置顯示該所建立視面的 用以當該攜帶式裝置移動於該3 D空間 虛擬場景的該顯示視面的程式指令。 二裝置的位置 之資訊,以使 參考點的程式 裝置的該等位 虛擬場景的程 當該兩裝置與 中同時改變; ,該虛擬場景 據該攜帶式裝 程式指令;及 內時,改變該 -57-201205121 VII. Patent application scope: 1 . A method for controlling a virtual scene and a viewing surface of a portable device, the method comprising: receiving a signal 'to synchronize the portable device; synchronizing the portable device to enable the The portable device is located at a reference point in a three-dimensional (3D) space; in the 3D space, a virtual scene is generated next to the reference point, the virtual scene includes a virtual reality component; and the reference is determined relative to the reference point The current location of the device in the 3D space; establishing a view of the virtual scene, wherein the view represents the virtual location of the current location of the portable device and the virtual location is viewed from the perspective of the current location of the portable device a scene; displaying the viewing view in the portable device; and changing the display view of the virtual scene when the portable device is moved by a user in the 3D space. 2. The method of claim 1, wherein the creating a view of the virtual scene further comprises: capturing, by the camera in the portable device, an image of the 3D space, the image being included in the 3D space a real component; and mixing the real component with the virtual reality component, wherein the virtual component appears on the display viewing surface as if the virtual component is a portion of the 3D space where the viewing surface of the virtual component is geometrically The viewing surface of the virtual components is changed in the same manner as the viewing surface of the real component changes when the portable device moves within the 3D space. 3. The method of claim 2, wherein the real component includes a player's hand, the method further comprising: detecting a location of the first virtual component occupied by the hand in the 3D space; and detecting Thereafter, facilitating interaction of the hand with the first virtual component to simulate that the hand is contacting the virtual component, wherein the hand can manipulate the first virtual component to change the position or characteristics of the first virtual component, As if the first virtual component is a real object. 4. The method of claim 3, wherein the interaction of the hand is by junction, holding, pushing, pulling, holding, moving, breaking, squeezing, tapping, throwing, fighting, opening, closing The action selected by the group consisting of turning on or off, button, ignition, and eating is performed on the first virtual component. 5. The method of claim 2, wherein the establishing the view further comprises: adding a shadow of the hand to the virtual component based on the lighting state in the 3D space and the virtual scene. 6. The method of claim 2, wherein the real component comprises a table, wherein the blending further comprises: placing the virtual component on the table as if the placing virtual component was placed on the table. 1 The method of claim 2, wherein the real component comprises a wall in the room, wherein the mixing further comprises: -48- 201205121 adding a display to the wall, the display being the virtual component The method of claim 1, wherein the current location of the portable device comprises a geometric coordinate of the portable device and a geometric coordinate of a viewing surface of the display in the portable device. 9. The method of claim 8, wherein the geometric coordinates of the portable device are equal to the coordinates of the camera in the portable device. 10. The method of claim 9, wherein the viewing angle is defined with reference to a vector having a starting point of the center of the viewing surface of the display and a direction perpendicular to a viewing surface of the display. The method of claim 1, wherein establishing the view of the virtual scene further comprises: when the portable device moves closer to the second virtual reality component, amplifying the second virtual reality component: And when the portable device moves away from the second virtual reality element, the second virtual reality element is reduced. 12. The method of claim 1, wherein changing the display viewing surface further comprises: adding image stabilization to the display viewing surface as the portable device moves. 13. The method of claim 1, further comprising: receiving an input to change a view; and changing the creation of the view of the virtual scene such that the view of the virtual scene is -49 - 201205121 Another method of calculating a different point from the current location of the portable device. The method of claim 13, wherein the viewing surface of the virtual scene is rotated 180 with respect to a vertical line intersecting the reference point. degree. The method of claim 1, wherein the receiving signal is generated by pressing a button on the portable device or by touching a touch display of the portable device. The method of claim 1, wherein the boundary of the virtual scene is defined by a wall of the room, wherein the portable device is positioned when the number is received. The method of claim 1, wherein synchronizing the portable device further comprises: resetting a location tracking module in the portable device, the location tracking module being an accelerometer, a magnetometer, A group of GPS devices, cameras, depth cameras, compasses, or gyroscopes is selected, wherein the decision of the current location is performed by information from the location tracking module. 18. A method of sharing a virtual scene between devices, the method comprising synchronizing a first device to a reference point in a three-dimensional (3D) space; calculating a position of the second device relative to a location of the first device; Information between the first device and the second device to synchronize the second device to the reference point in the 3D space, the information including the -50-201205121 reference point and the first and second devices a virtual scene in the vicinity of the reference point in the 3D space, the virtual scene being shared by the two devices, the virtual scene being simultaneously changed in the two devices, 'when the two devices interact with the virtual scene; establishing the a view of the virtual scene, which is viewed by the current location of the first device based on the perspective of the current location of the portable device; displaying the setup view in the first device; and when the portable device When moving in the 3D space, the display view of the virtual scene is changed. The method of claim 18, wherein calculating the position of the device of the table further comprises: collecting first relative position information between the first device and the second device, the collection includes WiFi Position tracking, audio triangulation ' or one or more of image analysis obtained by a camera in the first device; determining a relative position of the second device relative to the first device based on the first relative location information; and The coordinates of the relative location and the reference point are sent to the second device. 20- The method of claim 18, further comprising: collecting first relative location information between the first device and the second device, the collecting comprising WiFi location tracking, audio triangulation, or by One or more image analysis obtained by the camera in the first device or the second device; receiving second relative position information by the second device; determining, according to the first and second relative information, that the second device is related to -51 - 201205121 The relative location of the first device; and the coordinate of the relative location and the reference point to the second device 2 1. The method of claim 18, wherein the simulated scene includes a virtual A board game in which players respectively hold the first and second devices and play a virtual board game. 22. The method of claim 18, wherein the first player living in the first portable device controls the first virtual component by synchronizing with the first virtual movement of the first portable device The movement in this scene. A method for controlling a virtual scene and a view of a first device includes: synchronizing the first device to a reference point in a first three-dimensional (3D) space; between the first device and the second device Establishing a communication link, the first system is in a second 3D space outside the first 3D space, and the second system is synchronized to a second reference point in the second 3D space; generating includes virtual reality a common virtual scene of the component, the common scene can be observed by the first and the second device, the first virtual point is established next to the first reference point, and the second device establishes the next reference point a common virtual scene, the two devices interacting with the real-world component; determining a current location of the first device relative to the reference point at the first 3D space; establishing a view of the common virtual scene, wherein the view surface represents Holding the component virtual by the virtual device, the second loading device is virtually disposed in the current location of the device in the virtual room, and is seen from the perspective of the current location of the first device Common virtual scene; establish should not significantly depends on the surface of the first device; and when the first movement means in the first 3D space, the common virtual scene change the display view plane. 24. The method of claim 23, wherein the communication link comprises a network connection between the first device and the second device. 25 - The method of claim 23, further comprising: designating a virtual location in the first 3D space for the second device; receiving from the second device corresponding to the second device a second device interaction information of the interaction between the virtual scenes; and changing the view surface of the common virtual scene according to the received second device interaction information and the virtual location, wherein the second device appears in the first 3D space And interacting with the first device as if the second device actually appeared in the first 3D space. 26. The method of claim 23, wherein the view of the common virtual scene further comprises an image of a second player located proximate to the second device, the second device having a camera for capturing the The image of the two players. 27. The method of claim 26, further comprising: periodically updating the image of the second player. 28. The method of claim 26, further comprising: updating the image of the second player when the first game moves. 29. The method of claim 26, wherein the virtual-53-201205121 pseudo-component comprises a checkerboard and a western chess piece, wherein the first device and the second device are used to play by playing the chess piece Chess game. 30. The method of claim 26, wherein the first and second devices represent the view of the common virtual scene as a first object and a second object, wherein the movement of the first object Matching the movement of the first device in the first 3D space, and the movement of the second object matches the movement of the second device in the second 3D space. 31. A method of controlling a virtual scene and a view of a portable device, the method: synchronizing a portable device to a reference point in a three-dimensional (3D) space in which the portable device is located, the portable device including a front camera in front of the device and a rear camera facing the back of the portable device. In the 3D space, a virtual scene is generated next to the reference point, the virtual scene including a virtual reality component; determining the portable device relative to the reference Pointing at a current location within the 3D space; establishing a view of the virtual scene that captures a representative view of the virtual scene as seen by a player holding the portable device at a current eye location in the 3D space The capture corresponds to the player seeing through the window to enter the virtual scene, the window point in the 3D space is equal to the location in the 3D space of the display in the portable device; displaying the created view to the display And when the portable device or the player moves in the 3D space, change -54-201205121 to change the displayed view of the virtual scene . 32. The method of claim 31, wherein the image from the BU face camera is used to determine the current eye location and the image from the back camera to obtain the view of the 3D space. 3. The method of claim 31, wherein the display is pulled away from the player's eyes such that the viewing surface enlarges the virtual scene and the display is pulled toward the player's eyes such that the viewing surface Reduce the virtual scene. 34. A method of controlling a scene and a portable device, the method comprising receiving a signal to synchronize the portable device; synchronizing the portable device such that the portable device is in a three-dimensional (3D) space a reference point; a virtual scene is generated in the 3D space, the virtual scene includes a virtual reality element; and when the portable device moves away from the reference point, a view surface of the virtual scene is established, wherein the virtual scene The view surface represents the virtual scene as seen by the current location of the portable device; and displays the established view surface in the portable device; and when the portable device moves in the 3D space, changes the virtual scene This displays the viewport. 35. The method of claim 34, further comprising: determining a virtual component that produces a sound; and transmitting, by the portable device, a sound corresponding to the sound of the sound produced by the virtual component - 55 - 201205121 The emitted sound becomes larger as the portable device moves closer to the location of the virtual component that produces the sound. 3. The method of claim 35, wherein the portable device has a stereophonic speaker, wherein the emitted sound is adjusted to be between the portable device and the virtual component that produces the sound. Relative position, providing stereo sound effect. 37. A portable device for interacting with an augmented reality, the portable device comprising: a location module for determining a location of the portable device in a 3D space in which the portable device is located, wherein When the portable device receives the signal for synchronization, the location of the portable device is set as a reference point in the 3D space; a virtual reality generator that creates a virtual next to the reference point in the 3D space a scene, the virtual scene includes a virtual reality component; a view generator that establishes a view of the virtual scene, wherein the view represents a location of the portable device and a view from the location of the portable device The virtual scene: and the display is configured to display a view surface of the virtual scene, and when the portable device moves in the 3D space, the view surface displayed on the display changes. 3 8 - A computer program built into a computer readable storage medium, which is executed by one or more processors to share a virtual scene between devices, the computer program including z for synchronization a program command from a first device to a reference point in a three-dimensional (3D) space, -56-201205121 for calculating a first program command relative to a position of the first position: for exchanging at the first device and the first The second device synchronizes the instruction to the instruction in the 3D space, the information including the reference point and the first and second settings; for generating a command in the 3D space next to the reference point The virtual scene is shared by the two devices. When the virtual scene interacts, the view of the virtual scene in the program instructions used by the two devices to establish the view of the virtual scene is like the current location of the first device. And seeing at a view of the current location of the current location; displaying the created view on the first device for moving the portable device to the display view of the 3D space virtual scene Program finger . Information about the location of the two devices such that the process of the virtual scene of the program device of the reference point is changed simultaneously when the two devices are in the middle; and the virtual scene is changed according to the portable program command; 57-
TW100103494A 2010-03-05 2011-01-28 Methods, portable device and computer program for maintaining multiple views on a shared stable virtual space TWI468734B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US31125110P 2010-03-05 2010-03-05
US12/947,290 US8730156B2 (en) 2010-03-05 2010-11-16 Maintaining multiple views on a shared stable virtual space

Publications (2)

Publication Number Publication Date
TW201205121A true TW201205121A (en) 2012-02-01
TWI468734B TWI468734B (en) 2015-01-11

Family

ID=43923591

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100103494A TWI468734B (en) 2010-03-05 2011-01-28 Methods, portable device and computer program for maintaining multiple views on a shared stable virtual space

Country Status (4)

Country Link
CN (2) CN105843396B (en)
MX (1) MX2012010238A (en)
TW (1) TWI468734B (en)
WO (1) WO2011109126A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10015370B2 (en) 2015-08-27 2018-07-03 Htc Corporation Method for synchronizing video and audio in virtual reality system
US10866658B2 (en) 2018-12-20 2020-12-15 Industrial Technology Research Institute Indicator device, mixed reality device and operation method thereof
TWI803134B (en) * 2021-09-24 2023-05-21 宏達國際電子股份有限公司 Virtual image display device and setting method for input interface thereof

Families Citing this family (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130136566A (en) 2011-03-29 2013-12-12 퀄컴 인코포레이티드 Modular mobile connected pico projectors for a local multi-user collaboration
JP5718197B2 (en) * 2011-09-14 2015-05-13 株式会社バンダイナムコゲームス Program and game device
CN102495959A (en) * 2011-12-05 2012-06-13 无锡智感星际科技有限公司 Augmented reality (AR) platform system based on position mapping and application method
CN102542165B (en) * 2011-12-23 2015-04-08 三星半导体(中国)研究开发有限公司 Operating device and operating method for three-dimensional virtual chessboard
US20130234925A1 (en) * 2012-03-09 2013-09-12 Nokia Corporation Method and apparatus for performing an operation at least partially based upon the relative positions of at least two devices
US8630458B2 (en) * 2012-03-21 2014-01-14 Google Inc. Using camera input to determine axis of rotation and navigation
JP5966510B2 (en) 2012-03-29 2016-08-10 ソニー株式会社 Information processing system
CN103105993B (en) 2013-01-25 2015-05-20 腾讯科技(深圳)有限公司 Method and system for realizing interaction based on augmented reality technology
TWI555390B (en) * 2013-02-20 2016-10-21 仁寶電腦工業股份有限公司 Method for controlling electronic device and electronic apparatus using the same
WO2014188393A1 (en) * 2013-05-24 2014-11-27 Awe Company Limited Systems and methods for a shared mixed reality experience
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects
US10146299B2 (en) * 2013-11-08 2018-12-04 Qualcomm Technologies, Inc. Face tracking for additional modalities in spatial interaction
CN104657568B (en) * 2013-11-21 2017-10-03 深圳先进技术研究院 Many people's moving game system and methods based on intelligent glasses
EP2886172A1 (en) * 2013-12-18 2015-06-24 Microsoft Technology Licensing, LLC Mixed-reality arena
US9407865B1 (en) * 2015-01-21 2016-08-02 Microsoft Technology Licensing, Llc Shared scene mesh data synchronization
US9787846B2 (en) * 2015-01-21 2017-10-10 Microsoft Technology Licensing, Llc Spatial audio signal processing for objects with associated audio content
KR102610120B1 (en) 2016-01-20 2023-12-06 삼성전자주식회사 Head mounted display and control method thereof
US10115234B2 (en) * 2016-03-21 2018-10-30 Accenture Global Solutions Limited Multiplatform based experience generation
US10665019B2 (en) 2016-03-24 2020-05-26 Qualcomm Incorporated Spatial relationships for integration of visual images of physical environment into virtual reality
CN105938629B (en) * 2016-03-31 2022-01-18 联想(北京)有限公司 Information processing method and electronic equipment
WO2017190293A1 (en) * 2016-05-04 2017-11-09 深圳动三帝虚拟现实互动科技有限公司 Virtual reality display method, device, and terminal
US10245507B2 (en) * 2016-06-13 2019-04-02 Sony Interactive Entertainment Inc. Spectator management at view locations in virtual reality environments
US10169918B2 (en) * 2016-06-24 2019-01-01 Microsoft Technology Licensing, Llc Relational rendering of holographic objects
CN106200956A (en) * 2016-07-07 2016-12-07 北京时代拓灵科技有限公司 A kind of field of virtual reality multimedia presents and mutual method
CN106447786A (en) * 2016-09-14 2017-02-22 同济大学 Parallel space establishing and sharing system based on virtual reality technologies
US10593116B2 (en) 2016-10-24 2020-03-17 Snap Inc. Augmented reality object manipulation
CN106528285A (en) * 2016-11-11 2017-03-22 上海远鉴信息科技有限公司 Method and system for multi-terminal cooperative scheduling in virtual reality
CN106621306A (en) * 2016-12-23 2017-05-10 浙江海洋大学 Double-layer three-dimensional type army flag chessboard
US10242503B2 (en) 2017-01-09 2019-03-26 Snap Inc. Surface aware lens
EP4270325A3 (en) * 2017-01-09 2023-12-20 Snap Inc. Augmented reality object manipulation
JP6526367B2 (en) * 2017-03-01 2019-06-05 三菱電機株式会社 Information processing system
CN107103645B (en) * 2017-04-27 2018-07-20 腾讯科技(深圳)有限公司 virtual reality media file generation method and device
CN107087152B (en) * 2017-05-09 2018-08-14 成都陌云科技有限公司 Three-dimensional imaging information communication system
CN108932051B (en) * 2017-05-24 2022-12-16 腾讯科技(北京)有限公司 Augmented reality image processing method, apparatus and storage medium
CN107320955B (en) * 2017-06-23 2021-01-29 武汉秀宝软件有限公司 AR venue interface interaction method and system based on multiple clients
CN109298776B (en) * 2017-07-25 2021-02-19 阿里巴巴(中国)有限公司 Augmented reality interaction system, method and device
CN107469343B (en) * 2017-07-28 2021-01-26 深圳市瑞立视多媒体科技有限公司 Virtual reality interaction method, device and system
CN107390875B (en) 2017-07-28 2020-01-31 腾讯科技(上海)有限公司 Information processing method, device, terminal equipment and computer readable storage medium
CN107492183A (en) * 2017-07-31 2017-12-19 程昊 One kind has paper instant lottery AR methods of exhibiting and system
CN107632700A (en) * 2017-08-01 2018-01-26 中国农业大学 A kind of farm implements museum experiencing system and method based on virtual reality
CN109426333B (en) * 2017-08-23 2022-11-04 腾讯科技(深圳)有限公司 Information interaction method and device based on virtual space scene
WO2019080902A1 (en) * 2017-10-27 2019-05-02 Zyetric Inventions Limited Interactive intelligent virtual object
CN111263956A (en) * 2017-11-01 2020-06-09 索尼公司 Information processing apparatus, information processing method, and program
CN107861682A (en) * 2017-11-03 2018-03-30 网易(杭州)网络有限公司 The control method for movement and device of virtual objects
CN107967054B (en) * 2017-11-16 2020-11-27 中国人民解放军陆军装甲兵学院 Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled
CN107657589B (en) * 2017-11-16 2021-05-14 上海麦界信息技术有限公司 Mobile phone AR positioning coordinate axis synchronization method based on three-datum-point calibration
CN107995481B (en) * 2017-11-30 2019-11-15 贵州颐爱科技有限公司 A kind of display methods and device of mixed reality
CN108269307B (en) * 2018-01-15 2023-04-07 歌尔科技有限公司 Augmented reality interaction method and equipment
WO2019141879A1 (en) * 2018-01-22 2019-07-25 The Goosebumps Factory Bvba Calibration to be used in an augmented reality method and system
US11880540B2 (en) * 2018-03-22 2024-01-23 Hewlett-Packard Development Company, L.P. Digital mark-up in a three dimensional environment
CN108519817A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Exchange method, device, storage medium based on augmented reality and electronic equipment
CN108667798A (en) * 2018-03-27 2018-10-16 上海临奇智能科技有限公司 A kind of method and system of virtual viewing
CN108479065B (en) * 2018-03-29 2021-12-28 京东方科技集团股份有限公司 Virtual image interaction method and related device
US11173398B2 (en) * 2018-05-21 2021-11-16 Microsoft Technology Licensing, Llc Virtual camera placement system
CN108919945A (en) * 2018-06-07 2018-11-30 佛山市长郡科技有限公司 A kind of method of virtual reality device work
CN109284000B (en) * 2018-08-10 2022-04-01 西交利物浦大学 Method and system for visualizing three-dimensional geometric object in virtual reality environment
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
US11501499B2 (en) 2018-12-20 2022-11-15 Snap Inc. Virtual surface modification
EP3914996A1 (en) 2019-04-18 2021-12-01 Apple Inc. Shared data and collaboration for head-mounted devices
US10948978B2 (en) 2019-04-23 2021-03-16 XRSpace CO., LTD. Virtual object operating system and virtual object operating method
CN113508361A (en) 2019-05-06 2021-10-15 苹果公司 Apparatus, method and computer-readable medium for presenting computer-generated reality files
US10499044B1 (en) 2019-05-13 2019-12-03 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
CN110286768B (en) * 2019-06-27 2022-05-17 Oppo广东移动通信有限公司 Virtual object display method, terminal device and computer-readable storage medium
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
CN110349270B (en) * 2019-07-02 2023-07-28 上海迪沪景观设计有限公司 Virtual sand table presenting method based on real space positioning
US11232646B2 (en) 2019-09-06 2022-01-25 Snap Inc. Context-based virtual object rendering
US20210157394A1 (en) 2019-11-24 2021-05-27 XRSpace CO., LTD. Motion tracking system and method
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
EP4173257A1 (en) 2020-06-30 2023-05-03 Snap Inc. Skeletal tracking for real-time virtual effects
CN113941138A (en) * 2020-08-06 2022-01-18 黄得锋 AR interaction control system, device and application
CN111915736A (en) * 2020-08-06 2020-11-10 黄得锋 AR interaction control system, device and application
CN116114258A (en) * 2020-08-13 2023-05-12 斯纳普公司 User interface for pose driven virtual effects
CN115705116A (en) * 2021-08-04 2023-02-17 北京字跳网络技术有限公司 Interactive method, electronic device, storage medium, and program product
US20230078578A1 (en) * 2021-09-14 2023-03-16 Meta Platforms Technologies, Llc Creating shared virtual spaces

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588139A (en) * 1990-06-07 1996-12-24 Vpl Research, Inc. Method and system for generating objects for a multi-person virtual world using data flow networks
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US7149691B2 (en) * 2001-07-27 2006-12-12 Siemens Corporate Research, Inc. System and method for remotely experiencing a virtual environment
JP4054585B2 (en) * 2002-02-18 2008-02-27 キヤノン株式会社 Information processing apparatus and method
US20060257420A1 (en) * 2002-04-26 2006-11-16 Cel-Sci Corporation Methods of preparation and composition of peptide constructs useful for treatment of autoimmune and transplant related host versus graft conditions
US11033821B2 (en) * 2003-09-02 2021-06-15 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US8323106B2 (en) * 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8585476B2 (en) * 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
TWI278772B (en) * 2005-02-23 2007-04-11 Nat Applied Res Lab Nat Ce Augmented reality system and method with mobile and interactive function for multiple users
JP4738870B2 (en) * 2005-04-08 2011-08-03 キヤノン株式会社 Information processing method, information processing apparatus, and remote mixed reality sharing apparatus
JP5230114B2 (en) * 2007-03-13 2013-07-10 キヤノン株式会社 Information processing apparatus and information processing method
CN101174332B (en) * 2007-10-29 2010-11-03 张建中 Method, device and system for interactively combining real-time scene in real world with virtual reality scene
US8386918B2 (en) * 2007-12-06 2013-02-26 International Business Machines Corporation Rendering of real world objects and interactions into a virtual universe
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10015370B2 (en) 2015-08-27 2018-07-03 Htc Corporation Method for synchronizing video and audio in virtual reality system
US10866658B2 (en) 2018-12-20 2020-12-15 Industrial Technology Research Institute Indicator device, mixed reality device and operation method thereof
TWI803134B (en) * 2021-09-24 2023-05-21 宏達國際電子股份有限公司 Virtual image display device and setting method for input interface thereof

Also Published As

Publication number Publication date
TWI468734B (en) 2015-01-11
CN102884490B (en) 2016-05-04
WO2011109126A1 (en) 2011-09-09
CN105843396A (en) 2016-08-10
MX2012010238A (en) 2013-01-18
CN102884490A (en) 2013-01-16
CN105843396B (en) 2019-01-01

Similar Documents

Publication Publication Date Title
US10424077B2 (en) Maintaining multiple views on a shared stable virtual space
TWI468734B (en) Methods, portable device and computer program for maintaining multiple views on a shared stable virtual space
TWI449953B (en) Methods for generating an interactive space viewable through at least a first and a second device, and portable device for sharing a virtual reality among portable devices
TWI594174B (en) Tracking system, method and device for head mounted display
US9990029B2 (en) Interface object and motion controller for augmented reality
US9947139B2 (en) Method and apparatus for providing hybrid reality environment
CN104010706B (en) The direction input of video-game
TWI786700B (en) Scanning of 3d objects with a second screen device for insertion into a virtual environment
WO2022216465A1 (en) Adjustable robot for providing scale of virtual assets and identifying objects in an interactive scene