WO2017161660A1 - 增强现实设备、系统、图像处理方法及装置 - Google Patents
增强现实设备、系统、图像处理方法及装置 Download PDFInfo
- Publication number
- WO2017161660A1 WO2017161660A1 PCT/CN2016/082754 CN2016082754W WO2017161660A1 WO 2017161660 A1 WO2017161660 A1 WO 2017161660A1 CN 2016082754 W CN2016082754 W CN 2016082754W WO 2017161660 A1 WO2017161660 A1 WO 2017161660A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- augmented reality
- image
- user
- reality device
- environment
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/70—Services for machine-to-machine communication [M2M] or machine type communication [MTC]
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Definitions
- the augmented reality device can only collect an image of the user's surroundings through a camera disposed outside the device, and the image captured by the augmented reality device is relatively simple, so that the augmented reality server performs the image according to the augmented reality device.
- the flexibility of image processing is low.
- a device body that can be worn by a first user
- the communication module is connected to the facial image acquisition component, and configured to send the facial image to an augmented reality server,
- An environmental image acquisition component disposed outside the device body and configured to acquire an image of a surrounding environment of the second user
- a detecting submodule configured to detect whether an image of the first augmented reality device exists in an image of the environment surrounding the second user
- An environment image acquisition component the environment image acquisition component is disposed outside the device body, and the environment image acquisition component is configured to collect an image of the environment surrounding the user;
- the environment image acquisition component is further configured to detect a calibration signal sent by another augmented reality device when acquiring an image of the environment of the user, and determine the other according to the detected calibration signal. Identifying an identity of the augmented reality device and obtaining location information of the other augmented reality device in an image of the environment surrounding the user;
- the device further includes:
- the calibration device comprises an infrared emitter.
- the device body comprises glasses or a helmet.
- the device body of the first augmented reality device is provided with a mark pattern for indicating the first augmented reality device
- the detecting whether the image of the first augmented reality device exists in the image of the environment surrounding the second user includes:
- a processing module configured to perform an augmented reality process on the image of the environment surrounding the second user according to the facial image of the first user;
- a replacement submodule configured to replace an image of the first augmented reality device with a facial image of the first user when an image of the first augmented reality device exists in an image of the environment surrounding the second user.
- a third receiving module configured to receive the location information sent by the second augmented reality device and the identifier of the first augmented reality device corresponding to the location information, where the location information is used to indicate the first augmented reality device a location in an image of the environment surrounding the second user;
- the detection submodule is also configured to:
- the device body of the first augmented reality device is provided with a mark pattern for indicating the first augmented reality device
- the first motion state data includes a first deflection angle of the first user's head in a preset reference coordinate system
- the second motion state data includes the second user's head is in advance a second deflection angle in the reference coordinate system
- An augmented reality system includes: an augmented reality server and at least two augmented reality devices, the at least two augmented reality devices including a first augmented reality device and a second augmented reality device.
- the first augmented reality device may collect the facial image of the first user in addition to the image of the environment surrounding the first user.
- the augmented reality server is capable of receiving the facial image of the first user, and performing augmented reality processing on the image of the environment surrounding the second user sent by the second augmented reality device according to the facial image of the first user, thereby enriching
- FIG. 1 is a schematic structural diagram of an augmented reality system according to an embodiment of the present invention.
- FIG. 4F is a schematic diagram of an image of a second user's surrounding environment after augmented reality processing according to an embodiment of the present invention.
- FIG. 5A is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention.
- the device body 20 may be glasses or a helmet to facilitate wearing by a user.
- the facial image acquisition component 201 can be a wide-angle camera.
- the wide-angle camera has a large viewing angle range, so when the wide-angle camera is close to the user's face, the user's face image can be effectively collected.
- the communication module 21 can be a Bluetooth module, a Wireless Fidelity (WiFi) module, or a network interface.
- the communication module may be connected to the facial image capturing component through a processor in the augmented reality device, or may be directly connected to the facial image capturing component. This embodiment of the invention is not limited thereto.
- the location information of the other augmented reality device in the image of the user's surroundings may be the coordinates of the calibration signal transmission point in the image of the user's surroundings.
- the environment image acquisition component may separately acquire coordinates of each calibration signal transmission point in an image of the environment surrounding the user. If the image of the environment surrounding the user includes a calibration signal sent by a plurality of other augmented reality devices, the environment image acquisition component may establish a correspondence between the location information and the identifier of the augmented reality device according to the augmented reality device indicated by each calibration signal. relationship.
- the augmented reality device may further include: a motion sensor 204, configured to acquire motion state data of the user's head.
- the communication module 21 is connected to the motion sensor 204, and the motion state data can also be transmitted to the augmented reality server.
- the motion sensor 204 can be a six-axis sensor including a three-axis accelerator and a three-axis gyroscope, wherein the three-axis accelerator can detect the acceleration in the horizontal direction of the augmented reality device, and the three-axis gyroscope can detect the augmented reality. The angle at which the device rotates.
- Step 301 Receive a facial image of the first user sent by the first augmented reality device.
- the acquired location information of the first augmented reality device in the image of the second user's surrounding environment, and the location information corresponding to the location information may be The identity of the first augmented reality device is sent to the augmented reality server.
- the first user 41 in FIG. 4C is wearing the first augmented reality device, and the first augmented reality device is provided with two calibration devices, and the second augmented reality device is calibrated according to the two calibration devices.
- the signal, the acquired position information of the first augmented reality device in the image of the environment surrounding the second user is the coordinates of the two calibration signal transmission points.
- the location information can be: (1.5, 0.8) and (1.65, 0.86).
- the second augmented reality device may further acquire an identifier of the first augmented reality device, such as 001, according to the two calibration signals. Further, since the third augmented reality device worn by the third user 43 can also issue a calibration signal, the second enhanced device can also obtain the location of the third user device in the image of the environment surrounding the second user. Information: (1.3, 0.85); (1.5, 0.92), and the identifier of the third augmented reality device corresponding to the location information: 003. Therefore, the second augmented reality device can transmit a correspondence relationship between the location information shown in Table 1 and the identifier of the augmented reality device to the augmented reality server.
- the augmented reality server may determine that the image 410 of the augmented reality device is the first augmented reality The image of the device.
- the method then proceeds to step 404, when the image of the environment surrounding the second user exists
- the augmented reality server replaces the image of the first augmented reality device with the facial image of the first user.
- the augmented reality server may use the image of the first augmented reality device in the image of the environment surrounding the second user in the process of performing augmented reality processing on the image of the environment surrounding the second user.
- the facial image replaces the image of the first augmented reality device, that is, the facial image of the first user is superimposed on the image of the first augmented reality device according to a preset image superposition algorithm.
- the specific implementation process of superimposing the facial image according to the preset image superposition algorithm may refer to related technologies, and no further description is provided herein.
- the second augmented reality device transmits the motion state data of the second head of the second user to the augmented reality server.
- the second augmented reality device may also be provided with a motion sensor, and the second augmented reality device may also send the second motion state data of the second user's head to the augmented reality server in real time, the second motion state.
- the data may include a deflection angle of the second user's head in a preset reference coordinate system.
- the motion sensors in the plurality of augmented reality devices may be based on the coordinate system when collecting the motion state data.
- the same preset reference coordinate system is used to facilitate the augmented reality server to process the motion state data sent by each augmented reality device.
- the processing module 503 can perform augmented reality processing on the image of the environment surrounding the second user according to the facial image of the first user.
- the sending module 504 can send the processed image of the environment of the second user to the second augmented reality device Ready.
- the third receiving module 505 can receive the location information sent by the second augmented reality device and the identifier of the first augmented reality device corresponding to the location information, where the location information is used to indicate that the first augmented reality device is in the second user environment. The location in the image.
- the fourth receiving module 506 can receive the first motion state data of the first user's head sent by the first augmented reality device.
- the fifth receiving module 507 can receive the second motion state data of the second user's head sent by the second augmented reality device.
- the adjustment module 508 can adjust the setting angle of the facial image of the first user in the image of the environment surrounding the second user according to the first motion state data and the second motion state data.
- the first augmented reality device 20 can also issue a calibration signal, the calibration signal being used to indicate the first augmented reality device.
- the device body of the first augmented reality device may be provided with a calibration device (for example, an infrared emitter), and the first augmented reality device may be modulated with the first augmented reality by a calibration signal sent by the calibration device.
- a calibration device for example, an infrared emitter
- the identification of the device such that other augmented reality devices can identify the first augmented reality device based on the calibration signal.
- the first motion state data includes a first deflection angle of the first user's head in a preset reference coordinate system
- the second motion state data includes the second user's head at the preset reference The second angle of deflection in the coordinate system.
- the augmented reality server 10 can adjust a setting angle of the face image of the first user in an image of the environment surrounding the second user according to the first deflection angle and the second deflection angle.
- first augmented reality device and “second augmented reality device” do not necessarily mean that the first augmented reality device is prior to the second augmented reality device in ordering or prioritization. In fact, these phrases are only used to identify different augmented reality devices.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
增强现实设备的标识 | 001 | 003 |
位置信息 | (1.5,0.8);(1.65,0.86) | (1.3,0.85);(1.5,0.92) |
Claims (16)
- 一种增强现实系统,其特征在于,所述系统包括:增强现实服务器和至少两个增强现实设备,所述至少两个增强现实设备包括第一增强现实设备和第二增强现实设备,所述第一增强现实设备包括:可被第一用户佩戴的设备主体;设置在所述设备主体内侧的脸部图像采集组件,当所述设备主体被第一用户佩戴时,所述脸部图像采集组件朝向第一用户脸部,且与所述第一用户脸部存在预设距离,所述脸部图像采集组件被配置用于采集所述第一用户的脸部图像;和第一通信模块,所述通信模块与所述脸部图像采集组件连接,配置用于向增强现实服务器发送所述脸部图像,所述第二增强现实设备包括:可被第二用户佩戴的设备主体;环境图像采集组件,所述环境图像采集组件设置在所述设备主体的外侧,被配置用于采集第二用户周围环境的图像;和第二通信模块,第二通信模块与所述环境图像采集组件连接,被配置用于并向所述增强现实服务器发送所述第二用户周围环境的图像,所述第一用户和所述第二用户位于同一现实场景中,所述增强现实服务器包括:接收模块,被配置用于接收所述第一用户的脸部图像和所述第二用户周围环境的图像;处理模块,被配置用于根据所述第一用户的脸部图像,对所述第二用户周围环境的图像进行增强现实处理;和发送模块,被配置用于并将处理后的所述第二用户周围环境的图像发送至所述第二增强现实设备。
- 根据权利要求1所述的系统,其特征在于,所述增强现实服务器还包括:检测子模块,被配置用于检测所述第二用户周围环境的图像中是否存在第一增强现实设备的图像;和替换子模块,被配置用于当所述第二用户周围环境的图像中存在 第一增强现实设备的图像时,用所述第一用户的脸部图像替换所述第一增强现实设备的图像。
- 根据权利要求2所述的系统,其特征在于,所述第一增强现实设备还包括标定装置,所述标定装置设置在所述设备主体的外侧,被配置用于发出标定信号,所述标定信号用于指示所述第一增强现实设备,所述第二增强现实设备的环境图像采集组件还被配置用于检测所述标定信号,并且根据所述标定信号,确定所述第一增强现实设备的标识,获取所述第一增强现实设备在所述第二用户周围环境的图像中的位置信息,所述第二通信模块被配置用于向所述增强现实服务器发送所述位置信息以及所述位置信息对应的所述第一增强现实设备的标识,所述检测子模块还被配置用于:检测所述第二用户周围环境的图像中,所述位置信息指示的位置是否存在增强现实设备的图像;和当所述第二用户周围环境的图像中,所述位置信息指示的位置存在增强现实设备的图像时,根据所述第一增强现实设备的标识,确定所述位置信息指示的位置的增强现实设备的图像为所述第一增强现实设备的图像。
- 根据权利要求2所述的系统,其特征在于,所述第一增强现实设备的设备主体上设置有用于指示所述第一增强现实设备的标记图案,所述检测子模块还被配置用于:检测所述第二用户周围环境的图像中是否存在增强现实设备的图像;当所述第二用户周围环境的图像中存在增强现实设备的图像时,检测所述增强现实设备的图像中是否存在所述标记图案;当所述增强现实设备的图像中存在所述标记图案时,确定所述增强现实设备的图像为所述第一增强现实设备的图像。
- 根据权利要求2至4任一所述的系统,其特征在于,所述第一增强现实设备还包括第一运动传感器,被配置用于获取所述第一用户头部的第一运动状态数据,所述第一通信模块还被配置用于向所述增强现实服务器发送所述第一运动状态数据,所述第二增强现实设备还包括第二运动传感器,被配置用于获取所述第二用户头部的第二运动状态数据,所述第二通信模块还被配置用于向所述增强现实服务器发送所述第二运动状态数据,所述增强现实服务器还包括调整模块,被配置用于根据所述第一运动状态数据和所述第二运动状态数据,调整所述第一用户的脸部图像在所述第二用户周围环境的图像中的设置角度。
- 根据权利要求5所述的系统,其特征在于,所述第一运动状态数据包括所述第一用户头部在预设的基准坐标系中的第一偏转角度,所述第二运动状态数据包括所述第二用户头部在所述预设的基准坐标系中的第二偏转角度,;所述调整模块被配置用于:根据所述第一偏转角度和所述第二偏转角度,调整所述第一用户的脸部图像在所述第二用户周围环境的图像中的设置角度。
- 一种增强现实设备,其特征在于,所述设备包括:可被用户佩戴的设备主体;设置在所述设备主体内侧的脸部图像采集组件,当所述设备主体被用户佩戴时,所述脸部图像采集组件朝向用户脸部,且与所述用户脸部存在预设距离,所述脸部图像采集组件被配置用于采集所述用户的脸部图像;以及通信模块,所述通信模块与所述脸部图像采集组件连接,被配置用于向增强现实服务器发送所述脸部图像。
- 根据权利要求7所述的设备,其特征在于,所述设备还包括:标定装置,所述标定装置设置在所述设备主体的外侧,所述标定装置被配置用于发出标定信号,所述标定信号用于指示所述增强现实设备。
- 根据权利要求7所述的设备,其特征在于,所述设备还包括:环境图像采集组件,所述环境图像采集组件设置在所述设备主体的外侧,所述环境图像采集组件被配置用于采集所述用户周围环境的图像;所述通信模块与所述环境图像采集组件连接,还被配置用于向所述增强现实服务器发送所述用户周围环境的图像。
- 根据权利要求9所述的设备,其特征在于,所述环境图像采集组件还被配置用于在采集所述用户周围环境的图像时,检测其它增强现实设备发出的标定信号,并根据检测到的所述标定信号,确定所述其它增强现实设备的标识,并获取所述其它增强现实设备在所述用户周围环境的图像中的位置信息;所述通信模块还被配置用于向所述增强现实服务器发送所述位置信息以及所述位置信息对应的所述其它增强现实设备的标识。
- 根据权利要求7所述的设备,其特征在于,所述设备还包括:运动传感器,被配置用于获取所述用户头部的运动状态数据;所述通信模块与所述运动传感器连接,还被配置用于将所述运动状态数据发送至增强现实服务器。
- 根据权利要求7至11任一所述的设备,其特征在于,所述设备主体上设置有用于指示所述增强现实设备的标记图案。
- 一种图像处理方法,其特征在于,该图像处理方法用于增强现实服务器,所述方法包括:接收第一增强现实设备发送的第一用户的脸部图像;接收第二增强现实设备发送的第二用户周围环境的图像,所述第一用户和所述第二用户位于同一现实场景中;根据所述第一用户的脸部图像,对所述第二用户周围环境的图像进行增强现实处理;将处理后的所述第二用户周围环境的图像发送至所述第二增强现实设备。
- 根据权利要求13所述的方法,其特征在于,所述根据所述第一用户的脸部图像,对所述第二用户周围环境的图像进行增强现实处理,包括:检测所述第二用户周围环境的图像中是否存在第一增强现实设备的图像;当所述第二用户周围环境的图像中存在第一增强现实设备的图像时,用所述第一用户的脸部图像替换所述第一增强现实设备的图像。
- 一种图像处理装置,其特征在于,该图像处理装置用于增强 现实服务器,所述图像处理装置包括:第一接收模块,被配置用于接收第一增强现实设备发送的第一用户的脸部图像;第二接收模块,被配置用于接收第二增强现实设备发送的第二用户周围环境的图像,所述第一用户和所述第二用户位于同一现实场景中;处理模块,被配置用于根据所述第一用户的脸部图像,对所述第二用户周围环境的图像进行增强现实处理;发送模块,被配置用于将处理后的所述第二用户周围环境的图像发送至所述第二增强现实设备。
- 根据权利要求15所述的图像处理装置,其特征在于,所述处理模块进一步包括:检测子模块,被配置用于检测所述第二用户周围环境的图像中是否存在第一增强现实设备的图像;替换子模块,被配置用于当所述第二用户周围环境的图像中存在第一增强现实设备的图像时,用所述第一用户的脸部图像替换所述第一增强现实设备的图像。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/536,259 US10665021B2 (en) | 2016-03-25 | 2016-05-20 | Augmented reality apparatus and system, as well as image processing method and device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610179196.4 | 2016-03-25 | ||
CN201610179196.4A CN105867617B (zh) | 2016-03-25 | 2016-03-25 | 增强现实设备、系统、图像处理方法及装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017161660A1 true WO2017161660A1 (zh) | 2017-09-28 |
Family
ID=56624962
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/082754 WO2017161660A1 (zh) | 2016-03-25 | 2016-05-20 | 增强现实设备、系统、图像处理方法及装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10665021B2 (zh) |
CN (1) | CN105867617B (zh) |
WO (1) | WO2017161660A1 (zh) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10198874B2 (en) * | 2016-05-13 | 2019-02-05 | Google Llc | Methods and apparatus to align components in virtual reality environments |
WO2018026893A1 (en) * | 2016-08-03 | 2018-02-08 | Google Llc | Methods and systems for determining positional data for three-dimensional interactions inside virtual reality environments |
US20180077092A1 (en) * | 2016-09-09 | 2018-03-15 | Tariq JALIL | Method and system for facilitating user collaboration |
US11218431B2 (en) | 2016-09-09 | 2022-01-04 | Tariq JALIL | Method and system for facilitating user collaboration |
CN107168619B (zh) * | 2017-03-29 | 2023-09-19 | 腾讯科技(深圳)有限公司 | 用户生成内容处理方法和装置 |
CN107203263A (zh) * | 2017-04-11 | 2017-09-26 | 北京峰云视觉技术有限公司 | 一种虚拟现实眼镜系统及图像处理方法 |
CN108805984B (zh) * | 2017-04-28 | 2021-05-04 | 京东方科技集团股份有限公司 | 显示系统和图像显示方法 |
US10216333B2 (en) * | 2017-06-30 | 2019-02-26 | Microsoft Technology Licensing, Llc | Phase error compensation in single correlator systems |
US11145124B2 (en) | 2017-08-30 | 2021-10-12 | Ronald H. Winston | System and method for rendering virtual reality interactions |
CN116700489A (zh) * | 2018-03-13 | 2023-09-05 | 罗纳德·温斯顿 | 虚拟现实系统和方法 |
CN108551420B (zh) * | 2018-04-08 | 2021-12-14 | 北京灵犀微光科技有限公司 | 增强现实设备及其信息处理方法 |
CN109597484A (zh) * | 2018-12-03 | 2019-04-09 | 山东浪潮商用系统有限公司 | 一种基于vr虚拟现实的自助办税系统及方法 |
US11016630B2 (en) * | 2019-01-31 | 2021-05-25 | International Business Machines Corporation | Virtual view-window |
CN109978945B (zh) * | 2019-02-26 | 2021-08-31 | 浙江舜宇光学有限公司 | 一种增强现实的信息处理方法和装置 |
CN111698481B (zh) * | 2020-06-23 | 2021-07-23 | 湖北视纪印象科技股份有限公司 | 基于云计算的智能交互机器人监测系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120256820A1 (en) * | 2011-04-08 | 2012-10-11 | Avinash Uppuluri | Methods and Systems for Ergonomic Feedback Using an Image Analysis Module |
US20120327196A1 (en) * | 2010-05-24 | 2012-12-27 | Sony Computer Entertainment Inc. | Image Processing Apparatus, Image Processing Method, and Image Communication System |
CN103257703A (zh) * | 2012-02-20 | 2013-08-21 | 联想(北京)有限公司 | 一种增强现实装置及方法 |
CN104935866A (zh) * | 2014-03-19 | 2015-09-23 | 华为技术有限公司 | 实现视频会议的方法、合成设备和系统 |
CN205430495U (zh) * | 2016-03-25 | 2016-08-03 | 京东方科技集团股份有限公司 | 增强现实设备及系统 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100164990A1 (en) * | 2005-08-15 | 2010-07-01 | Koninklijke Philips Electronics, N.V. | System, apparatus, and method for augmented reality glasses for end-user programming |
JP2013186691A (ja) * | 2012-03-08 | 2013-09-19 | Casio Comput Co Ltd | 画像処理装置及び画像処理方法並びにプログラム |
JP5966510B2 (ja) * | 2012-03-29 | 2016-08-10 | ソニー株式会社 | 情報処理システム |
CN102866506A (zh) * | 2012-09-21 | 2013-01-09 | 苏州云都网络技术有限公司 | 增强现实眼镜及其实现方法 |
US9851787B2 (en) * | 2012-11-29 | 2017-12-26 | Microsoft Technology Licensing, Llc | Display resource management |
KR102019124B1 (ko) * | 2013-01-04 | 2019-09-06 | 엘지전자 주식회사 | 헤드 마운트 디스플레이 및 그 제어 방법 |
EP3042152B1 (en) * | 2013-09-04 | 2022-11-09 | Essilor International | Navigation method based on a see-through head-mounted device |
US9672416B2 (en) * | 2014-04-29 | 2017-06-06 | Microsoft Technology Licensing, Llc | Facial expression tracking |
US9719871B2 (en) * | 2014-08-09 | 2017-08-01 | Google Inc. | Detecting a state of a wearable device |
CN104571532B (zh) * | 2015-02-04 | 2018-01-30 | 网易有道信息技术(北京)有限公司 | 一种实现增强现实或虚拟现实的方法及装置 |
US9910275B2 (en) * | 2015-05-18 | 2018-03-06 | Samsung Electronics Co., Ltd. | Image processing for head mounted display devices |
US10165949B2 (en) * | 2015-06-14 | 2019-01-01 | Facense Ltd. | Estimating posture using head-mounted cameras |
CN104966318B (zh) * | 2015-06-18 | 2017-09-22 | 清华大学 | 具有图像叠加和图像特效功能的增强现实方法 |
CN105183147A (zh) * | 2015-08-03 | 2015-12-23 | 众景视界(北京)科技有限公司 | 头戴式智能设备及其建模三维虚拟肢体的方法 |
CN205430995U (zh) | 2016-03-01 | 2016-08-10 | 陈东彩 | 简易烟苗移植定距滚轮 |
-
2016
- 2016-03-25 CN CN201610179196.4A patent/CN105867617B/zh active Active
- 2016-05-20 US US15/536,259 patent/US10665021B2/en active Active
- 2016-05-20 WO PCT/CN2016/082754 patent/WO2017161660A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120327196A1 (en) * | 2010-05-24 | 2012-12-27 | Sony Computer Entertainment Inc. | Image Processing Apparatus, Image Processing Method, and Image Communication System |
US20120256820A1 (en) * | 2011-04-08 | 2012-10-11 | Avinash Uppuluri | Methods and Systems for Ergonomic Feedback Using an Image Analysis Module |
CN103257703A (zh) * | 2012-02-20 | 2013-08-21 | 联想(北京)有限公司 | 一种增强现实装置及方法 |
CN104935866A (zh) * | 2014-03-19 | 2015-09-23 | 华为技术有限公司 | 实现视频会议的方法、合成设备和系统 |
CN205430495U (zh) * | 2016-03-25 | 2016-08-03 | 京东方科技集团股份有限公司 | 增强现实设备及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN105867617B (zh) | 2018-12-25 |
US10665021B2 (en) | 2020-05-26 |
US20180061133A1 (en) | 2018-03-01 |
CN105867617A (zh) | 2016-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017161660A1 (zh) | 增强现实设备、系统、图像处理方法及装置 | |
KR101761751B1 (ko) | 직접적인 기하학적 모델링이 행해지는 hmd 보정 | |
CN108369653B (zh) | 使用眼睛特征的眼睛姿态识别 | |
US9460340B2 (en) | Self-initiated change of appearance for subjects in video and images | |
US20210042992A1 (en) | Assisted augmented reality | |
US10802606B2 (en) | Method and device for aligning coordinate of controller or headset with coordinate of binocular system | |
KR101822471B1 (ko) | 혼합현실을 이용한 가상현실 시스템 및 그 구현방법 | |
JP2021530817A (ja) | 画像ディスプレイデバイスの位置特定マップを決定および/または評価するための方法および装置 | |
WO2014071254A4 (en) | Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing | |
US20190215505A1 (en) | Information processing device, image generation method, and head-mounted display | |
JP2005500757A (ja) | 3dビデオ会議システム | |
TW201937922A (zh) | 場景重建系統、方法以及非暫態電腦可讀取媒體 | |
US10838515B1 (en) | Tracking using controller cameras | |
CN109155055B (zh) | 关注区域图像生成装置 | |
WO2018146922A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
CN111179341B (zh) | 一种增强现实设备与移动机器人的配准方法 | |
CN105893452B (zh) | 一种呈现多媒体信息的方法及装置 | |
WO2017163648A1 (ja) | 頭部装着装置 | |
CN205430495U (zh) | 增强现实设备及系统 | |
WO2022176450A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
CN105894581B (zh) | 一种呈现多媒体信息的方法及装置 | |
US20230028976A1 (en) | Display apparatus, image generation method, and program | |
JP2014155635A (ja) | 視線計測装置、注視領域の表示方法および注視点のガウス分布の表示方法 | |
TWI460683B (zh) | The way to track the immediate movement of the head | |
WO2017098999A1 (ja) | 情報処理装置、情報処理システム、情報処理装置の制御方法、及び、コンピュータープログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 15536259 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16895012 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16895012 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 19/02/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16895012 Country of ref document: EP Kind code of ref document: A1 |