CN108427479A - Wearable device, the processing system of ambient image data, method and readable medium - Google Patents

Wearable device, the processing system of ambient image data, method and readable medium Download PDF

Info

Publication number
CN108427479A
CN108427479A CN201810149313.1A CN201810149313A CN108427479A CN 108427479 A CN108427479 A CN 108427479A CN 201810149313 A CN201810149313 A CN 201810149313A CN 108427479 A CN108427479 A CN 108427479A
Authority
CN
China
Prior art keywords
wearable device
data
image data
ambient image
control terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810149313.1A
Other languages
Chinese (zh)
Other versions
CN108427479B (en
Inventor
顾照鹏
肖泽东
郑远力
陈宗豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810149313.1A priority Critical patent/CN108427479B/en
Publication of CN108427479A publication Critical patent/CN108427479A/en
Application granted granted Critical
Publication of CN108427479B publication Critical patent/CN108427479B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices
    • G06F13/102Program control for peripheral devices where the programme performs an interfacing function, e.g. device driver
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2213/00Indexing scheme relating to interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F2213/0024Peripheral component interconnect [PCI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a kind of wearable device, the processing system of ambient image data, method and readable medium, the wearable device includes:First interface, second interface and processor;The processor is connected by the first interface with imaging sensor, the processor, the ambient image data for receiving described image sensor acquisition;The processor is connected by the second interface with inertial sensor, the processor, is additionally operable to receive the exercise data of the wearable device of the inertial sensor sensing;The processor is connected with control terminal, is additionally operable to that the attitude data of wearable device is calculated according to ambient image data and exercise data, and generates maps processing message according to ambient image data.The wearable device can asynchronous transmission related data give other control terminals, preferably avoid frame losing problem, the scenes such as virtual reality scenario or augmented reality scene can be showed fast, in time.

Description

Wearable device, the processing system of ambient image data, method and readable medium
Technical field
The present invention relates to electronic technology field more particularly to a kind of wearable device, ambient image data processing system, Method and readable medium.
Background technology
With AR (Augmented Reality, augmented reality) and VR (Virtual Reality, virtual reality skill Art) technology development, wearable device (such as AR/VR glasses, the helmet) progresses into daily life.AR technologies are A technique in real time calculate camera image position and angle and plus respective image, video, 3D models, this technology Target be that virtual world is sleeved on real world on the screen and carries out interaction.VR technologies, which are one kind, can create and experience void The computer simulation system in the quasi- world generates a kind of simulated environment using computer, passes through a kind of Multi-source Information Fusion, interaction The Three-Dimensional Dynamic what comes into a driver's of formula and the system emulation of entity behavior make user be immersed in the environment.It wearable is set by similar It is standby, outstanding visual experience can be brought to user.
Current wearable device is limited to the limitation of volume, weight when carrying out the processing of the ambient images such as AR or VR, High performance processor may be used in terms of image real time transfer and presented to carry out image procossing and scene, but can cause in this way The cost of wearable device is higher, expensive, and if using cheap processor, there can be missing image etc. again and ask Topic, properties of product are relatively low.
Invention content
An embodiment of the present invention provides wearable device, the processing system of ambient image data, method and readable mediums, can The processing of image data is carried out to quickness and high efficiency on wearable device.
On the one hand, an embodiment of the present invention provides a kind of wearable devices, including:First interface, second interface and processing Device;
The processor is connected by the first interface with imaging sensor, the processor, for receiving the image sensing The ambient image data of device acquisition;
The processor is connected by the second interface with inertial sensor, the processor, is additionally operable to receive inertia biography The exercise data of the wearable device of sensor sensing;
The processor is additionally operable to that the appearance of the wearable device is calculated according to the ambient image data and the exercise data The attitude data is sent to the first control terminal by state data;
The processor is additionally operable to generate maps processing message according to the ambient image data, which is sent out Give the second control terminal, which includes the ambient image data, the maps processing message be used to indicate this second Control terminal carries out environmental map processing according to the ambient image data, generates map datum.
On the other hand, an embodiment of the present invention provides a kind of processing systems of ambient image data, including:It is wearable to set Standby, the first control terminal and the second control terminal;
The wearable device is connected with first control terminal, the wearable device, for obtain ambient image data and The exercise data of the wearable device, and the wearable device is calculated according to the ambient image data and the exercise data The attitude data is sent to first control terminal by attitude data;
The wearable device is connected with second control terminal, the wearable device, is additionally operable to according to the ambient image number According to maps processing message is generated, which is sent to second control terminal, which includes environment Image data;
First control terminal, for determining virtual information according to the attitude data, which, which is sent to this, to wear Wear equipment;
Second control terminal carries out environmental map processing according to ambient image data for this, generates map datum.
In another aspect, the embodiment of the present invention provides a kind of processing method of ambient image data, including:
Ambient image data are obtained, maps processing message is generated according to the ambient image data;
Exercise data is obtained, the posture number of wearable device is calculated according to the ambient image data and the exercise data According to;
The attitude data is sent to the first control terminal, which is sent to the second control terminal, the map It includes ambient image data to handle message, which is used to indicate second control terminal according to the ambient image data Environmental map processing is carried out, map datum is generated.
In another aspect, the embodiment of the present invention provides a kind of computer storage media, which is stored with one Item or one or more instruction, this or one or more instruction be suitable for loaded by processor and execute following steps:
Ambient image data are obtained, maps processing message is generated according to the ambient image data;
Exercise data is obtained, the posture number of wearable device is calculated according to the ambient image data and the exercise data According to;
The attitude data is sent to the first control terminal, which is sent to the second control terminal, the map It includes ambient image data to handle message, which is used to indicate second control terminal according to the ambient image data Environmental map processing is carried out, map datum is generated.
Wearable device in the embodiment of the present invention includes two different interface & processors.The processor can lead to Ambient image data and inertial sensor transmitted by the two different interfaces realization Asynchronous Reception imaging sensors are crossed to be sent out The exercise data of the wearable device sent.Also, the processor to the ambient image data and exercise data handle To after related data, it can give transmission other control terminals related data with asynchronous, preferably avoid frame losing problem, it can be fast Victory shows the scenes such as virtual reality scenario or enhancing display scene in time.
Description of the drawings
Technical solution in order to illustrate the embodiments of the present invention more clearly, below will be to needed in embodiment description Attached drawing is briefly described, it should be apparent that, drawings in the following description are some embodiments of the invention, general for this field For logical technical staff, without creative efforts, other drawings may also be obtained based on these drawings.
Fig. 1 is a kind of structural schematic diagram of wearable device provided in an embodiment of the present invention;
Fig. 2 is a kind of application scenario diagram of wearable device provided in an embodiment of the present invention;
Fig. 3 is a kind of system architecture diagram based on wearable device provided in an embodiment of the present invention;
Fig. 4 is a kind of structural schematic diagram of wearable device provided in an embodiment of the present invention;
Fig. 5 is a kind of structural schematic diagram of wearable device provided in an embodiment of the present invention;
Fig. 6 is a kind of structural schematic diagram of the processing system of ambient image data provided in an embodiment of the present invention;
Fig. 7 is the structural schematic diagram of the processing system of another ambient image data provided in an embodiment of the present invention;
Fig. 8 is a kind of flow diagram of the processing method of ambient image data provided in an embodiment of the present invention.
Specific implementation mode
In embodiments of the present invention, it provides a kind of wearable device being laid out again on software and hardware and is System, processor therein complete the reception, calculating and transmission of data by way of Heterogeneous Computing, in one embodiment, On the one hand, wearable device can be connected by an interface with imaging sensor, be exclusively used in receiving the data of imaging sensor, For example, the interface can be the USB3.0 interface or MIPI (Mobile Industry that can quickly receive image data Processor Interface, mobile industry processor interface) etc. interfaces;And be also laid out dedicated interface can it is accurate, The exercise data for quickly receiving inertial sensor sensing, for example, the interface can be SPI (Serial Peripheral Interface, Serial Peripheral Interface (SPI)) etc. interfaces.On the other hand, which can be calculated separately by way of isomery The posture and environmental map for determining user, more accurately to carry out positioning and posture confirmation to wearable device, at one It is the attitude data for calculating wearable device by processor in embodiment, determines the posture of user, in order to is subsequently come with this Determine suitable interaction content, and about the processing mode of environmental map, processor only needs to send out the related datas such as image Give connected control terminal, by control terminal come be based on SLAM ((simultaneous localization and mapping, i.e., Shi Dingwei and map structuring) scheduling algorithm completion map datum processing, generate the map about current environment.In one embodiment In, as shown in Figure 1, being a kind of structural schematic diagram of wearable device provided in an embodiment of the present invention.The wearable device can To include:First interface 101, second interface 102 and processor 103.
In one embodiment, which is connected by the first interface 101 with imaging sensor, the processing Device 103 is used to receive the ambient image data of imaging sensor acquisition.The imaging sensor is for acquiring ambient image.One In a embodiment, which can be the camera shooting equipment being connected on outside on the wearable device, as user voluntarily buys High-definition camera.Alternatively, the imaging sensor is also used as a part for wearable device, being installed in this in manufacture can wear It wears in equipment.The processor 103 is connected by the second interface 102 with inertial sensor, and the processor 103 is for receiving this The exercise data of the wearable device of inertial sensor sensing.The inertial sensor can be used for sensing the movement of wearable device Data.In one embodiment, which can be a kind of outer detection device being connected on the wearable device.Or Person, the inertial sensor are also used as a part for wearable device, are installed in the wearable device in manufacture.
Wearable device in the embodiment of the present invention includes first interface, second interface and processor.Due to image The data volume of the collected ambient image data of sensor is big and frame per second is low (20~60fps), and inertial sensor is collected The data volume of exercise data is small and frame per second is high (100~1000hz), therefore the embodiment of the present invention uses two different transmission speed The interface of rate comes the Asynchronous Reception ambient image data and the exercise data, can be to avoid ambient image data relative to movement number According to transmission delay and channel contention and caused by frame losing phenomenon, can accurately carry out the acquisition and processing of related data, So as to show the scenes such as virtual reality scenario or augmented reality accurately and in time for user.
The appearance of the wearable device can be calculated in the processor 103 according to the ambient image data and the exercise data Then the attitude data is sent to the first control terminal by state data, that is to say, that processor 103 can locally complete environment map As the fusion of data and exercise data, the comprehensive attitude data for determining that wearable device is current;The processor 103 can also basis The ambient image data generate maps processing message, which are sent to the second control terminal, which disappears Breath can include only ambient image data, which is mainly used for indicating second control terminal according to the environment map As data carry out environmental map processing, generate map datum, that is to say, that processor 103 can't based on SLAM algorithms etc. come The processing for carrying out environmental map is used only for the processing that the second control terminal of triggering carries out environmental map.In one embodiment, should First control terminal and second control terminal can realize that the equipment can realize institute respectively as host equipment on one device The corresponding function of the first control terminal and the second control terminal said, in other embodiments, first control terminal and second control End can also be realized on different devices.
On the one hand the attitude data of wearable device can be calculated for the data received in processor 103 in time, On the other hand producible to include the maps processing message of ambient image data, and the maps processing message is sent to the second control End, by be exclusively used in the second control terminal of maps processing according to Asynchronous Reception to ambient image data carry out maps processing.Thus It is completed as it can be seen that maps processing and Attitude Calculation are also asynchronous computing, it is ensured that imaging sensor and inertial sensor collect Data processing promptness and accuracy, complete data processing efficiently, quickly and at low cost, preferably show for user The scenes such as virtual reality scenario or augmented reality.
Fig. 2 is referred to, Fig. 2 is a kind of application scenario diagram of wearable device provided in an embodiment of the present invention.Such as Fig. 2 institutes Show, which can be worn on head by user.When user uses the wearable device, the wearable device In processor can receive user that imaging sensor is acquired by first interface where environment ambient image data, The exercise data that inertial sensor is sensed can also be received by second interface, which can be inertial sensor The six-degree-of-freedomovement movement data of the user sensed.So-called six degree of freedom can be with as shown in Fig. 2, include straight along x, y and z tri- The one-movement-freedom-degree of angular coordinate axis direction, and the rotational freedom rotating around tri- rectangular axes of x, y and z.Processor root It, can be by the posture after the wearable device i.e. attitude data of user is calculated according to the ambient image data and exercise data Data are sent to the first control terminal.First control terminal can use AR technologies/VR technologies, virtually be believed according to the attitude data It ceases, and the virtual information is sent to the processor in wearable device.It is virtual that processor in wearable device receives this After information, can include by the virtual information user at the moment, show corresponding AR scenes or VR scenes.
Wearable device in the embodiment of the present invention, which is not only a kind of user, can directly wear or can be whole It closes the clothes of user or the portable device in accessory or one kind can be by carrying out with host equipment (such as intelligent terminal) Interaction, the equipment for realizing corresponding scene with AR technologies/VR technologies.For example, when user is when playing certain game, it can be in intelligence The game is opened at terminal end, and puts on the wearable device.Intelligent terminal can be according to the attitude data of user (such as left-hand rotation, the right side Turn or directly walk etc.) corresponding interface image is determined in real time, and the virtual interface image transmitting can be worn Wear equipment.And the wearable device can lead to the virtual interface after receiving the virtual interface image It crosses wearable device and is shown to user, to give a kind of feeling of immersion on the spot in person of user.
Fig. 3 is referred to again, is a kind of system architecture diagram based on wearable device that the embodiment of the present invention proposes, this can wear It wears equipment at least one control terminal to be connected, connection diagram may refer to Fig. 3, and used connection type can be logical Cross mode that is wired or being wirelessly connected.The wearable device can be in the fortune for getting ambient image data and wearable device After dynamic data, data fusion is carried out according to the ambient image data and exercise data, the attitude data of user is calculated, so Attitude data is sent to the first control terminal afterwards.Delay that can be to avoid data transmission and frame losing problem.And directly by posture Data are sent to the first control terminal, it is also possible that first control terminal need not carry out the calculating of attitude data again, can and When virtual information is obtained according to the attitude data.In addition to this, which can also be according to the ambient image data Maps processing message is obtained, maps processing message is sent to the second control terminal.
In one embodiment, wearable device is connected by third interface with the first control terminal, by the 4th interface with Second control terminal is connected, and related data can be sent respectively to each control terminal, institute by wearable device by different interfaces Wire transmission interface or wireless transmission interface can be configured as needed by stating third interface and the 4th interface.At other In embodiment, wearable device and each control terminal can also be connected with server.Wearable device can by attitude data and Maps processing message is uploaded to server, is then accordingly sent to the attitude data and maps processing message respectively by server A control terminal.
In embodiments of the present invention, the structure of the wearable device may refer to Fig. 4.As shown in figure 4, the present invention is implemented Example in wearable device include:First interface 201, second interface 202 and processor 203.It should be noted that the present invention is real It applies the first interface 201 in example, second interface 202 and processor 203 and may refer to first interface in foregoing invention embodiment 101, second interface 102 and processor 103, the embodiment of the present invention repeats no more.In one embodiment, in the embodiment of the present invention Wearable device further comprise imaging sensor 204 and inertial sensor 205.In one embodiment, imaging sensor 204, the connection relation between inertial sensor 205 and processor 203 can refer to described below.
The imaging sensor 204 is connected by first interface 201 with the processor 203, which is used for Ambient image data are acquired, which are sent to the processor 203 by the first interface 201, in a reality It applies in example, the first interface 201 can be MIPI interfaces.
The inertial sensor 205 is connected by second interface 202 with the processor 203, which is used for The exercise data is sent to the processor 203 by the exercise data for sensing the wearable device by the second interface 202.It should Inertial sensor 205 is a kind of biography that can detect and export the movable informations such as the acceleration of wearable device, angular speed in real time Sensor.In one embodiment, which can be that IMU (survey by Inertial Measurement Unit, inertia Measure unit) sensor.The IMU sensors are multiple sensors such as a collection acceleration transducer, velocity sensor (gyroscope) The equipment being integrated, the exercise data for sensing wearable device using the IMU sensors, it is possible to reduce the wearable device exists Generated hysteresis when sensing movement data, so as to improve the Attitude Tracking function of wearable device.In a reality It applies in example, which can be SPI interface or I2C (Inter-Integrated Circuit, I2C) interface.
The processor 203, for the wearable device to be calculated according to the ambient image data and the exercise data The attitude data is sent to the first control terminal by attitude data.In one embodiment, the processor 203 is according to the environment When the attitude data of the wearable device is calculated in image data and the exercise data, EKF (Extended may be used Kalman Filter, Extended Kalman filter) algorithm carries out data fusion calculating, thus obtain one it is high-precision about The attitude data of wearable device.In one embodiment, which can be DSP (Digital Signal Processing, Digital Signal Processing) processor, it can also be general-purpose built-in type arm processor, can also be FPGA (Field-Programmable Gate Array, Field Programmable Logic Array) processor, or can also be it is other can To complete the processor of identical function, the embodiment of the present invention is not construed as limiting this.In one embodiment, data fusion calculates also Location information is merged, which refers to the position of the wearable device that determines based on SLAM algorithms in environmental map.
It can be seen that the wearable device in the embodiment of the present invention can get ambient image data and movement number After according to this, the ambient image data and the exercise data can be carried out that attitude data is calculated by the processor of itself, Need not ambient image data and exercise data be sent jointly into intelligent terminal again, can avoid the delay of data transmission.Except this Except, the attitude data calculated need to be only sent to the first control terminal by wearable device.It can avoid going out in data transmission procedure The case where existing channel contention, to avoid the frame losing phenomenon of data.
The processor 203 is additionally operable to generate maps processing message according to the ambient image data, by the maps processing message Be sent to the second control terminal, which includes ambient image data, the maps processing message be used to indicate this second Control terminal carries out environmental map processing according to the ambient image data, generates map datum.In one embodiment, the processor 203 according to the ambient image data when for generating maps processing message, for calling Feature Points Matching algorithm to the environment The corresponding ambient image of image data carries out Feature Points Matching calculating, obtains matching result, and generate ground according to the matching result Figure processing message.In one embodiment, processor 203 is calling Feature Points Matching algorithm progress Feature Points Matching to calculate it Before, which may call upon feature extraction algorithm (such as BRIEF (Binary Robust Independent Elementary Features, binaryzation robust independent characteristic) algorithm, ORB (Oriented FAST and Rotated BRIEF, with directive Fast detections and BRIEF features) algorithm etc.) carried to carry out characteristic point to ambient image data It takes.In one embodiment, which may include winding detection message and/or map rejuvenation message.
In one embodiment, if the processor 203 is additionally operable to be unsuccessfully received the image within a preset period of time The ambient image data that sensor 204 is sent send reorientation message to second control terminal, which is used to indicate Second control terminal carries out at reorientation the wearable device according to the key frame information that generated map datum includes Reason.If processor 203 is unsuccessfully received the ambient image data of the imaging sensor 204 transmission within a preset period of time, It can be assumed that wearable device at this time is in tracking status of fail, therefore reorientation message can be sent to the second control terminal, To indicate that the second control terminal is carried out to the wearable device relocation process.And processor 203 is to be received to from the second control After the obtained reorientation data of carry out relocation process at end, wearable device can be restored according to the reorientation data Track posture.In one embodiment, if Non feature regions, processing in the ambient image data that processor 203 receives Device 203 is it is also contemplated that wearable device at this time in tracking status of fail, can also send to the second control terminal relocate at this time Message.Under normal circumstances, it is based on during SLAM algorithms progress maps processing, is that front and back two frame is collected based on imaging sensor Ambient image data carry out characteristic point and compare to determine the relative position of wearable device, and then determine wearable device Position, determines position of the wearable device in environmental map.If imaging sensor does not have collected ambient image data, example As camera lens is blocked or when the second control terminal is not successfully receiving ambient image data, then can carry out at reorientation Reason, relocation process refer to:When receiving ambient image data again after no successful reception ambient image data, base In the recently received corresponding ambient image of ambient image data and the key recorded in carrying out environmental map processing procedure The corresponding key images of ambient image data that frame includes are compared, and are found and recently received ambient image data pair The maximum key images of similarity between the image answered, the key frame where determining the maximum key images of the similarity are most Similar key frame, and utilize the spy between the most like key frame in the recently received ambient image and environmental map It levies matching result and calculates the current corresponding position of recently received ambient image and posture, based on the environment received again Image data can then continue positioning and be handled with drawing.Key frame is during the second control terminal carries out maps processing Record, include image and position of the image in processed map in key frame, under normal circumstances, in front and back two frame When image change in ambient image data is bigger, the determining position note of positioning is carried out by a later frame image and based on the image Record is key frame.
In one embodiment, which is additionally operable to receive the reorientation data of second control terminal transmission, and root The attitude data of the wearable device is calculated in the exercise data sensed according to the reorientation data and the inertial sensor, should Reorientation data, which are the key frame information that includes according to generated map datum by second control terminal, wearable sets this It is obtained after standby progress relocation process.That is, attitude data can not only include the related data of 6 degree of freedom, also May include position of the user of wearing wearable device in the environmental map generated by SLAM algorithms.
In one embodiment, which is additionally operable to receive the virtual information of first control terminal transmission, this is virtual Information is by the first control terminal determination according to the attitude data.Processor 203 is receiving the transmission of the first control terminal Can include in the display screen of wearable device by the virtual information so that user checks the virtual letter after virtual information Breath.These virtual informations for example can be certain virtual cartoon figures, or be some objects explanatory note for example certain The price of part commodity and product introduction etc..
In one embodiment, which can be divided into visual token module 2031 and IMU data fusion moulds Block 2032, specific structural schematic diagram may refer to Fig. 5.As shown in Figure 5, visual token module 2031 passes through first interface 201 It is connected with imaging sensor 204, IMU data fusion modules 2032 are connected by second interface 202 with inertial sensor 205 It connects.The function of above-mentioned described processor 203 can be specifically by the visual token module 2031 and IMU data fusion modules 2032 realize respectively.In one embodiment, which disappears for generating maps processing according to the ambient image data Breath, can be by visual token module 2031 according to the ambient image when which is sent to the second control terminal Data generate maps processing message, and the maps processing message is sent to the second control terminal.In one embodiment, vision is surveyed It can be also used for after receiving the ambient image data of imaging sensor 204 away from module 2031, with image matching algorithm Front and back frame image is matched, determines the visual gesture data of wearable device according to matching result, and by the vision appearance State data are sent to IMU data fusion modules 2032.Corresponding, processor 203 is for according to the ambient image data and this The attitude data of the wearable device is calculated in exercise data, when which is sent to the first control terminal, Ke Yiyou IMU data fusion modules 2032 are completed to work accordingly.IMU data fusion modules 2032 are receiving inertial sensor 205 It, can be according to EKF algorithms to this after the visual gesture data that the exercise data and visual token module 2031 of transmission are sent Exercise data and visual gesture data carry out fusion treatment, obtain the attitude data of high-precision, high frame per second, and by the high-precision, The attitude data of high frame per second is sent to the first control terminal.
The interface that the embodiment of the present invention uses two different transmission rates carrys out the Asynchronous Reception ambient image data and is somebody's turn to do Exercise data, can to avoid ambient image data relative to transmission delay and the channel contention of exercise data and caused by frame losing Phenomenon, so as to show the scenes such as virtual reality scenario or augmented reality in time and preferably for user.
Based on above-mentioned described wearable device, the embodiment of the present invention also proposed a kind of processing of ambient image data System, the processing system are a kind of vision based on Heterogeneous Computing-inertia SLAM systems.The processing system can meet AR/VR Attitude Tracking needs high real-time, low delay and high-precision posture output demand.Processing system in the embodiment of the present invention Include mainly front-end and back-end two parts, front end can be responsible for ambient image processing, passed by imaging sensor, inertial sensor The parts such as the processor and peripheral circuit of sensor fusion form, and rear end can be by PC/ARM (Advanced RISC Machines, ARM) etc. universal cpus system composition, front-end and back-end can by USB (Universal Serial Bus, lead to With universal serial bus) it 3.0 is communicated.
In one embodiment, the structural schematic diagram of the processing system may refer to Fig. 6.As shown in fig. 6, the present invention is implemented Example in processing system may include:Wearable device 301, the first control terminal 302 and the second control terminal 303.Corresponding, this can Wearable device 301 is the fore-end of processing system, and first control terminal 302 and the second control terminal 303 collectively form the processing The rear end part of system.
Wherein, which is connected by third interface with first control terminal 302, the wearable device 301 exercise data for obtaining ambient image data and the wearable device, and according to the ambient image data and the movement The attitude data of the wearable device 301 is calculated in data, which is sent to first control terminal 302.
The wearable device 301 is connected by the 4th interface with second control terminal 303, and the wearable device 301 is also For according to ambient image data generation maps processing message, which to be sent to second control terminal 303, The maps processing message includes ambient image data.
First control terminal 302, for determining virtual information according to the attitude data, which is sent to this can Wearable device 301.In one embodiment, which may include 6Dof postures output module 3021, specifically System structure can be as shown in Figure 7.The posture number that the 6Dof postures output module 3021 can be sent according to wearable device 301 According to, the determining virtual information from the image data base stored in the first control terminal 302, and the virtual information of the determination is sent to Wearable device 301, so that the virtual information is included before user by wearable device 301.
Second control terminal 303 generates map datum for carrying out environmental map processing according to the ambient image data. In one embodiment, which may include winding detection module 3031, map maintenance module 3032 and again Locating module 3033.As shown in fig. 7, winding detection module 3031, map maintenance module 3032 and reorientation module 3033 are equal It can be connected with wearable device 301 by the 4th interface.Wherein, winding detection module 3031 is detected for being responsible for winding, with Complete the consistency that closed loop ensures map.Map maintenance module 3032 utilizes point/line segment between multiple image for being responsible for Match, update the three-dimensional position of point map, and safeguards the posture of multiple key frames using sparse bundle adjustment algorithm SBA.It resets Position module 3033 is used to be responsible for resuming work after 301 Attitude Tracking of wearable device fails.Therefore, the second control terminal 303 can Environmental map processing is carried out according to different maps processing message, to call different modules, generates map datum.
In one embodiment, image map processing message can be that winding detects message.Winding detection refers to second Whether the movement locus that control terminal 303 detects wearable device 301 there is the winding of a closure.It is corresponding, second control End 303 can be used for calling winding detection module 3031 when for carrying out environmental map processing according to the ambient image data Key frame information to include according to the corresponding ambient image of ambient image data and the generated map datum carries out Matching then carries out environmental map when judging that current environmental map processing state is in winding processing state according to matching result Winding processing.In one embodiment, which may call upon winding detection module 3031 and is detected using winding Algorithm (such as DBoW2 algorithms) carries out winding detection.If the second control terminal 303 detects winding occurred, the second control End can carry out bundle adjustment with the global SBA of invocation map maintenance module 3032.
In one embodiment, which can be map rejuvenation message, which includes should The image location information of characteristic point and characteristic point in the corresponding ambient image of ambient image data.It is corresponding, second control End 303 processed can be used for invocation map maintenance module when for carrying out environmental map processing according to the ambient image data 3032 carry out according to the image location information of characteristic point and characteristic point in the corresponding ambient image of ambient image data The update of environmental map is handled.
In one embodiment, if the wearable device 301 is additionally operable to collect ring not successfully within a preset period of time Border image data sends reorientation message to second control terminal 303.Corresponding, which is additionally operable to connecing After receiving the reorientation message, the key frame for relocating module 3033 to include according to generated map datum can be called Information carries out relocation process to obtain reorientation data to the wearable device, and it is wearable which is sent to this Equipment 301.In one embodiment, reorientation module 3033 can call reorientation algorithm (such as GCBB algorithms) that can wear this It wears equipment and carries out relocation process.
In one embodiment, which is additionally operable to the key for including according to generated map datum Frame information carries out relocation process to obtain reorientation data to the wearable device 301, which is sent to this Wearable device 301.In one embodiment, if second control terminal 303 is unsuccessfully received this within a preset period of time The ambient image data that wearable device 301 is sent, then second control terminal 303 can be determined that at the wearable device 301 In tracking status of fail, therefore the key frame information that second control terminal 303 can include according to generated map datum Relocation process is carried out to the wearable device 301 to obtain reorientation data, and the reorientation data are sent to this to wear Wear equipment 301 so that the wearable device 301 can receive second control terminal 303 transmission reorientation data with Afterwards, the attitude data of the wearable device 301 is calculated according to the reorientation data and the exercise data.
In one embodiment, the third interface and the 4th interface may each be USB interface, and in actual hardware device In, the third interface and the 4th interface can be the same USB interfaces, can also be different USB interfaces.
Processing system in the embodiment of the present invention can be completed by the processor in wearable device to ambient image Data carry out image procossing, the calculating of attitude data and the fusion of sensor, ensure that processing system real-time exports high frame The attitude data of rate, low delay.It can also be controlled respectively with the first control terminal and second by third interface and the 4th interface simultaneously End processed is communicated, and the maps consistency operations such as the second control terminal asynchronous process map rejuvenation, reorientation, winding detection are utilized. Therefore, which can evade the problems such as data transmission delay and frame losing, meet AR/VR Attitude Trackings and need height in real time Property, low delay and high-precision posture export demand.It can also ensure the consistency and processing system of SLAM maps simultaneously Robustness, the feeling of immersion of wearable device can be effectively improved, reduce the spinning sensation of user.
According to another embodiment of the invention, in Fig. 1 or Fig. 4~wearable device shown in Fig. 7 or processing system Each unit can respectively or all merge into one or several other units constitute or it is therein some (a little) unit can also be split as functionally smaller multiple units to constitute again, this may be implemented similarly to operate, without shadow Ring the realization of the technique effect of the embodiment of the present invention.In practical applications, the function of a unit can also be by multiple units To realize or the function of multiple units is realized by a unit.In other embodiments of the invention, wearable device or Processing system can also include other units, and in practical applications, these functions can also be assisted to realize by other units, and It can be cooperated and be realized by multiple units.
Based on the description of above-mentioned wearable device embodiment, then referring to FIG. 8, it is a kind of ring provided in an embodiment of the present invention The flow diagram of the processing method of border image data.The processing method for the ambient image data that the embodiment of the present invention is provided can With performed by the processor in above-mentioned wearable device, the processing method of the ambient image data of the embodiment of the present invention can Include the following steps.
S101 obtains ambient image data, and maps processing message is generated according to the ambient image data.In one embodiment In, the processor in wearable device can call imaging sensor to acquire ambient image data.Processor is collecting ring After the image data of border, maps processing message can be generated according to the ambient image data.In one embodiment, processor root The specific implementation mode of maps processing message is generated according to the ambient image data to be:Call Feature Points Matching algorithm to the ring Image data corresponding ambient image in border carries out Feature Points Matching calculating, obtains matching result.Generate ground according to the matching result Figure processing message.
S102 obtains exercise data, and wearable device is calculated according to the ambient image data and the exercise data Attitude data.In one embodiment, the processor in wearable device can call inertial sensor to carry out sensing movement data. After getting exercise data wearable set can be calculated according to the ambient image data and the exercise data in processor Standby attitude data, what the calculation of specific attitude data may refer to arrive mentioned in above-mentioned wearable device embodiment Method, details are not described herein.
The attitude data is sent to the first control terminal by S103, which is sent to the second control terminal.Its In, which includes ambient image data, which is used to indicate second control terminal according to the ring Border image data carries out environmental map processing, generates map datum.
In one embodiment, this method can also include:If getting environment map not successfully within a preset period of time As data, reorientation message is sent to second control terminal, which is used to indicate second control terminal according to having given birth to At the map datum key frame information that includes relocation process is carried out to the wearable device.
In one embodiment, this method can also include:Receive the reorientation data of second control terminal transmission, and root The attitude data of the wearable device is calculated in the exercise data sensed according to the reorientation data and the inertial sensor, should Reorientation data, which are the key frame information that includes according to generated map datum by second control terminal, wearable sets this It is obtained after standby progress relocation process.
In one embodiment, this method can also include:The virtual information of first control terminal transmission is received, this is virtual Information is by the first control terminal determination according to the attitude data.
According to one embodiment of present invention, the step S101- that the processing method of ambient image data shown in Fig. 8 is related to S103 can be that each unit in wearable device as shown in Figure 4 is performed.For example, step S101- shown in fig. 8 S103 shown in Fig. 4 imaging sensor 204, inertial sensor 205 and processor 203 can be executed respectively.
The wearable device of the embodiment of the present invention generates ground by obtaining ambient image data, according to the ambient image data Figure processing message.Exercise data is obtained, the appearance of wearable device is calculated according to the ambient image data and the exercise data State data.The attitude data is sent to the first control terminal, which is sent to the second control terminal.The present invention is real It applies the avoidable ambient image data of example and exercise data causes data frame losing being transmitted to intelligent terminal process contention channel.It removes Except this, the wearable device in the embodiment of the present invention transmits image data respectively by the interface of two different transmission rates And exercise data, it can be to avoid propagation delay time, to which the feeling of immersion of user can be improved.
In one embodiment, the embodiment of the present invention additionally provides a kind of computer storage media, the computer storage Media storage has program instruction to realize the ambient image data described in Fig. 8 when described program instruction is loaded and executed by processor Processing method.
The wearable device of the embodiment of the present invention generates ground by obtaining ambient image data, according to the ambient image data Figure processing message.Exercise data is obtained, the appearance of wearable device is calculated according to the ambient image data and the exercise data State data.The attitude data is sent to the first control terminal, which is sent to the second control terminal.The present invention is real It applies the avoidable ambient image data of example and exercise data causes data frame losing being transmitted to intelligent terminal process contention channel.It removes Except this, the wearable device in the embodiment of the present invention transmits image data respectively by the interface of two different transmission rates And exercise data, it can be to avoid propagation delay time, to which the feeling of immersion of user can be improved.
One of ordinary skill in the art will appreciate that realizing all or part of flow in above-described embodiment method, being can be with Relevant hardware is instructed to complete by computer program, the program can be stored in a computer read/write memory medium In, the program is when being executed, it may include such as the flow of the embodiment of above-mentioned each method.Wherein, the storage medium can be magnetic Dish, CD, read-only memory (Read-Only Memory, ROM) or random access memory (Random Access Memory, RAM) etc..
Above disclosed is only section Example of the present invention, cannot limit the right model of the present invention with this certainly It encloses, therefore equivalent changes made in accordance with the claims of the present invention, is still within the scope of the present invention.

Claims (14)

1. a kind of wearable device, which is characterized in that including:First interface, second interface and processor;
The processor is connected by the first interface with imaging sensor, the processor, for receiving described image The ambient image data of sensor acquisition;
The processor is connected by the second interface with inertial sensor, the processor, is additionally operable to receive described used Property sensor sensing the wearable device exercise data;
The processor is additionally operable to that the wearable device is calculated according to the ambient image data and the exercise data Attitude data, the attitude data is sent to the first control terminal;
The processor is additionally operable to generate maps processing message according to the ambient image data, by the maps processing message It is sent to the second control terminal, the maps processing message includes the ambient image data, and the maps processing message is for referring to Show that second control terminal carries out environmental map processing according to the ambient image data, generates map datum.
2. wearable device as described in claim 1, which is characterized in that the wearable device further includes:Imaging sensor And/or inertial sensor;
Described image sensor passes through the first interface for acquiring ambient image data, and by the ambient image data It is sent to the processor;
The inertial sensor, the exercise data for sensing the wearable device, and by the exercise data described in Second interface is sent to the processor.
3. wearable device as claimed in claim 1 or 2, which is characterized in that
The processor, when for generating maps processing message according to the ambient image data, for calling characteristic point Feature Points Matching calculating is carried out to the corresponding ambient image of the ambient image data with algorithm, obtains matching result, and according to The matching result generates maps processing message.
4. wearable device as claimed in claim 1 or 2, which is characterized in that
The processor, if being additionally operable to be unsuccessfully received the environment map of described image sensor transmission within a preset period of time As data, reorientation message is sent to second control terminal, the reorientation message is used to indicate the second control terminal root Relocation process is carried out to the wearable device according to the key frame information that generated map datum includes.
5. wearable device as claimed in claim 1 or 2, which is characterized in that
The processor is additionally operable to receive the reorientation data that second control terminal is sent, and according to the reorientation data The attitude data of the wearable device is calculated with the exercise data that the inertial sensor senses, it is described to reset digit According to be the key frame information that includes according to generated map datum by second control terminal to the wearable device into It is obtained after row relocation process.
6. wearable device as claimed in claim 1 or 2, which is characterized in that the first interface is mobile Industry Processor Interface;
The second interface is Serial Peripheral Interface (SPI) or I2C interface.
7. wearable device as claimed in claim 6, which is characterized in that the processor is additionally operable to receive first control The virtual information that end processed is sent, the virtual information is by the first control terminal determination according to the attitude data.
8. a kind of processing system of ambient image data, which is characterized in that including wearable device, the first control terminal and the second control End processed;
The wearable device is connected with first control terminal, the wearable device, for obtaining ambient image data With the exercise data of the wearable device, and it is calculated and described can be worn according to the ambient image data and the exercise data The attitude data is sent to first control terminal by the attitude data for wearing equipment;
The wearable device is connected with second control terminal, and the wearable device is additionally operable to according to the environment map As data generation maps processing message, the maps processing message is sent to second control terminal, the maps processing disappears Breath includes ambient image data;
The virtual information is sent to described by first control terminal for determining virtual information according to the attitude data Wearable device;
Second control terminal generates map datum for carrying out environmental map processing according to the ambient image data.
9. processing system as claimed in claim 8, which is characterized in that
The wearable device, if being additionally operable to collect ambient image data not successfully within a preset period of time, to described Two control terminals send reorientation message;
Second control terminal is additionally operable to after receiving the reorientation message, includes according to generated map datum Key frame information to the wearable device carry out relocation process with obtain reorientation data, by the reorientation data hair Give the wearable device.
10. processing system as claimed in claim 8, which is characterized in that
Second control terminal is additionally operable to the key frame information for including according to generated map datum and wearable is set to described It is standby to carry out relocation process to obtain reorientation data, the reorientation data are sent to the wearable device;
The wearable device is additionally operable to receive the reorientation data that second control terminal is sent, and according to the reorientation The attitude data of the wearable device is calculated in data and the exercise data.
11. the processing system as described in claim 9 or 10, which is characterized in that
Second control terminal, when for carrying out environmental map processing according to the ambient image data, for according to The key frame information that the corresponding ambient image of ambient image data and the generated map datum include is matched, root When judging that current environmental map processing state is in winding processing state according to matching result, then carry out at environmental map winding Reason.
12. the processing system as described in claim 9 or 10, which is characterized in that the ambient image data include the environment The image location information of characteristic point and characteristic point in the corresponding ambient image of image data;
Second control terminal, when for carrying out environmental map processing according to the ambient image data, for according to The image location information of characteristic point and characteristic point in the corresponding ambient image of ambient image data carries out environmental map more New processing.
13. a kind of processing method of ambient image data, which is characterized in that including:
Ambient image data are obtained, maps processing message is generated according to the ambient image data;
Exercise data is obtained, the posture number of wearable device is calculated according to the ambient image data and the exercise data According to;
The attitude data is sent to the first control terminal, the maps processing message is sent to the second control terminal, describedly Figure processing message includes ambient image data, and the maps processing message is used to indicate second control terminal according to the environment Image data carries out environmental map processing, generates map datum.
14. a kind of computer storage media, which is characterized in that the computer storage media has program stored therein instruction, the journey When sequence instruction is loaded and executed by processor, the processing method of ambient image data as claimed in claim 13 is realized.
CN201810149313.1A 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium Active CN108427479B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810149313.1A CN108427479B (en) 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810149313.1A CN108427479B (en) 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium

Publications (2)

Publication Number Publication Date
CN108427479A true CN108427479A (en) 2018-08-21
CN108427479B CN108427479B (en) 2021-01-29

Family

ID=63157007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810149313.1A Active CN108427479B (en) 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium

Country Status (1)

Country Link
CN (1) CN108427479B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147058A (en) * 2018-08-31 2019-01-04 腾讯科技(深圳)有限公司 Initial method and device and storage medium for the fusion of vision inertial navigation information
CN109255817A (en) * 2018-09-14 2019-01-22 北京猎户星空科技有限公司 A kind of the vision method for relocating and device of smart machine
CN109544615A (en) * 2018-11-23 2019-03-29 深圳市腾讯信息技术有限公司 Method for relocating, device, terminal and storage medium based on image
CN109788194A (en) * 2018-12-27 2019-05-21 北京航空航天大学 A kind of adaptivity wearable device subjectivity multi-view image acquisition method
CN114201004A (en) * 2020-09-18 2022-03-18 苹果公司 Inertial data management for augmented reality of mobile platforms
CN115039015A (en) * 2020-02-19 2022-09-09 Oppo广东移动通信有限公司 Pose tracking method, wearable device, mobile device and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120299962A1 (en) * 2011-05-27 2012-11-29 Nokia Corporation Method and apparatus for collaborative augmented reality displays
US20140240223A1 (en) * 2013-02-22 2014-08-28 Thalmic Labs Inc. Method and apparatus for analyzing capacitive emg and imu sensor signals for gesture control
US20160203359A1 (en) * 2015-01-12 2016-07-14 BMT Business Meets Technology Holding AG Wink Gesture Based Control System
CN105981369A (en) * 2013-12-31 2016-09-28 谷歌技术控股有限责任公司 Methods and Systems for Providing Sensor Data and Image Data to an Application Processor in a Digital Image Format
CN106507092A (en) * 2016-11-29 2017-03-15 歌尔科技有限公司 Camera head and its image processing method, virtual reality device
CN106774844A (en) * 2016-11-23 2017-05-31 上海创米科技有限公司 A kind of method and apparatus for virtual positioning
CN107160395A (en) * 2017-06-07 2017-09-15 中国人民解放军装甲兵工程学院 Map constructing method and robot control system
CN206541019U (en) * 2017-03-03 2017-10-03 北京国承万通信息科技有限公司 Position optical signal launch equipment, optical positioning system and virtual reality system
CN107270900A (en) * 2017-07-25 2017-10-20 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system and method for posture
CN107610175A (en) * 2017-08-04 2018-01-19 华南理工大学 The monocular vision SLAM algorithms optimized based on semi-direct method and sliding window
CN107657640A (en) * 2017-09-30 2018-02-02 南京大典科技有限公司 Intelligent patrol inspection management method based on ORB SLAM

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120299962A1 (en) * 2011-05-27 2012-11-29 Nokia Corporation Method and apparatus for collaborative augmented reality displays
US20140240223A1 (en) * 2013-02-22 2014-08-28 Thalmic Labs Inc. Method and apparatus for analyzing capacitive emg and imu sensor signals for gesture control
CN105981369A (en) * 2013-12-31 2016-09-28 谷歌技术控股有限责任公司 Methods and Systems for Providing Sensor Data and Image Data to an Application Processor in a Digital Image Format
US20160203359A1 (en) * 2015-01-12 2016-07-14 BMT Business Meets Technology Holding AG Wink Gesture Based Control System
CN106774844A (en) * 2016-11-23 2017-05-31 上海创米科技有限公司 A kind of method and apparatus for virtual positioning
CN106507092A (en) * 2016-11-29 2017-03-15 歌尔科技有限公司 Camera head and its image processing method, virtual reality device
CN206541019U (en) * 2017-03-03 2017-10-03 北京国承万通信息科技有限公司 Position optical signal launch equipment, optical positioning system and virtual reality system
CN107160395A (en) * 2017-06-07 2017-09-15 中国人民解放军装甲兵工程学院 Map constructing method and robot control system
CN107270900A (en) * 2017-07-25 2017-10-20 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system and method for posture
CN107610175A (en) * 2017-08-04 2018-01-19 华南理工大学 The monocular vision SLAM algorithms optimized based on semi-direct method and sliding window
CN107657640A (en) * 2017-09-30 2018-02-02 南京大典科技有限公司 Intelligent patrol inspection management method based on ORB SLAM

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
顾照鹏等: ""基于部分惯性传感器信息的单目视觉同步定位与地图创建方法"", 《计算机辅助设计与图形学学报》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147058A (en) * 2018-08-31 2019-01-04 腾讯科技(深圳)有限公司 Initial method and device and storage medium for the fusion of vision inertial navigation information
CN109147058B (en) * 2018-08-31 2022-09-20 腾讯科技(深圳)有限公司 Initialization method and device for visual and inertial navigation information fusion and storage medium
CN109255817A (en) * 2018-09-14 2019-01-22 北京猎户星空科技有限公司 A kind of the vision method for relocating and device of smart machine
CN109544615A (en) * 2018-11-23 2019-03-29 深圳市腾讯信息技术有限公司 Method for relocating, device, terminal and storage medium based on image
CN109544615B (en) * 2018-11-23 2021-08-24 深圳市腾讯信息技术有限公司 Image-based repositioning method, device, terminal and storage medium
CN109788194A (en) * 2018-12-27 2019-05-21 北京航空航天大学 A kind of adaptivity wearable device subjectivity multi-view image acquisition method
CN109788194B (en) * 2018-12-27 2020-08-25 北京航空航天大学 Adaptive wearable device subjective visual angle image acquisition method
CN115039015A (en) * 2020-02-19 2022-09-09 Oppo广东移动通信有限公司 Pose tracking method, wearable device, mobile device and storage medium
CN114201004A (en) * 2020-09-18 2022-03-18 苹果公司 Inertial data management for augmented reality of mobile platforms

Also Published As

Publication number Publication date
CN108427479B (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN108427479A (en) Wearable device, the processing system of ambient image data, method and readable medium
CN107820593A (en) A kind of virtual reality exchange method, apparatus and system
KR101509472B1 (en) Motion parameter determination method and device and motion auxiliary equipment
US10825197B2 (en) Three dimensional position estimation mechanism
JP2021034021A (en) Pose prediction with recurrent neural networks
CN206961066U (en) A kind of virtual reality interactive device
CN106980368A (en) A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit
CN104508600A (en) Three-dimensional user-interface device, and three-dimensional operation method
CN102667672A (en) Acceleration motion identify method and system thereof
CN103140879A (en) Information presentation device, digital camera, head mount display, projector, information presentation method, and information presentation program
CN103955295A (en) Real-time grabbing method of virtual hand based on data glove and physical engine
CN104662587A (en) Three-dimensional user-interface device, and three-dimensional operation method
US10948994B2 (en) Gesture control method for wearable system and wearable system
US20230325011A1 (en) Electronic device for controlling host device by using motion signal and mouse signal
EP4105766A1 (en) Image display method and apparatus, and computer device and storage medium
CN110928404B (en) Tracking system and related tracking method thereof
CN113033369B (en) Motion capture method, motion capture device, electronic equipment and computer readable storage medium
CN108427595A (en) The determination method and device of user interface controls display location in virtual reality
US20210142511A1 (en) Method of generating 3-dimensional model data
CN110688002A (en) Virtual content adjusting method and device, terminal equipment and storage medium
CN106970705A (en) Motion capture method, device and electronic equipment
CN110442235A (en) Positioning and tracing method, device, terminal device and computer-readable storage medium
CN107533244A (en) Headset equipment position monitoring component
Zhang et al. Ubiquitous human body motion capture using micro-sensors
CN205880817U (en) AR and VR data processing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant