CN109087237B - Virtual helmet - Google Patents

Virtual helmet Download PDF

Info

Publication number
CN109087237B
CN109087237B CN201810714466.6A CN201810714466A CN109087237B CN 109087237 B CN109087237 B CN 109087237B CN 201810714466 A CN201810714466 A CN 201810714466A CN 109087237 B CN109087237 B CN 109087237B
Authority
CN
China
Prior art keywords
image
unit
plane
connecting line
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810714466.6A
Other languages
Chinese (zh)
Other versions
CN109087237A (en
Inventor
邓文婕
吴坤杰
钱学斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haikou College Of Economics
Original Assignee
Haikou College Of Economics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haikou College Of Economics filed Critical Haikou College Of Economics
Priority to CN201810714466.6A priority Critical patent/CN109087237B/en
Publication of CN109087237A publication Critical patent/CN109087237A/en
Application granted granted Critical
Publication of CN109087237B publication Critical patent/CN109087237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/047
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a virtual helmet, comprising: the device comprises a shell, a first fisheye camera, a second fisheye camera, a 3D correction unit, a first comparison unit, a first processing unit, a second comparison unit, a plane image integration unit, a second processing unit and a display unit. The virtual helmet provided by the invention can well combine the virtual reality image with the actual image, and a user can wear the helmet to entertain in the process of movement.

Description

Virtual helmet
Technical Field
The invention relates to the technical field of electronic equipment, in particular to a virtual helmet.
Background
The virtual helmet is a novel game experience tool used in the aspect of game experience, and better enables a user to enjoy a more real world than tools such as a computer and a mobile phone, and enables the user to see everything in the virtual world.
Chinese patent document No. CN105975053A discloses a virtual helmet control method, device and virtual helmet, where the virtual helmet is provided with an action generator, and the method includes: monitoring a key frame played by the video when the virtual helmet plays the video; judging whether the key frame is a preset key frame or not; when the key frame is a preset key frame, acquiring an operation instruction corresponding to the preset key frame; and controlling the action generator to execute the execution action matched with the operation instruction. Therefore, when the virtual helmet provided in the patent embodiment is used for playing videos, a 4D playing effect can be achieved, and further the video playing and watching experience of a user is increased to a great extent.
Chinese patent document No. CN205750117U discloses a novel holographic virtual helmet, which comprises a face fixing member and a helmet main body, wherein a standby power supply for charging the mobile device is arranged in the helmet main body, and the standby power supply is provided with a USB output port. After adopting above-mentioned structure, can charge for mobile device through stand-by power supply, when experiencing holographic virtual helmet, prevent that mobile device from because of power consumption is big and outage influence in the short time experience time and experience the pleasant sensation. Meanwhile, under the condition of no commercial power, the standby power supply can supply power to the holographic virtual helmet within a certain time, and the use of the holographic virtual helmet is not influenced.
The existing virtual helmet can only be used in a static state or indoors generally, and the virtual helmet is limited in application because the helmet is worn outdoors and the user often difficultly walks.
Disclosure of Invention
The virtual helmet provided by the invention can well combine a virtual reality image with an actual image, and a user can wear the helmet for entertainment in the process of movement.
To achieve the above object, the present invention provides a virtual helmet comprising:
a housing;
the first fisheye camera and the second fisheye camera are arranged at the front of the shell at intervals along the horizontal direction and are used for acquiring a first 3D image and a second 3D image in front of the shell;
a 3D correction unit for 3D correcting the first 3D image and the second 3D image into a first planar image and a second planar image;
the first comparison unit is used for comparing whether at least two overlapped unit plane images exist in the first plane image and the second plane image, and each unit plane image corresponds to a real object with a closed contour; the object is a person or an object;
the first processing unit is used for acquiring at least two unit 3D images corresponding to the at least two overlappable unit plane images in the first 3D image and the second 3D image after the first comparing unit compares that the first plane image and the second plane image have the at least two overlappable unit plane images;
the second comparison unit is used for comparing the distance between the object a and the object b corresponding to the 3D images of the two units and the horizontal distance between the connecting line L of the center O1 of the first fisheye camera and the center O2 of the second fisheye camera;
a plane image integration unit for integrating the first plane image and the second plane image into an integral image;
the second processing unit is used for generating a virtual reality image and enabling a unit plane image corresponding to a long-distance real object in at least part of the whole image to be covered by the virtual reality image according to the distance between the real objects a and b acquired by the second comparison unit and the horizontal distance between the central connecting line L and the central connecting line L, and/or enabling a unit plane image corresponding to a short-distance real object in at least part of the whole image to be covered by the virtual reality image;
and the display unit is used for displaying the virtual reality image and the whole image generated by the second processing unit to the helmet wearer.
Optionally, the virtual helmet further includes a unit plane image acquiring unit, where the unit plane image acquiring unit includes:
the outline dividing unit is used for dividing the closed outlines of the sub-images in the first plane image and the second plane image;
and the unit plane image acquisition unit is used for taking the sub-image in each closed contour as the unit plane image.
Optionally, the second comparing unit includes:
a first fixed point acquisition unit configured to acquire two fixed points a and B in a first 3D image, where the fixed point a corresponds to a point S1 on a real object a and the fixed point B corresponds to a point S2 on a real object B;
a second fixed point obtaining unit, configured to obtain two fixed points C and D in a second 3D image, where the fixed point C corresponds to a point S1 on the object a and the fixed point D corresponds to a point S2 on the object b;
a distance calculation unit for calculating a horizontal distance L1 from the point S1 to the center connecting line L and a horizontal distance L2 from the point S2 to the center connecting line L2 according to the positions of the fixed point A, B, C, D and the centers O1 and O2;
and the distance judgment unit is used for judging the distance between the object a and the object b and the horizontal distance between the center connecting line L according to the sizes of the L1 and the L2: if L1 is larger than L2, the horizontal distance between the object a and the central connecting line L is longer than that between the object b and the central connecting line L; and if L1 is less than L2, the horizontal distance between the object a and the central connecting line L is close to the horizontal distance between the object b and the central connecting line L.
Optionally, the planar image integration unit performs the integration by using a feature-based and transform domain-based image registration algorithm.
Optionally, the virtual helmet further comprises a gyroscope, which is arranged in the outer shell and is used for acquiring an angle formed by a central connecting line L of the first fisheye camera and the second fisheye camera and a horizontal plane;
and the 3D image angle conversion module is used for converting the first 3D image and the second 3D image according to the angle acquired by the gyroscope and enabling the central connecting line of the first 3D image and the second 3D image to be parallel to the horizontal plane.
Optionally, the 3D image angle conversion module includes:
the X-direction angle conversion module is used for converting the first 3D image and the second 3D image in the X direction;
a Y-direction angle conversion module for converting the first 3D image and the second 3D image in a Y direction;
and the Z-direction angle conversion module is used for converting the first 3D image and the second 3D image in the Z direction.
Optionally, the virtual helmet further includes a communication module, configured to be in signal connection with the electronic device.
Optionally, the virtual helmet further includes a battery for supplying power to the first and second fisheye cameras, the 3D correction unit, the first comparison unit, the first processing unit, the second comparison unit, the viewing angle integration unit, the second processing unit, and the display unit.
The invention has the following advantages:
the virtual helmet can acquire the distance between the unit image and the helmet through the fisheye camera, so that the virtual reality image can be partially covered by the unit image and/or partially covered by the unit image, the virtual reality image is fused into a real image, a user can play games and entertains, the user can pay attention to the surrounding environment, and dangerous accidents are prevented.
Drawings
Fig. 1 is a schematic diagram of the imaging principle of a conventional fisheye camera.
FIG. 2 is a schematic diagram illustrating how far and near the image is obtained by the first processing unit according to the present invention.
Fig. 3 is a schematic view of the installation of a fisheye camera on the virtual helmet provided by the invention.
Detailed Description
The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
As shown in fig. 3, the present invention provides a virtual helmet comprising:
a housing 100; the shell can adopt the existing shells with various styles, and can be worn as a standard.
The first fisheye camera 200 and the second fisheye camera 300 are arranged at intervals in the front of the shell along the horizontal direction and are used for acquiring a first 3D image and a second 3D image in front of the shell; the camera mechanism of the fisheye camera is shown in fig. 1: the OZ optical axis and the XY plane are imaging direction planes, P and Q in figure 1 are imaged objects, and the connecting lines of P 'and Q', P P 'and Q Q' on the hemispherical spherical surface are displayed in the fisheye camera and pass through the O point.
A 3D correction unit for 3D correcting the first 3D image and the second 3D image into a first planar image and a second planar image; the 3D correction method is a conventional method, and is not described in detail herein.
The first comparison unit is used for comparing whether at least two overlappable unit plane images exist in the first plane image and the second plane image, and each unit plane image corresponds to a real object with a closed outline; the object is a person or an object. The unit plane image is a plane image belonging to the same real object with a closed contour, for example, the plane image of a person may include a hair unit plane image, a face unit plane image, a body unit plane image, and the like, and in order to obtain the unit plane image, the virtual helmet may further include a unit plane image obtaining unit, and the unit plane image obtaining unit may include: the outline dividing unit is used for dividing the closed outlines of the sub-images in the first plane image and the second plane image; the partitioning method can be performed using existing OpenCV. And the unit plane image acquisition unit is used for taking the sub-image in each closed contour as the unit plane image. The sub-image is an image constituting the first planar image or the second planar image.
A first processing unit, configured to, after the first comparing unit compares that at least two overlappable unit plane images exist in the first plane image and the second plane image, acquire at least two unit 3D images corresponding to the at least two overlappable unit plane images in the first 3D image and the second 3D image.
And the second comparison unit is used for comparing the distance between the object a and the object b corresponding to the 3D images of the two units and the horizontal distance between the connecting line L of the center O1 of the first fisheye camera and the center O2 of the second fisheye camera. The comparison method utilizes a special imaging method of the fisheye camera, as shown in fig. 2 to 3, the first processing unit may specifically include: a first fixed point acquisition unit configured to acquire two fixed points a and B in a first 3D image, where the fixed point a corresponds to a point S1 on a real object a and the fixed point B corresponds to a point S2 on a real object B; a second fixed point obtaining unit, configured to obtain two fixed points C and D in a second 3D image, where the fixed point C corresponds to a point S1 on the object a and the fixed point D corresponds to a point S2 on the object b; a distance calculation unit for calculating a horizontal distance L1 from the point S1 to the center connecting line L and a horizontal distance L2 from the point S2 to the center connecting line L2 according to the positions of the fixed point A, B, C, D and the centers O1 and O2; the calculation method can be carried out in a mode of establishing a space three-dimensional coordinate system, and under the condition that points A, B, C, D, O1 and O2 are known, a point S1 can be determined according to the intersection point of two straight lines O1A, O C, and a point S2 can be determined according to the intersection point of two straight lines O1B, O D. And the distance judgment unit is used for judging the distance between the real objects a and b and the horizontal distance between the central connecting line L and the central connecting line L according to the sizes of the L1 and the L2: if L1 is larger than L2, the horizontal distance between the object a and the central connecting line L is longer than that between the object b and the central connecting line L; and if L1 is less than L2, the horizontal distance between the object a and the central connecting line L is close to the horizontal distance between the object b and the central connecting line L.
And the plane image integration unit is used for integrating the first plane image and the second plane image into an integral image. The planar image integration unit may employ feature-based and transform domain-based image registration algorithms for the integration. The specific steps can be as follows: firstly, a Harris angular point detection algorithm is improved, and the speed and the precision of the extracted characteristic points are effectively improved. Then, by using the similarity measure NCC (normalized cross correlation), extracting an initial characteristic point pair by using a bidirectional maximum correlation coefficient matching method, and eliminating a pseudo characteristic point pair by using a Random Sample Consensus (Random Sample Consensus) method to realize the accurate matching of the characteristic point pair. And finally, realizing the registration of the images by using correct feature point matching pairs.
And the second processing unit is used for generating a virtual reality image and enabling a unit plane image corresponding to a long-distance real object in at least part of the whole image to be covered by the virtual reality image according to the distance between the real objects a and b acquired by the second comparison unit and the horizontal distance between the central connecting line L and/or the unit plane image corresponding to a short-distance real object in at least part of the whole image to be covered by the virtual reality image. Such processing units are well known to those skilled in the art and will not be described further herein.
And the display unit is used for displaying the virtual reality image and the whole image generated by the second processing unit to the helmet wearer.
The virtual helmet can obtain the distance between a real object and the helmet through the first fisheye camera and the second fisheye camera, so that a virtual reality image can be partially covered by a unit plane image and/or partially covered by the unit plane image, the virtual reality image is fused into the real plane image, and the display unit is adopted for displaying.
In order to solve the problem, the virtual helmet may further include a gyroscope disposed in the shell 100 and configured to obtain an angle formed by a central connection line L of the first fisheye camera 200 and the second fisheye camera 300 and a horizontal plane; and the 3D image angle conversion module is used for converting the first 3D image and the second 3D image according to the angle acquired by the gyroscope, and enabling a central connecting line of the first 3D image and the second 3D image to be parallel to the horizontal plane, wherein the central connecting line of the first 3D image and the second 3D image is a connecting line of spherical centers of the first 3D image and the second 3D image distributed on the hemispherical surface. The 33D image angle conversion module may adjust coordinate axes of three-dimensional coordinates where the first 3D image and the second 3D image are located, so that a central line of the first 3D image and the second 3D image is parallel to a horizontal plane, for example, the 3D image angle conversion module includes: the X-direction angle conversion module is used for converting the first 3D image and the second 3D image in the X direction; the Y-direction angle conversion module is used for converting the first 3D image and the second 3D image in the Y direction; and the Z-direction angle conversion module is used for converting the first 3D image and the second 3D image in the Z direction.
In order to facilitate connection of the virtual helmet with a mobile phone or a wireless network, the virtual helmet further comprises a communication module for signal connection with the electronic device, and the communication module can be a GPS module, a WIFI module, a Bluetooth module and the like.
In the present invention, the virtual helmet may further include a battery for supplying power to the first and second fisheye cameras 200 and 300, the first processing unit, the viewing angle integrating unit, the second processing unit, and the display unit. The first processing unit, the viewing angle integration unit, and the second processing unit may be integrated in the same chip, for example, an Allwinner V3 chip.
Although the invention has been described in detail above with reference to a general description and specific examples, it will be apparent to one skilled in the art that modifications or improvements may be made thereto based on the invention. Accordingly, such modifications and improvements are intended to be within the scope of the invention as claimed.

Claims (8)

1. A virtual helmet, comprising:
a housing;
the first fisheye camera and the second fisheye camera are arranged at the front of the shell at intervals along the horizontal direction and are used for acquiring a first 3D image and a second 3D image in front of the shell;
a 3D correction unit for 3D correcting the first 3D image and the second 3D image into a first planar image and a second planar image;
the first comparison unit is used for comparing whether at least two overlappable unit plane images exist in the first plane image and the second plane image, and each unit plane image corresponds to a real object with a closed outline; the object is a person or an object;
the first processing unit is used for acquiring at least two unit 3D images corresponding to the at least two overlappable unit plane images in the first 3D image and the second 3D image after the first comparing unit compares that the first plane image and the second plane image have the at least two overlappable unit plane images;
the second comparison unit is used for comparing the distance between the object a and the object b corresponding to the 3D images of the two units and the horizontal distance between the connecting line L of the center O1 of the first fisheye camera and the center O2 of the second fisheye camera;
a planar image integration unit for integrating the first planar image and the second planar image into an overall image;
the second processing unit is used for generating a virtual reality image and enabling a unit plane image corresponding to a long-distance real object in at least part of the whole image to be covered by the virtual reality image according to the distance between the real objects a and b acquired by the second comparison unit and the horizontal distance between the central connecting line L and the central connecting line L, and/or enabling a unit plane image corresponding to a short-distance real object in at least part of the whole image to be covered by the virtual reality image;
and the display unit is used for displaying the virtual reality image and the whole image generated by the second processing unit to the helmet wearer.
2. The virtual helmet of claim 1, further comprising a unit planar image acquisition unit comprising:
the contour dividing unit is used for dividing closed contours of the sub-images in the first plane image and the second plane image;
and the unit plane image acquisition unit is used for taking the sub-image in each closed contour as the unit plane image.
3. The virtual helmet of claim 1, wherein the second comparison unit comprises:
a first fixed point acquisition unit configured to acquire two fixed points a and B in a first 3D image, where the fixed point a corresponds to a point S1 on a real object a and the fixed point B corresponds to a point S2 on a real object B;
a second fixed point obtaining unit, configured to obtain two fixed points C and D in a second 3D image, where the fixed point C corresponds to a point S1 on the object a and the fixed point D corresponds to a point S2 on the object b;
a distance calculation unit which calculates a horizontal distance L1 from the point S1 to the center connecting line L and a horizontal distance L2 from the point S2 to the center connecting line L2 according to the positions of the fixed point A, B, C, D and the centers O1 and O2;
and the distance judgment unit is used for judging the distance between the object a and the object b and the horizontal distance between the center connecting line L according to the sizes of the L1 and the L2: if L1 is larger than L2, the horizontal distance between the object a and the central connecting line L is longer than that between the object b and the central connecting line L; if L1 is less than L2, the horizontal distance between the object a and the central connecting line L is close to the horizontal distance between the object b and the central connecting line L.
4. The virtual helmet of claim 1, wherein the planar image integration unit employs feature-based and transform domain-based image registration algorithms for the integration.
5. The virtual helmet of claim 1, further comprising a gyroscope disposed in the shell and configured to obtain an angle formed by a central line L between the first fisheye camera and the second fisheye camera and a horizontal plane;
and the 3D image angle conversion module is used for converting the first 3D image and the second 3D image according to the angle acquired by the gyroscope and enabling the central connecting line of the first 3D image and the second 3D image to be parallel to the horizontal plane.
6. The virtual helmet of claim 5, wherein the 3D image angle conversion module comprises:
the X-direction angle conversion module is used for converting the first 3D image and the second 3D image in the X direction;
a Y-direction angle conversion module for converting the first 3D image and the second 3D image in a Y direction;
and the Z-direction angle conversion module is used for converting the first 3D image and the second 3D image in the Z direction.
7. The virtual helmet of claim 1, further comprising a communication module for signal connection to an electronic device.
8. The virtual helmet of claim 1, further comprising a battery configured to power the first and second fisheye cameras, the 3D correction unit, the first comparison unit, the first processing unit, the second comparison unit, the viewing angle integration unit, the second processing unit, and the display unit.
CN201810714466.6A 2018-06-29 2018-06-29 Virtual helmet Active CN109087237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810714466.6A CN109087237B (en) 2018-06-29 2018-06-29 Virtual helmet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810714466.6A CN109087237B (en) 2018-06-29 2018-06-29 Virtual helmet

Publications (2)

Publication Number Publication Date
CN109087237A CN109087237A (en) 2018-12-25
CN109087237B true CN109087237B (en) 2023-02-03

Family

ID=64837108

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810714466.6A Active CN109087237B (en) 2018-06-29 2018-06-29 Virtual helmet

Country Status (1)

Country Link
CN (1) CN109087237B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012216077A (en) * 2011-03-31 2012-11-08 Fujifilm Corp Augmented reality providing device and superimposition method of virtual object
DE102011115739A1 (en) * 2011-10-11 2013-04-11 Daimler Ag Method for integrating virtual objects in vehicle displays
CN105657370A (en) * 2016-01-08 2016-06-08 李昂 Closed wearable panoramic photographing and processing system and operation method thereof
CN105955456B (en) * 2016-04-15 2018-09-04 深圳超多维科技有限公司 The method, apparatus and intelligent wearable device that virtual reality is merged with augmented reality
KR101696243B1 (en) * 2016-07-22 2017-01-13 김계현 Virtual reality system and method for realizing virtual reality therof

Also Published As

Publication number Publication date
CN109087237A (en) 2018-12-25

Similar Documents

Publication Publication Date Title
US11205282B2 (en) Relocalization method and apparatus in camera pose tracking process and storage medium
EP3798983B1 (en) Camera orientation tracking method and apparatus, device, and system
JP6632443B2 (en) Information processing apparatus, information processing system, and information processing method
CN107820593B (en) Virtual reality interaction method, device and system
US9155967B2 (en) Method for implementing game, storage medium, game device, and computer
CN104699247B (en) A kind of virtual reality interactive system and method based on machine vision
US8698902B2 (en) Computer-readable storage medium having image processing program stored therein, image processing apparatus, image processing system, and image processing method
EP2395767B1 (en) Image display program, image display system, and image display method
US20120293549A1 (en) Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method
US11222457B2 (en) Systems and methods for augmented reality
WO2018098867A1 (en) Photographing apparatus and image processing method therefor, and virtual reality device
JP2021002301A (en) Image display system, image display device, image display method, program, and head-mounted type image display device
CN108132490A (en) Detection system and detection method based on alignment system and AR/MR
CN110969061A (en) Neural network training method, neural network training device, visual line detection method, visual line detection device and electronic equipment
CN111367405A (en) Method and device for adjusting head-mounted display equipment, computer equipment and storage medium
WO2017061890A1 (en) Wireless full body motion control sensor
CN109087237B (en) Virtual helmet
JP6518645B2 (en) INFORMATION PROCESSING APPARATUS AND IMAGE GENERATION METHOD
CN112967261B (en) Image fusion method, device, equipment and storage medium
CN114093020A (en) Motion capture method, motion capture device, electronic device and storage medium
CN109996044A (en) Identification of Images glasses, recognition methods and readable storage medium storing program for executing based on pupil tracing technology
JP7465133B2 (en) Information processing device and information processing method
CN112925413A (en) Augmented reality glasses and touch control method thereof
CN114998553A (en) Interactive film watching method and system
CN111915745A (en) Underground pipeline display device based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230106

Address after: 570100 No. 1001, Haitao Avenue, Guilin Yang University District, Meilan District, Haikou, Hainan Province

Applicant after: HAIKOU College OF ECONOMICS

Address before: 571100 Yefeng Shuiyun, No. 39, Yebo Road, Qiongshan District, Haikou City, Hainan Province

Applicant before: Deng Wenjie

GR01 Patent grant
GR01 Patent grant