CN115253275A - Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal - Google Patents

Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal Download PDF

Info

Publication number
CN115253275A
CN115253275A CN202210907490.8A CN202210907490A CN115253275A CN 115253275 A CN115253275 A CN 115253275A CN 202210907490 A CN202210907490 A CN 202210907490A CN 115253275 A CN115253275 A CN 115253275A
Authority
CN
China
Prior art keywords
intelligent terminal
handle
inertial sensor
imu data
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210907490.8A
Other languages
Chinese (zh)
Inventor
翁志彬
周克
陈磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pimax Technology Shanghai Co ltd
Original Assignee
Pimax Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pimax Technology Shanghai Co ltd filed Critical Pimax Technology Shanghai Co ltd
Priority to CN202210907490.8A priority Critical patent/CN115253275A/en
Publication of CN115253275A publication Critical patent/CN115253275A/en
Priority to JP2023019232A priority patent/JP2024018887A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The invention provides an intelligent terminal, a palm machine, a virtual system and a space positioning method of the intelligent terminal, and solves the technical problems that functions of a virtual product and a palm machine product can be experienced without one mobile terminal and accurate positioning cannot be achieved in the prior art. The intelligent terminal provided by the invention can be detachably connected with the virtual head display equipment, and the main control chip in the intelligent terminal switches the function of the intelligent terminal from the virtual head display function to the palm machine function; after the intelligent terminal is installed on the virtual head display device, the main control chip in the intelligent terminal switches the functions of the intelligent terminal from the functions of the palm computer to the functions of the virtual head display device, and therefore the functions that a user can experience a virtual system and two products of the palm computer through one intelligent terminal are achieved. The main control chip can also determine the spatial positioning information of the intelligent terminal according to the environmental information of the intelligent terminal shot by the first camera device and the IMU data of the intelligent terminal detected by the first inertial sensor, so that the accurate positioning of the intelligent terminal is realized.

Description

Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal
Technical Field
The invention relates to the technical field of mobile equipment, in particular to an intelligent terminal, a palm machine, a virtual system and a space positioning method of the intelligent terminal.
Background
At present, a great number of novel Virtual Reality (VR) head display devices consisting of VR boxes and mobile terminals are emerging on the market. The performance of the novel VR head display device is mainly determined by the performance of the mobile terminal, and the performance of the novel VR head display device is far inferior to that of a traditional computer-VR head display device or a VR all-in-one machine. The computer-VR head display device, or the display of the VR integrated machine and the VR virtual head display device are generally inseparable and can only be used in a matched manner, so that convenience is limited.
Meanwhile, a handheld device for a game machine, such as a palm machine, includes a host and a handle, and the game experience is performed by inputting an instruction through the handle, and the palm machine does not have a function of experiencing virtual immersion.
Therefore, a need exists for a novel combined intelligent terminal device, which has the separable characteristic of a virtual box and can embody the functions of a palm machine, so that the purpose that a user can experience the functions of two different products by only purchasing one product can be met.
In addition, in the prior art, when the mobile terminal changes the working mode, for example, when the palm function is switched to the virtual head display function, the mobile terminal and the control handle cannot be accurately repositioned.
Disclosure of Invention
In view of this, the present invention provides an intelligent terminal, a palm machine, a virtual system and a spatial positioning method for the intelligent terminal, which solve the technical problems that in the prior art, one mobile terminal is not available to experience the functions of two products, namely a virtual product and a palm machine, and when the mobile terminal changes the working mode, the space of the mobile terminal cannot be accurately positioned.
According to an aspect of the present invention, there is provided an intelligent terminal including: the intelligent terminal comprises an intelligent terminal body, a virtual head display device and a handheld device, wherein the intelligent terminal body is detachably mounted on the virtual head display device or the handheld device; the display screen is arranged on the first side face of the intelligent terminal body; the function switching module is arranged in the intelligent terminal body and used for controlling the intelligent terminal to switch between a virtual head display function and a palm machine function; the main control chip is arranged in the intelligent terminal body and is in communication connection with the function switching module; the first camera device is arranged on a second side face of the intelligent terminal body, and the first side face and the second side face are opposite faces; the first inertial sensor is arranged on the intelligent terminal body and used for detecting IMU data of the intelligent terminal body; the first camera device and the first inertial sensor are respectively in communication connection with the main control chip.
In an embodiment of the present invention, the intelligent terminal further includes: the brightness controller is arranged in the intelligent terminal body and is respectively in communication connection with the main control chip and the display screen; when the intelligent terminal is switched from the palm function to the virtual head display function, the brightness controller reduces the display brightness of the display; and when the intelligent terminal is switched from the virtual head display function to the palm function, the brightness controller increases the display brightness of the display.
In an embodiment of the present invention, the intelligent terminal further includes: the image processor is arranged in the intelligent terminal body and is in communication connection with the main control chip; wherein the image processor is configured to perform asynchronous spatial warping, asynchronous time warping, and image rendering on image information displayed on the display screen.
In an embodiment of the present invention, the intelligent terminal further includes: the second camera device is arranged on the first side face of the intelligent terminal body; and the second camera device is in communication connection with the main control chip.
In an embodiment of the present invention, the number of the first camera devices is four, and the four first camera devices are divided into two camera device groups, and two first camera devices in each camera device group are symmetric with respect to a center of the intelligent terminal body.
As a second aspect of the present invention, the present invention also provides a palm machine, including: the intelligent terminal is described above; the palm machine handle is connected with the intelligent terminal; the palm machine handle is provided with a first infrared sensor and a second inertial sensor, and the second inertial sensor is used for detecting IMU data of the palm machine handle; the second inertial sensor is in communication connection with the main control chip in the intelligent terminal.
In an embodiment of the present invention, the main control chip includes: the first control unit is in communication connection with the function switching module, the first camera device and the first inertial sensor respectively, and is used for controlling the first camera device to shoot a first image of the surrounding environment where the intelligent terminal is located and controlling the first inertial sensor to detect IMU data of the intelligent terminal; the first computing unit is in communication connection with the first control unit, the first camera device, the first infrared sensor, the first inertial sensor and the second inertial sensor respectively, and is used for acquiring the first image and IMU data of the intelligent terminal body, calculating the IMU data of the intelligent terminal body and the first image and generating spatial positioning information of the intelligent terminal.
In an embodiment of the present invention, the main control chip further includes: the second control unit is respectively in communication connection with the function switching module, the first camera device and the second inertial sensor; the second control unit is used for controlling the first camera device to shoot a first infrared sensor on the palm machine handle and controlling a second inertial sensor on the palm machine handle to detect IMU data of the palm machine handle when the function switching module switches the intelligent terminal to the palm machine function; and the second computing unit is in communication connection with the first computing unit, the second control unit, the first camera device and the second inertial sensor respectively, and is used for acquiring the first light spot image of the first infrared sensor transmitted by the first camera device and the IMU data of the palm handle transmitted by the second inertial sensor, calculating the first light spot image, the IMU data of the palm handle and the space positioning information of the intelligent terminal, and generating the space positioning information of the palm handle.
In an embodiment of the present invention, the palm machine further includes: the connector is in communication connection with the palm machine handle and the intelligent terminal respectively; the palm machine handle is provided with a multifunctional key; and the multifunctional processor is respectively connected with the multifunctional keys and the main control chip of the intelligent terminal, and is used for receiving the operation instruction input by the user through the multifunctional keys and sending the operation instruction to the main control chip.
As a third aspect of the present invention, the present invention also provides a virtual system, including: the intelligent terminal is described above; the intelligent terminal is detachably mounted on the virtual head display equipment body; and a virtual head display handle; wherein, virtual head shows handle includes: a control handle; a second infrared sensor disposed on the control handle; and a third inertial sensor disposed on the control handle, the third inertial sensor for measuring IMU data of the control handle; the control handle and the third inertial sensor are respectively in communication connection with the main control chip.
In an embodiment of the present invention, the virtual head-display handle further includes: the handle comprises a handle shell, wherein the handle shell comprises an annular part and a holding part, and a concave part is arranged at the center of the holding part to accommodate and fix the control handle.
In an embodiment of the present invention, the main control chip includes: the third control unit is respectively in communication connection with the function switching module, the first camera device and the first inertial sensor, and is used for controlling the first camera device to shoot a first image of the surrounding environment where the intelligent terminal is located and controlling the first inertial sensor to detect IMU data of the intelligent terminal; and the third calculating unit is in communication connection with the third control unit, the first camera device, the first infrared sensor and the first inertial sensor respectively, and is used for acquiring the first image and the IMU data of the intelligent terminal body, calculating the IMU data of the intelligent terminal body and the first image and generating the space positioning information of the intelligent terminal.
In an embodiment of the present invention, the main control chip further includes: the fourth control unit is respectively in communication connection with the function switching module, the first camera device, the second infrared sensor and the third inertial sensor; the fourth control unit is configured to: when the function switching module switches the intelligent terminal to a virtual head display function, the fourth control unit controls the first camera device to shoot a second infrared sensor positioned on the virtual operating handle, and controls a third inertial sensor positioned on the virtual operating handle to detect IMU data of the virtual operating handle; a fourth computing unit communicatively connected to the third computing unit, the first imaging device, and the third inertial sensor, respectively, the fourth computing unit being configured to: and acquiring a second light spot image of the second infrared sensor transmitted by the first camera device and IMU data of the virtual operating handle transmitted by a third inertial sensor, and calculating the space positioning information of the intelligent terminal, the second light spot image and the IMU data of the virtual operating handle to generate the space positioning information of the virtual operating handle.
As a fourth aspect of the present invention, the present invention further provides a spatial positioning method of an intelligent terminal, which is used for positioning the intelligent terminal, wherein the spatial positioning method of the intelligent terminal includes: the main control chip controls a first image of the surrounding environment where the intelligent terminal is located, which is shot by the first camera device located on the intelligent terminal; the first inertial sensor is controlled to detect IMU data of the intelligent terminal; the main control chip acquires a first image of the surrounding environment where the intelligent terminal is located, wherein the first image is shot by the first camera device; the main control chip acquires IMU data of the intelligent terminal body detected by the first inertial sensor; and the main control chip calculates IMU data of the intelligent terminal body and the first image to generate space positioning information of the intelligent terminal.
In an embodiment of the present invention, when the intelligent terminal is in communication connection with the palm handle, the method for spatially positioning the intelligent terminal further includes: the function switching module switches the intelligent terminal to a palm function; the main control chip controls the first camera device to shoot a first infrared sensor on the palm machine handle and controls a second inertial sensor on the palm machine handle to detect IMU data of the palm machine handle; the main control chip acquires the first light spot image of the first infrared sensor transmitted by the first camera device and the IMU data of the palm machine handle transmitted by the second inertial sensor, calculates the space positioning information of the intelligent terminal, the first light spot image and the IMU data of the palm machine handle, and generates the space positioning information of the palm machine handle.
In an embodiment of the present invention, when the intelligent terminal is installed on a virtual head display device and the intelligent terminal is in communication connection with a virtual operating handle, the method for spatially positioning the intelligent terminal further includes:
the function switching module switches the intelligent terminal to a virtual head display function; the control chip controls the first camera device to shoot a second infrared sensor positioned on the virtual operating handle, and controls a third inertial sensor positioned on the virtual operating handle to detect IMU data of the virtual operating handle; the main control chip acquires a second light spot image of the second infrared sensor transmitted by the first camera device and IMU data of the virtual operating handle transmitted by the second inertial sensor, calculates the spatial positioning information of the intelligent terminal, the second light spot image and the IMU data of the virtual operating handle, and generates the spatial positioning information of the virtual operating handle.
The intelligent terminal provided by the invention can be detachably connected with the virtual head display equipment, so that the virtual system is a separated virtual system, namely the intelligent terminal and the virtual head display equipment can be separately arranged; when the intelligent terminal is separated from the virtual head display device, the intelligent terminal can establish communication connection with the palm machine handle, and a main control chip in the intelligent terminal switches the function of the intelligent terminal from the virtual head display function to the palm machine function, so that the intelligent terminal and the palm machine handle form a palm machine; after the intelligent terminal is installed on the virtual head display device, the main control chip in the intelligent terminal switches the function of the intelligent terminal to the virtual head display function through the palm machine function, so that the intelligent terminal, the virtual palm machine and the virtual head display device form a virtual system, the intelligent terminal can have two functions, and a user can experience the functions of two products, namely the virtual system and the palm machine, through one intelligent terminal. In addition, the intelligent terminal is provided with the first inertial sensor and the first camera device, and the main control chip can determine the space positioning information of the intelligent terminal, namely 6DOF data according to the environment information of the intelligent terminal shot by the first camera device and the IMU data of the intelligent terminal detected by the first inertial sensor, so that the accurate positioning of the intelligent terminal is realized.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally indicate like parts or steps.
Fig. 1 is a front view of an intelligent terminal according to an embodiment of the present invention;
fig. 2 is a rear view of an intelligent terminal according to an embodiment of the present invention;
fig. 3 is a schematic diagram illustrating an operation of an intelligent terminal according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating an operation of an intelligent terminal according to another embodiment of the present invention;
fig. 5 is a schematic diagram illustrating an operation of an intelligent terminal according to another embodiment of the present invention;
fig. 6 is a schematic diagram illustrating an operation of a palm machine according to an embodiment of the present invention;
fig. 7 is a schematic diagram illustrating an operation of a palm machine according to another embodiment of the present invention;
fig. 8 is a schematic flowchart illustrating a spatial positioning method of the smart terminal in the palm machine shown in fig. 7;
fig. 9 is a schematic diagram illustrating an operation of a palm machine according to another embodiment of the present invention;
fig. 10 is a schematic flowchart illustrating a spatial positioning method of the smart terminal in the palm machine shown in fig. 9;
fig. 11 is a schematic diagram illustrating an operation of a virtual system according to an embodiment of the present invention;
FIG. 12 is a schematic diagram illustrating the operation of a virtual system according to another embodiment of the present invention;
fig. 13 is a schematic flowchart illustrating a method for spatially positioning an intelligent terminal in the virtual system shown in fig. 12;
FIG. 14 is a schematic diagram illustrating operation of a virtual machine system according to another embodiment of the present invention;
fig. 15 is a schematic flowchart illustrating a method for spatially positioning an intelligent terminal in the virtual system shown in fig. 14;
fig. 16 is a schematic structural diagram of a virtual operating handle according to another embodiment of the present invention;
fig. 17 is a schematic diagram illustrating an operation of an electronic device according to an embodiment of the present invention.
Reference numerals:
1-an intelligent terminal; 100-intelligent terminal body; 101-a first side; 102-a second side; 200-a display screen; 300-a third camera; 400-a first camera device; 500-a second camera; 600-a master control chip; 601-function switching module; 700-a first inertial sensor; 800-an image processor; 201-a brightness controller; 602-a first control unit; 603-a first calculation unit; a second control unit 604; a second calculation unit 605; a third control unit 607; a third calculation unit 608; a fourth control unit 609; a fourth calculation unit 6091;
2-a palm machine handle; 22-a first infrared sensor; 23-a second inertial sensor; 24-a connector; 25-multifunctional keys; 26-a function processor; 31-virtual head display device; 32-virtual operating handle; 33-a control handle; 34-a second infrared sensor; 35-a third inertial sensor; 36-a handle housing; 361-a ring; 362-a grip portion; 37-a third infrared sensor; 38-a fourth inertial sensor; 900-an electronic device; 901-a processor; 902-a memory; 903-an input device; 904-output device
Detailed Description
In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise. All directional indicators in the embodiments of the present invention (such as upper, lower, left, right, front, rear, top, bottom … …) are only used to explain the relative position relationship between the components, the movement, etc. in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indicator is changed accordingly. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Furthermore, reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein may be combined with other embodiments.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a front view of an intelligent terminal 1 provided by the present invention, fig. 2 is a rear view of the intelligent terminal 1 provided by the present invention, fig. 3 is a working schematic diagram of the intelligent terminal 1 provided by the present invention, and as shown in fig. 1, fig. 2 and fig. 3, the intelligent terminal 1 includes:
the intelligent terminal comprises an intelligent terminal body 100, wherein the intelligent terminal body 100 is detachably mounted on the virtual head display device 31 or a handheld device, such as a palm machine;
the display screen 200 is arranged on the first side surface 101 of the intelligent terminal body 100, and the display screen 200 has a normal display function and can display video information, image information and the like;
the function switching module 601 is arranged in the intelligent terminal body 100, that is, the function switching module 601 can control the intelligent terminal 1 to switch between the virtual head display function and the palm machine function; for example, when the intelligent terminal 1 is installed on the virtual head display, the intelligent terminal 1 and the virtual head display form a virtual system, such as a virtual reality system (hereinafter referred to as VR system) or an augmented reality system (hereinafter referred to as AR system), and at this time, the function switching module 601 may receive information that the intelligent terminal 1 is installed on the virtual head display, and start the function of the virtual head display according to the received information, so that the intelligent terminal 1 performs the function of the virtual head display;
the main control chip 600 is arranged in the intelligent terminal body 100, and the main control chip 600 is in communication connection with the function switching module 601;
the first camera device 400 is arranged on the second side surface 102 of the intelligent terminal body 100, and the first side surface 101 and the second side surface 102 are opposite;
the first camera 400 may be used to photograph the infrared lamp. When the intelligent terminal 1 is installed on the virtual head display device 31, the intelligent terminal 1, the virtual operating handle 32 and the virtual head display device 31 form a virtual system, at this time, the intelligent terminal 1 needs to establish communication connection with the virtual operating handle 32, an infrared lamp is installed on the virtual operating handle 32, at this time, the main control chip 600 of the intelligent terminal 1 can control to start the first camera device 400, so that the first camera device 400 shoots the infrared lamp on the virtual operating handle 32, and transmits a light spot image to the main control chip 600 after shooting the light spot image of the infrared lamp on the virtual operating handle 32;
the first camera 400 can also be used for shooting the surrounding environment where the intelligent terminal 1 is located, and no matter the intelligent terminal 1 performs a virtual head display function or a palm function, the intelligent terminal 1 needs to be positioned, for example, when the intelligent terminal 1 and the palm handle 2 are combined to form a palm, the intelligent terminal 1 and the palm handle 2 need to be positioned; when the intelligent terminal 1 is installed on the virtual head display device 31 and forms a virtual system with the virtual operating handle 32, the intelligent terminal 1 and the virtual operating handle 32 also need to be positioned; the first camera 400 may capture the surrounding environment where the intelligent terminal 1 is located, form image information or video information capable of reflecting the surrounding environment, and send the image information and the video information to the main control chip 600, where the main control chip 600 may obtain the location information of the intelligent terminal 1 according to the image information or the video information;
the first inertial sensor 700 is disposed on the intelligent terminal body 100, the first inertial sensor 700 is configured to detect IMU data of the intelligent terminal body 100, the IMU data of the intelligent terminal 1 detected by the first inertial sensor 700 is transmitted to the main control chip 600 through the IMU data, and the main control chip 600 may determine spatial location information of the intelligent terminal 1 according to the IMU data and image information or video information of an environment where the intelligent terminal 1 is located, which is captured by the first camera 400, that is, 6DOF data of the intelligent terminal 1, that is, degrees of freedom of 6 angles may be obtained based on translational degrees of freedom and rotational degrees of freedom.
Specifically, an Inertial Measurement Unit (IMU) is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of an object. Generally, an IMU includes three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect acceleration signals of three independent axes of an object in a carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to a navigation coordinate system, so that the inertial sensors can measure angular velocity and acceleration of the object in a three-dimensional space, and solve the attitude of the object by using the angular velocity and acceleration signals, for example, a rotational degree of freedom of the object, which is 3 degrees of freedom related to positions, up and down, front and back, and left and right. The IMU data is the result data of the inertial sensor detecting the object, i.e. the angular velocity and acceleration data of an object in three-dimensional space detected by the inertial sensor. Therefore, the first inertial sensor 700 may detect the IMU data of the smart terminal body 100, and may calculate the posture of the smart terminal body 100 by using the IMU data of the smart terminal body 100, for example, the rotational degree of freedom of the smart terminal body 100, which is a degree of freedom related to 3 positions, i.e., up and down, front and back, and left and right.
The intelligent terminal 1 provided by the invention can be detachably connected with the virtual head display equipment 31, namely, the virtual system is a separated virtual system, namely, the intelligent terminal 1 and the virtual head display equipment 31 can be arranged in a separated mode; after the intelligent terminal 1 is separated from the virtual head display device 31, the intelligent terminal 1 can establish communication connection with the palm machine handle 2, and the main control chip 600 in the intelligent terminal 1 switches the function of the intelligent terminal 1 from the virtual head display function to the palm machine function, so that the intelligent terminal 1 and the palm machine handle 2 form a palm machine; after the intelligent terminal 1 is installed on the virtual head display device 31, the main control chip 600 in the intelligent terminal 1 switches the function of the intelligent terminal 1 from the palm function to the virtual head display function, so that the intelligent terminal 1, the virtual palm and the virtual head display device 31 form a virtual system, thereby realizing that the intelligent terminal 1 can have two functions, and a user can experience the functions of two products, namely the virtual system and the palm, through one intelligent terminal 1. In addition, the intelligent terminal 1 is provided with the first inertial sensor 700 and the first camera 400, and the main control chip 600 can determine the spatial positioning information of the intelligent terminal 1, namely 6DOF data, according to the environmental information of the intelligent terminal 1 shot by the first camera 400 and the IMU data of the intelligent terminal 1 detected by the first inertial sensor 700, so that the accurate positioning of the intelligent terminal 1 is realized.
Optionally, as shown in fig. 1, the intelligent terminal 1 further includes a third camera 300 disposed on the second side surface 102 of the intelligent terminal body 100, where the third camera 300 is a depth camera and may be configured to shoot an ambient environment where the intelligent terminal 1 is located, and the depth camera can obtain a planar image of the ambient environment where the intelligent terminal 1 is located, and also can obtain depth information of the ambient environment where the intelligent terminal 1 is located, that is, three-dimensional position and size information of the ambient environment where the intelligent terminal 1 is located, and a richer position relationship between objects may be obtained through distance information, therefore, when the main control chip 600 performs spatial positioning on the intelligent terminal 1, the spatial positioning information of the intelligent terminal 1 may be determined according to the environment information shot by the first camera 400, the depth information of the environment shot by the third camera 300, and IMU data of the intelligent terminal 1, and accuracy of the spatial positioning of the intelligent terminal 1 is improved.
Optionally, the number of the first camera devices 400 is four, and the four first camera devices 400 are divided into two camera device groups, and two first camera devices 400 in each camera device group are symmetric with respect to the center of the intelligent terminal body 100. For example, when the intelligent terminal body 100 is a square body, that is, the second side surface 102 on the intelligent terminal body 100 is a quadrilateral, at this time, the four first cameras 400 may be distributed at four corners of the quadrilateral, so that the position of the first camera 400 may be maximized, and the light spot image of the infrared sensor and the surrounding environment where the intelligent terminal 1 is located may be captured.
In an embodiment of the present invention, fig. 4 is a schematic diagram of an operation of an intelligent terminal 1 according to another embodiment of the present invention, and as shown in fig. 4, the intelligent terminal further includes: the brightness controller 201 is arranged in the intelligent terminal body, and the brightness controller 201 is respectively in communication connection with the main control chip 600 and the display screen 200; when the intelligent terminal 1 is installed on the virtual head display, the function switching module 601 switches the function of the intelligent terminal 1 to the virtual head display function, the brightness controller 201 reduces the brightness of the display screen 200, because the brightness of the display screen 200 is reduced, the lighting time of the display pixels of the display screen 200 when displaying a frame of image is also shortened, namely when a frame of display image is shortened, the time for the pixels to be lighted is shortened, the time of afterglow is also shortened, therefore, the reduction of the definition of the display image of the display screen 200 due to the tailing phenomenon is reduced, and the user experience is improved. When the intelligent terminal 1 is installed on the palm device, the function switching module 601 switches the function of the intelligent terminal 1 to the palm function, and the brightness controller 201 improves the brightness of the display screen 200.
In an embodiment of the present invention, fig. 5 is a schematic diagram illustrating an operation of an intelligent terminal 1 according to another embodiment of the present invention, and as shown in fig. 5, the intelligent terminal 1 further includes: the image processor 800 is arranged in the intelligent terminal body 100, the image processor 800 is in communication connection with the main control chip 600, when the intelligent terminal 1 is installed on the virtual head display device 31, the function switching module 601 in the main control chip 600 switches the function of the intelligent terminal 1 to the virtual head display function, the main control chip 600 transmits the information that the intelligent terminal 1 has switched to the virtual head display function to the image processor 800, and the image processor 800 starts working according to the information, so that the image information to be displayed on the display screen 200 can be subjected to asynchronous space distortion, asynchronous time distortion processing and image rendering processing, and the image definition is improved.
In an embodiment of the present invention, as shown in fig. 1, the intelligent terminal 1 further includes a second camera 500 disposed on the first side 101 of the intelligent terminal body 100, where the second camera 500 is used for shooting a user of the intelligent terminal 1; the second camera 500 is in communication connection with the main control chip 600. The second camera 500 may take a picture of the user, that is, achieve self-shooting of the user, and the main control chip 600 may obtain image information of the user, so that user authentication may be achieved or a virtual avatar of the user may be generated according to the image information of the user.
As a second aspect of the present invention, the present invention further provides a palm machine, and fig. 6 is a schematic diagram of an operation of the palm machine according to an embodiment of the present invention, and as shown in fig. 6, the palm machine includes: the intelligent terminal 1; and a palm machine handle 2 connected with the intelligent terminal 1. The palm machine handle 2 is provided with a first infrared sensor 22 and a second inertial sensor 23, and the second inertial sensor 23 is used for detecting IMU data of the palm machine handle 2. The second inertial sensor 23 is in communication connection with the main control chip 600 in the intelligent terminal 1; the first camera 400 on the intelligent terminal 1 is used for shooting the first infrared sensor 22 and the surrounding environment where the intelligent terminal 1 is located. After the palm-machine handle 2 is in communication connection with the intelligent terminal 1, the function switching module 601 switches the function of the intelligent terminal 1 to the palm-machine function, and at this time, the intelligent terminal 1 and the palm-machine handle 2 form a palm machine.
Optionally, as shown in fig. 6, the palm machine further includes: the connector 24 is respectively in communication connection with the palm machine handle 2 and the intelligent terminal 1, and the connector 24 can realize the communication connection between the palm machine handle 2 and the intelligent terminal 1; the palm machine handle 2 is provided with a multifunctional key 25; and a function processor 26, wherein the function processor 26 is respectively connected with the multifunctional key 25 and the main control chip 600 of the intelligent terminal 1, and the function processor 26 is configured to receive an operation instruction input by a user through the multifunctional key 25 and send the operation instruction to the main control chip 600.
In an embodiment of the present invention, after the intelligent terminal 1 and the palm handle 2 form a palm machine, in a process of a user operating the palm machine, the intelligent terminal 1 and the palm handle 2 both need to be spatially located, as shown in fig. 7, a working schematic diagram of the palm machine provided by another embodiment of the present invention is shown, and as shown in fig. 7, the main control chip 600 includes:
the first control unit 602 is in communication connection with the function switching module 601, the first camera 400 and the first inertial sensor 700, respectively, and the first control unit 602 is configured to control the first camera 400 to shoot a first image of an ambient environment where the intelligent terminal 1 is located, and control the first inertial sensor 700 to detect IMU data of the intelligent terminal 1; and
the first computing unit 603 is connected to the first control unit 602, the first camera 400, the first infrared sensor 22, the first inertial sensor 700, and the second inertial sensor 23 in a communication manner, and the first computing unit 603 is configured to obtain the first image and IMU data of the intelligent terminal body, calculate the IMU data of the intelligent terminal body and the first image, and generate spatial positioning information of the intelligent terminal.
Specifically, fig. 8 is a schematic flowchart of the method for spatially positioning the smart terminal 1 in the palm computer shown in fig. 7, that is, as shown in fig. 8, the method for spatially positioning the smart terminal in the palm computer shown in fig. 7 is a method for spatially positioning the smart terminal 1, where the method for spatially positioning the smart terminal 1 includes the following steps:
step S101: the function switching module 601 receives connection information of communication connection between the intelligent terminal 1 and the palm machine handle 2, and switches the function of the intelligent terminal 1 to the palm machine function according to the connection information;
step S102: when the function switching module 601 successfully switches the intelligent terminal 1 to the palm function, the first control unit 602 controls the first camera 400 located on the intelligent terminal 1 to shoot a first image of the surrounding environment where the intelligent terminal 1 is located; and controls the first inertial sensor 700 to detect the IMU data of the intelligent terminal 1;
the first imaging device 400 images the surrounding environment in which the smart terminal 1 is located under the control of the first control unit 602, forms a first image, and transmits the first image to the first calculation unit 603. The first inertial sensor 700 detects IMU data of the smart terminal 1 under the control of the first control unit 602, and transmits the IMU data of the smart terminal 1 to the first calculation unit 603.
Step S103: after the first computing unit 603 receives the first image transmitted by the first camera 400 and the IMU data of the intelligent terminal 1 transmitted by the first inertial sensor 700, the first computing unit 603 computes the IMU data and the first image of the intelligent terminal body 100, and generates spatial positioning information of the intelligent terminal 1.
The first inertial sensor 700 is configured to detect IMU data of the intelligent terminal body 100, where the IMU data refers to degrees of freedom of 3 rotational angles, the first inertial sensor 700 transmits the IMU data of the intelligent terminal 1 to the first computing unit 603 after detecting the IMU data, and the first computing unit 603 determines spatial positioning information of the intelligent terminal 1 according to the IMU data and image information or video information of an environment where the intelligent terminal 1 is located, which is captured by the first camera 400, that is, 6DOF data (hereinafter, referred to as 6DOF data) of the intelligent terminal 1, that is, degrees of freedom of 6 angles can be obtained based on translational degrees of freedom and rotational degrees of freedom.
Step S101 to step S103 may implement spatial positioning of the intelligent terminal 1, that is, the spatial positioning of the intelligent terminal 1 may be implemented through the first inertial sensor 700 and the first camera 400 disposed on the intelligent terminal 1.
Further, as shown in fig. 9, the main control chip further includes: the main control chip 600 further includes:
a second control unit 604, wherein the second control unit 604 is in communication connection with the function switching module 601, the first image capturing device 400, and the second inertial sensor 23; the second control unit 604 is configured to control the first camera device 400 to shoot the first infrared sensor 22 located on the palm handle and control the second inertial sensor 23 located on the palm handle to detect IMU data of the palm handle when the function switching module 601 switches the intelligent terminal to the palm function;
a second computing unit 605, where the second computing unit 605 is respectively in communication connection with the first computing unit 603, the second control unit 604, the first camera 400, and the second inertial sensor 23, and the second computing unit 604 is configured to obtain the spatial positioning information of the smart terminal 1 transmitted by the first computing unit 603, the first light spot image of the first infrared sensor 22 transmitted by the first camera 400, and the IMU data of the palm grip transmitted by the second inertial sensor 23, calculate the spatial positioning information of the smart terminal 1, the first light spot image, and the IMU data of the palm grip, and generate the spatial positioning information of the palm grip
Specifically, fig. 10 is a schematic flow chart of the space positioning method of the intelligent terminal 1, that is, the positioning method in the space positioning of the intelligent terminal in the palm-top shown in fig. 9 is shown in fig. 10, and the space positioning method of the intelligent terminal 1 further includes the following steps:
step S104: the second control unit 604 controls the first camera 400 to take a picture of the first infrared sensor 22 located on the palm handpiece 2 and controls the second inertial sensor 23 located on the palm handpiece 2 to detect IMU data of the palm handpiece 2;
the first camera 400 photographs the first infrared sensor 22 on the palm grip 2 under the control of the second control unit 604 to form a first light spot image of the first infrared sensor 22. The second inertial sensor 23, under the control of the second control unit 604, detects IMU data of the palm grip 2 and transmits IMU data of the palm grip 2 to the first spatial localization unit 606.
Step S105: the second calculation unit 605 acquires the spatial positioning information of the smart terminal 1 transmitted by the first calculation unit 603, the first spot image of the first infrared sensor 22 transmitted by the first camera 400, and the IMU data of the palm grip 2 transmitted by the second inertial sensor 23, and calculates the spatial positioning information of the smart terminal 1, the first spot image, and the IMU data of the palm grip 2 to generate the spatial positioning information of the palm grip 2. Namely 6DOF data (hereinafter referred to as 6DOF data) of the palm-work grip 2, that is, degrees of freedom of 6 angles can be obtained based on the translational degree of freedom and the rotational degree of freedom.
The spatial positioning of the palm rest 2 can be realized through the steps S104 to S105. Namely, the first camera 400 on the intelligent terminal 1 is used for shooting the first light spot image of the first infrared sensor 22 on the palm handpiece 2, and the IMU data of the palm handpiece 2 detected by the second inertial sensor 23 on the palm handpiece 2, so that the spatial positioning information of the palm handpiece 2 can be determined.
Exemplary virtual System
As a third aspect of the present invention, the present invention further provides a virtual system, and fig. 11 is a schematic diagram of a palm machine according to an embodiment of the present invention, and as shown in fig. 11, the virtual system includes: the intelligent terminal 1; the intelligent terminal 1 is detachably mounted on the body of the virtual head display device 31; and a virtual operating handle 32. Wherein, the virtual operation handle 32 includes: a control handle 33; a second infrared sensor 34 provided on the control handle 33; and a third inertial sensor 35 disposed on the control handle 33, the third inertial sensor 35 for measuring IMU data of the control handle 33; the control handle 33 and the third inertial sensor 35 are respectively in communication connection with the main control chip 600.
After intelligent terminal 1 installed on virtual head display device 31, function switching module 601 then switched intelligent terminal 1's function to virtual head display function, and at this moment, intelligent terminal 1 constitutes virtual system with virtual head display device 31 and virtual operating handle 32, for example intelligent terminal 1 constitutes the VR system with VR helmet and VR handle.
After the intelligent terminal 1 and the virtual head display device 31 are installed on the virtual head display device, in the process of operating the virtual system by the user, the intelligent terminal 1 and the virtual operating handle 32 both need to be spatially located, as shown in fig. 12, a working schematic diagram of the virtual system according to another embodiment of the present invention is shown, and as shown in fig. 12, the main control chip 600 includes:
the third control unit 607, the third control unit 607 is respectively connected to the function switching module 601, the first camera 400 and the first inertial sensor 700 in a communication manner, the third control unit 607 is configured to control the first camera 400 to shoot a first image of a surrounding environment where the intelligent terminal is located, and control the first inertial sensor 700 to detect IMU data of the intelligent terminal;
the third calculating unit 408, the third calculating unit 408 is respectively in communication connection with the third control unit 407, the first camera 400, the first infrared sensor 22 and the first inertial sensor 700, and the third calculating unit 408 is configured to acquire the first image and the IMU data of the intelligent terminal body, calculate the IMU data of the intelligent terminal body and the first image, and generate spatial positioning information of the intelligent terminal.
Specifically, fig. 13 is a schematic flow chart of a spatial positioning method of an intelligent terminal in a virtual system, that is, fig. 13 shows a positioning method when the intelligent terminal in the virtual system is spatially positioned as shown in fig. 13,
the space positioning method of the intelligent terminal 1 comprises the following steps:
step S201: after receiving the installation information of the intelligent terminal 1 installed on the virtual head display device 31, the function switching module 601 switches the function of the intelligent terminal 1 to the virtual head display function according to the installation information;
step S202: when the function switching module 601 successfully switches the intelligent terminal 1 to the virtual head display function, the third control unit 607 controls the first camera 400 located on the intelligent terminal 1 to shoot a first image of the surrounding environment where the intelligent terminal 1 is located; and controls the first inertial sensor 700 to detect the IMU data of the intelligent terminal 1;
the first camera device takes a picture of the surrounding environment of the intelligent terminal 1 under the control of the third control unit 607, forms a first image, and transmits the first image to the third calculation unit 608. The first inertial sensor 700 detects IMU data of the smart terminal 1 under the control of the third control unit 607 and transmits the IMU data of the smart terminal 1 to the third calculation unit 608.
Step S203: after the third calculation unit 608 receives the first image transmitted by the first camera 400 and the IMU data of the intelligent terminal body 100 transmitted by the first inertial sensor 700, the third calculation unit 608 calculates the IMU data of the intelligent terminal body 100 and the first image, and generates spatial positioning information of the intelligent terminal 1.
The first inertial sensor 700 is configured to detect IMU data of the intelligent terminal body 100, where the IMU data refers to degrees of freedom of 3 rotational angles, the IMU data of the intelligent terminal body 100 detected by the first inertial sensor 700 is transmitted to the third computing unit 608 through the IMU data, and the third computing unit 608 may determine spatial location information of the intelligent terminal 1 according to the IMU data and image information or video information of an environment where the intelligent terminal 1 is located, which is captured by the first camera 400, that is, 6DOF data (hereinafter, referred to as 6DOF data) of the intelligent terminal 1, that is, degrees of freedom of 6 angles may be obtained based on the translational degree of freedom and the rotational degree of freedom.
Step S201 to step S203 can realize the spatial positioning of the intelligent terminal 1, that is, the spatial positioning of the intelligent terminal 1 can be realized through the first inertial sensor 700 and the first camera 400 arranged on the intelligent terminal 1.
Further, fig. 14 is a schematic diagram of a virtual system according to another embodiment of the present invention, and as shown in fig. 14, the main control chip 600 further includes:
a fourth control unit 609, wherein the fourth control unit 609 is respectively connected with the function switching module 601, the first camera 400, the second infrared sensor 34 and the third inertial sensor 35 in a communication manner; the fourth control unit 609 is configured to: when the function switching module 601 switches the intelligent terminal to the virtual head display function, the fourth control unit 609 controls the first camera 400 to shoot the second infrared sensor 34 on the virtual operating handle, and controls the third inertial sensor 35 on the virtual operating handle to detect the IMU data of the virtual operating handle;
a fourth calculation unit 6091, a fourth calculation unit 3091 is communicatively connected to the third calculation unit 608, the first imaging device 400, and the third inertial sensor 35, respectively, and the fourth calculation unit 4091 is configured to: the spatial positioning information of the smart terminal 1 calculated by the third calculation unit 608, the second spot image of the second infrared sensor 34 transmitted by the first imaging device 400, and the IMU data of the virtual joystick transmitted by the third inertial sensor 35 are acquired, and the spatial positioning information of the smart terminal 1, the second spot image, and the IMU data of the virtual joystick are calculated to generate the spatial positioning information of the virtual joystick.
Specifically, fig. 15 is a schematic flowchart of the method for spatially positioning the intelligent terminal in the virtual system shown in fig. 14, that is, fig. 15 shows the method for spatially positioning the intelligent terminal in the virtual system as shown in fig. 15, and the method for spatially positioning the intelligent terminal 1 further includes the following steps:
step S204: the fourth control unit 609 controls the first imaging device 400 to image the second infrared sensor 34 located on the virtual operation handle 32, and controls the third inertial sensor 35 located on the virtual operation handle 32 to detect IMU data of the virtual operation handle 32;
step S205: the fourth calculation unit 6091 acquires the spatial localization information of the smart terminal 1 calculated by the third calculation unit 608, the second spot image of the second infrared sensor 34 transmitted by the first imaging device 400, and the IMU data of the virtual handle 32 transmitted by the second inertial sensor 23, calculates the spatial localization information of the smart terminal 1, the second spot image, and the IMU data of the virtual handle 32, and generates the spatial localization information of the virtual handle 32. That is, 6DOF data (hereinafter, referred to as 6DOF data) of the virtual operation handle 32, that is, 6 degrees of freedom can be obtained based on the translational degree of freedom and the rotational degree of freedom.
The spatial positioning of the virtual operation handle 32 can be realized through the steps S204 to S205. Namely, the first camera 400 on the intelligent terminal 1 is used for shooting the second light spot image of the second infrared sensor 34 on the virtual operating handle 32 and the IMU data of the palm machine handle 2 detected by the third inertial sensor 35 on the virtual operating handle 32, so that the spatial positioning information of the virtual operating handle 32 can be determined.
Optionally, fig. 16 is a schematic structural diagram of a virtual operating handle 32 according to another embodiment of the present invention, and as shown in fig. 16, the virtual operating handle 32 further includes: the handle housing 36, the handle housing 36 includes a ring portion 361 and a holding portion 362, wherein a recess portion is provided at the center of the holding portion 362 to receive and fix the control handle 33.
Next, an electronic apparatus according to an embodiment of the present invention is described with reference to fig. 17. Fig. 17 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
As shown in fig. 17, the electronic device 900 includes one or more processors 901 and memory 902.
The processor 901 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or information execution capabilities, and may control other components in the electronic device 900 to perform desired functions.
Memory 901 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program information may be stored on the computer readable storage medium, and the processor 901 may execute the program information to implement the above-described spatial location method of the smart terminal of the various embodiments of the present invention or other desired functions.
In one example, the electronic device 900 may further include: an input device 903 and an output device 904, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 903 may include, for example, a keyboard, a mouse, and the like.
The output device 904 can output various information to the outside. The output device 904 may include, for example, a display, a communication network, a remote output device connected thereto, and the like.
Of course, for simplicity, only some of the components of the electronic device 900 relevant to the present invention are shown in fig. 17, and components such as buses, input/output interfaces, and the like are omitted. In addition, electronic device 900 may include any other suitable components depending on the particular application.
In addition to the above methods and apparatuses, embodiments of the present invention may also be a computer program product comprising computer program information which, when executed by a processor, causes the processor to perform the steps in the method for spatial localization of a smart terminal according to various embodiments of the present invention described in this specification.
The computer program product may write program code for carrying out operations for embodiments of the present invention in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, an embodiment of the present invention may also be a computer-readable storage medium having stored thereon computer program information, which, when executed by a processor, causes the processor to perform the steps in the spatial location method of an intelligent terminal according to various embodiments of the present invention.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: a communication connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present invention have been described above with reference to specific embodiments, but it should be noted that the advantages, effects, etc. mentioned in the present invention are only examples and are not limiting, and the advantages, effects, etc. must not be considered to be possessed by various embodiments of the present invention. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the invention is not limited to the specific details described above.
The block diagrams of devices, apparatus, apparatuses, systems involved in the present invention are by way of illustrative examples only and are not intended to require or imply that the devices, apparatus, apparatuses, systems must be connected, arranged, or configured in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably herein. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the apparatus, devices and methods of the present invention, the components or steps may be broken down and/or re-combined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the invention. Thus, the present invention is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the invention, so that any modifications, equivalents and the like included in the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (16)

1. An intelligent terminal, comprising:
the intelligent terminal comprises an intelligent terminal body, a virtual head display device and a handheld device, wherein the intelligent terminal body is detachably mounted on the virtual head display device or the handheld device;
the display screen is arranged on the first side face of the intelligent terminal body;
the function switching module is arranged in the intelligent terminal body and is used for controlling the intelligent terminal to switch between a virtual head display function and a palm machine function;
the main control chip is arranged in the intelligent terminal body and is in communication connection with the function switching module;
the first camera device is arranged on a second side face of the intelligent terminal body, and the first side face and the second side face are opposite faces; and
the first inertial sensor is arranged on the intelligent terminal body and used for detecting IMU data of the intelligent terminal body;
the first camera device and the first inertial sensor are in communication connection with the main control chip respectively.
2. The intelligent terminal according to claim 1, further comprising:
the brightness controller is arranged in the intelligent terminal body and is respectively in communication connection with the main control chip and the display screen;
when the intelligent terminal is switched from the palm function to the virtual head display function, the brightness controller reduces the display brightness of the display;
and when the intelligent terminal is switched from the virtual head display function to the palm function, the brightness controller increases the display brightness of the display.
3. The intelligent terminal of claim 1, further comprising:
the image processor is arranged in the intelligent terminal body and is in communication connection with the main control chip;
the image processor is used for performing asynchronous space warping, asynchronous time warping and image rendering on image information displayed on the display screen.
4. The intelligent terminal according to claim 1, further comprising:
the second camera device is arranged on the first side face of the intelligent terminal body;
and the second camera device is in communication connection with the main control chip.
5. The intelligent terminal according to claim 1, wherein the number of the first camera devices is four, and the four first camera devices are divided into two camera device groups, and two first camera devices in each camera device group are symmetrical with respect to the center of the intelligent terminal body.
6. A palm machine, comprising:
the intelligent terminal of claim 1; and
the palm machine handle is connected with the intelligent terminal;
the palm machine handle is provided with a first infrared sensor and a second inertial sensor, and the second inertial sensor is used for detecting IMU data of the palm machine handle;
the second inertial sensor is in communication connection with the main control chip in the intelligent terminal.
7. The palm machine according to claim 6, wherein the master control chip comprises:
the first control unit is respectively in communication connection with the function switching module, the first camera device and the first inertial sensor, and is used for controlling the first camera device to shoot a first image of the surrounding environment where the intelligent terminal is located and controlling the first inertial sensor to detect IMU data of the intelligent terminal;
the first computing unit is in communication connection with the first control unit, the first camera device, the first infrared sensor, the first inertial sensor and the second inertial sensor respectively, and is used for acquiring the first image and IMU data of the intelligent terminal body, calculating the IMU data of the intelligent terminal body and the first image and generating spatial positioning information of the intelligent terminal.
8. The palm machine according to claim 7, wherein the main control chip further comprises:
the second control unit is respectively in communication connection with the function switching module, the first camera device and the second inertial sensor; the second control unit is used for controlling the first camera device to shoot a first infrared sensor on the palm machine handle and controlling a second inertial sensor on the palm machine handle to detect IMU data of the palm machine handle when the function switching module switches the intelligent terminal to the palm machine function;
and the second computing unit is in communication connection with the first computing unit, the second control unit, the first camera device and the second inertial sensor respectively, and is used for acquiring the first light spot image of the first infrared sensor transmitted by the first camera device and the IMU data of the palm handle transmitted by the second inertial sensor, calculating the first light spot image, the IMU data of the palm handle and the space positioning information of the intelligent terminal, and generating the space positioning information of the palm handle.
9. The palm machine of claim 6, further comprising:
the connector is in communication connection with the palm machine handle and the intelligent terminal respectively;
the palm machine handle is provided with a multifunctional key; and
the multifunctional processor is respectively connected with the multifunctional keys and the main control chip of the intelligent terminal, and is used for receiving operation instructions input by a user through the multifunctional keys and sending the operation instructions to the main control chip.
10. A virtualization system, comprising:
the smart terminal of claim 1;
the intelligent terminal is detachably mounted on the virtual head display equipment body; and
a virtual head display handle;
wherein the virtual head display handle comprises:
a control handle;
a second infrared sensor disposed on the control handle; and
a third inertial sensor disposed on the control handle, the third inertial sensor for measuring IMU data of the control handle;
the control handle and the third inertial sensor are respectively in communication connection with the main control chip.
11. The virtual system of claim 10, wherein the virtual head-up handle further comprises:
the handle shell comprises an annular part and a holding part, wherein a concave part is arranged at the center of the holding part to accommodate and fix the control handle.
12. The virtualization system according to claim 10, wherein said master control chip comprises:
the main control chip comprises:
the third control unit is respectively in communication connection with the function switching module, the first camera device and the first inertial sensor, and is used for controlling the first camera device to shoot a first image of the surrounding environment where the intelligent terminal is located and controlling the first inertial sensor to detect IMU data of the intelligent terminal;
and the third calculation unit is in communication connection with the third control unit, the first camera device, the first infrared sensor and the first inertial sensor respectively, and is used for acquiring the first image and the IMU data of the intelligent terminal body, calculating the IMU data of the intelligent terminal body and the first image and generating the spatial positioning information of the intelligent terminal.
13. The virtualization system of claim 12, wherein said master chip further comprises:
the fourth control unit is in communication connection with the function switching module, the first camera device, the second infrared sensor and the third inertial sensor respectively; the fourth control unit is configured to: when the function switching module switches the intelligent terminal to a virtual head display function, the fourth control unit controls the first camera device to shoot a second infrared sensor positioned on the virtual operating handle, and controls a third inertial sensor positioned on the virtual operating handle to detect IMU data of the virtual operating handle;
a fourth calculation unit that is communicatively connected to the third calculation unit, the first image capture device, and the third inertial sensor, respectively, the fourth calculation unit being configured to: and acquiring a second light spot image of the second infrared sensor transmitted by the first camera device and IMU data of the virtual operating handle transmitted by a third inertial sensor, and calculating the space positioning information of the intelligent terminal, the second light spot image and the IMU data of the virtual operating handle to generate the space positioning information of the virtual operating handle.
14. A spatial positioning method of an intelligent terminal, for positioning the intelligent terminal according to claim 1, wherein the spatial positioning method of the intelligent terminal comprises:
the main control chip controls a first image of the surrounding environment where the intelligent terminal is located, which is shot by the first camera device located on the intelligent terminal; the first inertial sensor is controlled to detect IMU data of the intelligent terminal;
the main control chip acquires a first image of the surrounding environment where the intelligent terminal is located, wherein the first image is shot by the first camera device;
the main control chip acquires IMU data of the intelligent terminal body detected by the first inertial sensor; and
and the main control chip calculates the IMU data of the intelligent terminal body and the first image to generate the space positioning information of the intelligent terminal.
15. The spatial positioning method of the intelligent terminal according to claim 14, wherein when the intelligent terminal is in communication connection with a palm-size grip, the spatial positioning method of the intelligent terminal further comprises:
the function switching module switches the intelligent terminal to a palm function;
the main control chip controls the first camera device to shoot a first infrared sensor positioned on the palm machine handle and controls a second inertial sensor positioned on the palm machine handle to detect IMU data of the palm machine handle;
the main control chip acquires the first light spot image of the first infrared sensor transmitted by the first camera device and the IMU data of the palm machine handle transmitted by the second inertial sensor, calculates the space positioning information of the intelligent terminal, the first light spot image and the IMU data of the palm machine handle, and generates the space positioning information of the palm machine handle.
16. The spatial positioning method of the intelligent terminal according to claim 14, wherein when the intelligent terminal is installed on a virtual head display device and the intelligent terminal is in communication connection with a virtual operating handle, the spatial positioning method of the intelligent terminal further comprises:
the function switching module switches the intelligent terminal to a virtual head display function;
the control chip controls the first camera device to shoot a second infrared sensor positioned on the virtual operating handle, and controls a third inertial sensor positioned on the virtual operating handle to detect IMU data of the virtual operating handle;
the main control chip acquires a second light spot image of the second infrared sensor transmitted by the first camera device and IMU data of the virtual operating handle transmitted by the second inertial sensor, calculates spatial positioning information of the intelligent terminal, the second light spot image and the IMU data of the virtual operating handle, and generates the spatial positioning information of the virtual operating handle.
CN202210907490.8A 2022-07-29 2022-07-29 Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal Pending CN115253275A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210907490.8A CN115253275A (en) 2022-07-29 2022-07-29 Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal
JP2023019232A JP2024018887A (en) 2022-07-29 2023-02-10 Intelligent terminal, handheld device, virtual system and intelligent terminal spatial positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210907490.8A CN115253275A (en) 2022-07-29 2022-07-29 Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal

Publications (1)

Publication Number Publication Date
CN115253275A true CN115253275A (en) 2022-11-01

Family

ID=83771689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210907490.8A Pending CN115253275A (en) 2022-07-29 2022-07-29 Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal

Country Status (2)

Country Link
JP (1) JP2024018887A (en)
CN (1) CN115253275A (en)

Also Published As

Publication number Publication date
JP2024018887A (en) 2024-02-08

Similar Documents

Publication Publication Date Title
JP5981591B1 (en) Computer program and computer system for controlling object operations in an immersive virtual space
US9928650B2 (en) Computer program for directing line of sight
WO2021031755A1 (en) Interactive method and system based on augmented reality device, electronic device, and computer readable medium
US9086724B2 (en) Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus
EP2354893B1 (en) Reducing inertial-based motion estimation drift of a game input controller with an image-based motion estimation
CN109668545B (en) Positioning method, positioner and positioning system for head-mounted display device
JP2004062758A (en) Information processor and information processing method
WO2013103410A1 (en) Imaging surround systems for touch-free display control
US20130176337A1 (en) Device and Method For Information Processing
KR20210010437A (en) Power management for optical positioning devices
CN111353930B (en) Data processing method and device, electronic equipment and storage medium
US20030184602A1 (en) Information processing method and apparatus
EP3109833B1 (en) Information processing device and information processing method
JP5565331B2 (en) Display system, display processing apparatus, display method, and display program
JP2022505999A (en) Augmented reality data presentation methods, devices, equipment and storage media
JP2005038321A (en) Head mount display device
US9013404B2 (en) Method and locating device for locating a pointing device
JP6549066B2 (en) Computer program and computer system for controlling object operation in immersive virtual space
JP2005147894A (en) Measuring method and measuring instrument
CN115253275A (en) Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal
WO2019106862A1 (en) Operation guiding system
KR101914660B1 (en) Method and apparatus for controlling displaying of augmented reality contents based on gyro sensor
CN115300897A (en) Space positioning method of separated virtual system and virtual system
US20240033614A1 (en) Positioning method and locator of combined controller, controller handle and virtual system
JP4676415B2 (en) Sectional image display device, sectional image display method, and sectional image display program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination