CN117008709A - Control method and device based on VR equipment, electronic equipment and storage medium - Google Patents

Control method and device based on VR equipment, electronic equipment and storage medium Download PDF

Info

Publication number
CN117008709A
CN117008709A CN202210462520.9A CN202210462520A CN117008709A CN 117008709 A CN117008709 A CN 117008709A CN 202210462520 A CN202210462520 A CN 202210462520A CN 117008709 A CN117008709 A CN 117008709A
Authority
CN
China
Prior art keywords
equipment
user
controlled
image data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210462520.9A
Other languages
Chinese (zh)
Inventor
黄青川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202210462520.9A priority Critical patent/CN117008709A/en
Publication of CN117008709A publication Critical patent/CN117008709A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a control method and device based on VR equipment, electronic equipment and storage medium. The control method based on the VR device comprises the following steps: receiving image data of a controlled device and displaying the image data through VR equipment; acquiring pose information of a user of the VR equipment, and determining control information according to the pose information; and sending the control information to the controlled equipment so that the controlled equipment executes the operation corresponding to the control information. The method can utilize the VR equipment to remotely and intelligently control the controlled equipment, so that the man-machine integrated operation of the user and the controlled equipment is realized.

Description

Control method and device based on VR equipment, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of intelligent terminals, and in particular relates to a control method and device based on VR equipment, electronic equipment and a storage medium.
Background
At present, a user sends a control instruction to a field near-end controller by using remote control equipment at a field far-end through a transmission channel, and the near-end controller finishes the control of controlled equipment, so that the purposes of state monitoring, equipment maintenance, service provision and the like are achieved.
Disclosure of Invention
The disclosure provides a control method and device based on VR equipment, electronic equipment and storage medium.
The present disclosure adopts the following technical solutions.
In some embodiments, the present disclosure provides a VR device-based control method, including:
receiving image data of a controlled device and displaying the image data through VR equipment;
acquiring pose information of a user of the VR equipment, and determining control information according to the pose information;
and sending the control information to the controlled equipment so that the controlled equipment executes the operation corresponding to the control information.
In some embodiments, the present disclosure provides a VR device-based control apparatus, comprising:
the first processing module is used for receiving the image data of the controlled equipment and displaying the image data through the VR equipment;
the second processing module is used for acquiring pose information of a user of the VR equipment and determining control information according to the pose information;
and the third processing module is used for sending the control information to the controlled equipment so as to enable the controlled equipment to execute the operation corresponding to the control information.
In some embodiments, the present disclosure provides an electronic device comprising: at least one memory and at least one processor;
the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method.
In some embodiments, the present disclosure provides a computer readable storage medium for storing program code which, when executed by a processor, causes the processor to perform the above-described method.
The control method based on the VR equipment can receive the image data of the controlled equipment, display the image data through the VR equipment, acquire pose information of a user of the VR equipment, determine control information according to the pose information, and further send the control information to the controlled equipment so that the controlled equipment can execute operations corresponding to the control information. Therefore, the position and posture information of the user is acquired through the VR equipment, and the processed control information is sent to the remote intelligent equipment, so that the remote intelligent equipment can complete the position and posture change identical or similar to the position and posture of the user of the VR equipment according to the control information, the purpose of remotely controlling the intelligent equipment by the user is achieved, and man-machine integrated operation is realized.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 is a flowchart of a VR device based control method of an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of VR device based control of an embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in and/or in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "a" and "an" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be construed as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The following describes in detail the schemes provided by the embodiments of the present disclosure with reference to the accompanying drawings.
As shown in fig. 1, fig. 1 is a flowchart of a method in an embodiment of the present disclosure, including the steps of:
step S01: receiving image data of a controlled device and displaying the image data through VR equipment;
in some embodiments, the controlled device may be a remote machine, for example, a manipulator, and the VR device may be communicatively connected with the remote controlled device through a cloud server, so that image data collected by the controlled device, or image data collected by other devices and centered on the controlled device, is sent to the VR device through the cloud server, so that the VR device displays a three-dimensional virtual world centered on the controlled device in real time, and a user may participate in and explore the role and change of the controlled device in the environment where the VR device is located, so as to generate immersion. In some embodiments, the controlled device may also be directly connected to the VR device in a communication manner, and it may be understood that, while the three-dimensional virtual world centered on the controlled device is displayed by the VR device, a virtual surround sound corresponding to the environment where the controlled device is located may also be generated by using an acoustic simulation technology, so as to meet the hearing requirement of the user.
Step S02: acquiring pose information of a user of the VR equipment, and determining control information according to the pose information;
in some embodiments, the VR device may include an operating component that includes a pose detection component for obtaining pose information of the user. Specifically, the pose detection component may obtain six-degree-of-freedom information corresponding to the movements of the head and the limbs of the user, and obtain control information after processing the pose information. For example, pose information of the head of the user is acquired by the pose detection part, and a pose control signal of the six-degree-of-freedom controlled device is generated according to the pose information. Wherein the movements of the user's head and extremities include, but are not limited to, forward, backward, left turn, right turn, head turn, and hand movements; the six degrees of freedom include three position coordinates and three angular velocities in three-dimensional space.
Step S03: and sending the control information to the controlled equipment so that the controlled equipment executes the operation corresponding to the control information.
In some embodiments, as shown in fig. 2, the control information may be transmitted to a remote cloud server through a network, and the cloud server transmits the control information to the controlled device. The controlled device controls the motion unit to execute the operation corresponding to the control information after processing and calculating the control information, thereby completing the posture adjustment same as or similar to the user.
According to the VR equipment-based control method, image data of the controlled equipment can be received and displayed through the VR equipment, pose information of a user of the VR equipment is obtained, control information is determined according to the pose information, and then the control information is sent to the controlled equipment, so that the controlled equipment can execute operations corresponding to the control information, the VR equipment can be utilized to transmit position and pose information of the user to remote intelligent equipment, for example, the remote intelligent equipment can complete the same or similar position and pose change, the purpose of remotely controlling the remote intelligent equipment by an operator is achieved, and man-machine integrated operation is achieved.
Based on the content of the above embodiments, in some embodiments, receiving image data of a controlled device and displaying by a VR device includes: receiving image data of a controlled device, generating a virtual reality image of the controlled device according to the image data, and displaying the virtual reality image through the VR device; or receiving image data of the controlled equipment and the environment where the controlled equipment is located, generating virtual reality images of the controlled equipment and the environment where the controlled equipment is located according to the image data, and displaying the virtual reality images through the VR equipment.
In some embodiments, on the one hand, the VR device may receive image data of the controlled device, and generate a virtual reality image of the controlled device according to the image data and display the virtual reality image through the VR device; the image data of the controlled device may be current pose image data of the controlled device body. On the other hand, the VR device may also receive the image data of the controlled device and the image data of the surrounding environment centered on the controlled device collected by other devices, and generate virtual reality images of the controlled device and the environment where the controlled device is located according to the image data of the controlled device and the image data of the surrounding environment of the controlled device, so that the VR device displays a three-dimensional virtual world centered on the controlled device in real time, and the user may participate in and explore the role and the change of the controlled device in the environment where the controlled device is located through the VR device, so as to generate immersion feeling.
Based on the content of the above embodiments, in some embodiments, further includes: and displaying the virtual reality image associated with the user through the VR equipment according to the pose information of the user.
In some embodiments, after pose information of a user of the VR device is obtained, a virtual reality image associated with the user may also be displayed by the VR device according to the pose information. For example, a virtual reality image of a first person perspective of a user may be displayed by the VR device according to pose information, a virtual display image of a third person perspective of the user may be displayed by the VR device according to pose information, and a virtual reality image of an environment in which a virtual character associated with the user is located may be displayed by the VR device according to pose information. Wherein, the virtual roles associated with the users can be set according to the user preference, and the method is not particularly limited.
Based on the foregoing embodiments, in some embodiments, the acquiring pose information of the user includes: and acquiring the real-time pose information of the user as the pose information, or acquiring and displaying the real-time pose information of the user, and responding to the editing operation or the confirmation operation of the displayed real-time pose information, and taking the edited or confirmed real-time pose information as the pose information.
In some embodiments, the obtained real-time pose information of the user may be used as target pose information, or may be displayed by VR device after the real-time pose information of the user is obtained, and then viewed and processed by the user as target pose information. It can be understood that in practical application, a user may generate unnecessary operations such as false touch and collision, so as to avoid the operations from performing error guidance on the controlled device, in some embodiments of the present disclosure, after acquiring real-time pose information of the user, the real-time pose information is displayed through VR device, the user can view the real-time pose information through VR device, and if the user determines that the real-time pose information is correct, the confirmed real-time pose information is used as target pose information through confirmation operation; if the user judges that the real-time pose information has flaws, the real-time pose information can be deleted, edited and calibrated, and then the edited and calibrated real-time pose information is used as target pose information.
Based on the foregoing embodiments, in some embodiments, the VR device includes a wearable display component and an operating component; the operation part is provided with a pose detection part, or the operation part and the wearable display part are provided with pose detection parts, and the pose information is acquired through the pose detection parts.
In some embodiments, the VR device includes a wearable display component and an operating component; the wearable display part is mainly used for displaying a virtual reality picture, and can be a head-wearing helmet with a display, and the operation part is used for being held by a user or worn by the user; the wearable display part and the operation part comprise pose detection parts, and the pose detection parts are used for recording pose information of the wearable display part and the operation part.
In some embodiments, the wearable display component includes a camera module for capturing ambient environment and recording user facial expression actions and sending to the pose detection component. The pose detection part is internally provided with a sensor, a gyroscope, a camera and the like and is used for tracking and positioning the actions of a user. When the controlled equipment is a simulation manipulator, the position movement and the action change of the hand of the user during virtual operation are acquired through the operation part worn by the hand of the user.
Based on the content of the foregoing embodiments, in some embodiments, the pose information of the user includes: at least one of a facial expression of the user, a position of the user, an action of the user, and an orientation of the user.
In some embodiments, the user pose information acquired by the pose detection component includes at least one of a facial expression of the user, a position of the user, an action of the user, and a user orientation. Because the wearable display part and the operation part of the VR equipment comprise pose detection parts for acquiring pose information and correcting the pose detection parts through the calibration positions recorded by the camera module, more accurate user position and position change information can be provided, in some embodiments, different user expressions correspond to different control information, and a user can control the controlled equipment through the user expressions.
In some embodiments, sending the control information to the controlled device to cause the controlled device to perform operations corresponding to the control information includes: and sending the control information to the controlled equipment so that the controlled equipment executes pose adjustment corresponding to the control information.
In some embodiments, the control information may be transmitted over a network to a remote cloud server, which transmits the control information to the controlled device. The controlled device controls the motion unit to execute the operation corresponding to the control information after processing and calculating the control information, thereby completing the posture adjustment same as or similar to the user. Some embodiments of the present disclosure utilize VR devices to automatically control remote intelligent devices, such as intelligent robots, intelligent automobiles, intelligent remote control devices, etc., to conveniently perform remote control, so that these devices implement some special functions, such as going deep into unmanned areas, dangerous rescue sites, etc.
Based on the content of the above embodiments, in some embodiments, further includes: at least one of the following: receiving the image data of the controlled equipment again and displaying the image data through VR equipment; and receiving the position information of the controlled equipment, and carrying out position correction on the controlled equipment according to the position information.
In some embodiments, as shown in fig. 2, after the controlled device performs pose adjustment corresponding to the control information, the controlled device sends the collected image data and its own position information to the VR device through the cloud server, so that the VR device displays, in real time, a virtual display image of the controlled device after the pose adjustment according to the image data, and performs position correction on the controlled device according to the position information of the controlled device. For example, after the controlled device performs pose adjustment, the user can check the current position of the controlled device through the wearable display device, and decide whether to correct the position and the pose of the controlled device according to the current position of the controlled device, so as to form a feedback mechanism. The calculation process of the correction data of the position and the posture can be performed in a cloud server.
Based on the content of the above embodiments, in some embodiments, further includes: and according to the received image data, adjusting the controlled device displayed by the VR device and/or the environment virtual reality image where the controlled device is located.
In some embodiments, the controlled device displayed by the VR device may be adjusted according to the image data received again, and the expected pose image and the actual pose image of the controlled device may be presented in the VR device in a small window manner, so that the user may judge the influence of the environment of the controlled device on the operation by comparing the expected pose image and the actual pose image of the controlled device, and may even directly give a correction suggestion in the form of text.
In some embodiments, the display interface of the VR device may be enlarged by operation to meet the control requirement with high accuracy.
In some embodiments, the pose detection component is disposed on a tracking device and a controller of the VR device.
In some embodiments, the pose detection component comprises an inertial navigation unit module IMU (Inertial Measurement Unit).
In some embodiments, the controlled device includes a 3D camera for acquiring image data of the controlled device.
The embodiment of the disclosure also provides a control device based on the VR device, which comprises:
the first processing module is used for receiving the image data of the controlled equipment and displaying the image data through the VR equipment;
the second processing module is used for acquiring pose information of a user of the VR equipment and determining control information according to the pose information;
and the third processing module is used for sending the control information to the controlled equipment so as to enable the controlled equipment to execute the operation corresponding to the control information.
In some embodiments, the VR device is in communication connection with the remote controlled device through the cloud server, so that image data collected by the controlled device or image data collected by other devices and centered on the controlled device is sent to the VR device through the cloud server, so that the VR device displays a three-dimensional virtual world centered on the controlled device in real time, and a user can participate in and explore the role and change of the controlled device in the environment through the VR device to generate immersion. It can be understood that, while the three-dimensional virtual world centered on the controlled device is displayed by the VR device, the virtual surround sound corresponding to the environment where the controlled device is located can be generated by the acoustic simulation technology, so as to meet the hearing requirement of the user.
In some embodiments, the VR device includes an operating component having a pose detection component therein for obtaining pose information of the user. Specifically, the pose detection component may obtain six-degree-of-freedom information corresponding to the movements of the head and the limbs of the user, and obtain control information after processing the pose information. For example, pose information of the head of the user is acquired by the pose detection part, and a pose control signal of the six-degree-of-freedom controlled device is generated according to the pose information. Wherein the movements of the user's head and extremities include, but are not limited to, forward, backward, left turn, right turn, head turn, and hand movements; the six degrees of freedom include three position coordinates and three angular velocities in three-dimensional space.
In some embodiments, the control information is transmitted to a remote cloud server over a network, and the cloud server transmits the control information to the controlled device. The controlled device controls the motion unit to execute the operation corresponding to the control information after processing and calculating the control information, thereby completing the posture adjustment same as or similar to the user.
The control device based on the VR equipment can receive the image data of the controlled equipment, display the image data through the VR equipment, acquire pose information of a user of the VR equipment, determine control information according to the pose information, and further send the control information to the controlled equipment, so that the controlled equipment can execute operations corresponding to the control information, the position and pose information of the controlled equipment can be transmitted to the intelligent equipment at the far end by using the VR equipment, the intelligent equipment at the far end can complete the same or similar position and pose change, the purpose of remotely controlling the intelligent equipment at the far end by an operator is achieved, and man-machine integrated operation is achieved.
In some embodiments, the first processing module is specifically configured to:
receiving image data of a controlled device, generating a virtual reality image of the controlled device according to the image data, and displaying the virtual reality image through the VR device; or,
receiving image data of a controlled device and an environment where the controlled device is located, generating virtual reality images of the controlled device and the environment where the controlled device is located according to the image data, and displaying the virtual reality images through the VR device.
In some embodiments, the first processing module is further specifically configured to:
and displaying the virtual reality image associated with the user through the VR equipment according to the pose information of the user.
In some embodiments, the second processing module is specifically configured to:
and acquiring the real-time pose information of the user as the pose information, or acquiring and displaying the real-time pose information of the user, and responding to the editing operation or the confirmation operation of the displayed real-time pose information, and taking the edited or confirmed real-time pose information as the pose information.
In some embodiments, the VR device includes a wearable display component and an operating component;
the operation part is provided with a pose detection part, or the operation part and the wearable display part are provided with pose detection parts, and the pose information is acquired through the pose detection parts.
In some embodiments, the pose information of the user includes: at least one of a facial expression of the user, a position of the user, an action of the user, and an orientation of the user.
In some embodiments, the third processing module is specifically configured to:
and sending the control information to the controlled equipment so that the controlled equipment executes pose adjustment corresponding to the control information.
In some embodiments, further comprising:
the fourth processing module is used for receiving the image data of the controlled device again and displaying the image data through the VR device;
and the fifth processing module is used for receiving the position information of the controlled equipment and carrying out position correction on the controlled equipment according to the position information.
In some embodiments, further comprising:
and the sixth processing module is used for adjusting the controlled device and/or the environment virtual reality image where the controlled device is located, which is displayed by the VR device, according to the received image data.
For embodiments of the device, reference is made to the description of method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the objectives of some embodiments. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The method and apparatus of the present disclosure are described above based on the embodiments and applications. In addition, the present disclosure also provides an electronic device and a computer-readable storage medium, which are described below.
Referring now to fig. 3, a schematic diagram of an electronic device (e.g., a terminal device or server) 800 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in the drawings is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
The electronic device 800 may include a processing means (e.g., a central processor, a graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with programs stored in a Read Only Memory (ROM) 802 or loaded from a storage 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data required for the operation of the electronic device 800 are also stored. The processing device 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
In general, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, etc.; storage 808 including, for example, magnetic tape, hard disk, etc.; communication means 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While an electronic device 800 having various means is shown, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communication device 809, or installed from storage device 808, or installed from ROM 802. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 801.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure described above.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of remote computers, the remote computer may be connected to the user computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (e.g., connected through the internet using an internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a VR device-based control method, including:
receiving image data of a controlled device and displaying the image data through VR equipment;
acquiring pose information of a user of the VR equipment, and determining control information according to the pose information;
and sending the control information to the controlled equipment so that the controlled equipment executes the operation corresponding to the control information.
According to one or more embodiments of the present disclosure, there is provided a method of receiving image data of a controlled device and displaying by a VR device, comprising:
receiving image data of a controlled device, generating a virtual reality image of the controlled device according to the image data, and displaying the virtual reality image through the VR device; or,
receiving image data of a controlled device and an environment where the controlled device is located, generating virtual reality images of the controlled device and the environment where the controlled device is located according to the image data, and displaying the virtual reality images through the VR device.
According to one or more embodiments of the present disclosure, there is provided a method further comprising: and displaying the virtual reality image associated with the user through the VR equipment according to the pose information of the user.
According to one or more embodiments of the present disclosure, there is provided a method of acquiring pose information of a user, including:
and acquiring the real-time pose information of the user as the pose information, or acquiring and displaying the real-time pose information of the user, and responding to the editing operation or the confirmation operation of the displayed real-time pose information, and taking the edited or confirmed real-time pose information as the pose information.
According to one or more embodiments of the present disclosure, there is provided a method, the VR device comprising a wearable display component and an operating component;
the operation part is provided with a pose detection part, or the operation part and the wearable display part are provided with pose detection parts, and the pose information is acquired through the pose detection parts.
According to one or more embodiments of the present disclosure, there is provided a method, the pose information of the user including: at least one of a facial expression of the user, a position of the user, an action of the user, and an orientation of the user.
According to one or more embodiments of the present disclosure, there is provided a method of transmitting the control information to the controlled device to cause the controlled device to perform operations corresponding to the control information, including:
and sending the control information to the controlled equipment so that the controlled equipment executes pose adjustment corresponding to the control information.
According to one or more embodiments of the present disclosure, there is provided a method further comprising: at least one of the following:
receiving the image data of the controlled equipment again and displaying the image data through VR equipment;
and receiving the position information of the controlled equipment, and carrying out position correction on the controlled equipment according to the position information.
According to one or more embodiments of the present disclosure, there is provided a method further comprising:
and according to the received image data, adjusting the controlled device displayed by the VR device and/or the environment virtual reality image where the controlled device is located.
According to one or more embodiments of the present disclosure, there is provided a VR device-based control apparatus, including:
the first processing module is used for receiving the image data of the controlled equipment and displaying the image data through the VR equipment;
the second processing module is used for acquiring pose information of a user of the VR equipment and determining control information according to the pose information;
and the third processing module is used for sending the control information to the controlled equipment so as to enable the controlled equipment to execute the operation corresponding to the control information.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: at least one memory and at least one processor;
wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored by the at least one memory to perform any of the methods described above.
According to one or more embodiments of the present disclosure, a computer-readable storage medium is provided for storing program code which, when executed by a processor, causes the processor to perform the above-described method.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (12)

1. A VR device-based control method, comprising:
receiving image data of a controlled device and displaying the image data through VR equipment;
acquiring pose information of a user of the VR equipment, and determining control information according to the pose information;
and sending the control information to the controlled equipment so that the controlled equipment executes the operation corresponding to the control information.
2. The method of claim 1, wherein receiving image data of the controlled device and displaying by the VR device comprises:
receiving image data of a controlled device, generating a virtual reality image of the controlled device according to the image data, and displaying the virtual reality image through the VR device; or,
receiving image data of a controlled device and an environment where the controlled device is located, generating virtual reality images of the controlled device and the environment where the controlled device is located according to the image data, and displaying the virtual reality images through the VR device.
3. The method as recited in claim 2, further comprising: and displaying the virtual reality image associated with the user through the VR equipment according to the pose information of the user.
4. The method of claim 1, wherein the obtaining pose information of the user comprises:
and acquiring the real-time pose information of the user as the pose information, or acquiring and displaying the real-time pose information of the user, and responding to the editing operation or the confirmation operation of the displayed real-time pose information, and taking the edited or confirmed real-time pose information as the pose information.
5. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the VR device includes a wearable display component and an operating component;
the operation part is provided with a pose detection part, or the operation part and the wearable display part are provided with pose detection parts, and the pose information is acquired through the pose detection parts.
6. The method of claim 1, wherein the pose information of the user comprises: at least one of a facial expression of the user, a position of the user, an action of the user, and an orientation of the user.
7. The method according to claim 1, wherein transmitting the control information to the controlled device to cause the controlled device to perform an operation corresponding to the control information, comprises:
and sending the control information to the controlled equipment so that the controlled equipment executes pose adjustment corresponding to the control information.
8. The method of any one of claims 1-7, further comprising: at least one of the following:
receiving the image data of the controlled equipment again and displaying the image data through VR equipment;
and receiving the position information of the controlled equipment, and carrying out position correction on the controlled equipment according to the position information.
9. The method as recited in claim 8, further comprising:
and according to the received image data, adjusting the controlled device displayed by the VR device and/or the environment virtual reality image where the controlled device is located.
10. A VR device-based control apparatus, comprising:
the first processing module is used for receiving the image data of the controlled equipment and displaying the image data through the VR equipment;
the second processing module is used for acquiring pose information of a user of the VR equipment and determining control information according to the pose information;
and the third processing module is used for sending the control information to the controlled equipment so as to enable the controlled equipment to execute the operation corresponding to the control information.
11. An electronic device, comprising:
at least one memory and at least one processor;
wherein the at least one memory is configured to store program code, and the at least one processor is configured to invoke the program code stored by the at least one memory to perform the method of any of claims 1 to 9.
12. A computer readable storage medium for storing program code which, when executed by a processor, causes the processor to perform the method of any one of claims 1 to 9.
CN202210462520.9A 2022-04-27 2022-04-27 Control method and device based on VR equipment, electronic equipment and storage medium Pending CN117008709A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210462520.9A CN117008709A (en) 2022-04-27 2022-04-27 Control method and device based on VR equipment, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210462520.9A CN117008709A (en) 2022-04-27 2022-04-27 Control method and device based on VR equipment, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117008709A true CN117008709A (en) 2023-11-07

Family

ID=88574961

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210462520.9A Pending CN117008709A (en) 2022-04-27 2022-04-27 Control method and device based on VR equipment, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117008709A (en)

Similar Documents

Publication Publication Date Title
CN116079697B (en) Monocular vision servo method, device, equipment and medium based on image
CN116077182B (en) Medical surgical robot control method, device, equipment and medium
CN111061363A (en) Virtual reality system
CN114564106B (en) Method and device for determining interaction indication line, electronic equipment and storage medium
US11610287B2 (en) Motion trail update method, head-mounted display device and computer-readable medium
CN112818898B (en) Model training method and device and electronic equipment
CN111382701B (en) Motion capture method, motion capture device, electronic equipment and computer readable storage medium
WO2023151558A1 (en) Method and apparatus for displaying images, and electronic device
CN117008709A (en) Control method and device based on VR equipment, electronic equipment and storage medium
CN113432620B (en) Error estimation method and device, vehicle-mounted terminal and storage medium
CN112927718B (en) Method, device, terminal and storage medium for sensing surrounding environment
CN114116081B (en) Interactive dynamic fluid effect processing method and device and electronic equipment
CN115002438A (en) Development preview method and device of XR application, electronic equipment and readable storage medium
CN114663553A (en) Special effect video generation method, device and equipment and storage medium
CN111912528A (en) Body temperature measuring system, method, device and equipment storage medium
CN113515201B (en) Cursor position updating method and device and electronic equipment
CN112781581B (en) Method and device for generating path from moving to child cart applied to sweeper
US20240153211A1 (en) Methods, apparatuses, terminals and storage media for display control based on extended reality
CN114115536A (en) Interaction method, interaction device, electronic equipment and storage medium
CN109255095B (en) IMU data integration method and device, computer readable medium and electronic equipment
CN112256025A (en) Equipment control method and device and electronic equipment
CN117308910A (en) Positioning method, positioning device, electronic equipment, head-mounted display equipment and storage medium
CN117170491A (en) Method, device, equipment and medium for determining virtual cursor in virtual reality scene
CN112486318A (en) Image display method, image display device, readable medium and electronic equipment
CN117453324A (en) Digital model display method, digital model display equipment, digital model display medium and optical scanner scanning system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination