CN115494776A - AR (augmented reality) -glasses-based equipment control method, device, equipment and medium - Google Patents

AR (augmented reality) -glasses-based equipment control method, device, equipment and medium Download PDF

Info

Publication number
CN115494776A
CN115494776A CN202211194089.0A CN202211194089A CN115494776A CN 115494776 A CN115494776 A CN 115494776A CN 202211194089 A CN202211194089 A CN 202211194089A CN 115494776 A CN115494776 A CN 115494776A
Authority
CN
China
Prior art keywords
target
controlled
equipment
hologram
glasses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211194089.0A
Other languages
Chinese (zh)
Inventor
陈智超
范荣
庞微
陈羽宇
查文陆
刘猛
韩超众
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aircraft Manufacturing Co Ltd
Original Assignee
Shanghai Aircraft Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aircraft Manufacturing Co Ltd filed Critical Shanghai Aircraft Manufacturing Co Ltd
Priority to CN202211194089.0A priority Critical patent/CN115494776A/en
Publication of CN115494776A publication Critical patent/CN115494776A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an AR glasses-based equipment control method, an AR glasses-based equipment control device, AR glasses-based equipment and an AR glasses-based medium. The method comprises the following steps: acquiring controlled equipment in a visual field, carrying out object recognition on the controlled equipment, and determining a target equipment identifier of the controlled equipment; determining a target hologram of the controlled equipment in the hologram database according to the target equipment identifier, and displaying the target hologram in a visual field; and when the target virtual button in the target hologram is detected to be selected, controlling the controlled equipment according to the control flow corresponding to the target virtual button. The method can realize virtual control on reality and improve the portability of equipment control.

Description

AR (augmented reality) -glasses-based equipment control method, device, equipment and medium
Technical Field
The invention relates to the technical field of augmented reality, in particular to an AR glasses-based equipment control method, an AR glasses-based equipment control device, AR glasses-based equipment and an AR glasses-based medium.
Background
In the transportation of materials, equipment such as intelligent industrial bodies, intelligent forklifts, and intelligent tools are generally used.
In the prior art, these intelligent industrial devices usually implement signal transmission and real-time communication on the bottom layer through a computer. Therefore, when controlling the smart industrial device, one computer must be provided, which results in poor control portability of the smart industrial device. Moreover, the intelligent industrial equipment is controlled through the bottom layer, and the control process is not visual.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for controlling equipment based on AR glasses, which are used for controlling reality through virtualization and improving the portability of equipment control.
According to an aspect of the present invention, there is provided an AR glasses-based device control method, the method including: acquiring controlled equipment in a visual field, performing object recognition on the controlled equipment, and determining a target equipment identifier of the controlled equipment;
determining a target hologram of the controlled equipment in a hologram database according to the target equipment identifier, and displaying the target hologram in a visual field;
and when the target virtual button in the target hologram is detected to be selected, controlling the controlled device according to a control flow corresponding to the target virtual button.
According to another aspect of the present invention, there is provided an AR glasses-based device control apparatus, the apparatus including:
the device identification determining module is used for acquiring controlled devices in a visual field, identifying objects of the controlled devices and determining target device identifications of the controlled devices;
the target hologram display module is used for determining a target hologram of the controlled equipment in a hologram database according to the target equipment identification and displaying the target hologram in a visual field;
and the controlled device control module is used for controlling the controlled device according to a control flow corresponding to the target virtual button when the target virtual button in the target hologram is detected to be selected.
According to another aspect of the present invention, there is provided AR glasses including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the AR glasses-based device control method according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement the AR glasses-based device control method according to any one of the embodiments of the present invention when executed.
According to the technical scheme of the embodiment of the invention, the target equipment identification of the controlled equipment is determined by acquiring the controlled equipment in the visual field and carrying out object recognition on the controlled equipment; determining a target hologram of the controlled equipment in the hologram database according to the target equipment identifier, and displaying the target hologram in a visual field; when the target virtual button in the target hologram is detected to be selected, the controlled equipment is controlled according to the control flow corresponding to the target virtual button, the problem of virtual-to-reality control is solved, and the portability of equipment control is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a flowchart of a method for controlling an AR glasses-based device according to an embodiment of the present invention;
FIG. 1b is a schematic diagram of a hologram provided in accordance with an embodiment of the present invention;
FIG. 1c is a schematic diagram of a hologram display according to an embodiment of the present invention;
fig. 2 is a flowchart of a device control method based on AR glasses according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an apparatus control device based on AR glasses according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device implementing the AR glasses-based device control method according to the embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," "object," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1a is a flowchart of a method for controlling an AR glasses-based device according to an embodiment of the present invention, where the embodiment is applicable to a case where the device is controlled by VR glasses, and the method may be performed by an AR glasses-based device control apparatus, which may be implemented in hardware and/or software, and the AR glasses-based device control apparatus may be configured in VR glasses. As shown in fig. 1a, the method comprises:
and 110, acquiring the controlled equipment in the visual field, performing object recognition on the controlled equipment, and determining the target equipment identification of the controlled equipment.
The controlled device can be an intelligent industrial device. For example, the controlled device can be a handling-type industrial agent, a scanning-type industrial agent, and a functional-type industrial agent. The controlled device can be a binocular photogrammetric tracking Automatic Guided Vehicle (AGV), a laser scanning AGV, an industrial intelligent forklift, a tool or knife measuring tool transporter, and the like.
Specifically, the binocular photogrammetry tracking AGV can acquire equipment of a shot object image in real time; and calculating the 2D photo obtained by photogrammetry to obtain high-precision 3D coordinate values of the measuring points. The binocular photogrammetry tracking AGV can mainly comprise a camera, a lens, a light source and an AGV. The laser scanning AGV may be an end effector for laser data acquisition. The mark points containing the high-reflectivity glass beads can be arranged on the actuator, the position information of the actuator is fed back by tracking the AGV through binocular photogrammetry, the exterior shape data of the measured object is obtained by the actuator through a triangular laser principle, the position information fed back by tracking the AGV through photogrammetry is converted by each piece of point cloud data of the measured object, and the point cloud data are spliced into a whole. The target equipment can be scanned by matching binocular photogrammetry tracking AGV and laser scanning AGV. The industrial intelligent forklift can be responsible for intelligent carrying work of large parts and tools. The tool measuring tool transport vehicle can be responsible for intelligent distribution work of the tool measuring tools.
Illustratively, an industrial intelligent forklift can be controlled through VR glasses, and intelligent carrying work of large parts and tools is realized; the tool measuring tool transport vehicle is controlled through the VR glasses, and intelligent distribution work of the tool measuring tools is achieved; the binocular photogrammetry tracking AGV and the laser scanning AGV are controlled through the VR glasses, and the large parts are scanned. Control equipment through VR glasses, can liberate user's both hands, degree of freedom when improving user control equipment.
In the embodiment of the present invention, the object recognition of the VR glasses to the controlled device may be implemented by an object recognition algorithm. The specific object identification process may be scanning the controlled device through the VR glasses, obtaining feature data of the controlled device, performing feature comparison on the feature data, and determining which device the controlled device is specific to, that is, determining the target device identifier of the controlled device. The target device identifier may be a device name of the controlled device. The embodiment of the invention does not limit the specific object recognition algorithm.
And step 120, determining a target hologram of the controlled device in the hologram database according to the target device identification, and displaying the target hologram in the visual field.
The hologram database may be a database that is generated in advance and contains holograms of the respective devices. In the hologram database, the respective holograms may be stored in accordance with the device identification. In particular, the hologram database may be stored in the cloud. The VR glasses can access cloud data through a network to obtain a target hologram. FIG. 1b is a schematic diagram of a hologram according to an embodiment of the present invention. As shown in fig. 1b, the hologram may be a control panel of the device. For example, the following information may be included in the hologram as virtual buttons: device name, communication status, industrial agent model, device status/operational status, device parameters, and control button zones, etc.
Fig. 1c is a schematic diagram of a hologram display according to an embodiment of the present invention. As shown in fig. 1c, in the visual field of the VR glasses, the controlled device may be displayed, and holograms may be displayed on both sides of the controlled device. Specifically, the hologram may be composed of two parts, and the first part of the hologram may be displayed on the left side of the controlled device; another part of the hologram may be presented to the right of the controlled device. The content of each part of the hologram can be set according to actual requirements, which is not limited in the embodiment of the present invention.
In an optional implementation of the embodiment of the present invention, displaying the target hologram within the field of view comprises: and acquiring real-time position information of the controlled equipment, and positioning the target hologram at two sides of the visual field of the controlled equipment according to the real-time position information.
The real-time position information of the controlled device may be current position information fed back to the VR glasses by the controlled device in real time. Or, the real-time location information of the controlled device may be location information predicted by the VR glasses in real time according to the received current location information. Positioning the full system on both sides of the controlled device's field of view may be achieved by a Simultaneous positioning and Mapping (SLAM) positioning algorithm. The positioning of the corresponding hologram in real space can be done using a SLAM positioning algorithm.
The core process of the SLAM positioning algorithm mainly comprises three steps: preprocessing, matching and map fusion. Specifically, the preprocessing may be to acquire environmental information of the location through a laser radar or other sensors; and then optimizing the original data of the laser radar, and eliminating some problematic data or filtering. Matching is a very critical step. The matching can be that the point cloud data of the current local environment is used for finding a corresponding position on the established map. The quality of the match has a direct impact on the accuracy of the map constructed by SLAM. In the SLAM positioning process, the point cloud currently acquired by the laser radar needs to be matched and spliced into the original map. The map fusion can be to splice the new data from the laser radar into the original map, and finally complete the update of the map.
By means of the SLAM positioning algorithm, the target holograms corresponding to the controlled device can be continuously positioned at two sides of the visual field of the controlled device, and the specific effect can be as shown in fig. 1 c. When the controlled device in the VR glasses is changed, the corresponding hologram can be changed synchronously. Namely, when the industrial intelligent forklift can be seen in the VR glasses, the hologram corresponding to the industrial intelligent forklift can be displayed; when the tool and measuring tool transport cart can be seen in the VR glasses, the corresponding hologram of the tool and measuring tool transport cart can be displayed. When the industrial intelligent forklift can be seen from the VR glasses in the turning position, the industrial intelligent forklift does not need to be identified according to the SLAM positioning algorithm, and the hologram corresponding to the industrial intelligent forklift is directly displayed according to the positioning of the industrial intelligent forklift.
And step 130, when the target virtual button in the target hologram is detected to be selected, controlling the controlled device according to the control flow corresponding to the target virtual button.
Wherein, can establish the communication between VR glasses and the controlled equipment. VR glasses may have the highest authority of the controlled device. The AR glasses may establish communication with the controlled device using unity web-related development techniques based on a communication protocol, such as the HTTP protocol. Specifically, the calibration of the get and post instruction information addresses can be completed in the low-end code part through the access and communication of the IP address of the controlled device. And then, according to the header file content with the uniform format, such as a json file, the communication between the VR glasses and the controlled equipment is realized.
Based on a communication mechanism between the VR glasses and the controlled equipment, developers can complete the setting and compiling of control instructions of the intelligent industrial equipment in background code scripts, namely the control flow is realized; editing the virtual holographic interface, namely realizing the layout of virtual buttons; and mounting virtual-real combined interaction scripts on the related controls. And finally, establishing event connection between the communication system and the control code of the background and the interactive virtual hologram control of the front end.
In practical application, an operator can visually check related information of the controlled device through the hologram, and based on communication established between the VR glasses and the controlled device, the operator can issue an instruction to the controlled device according to the virtual button in the hologram so as to control the controlled device.
Specifically, the control of the AR glasses on the hologram can be realized through a ray detection algorithm, so that the controlled device is controlled according to information query or interactive control of the hologram. Wherein, VR glasses pass through the removal that ray detection algorithm can detect operating personnel's head or glasses, detect whether should remove and touch the virtual button of hologram. When detecting that the operator touches the target virtual button in the hologram, the VR glasses may execute a control flow corresponding to the target virtual button.
Specifically, in an optional implementation manner of the embodiment of the present invention, when it is detected that a target virtual button in a target hologram is selected, controlling a controlled device according to a control flow corresponding to the target virtual button includes: when detecting that a virtual button corresponding to a target detection point from a carrying tool in a target hologram is selected, generating a device moving instruction according to the real-time position information and the target position information of the target detection point; and sending the device moving instruction to the controlled device so that the controlled device moves to the target detection point according to the device moving instruction.
The controlled equipment has an important function of carrying the tool, namely the controlled equipment can carry the tool to a target detection point so as to realize high-precision detection of large parts in the tool in a specific environment. The VR glasses can determine a moving path for moving the controlled device to the target detection point according to the real-time position information of the controlled device and the target position information of the target detection point, and can generate a corresponding device moving instruction according to the moving path to control the controlled device.
In an optional implementation manner of the embodiment of the present invention, controlling a controlled device according to a control flow corresponding to a target virtual button includes: and controlling the controlled equipment according to the control flow corresponding to the target virtual button based on the 5GCPE network.
The AR glasses can realize communication with the controlled device based on a 5G network environment. Specifically, the AR glasses may use a 5G user terminal device (CPE) to implement communication with the controlled device. The 5GCPE network can perform secondary relay (strengthening) on wireless signals (such as WIFI), and the coverage range of the wireless signals is extended. The 5GCPE network can improve the stability of the network, ensure the normal operation of the controlled equipment, reduce the time delay and keep the equipment control in a stable state.
According to the technical scheme of the embodiment, the target equipment identification of the controlled equipment is determined by acquiring the controlled equipment in the visual field and carrying out object recognition on the controlled equipment; determining a target hologram of the controlled equipment in the hologram database according to the target equipment identification, and displaying the target hologram in a visual field; when a target virtual button in the target hologram is detected to be selected, the controlled equipment is controlled according to a control flow corresponding to the target virtual button, the problem of virtual control over reality is solved, information of the equipment is checked through AR glasses, hands can be liberated, and portability of equipment control is improved; the equipment is controlled by replacing abstract codes with the visual hologram, so that the technical level requirement on operators and the threshold of equipment control can be reduced, the personnel training can be prevented from spending a long time, and the cost is reduced; moreover, the control process of the equipment is intervened through the hologram of the VR glasses, so that the equipment can be controlled visually, and the working efficiency and the operation convenience are improved.
Example two
Fig. 2 is a flowchart of an AR glasses-based device control method according to a second embodiment of the present invention, which is a further refinement of the above technical solutions, and the technical solutions in this embodiment may be combined with various alternatives in one or more of the above embodiments. As shown in fig. 2, the method includes:
step 210, obtaining a controlled device in the visual field, performing object recognition on the controlled device, and determining a target device identifier of the controlled device.
In an optional implementation manner of the embodiment of the present invention, before acquiring a controlled device in a field of view, performing object recognition on the controlled device, and determining a device identifier of the controlled device, the method further includes: carrying out object recognition on at least one device to be recognized, and determining a device identifier of the device to be recognized; acquiring a control application matched with the equipment to be identified, and generating a hologram of the equipment to be identified according to the control application; and storing the hologram according to the equipment identifier of the equipment to be identified to generate a hologram database.
The VR glasses can recognize objects of the equipment to be recognized through an object recognition algorithm, extract object features of the equipment to be recognized and store the object features. Specifically, the device identifier of the device to be recognized may be stored. Different devices to be identified may have different control applications. The control application can be to check the device name, check the communication state, check the industrial intelligent agent model, check the device state/operation state, check the device parameter, move the device, and the like. Corresponding virtual buttons may be generated according to different control applications. Virtual buttons may be shown in the hologram. By mounting the virtual-real combined interaction script on the virtual button, the corresponding control application can be realized.
And step 220, determining a target hologram of the controlled device in the hologram database according to the target device identification.
And step 230, acquiring real-time position information of the controlled device, and positioning the target holograms at two sides of the view field of the controlled device according to the real-time position information.
In an optional implementation manner of the embodiment of the present invention, the obtaining of the real-time location information of the controlled device includes: acquiring current position information, moving speed and timestamp information corresponding to the current position information of controlled equipment; and predicting the real-time position information of the controlled equipment according to the current position information, the timestamp information and the moving speed.
The Json file in which the controlled device interacts with the VR glasses may include current location information, a moving speed, timestamp information corresponding to the current location information, and the like of the controlled device. The current location information may include coordinate information of the device, and attitude information, etc. The speed of movement may include angular velocity, linear velocity, etc. of the device movement. The VR glasses can determine a file transmission time difference value according to the real-time information and the timestamp information. According to the file transmission time difference and the moving speed, the position information of the device moving in the file transmission can be determined. The current position information can be compensated according to the position information of the movement of the equipment, and the real-time position information of the controlled equipment is predicted. Therefore, the accuracy of the real-time position information of the controlled equipment is improved, the inaccuracy of the position information of the controlled equipment caused by communication time delay is avoided, and the reliability of control is improved.
In an optional implementation manner of the embodiment of the present invention, the positioning target holograms at two sides of a visual field of a controlled device according to real-time location information includes: performing coordinate conversion on the real-time position information to generate VR (virtual reality) glasses coordinates of the controlled equipment; and displaying the controlled equipment in the visual field according to the VR glasses coordinates, and positioning the target holograms at two sides of the visual field of the controlled equipment.
The coordinates in the position information of the controlled device and the coordinates in the VR glasses belong to different coordinate systems, and the origins and the directions of the two coordinates are different. Therefore, the real-time position information of the controlled equipment is converted into the coordinates of the VR glasses coordinate system by coordinate conversion, so that the controlled equipment can be accurately displayed. Specifically, the coordinate conversion may be realized by a mathematical method. For example, the coordinate conversion is realized using an RT matrix. Among them, the RT matrix may be a matrix containing both translation and rotation. R may be a rotation matrix, i.e. values of the euler angle spread out in a matrix form, which may be obtained by multiplication and addition of sin or cos of the angle. T may be a displacement transform vector.
And 240, when the virtual button corresponding to the target detection point from the carrying tool in the target hologram is detected to be selected, generating a device moving instruction according to the real-time position information and the target position information of the target detection point.
And step 250, sending the device moving instruction to the controlled device so that the controlled device moves to the target detection point according to the device moving instruction.
In an optional implementation manner of the embodiment of the present invention, sending a device movement instruction to a controlled device, so that the controlled device moves to a target detection point according to the device movement instruction, includes: and transmitting the device moving instruction to the controlled device based on the 5GCPE network so that the controlled device moves to the target detection point according to the device moving instruction.
According to the technical scheme of the embodiment, the target equipment identification of the controlled equipment is determined by acquiring the controlled equipment in the visual field and carrying out object recognition on the controlled equipment; determining a target hologram of the controlled equipment in the hologram database according to the target equipment identification; acquiring real-time position information of the controlled equipment, and positioning the target hologram at two sides of the visual field of the controlled equipment according to the real-time position information; when detecting that a virtual button corresponding to a target detection point from a carrying tool in a target hologram is selected, generating a device moving instruction according to the real-time position information and the target position information of the target detection point; the device moving instruction is sent to the controlled device, so that the controlled device moves to the target detection point according to the device moving instruction, the virtual reality control problem is solved, information of the device is checked through AR glasses, two hands can be liberated, and the portability of device control is improved; the equipment is controlled by replacing abstract codes with the visual hologram, so that the technical level requirement on operators and the threshold of equipment control can be reduced, the personnel training can be prevented from spending a long time, and the cost is reduced; moreover, the control process of the equipment is intervened through the hologram of the VR glasses, so that the equipment can be controlled visually, and the working efficiency and the operation convenience are improved.
According to the technical scheme of the embodiment of the invention, the AR glasses are used for object identification, the corresponding hologram is generated at the same time, and after the communication between the AR glasses and the industrial intelligent device is completed, an operator can give an instruction to the relevant device by clicking the virtual button in the VR glasses. By using the AR glasses-based equipment control method provided by the invention, an operator can intuitively acquire various kinds of information of the equipment from the hologram, so that the working efficiency is improved; the industrial intelligent device can be directly controlled by the AR glasses without carrying a computer or other mobile devices; operating personnel only need wear portable AR glasses and just can look over intelligent industrial equipment's relevant information, has greatly promoted the degree of freedom of during operation both hands. Meanwhile, the situation that the traditional AR glasses require a user to install control application programs (apps) on a mobile phone or other mobile terminals and send instructions to control the industrial intelligent device through the apps can be avoided, so that a mobile terminal is needed to send instructions between the AR glasses and the industrial intelligent device, operators need to use the mobile terminal to send instructions continuously during work, and the work efficiency is reduced.
EXAMPLE III
Fig. 3 is a schematic structural diagram of an apparatus control device based on AR glasses according to a third embodiment of the present invention. As shown in fig. 3, the apparatus includes: a device identification determining module 310, a target hologram displaying module 320 and a controlled device control module 330. Wherein:
the device identifier determining module 310 is configured to acquire a controlled device in a visual field, perform object recognition on the controlled device, and determine a target device identifier of the controlled device;
the target hologram display module 320 is configured to determine a target hologram of the controlled device in the hologram database according to the target device identifier, and display the target hologram in a visual field;
and the controlled device control module 330 is configured to, when it is detected that the target virtual button in the target hologram is selected, control the controlled device according to a control flow corresponding to the target virtual button.
Optionally, the apparatus further includes:
the device identification determining module is used for performing object identification on at least one device to be identified and determining the device identification of the device to be identified before acquiring the controlled device in the visual field, performing object identification on the controlled device and determining the device identification of the controlled device;
the hologram generating module is used for acquiring a control application matched with the equipment to be identified and generating a hologram of the equipment to be identified according to the control application;
and the hologram database generation module is used for storing the hologram according to the equipment identifier of the equipment to be identified to generate a hologram database.
Optionally, the target hologram display module 320 includes:
and the target hologram display unit is used for acquiring the real-time position information of the controlled equipment and positioning the target holograms at two sides of the visual field of the controlled equipment according to the real-time position information.
Optionally, the target hologram display unit includes:
the information acquisition subunit is used for acquiring the current position information, the moving speed and timestamp information corresponding to the current position information of the controlled equipment;
and the real-time position information predicting subunit is used for predicting the real-time position information of the controlled device according to the current position information, the timestamp information and the moving speed.
Optionally, the target hologram display unit includes:
the VR glasses coordinate generating subunit is used for carrying out coordinate conversion on the real-time position information to generate VR glasses coordinates of the controlled equipment;
and the target hologram display subunit is used for displaying the controlled equipment in the visual field according to the VR glasses coordinates and positioning the target holograms at two sides of the visual field of the controlled equipment.
Optionally, the controlled device control module 330 includes:
the equipment moving instruction generating unit is used for generating an equipment moving instruction according to the real-time position information and the target position information of the target detection point when the virtual button corresponding to the target detection point from the carrying tool in the target hologram is detected to be selected;
and the device moving instruction sending unit is used for sending the device moving instruction to the controlled device so that the controlled device moves to the target detection point according to the device moving instruction.
Optionally, the controlled device control module 330 includes:
and the controlled equipment control unit is used for controlling the controlled equipment according to the control flow corresponding to the target virtual button based on the 5GCPE network.
The device control device based on the AR glasses provided by the embodiment of the invention can execute the device control method based on the AR glasses provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Example four
FIG. 4 shows a schematic block diagram of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from a storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as the AR glasses-based device control method.
In some embodiments, the AR glasses-based device control method may be implemented as a computer program that is tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the AR glasses-based device control method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured by any other suitable means (e.g., by means of firmware) to perform the AR glasses-based device control method.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An AR glasses-based device control method, comprising:
acquiring controlled equipment in a visual field, performing object recognition on the controlled equipment, and determining a target equipment identifier of the controlled equipment;
determining a target hologram of the controlled device in a hologram database according to the target device identification, and displaying the target hologram in a visual field;
and when the target virtual button in the target hologram is detected to be selected, controlling the controlled equipment according to the control flow corresponding to the target virtual button.
2. The method of claim 1, wherein before acquiring the controlled device in the field of view, performing object recognition on the controlled device, and determining the device identifier of the controlled device, the method further comprises:
carrying out object recognition on at least one device to be recognized, and determining a device identifier of the device to be recognized;
acquiring a control application matched with the equipment to be identified, and generating a hologram of the equipment to be identified according to the control application;
and storing the hologram according to the equipment identifier of the equipment to be identified to generate a hologram database.
3. The method of claim 1, wherein displaying the target hologram within a field of view comprises:
and acquiring real-time position information of the controlled equipment, and positioning the target hologram at two sides of the visual field of the controlled equipment according to the real-time position information.
4. The method of claim 3, wherein obtaining real-time location information of the controlled device comprises:
acquiring current position information, moving speed and timestamp information corresponding to the current position information of the controlled equipment;
and predicting real-time position information of the controlled equipment according to the current position information, the timestamp information and the moving speed.
5. The method of claim 3, wherein positioning the target hologram on both sides of a field of view of the controlled device according to the real-time location information comprises:
performing coordinate conversion on the real-time position information to generate VR (virtual reality) glasses coordinates of the controlled equipment;
and displaying the controlled equipment in a visual field according to the VR glasses coordinates, and positioning the target holograms at two sides of the visual field of the controlled equipment.
6. The method according to claim 3, wherein when it is detected that a target virtual button in the target hologram is selected, controlling the controlled device according to a control flow corresponding to the target virtual button comprises:
when detecting that a virtual button corresponding to a target detection point is selected from the carrying tool in the target hologram, generating a device moving instruction according to the real-time position information and the target position information of the target detection point;
and sending the device moving instruction to the controlled device so that the controlled device moves to a target detection point according to the device moving instruction.
7. The method according to claim 1, wherein controlling the controlled device according to the control flow corresponding to the target virtual button comprises:
and controlling the controlled equipment according to the control flow corresponding to the target virtual button based on a 5GCPE network.
8. An apparatus for controlling an apparatus based on AR glasses, comprising:
the device identification determining module is used for acquiring controlled devices in a visual field, identifying objects of the controlled devices and determining target device identifications of the controlled devices;
the target hologram display module is used for determining a target hologram of the controlled equipment in a hologram database according to the target equipment identification and displaying the target hologram in a visual field;
and the controlled device control module is used for controlling the controlled device according to a control flow corresponding to the target virtual button when the target virtual button in the target hologram is detected to be selected.
9. AR glasses, characterized in that the AR glasses comprise:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform the AR glasses-based device control method of any one of claims 1-7.
10. A computer-readable storage medium storing computer instructions for causing a processor to implement the AR glasses-based device control method of any one of claims 1 to 7 when executed.
CN202211194089.0A 2022-09-28 2022-09-28 AR (augmented reality) -glasses-based equipment control method, device, equipment and medium Pending CN115494776A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211194089.0A CN115494776A (en) 2022-09-28 2022-09-28 AR (augmented reality) -glasses-based equipment control method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211194089.0A CN115494776A (en) 2022-09-28 2022-09-28 AR (augmented reality) -glasses-based equipment control method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN115494776A true CN115494776A (en) 2022-12-20

Family

ID=84471570

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211194089.0A Pending CN115494776A (en) 2022-09-28 2022-09-28 AR (augmented reality) -glasses-based equipment control method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN115494776A (en)

Similar Documents

Publication Publication Date Title
CN111442722B (en) Positioning method, positioning device, storage medium and electronic equipment
CN110689585B (en) Multi-phase external parameter combined calibration method, device, equipment and medium
KR20190082070A (en) Methods and apparatuses for map generation and moving entity localization
CN112549034B (en) Robot task deployment method, system, equipment and storage medium
CN111310840B (en) Data fusion processing method, device, equipment and storage medium
JP7214803B2 (en) Building positioning method, device, electronic device, storage medium, program, and terminal device
EP3904829B1 (en) Method and apparatus for generating information, device, medium and computer program product
CN112147632A (en) Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
CN114186007A (en) High-precision map generation method and device, electronic equipment and storage medium
CN111680596A (en) Positioning truth value verification method, device, equipment and medium based on deep learning
CN115578433A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111770450A (en) Workshop production monitoring server, mobile terminal and application
CN113326796B (en) Object detection method, model training method and device and electronic equipment
CN113177980B (en) Target object speed determining method and device for automatic driving and electronic equipment
KR102345333B1 (en) Device and method for providing augmented reality user interface
CN113126120A (en) Data annotation method, device, equipment, storage medium and computer program product
CN113436233A (en) Registration method and device of automatic driving vehicle, electronic equipment and vehicle
CN112528846A (en) Evaluation method, device, equipment and storage medium for obstacle detection
CN115494776A (en) AR (augmented reality) -glasses-based equipment control method, device, equipment and medium
CN111814651A (en) Method, device and equipment for generating lane line
CN114187509B (en) Object positioning method and device, electronic equipment and storage medium
CN114266876B (en) Positioning method, visual map generation method and device
CN116301321A (en) Control method of intelligent wearable device and related device
CN115575931A (en) Calibration method, calibration device, electronic equipment and storage medium
CN113015117B (en) User positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination