CN108536220B - Wearable device and maintenance processing method - Google Patents

Wearable device and maintenance processing method Download PDF

Info

Publication number
CN108536220B
CN108536220B CN201710121049.6A CN201710121049A CN108536220B CN 108536220 B CN108536220 B CN 108536220B CN 201710121049 A CN201710121049 A CN 201710121049A CN 108536220 B CN108536220 B CN 108536220B
Authority
CN
China
Prior art keywords
scene image
maintenance
image
information
chip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710121049.6A
Other languages
Chinese (zh)
Other versions
CN108536220A (en
Inventor
陈竞
李劲松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Gede Sike Technology Co ltd
Original Assignee
Beijing Gede Sike Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Gede Sike Technology Co ltd filed Critical Beijing Gede Sike Technology Co ltd
Priority to CN201710121049.6A priority Critical patent/CN108536220B/en
Publication of CN108536220A publication Critical patent/CN108536220A/en
Application granted granted Critical
Publication of CN108536220B publication Critical patent/CN108536220B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Abstract

The embodiment of the invention provides a wearable device and a maintenance processing method, comprising the following steps: a frame body and a lens; the frame main body is internally provided with an image recognition chip, a positioning chip, a CPU and a memory; an image pick-up device is arranged between the lenses on the glasses frame main body; the camera device is used for acquiring a first scene image of the equipment to be maintained; the positioning chip is used for acquiring positioning information of the wearable equipment; the CPU is used for matching the target first scene image according to the first scene image and the positioning information if judging that the first scene image has the components in the maintenance procedure stored in the memory; the image recognition chip is used for recognizing the target first scene image, and displaying a result of comparing the recognition result with a pre-stored standard image sample in the memory in the second scene image; the lens is used for displaying the second scene image. The maintenance processing method uses the equipment. The wearable device and the maintenance processing method provided by the embodiment of the invention can improve the efficiency of the maintenance workflow of the whole device.

Description

Wearable device and maintenance processing method
Technical Field
The embodiment of the invention relates to the technical field of intelligent equipment, in particular to wearable equipment and a maintenance processing method.
Background
The large-scale equipment needs to be maintained and repaired regularly, and the operation of the equipment is safe in terms of work matters. Therefore, technical means are required to ensure that maintenance work is in place, detect omission of maintenance personnel, and can track down reasons and responsibilities when accidents occur.
The existing method for maintaining equipment comprises the following steps: in the process of realizing the invention, the inventor finds that at least the following problems exist in the prior art: (1) For maintenance work of large-scale equipment such as aircrafts, tanks and the like, two hands are needed to work, at least one hand is occupied by a handheld terminal, the contradiction is particularly prominent in places with narrow space, and the handheld terminal can be placed without any place when tools are taken, so that the maintenance work is very troublesome. (2) The maintenance projects of maintenance work are numerous, and maintenance personnel need to check all maintenance projects one by one to determine the components of large equipment which need maintenance and do not need maintenance.
Therefore, how to develop a device without hand-holding and also automatically check the device to be maintained, so as to improve the efficiency of the maintenance workflow of the whole device, is a problem to be solved.
Disclosure of Invention
Aiming at the problems existing in the prior art, the embodiment of the invention provides wearable equipment and a maintenance processing method.
In one aspect, an embodiment of the present invention provides a wearable device, including: a frame body and a lens;
the frame main body is internally provided with an image recognition chip, a positioning chip, a CPU and a memory;
an image pick-up device is arranged between lenses on the glasses frame main body;
the camera device is used for acquiring a first scene image of the equipment to be maintained;
the positioning chip is used for acquiring positioning information of the wearable equipment;
the CPU is used for matching a target first scene image according to the first scene image and the positioning information if judging that the first scene image has a part in a maintenance procedure stored in a memory;
the image recognition chip is used for recognizing the target first scene image, and displaying a comparison result of the recognition result and a pre-stored standard image sample in the memory in a second scene image, wherein the second scene image is overlapped on the target first scene image, the comparison result comprises first marking information of equipment parts to be maintained without manual inspection, and the first marking information identifies the parts needing maintenance and needing no maintenance;
the lens is used for displaying the second scene image.
In another aspect, an embodiment of the present invention provides a method for performing maintenance processing, where the method includes:
acquiring a first scene image of equipment to be maintained;
acquiring positioning information of the wearable equipment;
if the first scene image is judged to be provided with a part in the maintenance procedure stored in the memory, matching a target first scene image according to the first scene image and the positioning information;
identifying the target first scene image, and displaying a comparison result of the identification result and a pre-stored standard image sample in a memory in a second scene image, wherein the second scene image is overlapped on the target first scene image, the comparison result comprises first labeling information of equipment parts to be maintained, which are not required to be checked manually, and the first labeling information identifies the parts which are required to be maintained and are not required to be maintained;
and displaying the second scene image.
The wearable device and the maintenance processing method provided by the embodiment of the invention do not need handheld devices, and can also automatically check the device to be maintained, thereby improving the efficiency of the maintenance workflow of the whole device.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a wearable device according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for performing maintenance processing according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic structural diagram of a wearable device according to an embodiment of the present invention, and as shown in fig. 1, the wearable device provided by the embodiment of the present invention includes: a frame body 1 and a lens 2;
the glasses frame main body 1 is internally provided with an image recognition chip 3, a positioning chip 4, a CPU6 and a memory 5;
an image pickup device 7 is arranged between the lenses 2 on the spectacle frame main body 1;
the camera device 7 is used for acquiring a first scene image of the equipment to be maintained;
the positioning chip 4 is used for acquiring positioning information of the wearable equipment;
the CPU6 is configured to, if it is determined that the first scene image has a component in the maintenance procedure stored in the memory 5, match a target first scene image according to the first scene image and the positioning information;
the image recognition chip 3 is configured to recognize the target first scene image, and display a result of comparing a recognition result with a pre-stored standard image sample in the memory 5 in a second scene image, where the second scene image is superimposed on the target first scene image, and the comparison result includes first labeling information of a wearable device component that does not need to be checked manually, where the first labeling information identifies a component that needs maintenance and does not need maintenance;
the lens 2 is used for displaying the second scene image. The frame body 1 may further be provided with a bluetooth chip, which is used for communication connection with peripheral input devices (such as a keyboard and a mouse), and the frame body 1 above the lens 2 may be provided with a larger accommodating space so as to sufficiently accommodate the built-in CPU6 and the memory 5. The camera device 7 can be arranged between the two lenses 2 on the frame main body 1, can be a binocular camera, is convenient for shooting a more complex scene, can be understood as geographic position information of the wearable equipment, can be longitude and latitude, or geographic coordinates and the like, and can further accurately determine the current position of the maintenance personnel. For a target first scene image, the following is illustrated: when the camera device 7 of the wearable device acquires all scenes (corresponding to the first scene image) in the cockpit of the aircraft, when the current position of the wearable device is acquired by the positioning chip 4 of the wearable device as an operation console in the cockpit, if the operation console in the cockpit has components in the maintenance procedure, the matched target first scene image is: an operating table in the cabin. The second scene image can be understood as a second annotation information which is generated by identifying the target first scene image through the image identification chip 3 and comparing the target first scene image with a pre-stored standard image sample in a memory, and the second annotation information can be displayed in the second scene image, in the embodiment of the invention, the second scene image is displayed on the lens 2 by adopting an augmented reality technology, the image seen by the maintenance personnel is a visual effect of overlapping the second scene image on the first scene image, and the first annotation information can be understood as: labeling information of maintenance parts which do not need manual inspection. For example: the maintenance components that do not require manual inspection may be: whether the position of the limit switch is accurate, whether screws, catches and the like are fastened or not, and the like, the maintenance part is easy to determine through recognition of the target first scene image. The second annotation information can be understood as: labeling information of the maintenance parts needing manual inspection. The maintenance components that need to be checked manually may be: whether the oil tank leaks, whether the cable has broken wires or not, and the like, the maintenance component is not easy to determine through identifying the target first scene image, and further checking and confirmation are required by maintenance personnel to go to the site. The specific labeling mode of the first labeling information can be as follows: the components which do not need maintenance can be marked with green marks, the components which do need maintenance can be marked with red marks, so that maintenance personnel can conveniently identify the components, and maintenance work of the components which need maintenance can be completed quickly.
The wearable equipment provided by the embodiment of the invention does not need to be held by hands, and can also automatically check the equipment to be maintained, so that the efficiency of the maintenance workflow of the whole equipment is improved.
On the basis of the above embodiment, the second scene image is superimposed on the target first scene image by using an augmented reality technology, but is not particularly limited.
The wearable device provided by the embodiment of the invention effectively ensures the display effect of the second scene image through the augmented reality technology.
On the basis of the above embodiment, the image recognition chip 3 is a GPU or an FPGA, but is not particularly limited.
According to the wearable device provided by the embodiment of the invention, the image identification chip is selected as the GPU or the FPGA, so that the accurate identification of the image is ensured.
On the basis of the above embodiment, the positioning chip 4 is a radio frequency chip, but is not particularly limited.
According to the wearable device provided by the embodiment of the invention, the accurate positioning of the wearable device is ensured by selecting the positioning chip as the radio frequency chip.
On the basis of the above embodiment, the lens holder body 1 further has a chip for optimizing an image, and the chip for optimizing an image includes: accelerometers and/or gyroscopes and/or magnetometers. The accelerometer, the gyroscope and the magnetometer can be used independently or in combination, and the problems of image blocking and unclear can be effectively solved.
According to the wearable device provided by the embodiment of the invention, the chip for optimizing the image is arranged in the mirror bracket main body, so that the display effect of the image is further optimized.
On the basis of the above embodiment, a function key is further externally arranged at the tail end of the frame main body 1, and the function key is used for selecting second labeling information in the second scene image;
the second labeling information is prompt information of the equipment to be maintained, which needs to be checked manually to determine whether the equipment to be maintained needs maintenance components, and the function keys comprise a direction selection key and a confirmation key for confirming selection. It should be noted that: the second annotation information and the first annotation information can be simultaneously displayed in the second scene image, in order to avoid interference of the second annotation information and the first annotation information, the second annotation information can be displayed in a certain corner in the second scene image in a form of a table, the function keys can comprise a direction selection key and a confirmation key for confirming selection, a microphone and/or a camera device 7 are arranged at the tail end of the frame main body 1 and confirm completion of the maintenance component by detecting voice and/or gesture of a maintenance person respectively, the confirmation key for confirming selection is integrated with a switch power supply, and when the action of pressing the confirmation key for a long time is detected, the opening and closing of the wearable device are realized.
According to the wearable device provided by the embodiment of the invention, the operability of the wearable device is further realized by setting the function keys.
On the basis of the above embodiment, the frame body 1 is further externally provided with a switching power supply, a USB interface and a lithium battery. The switching power supply is used for controlling the switch of the wearable device, the number of the USB interfaces can be one or more, and the USB interfaces are used for charging the wearable device and storing electric quantity in a lithium battery.
According to the wearable device provided by the embodiment of the invention, the functions of the wearable device are further improved through the external switching power supply, the USB interface and the lithium battery.
The following describes a maintenance workflow of a device to be maintained in combination with the wearable device:
1. and according to the positioning information returned by the positioning chip 4, combining the first scene image acquired by the image pickup device 7 to identify whether a maintenance part in a maintenance procedure exists.
2. And if the target first scene image is confirmed, matching the target first scene image according to the first scene image and the positioning information.
3. Comparing the target first scene image identification result with a pre-stored standard image sample in a memory, and simultaneously displaying the compared result in a generated second scene image, wherein the maintenance part which does not need to be checked manually is identified by first marking information in the second scene image; and identifying the maintenance part needing manual inspection by using the second marking information.
4. The second labeling information prompts the maintenance personnel to check the maintenance part manually, and the completion condition of maintenance work is confirmed by detecting the voice, gesture, external keyboard, information input in the mouse and the like of the maintenance personnel.
5. And (3) recording the positioning information and the first scene image of the wearable equipment in the whole process, and further evaluating the work of the maintenance personnel according to the positioning information and the first scene image.
The wearable equipment provided by the embodiment of the invention does not need to be held by hands, and can also automatically check the equipment to be maintained, so that the efficiency of the maintenance workflow of the whole equipment is improved.
Fig. 2 is a flow chart of a maintenance processing method according to an embodiment of the present invention, and as shown in fig. 2, the maintenance processing method provided by the embodiment of the present invention includes the following steps:
s1: and acquiring a first scene image of the equipment to be maintained.
Specifically, the wearable device obtains a first scene image of the device to be maintained. The first scene image can be obtained through a camera device built in the wearable equipment.
S2: and acquiring positioning information of the wearable equipment.
Specifically, the wearable device obtains positioning information of the wearable device. The positioning information can be obtained through a positioning chip built in the wearable device.
S3: and if the first scene image is judged to be provided with the components in the maintenance procedure stored in the memory, matching the target first scene image according to the first scene image and the positioning information.
Specifically, if the wearable device determines that the first scene image has a component in the maintenance procedure stored in the memory, the wearable device matches a target first scene image according to the first scene image and the positioning information. The existence of the component in the maintenance procedure stored in the memory of the first scene image can be judged by a CPU (central processing unit) built in the wearable device. For a target first scene image, the following is illustrated: when the camera device of the wearable equipment acquires all scenes (corresponding to the first scene image) in the aircraft cockpit, and when the current position of the wearable equipment is an operation desk in the cockpit, if the operation desk in the cockpit has components in the maintenance procedure, the matched target first scene image is: an operating table in the cabin.
S4: and identifying the target first scene image, and displaying a comparison result of the identification result and a pre-stored standard image sample in a memory in a second scene image, wherein the second scene image is overlapped on the target first scene image, the comparison result comprises first labeling information of equipment parts to be maintained, which are not required to be checked manually, and the first labeling information identifies the parts which are required to be maintained and are not required to be maintained.
Specifically, the wearable device identifies the target first scene image, and displays a comparison result of the identification result and a pre-stored standard image sample in the memory in a second scene image, wherein the second scene image is overlapped on the target first scene image, the comparison result comprises first labeling information of equipment parts to be maintained without manual inspection, and the first labeling information identifies the parts needing maintenance and needing no maintenance. The identification of the target first scene image is achieved through an image identification chip in the wearable device. The first annotation information can be understood as: labeling information of maintenance parts which do not need manual inspection. For example: the maintenance components that do not require manual inspection may be: whether the position of the limit switch is accurate, whether screws, catches and the like are fastened or not, and the like, the maintenance part is easy to determine through recognition of the target first scene image. The specific labeling mode of the first labeling information can be as follows: the components which do not need maintenance can be marked with green marks, the components which do need maintenance can be marked with red marks, so that maintenance personnel can conveniently identify the components, and maintenance work of the components which need maintenance can be completed quickly.
S5: and displaying the second scene image.
Specifically, the wearable device displays the second scene image. The second scene image may be displayed through a lens in the wearable device. The second scene image may be superimposed on the target first scene image using augmented reality techniques.
The maintenance processing method provided by the embodiment of the invention does not need handheld equipment, and can also automatically check the equipment to be maintained, thereby improving the efficiency of the maintenance workflow of the whole equipment.
On the basis of the embodiment, the comparison result further comprises second labeling information for manually checking the components.
Specifically, the comparison result further comprises second labeling information of the parts to be checked manually. The components that need to be inspected manually may be: whether the oil tank leaks, whether the cable has broken wires or not, and the like, the maintenance component is not easy to determine through identifying the target first scene image, and further checking and confirmation are required by maintenance personnel to go to the site.
According to the maintenance processing method provided by the embodiment of the invention, the parts needing to be checked and determined whether maintenance is needed or not are prompted through the second labeling information, so that the smooth proceeding of all maintenance projects is ensured.
On the basis of the above embodiment, the second annotation information and the first annotation information are displayed in the second scene image at the same time, and the second annotation information is displayed in a form of a table at a certain corner in the second scene image.
Specifically, the second annotation information and the first annotation information are displayed in the second scene image at the same time, and the second annotation information is displayed in a certain corner of the second scene image in a form of a table. The selection and confirmation of the completion condition of the maintenance component can be further realized by detecting the state of the function key on the wearable device. The function keys may include a direction selection key, and a confirmation key to confirm the selection.
According to the maintenance processing method provided by the embodiment of the invention, the second labeling information is combined with the first labeling information, so that the maintenance work efficiency is improved.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware associated with program instructions, where the foregoing program may be stored in a computer readable storage medium, and when executed, the program performs steps including the above method embodiments; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
The embodiments of the wearable device etc. described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the embodiments of the present invention, and are not limited thereto; although embodiments of the present invention have been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present invention.

Claims (8)

1. A dimension protection processing method, characterized by being used for a wearable device, the wearable device comprising: the glasses frame comprises a glasses frame body and lenses, wherein an image recognition chip, a positioning chip, a CPU and a memory are arranged in the glasses frame body; an image pick-up device is arranged between lenses on the glasses frame main body;
the maintenance processing method comprises the following steps:
acquiring a first scene image of equipment to be maintained through the camera device;
acquiring positioning information of the wearable equipment through the positioning chip;
if the CPU judges that the first scene image has a part in the maintenance procedure stored in the memory, matching a target first scene image according to the first scene image and the positioning information;
the method comprises the steps that the image recognition chip is used for recognizing a target first scene image, and a comparison result of a recognition result and a pre-stored standard image sample in a memory is displayed in a second scene image, wherein the second scene image is overlapped on the target first scene image, the comparison result comprises first marking information of equipment parts to be maintained, which are not required to be checked manually, and the first marking information is used for identifying the parts which are required to be maintained and are not required to be maintained;
displaying the second scene image through the lens;
the comparison result also comprises second labeling information of the parts to be checked manually; the second labeling information prompts the maintenance part which needs to be checked manually by the maintenance personnel, and the completion condition of maintenance work is confirmed by detecting the voice, gesture, external keyboard and information mode input by the mouse of the maintenance personnel;
and (3) recording the positioning information and the first scene image of the wearable equipment in the whole process, and further evaluating the work of the maintenance personnel according to the positioning information and the first scene image.
2. The dimension-preserving processing method of claim 1 wherein the second scene image is superimposed on the target first scene image using an augmented reality technique.
3. The method according to claim 1, wherein the image recognition chip is a GPU or an FPGA.
4. The dimension keeping method of claim 1, wherein the positioning chip is a radio frequency chip.
5. The maintenance processing method according to claim 1, wherein the frame body further has a chip for optimizing an image, and the chip for optimizing an image includes: accelerometers and/or gyroscopes and/or magnetometers.
6. The maintenance processing method according to claim 1, wherein a function key is further arranged at the tail end of the mirror holder main body, and the function key is used for selecting second labeling information in the second scene image;
the second labeling information is prompt information of the equipment to be maintained, which needs to be checked manually to determine whether the equipment to be maintained needs maintenance components, and the function keys comprise a direction selection key and a confirmation key for confirming selection.
7. The maintenance method according to any one of claims 1 to 6, wherein the frame body is further externally provided with a switching power supply, a USB interface, and a lithium battery.
8. The method according to claim 1, wherein the second annotation information and the first annotation information are displayed in the second scene image at the same time, and the second annotation information is displayed in a form of a table at a certain corner of the second scene image.
CN201710121049.6A 2017-03-02 2017-03-02 Wearable device and maintenance processing method Active CN108536220B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710121049.6A CN108536220B (en) 2017-03-02 2017-03-02 Wearable device and maintenance processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710121049.6A CN108536220B (en) 2017-03-02 2017-03-02 Wearable device and maintenance processing method

Publications (2)

Publication Number Publication Date
CN108536220A CN108536220A (en) 2018-09-14
CN108536220B true CN108536220B (en) 2024-03-08

Family

ID=63489168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710121049.6A Active CN108536220B (en) 2017-03-02 2017-03-02 Wearable device and maintenance processing method

Country Status (1)

Country Link
CN (1) CN108536220B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110264515B (en) * 2019-05-07 2023-08-18 联想(上海)信息技术有限公司 Labeling method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834379A (en) * 2015-05-05 2015-08-12 江苏卡罗卡国际动漫城有限公司 Repair guide system based on AR (augmented reality) technology
CN105158900A (en) * 2015-09-28 2015-12-16 大连楼兰科技股份有限公司 Method for recognizing head posture on automobile maintenance intelligent glasses
CN105223694A (en) * 2015-09-28 2016-01-06 大连楼兰科技股份有限公司 Intelligent glasses is applied to the method for indoor positioning in automobile repair and maintenance process and image recognition
CN206557668U (en) * 2017-03-02 2017-10-13 北京戈德思科科技有限公司 A kind of wearable device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160063327A1 (en) * 2014-08-29 2016-03-03 Datangle, Inc. Wearable Device To Display Augmented Reality Information
WO2016164355A1 (en) * 2015-04-06 2016-10-13 Scope Technologies Us Inc. Method and apparatus for sharing augmented reality applications to multiple clients

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834379A (en) * 2015-05-05 2015-08-12 江苏卡罗卡国际动漫城有限公司 Repair guide system based on AR (augmented reality) technology
CN105158900A (en) * 2015-09-28 2015-12-16 大连楼兰科技股份有限公司 Method for recognizing head posture on automobile maintenance intelligent glasses
CN105223694A (en) * 2015-09-28 2016-01-06 大连楼兰科技股份有限公司 Intelligent glasses is applied to the method for indoor positioning in automobile repair and maintenance process and image recognition
CN206557668U (en) * 2017-03-02 2017-10-13 北京戈德思科科技有限公司 A kind of wearable device

Also Published As

Publication number Publication date
CN108536220A (en) 2018-09-14

Similar Documents

Publication Publication Date Title
US10049111B2 (en) Maintenance assistance for an aircraft by augmented reality
US10438409B2 (en) Augmented reality asset locator
US11321044B2 (en) Augmented reality quick-start and user guide
US20130010068A1 (en) Augmented reality system
US20190333633A1 (en) Medical device information providing system, medical device information providing method, and program
CN109835260B (en) Vehicle information display method, device, terminal and storage medium
CN107798391A (en) A kind of analysis of equipment fault using augmented reality and maintenance system
GB2567968A (en) Wafer test system
KR20180011794A (en) Object visualization method
CN113537374B (en) Method for generating countermeasure sample
EP3232407A2 (en) Validating flight checklist items for maintenance and inspection
JP6316176B2 (en) Inspection history management system, inspection history management method
US9560272B2 (en) Electronic device and method for image data processing
CN109120730A (en) The system and method for double address failure are solved during the debugging of the system of connection
US11325258B2 (en) Guidance apparatus and method for failure recovery
RU2018115965A (en) AIRCRAFT VERIFICATION SYSTEM WITH VISUALIZATION AND RECORDING
CN108536220B (en) Wearable device and maintenance processing method
EP3321869B1 (en) Tag based location
JP2011186756A (en) Image display system and image display method
JP5891191B2 (en) Operation result acquisition system and operation result acquisition method
JP2018082338A (en) Monitor system
JP6479881B2 (en) Inspection support device, inspection support method, and program
EP3319029B1 (en) Emphasizing equipment based on an equipment tag
JP2023065371A (en) Manufacturing assistance system, method, and program
CN110458052A (en) Recongnition of objects method, apparatus based on augmented reality, equipment, medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant