WO2017033561A1 - 表示システム、表示装置、表示方法及びプログラム - Google Patents
表示システム、表示装置、表示方法及びプログラム Download PDFInfo
- Publication number
- WO2017033561A1 WO2017033561A1 PCT/JP2016/069055 JP2016069055W WO2017033561A1 WO 2017033561 A1 WO2017033561 A1 WO 2017033561A1 JP 2016069055 W JP2016069055 W JP 2016069055W WO 2017033561 A1 WO2017033561 A1 WO 2017033561A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- display device
- real space
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 57
- 238000012545 processing Methods 0.000 claims abstract description 35
- 238000003860 storage Methods 0.000 claims abstract description 23
- 238000007689 inspection Methods 0.000 claims description 147
- 230000005540 biological transmission Effects 0.000 claims description 19
- 238000003384 imaging method Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 description 46
- 238000010586 diagram Methods 0.000 description 16
- 230000004048 modification Effects 0.000 description 12
- 238000012986 modification Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 10
- 230000003190 augmentative effect Effects 0.000 description 6
- 230000004044 response Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 238000011179 visual inspection Methods 0.000 description 4
- 230000006872 improvement Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007591 painting process Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32014—Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35482—Eyephone, head-mounted 2-D or 3-D display, also voice and other control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39449—Pendant, pda displaying camera images overlayed with graphics, augmented reality
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Definitions
- the present invention relates to a display system, a display device, a display method, and a program.
- Patent Document 2 discloses a technique for displaying information related to failure recovery work on a wearable AR display device worn by an operator so as to be superimposed on an image in real space.
- the present invention has been made in view of such problems, and an object thereof is to provide support that leads to improvement in inspection accuracy of inspection by an inspector.
- FIG. 1 is a diagram showing an AR display system.
- FIG. 2 is a diagram illustrating an AR display device.
- FIG. 3 is a diagram illustrating the management apparatus.
- FIG. 4 is an explanatory view of a completed vehicle inspection.
- FIG. 5 is a flowchart showing the examination support process by the AR display system.
- FIG. 6 is a diagram illustrating an example of a data configuration of the reference image DB.
- FIG. 7 is a diagram illustrating a display example of a reference image.
- FIG. 8A is a diagram illustrating an enlarged display example of an image related to a selection instruction.
- FIG. 8B is a diagram illustrating an enlarged display example of an image related to the selection instruction.
- FIG. 9 is a diagram illustrating a display example of a reference image.
- FIG. 1 is a diagram showing an augmented reality display system as a display system.
- the augmented reality display system is referred to as an AR (Augmented Reality) display system.
- the AR display system includes an AR display device 100, a management device 110, and a PC 120.
- the AR display device 100 is a glasses-type device.
- an inspector wears the AR display device 100 and visually inspects an inspection object.
- the AR display device 100 is also a light transmission type display device, and a light transmission type display unit 105 is provided at a position corresponding to the lens portion of the glasses.
- the inspector wearing the AR display device 100 can see an object existing ahead of the line of sight in the real space via the display unit 105 of the AR display device 100.
- the inspector wearing the AR display device 100 recognizes the state in which the arbitrary image is superimposed on the real space viewed through the display unit 105.
- the AR display device 100 is a display device capable of displaying an image superimposed on the real space.
- a space in which a real space and an arbitrary image are combined is referred to as an augmented reality space.
- a photographing unit 107 is provided at a position adjacent to the display unit 105.
- the imaging unit 107 is installed so that the imaging direction of the wearer of the AR display device 100 matches the imaging direction of the imaging unit 107. Thereby, the imaging unit 107 can capture an image of the real space viewed by the inspector wearing the AR display device 100.
- the photographing unit 107 may be installed such that the photographing direction and the wearer's line-of-sight direction have a certain relationship.
- FIG. 2 is a diagram showing the AR display device 100.
- the AR display device 100 includes a CPU 101, a ROM 102, a RAM 103, a communication I / F 104, a display unit 105, a microphone 106, and a photographing unit 107.
- the CPU 101 reads the control program stored in the ROM 102 and executes various processes.
- the RAM 103 is used as a temporary storage area such as a main memory and a work area for the CPU 101. Note that the functions and processing of the AR display device 100 to be described later are realized by the CPU 101 reading out a computer program stored in the ROM 102 and executing this program.
- FIG. 4 is an explanatory diagram of a completed vehicle inspection carried out in an automobile factory.
- the finished vehicle inspection includes an inspection by an inspection device and an inspection by an inspector, and the inspector A wears the AR display device 100 and performs an inspection by visual inspection.
- the completed vehicle 200 is placed on the belt conveyor 210 and carried in front of the inspector A.
- the emblem 201 provided on the front surface of the complete vehicle 200 is used as an inspection object, and the inspection of the shape of the emblem 201 will be described as an example.
- FIG. 5 is a flowchart showing inspection support processing by the AR display system.
- This process is a process executed when an inspector wearing the AR display device 100 performs an inspection.
- the CPU 101 of the AR display device 100 confirms whether or not an instruction to select an inspection item has been accepted in response to an utterance by the inspector.
- the inspection item is information indicating the inspection object to be inspected by the inspector and the content of the inspection.
- the CPU 101 of the AR display device 100 performs voice recognition on the voice input from the microphone 106, and selects an inspection item “shape of emblem” according to the voice recognition result. Accept instructions.
- the user interface for the inspector to input a selection instruction is not limited to the microphone 106.
- FIG. 6 is a diagram illustrating an example of a data configuration of the reference image DB 118.
- the reference image DB 118 is stored in a storage unit such as the HDD 117 of the management apparatus 110, for example.
- inspection items and reference images are stored in association with each other.
- the reference image includes a pass image and a fail image.
- the pass image is an image of the inspection object determined to be pass.
- the reject image is an image of the inspection object determined to be rejected.
- the reference image DB 118 stores the reference image so that it can be identified whether the image is a pass image or a fail image. It is assumed that the pass image and the fail image are created in advance by a designer or the like and registered in the reference image DB 118 before actual processing is performed, such as when designing an AR display system.
- the CPU 111 of the management apparatus 110 extracts a reference image associated with the inspection item in the reference image DB 118. Note that there are cases where both a pass image and a reject image are stored in association with the inspection item, and only one of the images is stored. In the former case, both a pass image and a fail image are extracted, and in the latter case, only one of the pass image and the fail image is extracted.
- the CPU 111 of the management apparatus 110 associates the extracted reference image with information for identifying whether each reference image is a pass image or a fail image, via the communication I / F 114. To the AR display device 100.
- the process of S103 is an example of a reception process for receiving at least one of a pass image and a fail image.
- the CPU 101 of the AR display device 100 extracts one pass image and one fail image from the received reference images and displays them on the display unit 105.
- FIG. 7 is a diagram illustrating a display example of a pass image and a fail image. One pass image 221 and one fail image 223 are displayed on the left and right sides of the display unit 105, respectively.
- the inspector sees the emblem 201 of the completed vehicle 200 as the inspection object existing in the real space via the display unit 105, and at the same time, the pass image 221 displayed on the display unit 105 and the failure.
- An image 223 can be seen. That is, the inspector can recognize the augmented reality space in which the pass image 221 and the reject image 223 are superimposed on the inspection object existing in the real space.
- the AR display device 100 displays each of the pass image and the fail image on the display unit 105 at a predetermined display position. Thereby, the inspector can specify whether the image is a pass image or a fail image from the display position without confirming the information 222 and 224 indicating the pass image or the fail image.
- the AR display device 100 displays a pass image on the left side of the display unit 105 and displays a reject image on the right side of the display unit 105. Further, when only one of the pass image and the fail image is received in S103, the CPU 101 of the AR display device 100 displays only the received image (pass image or fail image) in S104. To do.
- the CPU 101 of the AR display device 100 further displays “1/3” adjacent to the pass image 221.
- the denominator “3” is the total number of accepted images received in S103
- the numerator “1” is the identification number of the accepted image 221 being displayed.
- the CPU 101 of the AR display device 100 also displays “1/4” in the same manner adjacent to the reject image 222.
- the CPU 101 of the AR display device 100 can further switch the image displayed on the display unit 105 in accordance with an instruction from the examiner. For example, when the inspector speaks “change of acceptable image”, the CPU 101 of the AR display device 100 accepts a display switching instruction of the acceptable image. Then, the CPU 101 of the AR display device 100 displays the other received acceptable image instead of the displayed acceptable image. Similarly, when the inspector speaks “change of failed image”, the CPU 101 of the AR display device 100 receives a display switching instruction for the failed image, and replaces the rejected image being displayed with another received failure. Display an image. In this manner, the CPU 101 of the AR display device 100 can display the received pass image and the reject image in order by switching the display image every time a display switching instruction is received.
- the process of S ⁇ b> 104 is an example of a display process for displaying an image received from the management apparatus 110.
- the number of images simultaneously displayed on the display unit 105 is not limited to the embodiment.
- three acceptable images and three unacceptable images are displayed on the left and right sides of the display unit 105 respectively, and each time a display switching instruction is received, the following three images are displayed instead of the three currently displayed images.
- An image may be displayed.
- the inspector causes the display unit 105 to display the pass image and the fail image, and compares the displayed image with the inspection object existing in the real space.
- the examiner inputs an instruction to select a similar image when there is a particularly similar image. For example, when the inspector wants to compare the inspection object with the displayed acceptable image, he / she utters “accepted image selection”.
- the CPU 101 of the AR display device 100 performs voice recognition processing on the voice input to the microphone 106, and determines that an instruction to select a passing image being displayed has been received according to the recognition result. Similarly, the CPU 101 of the AR display device 100 can accept an instruction to select a rejected image being displayed.
- FIG. 8A is a diagram illustrating an enlarged display example of a pass image.
- the pass image 231 is displayed on the left side of the display unit 105.
- FIG. 8B is a diagram illustrating an enlarged display example of the reject image.
- the reject image 232 is displayed on the right side of the display unit 105.
- the CPU 101 of the AR display device 100 can further receive a scaling instruction from the inspector and can enlarge or reduce the pass image 231 or the reject image 232 according to the scaling instruction. Thereby, the display size of the pass image 231 and the reject image 232 can be adjusted to the same size as the size at which the inspector visually recognizes the inspection object 201 in the real space, for example.
- the inspector can display the pass image 231 or the reject image 232 next to the inspection target 201 by moving the line of sight. Furthermore, the CPU 101 of the AR display device 100 can change the display position of the pass image 231 or the fail image 232 in accordance with an instruction from the inspector. Thereby, the pass image 231 or the fail image 232 can be displayed next to the inspection object 201 as described above. Thus, since the inspector can compare the actual inspection object with the pass image 231 or the reject image 232 without moving the line of sight, the inspector makes a pass / fail determination more accurately. be able to.
- the inspector refers to the displayed image, such as when the inspection object does not resemble any of the accepted and rejected images displayed on the display unit 105, the pass / fail is still acceptable. You may be troubled by the judgment. In such a case, the inspector speaks, for example, “determination request”.
- the CPU 101 of the AR display device 100 performs voice recognition processing on the voice of the inspector input to the microphone 106, and determines that a determination request instruction has been received from the recognition result.
- the CPU 101 of the AR display device 100 confirms whether or not a determination request instruction has been accepted.
- the process proceeds to S108. If the CPU 101 of the AR display device 100 does not accept the determination request instruction (No in S107), the process proceeds to S116.
- the process of S107 is an example of a reception process for receiving a determination request instruction.
- the CPU 101 of the AR display device 100 performs a shooting control process. This process is a process of taking an image and storing it in the storage unit, similar to the shooting control process in S106.
- the process proceeds to S110.
- the CPU 111 of the PC 120 displays the received captured image on the display unit 115 together with the inspection item.
- the supervisor of the inspector who is the user of the PC 120 confirms the captured image displayed on the display unit 115, determines pass or fail, and inputs the determination result via the input unit 116.
- the PC 120 may be an apparatus used by a person who is determined in advance as a person who can make a correct judgment on the examination.
- the PC 120 may be an apparatus used by a manager of judgment criteria in the inspection process, a manager of the entire final inspection process, or the like.
- the CPU 111 of the PC 120 receives an input of a determination result.
- the CPU 111 of the PC 120 transmits a determination result to the AR display device 100 via the communication I / F 114.
- the process proceeds to S113.
- the CPU 101 of the AR display device 100 displays the determination result on the display unit 105. Thereby, the inspector can quickly obtain a correct inspection result even for an inspection object that cannot be determined by himself / herself.
- the CPU 101 of the AR display device 100 appropriately ends the display of the determination result in accordance with the instruction from the examiner.
- the CPU 101 of the AR display device 100 transmits an image registration instruction for the photographed image obtained in S108 to the management device 110, and then advances the processing to S116.
- the image registration instruction includes, in addition to the captured image, the inspection item and the determination result received in S112.
- the CPU 111 of the management apparatus 110 receives the image registration instruction, the process proceeds to S115.
- the CPU 111 of the management apparatus 110 registers the captured image related to the image registration instruction in the reference image DB 118 in association with the examination item related to the image registration instruction.
- the CPU 101 of the AR display device 100 receives the determination result and receives an image registration instruction from the examiner, the CPU 101 executes the processes of S114 and S115 to register the image. It is good as well.
- the identification information of the captured image transmitted in S114 is transmitted together with the determination result.
- the management apparatus 110 receives the identification information and the determination result, the management image 110 identifies the captured image identified by the identification information in the reference image DB 118. Then, the management device 110 records the determination result, that is, information indicating that the image is a pass image or a fail image, in the reference image DB 118 in association with the specified captured image.
- the CPU 101 of the AR display device 100 confirms whether or not a reference image display end instruction has been accepted in S119 in accordance with the user's utterance. If the CPU 101 of the AR display device 100 does not receive a display end instruction (No in S119), the process proceeds to S104. As another example, if the CPU 101 of the AR display device 100 does not receive a display end instruction (No in S119), the process may proceed to S101. Thereby, the reference image DB 118 in a newer state can be referred to. On the other hand, when the CPU 101 of the AR display device 100 receives a display end instruction (Yes in S119), the process ends.
- the AR display device 100 displays a pass image and a fail image by superimposing on the inspection target existing in the real space. For this reason, when the inspector determines the inspection object, the inspector can determine whether the inspection object is acceptable or not by looking at the images of the actual acceptable product and the rejected product.
- images passed images and rejected images
- the AR display system according to the present embodiment can provide support that leads to an improvement in the inspection accuracy of the inspection visually performed by the inspector.
- the reference image is used as a determination criterion, and a new reference image is automatically accumulated, thereby determining the determination criterion. Is updated.
- identification information for distinguishing both can be displayed by symbols and characters common in many countries, such as circle and cross. Therefore, it is not necessary to manually update the determination criteria by an administrator or the like, and it is also possible to eliminate the translation work at the time of updating.
- the AR display device 100 is preferably a wearable device such as the eyeglass type as described in the present embodiment or a head-mounted type such as a head-mounted display.
- the AR display device 100 is not limited to the wearable type.
- the AR display device 100 in a case where the line-of-sight movement during inspection is small, the AR display device 100 is, for example, a position in the line-of-sight direction of the inspector and transmits the inspection object to a position between the inspector and the inspection object. It may be installed to display.
- the AR display device 100 may be portable.
- the AR display device 100 is not limited to an optical transmission type display device.
- a video transmission type display device may be used.
- the CPU 101 of the AR display device 100 controls the photographing unit 107 to photograph the image of the inspection object, and displays the real space video as the photographed image on the display unit 105. indicate.
- the CPU 101 of the AR display device 100 superimposes and displays a pass image and a reject image on the video of the inspection object in the real space being displayed on the display unit 105.
- the CPU 101 of the AR display device 100 superimposes the image of the inspection object in the image of the real space being displayed on the display unit 105 in place of the inspection object in the real space, and issues a superposition display instruction. Such an image shall be displayed.
- the AR display device 100 displays a 2D image, but may display a 3D image instead.
- the AR display device 100 may have two display units at positions corresponding to the right eye and the left eye, and display an image that realizes a 3D image on each display unit.
- the AR display device 100 may store a reference image corresponding to all inspection items that can be executed in a storage unit such as the ROM 102 in advance.
- the AR display device 100 receives an inspection item selection instruction in S100 illustrated in FIG. 5, the AR display device 100 reads a reference image related to the selection instruction from the reference images stored in the storage unit in S104 ( Reading process), and the read illumination image is displayed. That is, in this case, the processing of S101 to S103 is not necessary.
- the AR display device 100 periodically transmits an update image acquisition request to the management device 110, receives the updated reference image (additional reference image) from the management device 110, and stores it in the storage unit. May be.
- the inspector inputs an instruction to start the inspection at the start of the inspection.
- the AR display device 100 identifies the first inspection item indicated in the inspection information, and transmits an image acquisition request including the first inspection item to the management device 110, whereby the first inspection item is displayed. The reference image is received and displayed. Thereafter, the inspector inputs an instruction to end the inspection when the first inspection is completed.
- the AR display device 100 receives an instruction to end the inspection, the AR display device 100 refers to the inspection information and advances the inspection object to the next inspection item. Then, the AR display device 100 receives the corresponding reference image by transmitting an image acquisition request including the second inspection item to the management device 110, and displays it. Thereafter, the same processing may be repeated.
- the AR display device 100 is not limited to the embodiment in the arrangement of the reference image when displaying the reference image on the display unit 105 in S104 of FIG.
- the AR display device 100 may display a plurality of rejected images on the right side of the display unit 105 as shown in FIG.
- S104 as shown in FIGS. 8A and 8B, only the pass image 231 or only the fail image 232 may be displayed on the display unit 105.
- the total number and the identification number of the image being displayed may be displayed in a fractional format.
- the AR display device 100 automatically adjusts the image size and the display position according to the selection instruction without receiving an instruction from the examiner in S106 of FIG. May be.
- the CPU 101 of the AR display device 100 first acquires the captured image by controlling the imaging unit 107. Then, the CPU 101 performs image recognition on the captured image, extracts an image of the inspection object (image recognition processing), and specifies the position and size of the image of the inspection object in the captured image.
- the CPU 101 of the AR display device 100 further determines the display position and display size of the image related to the selection instruction based on the relationship between the shooting direction and the line-of-sight direction and the position and size of the inspection object in the shot image.
- the display position is the position adjacent to the inspection object. Then, the CPU 101 of the AR display device 100 displays an image according to the selection instruction with the determined display size at the determined display position.
- the adjacent position is a position where the distance between the position of the inspection object grasped by the inspector in the display unit 105 and the display position is a predetermined length, and more preferably, the inspection is performed. It is assumed that the position is such that the object and the image do not overlap.
- the adjacent position may be a position determined in accordance with the size of the inspection object grasped by the inspector in the display unit 105.
- the image displayed as the pass image and the fail image by the AR display device 100 is not limited to a still image, and may be a moving image.
- the AR display device 100 may generate an animation from the moving image shot by the shooting unit 107 and display it as a pass image and a fail image.
- the AR display device 100 determines not only the captured image that has received the determination result from the PC 120 but also the captured image that the inspector desires to register in the reference image DB 118 as the reference image. It may be registered in the reference image DB 118 in association with the result.
- the AR display device 100 captures and records an image of the inspection object during inspection or at the end of the inspection for all inspection items, not limited to cases where the inspector is troubled by the determination. It is good as well. Thereby, when a quality defect etc. arise later, the cause of a defect can be pinpointed.
- the AR display device 100 records a captured image in the management device 110 as an inspected image separately from the reference image. Furthermore, the AR display device 100 may record the inspection result in association with the inspected image.
- the inspection object is not limited to an automobile emblem.
- the inspection object is not limited to the completed vehicle inspection, but may be applied to a completion inspection in an automobile assembly process or a painting process.
- the present invention may be applied to building inspection, plant / facility / device inspection, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Biochemistry (AREA)
- Quality & Reliability (AREA)
- Manufacturing & Machinery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Automation & Control Theory (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- General Factory Administration (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Image Processing (AREA)
Abstract
Description
なお、AR表示装置100が、変更分の参照画像を受信するタイミングは特に限定されるものではない。AR表示装置100は、他の例としては、図5のS100において選択指示を受け付けたタイミングにおいて、更新分の参照画像を受信することとしてもよい。
Claims (18)
- 現実空間に重畳して画像を表示可能な表示装置と、前記表示装置が表示する画像を管理する管理装置とを有する表示システムであって、
前記表示装置は、
記憶手段から、現実空間に存在する検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出手段と、
読み出された画像を、光透過型の表示手段を透過する現実空間に重畳して前記表示手段に表示する表示処理手段と
を有する表示システム。 - 前記表示装置は、
前記管理装置から、前記合格画像及び前記不合格画像のうち少なくとも一方を受信する受信手段と、
前記受信手段が受信した前記合格画像及び不合格画像を前記記憶手段に格納する格納手段と
をさらに有する請求項1に記載の表示システム。 - 前記表示装置は、
前記表示処理手段が表示中の画像のうち一の画像の選択指示を受け付け、さらに、前記選択指示に係る画像に対する変倍指示を受け付ける受付手段をさらに有し、
前記表示処理手段は、前記変倍指示に係る画像を、変倍して表示する請求項1又は2に記載の表示システム。 - 前記表示装置は、
前記表示処理手段が表示中の画像のうち一の画像の選択指示を受け付ける受付手段と、
前記選択指示を受け付けた場合に、前記現実空間に存在する前記検査対象物の撮影を制御する撮影制御手段と、
前記撮影制御手段の制御により得られた撮影画像に対する画像認識処理により、前記検査対象物の画像を抽出する画像認識手段と
をさらに有し、
前記表示処理手段は、前記画像認識手段により得られた前記検査対象物の位置に基づいて、前記現実空間の前記検査対象物に隣接する位置に、前記選択指示に係る画像を表示する請求項1に記載の表示システム。 - 前記表示装置は、
前記現実空間の前記検査対象物の判定依頼指示を受け付ける受付手段と、
前記判定依頼指示を受け付けた場合に、前記検査対象物の撮影を制御する撮影制御手段と、
前記撮影制御手段の制御により得られた撮影画像を外部装置に送信する第1の送信手段と、
前記外部装置から前記撮影画像に対する合格又は不合格の判定結果を受信する第2の受信手段と
をさらに有し、
前記表示処理手段は、前記判定結果を表示する請求項1に記載の表示システム。 - 前記表示装置は、
前記撮影画像を前記判定結果と共に、前記管理装置に送信する第2の送信手段をさらに有し、
前記管理装置は、
前記判定結果に対応付けて前記撮影画像を記憶手段に登録する登録手段と、
前記判定結果に従い、前記記憶手段に登録されている前記撮影画像を、前記合格画像又は前記不合格画像として前記表示装置に送信する送信手段と
を有する請求項5に記載の表示システム。 - 前記表示装置は、
検査項目の選択指示を受け付ける受付手段と、
前記選択指示に係る検査項目に対応付けられている画像の取得要求を、前記管理装置に送信する第2の送信手段と、
前記選択指示に係る前記検査項目に対応付けられている前記画像を、前記管理装置から受信する受信手段と
をさらに有する請求項1に記載の表示システム。 - 前記表示装置は、装着型装置である請求項1乃至7何れか1項に記載の表示システム。
- 前記表示装置は、携帯型装置である請求項1乃至7何れか1項に記載の表示システム。
- 現実空間の撮影画像に、他の画像を重畳して表示可能な表示装置と、前記表示装置が表示する画像を管理する管理装置とを有する表示システムであって、
前記表示装置は、
前記現実空間の撮影を制御する撮影制御手段と、
記憶手段から、検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出手段と、
読み出された画像を、前記撮影制御手段の制御により得られた撮影画像に重畳してビデオ透過型の表示手段に表示する表示処理手段と
を有する表示システム。 - 現実空間に重畳して画像を表示可能な表示装置であって、
記憶手段から、現実空間に存在する検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出手段と、
読み出された画像を、光透過型の表示手段を透過する現実空間に重畳して前記表示手段に表示する表示処理手段を有する表示装置。 - 現実空間の撮影画像に、他の画像を重畳して表示可能な表示装置であって、
前記現実空間の撮影を制御する撮影制御手段と、
記憶手段から、検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出手段と、
読み出された画像を、前記撮影制御手段の制御により得られた撮影画像に重畳してビデオ透過型の表示手段に表示する表示処理手段と
を有する表示装置。 - 現実空間に重畳して画像を表示可能な表示装置と、前記表示装置が表示する画像を管理する管理装置とを有する表示システムが実行する表示方法であって、
前記表示装置が、記憶手段から、現実空間に存在する検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出ステップと、
前記表示装置が、読み出された画像を、光透過型の表示手段を透過する現実空間に重畳して前記表示手段に表示する表示処理ステップと
を含む表示方法。 - 現実空間の撮影画像に、他の画像を重畳して表示可能な表示装置と、前記表示装置が表示する画像を管理する管理装置とを有する表示システムが実行する表示方法であって、
前記表示装置が、前記現実空間の撮影を制御する撮影制御ステップと、
前記表示装置が、記憶手段から、検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出ステップと、
前記表示装置が、読み出された画像を、前記撮影制御ステップにおいて得られた撮影画像に重畳してビデオ透過型の表示手段に表示する表示処理ステップと
を含む表示方法。 - 現実空間に重畳して画像を表示可能な表示装置が実行する表示方法であって、
前記表示装置が、記憶手段から、現実空間に存在する検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出ステップと、
前記表示装置が、読み出された画像を、光透過型の表示手段を透過する現実空間に重畳して前記表示手段に表示する表示処理ステップと
を含む表示方法。 - 現実空間の撮影画像に、他の画像を重畳して表示可能な表示装置が実行する表示方法であって、
前記現実空間の撮影を制御する撮影制御ステップと、
記憶手段から、検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出ステップと、
読み出された画像を、前記撮影制御ステップにおいて得られた撮影画像に重畳してビデオ透過型の表示手段に表示する表示処理ステップと
を含む表示方法。 - コンピュータを、
記憶手段から、現実空間に存在する検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出手段と、
読み出された画像を、光透過型の表示手段を透過する現実空間に重畳して前記表示手段に表示する表示処理手段と
して機能させるためのプログラム。 - コンピュータを、
現実空間の撮影を制御する撮影制御手段と、
記憶手段から、検査対象物に対する合格画像及び不合格画像のうち少なくとも一方を読み出す読出手段と、
読み出された画像を、前記撮影制御手段の制御により得られた撮影画像に重畳してビデオ透過型の表示手段に表示する表示処理手段と
して機能させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MYPI2018000170A MY188596A (en) | 2015-08-21 | 2016-06-27 | Display system, display device, display method, and non-transitory computer readable recording medium |
JP2016567690A JP6165362B1 (ja) | 2015-08-21 | 2016-06-27 | 表示システム、表示装置、表示方法及びプログラム |
US15/753,119 US10539509B2 (en) | 2015-08-21 | 2016-06-27 | Display system, display device, display method, and non-transitory computer readable recording medium |
CN201680046493.XA CN107850550B (zh) | 2015-08-21 | 2016-06-27 | 显示系统、显示装置、显示方法以及存储介质 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015163825 | 2015-08-21 | ||
JP2015-163825 | 2015-08-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017033561A1 true WO2017033561A1 (ja) | 2017-03-02 |
Family
ID=58099780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/069055 WO2017033561A1 (ja) | 2015-08-21 | 2016-06-27 | 表示システム、表示装置、表示方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10539509B2 (ja) |
JP (2) | JP6165362B1 (ja) |
CN (1) | CN107850550B (ja) |
MY (2) | MY201893A (ja) |
WO (1) | WO2017033561A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019095431A (ja) * | 2017-10-27 | 2019-06-20 | ファイファー バキユーム | トレーサガスによる被検査物の密封性検査用漏洩検知モジュール、および漏洩検知方法 |
WO2019139129A1 (ja) * | 2018-01-11 | 2019-07-18 | 株式会社デンソー | 設置位置情報提供装置及び設置位置情報提供方法 |
JP2019200201A (ja) * | 2018-05-02 | 2019-11-21 | ファイファー バキユーム | トレーサガスによる被検査物の密封性を検査する漏洩検知モジュールおよび方法 |
WO2020066711A1 (ja) * | 2018-09-26 | 2020-04-02 | パナソニックIpマネジメント株式会社 | 内装材検査システム、及び、内装材検査方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180150070A1 (en) * | 2016-11-29 | 2018-05-31 | Caterpillar Inc. | Portable inspection and control device |
JP6856590B2 (ja) * | 2018-08-31 | 2021-04-07 | ファナック株式会社 | センシングシステム、作業システム、拡張現実画像の表示方法、およびプログラム |
CN109584226B (zh) * | 2018-11-26 | 2021-02-05 | 浙江瑞度新材料科技有限公司 | 一种质检系统以及方法 |
RU2739901C1 (ru) * | 2019-07-23 | 2020-12-29 | Публичное акционерное общество "Ракетно-космическая корпорация "Энергия" имени С.П. Королёва" | Мобильное устройство визуализации контроля технологического процесса с применением технологии дополненной реальности |
US11624713B2 (en) * | 2019-12-04 | 2023-04-11 | Ford Global Technologies, Llc | Flexible inspection system |
JP2021096185A (ja) * | 2019-12-18 | 2021-06-24 | 日本信号株式会社 | 検査システム |
CN113282778A (zh) * | 2020-02-20 | 2021-08-20 | 青岛海尔工业智能研究院有限公司 | 一种品质异常记录方法、装置、ar设备、系统及介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000230806A (ja) * | 1999-02-09 | 2000-08-22 | Sony Corp | 位置認識装置、位置認識方法及び仮想画像立体合成装置 |
JP2003338700A (ja) * | 2002-05-21 | 2003-11-28 | Hitachi Giken Co Ltd | 工業製品のリペア支援装置 |
JP2004184095A (ja) * | 2002-11-29 | 2004-07-02 | Mitsubishi Heavy Ind Ltd | 検査支援装置 |
JP2007309729A (ja) * | 2006-05-17 | 2007-11-29 | Matsushita Electric Ind Co Ltd | プレス加工品の外観検査方法とそれに用いる外観検査装置 |
JP2009150866A (ja) * | 2007-11-29 | 2009-07-09 | Toshiba Corp | 外観検査装置、外観検査システム及び外観検査方法 |
JP2012007985A (ja) * | 2010-06-24 | 2012-01-12 | Nec Corp | 確認業務支援システム、サーバ装置、ヘッドマウントディスプレイ装置、ウェアラブル端末、確認業務支援方法およびプログラム |
JP2015001468A (ja) * | 2013-06-17 | 2015-01-05 | 株式会社松井製作所 | 成形品の検査装置 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2237256B1 (es) * | 2003-01-13 | 2006-02-01 | Servicios Tecnologicos Para La Peritacion, S.L. | Sistema de peritaje virtual. |
JP4243500B2 (ja) | 2003-03-12 | 2009-03-25 | 日本Cmo株式会社 | ディスプレイパネルの欠陥検査システム |
JP4599980B2 (ja) | 2003-10-15 | 2010-12-15 | パナソニック株式会社 | 多層配線構造の不良解析方法および不良解析装置 |
US7502068B2 (en) * | 2004-06-22 | 2009-03-10 | International Business Machines Corporation | Sensor for imaging inside equipment |
JP2006337235A (ja) * | 2005-06-03 | 2006-12-14 | Toppan Printing Co Ltd | 目視検査支援システムおよび目視検査支援装置 |
US8224020B2 (en) | 2007-11-29 | 2012-07-17 | Kabushiki Kaisha Toshiba | Appearance inspection apparatus, appearance inspection system, and appearance inspection appearance |
JP4888792B2 (ja) | 2008-07-18 | 2012-02-29 | Necアクセステクニカ株式会社 | 検査工程管理システム |
JP4913913B2 (ja) | 2010-04-28 | 2012-04-11 | 新日鉄ソリューションズ株式会社 | 情報処理システム、情報処理方法及びプログラム |
US9977496B2 (en) * | 2010-07-23 | 2018-05-22 | Telepatheye Inc. | Eye-wearable device user interface and augmented reality method |
JP5325267B2 (ja) * | 2011-07-14 | 2013-10-23 | 株式会社エヌ・ティ・ティ・ドコモ | オブジェクト表示装置、オブジェクト表示方法及びオブジェクト表示プログラム |
JP5929238B2 (ja) * | 2012-01-27 | 2016-06-01 | オムロン株式会社 | 画像検査方法および画像検査装置 |
DE102012101310C5 (de) * | 2012-02-17 | 2014-09-04 | Stephan Krebs | Vorrichtung und Verfahren zur Druckbildkontrolle |
US9723251B2 (en) * | 2013-04-23 | 2017-08-01 | Jaacob I. SLOTKY | Technique for image acquisition and management |
JP5824101B2 (ja) * | 2014-04-14 | 2015-11-25 | 鹿島建設株式会社 | 電気機器の検査装置 |
KR102145542B1 (ko) * | 2014-08-14 | 2020-08-18 | 삼성전자주식회사 | 촬영 장치, 복수의 촬영 장치를 이용하여 촬영하는 촬영 시스템 및 그 촬영 방법 |
-
2016
- 2016-06-27 MY MYPI2019000578A patent/MY201893A/en unknown
- 2016-06-27 CN CN201680046493.XA patent/CN107850550B/zh active Active
- 2016-06-27 US US15/753,119 patent/US10539509B2/en active Active
- 2016-06-27 MY MYPI2018000170A patent/MY188596A/en unknown
- 2016-06-27 WO PCT/JP2016/069055 patent/WO2017033561A1/ja active Application Filing
- 2016-06-27 JP JP2016567690A patent/JP6165362B1/ja active Active
-
2017
- 2017-06-19 JP JP2017119626A patent/JP6279131B2/ja active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000230806A (ja) * | 1999-02-09 | 2000-08-22 | Sony Corp | 位置認識装置、位置認識方法及び仮想画像立体合成装置 |
JP2003338700A (ja) * | 2002-05-21 | 2003-11-28 | Hitachi Giken Co Ltd | 工業製品のリペア支援装置 |
JP2004184095A (ja) * | 2002-11-29 | 2004-07-02 | Mitsubishi Heavy Ind Ltd | 検査支援装置 |
JP2007309729A (ja) * | 2006-05-17 | 2007-11-29 | Matsushita Electric Ind Co Ltd | プレス加工品の外観検査方法とそれに用いる外観検査装置 |
JP2009150866A (ja) * | 2007-11-29 | 2009-07-09 | Toshiba Corp | 外観検査装置、外観検査システム及び外観検査方法 |
JP2012007985A (ja) * | 2010-06-24 | 2012-01-12 | Nec Corp | 確認業務支援システム、サーバ装置、ヘッドマウントディスプレイ装置、ウェアラブル端末、確認業務支援方法およびプログラム |
JP2015001468A (ja) * | 2013-06-17 | 2015-01-05 | 株式会社松井製作所 | 成形品の検査装置 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019095431A (ja) * | 2017-10-27 | 2019-06-20 | ファイファー バキユーム | トレーサガスによる被検査物の密封性検査用漏洩検知モジュール、および漏洩検知方法 |
WO2019139129A1 (ja) * | 2018-01-11 | 2019-07-18 | 株式会社デンソー | 設置位置情報提供装置及び設置位置情報提供方法 |
JP2019121322A (ja) * | 2018-01-11 | 2019-07-22 | 株式会社デンソー | 設置位置情報提供装置及び設置位置情報提供方法 |
JP2019200201A (ja) * | 2018-05-02 | 2019-11-21 | ファイファー バキユーム | トレーサガスによる被検査物の密封性を検査する漏洩検知モジュールおよび方法 |
WO2020066711A1 (ja) * | 2018-09-26 | 2020-04-02 | パナソニックIpマネジメント株式会社 | 内装材検査システム、及び、内装材検査方法 |
JPWO2020066711A1 (ja) * | 2018-09-26 | 2021-08-30 | パナソニックIpマネジメント株式会社 | 内装材検査システム、及び、内装材検査方法 |
JP7054869B2 (ja) | 2018-09-26 | 2022-04-15 | パナソニックIpマネジメント株式会社 | 内装材検査システム、及び、内装材検査方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6279131B2 (ja) | 2018-02-14 |
JPWO2017033561A1 (ja) | 2017-08-24 |
US10539509B2 (en) | 2020-01-21 |
MY188596A (en) | 2021-12-22 |
JP2017182838A (ja) | 2017-10-05 |
US20180238810A1 (en) | 2018-08-23 |
MY201893A (en) | 2024-03-22 |
CN107850550A (zh) | 2018-03-27 |
JP6165362B1 (ja) | 2017-07-19 |
CN107850550B (zh) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6279131B2 (ja) | 表示システム及び情報処理方法 | |
CN103558909B (zh) | 交互投射显示方法及交互投射显示系统 | |
US10095030B2 (en) | Shape recognition device, shape recognition program, and shape recognition method | |
EP3009833B1 (de) | In-Prozess Fehlerüberprüfung durch erweiterte Realität | |
JP6105180B1 (ja) | 作業支援装置、作業支援方法及びプログラム | |
US10032297B2 (en) | Simulation system, simulation device, and product explanation assistance method | |
WO2013035758A1 (ja) | 情報表示システム、情報表示方法、及び記憶媒体 | |
WO2017120363A1 (en) | Task management using augmented reality devices | |
CN105469379A (zh) | 视频目标区域遮挡方法和装置 | |
CN108986766A (zh) | 信息显示终端以及信息显示方法 | |
CN110678353B (zh) | 在vr眼镜中外部地显示车辆内部空间的拍摄图像 | |
JP7138499B2 (ja) | 作業支援システム、作業支援システム用のサーバ装置およびプログラム | |
JP7011569B2 (ja) | 熟練度判定システム | |
JP6710095B2 (ja) | 技術支援装置、方法、プログラムおよびシステム | |
US10460155B2 (en) | Facial identification techniques | |
US11455750B2 (en) | Operator characteristic-based visual overlays | |
JP6765846B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP7094759B2 (ja) | システム、情報処理方法及びプログラム | |
JP7036559B2 (ja) | 杭位置検査装置及び杭位置検査方法 | |
JP6601894B1 (ja) | ウェアラブル検索システム、検索方法、及び、検索プログラム | |
US20200372722A1 (en) | Assistance method for assisting performance of a task on a product, comprising displaying a highlighting image highlighting a monitored part of the product | |
JP7059798B2 (ja) | サーバ、システム、方法、及びプログラム | |
JP2024027122A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP2024019208A (ja) | 操作支援方法 | |
CN114341945A (zh) | 用于生成关于图像的至少一个被检查区域的信息的显微镜、控制电路、方法和计算机程序 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016567690 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16838908 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15753119 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16838908 Country of ref document: EP Kind code of ref document: A1 |