JP6137092B2 - Display device, head-mounted display, display control device, display program, and display method - Google Patents

Display device, head-mounted display, display control device, display program, and display method Download PDF

Info

Publication number
JP6137092B2
JP6137092B2 JP2014187124A JP2014187124A JP6137092B2 JP 6137092 B2 JP6137092 B2 JP 6137092B2 JP 2014187124 A JP2014187124 A JP 2014187124A JP 2014187124 A JP2014187124 A JP 2014187124A JP 6137092 B2 JP6137092 B2 JP 6137092B2
Authority
JP
Japan
Prior art keywords
video
magnification
setting data
video signal
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2014187124A
Other languages
Japanese (ja)
Other versions
JP2016061809A (en
Inventor
智己 片野
智己 片野
Original Assignee
ブラザー工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ブラザー工業株式会社 filed Critical ブラザー工業株式会社
Priority to JP2014187124A priority Critical patent/JP6137092B2/en
Publication of JP2016061809A publication Critical patent/JP2016061809A/en
Application granted granted Critical
Publication of JP6137092B2 publication Critical patent/JP6137092B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers

Description

  The present invention relates to a display device, a head mounted display, a display control device, a display program, and a display method.

  Conventionally, a display device having an input terminal to which a video signal is input is known. The display device is, for example, a head mounted display (HMD). The HMD disclosed in Patent Document 1 includes a controller, a main cable, and a display unit. The controller includes a plurality of input terminals such as an A / V input terminal, an S video input terminal, and an RGB input terminal. Different video devices are connected to the plurality of input terminals, respectively. Each of the plurality of input terminals receives a video signal generated by the video device. The display unit is connected to the controller via a main cable. The display unit receives a specific video signal that is one of the video signals input to the plurality of input terminals, and displays the video.

JP 2001-238150 A

  In some video apparatuses, for example, a medical diagnostic apparatus or the like, the video apparatus itself includes a display unit, and a video signal displayed on the display unit can be output to the outside. In some cases, a real-time video acquired by the video device and a predetermined still image are displayed together on the display unit of the video device. The predetermined still image indicates, for example, a menu for operating the video apparatus or setting information of the video apparatus.

  When an HMD is connected to this type of video apparatus, a real-time video and a predetermined still image are displayed together on the display unit of the HMD. Since the predetermined still image is displayed on the display unit of the video apparatus, the user may desire that only the real-time video is cut out and displayed by the HMD display unit at a desired magnification. It is considered advantageous to display a specific area corresponding to a real-time video specified by the user at a desired magnification with the HMD.

  However, if the video device is different, the positional relationship between the real-time video displayed by the display unit and the predetermined still image may be different. In other words, in the HMD, every time the input terminal to which a specific video signal is input is different, an area that the user desires to display at a desired magnification is designated out of the video displayed on the display unit of the video device. There was a need.

  SUMMARY OF THE INVENTION An object of the present invention is to provide a display device capable of displaying a predetermined area of a video signal input to each of a plurality of input terminals by changing the display magnification corresponding to each input terminal without designating by a user. , A head mounted display, a display control device, a display program, and a display method.

The display device according to the first aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and processes the first video signal input to the first input terminal. A first video processing unit; a second video signal indicating a second video; and a second input terminal to which the second video signal in a format different from the first video signal is input, and the second input A second video processing unit that processes the second video signal input to the terminal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and the second video A storage unit that stores second magnification setting data that is setting data for changing the display magnification of the predetermined area, a first video signal processed by the first video processing unit, and a process performed by the second video processing unit Any one of the second video signals to be selected as the specific video signal That the selecting means, among the first magnification setting data and the second magnification setting data, in particular video is a video corresponding to the specific image signal, depending on the particular magnification setting data is the corresponding setting data, before comprising a generation unit Kitoku display magnification of the predetermined region of constant image to produce a scaled video signal is a signal of the changed image, and output means for outputting said scaling video signal wherein the generating means has generated A control unit; and a display unit that displays an image based on the magnification-changed video signal output from the control unit. The storage unit includes a plurality of different first magnification setting data and a plurality of different first magnification setting data. 2 magnification setting data is stored, and the controller is configured to select one of the plurality of first magnification setting data and the plurality of second magnification setting data corresponding to the specific magnification setting data. Further comprising an acquisition unit that acquires magnification setting data, said generating means, in response to the acquired predetermined magnification setting data was by the acquisition means and that you generate the scaling image signal.

According to the first aspect, when the selection unit selects the video signal processed by the first video processing unit as the specific video signal, the generation unit generates the magnification-change video signal according to the first magnification setting data. Similarly, when the selection unit selects the video signal processed by the second video processing unit as the specific video signal, the generation unit generates a magnification change video signal according to the second magnification setting data. That is, regardless of whether the video signal selected by the selection means is the first video signal input to the first input terminal or the second video signal input to the second input terminal, the generation means may A magnification change video signal can be generated according to either the setting data or the second magnification setting data. Therefore, the display device can display a predetermined area of the video signal input to each of the plurality of input terminals by changing the display magnification in accordance with each input terminal without designating the user. Further, since the storage unit stores a plurality of first magnification setting data, the generation unit can generate a magnification-change video signal according to the video device connected to the first input terminal. Similarly, since the storage unit stores a plurality of second magnification setting data, the generation unit can generate a magnification-change video signal according to the video device connected to the second input terminal. Therefore, the display device can display the predetermined area by changing the display magnification according to the video device connected to each input terminal.

A display device according to a second aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and processes the first video signal input to the first input terminal. A first video processing unit; a second video signal indicating a second video; and a second input terminal to which the second video signal in a format different from the first video signal is input, and the second input A second video processing unit that processes the second video signal input to the terminal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and the second video A storage unit that stores second magnification setting data that is setting data for changing the display magnification of the predetermined area, a first video signal processed by the first video processing unit, and a process performed by the second video processing unit Any one of the second video signals to be selected as the specific video signal A selection unit, and a specific video that is a video corresponding to the specific video signal among the first magnification setting data and the second magnification setting data, according to the specific magnification setting data that is the corresponding setting data, A control unit including a generation unit that generates a magnification-change video signal that is a video signal in which the display magnification of the predetermined area of the specific video is changed, and an output unit that outputs the magnification-change video signal generated by the generation unit And a display unit that displays an image based on the magnification-change video signal output from the control unit, and the storage unit stores the first magnification setting data and the second magnification setting data in the predetermined area, respectively. Is stored in association with data indicating whether the predetermined area is mirror-inverted, and the generation unit is configured to associate the specific magnification setting data with data in which the predetermined area is mirror-inverted. , And generating the said scaled video signal a predetermined region by mirror-reversing. According to the second aspect, when the selection unit selects the video signal processed by the first video processing unit as the specific video signal, the generation unit generates the magnification change video signal according to the first magnification setting data. Similarly, when the selection unit selects the video signal processed by the second video processing unit as the specific video signal, the generation unit generates a magnification change video signal according to the second magnification setting data. That is, regardless of whether the video signal selected by the selection means is the first video signal input to the first input terminal or the second video signal input to the second input terminal, the generation means may A magnification change video signal can be generated according to either the setting data or the second magnification setting data. Therefore, the display device can display a predetermined area of the video signal input to each of the plurality of input terminals by changing the display magnification in accordance with each input terminal without designating the user. The generation unit generates either a magnification-change video signal that has been mirror-inverted or a magnification-change video signal that has not been mirror-inverted. When the image displayed on the display unit is a mirror-inverted image, the displayed image is visually recognized by the user through a reflecting member such as a half mirror. On the other hand, when the video displayed on the display unit is a video that is not mirror-inverted, the displayed video is directly viewed by the user. The display device can generate an appropriate magnification change video signal regardless of whether the video displayed by the display unit is viewed through the reflective member or directly viewed without using the reflective member. . Therefore, the method of using the display device is diversified.

In the first aspect and the second aspect , the display unit is built in the first housing and the first housing, and generates image light of a video whose display magnification is changed based on the magnification change video signal. An image light generation unit that emits the image light in a predetermined direction; a reflection member that can reflect the image light emitted by the image light generation unit in a direction that intersects the predetermined direction; And a holding portion that is provided in one housing and detachably holds the reflection member. In this case, since the reflection member is detachable from the holding unit, the video whose display magnification is changed is directly visually recognized depending on whether the image light generated by the image light generation unit is mirror-inverted, and It is possible to selectively use it for applications that are visually recognized through the reflecting member. Thus, the display device, depending on the respective application, the image display magnification is changed, can be properly recognized by the user as an image that is not a mirror image inverted.

In the first aspect and the second aspect , the output unit is provided, and the first casing, the second video processing section, the control section, and the storage section including the storage section are provided. You may provide the cable which connects a housing | casing and the said 2nd housing | casing and transmits the said magnification change video signal which the said output means output to the said image light production | generation part. In this case, the display device can generate the magnification-change video signal with the elements arranged in the second housing. Therefore, the display device can simplify the mechanism of the display unit configured separately from the control unit.

In the first aspect and the second aspect , the storage unit stores the first magnification setting data and the second magnification setting data in association with a luminance correction amount of the predetermined area, respectively, and the generation unit includes: The magnification-change video signal in which the luminance of the predetermined area is corrected may be generated according to the luminance correction amount associated with the specific magnification setting data. In this case, the display unit displays an image in which the display magnification whose luminance has been corrected is changed according to the luminance correction amount associated with the magnification setting data. The display device can make it easy for the user to visually recognize the video displayed by the display unit in accordance with the video device connected to either the first input terminal or the second input terminal.

In the first aspect, the storage unit stores the plurality of first magnification setting data, and a plurality of feature patterns of the first video, each corresponding to each of the plurality of first magnification setting data, A plurality of second magnification setting data and a plurality of second video feature patterns each corresponding to each of the plurality of second magnification setting data are stored, and the acquisition means corresponds to the specific magnification setting data The predetermined magnification setting data corresponding to the feature pattern of the specific video may be acquired from the one. In this case, if the video device connected to the first input terminal is different, the feature pattern of the first video may be different. Similarly, if the video device connected to the second input terminal is different, the feature pattern of the second video may be different. Since the feature pattern of the first video is associated with each of the plurality of first magnification setting data, the acquisition unit acquires appropriate predetermined magnification setting data according to the video device connected to the first input terminal. To do. Similarly, the acquisition unit acquires appropriate predetermined magnification setting data according to the video device connected to the second input terminal. Accordingly, the display apparatus, according to the video apparatus to be connected to either the first input terminal and a second input terminal, without a predetermined area of the image of the image signal user designates, to change the magnification of the predetermined area Can be displayed.

  In the first aspect, each of the first video and the second video is determined in advance by a real-time video display area in which an image changes at a predetermined frequency, and a frequency at which the image changes is lower than the predetermined frequency. A specific information display area on which specific information is displayed, and the feature patterns of the first video and the second video are each an image pattern of at least a part of the specific information display area, and the acquisition The means may acquire the predetermined magnification setting data corresponding to the feature pattern extracted from the specific information display area of the specific video. Generally, if the video device connected to either the first input terminal or the second input terminal is different, the characteristic pattern of the specific information display area of the specific video tends to be different. The display device can identify the video device connected to either the first input terminal or the second input terminal by extracting only the characteristic pattern of the specific information display area from the specific video. Therefore, the display device displays a video whose display magnification is changed after the video signal is input, compared to the case where the feature pattern is extracted from each of the real-time video display area and the specific information display area of the specific video. Time can be reduced.

  In the first aspect, the real-time video display area is an area including a center position of the first video or the second video, and the specific information display area is outside the real-time video display area with respect to the center position. The predetermined area may be the real-time video display area. In this case, the central part in the display area of the display unit of the video apparatus connected to either the first input terminal or the second input terminal is a real-time video display area. Since the area displayed by changing the display magnification is in the center of the specific video, the user can easily visually recognize the area displayed by changing the display magnification.

Head-mounted display according to a third aspect of the present invention, the first aspect, and Viewing apparatus according to the second aspect, the first member extending in a first direction, and said first direction from opposite sides of said first member A mounting tool having a pair of second members extending in a second direction intersecting the head and a connecting tool for connecting the first member and the display unit. According to the 3rd aspect, there can exist an effect similar to a 1st aspect and a 2nd aspect .

A display control apparatus according to a fourth aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and processes the first video signal input to the first input terminal. A first video processing unit, a second video signal indicating a second video, and a second input terminal to which the second video signal of a format different from the first video signal is input, the second video signal A second video processing unit that processes the second video signal input to the input terminal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and the second A storage unit that stores second magnification setting data that is setting data for changing a display magnification of a predetermined area of the video, a first video signal processed by the first video processing unit, and a second video processing unit; Either one of the second video signals to be processed is designated as a specific video signal Selection means for-option, one of the first magnification setting data and the second magnification setting data, the in particular video is a video corresponding to the specific image signal, depending on the particular magnification setting data is the corresponding setting data, before Kitoku generating means for generating a scaled image signal is a signal of the image display magnification is changed in the predetermined region of constant image, to the display magnification can be displayed altered image display unit, And a control unit including an output unit that outputs the magnification-change video signal generated by the generation unit, and the storage unit includes a plurality of different first magnification setting data and a plurality of different second magnifications. Setting data is stored, and the control unit sets a predetermined magnification from one of the plurality of first magnification setting data and the plurality of second magnification setting data corresponding to the specific magnification setting data. Further comprise an acquisition means for acquiring over data, said generating means in response to the predetermined magnification setting data acquired by the acquisition unit, and generates the scaling image signal. According to the 4th aspect, there can exist an effect similar to a 1st aspect.
A display control apparatus according to a fifth aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and processes the first video signal input to the first input terminal. A first video processing unit, a second video signal indicating a second video, and a second input terminal to which the second video signal of a format different from the first video signal is input, the second video signal A second video processing unit that processes the second video signal input to the input terminal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and the second A storage unit that stores second magnification setting data that is setting data for changing a display magnification of a predetermined area of the video, a first video signal processed by the first video processing unit, and a second video processing unit; Either one of the second video signals to be processed is designated as a specific video signal In accordance with the specific magnification setting data corresponding to the specific video that is the video corresponding to the specific video signal out of the first magnification setting data and the second magnification setting data, Generating means for generating a magnification-change video signal that is a video signal in which the display magnification of the predetermined area of the specific video has been changed, and a display unit capable of displaying the video with the display magnification changed; And a control unit including an output unit that outputs the magnification change video signal generated by the unit, wherein the storage unit stores the first magnification setting data and the second magnification setting data, and each of the predetermined areas is a mirror image. The generation means stores the data in association with data indicating whether the predetermined area is associated with data for which the predetermined area is mirror-inverted. And generating the scaling image signal obtained by mirror-reversing the pass. According to the 5th aspect, there can exist an effect similar to a 2nd aspect.

A display program according to a sixth aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and a first program that processes the first video signal input to the first input terminal. A first video processing unit; a second video signal indicating a second video; and a second input terminal to which the second video signal in a format different from the first video signal is input, and the second input terminal A second video processing unit that processes the second video signal input to the first video signal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and A first video signal processed by the first video processing unit, and a computer of a display control device including a storage unit that stores second magnification setting data that is setting data for changing a display magnification of a predetermined area; The second video signal processed by the second video processing unit Either Chi, a selection step of selecting a particular video signal, wherein the first magnification setting data and of said second magnification setting data, in particular video is a video corresponding to the specific image signal, corresponding configuration data in it, depending on the particular magnification setting data, a generation step of generating a scaled image signal is a signal of the image display magnification is changed in the predetermined region before Kitoku constant video image the display magnification is changed And a display program for executing the output step of outputting the magnification-change video signal generated by the generating step , and the storage unit includes a plurality of different first Magnification setting data and a plurality of second magnification setting data different from each other are stored, and the plurality of first magnification setting data and the plurality of second magnification setting data That is, the computer further executes an acquisition step of acquiring predetermined magnification setting data from one corresponding to the specific magnification setting data, and the generation step adds the predetermined magnification setting data acquired by the acquisition step to the predetermined magnification setting data. In response, the magnification-change video signal is generated . According to the 6th aspect, there can exist an effect similar to a 1st aspect.
A display program according to a seventh aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and processes the first video signal input to the first input terminal. A first video processing unit; a second video signal indicating a second video; and a second input terminal to which the second video signal in a format different from the first video signal is input, and the second input A second video processing unit that processes the second video signal input to the terminal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and the second video A first video signal processed by the first video processing unit in a computer of a display control device comprising: a storage unit for storing second magnification setting data that is setting data for changing the display magnification of the predetermined area; Second video signal processed by the second video processing unit A selection step for selecting one of them as a specific video signal, and setting data corresponding to a specific video that is a video corresponding to the specific video signal among the first magnification setting data and the second magnification setting data. Generating a magnification-change video signal that is a video signal in which the display magnification of the predetermined area of the specific video is changed in accordance with the specific magnification setting data, and displaying the video with the display magnification changed And a display program for causing the display unit to execute an output step of outputting the magnification change video signal generated by the generation step, wherein the storage unit includes the first magnification setting data and the first 2 magnification setting data is stored in association with data indicating whether the predetermined area is mirror-inverted, and the generation step includes the specific magnification A constant data, the predetermined region when the data to be mirror-reversing is associated, and generating the said scaled video signal a predetermined region by mirror-reversing. According to the 7th aspect, there exists an effect similar to a 2nd aspect.

A display method according to an eighth aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and processes the first video signal input to the first input terminal. A first video processing unit; a second video signal indicating a second video; and a second input terminal to which the second video signal in a format different from the first video signal is input, and the second input A second video processing unit that processes the second video signal input to the terminal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and the second video A display method executed by a display control device including a storage unit that stores second magnification setting data that is setting data for changing the display magnification of a predetermined area of the predetermined area, wherein the first video processing unit performs processing The first video signal to be processed and the second video processing unit Either of the two video signals, a selection step of selecting a particular video signal, wherein the first magnification setting data and of said second magnification setting data, in particular video is a video corresponding to the specific image signal, according to the corresponding set specific magnification setting data is data, a generation step of generating a scaled image signal is a signal of the image display magnification is changed in the predetermined region before Kitoku constant image, the display magnification the display can display the modified image, seen including an output step of outputting the scaling image signal generated by said generation step, wherein the storage unit, a plurality of the first magnification setting mutually different Data and a plurality of different second magnification setting data, and among the plurality of first magnification setting data and the plurality of second magnification setting data, the specific magnification The method further includes an acquisition step of acquiring predetermined magnification setting data from one corresponding to the fixed data, wherein the generation step is configured to output the magnification change video signal according to the predetermined magnification setting data acquired by the acquisition step. generated and characterized in that. According to the 8th aspect, there can exist an effect similar to a 1st aspect.
A display method according to a ninth aspect of the present invention includes a first input terminal to which a first video signal indicating a first video is input, and processes the first video signal input to the first input terminal. A first video processing unit; a second video signal indicating a second video; and a second input terminal to which the second video signal in a format different from the first video signal is input, and the second input A second video processing unit that processes the second video signal input to the terminal; first magnification setting data that is setting data for changing a display magnification of a predetermined area of the first video; and the second video A display method executed by a display control device including a storage unit that stores second magnification setting data that is setting data for changing the display magnification of a predetermined area of the predetermined area, wherein the first video processing unit performs processing The first video signal to be processed and the second video processing unit A selection step of selecting one of the two video signals as a specific video signal, and a specific video that is a video corresponding to the specific video signal among the first magnification setting data and the second magnification setting data, In accordance with specific magnification setting data corresponding to the setting data, a generation step of generating a magnification-change video signal that is a video signal in which the display magnification of the predetermined area of the specific video is changed, and the display magnification is changed. An output step for outputting the magnification-change video signal generated by the generation step to a display unit capable of displaying the video, wherein the storage unit includes the first magnification setting data and the second magnification setting. Each of the data is stored in association with data indicating whether the predetermined area is mirror-inverted, and the generation step includes the specific magnification setting data in the predetermined area. If the data area is mirror-reversing is associated, and generating the said scaled video signal a predetermined region by mirror-reversing. According to the 9th aspect, there exists an effect similar to a 2nd aspect.

It is a perspective view of HMD1. It is a perspective view of HD10. 3 is a perspective view of a housing 12. FIG. 6 is a perspective view of a deflection unit 59. FIG. 7 is a perspective view of a holder 5. FIG. 7 is a perspective view of a holder 5. FIG. It is a figure which shows the state from which the holder 5 and the half mirror 56 were removed from the housing | casing 12. FIG. It is a figure which shows the state in which the holder 5 and the half mirror 56 were mounted | worn with the housing | casing 12. FIG. It is a figure which shows the diagnostic image | video 76. It is an electrical block diagram of HMD1. It is a flowchart of a display magnification change pattern storage process. 3 is a data configuration diagram of a table 100. FIG. It is a flowchart of a display process. It is a flowchart of a display magnification change pattern determination process.

  Hereinafter, an embodiment of the present invention will be described with reference to the drawings. A head mounted display (hereinafter referred to as “HMD”) 1 shown in FIG. 1 is an example of a display device including an input terminal to which a video signal is input. The HMD 1 includes input terminals 91 and 92. Each of the input terminals 91 and 92 can be connected to any one of arbitrary video devices that meet the standard. The HMD 1 acquires a video signal indicating a video via any one of the input terminals 91 and 92. Furthermore, the HMD 1 can generate image light of a video in which the display magnification of a predetermined area is changed from the obtained video of the video signal, and allow the user to visually recognize the image light.

  The HMD 1 is, for example, an optically transmissive see-through HMD. The light in the scene in front of the user's eyes is guided directly to the user's eyes by passing through the half mirror 56. The projection format of HMD1 is a virtual image projection type. The half mirror 56 reflects image light generated by an image light generation unit 6 (see FIG. 10), which will be described later, toward the eye on one side of the user. The HMD 1 can allow the user to recognize an image superimposed on the scene in front of him. The HMD 1 includes a mounting tool 8, a connection tool 9, a head display (hereinafter referred to as HD) 10, and a control device (hereinafter referred to as control box or CB) 40. The CB 40 controls the HD 10. Hereinafter, in order to help understand the description of the figure, the upper side, lower side, left side, right side, front side, and rear side of the HMD 1 are defined. The upper side, lower side, left side, right side, front side, and rear side of the HMD 1 correspond to, for example, the upper side, lower side, left side, right side, lower left side, and upper right side of FIG.

<Installation tool 8, connection tool 9, HD10>
The mounting tool 8 is made of a flexible material such as resin or metal (for example, stainless steel). The mounting tool 8 includes a first member 8A and a pair of second members 8B. Each of the first member 8A and the pair of second members 8B is a curved and elongated plate-like member. The first member 8A extends in the left-right direction and is curved in a convex shape on the front side. One of the pair of second members 8B is provided at the end of one side (for example, the left side) of the first member 8A. The other of the pair of second members 8B is provided at an end portion on the other side (for example, the right side) of the first member 8A. Each of the pair of second members 8B extends in a direction in which end portions on the opposite side (for example, rear side) to the side connected to the first member 8A approach each other. In other words, the pair of second members 8B extend from the left and right sides of the first member 8A so as to intersect the left and right directions. The wearing tool 8 is worn on the user's head with the first member 8A and the pair of second members 8B in contact with the frontal, right and left heads of the user, respectively. Is done. In this state, the first member 8A extends in the left-right direction along the user's forehead.

  The connection tool 9 is rod-shaped. The connection tool 9 is made of resin or metal. One end side (for example, the upper side) of the connection tool 9 is connected to the right part of the first member 8A. The connection tool 9 and the mounting tool 8 are connected by a ball joint. A later-described HD 10 is connected to the other end side (for example, the lower side) of the connection tool 9. The connection tool 9 and the HD 10 are connected by a ball joint. The connection tool 9 holds the HD 10 at a position separated from the mounting tool 8. The connection tool 9 can arrange | position the half mirror 56 of HD10 ahead of a user's left eye in the state in which the mounting tool 8 was worn by the user's head.

  As shown in FIGS. 1 to 3, the HD 10 includes a housing 12, a deflection unit 59, and an image light generation unit 6 (see FIG. 10). The deflection unit 59 includes a holder 5 and a half mirror 56.

<Case 12>
As shown in FIGS. 1 to 3, the housing 12 has a hollow box shape. The housing 12 includes a main body portion 12A and a protruding portion 12B. The main body 12A has a substantially rectangular parallelepiped shape with curved corners. The protrusion 12B protrudes rearward from the right rear side of the main body 12A. The front side, the upper side, the lower side, the rear side, and the right side of the main body 12A are respectively referred to as a first casing 21, a second casing 22, a third casing 23 (see FIG. 1), and a fourth casing 24 ( 3) and the fifth casing 25 (see FIG. 1). The front surface of the first housing 21, the upper surface of the second housing 22, the lower surface of the third housing 23, the rear surface of the fourth housing 24, and the fifth housing 25 The right surfaces are referred to as a first surface 21M, a second surface 22M, a third surface 23M, a fourth surface 24M (see FIG. 3), and a fifth surface 25M (see FIG. 1), respectively. The left side of the housing 12 is opened, and the left side of the image light generation unit 6 (described later) inside the housing 12 is exposed. The left side of the image light generator 6 is not covered by the housing 12.

  As shown in FIGS. 1 and 2, the vicinity of the portion where the first surface 21M and the second surface 22M are connected and the vicinity of the portion where the first surface 21M and the third surface 23M are connected are curved. . As shown in FIG. 1, in the vicinity of the portion where the first surface 21M and the fifth surface 25M are connected, in the vicinity of the portion where the second surface 22M and the fifth surface 25M are connected, and the third surface 23M and the fifth surface The vicinity of the portion where the surface 25M is connected is curved.

  Hereinafter, as illustrated in FIG. 3, an image light generation unit 6 described later (see FIG. 10) in each of the first housing 21, the second housing 22, the third housing 23, and the fourth housing 24. ) Are referred to as a first extension portion 21A, a second extension portion 22A, a third extension portion 23A, and a fourth extension portion 24A, respectively. The first extending portion 21 </ b> A corresponds to a portion extending between the left end of the image light generating unit 6 and the left end 21 </ b> C of the first housing 21. The second extension 22 </ b> A corresponds to a portion extending between the left end of the image light generation unit 6 and the left end 22 </ b> C of the second housing 22. The third extending portion 23 </ b> A corresponds to a portion extending between the left end of the image light generating unit 6 and the left end 23 </ b> C of the third housing 23. The fourth extension 24 </ b> A corresponds to a portion extending between the left end of the image light generation unit 6 and the left end 24 </ b> C of the fourth housing 24. The rear surface of the first extension portion 21A, the lower surface of the second extension portion 22A, and the upper surface of the third extension portion 23A are respectively a first back surface 21B, a second back surface 22B, This is referred to as a third back surface 23B. The second back surface 22B and the third back surface 23B face each other.

  As shown in FIG. 2, the left end 21C of the first housing 21 is provided between the front end 22F of the second extending portion 22A and the front end 23F of the third extending portion 23A. The left end 21C is recessed rightward from the both ends in the vertical direction toward the center in the vertical direction. The left end 21C forms a substantially arc. The rightmost position of the left end 21C is disposed on the right side of the leftmost positions of the left end 22C and the left end 23C. An opening 122 is defined in a portion of the housing 12 that is surrounded by the front end 22F of the second extension 22A, the front end 23F of the third extension 23A, and the left end 21C.

  As shown in FIG. 3, the left end 24C of the fourth housing 24 is provided between the rear end 22R of the second extending portion 22A and the rear end 23R of the third extending portion 23A. The left end 24C extends substantially linearly in the vertical direction. The leftmost position of the left end 24C is arranged on the right side of the leftmost positions of the left ends 21C, 22C, and 23C. An opening 121 is defined in a portion of the housing 12 that is surrounded by the rear end 22R of the second extension 22A, the rear end 23R of the third extension 23A, and the left end 24C.

  As shown in FIG. 3, a front first regulating member 221A is provided on the front side of the second back surface 22B. The front first regulating member 221A is a plate-like member that extends downward from the second back surface 22B. A front first restriction member 231A (see FIG. 7) is provided on the front side of the third back surface 23B. The front first restriction member 231A is a plate-like member extending upward from the third back surface 23B. Both surfaces of the front first regulating members 221A and 231A face in the left-right direction. The positions in the left-right direction of the first front regulating members 221A and 231A are equal.

  A rear first regulating member 221B is provided on the rear side of the second back surface 22B. The rear first regulating member 221B is a plate-like member that extends downward along the rear end 22R of the second extending portion 22A. A rear first restriction member 231B is provided on the rear side of the third back surface 23B. The rear first regulating member 231B is a plate-like member that extends upward along the rear end 23R of the third extending portion 23A. Both surfaces of the rear first regulating members 221B and 231B face in the front-rear direction. The position in the left-right direction of the front first restriction members 221A and 231A is substantially equal to the position in the left-right direction of the right end of the rear first restriction members 221B and 231B. The right end of the rear first regulating member 221B, the upper end of the left end 24C of the fourth housing 24, and the second back surface 22B form a groove 221C that is recessed upward. The right end of the rear first restriction member 231B, the lower end of the left end 24C of the fourth housing 24, and the third back surface 23B form a groove 231C that is recessed downward.

  A second regulating member 211 is provided on the left side of the first back surface 21B. The second restricting member 211 is a plate-like member extending rearward from a position including the center in the vertical direction of the first back surface 21B. Both surfaces of the second regulating member 211 face in the left-right direction. The second restricting member 211 is disposed on the right side of the left and right positions of the front first restricting members 221A and 231A (see FIG. 7). The second restricting member 211 is disposed on the right side of the position in the left-right direction of the right end of each of the rear first restricting members 221B and 231B. The length in the left-right direction between the right ends of the front first restricting members 221A and 231A and the rear first restricting members 221B and 231B and the left end of the second restricting member 211 is a holder 5 (described later). 5) is equal to the length (thickness) in the left-right direction of the first end 51 (see FIG. 5).

  As shown in FIG. 3, a plurality of holes 24 </ b> D and a plurality of holes 24 </ b> E penetrating in the front-rear direction are provided in the fourth housing 24 on the right side of the center in the left-right direction of the housing 12. Each of the plurality of holes 24 </ b> D is a plurality of screw holes for attaching an image light generation unit 6 described later to the inside of the housing 12. Each of the plurality of holes 24 </ b> E is a plurality of screw holes for attaching the connector 9 to the housing 12. A hole 25D penetrating in the front-rear direction is provided at the rear end of the protruding portion 12B. The cable 7 (see FIG. 2) is connected to the hole 25D. The HD 10 is connected to the CB 40 via the cable 7.

  A housing engaging portion 241 is provided at the fourth extending portion 24 </ b> A of the fourth housing 24, more specifically, at the left end 24 </ b> C of the fourth housing 24. The housing engaging portion 241 is a plate-like member that extends leftward from a position including the center in the vertical direction of the left end 24 </ b> C of the fourth housing 24.

<Image Light Generation Unit 6>
The image light generator 6 (see FIG. 10) is built in the housing 12. The image light generation unit 6 is a unit that generates image light corresponding to a video signal input via the cable 7 (see FIG. 2) and emits the image light. The image light generation unit 6 includes a control board, a liquid crystal display device, and a lens unit (not shown) in order from the right. The control board (not shown) includes a display control circuit on which an IC chip or the like is mounted. The display control circuit is connected to the cable 7 via a connection I / F controller 17 (see FIG. 10) provided in the housing 12 and receives a video signal transmitted by the cable 7.

  A liquid crystal display device (not shown) includes a liquid crystal panel, a glass substrate, and a light source. The liquid crystal panel is a well-known reflective liquid crystal panel having a rectangular shape, and both side surfaces face the left-right direction. The liquid crystal panel is connected to the control board via a flexible printed board (not shown). The liquid crystal panel can display a video corresponding to the video signal by receiving the control signal output from the display control circuit via the flexible printed circuit board. The glass substrate is provided on the left side surface of the liquid crystal panel and protects the display surface of the liquid crystal panel. The light source is provided at a position that emits light to the liquid crystal panel. The light from the light source enters the liquid crystal panel, and the liquid crystal panel reflects the incident light, thereby generating image light. The image light generated by the liquid crystal panel passes through the glass substrate to the left side.

  In the present invention, a two-dimensional display device such as a digital mirror device (DMD) or an organic EL may be used instead of the liquid crystal panel. Furthermore, a retinal scanning type projection device (Retinal Scanning Display) that projects light two-dimensionally scanned onto the retina of the user may be used.

  The lens unit is a unit that guides the image light emitted from the image light generation unit 6 to a half mirror 56 (see FIG. 2) disposed on the left side of the lens unit. The lens unit has a plurality of lenses arranged along the left-right direction. The image light generated by the image light generation unit 6 enters the plurality of lenses from the right side. The plurality of lenses refract image light and emit it to the left side.

<Deflection unit 59>
As shown in FIG. 4, the deflection unit 59 includes a half mirror 56 and the holder 5. The half mirror 56 is disposed on the left side of the image light generator 6. The half mirror 56 can reflect the image light emitted toward the left by the image light generator 6 in the backward direction intersecting the left direction. The user's eyes can visually recognize the virtual image based on the image light reflected to the rear side by the half mirror 56. Moreover, the half mirror 56 can transmit the external light incident from the front side to the rear side.

  The half mirror 56 has a rectangular plate shape. The half mirror 56 is configured, for example, by vapor-depositing a metal such as aluminum or silver on a transparent resin or glass substrate so as to have a predetermined reflectance (for example, 50%). The half mirror 56 is held by a holder 5 described later. One surface 56B of both surfaces of the half mirror 56 held by the holder 5 faces diagonally right rearward, and the other surface 56C faces diagonally left frontward. The half mirror 56 can reflect a part (for example, 50%) of light incident on each of the surfaces 56B and 56C and transmit the other part.

  In the present invention, instead of the half mirror 56 described above, a reflecting member capable of totally reflecting the image light incident on the surface 56B to the rear side may be used. Further, instead of the half mirror 56, an optical path deflecting member such as a prism or a diffraction grating may be used.

  The holder 5 is disposed on the left side of the lens unit and holds the half mirror 56. As shown in FIGS. 5 and 6, the holder 5 includes a foundation member 50. The base member 50 has a plate-like first end 51, second end 52, third end 53, and fourth end 54. Of the three pairs of parallel planes of the first end 51, each of the pair of planes having the largest area faces in the left-right direction. The upper and lower corners of the front end of the first end 51 are curved. The second end 52 extends rearward from above the rear end of the first end 51. The third end portion 53 extends rearward from the lower side of the rear end portion of the first end portion 51. Of the three pairs of parallel planes of the second end portion 52 and the third end portion 53, each of the pair of planes having the largest area faces in the vertical direction. The fourth end portion 54 is constructed between the rear end portions of the second end portion 52 and the third end portion 53. Of the three pairs of parallel planes of the fourth end portion 54, each of the pair of planes having the largest area faces the front-rear direction. The length in the left-right direction of the fourth end portion 54 is shorter than the length in the left-right direction of each of the second end portion 52 and the third end portion 53. The positions of the right end surfaces of the first end portion 51, the second end portion 52, the third end portion 53, and the fourth end portion 54 in the left-right direction coincide with each other and form the same plane. A hole 51 </ b> B (see FIG. 6) is formed in a portion surrounded by the first end portion 51 to the fourth end portion 54. Hereinafter, as shown in FIG. 6, the right end surfaces of the first end portion 51, the second end portion 52, the third end portion 53, and the fourth end portion 54 are respectively referred to as the first end surface 51 </ b> A and the first end surface 51 </ b> A. These are referred to as a second end surface 52A, a third end surface 53A, and a fourth end surface 54A.

  As shown in FIG. 6, a hole 52 </ b> C penetrating in the vertical direction is provided in the vicinity of the left front corner of the second end portion 52. A hole 53 </ b> C penetrating in the vertical direction is provided in the vicinity of the left front corner of the third end portion 53. As shown in FIG. 4, the upper protruding portion 56A protruding upward from the upper end portion of the half mirror 56 is fitted into the hole 52C (see FIG. 6) from below. A lower protruding portion (not shown) protruding downward from the lower end portion of the half mirror 56 is fitted into the hole 53C (see FIG. 6) from above. The holder 5 holds the half mirror 56 by supporting the upper protrusion 56A of the half mirror 56 through the hole 52C and supporting the lower protrusion (not shown) of the half mirror 56 through the hole 53C. The half mirror 56 is disposed on the left side with respect to a plane including each of the first end surface 51A to the fourth end surface 54A.

  As shown in FIGS. 5 and 6, a protrusion 511 is provided at the rear end of the first end surface 51 </ b> A of the first end 51. The protrusion 511 protrudes rearward from a position including the center in the vertical direction. A protrusion 521 that protrudes downward is provided at the front end of the lower surface of the second end 52. A protruding portion 531 that protrudes upward is provided at the front end of the upper surface of the third end portion 53. The length in the left-right direction between the left end of the protruding portion 511 and the right end of the protruding portions 521 and 531 is substantially equal to the thickness of the transparent member 58 described later. As shown in FIG. 8, a protrusion 543 is provided at the left end of the front surface of the fourth end 54, in other words, at the end opposite to the fourth end surface 54A. The protruding portion 543 protrudes forward from a position including the vertical center. As shown in FIG. 5, projecting portions 544A and 544B are provided on the right side of the front end surface of the fourth end portion 54 with respect to the left end. The protrusion 544A protrudes forward from a position above the vertical center of the fourth end face 54A. The protruding portion 544B protrudes forward from a position below the vertical center of the fourth end surface 54A. The length in the left-right direction between the right end of the protrusion 554 and the left end of the protrusions 544A, 544B is substantially equal to the thickness of the transparent member 58 described later.

  As shown in FIGS. 5 and 6, a holder protrusion 52 </ b> B is provided on the upper surface of the portion connected to the fourth end 54 in the second end 52. The holder protrusion 52B includes extending portions 521B and 522B. The extending portion 521B is a plate-like portion that protrudes leftward with respect to the second end surface 52A (see FIG. 6). The length of the extending portion 521B in the left-right direction is the left-right direction between the upper end of the left end 24C (see FIG. 3) of the fourth extending portion 24A and the right end of the rear first regulating member 221B (see FIG. 3). (Hereinafter also referred to as “the length of the groove portion 221C (see FIG. 3) in the left-right direction”). The extending portion 522B is a plate-like portion that extends forward from the left end of the extending portion 521B. A holder protrusion 53 </ b> B is provided on the lower surface of the portion connected to the fourth end 54 in the third end 53. The holder protrusion 53B includes extending portions 531B and 532B. The extending portion 531B is a plate-like portion that protrudes leftward with respect to the third end surface 53A. The length in the left-right direction of the extending portion 531B is the length in the left-right direction between the lower end of the left end 24C of the fourth extending portion 24A and the right end of the rear first regulating member 231B (see FIG. 3) (hereinafter referred to as the left-side length). , Also referred to as “the length of the groove portion 231C (see FIG. 3) in the left-right direction”). The extending portion 532B is a plate-like portion extending forward from the left end of the extending portion 531B.

  As shown in FIG. 6, a first holder engagement portion 541A and second holder engagement portions 542A and 542B are provided on the fourth end surface 54A of the fourth end portion 54. The first holder engaging portion 541A protrudes to the right from the center of the fourth end surface 54A in the up-down direction and from the front side of the front-rear direction center. The second holder engaging portion 542A protrudes to the right from the upper side in the vertical direction of the fourth end face 54A and from the rear side in the front-rear direction. The second holder engagement portion 542B protrudes to the right from the lower side of the fourth end surface 54A and the rear side of the front and rear direction center.

  The deflection unit 59 configured as described above is detachably supported by the housing 12. FIG. 7 shows a state before the deflection unit 59 is attached to the housing 12. The deflection unit 59 is attached to the housing 12 by moving the holder 5 from the rear side to the front side with respect to the opening portion 121. In the process of mounting the deflection unit 59 on the housing 12, the holder protrusion 52B of the holder 5 enters the groove 221C of the second extension 22A from the rear side, and the holder protrusion 53B of the holder 5 extends to the third extension. It enters the groove portion 231C (see FIG. 3) of the exit portion 23A from the rear side. The second end portion 52 of the holder 5 moves in parallel below the second back surface 22B (see FIG. 3) of the second extending portion 22A of the housing 12 along the second back surface 22B. The third end portion 53 of the holder 5 moves in parallel with the third back surface 23B on the upper side of the third back surface 23B (see FIG. 3) of the third extending portion 23A of the housing 12.

  In the process in which the deflection unit 59 is attached to the housing 12, the first holder engaging portion 541A (see FIG. 6) is moved to the housing engaging portion 241 (see FIG. 3) provided at the left end 24C of the fourth housing 24. ) From the rear side. When further force is applied to the front side with respect to the holder 5, the fourth end portion 54 bends to the left side, so that the first holder engaging portion 541 A gets over the left side of the housing engaging portion 241 and moves to the housing. It moves to the front side of the body engaging portion 241.

  FIG. 8 shows a state in which the deflection unit 59 is mounted on the housing 12. The front end of the first end portion 51 of the holder 5 comes into contact with the vicinity of the left end 21C of the first back surface 21B (see FIG. 3) of the first extension portion 21A of the first housing 21 from the rear side. Thereby, the forward movement of the holder 5 is restricted. The deflection unit 59 is disposed in a region covered in the vertical direction by the second extending portion 22 </ b> A of the second housing 22 and the third extending portion 23 </ b> A of the third housing 23.

  Of the first end surface 51A (see FIG. 6) of the first end portion 51 of the holder 5, the left side surface of the second regulating member 211 (see FIG. 7) is in contact with the position including the center in the vertical direction. The front first regulating member 221A (see FIG. 3) contacts the upper position of the left end surface of the first end 51 of the holder 5. The right side surface of the front first regulating member 231A (see FIG. 7) contacts the lower position of the left end surface of the first end portion 51 of the holder 5.

  The length between the right end of each of the front first restriction members 221 </ b> A and 231 </ b> A and the left end of the second restriction member 211 is the length (thickness) in the left-right direction of the first end 51 of the holder 5. equal. Therefore, in a state where the deflection unit 59 is mounted on the housing 12, the right end surfaces of the first front regulating members 221 </ b> A and 231 </ b> A and the left end surface of the second regulating member 211 are in the left-right direction on the front side of the holder 5. Restrict movement.

  The extension 521B of the holder protrusion 52B (see FIG. 5) fits into the groove 221C. The right end of the extending portion 521B, that is, the rear end portion of the second end surface 52A is in contact with the upper end portion of the left end 24C of the fourth extending portion 24A constituting the groove portion 221C. The left end of the extending portion 521B is in contact with the right end of the rear first restriction member 221B. The extending portion 531B of the holder protruding portion 53B (see FIG. 5) fits into the groove portion 231C. The right end of the extending portion 531B, that is, the rear end portion of the third end surface 53A is in contact with the lower end portion of the left end 24C of the fourth extending portion 24A constituting the groove portion 231C. The left end of the extending portion 531B is in contact with the right end of the rear first restriction member 231B.

  The upper surface of the second end portion 52 (see FIG. 5) of the holder 5 is located below the second back surface 22B (see FIG. 3) of the second extending portion 22A of the housing 12 and the second end portion 52. The holder projecting portion 52B is disposed so as to be opposed to and separated from the upper projecting portion 52B. The lower surface of the third end portion 53 (see FIG. 5) of the holder 5 is located above the third back surface 23B (see FIG. 3) of the third extending portion 23A of the housing 12, and the third end portion 53. The holder projecting portion 53B is disposed so as to be opposed to each other while being separated by a downward projecting amount.

  Note that the length of the extending portion 521B in the left-right direction is equal to the length of the groove portion 221C in the left-right direction. Further, the length of the extending portion 531B in the left-right direction is equal to the length of the groove portion 231C in the left-right direction. Therefore, the right end of the rear first regulating member 221B and the upper end of the left end 24C of the fourth extending portion 24A in the state where the deflection unit 59 is attached to the housing 12 are the rear of the holder 5. Side to side movement is restricted. When the deflection unit 59 is mounted on the housing 12, the right end of the rear first restriction member 231 </ b> B constituting the groove 231 </ b> C and the lower end of the left end 24 </ b> C of the fourth extension 24 </ b> A are Regulate direction movement.

  In a state where the deflection unit 59 is mounted on the housing 12, the front side surface of the housing engaging portion 241 (see FIG. 3) is in contact with the rear side surface of the first holder engaging portion 541A (see FIG. 6). The rear side surface of the housing engaging portion 241 contacts the front side surface of each of the second holder engaging portions 542A and 542B. Thus, the first holder engaging portion 541A and the second holder engaging portions 542A and 542B engage with the housing engaging portion 241. Therefore, the movement of the holder 5 in the front-rear direction is restricted by the housing engaging portion 241 and the state where the deflection unit 59 is attached to the housing 12 is maintained.

  On the left side of the left end 24C of the fourth housing 24, the fourth end portion 54 (see FIG. 5) of the holder 5 is disposed. The holder 5 holds the half mirror 56 on the left side of the fourth end surface 54 </ b> A of the fourth end portion 54. Accordingly, the opening 121 (see FIG. 3) formed on the left side of the left end 24 </ b> C is disposed behind the half mirror 56. Further, the front end of the first end portion 51 of the holder 5 contacts the vicinity of the left end 21 </ b> C of the first extension portion 21 </ b> A of the first housing 21. Accordingly, the opening 122 (see FIG. 3) formed on the left side of the left end 21 </ b> C is disposed in front of the half mirror 56.

<CB40>
The outline of the CB 40 will be described with reference to FIG. CB40 is attached to a user's waist belt, an arm, etc., for example. Each of the input terminals 91 and 92 included in the CB 40 can be connected to any one of arbitrary video devices that meet the standard. The video device is a well-known AV device that can output a video signal, and examples thereof include a diagnostic device, a PC, a video receiver, and a video camera. In the present embodiment, the input terminal 91 is connected to any of diagnostic devices 71A to 71C (see FIG. 10) described later. The input terminal 92 is connected to any of diagnostic devices 72A to 72B (see FIG. 10) described later. The CB 40 is detachably connected to the HD 10 via the cable 7. The CB 40 selects one of the video signals input to the input terminals 91 and 92 (hereinafter referred to as a specific video signal). Further, the CB 40 outputs a video signal in which the display magnification of a predetermined area of the video corresponding to the specific video signal is changed to the HD 10 via the cable 7. Hereinafter, the video of the specific video signal is referred to as a specific video, and a predetermined area of the specific video whose display magnification is changed is referred to as a change target area.

  The CB 40 includes a casing 63, an operation unit 96, an LED 98, input terminals 91 and 92, and the like as main components. The shape of the housing 63 is a substantially rectangular parallelepiped with rounded edges. The operation unit 96 is a switch for performing various settings in the HD 10, various operations during use, and power on / off. The LED 98 is a light emitting element for notifying the user of the state of the HMD 1.

  Input terminals 91 and 92 are terminals for receiving video signals, respectively. Cables 71 and 72 for communicating video signals are detachably connected to the input terminals 91 and 92, respectively. The video signal input to the input terminal 91 via the cable 71 is a known analog signal including a synchronization signal and a luminance signal, for example. The analog signal is, for example, a composite video signal or a component video signal. As the input terminal 91, for example, a composite terminal, a component terminal, an S terminal, and a D terminal are used. The video signal input to the input terminal 92 via the cable 72 is, for example, a well-known digital signal indicating the RGB values of a plurality of pixels included in each of a plurality of image frames. The digital signal is, for example, a DVI video signal or an HDMI (registered trademark) video signal. As the input terminal 92, for example, a DVI terminal or an HDMI (registered trademark) terminal is used.

<Diagnostic devices 71A-71C, 72A-72C>
The outline | summary of diagnostic apparatus 71A-71C and 72A-72C is demonstrated. Each of the diagnostic apparatuses 71A to 71C and 72A to 72C is a known ultrasonic diagnostic apparatus that can detect an echo by irradiating an object with an ultrasonic wave and visualize the detected echo. The object is, for example, a fetus of a pregnant woman in obstetrics or a blood vessel of a patient in puncture. The diagnosis devices 71A to 71C and 72A to 72C are diagnosis devices of different models. Monitors corresponding to each of the diagnostic apparatuses 71A to 71C and 72A to 72C are connected. On each monitor, a diagnostic video imaged by each of the diagnostic apparatuses 71A to 71C and 72A to 72C is displayed. The user makes a diagnosis while viewing the video displayed on the monitor.

  Each of the diagnostic apparatuses 71A to 71C outputs a video signal of the video to the input terminal 91 in an analog format. Each of the diagnostic devices 72A to 72C outputs a video signal of a video to the input terminal 92 in a digital format. In the present embodiment, the CB 40 acquires a video signal input from any of the diagnostic devices 71A to 71C and 72A to 72C as a specific video signal. Note that the diagnostic apparatus connected to the input terminals 91 and 92 is not limited to an ultrasonic diagnostic apparatus. For example, a Magnetic Resonance Imaging (MRI) apparatus, an X-ray computed tomography (CT) apparatus, an optical coherence tomography (OCT) apparatus, Any diagnostic device such as an endoscopic device may be used.

  With reference to FIG. 9, the diagnostic image | video which each of diagnostic apparatus 71A-71C and 72A-72C visualizes on each monitor is demonstrated. Since the diagnostic apparatuses 71A to 71C and 72A to 72C are ultrasonic diagnostic apparatuses of different models, the layouts of the diagnostic images displayed on the monitors are different from each other.

   First, an example of the diagnostic image 76 of the diagnostic device 71A will be described. Hereinafter, the horizontal direction of the diagnostic image 76 is referred to as the X-axis direction, and the vertical direction is referred to as the Y-axis direction. The diagnostic video 76 is a video imaged by the diagnostic device 71A. The diagnostic video image 76 is updated at an update cycle corresponding to a predetermined frame rate (for example, 1/60 seconds). The diagnostic video 76 includes a real-time video display area 77 and a specific information display area 78. The real-time video display area 77 is a rectangular area including the approximate center position of the diagnostic video 76. In the real-time video display area 77, a black-and-white video to be diagnosed that visualizes the echo of the ultrasound is displayed. By updating the image of the diagnostic video 76 with an update cycle of 1/60 seconds, the image changes in real time in accordance with the movement of the diagnostic object in the real-time video display area 77. In other words, in the real-time video display area 77, it can be said that the image changes according to the movement of the diagnostic object at a predetermined frequency equal to or lower than the update period of 60 seconds. Note that the real-time video display area may include an area where the image does not change temporarily even when the diagnostic video 76 image is updated. Further, the image change in the real-time video display area 77 may not be periodic.

  The specific information display area 78 is a part of the diagnostic video 76 excluding the real-time video display area 77. The specific information display area 78 is arranged on the left side, the right side, and the lower side of the real-time video display area 77, for example. In other words, the specific information display area 78 is arranged outside the real-time video display area 77 with respect to the center position of the diagnostic video 76. Predetermined specific information is displayed in the specific information display area 78. In the specific information display area 78, for example, a diagnosis result list, a diagnosis method selection button, a setting condition input button, a diagnosis start button and an end button, a heart rate, a diagnosis date and time, and the like are displayed. In the present embodiment, by updating the image of the diagnostic video 76, the image of the region indicating the heart rate and the diagnosis date / time in the specific information display region 78 changes at a lower frequency than the update cycle of the diagnostic video 76. To do. For example, the image of the area indicating the diagnosis date / time changes at intervals of one second. The images in other areas do not change, and the frequency of changes seems to be extremely low. In other words, the frequency at which the image in the specific information display area 78 changes is lower than the frequency at which the image in the real-time video display area 77 changes as well as the diagnostic video 76. The layout of the real-time video display area 77 and the specific information display area 78 in the diagnostic video 76 is not limited to the example of FIG.

  The diagnostic images of the diagnostic apparatuses 71B, 71C, and 72A to 72C each include a real-time video display area and a specific information display area that are different from the diagnostic video 76 (not shown). In the present embodiment, each real-time video display area is arranged at a position that includes the approximate center position of each diagnostic video. The aspect ratio of each real tie video display area is different from the aspect ratio of the real time video display area 77. The aspect ratio is a ratio between the length in the X-axis direction and the length in the Y-axis direction of the image. Each specific information display area is arranged at least on either the right side or the left side of the diagnostic video. The specific information display area of each diagnostic apparatus has a different layout from the specific information display area 78. In the present embodiment, in each of the diagnostic apparatuses 71A to 71C and 72A to 72C, an area including at least a part of the real-time video display area is a change target area.

<Electrical configuration of HMD1>
The electrical configuration of the HMD 1 will be described with reference to FIG. First, the electrical configuration of the CB 40 will be described. The CB 40 includes a CPU 31. The CPU 31 includes a flash ROM 32, a RAM 33, a peripheral interface (hereinafter referred to as a peripheral I / F) 58, a power supply circuit 79, a first decoder 81, a second decoder 82, a selector 83, an input buffer 84, a first image processing unit 101, The second image processing unit 102, the output buffer 85, the output control unit 86, and a connection interface controller (hereinafter referred to as a connection I / F controller) 87 are electrically connected. Flash ROM 32, RAM 33, peripheral I / F 57, power supply circuit 79, first decoder 81, second decoder 82, selector 83, input buffer 84, first image processing unit 101, second image processing unit 102, output buffer 85, output The control unit 86 and the connection I / F controller 87 are each mounted on a substrate (not shown) built in the housing 63. The first decoder 81 and the second decoder 82 are electrically connected to the selector 83, respectively. The selector 83 is connected to the input buffer 84. The input buffer 84, the first image processing unit 101, and the second image processing unit 102 are electrically connected. The output buffer 85 is electrically connected to the second image processing unit 102 and the output control unit 86. The connection I / F controller 87 is electrically connected to the output control unit 86. The input buffer 84, the first image processing unit 101, the second image processing unit 102, the output buffer 85, and the output control unit 86 are mounted as a single ASIC on a substrate built in the housing 63.

  The flash ROM 32 stores a program for the CPU 31 to execute a later-described display magnification change pattern storage process (see FIG. 11), display process (see FIG. 13), display magnification change pattern determination process (see FIG. 14), and the like. The program is stored in the flash ROM 32 when the HMD 1 is shipped. The flash ROM 32 stores a table 100 (see FIG. 12) described later. The RAM 33 temporarily stores various data. The first decoder 81 has an input terminal 91. The first decoder 81 is a video processing unit that decodes an analog video signal input via the input terminal 91. The second decoder 82 has an input terminal 92. The second decoder 82 is a video processing unit that decodes a digital video signal input via the input terminal 92. The selector 83 selects and accepts a video signal processed by either the first decoder 81 or the second decoder 82 as a specific video signal. The input buffer 84 stores specific video signal data received by the selector 83. In the present embodiment, the first decoder 81 and the second decoder 82 are separate image processing ICs having standards conforming to the input terminals 91 and 92, respectively.

  The first image processing unit 101 is an ASIC that can determine a change target region in a video (hereinafter referred to as a specific video) corresponding to a specific video signal stored in the input buffer 84. The first image processing unit 101 determines the change target area by the operation of the operation unit 96 by the user. The area determined by the first image processing unit 101 is stored in the flash ROM 32. The second image processing unit 102 acquires the specific video signal by referring to the input buffer 84 and refers to the flash ROM 32 to change the target area determined by the first image processing unit 101 and a later-described region. Table 100 is acquired. As a result, the second image processing unit 102 generates a magnification change video signal that is a signal of a video in which the display magnification of the change target area is changed (hereinafter referred to as a magnification change video). Note that the second image processing unit 102 can also output the specific video signal stored in the input buffer 84 to the output buffer 85 as it is. The output buffer 85 stores the video signal output from the second image processing unit 102. The output control unit 86 acquires the video signal stored in the output buffer 85 and transmits it to the connection I / F controller 87. The connection I / F controller 87 serially converts the acquired video signal and transmits it to the HD 10 via the cable 7. The connection I / F controller 87 is an IC circuit that performs encoding (for example, serialization) in accordance with a predetermined video standard such as DVI, HDMI (registered trademark), V-by-One (registered trademark), or the like.

  The electrical configuration of the HD 10 will be described. The HD 10 includes a connection interface controller (hereinafter referred to as a connection I / F controller) 17 and an image light generation unit 6. The connection I / F controller 17 and the image light generation unit 6 are electrically connected. The connection I / F controller 17 receives the video signal transmitted from the connection I / F controller 87 via the cable 7, further performs parallel conversion, and outputs the video signal to the image light generation unit 6. The connection I / F controller 17 is an IC circuit that performs decoding (for example, deserialization) in accordance with a predetermined video standard such as DVI, HDMI (registered trademark), V-by-One (registered trademark), or the like. The image light generation unit 6 generates image light based on the input video signal.

  The HMD 1 may further include a wireless communication unit (not shown). The CPU 31 may download a program from a program download server via the wireless communication unit and store the program in the flash ROM 32. Further, the HMD 1 may further include a CPU (not shown) different from the CPU 31 in the HD 10. In this case, the CPU of the HD 10 executes part of the processing executed by the CPU 31 instead.

<Display magnification change pattern storage processing>
The display magnification change pattern storage process executed by the CPU 31 of the CB 40 will be described with reference to FIG. The display magnification change pattern storage process is a process in which the CPU 31 stores the change target area desired by the user, the display magnification desired by the user, and the like in the flash ROM 32. The user connects one of the cables 71 and 72 to one of the input terminals 91 and 92. Any of the diagnostic devices 71 </ b> A to 71 </ b> C is connected to the cable 71. Any of the diagnostic devices 72 </ b> A to 72 </ b> C is connected to the cable 72. The user wears the HMD 1 to which the deflection unit 59 is attached and turns on the power of the HMD 1. Further, the user sets a mode in which the CPU 31 executes the display magnification change pattern via the operation unit 96. The CPU 31 starts processing based on the program stored in the flash ROM 32. Thereby, the display magnification change pattern storage process is executed. In the following description, a case where the diagnostic device 71A is connected to the input terminal 91 via the cable 71 and the diagnostic device 72A is connected to the input terminal 92 via the cable 72 will be described as an example.

  The CPU 31 determines whether a video signal is input (S11). The CPU 31 determines whether a video signal is input based on whether at least one of the first decoder 81 and the second decoder 82 receives the video signal. The CPU 31 stands by until a video signal is input (S11: NO). When the CPU 31 determines that a video signal is input (S11: YES), it determines whether a video signal is input to both the input terminals 91 and 92 (S13). The CPU 31 stands by until a video signal is input to only one of the input terminals 91 and 92 (S13: YES). For example, if the user removes the cable 72 connected to the diagnostic device 72 </ b> A from the input terminal 92, the video signal is input only to the input terminal 91. When the CPU 31 determines that the video signal is input to only one of the input terminals 91 and 92 (S13: NO), the CPU 31 specifies the input terminal to which the video signal is input (S15). The CPU 31 causes the selector 83 to select a decoder corresponding to the input terminal specified in S15 out of the first decoder 81 and the second decoder 82. The selector 83 selects and accepts the video signal decoded by the first decoder 81 as a specific video signal. The specific video signal received by the selector 83 is stored in the input buffer 84.

  The CPU 31 causes the image light generation unit 6 to display the specific video of the specific video signal selected by the selector 83 (S17). For example, the CPU 31 transmits a signal to the second image processing unit 102, the output control unit 86, and the connection I / F controller 87. Accordingly, the second image processing unit 102, the output control unit 86, and the connection I / F controller 87 execute the following processing. The second image processing unit 102 acquires a specific video signal stored in the input buffer 84 and performs mirror image inversion processing. The mirror image inversion process is achieved, for example, by rearranging the arrangement order of pixels in each row (that is, in the X-axis direction) for each frame indicated by the specific video signal. That is, the mirror image inversion process is performed in the horizontal direction of the image. The second image processing unit 102 outputs the specific video signal of the specific video that has undergone the mirror image inversion process to the output buffer 85. The output buffer 85 stores the specific video signal that is mirror-inverted and output from the second image processing unit 102. The output control unit 86 acquires the specific video signal stored in the output buffer 85 and outputs it to the connection I / F controller 87. The connection I / F controller 87 serially converts the specific video signal and then transmits it to the connection I / F controller 17 via the cable 7. The connection I / F controller 17 converts the serial-converted specific video signal into parallel and outputs it to the image light generation unit 6. The image light generator 6 generates image light of the diagnostic image 76 that is mirror-inverted and emits the image light to the half mirror 56. The user visually recognizes the diagnostic video 76 of the diagnostic apparatus 71A as the specific video via the half mirror 56.

  CPU31 acquires the characteristic pattern of the specific information display area 78 of the diagnostic image | video 76 (S19). The characteristic pattern of the specific information display area 78 is an image pattern of the specific information display area. When the CPU 31 transmits a signal to the first image processing unit 101, the first image processing unit 101 executes a process of acquiring the feature pattern of the specific information display region 78 at the upper right and upper left of the diagnostic video 76. For example, the first image processing unit 101 acquires pixel data of the diagnostic video 76 in order along the X-axis direction from a pixel at the upper left end of the diagnostic video 76 (hereinafter referred to as a first pixel). The first image processing unit 101 acquires the coordinates of points at which the sequentially acquired pixel data is discontinuous. The discontinuous point is a point where a difference in value between adjacent pixels is larger than a predetermined threshold, for example, when a white pixel changes to a black pixel. After that, the first image processing unit 101 acquires image data along the X-axis direction from the pixel one dot below the first pixel in the same manner as described above, and acquires the coordinates of discontinuous points. . As a result, the first image processing unit 101 acquires the feature pattern of the specific information display area 78 at the upper left of the diagnostic video 76. That is, the feature pattern is data in a predetermined format indicating a set of coordinate points where the pixel data is discontinuous. Similarly, the first image processing unit 101 acquires a feature pattern of the specific information display area 78 at the upper right of the diagnostic video 76. Note that the above-described processing performed by the first image processing unit 101 may be executed by the CPU 31 by executing a program stored in the flash ROM 32.

  The CPU 31 determines whether a change target area has been selected (S21). For example, the user selects an area including at least a part of the real-time video display area 77 in the diagnostic video 76 as a desired change target area via the operation unit 96. The aspect ratio of the change target area selected by the user via the operation unit 96 is set to be the same as the aspect ratio of the image light generated by the image light generation unit 6.

  The CPU 31 acquires the coordinates of the center position of the change target area and the magnification when the image light generation unit 6 displays the change target area (S23). The aspect ratio of the change target area selected by the user is the same as the aspect ratio of the image light displayed on the image light generation unit 6. Therefore, the magnification is obtained by dividing the diagonal distance of the image light of the image light generation unit 6 by the diagonal distance of the selected change target region. In the present embodiment, the diagonal distance of the change target region is half of the diagonal distance of the image light of the image light generation unit 6. For example, the CPU 31 acquires (500, 500) as the coordinates of the center position of the change target area and acquires 2 times as the magnification.

  The CPU 31 acquires the presence / absence of mirror image reversal and the luminance correction amount when the change target area is changed and displayed on the image light generated by the image light generation unit 6 (S25). For example, the user inputs the mirror image reversal on and the luminance correction amount 1 a via the operation unit 96 when changing the magnification of the region including at least a part of the real-time video display region 77 to be displayed on the image light generation unit 6. . The CPU 31 acquires the mirror image reversal ON and the luminance correction amount 1a.

  The CPU 31 displays the characteristic pattern of the specific information display area acquired in S19, the magnification acquired in S23 and the coordinates of the center position of the change target area, the presence / absence of mirror image inversion acquired in S25, and the luminance correction amount in the table 100 (see FIG. 12). (S27), and the display magnification change pattern storage process is terminated. The display magnification change pattern storage process is executed corresponding to each of the diagnostic apparatuses 71A to 71C and 72A to 72C. In the display magnification change pattern storage processing in each diagnostic device, the user operates the operation unit 96 to select an area including at least a part of the real-time video display area as a change target area, and whether or not mirror image inversion exists. Then, a desired luminance correction amount is input. The luminance correction amount is, for example, a coefficient of correction (for example, addition or multiplication) performed by the second image processing unit 102 on the value of each pixel of the image indicated by the specific video signal. Alternatively, the luminance correction amount may be information indicating a dynamic range that designates a pixel value in a range that can be displayed by the image light generation unit 6. As a result, display magnification change patterns for changing and displaying the respective real-time video display areas of the diagnostic apparatuses 71A to 71C and 72A to 72C are stored in the table 100.

<Table 100>
The table 100 will be described with reference to FIG. The table 100 stores input terminals, diagnostic device data, magnification setting data, and image processing data in association with each other. The data stored in the input terminal column is data stored when the HMD 1 is shipped, and the data stored in each column of the diagnostic device data, the magnification setting data, and the image correction processing data is the display magnification change pattern storage process. Is stored every time is executed. The input CH1 and input CH2 are stored in the input terminal column. The input CH1 is data indicating the input terminal 91, and the input CH2 is data indicating the input terminal 92.

  The diagnostic device data column is divided into a diagnostic device column and a feature pattern column. The data stored in the column of the diagnostic device is index data corresponding to each of the inputs CH1 and CH2, and each time the display magnification change pattern storage process (see FIG. 11) is executed, either the input CH1 or CH2 is stored. Are stored in order. For example, when the display devices 71A to 71C are sequentially connected to the input terminal 91 and the display magnification change pattern storage process is executed each time the connection is made, 71A to 71C are sequentially displayed in the column of the diagnosis device corresponding to the input CH1. Remembered. Similarly, when the diagnostic devices 72A to 72C are sequentially connected to the input terminal 92 and the display magnification change pattern storage process is executed each time the connection is made, 72A to 72C are sequentially displayed in the column of the diagnostic device corresponding to the input CH2. Remembered. In this embodiment, 71A-71C memorize | stored in the column of a diagnostic apparatus respectively shows diagnostic apparatus 71A-71C, 72A-72C shows diagnostic apparatus 72A-72C, respectively.

  In the feature pattern column, a feature pattern acquired by the CPU 31 executing S19 is stored corresponding to each data stored in the diagnostic device column. The feature patterns 71A to 71C stored in the feature pattern column respectively indicate the feature patterns of the specific information display area in the diagnosis devices 71A to 71C, and the feature patterns 72A to 72C respectively indicate the specific information display regions in the diagnosis devices 72A to 72C. The characteristic pattern of is shown. The diagnostic devices 71A to 71C and 72A to 72C are diagnostic devices of different models, and the characteristic patterns of the specific information display areas corresponding to the diagnostic devices are also different from each other.

  The column of magnification setting data is divided into a column of central coordinates and a column of magnification corresponding to each combination of data stored in the column of the diagnostic device. The center coordinate column stores the coordinates of the center position of the change target area selected by the user, which is acquired by the CPU 31 executing S23. In the magnification column, the magnification calculated by the CPU 31 executing S23 is stored corresponding to each of the data stored in the diagnostic device column. That is, the magnification setting data includes the coordinates of the center position of the change target area and the magnification. Hereinafter, of the magnification setting data, data corresponding to the input CH1 (input terminal 91) is referred to as first magnification setting data, and data corresponding to the input CH2 (input terminal 92) is referred to as second magnification setting data. In FIG. 12, as an example, three first magnification setting data are stored corresponding to the diagnostic devices 71A to 71C, respectively, and three second magnification setting data are respectively corresponding to the diagnostic devices 72A to 72C. Remembered. In other words, the flash ROM 32 stores three first magnification setting data and feature patterns 71A to 71C each corresponding to each of the three magnification setting data. Further, the flash ROM 32 stores three pieces of second magnification setting data and feature patterns 72A to 72C each corresponding to each of the three pieces of magnification setting data.

  The image processing data column is divided into a mirror image reversal column and a luminance correction amount column corresponding to each combination of data stored in the diagnostic device column. The mirror image reversal column stores data indicating whether the mirror image reversal on or the mirror image reversal off is performed on an area including at least a part of the real-time video display area that is the change target area. The data stored in the mirror image reversal column is acquired by the CPU 31 executing S25. The brightness correction amount column stores the brightness correction amount for correcting the brightness of the real-time video display area that is the change target area. The data stored in the luminance correction amount column is acquired by the CPU 31 executing S25. The brightness correction amount and the data stored in the mirror image inversion column are stored corresponding to the first magnification setting data and the second magnification setting data, respectively.

<Display processing>
Display processing executed by the CPU 31 of the CB 40 will be described with reference to FIG. After the display magnification change pattern storage process is executed, the user connects any of the diagnostic devices 71A to 71C and 72A to 72C to any of the cables 71 and 72. The user wears the HMD 1 and turns on the power of the HMD 1. Further, the user sets a mode in which the video of the input video signal is displayed on the HD 10 via the operation unit 96. The CPU 31 starts processing based on the program stored in the flash ROM 32. Thereby, the display process is executed. In the following description, an example in which the deflection unit 59 is attached to the HMD 1 will be described.

  The CPU 31 determines whether a video signal is input (S31). When the CPU 31 determines that no video signal is input (S31: NO), it displays a black screen on the image light generator 6 (S33). The CPU 31 transmits signals to the second image processing unit 102, the output control unit 86, and the connection I / F controller 87. The second image processing unit 102 generates a black screen video signal and outputs it to the output buffer 85. The description until the video signal of the black screen stored in the output buffer 85 is transmitted to the image light generation unit 6 overlaps with the description of S17, and will be omitted. Note that the CPU 31 may cause the image light generation unit 6 to display a white screen instead of the black screen.

  The CPU 31 determines whether an end instruction has been detected (S35). The CPU 31 stands by until an end instruction is input via the operation unit 96 (S35: NO). When the end instruction is input (S35: YES), the CPU 31 ends the display process.

  When CPU 31 detects the input of the video signal (S31: YES), CPU 31 determines whether the video terminal is input to both input terminals 91 and 92 (S37). When the CPU 31 determines that the video signal is input to only one of the input terminals 91 and 92 (S37: NO), the CPU 31 specifies the input terminal to which the video signal is input (S39) and performs processing. Proceed to S44. For example, when a video signal is input to the input terminal 91, the CPU 31 identifies the input terminal 91 as an input terminal to which the video signal is input.

  When the CPU 31 determines that the video signal is input to both the input terminals 91 and 92 (S37: YES), the CPU 31 refers to the priority information stored in the flash ROM 32. The priority information is information indicating an input terminal corresponding to a decoder that the selector 83 prioritizes selection when a video signal is input to both of the input terminals 91 and 92. At the time of shipment of the HMD 1, for example, input CH2 (input terminal 92) is stored as priority information. The CPU 31 determines the input terminal 92 as an input terminal for receiving a video signal (S43). Note that the input CH2 (input terminal 92) may be stored as priority information at the time of factory shipment.

  The CPU 31 selects, as a specific video signal, a video signal corresponding to the input terminal specified in S39 or the input terminal determined in S43 among the video signal input to the input terminal 91 and the video signal input to the input terminal 92. 83 is selected (S44). The selector 83 receives the selected video signal and outputs it to the input buffer 84. The input buffer 84 stores the input video signal. In the following description of the display process, a case will be described in which the selector 83 selects and accepts the diagnostic video 76 video signal input to the input terminal 91 as the specific video signal. The CPU 31 causes the selector 83 to select the first decoder 81 out of the first decoder 81 and the second decoder 82. The selector 83 selects and accepts the video signal of the diagnostic video 76 processed by the first decoder 81 as the specific video signal. The video signal of the diagnostic video 76 is stored in the input buffer 84.

  The CPU 31 determines whether the display magnification change mode is set (S45). The user can set either the normal display mode or the display magnification change mode via the operation unit 96. Information indicating the set mode is stored in the flash ROM 32 or the RAM 33. The normal display mode is a display mode in which a video corresponding to a video signal input to the HMD 1 is displayed on the image light generation unit 6 as it is. The display magnification change mode is a display mode in which a magnification change video is displayed.

  When the CPU 31 determines that the display magnification change mode is not set (S45: NO), the CPU 31 causes the image light generator 6 to display the specific video (S47). When the user sets the normal display mode, the CPU 31 determines that the display magnification change mode is not set (S45: NO). For example, the CPU 31 causes the image light generation unit 6 to display the diagnostic video 76 as the specific video. The description of how the CPU 31 displays the diagnostic video 76 on the image light generator 6 is omitted because it overlaps with S17.

  The CPU 31 determines whether an end instruction has been input (S49). The CPU 31 stands by until an end instruction is input via the operation unit 96 (S49: NO). If the CPU 31 determines that an end instruction has been input (S49: YES), it advances the process to S63. The CPU 31 updates the priority information stored in the flash ROM 32 to the information of the input terminal to which the specific video signal selected by the selector 83 in S44 has been input (S63), and ends the display process.

  On the other hand, when determining that the display magnification change mode has been set (S45: YES), the CPU 31 executes a display magnification change pattern determination process (S51). The display magnification change pattern determination process executed by the CPU 31 will be described with reference to FIG. The CPU 31 refers to the flash ROM 32 and acquires the table 100 (S71). CPU31 acquires the characteristic pattern of the specific information display area in the specific video of the specific video signal (S73). The manner in which the CPU 31 acquires the feature pattern is the same as that in S19, and thus the description thereof is omitted. For example, when the video signal of the diagnostic video 76 is input to the input terminal 91, the CPU 31 transmits a signal to the first image processing unit 101, so that the first image processing unit 101 displays the specific information of the diagnostic video 76. Processing for acquiring the feature pattern of the region 78 is executed. In this case, since the diagnostic device 71A is connected to the input terminal 91, the feature pattern acquired by the first image processing unit 101 is the same as the feature pattern 71A stored in the table 100.

  The CPU 31 determines whether a feature pattern that matches the feature pattern acquired in S73 is stored in the table 100 corresponding to the input terminal to which the specific video signal selected in S44 is input (S75). For example, the CPU 31 determines whether or not a feature pattern that matches the feature pattern in the specific information display area 78 acquired in S73 is stored corresponding to the input CH1 of the table 100. For example, the CPU 31 stores data in a predetermined format (hereinafter referred to as first data) indicating a set of coordinate points at which the pixel data of the specific information display area 78 is discontinuous, and stored in correspondence with the input CH1. Comparison is sequentially made with any of -71C. Then, the CPU 31 determines whether any one of the feature patterns 71A to 71C is completely coincident with the first data or data corresponding to a partial region is coincident (partial coincidence). If the CPU 31 determines that no matching feature pattern is stored (S75: NO), it stores Flag = 1 in the flash ROM 32 (S77), and returns the process to the display process.

  If the CPU 31 determines that a matching feature pattern is stored (S75: YES), it stores Flag = 0 in the flash ROM 32 (S79), and advances the process to S81. The CPU 31 acquires predetermined magnification setting data and predetermined image processing data corresponding to the feature pattern acquired in S73 from specific magnification setting data (described later) (S81), and returns the processing to the display processing. Among the three first magnification setting data and the three second magnification setting data stored in the table 100, the specific magnification setting data includes three corresponding to the input terminal to which the specific video signal selected by the CPU 31 in S44 is input. This is the magnification setting data. For example, when the CPU 31 causes the selector 83 to select the video signal of the diagnostic video 76 input to the input terminal 91 as the specific video signal (S44), the three first magnifications corresponding to the input CH1 (input terminal 91) The setting data becomes specific magnification setting data. The CPU 31 acquires the center coordinates (500, 500) and the magnification 2 as the predetermined magnification setting data corresponding to the feature pattern 71A from the three first magnification setting data that are the specific magnification setting data, and further, the predetermined image. Mirror image inversion ON and luminance correction amount 1a are acquired as processing data.

  As shown in FIG. 13, the CPU 31 refers to the flash ROM 32 and determines whether Flag = 1 (S53). If the CPU 31 determines that Flag = 1 (S53: YES), it causes the image light generator 6 to generate the image light of the error message (S55), and proceeds to S33. The CPU 31 causes the image light generator 6 to display, for example, “No display pattern corresponding to the diagnostic device is stored” as an error message (S55), and display a black screen after a predetermined time. (S33).

  When the CPU 31 determines that Flag = 1 is not satisfied (S53: NO), the CPU 31 generates a magnification change video signal (S57). The CPU 31 transmits a signal to the first image processing unit 101 and the second image processing unit 102. Thereby, the first image processing unit 101 and the second image processing unit 102 execute the following processing. The first image processing unit 101 acquires the specific video signal stored in the input buffer 84, and specifies the change target area according to the predetermined magnification setting data acquired by the CPU 31 in S81. The second image processing unit 102 acquires the specific video signal stored in the input buffer 84, and displays the display target area specified by the first image processing unit 101 according to the predetermined magnification setting data acquired by the CPU 31 in S81. A process for changing the display magnification is executed, and the image is processed according to predetermined image processing data. Thereby, the second image processing unit 102 generates a video signal of a video whose display magnification is changed. The magnification change video signal generated by the second image processing unit 102 is stored in the output buffer 85. For example, the first image processing unit 101 specifies an area including at least a part of the real-time video display area 77 as the change target area, and the second image processing unit 102 specifies the area specified by the first image processing unit 101. A magnification-change video signal is generated by enlarging at a magnification of 2, inversion of the mirror image, and performing luminance correction of the luminance correction amount 1a. Note that the processing executed by the first image processing unit 101 and the second image processing unit 102 may be executed by the CPU 31 by executing a program stored in the flash ROM 32.

  The CPU 31 displays the image of the magnification change video signal generated in S57 on the image light generation unit 6 (S59). The CPU 31 transmits a signal to the output control unit 86 and the connection I / F controller 87. The output control unit 86 acquires the magnification change video signal stored in the output buffer 85 and outputs it to the connection I / F controller 87. The connection I / F controller 87 outputs the magnification change video signal to the connection I / F controller 17 via the cable 7. The connection I / F controller 17 acquires the magnification change video signal transmitted via the cable 7 and outputs it to the image light generation unit 6. The image light generation unit 6 generates image light of the change target region that is mirror-inverted and enlarged at a magnification of 2 based on the input magnification change video signal.

  The image light generation unit 6 emits the generated image light toward the left side of the half mirror 56. The half mirror 56 reflects the transmitted image light in the backward direction that is a direction intersecting the left direction. The user visually recognizes the area to be changed, which is not mirror-inverted and enlarged, through the half mirror 56 as a magnification change image. That is, the user visually recognizes an image in which at least a part of the real-time image display area 77 is enlarged. The user can make a desired diagnosis while visually recognizing the video in a hands-free state.

  The CPU 31 determines whether an end instruction has been input (S61). The CPU 31 stands by until an end instruction is input via the operation unit 96 (S61: NO). If the CPU 31 determines that an end instruction has been input (S61: YES), it executes S63 and ends the display process.

  Next, a case where a diagnostic device 72A instead of the diagnostic device 71A is connected to the input terminal 92 via the cable 72 and display processing is executed will be described. Similar to the above, the processing of S31 to S43 is executed. The CPU 31 causes the selector 83 to select the video signal of the diagnostic device 72A input to the input terminal 92 as a specific video signal (S44). When the user selects the display magnification change mode via the operation unit 96 (S45: YES), S51 and S71 to S79 are executed as described above. The CPU 31 acquires center coordinates (400, 500) and a magnification of 2.5 as predetermined magnification setting data corresponding to the feature pattern 72A from the three second magnification setting data that are specific magnification setting data (see FIG. 12). In addition, mirror image inversion ON and luminance correction amount 2a are acquired as image processing data (S81). The second image processing unit 102 that has received the signal from the CPU 31 generates a magnification-change video signal obtained by enlarging the real-time video display area of the diagnostic device 72A (S57). The user can visually recognize the magnification change video through the half mirror 56. The CPU 31 updates the priority information to the information of the input terminal 92 (S63), and ends the display process.

  Next, instead of the diagnostic device 72A, a case where the diagnostic device 71B is connected to the input terminal 91 via the cable 71 and display processing is executed will be described. Similar to the above, the processing of S31 to S79 is executed. The CPU 31 acquires center coordinates (550, 450) and a magnification of 1.5 as predetermined magnification setting data corresponding to the feature pattern 71B from the three first magnification setting data which are specific magnification setting data (see FIG. 12). Further, mirror image inversion off and luminance correction amount 1b are acquired as image processing data (S81). The second image processing unit 102 that has received the signal from the CPU 31 generates a magnification-change video signal in which an area including at least a part of the real-time video display area of the diagnostic apparatus 71B is enlarged without mirror image reversal (S57). The image light generation unit 6 generates a change target region that is not mirror-inverted and enlarged at a magnification of 1.5 as a magnification-change video. The user removes the deflection unit 59 including the half mirror 56 from the housing 12 (see FIG. 7), and can directly view the image light generated by the image light generation unit 6 through the opening portions 121 and 122.

<Main functions and effects of this embodiment>
As described above, the CPU 31 stores the first magnification setting data and the second magnification setting data corresponding to the input terminals 91 and 92 (inputs CH1 and CH2) by executing the display magnification change pattern storage process ( S11 to S27). Thus, the magnification setting data desired by the user is stored in correspondence with each of the diagnostic devices 71A to 71C and 72A to 72C input to the input terminals 91 and 92. When the image light generator 6 displays the magnification-changed video, when a video signal is input to at least one of the input terminals 91 and 92, the CPU 31 causes the selector 83 to change One of the first decoder 81 and the second decoder 82 is selected (S39, S43). The CPU 31 selects a video signal input to one of the input terminals 91 and 92 as a specific signal. The CPU 31 causes the second image processing unit 102 to generate a magnification change video signal in accordance with predetermined magnification setting data included in the specific magnification setting data (S57). Thereby, the image light generation unit 6 can display the magnification change video desired by the user.

  If the magnification change display pattern desired by the user is stored in the flash ROM 32 by executing the display magnification change pattern storage process, the HMD 1 can be operated by the user for each of the diagnostic devices 71A to 71C and 72A to 72C. The selected change target area including at least a part of the real-time video display area is automatically displayed with the display magnification changed according to the first magnification setting data or the second magnification setting data. Each time the user alternately uses any of the diagnostic devices 71A to 71C connected to the input terminal 91 and any of the diagnostic devices 72A to 72C connected to the input terminal 92, the user selects the change target region and sets the display magnification. There is no need to input settings to the HMD 1. Therefore, the HMD 1 can display the display target area of the video signal input to each of the input terminals 91 and 92 by changing the display magnification corresponding to the input terminals 91 and 92 without designating the user. .

  In the display magnification change pattern storage processing, the user inputs data indicating whether the display target area is mirror-inverted. Thus, the CPU 31 stores the first magnification setting data and the second magnification setting data in the table 100 in association with data indicating whether the display target area is mirror-inverted (S27). The CPU 31 causes the second image processing unit 102 to generate either a magnification-change video signal that is mirror-inverted or a magnification-change video signal that is not mirror-inverted (S57). When the image light generation unit 6 generates image light with a mirror image inverted, the user visually recognizes the image light through the half mirror 56. On the other hand, when the image light generation unit 6 generates image light that is not mirror-inverted, the user directly visually recognizes the image light through the opening units 121 and 122. That is, the HMD 1 is in any case where the image light of the image displayed by the image light generation unit 6 is viewed through the half mirror 56 or directly through the half mirror 56. A magnification change video signal can be generated. Therefore, the usage method of HMD1 is diversified.

  The deflection unit 59 including the half mirror 56 is detachable from the housing 12. When the image light generation unit 6 generates image light that is mirror-inverted, the user can visually recognize the image light that is not mirror-inverted by attaching the deflection unit 59 to the housing 12. When the image light generator 6 generates image light that is not mirror-inverted, the user can visually recognize the image light that is not mirror-inverted by removing the deflection unit 59 from the housing 12. Depending on whether the image light generated by the image light generation unit 6 is mirror-inverted, the use of the image light through the half mirror 56 and the use of the image light directly through the half mirror 56 are properly used. Is possible. Therefore, the HMD 1 can appropriately allow the user to visually recognize the magnification-changed video as a video that is not mirror-inverted according to each application.

  The magnification change video signal is generated by the first decoder 81, the second decoder 82, the selector 83, the input buffer 84, the first image processing unit 101, the second image processing unit 102, and the output buffer 85 built in the housing 63. Is done. The generated magnification change video signal is transmitted to the HD 10 configured separately from the housing 63 via the connection I / F controller 87 provided in the housing 63. The HD 10 can display the magnification-change video only by including the connection I / F controller 17 and the image light generation unit 6. Thus, the HMD 1 can simplify the mechanism of the HD 10 configured separately from the CB 40.

  The CPU 31 displays the magnification-changed image whose luminance has been corrected in accordance with the luminance correction amount associated with the first magnification setting data and the second magnification setting data (S57 to S59). The user can visually recognize the image light whose luminance has been corrected. Therefore, the HMD 1 can easily make the user visually recognize the image light generated by the image light generation unit 6 according to the diagnostic devices 71A to 71C and 72A to 72C connected to any of the input terminals 91 and 92.

  Since the table 100 stores three pieces of first magnification setting data corresponding to the diagnostic devices 71A to 71C, the CPU 31 stores in the second image processing unit 102 the diagnostic devices 71A to 71C connected to the input terminal 91. A corresponding magnification change video signal can be generated (S57). Similarly, the CPU 31 can cause the second image processing unit 102 to generate a magnification change video signal corresponding to the diagnostic devices 72A to 72C connected to the input terminal 92 (S57). Therefore, the HMD 1 can display the magnification change video according to the diagnostic devices 71A to 71C and 72A to 72C connected to the input terminals 91 and 92, respectively. Even when a plurality of diagnostic devices are sequentially connected to the input terminals 91 and 92, the HMD 1 can display a magnification change image according to each diagnostic device.

  The diagnostic devices 71A to 71C and 72A to 72C connected to any one of the input terminals 91 and 92 are diagnostic devices of different models, and feature patterns of the specific information display areas are different from each other. By executing the display magnification change pattern storage process, the table 100 stores any one of the feature patterns 71A to 71C corresponding to each of the three first magnification setting data, and the three second magnification setting data. One of the feature patterns 72A to 72C is stored corresponding to each of the above. The CPU 31 acquires predetermined magnification setting data included in the specific magnification setting data among the three first magnification setting data and the three second magnification setting data based on the feature pattern identified in S73 (S81). , HMD1 does not receive an instruction from the user even when one diagnostic device is sequentially connected to any of input terminals 91 and 92, and appropriate magnification setting data according to the connected diagnostic device. And a magnification change image corresponding to each diagnostic apparatus can be displayed by appropriately changing the display magnification.

  In the feature pattern column of the table 100, feature patterns of the specific information display areas of the diagnostic apparatuses 71A to 71C and 72A to 72C are stored. Since the diagnostic devices 71A to 71C and 72A to 72C are diagnostic devices of different models, the characteristic patterns of the specific information display areas are different from each other. The CPU 31 can identify the diagnostic device connected to the input terminals 91 and 92 by causing the first image processing unit 101 to acquire the characteristic pattern of the specific information display area in the specific video (S73). . Therefore, the HMD 1 has a magnification change video signal after the video signal is input to any of the input terminals 91 and 92, as compared with the case where the feature pattern is extracted from each of the real-time video display area and the specific information display area of the specific video. It is possible to reduce the time until the is generated.

  The user visually recognizes the specific video through the half mirror 56 when the display magnification change pattern storage process is executed. In this case, since the real-time video display areas of the diagnostic apparatuses 71A to 71C and 72A to 72C are arranged at positions including the center position of the diagnostic video, the user can easily visually recognize the real-time video display area as a display target area.

  The HMD 1 is an example of the “display device” in the present invention. The CB 40 is an example of the “display control device” in the present invention. The diagnostic images of the diagnostic apparatuses 71A to 71C are an example of the “first image” in the present invention. The video signal input to the input terminal 91 is an example of the “first video signal” in the present invention. The input terminal 91 is an example of the “first input terminal” in the present invention. The first decoder 81 is an example of the “first video processing unit” in the present invention. The diagnostic images of the diagnostic apparatuses 72A to 72C are an example of the “second image” in the present invention. The video signal input to the input terminal 92 is an example of the “second video signal” in the present invention. The input terminal 92 is an example of the “second force terminal” in the present invention. The second decoder 82 is an example of the “second video processing unit” in the present invention. The flash ROM 32 is an example of the “storage unit” in the present invention. The real-time video display areas of the diagnostic apparatuses 71A to 71C and 72A to 72C are examples of the “predetermined area” in the present invention. The CB 40 is an example of the “control unit” in the present invention. The HD 10 is an example of the “display unit” in the present invention. The CPU 31 that executes S44 is an example of the “selecting means” in the present invention. The second image processing unit 102 that generates the magnification-change video signal when the CPU 31 executes S57 is an example of the “generating unit” in the present invention. The CPU 31 that executes S59 is an example of the “output unit” in the present invention. The CPU 31 that executes S81 is an example of the “acquiring unit” in the present invention. The housing 12 is an example of the “first housing” in the present invention. The half mirror 56 is an example of the “reflecting member” in the present invention. The first extending portion 21A and the second extending portion 22A are examples of the “holding portion” in the present invention. The casing 63 is an example of the “second casing” in the present invention.

<Modification>
In addition, this invention is not limited to the said embodiment, A various change is possible. The table 100 may not have a mirror image reversal column. That is, the table 100 may not store data indicating whether the real-time video display area is mirror-inverted. In this case, when generating the magnification change video signal, the HMD 1 may automatically generate a magnification change video signal in which the change target area is automatically mirror-inverted. The half mirror 56 is not detachable from the housing 12 but may be fixed.

  The table 100 does not have to be provided with a luminance correction amount column. That is, the table 100 may not store the luminance correction amount for correcting the luminance of the display target area. In this case, the CB 40 generates a magnification change video signal that does not perform luminance correction.

  The table 100 does not have to store the three first magnification setting data and the three magnification setting data. It is sufficient that at least one first magnification setting data and second magnification setting data corresponding to the diagnostic device connected to any one of the input terminals 91 and 92 is stored in the table 100.

  The table 100 may not have a feature pattern column. That is, the table 100 may not store feature patterns corresponding to the three first magnification setting data and the three second magnification setting data. In this case, instead of acquiring the feature pattern of the specific video and determining whether the matching feature pattern is stored in the table 100, the CPU 31 receives an instruction to select predetermined magnification setting data via the operation unit 96. That's fine. The user operates the operation unit 96 to select desired magnification setting data.

  The real-time video display area 77 may be arranged at a position that does not include the center position of the diagnostic video 76. In this case, the specific information display area 78 may be arranged at the center of the diagnostic video 76. CPU31 should just acquire the characteristic pattern of the center part of a diagnostic image | video, when performing S19 and S73.

  The aspect ratio of the display target area selected by the user in the display magnification change pattern storage process may not be the same as the aspect ratio of the image light generated by the image light generation unit 6. In this case, the length of the image light generated by the image light generation unit 6 in the X-axis direction is divided by the length of the display target region in the X-axis direction, or Y of the image light generated by the image light generation unit 6 The magnification is acquired by dividing the length in the axial direction by the length in the Y-axis direction of the display target area.

  The display target area may not be the real-time video display area of the diagnostic apparatuses 71A to 71C and 72A to 72C. The display target area may be, for example, specific information display areas of the diagnostic devices 71A to 71C and 72A to 72C.

  In the CB 40, instead of the first decoder 81 and the second decoder 82, a single image processing IC may be mounted on a substrate built in the housing 63. In addition, the CB 40 does not include the first decoder 81 and the second decoder 82, and executes processing corresponding to the functions of the first decoder 81 and the second decoder 82 by an operation performed by the CPU 31 in cooperation with the RAM 33. Also good.

   The input buffer 84, the first image processing unit 101, the second image processing unit 102, the output buffer 85, and the output control unit 86 are mounted as a single ASIC on the board built in the housing 63, It may be implemented as an individual IC. Further, the CB 40 may not include the input buffer 84, the first image processing unit 101, the second image processing unit 102, the output buffer 85, and the output control unit 86. In this case, processing corresponding to the functions of the input buffer 84, the first image processing unit 101, the second image processing unit 102, the output buffer 85, and the output control unit 86 is performed by an operation performed by the CPU 31 in cooperation with the RAM 33. May be executed.

   The magnification need not be greater than 1. The magnification may be 1 or smaller than 1. When the magnification is smaller than 1, the image light generation unit 6 generates image light in which the change target area is reduced.

1 HMD
6 Image light generator 7 Cable 8 Mounting tool 8A First member 8B Second member 9 Connector 10 HD
12 Housing 22A Second extending portion 31 CPU
32 flash ROM
102 Second image processing unit 40 CB
56 half mirror 63 housing 76 diagnostic video 77 real-time video display area 78 specific information display area 81 first decoder 82 second decoder 91 input terminal 92 input terminal

Claims (15)

  1. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video A storage unit for storing
    Selecting means for selecting one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Among the first magnification setting data and the second magnification setting data, the in particular video is a video corresponding to the specific image signal, according to the corresponding specific magnification setting data is setting data, before Kitoku constant video Generating means for generating a magnification-change video signal that is a video signal in which the display magnification of the predetermined area is changed;
    A control unit including output means for outputting the magnification change video signal generated by the generation means;
    A display unit for displaying a video based on the magnification change video signal output by the control unit ,
    The storage unit stores a plurality of different first magnification setting data and a plurality of different second magnification setting data.
    The control unit further includes obtaining means for obtaining predetermined magnification setting data from one of the plurality of first magnification setting data and the plurality of second magnification setting data corresponding to the specific magnification setting data,
    It said generating means in response to the predetermined magnification setting data acquired by the acquisition unit, display device comprising that you generate the scaling image signal.
  2. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video A storage unit for storing
    Selecting means for selecting one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Of the first magnification setting data and the second magnification setting data, the specific image corresponding to the specific video signal is applied to the specific video corresponding to the specific magnification setting data corresponding to the specific video. Generating means for generating a magnification-change video signal that is a video signal in which the display magnification of the area is changed;
    Output means for outputting the magnification change video signal generated by the generation means;
    A control unit including:
    A display unit for displaying an image based on the magnification change video signal output by the control unit;
    With
    The storage unit stores the first magnification setting data and the second magnification setting data in association with data indicating whether the predetermined region is mirror-inverted, respectively.
    The generation unit generates the magnification-change video signal in which the predetermined area is mirror-inverted when the specific magnification setting data is associated with data in which the predetermined area is mirror-inverted. Viewing apparatus that.
  3. The display unit
    A first housing;
    An image light generation unit that is built in the first housing, generates image light of a video whose display magnification is changed based on the magnification change video signal, and emits the image light in a predetermined direction;
    A reflective member capable of reflecting the image light emitted from the image light generation unit in a direction intersecting the predetermined direction;
    The first is provided in the housing, the display device according to claim 1 or 2, characterized in that it comprises a holding portion for detachably holding the reflector member.
  4. A second housing provided with the output means and incorporating the first video processing unit, the second video processing unit, the control unit, and the storage unit;
    The cable according to claim 3, further comprising: a cable that connects the first casing and the second casing and transmits the magnification change video signal output from the output unit to the image light generation unit. The display device described.
  5. The storage unit stores the first magnification setting data and the second magnification setting data in association with a luminance correction amount of the predetermined area, respectively.
    The said generation | production means produces | generates the said magnification change video signal which correct | amended the brightness | luminance of the said predetermined area | region according to the said brightness | luminance correction amount matched with the said specific magnification setting data. A display device according to any one of the above.
  6. The storage unit
    Storing the plurality of first magnification setting data and a plurality of feature patterns of the first video each corresponding to each of the plurality of first magnification setting data;
    Storing the plurality of second magnification setting data and a plurality of feature patterns of the second video each corresponding to each of the plurality of second magnification setting data;
    The display device according to claim 1 , wherein the acquisition unit acquires the predetermined magnification setting data corresponding to the feature pattern of the specific video from the one corresponding to the specific magnification setting data.
  7. Each of the first video and the second video displays a real-time video display area in which an image changes at a predetermined frequency, and predetermined specific information is displayed at a frequency lower than the predetermined frequency. Specific information display area
    Each of the feature patterns of the first video and the second video is an image pattern of at least a part of the specific information display area,
    The display device according to claim 6 , wherein the acquisition unit acquires the predetermined magnification setting data corresponding to the feature pattern extracted from the specific information display area of the specific video.
  8. The real-time video display area is an area including a center position of the first video or the second video,
    The specific information display area is disposed outside the real-time video display area with respect to the center position,
    The display device according to claim 7 , wherein the predetermined area is the real-time video display area.
  9. And Viewing apparatus according to any one of claims 1 to 8,
    A mounting member having a first member extending in a first direction, and a pair of second members extending in a second direction intersecting the first direction from both sides of the first member;
    A head-mounted display comprising: a connecting tool that connects the first member and the display unit.
  10. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video A storage unit for storing
    Selecting means for selecting one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Among the first magnification setting data and the second magnification setting data, the in particular video is a video corresponding to the specific image signal, according to the corresponding specific magnification setting data is setting data, before Kitoku constant video Generating means for generating a magnification-change video signal that is a video signal in which the display magnification of the predetermined area is changed;
    A control unit including an output unit that outputs the magnification-change video signal generated by the generation unit with respect to a display unit capable of displaying the video with the display magnification changed ,
    The storage unit stores a plurality of different first magnification setting data and a plurality of different second magnification setting data.
    The control unit further includes obtaining means for obtaining predetermined magnification setting data from one of the plurality of first magnification setting data and the plurality of second magnification setting data corresponding to the specific magnification setting data,
    The display control apparatus , wherein the generation unit generates the magnification-change video signal according to the predetermined magnification setting data acquired by the acquisition unit.
  11. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video A storage unit for storing
    Selecting means for selecting one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Of the first magnification setting data and the second magnification setting data, the specific image corresponding to the specific video signal is applied to the specific video corresponding to the specific magnification setting data corresponding to the specific video. Generating means for generating a magnification-change video signal that is a video signal in which the display magnification of the area is changed;
    Output means for outputting the magnification-change video signal generated by the generation means to a display unit capable of displaying the video with the display magnification changed;
    Control unit including
    With
    The storage unit stores the first magnification setting data and the second magnification setting data in association with data indicating whether the predetermined region is mirror-inverted, respectively.
    The generation unit generates the magnification-change video signal in which the predetermined area is mirror-inverted when the specific magnification setting data is associated with data in which the predetermined area is mirror-inverted. Display control device.
  12. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video In a computer of a display control device comprising a storage unit for storing
    A selection step of selecting any one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Among the first magnification setting data and the second magnification setting data, the in particular video is a video corresponding to the specific image signal, according to the corresponding specific magnification setting data is setting data, before Kitoku constant video Generating a magnification change video signal that is a video signal in which the display magnification of the predetermined area is changed;
    An output step of outputting the magnification-change video signal generated by the generation step to a display unit capable of displaying the video with the display magnification changed ,
    The storage unit stores a plurality of different first magnification setting data and a plurality of different second magnification setting data.
    The computer further executes an acquisition step of acquiring predetermined magnification setting data from one of the plurality of first magnification setting data and the plurality of second magnification setting data corresponding to the specific magnification setting data,
    The display program characterized in that the generation step generates the magnification-change video signal in accordance with the predetermined magnification setting data acquired in the acquisition step .
  13. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video In a computer of a display control device comprising a storage unit for storing
    A selection step of selecting any one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Of the first magnification setting data and the second magnification setting data, the specific image corresponding to the specific video signal is applied to the specific video corresponding to the specific magnification setting data corresponding to the specific video. A generation step of generating a magnification-change video signal that is a video signal in which the display magnification of the region is changed;
    An output step of outputting the magnification-change video signal generated by the generation step to a display unit capable of displaying the video with the display magnification changed;
    Is a display program for executing
    The storage unit stores the first magnification setting data and the second magnification setting data in association with data indicating whether the predetermined region is mirror-inverted, respectively.
    The generation step generates the magnification-changed video signal in which the predetermined area is mirror-inverted when the specific magnification setting data is associated with data in which the predetermined area is mirror-inverted. Display program.
  14. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video A display method executed by a display control device comprising a storage unit for storing
    A selection step of selecting any one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Among the first magnification setting data and the second magnification setting data, the in particular video is a video corresponding to the specific image signal, according to the corresponding specific magnification setting data is setting data, before Kitoku constant video Generating a magnification change video signal that is a video signal in which the display magnification of the predetermined area is changed;
    With respect to the display magnification can be displayed altered image display unit, viewed including an output step of outputting the scaling image signal generated by said generating step,
    The storage unit stores a plurality of different first magnification setting data and a plurality of different second magnification setting data.
    An acquisition step of acquiring predetermined magnification setting data from one of the plurality of first magnification setting data and the plurality of second magnification setting data corresponding to the specific magnification setting data;
    The display method according to claim 1, wherein the generation step generates the magnification-change video signal in accordance with the predetermined magnification setting data acquired in the acquisition step .
  15. A first video processing unit that has a first input terminal to which a first video signal indicating a first video is input, and that processes the first video signal input to the first input terminal;
    A second video signal indicating a second video, having a second input terminal to which the second video signal in a format different from the first video signal is input, and the second video signal input to the second input terminal A second video processing unit for processing two video signals;
    First magnification setting data that is setting data for changing the display magnification of the predetermined area of the first video, and second magnification setting data that is setting data for changing the display magnification of the predetermined area of the second video A display method executed by a display control device comprising a storage unit for storing
    A selection step of selecting any one of the first video signal processed by the first video processing unit and the second video signal processed by the second video processing unit as a specific video signal;
    Of the first magnification setting data and the second magnification setting data, the specific image corresponding to the specific video signal is applied to the specific video corresponding to the specific magnification setting data corresponding to the specific video. A generation step of generating a magnification-change video signal that is a video signal in which the display magnification of the region is changed;
    An output step of outputting the magnification-change video signal generated by the generation step to a display unit capable of displaying the video with the display magnification changed;
    Including
    The storage unit stores the first magnification setting data and the second magnification setting data in association with data indicating whether the predetermined region is mirror-inverted, respectively.
    The generation step generates the magnification-changed video signal in which the predetermined area is mirror-inverted when the specific magnification setting data is associated with data in which the predetermined area is mirror-inverted. Display method.
JP2014187124A 2014-09-12 2014-09-12 Display device, head-mounted display, display control device, display program, and display method Active JP6137092B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014187124A JP6137092B2 (en) 2014-09-12 2014-09-12 Display device, head-mounted display, display control device, display program, and display method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014187124A JP6137092B2 (en) 2014-09-12 2014-09-12 Display device, head-mounted display, display control device, display program, and display method
PCT/JP2015/075561 WO2016039365A1 (en) 2014-09-12 2015-09-09 Display device, head-mounted display, display control device, display program, and display method

Publications (2)

Publication Number Publication Date
JP2016061809A JP2016061809A (en) 2016-04-25
JP6137092B2 true JP6137092B2 (en) 2017-05-31

Family

ID=55459109

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014187124A Active JP6137092B2 (en) 2014-09-12 2014-09-12 Display device, head-mounted display, display control device, display program, and display method

Country Status (2)

Country Link
JP (1) JP6137092B2 (en)
WO (1) WO2016039365A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003308059A (en) * 2002-04-18 2003-10-31 Canon Inc Image display processing apparatus, image display processing method, program and storage medium
JP2007248589A (en) * 2006-03-14 2007-09-27 Pioneer Electronic Corp Video display device and method
JP2007251821A (en) * 2006-03-17 2007-09-27 Ricoh Co Ltd Apparatus for image processing and for displaying using it
JP5594258B2 (en) * 2011-08-23 2014-09-24 ブラザー工業株式会社 Head mounted display
JP2013232744A (en) * 2012-04-27 2013-11-14 Bi2−Vision株式会社 Display system, display adjustment system, display adjustment method and program
JP5895792B2 (en) * 2012-09-28 2016-03-30 ブラザー工業株式会社 Work assistance system and program

Also Published As

Publication number Publication date
WO2016039365A1 (en) 2016-03-17
JP2016061809A (en) 2016-04-25

Similar Documents

Publication Publication Date Title
AU2017232179B2 (en) Display system and method
US7193584B2 (en) Wearable display apparatus
US9595243B2 (en) Image processing apparatus and image processing method
TWI534476B (en) Head-mounted display
US8957948B2 (en) Geometric calibration of head-worn multi-camera eye tracking system
US7610558B2 (en) Information processing apparatus and method
US20130249778A1 (en) Head-mounted display
US9046686B2 (en) Head-mount type display device
US20060050070A1 (en) Information processing apparatus and method for presenting image combined with virtual image
JP4373286B2 (en) Head-mounted display device
US8760470B2 (en) Mixed reality presentation system
US20110267321A1 (en) Head mounted display and drive method thereof
EP2434458A2 (en) Image processing program, apparatus, system, and method for mixed reality
JP5884502B2 (en) Head mounted display
US9191658B2 (en) Head-mounted display and position gap adjustment method
EP2006827A9 (en) Image display device
KR100943392B1 (en) Apparatus for displaying three-dimensional image and method for controlling location of display in the apparatus
CN103815866A (en) Visual function testing method and control device
JP5884576B2 (en) Head-mounted display device and method for controlling head-mounted display device
TW200803458A (en) Image adjustment apparatus and method for head-mounted display
US9530249B2 (en) Computer-readable storage medium having image processing program stored therein, image processing apparatus, image processing system, and image processing method
JP2010164814A (en) Head mounted display
JPH086708A (en) Display device
CA2991642A1 (en) Collimating fiber scanner design with inward pointing angles in virtual/augmented reality system
JP5073013B2 (en) Display control program, display control device, display control method, and display control system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160314

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20161122

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170120

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170404

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170417

R150 Certificate of patent or registration of utility model

Ref document number: 6137092

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150