WO2022239206A1 - 画像表示装置および画像表示システム - Google Patents
画像表示装置および画像表示システム Download PDFInfo
- Publication number
- WO2022239206A1 WO2022239206A1 PCT/JP2021/018292 JP2021018292W WO2022239206A1 WO 2022239206 A1 WO2022239206 A1 WO 2022239206A1 JP 2021018292 W JP2021018292 W JP 2021018292W WO 2022239206 A1 WO2022239206 A1 WO 2022239206A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- distance
- camera
- bright
- distance image
- Prior art date
Links
- 230000000007 visual effect Effects 0.000 claims abstract description 23
- 238000000034 method Methods 0.000 claims description 178
- 230000008569 process Effects 0.000 claims description 169
- 238000012545 processing Methods 0.000 claims description 112
- 238000001514 detection method Methods 0.000 claims description 24
- 230000006854 communication Effects 0.000 claims description 15
- 238000004891 communication Methods 0.000 claims description 14
- 238000003384 imaging method Methods 0.000 claims description 14
- 238000005286 illumination Methods 0.000 claims description 2
- 238000003780 insertion Methods 0.000 claims 4
- 230000037431 insertion Effects 0.000 claims 4
- 238000000605 extraction Methods 0.000 claims 2
- 230000002708 enhancing effect Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 61
- 238000007726 management method Methods 0.000 description 28
- 238000005259 measurement Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 11
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 10
- 230000000694 effects Effects 0.000 description 5
- 210000003128 head Anatomy 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 4
- 230000000052 comparative effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 241000238370 Sepia Species 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 206010017577 Gait disturbance Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002195 synergetic effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
Definitions
- the present invention relates to an image display device and an image display system that support visibility in dark places.
- Augmented reality technology that adds AR (Argument Reality) objects created by CG (Computer Graphics) to a real space or background image is used in content such as games and maintenance work.
- a user experiences augmented reality by viewing a content image synthesized on a display surface using a head-mounted display (hereinafter also referred to as an HMD (Head Mounted Display)), a portable information terminal, or the like.
- HMD Head Mounted Display
- Patent Document 1 discloses an HMD equipped with a camera that captures an image of the background and a distance measuring device that measures the distance to a physical object in the background. Furthermore, a technique is disclosed in which a process of extracting a physical object from an image obtained by a camera is executed, and an AR object is added in association with the physical object.
- Patent Document 2 discloses an example of a distance measuring sensor.
- a distance measuring sensor LiDAR: Light Detection and Ranging
- Patent Document 1 does not consider use in dark places. For example, in dark places where cameras are not good at shooting, such as places where the lights are off due to a power failure, etc., construction sites or maintenance sites at night when lighting cannot be used due to consideration for the surrounding environment, and buildings where the lights are not lit. , the user's forward visibility is poor, and it is difficult to use this technology.
- An image display device for displaying a visible image to be visually recognized by a user, comprising: a camera that captures an image in front of the user and obtains a camera image; a distance measuring sensor that obtains data representing the distance to each position of a physical object included in the visual field of the; an illuminance sensor that obtains data representing the brightness of the location where the user is; and data obtained by the distance measuring sensor. Based on the data obtained by the generation device for generating a distance image corresponding to the visual field region and each pixel representing the distance to each position, and the data obtained by the illuminance sensor, it is determined whether the location of the user is in a bright or dark place.
- a determination device for determining whether a place is in a bright place; and when the determination device determines that the place is in a bright place, the camera image obtained by the camera and the range image obtained by the generation device are combined into a bright place camera image and a bright place.
- a storage device for storing an image set including distance images; a recognition device for recognizing the distance image obtained by the generation device as a dark place distance image when the determination device determines that the place is dark;
- a search device that identifies a bright place distance image corresponding to the recognized dark place distance image by comparing the bright place distance image stored by the storage device and the dark place distance image recognized by the recognition device.
- a determination device for determining a visible image to be visually recognized by the user based on a photopic camera image included in the same image set as the photopic distance image specified by the search device; and a display for displaying the visible image. and an image display device.
- FIG. 1 is an external view of an HMD according to Embodiment 1.
- FIG. 2 is a block diagram showing the hardware configuration of the HMD according to Embodiment 1;
- FIG. 2 is a functional block diagram showing the functional configuration of the HMD according to Embodiment 1;
- FIG. 4 is a processing flow diagram of a dark place guide program by the HMD according to Embodiment 1.
- FIG. 4 is a processing flow diagram of image set storage/management by the HMD according to the first embodiment;
- FIG. 4A and 4B are diagrams showing an example of an image obtained by the HMD according to the first embodiment;
- FIG. 10 is a diagram showing a first configuration example of an HMD system according to Embodiment 2;
- FIG. 9 is a diagram showing a second configuration example of the HMD system according to Embodiment 2; 3 is a block diagram showing the hardware configuration of an image storage service server; FIG. FIG. 3 is a diagram showing an example of the data structure of an image set; FIG. 11 is a functional block diagram showing a functional configuration of an HMD according to Embodiment 3; FIG. 11 is a flow diagram of determination processing of a visually recognized image according to the third embodiment; FIG. 11 is a flow diagram of determination processing of a visually recognized image according to the third embodiment; FIG. 11 is a diagram for explaining a first example of processing for determining a visible image by the HMD according to Embodiment 3; FIG.
- FIG. 14 is a diagram for explaining a second example of processing for determining a visible image by the HMD according to the third embodiment;
- FIG. 14 is a diagram for explaining a third example of processing for determining a visible image by the HMD according to the third embodiment;
- FIG. 12 is a flowchart of processing for comparison between distance images and match determination according to Embodiment 4;
- FIG. 11 is a diagram for explaining translation processing according to the third embodiment;
- FIG. 12 is a diagram for explaining scaling processing in the third embodiment;
- FIG. FIG. 11 is a diagram for explaining combination processing using a plurality of distance images according to the third embodiment;
- FIG. 11 is an external view of an HMD according to Embodiment 5;
- FIG. 11 is a functional block diagram of an HMD according to Embodiment 5;
- FIG. 14 is a flowchart of generation/editing processing of a visible image according to Embodiment 5; 14A and 14B are diagrams showing examples of a camera image and a visually recognized image obtained by the fifth embodiment;
- FIG. FIG. 12 is a diagram showing a first example of the appearance of an HMD according to Embodiment 6;
- FIG. 12 is a diagram showing a first example of the appearance of an HMD according to Embodiment 6;
- FIG. 12 is a diagram showing a second example of the appearance of the HMD according to Embodiment 6;
- FIG. 12 is a diagram showing a second example of the appearance of the HMD according to Embodiment 6;
- FIG. 12 is a diagram showing a second example of the appearance of the HMD according to Embodiment 6;
- FIG. 10 is a diagram showing a gesture operation area by a finger of a user; It is the figure which showed an example of the user interface screen in HMD.
- FIG. 21 is an external view of an HMD according to Embodiment 8;
- FIG. 3 is a diagram showing an example of the angle of view of the camera of the portable information terminal and the measurement range of the distance measuring sensor;
- FIG. 10 is a diagram showing an example of a state in which a dark place distance image is displayed on a portable information terminal;
- FIG. 10 is a diagram showing an example of a state in which a photopicture camera image is displayed on the portable information terminal;
- the “bright place distance image corresponding to the dark place distance image” or “another bright place distance image corresponding to the bright place distance image” is Except, it means images in which the images are substantially coincident.
- substantially match means that deviations in the linear direction, rotational direction, and scaling direction of the image, differences in color, and differences in shading (brightness) are allowed within a range that does not pose a practical problem. is. For example, if the degree of matching or similarity between images is greater than or equal to a certain threshold, it is determined that they match.
- Embodiment 1 An HMD that is Embodiment 1 of the present invention will be described.
- the HMD of the first embodiment obtains a bright place camera image from the camera and a bright place distance image based on the output of the ranging sensor corresponding to this camera image, and saves these images as a set. accumulate.
- a dark place a dark place distance image is obtained based on the output of the distance measuring sensor, and a bright place distance image corresponding to (substantially matching with) the dark place distance image is searched and specified. Then, based on the photopicture camera images included in the same set as the photopicture distance image, a visible image to be visually recognized by the user is determined and displayed. Accordingly, in a dark place, the user can visually recognize an image of the same place in the bright place obtained in the past, and the user's visibility can be improved.
- FIG. 1 is an external view of an HMD according to Embodiment 1.
- the HMD 1 according to this embodiment includes a camera 11, a distance sensor 12, a right-eye projector 13, a left-eye projector 14, an image screen (image display surface) 15, a nose pad 16, a controller 17, a microphone 18, a speaker 19, and frame housings 20a to 20c.
- the user wears the HMD 1 on his or her face using the frame housings 20 a and 20 b and the nose pad 16 .
- the right-eye projector 13, left-eye projector 14, and image screen 15 constitute the display device of the HMD 1. Note that the right-eye projector 13 and the left-eye projector 14 are also referred to as projectors 13 and 14 hereinafter.
- the image screen 15 is a transflective screen that transmits light from the front of the user.
- the image screen 15 may be a non-transmissive display that does not transmit light.
- the camera 11 is attached so as to capture an image of the real space in front of the user.
- the camera 11 is a so-called digital camera that captures an image of its own field of view and obtains image data corresponding to the field of view.
- an image represented by this image data is called a camera image.
- the camera 11 repeats imaging at predetermined timings. Note that this imaging may be performed manually or automatically.
- the image is captured at a constant frame rate, and the frame rate may be appropriately adjusted according to the processing speed and memory capacity of the controller 17 .
- a frame rate for example, about 1 to 30 fps (frames per second) can be considered.
- the ranging sensor 12 measures distances from the sensor itself to physical objects corresponding to respective positions in the field of view of the camera 11, and obtains distance data representing those distances.
- the range sensor 12 is substantially equivalent to obtaining distance data representing the distance from the user to the physical object corresponding to each position.
- the distance measurement sensor 12 is capable of distance measurement even in a dark place, and is composed of, for example, like the aforementioned LiDAR, a sensor that emits infrared light and receives reflected light from a physical object.
- the projectors 13 and 14 project CG images, camera images, etc. onto the image screen 15 and superimpose them on the background visible through the image screen 15 .
- the projectors 13 and 14 project the image for the left eye and the image for the right eye generated in consideration of the parallax on the image screen 15, respectively. This makes it possible to stereoscopically display a CG image as if it were at a predetermined distance in the real space.
- the controller 17 takes in image data obtained by the camera 11 and distance data obtained by the distance measuring sensor 12, and supplies these data to an internal memory or processor.
- the controller 17 also incorporates a group of sensors such as a GPS (Global Positioning System) sensor, an illumination sensor, an acceleration sensor, a gyro sensor, and an orientation (magnetic) sensor. Further, the controller 17 creates images to be projected by the projectors 13 and 14, sounds to be output by the speaker 19, and the like.
- Controller 17, camera 11, ranging sensor 12, microphone 18, and speaker 19 are arranged in frame housings 20a-20c. It should be noted that the arrangement locations of these shown in FIG. 1 are examples, and may not be exactly as shown.
- FIG. 2 is a block diagram showing the hardware configuration of the HMD according to the first embodiment.
- the controller 17 of the HMD 1 includes an internal bus 30, a GPS sensor 31, an illuminance sensor 32, an acceleration sensor 33, a gyro sensor 34, an orientation sensor 35, a processor 36, a memory 37, and an image memory. 38, a non-volatile storage device 39, and a communication device 40 are provided inside.
- the blocks 31 to 40 are connected via an internal bus 30 and operate in cooperation with each other.
- the processor 36 is composed of, for example, a CPU (Central Processing Unit) or an MPU (Micro Processing Unit).
- a CPU Central Processing Unit
- MPU Micro Processing Unit
- the memory 37 and the image memory 38 are configured by, for example, RAM (Random Access Memory), which is a semiconductor storage device.
- RAM Random Access Memory
- the non-volatile storage device 39 is composed of single or multiple non-volatile memory media.
- a non-volatile memory medium for example, a programmable ROM (Read Only Memory) can be considered.
- the programmable ROM is composed of, for example, EEPROM (Electronically Erasable and Programmable ROM) or FROM (Flash ROM).
- the nonvolatile storage device 39 stores a basic operation program 41 and a dark place guide program 42 as processing programs.
- a save data area 43 is allocated to the nonvolatile memory device 39 .
- the saved data area 43 stores data and image data necessary for executing the processing program.
- the communication device 40 includes a mobile communication device such as 4G (Generation) or 5G, a wireless LAN communication device, and the like.
- the communication device 40 selects an appropriate process from among the communication processes as necessary and connects the HMD 1 to the network.
- the image data to be sent to the projectors 13 and 14 is stored in the image memory 38 and read out.
- the processor 36 expands the basic operation program 41 and the dark place guide program 42 stored in the non-volatile storage device 39 into the memory 37 and executes them, thereby controlling and managing the basic operation and performing the dark place guide program. Realize the guide function.
- ⁇ Functional block configuration of HMD> 3 is a functional block diagram showing a functional configuration of the HMD according to Embodiment 1.
- FIG. Each functional block shown in FIG. 3 is realized by the processor 36 expanding the dark place guide program in the memory 37 and executing it, and by cooperating with various sensors and various devices provided in the HMD 1 .
- the HMD 1 includes, as functional blocks, a position information acquisition device 51, a time information acquisition device 52, an orientation information acquisition device 53, an orientation information acquisition device 54, a camera image acquisition device 55, a distance Image generation device (generation device) 56, bright/dark place determination device (determination device) 57, bright place image storage management device (storage device) 58, bright place image storage device 59, dark place distance image recognition device (recognition device) 60 , a comparative image narrowing down device 61 , a bright image search device 62 , a visible image determining device 63 , and a visible image display device (display device) 64 .
- a position information acquisition device 51 As shown in FIG. 3, the HMD 1 according to the present embodiment includes, as functional blocks, a position information acquisition device 51, a time information acquisition device 52, an orientation information acquisition device 53, an orientation information acquisition device 54, a camera image acquisition device 55, a distance Image generation device (generation device) 56, bright/dark place determination device (determination device)
- the position information acquisition device 51 acquires the coordinate data obtained by the GPS sensor 31, and based on the data, generates position information representing the location of the user.
- the time information acquisition device 52 acquires clock data including date and time from the electronic clock of the processor 36 and generates time information representing the date and time when the image was obtained.
- the orientation information acquisition device 53 acquires orientation data obtained by the orientation sensor 35, and based on the data, generates orientation information representing the horizontal orientation of the user's face, that is, the orientation.
- the posture information acquisition device 54 acquires data obtained by the acceleration sensor 33 and the gyro sensor 34, and based on the data, generates posture information representing the orientation of the user's face in the vertical direction.
- the camera image acquisition device 55 acquires image data obtained by the camera 11, performs image processing such as noise removal, interpolation, and size adjustment on the data as necessary to generate a camera image.
- the camera image acquisition device 55 continuously and repeatedly acquires image data at time intervals to generate camera images in time series.
- the distance image generation device 56 acquires the distance data obtained by the distance measuring sensor 12 and generates a distance image based on the data.
- a range image is an image having an area corresponding to the field of view area of the camera image.
- the distance image is an image in which the distance from the distance measuring sensor 12 to the physical object corresponding to each position in the visual field area is reflected in the pixel value (color or gray scale) of the pixels corresponding to each position.
- the distance image generation device 56 determines that the location where the user is located is in a bright place
- the distance image generation device 56 generates a depth image corresponding to each camera image generated. to generate
- the distance image generation device 56 generates a distance image regardless of the presence or absence of a camera image when it is determined that the user is in a dark place.
- the bright/dark place determination device 57 acquires the illuminance data obtained by the illuminance sensor 32 and determines whether the user's location is bright or dark based on the data.
- the bright/dark place determination device 57 determines the bright/dark place by, for example, threshold determination of the illuminance represented by the illuminance data.
- the determination of bright and dark places may be performed based on the brightness of the acquired camera image, or may be performed using both the illuminance data and the brightness of the camera image.
- determination of bright and dark places may be performed using position information and time information as an auxiliary.
- the bright/dark place determination device 57 refers to the position information and the time information when the illuminance represented by the illuminance data is exactly midway between the bright place and the dark place. Based on these pieces of information, the user's location and date and time are detected, and it is determined whether or not it is after the evening time set in advance for each location. If it is before the evening time, it is determined as a bright place, and if it is after the evening time, it is determined as a dark place.
- the bright place image storage management device 58 stores the camera image and the range image obtained when the bright place/dark place determination device 57 determines that the place where the user is present is in a bright place, as a bright place camera image and a bright place distance. Save as an image set containing the images.
- the bright-field image storage management device 58 associates position information, time information, orientation information, and orientation information when the image set was obtained as metadata with the image set. save.
- the storage destination of the image set and metadata is the photopic image storage device 59, which will be described later.
- the photopicture storage device 59 stores image sets and metadata saved by the photopicture storage management device 58 . That is, in the photopicture storage device 59, image sets are sequentially saved and accumulated for each location where the user was and each orientation (orientation)/posture of the user.
- the dark place distance image recognition device 60 recognizes the distance image obtained when the bright place/dark place determination device 57 determines that the place where the user is located is a dark place as a dark place distance image.
- the comparative image narrowing-down device 61 selects the bright place distance image to be compared with the dark place distance image from the bright place distance image stored in the bright place image storage device 59. Filter from images. Narrowing down of the photopic distance image is performed using the associated metadata. That is, based on the position information, time information, direction information, and posture information when the recognized dark place distance image was obtained, the location of the user, the date the user was present, the orientation of the user, and the posture of the user are calculated. is approximated within a predetermined range, and the image is narrowed down. In the present embodiment, the narrowing down is performed based on position information, time information, orientation information, and orientation information. You can narrow down. Alternatively, the narrowing down by the comparison image narrowing down device 61 itself may be omitted.
- the bright place image retrieval device 62 sequentially reads bright place distance images to be compared with the recognized dark place distance images, and compares the dark place distance images with the read bright place distance images. Then, based on the result of this comparison, a bright place distance image corresponding to the dark place distance image, that is, a bright place distance image that matches within the allowable range is searched and specified.
- the comparison between the dark place distance image and the light place distance image includes translation, rotation (inclination) movement, enlargement/reduction, pixel value adjustment, and the like for at least one of the distance images.
- the degree of matching between the recognized dark-place distance image and the read-out bright-place distance image is calculated, and if the degree of matching is greater than or equal to a threshold. It is possible to use a method of judging that they match and judging that they do not match if the degree of matching is less than a threshold.
- the pixel value levels of corresponding pixels or groups of pixels for example, the differences in the average pixel values are obtained, and the magnitude of these differences is evaluated comprehensively. Then, the degree of matching is calculated, and the threshold value of the degree of matching is determined.
- the degree of matching for example, a value obtained by multiplying a variance value or a deviation value for the pixel value level difference at each position or each region by a negative coefficient, or a reciprocal of the variance value or the deviation value can be considered.
- the matching determination is performed excluding the different regions and the like, which will be described later.
- artificial intelligence may be used to determine whether or not there is a match within the allowable range.
- the artificial intelligence may be trained to match images representing objects that may exist in real space.
- the visible image determination device 63 determines a visible image to be visually recognized by the user based on the result of comparison between distance images by the bright image search device 62 .
- the visible image determination device 63 determines the visible place image based on the bright place camera image included in the same image set as the bright place distance image. Decide on an image. If several matching range images are identified, the newest one or the brightest one may be chosen by the user. On the other hand, if a bright place distance image that substantially matches the dark place distance image is not identified, the visible image is determined based on the dark place distance image.
- the visible image may be the bright camera image itself or the dark distance image itself. may
- the visual image display device 64 projects and displays the image on the image screen 15 using the projectors 13 and 14 so that the user can visually recognize the determined visual image.
- the photopicture camera image when the user is moving, it may be difficult to display the photopicture camera image at the same position as a moving image without delay while detecting the position. In such a case, a still image may be displayed according to the movement position of the user.
- FIG. 4 is a processing flow diagram of the dark place guide program by the HMD according to the first embodiment.
- step S1 a process of acquiring a camera image is performed.
- the image data obtained by the camera 11 is obtained by the camera image obtaining device 55 .
- Acquisition of the camera image may be performed in synchronization with the timing of camera imaging, or an image obtained by imaging at an arbitrary timing while continuously performing image imaging may be acquired.
- image capturing is performed continuously, for example, it is performed at a frame rate of about 10 to 60 fps (frame per second).
- step S2 a process of acquiring distance data is performed. Specifically, the distance image generation device 56 acquires distance data representing the distance from the distance measurement sensor to the physical object included in the field of view of the camera 11 .
- step S3 a process of generating a distance image is performed. Specifically, the distance image generation device 56 generates a distance image corresponding to the visual field area of the camera 11 based on the acquired distance data.
- step S4 processing is performed to determine whether it is a bright place or a dark place.
- the bright/dark place determination device 57 acquires the illuminance data from the illuminance sensor 32 and determines whether the user's place is bright or dark based on the illuminance data. Note that this determination may be performed based on the brightness of the acquired camera image, as described above, or may be performed using position information, time information, and the like as an auxiliary.
- step S5 when it is determined that the place is bright (S4, Yes), the process proceeds to step S5.
- the place is dark (S4, No)
- the process proceeds to step S7.
- step S5 a process of acquiring metadata is performed. Specifically, the bright image storage management device 58 acquires time information, position information, orientation information, and orientation information as metadata associated with the image set.
- step S6 processing for saving, updating, and managing the image set is performed.
- the bright place image storage management device 58 sets the obtained camera image and distance image as a bright place camera image and a bright place distance image. Then, the image set including these images and the acquired metadata are associated with each other and stored in the photopicture storage device 59 . After that, the process proceeds to step S17.
- an image set whose acquisition date and time is earlier than a certain time (first time) or more, that is, an old image set may be erased (deleted) from the photopic image storage device 59 .
- first time a certain time
- an old image set may be erased (deleted) from the photopic image storage device 59 .
- one month to one year can be considered as the certain period of time.
- step S6 The details of the image set storage/management processing in step S6 will be described later.
- step S7 a process of recognizing the dark distance image is performed. Specifically, when the dark place distance image recognition device 60 receives determination that the place where the user is located is a dark place, the obtained distance image is recognized as a dark place distance image.
- step S8 a process of narrowing down the photopic distance images to be compared is performed.
- the comparative image narrowing-down device 61 selects the bright place distance image to be compared with the recognized dark place distance image from the past bright place distance images stored in the bright place image storage device 59. Narrow down from inside. Narrowing down of the photopic distance image is performed using the associated metadata. That is, based on the positional information when the recognized dark place distance image is obtained, the bright place distance image matching or similar to the user's location within a predetermined range is searched and narrowed down.
- step S9 a process of reading out the photopic distance image is performed. Specifically, the bright place image retrieval device 62 reads one bright place distance image from the narrowed bright place distance images.
- step S10 a process of comparing the dark place distance image and the bright place distance image is performed.
- the bright place image search device 62 compares the recognized dark place distance image and the read bright place distance image, and calculates an evaluation value reflecting the degree of matching between the distance images.
- the distance image in bright places is not saved and only the camera image is stored, the features of the distance image in dark places and the features of the camera images in bright places may be extracted and compared. .
- step S11 a process is performed to determine whether the compared distance images have a corresponding relationship, that is, whether they substantially match. Specifically, the bright place image search device 62 determines whether or not the recognized dark place distance image substantially matches the read bright place distance image by threshold determination of the calculated evaluation value. judge. In this determination, if it is determined that they match (S11, Yes), the process proceeds to step S12. On the other hand, if it is determined that they do not match (S11, No), the process proceeds to step S13.
- step S12 processing is performed using the bright camera image as the basis of the visible image.
- the visible image determining device 63 sets, as the basis of the visible image to be visually recognized by the user, the photopicture camera image included in the same image set as the photopicture distance image determined to match in step S11. After that, the process proceeds to step S15.
- step S13 a process is performed to determine whether all the comparisons for the narrowed-down photopic distance images have been completed. Specifically, the photopicture image search device 62 compares all the narrowed photopicture distance images, and determines whether there is any photopicture distance image to be compared next. If it is determined that all comparisons have been completed (S13, Yes), the process proceeds to step S14. On the other hand, if it is determined that all the comparisons have not been completed (S13, No), the process returns to step S9, and the next bright-field distance image to be compared is read.
- step S14 processing is performed using the dark place distance image as the basis of the visible image.
- the visual image determination device 63 sets the recognized dark place distance image as the basis of the visual image to be visually recognized by the user.
- step S15 a process of generating and editing a visible image is performed.
- the visible image determination device 63 generates and determines a visible image based on the bright-spot camera image or dark-spot distance image set as a basis.
- editing such as pasting the CG image or AR object image onto the image set as the base or emphasizing a part of the image set as the base is performed.
- step S16 processing for displaying a visible image is performed.
- the visible image display device 64 projects and displays the determined visible image on the image screen 15 so that the user can visually recognize the determined visible image.
- step S17 processing is performed to determine whether or not to continue the dark place guide program. For example, when moving from a dark place to a bright place, when changing to an environment where the execution of the dark place guide program is deemed unnecessary, when the user inputs a command to stop or end the dark place guide program, the internal If a processing error or communication error occurs, it is determined not to continue (S17, No), and the dark place guide program ends. On the other hand, if there is no reason to end the process, it is determined to continue (S17, Yes), the process returns to step S1, and the dark place guide program continues the process.
- FIG. 5 is a processing flow diagram of image set storage/management by the HMD according to the first embodiment.
- step S61 a process of narrowing down past image sets to be compared is performed.
- the bright-field image storage management device 58 uses the current-time acquired metadata from the past-light-picked distance images stored in the bright-field image storage device 59 based on the metadata acquired at the current time.
- the comparison targets are narrowed down to photopicture distance images in which the imaging location of the image set at , the orientation (orientation) of the user, and the orientation of the user match within a predetermined range. Note that this narrowing down may be performed based only on the imaging location without considering the orientation and posture at the time of imaging.
- step S62 a process of reading one past photopic distance image is performed.
- the bright place image storage management device 58 reads out one of the past bright place distance images narrowed down in step S61.
- the location recognized by the location information included in the metadata may be read in order from the location and direction closest to the current location.
- step S63 a process of comparing the current photopic distance image and the past photopic distance image is performed. Specifically, the bright place image storage management device 58 compares the current bright place distance image acquired with the past bright place distance image read in step S62.
- step S64 a process is performed to determine whether the bright-field distance images to be compared are the same. Specifically, the bright place image storage management device 58 determines whether the current bright place distance image acquired and the past bright place distance image read out in step S62 match within an allowable range, that is, Determine if they are substantially the same. If the distance images are completely different from each other or are shifted by a predetermined level or more, it is determined that they are not the same. In this determination, if it is determined that they are the same (S64, Yes), the process proceeds to step S65. On the other hand, in this determination, if it is determined that they are not the same (S64, No), the process returns to step S62, reads out the photopicture distance image to be compared next, and continues the processing.
- an allowable range that is, Determine if they are substantially the same. If the distance images are completely different from each other or are shifted by a predetermined level or more, it is determined that they are not the same. In this determination, if it is determined
- step S65 a process of recognizing the recording date and time of past photopic distance images is performed. Specifically, the bright place image storage management device 58 reads and recognizes the recording date and time included in the metadata of the past bright place distance image read in step S62.
- step S66 a process of determining whether or not the read past photopic distance image is old is performed. Specifically, the light-picture image storage management device 58 determines whether or not the recording date and time recognized in step S65 is earlier than the current time by a predetermined period or more, that is, whether or not it is old. In this judgment, if it is judged to be old (S66, Yes), the process proceeds to step S67. On the other hand, if it is determined in this determination that it is not old (S66, No), the process proceeds to step S68.
- step S67 a process of deleting past image sets is performed. Specifically, the bright place image storage management device 58 deletes the image set containing the past bright place distance image read in step S62. Then, the process proceeds to step S70.
- step S68 a process is performed to determine whether the recognized recording date and time are substantially the same as the current date and time. Specifically, the bright-field image storage management device 58 determines whether the time difference between the recording date and time recognized in step S65 and the current date and time is within a preset relatively short period of time. It is determined whether the date and time are substantially the same. The preset time can be, for example, one hour to one day, but is not limited to this. In this determination, if it is determined that the date and time are substantially the same (S68, Yes), the process proceeds to step S69. On the other hand, in this determination, if it is determined that the date and time are not the same (S68, No), the process proceeds to step S70.
- step S69 a process of overwriting saving or adding the recording date and time is performed.
- the photopicture image storage management device 58 creates a new image using the acquired photopicture distance image and photopicture camera image for the image set containing the photopicture distance image read out in step S62. Overwrite the image set.
- only the recording date and time which is metadata, is updated to the current date and time for the image set including the past photopic distance image read in step S62. In this case, since there is no substantial difference in the image itself, there is no problem even if only the recording date and time are updated, thereby simplifying the processing.
- step S69 the process proceeds to step S70.
- step S70 processing is performed to determine whether there is still a past photopic distance image to be compared. Specifically, the bright place image storage management device 58 has finished comparing all the past bright place distance images narrowed down in step S61 to be compared, and no past bright place distance images to be compared remain. Determine whether or not In this determination, if it is determined that there is no remaining (S70, Yes), the process proceeds to step S71. On the other hand, if it is determined in this determination that it remains (S70, No), the process returns to step S62, reads the next past photopic distance image, and continues the processing.
- step S71 processing is performed to determine whether the image set has been overwritten and saved. Specifically, the light-picture image storage management device 58 determines whether or not the process of step S69, that is, the overwrite storage of the currently acquired image set or the addition of the recording date and time has already been performed. In this determination, if it is determined that overwrite storage or the like has already been performed (S71, Yes), the image set storage/management processing is terminated. On the other hand, if it is determined that overwrite storage or the like has not been performed (S71, No), the process proceeds to step S72.
- the process of step S69 that is, the overwrite storage of the currently acquired image set or the addition of the recording date and time has already been performed. In this determination, if it is determined that overwrite storage or the like has already been performed (S71, Yes), the image set storage/management processing is terminated. On the other hand, if it is determined that overwrite storage or the like has not been performed (S71, No), the process proceeds to step S72.
- step S72 processing for newly saving the image set is performed. Specifically, the photopicture storage management device 58 newly stores the acquired image set in the photopicture storage device 59 in association with the currently obtained metadata. Then, the image set saving/managing process ends.
- a past image set that is substantially the same as the acquired image set is searched for and identified. Then, for the specified image set, old ones are deleted, those that are not old but whose recording date and time differ by a certain time or more are left, and those that have substantially the same recording date and time are overwritten or saved, or the recording date and time are changed. Postscript, that is, update.
- the date and time of recording are different from those of the past image set, or when at least part of the images are different by a certain level or more, the acquired image set is newly saved. As a result, a required image set can be held for a required period of time, and as a result, an inexhaustible increase in the storage capacity of the image set can be suppressed, and the storage capacity can be properly maintained.
- FIG. 6 is a diagram illustrating an example of an image obtained by the HMD according to the first embodiment
- FIG. FIG. 6 shows an example of a camera image, a range image, and a visible image obtained in bright and dark places.
- the upper stage is an image in a bright place recorded in the past
- the middle stage is an image in a dark place at the present time.
- the left side is the camera image
- the right side is the range image.
- the distance image is shown in different colors depending on the distance, but here it is shown in black and white (gray scale).
- 80A is a clear photopicture camera image
- 81A is a photopicture distance image corresponding to the photopicture camera image 80A.
- 82B is a dark camera image obtained at the present time, which is unclear due to the dark surroundings.
- 83B corresponds to the dark place camera image 82B and is a dark place distance image obtained almost simultaneously with the dark place camera image, but it is obtained with a sharpness comparable to that of the bright place distance image 81A.
- the camera image in the bright place with which the distance images match is determined as the visible image 84B to be visually recognized by the user. Then, the visually recognized image is displayed on the display surface of the HMD. This allows the user to check the situation at the site in a dark place.
- the contour of an object is extracted from the dark place distance image 83B, and compared with the contour of the object extracted from the bright place distance image 81A or the bright place camera image 80A, the image of the field of view from the same position It can be confirmed that Even in such a case, by displaying the camera image in the bright place on the HMD, the user can check the situation at the site in the dark place.
- the previously saved camera image of the same field of view in a bright place is displayed, thereby helping the user to see ahead. It becomes possible to For example, at a work site that is dark and has poor visibility, by providing a clear camera image obtained in a bright place instead of a camera image taken in a dark place, the user can detect the presence of objects such as obstacles. can be confirmed, and work can be performed more safely.
- the following processing is performed when the corresponding image, that is, the substantially matching photopic distance image is already saved. That is, if the acquisition date and time of the corresponding image that has already been saved is within a certain period of time and relatively newer than the acquisition date and time of the image to be saved, the corresponding image is not overwritten and saved. Add only the date and time to the current date and time. If the date and time of acquisition of the corresponding image is earlier than the date and time of acquisition of the image to be stored by a predetermined period or more, the corresponding image is deleted. By doing so, it is possible to simplify the processing, speed up the processing, reduce energy consumption, and efficiently use the storage area.
- the location where the user is currently located is specified by GPS or the like, and a search is performed from the photopic distance image narrowed down to that location. As a result, the image search time can be shortened.
- a transflective screen that transmits light from the front of the user is used as the image screen, which is the image display surface.
- the image screen which is the image display surface.
- the dark place distance image and the bright place distance image are compared to determine whether they match.
- the match determination may be performed by comparing the dark place distance image and the bright place camera image.
- the outline of the object may be extracted from the dark distance image
- the outline of the object may be extracted from the bright camera image
- the two outlines may be compared to determine if they match. This makes it possible to provide a user with an image with good visibility even when a bright-picture camera image is stored without a bright-picture distance image.
- the algorithm of the dark place guide program can be simplified, and the development cost can be reduced.
- the user is indoors, etc., and GPS location information cannot be obtained
- the metadata of the image set includes location information and orientation data from network access points, UWB (Ultra Wide Band), etc. You can use these if you want. That is, a search is preferentially performed for a photopic distance image having metadata of position information and direction information that are substantially the same as position information obtained by an access point or the like and direction information detected by a direction sensor at the user's location. good too. As a result, it is possible to shorten the search time for the photopic range image even when GPS position information cannot be used.
- Embodiment 1 when the initial position of the user can be determined by GPS or a wireless communication access point, after that, using a method such as pedestrian dead reckoning (PDR (Pedestrian Dead Reckoning)), It is also possible to specify the location of the user and search for a photopic distance image corresponding to the line-of-sight direction from that location. As a result, the dark place guide program can be continuously executed even when the user moves from outside to inside.
- PDR pedestrian Dead Reckoning
- the specified bright-picture camera image is used as the visible image as it is.
- the image may be used as the visually recognized image.
- the color of the specified photopicture camera image may be converted to black-and-white or sepia, or arranged in rich colors such as those used in CG or animation.
- a virtual reality space corresponding to the user's location, the direction the user is facing, and the user's posture is displayed, and a CG image artificially created based on the specified bright camera image is superimposed thereon. may be displayed.
- Embodiment 2 is an HMD system in which an HMD and a server are connected via a network.
- an image storage service including a dark place guide function is provided to the user by cooperation between the HMD and the server.
- FIG. 7A is a diagram showing a first configuration example of the HMD system according to Embodiment 2.
- FIG. FIG. 7A shows the configuration of an HMD system 100 in which a single HMD and server are connected via a network.
- the HMD system 100 includes an HMD 1, a user 70, wireless communication signals 71a and 71b using electromagnetic waves or the like, an access point 72, a network 73, and an image storage service server (server) 74.
- the network 73 is, for example, a wide area communication network, specifically the Internet, Ethernet, an industrial communication network, or the like.
- FIG. 8 is a block diagram showing the hardware configuration of the image storage service server.
- the image storage service server 74 has an internal bus 740 , network I/F (Interface) 741 , processor 742 , memory 743 and storage 744 .
- the storage 744 stores a basic operation program 745 and is provided with an image data storage area 746 .
- the storage 744 is composed of, for example, a hard disk or a semiconductor memory.
- An image set including range images from HMD 1 is stored in storage 744 via network I/F 741, processor 742, and the like.
- Basic operation program 745 is executed using processor 742 and memory 743, and multiple user management processing is performed.
- a user 70 wears the HMD 1 on the head and looks ahead.
- the HMD 1 is connected to a network 73 via communication signals 71 a and 71 b and an access point 72 , and an image storage service server 74 is connected to the network 73 .
- the HMD 1 executes the dark place guide program, but the image set and metadata of the photopic distance image and photopicture camera image are stored in the image data storage area 746 of the image storage service server 74, not in the HMD 1, and can be used as needed. is read out according to Alternatively, a necessary image set group corresponding to the location is downloaded from the image storage service server 74 in advance to the HMD 1 and used. This is an image storage service.
- this HMD system comprises an image storage service server 74 connected to the network 73 and the HMD 1 connected to the network 73 and communicating with the image storage service server 74 .
- the HMD 1 also includes a camera that captures the front of the user 70 to obtain a camera image, a range sensor that obtains data representing the distance from the user 70 to a physical object corresponding to each position in the field of view of the camera, and a user 70 and a display device for displaying a visible image for the user 70 to visually recognize.
- the image storage service server 74 and the HMD 1 cooperate with each other to execute the following various processes.
- the image set and metadata which tend to be large, can be stored in the image storage service server 74 instead of inside the HMD 1, thereby reducing the storage capacity of the HMD 1. be able to.
- processing with a large load can be performed on the image storage service server 74 side instead of the HMD 1 side, and only the result of the processing can be sent to the HMD 1, thereby reducing the processing specifications of the HMD 1 and realizing high-speed processing.
- FIG. 7B is a diagram showing a second configuration example of the HMD system according to the second embodiment.
- FIG. 7B shows the configuration of an HMD system 101 in which multiple HMDs and servers are simultaneously connected via a network.
- the example shown in FIG. 7B is an example in which there are users wearing HMDs at a plurality of locations. There may be multiple users at each of multiple locations. A plurality of users share the image storage service provided by the image storage service server 74 .
- 1a and 1b are HMDs
- 70a and 70b are users
- 71c, 71d and 71e are communication signals
- 72 is an access point
- 73 is a network
- 74 is an image storage service server.
- the user 70a is at Site abc (75a) and uses the image storage service
- the user 70b is at Site aaa (75b) and uses the image storage service.
- Each user saves the bright-picture camera image and the bright-picture distance image obtained in bright light as an image set corresponding to each position and orientation in the image saving service server 74 together with metadata.
- the stored image sets can be accessed by respective users from various locations to recall the most recent required image set.
- this system comprises an image storage service server (server) 74 connected to a network 73, and a plurality of HMDs (mobile image display devices) 1a and 1b connected to the network 73 and communicating with the image storage service server 74.
- server image storage service server
- HMDs mobile image display devices
- the plurality of HMDs 1a and 1b respectively acquire cameras that capture images in front of the users 70a and 70b to obtain camera images, and obtain data representing distances from the HMDs 1a and 1b to physical objects corresponding to respective positions in the camera's field of view.
- a distance measurement sensor for obtaining data representing the brightness of the location where the users 70a and 70b are present, a display device for displaying a visible image to be visually recognized by the users 70a and 70b, and a position for obtaining position information of the HMDs 1a and 1b. and an information acquisition device.
- the first determination process determines whether the user is in a bright place or a dark place
- the image storage service server 74 and one HMD out of the plurality of HMDs 1a and 1b cooperate with each other to execute the following various processes.
- image sets are accumulated one after another by a plurality of users, and the accumulated image sets are shared.
- connection between the HMD and the network is wireless in this embodiment, it may be wired.
- Example of image set data structure Here, an example of the data structure of the image set is shown.
- FIG. 9 is a diagram showing an example of the data structure of an image set. As shown in FIG. 9, an image set including a camera image and a range image and its metadata are associated with each other to form a data set T10.
- the data set T10 has an image storage service service IDT11 and location information (including site information (location information) if it can be specified) T12a and T12b.
- the position information is GPS coordinates obtained from a GPS sensor, and the site information is the name of the site, building, etc. specified from the GPS coordinates.
- the image set data is thus sorted by location or location information.
- the data set T10 has date and time T13a and T13b when the image set was acquired.
- Data set T10 may also include orientation data representing the orientation the user is facing.
- the data set 10 further includes camera image data T14a and T14b and range image data T15a and T15b. A camera image and a range image are managed as a pair of data.
- the site is shown as an indoor building, but outdoor location information detected from GPS or the like may also be used.
- the degree of location matching may be estimated by comparing the distance image with the stored camera image. For example, the comparison may be made using contour components of the camera image and the range image.
- the second embodiment it is possible to realize an HMD system having the same effect as the HMD of the first embodiment, and to reduce the implementation cost of the HMD by storing the image data on the server.
- images saved by a plurality of users can be shared and used by a plurality of users, a synergistic effect is produced, and the range of places where this system can be used or the range of users can be expanded.
- Embodiment 3 is an HMD that detects a difference area between a dark place distance image and a corresponding bright place distance image and determines a visible image according to the detection result. That is, in the HMD according to the present embodiment, if there is a difference area between the current dark place distance image and the bright place distance image corresponding to the dark place distance image, which distance image is the cause of the difference. It finds out whether or not there is, executes processing according to the result, and determines a more suitable visible image.
- FIG. 10 is a functional block diagram showing a functional configuration of the HMD according to Embodiment 3.
- the HMD 1c according to the third embodiment is based on the functional configuration of the HMD 1 according to the first embodiment, with a different area detection device (detection device) 65, a difference factor determination device (determination device) 66, and It has a configuration further including a photopic distance image selection device (selection device) 67 .
- the different area detection device 65 detects a different area between the current dark place distance image and the specified past bright place distance image.
- the difference factor determination device 66 compares and analyzes both distance images, and determines the difference. It is determined whether the cause of occurrence is on the dark place distance image side or on the bright place distance image side.
- the bright place distance image selection device 67 corresponds to the current dark place distance image, and determines whether or not a bright place distance image whose acquisition time point is further back in time than the previously specified bright place distance image is stored. determine whether Then, such a photopic distance image is selected and read out as a further past photopic distance image.
- the visible image determination device 63 determines the visible image according to the presence or absence of a difference between the dark place distance image and the bright place distance image, the cause of the difference when there is a difference, the presence or absence of a further past bright place distance image, and the like. Edit and decide.
- the processing flow of the dark place guide program according to the third embodiment is basically the same as that of the first embodiment, but differs only in the process of determining the visible image. Therefore, only the flow of the process of determining the visible image will be described here, and the description of other flows will be omitted.
- 11A and 11B are flow diagrams of the process of determining a visible image according to the third embodiment. This flow chart corresponds to step S12 in the processing flow according to the first embodiment shown in FIG.
- step S121 a process of comparing the bright place distance image and the dark place distance image is performed. Specifically, the different area detection device 65 compares the bright place distance image and the dark place distance image.
- step S122 processing is performed to determine whether there is a difference between the compared distance images based on the comparison result in step S121. Specifically, the different area detection device 65 determines whether or not there is a difference between the past bright place distance image and the current dark place distance image based on the above comparison result. Here, if it is determined that there is no difference, that is, that the two distance images substantially match over the entire image area (S122, No), the process proceeds to step S123. On the other hand, if it is determined that there is a difference (S122, Yes), the process proceeds to step S124.
- step S123 processing is performed to determine the bright camera image as the visible image as it is.
- the visible image determination device 63 determines the bright camera image as the visible image. This corresponds to the determination example of the visible image described with reference to FIG. This completes the process of determining the visible image.
- step S124 a process of analyzing the characteristics of the different regions and recognizing the cause of their occurrence is performed. Specifically, the difference factor determining device 66 analyzes the features of the different regions in both distance images. Then, based on the analysis result, it is determined whether the cause of the difference area is on the dark place distance image side or on the bright place distance image side.
- discontinuous distance data when the appearance of discontinuous distance data is seen in the area of the dark place distance image, it can be determined that there was a sudden displacement in the dark place distance image. Conversely, when discontinuous range data appears in the area of the photopic range image, it can be determined that the photopic range image has been suddenly displaced.
- the object included in the different area is a person, a movable object, an artificial object, etc., it can be considered that the different area may have been caused by the difference due to their movement.
- step S125 a process is performed to determine whether or not there is a factor causing a difference in the dark place distance image.
- the difference factor discriminating device 66 discriminates which of the dark place distance image and the bright place distance image has the cause of the difference based on the above analysis result. In this determination, if it is determined that there is a factor causing the difference in the dark place distance image (S125, Yes), the process proceeds to step S126. On the other hand, in this determination, if it is determined that there is a factor causing the difference in the photopic distance image (S125, No), the process proceeds to step S127.
- step S126 processing is performed to obtain a visible image by inserting a mask image or an AR object into the different area of the photocamera image.
- the visible image determining device 63 obtains a visible image by performing enhancement processing on a region corresponding to the different region in the specified bright-picture camera image. For example, a mask image or an AR object is inserted into the area corresponding to the different area of the photocamera image. If an object included in the different area can be recognized, an AR object corresponding to that object is inserted. If the object included in the different area cannot be recognized, or if such an AR object is not prepared, enhancement processing such as superimposing a mask image on the area corresponding to the different area of the bright camera image to obtain a visible image. This completes the process of determining the visible image.
- FIG. 12 is a diagram for explaining a first example of processing for determining a visible image by the HMD according to the third embodiment.
- FIG. 12 shows an example in which a visible image is determined by superimposing a mask image or an AR object on a photocamera image. This example is the case where the difference between the distance images is due to the current low-light distance image.
- the different area 87 is a person by analyzing the characteristics of the different area 87, based on the bright camera image 80B, the corresponding area 87C of the different area is an AR object of the person. Add 88B. According to this embodiment, it is possible to communicate the details of the differences to the user.
- step S127 processing is performed to determine whether there is a bright place distance image that corresponds to the current dark place distance image and that was acquired at a point in time going back further in the past.
- the bright place distance image selection device 67 corresponds to the dark place distance image at the present time, and the bright place distance image whose acquisition point of time is further back in time than the bright place distance image specified earlier. Determine whether or not it is saved. In this determination, if it is determined that there is no such past photopic distance image (S127, No), the process proceeds to step S128. On the other hand, if it is determined in this determination that there is such a photopic distance image that goes back further in the past (S127, Yes), the process proceeds to step S129.
- step S1208 processing is performed to determine a visible image by inserting the image portion of the different area of the dark place distance image into the different area of the bright place camera image.
- the visible image determining device 63 acquires a past bright-field camera image, and inserts or pastes the image portion of the different area of the dark-field distance image into the different area. is determined as the viewed image.
- FIG. 13A is a diagram for explaining a second example of processing for determining a visible image by the HMD according to the third embodiment.
- FIG. 13A shows an example in which a visible image is obtained by pasting a part of the dark place distance image on the bright place camera image.
- 81C is a stored bright place distance image
- 80C is a stored bright place camera image
- 83C is a dark place distance image
- 84C is a visible image to be visually recognized by the user.
- a region RA of the dark place distance image 83C is a different region, and a person is present in a region corresponding to the above-mentioned different region in the bright place distance image 81C.
- the bright camera image 80C is applied to the entire area as the visible image, the user will be misled into thinking that there is a person who is not present.
- the image portion of the different area RA in the dark place distance image 83C is superimposed on the corresponding area in the bright place camera image 80C to obtain a visible image 84C.
- step S129 a process of reading out a photopic distance image going back further in the past is performed.
- the photopicture distance image selection device 67 reads a photopicture distance image that corresponds to the current dark place distance image and goes back further in the past.
- step S130 a process of comparing and analyzing the dark place distance image and the further past bright place distance image is performed. Specifically, the visible image determining device 63 compares the current dark place distance image with the read further past bright place distance image. It then performs the necessary analysis to determine if there is a difference between these two range images.
- step S131 it is determined whether there is a difference between the dark place distance image and the further past bright place distance image. Specifically, based on the comparison/analysis result in step S130, the visible image determination device 63 determines the difference determine whether there is In this determination, if it is determined that there is a difference (S131, No), the process proceeds to step S128. On the other hand, in this determination, if it is determined that there is no difference (S131, Yes), the process proceeds to step S132.
- step S132 a process of reading further past photopic camera images is performed. Specifically, the visible image determination device 63 reads out the further past photopic camera image included in the same image set as the further past photopic distance image from the photopic image storage device 59 .
- step S133 processing is performed to determine a visible image by inserting the image portion of the different area of the past bright-picture camera image into the different area of the bright-picture camera image.
- the viewed image determination device 63 inserts or pastes the image portion of the different region in the past bright-picture camera image into the specified different region of the past bright-picture camera image. The resulting image is determined as the viewed image.
- FIG. 13B is a diagram for explaining a third example of processing for determining a visible image by the HMD according to the third embodiment.
- FIG. 13B shows an example in which a past bright-picture camera image is pasted with a part of a past bright-picture camera image to determine a visible image.
- the situation is the same as in the example shown in FIG. 13A, but with reference to a further past photopic camera image 80D stored at a time earlier than the past photopic camera image 80C. , determines the viewed image. That is, an image portion corresponding to the region RB of the past bright-picture camera image 80D is inserted into the past bright-picture camera image 80C to obtain a visible image 84D.
- a further past photopicture camera image 80D is a camera image obtained by imaging when no person is present.
- the image portion of the region RB corresponding to the region RA in the further past bright-picture camera image 80D is superimposed on the corresponding region of the bright-picture camera image 80C to obtain a visible image 84D.
- the entire image of the past bright-field camera image 80D may be used as the visible image as it is, the clearer one of the stored camera images is adopted as the entire image, and only the partially different areas are selected. A clearer image can be obtained by inserting another past photopic camera image.
- the user may select a photopic image in a desired time period and use it as the entire area image. For example, an evening image may be used as the full area image.
- a person is used as an example of a different area, but in reality, it is not limited to only a person.
- living things such as animals, mobile objects such as automobiles, bicycles, motorcycles, trolleys, and carts, newly installed equipment, placed members, and the like can be considered.
- Objects that do not move include buildings and road signs.
- machine learning techniques may be used to discriminate different objects.
- CG image, signature, or text data may be displayed as an alternative.
- the third embodiment it is possible to determine a visible image in accordance with the occurrence situation of the different area, and it is possible to realize high performance in generating the visible image of the dark place guide function.
- Embodiment 4 is an HMD that includes processes such as parallel movement, rotational movement, and scaling (enlargement/reduction) of distance images in the process of comparison between distance images and determination of match. Note that the hardware configuration of the HMD according to the fourth embodiment is the same as that of the HMD according to the first embodiment, so description thereof will be omitted.
- the processing flow of the dark place guide program according to the fourth embodiment is basically the same as that of the first embodiment, but differs only in the processing of comparison between distance images and determination of match. Therefore, only the processing flow for comparison between distance images and match determination will be described here, and description of other flows will be omitted.
- FIG. 14 is a flow diagram of the process of comparison between distance images and match determination according to the fourth embodiment. This flow diagram corresponds to steps S9 to S11 and S13 in the overall processing flow of FIG. 4, which are surrounded by dashed lines in FIG.
- FIGS. 15A to 15C are diagrams for explaining an example of processing used for comparison between distance images and match determination according to the fourth embodiment.
- step S21 a process of extracting features of the dark place distance image is performed. Specifically, the bright place image search device 62 extracts the features of the recognized dark place distance image.
- step S22 a process of acquiring a photopic distance image is performed. Specifically, the bright place image retrieval device 62 reads out one of the narrowed bright place distance images.
- step S23 a process of extracting features of the bright-field distance image is performed. Specifically, the distance image search device 62 extracts the features of the read photopic distance image.
- step S24 a process of comparing the extracted features and calculating the distance image movement amount and scaling factor is performed.
- the range image search device 62 compares the features of the dark range image and the features of the bright range image, focuses on the same features, and determines the amount of movement of the range image and the scaling factor of the range image. calculate.
- step S25 a process of moving/scaling the dark place distance image is performed. Specifically, the distance image search device 62 moves and scales the dark place distance image with the calculated movement amount and scaling factor.
- steps S24 and S25 when the distance from the shooting position changes due to scaling, the distance images can be easily compared by changing the color indicating the perspective of the distance image according to the scaling magnification.
- step S26 a process of comparing distance images is performed. Specifically, the distance image search device 62 compares the dark place distance image and the bright place distance image that have been moved and scaled.
- step S28 a process of determining whether the distance images match each other is performed. Specifically, the distance image search device 62 determines whether or not the dark distance image and the bright distance image match based on the above comparison result. In this determination, if it is determined that they match (S28, Yes), the process proceeds to step S29. On the other hand, if it is determined that they do not match (S28, No), the process proceeds to step S30.
- step S29 processing is performed to determine whether the matching area is sufficient. Specifically, the distance image search device 62 determines whether or not the coverage of the entire area where the recognized dark place distance image and the read bright place distance image match is greater than a specified value. do. In this determination, if it is determined that the value is larger than the specified value (S29, Yes), the process proceeds to step S12. On the other hand, if it is determined that it is equal to or less than the specified value (S29, No), the process proceeds to step S30.
- step S30 a process is performed to determine whether all comparisons for the narrowed-down photopic distance images have been completed. Specifically, the distance image search device 62 determines whether or not there remains any photopic distance image to be compared after all comparisons have been completed. Here, if it is determined that all comparisons have been completed, that is, that there is no photopic distance image to be compared (S30, Yes), the process proceeds to step S14. On the other hand, if it is determined that all the comparisons have not been completed, that is, that there are still images of the photopic distances to be compared (S30, No), the process returns to step S22, and the next photopic distance to be compared is determined. Image reading is performed.
- FIG. 15A is a diagram for explaining translation processing in the third embodiment.
- 15B is a diagram for explaining scaling processing according to the third embodiment;
- FIG. 15C is a diagram for explaining combination processing using a plurality of distance images according to the third embodiment;
- the thick frame indicated by 81F is the dark place distance image
- the thin frame indicated by 83F is the bright place distance image.
- the blank area may be filled with, for example, a low-light distance image or another past camera image.
- FIG. 15C is a case of comparing a plurality of bright place distance images 83J, 83K, 83L and a dark place distance image 81F.
- this example is not described in the above flow, such processing may be incorporated.
- the dark place distance image 81F and the bright place distance images 83J, 83K, and 83L partially match each other, the bright place camera images corresponding to the matching regions are cut out and combined to generate a visible image. do.
- the bright-field distance images 83J, 83K, and 83L may be superimposed so that a clearer image is prioritized, that is, the clearer image is positioned higher (on the front).
- the images may be superimposed so that the image with the newer date and time of acquisition is prioritized, that is, the image with the newer date and time of acquisition is positioned higher.
- the sharpness is in the order of photopicture distance image 83J > photopicture distance image 83K > photopicture distance image 83L.
- a bright place distance image 83K, and a bright place distance image 83L are arranged so as to overlap in this order.
- the object to be subjected to processing such as parallel movement, rotational movement, scaling, etc.
- the low-light distance image. good when a wide-angle image such as a 360-degree camera is used at the time of shooting, the image in the front direction of the user's HMD is cut out at a predetermined angle of view based on the position and direction of the HMD and saved. can be Furthermore, when recording a plurality of captured images, the directions of the captured images at the same position may be matched in advance, and images having substantially the same position and direction may be recorded in association with each other.
- the fourth embodiment it is possible to easily find an image in a bright place that substantially matches the dark-place distance image.
- Embodiment 5 is an HMD that performs processing suitable for a case where the user is in a dark place but a partial area in front of the user is illuminated by a light source.
- FIG. 16 is an external view of the HMD according to Embodiment 5.
- FIG. The same numbers are assigned to the same components as those of the HMD 1 shown in FIG. 1, and overlapping descriptions are omitted.
- the HMD 1d shown in FIG. 16 has a configuration in which a headlight 22 as an auxiliary light source is added to the HMD 1 shown in FIG.
- FIG. 17 is a functional block diagram of the HMD 1d according to the fifth embodiment.
- the same blocks as those in the functional block diagram shown in FIG. 3 are assigned the same numbers, and overlapping descriptions are omitted.
- the functional block diagram shown in FIG. 17 has a configuration in which an irradiation area detection device (detection device) 68 is added to the functional block diagram shown in FIG.
- the irradiation area detection device 68 detects a spot irradiation area, which is a partial area illuminated by the light source and brightened, in the dark camera image obtained by the camera.
- the visible image determining device 63 determines the image portion of the spot irradiation area in the dark camera image for the corresponding area of the detected spot irradiation area in the bright camera image or the dark distance image determined as the basis of the visible image. Paste. This determines the visible image.
- the irradiation area detection device 68 is realized by the processor 36 executing the dark place guide program 42 using the memory 37 and the like.
- Embodiment 5 can be similarly applied even in these cases.
- FIG. 18 is a flowchart of visible image generation/editing processing according to the fifth embodiment. This processing flow corresponds to step S15 in the overall processing flow according to the first embodiment shown in FIG. 4, and is a modification of the first embodiment.
- step S31 processing is performed to determine whether there is a spot irradiation area. Specifically, the visible image determination device 63 determines whether or not there is a spot irradiation area in the obtained dark camera image. In this determination, if it is determined that there is a spot irradiation area (S31, Yes), the process proceeds to step S32. On the other hand, in this determination, if it is determined that there is no spot irradiation area (S31, No), the process proceeds to step S34.
- step S32 a process of cutting out the image portion of the spot irradiation area is performed. Specifically, the visible image determination device 63 cuts out the image portion of the spot irradiation area irradiated by the light source such as the headlight from the dark camera image.
- the visible image determination device 63 cuts out the image portion of the spot irradiation area irradiated by the light source such as the headlight from the dark camera image.
- step S33 a process of synthesizing the clipped image portions is performed.
- the visible image determination device 63 determines the visible image by pasting the clipped image portion onto the area corresponding to the spot irradiation area in the image set as the basis of the visible image. At this time, the pasting position and size of the clipped image portion are adjusted so that the image set as the base and the clipped image portion are smoothly connected and look natural. After that, the visible image generation/editing process ends, and the process proceeds to step S16 shown in FIG.
- step S34 the visible image is determined by the method according to the first to fourth embodiments. After that, the visible image generation/editing process ends, and the process proceeds to step S16 shown in FIG.
- FIG. 19A and 19B are diagrams showing examples of a camera image and a visually recognized image obtained by Embodiment 5.
- FIG. 19 82G indicates a dark camera image
- 84G indicates a visible image.
- the spot irradiation area 89 irradiated by the headlights 22 shows the situation ahead.
- the visible image 84G is an image obtained by pasting the image portion of the spot irradiation area 89 in the dark camera image onto the corresponding area of the spot irradiation area 89 in the bright camera image.
- the user can directly visually confirm what is in the different area as the camera image by directing the headlight 22 toward the real space area corresponding to the mask display area.
- one spot area is pasted and displayed, but by changing the spot irradiation direction and moving the irradiation area by the user, an image of a wide irradiation area can be obtained. An image of such a wide spot irradiation area may be pasted.
- the difference when there is a difference in front of the user in a dark place that is not captured in the past bright camera image, by illuminating the difference area with the light source, the difference can be seen in the visible image. can be confirmed. This is particularly effective when the display surface of the visible image has light impermeability, as will be described later.
- Embodiment 6 is an HMD that can be operated by a user's gesture.
- FIGS. 20A and 20B are diagrams showing a first example of the appearance of the HMD according to Embodiment 6.
- FIG. The HMD 1e shown in FIGS. 20A and 20B includes a ranging sensor 23 suitable for short-range ranging (hereinafter referred to as a short-range ranging sensor) and a ranging sensor 23 suitable for middle-range or long-range ranging (hereinafter referred to as a range-finding sensor). , a medium-to-long-range distance measuring sensor) 24 .
- These ranging sensors are connected to a controller 17, and the controller 17 acquires distance data obtained by these ranging sensors.
- the medium-to-long distance ranging sensor 24 is for obtaining distance data for the real space in front of the user's line of sight.
- the short-range distance measuring sensor 23 is for detecting gestures made by the user's fingers.
- a stored data area in the controller 17 stores a table in which distance data corresponding to gesture types and operation types are associated with each other.
- the controller 17 refers to the table based on the distance data obtained by the short-range distance measuring sensor 23, recognizes the type of user's gesture, and receives an operation corresponding to the gesture.
- As the operation for example, turning on/off the dark place guide program, switching of the visible image to be displayed, and the like can be considered.
- the medium-to-long-range distance measuring sensor 24 is installed at the end in the width direction of the upper frame of the lens (display) of the HMD 1e.
- the short-range distance measuring sensor 23 is installed in the center of the upper frame.
- the distance measurement center axis direction 24c of the medium-to-long distance distance measurement sensor 24 and the distance measurement center axis direction 23c of the distance measurement sensor 23 for short distance are substantially the same.
- the user puts out his or her finger in front of the user to perform a gesture operation.
- the short distance ranging sensor 23 obtains distance data corresponding to the user's gesture operation.
- the short distance sensor 23 has its distance measurement central axis direction 23c set at a predetermined angle (for example, 30 °). As a result, the user can perform a gesture operation with his or her fingers while confirming the gesture operation in a natural line-of-sight direction, enabling a more natural operation with less stress.
- the arrangement of the short-range distance measuring sensor 23 and the medium-to-long-range distance measuring sensor 24 is not limited to the above example, and can be conceived in various ways according to the design concept, specifications, or the like.
- FIGS. 21A and 21B are diagrams showing a second example of the appearance of the HMD according to Embodiment 6.
- FIG. 21C is a diagram showing a gesture operation area by the user's fingers.
- 21A and 21B show a modification in which the placement of the short-range distance measuring sensor is changed from the first example.
- the HMD 1f is an example in which the short-range distance measuring sensor 23 is arranged in a portion of the frame housing 20 shaped like a temple of eyeglasses.
- the distance measurement center axis direction 24c of the middle- and long-range distance measurement sensor 24 is the same as in the previous example, but the distance measurement center axis direction 23c of the short-range distance measurement sensor 23 extends toward the side of the user's head. direction. In this case, the user performs a gesture operation with fingers on the side of the user's head.
- a gesture operation space area is defined by an xyz coordinate system, and the gesture operation space area is subdivided into a plurality of partial areas.
- the short-range ranging sensor 23 obtains ranging values in each subdivided partial area, and the controller 17 detects the gesture operation based on the distribution state of the ranging values or their change over time.
- Examples of gesture operations include moving a finger in the x direction to give an instruction to move the display screen back and forth, moving it in the y direction to give an instruction to move it up and down, and moving it in the z direction to assign it to scaling the display screen. There is a method.
- the gesture operation with the user's fingers can be performed near the side of the head, and the operation can be performed without obstructing the field of view by the gesture.
- the dark place guide program can be turned on and off, the display image can be switched, and the like can be operated by the user's gesture operation, and an image display device that is easier to use can be realized.
- the user can operate the image display device without operating a button-type or touch-type operation unit that requires relatively delicate operations. be able to.
- Embodiment 7 is an example of an HMD configured to display a user interface screen.
- the controller 17 of the HMD controls the display device to display the user interface screen on the display surface, and also accepts predetermined operations such as gesture operations shown in the sixth embodiment.
- FIG. 22 is a diagram showing an example of a user interface screen on the HMD.
- the HMD controller 17 detects that the illuminance sensor has detected that the user's field of vision has become dark, the controller 17 controls the display device to display a screen display switching selection screen 90 .
- a selection screen as shown in FIG. 22 is displayed so that the user can select whether to switch the display screen to the bright image mode.
- a selection screen is displayed for selecting several screen display methods. For example, a selection screen as shown in the lower part of FIG. 22 is displayed so that the user can select an image to be displayed from among "daytime image", "partial image interpolation", and "arranged image”. .
- a bright video (image) obtained by imaging in a bright place and when a part of the field of view is dark, an image obtained by imaging only the dark part in a bright place.
- the image may be processed and displayed by image processing. For example, an image converted to black and white or sepia may be displayed, or an image arranged in rich colors such as used in CG or animation may be displayed.
- the seventh embodiment it is possible to switch images according to the user's intentions and preferences.
- Embodiments 1 to 7 are examples of using an HMD as an image display device, which includes an image display surface having optical transparency through which light from the front of the user passes.
- This embodiment is an example of using an HMD as an image display device that has an image display surface that does not allow light from in front of the user to pass therethrough.
- FIG. 23 is an external view of an HMD according to Embodiment 8 as the image display device of Embodiments 1-7.
- FIG. 23 is a diagram showing an example of a goggle-type HMD having a display surface that is non-transmissive to light from the front of the user. It is used in a video see-through mode that displays images captured by the camera on the left and right displays.
- the HMD 1z shown in FIG. 23 includes a housing 20d molded to cover the field of view of the user, and fixtures 20e for fixing the housing 20d so as to cover the field of view of the user.
- a left-eye display 25 and a right-eye display 26 are provided on the surface of the housing 20d that covers the user's eyes.
- the housing 20d also includes a right camera 11a, a left camera 11b, a distance sensor 12, a controller 17, a microphone 18, a speaker 19, a battery 21, a headlight 22, and a short distance distance sensor 23. .
- the illuminance sensor is built in the controller 17, it may be provided on the top of the housing 20d.
- the left-eye display 25 and right-eye display 26 are configured by, for example, a liquid crystal panel, an organic EL (Electro Luminescence) panel, or a plasma display.
- these left-eye and right-eye displays 25 and 26 may each be configured by an image projection device using a projector.
- the user wears the HMD 1z on the head, sees the image displayed on the left-eye display with the left eye, and sees the image displayed on the right-eye display with the right eye, so that the front camera image, the distance image, or the visual Images and the like can be viewed stereoscopically.
- a more robust face-fitting HMD can be used as the image display device, and the user's eyes can be protected from dangerous obstacles or gases or liquids that may exist in the field. can be protected.
- the degree of freedom in designing the image display device increases.
- Embodiments 1 to 8 are examples using an HMD as an image display device.
- the ninth embodiment is an example using a portable information terminal as an image display device.
- a portable information terminal for example, a smart phone, a tablet terminal device, or a notebook computer can be considered.
- portable information terminals there are many models that are equipped with a camera and ranging sensor on the side (back side) opposite to the side where the display screen is installed.
- the ranging sensor may be retrofitted.
- Such a portable information terminal can operate in the same manner as the HMD described above.
- FIG. 24A is a diagram showing an example of the angle of view of the camera of the portable information terminal and the measurement range of the ranging sensor.
- FIG. 24B is a diagram showing an example of a state in which a dark place distance image is displayed on the portable information terminal.
- FIG. 24C is a diagram showing an example of a state in which a photopicture camera image is displayed on the portable information terminal.
- the portable information terminal 1s is configured, for example, to define xy coordinate axes as indicated by dashed lines in FIG. be. Further, the portable information terminal 1s displays a dark place distance image on the display screen as shown in FIG. 24B, or displays a bright place camera image corresponding to the dark place distance image as shown in FIG. 24C. It is also possible to display it.
- a highly versatile and widely spread portable information terminal can be used to perform the operations of the first to eighth embodiments. It is possible to perform the same operation as the HMD by. As a result, the development cost of the image display device or the cost of the device itself can be reduced.
- users can be expanded to those who already own a portable information terminal, and image sets can be efficiently accumulated at any location.
- the user can also obtain a visible image such as a photocamera image at any place.
- the present invention is not limited to the above-described embodiments, and includes various modifications.
- the same effect can be obtained by replacing the generation of the distance image by the distance sensor with the generation of the infrared camera image by the infrared camera.
- the above-described embodiments are described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the described configurations.
- part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .
- numerical values, messages, etc. contained in texts and drawings are only examples, and the effects of the present invention are not impaired even if different ones are used.
- each of the above configurations, functions, processing units, processing means, and the like may be realized by hardware, for example, by designing a part or all of them using an integrated circuit.
- each of the above configurations, functions, and the like may be realized by software by having a processor such as an MPU or CPU interpret and execute a program for realizing each function.
- the scope of functions implemented by software is not limited, and hardware and software may be used together.
- Information such as programs, tables, and files that implement each function can be stored in a recording device such as a memory, a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD.
- An image display device that displays a visible image to be visually recognized by a user, a camera that captures an image in front of the user and obtains a camera image; a ranging sensor that obtains data representing the distance from the user to each position on a physical object included in the field of view of the camera; an illuminance sensor that obtains data representing the brightness of the location where the user is; a generating device that generates a distance image corresponding to the visual field region and in which each pixel represents the distance to each position based on the data obtained by the distance measuring sensor; a determination device that determines whether the location of the user is a bright place or a dark place based on the data obtained by the illuminance sensor; When the determining device determines that the place is bright, the camera image obtained by the camera and the distance image obtained by the generating device are stored as an image set including the bright place camera image and the bright place distance image.
- a storage device that a recognition device that recognizes the distance image obtained by the generation device as a dark place distance image when the determination device determines that the place is dark;
- An image display device comprising:
- Appendix 2 A program for causing a computer to function as a generation device, a determination device, a storage device, a recognition device, a search device, and a determination device in the image display device according to [Appendix 1].
- Appendix 3 A computer-readable recording medium in which a program for causing a computer to function as a generation device, a determination device, a storage device, a recognition device, a search device, and a determination device in the image display device according to [Appendix 1] is recorded. .
- the process of comparing the dark place distance image and the bright place distance image includes parallel movement, rotational movement, and and an image display device that applies at least one process of scaling.
- Appendix 5 In the image display device according to Appendix 1, An image display device using a smartphone, a tablet terminal device, or a notebook computer as the image display device.
- the search device performs at least one of parallel movement, rotational movement, and scaling on at least one of the recognized dark place distance image and the identified bright place distance image, and compares them.
- Image display device In the image display device according to Appendix 1, The search device performs at least one of parallel movement, rotational movement, and scaling on at least one of the recognized dark place distance image and the identified bright place distance image, and compares them.
- the display device is an image display device provided with a screen on which the visible image is projected.
- Appendix 9 In the image display device according to Appendix 1, An image display device, wherein the display device includes a display device that displays the visible image.
- Appendix 10 In the image display device according to Appendix 1, The image display device, wherein the search device specifies, as a bright place distance image corresponding to the dark place distance image, a bright place distance image whose degree of similarity or matching with the dark place distance image is equal to or greater than a threshold.
- Appendix 11 In the image display device according to Appendix 1, The image display device, wherein the search device specifies, as a bright place distance image corresponding to the dark place distance image, a bright place distance image recognized by artificial intelligence as matching, approximating, or similar to the dark place distance image.
- the time information acquisition device acquires time information including at least a date
- the storage device is a process of extracting a photopic distance image corresponding to the photopical distance image to be stored by the storage device from the stored photopical distance images; a process of correlating and saving the time information acquired by the time information acquisition device when saving the image set;
- a process of overwriting the image set containing the photopic distance image extracted by the extracting process, or the time information associated with the extracted photopic distance image represents deleting the image set including the extracted photopic distance image when the time is earlier than the time represented by the time information associated with the photopic distance image to be stored by a first time or more;
- An image display device that performs processing.
- Appendix 13 In the image display device according to Appendix 12, In the extracting process, as a bright place distance image corresponding to the bright place distance image to be stored, a bright place distance image having a degree of similarity or a degree of matching with the bright place distance image to be stored is equal to or greater than a threshold value.
- An image display device that is a process of extracting a distance image.
- Appendix 14 In the image display device according to Appendix 12, In the extracting process, as a photopic distance image corresponding to the photopic distance image to be saved, artificial intelligence recognizes that it matches, approximates, or resembles the photopic distance image to be saved. An image display device that is a process of extracting a photopic distance image.
- Time information acquisition device 53... Azimuth information acquisition device 54... Posture information acquisition device 55... Camera image acquisition device 56... Distance image generation device 57... Bright place dark place determination device 58... Bright place image storage management device 59... Bright place Image storage device 60 Dark place distance image recognition device 61 Comparative image narrowing device 62 Bright place image search device 63 Visual image determination device 64 Visual image display device 65 Difference area detection device 66...difference factor determination device, 67...photopic distance image selection device, 68...irradiation area detection device, 70...user, 72...access point, 73...network, 74...image storage service server, 100, 101...HMD system.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
本発明の実施形態1であるHMDについて説明する。
実施形態1であるHMDは、明所では、カメラによる明所カメラ画像と、このカメラ画像に対応し測距センサの出力に基づく明所距離画像とを得て、これらの画像をセットで保存し蓄積する。一方、暗所では、測距センサの出力に基づく暗所距離画像を得て、この暗所距離画像に対応する(実質的に一致する)明所距離画像を検索して特定する。そして、その明所距離画像と同じセットに含まれる明所カメラ画像に基づき、ユーザに視認させる視認画像を決定し表示する。これにより、暗所において、過去に得られた同じ場所の明所での画像をユーザに視認させることができ、ユーザの視認性を向上させることができる。
図1は、実施形態1に係るHMDの外観図である。図1に示すように、本実施形態に係るHMD1は、カメラ11、測距センサ12、右目用プロジェクタ13、左目用プロジェクタ14、画像スクリーン(画像表示面)15、ノーズパッド16、コントローラ17、マイク18、スピーカ19、およびフレーム筐体20a~20cを備えている。ユーザは、フレーム筐体20a、20bとノーズパッド16で、HMD1を自身の顔部に装着する。
図3は、実施形態1に係るHMDの機能的な構成を示す機能ブロック図である。図3に示す各機能ブロックは、プロセッサ36が暗所ガイドプログラムをメモリ37に展開して実行し、HMD1が備える各種センサおよび各種デバイスと連携することにより実現される。
これより、暗所ガイドプログラム処理について説明する。
ここで、上記のステップS6における画像セット保存・管理処理のフローについて、詳しく説明する。
図6は、実施形態1に係るHMDにより得られる画像の例を示す図である。図6では、明所と暗所で得られたカメラ画像と距離画像、および視認画像の例を示している。図6において、上段が過去に記録された明所での画像であり、中段が現時点における暗所での画像である。また、左側がカメラ画像であり、右側が距離画像である。一般的に、距離画像は距離の遠近により、異なる色で示されるが、ここでは白黒(グレースケール)で疑似的に示している。
本発明の実施形態2について説明する。
実施形態2は、HMDとサーバとがネットワークを介して接続されるHMDシステムである。本実施形態では、HMDとサーバとの連携により、ユーザに対して、暗所ガイド機能を含む画像保存サービスを提供する。
実施形態2に係るHMDシステムの構成について説明する。
図8は、画像保存サービスサーバのハードウェア的な構成を示すブロック図である。図8に示すように、画像保存サービスサーバ74は、内部バス740、ネットワークI/F(Interface)741、プロセッサ742、メモリ743、およびストレージ744を備えている。ストレージ744には、基本動作プログラム745が格納されており、画像データ保存エリア746が設けられている。ストレージ744は、例えば、ハードディスクあるいは半導体メモリ等で構成される。
ユーザ70は、HMD1を頭部に装着し、前方を観ている。HMD1は通信信号71aと71b、およびアクセスポイント72を介して、ネットワーク73に繋がり、ネットワーク73には、画像保存サービスサーバ74が繋がっている。
ここで、画像セットのデータ構成の例を示す。
本発明の実施形態3について説明する。
実施形態3は、暗所距離画像とこれに対応する明所距離画像との間の差異領域を検出し、その検出結果に応じて視認画像を決定するHMDである。すなわち、本実施形態に係るHMDは、現時点の暗所距離画像と、当該暗所距離画像に対応する明所距離画像との間に差異領域がある場合に、その発生要因がどちらの距離画像にあるかを突き止め、その結果に応じた処理を実行し、より相応しい視認画像を決定する。
図10は、実施形態3に係るHMDの機能的な構成を示す機能ブロック図である。
実施形態3による暗所ガイドプログラムの処理フローは、基本的な部分においては、実施形態1と同様であるが、視認画像の決定処理の部分のみ異なっている。そこで、ここでは、その視認画像の決定処理のフローについてのみ説明し、他のフローについては説明を省略する。
本発明の実施形態4について説明する。
実施形態4は、距離画像間の比較および一致判定の処理において、距離画像の平行移動、回転移動、スケーリング(拡縮)等の処理を含むHMDである。なお、実施形態4に係るHMDのハードウェア的な構成は、実施形態1に係るHMDと同様であるため、説明を省略する。
実施形態4による暗所ガイドプログラムの処理フローは、基本的な部分においては、実施形態1と同様であるが、距離画像間の比較および一致判定の処理の部分のみ異なっている。そこで、ここでは、その距離画像間の比較および一致判定の処理のフローについてのみ説明し、他のフローについては説明を省略する。
本発明の実施形態5について説明する。
実施形態5は、ユーザの居る場所が暗所であるものの、ユーザの前方の一部領域が光源により照らされているような場合に適した処理を行うHMDである。
本発明の実施形態6について説明する。
実施形態6は、ユーザのジェスチャにより操作を可能にするHMDである。
本発明の実施形態7について説明する。
実施形態7は、ユーザインタフェース画面が表示されるように構成されたHMDの例である。
本発明の実施形態8について説明する。
実施形態1~7は、画像表示装置として、ユーザ前方からの光を通す、光透過性を有する画像表示面を備えるHMDを用いた例であった。本実施形態は、画像表示装置として、ユーザ前方からの光を通さない、光非透過性を有する画像表示面を備えるHMDを用いた例である。
本発明の実施形態9について説明する。
実施形態1~8は、画像表示装置としてHMDを用いた例であった。実施形態9は、画像表示装置として携帯型情報端末を用いた例である。携帯型情報端末としては、例えば、スマートフォン、タブレット端末装置、またはノートパソコンなどを考えることができる。
ユーザに視認させる視認画像を表示する画像表示装置であって、
前記ユーザの前方を撮像しカメラ画像を得るカメラと、
前記ユーザから前記カメラの視野領域に含まれる現実物体における各位置までの距離を表すデータを得る測距センサと、
前記ユーザが居る場所の明るさを表すデータを得る照度センサと、
前記測距センサにより得られるデータを基に、前記視野領域に対応し各画素が前記各位置までの距離を表す距離画像を生成する生成装置と、
前記照度センサにより得られるデータを基に、前記ユーザの居る場所が明所か暗所かを判定する判定装置と、
前記判定装置により明所であると判定された場合に、前記カメラにより得られるカメラ画像と前記生成装置により得られる距離画像とを、明所カメラ画像と明所距離画像とを含む画像セットとして保存する保存装置と、
前記判定装置により暗所であると判定された場合に、前記生成装置により得られる距離画像を暗所距離画像として認識する認識装置と、
前記保存装置により保存されている明所距離画像と前記認識装置により認識された暗所距離画像とを比較することにより、前記認識された暗所距離画像に対応する明所距離画像を特定する検索装置と、
前記検索装置により特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像に基づいて、前記ユーザに視認させる視認画像を決定する決定装置と、
前記視認画像を表示する表示装置と、
を備える画像表示装置。
コンピュータを、[付記1]に記載の画像表示装置における、生成装置、判定装置、保存装置、認識装置、検索装置、および決定装置として機能させるためのプログラム。
コンピュータを、[付記1]に記載の画像表示装置における、生成装置、判定装置、保存装置、認識装置、検索装置、および決定装置として機能させるためのプログラムが記録された、コンピュータ読み取り可能な記録媒体。
付記1に記載の画像表示装置において、暗所距離画像と明所距離画像を比較する処理は、前記暗所距離画像および前記明所距離画像のうち少なくとも一方に対して、平行移動、回転移動、および拡縮のうち少なくとも1つの処理を適用する、画像表示装置。
付記1に記載の画像表示装置において、
当該画像表示装置として、スマートフォン、タブレット端末装置、またはノートパソコンを用いる、画像表示装置。
付記1に記載の画像表示装置において、
前記カメラは、前記照度センサを兼ねる、画像表示装置。
付記1に記載の画像表示装置において、
前記検索装置は、前記認識された暗所距離画像および前記特定された明所距離画像のうち少なくとも1つに対して、平行移動、回転移動、および拡縮のうち少なくとも1つを行って比較する、画像表示装置。
付記1に記載の画像表示装置において、
前記表示装置は、前記視認画像が投影されるスクリーンを備える、画像表示装置。
付記1に記載の画像表示装置において、
前記表示装置は、前記視認画像を表示するディスプレイ装置を備える、画像表示装置。
付記1に記載の画像表示装置において、
前記検索装置は、前記暗所距離画像に対応する明所距離画像として、暗所距離画像との間の類似度または一致度が閾値以上である明所距離画像を特定する、画像表示装置。
付記1に記載の画像表示装置において、
前記検索装置は、前記暗所距離画像に対応する明所距離画像として、前記暗所距離画像に一致または近似もしくは類似すると人工知能により認識された明所距離画像を特定する、画像表示装置。
付記1に記載の画像表示装置において、
時間情報取得装置を備え、
前記時間情報取得装置は、少なくとも日付を含む時間情報を取得し、
前記保存装置は、
前記保存されている明所距離画像の中から、前記保存装置による保存の対象となる明所距離画像に対応する明所距離画像を抽出する処理と、
前記画像セットを保存する際に、前記時間情報取得装置により取得された時間情報を対応付けて保存する処理と、
前記画像セットを保存する際に、前記抽出する処理により抽出された明所距離画像を含む前記画像セットを上書きする処理、または、前記抽出された明所距離画像に対応付けされた時間情報が表す時間が、前記保存の対象となる明所距離画像に対応付けされる時間情報が表す時間より第1の時間以上前である場合に、前記抽出された明所距離画像を含む画像セットを消去する処理、を実行する、画像表示装置。
付記12に記載の画像表示装置において、
前記抽出する処理は、前記保存の対象となる明所距離画像に対応する明所距離画像として、前記保存の対象となる明所距離画像との間の類似度または一致度が閾値以上である明所距離画像を抽出する処理である、画像表示装置。
付記12に記載の画像表示装置において、
前記抽出する処理は、前記保存の対象となる明所距離画像に対応する明所距離画像として、前記保存の対象となる明所距離画像に一致するまたは近似するもしくは類似すると人工知能により認識された明所距離画像を抽出する処理である、画像表示装置。
Claims (16)
- ユーザに視認させる視認画像を表示する画像表示装置であって、
前記ユーザの前方を撮像しカメラ画像を得るカメラと、
前記ユーザから前記カメラの視野領域に含まれる現実物体における各位置までの距離を表すデータを得る測距センサと、
前記ユーザが居る場所の明るさを表すデータを得る照度センサと、
前記測距センサにより得られるデータを基に、前記視野領域に対応し各画素が前記各位置までの距離を表す距離画像を生成する生成装置と、
前記照度センサにより得られるデータを基に、前記ユーザの居る場所が明所か暗所かを判定する判定装置と、
前記判定装置により明所であると判定された場合に、前記カメラにより得られるカメラ画像と前記生成装置により得られる距離画像とを、明所カメラ画像と明所距離画像とを含む画像セットとして保存する保存装置と、
前記判定装置により暗所であると判定された場合に、前記生成装置により得られる距離画像を暗所距離画像として認識する認識装置と、
前記保存装置により保存されている明所距離画像と前記認識装置により認識された暗所距離画像とを比較することにより、前記認識された暗所距離画像に対応する明所距離画像を特定する検索装置と、
前記検索装置により特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像に基づいて、前記ユーザに視認させる視認画像を決定する決定装置と、
前記視認画像を表示する表示装置と、
を備える画像表示装置。 - 請求項1に記載の画像表示装置において、
位置情報取得装置と、絞込装置と、を備え、
前記位置情報取得装置は、前記ユーザの位置情報を取得し、
前記保存装置は、前記画像セットを保存する際に、前記画像セットと、前記位置情報取得装置により取得された位置情報とを対応付けて保存し、
前記絞込装置は、前記画像セットに対応付けされた位置情報を基に、前記検索装置が比較する明所距離画像を絞り込む、画像表示装置。 - 請求項1に記載の画像表示装置において、
時間情報取得装置を備え、
前記時間情報取得装置は、少なくとも日付を含む時間情報を取得し、
前記保存装置は、
保存されている明所距離画像の中から、保存の対象となる明所距離画像に対応する明所距離画像を抽出する抽出処理と、
前記画像セットを保存する際に、前記時間情報取得装置により取得された時間情報を対応付けて保存する保存処理と、
前記画像セットを保存する際に、前記抽出処理により抽出された明所距離画像を含む前記画像セットを上書きする上書き処理、または、前記抽出された明所距離画像に対応付けされた時間情報が表す時間が、前記保存の対象となる明所距離画像に対応付けされる時間情報が表す時間より第1の時間以上前である場合に、前記抽出された明所距離画像を含む前記画像セットを消去する消去処理、を実行する、画像表示装置。 - 請求項1に記載の画像表示装置において、
検出装置を備え、
前記検出装置は、前記認識された暗所距離画像と前記特定された明所距離画像との差異領域を検出し、
前記決定装置は、前記特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像における前記差異領域の対応領域に、前記認識された暗所距離画像における前記差異領域の対応画像を挿入する挿入処理を実行することにより、前記視認画像を決定する、画像表示装置。 - 請求項4に記載の画像表示装置において、
判別装置を備え、
前記判別装置は、前記差異領域の発生要因が前記認識された暗所距離画像と前記特定された明所距離画像のどちらにあるかを判別し、
前記決定装置は、前記判別装置により前記発生要因が前記特定された明所距離画像にあると判別された場合に、前記挿入処理を実行する、画像表示装置。 - 請求項1に記載の画像表示装置において、
検出装置と、選定装置と、を備え、
前記検出装置は、前記認識された暗所距離画像と前記特定された明所距離画像との差異領域を検出し、
前記選定装置は、前記保存装置により保存されている明所距離画像と前記認識された暗所距離画像とを比較することにより、前記認識された暗所距離画像に対応しており前記特定された明所距離画像とは異なる明所距離画像を選定し、
前記決定装置は、前記特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像における前記検出された差異領域の対応領域に、前記選定装置により選定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像における前記検出された差異領域の対応画像を挿入する挿入処理を実行することにより、前記視認画像を決定する、画像表示装置。 - 請求項1に記載の画像表示装置において、
検出装置と、判別装置と、を備え、
前記検出装置は、前記認識された暗所距離画像と前記特定された明所距離画像との差異領域を検出し、
前記判別装置は、前記検出装置により検出された差異領域の発生要因が前記認識された暗所距離画像と前記特定された明所距離画像のどちらにあるかを判別し、
前記決定装置は、前記判別装置により前記発生要因が前記認識された暗所距離画像にあると判別された場合に、前記特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像における前記検出された差異領域の対応領域を強調する強調処理を実行することにより、前記視認画像を決定する、画像表示装置。 - 請求項1に記載の画像表示装置において、
検出装置と、判別装置と、を備え、
前記検出装置は、前記認識された暗所距離画像と前記特定された明所距離画像との差異領域を検出し、
前記判別装置は、前記検出装置により検出された差異領域の発生要因が前記認識された暗所距離画像と前記特定された明所距離画像のどちらにあるかを判別し、
前記決定装置は、前記判別装置により前記発生要因が前記認識された暗所距離画像にあると判別された場合に、前記特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像における前記検出された差異領域の対応領域に前記差異領域の特徴に応じた人工画像を挿入する挿入処理を実行することにより、前記視認画像を決定する、画像表示装置。 - 請求項1に記載の画像表示装置において、
前記検索装置は、複数の明所距離画像を特定し、
前記決定装置は、前記検索装置により特定された複数の明所距離画像を基に、前記視認画像を決定する、画像表示装置。 - 請求項1に記載の画像表示装置において、
光源と、検知装置と、を備え、
前記光源は、前記視野領域における一部を照らし、
前記検知装置は、暗所にて前記カメラにより得られる暗所カメラ画像における前記光源の照射領域を検知し、
前記決定装置は、前記特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像における前記検知装置により検知された照射領域の対応領域に、前記認識された暗所距離画像と同じ前記画像セットに含まれる暗所カメラ画像における前記検知された照射領域の対応画像を挿入する挿入処理を実行することにより、前記視認画像を決定する、画像表示装置。 - 請求項1に記載の画像表示装置において、
ヘッドマウントディスプレイを含み、
前記ヘッドマウントディスプレイは、前記カメラ、前記測距センサ、前記照度センサ、および前記表示装置を有する、画像表示装置。 - ネットワークに接続されるサーバと、前記ネットワークに接続され前記サーバと通信する携帯型の画像表示装置と、を備える画像表示システムであって、
前記画像表示装置は、
ユーザの前方を撮像してカメラ画像を得るカメラと、
前記ユーザから前記カメラの視野領域に含まれる現実物体における各位置までの距離を表すデータを得る測距センサと、
前記ユーザが居る場所の明るさを表すデータを得る照度センサと、
前記ユーザに視認させる視認画像を表示する表示装置と、を備え、
前記サーバと前記画像表示装置との連携により、
前記測距センサにより得られるデータを基に、前記視野領域に対応し各画素が前記各位置までの距離を表す距離画像を生成する生成処理と、
前記照度センサにより得られるデータを基に、前記ユーザの居る場所が明所か暗所かを判定する判定処理と、
前記判定処理により明所であると判定された場合に、前記カメラにより得られるカメラ画像と前記生成処理により得られる距離画像とを、明所カメラ画像と明所距離画像とによる画像セットとして保存する保存処理と、
前記判定処理により暗所であると判定された場合に、前記生成処理により得られる距離画像を暗所距離画像として認識する認識処理と、
前記保存処理により保存されている明所距離画像と前記認識処理により認識された暗所距離画像とを比較することにより、前記認識された暗所距離画像に対応した明所距離画像を特定する検索処理と、
前記検索処理により特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像に基づいて、前記視認画像を決定する決定処理と、
を実行する、画像表示システム。 - ネットワークに接続されるサーバと、前記ネットワークに接続され前記サーバと通信する携帯型の複数の画像表示装置と、を備える画像表示システムであって、
前記複数の画像表示装置は、それぞれ、
ユーザの前方を撮像してカメラ画像を得るカメラと、
前記ユーザから前記カメラの視野領域に含まれる現実物体における各位置までの距離を表すデータを得る測距センサと、
前記ユーザが居る場所の明るさを表すデータを得る照度センサと、
前記ユーザに視認させる視認画像を表示する表示装置と、
前記ユーザの位置情報を取得する位置情報取得装置と、を備え、
前記複数の画像表示装置の各々において、前記サーバと画像表示装置との連携により、
前記測距センサにより得られるデータを基に、前記視野領域に対応し各画素が前記各位置までの距離を表す距離画像を生成する第1の生成処理と、
前記照度センサにより得られるデータを基に、前記ユーザの居る場所が明所か暗所かを判定する第1の判定処理と、
前記第1の判定処理により明所であると判定された場合に、前記カメラにより得られるカメラ画像と前記第1の生成処理により得られる距離画像とを、明所カメラ画像と明所距離画像とを含む画像セットとし、前記画像セットと前記位置情報取得装置により取得された位置情報とを対応付けて前記サーバに保存する保存処理と、を実行し、
前記サーバと前記複数の画像表示装置のうちの1つの画像表示装置との連携により、
前記測距センサにより得られるデータを基に、前記視野領域に対応し前記各位置までの距離を表す距離画像を生成する第2の生成処理と、
前記照度センサにより得られるデータを基に、前記ユーザの居る場所が明所か暗所かを判定する第2の判定処理と、
前記第2の判定処理により暗所であると判定された場合に、前記第2の生成処理により得られる距離画像を暗所距離画像として認識する認識処理と、
前記サーバに保存されている明所距離画像のうち、対応付けされた位置情報が表す位置が、前記1つの画像表示装置の前記位置情報取得装置により取得された位置情報が表す位置と対応する明所距離画像を識別する識別処理と、
前記識別処理により識別された明所距離画像と、前記認識処理により認識された暗所距離画像とを比較することにより、前記認識された暗所距離画像に対応した明所距離画像を特定する検索処理と、
前記検索処理により特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像を前記サーバから読み出し、読み出された明所カメラ画像に基づいて、前記1つの画像表示装置が表示する前記視認画像を決定する決定処理と、を実行する、画像表示システム。 - 請求項12に記載の画像表示システムにおいて、
前記画像表示装置は、ヘッドマウントディスプレイ形状を有し、
前記表示装置は、前記ユーザの視野の一部または全部を覆い、前記ユーザの前方からの光に対して透過性を有する画像表示面を備える、画像表示システム。 - 請求項12に記載の画像表示システムにおいて、
前記画像表示装置は、ヘッドマウントディスプレイ形状を有し、
前記表示装置は、前記ユーザの視野の一部または全部を覆い、前記ユーザの前方からの光に対して非透過性を有する画像表示面を備える、画像表示システム。 - ネットワークに接続されたサーバと通信する携帯型の画像表示装置であって、
前記画像表示装置は、
ユーザの前方を撮像してカメラ画像を得るカメラと、
前記ユーザから前記カメラの視野領域に含まれる現実物体における各位置までの距離を表すデータを得る測距センサと、
前記ユーザが居る場所の明るさを表すデータを得る照度センサと、
前記ユーザに視認させる視認画像を表示する表示装置と、を備え、
前記サーバと前記画像表示装置との連携により、
前記測距センサにより得られるデータを基に、前記視野領域に対応し各画素が前記各位置までの距離を表す距離画像を生成する生成装置と、
前記照度センサにより得られるデータを基に、前記ユーザの居る場所が明所か暗所かを判定する判定装置と、
前記判定装置により明所であると判定された場合に、前記カメラにより得られるカメラ画像と前記生成処理により得られる距離画像とを、明所カメラ画像と明所距離画像とによる画像セットとして前記サーバに保存するために出力する通信装置と、
前記判定装置により暗所であると判定された場合に、前記生成装置により得られる距離画像を暗所距離画像として認識する認識装置と、
前記サーバにて保存されている明所距離画像と前記認識処理により認識された暗所距離画像とを比較することにより、前記認識された暗所距離画像に対応した明所距離画像を特定し、
前記特定された明所距離画像と同じ前記画像セットに含まれる明所カメラ画像に基づいて処理された前記視認画像を前記通信装置にて受信し、
受信した前記視認画像を表示する、画像表示装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180098170.6A CN117296307A (zh) | 2021-05-13 | 2021-05-13 | 图像显示装置以及图像显示系统 |
JP2023520699A JPWO2022239206A1 (ja) | 2021-05-13 | 2021-05-13 | |
PCT/JP2021/018292 WO2022239206A1 (ja) | 2021-05-13 | 2021-05-13 | 画像表示装置および画像表示システム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/018292 WO2022239206A1 (ja) | 2021-05-13 | 2021-05-13 | 画像表示装置および画像表示システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022239206A1 true WO2022239206A1 (ja) | 2022-11-17 |
Family
ID=84028058
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/018292 WO2022239206A1 (ja) | 2021-05-13 | 2021-05-13 | 画像表示装置および画像表示システム |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2022239206A1 (ja) |
CN (1) | CN117296307A (ja) |
WO (1) | WO2022239206A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001155153A (ja) * | 1999-11-26 | 2001-06-08 | Komatsu Ltd | 距離画像生成装置及び方法 |
JP2018147151A (ja) * | 2017-03-03 | 2018-09-20 | Kddi株式会社 | 端末装置およびその制御方法、並びにプログラム |
WO2020115784A1 (ja) * | 2018-12-03 | 2020-06-11 | マクセル株式会社 | 拡張現実表示装置及び拡張現実表示方法 |
-
2021
- 2021-05-13 JP JP2023520699A patent/JPWO2022239206A1/ja active Pending
- 2021-05-13 CN CN202180098170.6A patent/CN117296307A/zh active Pending
- 2021-05-13 WO PCT/JP2021/018292 patent/WO2022239206A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001155153A (ja) * | 1999-11-26 | 2001-06-08 | Komatsu Ltd | 距離画像生成装置及び方法 |
JP2018147151A (ja) * | 2017-03-03 | 2018-09-20 | Kddi株式会社 | 端末装置およびその制御方法、並びにプログラム |
WO2020115784A1 (ja) * | 2018-12-03 | 2020-06-11 | マクセル株式会社 | 拡張現実表示装置及び拡張現実表示方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022239206A1 (ja) | 2022-11-17 |
CN117296307A (zh) | 2023-12-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10514758B2 (en) | Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device | |
US10068369B2 (en) | Method and apparatus for selectively integrating sensory content | |
CN106415671B (zh) | 用于呈现与真实对象相关的数字信息的方法和系统 | |
US9122053B2 (en) | Realistic occlusion for a head mounted augmented reality display | |
JP5267660B2 (ja) | 画像処理装置、画像処理プログラム、画像処理方法 | |
EP3767432A1 (en) | Information processing device, information processing method, and recording medium | |
US20180164983A1 (en) | Display system, display apparatus, control method for display apparatus | |
JPWO2019044536A1 (ja) | 情報処理装置、情報処理方法、プログラム、および移動体 | |
JP6693223B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20150193977A1 (en) | Self-Describing Three-Dimensional (3D) Object Recognition and Control Descriptors for Augmented Reality Interfaces | |
KR101944607B1 (ko) | 차량의 위치 파악을 위한 안내표지판 내의 거리 정보를 습득하는 시스템 및 방법 | |
CN110546026A (zh) | 调整装置、显示系统及调整方法 | |
WO2018134897A1 (ja) | 位置姿勢検出装置、ar表示装置、位置姿勢検出方法およびar表示方法 | |
KR20200040716A (ko) | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 | |
WO2022239206A1 (ja) | 画像表示装置および画像表示システム | |
US20190114502A1 (en) | Information processing device, information processing method, and program | |
KR101935853B1 (ko) | 라이다 및 레이더를 이용한 나이트 비전 시스템 | |
KR20190069633A (ko) | 전자 장치 및 그의 텍스트 제공 방법 | |
KR20190071781A (ko) | 열에너지 정보를 표시하는 나이트 비젼 시스템 및 그 제어방법 | |
CN116249576A (zh) | 动态处理影像的系统及方法 | |
EP3510440B1 (en) | Electronic device and operation method thereof | |
US20230004214A1 (en) | Electronic apparatus and controlling method thereof | |
KR20190070951A (ko) | 나이트 비젼 시스템 | |
JP2023167915A (ja) | 情報処理装置、情報処理方法及びプログラム | |
KR20240030881A (ko) | 가상 컨텐츠 출력 방법 및 이를 지원하는 전자장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21941938 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023520699 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180098170.6 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21941938 Country of ref document: EP Kind code of ref document: A1 |