WO2003098942A1 - Information processing apparatus, information processing system, and dialogist displaying method - Google Patents
Information processing apparatus, information processing system, and dialogist displaying method Download PDFInfo
- Publication number
- WO2003098942A1 WO2003098942A1 PCT/JP2003/006155 JP0306155W WO03098942A1 WO 2003098942 A1 WO2003098942 A1 WO 2003098942A1 JP 0306155 W JP0306155 W JP 0306155W WO 03098942 A1 WO03098942 A1 WO 03098942A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information processing
- image display
- processing apparatus
- display means
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
- H04N7/144—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display camera and display on the same optical axis, e.g. optically multiplexing the camera and display for eye to eye contact
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/307—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
- H04N2007/145—Handheld terminals
Definitions
- the present invention relates to an information processing apparatus such as a mobile communication terminal capable of displaying an image of a communication partner.
- the present invention relates to an information processing apparatus, an information processing system, and a method for displaying an interlocutor using a mobile communication terminal or the like for displaying the other party's line of sight according to the terminal user.
- mobile communication terminals such as mobile phones have become extremely popular. In addition to voice communication functions, these mobile communication terminals are capable of sending and receiving e-mail, accessing the Internet, Various functions, such as receiving still images and moving images captured by the other party's camera, have become available. In recent years, not only mobile phones, but also two-way video communication systems such as video conference systems and video phones that connect remote locations with a video-voice communication network have become widespread.
- Some mobile phones and two-way video communication systems are configured so that an image of a communication partner is displayed on an image display device arranged near a camera that images a user's face and the like.
- a user of such a mobile phone or a two-way video communication system usually proceeds with a conversation while watching an image such as a face of a call partner displayed on an image display device.
- the size of the image display unit is 50 mm. Even if it is as small as four sides, if the imaging device is arranged in the left and right areas or the upper and lower areas of the image display unit, the line of sight of the captured face image indicates the other party As long as you look at the image display, you will not face the imaging device. As a result, in such a device, a face image that does not match the line of sight is displayed on the image display unit. Therefore, in such a device, since the user talks with the other party with an image that does not match the line of sight, there is a problem that an unnatural feeling cannot be denied and the sense of reality is lost as the impression of the dialogue.
- FIG. 27A, FIG. 27B, FIG. 28A, and FIG. 28B are diagrams showing examples in which the viewing directions do not match on the image display unit of the terminal.
- the camera is located 65 mm from the center of the image display, the user's face is positioned at a distance of about 25 cm, and the user looks at the center of the image display. The image is shown.
- FIG. 27A and Figure 27B show the appearance of the image displayed on the image display unit when the camera is placed in one of the left and right areas of the image display unit, which is about 50 mm square. It is a figure which shows typically.
- FIG. 27A is a camera image when the camera is arranged on the left side toward the image display unit
- FIG. 27B is a camera image when the camera is arranged on the right side toward the image display unit. In this way, all camera images have an unnatural dialogue screen because their eyes are not directed toward the user from the image display unit.
- FIGS. 28A and 28B are displayed on the image display section when the camera is arranged in one of the upper area and the lower area of the image display section of about 50 mm square and picked up. It is a figure which shows the mode of an image typically.
- Figure 28A shows a camera image when the camera is placed above the image display section.
- B is a camera image when the camera is arranged below the display screen. In this case as well, the camera image does not look directly at the user from the image display unit, resulting in an unnatural dialog screen.
- a micromirror of a substantially flat plate shape is arranged on the surface of the image display unit, and an image displayed on the image display unit is displayed through the micromirror and displayed by the user.
- an image is captured by receiving reflected light reflected on the surface of a minute half mirror with an imaging camera.
- an image display unit having a structure capable of transmitting light is provided, and a camera is arranged behind the image display unit for a user.
- the image display unit repeats the display state and the transmission state in a time-division manner, and when the image display unit is in the transmission state, an image of the user is taken.
- a required video signal is sent to the image display unit during a period different from the transmission state, and an image of the other party is displayed.
- the display is performed with the eyes aligned. It becomes possible.
- a display / imaging device as described in Japanese Patent Application Laid-Open No. 4-166790 is known.
- minute holes are provided over the entire surface of the image display unit, the end of the optical fiber faces each of the minute holes, and the other end of the optical fiber is connected to the camera. It is configured to connect.
- the display and imaging device Since the relationship between the end of the optical fiber 1 facing the hole and the image display unit does not deviate in position, it is possible to perform display in line of sight.
- the conventional techniques described above are realized by aligning the optical axis of the image display unit with the optical axis of the imaging device.
- Japanese Patent Application Laid-Open No. H10-75432 discloses a housing placed on a table provided with an imaging unit composed of a camera and an image display unit.
- a three-dimensional video phone is disclosed, which is configured using a three-dimensional liquid crystal display element without glasses of the split-type system, and in which force lenses are arranged at right and left positions of the image display unit.
- Japanese Patent Application Laid-Open No. Hei 10-747532 discloses that by selectively combining and fusing images captured by two cameras provided at left and right positions of an image display unit, It states that a pseudo-stereoscopic frontal face image can be obtained, and that conversation with eyes can be performed.
- the imaging device is arranged with some positional relationship with respect to the image display unit itself.
- the present invention has been made in view of such circumstances, and in a portable information processing apparatus such as a mobile communication terminal, an information processing apparatus that realizes a natural eye-gaze conversation with a partner is provided.
- System and interlocutor display The aim is to provide a method.
- the present invention also provides an information processing apparatus, an information processing system, and an interlocutor display for realizing a conversation in which a natural gaze is matched with a partner while avoiding a problem caused by a parallax between two captured images being too large.
- the aim is to provide a method. Disclosure of the invention
- An information processing apparatus that achieves the above object is a portable information processing apparatus that performs a dialogue with a video, and an image display unit that displays a required image according to an image signal;
- An image display means is provided on each of the left and right sides of the image display means.
- the right imaging means captures the image viewed from the right front of the user as an image.
- the image pickup means captures a state viewed from the front left of the user as an image.
- the image taken from the left side and the image taken from the right side are displayed together as required image display on an image display means such as a terminal on the other side.
- the viewer of the image display means sees both images at the same time, and the discrepancy between the left and right gaze directions is grasped in a corrected form. Therefore, a display can be realized in which the eyes are aligned without particularly aligning the optical axis of the imaging means with the optical axis of the image display unit.
- an information processing apparatus that achieves the above object is a portable information processing apparatus that performs a dialogue with a video, comprising: a portable housing; an image display unit that displays a desired image in response to the signal, the t casing a surface of which is characterized by comprising an imaging means provided on each of the right and left sides of the image display unit such
- the information processing apparatus It has a structure in which the device is mounted in a portable housing, and the imaging means mounted on each of the left and right sides is formed on the surface of the housing, so it must have a thin structure as a whole.
- the information processing device can be configured with a small and lightweight housing.
- an information processing apparatus that achieves the above object is a portable information processing apparatus that performs a dialogue with a video, and includes a plurality of pixels that perform display based on a left-eye signal and a right-eye image. It is characterized by comprising image display means in which a plurality of pixels for performing display based on signals are mixed, and image pickup means provided on each of the left and right sides of the image display means.
- the image pickup means on each of the left and right sides of the image display means, it is possible not to adjust the position of the image display means in particular, for example, to avoid a positional relationship of overlapping. Since the imaging means is mounted, a small and thin structure as a whole is realized. Since the image display means includes a plurality of pixels for performing display based on the signal for the left eye and a plurality of pixels for performing display based on the signal for the right eye, no special device such as polarized glasses is required. It is possible to proceed with the conversation while keeping the gaze with the other party.
- the information processing apparatuses of the three forms according to the present invention as described above each generate a new image in which parallax is interpolated, based on the two images captured by the imaging unit. It is preferable that an image processing unit is provided, and two new images generated by the image processing unit are displayed on the display surface of the image display unit. '
- the information processing apparatus it is possible to avoid a problem in displaying images due to an excessively large parallax between two images captured by the imaging unit.
- a plurality of portable information processing terminals that are configured to include means for performing a dialogue with a video are provided, and the information processing terminals can communicate with each other.
- the imaging means is provided on each of the right and left sides, so that the imaging means on the side has a view from the right front of the user, and the imaging means on the left has the user The images viewed from the left front of are captured as images.
- each of the information processing terminals is configured to generate a new image in which parallax is interpolated based on two images obtained by the imaging unit. It is preferable that the information processing terminal has a processing means, and two new images generated by the image processing means are displayed on a display surface of the image display means of the information processing terminal of the other party.
- the information processing system it is possible to avoid a problem in displaying images due to an excessively large parallax between two images captured by the imaging unit, and therefore, the communication partner It is possible to optimize the stereoscopic display for matching the eyes with the eyes, and to make the images easy to see and natural.
- the method for displaying an interlocutor according to the present invention includes a pair of imaging devices provided on both left and right sides of an image display unit of a portable terminal.
- a video of a user is captured by a pair of imaging means, and a signal relating to a left image and a signal relating to a right image of the user are obtained.
- These signals are sent to the image display means of the terminal of the communication partner, and the pixels of the left image and the pixels of the right image are mixed and displayed, for example, so that the user's line of sight coincides with the viewpoint of the communication partner. Can be displayed.
- the interlocutor display method includes an image processing step of generating a new image in which parallax is interpolated based on the two images captured in the video capturing step. It is desirable that two new images generated in the image processing step be displayed on the display surface of the image display means.
- the interlocutor display method it is possible to avoid a problem in display due to an excessively large parallax between the two images captured by the imaging unit. It is possible to optimize the stereoscopic display to match the line of sight with the other party, and to make it easier to view and natural images.
- an information processing apparatus that achieves the above-described object is an information processing apparatus that performs a dialogue with a video, and includes an image display unit that displays a required image in accordance with an image signal; Imaging means provided on each of the right and left sides of the means, and image processing means for generating a new image with parallax interpolated based on the two images obtained by the imaging means.
- the display screen of the image display means displays two new images generated by the image processing means c.
- imaging means are provided on both left and right sides of the image display means, and a new parallax-interpolated image is obtained based on the two images obtained by the imaging means.
- an information processing apparatus that achieves the above-mentioned object is an information processing apparatus that performs a dialogue with a video, wherein the information processing apparatus is mounted on a surface of the housing, and required according to an image signal.
- Image display means for displaying the image of the image display means, imaging means provided on the left and right sides of the image display means on the surface of the housing, and two images obtained by the imaging means.
- Image processing means for generating a new image in which parallax is interpolated, and two new images generated by the image processing means are displayed on the display surface of the image display means. It is characterized by that.
- Such an information processing apparatus has a structure in which the above information processing apparatus is mounted on a portable housing, and the imaging means mounted on each of the left and right sides is a surface of the housing. Since it is formed above, it is possible to take a thin structure as a whole, and the information processing device can be configured with a small and lightweight housing.
- An information processing apparatus that achieves the above object is an information processing apparatus that performs a dialogue with a video, and includes a plurality of pixels that perform display based on a left-eye signal and a display based on a right-eye signal.
- Image display means in which a plurality of pixels for performing Image capturing means provided, and image processing means for generating a new image with parallax interpolated based on the two images captured by the image capturing means, and provided on a display surface of the image display means. Is characterized in that two new images generated by the image processing means are displayed.
- the imaging means is mounted without adjusting the position of the image display means, for example, without a positional relationship such that they are superimposed on each other.
- the image display means includes a plurality of pixels performing display based on the signal for the left eye and a plurality of pixels performing display based on the signal for the right eye, and is obtained by being imaged by the imaging means.
- the newly generated image is displayed by interpolating the parallax based on the two images, so the stereoscopic display to match the line of sight with the other party is optimized without the need for special equipment such as polarized glasses. This makes it possible to proceed with the conversation while keeping the gaze with the other party under a more natural image that is easier to see.
- the information processing system includes, in addition to providing image display means capable of displaying an image including a face part of a call partner, image pickup means on each of the left and right sides of the image display means.
- a plurality of information processing terminals for performing a dialogue with video each of the information processing terminals having a new parallax-interpolated based on two images captured by the imaging means.
- Image processing means for generating a unique image, and when communicating between the information processing terminals, the two new images generated by the image processing means are transmitted to the information processing terminal of the other party.
- the image is displayed on the display surface of the image display means.
- the image pickup means is provided on each of the left and right sides, so that the right image pickup means looks from the right front of the user, and the left image pickup means shows the user's view.
- the state seen from the front left is Each is captured as an image.
- the image taken from the left side and the image taken from the right side are displayed together by the image display means of the information processing terminal on the other side.
- the image display means displays a newly generated image by interpolating the parallax based on the two images obtained by the image pickup means, so that it is possible to match the gaze with the other party.
- the 3D display can be optimized, and an image that is extremely easy to see and has a natural gaze direction can be obtained.
- the method for displaying an interlocutor comprises: An image processing step of generating a new image with parallax interpolated based on the two images captured in the video capturing step, and an image display means of the other party's terminal displaying the image in the image processing step. And displaying the two new images so that the user's line of sight coincides with the viewpoint of the other party.
- a user's video is captured by a pair of imaging means, and a new image in which parallax is interpolated based on the captured two images is generated.
- the signal concerning the left image and the signal concerning the right image of the user are obtained.
- the three-dimensional display to match the line of sight with the other party is optimized. This makes it possible to match the line of sight of the user with the viewpoint of the other party and display the same.
- FIG. 1 is a schematic diagram of an example of the information processing device of the present invention.
- 2A to 2C are schematic diagrams illustrating the principle of an example of the information processing device of the present invention.
- FIG. 3 is a block diagram showing a schematic circuit configuration of an example of the information processing apparatus of the present invention.
- FIG. 4 is a diagram showing a use state of a system using the information processing device of the present invention.
- FIG. 5 is an explanatory diagram (X direction) of an emission angle of a light beam in the image display unit of the information processing device of the present invention.
- FIG. 6 is an explanatory diagram (y-direction) of an emission angle of a light ray in the image display unit of the information processing device of the present invention.
- FIG. 7 is a diagram illustrating a mounting position of a camera on an image display unit of an example of the information processing apparatus of the present invention.
- FIG. 8 is a schematic diagram illustrating a configuration of a pixel of an image display unit of an example of the information processing device of the present invention.
- FIGS. 9A to 9E are diagrams illustrating an example of an array pattern of left-eye pixels (L) and right-eye pixels (R) of an image display unit of an example of the information processing apparatus according to the present invention.
- FIG. 10 is a schematic diagram showing a configuration near a pixel of an image display unit of an example of the information processing apparatus of the present invention. .
- FIG. 11 is a cross-sectional view of an image display unit of an example of the information processing apparatus according to the present invention, and is a cross-sectional view in a case where the axes of the light emitting element and the microphone aperture lens are not shifted.
- FIG. 12 is a cross-sectional view of an image display unit of an example of the information processing apparatus of the present invention, and is a cross-sectional view in a case where the axes of the light emitting element and the microphone aperture lens are misaligned.
- FIGS. 13A and 13B are diagrams showing the spread of light rays by the light emitting element and the microlens as an example of the information processing apparatus of the present invention.
- FIG. 13A shows the axis of the light emitting element and the microlens.
- Figure 13B shows the spread of the light beam when the axis of the light emitting element and the microphone aperture lens are misaligned. It is the figure which simulated each of these.
- FIGS. 14A to 14C are graphs showing optical calculation results on the size and the positional relationship between the microlens and the light-emitting element as an example of the information processing apparatus according to the present invention.
- FIG. Fig. 14B is a graph showing the result of optical calculation of the element diameter and the spread angle of the light beam.
- Fig. 14B is a graph showing the result of calculating the position of the lens and the optical axis direction and the spread of the light beam.
- FIGS. 15A and 15B are graphs showing calculation results of an in-plane direction position of a light emitting element and an emission angle of a light ray.
- FIGS. 15A and 15B show microlenses used as an example of the information processing apparatus of the present invention.
- FIG. 15A is a diagram showing an example
- FIG. 15A is a front view
- FIG. 15B is a cross-sectional view.
- FIG. 16 is a conceptual diagram for explaining the concept of image processing as a function of an example of the information processing apparatus of the present invention, and is a diagram for explaining a state of image processing for finding an image closest to a template from an image. is there.
- FIG. 17 is a diagram showing parameters for parallax.
- FIGS. 18A and 18B are diagrams showing an image display section of another example of the information processing apparatus of the present invention
- FIG. 18A is a cross-sectional view showing the vicinity of a light emitting element and a cone type micro lens.
- FIG. 18B is a perspective view of the cone-type microlens.
- FIG. 19 is a schematic view illustrating parameters for explaining a relationship between light emission and an emission angle when a microlens or the like is used.
- FIG. 20 is a schematic diagram illustrating the relationship between the partial area ratio to the global area and the exit angle ⁇ when a micro lens or the like is used.
- FIG. 21 is a cross-sectional view showing an image display unit of still another example of the information processing device of the present invention.
- FIGS. 22A and 22B are schematic diagrams showing an experiment in which the impression of an image whose gaze direction and the gaze direction were matched was examined.
- FIG. 22B is a diagram when a monaural image is obtained.
- FIGS. 23A and 23B are graphs showing the results of the experiments of FIGS. 22A and 22B, and FIG. 23A is a diagram when a stereoscopic image is obtained. B is a diagram when a monaural image is obtained.
- FIG. 24 is a conceptual diagram illustrating the concept of interpolating parallax in the information processing apparatus of the present invention, and is based on two images captured by two cameras provided on the left and right sides of the image display unit.
- FIG. 8 is a diagram for explaining how to generate a new image with small parallax as if it were obtained by being virtually imaged by two virtual cameras provided at a smaller interval than the two cameras.
- FIG. 25 is a conceptual diagram illustrating the concept of image processing as a function of an example of the information processing apparatus according to the present invention, in which two images captured by two cameras provided on the left and right sides of the image display unit are shown.
- FIG. 7 is a diagram for explaining a state of image processing for obtaining corresponding points of the image processing.
- Figure 26 is a diagram schematically showing an example of an image generated by moving the image by half the number of pixels of the amount of displacement indicating parallax, taken from the center of the two cameras that took the two images
- FIG. 8 is a diagram for explaining how an image equivalent to the generated image is generated.
- FIGS. 27A and 27B are diagrams schematically showing a display screen when a camera is arranged on one of the left and right sides of a screen having a predetermined angle
- FIG. 27A is a case on the left side
- FIG. 27B is a diagram of the case on the right side.
- FIGS. 28A and 28B are diagrams schematically showing a display screen when a camera is arranged on one of the upper and lower sides of a screen having a predetermined angle, and FIG. 28A shows an upper side of the screen.
- FIG. 28B is a diagram of the lower case.
- the mobile communication terminal 10 as an information processing apparatus has a housing 12 having a size that can be held by a user with one hand, and the longitudinal direction of the housing 12 is a vertical direction. It is configured to enable dialogue with An image display section 11 having a size of about 10 mm square to about 100 mm square is provided on the front side of the housing 12 so that its display surface is exposed to the outside. On the left and right sides of the image display section 11, a left camera 13L as an imaging section and a right camera 13R are provided.
- a minor monitor 14 for monitoring the image of the face of the user is provided, and on the side of the minor monitor 14, the face of the user is displayed.
- An indicator 17 is provided for notifying by flashing or the like when a large departure from the imaging region has occurred.
- a button 15 and a key input unit 16 are formed below the minor monitor 14. In the portable communication terminal 10, the operation of the button 15 and the key input section 16 enables the dialing of a telephone number, the input of characters of an e-mail, the function control of the portable communication terminal 10, and the like. Become.
- the left camera 13L and the right camera 13R as imaging units are disposed on the left and right sides of the image display unit 11, respectively.
- the left camera 13L is provided so as to take an image from a slightly left front of the user using the mobile communication terminal 10
- the right camera 13R is provided to take an image from a slightly right front of the user. It is provided as follows.
- the left camera 13 L and the right camera 13 R are each composed of, for example, an optical system in which an M ⁇ S (Metal-Oxide Semiconductor) image sensor or a CCD (Charge Coupled Devices) image sensor and a lens are combined, Specifically, in the mobile communication terminal 10, the left camera 13 L and the right camera For example, a small and lightweight solid-state imaging device is used as the 13R.
- M ⁇ S Metal-Oxide Semiconductor
- CCD Charge Coupled Devices
- the image display unit 11 is configured by mixing a plurality of pixels performing display based on the left-eye signal and a plurality of pixels performing display based on the right-eye signal.
- the image display section 11 has emission means for independently emitting light to both eyes of the user.
- a light emitting unit that generates required light based on a left-eye signal or a right-eye signal as an emission unit, and an emission angle that controls light from this light-emitting unit to emit in a predetermined angle direction
- a control unit is composed of a plurality of light emitting diodes arranged, and the emission angle control section is composed of a micro lens array in which a plurality of micro lenses are arranged in a matrix. Note that the structure of the emitting means here is merely an example.
- the injection angle control unit can also have various structures as described later. A method of mixing a plurality of pixels performing display based on the left-eye signal and a plurality of pixels performing display based on the right-eye signal will be described later.
- the mobile communication terminal 10 shown as an embodiment of the present invention is a device that can compensate for this relatively easily perceived shift in the line of sight between left and right to realize realistic communication.
- 2A to 2C are illustrations that briefly illustrate the concept.
- FIG. 2A is an image from the left camera 13L. Since the left camera 13L is located on the left side of the center, the user's line of sight appears from the right.
- FIG. 2B is an image from the right camera 13. Since the right camera 13 R is located on the right side of the center, the user's line of sight appears from the left. Then, FIG.
- 2C shows a case where a plurality of pixels performing display based on the signal for the left eye and a plurality of pixels performing display based on the signal for the right eye are mixed and independently for both eyes of the user.
- 3 shows a display screen when an image is displayed using the image display unit 11 having a mechanism for emitting light. That is, the mobile communication terminal 10 emits light toward the left eye of the user from each pixel of the image display unit 11 to which the signal for the left eye is sent exclusively, and emits the emitted light. Reaches the left eye of the user. Similarly, the mobile communication terminal 10 emits light toward the right eye of the user from each pixel of the image display unit 11 to which the signal for the right eye is sent exclusively, and the emitted light is used. Reaches his right eye.
- the user sees a different image between the two eyes, but the user's cerebrum has the function of synthesizing and seeing it, and as a result, as shown in Fig.
- the image may be a still image or a moving image.
- the mobile communication terminal 10 can transmit and receive real-time video if there is enough transmission and reception bandwidth, but it is configured to transmit and receive compressed or thinned-out image data. Is also good.
- FIG. 3 is a block diagram showing a circuit configuration example of the mobile communication terminal 10.
- the control circuit 21 includes a CPU (Central Processing Unit), a video memory, and a required image information processing circuit, and the control circuit 21 also performs signal processing for preventing camera shake as described later.
- the data processed by the control circuit 21 can be transmitted by the transmission / reception unit 28, and the data received by the transmission / reception unit 28 is transmitted to the control circuit 21.
- the key input section 27 for example, a numeric keypad and a function setting button are arranged, and further, a jog dial, an input pad, other extended function devices, and the like may be arranged.
- a signal from the left camera 24L and a signal from the right camera 24R are independently supplied to the control circuit 21.
- the signal from the left camera 24 L includes video data captured from the user's slightly forward left, and the signal from the right camera 24 R captures an image from the user's slightly forward right. Includes video and overnight.
- signals from the left camera 24 L and the right camera 24 R are transmitted to the transmission / reception unit 28 via the control circuit 21, and transmitted from the transmission / reception unit 28. Received by other mobile communication terminals. In the mobile communication terminal 10, it is possible to perform a call by transmitting and receiving such data overnight. In particular, in the present embodiment, as shown in FIG. Being able to see it means that a natural call can be made with the eyes aligned.
- the signals from the left camera 24L and the right camera 24R are sent to other portable communication terminals, respectively, and can be projected on a small monitor screen via the control circuit 21. That is, the left camera 24 L and the right camera The signals from 24 R are sent to the minor monitor 25 via the control circuit 21 and the driver 26, respectively.
- This allows the portable communication terminal 10 to see its own face while talking on a relatively small screen such as the minor monitor 14 shown in FIG. If you want to talk on a relatively small screen such as a main monitor 14 without checking your face, depending on the angle of the mobile communication terminal 10, the left camera 24 L and the right camera 2 At the center of the 4R, it may not be possible to capture your own face, and the face may be out of the camera's imaging range.
- the minor monitor 25 is a device that prompts visual confirmation, but in the mobile communication terminal 10, signals from the left camera 24 L and the right camera 24 R are sent to the control circuit 21, respectively.
- steady image processing such as camera shake prevention processing may be performed, and if an appropriate face cannot be displayed, the indicator 29 may be made to blink, and the minor monitor Control may be performed so as to display this on the screen of the main monitor 22 or the main monitor 22.
- the signal from the other party is sent from the transmission / reception unit 28 to the control circuit 21, divided into a left-eye signal LE and a right-eye signal RE, and output from the control circuit 21.
- the left-eye signal LE is sent to the left-eye driver 23 L for driving the left-eye pixel
- the right-eye signal RE is sent to the right-eye driver 23L for driving the right-eye pixel.
- the main monitor 22 that displays the other person's face is driven by signals from the left-eye driver 23 L and the right-eye driver 23 R. As described above, on the main monitor 22, pixels for the left eye and pixels for the right eye are mixed, for example, even lines are for the right eye and odd lines are for the left eye.
- the block diagram shown in FIG. 3 merely shows an example of the mobile communication terminal 10, and other circuit configurations are possible.
- the mobile communication terminal 10 only one of the signal LE for the left eye and the signal RE for the right eye may be sent to both drivers 23 L and 23 R by a button operation or the like. Also, it may be controlled so that the signal of the main monitor 122 and the signal of the minor monitor 25 are switched in accordance with a button operation, etc. It is also possible to display as follows. In the mobile communication terminal 10, not only the image but also the sound may be stereo simultaneously. Further, in the mobile communication terminal 10, signal reproduction for the left eye and signal reproduction for the right eye are performed only when necessary, and in a normal case, the screen is displayed with a signal that does not distinguish between left and right.
- FIG. 4 is a diagram schematically showing a state in which a call is made using two mobile communication terminals 35 and 38 having the same configuration as the above-described mobile communication terminal 10.
- the mobile communication terminal 35 carried by the user 33 has an image display section 36 and a pair of Cameras 37 L and 37 R are mounted.
- the image display section 36 shows the face of the other party 34, and the face image is a natural-shaped image whose eyes are aligned according to the principle described above with reference to FIGS. 2A to 2C. It is said.
- the portable communication terminal 38 held by the other party 34 has an image display unit 39 and a pair of cameras 40L and 40R.
- the image display section 38 shows the face of the user 33, and similarly, the face image has a line of sight according to the principle described above with reference to FIGS. 2A to 2C. It is considered to be a natural shaped image that fits.
- FIG. 5 is a diagram when the image display unit of the mobile communication terminal shown as the present embodiment is viewed from above.
- a region bl indicates a range in which light emitted or reflected from the pixel on the left end of the display surface of the image display unit is strong enough to be seen.
- the region b2 is a range of light emitted or reflected from the rightmost pixel on the display surface of the image display unit.
- the angle Q continuously changes between the two so that they substantially overlap each other at a distance L apart from the display surface of the image display unit.
- the distance is a dimension normally assumed as the distance to look at the display surface.
- this distance L is set to 250 mm, which is called the clear vision distance of a person.
- the distance L 2 is the assumed dimensions further consideration to see stretched out arm.
- this distance L 2 is assumed to be 400 mm.
- black dots b 3 and b 4 indicate the positions of the left eye E L1 and the right eye E R 1 at the distance L of the viewer of the display surface of the mobile communication terminal, respectively. Furthermore, black points b 3 ′ and b 4 ′ indicate the positions of both eyes at a distance L 2 , respectively.
- the light in the regions indicated by the regions bl and b2 does not enter the right eye but can be seen only by the left eye.
- the light angle is reversed left and right, pixels that can be seen only by the right eye can be set. Therefore, by displaying these images for the left and right eyes for each line and for each pixel with respect to the entire display pixels, stereoscopic vision becomes possible.
- the distance D is the horizontal size of the display surface of the image display unit.
- the width of the image display section of a portable device is about 20 mm to 80 mm due to a requirement that the image display section can be held in a hand.
- D is 40 mm.
- the values are designed so that the light spread is 10 ° so that the light for both eyes is not mixed as much as possible and there is no gap, so that the light reaches the eyes sufficiently.
- the distance at which a part of the display surface becomes theoretically invisible in this setting is geometrically the shortest at 202 mm and the longest at 657 mm.
- the divergence angle of this ray may be larger than 0.10 ° if the light can be separated into both eyes.
- this spread is large, it is necessary to increase the exit angle of light rays, which makes optical design difficult.
- the object of the present invention is a device that assumes that the user is an individual, it is not necessary to spread the light more than necessary in terms of privacy protection and energy consumption reduction. There are advantages.
- the black spots b 3 ′ and b 4 ′ are at the same position (L 2 ).
- this can be further generalized by the following equations.
- the distance D x max from the center to the edge of the display surface of the image display unit is shorter than the distance E R 1 which is half the binocular distance.
- the size of the display surface of the image display unit is small.
- the distance 1 ⁇ is a dimension normally assumed as a distance for viewing the display surface, and is, for example, a clear visual distance of a person.
- the distance L 2 is the assumed dimensions further consideration to see stretched out arm.
- the distance Dx is the distance in the X direction (horizontal direction)
- the distance Dy is the distance in the y direction (vertical direction). In other words, the calculation can be performed assuming that the position of the pixel for which the angular direction of the ejection is found exists in (D x D y).
- the distance D x max from the center to the end of the display surface of the image display unit is longer than the half distance E R 1 of the binocular distance In this case, the size of the display surface of the image display unit is large.
- the exit angle of the light beam is set as shown in the following equation (3).
- the emission direction can be determined by calculating each of the pixels using the above equations (1) to (3). It is. For example, when the pixels for the right eye and the left eye are on a line-by-line basis, several points are extracted on the line, and the extracted points are expressed by the above equations (1) to (3). It is also possible to calculate using each of these, and for each point between the extraction points, set the data on the exit angle by a method such as linear interpolation.
- the image display section 31 has a structure in which pixels are arranged in a matrix, and its outer shape is substantially rectangular. Although at least one camera is provided on each of the right and left sides, more cameras may be provided, for example, different types of cameras may be provided. One of the left and right cameras is a normal camera, and the other can be combined with a relatively simplified camera for synthesizing the eyes.
- the position of each camera includes a position within a predetermined range from the left and right ends of the substantially rectangular image display unit 31 in the horizontal direction and from the upper and lower ends of the left and right ends, for example, in FIG.
- the camera can be placed in the areas indicated by the areas 32L and 32R.
- Region 3 2 L, 3 2 R shown in FIG. 7 is a side horizontal left right end of the image display unit 3 1 includes a strip-shaped region having a width H 2, the upper and under end of the substantially rectangular shape
- the distance from the corner of the image display unit 31 is, for example, within a radius r. In the height direction, it also includes an area that extends above the substantially rectangular image display unit 31 by a distance of 11 inches.
- the width H 2 is not particularly limited, for example, when the previous radius r about 2 0 mm likewise the width H 2 may be about 2 0 mm. It is preferable that the camera position as described above is fixed to the image display unit 31 because each of the optical systems (not shown) is provided. However, the camera itself can be freely moved in and out of the side of the image display unit 31. In addition, both or one of the pair of cameras may be attached at the time of imaging. It may be a structure that can be used. A glass or plastic lens is attached to the front end of the camera, but it can be covered with a cover when not in use to prevent scratches.
- FIG. 8 is an enlarged view of a display pixel, and a region 51 indicated by a substantially square block corresponds to one pixel.
- Four light-emitting elements 52 R, 52 G, 52 B, and 52 G are arranged in the area 51 of each pixel so as to occupy the positions of the four points of the dice.
- the light emitting elements 52 R, 52 G, 52 B, 52 G are composed of semiconductor light emitting elements such as light emitting diodes.
- the light-emitting element 52R is an element that emits red light
- the light-emitting element 52G is an element that emits green light
- the light-emitting element 52B is an element that emits blue light.
- the light-emitting elements 52G that emit green light are more easily resolved by human eyes than elements that emit other light-emitting colors, so the green light-emitting elements 52G are densely arranged. By doing so, it is possible to give a uniform impression.
- a transmissive display element having a color filter such as a color liquid crystal may be used as an element constituting the image display unit, or some kind of reflective display element. You may.
- images In order to configure an image display unit capable of performing required stereoscopic viewing by emitting light beams to the left and right eyes using the pixel region 51 as described above, images must be distributed for each line and each pixel. Just set it.
- FIGS. 9A to 9E are diagrams showing examples of left and right video distribution patterns for realizing stereoscopic vision.
- “L” is a pixel that generates light according to data for the left eye, and emits a light beam toward the left eye.
- “R” is a pixel that generates light based on data for the right eye, and emits a light beam toward the right eye.
- FIG. 9A shows a pattern in which pixels for the left eye indicated by "L” and pixels for the right eye indicated by "R” are alternately arranged for each horizontal line.
- FIG. 9B shows a pattern in which pixels for the left eye indicated by "L” and pixels for the right eye indicated by “R” are alternately arranged for each vertical line.
- FIG. 9C shows a checkered pattern. This figure shows a pattern in which pixels for the left eye and pixels for the eye appear alternately.One pixel for the left eye and one pixel for the right eye appear alternately on a horizontal line, one pixel at a time. The horizontal line indicates a pattern in which a similar pattern appears shifted by one pixel in the horizontal direction.
- FIG. 9D shows a case where a checkered pattern is formed for each pixel size of 2 ⁇ 2 pixels, similar to FIG. 9C.
- FIG. 9D shows a case where a checkered pattern is formed for each pixel size of 2 ⁇ 2 pixels, similar to FIG. 9C.
- FIGS. 9A to 9E two pixels for the left eye and pixels for the right eye appear alternately every two pixels in one horizontal line, and a similar pattern appears horizontally in the current horizontal line and the next horizontal line.
- This figure shows a pattern that appears to be shifted by each pixel.
- the patterns shown in FIGS. 9A to 9E are examples, and other patterns can be formed. The entire surface can be formed with the same pattern.However, for example, different patterns may be used on the center side of the image display area and near the periphery. May be arranged. Also, instead of assigning the left-eye pixels and right-eye pixels to a regular pattern, the wiring for the left-eye pixels and the right-eye pixels is assigned in an irregular pattern although the wiring is complicated. You may.
- microlenses which are minute lenses
- FIG. 10 is a diagram showing a state in which four light emitting elements 63G, 63B, 63R composed of semiconductor light emitting elements such as light emitting diodes and semiconductor lasers are arranged for one pixel 61. It is.
- Light emitting element 63 R emits red light
- the light emitting element 63 G is an element that emits green light
- the light emitting element 63 B is an element that emits blue light.
- the green light-emitting element 63 G emits green light more easily than the other light-emitting elements. By arranging them densely, it is possible to give a uniform impression.
- microlenses 62 made of a spherical transparent body are arranged on the surface side of the image display section, which is the light emission side of such light emitting elements 63G, 63B, 63R.
- the microlens 62 is an emission angle control unit that controls the light from the light emitting elements 63G, 63B, and 63R so as to emit the light in a predetermined angle direction to the left eye or the right eye. It is formed of a transparent synthetic resin such as polymethyl methacrylate) or glass.
- the shape of the microlens is not limited to a sphere, but may be a cone, a pyramid, or a rectangle.
- each microlens 62 has a function of controlling the light emission angle, an opening for each light-emitting element pointing to one of both eyes of the user is provided on a shielding plate or the like for that purpose.
- a method of shifting the position of the microlens 62 from the optical axis of light from the light emitting elements 63G, 63B, 63R is also possible.
- FIGS. 11 and 12 are schematic cross-sectional views for explaining an example in which the emission direction of the light beam is controlled by the position of the microlens 62.
- FIG. FIG. 12 shows an example in which the axes of the microlens 62 and the microlens 62 coincide with each other.
- FIG. 12 shows an example in which the axes of the light emitting element 63G and the microlens 62 are misaligned.
- the size of the light emitting element 630 is indicated by ⁇ . ! ⁇ Is about 30 im, which is representative of a light emitting element 63 G that emits green light.
- the z-axis direction is the normal direction of the display surface of the image display unit, and here, the z-axis direction is the light emission direction.
- the light-emitting element 63G is, for example, a GaN-based light-emitting diode, and a blue light-emitting diode can also be constituted by a GaN-based semiconductor or the like. Further, the light-emitting element 63R that emits red light may be composed of a GaAs-based compound semiconductor or the like.
- Each light emitting element 63G is adhered on a support substrate 65, and is arranged in a matrix at a pitch of about 300m to 300m.
- a molded holding member 66 which functions as a holding member for the microlenses 62 and also functions as a shielding plate for restricting light from the light emitting element to an appropriate angle, is provided on the support substrate 65.
- the molded holding member 66 has an opening corresponding to the position of each light emitting element. From the opening, the diameter is enlarged with a substantially frustoconical shape, and a micro end is provided at the end opposite to the light emitting element.
- the black lens 62 is configured to be fitted. The molded holding member 66 and the microlens 62, and the molded holding member 66 and the support substrate 65 are adhered and fixed to each other.
- the microlenses 62 are connected and held by a holding portion 64 that holds the lens at the maximum diameter portion.
- the diameter LENS of each microlens 62 is set to about 300 m here. ing. With such a configuration, a gap is formed at the bottom of the microlens 62 attached to the opening of the molded holding member 66 by a distance d between the microlens 62 and the light emitting element 63G, and light passes through the gap. It passes through and is introduced into the micro lens 62.
- FIG. 12 shows an example in which the axes of the light emitting element 63G and the microlens 62 are shifted as described above.
- the light emitting element 63 G is positioned at a distance from a line passing through the center of the microlens 62 and parallel to the z-axis. It is arranged at a position shifted by the distance y. If the light-emitting element 63G is arranged at such a position shifted by the distance ⁇ y, light emitted from the light-emitting element 63G is bent from the misalignment of the axis of the microlens 62. From the setting of such a positional relationship, light can be emitted in the direction toward the right eye and the left eye.
- the method of shifting the axis of the light emitting element 6 3 G and the micro lens 6 2 includes a method of shifting the position of the light emitting element 6 3 on the support substrate 65, a method of shifting the position of the micro lens 62, and a method of shifting the micro lens 6.
- a method of shifting the position of the microlens 62 by shifting the position of the opening of the molding and holding member 66 may be used.
- the method of shifting the position of the microlens 62 is adopted, and the center of the opening of the molded holding member 66 is fixed so that the axis of the microlens 62 does not match. I have.
- FIG. 12 only the displacement in the y direction is shown, but not only in the y direction but also in the z direction and the X direction in order to enable light to be emitted in the direction toward the right and left eyes. The deviation may be included.
- FIG. 13A simulates the spread of light rays when the axes of the light-emitting element 63G and the microlens 62 are coincident.
- Fig. 13B shows the light-emitting element 63G and the microlens. This is a simulation of the spread of light rays when the axis is shifted from 62. That is, FIG. 13A shows the relationship between the light emitting element having the structure of FIG. 11 and the microlens, and FIG. 13B shows the relationship between the light emitting element having the structure of FIG. 12 and the microlens. It shows the relationship.
- the center is about the z-axis, which is the normal direction of the display surface of the image display section.
- the axes of the light emitting element 6 3 G and the microlens 62 are misaligned, an angle is given to the light emitting direction as shown in FIG. Slightly diagonally above The light rays will be emitted toward the other side.
- the material of the microphone aperture lens is PMMA
- the size is 300 m in diameter
- the size of the light emitting element is 30 m in diameter
- the distance between the light emitting element and the microlens is Is set to 50 m, FIG.
- the refractive index changes according to the wavelength.
- the following table shows the relationship. The calculation is performed using the data shown in the following table.
- optical calculations as shown in FIGS. 14A to 14C are performed.
- the graph shown in FIG. 14A shows the result of optical calculation of the diameter of the light emitting element and the spread angle of the light beam.
- the conditions were the same as in the case shown in Fig. 13A, the material of the microlens was PMMA, the size was 300 m in diameter, and the distance between the light emitting element and the microphone aperture lens was 50 ⁇ m.
- the size of the light emitting element is changed.
- the size of the light emitting element is preferably about 30 m in diameter. Since the comparison between the size of the microphone aperture lens and the size of the light emitting element is relative, for example, the size ratio between the micro lens and the light emitting element is set to 30: 1 to 5: 1. Is preferred.
- the graph shown in Fig. 14B shows the result of calculating the effect of the distance between the light emitting element and the microlens on the spread angle of the light beam.
- the conditions are the same as in the case of Fig. 14A.
- the material of the microlens is PMM A, the size is 300 mm, the size of the light emitting element is 3 Om, and the light emitting element and the micro lens are And the distance d is variable. From these results, it can be seen that it is preferable to set the distance between the light emitting element and the micro-aperture lens to about 50 im in order to set the spread angle of the light beam to 10 °.
- the upper and lower angles mean the range of 0 in the light spread area shown in FIG.
- the center is a plot of the center of those angles. From the explanation of Fig. 5, it is necessary to change from 0.43 ° to 19.57 ° depending on the position of the display surface in order to distribute images to the left and right eyes. From the graph shown in Fig. 14C, it can be seen that in order to satisfy this condition, the distance ⁇ y should be changed linearly from approximately 0 to 35 m. Such a linear calculation can be approximately given by the following equation (4).
- Vy relative position of the light emitting element in the horizontal direction with respect to the lens
- the advantages of using a microlens include, as described above, high light use efficiency and low power consumption, as well as reflection of external light obliquely entering the eye.
- High contrast, high image quality, and the lens effect increases the apparent size of the pixels, which reduces the apparent pixel spacing and visually separates the pixels. For example, it is possible to prevent the image from being seen visually, and to obtain a continuous image with a relatively small number of pixels.
- FIGS. 15A and 15B Next, an example of a microlens will be described with reference to FIGS. 15A and 15B.
- FIG. 15A is a front view of the microlens
- FIG. 15B is a cross-sectional view of the microlens.
- the microlenses shown in FIGS. 15A and 15B have a structure in which the individual microlenses formed of substantially transparent spheres are held by a substantially flat holding member at the maximum diameter portion. They are arranged densely. The diameter of each micro lens is, for example, about 300 m. It is also possible to form an image display section by bonding such individual microphone aperture lenses to an array of light emitting elements while holding them on a substantially flat holding member, and it is possible to position individual micro lenses. Since it becomes unnecessary, the manufacturing cost of the mobile communication terminal can be reduced. Next, as an example of the image processing applied to the present embodiment, a method of keeping the position of the eye stable will be described.
- the cameras 13 L and 13 R that capture the face are respectively installed on both sides of the image display section 11, and the minor monitor 14 has its own for confirmation. Since the face is displayed, it can be adjusted to some extent so that the face is within the imaging range of the camera. However, the positional relationship between the displayed image of the face of the interlocutor and the camera usually fluctuates greatly when it is hand-held. The shift in the direction of the line of sight does not become extremely large as long as the face of the interlocutor is almost included on the display surface.However, in order to further match the line of sight and reduce the blur of the screen, It is preferable to provide a function for stabilizing the position.
- the method of stabilizing the eye position is to provide a margin in the area imaged by the cameras 13L and 13R, and to image an area one size larger than the face. Then, on the display side, the position of the eye of the interlocutor is adjusted by image processing so as to be on a line connecting the cameras 13 L and 13 R and their centers approaching each other, and then displayed.
- a method of detecting the position of the eyes from the face image a widely known image recognition method can be used.
- a method based on correlation detection will be described.
- FIG. 16 is a conceptual diagram illustrating the process of finding the closest to the template from the image. This correlation detection uses a calculation formula of a correlation value represented by the following equation (5). Correlation coefficient covariance variance
- the coordinate value (i, j) at which the correlation coefficient c; j becomes maximum gives the matching position.
- g is a template image. In this case, images of a standard eye, nose, eyebrows, mouth, etc. are registered in the memory in advance.
- F is the target display image.
- FIG. 17 illustrates parameters related to parallax, and the calculation is performed using the following equation (6).
- Absolute parallax convergence angle: y f , y f
- a template of a face part of appropriate size can be used. Then, the input screen is searched using the template, and the position of the face eye is grasped by finding the position having the largest correlation value. Then, the screen is translated or rotated, or enlarged or reduced so that the position of the eye overlaps as much as possible in the left and right images and the center thereof approaches the position corresponding to the center of the camera on the display screen. Also at this time, it is possible to perform appropriate display by using a matrix as shown in the above equation (7) for converting the image. Next, another example of the structure of the image display unit will be described with reference to FIGS. 18A and 18B.
- FIG. 18A and FIG. 18B are examples of an image display unit using a cone type lens.
- FIG. 18A is a cross-sectional view of the image display unit
- FIG. 18B is a lens. It is a perspective view of.
- light-emitting elements 74 and 75 such as light-emitting diodes are arranged on a support substrate 77.
- the light emitting element 74 is formed at a substantially central portion of a gap 76 formed by the molding and holding member 3 so that the direction of the emitted light beam is substantially the z direction, and the light emitting element 75 emits light.
- the molding and holding member 73 is formed by the molding and holding member 73 to shift the direction of the light beam in the y direction, and is formed by shifting in the y direction from a substantially central portion of the gap 76.
- the molded holding member 73 is a member formed by molding a required synthetic resin, functions as a holding member for the cone-shaped micro lenses 71 and 72, and restricts light from the light emitting element to an appropriate angle. It also functions as a shield plate for
- the micro lenses 71 and 72 have cone shapes, and the large-diameter end on the bottom side has light emitting elements 74 and 7. It is configured so as to face the space 5 via a gap 76, and is arranged such that the narrowed end faces 78, 799 having a small diameter are on the display surface side.
- the axial direction of the microlenses 7 1 and 7 2 coincides with the light emission direction, if the micro lenses 7 1 and 7 2 are inclined in the direction in which light rays should be emitted, the desired light emission The angle can be adjusted.
- the light from 4, 7 5 can be collected and emitted from the end faces 7 8, 7 9.
- Figure 19 shows the sphere area S with respect to the angle ⁇ from the origin, and illustrates the parameters over time.
- r is the radius of the virtual sphere, and h gives the diameter of the portion where the ray of angle ⁇ intersects the virtual sphere.
- S is the sphere area cut off by a cone according to the angle s rati . Is the ratio of the spherical area S to the global area.
- FIG. 20 shows the angle ⁇ calculated from the equations shown on the left side of FIG. 19 and the sphere area ratio S rat ; FIG. In particular, when the angle ⁇ > force is 10 ° (0.174 rad), the sphere area ratio is S rat ;. May be an extremely small value of 0.001 920 265.
- the angle of the emitted light beam is narrowed down to within 10 ° and emitted, so that the amount of light of 1 Z 2 63 is sufficient. It is suggested that a good image can be obtained by controlling the angle of the emitted light beam without increasing the amount of light. In addition, this means that the contrast can be increased when driving the light emitting device with the same power.
- a clear line of sight can be obtained. An interactive screen will be displayed.
- FIG. 21 is a cross-sectional view illustrating still another example of the structure of the image display unit.
- light emitting elements 83 and 84 such as light emitting diodes are mounted on a substrate 82, and these light emitting elements 83 and 84 are arranged in a matrix.
- Micro-diffraction plates 85 L and 85 R are formed on the front side of the substrate 82.
- Transparent substrate 81 is attached.
- the micro-diffraction plates 85 L and 85 R have the function of bending the light beams emitted from the light emitting elements 83 and 84 by the diffraction phenomenon, respectively. And sent to the user.
- the micro-diffraction plate 85L controls the exit angle of the light beam so that it can be seen by the left eye of the user, and the micro-diffraction plate 85R can emit the light beam so that it can be seen by the right eye of the user. Control the angle.
- the function of synthesizing and viewing it in the cerebrum of the user works, and as a result, as shown in Fig. 2C, the screen that matches the eyes is displayed. This will enable a natural call that matches the gaze.
- a method of mixing a plurality of pixels performing display based on the left-eye signal and a plurality of pixels performing display based on the right-eye signal in the image display unit is spatially mixed.
- Display based on the left-eye signal is performed by switching between display based on the left-eye signal and display based on the right-eye signal in a time-division manner.
- a plurality of pixels and a plurality of pixels performing display based on the right-eye signal may be mixed.
- Fig. 22A and Fig. 22B are diagrams showing the state of an experiment for examining the impression of gaze direction and gaze coincidence
- Fig. 22A is a diagram when images are captured from the left and right cameras
- 22B is a diagram when a camera is assumed to be at the center of the virtual screen.
- the cameras are on both sides of the virtual screen and have the structure of the mobile communication terminal described as the present embodiment.
- the distance between the pair of force lenses is 65 mm, which is the same as the distance between both eyes.
- 7 points of interest are set in the vertical direction and 7 points in the horizontal direction.
- the results of examining the impression at that time are shown in FIGS. 23A and 23B.
- the distance between the gazing points is 12.5 mm, and this distance is equivalent to an angle of 2.56 °.
- the subject looked at the point of gaze at a position 280 mm from the virtual screen.
- the mobile communication terminal described as the present embodiment has a camera on each of the left and right sides of the image display unit, so that it is possible to interact with the other party while keeping their eyes on the same, thereby providing a sense of presence. It is possible to have a dialogue.
- higher light use efficiency enables lower power consumption, and high contrast images can be viewed even in bright outdoor environments.
- a mobile communication terminal has a structure in which imaging devices are arranged on both left and right sides of a display screen, so that it can be downsized and is extremely useful as a portable device.
- cameras are provided on each of the left and right sides of the image display unit, and three-dimensional display for matching the line of sight with the other party is performed based on the two images captured by the cameras.
- three-dimensional display for matching the line of sight with the other party is performed based on the two images captured by the cameras.
- the parallax is small as if the image was obtained by two virtual cameras V R , VL provided at a smaller interval than the two cameras R R , RL
- the parallax can be set to an appropriate size, thereby realizing an extremely easy-to-see natural image display.
- the disparity is defined as the relative disparity as the difference in the angle of convergence as shown in the above equation (6), but here, for simplicity, the two cameras provided on the left and right sides of the image display unit are used. Therefore, the difference between the corresponding points of the two images captured is treated as the number of pixels.
- an image recognition method based on correlation detection can be used to find a corresponding point between two images. That is, in a mobile communication terminal, as shown in FIG. 25, a predetermined area such as a contour of a face with respect to a background and positions of eyes and nose from an image taken by a camera provided on the left side of an image display unit.
- the control circuit 21 extracts an area of 2 u ⁇ 2 v with the center coordinate value (i, j) from the image L as a template image g. , P and q are variable, and a search is made for a 2uX by 2v region where the center coordinate value is (i + P, j + q), and by finding a region where the correlation coefficient pq is maximum.
- the search range of the target image f corresponding to the template image g is determined according to the position of the template image g in the image L. Can be limited in advance, and processing can be performed efficiently. Note that this correlation detection uses the equation for calculating the correlation value expressed by the following equation (8).
- g ′ indicates the average value of the template image ′ and f ′ indicates the average value of the target image f.
- P, q where ⁇ pq is the maximum indicates the number of pixels of the displacement amount of the target image f with respect to the template image g, and corresponds to the parallax. . Therefore, in the mobile communication terminal, an image of an arbitrary parallax can be generated from one of the images R and L by adjusting the amount of the shift.
- the displacement amount indicating parallax is 1
- the image is generated by moving the image L by the number of pixels of (p / 2, Q / 2) / 2
- the image is captured from the center of the two cameras that captured the images R and L, as shown in Fig. 26.
- Image that is, an image equivalent to an image of a subject viewed from the front can be generated.
- a mobile communication terminal In the case of a mobile communication terminal, if there is no pixel to be drawn at the original position where the pixel is moved, the pixel is filled with a pixel obtained by interpolating from the pixel such as the left and right or the left and right and up and down of the pixel. Deficiency can be avoided. Also, in a mobile communication terminal, when there is a large parallax between the images R and L, if there is a hidden portion called so-called occlusion that appears only in one image and does not appear in the other image, In some cases, an appropriate point of correspondence cannot be found. However, this occlusion occurs even in situations where people are looking at the natural world, and if it is equal to this degree, it will hardly cause any discomfort.
- the process of calculating the number of pixels of the shift amount of the target image ⁇ ⁇ corresponding to the template image g is preferably performed on the entire image range, and the predetermined pixel based on the number of pixels of the obtained shift amount is determined.
- Two new images are generated by moving two images R and L by the number.
- the mobile communication terminal generates two new images with small parallax as if they were obtained by two virtual cameras provided at a smaller interval than the images R and L. be able to.
- the mobile communication terminal by displaying these two newly generated images on the image display unit, the user can reduce the distance between cameras and reduce the parallax. You can see two images, and you can feel as if a stereoscopic image with very easy-to-view high-quality eyes was displayed.
- the parallax required from the two images L is reduced and set to an appropriate arbitrary ratio, so that the left and right This makes it possible to reduce the displacement of the visible image of the subject and to provide a more easily viewable three-dimensional display.
- parallax interpolation technology can be used to increase the number of viewpoints. That is, in a mobile communication terminal, usually, only two viewpoint images can be generated from two cameras, but based on these two images, for example, the parallax becomes 1/2, respectively. By generating an interpolated image as described above, an image of four viewpoints can be generated. Also, in a mobile communication terminal, by interpolating so that the parallax is reduced to a predetermined amount, a multi-viewpoint image can be generated, and the obtained images can be obtained by using a lenticular lens or the like. By appropriately performing stereoscopic display, it is possible to reduce a so-called flicking phenomenon in which an image changes rapidly due to a difference in viewpoint position, and it is possible to perform higher quality stereoscopic display.
- any method other than the correlation detection can be applied.
- a so-called difference value between the brightness values of the two images is used.
- the method of Sum of Difference can be used, and the so-called residual sum of squares (SSD) method can also be used.
- the above-described correlation detection method gives the most accurate result in order to normalize these two images. It goes without saying that it can be obtained.
- the mobile communication terminal that performs the process of interpolating parallax based on two images obtained by capturing images by the cameras provided on each of the left and right sides of the image display unit may be the transmitting side.
- the receiving side may be used.
- the mobile communication terminal described as this embodiment has Interpolation of parallax based on two images obtained by cameras provided on the left and right sides of the display unit optimizes the stereoscopic display to match the line of sight with the other party, Furthermore, since the image can be made natural and easy to see, it does not cause a situation where the parallax becomes too large to be able to fuse the image as a double image, and the fusion becomes so large as to cause fatigue of the user. It is possible to provide extremely excellent convenience without causing a situation in which the formed image is difficult to see.
- the technique of interpolating parallax based on two images obtained by imaging with cameras provided on each of the left and right sides of the image display unit is not limited to mobile communication terminals.
- the present invention can be applied to any information processing device that performs display. Industrial applicability
- the present invention it is possible to have a conversation with a mobile communication terminal while matching the line of sight with a communication partner, and a realistic conversation can be achieved.
- the higher light use efficiency enables lower power consumption, and high contrast images can be viewed even in bright outdoor environments.
- the structure in which imaging devices are arranged on both sides of the display screen enables miniaturization, which is extremely useful for a portable information processing device.
- the present invention by generating a new image in which parallax is interpolated based on two images obtained by imaging by the imaging means, it is possible to optimize a stereoscopic display for matching a line of sight with a communication partner.
- the image can be made more natural and easy to see, it does not lead to a situation where the parallax becomes too large to be able to fuse as a double image, and the fusion becomes so large as to cause fatigue of the user. It is possible to provide extremely excellent convenience without causing a situation in which the formed image is difficult to see.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/515,288 US7528879B2 (en) | 2002-05-21 | 2003-05-16 | Information processing apparatus, information processing system, and conversation partner display method while presenting a picture |
EP03725814A EP1507419A4 (en) | 2002-05-21 | 2003-05-16 | Information processing apparatus, information processing system, and dialogist displaying method |
KR1020047018744A KR100976002B1 (en) | 2002-05-21 | 2003-05-16 | Information processing apparatus, information processing system, and dialogist displaying method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-146181 | 2002-05-21 | ||
JP2002146181 | 2002-05-21 | ||
JP2002358567A JP2004048644A (en) | 2002-05-21 | 2002-12-10 | Information processor, information processing system and interlocutor display method |
JP2002-358567 | 2002-12-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003098942A1 true WO2003098942A1 (en) | 2003-11-27 |
Family
ID=29552322
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/006155 WO2003098942A1 (en) | 2002-05-21 | 2003-05-16 | Information processing apparatus, information processing system, and dialogist displaying method |
Country Status (7)
Country | Link |
---|---|
US (1) | US7528879B2 (en) |
EP (1) | EP1507419A4 (en) |
JP (1) | JP2004048644A (en) |
KR (1) | KR100976002B1 (en) |
CN (1) | CN1672431A (en) |
TW (1) | TWI223561B (en) |
WO (1) | WO2003098942A1 (en) |
Families Citing this family (125)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050207486A1 (en) * | 2004-03-18 | 2005-09-22 | Sony Corporation | Three dimensional acquisition and visualization system for personal electronic devices |
EP1613082A1 (en) * | 2004-06-30 | 2006-01-04 | Sony Ericsson Mobile Communications AB | Face image correction |
US20110028212A1 (en) | 2004-07-01 | 2011-02-03 | David Krien | Computerized Imaging of Sporting Trophies and Method of Providing a Replica |
JP4699995B2 (en) * | 2004-12-16 | 2011-06-15 | パナソニック株式会社 | Compound eye imaging apparatus and imaging method |
US7929801B2 (en) | 2005-08-15 | 2011-04-19 | Sony Corporation | Depth information for auto focus using two pictures and two-dimensional Gaussian scale space theory |
US8571346B2 (en) * | 2005-10-26 | 2013-10-29 | Nvidia Corporation | Methods and devices for defective pixel detection |
US8588542B1 (en) | 2005-12-13 | 2013-11-19 | Nvidia Corporation | Configurable and compact pixel processing apparatus |
JP4199238B2 (en) | 2006-01-11 | 2008-12-17 | パナソニック株式会社 | Shooting system |
EP1980935A1 (en) * | 2006-02-03 | 2008-10-15 | Matsushita Electric Industrial Co., Ltd. | Information processing device |
US8737832B1 (en) | 2006-02-10 | 2014-05-27 | Nvidia Corporation | Flicker band automated detection system and method |
US7616254B2 (en) | 2006-03-16 | 2009-11-10 | Sony Corporation | Simple method for calculating camera defocus from an image scene |
KR100962874B1 (en) * | 2006-04-26 | 2010-06-10 | 차오 후 | A portable personal integrative stereoscopic video multimedia device |
JP2007304525A (en) * | 2006-05-15 | 2007-11-22 | Ricoh Co Ltd | Image input device, electronic equipment, and image input method |
US7711201B2 (en) | 2006-06-22 | 2010-05-04 | Sony Corporation | Method of and apparatus for generating a depth map utilized in autofocusing |
US7860382B2 (en) * | 2006-10-02 | 2010-12-28 | Sony Ericsson Mobile Communications Ab | Selecting autofocus area in an image |
EP2084578A1 (en) * | 2006-10-26 | 2009-08-05 | SeeReal Technologies S.A. | Mobile telephony system comprising holographic display |
US8077964B2 (en) | 2007-03-19 | 2011-12-13 | Sony Corporation | Two dimensional/three dimensional digital information acquisition and display device |
US20100103244A1 (en) * | 2007-03-30 | 2010-04-29 | Nxp, B.V. | device for and method of processing image data representative of an object |
DE102007016403A1 (en) * | 2007-04-03 | 2008-10-09 | Micronas Gmbh | Method and device for recording, transmitting and / or reproducing image data, in particular videoconferencing image data |
KR101405933B1 (en) * | 2007-07-12 | 2014-06-12 | 엘지전자 주식회사 | Portable terminal and method for displaying position information in the portable terminal |
US8724895B2 (en) | 2007-07-23 | 2014-05-13 | Nvidia Corporation | Techniques for reducing color artifacts in digital images |
US8827161B2 (en) * | 2007-08-14 | 2014-09-09 | Opticon, Inc. | Optimized illumination for an omniscanner |
JP4973393B2 (en) * | 2007-08-30 | 2012-07-11 | セイコーエプソン株式会社 | Image processing apparatus, image processing method, image processing program, and image processing system |
JP4905326B2 (en) * | 2007-11-12 | 2012-03-28 | ソニー株式会社 | Imaging device |
JP4958233B2 (en) * | 2007-11-13 | 2012-06-20 | 学校法人東京電機大学 | Multi-view image creation system and multi-view image creation method |
JP4625067B2 (en) * | 2007-11-20 | 2011-02-02 | 富士フイルム株式会社 | Imaging apparatus and method, and program |
US8698908B2 (en) | 2008-02-11 | 2014-04-15 | Nvidia Corporation | Efficient method for reducing noise and blur in a composite still image from a rolling shutter camera |
JP4608563B2 (en) * | 2008-03-26 | 2011-01-12 | 富士フイルム株式会社 | Stereoscopic image display apparatus and method, and program |
JP5322264B2 (en) * | 2008-04-01 | 2013-10-23 | Necカシオモバイルコミュニケーションズ株式会社 | Image display apparatus and program |
US9379156B2 (en) | 2008-04-10 | 2016-06-28 | Nvidia Corporation | Per-channel image intensity correction |
US8280194B2 (en) | 2008-04-29 | 2012-10-02 | Sony Corporation | Reduced hardware implementation for a two-picture depth map algorithm |
US10896327B1 (en) * | 2013-03-15 | 2021-01-19 | Spatial Cam Llc | Device with a camera for locating hidden object |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
EP3876510A1 (en) | 2008-05-20 | 2021-09-08 | FotoNation Limited | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US8194995B2 (en) | 2008-09-30 | 2012-06-05 | Sony Corporation | Fast camera auto-focus |
US8553093B2 (en) | 2008-09-30 | 2013-10-08 | Sony Corporation | Method and apparatus for super-resolution imaging using digital imaging devices |
JP5144456B2 (en) * | 2008-10-09 | 2013-02-13 | 富士フイルム株式会社 | Image processing apparatus and method, image reproducing apparatus and method, and program |
US8675091B2 (en) * | 2008-12-15 | 2014-03-18 | Nvidia Corporation | Image data processing with multiple cameras |
US8749662B2 (en) | 2009-04-16 | 2014-06-10 | Nvidia Corporation | System and method for lens shading image correction |
TWI413979B (en) * | 2009-07-02 | 2013-11-01 | Inventec Appliances Corp | Method for adjusting displayed frame, electronic device, and computer program product thereof |
US8698918B2 (en) | 2009-10-27 | 2014-04-15 | Nvidia Corporation | Automatic white balancing for photography |
WO2011053315A1 (en) * | 2009-10-30 | 2011-05-05 | Hewlett-Packard Development Company, L.P. | Video display systems |
US8514491B2 (en) | 2009-11-20 | 2013-08-20 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
EP2508002A1 (en) * | 2009-12-04 | 2012-10-10 | Nokia Corp. | A processor, apparatus and associated methods |
KR101268520B1 (en) * | 2009-12-14 | 2013-06-04 | 한국전자통신연구원 | The apparatus and method for recognizing image |
SG185500A1 (en) | 2010-05-12 | 2012-12-28 | Pelican Imaging Corp | Architectures for imager arrays and array cameras |
EP2395768B1 (en) * | 2010-06-11 | 2015-02-25 | Nintendo Co., Ltd. | Image display program, image display system, and image display method |
US20120007819A1 (en) * | 2010-07-08 | 2012-01-12 | Gregory Robert Hewes | Automatic Convergence Based on Touchscreen Input for Stereoscopic Imaging |
JP5545993B2 (en) * | 2010-07-22 | 2014-07-09 | Necパーソナルコンピュータ株式会社 | Information processing apparatus, frame data conversion method, and program |
TW201205344A (en) * | 2010-07-30 | 2012-02-01 | Hon Hai Prec Ind Co Ltd | Adjusting system and method for screen, advertisement board including the same |
JP2012053165A (en) * | 2010-08-31 | 2012-03-15 | Sony Corp | Information processing device, program, and information processing method |
WO2012053940A2 (en) * | 2010-10-20 | 2012-04-26 | Rawllin International Inc | Meeting camera |
CN102457736A (en) * | 2010-10-25 | 2012-05-16 | 宏碁股份有限公司 | Method for providing instant three-dimensional video and instant three-dimensional video system |
CN101986346B (en) * | 2010-11-01 | 2012-08-08 | 华为终端有限公司 | Face image processing method and device |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
KR20120078649A (en) * | 2010-12-31 | 2012-07-10 | 한국전자통신연구원 | Camera-equipped portable video conferencing device and control method thereof |
US8823769B2 (en) | 2011-01-05 | 2014-09-02 | Ricoh Company, Ltd. | Three-dimensional video conferencing system with eye contact |
GB201101083D0 (en) * | 2011-01-21 | 2011-03-09 | Rawlin Internat Inc | Mobile device camera lighting |
EP2708019B1 (en) | 2011-05-11 | 2019-10-16 | FotoNation Limited | Systems and methods for transmitting and receiving array camera image data |
US20130265459A1 (en) | 2011-06-28 | 2013-10-10 | Pelican Imaging Corporation | Optical arrangements for use with an array camera |
JP2014521117A (en) | 2011-06-28 | 2014-08-25 | ペリカン イメージング コーポレイション | Optical array for use with array cameras |
US20130070060A1 (en) | 2011-09-19 | 2013-03-21 | Pelican Imaging Corporation | Systems and methods for determining depth from multiple views of a scene that include aliasing using hypothesized fusion |
US8542933B2 (en) | 2011-09-28 | 2013-09-24 | Pelican Imaging Corporation | Systems and methods for decoding light field image files |
WO2013126578A1 (en) | 2012-02-21 | 2013-08-29 | Pelican Imaging Corporation | Systems and methods for the manipulation of captured light field image data |
EP2640060A1 (en) * | 2012-03-16 | 2013-09-18 | BlackBerry Limited | Methods and devices for producing an enhanced image |
US9210392B2 (en) | 2012-05-01 | 2015-12-08 | Pelican Imaging Coporation | Camera modules patterned with pi filter groups |
US9100635B2 (en) | 2012-06-28 | 2015-08-04 | Pelican Imaging Corporation | Systems and methods for detecting defective camera arrays and optic arrays |
US20140002674A1 (en) | 2012-06-30 | 2014-01-02 | Pelican Imaging Corporation | Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors |
US9798698B2 (en) | 2012-08-13 | 2017-10-24 | Nvidia Corporation | System and method for multi-color dilu preconditioner |
EP2888720B1 (en) | 2012-08-21 | 2021-03-17 | FotoNation Limited | System and method for depth estimation from images captured using array cameras |
US20140055632A1 (en) | 2012-08-23 | 2014-02-27 | Pelican Imaging Corporation | Feature based high resolution motion estimation from low resolution images captured using an array source |
US9508318B2 (en) | 2012-09-13 | 2016-11-29 | Nvidia Corporation | Dynamic color profile management for electronic devices |
US9214013B2 (en) | 2012-09-14 | 2015-12-15 | Pelican Imaging Corporation | Systems and methods for correcting user identified artifacts in light field images |
EP2901671A4 (en) | 2012-09-28 | 2016-08-24 | Pelican Imaging Corp | Generating images from light fields utilizing virtual viewpoints |
US9307213B2 (en) | 2012-11-05 | 2016-04-05 | Nvidia Corporation | Robust selection and weighting for gray patch automatic white balancing |
WO2014078443A1 (en) | 2012-11-13 | 2014-05-22 | Pelican Imaging Corporation | Systems and methods for array camera focal plane control |
CN103841311A (en) * | 2012-11-20 | 2014-06-04 | 广州三星通信技术研究有限公司 | Method for generating 3D image and portable terminals |
WO2014130849A1 (en) | 2013-02-21 | 2014-08-28 | Pelican Imaging Corporation | Generating compressed light field representation data |
US9374512B2 (en) | 2013-02-24 | 2016-06-21 | Pelican Imaging Corporation | Thin form factor computational array cameras and modular array cameras |
WO2014138697A1 (en) | 2013-03-08 | 2014-09-12 | Pelican Imaging Corporation | Systems and methods for high dynamic range imaging using array cameras |
US8866912B2 (en) | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
US9521416B1 (en) | 2013-03-11 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for image data compression |
US9106784B2 (en) | 2013-03-13 | 2015-08-11 | Pelican Imaging Corporation | Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing |
US9888194B2 (en) | 2013-03-13 | 2018-02-06 | Fotonation Cayman Limited | Array camera architecture implementing quantum film image sensors |
US9124831B2 (en) | 2013-03-13 | 2015-09-01 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
WO2014165244A1 (en) | 2013-03-13 | 2014-10-09 | Pelican Imaging Corporation | Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies |
WO2014159779A1 (en) | 2013-03-14 | 2014-10-02 | Pelican Imaging Corporation | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
WO2014153098A1 (en) | 2013-03-14 | 2014-09-25 | Pelican Imaging Corporation | Photmetric normalization in array cameras |
WO2014150856A1 (en) | 2013-03-15 | 2014-09-25 | Pelican Imaging Corporation | Array camera implementing quantum dot color filters |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US9445003B1 (en) | 2013-03-15 | 2016-09-13 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
US9633442B2 (en) | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
US9438888B2 (en) | 2013-03-15 | 2016-09-06 | Pelican Imaging Corporation | Systems and methods for stereo imaging with camera arrays |
JP2013210641A (en) * | 2013-04-19 | 2013-10-10 | Nec Casio Mobile Communications Ltd | Image display apparatus and program |
WO2014204950A1 (en) * | 2013-06-17 | 2014-12-24 | Reald Inc. | Controlling light sources of a directional backlight |
US9756222B2 (en) | 2013-06-26 | 2017-09-05 | Nvidia Corporation | Method and system for performing white balancing operations on captured images |
US9826208B2 (en) | 2013-06-26 | 2017-11-21 | Nvidia Corporation | Method and system for generating weights for use in white balancing an image |
US9099575B2 (en) * | 2013-07-16 | 2015-08-04 | Cree, Inc. | Solid state lighting devices and fabrication methods including deposited light-affecting elements |
WO2015048694A2 (en) | 2013-09-27 | 2015-04-02 | Pelican Imaging Corporation | Systems and methods for depth-assisted perspective distortion correction |
US9426343B2 (en) | 2013-11-07 | 2016-08-23 | Pelican Imaging Corporation | Array cameras incorporating independently aligned lens stacks |
US10119808B2 (en) | 2013-11-18 | 2018-11-06 | Fotonation Limited | Systems and methods for estimating depth from projected texture using camera arrays |
EP3075140B1 (en) | 2013-11-26 | 2018-06-13 | FotoNation Cayman Limited | Array camera configurations incorporating multiple constituent array cameras |
US10089740B2 (en) | 2014-03-07 | 2018-10-02 | Fotonation Limited | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images |
US9247117B2 (en) | 2014-04-07 | 2016-01-26 | Pelican Imaging Corporation | Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array |
US9521319B2 (en) | 2014-06-18 | 2016-12-13 | Pelican Imaging Corporation | Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor |
EP3201877B1 (en) | 2014-09-29 | 2018-12-19 | Fotonation Cayman Limited | Systems and methods for dynamic calibration of array cameras |
US11019323B2 (en) * | 2015-02-06 | 2021-05-25 | Tara Chand Singhal | Apparatus and method for 3D like camera system in a handheld mobile wireless device |
US9942474B2 (en) | 2015-04-17 | 2018-04-10 | Fotonation Cayman Limited | Systems and methods for performing high speed video capture and depth estimation using array cameras |
KR101782582B1 (en) * | 2015-12-04 | 2017-09-28 | 카페24 주식회사 | Method, Apparatus and System for Transmitting Video Based on Eye Tracking |
CN106331673A (en) * | 2016-08-22 | 2017-01-11 | 上嘉(天津)文化传播有限公司 | VR video data control method based on distributed control system |
US10482618B2 (en) | 2017-08-21 | 2019-11-19 | Fotonation Limited | Systems and methods for hybrid depth regularization |
US10721419B2 (en) * | 2017-11-30 | 2020-07-21 | International Business Machines Corporation | Ortho-selfie distortion correction using multiple image sensors to synthesize a virtual image |
WO2021055585A1 (en) | 2019-09-17 | 2021-03-25 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
US11525906B2 (en) | 2019-10-07 | 2022-12-13 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
KR20230116068A (en) | 2019-11-30 | 2023-08-03 | 보스턴 폴라리메트릭스, 인크. | System and method for segmenting transparent objects using polarization signals |
US11195303B2 (en) | 2020-01-29 | 2021-12-07 | Boston Polarimetrics, Inc. | Systems and methods for characterizing object pose detection and measurement systems |
CN115428028A (en) | 2020-01-30 | 2022-12-02 | 因思创新有限责任公司 | System and method for synthesizing data for training statistical models in different imaging modalities including polarized images |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
CN114021100B (en) * | 2022-01-10 | 2022-03-15 | 广东省出版集团数字出版有限公司 | Safety management system for digital teaching material storage |
CN114911445A (en) * | 2022-05-16 | 2022-08-16 | 歌尔股份有限公司 | Display control method of virtual reality device, and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09121370A (en) * | 1995-08-24 | 1997-05-06 | Matsushita Electric Ind Co Ltd | Stereoscopic television device |
JPH1075432A (en) * | 1996-08-29 | 1998-03-17 | Sanyo Electric Co Ltd | Stereoscopic video telephone set |
JPH10108152A (en) * | 1996-09-27 | 1998-04-24 | Sanyo Electric Co Ltd | Portable information terminal |
JPH10221644A (en) * | 1997-02-05 | 1998-08-21 | Canon Inc | Stereoscopic picture display device |
JPH1175173A (en) * | 1995-02-16 | 1999-03-16 | Sumitomo Electric Ind Ltd | Bidirectional interactive system having mechanism to secure coincidence of lines of sight between dialogists via transmission means |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5359362A (en) | 1993-03-30 | 1994-10-25 | Nec Usa, Inc. | Videoconference system using a virtual camera image |
JPH0750855A (en) * | 1993-08-06 | 1995-02-21 | Sharp Corp | Picture trasmitter |
US6259470B1 (en) * | 1997-12-18 | 2001-07-10 | Intel Corporation | Image capture system having virtual camera |
-
2002
- 2002-12-10 JP JP2002358567A patent/JP2004048644A/en active Pending
-
2003
- 2003-05-06 TW TW092112349A patent/TWI223561B/en not_active IP Right Cessation
- 2003-05-16 WO PCT/JP2003/006155 patent/WO2003098942A1/en active Application Filing
- 2003-05-16 KR KR1020047018744A patent/KR100976002B1/en not_active IP Right Cessation
- 2003-05-16 CN CNA038174871A patent/CN1672431A/en active Pending
- 2003-05-16 US US10/515,288 patent/US7528879B2/en not_active Expired - Fee Related
- 2003-05-16 EP EP03725814A patent/EP1507419A4/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1175173A (en) * | 1995-02-16 | 1999-03-16 | Sumitomo Electric Ind Ltd | Bidirectional interactive system having mechanism to secure coincidence of lines of sight between dialogists via transmission means |
JPH09121370A (en) * | 1995-08-24 | 1997-05-06 | Matsushita Electric Ind Co Ltd | Stereoscopic television device |
JPH1075432A (en) * | 1996-08-29 | 1998-03-17 | Sanyo Electric Co Ltd | Stereoscopic video telephone set |
JPH10108152A (en) * | 1996-09-27 | 1998-04-24 | Sanyo Electric Co Ltd | Portable information terminal |
JPH10221644A (en) * | 1997-02-05 | 1998-08-21 | Canon Inc | Stereoscopic picture display device |
Non-Patent Citations (1)
Title |
---|
See also references of EP1507419A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP2004048644A (en) | 2004-02-12 |
US7528879B2 (en) | 2009-05-05 |
KR20040106575A (en) | 2004-12-17 |
CN1672431A (en) | 2005-09-21 |
KR100976002B1 (en) | 2010-08-17 |
US20050175257A1 (en) | 2005-08-11 |
TW200307460A (en) | 2003-12-01 |
TWI223561B (en) | 2004-11-01 |
EP1507419A1 (en) | 2005-02-16 |
EP1507419A4 (en) | 2005-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003098942A1 (en) | Information processing apparatus, information processing system, and dialogist displaying method | |
US10750210B2 (en) | Three-dimensional telepresence system | |
US7224382B2 (en) | Immersive imaging system | |
US20070182812A1 (en) | Panoramic image-based virtual reality/telepresence audio-visual system and method | |
WO2011108139A1 (en) | Teleconference system | |
TWI692976B (en) | Video communication device and method for connecting video communivation to other device | |
TW201943259A (en) | Window system based on video communication | |
US10645340B2 (en) | Video communication device and method for video communication | |
TWI710247B (en) | Video communication device and method for connecting video communication to other device | |
JP6157077B2 (en) | Display device with camera | |
JP2008228170A (en) | Image display device and television telephone device | |
JPH1075432A (en) | Stereoscopic video telephone set | |
JP5963637B2 (en) | Display device with imaging device | |
Tsuchiya et al. | An optical design for avatar-user co-axial viewpoint telepresence | |
JP2002027419A (en) | Image terminal device and communication system using the same | |
JP2001147401A (en) | Stereoscopic image pickup device | |
TWI700933B (en) | Video communication device and method for connecting video communivation to other device | |
JP3139100B2 (en) | Multipoint image communication terminal device and multipoint interactive system | |
JPH06225298A (en) | Terminal device for visual communication | |
JP2010283550A (en) | Communication system, and communication device | |
US20240004215A1 (en) | Full lightfield with monocular and stereoscopic depth control via monocular-to-binocular hybridization | |
JPH06253303A (en) | Photographic display device | |
KR20090008750A (en) | 3 dimension image-based terminal and stereo scope applied to the same | |
US20210051310A1 (en) | The 3d wieving and recording method for smartphones | |
JPH0530503A (en) | Image display/pickup device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1020047018744 Country of ref document: KR Ref document number: 10515288 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003725814 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020047018744 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20038174871 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2003725814 Country of ref document: EP |