WO2021237430A1 - Method of controlling image creating display device, and image creating display device - Google Patents

Method of controlling image creating display device, and image creating display device Download PDF

Info

Publication number
WO2021237430A1
WO2021237430A1 PCT/CN2020/092194 CN2020092194W WO2021237430A1 WO 2021237430 A1 WO2021237430 A1 WO 2021237430A1 CN 2020092194 W CN2020092194 W CN 2020092194W WO 2021237430 A1 WO2021237430 A1 WO 2021237430A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
display device
eye position
eye
Prior art date
Application number
PCT/CN2020/092194
Other languages
English (en)
French (fr)
Inventor
Hirotake Cho
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2020/092194 priority Critical patent/WO2021237430A1/en
Priority to CN202080098614.1A priority patent/CN115280369A/zh
Publication of WO2021237430A1 publication Critical patent/WO2021237430A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Definitions

  • the present disclosure relates to a method of controlling an image display device and the image display device.
  • image creating display devices which have both a camera module and a display module.
  • Typical examples of such image creating display devices are portable electrical devices like smartphones, mobile phones, laptop computers, desktop computers, tablet computers and so on.
  • Projection systems are also a kind of image creating display devices.
  • AR Augmented Reality
  • an image displayed on a display screen is not coordinated with a user perspective which is located in an opposite side of a background of the portable electrical device, for example. Therefore, a border between the image displayed on the display screen and the background around the display module looks discontinuous. That is, the image displayed on the display screen and the background of the portable electrical device are not a seamless image. Moreover, even if the user moves his/her face, the image displayed on the display screen is not adjusted and thus unnaturalness of the image on the display screen is worse for the user. As a result, a sense of immersion of the user is impaired.
  • the other image creating display devices also have the same problems.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide a method of controlling an image creating display device and an image creating display device implementing such method.
  • a method of controlling an image creating display device may include:
  • the user perspective image is an image which matches a view through a projection area on a projection plane from the eye position of the user with a background around the projection area from the eye position of the user;
  • the capturing the first image and the second image may include capturing the first image by a first camera module of the camera module and the second image by a second camera module of the camera module.
  • the calculating the eye position of the user may include deciding which of the eyes of the user should be the eye position of the user by the processing circuitry.
  • the calculating the eye position of the user may include calculating an eye depth by the processing circuitry, wherein the eye depth is a depth of the eye position of the user from the second camera module.
  • the calculating the eye depth may include:
  • the calculating the eye depth may include estimating the eye depth based on the eyes and/or the face of the user in the second image by the processing circuitry.
  • the first image may have a first coordinate system and the second image has a second coordinate system
  • the calculating the eye position of the user may include calculating a coordinate of the eye position of the user in the second coordinate system by the processing circuitry.
  • the creating the user perspective image may include transforming the coordinate of the eye position of the user in the second coordinate system into a coordinate of the eye position of the user in the first coordinate system.
  • the creating user perspective image may include creating the user perspective image based on the coordinate of the eye position of the user in the first coordinate system and the first image.
  • the method may further include adding an additional information to the user perspective image by overlapping the additional information on the user perspective image.
  • the capturing the first image and the second image may include capturing the first image and the second image by a spherical camera of the camera module.
  • the image creating display device may be a portable electrical device
  • the display module may be a display installed in the portable electrical device
  • the projection area on the projection plane may be defined by a display screen of the display.
  • the image creating display device may be a projection system
  • the display module is a projector to project the user perspective image on a projection object
  • the projection area on the projection plane is defined by the projection object.
  • the capturing the first image and the second image may include capturing the first image by a first camera module of the camera module and the second image by a second camera module of the camera module.
  • the calculating the eye position of the user may include calculating an eye depth by the processing circuitry, wherein the eye depth is a depth of the eye position of the user from the second camera module.
  • the calculating the eye depth may include:
  • a method of controlling an image creating display device may include:
  • the user perspective image is an image which matches a view through a projection area on a projection plane from the eye position of the user with a background around the projection area from the eye position of the user;
  • an image creating display device may include:
  • a camera module configured to capture a first image in a first side of the image creating display device and a second image in a second side of the image creating display device, wherein the first side is opposite to the second side and a user of the image creating display device is located in the second side;
  • processing circuitry configured to:
  • the user perspective image is an image which matches a view through a projection area on a projection plane from the eye position of the user with a background around the projection area from the eye position of the user;
  • a display module configured to display the user perspective image in the projection area on the projection plane.
  • an image creating display device may include:
  • a camera module configured to capture a first image in a first side of the image creating display device and a second image in a second side of the image creating display device, wherein the first side is opposite to the second side and a user of the image creating display device is located in the second side;
  • processing circuitry configured to:
  • the user perspective image is an image which matches a view through a projection area on a projection plane from the eye position of the user with a background around the projection area from the eye position of the user;
  • a display module configured to display the additional information in the projection area on the projection plane.
  • FIG. 1 illustrates a plan view of a first side of a portable electrical device according to an embodiment of the present disclosure
  • FIG. 2 illustrates a plan view of a second side of the portable electrical device according to the present embodiment of the present disclosure
  • FIG. 3 illustrates a block diagram of the portable electrical device according to the present embodiment of the present disclosure
  • FIG. 4 illustrates a relative position of the portable electrical device, a user of the portable electrical device, and objects to be captured by the portable electrical device in the present embodiment of the present disclosure
  • FIG. 5 illustrates the portable electrical device and the objects and the display of the display module which displays an image in a conventional manner
  • FIG. 6 illustrates the portable electrical device according to the present embodiment and the objects and the display of the display module which displays the user perspective image
  • FIG. 7 illustrates one example of the portable electrical device according to the present embodiment (the first side) ;
  • FIG. 8 illustrates one example of the portable electrical device according to the present embodiment (the second side) ;
  • FIG 9 illustrates another example of the portable electrical device according to the present embodiment (the first side) ;
  • FIG. 10 illustrates another example of the portable electrical device according to the present embodiment (the second side) ;
  • FIG. 11 illustrates still another example of the portable electrical device according to the present embodiment (the first side) ;
  • FIG. 12 illustrates still another example of the portable electrical device according to the present embodiment (the second side) ;
  • FIG. 13 illustrates still yet another example of the portable electrical device according to the present embodiment (the first side) ;
  • FIG. 14 illustrates still yet another example of the portable electrical device according to the present embodiment (the second side) ;
  • FIG. 15 illustrates a window frame and a background of the window frame
  • FIG. 16 illustrates the projection system which is installed on the window frame
  • FIG. 17 illustrates a first side of a projection system according to the present embodiment
  • FIG. 18 illustrates a second side of the projection system according to the present embodiment
  • FIG. 19 illustrates another example of the projection system according to the present embodiment (the second side) ;
  • FIG. 20 and FIG. 21 illustrate a flowchart of a user perspective image creating process according to the present embodiment
  • FIG. 22 and FIG. 23 illustrate an example to estimate an eye depth based on the second image
  • FIG. 24 illustrates a relative position of a sub camera and an eye position of the eyes of the user according to the present embodiment
  • FIG. 25 illustrates a relative position of a first main camera and the eye position of the eyes of the user according to the present embodiment
  • FIG. 26 illustrates an explanatory drawing of creating the user perspective image according to the present embodiment.
  • FIG. 27 illustrates an example of the user perspective image displayed on the display of the display module of the portable electrical device according to the present embodiment.
  • FIG. 1 illustrates a plan view of a first side of a potable electrical device 10 according to an embodiment of the present disclosure
  • FIG. 2 illustrates a plan view of a second side of the portable electrical device 10 according to the embodiment of the present disclosure.
  • the second side may be referred to as a user side of the portable electrical device 10
  • the first side may be referred to as an opposite side of the portable electrical device 10 and the opposite side is opposite to the user side of the portable electrical device 10.
  • FIG. 1 shows a smartphone as one example of the portable electrical device and the portable electrical device is one example of an image creating display device.
  • the portable electrical device 10 can be a smartphone, a mobile phone, a laptop computer, a desktop computer, a tablet computer, a personal digital assistant, and so on. That is, the present embodiment can be applied to any kinds of portable electrical devices, i.e., image creating display devices, which are capable of creating and displaying an image.
  • the portable electrical device 10 may include a display 20 and a camera assembly 30.
  • the camera assembly 30 includes a first main camera 32, a second main camera 34 and a sub camera 36.
  • the first main camera 32 and the second main camera 34 can capture an image in a first side of the portable electrical device 10 and the sub camera 36 can capture an image in the second side of the portable electrical device 10. Therefore, the first main camera 32 and the second main camera 34 are so-called out-cameras whereas the sub camera 36 is a so-called in-camera.
  • the image captured by the first main camera 32 and the second main camera 34 is also referred to as a first image and the image captured by the sub camera 36 is also referred to as a second image.
  • the portable electrical device 10 may have less than three cameras or more than three cameras.
  • the portable electrical device 10 may have two, four, five, and so on, cameras.
  • the portable electrical device 10 includes a first range sensor 40 and a second range sensor 42.
  • the first range sensor 40 measures a distance from the first main camera 32 and/or the second main camera 34 to objects in the first side.
  • the second range sensor 42 measures a distance from the sub camera 36 to objects in the second side.
  • each of the first range sensor 40 and the second range sensor 42 measures the distance by emitting a laser beam and detecting reflections of the emitted laser beam which are reflected by surfaces of the objects.
  • each of the first range sensor 40 and the second range sensor 42 may measure the distance by emitting ultrasonic waves and detecting reflections of the emitted ultrasonic waves which are reflected by the surfaces of the objects.
  • the first range sensor 40 can measure the distance between the first main camera 32 and/or the second main camera 34 and the objects in the first side to generate a depth information in the first image.
  • the second range sensor 42 can measure the distance between the sub camera 36 and the objects in the second side to generate a depth information in the second image.
  • the first range sensor 40 and the second range sensor 42 are optional. Therefore, the portable electrical device 10 does not necessarily have the first range sensor 40, and the portable electrical device 10 does not necessarily have the second range sensor 42. In other words, the portable electrical device 10 may have no range sensors, or the portable electrical device 10 may have at least one of the first range sensor 40 and the second range sensor 42.
  • FIG. 3 illustrates a block diagram of a configuration of the portable electrical device 10 according to the present embodiment.
  • the portable electrical device 10 has a first camera module 50, a first range sensor module 52, a second camera module 54, a second range sensor module 56, and an image signal processor 58.
  • the first camera module 50 includes the first main camera 32 and the second main camera 34 in FIG. 1.
  • the first range sensor module 52 includes the first range sensor 40 in FIG. 1. Therefore, the first camera module 50 and the first sensor module 52 are modules for capturing the first image and generating the depth information in the first side of the portable electrical device 10.
  • the second camera module 54 includes the sub camera 36 in FIG. 2 and the second range sensor module 56 includes the second range sensor 42 in FIG. 2. Therefore, the second camera module 54 and the second range senor module 56 are modules for capturing the second image and generating the depth information in the second side of the portable electrical device 10.
  • the image signal processor 58 controls the first camera module 50, the first range sensor module 52, the second camera module 54, and the second range sensor module 56. Also, the image signal processor 58 processes camera image data acquired from the first camera module 50 and the second camera module 54.
  • the portable electrical device 10 has a global navigation satellite system (GNSS) module 60, a wireless communication module 62, a CODEC 64, a speaker 66, a microphone 68, a display module 70, an input module 72, an inertial measurement unit (IMU) 74, a processor 76, and a memory 78.
  • GNSS global navigation satellite system
  • IMU inertial measurement unit
  • the GNSS module 60 measures the current position of the portable electrical device 10.
  • the wireless communication module 62 performs wireless communications with the Internet, public wireless communication networks and so on.
  • the wireless communication module 62 may adopt any communication standard or protocol, including but not limited to GSM (Global System for Mobile communication) , CDMA (Code Division Multiple Access) , LTE (Long Term Evolution) , LTE-Advanced, and 5th generation (5G) .
  • the wireless communication module 62 may include an antenna and a RF (radio frequency) circuit.
  • the CODEC 64 bidirectionally performs encoding and decoding, by using a predetermined encoding/decoding method.
  • the speaker 66 outputs sound in accordance with sound data decoded by the CODEC 64.
  • the microphone 68 outputs sound data to the CODEC 64 based on input sound.
  • the display module 70 has the display 20 in FIG. 2 and displays various information.
  • the display 20 of the display module 70 displays images captured by the first camera module 50 and/or the second camera module 54. That is, the display 20 of the display module 70 can display the first image captured by the first camera module 50 and can display the second image captured by the second camera module 54.
  • the image indicates both still image and moving image.
  • the display module 70 may be a projector to project the images captured by the first camera module 50 and/or the second camera module 54.
  • the display module 70 may be a projector to project the images captured by the first camera module 50 and/or the second camera module 54.
  • image display units There are various kinds of image display units and any kinds of the image display units to display the images captured by the first camera module 50 and/or the second camera module 54 can be employed in the display module 70.
  • the input module 72 receives a user’s input.
  • the display module 70 also has a touch panel system which detects a position on the display 20 touched by the user and a movement of the position on the display 20.
  • the IMU 74 detects the angular velocity and the acceleration of the portable electrical device 10.
  • the processor 76 controls the global navigation satellite system (GNSS) module 60, the wireless communication module 62, the CODEC 64, the speaker 66, the microphone 68, the display module 70, the input module 72, and the IMU 74. In other words, the processor 76 controls the entire portable electrical device 10. That is, the processor 76 executes a variety of processes so that a variety of functions of the portable electrical device 10 are implemented.
  • GNSS global navigation satellite system
  • the memory 78 stores a program and data required for the image signal processor 58 to control the first camera module 50, the first range sensor module 52, the second camera module 54 and the second range sensor module 56. In addition, the memory 78 stores an acquired image data from the image signal processor 58 and a program and data required for the processor 76 to control the portable electrical device 10.
  • the memory 78 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the processor 76, the computer program implements methods explained below for controlling the portable electrical device 10.
  • FIG. 4 illustrates a relative position of the portable electrical device 10, a user U1 of the portable electrical device 10, and objects O1 and O2 to be captured by the portable electrical device 10.
  • FIG. 5 illustrates the portable electrical device 10 and the objects O1 and O2 and the display 20 of the display module 70 which displays an image in a conventional manner.
  • FIG. 6 illustrates the portable electrical device 10 according to the present embodiment and the objects O1 and O2 and the display 20 of the display module 70 which displays the user perspective image according to the present embodiment.
  • the first camera module 50 can capture the first image including the objects O1 and O2 in the first side of the portable electrical device 10.
  • the second camera module 54 can capture the second image including the user U1 in the second side of the portable electrical device 10. The user U1 cannot see a part of the objects O1 and O2 directly because the portable electrical device 10 is a sort of obstacles to see the objects O1 and O2.
  • the user U1 can see the objects O1 and O2 via the first image on the display 20.
  • an eye position of eyes of the user U1 is not considered in the first image displayed on the display 20. Therefore, there is a gap between the first image on the display 20 and a background of the portable electrical device 10. That is, an edge of the first image on the display 20 does not coincide with the background around the display 20 of the portable electrical device 10.
  • the portable electrical device 10 calculates the eye position of the eyes of the user U1 based on the second image captured by the second camera module 54 and creates the user perspective image based on the calculated eye position of the user U1 and the first image. That is, the edge of the user perspective image displayed on the display screen of the display 20 coincides with the background around the display 20 of the portable electrical device 10 when the user U1 sees the display 20 and the background of the display 20.
  • the display screen of the display 20 is a projection plane of the user perspective image. That is, a projection area on the projection plane of the user perspective image is defined by the display screen of the display 20.
  • FIG. 7 and FIG. 8 illustrate one example of the portable electrical device 10 according to the present embodiment.
  • FIG. 7 shows the first side of the portable electrical device 10 according to this example of the present embodiment
  • FIG. 8 shows the second side of the portable electrical device 10 according to this example of the present embodiment.
  • the first main camera 32 of the first camera module 50 and the first range sensor 40 of the first range sensor module 52 are provided in the first side of the portable electrical device in the same manner as FIG. 1 and FIG. 2.
  • positions of the first main camera 32 and the first range sensor 40 are different from those in FIG. 1 and FIG. 2. That is, the position of the first main camera 32 and the first range sensor 40 is shifted to an upper right corner of the portable electrical device 10 when seen from the user side.
  • the sub camera 36 of the second camera module 54 and the second range sensor 42 of the second range sensor module 56 as well as the display 20 are provided in the second side of the portable electrical device 10.
  • FIG. 9 and FIG. 10 illustrate another example of the portable electrical device 10 according to the present embodiment.
  • FIG. 9 shows the first side of the portable electrical device 10 according to this example of the present embodiment
  • FIG. 10 shows the second side of the portable electrical device 10 according to this example of the present embodiment.
  • the first main camera 32 of the first camera module 50, the first range sensor 40 of the first range sensor module 52, the sub camera 36 of the second camera module 54, and the second range sensor 42 of the second range sensor module 56 are provided in a pop-up unit 44.
  • the pop-up unit 44 is stored in a main body of the portable electrical device 10. That is, the pop-up unit 44 is inside the main body of the portable electrical device 10.
  • the pop-up unit 44 pops up from the main body of the portable electrical device 10.
  • the first main camera 32 of the first camera module 50, the first range sensor 40 of the first range sensor module 52, the sub camera 36 of the second camera module 54, and the second range sensor 42 of the second range sensor module 56 are exposed.
  • the first main camera 32 of the first camera module 50 and the first range sensor 40 of the first range sensor module 52 are provided in the first side of the pop-up unit 44 of the portable electrical device 10. Also, the sub camera 36 of the second camera module 54 and the second range sensor 42 of the second range sensor module 56 are provided in the second side of the pop-up unit 44 of the portable electrical device 10.
  • FIG. 11 and FIG. 12 illustrate another example of the portable electrical device 10 according to the present embodiment.
  • FIG. 11 shows the first side of the portable electrical device 10 according to this example of the present embodiment
  • FIG. 12 shows the second side of the portable electrical device 10 according to this example of the present embodiment.
  • the portable electrical device 10 is a laptop computer.
  • the first main camera 32 of the first camera module 50 and the first range sensor 40 of the first sensor module 52 are provided in the first side of the portable electrical device. That is, the first main camera 32 and the first range sensor 40 are provided at a central upper portion of a back side of the display module 70 of the laptop computer.
  • the sub camera 36 of the second camera module 54 and the second range sensor 42 of the second range sensor module 56 as well as the display 20 are provided in the second side of the portable electrical device 10. That is, the sub camera 36 and the second range sensor 42 are provided at a central upper portion of a frame of the display 20 of the display module 70.
  • FIG. 13 and FIG. 14 illustrate another example of the portable electrical device 10 according to the present embodiment.
  • FIG. 13 shows the first side of the portable electrical device 10 according to this example of the present embodiment
  • FIG. 14 shows the second side of the portable electrical device 10 according to this example of the present embodiment.
  • the portable electrical device 10 is a desktop computer whose computer system is incorporated in a casing of the display module 70 of the desktop computer.
  • the first main camera 32 of the first camera module 50 and the first range sensor 40 of the first sensor module 52 are provided in the first side of the portable electrical device. That is, the first main camera 32 and the first range sensor 40 are provided at a central upper portion of a back side of the display module 70 of the desktop computer.
  • the sub camera 36 of the second camera module 54 and the second range sensor 42 of the second range sensor module 56 as well as the display 20 are provided in the second side of the portable electrical device 10. That is, the sub camera 36 and the second range sensor 42 are provided at a central upper portion of a frame of the display 20 of the display module 70 of the desktop computer.
  • the portable electrical device 10 may be a display or a television set which is provided with necessary processing circuitry therein.
  • the display or the television set may be a wall hanging type which the user can use for watching television programs.
  • the display or the television set can also display a view of a window of its outside or a view in foreign countries where the first main camera 32 of the first camera module 50 and the first range sensor 40 of the first sensor module 52 are connected via a network.
  • the image creating display device may be a projection system.
  • the present embodiment of the present disclosure is applied to the projection system will be explained.
  • FIG. 15 illustrates a window frame 80 and a background of the window frame 80
  • FIG. 16 illustrates the projection system 90 which is installed on the window frame 80.
  • the window frame 80 is located in front of a user U2. Therefore, the user U2 who is located in the second side of the projection system 90 cannot see the whole view in the first side of the projection system 90. That is, the window frame 80 is an obstacle for the user U2 for seeing the whole view in the first side of the projection system 90.
  • the projection system 90 is installed on the window frame 80.
  • the projection system 90 has the first camera module 50, the second camera module 54 and the display module 70.
  • the first camera module 50 can capture the first image in the first side of the projection system 90 and the second camera module 54 can capture the second image in the second side of the projection system 90.
  • the display module 70 includes a projector 92 to project the user perspective image on the window frame 80. That is, in the present embodiment, a surface in the second side of the window frame 80 is the projection plane to project the user perspective image. In other words, the projection area of the projection plane is defined by the window frame 80.
  • necessary elements such as the first range sensor module 52, the second range sensor module 56, an image signal processor 58, the processor 76, and the memory 78 may be accommodated in a casing of the display module 70 (the projector 92) .
  • the display module 70 constitutes a main unit which accommodates the necessary elements for the projection system 90.
  • the first range sensor module 52 and the second range sensor 56 may be omitted.
  • the user perspective image can be created by an approximation process even if the distance between the first camera module 50 and the objects is unknown.
  • the position of the user in the second side also be estimated by the second image captured by the second camera module 54.
  • FIG. 17 and FIG. 18 illustrate another example of the projection system 90 according to the present embodiment.
  • FIG. 17 illustrates the first side of the projection system 90 according to this example of the present embodiment
  • FIG. 18 illustrates the second side of the projection system 90 according to this example of the present embodiment.
  • the projection system 90 has the first main camera 32 of the first camera module 50 and the first range sensor 40 of the first range sensor module 52 in the first side. Therefore, the first main camera 32 can capture the first image in the first side, and the first range sensor 40 can measure the distance from the first main camera 32 to the objects in the first side.
  • the projection system 90 has the sub camera 36 of the second camera module 54 and the second range sensor 42 of the second range sensor module 56 in the second side. Therefore, the sub camera 36 can capture the second image in the second side, and the second range sensor 42 can measure the distance from the sub camera 36 to the user in the second side.
  • the first range sensor 40 and the second range sensor 42 are optional. Therefore, the first range sensor 40 and the second range sensor 42 can be omitted in the projection system 90.
  • the projector 92 of the display module 70 projects the user perspective image in the projection area on the projection plane.
  • the projection area on the projection plane may be formed by a projection display screen, a transparent display screen, just the air, a semitransparent display screen, the retina of human for a retina projection system, and so on.
  • FIG. 19 illustrates another example of the projection system 90 according to the present embodiment.
  • FIG. 19 illustrates the second side of the projection system 90 according to the present embodiment.
  • the projection system 90 has the first main camera 32 of the first camera module 50 and the first range sensor 40 of the first range sensor module 52 in the first side. Therefore, the first main camera 32 can capture the first image in the first side, and the first range sensor 40 can measure the distance from the first main camera 32 to the objects in the first side.
  • the projection system 90 has the sub camera 36 of the second camera module 54 and the second range sensor 42 of the second range sensor module 56 in the second side. Therefore, the sub camera 36 can capture the second image in the second side, and the second range sensor 42 can measure the distance from the sub camera 36 to the user in the second side.
  • the first range sensor 40 and the second range sensor 42 are optional. Therefore, the first range sensor 40 and the second range sensor 42 can be omitted in the projection system 90.
  • the projector 92 of the display module 70 projects the user perspective image in the projection area on the projection plane.
  • the projection plane may be formed by a projection display screen, a transparent display screen, just the air, a semitransparent display screen, the retina of human for a retina projection system, and so on.
  • the projector 92 of the display module 70 projects the user perspective image on the projection display screen 94.
  • the first main camera 32 is not fixed on a specific position.
  • the first range sensor 40 is not fixed on a specific position, either. Therefore, the first main camera 32 and the first range sensor 40 can be installed in any other places.
  • the first main camera 32 and the first range sensor 40 may be connected to any places via a network.
  • FIG. 20 and FIG. 21 illustrate a flowchart of the user perspective image creating process according to the present embodiment.
  • the user perspective image creating process is executed by, for example, the processor 76 in order to generate the user perspective image.
  • the processor 76 may collaborate with the image signal processor 58 to generate the user perspective image. Therefore, the processor 76 and the image signal processor 58 may be used together for creating the user perspective image.
  • the image signal processor 58 and/or the processor 76 constitute processing circuitry. However, circuits other than the image signal processor 58 and the processor 76 may also constitute the processing circuitry.
  • the user perspective image creating process is repeatedly and regularly executed by the image creating display device. Therefore, if the face of the user, i.e. the eye position of the user moves, the user perspective image displayed on the display screen is also moved in accordance with a movement of the eye position of the user.
  • a program to implement the user perspective image creating process is stored in the memory 68, and the processor 76 reads out and executes the program to implement the user perspective image creating process.
  • the processor 76 of the image creating display device obtains the first image in the first side and the second image in the second side (Step S10) .
  • the processor 76 instructs the image signal processor 58 to capture the first image and the second image, and the image signal processor 58 controls the first camera module 50 to capture the first image in the first side and the second camera module 54 to capture the second image in the second side.
  • the image signal processor 58 controls the first range sensor module 52 to obtain the depth information on the first image in the first side.
  • the image signal processor 58 controls the second range sensor module 56 to obtain the depth information on the second image in the second side.
  • the depth information on the first image may be generated based on a parallax of the plurality of the captured images in the first side.
  • the depth information on the second image may be generated based on a parallax of the plurality of the captured images in the second side.
  • the spherical camera can capture the first image in the first side and the second image in the second side simultaneously. That is, the spherical camera can capture the images around the spherical camera at 360 degrees simultaneously.
  • the processor 76 of the image creating display device calculates a coordinate of the face of the user on the second image (Step S12) .
  • the image creating display device recognizes the face of the user in the second image by an image recognition process based on the second image.
  • the processor 76 of the image creating display device calculates a coordinate of the eye position of the user on the second image (Step S14) .
  • the image creating display device recognizes the right eye and the left eye of the user in the second image by the image recognition process based on the second image.
  • a coordinate of a center of a pupil of the right eye may be regarded as the coordinate of the right eye of the user
  • a coordinate of a center of a pupil of the left eye may be regarded as the coordinate of the left eye of the user.
  • the processor 76 of the image creating display device decides which of eyes should be a user perspective (Step S16) . If the right eye is the user perspective, the coordinate of the right eye is regarded as the coordinate of the eye position and the right eye is subjected to an image processing hereinafter (Step S18) . On the other hand, if the left eye is the user perspective, the coordinate of the left eye is regarded as the coordinate of the eye position and the left eye is subjected to the image processing hereinafter (Step S20) .
  • the coordinate of the eye position of the user is the user perspective from which the user sees the image creating display device and the background thereof.
  • the image creating display device may make the user select which of eyes should be the user perspective. That is, the image creating display device may display a selection menu screen on the display 20 to make the user select the right eye or the left eye as the user perspective. Alternatively, the image creating display device may prompt the user to close the right eye or the left eye while capturing the second image, and then the coordinate of the closed eye is regarded as the user perspective.
  • the image creating display device may decide the user perspective based on a relative position of the face of the user to the display 20. For example, when the display 20 is located in a left side with respect to the face of the user, the coordinate of the left eye is regarded as the user perspective.
  • the image creating display device may decide the user perspective based on whether the user sees the display 20 and a focus of the eyes of the user is short or the user sees a distant place and the focus of the eyes of the user is long.
  • the user perspective may be the right eye of the user, which is not a dominant eye for the user.
  • the user perspective may be the left eye of the user, which is a dominant eye for the user.
  • the user has to register his/her dominant eye in the image creating display device in advance.
  • the image creating display device may detect a position of the display 20 with respect to the position of the face of the user based on the second image to decide the user perspective. For example, the image creating display device recognizes the position of the face of the user and the display 20 based on the second image, and then decides whether the display 20 is located in a right side with respect to the face of the user or the display 20 is located in a left side with respect to the face of the user. For example, when the display 20 is located in the left side with respect to the face of the user, the coordinate of the left eye is regarded as the user perspective.
  • the image creating display device may calculate an intermediate point between the right eye and the left eye of the user to decide the user perspective. For example, the image creating display device may calculate the middle point between the right eye and the left eye of the user based on the second image. In this example, the coordinate of the intermediate point calculated based on the second image is regarded as the user perspective. Alternatively, if it is difficult to detect the positions of the right eye and the left eye based on the second image, the intermediate point may be estimated based on the position of the face of the user in the second image.
  • the steps S16, S18 and S20 can be omitted.
  • both the coordinate of the right eye and the coordinate of the left eye are the eye positions of the user in order to create the user perspective images for the right eye and the left eye of the user.
  • the three-dimensional image can be realized by a special three-dimensional display with special glasses or without any glasses, i.e. naked eyes.
  • the processor 76 of the image creating display device judges whether the depth information on the second image is available (Step S30) . That is, if the depth information on the second image has been obtained in the step S10, the depth information on the second image is available for the image creating display device. On the other hand, if the depth information on the second image has not been obtained in the step S10, the depth information on the second image is not available for the image creating display device.
  • Step S30 If the depth information on the second image is available (Step S30: Yes) , for example, the processor 76 of the image creating display device calculates an eye depth of the user based on the depth information on the second image (Step 32) . That is, the image creating display device can calculate the coordinate of the eye position which has been decided as the user perspective in the steps S16, S18 and S20.
  • Step S30 the depth information on the second image is not available (Step S30: No) , for example, the processor 76 of the image creating display device needs to estimate the eye depth based on the second image (Step S34) .
  • FIG. 22 and FIG. 23 illustrate an example to estimate the eye depth based on the second image.
  • FIG. 22 illustrates the second image in which the user U3 is close to the sub camera 36 of the second camera module 54
  • FIG. 23 illustrates the second image in which the user U4 is far from the sub camera 36 of the second camera module 54.
  • the image creating display device may estimate the eye depth based on a size of a face of the user. That is, as shown in FIG. 22, the size of the face F3 of the user U3 on the second image is relatively large. On the other hand, as shown in FIG. 23, the size of the face F4 of the user U4 on the second image is relatively small. Therefore, the image creating display device can estimate the eye depth of the user based on the size of the face of the user on the second image.
  • the image creating display device may estimate the eye depth based on a distance between the eyes of the user. That is, as shown in FIG. 22, the distance D3 of the user U3 on the second image is relatively long. On the other hand, as shown in FIG. 23, the distance D4 of the user U4 on the second image is relatively short. For example, a statistical average of the distance between the right eye and the left eye in Asian people is known. By using the statistical average of the distance of the eyes, the eye depth can be estimated precisely enough when capturing the Asian people in the second image. Therefore, the image creating display device can estimate the eye depth of the user based on the distance between the right eye and the left eye of the user on the second image.
  • the image creating display device may estimate the eye depth of the user on the second image based on both the size of the face of the user and the distance of the eyes of the user on the second image. That is, the two techniques mentioned above to estimate the eye depth of the user may be combined. In other words, the eye depth may be estimated based on both the eyes and/or the face of the user in the second image.
  • the processor 76 of the image creating display device calculates a coordinate of the eye position of the user in a second coordinate system (Step S36) .
  • the first image in the first side has a first coordinate system and the second image in the second side has the second coordinate system. Therefore, in the step S36, the image creating display device calculates the coordinate of the eye position in the second coordinate system based on the eye depth and the second image.
  • FIG. 24 illustrates a relative position of the sub camera 36 and the eye position of the eyes of the user.
  • the position of the right eye of the user is the eye position of the user.
  • the coordinate of the eye position of the user in the second coordinate system is calculated based on the eye depth estimated in the step S32 or S34 and the second image obtained in the step S10.
  • the coordinate of the eye position in the second coordinate system is indicated by vector e user-side .
  • the processor 76 of the image creating display device calculates a coordinate of the eye position in the first coordinate system (Step S38) . That is, the image creating display device transforms the three-dimensional coordinate of the eye position of the user in the first coordinate system into a three-dimensional coordinate of the eye position of the user in the second coordinate system. Therefore, a coordinate transformation from the first coordinate system to the second coordinate system is performed.
  • a coordinate transformation matrix for the coordinate transformation may be previously defined.
  • the coordinate transformation matrix can be defined by a calibration by capturing the same object at the same position by both the first camera module 50 and the second camera module 54.
  • the coordinate transformation matrix can dynamically be defined by capturing the same object at the same position by both the first camera module 50 and the second camera module 54.
  • FIG. 25 illustrates the eye position in the first coordinate system after the coordinate transformation is executed by the image creating display device according to the present embodiment.
  • the coordinate transformation matrix includes a rotation matrix and a translation vector which are previously obtained by the calibration.
  • the eye position in the first coordinate system is indicated by a vector e env-side .
  • the processor 76 of the image creating display device creates the user perspective image based on the third dimensional coordinate of the eye position of the user in the first coordinate system and the first image captured by the first camera module in the first side (Step S40) . That is, the image creating display device creates the user perspective image based on the eye position of the user and the first image.
  • FIG. 26 is an explanatory drawing of creating the user perspective image according to the present embodiment.
  • the third dimensional coordinate of the object in the first coordinate system is expressed by P and the position of P corresponds a position P camera on the projection plane of the first image.
  • a line between the eye position and the position of P and the display intersect at P display .
  • the user perspective image can be created.
  • complement process and/or black-painting process are needed for the display.
  • the information on the past frame for the display or an image data stored in the image creating device or a server computer can be used.
  • the depth information of the first image is not necessarily needed.
  • the user perspective image can be created by a relative position of the eye position of the user, the position and the size of the projection area on the projection plane of the first image, and the position and size of the projection area on the projection plane of the second image. That is, the user perspective image can be created without the depth information on the first image.
  • the processor 76 of the image creating display device adds an additional information to the user perspective image by overlapping the additional information on the user perspective image (Step S42) .
  • the additional information to be overlapped on the user perspective image.
  • a generated specific object may be overlapped on the user perspective image.
  • an explanation written in any kinds of characters may be overlapped on the user perspective image.
  • the processor 76 of the image creating device analyzes the user perspective image and then generates the explanation on the user perspective image based on the result of the analysis of the user perspective image.
  • the step S42 can be omitted if the additional information is not required for the user.
  • the additional information is not added on the user perspective image.
  • the processor 76 of the image creating display displays the user perspective image in the projection area on the projection plane by the display module 70 (Step S44) .
  • the projection area on the projection plane is within the display 20 of the portable electrical device 10, the window frame 80 of the projection system 90, or the like.
  • FIG. 27 illustrates an example of the user perspective image displayed on the display 20 of the display module 70 of the portable electrical device 10.
  • the user perspective image matches the background around the projection area of the display 20.
  • the image creating display device can create the user perspective image which matches a view through a projection area on the projection plane from the eye position of the user with the background around the projection area seen from the eye position of the user.
  • the processor 76 of the image creating display device it is possible for the processor 76 of the image creating display device to display the additional information but not to display the user perspective image.
  • the image creating display device does not need to display the user perspective image for the user because the user can see the user perspective image through the transparent display screen or the semi-transparent display screen. Therefore, in this case, it is sufficient for user to display the additional information on the projection area on the projection plane.
  • step S44 After the process of the step S44, the user perspective image creating process of the present embodiment shown in FIG. 20 and FIG. 21 has been completed, and the process from the step S10 is performed again.
  • the image creating display device can create the user perspective image which matches the view through the projection area on the projection plane based on the eye position of the user with a background around the projection area from the eye position of the user.
  • the edge of the user perspective image displayed by the display module 70 coincides with the background around the edge of the user perspective image.
  • a sense of immersion of the user is improved and the quality of a virtual reality experience is highlighted.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • the feature defined with “first” and “second” may comprise one or more of this feature.
  • a plurality of means two or more than two, unless specified otherwise.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements, which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are contacted via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is right or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is right or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function may be specifically achieved in any computer readable medium to be used by the instruction execution system, device or equipment (such as the system based on computers, the system comprising processors or other systems capable of obtaining the instruction from the instruction execution system, device and equipment and executing the instruction) , or to be used in combination with the instruction execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instruction execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Controls And Circuits For Display Device (AREA)
PCT/CN2020/092194 2020-05-26 2020-05-26 Method of controlling image creating display device, and image creating display device WO2021237430A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/092194 WO2021237430A1 (en) 2020-05-26 2020-05-26 Method of controlling image creating display device, and image creating display device
CN202080098614.1A CN115280369A (zh) 2020-05-26 2020-05-26 图像创建显示装置的控制方法及图像创建显示装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/092194 WO2021237430A1 (en) 2020-05-26 2020-05-26 Method of controlling image creating display device, and image creating display device

Publications (1)

Publication Number Publication Date
WO2021237430A1 true WO2021237430A1 (en) 2021-12-02

Family

ID=78745122

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092194 WO2021237430A1 (en) 2020-05-26 2020-05-26 Method of controlling image creating display device, and image creating display device

Country Status (2)

Country Link
CN (1) CN115280369A (zh)
WO (1) WO2021237430A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140168056A1 (en) * 2012-12-19 2014-06-19 Qualcomm Incorporated Enabling augmented reality using eye gaze tracking
CN104883598A (zh) * 2015-06-24 2015-09-02 三星电子(中国)研发中心 一种画面显示设备以及一种显示画面调整方法
CN104918036A (zh) * 2014-03-12 2015-09-16 联想(北京)有限公司 增强现实显示装置及方法
US20170046877A1 (en) * 2015-08-14 2017-02-16 Argis Technologies, LLC Augmented visualization system for hidden structures
CN109255838A (zh) * 2017-07-14 2019-01-22 北京行云时空科技有限公司 避免增强现实显示设备观看重影的方法及设备
CN109725728A (zh) * 2018-12-29 2019-05-07 三星电子(中国)研发中心 一种ar设备的显示修正方法和装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140168056A1 (en) * 2012-12-19 2014-06-19 Qualcomm Incorporated Enabling augmented reality using eye gaze tracking
CN104918036A (zh) * 2014-03-12 2015-09-16 联想(北京)有限公司 增强现实显示装置及方法
CN104883598A (zh) * 2015-06-24 2015-09-02 三星电子(中国)研发中心 一种画面显示设备以及一种显示画面调整方法
US20170046877A1 (en) * 2015-08-14 2017-02-16 Argis Technologies, LLC Augmented visualization system for hidden structures
CN109255838A (zh) * 2017-07-14 2019-01-22 北京行云时空科技有限公司 避免增强现实显示设备观看重影的方法及设备
CN109725728A (zh) * 2018-12-29 2019-05-07 三星电子(中国)研发中心 一种ar设备的显示修正方法和装置

Also Published As

Publication number Publication date
CN115280369A (zh) 2022-11-01

Similar Documents

Publication Publication Date Title
US11636653B2 (en) Method and apparatus for synthesizing virtual and real objects
KR102150013B1 (ko) 음향신호를 위한 빔포밍 방법 및 장치
KR101979669B1 (ko) 이미지 내 사용자의 시선 보정 방법, 기계로 읽을 수 있는 저장 매체 및 통신 단말
US9594945B2 (en) Method and apparatus for protecting eyesight
US10554928B2 (en) Telepresence device
CN109977847B (zh) 图像生成方法及装置、电子设备和存储介质
JP6109413B2 (ja) 画像表示方法、画像表示装置、端末、プログラム及び記録媒体
US20190147606A1 (en) Apparatus and method of five dimensional (5d) video stabilization with camera and gyroscope fusion
EP3825960A1 (en) Method and device for obtaining localization information
US11089265B2 (en) Telepresence devices operation methods
CN110706339B (zh) 三维人脸重建方法及装置、电子设备和存储介质
CN110569708A (zh) 文本检测方法及装置、电子设备和存储介质
CN112927271A (zh) 图像处理方法、图像处理装置、存储介质与电子设备
CN110152293B (zh) 操控对象的定位方法及装置、游戏对象的定位方法及装置
US20220245839A1 (en) Image registration, fusion and shielding detection methods and apparatuses, and electronic device
CN112308103B (zh) 生成训练样本的方法和装置
US11240487B2 (en) Method of stereo image display and related device
CN113870213A (zh) 图像显示方法、装置、存储介质以及电子设备
WO2021237430A1 (en) Method of controlling image creating display device, and image creating display device
WO2020210937A1 (en) Systems and methods for interpolative three-dimensional imaging within the viewing zone of a display
US20190318449A1 (en) Imaging method and apparatus for virtual reality device, and virtual reality device
CN112270737A (zh) 一种纹理映射方法及装置、电子设备和存储介质
CN114093020A (zh) 动作捕捉方法、装置、电子设备及存储介质
CN116681746B (zh) 深度图像的确定方法和装置
WO2022016331A1 (en) Method of compensating tof depth map and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20938328

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20938328

Country of ref document: EP

Kind code of ref document: A1