WO2018168825A1 - Dispositif de traitement d'image et équipement électronique - Google Patents

Dispositif de traitement d'image et équipement électronique Download PDF

Info

Publication number
WO2018168825A1
WO2018168825A1 PCT/JP2018/009651 JP2018009651W WO2018168825A1 WO 2018168825 A1 WO2018168825 A1 WO 2018168825A1 JP 2018009651 W JP2018009651 W JP 2018009651W WO 2018168825 A1 WO2018168825 A1 WO 2018168825A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
image
displayed
display unit
frame
Prior art date
Application number
PCT/JP2018/009651
Other languages
English (en)
Japanese (ja)
Inventor
英範 栗林
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2018168825A1 publication Critical patent/WO2018168825A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an image processing apparatus and an electronic apparatus.
  • Patent Document 1 A camera that cuts out and displays or records a part of a photographed image photographed by an ultra-wide-angle camera is known (for example, Patent Document 1).
  • the image processing apparatus is a part of an image obtained by capturing the first subject and the second subject, and moves the part of the image displayed on the display unit in the first direction.
  • the second subject is displayed after the first subject is displayed by repeating the control to display a portion of the image that is not displayed on the display unit, and the first subject is again displayed on the display unit.
  • Image data used to be displayed on the first image data used to display the first image obtained by imaging the third subject, and taken after the first image is taken
  • a frame generation unit configured to generate the second frame such that a left-right or vertical positional relationship between the fourth subject and the fifth subject in the second partial image is maintained.
  • the image processing apparatus is a part of an image obtained by capturing the first subject and the second subject, and the part of the image displayed on the display unit is moved in the first direction.
  • the second subject is displayed after the first subject is displayed by repeating the control to display a portion of the image that is not displayed on the display unit, and the first subject is again displayed on the display unit.
  • Image data used to be displayed on the first image data used to display the first image obtained by imaging the third subject, and taken after the first image is taken An input unit for inputting second image data used to display a second image obtained by imaging the fourth subject and the fifth subject, and the display unit including the third subject.
  • a moving image generating unit that generates a moving image including at least a second frame displayed on the display unit, and based on a position of the third subject in the first image, the fourth subject and the An image generation unit that generates the second frame in which a fifth subject is arranged.
  • the electronic device captures the first image in which the first subject is captured, and the second image in which the second subject and the third subject are captured after the first image is captured. And a control for moving a part of the first image displayed on the display unit in the first direction and displaying a part of the first image that is not displayed on the display unit. By repeating, after displaying the first subject, the first subject is displayed again on the display unit, and a part of the second image displayed on the display unit is moved in the first direction. By repeating the control to display a portion of the second image that is not displayed on the display unit, the third subject is displayed after the second subject is displayed, and the second subject is displayed again on the display unit.
  • a control unit to be displayed on the first object And a moving image generating unit that generates a moving image including at least a first frame displayed on the display unit, a second frame including the second subject and the third subject, and displayed on the display unit And a second partial image including a partial image at a position corresponding to a first partial image including the first subject in the first image, and the second subject and the third subject in the second image.
  • a frame generation unit for generating the second frame In the second partial image in which the partial image is included between the second subject and the third subject, the horizontal and vertical positional relationship between the second subject and the third subject is maintained.
  • the electronic device captures the first image in which the first subject is captured, and the second image in which the second subject and the third subject are captured after the first image is captured. And a control for moving a part of the first image displayed on the display unit in the first direction and displaying a part of the first image that is not displayed on the display unit. By repeating, after displaying the first subject, the first subject is displayed again on the display unit, and a part of the second image displayed on the display unit is moved in the first direction. By repeating the control to display a portion of the second image that is not displayed on the display unit, the third subject is displayed after the second subject is displayed, and the second subject is displayed again on the display unit.
  • the image processing device includes a first omnidirectional image including the first subject imaged by the imaging unit, a second subject imaged after the first omnidirectional image is imaged, and a third object.
  • An input unit that inputs a second omnidirectional image including a subject; a first frame that includes the first subject and is displayed on a display unit; the second subject and the third subject; and the display unit
  • a moving image generating unit that generates a moving image including at least the second frame displayed on the position, and a position corresponding to the first partial image including the first subject in the first entire image in the second entire image.
  • a second partial image including the second subject and the third subject wherein the second partial image includes the partial image between the second subject and the third subject.
  • the second subject and the third subject in Comprises a frame generator for generating a second frame, such as horizontally or vertically positional relationship is maintained, the.
  • the image processing device includes a first omnidirectional image including the first subject imaged by the imaging unit, a second subject imaged after the first omnidirectional image is imaged, and a third subject.
  • An input unit that inputs a second omnidirectional image including a subject; a first frame that includes the first subject and is displayed on a display unit; the second subject and the third subject; and the display unit
  • a moving image generation unit that generates a moving image including at least a second frame displayed on the first frame, and the second subject and the third subject are arranged based on the position of the first subject in the first circumferential image.
  • a frame generation unit configured to generate the second frame.
  • Block diagram schematically showing the configuration of the image processing system Block diagram schematically showing the configuration of the imaging device Schematic diagram of imaging range and omnidirectional image of the imaging unit
  • FIG. 1 is a block diagram schematically showing the configuration of the image processing system 1.
  • the image processing system 1 includes an imaging device 2, an image processing device 3, and a playback device 4.
  • the imaging device 2 is an electronic device such as a digital camera, a smartphone, or a tablet terminal.
  • the image processing device 3 is an electronic device such as a digital camera, a smartphone, a tablet terminal, or a personal computer.
  • the playback device 4 is an electronic device such as a digital camera, a smartphone, a tablet terminal, a personal computer, a digital photo frame, or a head mounted display.
  • the imaging device 2 has a still image imaging function and a moving image imaging function.
  • the still image capturing function is a function for capturing an omnidirectional image (described later).
  • the moving image capturing function is a function that repeatedly captures an omnidirectional image and creates an omnidirectional moving image in which each frame is an omnidirectional image.
  • the image processing device 3 creates a two-dimensional moving image (described later) in which each frame is a two-dimensional image having a smaller angle of view than the omnidirectional image from the omnidirectional moving image created by the imaging device 2.
  • the playback device 4 plays back (displays) omnidirectional images and two-dimensional moving images.
  • FIG. 2 is a block diagram schematically illustrating the configuration of the imaging device 2.
  • the imaging device 2 includes an imaging unit 20, a first imaging optical system 21, a second imaging optical system 22, and a storage unit 23.
  • the imaging unit 20 includes a first imaging element 201 and a second imaging element 202.
  • the first imaging optical system 21 and the second imaging optical system 22 are so-called fisheye lenses.
  • the first imaging optical system 21 forms a subject image in the hemisphere range on the imaging surface of the first imaging element 201.
  • the first image sensor 201 is configured to be able to image a range of 360 degrees in the horizontal direction and 180 degrees in the vertical direction.
  • the imaging range of the first imaging element 201 is referred to as a first hemisphere.
  • the second imaging optical system 22 forms a subject image in a hemisphere range different from the first hemisphere on the imaging surface of the second imaging element 202.
  • the second image sensor 202 is configured to be able to image a range of 360 degrees in the horizontal direction and 180 degrees in the vertical direction.
  • the imaging range of the second imaging element 202 is referred to as a second hemisphere.
  • the first hemisphere and the second hemisphere make up the whole sphere. That is, the imaging unit 20 images the 360-degree range of the celestial sphere in the horizontal direction and 360 degrees in the vertical direction by the first imaging element 201 and the second imaging element 202.
  • an image obtained by capturing an entire celestial sphere having an angle of view of 360 degrees in the horizontal direction and 360 degrees in the vertical direction is referred to as an omnidirectional image.
  • the storage unit 23 stores a single omnidirectional image captured by the imaging unit 20 in the storage medium 51 (for example, a memory card).
  • the storage unit 23 stores the omnidirectional moving image including a plurality of omnidirectional images repeatedly captured by the imaging unit 20 in the storage medium 51.
  • each frame of the omnidirectional video is an omnidirectional image.
  • the storage medium 51 can be inserted into and removed from the imaging apparatus 2, but the imaging apparatus 2 may incorporate the storage medium 51.
  • FIG. 3A is a schematic diagram of an imaging range of the imaging unit 20.
  • the imaging unit 20 images the range of the omnidirectional sphere 60 shown in FIG. 3A with the installation position (camera position) of the imaging unit 20 as the origin O.
  • FIG. 3B is a schematic view illustrating an omnidirectional image captured by the imaging unit 20.
  • the omnidirectional image 61 illustrated in FIG. 3B includes a first hemisphere image 62 imaged by the first image sensor 201 and a second hemisphere image 63 imaged by the second image sensor 202.
  • the first hemisphere image 62 includes a circular image 64 formed by the first imaging optical system 21.
  • the second hemisphere image 63 includes a circular image 65 formed by the second imaging optical system 22.
  • An image with an arbitrary angle of view can be obtained by cutting out and deforming a part of the omnidirectional image 61 illustrated in FIG.
  • the region 67 in FIG. 3B may be cut out and transformed into a rectangle.
  • the area 68 shown in the shaded area in FIG. in order to obtain an image, the region 69 in FIG. 3B may be cut out and transformed into a rectangle.
  • An image 70 obtained at this time is illustrated in FIG.
  • the image 70 is a horizontally long panoramic image. Note that the left end 71 and the right end 72 of the image 70 illustrated in FIG.
  • the image 70 illustrated in FIG. 3C is an all-round image obtained by imaging a 360 ° range around the imaging unit 20.
  • the all-round image 70 includes a path 600 that goes around the surface of the omnidirectional sphere 60.
  • the path 600 is the circumference of a circle centered on the origin O and having the same diameter as the omnidirectional sphere 60. Since the origin O is the center of the omnidirectional sphere 60, the circle coincides with the circumference of the cross section of the omnidirectional sphere 60 by a plane passing through the center of the omnidirectional sphere 60.
  • the length of line segment AB can be set arbitrarily. For example, by setting the point A to the so-called north pole and setting the point B to the so-called south pole, the range imaged in the omnidirectional image 61 and the range imaged in the image 70 that is the all-round image match. That is, the all-round image may be said to be a projection (mapping) of the omnidirectional image 61 onto a two-dimensional image.
  • the image 70 illustrated in FIG. 3C is an all-round image obtained by imaging a 360 ° range in the horizontal direction of the imaging unit 20. Therefore, the all-round image 70 includes a path 600 corresponding to a so-called equator.
  • the all-round image is not limited to the horizontal direction of the imaging unit 20, and may be an image obtained by imaging a 360 ° range in all directions of the imaging unit 20. For example, an image obtained by imaging a 360 degree range around the imaging unit 20 along the meridian of the celestial sphere 60 can be used.
  • the omnidirectional image is exemplified as an all-round image obtained by capturing a 360 degree range in the horizontal direction as illustrated in FIG. That is, in the following description, the omnidirectional image illustrated as if it is an all-round image like the image 70 of FIG. 3C is actually shown in FIG. 3A unless otherwise specified. It is an image in which the range of the celestial sphere 60 is captured.
  • the imaging unit 20 may have a larger number of imaging elements instead of the first imaging element 201 and the second imaging element 202.
  • the range of the omnidirectional sphere 60 is not a combination of two image sensors that capture the range of the hemisphere, but the range of the omnisphere 60 is combined to capture the range of the omnisphere 60. You may make it image.
  • the imaging ranges of the individual imaging elements may partially overlap each other. For example, the imaging range of the first imaging element 201 and the imaging range of the second imaging element 202 may partially overlap.
  • the imaging device 2 is not the first imaging optical system 21 and the second imaging optical system 22, but has a larger number of imaging optical systems that each form a subject image in a range narrower than the hemisphere. You may do it.
  • the imaging unit 20 may have a single imaging element instead of the first imaging element 201 and the second imaging element 202.
  • the circular image 64 and the circular image 65 are made into a single by directing the light from the first imaging optical system 21 and the light from the second imaging optical system 22 to a single imaging element by a mirror or the like. Imaging can be performed by the imaging element. By doing in this way, the number of image pick-up elements can be reduced and the cost of the image pick-up part 20 can be reduced.
  • FIG. 4A is a block diagram schematically showing the configuration of the image processing apparatus 3.
  • the image processing device 3 includes an image generation unit 30, an input unit 31, and an output unit 32.
  • the input unit 31 reads out the omnidirectional moving image from the storage medium 51 in which the omnidirectional moving image is stored, and inputs it to the image generation unit 30.
  • the image generation unit 30 executes a two-dimensional moving image creation process to be described later on the input omnidirectional moving image.
  • the two-dimensional video creation process is a process for creating a two-dimensional video from the omnidirectional video. That is, the image generation unit 30 creates a two-dimensional video from the input omnidirectional video.
  • a two-dimensional moving image is a moving image in which each frame is composed of images with a narrower angle of view than the omnidirectional image.
  • a two-dimensional moving image has the same content as a moving image captured by placing a video camera having a general angle of view of about 50 to 25 degrees at the origin O in FIG.
  • the output unit 32 stores the two-dimensional moving image created by the image generation unit 30 in the storage medium 52.
  • the storage medium 52 may be the same storage medium as the storage medium 51 in which the imaging device 2 stores the omnidirectional video, or may be a different storage medium. 4A, the storage medium 51 and the storage medium 52 are provided outside the image processing apparatus 3. However, the image processing apparatus 3 incorporates one or both of the storage medium 51 and the storage medium 52. Also good.
  • the storage medium 51 and the storage medium 52 may be connected to the image processing apparatus via a wired or wireless network. Further, instead of the storage medium 51, the omnidirectional video may be directly input from the imaging device 2 via the network.
  • each frame of a two-dimensional video created from an omnidirectional video is not only one image with a narrower angle of view than the omnidirectional image, but also two or more frames with a smaller angle of view than the omnidirectional image. You may have the image of.
  • FIG. 4B is a block diagram schematically showing the configuration of the playback device 4.
  • the playback device 4 includes a display unit 40, an input unit 41, a control unit 42, and an operation unit 43.
  • the input unit 41 reads out the omnidirectional image from the storage medium 51 in which the omnidirectional image is stored and inputs it to the control unit 42.
  • the input unit 41 reads the two-dimensional moving image from the storage medium 52 in which the two-dimensional moving image is stored and inputs the two-dimensional moving image to the control unit 42.
  • the control unit 42 performs control to display the input omnidirectional image or two-dimensional moving image on the display unit 40.
  • the display unit 40 has a display screen configured by, for example, a liquid crystal panel.
  • the display unit 40 displays the omnidirectional image or the two-dimensional moving image on the display screen based on the control by the control unit 42.
  • the storage medium 51 and the storage medium 52 are provided outside the playback device 4.
  • the playback device 4 may incorporate one or both of the storage medium 51 and the storage medium 52.
  • the display unit 40 is, for example, a liquid crystal display of a smartphone, a liquid crystal display of a tablet terminal, and a head mounted display. Therefore, when all the image areas of the omnidirectional image are displayed on the display unit 40 at once, a 360-degree range is displayed on the two-dimensional display, which is difficult for the user to visually recognize.
  • a method is known in which a part of the omnidirectional image having a field angle of 360 degrees is cut out, and a part of the omnidirectional image is displayed on the two-dimensional plane display screen (display unit 40) and reproduced. .
  • a part of the omnidirectional image is displayed on the display unit 40 and reproduced.
  • the operation unit 43 is an operation member to which an operation by a user is input.
  • the operation unit 43 is a touch sensor that is superimposed on the display screen of the display unit 40.
  • the operation unit 43 detects the position where the user's finger or the like has touched the display screen and transmits the detected position to the control unit 42. That is, the operation unit 43 detects a user's touch operation and inputs it to the control unit 42.
  • the touch operation for example, the user touches a finger or the like at a position on the display screen, slides the finger or the like in any of the upper, lower, left and right directions while maintaining the contact state, and then the finger or the like is displayed on the display screen.
  • the scroll operation for moving a finger or the like in the left direction is a scroll operation in the left direction.
  • the scrolling operation is an operation of moving the image displayed on the display unit 40 in an arbitrary direction on the display unit 40.
  • the operation unit 43 may be an operation member different from the touch sensor.
  • the operation unit 43 is a sensor that detects the displacement (direction, position, etc.) of the head mounted display in accordance with the movement of the user's neck.
  • the image displayed on the display unit 40 moves by an amount corresponding to the displacement of the head mounted display. For example, by performing an operation of shaking the neck to the left, the image displayed on the display unit 40 is moved to the right and displayed.
  • the operation member used for the operation unit 43 is not limited to the above-described one as long as the image displayed on the display unit 40 is moved in an arbitrary direction on the display unit 40.
  • FIG. 5 is an explanatory diagram of the omnidirectional image reproduction process.
  • FIG. 5A, FIG. 5C, and FIG. 5E are diagrams illustrating the omnidirectional image 73 to be reproduced.
  • the omnidirectional image 73 is an image in which the subject 74 and the subject 75 are captured.
  • FIGS. 5B, 5 ⁇ / b> D, and 5 ⁇ / b> F are diagrams illustrating the display screen of the display unit 40 that reproduces the omnidirectional image 73.
  • the control unit 42 cuts out a part of the range 76 from the omnidirectional image 73 shown in FIG. 5A and displays it on the display unit 40 as shown in FIG. In FIG. 5B, the display unit 40 displays a range 76 including the subject 74.
  • the control unit 42 displays a part of the omnidirectional image 73 displayed on the display unit 40 as shown in FIGS. 5 (c) and 5 (d). Control is performed to move leftward and display a portion of the omnidirectional image 73 that has not been displayed on the display unit 40 at the time of FIG.
  • control unit 42 replaces the image displayed on the display unit 40 with a part of the omnidirectional image 73 displayed on the display unit 40 with another part on the right side of the omnidirectional image 73. .
  • the control unit 42 once erases a part of the omnidirectional image 73 currently displayed on the display unit 40, and changes the range 76 shown in FIG. 5A to the range 77 shown in FIG. 5C. Then, a new part of the omnidirectional image 73 corresponding to the range 77 is displayed on the display unit 40. At this time, it appears to the user that the omnidirectional image 73 has moved leftward by a distance 78.
  • the distance 78 can be measured in units of pixels constituting the display unit 40.
  • the omnidirectional image 73 moves to the left by one pixel on the display unit 40 by minimizing the scroll operation in the left direction on the screen.
  • the distance 78 can be defined in units of pixels.
  • the control unit 42 repeats the above-described control. As a result, as shown in FIGS. 5E and 5F, the control unit 42 cuts out a range 79 including the subject 75 and displays it on the display unit 40. As described above, since the right end and the left end of the omnidirectional image 73 are continuous, the control unit 42 displays the subject 74 on the display unit 40 again when the user further repeats the scroll operation in the left direction of the screen. Will come to do. That is, the display content of the display unit 40 becomes the content shown in FIGS. 5A and 5B again.
  • the omnidirectional image 73 is a part of an image in which the subject 74 and the subject 75 are captured, and a part of the omnidirectional image 73 displayed on the display unit 40 is leftward.
  • the object 75 is displayed after the subject 74 is displayed, and the subject 74 is displayed again on the display unit 40 by repeating the control to move and display the portion of the omnidirectional image 73 that is not displayed on the display unit 40.
  • This is image data used for image processing.
  • the scrolling operation in the left direction of the screen is repeated, and the image displayed on the display unit 40 is moved until the subject 75 is displayed on the display unit 40.
  • the distance 80 (FIG. 5E) is referred to as the distance in the left direction from the subject 74 to the subject 75.
  • 81 (FIG. 5E) is referred to as a distance in the left direction from the subject 75 to the subject 74.
  • the subject 75 is displayed after the subject 74 is displayed. Is displayed, and the subject 74 is displayed on the display unit 40 again.
  • the subject 75 is positioned in the upward direction instead of the right direction of the subject 74, it is not in the horizontal direction of the screen, for example, in the upward direction of the screen.
  • the subject 74 disappears from the screen after the subject 74 is displayed, and then the subject 75 is displayed and the subject 74 is displayed again. That is, two arbitrary subjects imaged in the omnidirectional image 73 can be displayed as described above by making the direction of the scroll operation constant.
  • the playback device 4 of the present embodiment cuts out a part of the omnidirectional image having an angle of view of 360 degrees in the vertical direction and the horizontal direction, and plays it back on the two-dimensional plane display screen.
  • the omnidirectional image has been described as a still image.
  • an omnidirectional video in which each frame is an omnidirectional image can be reproduced by the same processing.
  • the description is exactly the same as described above.
  • each frame (omnidirectional image) constituting the omnidirectional video differs only in that it changes with time.
  • the main subject can be displayed on the display unit 40 and visually recognized by the user performing a scrolling operation in an arbitrary direction.
  • each frame (omnidirectional image) constituting the omnidirectional video is displayed on the display unit 40 for only a very short time, and thus is not displayed on the display unit 40 of the frame. It is difficult to display the part.
  • the scroll operation is displayed on the display unit 40 as a control result for the next frame. As a result, for example, the user may miss a scene in which a main subject is moving in a portion that is not currently reproduced on the display screen.
  • the user may not even notice that there is a main subject that he / she has not visually recognized in the omnidirectional video. Moreover, it is necessary to perform the scroll operation as described above every time a moving image is viewed and adjust the display position, which is troublesome. Furthermore, for example, when two main subjects are moving to be noticed at different points, it is necessary to reproduce a moving image a plurality of times in order to see both of them. Thus, the reproduction of the omnidirectional video is a heavy burden on the user. Therefore, the image processing system 1 according to the present embodiment automatically creates a two-dimensional moving image focusing on an appropriate subject from the omnidirectional moving image and reproduces the two-dimensional moving image, thereby solving the above-described problems. To do.
  • the playback process (display process) of the two-dimensional video by the playback device 4 will be described.
  • the two-dimensional moving image is composed of a plurality of two-dimensional images arranged in time series. Each two-dimensional image constituting the two-dimensional moving image is called a frame.
  • the control unit 42 reproduces the two-dimensional moving image by displaying the plurality of frames on the display unit 40 in order.
  • the input of the omnidirectional video from the imaging device 2 to the image processing device 3 may be performed by a method that does not use the storage medium 51.
  • the imaging device 2 and the image processing device 3 may be electrically connected by a communication cable, and the omnidirectional video may be input to the image processing device 3 by data communication.
  • the imaging device 2 and the image processing device 3 may exchange omnidirectional moving images by wireless communication via radio waves. The same applies to the input of the omnidirectional image from the imaging device 2 to the playback device 4 and the input of the two-dimensional moving image from the image processing device 3 to the playback device 4.
  • the two-dimensional video creation process executed by the image generation unit 30 will be described.
  • the image generation unit 30 creates a 2D moving image from the omnidirectional moving image by executing a 2D moving image creation process.
  • the two-dimensional moving image creating process is a process for identifying a main subject from the omnidirectional image and creating a two-dimensional moving image including the identified main subject.
  • the 2D video creation process includes a subject identification process and a 2D image creation process.
  • the subject specifying process is a process for specifying the main subject from the omnidirectional image included in the omnidirectional video.
  • the two-dimensional image creation process is a process for creating a two-dimensional image including the main subject specified by the subject specifying process from the omnidirectional image.
  • the subject specifying process and the two-dimensional image creation process will be described in order.
  • the image generation unit 30 specifies a main subject from each frame included in one omnidirectional video using a known technique such as face recognition or pattern matching. For example, when the main subject is a person, the face included in the omnidirectional image is detected by a technique for recognizing a human face, and the person corresponding to the face is detected from the orientation, position, color, etc. The whole body can be identified. Note that “specifying the main subject” means recognizing (detecting) the positions and shapes of various subjects reflected in the omnidirectional image and selecting the main subject from these subjects. . For example, when the main subject is a person and three or more persons are detected from the omnidirectional image, the image generation unit 30 identifies all of those persons as the main subject.
  • the recognition of the main subject can be determined based on various factors (parameters) such as the size and saliency of the subject in the image. Further, by using a plurality of temporally continuous images instead of one image, the determination can be made based on the movement of the subject. It should be noted that an object having a predetermined threshold or more can be set as a main object by digitizing the parameter and using threshold processing. When threshold processing is used, a plurality of subjects may be recognized as main subjects. There may be one or more main subjects. In the omnidirectional image, since a range of 360 degrees is captured, there is a high possibility that a plurality of subjects are recognized as main subjects as compared with the case of shooting with a normal camera.
  • the two-dimensional image creation process is a process for creating a two-dimensional image including a main subject from each frame of the omnidirectional video.
  • the image processing system 1 automatically creates a two-dimensional moving image including a main subject from an omnidirectional moving image. Each image constituting the two-dimensional moving image is called a frame.
  • the two-dimensional image creation process is a process for creating a two-dimensional image (frame) including the main subject specified by the subject specifying process from the omnidirectional image. In the two-dimensional image creation process, if there is one main subject, a frame including one main subject is generated, and if there are two main subjects, a frame including two main subjects is generated.
  • FIGS. 7A to 7C are diagrams illustrating two-dimensional images (frames) generated by the two-dimensional image creation process when two main subjects are recognized.
  • the first subject 201 and the second subject 202 are subjects that are recognized as main subjects.
  • the two-dimensional image 610 illustrated in FIG. 7A is a two-dimensional image (frame) obtained by cutting out a partial image (view angle) including the first subject 201 and the second subject 202 from the omnidirectional image. .
  • FIG. 7B the first subject 201 is cut out from the omnidirectional image
  • the second subject 202 is cut out from the omnidirectional image
  • the two partial images thus cut out are pasted up, down, left, and right.
  • a dimensional image 611 may be created. Further, as shown in FIG.
  • superimposition synthesis is performed on an image obtained by cutting out a wide range including the second subject 202 from the omnidirectional image and a partial image 613 obtained by extracting the first subject 201 from the omnidirectional image.
  • a two-dimensional image 612 may be created.
  • the problem in a two-dimensional image creation process is demonstrated using the example which images the volleyball game with the imaging device 2.
  • FIG. 6 is an explanatory diagram of the two-dimensional image creation process.
  • FIG. 6A is a top view of a volleyball court.
  • the imaging device 2 is installed at the center of the coat 200.
  • On the left side of the coat 200 there is a person (hereinafter referred to as a first subject 201) as a main subject.
  • On the right side of the surface of the coat 200 there is a person who is another main subject (hereinafter referred to as a second subject 202). That is, it is a case where two main subjects are recognized.
  • FIG. 6B shows the arrangement of the first subject 201 and the second subject 202 in a three-dimensional space centered on the imaging device 2.
  • the image generation unit 30 cuts out a two-dimensional image (frame) including the first subject 201 and the second subject 202 from the omnidirectional image as shown in FIG. If the image generation unit 30 generates a two-dimensional image (frame) so as to include the path 204 in FIG. 6B, the generated two-dimensional image 610 includes a first subject 201 on the right side and a second subject 202. Will be placed on the left side. On the other hand, if the image generation unit 30 generates a two-dimensional image (frame) so as to include the path 209 in FIG. 6B, the generated two-dimensional image 610 includes the first subject 201 on the left side, Two subjects 202 are arranged on the right side.
  • the image generation unit 30 arranges at least two types of the first subject 201 and the second subject 202 (the first subject 201 is on the left side and the second subject 202 is on the right side. One subject 201 can be placed on the right side and the second subject 202 can be placed on the left side).
  • FIG. 7A shows the path 204 shown in FIG. 6B for easy understanding.
  • the image generation unit 30 needs to generate a two-dimensional image (frame) in which a plurality of main subjects (two main subjects) are appropriately arranged.
  • the image generation unit 30 creates a two-dimensional image including main subjects of both the first subject 201 and the second subject 202 in the two-dimensional image creation process.
  • the image generation unit 30 determines the angle of view including the first subject 201 and the second subject 202 as the angle of view of the two-dimensional image using the positions of the first subject 201 and the second subject 202 specified by the subject specifying process. .
  • the image generation unit 30 displays the angle of view 203 including the first subject 201 and the second subject 202 of the two-dimensional image. Determined as the angle of view.
  • the image generation unit 30 creates a two-dimensional image by cutting out the content corresponding to the angle of view 203 from the omnidirectional image and transforming it into a rectangle.
  • the image generation unit 30 selects “the shortest path 204 that connects the first subject 201 and the second subject 202 in the three-dimensional space, and includes the first subject 201 and the second subject 202 from among such many angles of view. Select the “Including” angle of view. For example, in FIG.
  • the angle of view including the first subject 201 and the second subject 202 may be a number of angles of view such as the angle of view 203 and the angle of view 205.
  • the image generation unit 30 selects the angle of view 203 that includes the shortest path 204 that connects the first subject 201 and the second subject 202 and includes the first subject 201 and the second subject 202 out of those angles of view. .
  • “including the shortest path 204” can be considered as “including a third subject different from the first subject 201 and the second subject 202 existing in the shortest path 204”.
  • a method of specifying “the shortest path connecting the first subject 201 and the second subject 202 in the omnidirectional image” will be described.
  • the omnidirectional sphere 60 is cut on a plane passing through the center of the omnidirectional sphere 60 and passing through the first subject 201 and the second subject 202, a part of the circumference of the cross section of the omnidirectional sphere 60 is the first subject 201.
  • the shortest path connecting the second subject 202 In the omnidirectional image 206 shown in FIG. 6B, when the first subject 201 and the second subject 202 are regarded as points, the circumference of the cross section of the celestial sphere 60 can be said to be a combination of the path 204 and the path 209. .
  • the shorter one of the route 204 and the route 209 is the shortest route. That is, the shortest path connecting the first subject 201 and the second subject 202 in the omnidirectional image is the path 204. Except for the case where the first subject 201 and the second subject 202 are located directly opposite to the omnidirectional sphere 60, the shortest path can be uniquely identified.
  • the image generation unit 30 calculates the shortest path between the first subject 201 and the second subject 202 as follows. For example, in the omnidirectional image (circumferential image) 206 shown in FIG. 6C, the image generation unit 30 arranges the second subject 202 to the right of the first subject 201. Alternatively, the image generation unit 30 prepares an omnidirectional image 206 in which the second subject 202 is arranged in the right direction of the first subject 201. As a result, the straight line connecting the first subject 201 and the second subject 202 in the omnidirectional image 206 passes through the center of the omnidirectional sphere 60 and passes through the first photographic subject 201 and the second photographic subject 202. It coincides with the circumference of the cross section of the omnidirectional sphere 60 when 60 is cut.
  • the image generation unit 30 compares the path 209 from the second subject 202 to the first subject 201 in the left direction with the shortest path 204 from the first subject 201 to the second subject 202 in the left direction. For comparison, the image generation unit 30 calculates a distance 208 (hereinafter referred to as a first distance 208) to the second subject 202 in the left direction of the first subject 201. Similarly, the image generation unit 30 calculates a distance 207 to the first subject 201 in the left direction of the second subject 202 (hereinafter referred to as a second distance 207). The distance can be calculated by counting the pixels constituting the omnidirectional image (circumferential image) 206. The image generation unit 30 compares the first distance 208 and the second distance 207. In the example of FIG. 6, the second distance 207 is longer than the first distance 208.
  • the image generation unit 30 compares the first distance 208 and the second distance 207 and determines that the second distance 207 is longer than the first distance 208. Therefore, the image generation unit 30 generates a two-dimensional image (frame) so that the first subject 201 is placed on the right and the second subject 202 is placed on the left. On the other hand, when the first distance 208 is longer than the second distance 207, the image generation unit 30 conversely, the first subject 201 is placed on the left and the second subject 202 is placed on the right. A two-dimensional image (frame) is generated.
  • the image generation unit 30 may generate a two-dimensional image (frame) as follows.
  • the image generation unit 30 compares the first distance 208 and the second distance 207 and determines that the second distance 207 is longer than the first distance 208
  • the image generation unit 30 includes the shortest path 204 and includes the first subject.
  • the angle of view is determined so as to include 201 and the second subject 202.
  • the image generation unit 30 generates a two-dimensional image (frame) so that the first subject 201 is placed on the right and the second subject 202 is placed on the left.
  • the first distance 207 is displayed on the display unit 40 when the user repeatedly scrolls leftward in the omnidirectional image reproduction process performed by the reproduction device 4 described above. This corresponds to the distance that the image displayed on the display unit 40 has moved from when the second subject 202 is displayed on the display unit 40 until the second subject 202 is displayed.
  • the above-mentioned second distance 208 is the time after the second subject 202 is displayed on the display unit 40 when the user repeatedly scrolls leftward in the omnidirectional image reproduction process by the reproduction device 4 described above. This corresponds to the distance that the image displayed on the display unit 40 has moved before the first subject 201 is displayed on the display unit 40.
  • the image generation unit 30 moves the image displayed on the display unit 40 from when the first subject 201 is displayed on the display unit 40 until the second subject 202 is displayed on the display unit 40.
  • a two-dimensional image in which the first subject 201 and the second subject 202 are arranged is generated from the omnidirectional image.
  • the image generation unit 30 includes the first subject 201 and the second subject 202, and the second direction on the left side of the first subject 201.
  • a two-dimensional image in which the subject 202 is arranged is generated from the omnidirectional image.
  • FIG. 7D the direction in which the left direction side (first direction side) of the first subject 201 is pointed out will be described in detail.
  • the position 201a of the first subject 201 and the position 202a of the second subject 202 are represented by points for the sake of simplicity.
  • a vector 615 starting from the first subject 201 and ending at the second subject 202 can be obtained.
  • the vector 615 is decomposed into a component 616 in the left direction (lateral direction) and a component 617 in the direction (vertical direction) orthogonal to the left direction.
  • the second subject 202 is arranged on the left side of the first subject 201.
  • “disposing the second subject 202 on the left side (first direction side) of the first subject 201” means that the position 201a of the first subject 201 is the start point and the position 202a of the second subject 202 is the end point.
  • the vector 615 to be processed has a positive component in the left direction (first direction). Note that it does not matter what state the component 617 in the direction (vertical direction) orthogonal to the left direction (first direction) of the vector 615 is.
  • the image generation unit 30 when the image generation unit 30 generates a two-dimensional image (frame), the concept using the first distance 207 and the second distance 208 has been described in FIG. 6, but the angle in the three-dimensional space is used instead of the distance. It can also be explained.
  • FIG. 6B an angle formed by a vector from the origin O to the first subject 201 and a vector from the origin O to the second subject 202 is considered.
  • Two angles, an acute angle and an obtuse angle can be considered as the angle formed by these two vectors.
  • the acute angle corresponds to the shortest path 204 and the angle of view 203
  • the obtuse angle corresponds to the path 209 and the angle of view 205.
  • the image generation unit 30 determines the angle of view so that the angle formed by these two vectors is minimized and the first subject 201 and the second subject 202 are included.
  • the image generation unit 30 generates a two-dimensional image (frame) so that the first subject 201 is placed on the right and the second subject 202 is placed on the left.
  • the image generation unit 30 generates (creates) a two-dimensional image (frame) by the processing described above.
  • the image generation unit 30 generates (creates) a two-dimensional moving image including these two-dimensional images and stores the generated two-dimensional moving image in the storage medium 52.
  • FIG. 8 is a flowchart of the two-dimensional video creation process.
  • the image generation unit 30 performs subject identification processing for each frame included in the omnidirectional video. Thereby, the main subject is specified for each frame.
  • step S30 the image generation unit 30 selects one frame included in the omnidirectional video.
  • the image generation unit 30 acquires the number of main subjects specified in the selected frame. If there is one main subject (step S30: YES), the process proceeds to step S35.
  • step S35 the image generating unit 30 creates a two-dimensional image (frame) including the main subject based on the omnidirectional image (frame).
  • step S40 the image generation unit 30 calculates the first distance in the selected frame. That is, the image generation unit 30 calculates the distance from the first subject to the second subject in the first direction, with one of the two main subjects being the first subject and the other being the second subject.
  • step S50 the image generation unit 30 calculates the second distance in the selected frame. That is, the image generation unit 30 calculates the distance from the second subject to the first subject in the first direction.
  • the first direction is the display of the first subject 201 when the user repeatedly scrolls in a certain direction when a part of the frame included in the omnidirectional video is displayed on the display unit 40. In this direction, the first subject 201 disappears from the display unit 40 after being displayed on the unit 40, the second subject 202 is then displayed, and the first subject 201 is displayed on the display unit 40 again.
  • step S60 the image generation unit 30 determines whether or not the first distance calculated in step S40 is longer than the second distance calculated in step S50.
  • the image generation unit 30 advances the process to step S70.
  • step S70 the image generating unit 30 creates a two-dimensional image in which the second subject is arranged on the first direction side of the first subject based on the omnidirectional image (frame) selected in step S30.
  • step S80 the image generation unit 30 creates a two-dimensional image in which the first subject is arranged on the first direction side of the second subject based on the frame selected in step S30.
  • step S90 the image generation unit 30 determines whether an unselected frame remains in the omnidirectional video. If an unselected frame remains, the image generation unit 30 advances the process to step S30. On the other hand, if all the frames have already been selected, the image generation unit 30 advances the process to step S100.
  • step S ⁇ b> 100 the image generation unit 30 controls the output unit 32 to store the two-dimensional moving image including the two-dimensional images created in steps S ⁇ b> 70 and S ⁇ b> 80 in the storage medium 52.
  • a two-dimensional image suitable for viewing can be automatically generated from an omnidirectional image.
  • a single device may have two or more of the imaging unit 20, the image generation unit 30, and the display unit 40.
  • the imaging device 2 may include the image generation unit 30 in addition to the imaging unit 20.
  • the imaging device 2 also serves as the image processing device 3. Therefore, the image processing apparatus 3 may not be included in the image processing system 1.
  • the image processing apparatus 3 may include a display unit 40 in addition to the image generation unit 30.
  • the image processing device 3 also serves as the playback device 4. Therefore, the playback device 4 may not be included in the image processing system 1.
  • the imaging device 2 may include an image generation unit 30 and a display unit 40 in addition to the imaging unit 20. In this case, the imaging device 2 serves as the image processing device 3 and the playback device 4. That is, the imaging device 2 provides a function equivalent to that of the image processing system 1 alone.
  • FIG. 10 is a block diagram schematically showing an electronic apparatus 1000 that combines the image processing device 3 and the playback device 4.
  • the electronic device 1000 is, for example, a smartphone or a tablet terminal.
  • the electronic device 1000 includes an image generation unit 30, an input unit 31, an output unit 32, a display unit 40, a control unit 42, and an operation unit 43.
  • the electronic device 1000 generates a two-dimensional moving image, reproduces the generated two-dimensional moving image by the display unit 40, stores the generated two-dimensional moving image in the storage medium 52, and displays the omnidirectional image (omnidirectional moving image) display unit 40. Can be played. Note that the operation of each part of the electronic device 1000 is the same as that of the first embodiment, and a description thereof will be omitted.
  • the creation of the two-dimensional video by the image generation unit 30 may be performed in real time in parallel with the creation of the omnidirectional video by the imaging unit 20, or may be started after the creation of the omnidirectional video is completed. May be.
  • the display of the two-dimensional moving image by the display unit 40 may be performed in real time in parallel with the creation of the two-dimensional moving image by the image generation unit 30, or is started after the creation of the two-dimensional moving image is completed. Also good.
  • the imaging unit 20 images the omnidirectional sphere. That is, the imaging unit 20 has been described as being able to capture a 360-degree range around the imaging unit 20, but the imaging unit 20 can only capture a range narrower than the omnidirectional sphere in the vertical direction and / or the horizontal direction. Also good.
  • the imaging unit 20 may be configured to be able to image the hemisphere.
  • the imaging unit 20 may be able to capture only an area that is narrower than the hemisphere. For example, only the image in the range 68 indicated by shading in FIG.
  • the angle of view of the imaging unit 20 is narrower than the omnidirectional sphere, the two-dimensional moving image is configured by an image having a narrower angle of view.
  • the all-round image does not necessarily have to be an image obtained by imaging the entire range of 360 degrees.
  • an image obtained by imaging a range of about 300 degrees can be handled as an all-round image in which the left and right ends are connected.
  • the control of moving a part of the image displayed on the display unit 40 in the first direction and displaying the part not displayed on the display unit 40 of the image is repeated.
  • the second subject 202 is displayed after the first subject 201 included in the image is displayed, and the first subject 201 is displayed on the display unit 40 again.
  • An image in which a part of the celestial sphere is missing can also be obtained by moving a part of the image displayed on the display unit 40 in the first direction by making the missing part continuous.
  • the second subject 202 is displayed after the first subject 201 included in the image is displayed, and the first subject 201 is displayed on the display unit 40 again. So it is a spherical image.
  • FIG. 9A is a schematic diagram showing an example of an omnidirectional image.
  • An image 620 and an image 621 corresponding to the hemisphere are images obtained by imaging a range smaller than 360 degrees, and a part of the hemisphere is missing.
  • the image generation unit 30 and the control unit 42 can handle the side EF and the side GF as being continuous. That is, the image shown in FIG. 9A is an omnidirectional image.
  • FIG. 9B is a schematic diagram showing an example of an all-round image.
  • FIG. 9B illustrates an image 622, an image 623, and an image 624 obtained by capturing a discontinuous range in the horizontal direction. These three images are images obtained by capturing a range smaller than 360 degrees in the horizontal direction, and a part of 360 degrees is missing.
  • the image generation unit 30 and the control unit 42 can treat the sides AB, the sides CD, and the sides EF as continuous. Specifically, the control unit 42 moves the image 622 in the horizontal left direction in a state where a part of the image 622 is displayed on the display unit 40, and selects a portion of the image 622 that is not displayed on the display unit 40. Repeat the display control.
  • the image 622, the image 623, and the image 624 shown in FIG. 9B are all-round images. This is because the images (image 622, image 623, image 624) shown in FIG. 9B are displayed on the display unit 40 by setting the appropriate subjects included in the images as the first subject 201 and the second subject 202.
  • the first subject 201 included in the image is displayed by repeating the control of moving the displayed part of the image in the first direction and displaying the part not displayed on the display unit 40 of the image.
  • the omnidirectional image does not have any problem with the continuity of the image content. That is, for example, when an image including the side CD is displayed on the display unit 40, the user does not recognize the image content on the left side of the side CD and the image content on the right side of the side CD as image content. You may see it as continuous. However, the continuity of the image content is not a problem, and the continuity of the image is important. That is, the image on the left side of the side CD and the image on the right side of the side CD need only be continuous.
  • the image 622 can be said to be an all-round image by continuously handling the sides AB and CD.
  • the side CD and the side EF are handled in succession so that the image 623 can be said to be an all-round image. All the images are handled in the same manner, so that a part of the image displayed on the display unit 40 is moved in the first direction, and a control for displaying a portion of the image not displayed on the display unit 40 is repeated.
  • the second subject 202 is displayed after the first subject 201 included in the image is displayed, and the first subject 201 is displayed again on the display unit 40. ).
  • the two-dimensional image generation by the image generation unit 30 is possible even in the case of an all-round image or a omnidirectional image (image 622, image 623, image 624) that is not actually continuous.
  • the processing is no different from the above-described embodiment. That is, how to generate a two-dimensional image (frame) in which the first subject 201 and the second subject 202 are generated from the all-round image (image 622, image 623, image 624) is the same as in the above-described embodiment. Can be judged. For example, after the first subject 201 is displayed on the display unit 40, an image displayed on the display unit 40 is repeatedly displayed until the second subject 202 is displayed on the display unit 40 by repeatedly scrolling leftward on the screen.
  • the scrolling operation in the left direction of the screen is repeated until the first subject 201 is displayed on the display unit 40. If the former distance is longer than the latter distance, a two-dimensional image is created so that the second subject 202 is arranged in the left direction of the first subject 201. Good.
  • the two-dimensional image (frame) including the first subject 201 and the second subject 202 may be created by a method other than the above-described FIGS. 7A to 7C.
  • a two-dimensional image in which the space between the first subject 201 and the second subject 202 is compressed may be created using a technique such as seam carving.
  • a two-dimensional image may be created by thinning or reducing a subject located between the first subject 201 and the second subject 202.
  • the image generation unit 30 may execute the subject specifying process only for some frames instead of executing the subject specifying process for all the frames. For example, the image generation unit 30 identifies the main subject every 30 frames, such as the first frame, the 31st frame, and the 61st frame. The image generation unit 30 does not execute the subject specifying process for 29 frames between the first frame and the 31st frame.
  • the frame rate of the omnidirectional video is 60 fps
  • 30 frames corresponds to 0.5 seconds. If the period is about 0.5 seconds, it is expected that the position of the main subject has hardly changed. That is, the position of the main subject in these 29 frames can be easily estimated from the position of the main subject in the first frame and the position of the main subject in the 31st frame.
  • the subject specifying process is executed only for a part of the omnidirectional images (the main subject is specified only from the part of the omnidirectional images), so that it is necessary for the execution of the two-dimensional video creation process.
  • the amount of calculation can be reduced.
  • the content of the two-dimensional moving image creation process executed by the image generation unit 30 is different from that of the first embodiment. Note that points not mentioned in the present embodiment are the same as those described in the first embodiment. That is, the contents described in the first embodiment are all incorporated in the second embodiment.
  • the image processing system according to the second embodiment will be described focusing on differences from the image processing system according to the first embodiment.
  • the image generation unit 30 executes subject specifying processing for each frame, as in the first embodiment.
  • the image generation unit 30 executes direction specifying processing for specifying the direction of the main subject in the frame for the main subject specified thereby.
  • the main subject is a person
  • the direction of the main subject is the direction of the person's face in the image.
  • the image generation unit 30 executes a well-known face recognition process to recognize the face of the main subject and the face direction.
  • the image generation unit 30 specifies the face orientation of the main subject in the image as the orientation of the main subject.
  • the orientation of the main subject in the three-dimensional space is determined.
  • the direction in which the nose is facing is set as the direction of the main subject.
  • the direction of the vector starting from the center of the face and ending at the vertex of the nose can be set as the direction of the main subject.
  • a method for determining the orientation of the main subject in the three-dimensional space will be described later.
  • the vector is projected onto an image (or an imaging surface).
  • the vector (projection vector) projected onto the two-dimensional image in which the main subject is imaged becomes the direction of the main subject in the image.
  • FIG. 11 is an explanatory diagram of a two-dimensional image creation process.
  • the omnidirectional image 300 shown in FIG. 11 includes a first subject 301 and a second subject 302 that are main subjects.
  • the first subject is displayed on the display unit 40 and the second subject is displayed on the display unit 40.
  • the distance that the image displayed on the display unit 40 has moved (the distance from the first subject 301 to the second subject 302 in the right direction) is longer than the distance from the second subject 302 to the first subject 301 in the right direction. Therefore, if a two-dimensional image is created by performing the same processing as in the first embodiment, a two-dimensional image in which the second subject 302 is arranged on the left side of the first subject 301 is created.
  • the image generation unit 30 arranges the first subject 201 and the second subject 202 in at least two ways (the first subject 201 is on the left side and the second subject 202 is on the right side). Alternatively, the first subject 201 can be arranged on the right side and the second subject 202 can be arranged on the left side).
  • the two-dimensional image (or two-dimensional moving image) generated by the image generation unit 30 is reproduced, the visibility of the user varies greatly depending on the arrangement method.
  • the image generation unit 30 needs to generate a two-dimensional image (frame) in which a plurality of main subjects (two main subjects) are appropriately arranged.
  • the image processing unit 30 creates a two-dimensional image in which the first subject 301 faces the second subject 302 in the image. In the image illustrated in FIG. 11, the first subject 301 faces rightward on the page.
  • an image (video) that does not give the user a sense of incongruity It can be.
  • the image processing unit 30 creates a two-dimensional image so that the first subject 301 is arranged on the left side of the second subject 302.
  • the projection vector is the orientation of the main subject in the image.
  • the vector shown in FIG. 11 is the projection vector of the first subject 301.
  • the first subject 301 is the origin, and the X axis is taken in the direction from the first subject 301 toward the second subject 302.
  • the X-axis direction component of the projection vector of the first subject 301 is positive, it can be determined that the first subject 301 faces the second subject 302.
  • the X-axis direction component of the projection vector of the first subject 301 is negative, it can be determined that the first subject 301 is not facing the second subject 302.
  • the first distance that the image displayed on the display unit 40 has moved from when the first subject 301 is displayed on the display unit 40 to when the second subject 302 is displayed on the display unit 40 is When the image displayed on the display unit 40 is longer than the second distance moved from when the second subject 302 is displayed on the display unit 40 to when the first subject 301 is displayed on the display unit 40, the second subject 302 is displayed.
  • the first subject 301 does not face the second subject 302.
  • a two-dimensional image including the first subject 301 and the second subject 302 and having the first subject 301 arranged on the first direction side of the second subject 302 is generated from the omnidirectional image.
  • the direction of the nose of the person has been described as an example, the direction of the face of the subject (person) may be used.
  • the direction of the face may be the direction in which the eyes are facing, or may be the normal direction of the plane when the face is modeled as a plane.
  • the orientation of the person's body may be adopted as the orientation of the subject instead of the orientation of the person's face.
  • the chest may be modeled as a plane, and the normal direction of the plane may be used as the body orientation.
  • the direction of the subject in the three-dimensional space can be uniquely determined.
  • the orientation may be appropriately defined according to the subject.
  • the main subject is a vehicle or other moving body
  • the traveling direction (movement direction) of the vehicle may be the direction of the main subject.
  • the main subject is a building
  • the direction of the entrance at the front of the building may be the direction of the main subject.
  • the image generation unit 30 performs image analysis of the omnidirectional image, or from the output of a sensor provided separately from the imaging unit 20, or
  • the orientation of the main subject in the three-dimensional space can be acquired by distance measurement using the imaging unit 20. If a vector indicating the orientation of the main subject in the three-dimensional space can be acquired, the projection vector can be acquired by projecting the vector. Then, as described above, the image generation unit 30 can calculate the orientation of the main subject in the image based on the projection vector.
  • a two-dimensional image suitable for viewing can be automatically generated from an omnidirectional image.
  • a single device may have two or more of the imaging unit 20, the image generation unit 30, and the display unit 40.
  • the imaging device 2 may include the image generation unit 30 in addition to the imaging unit 20.
  • the imaging device 2 also serves as the image processing device 3. Therefore, the image processing apparatus 3 may not be included in the image processing system 1.
  • the image processing apparatus 3 may include a display unit 40 in addition to the image generation unit 30.
  • the image processing device 3 also serves as the playback device 4. Therefore, the playback device 4 may not be included in the image processing system 1.
  • the imaging device 2 may include an image generation unit 30 and a display unit 40 in addition to the imaging unit 20. In this case, the imaging device 2 serves as the image processing device 3 and the playback device 4. That is, the imaging device 2 provides a function equivalent to that of the image processing system 1 alone.
  • FIG. 10 is a block diagram schematically showing an electronic apparatus 1000 that combines the image processing device 3 and the playback device 4.
  • the electronic device 1000 is, for example, a smartphone or a tablet terminal.
  • the electronic device 1000 includes an image generation unit 30, an input unit 31, an output unit 32, a display unit 40, a control unit 42, and an operation unit 43.
  • the electronic device 1000 can create a two-dimensional video, reproduce the created two-dimensional video by the display unit 40, store the created two-dimensional video in the storage medium 52, and reproduce the omnidirectional image by the display unit 40. it can. Note that the operation of each part of the electronic device 1000 is the same as that of the first embodiment, and a description thereof will be omitted.
  • the creation of the two-dimensional video by the image generation unit 30 may be performed in real time in parallel with the creation of the omnidirectional video by the imaging unit 20, or may be started after the creation of the omnidirectional video is completed. May be.
  • the display of the two-dimensional moving image by the display unit 40 may be performed in real time in parallel with the creation of the two-dimensional moving image by the image generation unit 30, or is started after the creation of the two-dimensional moving image is completed. Also good.
  • the content of the two-dimensional moving image creation process executed by the image generation unit 30 is different from that of the first embodiment. Note that points not mentioned in the present embodiment are the same as those described in the first embodiment. That is, the contents described in the first embodiment are all incorporated in the third embodiment.
  • an image processing system according to the third embodiment will be described focusing on differences from the image processing system according to the first embodiment. Similar to the first embodiment, the image generation unit 30 of the present embodiment performs subject identification processing. Since the content is the same as that of the first embodiment, the description is omitted.
  • FIGS. 12 and 13 are explanatory diagrams of the two-dimensional image creation process.
  • 12 (a) and 13 (a) are top views of a volleyball court.
  • the imaging device 2 is installed at the center of the coat 400.
  • FIG. 12A shows the state of the coat 400 at time t1
  • FIG. 12B shows the first frame (hereinafter referred to as first omnidirectional image 500) taken at time t1.
  • the image generation unit 30 recognizes a subject 403 (hereinafter referred to as a third subject 403), which is a person, as a main subject. Accordingly, the image generation unit 30 creates a two-dimensional image (frame) including the third subject 403 in step S35 in the flowchart shown in FIG.
  • a subject 403 hereinafter referred to as a third subject 403
  • FIG. 13A shows the state of the coat 400 at time t2 after time t1
  • FIG. 13B shows the 31st frame (hereinafter referred to as second omnidirectional image 510) taken at time t2.
  • the court 400 includes the third subject 403 that is the main subject at the time t1, the main subject 404 that is a person at the time t2 (hereinafter referred to as the fourth subject 404), and the main subject 405 that is a person. (Hereinafter referred to as the fifth subject 405).
  • the image generation unit 30 specifies the fourth subject 404 and the fifth subject 405 as two main subjects.
  • the third subject 403 is not specified as the main subject at time t2.
  • the positional relationship between the imaging device 2, the fourth subject 404, and the fifth subject 405 is the same as the example shown in FIG. That is, in FIG. 13B, after the fourth subject 404 is displayed on the display unit 40 by the control of moving a part of the second omnidirectional image 510 displayed on the display unit 40 leftward, The distance to which the image displayed on the display unit 40 has moved before the five subjects 405 are displayed on the display unit 40 (the distance to the fifth subject 405 in the right direction of the fourth subject 404 (hereinafter referred to as the first distance). )) Is longer than the distance to the fourth subject 404 in the right direction of the fifth subject 405 (hereinafter referred to as the second distance). Therefore, if a two-dimensional image is created from the second omnidirectional image 510 by performing the same processing as in the first embodiment, the fifth subject 405 is arranged on the left side of the fourth subject 404. Will be created.
  • the image generation unit 30 positions the main subject in the first omnidirectional image 500 captured at the previous frame in time, that is, at the time t1 illustrated in FIG. Respect and generate 2D images. Specifically, the image generation unit 30 specifies an angle of view 401 that is an angle of view including the main subject (third subject 403) at time t1. Then, a partial image 511 at a position corresponding to the angle of view 401 is specified from the second omnidirectional image 510 captured at time t2. Then, the image generation unit 30 arranges the fourth subject 404 and the fifth subject 405 based on the positional relationship of the partial image 511, the fourth subject 404, and the fifth subject 405 in the second omnidirectional image 510. Create a two-dimensional image.
  • the 2D image creation process by the image generation unit 30 will be described in detail.
  • the image generation unit 30 includes a partial image 511 (that is, an image in the second omnidirectional image in the second omnidirectional image) at a position corresponding to the partial image including the third subject 403 in the first omnidirectional image. Assume a partial image corresponding to the corner 401 (hereinafter referred to as a first partial image 511).
  • the image generation unit 30 includes a fourth subject 404, a fifth subject 405, and a partial image in which the first partial image 511 is included between the fourth subject 404 and the fifth subject 405 (that is, in the second omnidirectional image).
  • the image processing unit 30 creates a two-dimensional image that maintains the left-right positional relationship between the fourth subject 404 and the fifth subject 405 in the second partial image.
  • the image generation unit 30 generates a two-dimensional image (frame) including the third subject 403 from the first omnidirectional image 500 captured at time t1. That is, the user views the direction corresponding to the angle of view 401 for a predetermined period as a video obtained by reproducing the two-dimensional video.
  • the main subject is changed from the third subject 403 to the fourth subject 404 and the fifth subject 405.
  • the image generation unit 30 generates a two-dimensional image (frame) including the fourth subject 404 and the fifth subject 405 from the second omnidirectional image 510 captured at time t2.
  • the fourth subject 404 is based on the position of the third subject 403 (that is, the direction corresponding to the angle of view 401). This is because it is easy to understand the arrangement of the subject in the three-dimensional space by arranging the fifth subject 405 and the fifth subject 405. That is, in FIG. 13A, the fourth subject 404 is on the left and the fifth subject 405 is on the right when the direction of the angle of view 401 is viewed from the imaging device 2.
  • the angle of view is determined by respecting the angle of view of the previous frame in time, so that confusion during viewing due to a sudden scene change can be avoided. That is, if a direction completely different from the previous frame is suddenly reproduced, there is a possibility that it is difficult to know which part of the omnidirectional image is cut out.
  • the image generation unit 30 of the present embodiment respects the angle of view of the previous frame and determines the angle of view, so that it is possible to create a two-dimensional moving image suitable for viewing with easy understanding of the space. it can.
  • the positional relationship between the left and right is maintained means that the positional relationship is maintained when the positional relationship in the left and right direction is considered, ignoring the vertical positional relationship. In other words, no matter how much the vertical positional relationship changes, as long as the positional relationship in the left-right direction is maintained, it means that the horizontal positional relationship is maintained.
  • a two-dimensional image suitable for viewing can be automatically generated from an omnidirectional image.
  • a single device may have two or more of the imaging unit 20, the image generation unit 30, and the display unit 40.
  • the imaging device 2 may include the image generation unit 30 in addition to the imaging unit 20.
  • the imaging device 2 also serves as the image processing device 3. Therefore, the image processing apparatus 3 may not be included in the image processing system 1.
  • the image processing apparatus 3 may include a display unit 40 in addition to the image generation unit 30.
  • the image processing device 3 also serves as the playback device 4. Therefore, the playback device 4 may not be included in the image processing system 1.
  • the imaging device 2 may include an image generation unit 30 and a display unit 40 in addition to the imaging unit 20. In this case, the imaging device 2 serves as the image processing device 3 and the playback device 4. That is, the imaging device 2 provides a function equivalent to that of the image processing system 1 alone.
  • FIG. 10 is a block diagram schematically showing an electronic apparatus 1000 that combines the image processing device 3 and the playback device 4.
  • the electronic device 1000 is, for example, a smartphone or a tablet terminal.
  • the electronic device 1000 includes an image generation unit 30, an input unit 31, an output unit 32, a display unit 40, a control unit 42, and an operation unit 43.
  • the electronic device 1000 can create a two-dimensional video, reproduce the created two-dimensional video by the display unit 40, store the created two-dimensional video in the storage medium 52, and reproduce the omnidirectional image by the display unit 40. it can. Note that the operation of each part of the electronic device 1000 is the same as that of the first embodiment, and a description thereof will be omitted.
  • the creation of the two-dimensional video by the image generation unit 30 may be performed in real time in parallel with the creation of the omnidirectional video by the imaging unit 20, or may be started after the creation of the omnidirectional video is completed. May be.
  • the display of the two-dimensional moving image by the display unit 40 may be performed in real time in parallel with the creation of the two-dimensional moving image by the image generation unit 30, or is started after the creation of the two-dimensional moving image is completed. Also good.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image qui est équipé d'une unité d'entrée, d'une unité de génération d'image dynamique et d'une unité de génération de trame. L'unité d'entrée entre : des premières données d'image, lesdites premières données d'image étant des données d'image qui sont une partie d'une image capturant un premier sujet et un deuxième sujet et qui sont utilisées pour afficher, sur une unité d'affichage, le premier sujet, suivi par le deuxième sujet, et, ensuite, le premier sujet à nouveau, en répétant une commande pour contraindre ladite partie d'image affichée sur l'unité d'affichage à se déplacer dans une première direction de sorte à provoquer l'affichage des parties de l'image qui ne sont pas affichées sur l'unité d'affichage, et étant utilisées pour afficher une première image capturant un troisième sujet; et des secondes données d'image qui sont utilisées pour afficher une seconde image qui est capturée après que la première image a été capturée et qui capture un quatrième sujet et un cinquième sujet. L'unité de génération d'image dynamique génère une image dynamique qui comprend au moins : une première trame qui comprend le troisième sujet et est affichée sur l'unité d'affichage; et une seconde trame qui comprend le quatrième sujet et le cinquième sujet et est affichée sur l'unité d'affichage. L'unité de génération de trame génère la seconde trame d'une manière qui conserve la relation de position gauche-droite ou vers le bas d'une image partielle dans une position dans la seconde image qui correspond à une première image partielle comprenant le troisième sujet dans la première image, et du quatrième sujet et du cinquième sujet dans une seconde image partielle qui comprend le quatrième sujet et le cinquième sujet, et qui comprend l'image partielle susmentionnée entre le quatrième sujet et le cinquième sujet.
PCT/JP2018/009651 2017-03-14 2018-03-13 Dispositif de traitement d'image et équipement électronique WO2018168825A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-048863 2017-03-14
JP2017048863A JP2020077896A (ja) 2017-03-14 2017-03-14 画像処理装置および電子機器

Publications (1)

Publication Number Publication Date
WO2018168825A1 true WO2018168825A1 (fr) 2018-09-20

Family

ID=63523784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/009651 WO2018168825A1 (fr) 2017-03-14 2018-03-13 Dispositif de traitement d'image et équipement électronique

Country Status (2)

Country Link
JP (1) JP2020077896A (fr)
WO (1) WO2018168825A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020061691A (ja) * 2018-10-11 2020-04-16 楽天株式会社 カメラアッセンブリ

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006211105A (ja) * 2005-01-26 2006-08-10 Konica Minolta Holdings Inc 画像生成装置およびシステム
JP2013218432A (ja) * 2012-04-05 2013-10-24 Dainippon Printing Co Ltd 画像処理装置、画像処理方法、画像処理用プログラム、および、記録媒体
JP2016027704A (ja) * 2014-07-04 2016-02-18 パナソニックIpマネジメント株式会社 撮像装置
WO2017169369A1 (fr) * 2016-03-31 2017-10-05 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006211105A (ja) * 2005-01-26 2006-08-10 Konica Minolta Holdings Inc 画像生成装置およびシステム
JP2013218432A (ja) * 2012-04-05 2013-10-24 Dainippon Printing Co Ltd 画像処理装置、画像処理方法、画像処理用プログラム、および、記録媒体
JP2016027704A (ja) * 2014-07-04 2016-02-18 パナソニックIpマネジメント株式会社 撮像装置
WO2017169369A1 (fr) * 2016-03-31 2017-10-05 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020061691A (ja) * 2018-10-11 2020-04-16 楽天株式会社 カメラアッセンブリ
US10951794B2 (en) 2018-10-11 2021-03-16 Rakuten, Inc. Camera assembly

Also Published As

Publication number Publication date
JP2020077896A (ja) 2020-05-21

Similar Documents

Publication Publication Date Title
EP3007038B1 (fr) Interaction avec une image vidéo tridimensionnelle
JP4878083B2 (ja) 画像合成装置及び方法、プログラム
JP5659305B2 (ja) 画像生成装置および画像生成方法
JP5659304B2 (ja) 画像生成装置および画像生成方法
JP5769813B2 (ja) 画像生成装置および画像生成方法
JP3847753B2 (ja) 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス
JP5865388B2 (ja) 画像生成装置および画像生成方法
US8388146B2 (en) Anamorphic projection device
JP2023017920A (ja) 画像処理装置
KR20170031733A (ko) 디스플레이를 위한 캡처된 이미지의 시각을 조정하는 기술들
US10839601B2 (en) Information processing device, information processing method, and program
JP7182920B2 (ja) 画像処理装置、画像処理方法およびプログラム
JP5477777B2 (ja) 画像取得装置
JP2016224173A (ja) 制御装置及び制御方法
WO2018168825A1 (fr) Dispositif de traitement d'image et équipement électronique
JP4689548B2 (ja) 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス
JP2011108028A (ja) 画像再生装置、撮像装置、画像再生方法
WO2018168824A1 (fr) Dispositif de traitement d'image et équipement électronique
US20170155892A1 (en) Wearable stereoscopic camera system for 3d virtual reality imaging and networked area learning
JPWO2018083757A1 (ja) 画像提供装置、画像提供方法、プログラム、ならびに、非一時的なコンピュータ読取可能な情報記録媒体
KR102596487B1 (ko) 표시 제어 시스템, 방법 및 컴퓨터 판독 가능한 기록 매체
TWM520772U (zh) 全景影像錄影及擬真觀看系統
JP6197849B2 (ja) 画像表示装置及びプログラム
JP4767331B2 (ja) 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス
KR20240045052A (ko) 전자 장치 및 그 동작 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18768704

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18768704

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP