WO2014091736A1 - Display device for panoramically expanded image - Google Patents

Display device for panoramically expanded image Download PDF

Info

Publication number
WO2014091736A1
WO2014091736A1 PCT/JP2013/007209 JP2013007209W WO2014091736A1 WO 2014091736 A1 WO2014091736 A1 WO 2014091736A1 JP 2013007209 W JP2013007209 W JP 2013007209W WO 2014091736 A1 WO2014091736 A1 WO 2014091736A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
passage
panoramic
camera
omnidirectional
Prior art date
Application number
PCT/JP2013/007209
Other languages
French (fr)
Japanese (ja)
Inventor
偉志 渡邊
藤井 博文
藤松 健
森村 淳
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2014091736A1 publication Critical patent/WO2014091736A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Definitions

  • the present invention relates to a panoramic unfolded image display device that unfolds and displays an image captured by an omnidirectional camera.
  • the panoramic developed image imaging device described in Patent Document 1 forms a circular image by a fisheye lens on a light-receiving surface on which a fisheye lens and a plurality of light-receiving elements are arranged, and generates captured image data having a circular image.
  • Imaging means memory for storing the position or size of the inner and outer circle boundaries that define the range of the ring shape used to generate a 360-degree panoramic unfolded image in the captured image, and the above based on the user's operation
  • User update means for updating the position or size of at least one of the inner circle boundary and the outer circle boundary stored in the memory, and the inner information stored in the memory in the captured image generated by the imaging means
  • Panorama expanded image generating means for generating a 360 ° panoramic expanded image based on an image within a ring-shaped range sandwiched between a circular boundary and the outer circular boundary.
  • FIG. 16 is a diagram illustrating an example of a camera image 100 taken by an omnidirectional camera and a panoramic development image 101 obtained by panoramic development of the camera image 100.
  • FIG. 17 shows a camera image 100 ((a) in the figure) obtained by photographing the inside of the convenience store with an omnidirectional camera, and a panoramic developed image 101 ((b) in the figure) obtained by panoramic developing the camera image 100.
  • FIG. 4 is a diagram showing an arrangement of actual articles in a convenience store ((c) in the figure).
  • the panoramic development image 101 is a panoramic development of the camera image 100 based on the development reference line 110 shown in FIG.
  • the flow line 120-1 in the panorama development image 101 corresponds to the passage 130-1 in the convenience store
  • the flow line 120-2 corresponds to the passage 130-2 in the convenience store
  • the flow line 120-3 It corresponds to the passage 130-3 in the convenience store.
  • the passage may be divided (the flow line 120-2 is divided) in the panoramic development image 101 as in the passage 130-2 in FIG. 17, and an omnidirectional camera is installed. It becomes difficult to understand the situation of the place.
  • the present invention has been made in view of such circumstances, and provides a panoramic unfolded image display device that can obtain a panoramic unfolded image from which a camera installation location can be easily understood from a camera image taken by an omnidirectional camera.
  • the purpose is to do.
  • the panoramic unfolded image display device of the present invention includes an image input means for inputting an omnidirectional image, a path acquisition means for obtaining a path of an object photographed in the omnidirectional image, and the omnidirectional image from a predetermined center position.
  • Panorama development means for developing a panorama by cutting along a cutting line in the circumferential direction; and display means for displaying the panorama developed image, wherein the intersection of the passage with the passage is a predetermined value Set to be less than the value.
  • the cutting line (development reference line) for panoramic development of the omnidirectional image is set so that the intersection with the passage is smaller than a predetermined value.
  • the passage is not divided, and a panoramic development image that allows easy understanding of the situation of the camera installation location can be obtained.
  • the cutting line is set so that the passage perpendicular to the passage is larger than a predetermined value.
  • the omnidirectional image is an image taken by an omnidirectional camera
  • the predetermined center position is a position in the image taken immediately below the omnidirectional camera.
  • the information on the passage is generated from the flow line by obtaining the flow line of the object in the omnidirectional image.
  • the vehicle has a congestion degree calculating means for calculating a congestion degree for each passage, and the cutting line is set so as to cross a passage having the congestion degree less than a predetermined value.
  • the degree of congestion for each passage is analyzed, and the cutting line is set so that the passage with a high degree of congestion is prioritized, so that a busy passage can be clearly seen.
  • the congestion degree calculating means calculates the congestion degree for each time zone, and the cutting line is set for each time zone according to the congestion degree for each time zone.
  • the panoramic developed image output device of the present invention includes an image input means for inputting an omnidirectional image, a path acquisition means for obtaining a path of an object photographed in the omnidirectional image, and the omnidirectional image from a predetermined center position.
  • Panorama developing means for outputting the panorama by cutting along a cutting line in the circumferential direction, and the cutting line is set so that the intersection with the passage is less than a predetermined value.
  • the panoramic developed image display method of the present invention includes an image input step of inputting an omnidirectional image, a passage acquisition step of obtaining a passage of an object photographed in the omnidirectional image, and the omnidirectional image from a predetermined center position.
  • the cutting line has a predetermined intersection with the passage. Set to be less than the value.
  • FIG. 1 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 1 of the present invention.
  • the figure which shows an example of a camera image when an omnidirectional camera is on a passage (A)-(c) The figure which shows an example of a camera image when an omnidirectional camera is not on a passage
  • movement of the panoramic expansion image display apparatus of FIG. The figure which shows the expansion
  • FIG. 3 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 2 of the present invention.
  • FIG. 3 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 3 of the present invention.
  • FIG. 5 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 4 of the present invention.
  • movement of the panoramic expansion image display apparatus of FIG. The congestion degree average table for each passage stored in the congestion degree information history storage unit of the panoramic developed image display device of FIG. 13 and 10: 00-11: 0 and 12: 00-13: 00 in the congestion degree average table.
  • path on the camera image in each The figure which shows an example of the original image which the omnidirectional camera image
  • FIG. 1 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 1 of the present invention.
  • a panoramic developed image display device 1 includes an omnidirectional camera image input unit (image input unit) 10, a passage information input unit (passage acquisition unit) 11, and a camera center position input unit 12.
  • a development reference position determination unit 13 an omnidirectional camera image panorama development unit (panorama development unit) 14, and a display unit (display unit) 15.
  • the omnidirectional camera image input unit 10 inputs an image of an omnidirectional camera (not shown) (hereinafter referred to as “camera image”).
  • the passage information input unit 11 inputs information about a passage in the building (hereinafter referred to as “passage information”) from the camera image. For example, if the building is a convenience store and an omnidirectional camera is installed in the convenience store, the passage information in the convenience store is input. In the present embodiment, the passage information is manually input by the user himself / herself.
  • the camera center position input unit 12 inputs a camera center position in a camera image obtained by photographing a omnidirectional camera. The user inputs the camera center position manually.
  • the development reference position determination unit 13 sets a development reference position for panoramic development of the camera image from the camera center position input by the camera center position input unit 12 and the path information input by the path information input unit 11.
  • the deployment reference position is set in a direction orthogonal to the passage if the omnidirectional camera is on the passage and is not an intersection of the passages, and if the omnidirectional camera is not on the passage, the development reference position is all around the camera position. An azimuth scan is performed and the line of intersection with the passage is set to be minimum and perpendicular.
  • FIG. 2 is a diagram illustrating an example of the camera image 100 when the omnidirectional camera is on the passage 130.
  • the development reference line 110 indicating the development reference position is set in a direction orthogonal to the passage 130 at a location other than the intersection 160 of the passage.
  • FIGS. 3A to 3C are views showing an example of the camera image 100 when the omnidirectional camera is not on the passage 130.
  • the development reference line 110 indicating the development reference position is set so that the line of intersection with the passage 130 is the smallest and perpendicular.
  • the omnidirectional camera image panorama developing unit 14 develops a panorama by cutting the camera image based on the development reference position set by the development reference position determining unit 13. That is, the camera image is panoramicly developed by cutting along the development reference line in the circumferential direction from the center position centered on the camera position.
  • the display unit 15 displays the image panorama developed by the omnidirectional camera image panorama developing unit 14.
  • FIG. 4 is a flowchart showing the operation of the panoramic developed image display device 1 according to the present embodiment.
  • the omnidirectional camera image input unit 10 first inputs a camera image (step S1).
  • the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S2).
  • the passage information input unit 11 inputs passage information from an image of one frame (step S3).
  • the development reference position determination unit 13 determines whether the camera center position exists on the passage (step S4), and determines that the camera center position exists on the passage. (In other words, if “Yes” is determined), it is determined whether or not the camera center position exists at the passage intersection (step S5).
  • the deployment reference position determination unit 13 sets the deployment reference position in a direction orthogonal to the passage when it is determined that the camera center position does not exist at the passage intersection (that is, when “No” is determined) (step S6).
  • FIG. 5 is a diagram showing a development reference line 110 on the development reference position when the camera center position is on a passage other than the passage intersection.
  • the omnidirectional camera image panorama development unit 14 develops a panoramic image of 360 degrees based on the development reference position and displays it on the display unit 15 (step S7). After the panorama development image for one frame of the camera image is displayed, the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S8), and the camera image capture has not been completed.
  • the camera image of the next frame is captured (step S9), and the camera image is panorama developed and displayed on the display unit 15.
  • the omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”).
  • the development reference position determination unit 13 determines in step S4 that the camera center position does not exist on the passage (that is, determines “No”) or in step S5.
  • a straight line that is, a development reference line
  • the development reference position determination unit 13 shifts the angle ⁇ from the camera center position toward the camera image end, thereby minimizing the number of intersection points between the development reference line and the passage and maximizing the number of intersection points at right angles. Is set (step S11).
  • step S7 is a diagram showing the development reference line 110 when the angle is shifted by ⁇ when the camera center position is not on the passage.
  • FIGS. 7A and 7B are diagrams showing a comparative example between the case where the development reference position is automatically set (the present invention) and the case where the development reference position is fixed (conventional). a) shows the camera image 100 and panorama development image 101 when the development reference position is fixed, and (b) shows the camera image 100 and panorama development image 101 when the development reference position is automatically set. As can be seen from the figure, the passage is divided when the development reference position is fixed, but the passage is not divided when the development reference position is automatically set.
  • the panoramic developed image display device 1 sets the development reference line for panoramic development of the camera image so that the intersection with the passage is smaller than a predetermined value, so that the passage is divided. Therefore, it is possible to obtain a panoramic developed image that can easily understand the situation of the camera installation location.
  • FIG. 8 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 2 of the present invention.
  • the panoramic unfolded image display device 2 has a function of estimating a passage position by analyzing a flow line (person's movement trajectory) using a person tracking technique, and an omnidirectional camera image input unit 10.
  • a development reference position determining unit 19 a development reference position determining unit 19.
  • the person flow line information acquisition unit 16 acquires the person flow line information by obtaining the flow line of the person in the camera image.
  • the person flow line information acquisition unit 16 acquires and outputs the person flow line information for each frame.
  • the person flow line information history storage unit 17 accumulates the person flow line information acquired for each frame by the person flow line information acquisition unit 16.
  • the passage position estimation unit 18 estimates the passage position from the person flow line information for each frame accumulated in the person flow line information history storage unit 17.
  • FIGS. 9A and 9B are diagrams illustrating an example of N-frame human flow information accumulated in the human flow information history storage unit 17 and passage information estimated from the N-flow human flow information.
  • Yes N: natural number
  • (a) in the figure is a camera image in which the flow line 120 is accumulated for N frames
  • (b) is an estimation result of passage information.
  • the development reference position determination unit 19 determines a development reference position for panoramic development of the camera image from the camera center position input by the camera center position input unit 12 and the passage position estimated by the passage position estimation unit 18.
  • the omnidirectional camera image panorama developing unit 14 develops a panorama by cutting the camera image based on the development reference position set by the development reference position determining unit 19.
  • the display unit 15 displays the image panorama developed by the omnidirectional camera image panorama developing unit 14.
  • FIG. 10 is a flowchart showing the operation of the panoramic developed image display device 2 according to the present embodiment.
  • the omnidirectional camera image input unit 10 first inputs a camera image (step S20).
  • the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S21).
  • the human flow line information acquisition unit 16 acquires human flow line information from one frame image (step S22).
  • the person flow line information history storage unit 17 accumulates the person flow line information acquired by the person flow line information acquisition unit 16 (step S23).
  • the passage position estimation unit 18 integrates the human flow line information of the past N frames (step S24), and then the integrated person The passage information is estimated based on the flow line information (step S25).
  • the development reference position determination unit 19 sets a development reference position for panoramic development of the camera image (step S26). Since this process (Process A) is the same as Steps S4 to S6 and Steps S10 and S11 of FIG. 4 described above, description thereof is omitted.
  • the omnidirectional camera image panorama development unit 14 develops the camera image and displays it on the display unit 15 (step S27). Note that the one-frame image used by the omnidirectional camera image panorama developing unit 14 is the same as the one-frame image first captured by the human flow line information acquiring unit 16.
  • the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S28), and the camera image capture has not been completed. (That is, when “No” is determined), the camera image of the next frame is captured (step S29), and the camera image is panorama developed and displayed on the display unit 15.
  • the omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”).
  • the process from step S22 is repeated.
  • the panoramic developed image display device 2 acquires the person flow line information in the camera image and estimates the passage information from the acquired person flow line information. As a result, the operability can be improved.
  • FIG. 11 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 3 of the present invention.
  • the panorama development image display device 3 analyzes the passage congestion degree from the estimated passage position in addition to the functions of the panorama development image display device 2 described above, and gives priority to a passage with a high degree of congestion.
  • the omnidirectional camera image input unit 10 the camera center position input unit 12, the omnidirectional camera image panorama development unit 14, the display unit 15, and the human motion.
  • a line information acquisition unit 16, a person flow line information history storage unit 17, a passage position estimation unit 18, a passage congestion degree calculation unit (congestion degree calculation means) 20, and a development reference position determination unit 21 are provided.
  • the degree of congestion is, for example, the ratio of the number of people who have actually passed to the maximum number of people passing in a unit time for each passage, expressed as a percentage.
  • FIG. 12 is a flowchart showing the operation of the panoramic developed image display device 3 according to the present embodiment.
  • the omnidirectional camera image input unit 10 first inputs a camera image (step S30).
  • the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S31).
  • the human flow line information acquisition unit 16 acquires human flow line information from one frame image (step S32).
  • the person flow line information history storage unit 17 accumulates the person flow line information acquired by the person flow line information acquisition unit 16 (step S33).
  • the passage position estimation unit 18 integrates the person N flow line information of the past N frames (step S34), and further integrates the person.
  • the passage information is estimated based on the flow line information (step S35).
  • the passage congestion degree calculation unit 20 calculates the degree of congestion for each passage (step S36). After the congestion degree is calculated for each passage, the development reference position determination unit 21 determines whether or not the camera center position exists on the passage (step S37), and determines that the camera center position exists on the passage ( That is, if “Yes” is determined), it is determined whether or not the camera center position exists at the passage intersection (step S38).
  • the development reference position determination unit 21 determines that the camera center position does not exist at the passage intersection (that is, when it is determined “No”), the development reference position determination unit 21 sets the development reference position in a direction orthogonal to the passage (step S39).
  • the omnidirectional camera image panorama development unit 14 develops the camera image on the basis of the development reference position set by the development reference position determination unit 21 and displays it on the display unit 15 (step S1). S40). After the panorama development image for one frame of the camera image is displayed, the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S41), and the camera image capture has not been completed.
  • the camera image of the next frame is captured (step S42), and the process returns to step S32.
  • the omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”).
  • the development reference position determining unit 21 determines in step S37 that the camera center position does not exist on the passage (that is, determines “No”) or in step S38.
  • a straight line that is, a development reference line
  • the development reference position determination unit 21 shifts the angle from the camera center position toward the image edge by ⁇ , thereby minimizing the number of intersections between the straight line and the passage, and maximizing the number of intersections at right angles, and the degree of congestion of the intersecting passage.
  • the development reference position is set so that the average is minimized (step S44). After setting the development reference position, the process proceeds to step S40. Thereafter, the processes of steps S40 to S42 described above are performed.
  • the panoramic developed image display device 3 analyzes the degree of congestion for each passage and sets the development reference position so that the passage with high degree of congestion is prioritized and is not interrupted. You can clearly see the passage that is.
  • FIG. 13 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 4 of the present invention.
  • the panoramic developed image display device 4 according to the present embodiment has a function of changing the developed reference position for each time zone depending on the degree of congestion in addition to the functions of the panoramic developed image display device 3 described above.
  • Camera image input unit 10, camera center position input unit 12, omnidirectional camera image panorama development unit 14, display unit 15, person flow line information acquisition unit 16, person flow line information history storage unit 17, passage The position estimation unit 18, the passage congestion degree calculation unit 20, the development reference position determination unit 21, and the congestion degree information history storage unit 22 are provided.
  • FIG. 14 is a flowchart showing the operation of the panoramic developed image display device 4 according to the present embodiment.
  • the omnidirectional camera image input unit 10 inputs a camera image (step S50).
  • the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S51).
  • the human flow line information acquisition unit 16 acquires human flow line information from an image of one frame (step S52).
  • the person flow line information history storage unit 17 accumulates the person flow line information acquired by the person flow line information acquisition unit 16 (step S53).
  • the passage position estimation unit 18 integrates the person N flow line information of the past N frames (step S54), and further, the integrated person The passage information is estimated based on the flow line information (step S55).
  • the passage congestion degree calculation unit 20 calculates the degree of congestion for each passage (step S56), and further acquires time information (step S57). Then, the congestion degree average table for each passage stored in the congestion degree information history storage unit 22 is updated (step S58).
  • FIG. 15 is a diagram illustrating an example of the congestion degree average table for each passage and the paths on the camera image at 10: 00-11: 0 and 12: 00-13: 00 in the congestion degree average table.
  • the passage 130 has a thickness corresponding to the degree of congestion. The thicker, the higher the congestion level.
  • the development reference position determination unit 21 determines whether or not the development position is updated (step S ⁇ b> 59), and when it is determined that the development position is updated (that is, If “Yes” is determined), the congestion degree average table stored in the congestion degree information history storage unit 22 is referred to and the congestion degree is set for each passage (step S60).
  • a development reference position is set (step S61). This process (Process B) is the same as Steps S37 to S39 and Steps S43 and S44 of FIG.
  • the omnidirectional camera image panorama development unit 14 develops the camera image and displays it on the display unit 15 (step S62).
  • the one-frame image used by the omnidirectional camera image panorama developing unit 14 is the same as the one-frame image first captured by the human flow line information acquiring unit 16.
  • the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S63), and the camera image capture has not been completed. (That is, when “No” is determined), the camera image of the next frame is captured (step S64), and the process returns to step S52.
  • the omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”).
  • step S59 determines whether the development reference position determination unit 21 is not updating the development position (ie, “No” is determined). If it is determined in step S59 that the development reference position determination unit 21 is not updating the development position (ie, “No” is determined), the previous development reference position is held (step S65), and step S62 is performed. Move on to processing.
  • the panoramic developed image display device 4 changes the development reference position for each time zone depending on the degree of congestion for each passage, so that a congested passage for each time zone can be clearly seen. it can.
  • FIG. 18 is a block diagram showing a schematic configuration of the fifth embodiment of the present invention.
  • the fifth embodiment includes an omnidirectional camera 181, an input unit 182, a processing unit 183, and a display unit 184.
  • the omnidirectional camera 181 may be a camera that inputs an omnidirectional image
  • the input unit 182 may be an input device such as a keyboard
  • the processing unit 183 may be a personal computer
  • the display unit 184 may be a liquid crystal display.
  • Embodiment 5 is a more specific implementation means of Embodiment 1 of the present invention.
  • the omnidirectional camera image input unit 10 of the first embodiment is an omnidirectional camera 181
  • the camera center position input unit 12 and the passage information input unit 11 are input units 182, and a development reference.
  • the position determining unit 13 and the omnidirectional camera image panorama developing unit 14 are mounted in the processing unit 183, and the display unit 15 corresponds to the display unit 184.
  • FIG. 19 is a block diagram showing a schematic configuration of the sixth embodiment of the present invention.
  • the functions other than the display unit 15 are realized in the omnidirectional camera as they are except for some functions.
  • the same components as those in FIG. 8 showing the second embodiment are given the same numbers.
  • the omnidirectional camera 191 includes an omnidirectional camera image input unit 10, a camera center position holding unit 192, a person flow line information acquisition unit 16, a person flow line information history storage unit 17, a passage position estimation unit 18, and a deployment.
  • a reference position determining unit 19 and an omnidirectional camera image panorama developing unit 14 are incorporated.
  • the camera center position described in the first embodiment is held in advance.
  • Other functions are the same as those of the second embodiment, and the panoramic image developed by the omnidirectional camera image panorama developing unit 14 is output from the omnidirectional camera 191.
  • the output panoramic image may be displayed on a liquid crystal display or the like connected to the omnidirectional camera 191, or may be displayed on a liquid crystal display at a location away from the omnidirectional camera 191 via a network, or may be stored as an image. It may be stored in a functioning storage device.
  • the present invention has an effect that it is possible to obtain a panoramic development image from which a camera installation location can be easily understood from a camera image photographed by an omnidirectional camera. Application to the device is possible.
  • Omnidirectional camera image input unit 11 Path information input unit 12 Camera center position input unit 13 Development reference position determination unit 14 Omnidirectional camera image panorama development unit 15 Display unit 16 Human motion Line information acquisition unit 17 Person flow line information history storage unit 18 Passage position estimation unit 19, 21 Development reference position determination unit 20 Passage congestion degree calculation unit 22 Congestion degree information history storage unit 100 Camera image 101 Panorama development image 110 Development reference line 130 Passage 150 Camera position 160 Intersection 181 Omnidirectional camera 182 Input unit 183 Processing unit 184 Display unit 191 Omnidirectional camera 192 Camera center position holding unit

Abstract

The present invention is provided with: an omnidirectional-camera-image input unit (10) that inputs an image from an omnidirectional camera; a path information input unit (11) that obtains the path of a subject captured in the image from the omnidirectional camera; a camera central-position input unit (12) that inputs the central position of the omnidirectional camera; an expansion reference position determining unit (13) that, from the inputted camera central position and path information, sets an expansion reference position for panoramically expanding the image from the camera; an omnidirectional-camera-image panoramic expansion unit (14) that panoramically expands the image from the camera by cutting the image on the basis of the set expansion reference position; and a display unit (15) that displays the panoramically expanded image.

Description

パノラマ展開画像表示装置Panorama development image display device
 本発明は、全方位カメラで撮影された画像をパノラマ展開して表示するパノラマ展開画像表示装置に関する。 The present invention relates to a panoramic unfolded image display device that unfolds and displays an image captured by an omnidirectional camera.
 上述したパノラマ展開画像表示装置として、例えば特許文献1に記載されたものが知られている。この特許文献1に記載されたパノラマ展開画像撮像装置は、魚眼レンズと、複数の受光素子が配列される受光面に魚眼レンズによる円形の像が結像し、円形画像を有する撮像画像のデータを生成する撮像手段と、撮像画像中の360度のパノラマ展開画像の生成に使用するリング形状の範囲を規定する内円境界および外円境界の位置あるいはサイズを記憶するメモリと、ユーザの操作に基づいて前記メモリに記憶される前記内円境界および前記外円境界の中の少なくとも一方の位置あるいはサイズを更新するユーザ更新手段と、前記撮像手段が生成する前記撮像画像中の前記メモリに記憶される前記内円境界と前記外円境界とで挟まれるリング形状の範囲内の画像に基づいて360度のパノラマ展開画像を生成するパノラマ展開画像生成手段と、を備える。 As the above-described panoramic development image display device, for example, the one described in Patent Document 1 is known. The panoramic developed image imaging device described in Patent Document 1 forms a circular image by a fisheye lens on a light-receiving surface on which a fisheye lens and a plurality of light-receiving elements are arranged, and generates captured image data having a circular image. Imaging means, memory for storing the position or size of the inner and outer circle boundaries that define the range of the ring shape used to generate a 360-degree panoramic unfolded image in the captured image, and the above based on the user's operation User update means for updating the position or size of at least one of the inner circle boundary and the outer circle boundary stored in the memory, and the inner information stored in the memory in the captured image generated by the imaging means Panorama expanded image generating means for generating a 360 ° panoramic expanded image based on an image within a ring-shaped range sandwiched between a circular boundary and the outer circular boundary. , Comprising a.
 図16は、全方位カメラが撮影したカメラ画像100と、カメラ画像100をパノラマ展開したパノラマ展開画像101の一例を示す図である。 FIG. 16 is a diagram illustrating an example of a camera image 100 taken by an omnidirectional camera and a panoramic development image 101 obtained by panoramic development of the camera image 100.
日本国特開2008-028606号公報Japanese Unexamined Patent Publication No. 2008-028606
 しかしながら、全方位カメラで撮影されたカメラ画像をパノラマ展開する場合、全方位カメラの設置場所や、カメラ画像からパノラマ展開するための基準となる位置(以下、“展開基準位置”と呼ぶ)によっては、全方位カメラが設置された場所の状況を容易に理解することが困難になる(即ち、パノラマ展開画像と現実世界の結びつけが困難になる)という課題がある。 However, when panoramic development is performed on a camera image taken with an omnidirectional camera, depending on the installation location of the omnidirectional camera and a reference position for panoramic development from the camera image (hereinafter referred to as “development reference position”) There is a problem that it is difficult to easily understand the situation of the place where the omnidirectional camera is installed (that is, it is difficult to link the panoramic image and the real world).
 この課題について例を挙げて説明する。図17は、コンビニエンスストア内を全方位カメラで撮影して得られたカメラ画像100(同図の(a))と、カメラ画像100をパノラマ展開したパノラマ展開画像101(同図の(b))と、コンビニエンスストア内の実際の物品の配置(同図の(c))を示す図である。パノラマ展開画像101は、カメラ画像100を、同図中に示す展開基準線110を基にパノラマ展開したものである。また、パノラマ展開画像101内の動線120-1はコンビニエンスストア内の通路130-1に対応し、動線120-2はコンビニエンスストア内の通路130-2に対応し、動線120-3はコンビニエンスストア内の通路130-3に対応する。位置固定された展開基準線110では、図17における通路130-2のように、パノラマ展開画像101では通路が分断(動線120-2が分断)されてしまう場合があり、全方位カメラを設置した場所の状況を理解し難くなる。 This problem will be explained with an example. FIG. 17 shows a camera image 100 ((a) in the figure) obtained by photographing the inside of the convenience store with an omnidirectional camera, and a panoramic developed image 101 ((b) in the figure) obtained by panoramic developing the camera image 100. FIG. 4 is a diagram showing an arrangement of actual articles in a convenience store ((c) in the figure). The panoramic development image 101 is a panoramic development of the camera image 100 based on the development reference line 110 shown in FIG. In addition, the flow line 120-1 in the panorama development image 101 corresponds to the passage 130-1 in the convenience store, the flow line 120-2 corresponds to the passage 130-2 in the convenience store, and the flow line 120-3 It corresponds to the passage 130-3 in the convenience store. With the fixed development reference line 110, the passage may be divided (the flow line 120-2 is divided) in the panoramic development image 101 as in the passage 130-2 in FIG. 17, and an omnidirectional camera is installed. It becomes difficult to understand the situation of the place.
 本発明は、係る事情に鑑みてなされたものであり、全方位カメラで撮影されたカメラ画像からカメラ設置場所の状況を容易に理解できるパノラマ展開画像を得ることができるパノラマ展開画像表示装置を提供することを目的とする。 The present invention has been made in view of such circumstances, and provides a panoramic unfolded image display device that can obtain a panoramic unfolded image from which a camera installation location can be easily understood from a camera image taken by an omnidirectional camera. The purpose is to do.
 本発明のパノラマ展開画像表示装置は、全方位画像を入力する画像入力手段と、前記全方位画像に撮影された物体の通路を求める通路取得手段と、前記全方位画像を所定の中心位置から円周方向への切断線に沿って切断することによりパノラマ展開するパノラマ展開手段と、前記パノラマ展開された画像を表示する表示手段とを有し、前記切断線は、前記通路との交点が所定の値より少なくなるように設定する。 The panoramic unfolded image display device of the present invention includes an image input means for inputting an omnidirectional image, a path acquisition means for obtaining a path of an object photographed in the omnidirectional image, and the omnidirectional image from a predetermined center position. Panorama development means for developing a panorama by cutting along a cutting line in the circumferential direction; and display means for displaying the panorama developed image, wherein the intersection of the passage with the passage is a predetermined value Set to be less than the value.
 上記構成によれば、全方位画像をパノラマ展開するための切断線(展開基準線)を、通路との交点が所定の値より少なくなるように設定するので、全方位画像をパノラマ展開したときに通路が分断されてしまうことがなくなり、カメラ設置場所の状況を容易に理解することができるパノラマ展開画像を得ることができる。 According to the above configuration, the cutting line (development reference line) for panoramic development of the omnidirectional image is set so that the intersection with the passage is smaller than a predetermined value. The passage is not divided, and a panoramic development image that allows easy understanding of the situation of the camera installation location can be obtained.
 上記構成において、前記切断線は、前記通路と直角になる前記通路が所定の値より多くなるように設定する。 In the above configuration, the cutting line is set so that the passage perpendicular to the passage is larger than a predetermined value.
 上記構成において、前記全方位画像は全方位カメラによって撮影された画像であって、前記所定の中心位置は、前記全方位カメラの真下を撮影した前記画像内の位置である。 In the above configuration, the omnidirectional image is an image taken by an omnidirectional camera, and the predetermined center position is a position in the image taken immediately below the omnidirectional camera.
 上記構成において、前記通路の情報は、前記全方位画像内の物体の動線を求め、前記動線から生成する。 In the above configuration, the information on the passage is generated from the flow line by obtaining the flow line of the object in the omnidirectional image.
 上記構成において、前記通路毎の混雑度を算出する混雑度算出手段を有し、前記切断線は、前記混雑度が所定の値より少ない通路を横切るように設定する。 In the above configuration, the vehicle has a congestion degree calculating means for calculating a congestion degree for each passage, and the cutting line is set so as to cross a passage having the congestion degree less than a predetermined value.
 上記構成によれば、通路毎の混雑度を解析し、混雑度が高い通路を優先して途切れないように切断線を設定するので、混雑している通路を明確に見ることができる。 According to the above configuration, the degree of congestion for each passage is analyzed, and the cutting line is set so that the passage with a high degree of congestion is prioritized, so that a busy passage can be clearly seen.
 上記構成において、前記混雑度算出手段は時間帯毎の混雑度を算出し、前記切断線は、前記時間帯毎の混雑度に応じて、時間帯毎に設定する。 In the above configuration, the congestion degree calculating means calculates the congestion degree for each time zone, and the cutting line is set for each time zone according to the congestion degree for each time zone.
 上記構成によれば、通路毎の混雑度により時間帯毎に切断線を変化させるので、時間帯毎に混雑している通路を明確に見ることができる。 According to the above configuration, since the cutting line is changed for each time zone depending on the degree of congestion for each passage, it is possible to clearly see the congested passage for each time zone.
 本発明のパノラマ展開画像出力装置は、全方位画像を入力する画像入力手段と、前記全方位画像に撮影された物体の通路を求める通路取得手段と、前記全方位画像を所定の中心位置から円周方向への切断線に沿って切断することによりパノラマ展開して出力するパノラマ展開手段とを有し、前記切断線は、前記通路との交点が所定の値より少なくなるように設定する。 The panoramic developed image output device of the present invention includes an image input means for inputting an omnidirectional image, a path acquisition means for obtaining a path of an object photographed in the omnidirectional image, and the omnidirectional image from a predetermined center position. Panorama developing means for outputting the panorama by cutting along a cutting line in the circumferential direction, and the cutting line is set so that the intersection with the passage is less than a predetermined value.
 本発明のパノラマ展開画像表示方法は、全方位画像を入力する画像入力ステップと、前記全方位画像に撮影された物体の通路を求める通路取得ステップと、前記全方位画像を所定の中心位置から円周方向への切断線に沿って切断することによりパノラマ展開するパノラマ展開ステップと、前記パノラマ展開された画像を表示する表示ステップとを有し、前記切断線は、前記通路との交点が所定の値より少なくなるように設定する。 The panoramic developed image display method of the present invention includes an image input step of inputting an omnidirectional image, a passage acquisition step of obtaining a passage of an object photographed in the omnidirectional image, and the omnidirectional image from a predetermined center position. A panoramic development step for panoramic development by cutting along a cutting line in the circumferential direction; and a display step for displaying the panoramic development image. The cutting line has a predetermined intersection with the passage. Set to be less than the value.
 本発明によれば、全方位カメラで撮影されたカメラ画像からカメラ設置場所の状況を容易に理解できるパノラマ展開画像を得ることができる。 According to the present invention, it is possible to obtain a panoramic development image that can easily understand the situation of the camera installation location from a camera image taken by an omnidirectional camera.
本発明の実施の形態1に係るパノラマ展開画像表示装置の概略構成を示すブロック図1 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 1 of the present invention. 全方位カメラが通路上にある場合のカメラ画像の一例を示す図The figure which shows an example of a camera image when an omnidirectional camera is on a passage (a)~(c)全方位カメラが通路上にない場合のカメラ画像の一例を示す図(A)-(c) The figure which shows an example of a camera image when an omnidirectional camera is not on a passage 図1のパノラマ展開画像表示装置の動作を示すフローチャートThe flowchart which shows operation | movement of the panoramic expansion image display apparatus of FIG. 図1のパノラマ展開画像表示装置の展開基準位置決定部によって決定された展開基準線を示す図The figure which shows the expansion | deployment reference line determined by the expansion | deployment reference position determination part of the panoramic expansion | deployment image display apparatus of FIG. 図1のパノラマ展開画像表示装置の展開基準位置決定部によって決定された展開基準線を示す図The figure which shows the expansion | deployment reference line determined by the expansion | deployment reference position determination part of the panoramic expansion | deployment image display apparatus of FIG. (a),(b)展開基準位置を自動設定するようにした場合と展開基準位置を固定した場合との比較例を示す図(A), (b) The figure which shows the comparative example with the case where a development reference position is set automatically, and the case where a development reference position is fixed 本発明の実施の形態2に係るパノラマ展開画像表示装置の概略構成を示すブロック図FIG. 3 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 2 of the present invention. (a),(b)図8のパノラマ展開画像表示装置の人物動線情報履歴格納部に蓄積されたNフレームの人物動線情報及び通路位置推定部で推定された通路情報の一例を示す図(A), (b) The figure which shows an example of the passage information estimated by the person flow line information of N frames accumulate | stored in the human flow line information log | history storage part of the panoramic expansion image display apparatus of FIG. 8, and a passage position estimation part 図8のパノラマ展開画像表示装置の動作を示すフローチャートThe flowchart which shows operation | movement of the panoramic expansion image display apparatus of FIG. 本発明の実施の形態3に係るパノラマ展開画像表示装置の概略構成を示すブロック図FIG. 3 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 3 of the present invention. 図11のパノラマ展開画像表示装置の動作を示すフローチャートThe flowchart which shows operation | movement of the panoramic expansion image display apparatus of FIG. 本発明の実施の形態4に係るパノラマ展開画像表示装置の概略構成を示すブロック図FIG. 5 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 4 of the present invention. 図13のパノラマ展開画像表示装置の動作を示すフローチャートThe flowchart which shows operation | movement of the panoramic expansion image display apparatus of FIG. 図13のパノラマ展開画像表示装置の混雑度情報履歴保存部に保存される通路毎の混雑度平均表と、該混雑度平均表における10:00-11:00及び12:00-13:00のそれぞれにおけるカメラ画像上の通路の一例を示す図The congestion degree average table for each passage stored in the congestion degree information history storage unit of the panoramic developed image display device of FIG. 13 and 10: 00-11: 0 and 12: 00-13: 00 in the congestion degree average table. The figure which shows an example of the channel | path on the camera image in each 全方位カメラが撮影したオリジナル画像と、該オリジナル画像をパノラマ展開したパノラマ展開画像の一例を示す図The figure which shows an example of the original image which the omnidirectional camera image | photographed, and the panorama expansion | deployment image which carried out the panorama expansion | deployment of this original image (a)~(c)コンビニエンスストア内を全方位カメラで撮影して得られたオリジナル画像と、該オリジナル画像をパノラマ展開したパノラマ展開画像と、コンビニエンスストア内の実際の物品の配置を示す図(A)-(c) The figure which shows arrangement | positioning of the original image obtained by image | photographing the inside of a convenience store with an omnidirectional camera, the panorama expansion | deployment image which expand | deployed the original image, and the actual goods in a convenience store 本発明の実施の形態5の概略構成を示すブロック図The block diagram which shows schematic structure of Embodiment 5 of this invention. 本発明の実施の形態6の概略構成を示すブロック図The block diagram which shows schematic structure of Embodiment 6 of this invention.
 以下、本発明を実施するための好適な実施の形態について、図面を参照して詳細に説明する。 Hereinafter, preferred embodiments for carrying out the present invention will be described in detail with reference to the drawings.
 (実施の形態1)
 図1は、本発明の実施の形態1に係るパノラマ展開画像表示装置の概略構成を示すブロック図である。同図において、本実施の形態に係るパノラマ展開画像表示装置1は、全方位カメラ画像入力部(画像入力手段)10と、通路情報入力部(通路取得手段)11と、カメラ中心位置入力部12と、展開基準位置決定部13と、全方位カメラ画像パノラマ展開部(パノラマ展開手段)14と、表示部(表示手段)15とを備える。
(Embodiment 1)
FIG. 1 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 1 of the present invention. In the figure, a panoramic developed image display device 1 according to the present embodiment includes an omnidirectional camera image input unit (image input unit) 10, a passage information input unit (passage acquisition unit) 11, and a camera center position input unit 12. A development reference position determination unit 13, an omnidirectional camera image panorama development unit (panorama development unit) 14, and a display unit (display unit) 15.
 全方位カメラ画像入力部10は、図示せぬ全方位カメラの画像(以下、“カメラ画像”と呼ぶ)を入力する。通路情報入力部11は、カメラ画像から建物内の通路に関する情報(以下、“通路情報”と呼ぶ)を入力する。例えば、建物がコンビニエンスストアで、そのコンビニエンスストア内に全方位カメラが設置されていたとすると、そのコンビニエンスストア内の通路情報を入力する。本実施の形態では、通路情報の入力はユーザ自身が行う手動入力である。カメラ中心位置入力部12は、全方位カメラの真下を撮影したカメラ画像内のカメラ中心位置を入力する。このカメラ中心位置の入力はユーザ自身が手動で行う。展開基準位置決定部13は、カメラ中心位置入力部12で入力されたカメラ中心位置と通路情報入力部11で入力された通路情報とからカメラ画像をパノラマ展開する展開基準位置を設定する。この場合、展開基準位置は、全方位カメラが通路上にあり、かつ通路の交点でなければ、通路と直交する方向に設定され、全方位カメラが通路上になければ、カメラ位置を中心に全方位スキャンし、通路との交線が最小かつ直角となるように設定される。 The omnidirectional camera image input unit 10 inputs an image of an omnidirectional camera (not shown) (hereinafter referred to as “camera image”). The passage information input unit 11 inputs information about a passage in the building (hereinafter referred to as “passage information”) from the camera image. For example, if the building is a convenience store and an omnidirectional camera is installed in the convenience store, the passage information in the convenience store is input. In the present embodiment, the passage information is manually input by the user himself / herself. The camera center position input unit 12 inputs a camera center position in a camera image obtained by photographing a omnidirectional camera. The user inputs the camera center position manually. The development reference position determination unit 13 sets a development reference position for panoramic development of the camera image from the camera center position input by the camera center position input unit 12 and the path information input by the path information input unit 11. In this case, the deployment reference position is set in a direction orthogonal to the passage if the omnidirectional camera is on the passage and is not an intersection of the passages, and if the omnidirectional camera is not on the passage, the development reference position is all around the camera position. An azimuth scan is performed and the line of intersection with the passage is set to be minimum and perpendicular.
 図2は、全方位カメラが通路130上にある場合のカメラ画像100の一例を示す図である。同図に示すように、カメラ位置150が通路130上にある場合、展開基準位置を示す展開基準線110は、通路の交点160でない場所で、通路130と直交する方向に設定される。図3(a)~(c)は、全方位カメラが通路130上にない場合のカメラ画像100の一例を示す図である。同図に示すように、カメラ位置150が通路130上にない場合、展開基準位置を示す展開基準線110は、通路130との交線が最小で、かつ直角になるように設定される。 FIG. 2 is a diagram illustrating an example of the camera image 100 when the omnidirectional camera is on the passage 130. As shown in the figure, when the camera position 150 is on the passage 130, the development reference line 110 indicating the development reference position is set in a direction orthogonal to the passage 130 at a location other than the intersection 160 of the passage. FIGS. 3A to 3C are views showing an example of the camera image 100 when the omnidirectional camera is not on the passage 130. FIG. As shown in the figure, when the camera position 150 is not on the passage 130, the development reference line 110 indicating the development reference position is set so that the line of intersection with the passage 130 is the smallest and perpendicular.
 図1に戻り、全方位カメラ画像パノラマ展開部14は、展開基準位置決定部13で設定された展開基準位置を基にカメラ画像を切断することによりパノラマ展開する。即ち、カメラ位置を中心とする中心位置から円周方向への展開基準線に沿って切断することによりカメラ画像をパノラマ展開する。表示部15は、全方位カメラ画像パノラマ展開部14でパノラマ展開された画像を表示する。 Returning to FIG. 1, the omnidirectional camera image panorama developing unit 14 develops a panorama by cutting the camera image based on the development reference position set by the development reference position determining unit 13. That is, the camera image is panoramicly developed by cutting along the development reference line in the circumferential direction from the center position centered on the camera position. The display unit 15 displays the image panorama developed by the omnidirectional camera image panorama developing unit 14.
 次に、本実施の形態に係るパノラマ展開画像表示装置1の動作を説明する。
 図4は、本実施の形態に係るパノラマ展開画像表示装置1の動作を示すフローチャートである。同図において、まず全方位カメラ画像入力部10が、カメラ画像を入力する(ステップS1)。次いで、カメラ中心位置入力部12が、ユーザがカメラ画像に対して指定したカメラ中心位置を入力する(ステップS2)。また、カメラ中心位置入力と同時に、通路情報入力部11が、1フレームの画像から通路情報を入力する(ステップS3)。カメラ中心位置及び通路情報が入力された後、展開基準位置決定部13が、通路上にカメラ中心位置が存在するかどうか判定し(ステップS4)、通路上にカメラ中心位置が存在すると判断した場合(即ち、「Yes」と判断した場合)、通路交点にカメラ中心位置が存在するかどうか判定する(ステップS5)。
Next, the operation of the panoramic developed image display device 1 according to the present embodiment will be described.
FIG. 4 is a flowchart showing the operation of the panoramic developed image display device 1 according to the present embodiment. In the figure, the omnidirectional camera image input unit 10 first inputs a camera image (step S1). Next, the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S2). Simultaneously with the camera center position input, the passage information input unit 11 inputs passage information from an image of one frame (step S3). After the camera center position and passage information are input, the development reference position determination unit 13 determines whether the camera center position exists on the passage (step S4), and determines that the camera center position exists on the passage. (In other words, if “Yes” is determined), it is determined whether or not the camera center position exists at the passage intersection (step S5).
 展開基準位置決定部13は、通路交点にカメラ中心位置が存在しないと判断した場合(即ち、「No」と判断した場合)、通路と直交する方向に展開基準位置を設定する(ステップS6)。図5は、カメラ中心位置が通路交点以外の通路上にあるときの展開基準位置上の展開基準線110を示す図である。展開基準位置が設定されると、全方位カメラ画像パノラマ展開部14が、当該展開基準位置を基にカメラ画像を360度にパノラマ展開し、表示部15に表示する(ステップS7)。1フレームのカメラ画像に対するパノラマ展開画像が表示された後、全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したかどうか判定し(ステップS8)、カメラ画像の取り込みが終了していないと判断した場合(即ち、「No」と判断した場合)、次のフレームのカメラ画像の取り込みを行い(ステップS9)、そのカメラ画像をパノラマ展開して表示部15に表示する。全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したと判断した場合(即ち、「Yes」と判断した場合)、本処理を終える。 The deployment reference position determination unit 13 sets the deployment reference position in a direction orthogonal to the passage when it is determined that the camera center position does not exist at the passage intersection (that is, when “No” is determined) (step S6). FIG. 5 is a diagram showing a development reference line 110 on the development reference position when the camera center position is on a passage other than the passage intersection. When the development reference position is set, the omnidirectional camera image panorama development unit 14 develops a panoramic image of 360 degrees based on the development reference position and displays it on the display unit 15 (step S7). After the panorama development image for one frame of the camera image is displayed, the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S8), and the camera image capture has not been completed. (That is, when “No” is determined), the camera image of the next frame is captured (step S9), and the camera image is panorama developed and displayed on the display unit 15. The omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”).
 一方、展開基準位置決定部13は、ステップS4の判定において、通路上にカメラ中心位置が存在しないと判断した場合(即ち、「No」と判断した場合)、あるいは、ステップS5の判定において、通路交点にカメラ中心位置が存在すると判断した場合(即ち、「Yes」と判断した場合)、カメラ中心位置からカメラ画像端に向けて角度をαずつずらしながら直線(即ち、展開基準線)を設定する(ステップS10)。展開基準位置決定部13は、カメラ中心位置からカメラ画像端に向けて角度をαずらすことで、展開基準線と通路との交点が最小で、かつ直角となる交点数が最大となる展開基準位置を設定する(ステップS11)。図6は、カメラ中心位置が通路上にないときで、角度をαずらしたときの展開基準線110を示す図である。図4に戻り、展開基準位置を設定した後、ステップS7の処理に進む。以下、上述したステップS7~ステップS9の処理が行われる。 On the other hand, the development reference position determination unit 13 determines in step S4 that the camera center position does not exist on the passage (that is, determines “No”) or in step S5. When it is determined that the camera center position exists at the intersection (that is, when “Yes” is determined), a straight line (that is, a development reference line) is set while shifting the angle from the camera center position toward the camera image end by α. (Step S10). The development reference position determination unit 13 shifts the angle α from the camera center position toward the camera image end, thereby minimizing the number of intersection points between the development reference line and the passage and maximizing the number of intersection points at right angles. Is set (step S11). FIG. 6 is a diagram showing the development reference line 110 when the angle is shifted by α when the camera center position is not on the passage. Returning to FIG. 4, after setting the development reference position, the process proceeds to step S7. Thereafter, the above-described steps S7 to S9 are performed.
 図7(a),(b)は、展開基準位置を自動設定するようにした場合(本発明)と、展開基準位置を固定にした場合(従来)との比較例を示す図であり、(a)は展開基準位置を固定にした場合のカメラ画像100とパノラマ展開画像101、(b)は展開基準位置を自動設定するようにした場合のカメラ画像100とパノラマ展開画像101である。同図から分かるように、展開基準位置を固定にした場合は通路が分断されているが、展開基準位置を自動設定するようにした場合は通路が分断されていない。 FIGS. 7A and 7B are diagrams showing a comparative example between the case where the development reference position is automatically set (the present invention) and the case where the development reference position is fixed (conventional). a) shows the camera image 100 and panorama development image 101 when the development reference position is fixed, and (b) shows the camera image 100 and panorama development image 101 when the development reference position is automatically set. As can be seen from the figure, the passage is divided when the development reference position is fixed, but the passage is not divided when the development reference position is automatically set.
 このように本実施の形態に係るパノラマ展開画像表示装置1は、カメラ画像をパノラマ展開するための展開基準線を、通路との交点が所定の値より少なくなるように設定するので、通路を分断することがなく、カメラ設置場所の状況を容易に理解できるパノラマ展開画像を得ることができる。 As described above, the panoramic developed image display device 1 according to the present embodiment sets the development reference line for panoramic development of the camera image so that the intersection with the passage is smaller than a predetermined value, so that the passage is divided. Therefore, it is possible to obtain a panoramic developed image that can easily understand the situation of the camera installation location.
 (実施の形態2)
 図8は、本発明の実施の形態2に係るパノラマ展開画像表示装置の概略構成を示すブロック図である。同図において前述した図1と共通する部分には同一の符号を付けている。本実施の形態に係るパノラマ展開画像表示装置2は、人物追跡技術を用いて動線(人物の移動軌跡)解析して通路位置を推定する機能を有するものであり、全方位カメラ画像入力部10と、カメラ中心位置入力部12と、全方位カメラ画像パノラマ展開部14と、表示部15と、人物動線情報取得部16と、人物動線情報履歴格納部17と、通路位置推定部18と、展開基準位置決定部19とを備える。
(Embodiment 2)
FIG. 8 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 2 of the present invention. In the figure, the same reference numerals are given to the portions common to FIG. 1 described above. The panoramic unfolded image display device 2 according to the present embodiment has a function of estimating a passage position by analyzing a flow line (person's movement trajectory) using a person tracking technique, and an omnidirectional camera image input unit 10. A camera center position input unit 12, an omnidirectional camera image panorama development unit 14, a display unit 15, a person flow line information acquisition unit 16, a person flow line information history storage unit 17, and a passage position estimation unit 18. And a development reference position determining unit 19.
 人物動線情報取得部16は、カメラ画像内における人物の動線を求めることで人物動線情報を取得する。人物動線情報取得部16は、フレーム毎に人物動線情報を取得して出力する。人物動線情報履歴格納部17は、人物動線情報取得部16でフレーム毎に取得された人物動線情報を蓄積する。通路位置推定部18は、人物動線情報履歴格納部17に蓄積されたフレーム毎の人物動線情報から通路位置を推定する。 The person flow line information acquisition unit 16 acquires the person flow line information by obtaining the flow line of the person in the camera image. The person flow line information acquisition unit 16 acquires and outputs the person flow line information for each frame. The person flow line information history storage unit 17 accumulates the person flow line information acquired for each frame by the person flow line information acquisition unit 16. The passage position estimation unit 18 estimates the passage position from the person flow line information for each frame accumulated in the person flow line information history storage unit 17.
 図9(a),(b)は、人物動線情報履歴格納部17に蓄積されたNフレームの人物動線情報とNフレームの人物動線情報から推定された通路情報の一例を示す図であり(N:自然数)、同図の(a)は動線120がNフレーム分蓄積されたカメラ画像、(b)は通路情報の推定結果である。 FIGS. 9A and 9B are diagrams illustrating an example of N-frame human flow information accumulated in the human flow information history storage unit 17 and passage information estimated from the N-flow human flow information. Yes (N: natural number), (a) in the figure is a camera image in which the flow line 120 is accumulated for N frames, and (b) is an estimation result of passage information.
 図8に戻り、展開基準位置決定部19は、カメラ中心位置入力部12で入力されたカメラ中心位置と通路位置推定部18で推定された通路位置とからカメラ画像をパノラマ展開する展開基準位置を設定する。全方位カメラ画像パノラマ展開部14は、展開基準位置決定部19で設定された展開基準位置を基にカメラ画像を切断することによりパノラマ展開する。表示部15は、全方位カメラ画像パノラマ展開部14でパノラマ展開された画像を表示する。 Returning to FIG. 8, the development reference position determination unit 19 determines a development reference position for panoramic development of the camera image from the camera center position input by the camera center position input unit 12 and the passage position estimated by the passage position estimation unit 18. Set. The omnidirectional camera image panorama developing unit 14 develops a panorama by cutting the camera image based on the development reference position set by the development reference position determining unit 19. The display unit 15 displays the image panorama developed by the omnidirectional camera image panorama developing unit 14.
 次に、本実施の形態に係るパノラマ展開画像表示装置2の動作を説明する。
 図10は、本実施の形態に係るパノラマ展開画像表示装置2の動作を示すフローチャートである。同図において、まず全方位カメラ画像入力部10が、カメラ画像を入力する(ステップS20)。次いで、カメラ中心位置入力部12が、ユーザがカメラ画像に対して指定したカメラ中心位置を入力する(ステップS21)。また、カメラ中心位置入力と同時に、人物動線情報取得部16が、1フレームの画像から人物動線情報を取得する(ステップS22)。次いで、人物動線情報履歴格納部17が、人物動線情報取得部16によって取得された人物動線情報を蓄積する(ステップS23)。人物動線情報履歴格納部17にNフレームの人物動線情報が蓄積されると、通路位置推定部18が、過去Nフレームの人物動線情報を統合し(ステップS24)、次いで、統合した人物動線情報を基に通路情報を推定する(ステップS25)。
Next, the operation of the panoramic developed image display device 2 according to the present embodiment will be described.
FIG. 10 is a flowchart showing the operation of the panoramic developed image display device 2 according to the present embodiment. In the figure, the omnidirectional camera image input unit 10 first inputs a camera image (step S20). Next, the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S21). Simultaneously with the input of the camera center position, the human flow line information acquisition unit 16 acquires human flow line information from one frame image (step S22). Next, the person flow line information history storage unit 17 accumulates the person flow line information acquired by the person flow line information acquisition unit 16 (step S23). When the human flow line information of N frames is accumulated in the human flow line information history storage unit 17, the passage position estimation unit 18 integrates the human flow line information of the past N frames (step S24), and then the integrated person The passage information is estimated based on the flow line information (step S25).
 通路位置推定部18によって通路情報が推定された後、展開基準位置決定部19が、カメラ画像をパノラマ展開するための展開基準位置を設定する(ステップS26)。この処理(処理A)は、前述した図4のステップS4~ステップS6及びステップS10,S11と同様であるので、記載を省略する。展開基準位置が設定された後、全方位カメラ画像パノラマ展開部14が、カメラ画像をパノラマ展開して表示部15に表示する(ステップS27)。なお、全方位カメラ画像パノラマ展開部14が使用する1フレームの画像は、人物動線情報取得部16が最初に取り込む1フレームの画像と同じものである。1フレームのカメラ画像に対するパノラマ展開画像が表示された後、全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したかどうか判定し(ステップS28)、カメラ画像の取り込みが終了していないと判断した場合(即ち、「No」と判断した場合)、次のフレームのカメラ画像の取り込みを行い(ステップS29)、そのカメラ画像をパノラマ展開して表示部15に表示する。全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したと判断した場合(即ち、「Yes」と判断した場合)、本処理を終える。なお、1フレーム毎に人物動線情報を更新する場合は、次フレームのカメラ画像を取得するとともに、ステップS22からの処理を繰り返す。 After the passage information is estimated by the passage position estimation unit 18, the development reference position determination unit 19 sets a development reference position for panoramic development of the camera image (step S26). Since this process (Process A) is the same as Steps S4 to S6 and Steps S10 and S11 of FIG. 4 described above, description thereof is omitted. After the development reference position is set, the omnidirectional camera image panorama development unit 14 develops the camera image and displays it on the display unit 15 (step S27). Note that the one-frame image used by the omnidirectional camera image panorama developing unit 14 is the same as the one-frame image first captured by the human flow line information acquiring unit 16. After the panorama development image for one frame of the camera image is displayed, the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S28), and the camera image capture has not been completed. (That is, when “No” is determined), the camera image of the next frame is captured (step S29), and the camera image is panorama developed and displayed on the display unit 15. The omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”). In addition, when updating person flow line information for every frame, while acquiring the camera image of the next frame, the process from step S22 is repeated.
 このように本実施の形態に係るパノラマ展開画像表示装置2は、カメラ画像内の人物動線情報を取得し、取得した人物動線情報から通路情報を推定するので、ユーザによる通路情報の入力操作を省ける分、操作性の向上が図れる。 As described above, the panoramic developed image display device 2 according to the present embodiment acquires the person flow line information in the camera image and estimates the passage information from the acquired person flow line information. As a result, the operability can be improved.
 (実施の形態3)
 図11は、本発明の実施の形態3に係るパノラマ展開画像表示装置の概略構成を示すブロック図である。同図において前述した図1及び図8のそれぞれと共通する部分には同一の符号を付けている。本実施の形態に係るパノラマ展開画像表示装置3は、前述したパノラマ展開画像表示装置2の機能の他に、推定した通路位置から通路混雑度を解析し、混雑度が高い通路を優先して途切れないように展開基準位置を設定する機能を有するものであり、全方位カメラ画像入力部10と、カメラ中心位置入力部12と、全方位カメラ画像パノラマ展開部14と、表示部15と、人物動線情報取得部16と、人物動線情報履歴格納部17と、通路位置推定部18と、通路混雑度算出部(混雑度算出手段)20と、展開基準位置決定部21とを備える。
(Embodiment 3)
FIG. 11 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 3 of the present invention. In the figure, the same reference numerals are assigned to the portions common to those in FIGS. 1 and 8 described above. The panorama development image display device 3 according to the present embodiment analyzes the passage congestion degree from the estimated passage position in addition to the functions of the panorama development image display device 2 described above, and gives priority to a passage with a high degree of congestion. The omnidirectional camera image input unit 10, the camera center position input unit 12, the omnidirectional camera image panorama development unit 14, the display unit 15, and the human motion. A line information acquisition unit 16, a person flow line information history storage unit 17, a passage position estimation unit 18, a passage congestion degree calculation unit (congestion degree calculation means) 20, and a development reference position determination unit 21 are provided.
 混雑度とは、例えば通路毎の単位時間における最大通過人数に対する実際に通過した人数の比をパーセントで表したものである。通路毎の混雑度を算出することで、混雑度が高い通路を優先して途切れないように展開基準位置を設定して表示することが可能となる。なお、混雑度については、先行技術文献(例えば、特開2009-110152号公報、特開2009-110054号公報)に詳しく記載されているので、本願明細書では記載を省略する。 The degree of congestion is, for example, the ratio of the number of people who have actually passed to the maximum number of people passing in a unit time for each passage, expressed as a percentage. By calculating the degree of congestion for each passage, it is possible to set and display a development reference position so that a passage with a high degree of congestion is prioritized and is not interrupted. Note that the degree of congestion is described in detail in the prior art documents (for example, JP 2009-110152 A and JP 2009-110054 A), and thus description thereof is omitted in this specification.
 次に、本実施の形態に係るパノラマ展開画像表示装置3の動作を説明する。
 図12は、本実施の形態に係るパノラマ展開画像表示装置3の動作を示すフローチャートである。同図において、まず全方位カメラ画像入力部10が、カメラ画像を入力する(ステップS30)。次いで、カメラ中心位置入力部12が、ユーザがカメラ画像に対して指定したカメラ中心位置を入力する(ステップS31)。また、カメラ中心位置入力と同時に、人物動線情報取得部16が、1フレームの画像から人物動線情報を取得する(ステップS32)。そして、人物動線情報取得部16が取得した人物動線情報を人物動線情報履歴格納部17が蓄積する(ステップS33)。人物動線情報履歴格納部17にNフレームの人物動線情報が蓄積されると、通路位置推定部18が、過去Nフレームの人物動線情報を統合し(ステップS34)、さらに、統合した人物動線情報を基に通路情報を推定する(ステップS35)。
Next, the operation of the panoramic developed image display device 3 according to the present embodiment will be described.
FIG. 12 is a flowchart showing the operation of the panoramic developed image display device 3 according to the present embodiment. In the figure, the omnidirectional camera image input unit 10 first inputs a camera image (step S30). Next, the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S31). Simultaneously with the input of the camera center position, the human flow line information acquisition unit 16 acquires human flow line information from one frame image (step S32). The person flow line information history storage unit 17 accumulates the person flow line information acquired by the person flow line information acquisition unit 16 (step S33). When N frames of person flow line information are accumulated in the person flow line information history storage unit 17, the passage position estimation unit 18 integrates the person N flow line information of the past N frames (step S34), and further integrates the person. The passage information is estimated based on the flow line information (step S35).
 通路位置推定部18によって通路情報が推定された後、通路混雑度算出部20が通路毎に混雑度を算出する(ステップS36)。通路毎に混雑度が算出された後、展開基準位置決定部21が、通路上にカメラ中心位置が存在するかどうか判定し(ステップS37)、通路上にカメラ中心位置が存在すると判断した場合(即ち、「Yes」と判断した場合)、通路交点にカメラ中心位置が存在するかどうか判定する(ステップS38)。 After the passage information is estimated by the passage position estimation unit 18, the passage congestion degree calculation unit 20 calculates the degree of congestion for each passage (step S36). After the congestion degree is calculated for each passage, the development reference position determination unit 21 determines whether or not the camera center position exists on the passage (step S37), and determines that the camera center position exists on the passage ( That is, if “Yes” is determined), it is determined whether or not the camera center position exists at the passage intersection (step S38).
 展開基準位置決定部21は、通路交点にカメラ中心位置が存在しないと判断した場合(即ち、「No」と判断した場合)、通路と直交する方向に展開基準位置を設定する(ステップS39)。展開基準位置が設定されると、全方位カメラ画像パノラマ展開部14が、展開基準位置決定部21で設定された展開基準位置を基にカメラ画像をパノラマ展開し、表示部15に表示する(ステップS40)。1フレームのカメラ画像に対するパノラマ展開画像が表示された後、全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したかどうか判定し(ステップS41)、カメラ画像の取り込みが終了していないと判断した場合(即ち、「No」と判断した場合)、次のフレームのカメラ画像の取り込みを行い(ステップS42)、ステップS32へ戻る。全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したと判断した場合(即ち、「Yes」と判断した場合)、本処理を終える。 When the development reference position determination unit 21 determines that the camera center position does not exist at the passage intersection (that is, when it is determined “No”), the development reference position determination unit 21 sets the development reference position in a direction orthogonal to the passage (step S39). When the development reference position is set, the omnidirectional camera image panorama development unit 14 develops the camera image on the basis of the development reference position set by the development reference position determination unit 21 and displays it on the display unit 15 (step S1). S40). After the panorama development image for one frame of the camera image is displayed, the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S41), and the camera image capture has not been completed. (That is, when “No” is determined), the camera image of the next frame is captured (step S42), and the process returns to step S32. The omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”).
 一方、展開基準位置決定部21は、ステップS37の判定において、通路上にカメラ中心位置が存在しないと判断した場合(即ち、「No」と判断した場合)、あるいは、ステップS38の判定において、通路交点にカメラ中心位置が存在すると判断した場合(即ち、「Yes」と判断した場合)、カメラ中心位置から画像端に向けて角度をαずつずらしながら直線(即ち、展開基準線)を設定する(ステップS43)。展開基準位置決定部21は、カメラ中心位置から画像端に向けて角度をαずらすことで、直線と通路との交点が最小、かつ直角となる交点数が最大、かつ交差する通路の混雑度の平均が最小となるよう展開基準位置を設定する(ステップS44)。展開基準位置を設定した後、ステップS40の処理に進む。以下、上述したステップS40~ステップS42の処理が行われる。 On the other hand, the development reference position determining unit 21 determines in step S37 that the camera center position does not exist on the passage (that is, determines “No”) or in step S38. When it is determined that the camera center position exists at the intersection (that is, when “Yes” is determined), a straight line (that is, a development reference line) is set while shifting the angle from the camera center position toward the image edge by α ( Step S43). The development reference position determination unit 21 shifts the angle from the camera center position toward the image edge by α, thereby minimizing the number of intersections between the straight line and the passage, and maximizing the number of intersections at right angles, and the degree of congestion of the intersecting passage. The development reference position is set so that the average is minimized (step S44). After setting the development reference position, the process proceeds to step S40. Thereafter, the processes of steps S40 to S42 described above are performed.
 このように本実施の形態に係るパノラマ展開画像表示装置3は、通路毎の混雑度を解析し、混雑度が高い通路を優先して途切れないように展開基準位置を設定するので、混雑している通路を明確に見ることができる。 As described above, the panoramic developed image display device 3 according to the present embodiment analyzes the degree of congestion for each passage and sets the development reference position so that the passage with high degree of congestion is prioritized and is not interrupted. You can clearly see the passage that is.
 (実施の形態4)
 図13は、本発明の実施の形態4に係るパノラマ展開画像表示装置の概略構成を示すブロック図である。同図において前述した図1、図8及び図11のそれぞれと共通する部分には同一の符号を付けている。本実施の形態に係るパノラマ展開画像表示装置4は、前述したパノラマ展開画像表示装置3の機能の他に、混雑度により時間帯毎に展開基準位置を変化させる機能を有するものであり、全方位カメラ画像入力部10と、カメラ中心位置入力部12と、全方位カメラ画像パノラマ展開部14と、表示部15と、人物動線情報取得部16と、人物動線情報履歴格納部17と、通路位置推定部18と、通路混雑度算出部20と、展開基準位置決定部21と、混雑度情報履歴保存部22とを備える。
(Embodiment 4)
FIG. 13 is a block diagram showing a schematic configuration of a panoramic developed image display apparatus according to Embodiment 4 of the present invention. In the figure, the same reference numerals are given to the portions common to each of FIGS. 1, 8, and 11 described above. The panoramic developed image display device 4 according to the present embodiment has a function of changing the developed reference position for each time zone depending on the degree of congestion in addition to the functions of the panoramic developed image display device 3 described above. Camera image input unit 10, camera center position input unit 12, omnidirectional camera image panorama development unit 14, display unit 15, person flow line information acquisition unit 16, person flow line information history storage unit 17, passage The position estimation unit 18, the passage congestion degree calculation unit 20, the development reference position determination unit 21, and the congestion degree information history storage unit 22 are provided.
 次に、本実施の形態に係るパノラマ展開画像表示装置4の動作を説明する。
 図14は、本実施の形態に係るパノラマ展開画像表示装置4の動作を示すフローチャートである。同図において、まず全方位カメラ画像入力部10が、カメラ画像を入力する(ステップS50)。次いで、カメラ中心位置入力部12が、ユーザがカメラ画像に対して指定したカメラ中心位置を入力する(ステップS51)。また、カメラ中心位置入力と同時に、人物動線情報取得部16が、1フレームの画像から人物動線情報を取得する(ステップS52)。そして、人物動線情報取得部16が取得した人物動線情報を人物動線情報履歴格納部17が蓄積する(ステップS53)。人物動線情報履歴格納部17にNフレームの人物動線情報が蓄積されると、通路位置推定部18が、過去Nフレームの人物動線情報を統合し(ステップS54)、さらに、統合した人物動線情報を基に通路情報を推定する(ステップS55)。
Next, the operation of the panoramic developed image display device 4 according to the present embodiment will be described.
FIG. 14 is a flowchart showing the operation of the panoramic developed image display device 4 according to the present embodiment. In the figure, first, the omnidirectional camera image input unit 10 inputs a camera image (step S50). Next, the camera center position input unit 12 inputs the camera center position designated by the user for the camera image (step S51). Simultaneously with the camera center position input, the human flow line information acquisition unit 16 acquires human flow line information from an image of one frame (step S52). Then, the person flow line information history storage unit 17 accumulates the person flow line information acquired by the person flow line information acquisition unit 16 (step S53). When N frames of person flow line information are accumulated in the person flow line information history storage unit 17, the passage position estimation unit 18 integrates the person N flow line information of the past N frames (step S54), and further, the integrated person The passage information is estimated based on the flow line information (step S55).
 通路情報が推定された後、通路混雑度算出部20が、通路毎に混雑度を算出し(ステップS56)、さらに、時間情報を取得する(ステップS57)。そして、混雑度情報履歴保存部22に保存している通路毎の混雑度平均表を更新する(ステップS58)。図15は、通路毎の混雑度平均表と、該混雑度平均表における10:00-11:00及び12:00-13:00のそれぞれにおけるカメラ画像上の通路の一例を示す図である。通路130は、混雑度に応じた太さとなっている。太いほど混雑度が高くなっている。 After the passage information is estimated, the passage congestion degree calculation unit 20 calculates the degree of congestion for each passage (step S56), and further acquires time information (step S57). Then, the congestion degree average table for each passage stored in the congestion degree information history storage unit 22 is updated (step S58). FIG. 15 is a diagram illustrating an example of the congestion degree average table for each passage and the paths on the camera image at 10: 00-11: 0 and 12: 00-13: 00 in the congestion degree average table. The passage 130 has a thickness corresponding to the degree of congestion. The thicker, the higher the congestion level.
 図14に戻り、通路毎の混雑度平均表が更新された後、展開基準位置決定部21が、展開位置更新かどうか判定し(ステップS59)、展開位置更新であると判断した場合(即ち、「Yes」と判断した場合)、混雑度情報履歴保存部22に保存されている混雑度平均表を参照し、通路毎に混雑度を設定する(ステップS60)。次いで、展開基準位置を設定する(ステップS61)。この処理(処理B)は、前述した図12のステップS37~ステップS39及びステップS43,S44と同様であるので、記載を省略する。展開基準位置が設定された後、全方位カメラ画像パノラマ展開部14が、カメラ画像をパノラマ展開して表示部15に表示する(ステップS62)。なお、全方位カメラ画像パノラマ展開部14が使用する1フレームの画像は、人物動線情報取得部16が最初に取り込む1フレームの画像と同じものである。1フレームのカメラ画像に対するパノラマ展開画像が表示された後、全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したかどうか判定し(ステップS63)、カメラ画像の取り込みが終了していないと判断した場合(即ち、「No」と判断した場合)、次のフレームのカメラ画像の取り込みを行い(ステップS64)、ステップS52へ戻る。全方位カメラ画像パノラマ展開部14は、カメラ画像の取り込みが終了したと判断した場合(即ち、「Yes」と判断した場合)、本処理を終える。 Returning to FIG. 14, after the congestion degree average table for each passage is updated, the development reference position determination unit 21 determines whether or not the development position is updated (step S <b> 59), and when it is determined that the development position is updated (that is, If “Yes” is determined), the congestion degree average table stored in the congestion degree information history storage unit 22 is referred to and the congestion degree is set for each passage (step S60). Next, a development reference position is set (step S61). This process (Process B) is the same as Steps S37 to S39 and Steps S43 and S44 of FIG. After the development reference position is set, the omnidirectional camera image panorama development unit 14 develops the camera image and displays it on the display unit 15 (step S62). Note that the one-frame image used by the omnidirectional camera image panorama developing unit 14 is the same as the one-frame image first captured by the human flow line information acquiring unit 16. After the panorama development image for one frame of the camera image is displayed, the omnidirectional camera image panorama development unit 14 determines whether or not the camera image has been captured (step S63), and the camera image capture has not been completed. (That is, when “No” is determined), the camera image of the next frame is captured (step S64), and the process returns to step S52. The omnidirectional camera image panorama developing unit 14 ends this process when it is determined that the capture of the camera image has been completed (that is, when it is determined “Yes”).
 一方、ステップS59において、展開基準位置決定部21は、展開位置更新ではないと判断した場合(即ち、「No」と判断した場合)、前回の展開基準位置を保持し(ステップS65)、ステップS62の処理に移行する。 On the other hand, if it is determined in step S59 that the development reference position determination unit 21 is not updating the development position (ie, “No” is determined), the previous development reference position is held (step S65), and step S62 is performed. Move on to processing.
 このように本実施の形態に係るパノラマ展開画像表示装置4は、通路毎の混雑度により時間帯毎に展開基準位置を変化させるので、時間帯毎に混雑している通路を明確に見ることができる。 As described above, the panoramic developed image display device 4 according to the present embodiment changes the development reference position for each time zone depending on the degree of congestion for each passage, so that a congested passage for each time zone can be clearly seen. it can.
 (実施の形態5)
 図18は、本発明の実施の形態5の概略構成を示すブロック図である。同図において、実施の形態5は、全方位カメラ181、入力部182、処理部183、および表示部184によって構成されている。全方位カメラ181は全周囲画像を入力するカメラ、入力部182はキーボードなどの入力装置、処理部183はパーソナルコンピュータ、表示部184は液晶ディスプレイなどであっても良い。
(Embodiment 5)
FIG. 18 is a block diagram showing a schematic configuration of the fifth embodiment of the present invention. In the figure, the fifth embodiment includes an omnidirectional camera 181, an input unit 182, a processing unit 183, and a display unit 184. The omnidirectional camera 181 may be a camera that inputs an omnidirectional image, the input unit 182 may be an input device such as a keyboard, the processing unit 183 may be a personal computer, and the display unit 184 may be a liquid crystal display.
 実施の形態5は、本発明の実施の形態1の実現手段をさらに具体的にしたものである。実施の形態1との関係では、実施の形態1の全方位カメラ画像入力部10は全方位カメラ181であり、カメラ中心位置入力部12および通路情報入力部11は入力部182であり、展開基準位置決定部13と全方位カメラ画像パノラマ展開部14は処理部183内に実装され、表示部15は表示部184に対応している。 Embodiment 5 is a more specific implementation means of Embodiment 1 of the present invention. In relation to the first embodiment, the omnidirectional camera image input unit 10 of the first embodiment is an omnidirectional camera 181, the camera center position input unit 12 and the passage information input unit 11 are input units 182, and a development reference. The position determining unit 13 and the omnidirectional camera image panorama developing unit 14 are mounted in the processing unit 183, and the display unit 15 corresponds to the display unit 184.
 (実施の形態6)
 図19は、本発明の実施の形態6の概略構成を示すブロック図である。実施の形態6は、実施の形態2の機能のうち、表示部15を除く他の機能を一部の機能を除いてそのまま全方位カメラ内で実現している。
 図19において、実施の形態2を示す図8と同じものには同じ番号を付与している。図19において、全方位カメラ191には、全方位カメラ画像入力部10、カメラ中心位置保持部192、人物動線情報取得部16、人物動線情報履歴格納部17、通路位置推定部18、展開基準位置決定部19、全方位カメラ画像パノラマ展開部14が内蔵されている。
(Embodiment 6)
FIG. 19 is a block diagram showing a schematic configuration of the sixth embodiment of the present invention. In the sixth embodiment, among the functions of the second embodiment, the functions other than the display unit 15 are realized in the omnidirectional camera as they are except for some functions.
In FIG. 19, the same components as those in FIG. 8 showing the second embodiment are given the same numbers. 19, the omnidirectional camera 191 includes an omnidirectional camera image input unit 10, a camera center position holding unit 192, a person flow line information acquisition unit 16, a person flow line information history storage unit 17, a passage position estimation unit 18, and a deployment. A reference position determining unit 19 and an omnidirectional camera image panorama developing unit 14 are incorporated.
 実施の形態6においては、実施の形態1で述べたカメラ中心位置は、あらかじめ保持されている。他の機能は実施の形態2と同じであり、全方位カメラ191からは、全方位カメラ画像パノラマ展開部14で展開されたパノラマ画像が出力される。 In the sixth embodiment, the camera center position described in the first embodiment is held in advance. Other functions are the same as those of the second embodiment, and the panoramic image developed by the omnidirectional camera image panorama developing unit 14 is output from the omnidirectional camera 191.
 出力されたパノラマ画像は全方位カメラ191に接続された液晶ディスプレイなどで表示されても良いし、ネットワークを経由して全方位カメラ191と離れた場所にある、液晶ディスプレイで表示されたり、画像蓄積機能のある蓄積装置に蓄積されても良い。 The output panoramic image may be displayed on a liquid crystal display or the like connected to the omnidirectional camera 191, or may be displayed on a liquid crystal display at a location away from the omnidirectional camera 191 via a network, or may be stored as an image. It may be stored in a functioning storage device.
 本発明を詳細にまた特定の実施態様を参照して説明したが、本発明の精神と範囲を逸脱することなく様々な変更や修正を加えることができることは当業者にとって明らかである。 Although the present invention has been described in detail and with reference to specific embodiments, it will be apparent to those skilled in the art that various changes and modifications can be made without departing from the spirit and scope of the invention.
 本出願は、2012年12月10日出願の日本特許出願(特願2012-269256)に基づくものであり、その内容はここに参照として取り込まれる。 This application is based on a Japanese patent application (Japanese Patent Application No. 2012-269256) filed on Dec. 10, 2012, the contents of which are incorporated herein by reference.
 本発明は、全方位カメラで撮影されたカメラ画像からカメラ設置場所の状況を容易に理解できるパノラマ展開画像を得ることができるといった効果を有し、カメラ画像からパノラマ展開画像を表示可能な全ての装置への適用が可能である。 The present invention has an effect that it is possible to obtain a panoramic development image from which a camera installation location can be easily understood from a camera image photographed by an omnidirectional camera. Application to the device is possible.
 1,2,3,4 パノラマ展開画像表示装置
 10 全方位カメラ画像入力部
 11 通路情報入力部
 12 カメラ中心位置入力部
 13 展開基準位置決定部
 14 全方位カメラ画像パノラマ展開部
 15 表示部
 16 人物動線情報取得部
 17 人物動線情報履歴格納部
 18 通路位置推定部
 19,21 展開基準位置決定部
 20 通路混雑度算出部
 22 混雑度情報履歴保存部
 100 カメラ画像
 101 パノラマ展開画像
 110 展開基準線
 130 通路
 150 カメラ位置
 160 交点
 181 全方位カメラ
 182 入力部
 183 処理部
 184 表示部
 191 全方位カメラ
 192 カメラ中心位置保持部
1, 2, 3, 4 Panorama development image display device 10 Omnidirectional camera image input unit 11 Path information input unit 12 Camera center position input unit 13 Development reference position determination unit 14 Omnidirectional camera image panorama development unit 15 Display unit 16 Human motion Line information acquisition unit 17 Person flow line information history storage unit 18 Passage position estimation unit 19, 21 Development reference position determination unit 20 Passage congestion degree calculation unit 22 Congestion degree information history storage unit 100 Camera image 101 Panorama development image 110 Development reference line 130 Passage 150 Camera position 160 Intersection 181 Omnidirectional camera 182 Input unit 183 Processing unit 184 Display unit 191 Omnidirectional camera 192 Camera center position holding unit

Claims (8)

  1.  全方位画像を入力する画像入力手段と、
     前記全方位画像に撮影された物体の通路を求める通路取得手段と、
     前記全方位画像を所定の中心位置から円周方向への切断線に沿って切断することによりパノラマ展開するパノラマ展開手段と、
     前記パノラマ展開された画像を表示する表示手段とを有し、
     前記切断線は、前記通路との交点が所定の値より少なくなるように設定することを特徴とするパノラマ展開画像表示装置。
    An image input means for inputting an omnidirectional image;
    Passage acquisition means for obtaining a passage of an object photographed in the omnidirectional image;
    Panoramic development means for developing a panorama by cutting the omnidirectional image along a cutting line in a circumferential direction from a predetermined center position;
    Display means for displaying the panoramic image,
    The panoramic developed image display device characterized in that the cutting line is set so that the intersection with the passage is less than a predetermined value.
  2.  前記切断線は、前記通路と直角になる前記通路が所定の値より多くなるように設定することを特徴とする請求項1に記載のパノラマ展開画像表示装置。 2. The panoramic developed image display device according to claim 1, wherein the cutting line is set so that the passage perpendicular to the passage is larger than a predetermined value.
  3.  前記全方位画像は全方位カメラによって撮影された画像であって、前記所定の中心位置は、前記全方位カメラの真下を撮影した前記画像内の位置であることを特徴とする請求項1又は請求項2に記載のパノラマ展開画像表示装置。 The omnidirectional image is an image taken by an omnidirectional camera, and the predetermined center position is a position in the image taken immediately below the omnidirectional camera. Item 3. The panoramic expanded image display device according to Item 2.
  4.  前記通路の情報は、前記全方位画像内の物体の動線を求め、前記動線から生成することを特徴とする請求項1乃至請求項3のいずれか一項に記載のパノラマ展開画像表示装置。 4. The panoramic unfolded image display device according to claim 1, wherein the information on the passage is generated from the flow line obtained from the flow line of the object in the omnidirectional image. 5. .
  5.  前記通路毎の混雑度を算出する混雑度算出手段を有し、
     前記切断線は、前記混雑度が所定の値より少ない通路を横切るように設定することを特徴とする請求項1乃至請求項4のいずれか一項に記載のパノラマ展開画像表示装置。
    Congestion degree calculating means for calculating the degree of congestion for each passage,
    5. The panoramic unfolded image display device according to claim 1, wherein the cutting line is set so as to cross a passage where the degree of congestion is less than a predetermined value.
  6.  前記混雑度算出手段は時間帯毎の混雑度を算出し、
     前記切断線は、前記時間帯毎の混雑度に応じて、時間帯毎に設定することを特徴とする請求項5に記載のパノラマ展開画像表示装置。
    The congestion degree calculating means calculates a congestion degree for each time zone,
    6. The panoramic unfolded image display device according to claim 5, wherein the cutting line is set for each time period according to the degree of congestion for each time period.
  7.  全方位画像を入力する画像入力手段と、
     前記全方位画像に撮影された物体の通路を求める通路取得手段と、
     前記全方位画像を所定の中心位置から円周方向への切断線に沿って切断することによりパノラマ展開して出力するパノラマ展開手段とを有し、
     前記切断線は、前記通路との交点が所定の値より少なくなるように設定することを特徴とするパノラマ展開画像出力装置。
    An image input means for inputting an omnidirectional image;
    Passage acquisition means for obtaining a passage of an object photographed in the omnidirectional image;
    Panorama developing means for outputting a panoramic image by cutting the omnidirectional image along a cutting line in a circumferential direction from a predetermined center position;
    The panoramic developed image output device, wherein the cutting line is set so that an intersection with the passage is smaller than a predetermined value.
  8.  全方位画像を入力する画像入力ステップと、
     前記全方位画像に撮影された物体の通路を求める通路取得ステップと、
     前記全方位画像を所定の中心位置から円周方向への切断線に沿って切断することによりパノラマ展開するパノラマ展開ステップと、
     前記パノラマ展開された画像を表示する表示ステップとを有し、
     前記切断線は、前記通路との交点が所定の値より少なくなるように設定することを特徴とするパノラマ展開画像表示方法。
    An image input step for inputting an omnidirectional image;
    A passage obtaining step for obtaining a passage of an object photographed in the omnidirectional image;
    A panoramic development step for panoramic development by cutting the omnidirectional image along a cutting line in a circumferential direction from a predetermined center position;
    A display step of displaying the panoramic image.
    The panoramic unfolded image display method, wherein the cutting line is set so that the intersection with the passage is less than a predetermined value.
PCT/JP2013/007209 2012-12-10 2013-12-06 Display device for panoramically expanded image WO2014091736A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012269256 2012-12-10
JP2012-269256 2012-12-10

Publications (1)

Publication Number Publication Date
WO2014091736A1 true WO2014091736A1 (en) 2014-06-19

Family

ID=50934044

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/007209 WO2014091736A1 (en) 2012-12-10 2013-12-06 Display device for panoramically expanded image

Country Status (1)

Country Link
WO (1) WO2014091736A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115499594A (en) * 2022-09-30 2022-12-20 如你所视(北京)科技有限公司 Panoramic image generation method and computer-readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006244069A (en) * 2005-03-02 2006-09-14 Hitachi Software Eng Co Ltd Optimum route search device and method in station yard
JP2008028606A (en) * 2006-07-20 2008-02-07 Opt Kk Imaging device and imaging system for panoramically expanded image
JP2011244116A (en) * 2010-05-17 2011-12-01 Panasonic Corp Panoramic expanded image photography system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006244069A (en) * 2005-03-02 2006-09-14 Hitachi Software Eng Co Ltd Optimum route search device and method in station yard
JP2008028606A (en) * 2006-07-20 2008-02-07 Opt Kk Imaging device and imaging system for panoramically expanded image
JP2011244116A (en) * 2010-05-17 2011-12-01 Panasonic Corp Panoramic expanded image photography system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115499594A (en) * 2022-09-30 2022-12-20 如你所视(北京)科技有限公司 Panoramic image generation method and computer-readable storage medium
CN115499594B (en) * 2022-09-30 2023-06-30 如你所视(北京)科技有限公司 Panoramic image generation method and computer-readable storage medium

Similar Documents

Publication Publication Date Title
JP5334711B2 (en) Super resolution digital zoom
JP5020398B1 (en) Image conversion apparatus, camera, image conversion method and program
EP3346308A1 (en) Detection device, detection method, detection program, and imaging device
JP2006262030A (en) Angle of view adjusting apparatus, camera system, and angle of view adjusting method
JP6112479B2 (en) Surveillance camera device, surveillance system including the same, mask processing method, and mask processing program
JP7231643B2 (en) Information processing equipment
JP4872396B2 (en) Image editing apparatus, image editing method, and image editing program
JP2016100636A (en) Imaging apparatus
CN103209290A (en) Control system and method of PTZ (Pan Tilt Zoom) photographic device
JP2009171428A (en) Control method and program for digital camera apparatus and electronic zoom
WO2014091736A1 (en) Display device for panoramically expanded image
JP4769653B2 (en) Target image detection system, target image portion matching determination device, target image portion sorting device, and control method therefor
JP6269014B2 (en) Focus control device and focus control method
KR100974990B1 (en) Image stabilization with user feedback
JP2019184830A5 (en)
KR20230035299A (en) Method and apparatus for processing wide angle image
JP2018006995A (en) Imaging apparatus, display apparatus, and image processing program
JP2006319526A (en) Network camera system and its control method
JP2019207611A (en) Image processing device, image processing program, and image processing method
JPWO2018168228A1 (en) Image processing apparatus, image processing method, and image processing program
JP2011188035A (en) Imaging device, panoramic image synthesis method, and program therefor
JP6700706B2 (en) Information processing apparatus, information processing method, and program
JP5484129B2 (en) Imaging device
JP6820489B2 (en) Image processing device and image processing program
KR100736565B1 (en) Method of taking a panorama image and mobile communication terminal thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13862944

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13862944

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP