WO2023286569A1 - Congestion estimating system - Google Patents

Congestion estimating system Download PDF

Info

Publication number
WO2023286569A1
WO2023286569A1 PCT/JP2022/025346 JP2022025346W WO2023286569A1 WO 2023286569 A1 WO2023286569 A1 WO 2023286569A1 JP 2022025346 W JP2022025346 W JP 2022025346W WO 2023286569 A1 WO2023286569 A1 WO 2023286569A1
Authority
WO
WIPO (PCT)
Prior art keywords
congestion
image
congestion degree
imaging
unit
Prior art date
Application number
PCT/JP2022/025346
Other languages
French (fr)
Japanese (ja)
Inventor
隆太郎 久田
幹生 岩村
亮介 倉地
雅之 横尾
文紀 新清
けい 石黒
健吾 松本
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Priority to JP2023535209A priority Critical patent/JPWO2023286569A1/ja
Publication of WO2023286569A1 publication Critical patent/WO2023286569A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present invention relates to a congestion level estimation system for estimating the congestion level of an estimation target location.
  • Patent Document 1 it has been proposed to detect a person in an image taken by an imaging device such as a camera and estimate the degree of congestion in the place where the image was taken (see Patent Document 1, for example).
  • the degree of congestion is estimated using an image captured by an imaging device that is fixedly installed in advance at the place where the degree of congestion is to be estimated. For this reason, it is necessary to preliminarily provide an imaging device in a fixed location at a place where the degree of congestion is to be estimated. As the number of locations for which the degree of congestion is estimated increases, large costs and labor are required.
  • An embodiment of the present invention has been made in view of the above, and aims to provide a congestion degree estimation system that can more easily and appropriately estimate the congestion degree of an estimation target location.
  • a congestion degree estimation system includes an image acquisition unit that acquires an image used for estimating a congestion degree, and based on the image acquired by the image acquisition unit: , an imaging information acquisition unit that acquires imaging information indicating the imaging position and imaging direction of the image of the imaging device that captured the image; A location for which a congestion degree is to be estimated is specified from the number of people detection unit to be detected and the imaging information acquired by the imaging information acquisition unit, and the congestion degree is estimated from the number of people detected by the people detection unit for the specified location. and a congestion degree estimating unit.
  • a place whose congestion level is to be estimated is specified from the acquired image, and the congestion level is estimated for the specified place. Therefore, an image captured by an image capturing device other than the image capturing device that is fixedly provided in advance, for example, an image captured by an image capturing device carried by a user at the location where the congestion level is to be estimated is used to calculate the congestion degree. can be estimated.
  • the imaging information indicating the position and imaging direction of the image at the time of imaging is acquired based on the image, and the location for which the degree of congestion is to be estimated is specified, the location can be appropriately specified. Therefore, according to the congestion degree estimation system according to one embodiment of the present invention, it is possible to more easily and appropriately estimate the congestion degree of the place to be estimated.
  • FIG. 1 shows a congestion degree estimation system 10 according to this embodiment.
  • the congestion degree estimation system 10 is a system (apparatus) for estimating the congestion degree of a predetermined place.
  • the degree of congestion indicates the degree of human congestion (how many people are in the location) at the location to be estimated.
  • the congestion degree estimation system 10 estimates the congestion degree in buildings such as commercial facilities. In this case, the inside of the building is divided into a plurality of areas (sections, measurement points), and each of the plurality of areas is set as a place to be estimated.
  • Each of the plurality of regions may have, for example, a mesh-like shape.
  • Each of the plurality of regions corresponds, for example, to individual sales floors within the commercial facility.
  • the congestion degree estimation system 10 manages and outputs information indicating the estimated congestion degree.
  • Information indicating the degree of congestion is output, for example, so that it can be referred to by a person who has visited the place to be estimated.
  • the estimated degree of congestion it is possible to grasp which parts of the commercial facility are crowded and which parts are vacant. It should be noted that the location for which the degree of congestion is to be estimated is not limited to the inside of the building such as the above-mentioned commercial facility.
  • the congestion degree estimation system 10 is configured by a computer such as a PC (personal computer) or a server device.
  • the congestion degree estimation system 10 may be configured by a plurality of computers, that is, computer systems.
  • the congestion degree estimation system 10 has a communication function, and can exchange information with other devices.
  • the congestion degree estimation system 10 estimates the degree of congestion using images (camera images) captured by the terminal 20 at locations targeted for estimation of the degree of congestion.
  • the terminal 20 is a device that is carried and used by a user, and is, for example, a mobile communication terminal such as a smart phone.
  • the terminal 20 may be a device worn by the user, such as AR (Augmented Reality) glasses (in this case, the terminal 20 is worn on the user's head, specifically on the eyes). good.
  • AR Augmented Reality
  • the terminal 20 is equipped with an imaging device.
  • the imaging device may be an existing one, for example, a camera normally provided in a smartphone or the like.
  • the terminal 20 obtains an image by capturing an image using the image capturing function at the place whose congestion degree is to be estimated. For example, the terminal 20 captures an image at a location whose congestion degree is to be estimated at regular intervals (for example, every few minutes). Also, the image obtained by imaging may be a moving image.
  • the terminal 20 has a communication function and can exchange information with the congestion degree estimation system 10 .
  • the communication function may be an existing one.
  • the terminal 20 transmits the captured image to the congestion degree estimation system 10 .
  • the transmission is performed, for example, each time imaging is performed.
  • the congestion degree estimation system 10 includes an image acquisition unit 11, an imaging information acquisition unit 12, a people detection unit 13, and a congestion degree estimation unit .
  • the image acquisition unit 11 is a functional unit that acquires an image used for estimating the degree of congestion.
  • the image acquisition unit 11 receives an image transmitted from the terminal 20 and acquires the image.
  • the image acquisition unit 11 may acquire an image by a method other than the above.
  • the image acquisition unit 11 outputs the acquired image to the imaging information acquisition unit 12 and the number detection unit 13 .
  • the imaging information acquisition unit 12 acquires imaging information indicating the position and imaging direction of the image of the imaging device (that is, the terminal 20) that captured the image. It is a functional part that The imaging information acquisition unit 12 may acquire information indicating the three-dimensional position in the building as the information indicating the position at the time of imaging.
  • the terminal 20 is a device worn by the user, such as AR glasses, and the imaging device provided in the terminal 20 captures the line-of-sight direction of the user wearing the terminal 20, the above-mentioned imaging direction is the line-of-sight direction of the user. becomes.
  • the information indicating the position of the terminal 20 at the time of imaging is, for example, the coordinates (three-dimensional position) of a three-dimensional coordinate system defined in advance in a building such as a commercial facility for estimating the degree of congestion. This is the information shown.
  • the information indicating the imaging direction is, for example, information indicating an absolute angle (azimuth) with respect to the north direction.
  • the imaging information to be acquired may be information other than the above as long as it is acquired based on an image and indicates the position and imaging direction of the terminal 20 at the time of imaging.
  • the imaging information acquisition unit 12 acquires imaging information as follows.
  • the imaging information acquisition unit 12 inputs an image from the image acquisition unit 11 .
  • the imaging information acquisition unit 12 acquires imaging information based on the image input from the image acquisition unit 11 using existing technology.
  • the imaging information acquisition unit 12 stores in advance the information indicating the position and direction and the information related to the image such as the feature points in association with each other.
  • the imaging information acquisition unit 12 extracts information related to the input image, such as feature points of the image input from the image acquisition unit 11 .
  • the imaging information acquisition unit 12 collates the extracted feature points and the like with information related to the image such as the stored feature points.
  • the imaging information acquisition unit 12 calculates and acquires imaging information related to the image input from the image acquisition unit 11 based on the stored information of the correspondence and the collation result. In this manner, the imaging information acquisition unit 12 may acquire imaging information by the self-location recognition function from the image.
  • the imaging information acquisition unit 12 may acquire imaging information using an external function of the congestion degree estimation system 10 .
  • the imaging information acquisition unit 12 may transmit an image to an external server (external service) that calculates imaging information from the image, and acquire imaging information calculated from the image by the external server.
  • An external server may be, for example, one provided by GCA or Immersal.
  • the imaging information acquisition unit 12 may acquire imaging information based on an image by a method other than the above.
  • the imaging information acquisition unit 12 outputs the acquired imaging information to the congestion degree estimation unit 14 .
  • the number detection unit 13 is a functional unit that detects the number of people in the image acquired by the image acquisition unit 11 .
  • the number detection unit 13 receives an image from the image acquisition unit 11 .
  • the number-of-people detection unit 13 detects the number of people in the image input from the image acquisition unit 11 by using an existing image recognition technique or the like.
  • the number detection unit 13 outputs information indicating the detected number of people to the congestion degree estimation unit 14 . Note that the imaging information acquired by the imaging information acquisition unit 12 and the number of people detected by the people detection unit 13 from the same image are associated with each other.
  • the congestion degree estimation unit 14 identifies a location whose congestion degree is to be estimated from the imaging information acquired by the imaging information acquisition unit 12, and estimates the congestion degree of the identified location from the number of people detected by the people detection unit 13. It is a functional part that The congestion level estimating unit 14 may specify any one of a plurality of areas obtained by dividing a building as a location for which the congestion level is to be estimated. The congestion degree estimation unit 14 may estimate the congestion degree from the number of people detected from each of the plurality of images by the people detection unit 13 for the same place.
  • the congestion degree estimation unit 14 receives imaging information from the imaging information acquisition unit 12 .
  • the congestion degree estimation unit 14 inputs information indicating the number of people detected by the number detection unit 13 .
  • the congestion degree estimating unit 14 estimates the congestion degree as follows, using imaging information obtained from the same image and information indicating the number of people.
  • the congestion degree estimation unit 14 identifies, from the imaging information, the position appearing in the image related to the imaging information.
  • the terminal 20 is a device worn by the user, such as AR glasses, and the imaging device provided in the terminal 20 captures the line-of-sight direction of the user wearing the terminal 20, the position captured in the image corresponds to the visual field of the user. It becomes the position of the range.
  • the congestion degree estimating unit 14 determines a position a preset distance (for example, several meters) in the imaging direction indicated by the imaging information from the position at the time of imaging of the terminal 20 indicated by the imaging information. Identify the position in the image.
  • the congestion degree estimation unit 14 stores in advance information indicating the position of each area whose congestion degree is to be estimated in the same coordinate system as the position of the terminal 20 at the time of image capture indicated by the image information.
  • the congestion degree estimating unit 14 determines an area including the identified position based on the pre-stored position of each area whose congestion degree is to be estimated.
  • the congestion degree estimating unit 14 uses the area determined to include the identified position as an area for estimating the congestion degree from the number of people detected by the number of people detecting unit 13 .
  • the congestion degree estimation unit 14 may use the number of people detected by the number detection unit 13 as the congestion degree of the area.
  • the congestion degree estimating unit 14 may store a formula or condition for estimating the congestion degree from the number of people, and estimate the congestion degree of the area based on the formula or condition.
  • the estimated degree of congestion may be, for example, a numerical value indicating the degree of congestion, or may be a qualitative degree of congestion (for example, whether the degree of congestion is large, medium, or small). .
  • the congestion degree estimation unit 14 may store map data indicating the position of the area and associate the estimated congestion degree with the map. That is, the congestion degree estimation unit 14 may plot the congestion degree such as the number of people on the map.
  • the congestion degree estimating unit 14 determines that when the congestion degree has already been estimated in the region specified as the region for which the congestion degree is to be estimated from the imaging information, that is, when the region already has information indicating the number of people (information indicating the number of people is overlap), the congestion degree of the area may be estimated from the information indicating the existing number of people and the information indicating the new number of people. For example, the congestion level estimation unit 14 may use the average of these numbers as the congestion level. In this manner, the congestion level estimation unit 14 may estimate the congestion level by adjusting the number of people detected by the people detection unit 13 from each of the plurality of images for the same location.
  • the congestion degree estimating unit 14 resets the congestion degree for each region at regular intervals (for example, every few minutes), and calculates the number of people acquired within a certain interval.
  • a new congestion degree may be calculated using only the indicated information.
  • the congestion degree estimation unit 14 outputs information indicating the estimated congestion degree for each area.
  • the congestion degree estimation unit 14 may transmit information indicating the estimated congestion degree for each region to the terminal 20 .
  • the congestion degree estimation unit 14 may transmit information indicating the estimated congestion degree of each area to a display device such as a digital signage installed in a building such as a commercial facility where the congestion degree is estimated. By displaying information indicating the degree of congestion for each area on these devices that have received the information, a person in the building can grasp the degree of congestion for each area.
  • the above transmission may be performed upon request or at fixed time intervals.
  • the congestion degree estimation unit 14 may output information indicating the estimated congestion degree of each region to an output destination other than the above by a method other than the above.
  • the functions of the congestion degree estimation system 10 according to the present embodiment have been described above.
  • the image acquisition unit 11 acquires an image used for estimating the degree of congestion (S01).
  • the imaging information acquisition unit 12 acquires imaging information indicating the position and imaging direction of the image of the terminal 20 that captured the image (S02).
  • the number of people in the image is detected from the image by the number-of-people detection unit 13 (S03). Note that the imaging information acquisition processing (S02) by the imaging information acquisition unit 12 and the number detection processing (S03) by the people detection unit 13 can be performed independently of each other, so the order shown in FIG. does not need to be done in
  • the congestion degree estimation unit 14 identifies a location whose congestion degree is to be estimated from the imaging information (S04). Subsequently, the congestion level estimation unit 14 estimates the congestion level of the specified place from the detected number of people (S05). Subsequently, information indicating the estimation result of the congestion degree is output by the congestion degree estimation unit 14 (S06). The above is the processing executed by the congestion degree estimation system 10 according to the present embodiment.
  • a place whose congestion level is to be estimated is specified, and the congestion level is estimated for the specified place. Therefore, an image captured by an imaging device other than an imaging device that is fixedly provided in advance, for example, an image captured by the terminal 20 carried by the user at the place where the congestion level is to be estimated as in the present embodiment. Images can be used to estimate congestion.
  • imaging information indicating the position and imaging direction at the time of imaging of the image is obtained based on the image, and the location for which the degree of congestion is to be estimated is specified. Therefore, for example, even indoors where the position cannot be accurately estimated by GPS (Global Positioning System) or the like, the location can be appropriately identified. Therefore, according to the present embodiment, it is possible to more easily and appropriately estimate the congestion degree of the estimation target location.
  • information indicating a three-dimensional position in the building is obtained as information indicating the position at the time of imaging, and the area for which the building is divided into a plurality of areas is obtained as a target for estimating the degree of congestion. Any one may be specified. According to this configuration, it is possible to appropriately estimate the degree of congestion for each area even in a building such as a multi-story commercial facility, for example. However, it is not always necessary to adopt the above configuration in estimating the degree of congestion.
  • the degree of congestion may be estimated from the number of people detected from each of a plurality of images for the same location. For example, as described above, the number of people may be averaged. According to this configuration, it is possible to estimate the degree of congestion with high accuracy. However, the degree of congestion for the same place may be estimated from one image.
  • the congestion degree estimation system 10 may be provided in another device such as the terminal 20 .
  • the congestion degree estimation system may include the other device.
  • each functional block may be implemented using one device that is physically or logically coupled, or directly or indirectly using two or more devices that are physically or logically separated (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices.
  • a functional block may be implemented by combining software in the one device or the plurality of devices.
  • Functions include judging, determining, determining, calculating, calculating, processing, deriving, investigating, searching, checking, receiving, transmitting, outputting, accessing, resolving, selecting, choosing, establishing, comparing, assuming, expecting, assuming, Broadcasting, notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc. can't
  • a functional block (component) responsible for transmission is called a transmitting unit or transmitter.
  • the implementation method is not particularly limited.
  • the congestion degree estimation system 10 may function as a computer that performs information processing of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration of congestion degree estimation system 10 according to an embodiment of the present disclosure.
  • the congestion degree estimation system 10 described above may be physically configured as a computer device including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
  • the hardware configuration of the terminal 20 may be the one described here.
  • the term "apparatus” can be read as a circuit, device, unit, or the like.
  • the hardware configuration of the congestion degree estimation system 10 may be configured to include one or more of each device shown in the figure, or may be configured without some of the devices.
  • Each function in the congestion degree estimation system 10 is performed by the processor 1001 by loading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, and the processor 1001 performs calculations, controls communication by the communication device 1004, It is realized by controlling at least one of data reading and writing in the memory 1002 and the storage 1003 .
  • the processor 1001 for example, operates an operating system and controls the entire computer.
  • the processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, registers, and the like.
  • CPU central processing unit
  • each function in congestion degree estimation system 10 described above may be implemented by processor 1001 .
  • the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • the program a program that causes a computer to execute at least part of the operations described in the above embodiments is used.
  • each function in congestion degree estimation system 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 .
  • FIG. Processor 1001 may be implemented by one or more chips.
  • the program may be transmitted from a network via an electric communication line.
  • the memory 1002 is a computer-readable recording medium, and is composed of at least one of, for example, ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), etc. may be
  • ROM Read Only Memory
  • EPROM Erasable Programmable ROM
  • EEPROM Electrical Erasable Programmable ROM
  • RAM Random Access Memory
  • the memory 1002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 1002 can store executable programs (program code), software modules, etc. for performing information processing according to an embodiment of the present disclosure.
  • the storage 1003 is a computer-readable recording medium, for example, an optical disc such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disc, a magneto-optical disc (for example, a compact disc, a digital versatile disc, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 1003 may also be called an auxiliary storage device.
  • the storage medium included in the congestion degree estimation system 10 may be, for example, a database including at least one of the memory 1002 and the storage 1003, a server, or other suitable medium.
  • the communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via at least one of a wired network and a wireless network, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • the input device 1005 is an input device (for example, keyboard, mouse, microphone, switch, button, sensor, etc.) that receives input from the outside.
  • the output device 1006 is an output device (eg, display, speaker, LED lamp, etc.) that outputs to the outside. Note that the input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
  • Each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information.
  • the bus 1007 may be configured using a single bus, or may be configured using different buses between devices.
  • the congestion degree estimation system 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array). part or all of each functional block may be implemented by the hardware.
  • processor 1001 may be implemented using at least one of these pieces of hardware.
  • Input/output information may be stored in a specific location (for example, memory) or managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
  • the determination may be made by a value represented by one bit (0 or 1), by a true/false value (Boolean: true or false), or by numerical comparison (for example, a predetermined value).
  • notification of predetermined information is not limited to being performed explicitly, but may be performed implicitly (for example, not notifying the predetermined information). good too.
  • Software whether referred to as software, firmware, middleware, microcode, hardware description language or otherwise, includes instructions, instruction sets, code, code segments, program code, programs, subprograms, and software modules. , applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, and the like.
  • software, instructions, information, etc. may be transmitted and received via a transmission medium.
  • the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • system and “network” used in this disclosure are used interchangeably.
  • information, parameters, etc. described in the present disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using other corresponding information. may be represented.
  • determining and “determining” used in this disclosure may encompass a wide variety of actions.
  • “Judgement” and “determination” are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure), ascertaining as “judged” or “determined”, and the like.
  • "judgment” and “determination” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgment” or “decision” has been made.
  • judgment and “decision” are considered to be “judgment” and “decision” by resolving, selecting, choosing, establishing, comparing, etc. can contain.
  • judgment and “decision” may include considering that some action is “judgment” and “decision”.
  • judgment (decision) may be read as “assuming”, “expecting”, “considering”, or the like.
  • connection means any direct or indirect connection or coupling between two or more elements, It can include the presence of one or more intermediate elements between two elements being “connected” or “coupled.” Couplings or connections between elements may be physical, logical, or a combination thereof. For example, “connection” may be read as "access”.
  • two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
  • any reference to elements using the "first,” “second,” etc. designations used in this disclosure does not generally limit the quantity or order of those elements. These designations may be used in this disclosure as a convenient method of distinguishing between two or more elements. Thus, reference to a first and second element does not imply that only two elements can be employed or that the first element must precede the second element in any way.
  • a and B are different may mean “A and B are different from each other.”
  • the term may also mean that "A and B are different from C”.
  • Terms such as “separate,” “coupled,” etc. may also be interpreted in the same manner as “different.”

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The objective of the present invention is to estimate congestion in an estimation target location more simply and appropriately. A congestion estimating system 10 comprises: an image acquiring unit 11 for acquiring an image to be used to estimate congestion; an imaging information acquiring unit 12 for acquiring, on the basis of the acquired image, imaging information indicating a position and imaging direction, at the time the image was captured, of an imaging device that captured the image; a number-of-persons detecting unit 13 for detecting, from the image, the number of persons appearing in the image; and a congestion estimating unit 14 for determining, from the acquired imaging information, a location that is to serve as a congestion estimation target, and estimating the congestion from the detected number of persons, for the determined location.

Description

混雑度推定システムCongestion estimation system
 本発明は、推定対象となる場所について混雑度を推定する混雑度推定システムに関する。 The present invention relates to a congestion level estimation system for estimating the congestion level of an estimation target location.
 従来、カメラ等の撮像装置によって撮像された画像から写った人を検出して、撮像された場所の混雑度を推定することが提案されている(例えば、特許文献1参照)。 Conventionally, it has been proposed to detect a person in an image taken by an imaging device such as a camera and estimate the degree of congestion in the place where the image was taken (see Patent Document 1, for example).
特開2018-148422号公報JP 2018-148422 A
 特許文献1に示される従来の方法では、混雑度の推定対象となる場所に予め固定的に設けられた撮像装置によって撮像された画像が用いられて混雑度が推定される。そのため、混雑度の推定対象となる場所に予め固定的に撮像装置を設けておく必要がある。混雑度の推定対象となる場所が多くなると大きなコスト及び手間が生じる。 In the conventional method disclosed in Patent Document 1, the degree of congestion is estimated using an image captured by an imaging device that is fixedly installed in advance at the place where the degree of congestion is to be estimated. For this reason, it is necessary to preliminarily provide an imaging device in a fixed location at a place where the degree of congestion is to be estimated. As the number of locations for which the degree of congestion is estimated increases, large costs and labor are required.
 本発明の一実施形態は、上記に鑑みてなされたものであり、より簡易かつ適切に推定対象の場所の混雑度を推定することができる混雑度推定システムを提供することを目的とする。 An embodiment of the present invention has been made in view of the above, and aims to provide a congestion degree estimation system that can more easily and appropriately estimate the congestion degree of an estimation target location.
 上記の目的を達成するために、本発明の一実施形態に係る混雑度推定システムは、混雑度の推定に用いる画像を取得する画像取得部と、前記画像取得部によって取得された画像に基づいて、当該画像を撮像した撮像装置の当該画像の撮像時の位置及び撮像方向を示す撮像情報を取得する撮像情報取得部と、前記画像取得部によって取得された画像から、当該画像に写った人数を検出する人数検出部と、前記撮像情報取得部によって取得された撮像情報から混雑度の推定対象となる場所を特定し、特定した場所について、前記人数検出部によって検出された人数から混雑度を推定する混雑度推定部と、を備える。 In order to achieve the above object, a congestion degree estimation system according to an embodiment of the present invention includes an image acquisition unit that acquires an image used for estimating a congestion degree, and based on the image acquired by the image acquisition unit: , an imaging information acquisition unit that acquires imaging information indicating the imaging position and imaging direction of the image of the imaging device that captured the image; A location for which a congestion degree is to be estimated is specified from the number of people detection unit to be detected and the imaging information acquired by the imaging information acquisition unit, and the congestion degree is estimated from the number of people detected by the people detection unit for the specified location. and a congestion degree estimating unit.
 本発明の一実施形態に係る混雑度推定システムでは、取得された画像から、混雑度の推定対象となる場所が特定されると共に、特定された場所について混雑度が推定される。そのため、予め固定的に設けられた撮像装置以外の撮像装置によって撮像された画像、例えば、混雑度の推定対象の場所にいるユーザが携帯している撮像装置によって撮像された画像を用いて混雑度を推定することができる。また、画像に基づいて画像の撮像時の位置及び撮像方向を示す撮像情報が取得されて、混雑度の推定対象となる場所が特定されるため、適切に当該場所が特定される。従って、本発明の一実施形態に係る混雑度推定システムによれば、より簡易かつ適切に推定対象の場所の混雑度を推定することができる。 In the congestion level estimation system according to one embodiment of the present invention, a place whose congestion level is to be estimated is specified from the acquired image, and the congestion level is estimated for the specified place. Therefore, an image captured by an image capturing device other than the image capturing device that is fixedly provided in advance, for example, an image captured by an image capturing device carried by a user at the location where the congestion level is to be estimated is used to calculate the congestion degree. can be estimated. In addition, since the imaging information indicating the position and imaging direction of the image at the time of imaging is acquired based on the image, and the location for which the degree of congestion is to be estimated is specified, the location can be appropriately specified. Therefore, according to the congestion degree estimation system according to one embodiment of the present invention, it is possible to more easily and appropriately estimate the congestion degree of the place to be estimated.
 本発明の一実施形態によれば、より簡易かつ適切に推定対象の場所の混雑度を推定することができる。 According to one embodiment of the present invention, it is possible to more easily and appropriately estimate the congestion degree of the estimation target location.
本発明の実施形態に係る混雑度推定システムの構成を示す図である。It is a figure which shows the structure of the congestion degree estimation system which concerns on embodiment of this invention. 本発明の実施形態に係る混雑度推定システムで実行される処理を示すフローチャートである。It is a flowchart which shows the process performed by the congestion degree estimation system which concerns on embodiment of this invention. 本発明の実施形態に係る混雑度推定システムのハードウェア構成を示す図である。It is a figure which shows the hardware constitutions of the congestion degree estimation system which concerns on embodiment of this invention.
 以下、図面と共に本発明に係る混雑度推定システムの実施形態について詳細に説明する。なお、図面の説明においては同一要素には同一符号を付し、重複する説明を省略する。 Hereinafter, an embodiment of the congestion degree estimation system according to the present invention will be described in detail along with the drawings. In the description of the drawings, the same elements are denoted by the same reference numerals, and overlapping descriptions are omitted.
 図1に本実施形態に係る混雑度推定システム10を示す。混雑度推定システム10は、所定の場所の混雑度を推定するシステム(装置)である。混雑度は、推定対象となる場所における人による混雑の度合い(当該場所にどの程度の数の人がいるか)を示すものである。例えば、混雑度推定システム10は、商業施設等の建物内における混雑度を推定する。この場合、建物内を複数の領域(区画、測定点)に分割し、複数の領域のそれぞれを推定対象となる場所とする。複数の領域のそれぞれは、例えば、メッシュ状の形状であってもよい。複数の領域のそれぞれは、例えば、商業施設内の個々の売り場に対応する。混雑度推定システム10は、推定した混雑度を示す情報を管理すると共に出力する。混雑度を示す情報の出力は、例えば、推定対象となる場所を訪れた人等に参照可能なように行われる。推定された混雑度を参照することで、商業施設内のどこが混んでいるか、どこが空いているかを把握することができる。なお、混雑度の推定対象となる場所は、上記の商業施設等の建物内に限られない。 FIG. 1 shows a congestion degree estimation system 10 according to this embodiment. The congestion degree estimation system 10 is a system (apparatus) for estimating the congestion degree of a predetermined place. The degree of congestion indicates the degree of human congestion (how many people are in the location) at the location to be estimated. For example, the congestion degree estimation system 10 estimates the congestion degree in buildings such as commercial facilities. In this case, the inside of the building is divided into a plurality of areas (sections, measurement points), and each of the plurality of areas is set as a place to be estimated. Each of the plurality of regions may have, for example, a mesh-like shape. Each of the plurality of regions corresponds, for example, to individual sales floors within the commercial facility. The congestion degree estimation system 10 manages and outputs information indicating the estimated congestion degree. Information indicating the degree of congestion is output, for example, so that it can be referred to by a person who has visited the place to be estimated. By referring to the estimated degree of congestion, it is possible to grasp which parts of the commercial facility are crowded and which parts are vacant. It should be noted that the location for which the degree of congestion is to be estimated is not limited to the inside of the building such as the above-mentioned commercial facility.
 混雑度推定システム10は、PC(パーソナルコンピュータ)又はサーバ装置等のコンピュータによって構成されている。混雑度推定システム10は、複数のコンピュータ、即ち、コンピュータシステムによって構成されていてもよい。混雑度推定システム10は、通信機能を有しており、他の装置との間で互いに情報の送受信を行うことができる。 The congestion degree estimation system 10 is configured by a computer such as a PC (personal computer) or a server device. The congestion degree estimation system 10 may be configured by a plurality of computers, that is, computer systems. The congestion degree estimation system 10 has a communication function, and can exchange information with other devices.
 混雑度推定システム10は、混雑度の推定対象となる場所において、端末20によって撮像された画像(カメラ映像)を用いて混雑度を推定する。端末20は、ユーザによって携帯されて用いられる装置であり、例えば、スマートフォン等の移動通信端末である。あるいは、端末20は、ユーザに装着される装置、例えば、AR(Augmented Reality)グラス(この場合、端末20は、ユーザの頭部、具体的には眼の部分に装着される)であってもよい。なお、図2では、端末20は1つのみしか描かれていないが、複数のユーザそれぞれに用いられる複数の端末20が混雑度の推定に用いられてもよい。 The congestion degree estimation system 10 estimates the degree of congestion using images (camera images) captured by the terminal 20 at locations targeted for estimation of the degree of congestion. The terminal 20 is a device that is carried and used by a user, and is, for example, a mobile communication terminal such as a smart phone. Alternatively, the terminal 20 may be a device worn by the user, such as AR (Augmented Reality) glasses (in this case, the terminal 20 is worn on the user's head, specifically on the eyes). good. Note that although only one terminal 20 is illustrated in FIG. 2, a plurality of terminals 20 that are used by a plurality of users may be used to estimate the degree of congestion.
 端末20は、撮像装置を備えている。当該撮像装置は、既存のもの、例えば、スマートフォン等に通常備えられているカメラでよい。混雑度の推定対象となる場所において、端末20は、撮像機能によって撮像を行って画像を得る。混雑度の推定対象となる場所における端末20による撮像は、例えば、一定時間毎(例えば、数分毎)に行われる。また、撮像によって得られる画像は、動画像であってもよい。 The terminal 20 is equipped with an imaging device. The imaging device may be an existing one, for example, a camera normally provided in a smartphone or the like. The terminal 20 obtains an image by capturing an image using the image capturing function at the place whose congestion degree is to be estimated. For example, the terminal 20 captures an image at a location whose congestion degree is to be estimated at regular intervals (for example, every few minutes). Also, the image obtained by imaging may be a moving image.
 端末20は、通信機能を有しており、混雑度推定システム10との間で互いに情報の送受信を行うことが可能である。当該通信機能は、既存のものでよい。端末20は、撮像によって得られた画像を混雑度推定システム10に送信する。当該送信は、例えば、撮像が行われる度に行われる。 The terminal 20 has a communication function and can exchange information with the congestion degree estimation system 10 . The communication function may be an existing one. The terminal 20 transmits the captured image to the congestion degree estimation system 10 . The transmission is performed, for example, each time imaging is performed.
 引き続いて、本実施形態に係る混雑度推定システム10の機能を説明する。図1に示すように混雑度推定システム10は、画像取得部11と、撮像情報取得部12と、人数検出部13と、混雑度推定部14とを備えて構成される。 Next, the functions of the congestion degree estimation system 10 according to this embodiment will be explained. As shown in FIG. 1, the congestion degree estimation system 10 includes an image acquisition unit 11, an imaging information acquisition unit 12, a people detection unit 13, and a congestion degree estimation unit .
 画像取得部11は、混雑度の推定に用いる画像を取得する機能部である。例えば、画像取得部11は、端末20から送信された画像を受信して画像を取得する。また、画像取得部11は、上記以外の方法で画像を取得してもよい。画像取得部11は、取得した画像を撮像情報取得部12及び人数検出部13に出力する。 The image acquisition unit 11 is a functional unit that acquires an image used for estimating the degree of congestion. For example, the image acquisition unit 11 receives an image transmitted from the terminal 20 and acquires the image. Also, the image acquisition unit 11 may acquire an image by a method other than the above. The image acquisition unit 11 outputs the acquired image to the imaging information acquisition unit 12 and the number detection unit 13 .
 撮像情報取得部12は、画像取得部11によって取得された画像に基づいて、当該画像を撮像した撮像装置(即ち、端末20)の当該画像の撮像時の位置及び撮像方向を示す撮像情報を取得する機能部である。撮像情報取得部12は、撮像時の位置を示す情報として、建物中の三次元位置を示す情報を取得してもよい。端末20がARグラス等のユーザに装着される装置であると共に端末20に備えられる撮像装置が端末20を装着するユーザの視線方向を撮像する場合には、上記の撮像方向は、ユーザの視線方向となる。 Based on the image acquired by the image acquisition unit 11, the imaging information acquisition unit 12 acquires imaging information indicating the position and imaging direction of the image of the imaging device (that is, the terminal 20) that captured the image. It is a functional part that The imaging information acquisition unit 12 may acquire information indicating the three-dimensional position in the building as the information indicating the position at the time of imaging. When the terminal 20 is a device worn by the user, such as AR glasses, and the imaging device provided in the terminal 20 captures the line-of-sight direction of the user wearing the terminal 20, the above-mentioned imaging direction is the line-of-sight direction of the user. becomes.
 取得される撮像情報のうち、端末20の撮像時の位置を示す情報は、例えば、混雑度を推定する商業施設等の建物内において予め定義された三次元座標系の座標(三次元位置)を示す情報である。取得される撮像情報のうち、撮像方向を示す情報は、例えば、北方向に対する絶対角度(方位)を示す情報である。但し、取得される撮像情報は、画像に基づいて取得されると共に端末20の撮像時の位置及び撮像方向を示す情報であれば、上記以外のものであってもよい。 Among the acquired imaging information, the information indicating the position of the terminal 20 at the time of imaging is, for example, the coordinates (three-dimensional position) of a three-dimensional coordinate system defined in advance in a building such as a commercial facility for estimating the degree of congestion. This is the information shown. Among the acquired imaging information, the information indicating the imaging direction is, for example, information indicating an absolute angle (azimuth) with respect to the north direction. However, the imaging information to be acquired may be information other than the above as long as it is acquired based on an image and indicates the position and imaging direction of the terminal 20 at the time of imaging.
 例えば、撮像情報取得部12は、以下のように撮像情報を取得する。撮像情報取得部12は、画像取得部11から画像を入力する。撮像情報取得部12は、既存の技術によって、画像取得部11から入力した画像に基づいて撮像情報を取得する。例えば、撮像情報取得部12は、予め位置及び方向を示す情報と特徴点等の画像に係る情報とを対応付けて記憶しておく。撮像情報取得部12は、画像取得部11から入力した画像の特徴点等の、入力した画像に係る情報を抽出する。撮像情報取得部12は、抽出した特徴点等と、記憶した特徴点等の画像に係る情報とを照合する。撮像情報取得部12は、記憶された対応付けの情報と照合結果とから、画像取得部11から入力した画像に係る撮像情報を算出して取得する。このように撮像情報取得部12は、画像からの自己位置認識機能によって撮像情報を取得してもよい。 For example, the imaging information acquisition unit 12 acquires imaging information as follows. The imaging information acquisition unit 12 inputs an image from the image acquisition unit 11 . The imaging information acquisition unit 12 acquires imaging information based on the image input from the image acquisition unit 11 using existing technology. For example, the imaging information acquisition unit 12 stores in advance the information indicating the position and direction and the information related to the image such as the feature points in association with each other. The imaging information acquisition unit 12 extracts information related to the input image, such as feature points of the image input from the image acquisition unit 11 . The imaging information acquisition unit 12 collates the extracted feature points and the like with information related to the image such as the stored feature points. The imaging information acquisition unit 12 calculates and acquires imaging information related to the image input from the image acquisition unit 11 based on the stored information of the correspondence and the collation result. In this manner, the imaging information acquisition unit 12 may acquire imaging information by the self-location recognition function from the image.
 また、撮像情報取得部12は、混雑度推定システム10の外部の機能によって撮像情報を取得してもよい。例えば、撮像情報取得部12は、画像から撮像情報を算出する外部のサーバ(外部サービス)に画像を送信し、外部のサーバによって画像から算出された撮像情報を取得してもよい。外部のサーバは、例えば、GCA又はImmersalによって提供されるものであってもよい。 In addition, the imaging information acquisition unit 12 may acquire imaging information using an external function of the congestion degree estimation system 10 . For example, the imaging information acquisition unit 12 may transmit an image to an external server (external service) that calculates imaging information from the image, and acquire imaging information calculated from the image by the external server. An external server may be, for example, one provided by GCA or Immersal.
 また、撮像情報取得部12は、上記以外の方法で、画像に基づいて撮像情報を取得してもよい。撮像情報取得部12は、取得した撮像情報を混雑度推定部14に出力する。 Also, the imaging information acquisition unit 12 may acquire imaging information based on an image by a method other than the above. The imaging information acquisition unit 12 outputs the acquired imaging information to the congestion degree estimation unit 14 .
 人数検出部13は、画像取得部11によって取得された画像から、当該画像に写った人数を検出する機能部である。人数検出部13は、画像取得部11から画像を入力する。人数検出部13は、既存の画像認識等の技術によって、画像取得部11から入力した画像から、当該画像に写った人数を検出する。人数検出部13は、検出した人数を示す情報を混雑度推定部14に出力する。なお、同一の画像から、撮像情報取得部12によって取得された撮像情報と、人数検出部13によって検出された人数とは、対応付けられている。 The number detection unit 13 is a functional unit that detects the number of people in the image acquired by the image acquisition unit 11 . The number detection unit 13 receives an image from the image acquisition unit 11 . The number-of-people detection unit 13 detects the number of people in the image input from the image acquisition unit 11 by using an existing image recognition technique or the like. The number detection unit 13 outputs information indicating the detected number of people to the congestion degree estimation unit 14 . Note that the imaging information acquired by the imaging information acquisition unit 12 and the number of people detected by the people detection unit 13 from the same image are associated with each other.
 混雑度推定部14は、撮像情報取得部12によって取得された撮像情報から混雑度の推定対象となる場所を特定し、特定した場所について、人数検出部13によって検出された人数から混雑度を推定する機能部である。混雑度推定部14は、混雑度の推定対象となる場所として、建物を複数に分割した領域の何れかを特定してもよい。混雑度推定部14は、同一の場所について、人数検出部13によって複数の画像それぞれから検出された人数から混雑度を推定してもよい。 The congestion degree estimation unit 14 identifies a location whose congestion degree is to be estimated from the imaging information acquired by the imaging information acquisition unit 12, and estimates the congestion degree of the identified location from the number of people detected by the people detection unit 13. It is a functional part that The congestion level estimating unit 14 may specify any one of a plurality of areas obtained by dividing a building as a location for which the congestion level is to be estimated. The congestion degree estimation unit 14 may estimate the congestion degree from the number of people detected from each of the plurality of images by the people detection unit 13 for the same place.
 混雑度推定部14は、撮像情報取得部12から撮像情報を入力する。混雑度推定部14は、人数検出部13から検出された人数を示す情報を入力する。例えば、混雑度推定部14は、同一の画像から得られた撮像情報及び人数を示す情報を用いて、以下のように混雑度を推定する。 The congestion degree estimation unit 14 receives imaging information from the imaging information acquisition unit 12 . The congestion degree estimation unit 14 inputs information indicating the number of people detected by the number detection unit 13 . For example, the congestion degree estimating unit 14 estimates the congestion degree as follows, using imaging information obtained from the same image and information indicating the number of people.
 混雑度推定部14は、撮像情報から、撮像情報に係る画像に写った位置を特定する。端末20がARグラス等のユーザに装着される装置であると共に端末20に備えられる撮像装置が端末20を装着するユーザの視線方向を撮像する場合には、画像に写った位置は、ユーザの視覚範囲の位置となる。例えば、混雑度推定部14は、撮像情報によって示される端末20の撮像時の位置から、撮像情報によって示される撮像方向に予め設定された距離(例えば、数m)進んだ位置を撮像情報に係る画像に写った位置と特定する。 The congestion degree estimation unit 14 identifies, from the imaging information, the position appearing in the image related to the imaging information. When the terminal 20 is a device worn by the user, such as AR glasses, and the imaging device provided in the terminal 20 captures the line-of-sight direction of the user wearing the terminal 20, the position captured in the image corresponds to the visual field of the user. It becomes the position of the range. For example, the congestion degree estimating unit 14 determines a position a preset distance (for example, several meters) in the imaging direction indicated by the imaging information from the position at the time of imaging of the terminal 20 indicated by the imaging information. Identify the position in the image.
 混雑度推定部14は、予め、撮像情報によって示される端末20の撮像時の位置と同じ座標系での、混雑度の推定対象となる各領域の位置を示す情報を記憶している。混雑度推定部14は、予め記憶した混雑度の推定対象となる各領域の位置に基づいて、特定した位置が含まれる領域を判断する。混雑度推定部14は、特定した位置が含まれると判断した領域を人数検出部13によって検出された人数から混雑度を推定する領域とする。混雑度推定部14は、人数検出部13によって検出された人数自体を当該領域の混雑度としてもよい。あるいは、混雑度推定部14は、人数から混雑度を推定するための式又は条件を記憶しておき、当該式又は条件に基づいて当該領域の混雑度を推定してもよい。この際、推定される混雑度は、例えば、混雑度を示す数値であってもよいし、定性的な混雑度(例えば、混雑度が大、中、小の何れか等)であってもよい。 The congestion degree estimation unit 14 stores in advance information indicating the position of each area whose congestion degree is to be estimated in the same coordinate system as the position of the terminal 20 at the time of image capture indicated by the image information. The congestion degree estimating unit 14 determines an area including the identified position based on the pre-stored position of each area whose congestion degree is to be estimated. The congestion degree estimating unit 14 uses the area determined to include the identified position as an area for estimating the congestion degree from the number of people detected by the number of people detecting unit 13 . The congestion degree estimation unit 14 may use the number of people detected by the number detection unit 13 as the congestion degree of the area. Alternatively, the congestion degree estimating unit 14 may store a formula or condition for estimating the congestion degree from the number of people, and estimate the congestion degree of the area based on the formula or condition. At this time, the estimated degree of congestion may be, for example, a numerical value indicating the degree of congestion, or may be a qualitative degree of congestion (for example, whether the degree of congestion is large, medium, or small). .
 混雑度推定部14は、領域の位置を示す地図のデータを記憶しておき、地図に推定した混雑度を対応付けてもよい。即ち、混雑度推定部14は、地図上の人数等の混雑度をプロットしてもよい。 The congestion degree estimation unit 14 may store map data indicating the position of the area and associate the estimated congestion degree with the map. That is, the congestion degree estimation unit 14 may plot the congestion degree such as the number of people on the map.
 混雑度推定部14は、撮像情報から混雑度を推定する領域として特定された領域において既に混雑度が推定されている場合、即ち、当該領域について既に人数を示す情報がある(人数を示す情報が重複する)場合、既存の人数を示す情報及び新たな人数を示す情報から当該領域の混雑度を推定してもよい。例えば、混雑度推定部14は、それらの人数の平均を混雑度としてもよい。このように、混雑度推定部14は、同一の場所について、人数検出部13によって複数の画像それぞれから検出された人数から、人数を調整して混雑度を推定してもよい。 The congestion degree estimating unit 14 determines that when the congestion degree has already been estimated in the region specified as the region for which the congestion degree is to be estimated from the imaging information, that is, when the region already has information indicating the number of people (information indicating the number of people is overlap), the congestion degree of the area may be estimated from the information indicating the existing number of people and the information indicating the new number of people. For example, the congestion level estimation unit 14 may use the average of these numbers as the congestion level. In this manner, the congestion level estimation unit 14 may estimate the congestion level by adjusting the number of people detected by the people detection unit 13 from each of the plurality of images for the same location.
 混雑度推定部14は、提供する混雑度を新鮮なものとするため、一定時間毎(例えば、数分毎)に、領域毎の混雑度をリセットしたり、一定時間内に取得された人数を示す情報のみを用いて新たに混雑度を算出したりしてもよい。 In order to keep the provided congestion degree fresh, the congestion degree estimating unit 14 resets the congestion degree for each region at regular intervals (for example, every few minutes), and calculates the number of people acquired within a certain interval. A new congestion degree may be calculated using only the indicated information.
 混雑度推定部14は、推定した領域毎の混雑度を示す情報を出力する。例えば、混雑度推定部14は、端末20に推定した領域毎の混雑度を示す情報を送信してもよい。あるいは、混雑度推定部14は、混雑度を推定した商業施設等の建物内に設けられるデジタルサイネージ等の表示装置に推定した領域毎の混雑度を示す情報を送信してもよい。情報が受信されたこれらの装置で領域毎の混雑度を示す情報の表示等が行われることで、建物内にいる人が領域毎の混雑度を把握することができる。上記の送信は、要求に応じて又は一定時間毎に行われればよい。また、混雑度推定部14は、上記以外の方法で上記以外の出力先に、推定した領域毎の混雑度を示す情報を出力してもよい。以上が、本実施形態に係る混雑度推定システム10の機能である。 The congestion degree estimation unit 14 outputs information indicating the estimated congestion degree for each area. For example, the congestion degree estimation unit 14 may transmit information indicating the estimated congestion degree for each region to the terminal 20 . Alternatively, the congestion degree estimation unit 14 may transmit information indicating the estimated congestion degree of each area to a display device such as a digital signage installed in a building such as a commercial facility where the congestion degree is estimated. By displaying information indicating the degree of congestion for each area on these devices that have received the information, a person in the building can grasp the degree of congestion for each area. The above transmission may be performed upon request or at fixed time intervals. In addition, the congestion degree estimation unit 14 may output information indicating the estimated congestion degree of each region to an output destination other than the above by a method other than the above. The functions of the congestion degree estimation system 10 according to the present embodiment have been described above.
 引き続いて、図2のフローチャートを用いて、本実施形態に係る混雑度推定システム10で実行される処理(混雑度推定システム10が行う動作方法)を説明する。本処理では、画像取得部11によって、混雑度の推定に用いる画像が取得される(S01)。続いて、撮像情報取得部12によって、画像に基づいて、当該画像を撮像した端末20の当該画像の撮像時の位置及び撮像方向を示す撮像情報が取得される(S02)。また、人数検出部13によって、画像から当該画像に写った人数が検出される(S03)。なお、撮像情報取得部12による撮像情報の取得の処理(S02)と、人数検出部13による人数の検出の処理(S03)とは、互いに独立して行われ得るので、図2に示される順番で行われる必要はない。 Subsequently, the processing executed by the congestion degree estimation system 10 according to the present embodiment (operation method performed by the congestion degree estimation system 10) will be described using the flowchart of FIG. In this process, the image acquisition unit 11 acquires an image used for estimating the degree of congestion (S01). Subsequently, based on the image, the imaging information acquisition unit 12 acquires imaging information indicating the position and imaging direction of the image of the terminal 20 that captured the image (S02). Also, the number of people in the image is detected from the image by the number-of-people detection unit 13 (S03). Note that the imaging information acquisition processing (S02) by the imaging information acquisition unit 12 and the number detection processing (S03) by the people detection unit 13 can be performed independently of each other, so the order shown in FIG. does not need to be done in
 続いて、混雑度推定部14によって、撮像情報から混雑度の推定対象となる場所が特定される(S04)。続いて、混雑度推定部14によって、特定された場所について、検出された人数から混雑度が推定される(S05)。続いて、混雑度推定部14によって、混雑度の推定結果を示す情報が出力される(S06)。以上が、本実施形態に係る混雑度推定システム10で実行される処理である。 Next, the congestion degree estimation unit 14 identifies a location whose congestion degree is to be estimated from the imaging information (S04). Subsequently, the congestion level estimation unit 14 estimates the congestion level of the specified place from the detected number of people (S05). Subsequently, information indicating the estimation result of the congestion degree is output by the congestion degree estimation unit 14 (S06). The above is the processing executed by the congestion degree estimation system 10 according to the present embodiment.
 本実施形態では、取得された画像から、混雑度の推定対象となる場所が特定されると共に、特定された場所について混雑度が推定される。そのため、予め固定的に設けられた撮像装置以外の撮像装置によって撮像された画像、例えば、本実施形態のように混雑度の推定対象の場所にいるユーザが携帯している端末20によって撮像された画像を用いて混雑度を推定することができる。また、画像に基づいて画像の撮像時の位置及び撮像方向を示す撮像情報が取得されて、混雑度の推定対象となる場所が特定される。そのため、例えば、GPS(Global Positioning System)等では正確に位置を推定できない屋内等であっても、適切に当該場所が特定される。従って、本実施形態によれば、より簡易かつ適切に推定対象の場所の混雑度を推定することができる。 In this embodiment, from the acquired image, a place whose congestion level is to be estimated is specified, and the congestion level is estimated for the specified place. Therefore, an image captured by an imaging device other than an imaging device that is fixedly provided in advance, for example, an image captured by the terminal 20 carried by the user at the place where the congestion level is to be estimated as in the present embodiment. Images can be used to estimate congestion. In addition, imaging information indicating the position and imaging direction at the time of imaging of the image is obtained based on the image, and the location for which the degree of congestion is to be estimated is specified. Therefore, for example, even indoors where the position cannot be accurately estimated by GPS (Global Positioning System) or the like, the location can be appropriately identified. Therefore, according to the present embodiment, it is possible to more easily and appropriately estimate the congestion degree of the estimation target location.
 また、上述した実施形態のように、撮像時の位置を示す情報として、建物中の三次元位置を示す情報が取得され、混雑度の推定対象となる場所として、建物を複数に分割した領域の何れかを特定してもよい。この構成によれば、例えば、複数階の商業施設等の建物内においても、適切に領域毎の混雑度を推定することができる。但し、混雑度の推定において必ずしも上記の構成を取る必要はない。 Further, as in the above-described embodiment, information indicating a three-dimensional position in the building is obtained as information indicating the position at the time of imaging, and the area for which the building is divided into a plurality of areas is obtained as a target for estimating the degree of congestion. Any one may be specified. According to this configuration, it is possible to appropriately estimate the degree of congestion for each area even in a building such as a multi-story commercial facility, for example. However, it is not always necessary to adopt the above configuration in estimating the degree of congestion.
 また、上述した実施形態のように、同一の場所について、複数の画像それぞれから検出された人数から混雑度が推定されてもよい。例えば、上述したように人数の平均を取る等してもよい。この構成によれば、精度よく混雑度を推定することができる。但し、同一の場所についての混雑度は、1つの画像から推定されたものであってもよい。 Also, as in the above-described embodiment, the degree of congestion may be estimated from the number of people detected from each of a plurality of images for the same location. For example, as described above, the number of people may be averaged. According to this configuration, it is possible to estimate the degree of congestion with high accuracy. However, the degree of congestion for the same place may be estimated from one image.
 混雑度推定システム10に備えられるとした上記の各機能部の一部は、端末20等の別の装置に備えられていてもよい。その場合、当該別の装置を含めたものを混雑度推定システムとしてもよい。 Some of the functional units described above that are provided in the congestion degree estimation system 10 may be provided in another device such as the terminal 20 . In that case, the congestion degree estimation system may include the other device.
 なお、上記実施形態の説明に用いたブロック図は、機能単位のブロックを示している。これらの機能ブロック(構成部)は、ハードウェア及びソフトウェアの少なくとも一方の任意の組み合わせによって実現される。また、各機能ブロックの実現方法は特に限定されない。すなわち、各機能ブロックは、物理的又は論理的に結合した1つの装置を用いて実現されてもよいし、物理的又は論理的に分離した2つ以上の装置を直接的又は間接的に(例えば、有線、無線などを用いて)接続し、これら複数の装置を用いて実現されてもよい。機能ブロックは、上記1つの装置又は上記複数の装置にソフトウェアを組み合わせて実現されてもよい。 It should be noted that the block diagrams used in the description of the above embodiments show blocks for each function. These functional blocks (components) are realized by any combination of at least one of hardware and software. Also, the method of implementing each functional block is not particularly limited. That is, each functional block may be implemented using one device that is physically or logically coupled, or directly or indirectly using two or more devices that are physically or logically separated (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices. A functional block may be implemented by combining software in the one device or the plurality of devices.
 機能には、判断、決定、判定、計算、算出、処理、導出、調査、探索、確認、受信、送信、出力、アクセス、解決、選択、選定、確立、比較、想定、期待、見做し、報知(broadcasting)、通知(notifying)、通信(communicating)、転送(forwarding)、構成(configuring)、再構成(reconfiguring)、割り当て(allocating、mapping)、割り振り(assigning)などがあるが、これらに限られない。たとえば、送信を機能させる機能ブロック(構成部)は、送信部(transmitting unit)又は送信機(transmitter)と呼称される。いずれも、上述したとおり、実現方法は特に限定されない。 Functions include judging, determining, determining, calculating, calculating, processing, deriving, investigating, searching, checking, receiving, transmitting, outputting, accessing, resolving, selecting, choosing, establishing, comparing, assuming, expecting, assuming, Broadcasting, notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc. can't For example, a functional block (component) responsible for transmission is called a transmitting unit or transmitter. In either case, as described above, the implementation method is not particularly limited.
 例えば、本開示の一実施の形態における混雑度推定システム10は、本開示の情報処理を行うコンピュータとして機能してもよい。図3は、本開示の一実施の形態に係る混雑度推定システム10のハードウェア構成の一例を示す図である。上述の混雑度推定システム10は、物理的には、プロセッサ1001、メモリ1002、ストレージ1003、通信装置1004、入力装置1005、出力装置1006、バス1007などを含むコンピュータ装置として構成されてもよい。また、端末20のハードウェア構成も、ここで説明するものであってもよい。 For example, the congestion degree estimation system 10 according to an embodiment of the present disclosure may function as a computer that performs information processing of the present disclosure. FIG. 3 is a diagram illustrating an example of a hardware configuration of congestion degree estimation system 10 according to an embodiment of the present disclosure. The congestion degree estimation system 10 described above may be physically configured as a computer device including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like. Also, the hardware configuration of the terminal 20 may be the one described here.
 なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。混雑度推定システム10のハードウェア構成は、図に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。 In the following explanation, the term "apparatus" can be read as a circuit, device, unit, or the like. The hardware configuration of the congestion degree estimation system 10 may be configured to include one or more of each device shown in the figure, or may be configured without some of the devices.
 混雑度推定システム10における各機能は、プロセッサ1001、メモリ1002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ1001が演算を行い、通信装置1004による通信を制御したり、メモリ1002及びストレージ1003におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。 Each function in the congestion degree estimation system 10 is performed by the processor 1001 by loading predetermined software (program) on hardware such as the processor 1001 and the memory 1002, and the processor 1001 performs calculations, controls communication by the communication device 1004, It is realized by controlling at least one of data reading and writing in the memory 1002 and the storage 1003 .
 プロセッサ1001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ1001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU:Central Processing Unit)によって構成されてもよい。例えば、上述の混雑度推定システム10における各機能は、プロセッサ1001によって実現されてもよい。 The processor 1001, for example, operates an operating system and controls the entire computer. The processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, registers, and the like. For example, each function in congestion degree estimation system 10 described above may be implemented by processor 1001 .
 また、プロセッサ1001は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ1003及び通信装置1004の少なくとも一方からメモリ1002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、上述の実施の形態において説明した動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。例えば、混雑度推定システム10における各機能は、メモリ1002に格納され、プロセッサ1001において動作する制御プログラムによって実現されてもよい。上述の各種処理は、1つのプロセッサ1001によって実行される旨を説明してきたが、2以上のプロセッサ1001により同時又は逐次に実行されてもよい。プロセッサ1001は、1以上のチップによって実装されてもよい。なお、プログラムは、電気通信回線を介してネットワークから送信されても良い。 Also, the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them. As the program, a program that causes a computer to execute at least part of the operations described in the above embodiments is used. For example, each function in congestion degree estimation system 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 . Although it has been explained that the above-described various processes are executed by one processor 1001, they may be executed simultaneously or sequentially by two or more processors 1001. FIG. Processor 1001 may be implemented by one or more chips. Note that the program may be transmitted from a network via an electric communication line.
 メモリ1002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAM(Random Access Memory)などの少なくとも1つによって構成されてもよい。メモリ1002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ1002は、本開示の一実施の形態に係る情報処理を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 1002 is a computer-readable recording medium, and is composed of at least one of, for example, ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), etc. may be The memory 1002 may also be called a register, cache, main memory (main storage device), or the like. The memory 1002 can store executable programs (program code), software modules, etc. for performing information processing according to an embodiment of the present disclosure.
 ストレージ1003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。ストレージ1003は、補助記憶装置と呼ばれてもよい。混雑度推定システム10が備える記憶媒体は、例えば、メモリ1002及びストレージ1003の少なくとも一方を含むデータベース、サーバその他の適切な媒体であってもよい。 The storage 1003 is a computer-readable recording medium, for example, an optical disc such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disc, a magneto-optical disc (for example, a compact disc, a digital versatile disc, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like. Storage 1003 may also be called an auxiliary storage device. The storage medium included in the congestion degree estimation system 10 may be, for example, a database including at least one of the memory 1002 and the storage 1003, a server, or other suitable medium.
 通信装置1004は、有線ネットワーク及び無線ネットワークの少なくとも一方を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。 The communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via at least one of a wired network and a wireless network, and is also called a network device, a network controller, a network card, a communication module, or the like.
 入力装置1005は、外部からの入力を受け付ける入力デバイス(例えば、キーボード、マウス、マイクロフォン、スイッチ、ボタン、センサなど)である。出力装置1006は、外部への出力を実施する出力デバイス(例えば、ディスプレイ、スピーカー、LEDランプなど)である。なお、入力装置1005及び出力装置1006は、一体となった構成(例えば、タッチパネル)であってもよい。 The input device 1005 is an input device (for example, keyboard, mouse, microphone, switch, button, sensor, etc.) that receives input from the outside. The output device 1006 is an output device (eg, display, speaker, LED lamp, etc.) that outputs to the outside. Note that the input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
 また、プロセッサ1001、メモリ1002などの各装置は、情報を通信するためのバス1007によって接続される。バス1007は、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。 Each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information. The bus 1007 may be configured using a single bus, or may be configured using different buses between devices.
 また、混雑度推定システム10は、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)などのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ1001は、これらのハードウェアの少なくとも1つを用いて実装されてもよい。 In addition, the congestion degree estimation system 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array). part or all of each functional block may be implemented by the hardware. For example, processor 1001 may be implemented using at least one of these pieces of hardware.
 本開示において説明した各態様/実施形態の処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本開示において説明した方法については、例示的な順序を用いて様々なステップの要素を提示しており、提示した特定の順序に限定されない。 The order of the processing procedures, sequences, flowcharts, etc. of each aspect/embodiment described in the present disclosure may be changed as long as there is no contradiction. For example, the methods described in this disclosure present elements of the various steps using a sample order, and are not limited to the specific order presented.
 入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルを用いて管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 Input/output information may be stored in a specific location (for example, memory) or managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
 判定は、1ビットで表される値(0か1か)によって行われてもよいし、真偽値(Boolean:true又はfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 The determination may be made by a value represented by one bit (0 or 1), by a true/false value (Boolean: true or false), or by numerical comparison (for example, a predetermined value).
 本開示において説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 Each aspect/embodiment described in the present disclosure may be used alone, may be used in combination, or may be used by switching along with execution. In addition, the notification of predetermined information (for example, notification of “being X”) is not limited to being performed explicitly, but may be performed implicitly (for example, not notifying the predetermined information). good too.
 以上、本開示について詳細に説明したが、当業者にとっては、本開示が本開示中に説明した実施形態に限定されるものではないということは明らかである。本開示は、請求の範囲の記載により定まる本開示の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本開示の記載は、例示説明を目的とするものであり、本開示に対して何ら制限的な意味を有するものではない。 Although the present disclosure has been described in detail above, it is clear to those skilled in the art that the present disclosure is not limited to the embodiments described in the present disclosure. The present disclosure can be practiced with modifications and variations without departing from the spirit and scope of the present disclosure as defined by the claims. Accordingly, the description of the present disclosure is for illustrative purposes and is not meant to be limiting in any way.
 ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。 Software, whether referred to as software, firmware, middleware, microcode, hardware description language or otherwise, includes instructions, instruction sets, code, code segments, program code, programs, subprograms, and software modules. , applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, and the like.
 また、ソフトウェア、命令、情報などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、有線技術(同軸ケーブル、光ファイバケーブル、ツイストペア、デジタル加入者回線(DSL:Digital Subscriber Line)など)及び無線技術(赤外線、マイクロ波など)の少なくとも一方を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び無線技術の少なくとも一方は、伝送媒体の定義内に含まれる。 In addition, software, instructions, information, etc. may be transmitted and received via a transmission medium. For example, the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
 本開示において使用する「システム」及び「ネットワーク」という用語は、互換的に使用される。 The terms "system" and "network" used in this disclosure are used interchangeably.
 また、本開示において説明した情報、パラメータなどは、絶対値を用いて表されてもよいし、所定の値からの相対値を用いて表されてもよいし、対応する別の情報を用いて表されてもよい。 In addition, the information, parameters, etc. described in the present disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using other corresponding information. may be represented.
 本開示で使用する「判断(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判断」、「決定」は、例えば、判定(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up、search、inquiry)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判断」「決定」したとみなす事を含み得る。つまり、「判断」「決定」は、何らかの動作を「判断」「決定」したとみなす事を含み得る。また、「判断(決定)」は、「想定する(assuming)」、「期待する(expecting)」、「みなす(considering)」などで読み替えられてもよい。 The terms "determining" and "determining" used in this disclosure may encompass a wide variety of actions. "Judgement" and "determination" are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure), ascertaining as "judged" or "determined", and the like. Also, "judgment" and "determination" are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgment" or "decision" has been made. In addition, "judgment" and "decision" are considered to be "judgment" and "decision" by resolving, selecting, choosing, establishing, comparing, etc. can contain. In other words, "judgment" and "decision" may include considering that some action is "judgment" and "decision". Also, "judgment (decision)" may be read as "assuming", "expecting", "considering", or the like.
 「接続された(connected)」、「結合された(coupled)」という用語、又はこれらのあらゆる変形は、2又はそれ以上の要素間の直接的又は間接的なあらゆる接続又は結合を意味し、互いに「接続」又は「結合」された2つの要素間に1又はそれ以上の中間要素が存在することを含むことができる。要素間の結合又は接続は、物理的なものであっても、論理的なものであっても、或いはこれらの組み合わせであってもよい。例えば、「接続」は「アクセス」で読み替えられてもよい。本開示で使用する場合、2つの要素は、1又はそれ以上の電線、ケーブル及びプリント電気接続の少なくとも一つを用いて、並びにいくつかの非限定的かつ非包括的な例として、無線周波数領域、マイクロ波領域及び光(可視及び不可視の両方)領域の波長を有する電磁エネルギーなどを用いて、互いに「接続」又は「結合」されると考えることができる。 The terms "connected", "coupled", or any variation thereof, mean any direct or indirect connection or coupling between two or more elements, It can include the presence of one or more intermediate elements between two elements being "connected" or "coupled." Couplings or connections between elements may be physical, logical, or a combination thereof. For example, "connection" may be read as "access". As used in this disclosure, two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
 本開示において使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 The term "based on" as used in this disclosure does not mean "based only on" unless otherwise specified. In other words, the phrase "based on" means both "based only on" and "based at least on."
 本開示において使用する「第1の」、「第2の」などの呼称を使用した要素へのいかなる参照も、それらの要素の量又は順序を全般的に限定しない。これらの呼称は、2つ以上の要素間を区別する便利な方法として本開示において使用され得る。したがって、第1及び第2の要素への参照は、2つの要素のみが採用され得ること、又は何らかの形で第1の要素が第2の要素に先行しなければならないことを意味しない。 Any reference to elements using the "first," "second," etc. designations used in this disclosure does not generally limit the quantity or order of those elements. These designations may be used in this disclosure as a convenient method of distinguishing between two or more elements. Thus, reference to a first and second element does not imply that only two elements can be employed or that the first element must precede the second element in any way.
 本開示において、「含む(include)」、「含んでいる(including)」及びそれらの変形が使用されている場合、これらの用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。さらに、本開示において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 Where "include," "including," and variations thereof are used in this disclosure, these terms are inclusive, as is the term "comprising." is intended. Furthermore, the term "or" as used in this disclosure is not intended to be an exclusive OR.
 本開示において、例えば、英語でのa, an及びtheのように、翻訳により冠詞が追加された場合、本開示は、これらの冠詞の後に続く名詞が複数形であることを含んでもよい。 In this disclosure, if articles are added by translation, such as a, an, and the in English, the disclosure may include that the nouns following these articles are plural.
 本開示において、「AとBが異なる」という用語は、「AとBが互いに異なる」ことを意味してもよい。なお、当該用語は、「AとBがそれぞれCと異なる」ことを意味してもよい。「離れる」、「結合される」などの用語も、「異なる」と同様に解釈されてもよい。 In the present disclosure, the term "A and B are different" may mean "A and B are different from each other." The term may also mean that "A and B are different from C". Terms such as "separate," "coupled," etc. may also be interpreted in the same manner as "different."
 10…混雑度推定システム、11…画像取得部、12…撮像情報取得部、13…人数検出部、14…混雑度推定部、20…端末、1001…プロセッサ、1002…メモリ、1003…ストレージ、1004…通信装置、1005…入力装置、1006…出力装置、1007…バス。 DESCRIPTION OF SYMBOLS 10... Congestion degree estimation system 11... Image acquisition part 12... Imaging information acquisition part 13... People detection part 14... Congestion degree estimation part 20... Terminal 1001... Processor 1002... Memory 1003... Storage 1004 Communication device 1005 Input device 1006 Output device 1007 Bus.

Claims (3)

  1.  混雑度の推定に用いる画像を取得する画像取得部と、
     前記画像取得部によって取得された画像に基づいて、当該画像を撮像した撮像装置の当該画像の撮像時の位置及び撮像方向を示す撮像情報を取得する撮像情報取得部と、
     前記画像取得部によって取得された画像から、当該画像に写った人数を検出する人数検出部と、
     前記撮像情報取得部によって取得された撮像情報から混雑度の推定対象となる場所を特定し、特定した場所について、前記人数検出部によって検出された人数から混雑度を推定する混雑度推定部と、
    を備える混雑度推定システム。
    an image acquisition unit that acquires an image used for estimating the degree of congestion;
    an imaging information acquisition unit that acquires imaging information indicating a position and an imaging direction of an imaging device that captured the image based on the image acquired by the image acquisition unit;
    A people detection unit that detects the number of people in the image from the image acquired by the image acquisition unit;
    a congestion degree estimation unit that identifies a place to be a target for estimating the degree of congestion from the imaging information acquired by the imaging information acquisition unit, and estimates the degree of congestion from the number of people detected by the people detection unit for the specified location;
    Congestion degree estimation system with.
  2.  前記撮像情報取得部は、前記撮像時の位置を示す情報として、建物中の三次元位置を示す情報を取得し、
     前記混雑度推定部は、前記混雑度の推定対象となる場所として、前記建物を複数に分割した領域の何れかを特定する請求項1に記載の混雑度推定システム。
    The imaging information acquisition unit acquires information indicating a three-dimensional position in a building as information indicating the position at the time of imaging,
    2. The congestion degree estimation system according to claim 1, wherein the congestion degree estimation unit specifies any one of a plurality of areas into which the building is divided as a location for which the congestion degree is to be estimated.
  3.  前記混雑度推定部は、同一の場所について、前記人数検出部によって複数の画像それぞれから検出された人数から混雑度を推定する請求項1又は2に記載の混雑度推定システム。 The congestion degree estimation system according to claim 1 or 2, wherein the congestion degree estimation unit estimates the degree of congestion from the number of people detected from each of the plurality of images by the people detection unit for the same place.
PCT/JP2022/025346 2021-07-15 2022-06-24 Congestion estimating system WO2023286569A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023535209A JPWO2023286569A1 (en) 2021-07-15 2022-06-24

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021116976 2021-07-15
JP2021-116976 2021-07-15

Publications (1)

Publication Number Publication Date
WO2023286569A1 true WO2023286569A1 (en) 2023-01-19

Family

ID=84920021

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/025346 WO2023286569A1 (en) 2021-07-15 2022-06-24 Congestion estimating system

Country Status (2)

Country Link
JP (1) JPWO2023286569A1 (en)
WO (1) WO2023286569A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017038450A1 (en) * 2015-09-02 2017-03-09 日本電気株式会社 Monitoring system, monitoring network construction method, and program
WO2017159060A1 (en) * 2016-03-18 2017-09-21 日本電気株式会社 Information processing device, control method, and program
JP2019145022A (en) * 2018-02-23 2019-08-29 パナソニックIpマネジメント株式会社 Store information providing system, server, store information providing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017038450A1 (en) * 2015-09-02 2017-03-09 日本電気株式会社 Monitoring system, monitoring network construction method, and program
WO2017159060A1 (en) * 2016-03-18 2017-09-21 日本電気株式会社 Information processing device, control method, and program
JP2019145022A (en) * 2018-02-23 2019-08-29 パナソニックIpマネジメント株式会社 Store information providing system, server, store information providing method, and program

Also Published As

Publication number Publication date
JPWO2023286569A1 (en) 2023-01-19

Similar Documents

Publication Publication Date Title
US6147624A (en) Method and apparatus for parking management system for locating available parking space
JP6885682B2 (en) Monitoring system, management device, and monitoring method
WO2017048067A1 (en) Terminal and method for measuring location thereof
WO2016137294A1 (en) Electronic device and control method thereof
EP3351023A1 (en) Terminal and method for measuring location thereof
JP2018182705A (en) Identification system, identification method and program
WO2023286569A1 (en) Congestion estimating system
US20220164997A1 (en) Systems and methods for dynamically estimating real time signal strengths for a wireless router using augmented reality
US11836978B2 (en) Related information output device
CN109040457B (en) Screen brightness adjusting method and mobile terminal
CN111179377A (en) Robot mapping method, corresponding robot and storage medium
JP7198966B2 (en) Positioning system
WO2023008277A1 (en) Content sharing system
KR20190045679A (en) Method for Providing Augmented Reality Service and Apparatus Thereof
KR101886856B1 (en) System and method for data combining based on result of non-rigid object tracking on multi-sensor seeker
WO2020106011A1 (en) Device for generating user profile and system comprising the device
JP7489379B2 (en) Area Identification System
JP6994996B2 (en) Traffic route judgment system
WO2019202791A1 (en) Pastime preference estimation device and pastime preference estimation method
WO2021172137A1 (en) Content sharing system and terminal
JP7389222B2 (en) Object recognition system and receiving terminal
WO2023074817A1 (en) Content providing device
US20190266742A1 (en) Entity location provision using an augmented reality system
JP6760483B2 (en) Object detection system, object detection method and program
WO2021199785A1 (en) Information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22841913

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023535209

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE