WO2017072158A1 - Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail - Google Patents

Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail Download PDF

Info

Publication number
WO2017072158A1
WO2017072158A1 PCT/EP2016/075764 EP2016075764W WO2017072158A1 WO 2017072158 A1 WO2017072158 A1 WO 2017072158A1 EP 2016075764 W EP2016075764 W EP 2016075764W WO 2017072158 A1 WO2017072158 A1 WO 2017072158A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupancy
area
image
workspace
processor
Prior art date
Application number
PCT/EP2016/075764
Other languages
English (en)
Inventor
Ashish Vijay Pandharipande
David Ricardo CAICEDO FERNANDEZ
Original Assignee
Philips Lighting Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Lighting Holding B.V. filed Critical Philips Lighting Holding B.V.
Publication of WO2017072158A1 publication Critical patent/WO2017072158A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention relates generally to the field of occupancy detection, and more particularly to an occupancy detection system and a corresponding method suitable for determining the number of people within a workspace area.
  • occupancy detection can be used for automatically control lighting and ventilation systems, or heating, ventilation, and air condition systems (HVAC). Occupancy detectors are used to maximize correct, efficient and timely delivery of light and air in the environment.
  • HVAC heating, ventilation, and air condition systems
  • a main concern of occupancy detectors for lighting control in such presence or occupancy controlled systems is to ensure that lighting is promptly switched on when a person enters a given environment.
  • Cheap and efficient solutions to deliver on this goal consist of Passive Infrared Sensors (PIR), RADARs or SONARs. These are able to quickly detect movements in the environment.
  • this type of system also does not enable a count of the number of people in an area to be provided.
  • the image processing involves analyzing an edge pattern of a general shape of a human head, and utilizing it to search a captured image in a grey scale and hue of the captured image, and detecting a human head based on detection of a strong edge gradient around the edge pattern.
  • WO 2015/063479 discloses a system for controlling room lighting which includes occupancy detection. It uses presence detection and motion detection to work out where workstations are located, so that lighting can then be controlled accordingly.
  • a system for identifying workspaces in an area and for determining the occupancy of the workspaces comprising: a vision sensor arrangement for capturing an image or images of the area over time;
  • an image processor adapted to determine occupancy information in the area
  • an occupancy processor adapted to analyze the occupancy information over time, to:
  • the image processor is provided at the respective vision sensor such that the vision sensor and the image processor form a vision sensor module, and the occupancy processor comprises a central processor.
  • This system makes use of occupancy information based on captured images over time.
  • the information is used both to detect the presence of an occupant, but also to derive the location of workspaces based on historical information. In this way, the system does not need to be programmed with the workspace locations, but it can instead learn where they are.
  • a workspace may be identified where there is quasi-static movement, by which is meant that there is local movement within that region of the image, but the occupancy in the region stays static for prolonged times. This corresponds to a user sitting at a desk but moving by small amounts.
  • the occupancy information for example comprises information about an occupant in the form of one or more attributes such as location, speed, orientation etc.
  • each region for example corresponds in real space to a dimension of a person (e.g. when viewed from above, the area of a person's head).
  • the different regions are regions where a person may be seated.
  • a person may occupy several such regions (i.e. they are smaller than the size of a person), or they may be smaller than the area of a region.
  • the amount of data is still much less than for a high resolution pixelated image.
  • the regions form a low resolution block pixel image. This also means that the block pixel image conveys less sensitive information, such as the identity of people, or the content of documents they may be carrying.
  • the image processor and the vision sensor are provided as a module, image processing is applied to the images at the source to remove some or all content with privacy implications.
  • the vision sensor module outputs information which has less personally sensitive content, and this is then made available to and used by the central processor.
  • the vision sensor module may be designed not to have a raw image output.
  • the vision sensor module is designed so that no personally sensitive information can be output.
  • the occupancy processor can be distributed for example such that the identifying of workspaces can be done on a different device than the identifying of current occupancy. For example, a central device identifies where the workspaces are and the local processor receives information on these regions and then determines if they are free.
  • the image processor is for example adapted to generate an image which comprises a block pixel image of the area, each block pixel representing an area region and providing occupancy information in respect of that area region, and wherein the occupancy processor (36) is adapted to identify workspace regions, based on the historical occupancy information which includes historical occupancy movement information within the area regions.
  • the area regions define coarse locations where a user may be present, and movement within the regions is used to identify whether there is small movement within the region or larger movement across or through the region. When a region where there is movement changes, this indicates movement across a larger area, rather than movement within an area. In this way, a user may be tracked.
  • the processor is for example adapted to determine a level of change of the images over time within each region, and to derive the occupancy information based on the level of change.
  • the current or recent change level information may for example relate to movement over a previous number of seconds, minutes or hours, whereas the historical information for example relates to previous days, weeks or even months.
  • the processor may also be adapted to classify movements and movement velocities for the detected occupants. For example, there may be static, dynamic or quasi- dynamic events.
  • the vision sensor arrangement may comprise a plurality of vision sensors.
  • the processor is for example adapted to provide a metric for each determined occupant based on the current or recent level of movement.
  • the processor generates a block map of movement information. It does not need to provide any personally sensitive information about any particular occupants of the area.
  • the processor is for example adapted to identify quasi-static occupancy of a region and moving occupancy, and is adapted to identify workspace regions based on historical quasi-static occupancy identifications.
  • the system may comprise RF sensors for sensing signals from portable devices of an occupant of the area to assist in the location of workspace regions.
  • the system may further comprise a request processor which is adapted to receive a request for identification of a vacant workspace and to allocate a non-occupied workspace region. If there are multiple non-occupied workspace regions, they may be allocated based on proximity, or based on the level of already existing crowding. The allocation is conveyed to the user, and the system records that the allocation has been made (so that the same workspace will not be allocated immediately afterwards). Examples in accordance with another aspect provide a lighting system, comprising:
  • the lighting controller is adapted to control the lighting in dependence on the determined occupancy.
  • a shared infrastructure may be used.
  • the lighting system may for example make use of a network, and the vision sensors and processor of the occupancy system may communicate using the same network.
  • the vision sensors may for example be mounted at the luminaires.
  • the lighting controller may also control the lighting in response to changes in the workspace locations.
  • Examples in accordance with another aspect of the invention provide a method for identifying workspaces in an area and for determining the occupancy of the workspaces, comprising:
  • an image processor which is provided at the vision sensor, such that the vision sensor and the image processor form a vision sensor module, to determine occupancy information in respect of the area;
  • the image processor may be used to generate an image which comprises a block pixel image of the area, each block pixel representing an area region and providing occupancy information, and using the occupancy processor (36) to identify workspace regions, based on the historical occupancy information which includes historical occupancy movement information within the area regions.
  • Quasi-static occupancy of a region and moving occupancy may be identified, and workspace regions may be identified based on historical quasi-static occupancy identifications.
  • the method may comprise sensing signals from portable devices of an occupant of the area and using the sensed signals to assist in the location of workspace regions.
  • the invention may be implemented at least in part in software.
  • Figure 1 shows a sequence of images to explain how the images are processed
  • FIG. 1 shows an occupancy request is processed
  • Figure 3 shows a system for identifying workspaces in an area and for determining the occupancy of the workspaces
  • Figure 4 shows a method for identifying workspaces in an area and for determining the occupancy of the workspaces
  • Figure 5 shows a computer for implementing the methods.
  • the invention provides a system and method for identifying workspaces in an area and for determining the occupancy of the workspaces.
  • a vision sensor arrangement is used for capturing an image or images of the area over time. For the images, occupancy information over time in the area is analyzed so that workspace regions can be automatically recognized based on the historical occupancy information. The current occupancy of the workspace regions can be determined based on current or recent occupancy information.
  • Figure 1 shows three images of a workspace, as well as a simplified block representation of those images.
  • the image is of a workspace with multiple workspace locations, i.e. seats at a desk.
  • Figure 1 shows three sequential images.
  • thee is a moving person 10 and a seated person 12.
  • the seated person can be defined as quasi static. This means they are moving, for example their head and arms locally, but they are globally static, i.e. they physically occupy a fixed region of space which is not significantly larger than their own volume.
  • the right image shows a block pixel-by-block pixel score matrix that a vision sensor generates and sends to a central processing unit.
  • the data is sent with a time stamp and with the identity of the particular vision sensor which captured the image.
  • the image formed is termed below a "block pixel image”.
  • the squares 14a and 14b represent block pixel regions of the block pixel image where movement is detected, and the amount of movement may also be encoded into the block pixel.
  • the block pixels comprise image regions, correponding to an area region, i.e. a region of the area being monitored, and the vision sensor allocates these regions to the images, and determines the occupancy over time in each region. This is caused by movement in the area being monitored. For example locations can be identified with certain temporal characteristics, e.g. a small variance over time is indicative of workplaces in comparison to large variance over time that is indicative of movements in hallways.
  • the block pixel image may for instance be a low resolution score matrix with each element encoding a value which is indicative of a size of observed movement.
  • This movement information essentially provides occupancy information. For example, thresholds can be applied both in terms of the amount of movement within a region and the amount of time during which there is movement within the region.
  • the block pixel image is thus one way to encode discrete locations of the occupants at a particular time. Another way is to transmit location coordinates for a detected occupant.
  • a metric for the size of observed movement may be the variance between the location of an occupant over time.
  • a large variance may relate to a person walking through a region or across multiple regions, whereas a small variance over a long time may relate to a person sitting in a region but moving by a small amount.
  • quasi-static presence may be identified. This may be done by marking those blocks whose variances fall within a prescribed range.
  • Var(x) - ⁇ — j- ⁇ f c
  • where: mean(x) - ⁇ fc (/c) .
  • the variance may be calculated at different locations. If the sensor reports locations at a high reporting frequency (e.g. 1 s), the variance may be computed at the backend. If the sensor reports at a lower frequency (e.g. 1 min), the sensor may then report location centroids as well as the variance.
  • a high reporting frequency e.g. 1 s
  • a lower frequency e.g. 1 min
  • a vision sensor provides much richer data than a PIR sensor but privacy is maintained by forming a block pixel image which only encodes levels of movement within regions.
  • Detecting the presence of an occupant may be carried out in various ways. Essentially, changes in images over time represent movement so that regions where there is movement can be identified and determined as containing an occupant. The movement of an occupant can then be tracked based on how the location of detected movement evolves over time.
  • Detecting movement may thus be based on changes in raw images over time, with analysis only on a region-by-region basis. More sophisticated image processing techniques may however be used (for example edge detection, head detection etc.) which may then be converted into the more coarse regional occupancy information to be used in the system of the invention.
  • image processing techniques for example edge detection, head detection etc.
  • occupants may be identified based on image processing techniques, and again they may then be tracked. The velocity as well as position of a detected occupant can be tracked. By tracking a person moving between locations (block- pixels) over time, the type of movement can be classified. The person being tracked is not however identified.
  • Person tracking can be used to obtain velocity information. If velocity is not derived in this way, the system may also estimate the type of movement from the reported locations (block-pixels) and how these change over time. This could be based on associating locations to a particular person to maintain a continuous flow.
  • the image regions may be defined in a static predefined way - so that the image is divided into a fixed set of regions.
  • the regions may instead be somewhat dynamic, for example becoming defined by the detected occupancy over time. In this way, a region may be made to match more accurately the position of a workspace.
  • Figure IB shows a later image.
  • the person 10 has moved so that the movement in the region 14a has dropped to zero and there is movement in region 14c.
  • the block pixel image shows the newly used workspace as 14d.
  • the block pixel images are analyzed, and block pixels which represent quasi- static presence are identified. This may be based on a number of observations of quasi-static presence within a pre-defined time window.
  • the region 14b is identified as a workspace, as shown by the dotted surround 18. Over time, the region 14d will also become recognized as a workspace, if it is used over time.
  • the block pixel data is able to be used to identify workspace locations.
  • the vision sensors in this example only transmit location information of observed presence in the form of a block pixel-by-block pixel score matrix, and in particular the images indicate pixels with quasi-static presence. In this way, no personal information is collected.
  • workspace regions By analyzing the movement within the area regions (i.e. the block pixels) over time and the movement between block pixels over time (i.e. one block pixels stops showing movement and an adjacent one shows movement), workspace regions can be identified, the movement of people can be tracked, and the occupancy of those identified workspace regions can be monitored.
  • a single metric conveyed by each of the block pixels enables all of this information to be derived. This metric is the amount of movement over a preceding time period with the corresponding area region (where any movement is indicative of occupancy).
  • the sensing region of a vision sensor may either cover multiple workspaces, a workspace in part, or no workspace at all.
  • the vision sensors can only send limited amount of information to the central processing unit on a rate-limited communication channel.
  • the information elements sent by individual vision sensors conform to privacy and low communication rate constraints.
  • a request for identifying vacant workspaces is sent to the system.
  • the system checks the vacancy of previously identified workspaces and sends such information to the querying entity.
  • the image 20 is a current or recent block pixel image of movements identified as quasi-static.
  • the image 22 shows where workspaces have been identified over a much longer time in the manner explained above. Four such workspaces are identified. As represented by the image 24, there are two unoccupied workspaces that are identified.
  • the current or recent change level information shown in image 20 may for example relate to movement over a previous number of seconds, minutes or hours, whereas the historical information used to form the workspace information represented by image 22 for example relates to previous days, weeks or even months.
  • the block pixel image means the amount of data that needs to be transmitted to and processed by the central processor is reduced.
  • Each region for example corresponds in real space to a dimension of the same order of magnitude as the space occupied by a person (e.g. when viewed from above, the area of a person's head).
  • the different regions are regions where a person may be seated.
  • a person may occupy several such regions (i.e. they are smaller than the size of a person), or they may occupy a space smaller than corresponds to a region.
  • the amount of data is still much less than for a high resolution pixelated image.
  • Figure 3 shows the system for identifying workspaces in an area and for determining the occupancy of the workspaces.
  • a vision sensor arrangement is shown comprising two vision sensors, i.e. cameras 30, 32 for capturing an image or images of the area 34 over time.
  • Each vision sensor has a local image processor 30a, 32a adapted to allocate regions to the images as described above to form the block pixel image, and to determine a level of change of the images over time in each region.
  • Output data 30b, 32b is provided in the form of change level information. This output information has no personally sensitive content.
  • the vision sensor module may be designed not to have a raw image output. In this way, the vision sensor module is designed so that no personally sensitive information can be output.
  • a central processor 36 which may be considered to be an occupancy processor, analyzes the occupancy information over time.
  • the system does not need to be programmed with the workspace locations, but it can instead learn where they are.
  • the amount of data that needs to be transmitted to and processed by the occupancy processor 36 is low.
  • the occupancy processor 36 can be distributed for example such that the identifying of workspaces can be done on a different device than the identifying of current occupancy. For example, a central device identifies where the workspaces are and a more local processor receives information on these regions and then determines if they are free.
  • vision sensors There may be more than two vision sensors. They do not need to be directed accurately as a result of the learning capability. They may have overlapping or non- overlapping fields of view. Since the system does not need to be programmed with the workspace locations, it may be that a field of view of one vision sensor covers no workspace regions or one or many.
  • the system may comprise sensors for sensing signals from portable devices of an occupant of the area to assist in the location of workspace regions by correlating the sensed data with the vision sensor data.
  • Received signal strength indication (RSSI) measurements at multiple receivers located for example at luminaires may be used to position a user mobile. Static/quasi-static positions are then filtered and spatio-temporally correlated with vision sensor pixel positions to identify potential workspaces.
  • the sensor data may be based on RF signals or other electromagnetic signals or indeed other types of signal such as acoustic signals.
  • the vision sensors may be mounted at luminaires of a lighting system, so that a shared infrastructure may be used.
  • the lighting system may be a networked system, and the vision sensors and processors of the occupancy system may communicate using the same network.
  • Figure 4 shows a method for identifying workspaces in an area and for determining the occupancy of the workspaces.
  • step 40 an image or images of the area are captured over time.
  • step 42 regions are allocated to the images so that a block pixel layout is defined.
  • step 44 occupancy information over time is determined within each region.
  • This occupancy information may take various forms, but it aims to identify regions where there is a person who is locally moving but globally relatively static, termed quasi static above. For example, this may arise when there is almost continuous detection of occupancy within the area over a time period, but the region is surrounded by regions which do not include occupants or motion.
  • Occupancy information may take the form of information relating to the change of captured images over time, based on a change in the image captured corresponding to movement at that location. The amount of movement may be within thresholds to correspond to this quasi static condition. For example, for a system that does not track a user, a change in the image content below a threshold may simply be caused by changing ambient light conditions.
  • step 46 output data is provided in the form of a block pixel image which encodes change level information.
  • This output data include a timestamp and also identification of which vision sensor the image is for.
  • step 48 the occupancy information is analyzed over time in the central processor.
  • step 50 workspace regions are identified, based on the historical occupancy information.
  • step 52 current occupancy of the workspace regions is determined in response to an enquiry from a user of the system.
  • the current occupancy is based on current or recent occupancy information.
  • the enquiry for example is based on a worker requesting a vacant workspace when entering an office.
  • the request may be made using a mobile phone or by interacting with a terminal at the entrance to the building, or at the elevator on each floor of a building.
  • the system then has a request processor (which may in fact simply be a function of the central processor) which receives the request.
  • a request processor (which may in fact simply be a function of the central processor) which receives the request.
  • step 54 a workspace is identified for allocation to the user which the user may occupy. This identification is carried out by the request processor.
  • the allocated workspace may be the nearest workspace to the user when they made the enquiry (for example based on the location of their mobile phone, or the terminal they used to log the enquiry). It may be set to be on the same floor as the enquiry location. Alternatively, a workspace in a currently least crowded area may be selected.
  • the user may have the option to choose his or her preferences for selection between any available workspaces, based on these parameters.
  • Other parameters for example may include the proximity to a window, or heating or cooling systems, or even a preferred temperature.
  • the system is thus of interest for offices that use hot-desking, where users are not allocated fixed workspaces, but occupy them on a supply and demand basis.
  • FIG. 5 illustrates an example of a computer 60 for implementing the processors described above.
  • the computer 60 may include one or more processors 61, memory 62, and one or more I/O devices 63 that are communicatively coupled via a local interface (not shown).
  • the local interface can be, for example but not limited to, one or more buses or other wired or wireless connections, as is known in the art.
  • the local interface may have additional elements, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components.
  • the processor 61 is a hardware device for executing software that can be stored in the memory 62.
  • the processor 61 can be virtually any custom made or
  • processors such as a central processing unit (CPU), a digital signal processor (DSP), or an auxiliary processor among several processors associated with the computer 60, and the processor 61 may be a semiconductor based microprocessor (in the form of a microchip) or a microprocessor.
  • CPU central processing unit
  • DSP digital signal processor
  • auxiliary processor among several processors associated with the computer 60
  • the processor 61 may be a semiconductor based microprocessor (in the form of a microchip) or a microprocessor.
  • the memory 62 can include any one or combination of volatile memory elements (e.g., random access memory (RAM), such as dynamic random access memory (DRAM), static random access memory (SRAM), etc.) and non-volatile memory elements (e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), disk, diskette, cartridge, cassette or the like, etc.).
  • RAM random access memory
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • non-volatile memory elements e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), disk, diskette, cartridge, cassette or the like, etc.
  • the memory 62 may incorporate electronic, magnetic, optical, and/or other types
  • the software in the memory 62 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions.
  • the software in the memory 62 includes a suitable operating system (O/S) 64, compiler 65, source code 66, and one or more applications 67 in accordance with exemplary embodiments.
  • O/S operating system
  • compiler 65 compiler 65
  • source code 66 source code 66
  • applications 67 application 67 in accordance with exemplary embodiments.
  • the application 67 comprises numerous functional components such as computational units, logic, functional units, processes, operations, virtual entities, and/or modules.
  • the operating system 64 controls the execution of computer programs, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.
  • Application 67 may be a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
  • a source program then the program is usually translated via a compiler (such as the compiler 65), assembler, interpreter, or the like, which may or may not be included within the memory 62, so as to operate properly in connection with the operating system 64.
  • the application 67 can be written as an object oriented programming language, which has classes of data and methods, or a procedure programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, C#, Pascal, BASIC, API calls, HTML, XHTML, XML, ASP scripts, JavaScript, FORTRAN, COBOL, Perl, Java, ADA, .NET, and the like.
  • the I/O devices 63 may include input devices such as, for example but not limited to, a mouse, keyboard, scanner, microphone, camera, etc. Furthermore, the I/O devices 67 may also include output devices, for example but not limited to a printer, display, etc. Finally, the I/O devices 63 may further include devices that communicate both inputs and outputs, for instance but not limited to, a NIC or modulator/demodulator (for accessing remote devices, other files, devices, systems, or a network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc. The I/O devices 63 also include components for communicating over various networks, such as the Internet or intranet.
  • a NIC or modulator/demodulator for accessing remote devices, other files, devices, systems, or a network
  • RF radio frequency
  • the I/O devices 63 also include components for communicating over various networks, such as the Internet or intranet.
  • the processor 61 When the computer 60 is in operation, the processor 61 is configured to execute software stored within the memory 62, to communicate data to and from the memory 62, and to generally control operations of the computer 60 pursuant to the software.
  • the application 67 and the operating system 64 are read, in whole or in part, by the processor 61, perhaps buffered within the processor 61, and then executed.
  • a computer readable medium may be an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method.
  • the invention relates to privacy of the sensed image, and one aspect is the division of processing tasks as explained above. Another aspect is the use of the block pixel image.
  • a second aspect relates to the image processing approach. This aspect provides a system for identifying workspaces in an area, for determining the occupancy of the workspaces and for allocating workspaces, comprising:
  • a vision sensor arrangement (30,32) for capturing an image or images of the area over time, wherein the image or images each comprise a block pixel image of the area, each block pixel representing an area region;
  • a processing arrangement (36, 30a,32a) adapted to:

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un système et un procédé permettant d'identifier des espaces de travail dans une zone et de déterminer l'occupation des espaces de travail. Un agencement de capteur de vision est utilisé pour capturer une image ou des images de la zone dans le temps. Pour les images, des informations d'occupation dans le temps dans la zone sont analysées de sorte que des régions d'espace de travail peuvent être reconnues automatiquement sur la base des informations d'occupation historiques. L'occupation actuelle des régions d'espace de travail peut être déterminée sur la base d'informations d'occupation actuelles ou récentes.
PCT/EP2016/075764 2015-10-30 2016-10-26 Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail WO2017072158A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15192233 2015-10-30
EP15192233.3 2015-10-30

Publications (1)

Publication Number Publication Date
WO2017072158A1 true WO2017072158A1 (fr) 2017-05-04

Family

ID=54477848

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/075764 WO2017072158A1 (fr) 2015-10-30 2016-10-26 Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail

Country Status (1)

Country Link
WO (1) WO2017072158A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019076732A1 (fr) * 2017-10-17 2019-04-25 Signify Holding B.V. Étalonnage de capteur d'occupation et estimation d'occupation
WO2020104254A1 (fr) * 2018-11-20 2020-05-28 Signify Holding B.V. Système de comptage de personnes doté de régions de détection agrégées
CN111344733A (zh) * 2017-11-13 2020-06-26 苏伊士集团 用于处理异构数据以确定时间和空间中的流入的设备和方法
WO2020190894A1 (fr) * 2019-03-15 2020-09-24 VergeSense, Inc. Détection d'arrivée des capteurs optiques alimentés par batterie
WO2022015470A1 (fr) 2020-07-17 2022-01-20 Philip Markowitz Système et procédé de suivi de temps amélioré par vidéo
US11375164B2 (en) 2017-05-05 2022-06-28 VergeSense, Inc. Method for monitoring occupancy in a work area
US11563922B2 (en) 2017-05-05 2023-01-24 VergeSense, Inc. Method for monitoring occupancy in a work area
US11563901B2 (en) 2017-11-14 2023-01-24 VergeSense, Inc. Method for commissioning a network of optical sensors across a floor space
US11620808B2 (en) 2019-09-25 2023-04-04 VergeSense, Inc. Method for detecting human occupancy and activity in a work area
EP4172857A4 (fr) * 2020-07-17 2023-11-29 Philip Markowitz Système et procédé de suivi de temps amélioré par vidéo

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4308983A1 (de) * 1993-03-20 1994-09-22 Abb Patent Gmbh Sensorschalter mit einem auf Bewegung reagierenden Sensor
US20080008360A1 (en) * 2005-11-05 2008-01-10 Ram Pattikonda System and method for counting people
US20130113932A1 (en) * 2006-05-24 2013-05-09 Objectvideo, Inc. Video imagery-based sensor
US20140093130A1 (en) * 2011-06-09 2014-04-03 Utah State University Research Foundation Systems and Methods For Sensing Occupancy
WO2015063479A1 (fr) 2013-10-29 2015-05-07 C.P. Electronics Limited Appareil de régulation d'une charge électrique

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4308983A1 (de) * 1993-03-20 1994-09-22 Abb Patent Gmbh Sensorschalter mit einem auf Bewegung reagierenden Sensor
US20080008360A1 (en) * 2005-11-05 2008-01-10 Ram Pattikonda System and method for counting people
US8228382B2 (en) 2005-11-05 2012-07-24 Ram Pattikonda System and method for counting people
US20130113932A1 (en) * 2006-05-24 2013-05-09 Objectvideo, Inc. Video imagery-based sensor
US20140093130A1 (en) * 2011-06-09 2014-04-03 Utah State University Research Foundation Systems and Methods For Sensing Occupancy
WO2015063479A1 (fr) 2013-10-29 2015-05-07 C.P. Electronics Limited Appareil de régulation d'une charge électrique

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11563922B2 (en) 2017-05-05 2023-01-24 VergeSense, Inc. Method for monitoring occupancy in a work area
US11375164B2 (en) 2017-05-05 2022-06-28 VergeSense, Inc. Method for monitoring occupancy in a work area
CN111213073A (zh) * 2017-10-17 2020-05-29 昕诺飞控股有限公司 占用传感器校准和占用估计
WO2019076732A1 (fr) * 2017-10-17 2019-04-25 Signify Holding B.V. Étalonnage de capteur d'occupation et estimation d'occupation
CN111213073B (zh) * 2017-10-17 2024-03-12 昕诺飞控股有限公司 占用传感器校准和占用估计
EP3698609B1 (fr) * 2017-10-17 2021-02-17 Signify Holding B.V. Étalonnage d'un détecteur d'occupation et estimation d'occupation
US11184968B2 (en) 2017-10-17 2021-11-23 Signify Holding B.V. Occupancy sensor calibration and occupancy estimation
CN111344733A (zh) * 2017-11-13 2020-06-26 苏伊士集团 用于处理异构数据以确定时间和空间中的流入的设备和方法
CN111344733B (zh) * 2017-11-13 2023-09-26 苏伊士国际公司 用于处理异构数据以确定时间和空间中的流入的设备和方法
US11563901B2 (en) 2017-11-14 2023-01-24 VergeSense, Inc. Method for commissioning a network of optical sensors across a floor space
WO2020104254A1 (fr) * 2018-11-20 2020-05-28 Signify Holding B.V. Système de comptage de personnes doté de régions de détection agrégées
WO2020190894A1 (fr) * 2019-03-15 2020-09-24 VergeSense, Inc. Détection d'arrivée des capteurs optiques alimentés par batterie
US11532163B2 (en) 2019-03-15 2022-12-20 VergeSense, Inc. Arrival detection for battery-powered optical sensors
EP3938975A4 (fr) * 2019-03-15 2022-12-14 Vergesense, Inc. Détection d'arrivée des capteurs optiques alimentés par batterie
AU2020241843B2 (en) * 2019-03-15 2023-09-28 VergeSense, Inc. Arrival detection for battery-powered optical sensors
US10915759B2 (en) 2019-03-15 2021-02-09 VergeSense, Inc. Arrival detection for battery-powered optical sensors
US11620808B2 (en) 2019-09-25 2023-04-04 VergeSense, Inc. Method for detecting human occupancy and activity in a work area
WO2022015470A1 (fr) 2020-07-17 2022-01-20 Philip Markowitz Système et procédé de suivi de temps amélioré par vidéo
EP4172857A4 (fr) * 2020-07-17 2023-11-29 Philip Markowitz Système et procédé de suivi de temps amélioré par vidéo

Similar Documents

Publication Publication Date Title
WO2017072158A1 (fr) Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail
CN111247593B (zh) 使用实时定位系统和下一代测序在健康护理机构中预测、预防和控制感染传播
US10824878B2 (en) Method and arrangement for receiving data about site traffic derived from imaging processing
US10430528B2 (en) Method and system for managing space configurations
US10748024B2 (en) Method and system for detecting a person in an image based on location in the image
US11184968B2 (en) Occupancy sensor calibration and occupancy estimation
WO2018087762A1 (fr) Procédé et système de gestion automatique de ressources liées à l'espace
US10026003B2 (en) Method and arrangement for receiving data about site traffic derived from imaging processing
US20220293278A1 (en) Connected contact tracing
US20120194342A1 (en) Method and System for the Acquisition, Transmission and assessment of Remote Sensor Data for Trend Analysis, Prediction and Remediation
JP2023531504A (ja) 適応型作業空間レイアウトおよび使用状況の最適化のためのシステムおよび方法
WO2019018645A1 (fr) Gestion de préférences pondérées d'environnement intérieur
US20200388039A1 (en) Method and system for detecting occupant interactions
US7302369B2 (en) Traffic and geometry modeling with sensor networks
GB2514230A (en) In-room probability estimating apparatus, method therefor and program
JP2019109655A (ja) 建物利用状態管理システムおよび方法
CN113853158B (zh) 步行功能评价装置、步行功能评价系统、步行功能评价方法、记录介质及认知功能评价装置
CN109727417A (zh) 控制视频处理单元以促进检测新来者的方法和控制器
WO2020104254A1 (fr) Système de comptage de personnes doté de régions de détection agrégées
CN112237013A (zh) 对于智能建筑物的满意度测量
CN118541589A (zh) 用于占用者检测的传感器融合方案
JP7476028B2 (ja) 監視情報処理装置、方法およびプログラム
US11087615B2 (en) Video/sensor based system for protecting artwork against touch incidents
CN112312825B (zh) 一种过敏原警告系统和方法
KR101371869B1 (ko) 스테레오 카메라를 이용하는 재실 인원 계수 장치와 그 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16787834

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16787834

Country of ref document: EP

Kind code of ref document: A1