WO2001059736A9 - System and method of facilities and operations monitoring and remote management support - Google Patents

System and method of facilities and operations monitoring and remote management support

Info

Publication number
WO2001059736A9
WO2001059736A9 PCT/US2001/004496 US0104496W WO0159736A9 WO 2001059736 A9 WO2001059736 A9 WO 2001059736A9 US 0104496 W US0104496 W US 0104496W WO 0159736 A9 WO0159736 A9 WO 0159736A9
Authority
WO
WIPO (PCT)
Prior art keywords
set forth
operable
processor
captured images
quantitative data
Prior art date
Application number
PCT/US2001/004496
Other languages
French (fr)
Other versions
WO2001059736A3 (en
WO2001059736A2 (en
Inventor
Philip R Mckee
Bobby C Wong
Original Assignee
Digibot Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Digibot Inc filed Critical Digibot Inc
Priority to AU2001236932A priority Critical patent/AU2001236932A1/en
Publication of WO2001059736A2 publication Critical patent/WO2001059736A2/en
Publication of WO2001059736A3 publication Critical patent/WO2001059736A3/en
Publication of WO2001059736A9 publication Critical patent/WO2001059736A9/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • G05B23/0205Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults
    • G05B23/0218Electric testing or monitoring by means of a monitoring system capable of detecting and responding to faults characterised by the fault detection method dealing with either existing or incipient faults
    • G05B23/0224Process history based detection method, e.g. whereby history implies the availability of large amounts of data
    • G05B23/0227Qualitative history assessment, whereby the type of data acted upon, e.g. waveforms, images or patterns, is not relevant, e.g. rule based assessment; if-then decisions
    • G05B23/0235Qualitative history assessment, whereby the type of data acted upon, e.g. waveforms, images or patterns, is not relevant, e.g. rule based assessment; if-then decisions based on a comparison with predetermined threshold or range, e.g. "classical methods", carried out during normal operation; threshold adaptation or choice; when or how to compare with the threshold
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Definitions

  • This invention is related in general to the field of computer systems and software. More particularly, the invention is related to a system and method of exception- based facilities and operations monitoring and remote management support .
  • the raw material input is inventoried and tracked as it moves through the factory.
  • Each step of the manufacturing and assembly process is monitored carefully so that variances exceeding a predetermined threshold can be flagged and corrected immediately.
  • Such careful control and monitoring of the entire process results in less failures and errors, even though many of the manufacturing processes are themselves highly automated and thus not particularly labor dependent.
  • Kitchens are, in many ways, no different from factories or assembly lines. In a kitchen, there is also raw material input that undergoes a well-defined process. At the end of the process, finished goods are produced.
  • the "made-to-order" labor-intensive operation of today's commercial kitchens in the food service industry is typically not carefully monitored and managed. Therefore, the food product "manufactured" by a restaurant can be highly variable depending on the experience and training of its employees. With a tight labor market and the typical high turnover rate in the food service industry, employee training and employee performance monitoring and management support are a top priority.
  • a customer's impression and evaluation of a visit to a restaurant is not only dependent upon the manufactured product, but is also highly dependent on the level of service delivered by the restaurant employees.
  • customers factor in how long the wait was for a table, how long they had to wait for the waitperson to greet them and take their orders, how long they had to wait for the waitperson to deliver food to the table, how long it took for the waitperson to respond to requests
  • the operations of many other types of facilities can also benefit from better control, better monitoring, and better training of its employees.
  • the operations of these facilities are typically highly dependent on its employees to perform a service for the customers, particularly when the formation of the customers' total experience and impressions is highly dependent upon the performance of the employees.
  • Examples of this type of service-oriented businesses include restaurants, hotels, daycare centers, automotive repair shops, drug stores, health clubs, banking institutions, hair salons, and car washes.
  • Other businesses relying on its labor force to a lesser degree, such as retail merchandise outlets, grocery stores, and supermarkets, will also benefit from better training, management, monitoring and management support of its employees.
  • a system and method to monitor the operations and performance of multiple facilities and to provide management support thereto are provided which eliminate or substantially reduce the disadvantages associated with prior operations.
  • an exception-based remote monitoring of the facilities and operations and management support are provided.
  • a system of monitoring the operations of one or more facilities and enabling management support thereto includes at least one camera installed in the facility operable to capture images.
  • a processor is in communication with the at least one camera and operable to receive the captured images.
  • the processor is operable to derive quantitative data from the captured images, analyze all the quantitative data, compare them with predetermined thresholds, and generate an alert in response to any threshold being exceeded.
  • a system of monitoring the operations of one or more food service establishments and enabling management support thereto includes at least one sensor operable to gather quantitative data.
  • the system further includes at least one camera installed in the food service establishment operable to capture images.
  • a restaurant processor is in communication with the at least one sensor and operable to receive the quantitative data.
  • the restaurant processor is also in communication with the at least one camera and operable to receive the captured images.
  • the restaurant processor is operable to compare the quantitative data with predetermined thresholds, and to generate an alert in response to any threshold being exceeded.
  • a central computer is in communication with the restaurant processor and operable to receive at least a subset of the data received by the restaurant processor and generated alerts.
  • the central processor is able to display the received data and alerts.
  • a method of monitoring the operations of a facility and providing management support thereto includes the steps of capturing images using at least one camera installed in strategic locations in the facility, processing the captured images and deriving quantitative data therefrom, comparing the quantitative data with predetermined thresholds, and generating an alert in response to any threshold being exceeded.
  • the captured images, quantitative data, and alert are displayed to a user.
  • a method of remotely monitoring at least one restaurant facility and providing management support thereto includes the steps of receiving quantitative data determined by at least one sensor installed at the restaurant facility, and capturing images using at least one camera installed in strategic locations in the restaurant facility. The quantitative data are then compared with predetermined thresholds . An alert is generated in response to any threshold being exceeded. At least a subset of the quantitative data, captured images and generated alert are displayed to a remote user.
  • a method of counting items in captured images includes the steps of capturing a reference image, storing the captured reference image in a reference image array, capturing a target image, and storing the captured target image in a target image array.
  • the target image array is compared with the reference image array and a change image array representing a difference therebetween is generated. Pixel groups in the change image array are then identified and counted.
  • a temperature sensor in yet another embodiment of the present invention, includes an infrared thermometer having a field of measurement, and a laser coupled to the infrared thermometer.
  • the laser is operable to generate a laser beam visible to the human eye and is aimed substantially at the center of the infrared thermometer field of measurement.
  • a camera is coupled to the laser and has a field of view substantially centered with the laser beam. Actuating motors are operable to simultaneously effect the displacement of the laser beam, infrared thermometer field of measurement, and camera field of view, so that the laser beam remains centered in the infrared thermometer field of measurement and camera field of view.
  • a technical advantage of the present invention is that critical data is made available and presented to senior management in such a way that senior management is able to apply their knowledge and experience in real time to improve the operations and earnings of the enterprise. This is accomplished by the provision of hierarchically presented exception-based remote monitoring of facilities and operations. Quantitative data are gathered and compared with predetermined thresholds so that exceptions can be noted. Qualitative data such as video images are captured and analyzed to derive quantitative values therefrom, said derived quantitative values also being compared against expected or normal thresholds or ranges . For example, the number of people waiting to be serviced can be determined by analyzing the captured images. The timing of when food is delivered to a table by the wait staff can also be determined from the captured images.
  • the transformation of qualitative data to quantitative data so that certain performance values can be assigned to j udge the quality of the service is an important aspect of the present invention .
  • the exception information which results from the comparison of the quantitative values to the expected or normal threshold ranges can then be hierarchically presented to a remote user such as a senior manager, within the context of an alert condition, along with a "one click" initiation of a videoconference or teleconference so that said senior manager can provide management support to the exception condition .
  • a senior manager is better equipped to oversee the day-to-day operations of his/her many facilities and to troubleshoot when problems arise .
  • the system is operable to archive all or a subset of the quantitative data and video images when a service breach is detected .
  • the stored data may be analyzed and studied at a later time to trouble shoot problems and to support or deny an allegation of employee rudeness , inattentiveness , wrongdoing, etc .
  • non-intrusive “bots” to collect qualitative as well as quantitative data ensures quick installation time and minimal disruption to the operations of the facilities .
  • the use of wireless communications protocols also obviates the need to install cables and wiring to each bot . Because the data are transmitted over the Internet to the senior manager, no special hardware or software is required to access the data and video images .
  • FIGURE 1 is a block diagram of an embodiment of a system of remote exception-based facilities and operations monitoring constructed according to the teachings of the present invention
  • FIGURE 2 is a hierarchical diagram illustrating the levels of monitoring
  • FIGURE 3 is a block diagram of an embodiment of a system of remote food service operations monitoring constructed according to the teachings of the present invention
  • FIGURE 4 is a more detailed block diagram of kitchen equipment monitoring constructed according to an embodiment of the present invention.
  • FIGURE 5 is a more detailed block diagram of video monitoring constructed according to an embodiment of the present invention
  • FIGURE 6 is a simplified flowchart of a fast food restaurant process
  • FIGURE 7 is a flowchart of an embodiment of an order entry process according to an embodiment of the present invention
  • FIGURE 8 is a flowchart of an embodiment of a food preparation process according to an embodiment of the present invention
  • FIGURE 9 is a flowchart of an embodiment of a food service process according to an embodiment of the present invention.
  • FIGURE 10 is an exemplary graphical screen layout providing data summaries of a plurality of sites and a separate window providing detailed data of a particular site according to the teachings of the present invention
  • FIGURE 11 is a flow chart of an exemplary image quantitative analysis process according to the teachings of the present invention
  • FIGURES 12A-12C are illustrations representing a reference image array, a target image array and a change image array generated and used in an exemplary image quantitative analysis process according to the teachings of the present invention
  • FIGURE 13 is a flowchart of an embodiment of a basic item/people counting process according to the teaching of the present invention.
  • FIGURE 14 is a perspective view of an embodiment of a temperature sensor according to the teachings of the present invention.
  • FIGURE 1 is a block diagram of an embodiment of a system 10 of remote exception-based facilities and operations monitoring constructed according to the teachings of the present invention.
  • the operations of any facility can be remotely monitored.
  • any business that is service- oriented would benefit from more efficient monitoring, management, and training enabled by the present invention.
  • restaurants, hotels, daycare centers, automotive repair shops, drug stores, health clubs, banking institutions, hair salons, car washes and other facilities in which employees perform tangible measurable tasks are good candidates. The most benefit is derived when there are many facilities under the management of a senior executive, and when the business is highly dependent on the performance of its employees .
  • System 10 includes a central computer 12 with a user interface, such as a graphical user interface or a web browser application for displaying data, including sensor measurements, images, video streams, audio streams, analysis data, data summary reports, graphical information, and other information.
  • Central computer 12 may be any personal computer, computing platform, workstation, or processor that is capable of processing and analyzing data, and storing and accessing data in a database 14.
  • a video camera 13 is coupled to central computer 12 for videoconferencing.
  • Central computer 12 is also equipped for connection to a communications network such as the Internet 16 for communicating with one or more monitoring systems 20 each located at a different facility.
  • Central computer 12 may incorporate a modem, a cable modem, or have a TI connection, ISDN (integrated services digital network) connection, or a similar means of accessing the Internet 16.
  • Each monitoring system 20 includes at least one processor 22 in communication with a number of distributed data collecting "bots" 24.
  • Processor 22 may be any device which is capable of collecting data from bots 24, processing the data and to communicating the data to a web server (not explicitly shown) .
  • Processor 22 may be a personal computer, central processing unit, computing platform, workstation, processor, or any similar device.
  • processor 22 communicates with bots 24 via wireless communications, so that no cable or wiring has to be installed and routed in the facility.
  • the 15-XXXX series wireless observation systems manufactured by COP Security® may be used to transmit and receive the collected data signals and video signals.
  • all that is required to install bots 24 is to provide power and the physical mounting of the hardware.
  • a subset of bots 24 may be coupled to processor 22 via cables or wires, if desirable or necessary.
  • Data collecting bots 24 may include embedded processors, temperature sensors, level sensors, smoke detectors, carbon monoxide detectors, panic buttons, current sensors, security devices, perimeter detection devices, motion sensors, video cameras, digital cameras and other devices that gather a variety of data related to the performance and operations of a facility.
  • Bots 24 may collect quantitative data, such as temperature or electrical current usage, or qualitative data, such as video images.
  • the data collected by bots 24 are continuously or periodically transmitted to processor 22 via a transmitter/receiver 23 coupled to processor 22.
  • Processor 22 is further coupled to a monitor or display 21 on which data collected by bots 24, analysis data, and user remote control operators are displayed. Collected data and data derived from analysis may be stored in database or memory 36. The data may be stored for at least two reasons.
  • the system may store the data of the past hour, for example, including quantitative data and video images. This data is continuously overwritten so that what is stored is always the data for the last predetermined period.
  • a circular cache may be used for this purpose.
  • the second type of data storage occurs as a response to an exception. When a predetermined threshold or range is exceeded, the processor may respond by storing all or a subset of the data for a predetermined period of time.
  • Processor 22 may be coupled to a web server (not explicitly shown) on which web pages used to contain the data are stored. Processor 22 is operable to upload the collected data and analysis data to the web server. The web pages and the dynamic data are downloadable from the web server to central computer 12 via the Internet 16. Alternatively, the web pages and any static data may be stored locally on central computer 12 and the dynamic data and analysis data are transmitted directly from processors 22 to central computer 12 via the Internet. Software code, applets and additional applications may be used to enhance the transmission speed and/or dynamic presentation of the data as known in the art .
  • FIGURE 2 is a hierarchical diagram illustrating the levels of access and monitoring. An executive or senior level manager may have executive access 30 to all facilities or sites 36 under his or her oversight.
  • Sites 38-43 are typically grouped into a number of regions.
  • Regional managers may have regional access 33-35 only to the sites that are within the region (s) under his or her supervision.
  • regional managers as well as site managers may have regional access 32 as well as site access 36 to facility data from all regions and sites.
  • a user's personal identifier and password determine the level of access he/she has.
  • the capability to view system-wide data encourages cross- organizational communication and comparison. For example, the site manager of a less successful facility may access the operational data of a more successful facility to obtain insight and better understanding of its operations. Therefore, system 10 is a learning and management support tool as well as a monitoring tool .
  • System 10 further allows the senior manager to conduct videoconferencing with any or all of his/her regional and site managers while viewing data summaries or detailed data on all the facilities. In this manner, the senior manager is able to efficiently apply his/her many years of experience to the operations of all the facilities.
  • FIGURE 3 is a block diagram of an embodiment of a system 50 as specifically applied to remote food service operations monitoring according to the teachings of the present invention.
  • System 50 includes a central computer 52 with a user interface, such as a graphical user interface for displaying data, such as sensor measurements, images, video streams, audio streams and other information.
  • Central computer 52 may be any personal computer, computing platform, workstation, or processor that is capable of processing and analyzing data, and storing and accessing data in a database 54.
  • a video camera 53 is coupled to central computer for videoconferencing.
  • Central computer 52 is also equipped for connection to the Internet, a telecommunications network, or a computer network 56 for communicating with restaurant monitoring systems 60 and 80.
  • Each restaurant monitoring system 60 includes a restaurant processor 62 in wireless communications 64 with a number of distributed bots, which may include embedded processors, sensors, and other devices that gather a variety of data about the performance and operations of a food service facility.
  • the collected data and analysis thereof provide insight into the operations of the facilities that is usually not apparent or available, or is apparent only to those who have trained eyes and years of experience.
  • One or more point-of-sale (POS) processors 66 may be used by the restaurant employees to input customer food orders and determine the transaction amount. Typically, such POS processors 66 incorporate a keypad with buttons representing or labeled with names of food products that the restaurant serves. Similar devices, including wait staff order entry processors 72 typically used in establishments that use a wait staff to serve the food to seated customers, are used to enter ordered food items into system 50. Some POS processors 66 may be linked to portable order entry devices, such as a personal digital assistant (PDA) 76 and pagers 75, used by the wait staff via wireless communications. The food order, timestamp, and transaction amount of the order for each customer are conveyed to restaurant processor 62 via wireless communications. Not shown explicitly are security devices, such as smoke detectors, carbon monoxide detectors, perimeter alarms, etc. which are also in communications with processor 62 to relay any breach of associated parameters.
  • PDA personal digital assistant
  • security devices such as smoke detectors, carbon monoxide detectors, perimeter alarms,
  • Well-known wireless local loop technology may be used for the transmission of data between the bots and the restaurant processor. Not shown explicitly are wireless transmitters and receivers that enable the transmission and receipt of data. Typically, frequency in the 900 to 2500 MHz range are used to minimize interference.
  • System 50 may employ any suitable communications protocol between restaurant processor 62 and the data gathering bots and equipment, and between restaurant processor 62 and central computer 52. Although each restaurant processor is shown located at each food service establishment, the restaurant processors need not be installed on the premises of the establishment, if the means of communications between the restaurant processor and the data gathering bots and equipment so permits.
  • bots 68 including sensors monitoring various kitchen equipment are also in wireless communication with restaurant processor 62.
  • the temperature of kitchen equipment may be measured by strategically positioned temperature sensors 70.
  • the use of non-intrusive external temperature sensors is preferred to ensure minimal disruption of kitchen operations when the kitchen equipment is not already equipped to measure and transmit this data.
  • an infrared thermometer with a laser marker is used to pinpoint an exact measurement region.
  • an infrared thermometer with laser marker, model HHLM-2, manufactured by Omega may be used for this purpose.
  • the measured temperature reading is output as a voltage level, which is converted to a frequency by a voltage-to- frequency converter (not explicitly shown) and transmitted by the wireless transmitter.
  • the received frequency signal is converted to a voltage level by a frequency-to-voltage converter (not explicitly shown) .
  • the temperature represented by the voltage level is then provided as output.
  • the non-intrusive temperature sensor 340 includes an infrared thermometer and laser marker device 342 is combined with a video camera 344.
  • the center of the field of view of the video camera 346 is adjusted and aligned to coincide with the laser beam 348.
  • the center of a field of measurement 350 of the infrared thermometer is also adjusted and aligned with the laser beam 348.
  • Video camera 344 and infrared thermometer and laser marker device 342 are mounted together as a single integral unit to stepping motors 352 and driver 354 which may be used to control the uniform pan and tilt movements of the video camera and infrared sensor/laser.
  • the video image of the video camera is transmitted to the processor via video/audio transmitter/receiver 356 and then to the central computer for display to the user. Therefore, the user is able to see and determine the exact measurement region of the temperature sensor.
  • the user may remotely control the movement of the video camera to point the infrared thermometer to modify or fine tune the aim thereof and thus change the measurement region.
  • the user control signals are transmitted back to the stepper motors to effect the requested camera movement.
  • the user is able to receive constant feedback on the aim of the infrared thermometer by viewing the video image .
  • the video image may be converted to an active pixel map which enables the user to control the camera movement by clicking on the displayed video image.
  • the camera and thus the infrared thermometer are centered about the pixel the user clicked on.
  • a communication channel previously intended to transmit the audio signal is used to transmit the temperature measurement. Therefore in this preferred embodiment, no extra wireless transmitter and receiver hardware is needed to convey the temperature data along with the video signals to the processor.
  • Infrared temperature sensor 340 is operable to detect and transmit the operating temperature of kitchen equipment either continuously or when the doors are opened. For example, a temperature reading may be taken when a kitchen staff opens the oven to remove cooked food. The operating temperatures of ovens 90, stoves, grills 94, fryers 92, refrigerators 100, freezers 102, food holding areas 98, etc. may be measured in this manner and transmitted to the restaurant processor. On the other hand, some kitchen equipment may already incorporate temperature and other types of sensors. For example, some ovens 90 may be equipped with processors and sensors to determine and transmit data such as operating temperature, settings (food type and/or temperature setting) for cooking the food, and food preparation time.
  • other cooking equipment such as fryers 92 and grills 94 may also include temperature sensors and time keeping devices to determine and transmit temperature and time data to restaurant processor 62.
  • hot food and cold food holding areas 96 and 98 may also include temperature sensors to detect the temperature of the prepared food held there or the ambient temperature surrounding the food, and/or the temperature of the counter surface where the food is placed.
  • Other food preparation equipment such as a soft serve ice cream machine also may include temperature sensors and level sensors that are operable to determine and transmit ice cream mix temperatures and levels to restaurant processor 62.
  • the operation and performance of additional kitchen equipment such as refrigerators 100 and freezers 102 may also be monitored by detecting and transmitting temperature measurements to restaurant processor 62.
  • energy consumption may also be measured and conveyed to restaurant processor 62 to determine the operating efficiency of the equipment.
  • an inductive current sensor may be used to measure current flow in the power lines of the equipment. The energy consumption measurements may be timestamped to correlate with operating hours and recognize peak usage hours. Processor 62 may poll each bot periodically to receive the data.
  • FIGURE 5 is a more detailed block diagram of a video monitoring scheme according to an embodiment of the present invention.
  • Video and/or still cameras 110-119 are installed at strategic locations in and around the food service facility. In most instances, the cameras are positioned overhead, but the cameras may also be placed at eye-level, for example, and pointed across the target area to obtain a three-dimensional image of the target area.
  • a subset of these cameras may be infrared cameras that operate in the infrared region of the spectrum.
  • the captured video images not only serve as surveillance tools, but can also be analyzed to derive quantitative data that can be used to measure the quality of performance by the employees.
  • selected images at a reduced frame rate may be archived for later analysis.
  • one or more cameras 110 may be located in the restaurant lobby or waiting area to monitor the activities there.
  • Other video cameras 118 may be installed in the dining area.
  • image analysis software as further described below, the number of people waiting in food service lines, waiting to be seated, and seated at tables may be determined.
  • the captured images may be timestamped so that the time associated with certain activities may be determined. For example, an analysis of a captured image of table 21 at 18:32:33 may reveal that no one was seated at the table. An analysis of an image of the same table captured at 18:39:28 may reveal that three customers are seated at the table. An analysis of an image captured at 18:48:32 may show that an additional person arrived at the table and left shortly after.
  • the system may possess sufficient knowledge and artificial intelligence to assume that this person was a server who took food orders from the customers.
  • An analysis of an image captured at 18:59:44 may reveal that round flat objects resembling plates were placed on the table.
  • the system may determine from this information that it took at least nine minutes after the customers are seated for the server to take the food orders, and that it took at least eleven minutes for the food to arrive at the table. Therefore, such metrics for measuring the performance of the restaurant staff are computed and used to compare with permissible ranges. Details of the image analysis process is described below by referring to FIGURES 12A-12C.
  • One or more cameras 112 may also be installed in the kitchen to monitor the activities of the kitchen staff and other indicators of performance. For example, the cameras may be able to capture whether the floor and food preparation surfaces are kept clean. Food holding area cameras 114 may be used to capture images of the prepared food. Image processing software may be used to analyze the shade and color of the prepared food, the presence or absence of grill marks, and the general overall presentation of the food, for example.
  • additional cameras 116 may be installed to monitor the number of cars waiting to place food orders at the drive through window. Furthermore, exterior cameras may be used to capture images of cars entering the premises and then leaving because the drive through line is too long. Traffic conditions of adjacent roadways may also be monitored, analyzed and compared with the norm. Dining room cameras 118 may also be used to survey the activity in the dining area. Also, the overall cleanliness of the dining area including indications of how quickly tables are bussed may also be captured.
  • One or more additional video cameras 119 are also used for videoconferencing with the senior manager or with other personnel . These videoconferencing cameras are preferably located in an office or an area with minimal disturbance. Videoconferencing cameras 119 may be automatically activated when a service breach has occurred, such as when the temperature of a prepared food item is below the required temperature threshold. Videoconferencing may also be initiated when appropriate controls on the display screen are activated, such as when a videoconferencing button is clicked.
  • All still and video cameras communicate video and optionally audio data to restaurant processor 22 via wireless transmission.
  • a sophisticated system may allow a senior manager using central computer 52 to enter camera movement commands (pan and tilt) to point the camera in specific directions and to zoom in and out in real-time.
  • the video image is converted to an active pixel map which allows a user to center the image and thereby point the camera by clicking on the video display itself.
  • a manager may monitor the activities of the restaurant in real-time as if performing a virtual visit and survey of the facilities.
  • a senior manager or marketing executive may be able to view or otherwise videoconference with customers to gauge reaction to new food product offerings, food quality and quantity, and to receive immediate customer feedback.
  • FIGURE 6 is a simplified flowchart of a restaurant process 120 serving as an exemplary process in which the system and method of the present invention are applicable.
  • the first step of the process is taking food orders from the customer.
  • the food orders are input into POS processor 66, as shown in block 122. This allows the logging of food orders into system 50 which may be used for a number of purposes to be described below in conjunction with FIGURE 7.
  • the food is prepared, as shown in block 124. In some restaurant operations, food preparation of certain foods at peak dining times may be prepared in anticipation of high demand due to various reasons, such as warm comfort foods during cold wintry weather.
  • the food is served to the customer, as shown in block 126.
  • the process ends in block 128. With minor modifications, this process may be adjusted to other types of restaurant operations.
  • an additional step of seating the customers at a table may be added prior to the order entry step.
  • Steps of presenting the bill to the customer and receiving payment may also be added.
  • Each of these steps in the process presents opportunities for performance measurement and monitoring.
  • the wait staff may simply log each action in his or her portable order entry device so that each action receives a timestamp.
  • FIGURE 7 is a flowchart of an embodiment of a process 130 for order entry and to begin service according to an embodiment of the present invention.
  • an indication of the amount of time customers are required to wait to be seated may be determined, as shown in block 131. This step is optional for counter service establishments.
  • Imaging analysis software as described in detail below, is used to count the number of people waiting in the waiting area of the restaurant. A correlation for a wait time can then be drawn from the people count. If the wait time or the number of people waiting for a table exceeds a predetermined threshold, an alert is generated, as shown in blocks 132 and 133. In block 134, the amount of time a customer is required to wait to place an order is determined.
  • an image of the length of a line or the number of customers waiting in line at the counter may be captured by a video camera and analyzed by the image analysis software.
  • the order placement wait time may be determined by first ascertaining an approximate time the customers were seated at the table, and an approximate time the server approached the table briefly to take the order. By using image analysis software, the timing of such activities may be identified. Please refer to the discussion below for details on the image analysis software.
  • the time that the customer is required to wait to place the order is then compared to a predetermined threshold, as shown in block 135. If the threshold is exceeded, then an alert is generated, as shown in block 136.
  • the system If the system is coupled to POS processors, the system then receives the food order, which is entered by the server, as shown in block 138.
  • the order is timestamped or otherwise associated with a time value, as shown in block 140.
  • the order entry process ends in block 142. It will be apparent from the discussion below that the image analysis software may be resident on central computer 52 or restaurant processor 62.
  • an exemplary process 200 of analyzing the captured images according to the present invention is shown.
  • an image is captured by a video or still camera located in areas where customers may be waiting to be served.
  • infrared cameras may also be used.
  • the camera is located overhead and pointing downward.
  • shapes may be delineated and recognized, as shown in block 204.
  • the delineated shapes are then compared with relevant shapes stored in memory, as shown in block 206.
  • the relevant shapes are shapes that may represent customers ' heads and plates of food.
  • the infrared spectrum cameras may be used to obtain thermal imaging of the scene .
  • the thermal image may be used as an aide to better analyze the captured video or still image.
  • the thermal image may alternatively be the sole image analyzed to detect the presence of people.
  • the outline of each person may be more accurately determined, as shown in block 208.
  • the qualitative image data is translated to a quantitative value that can be used in comparison against preset standard thresholds and ranges.
  • the comparison provides an evaluation against the . norm such that exceptions may be noted and flagged.
  • the process ends in block 212.
  • the resulting number of waiting customers or a determination of wait time to place the order is compared with respective predetermined thresholds or predetermined ranges.
  • an alert may be issued (blocks 133 and 136) so that remedial action may be taken.
  • Remedial action may include a wide range of responses depending on the type and seriousness of the breach.
  • the senior manager may immediately communicate with the restaurant manager via a teleconferencing, videoconferencing, or some other forms of communications.
  • system 50 may automatically generate an electronic mail message, a page, voice mail, or some other form of notification to the restaurant manager and/or the personnel causing the alert.
  • System 50 receives the order at POS processor 66, as shown in block 138, at which time a timestamp may be determined, as shown in block 140. The process ends in block 142.
  • FIGURE 8 is a flowchart of an embodiment of a food preparation process 150 according an embodiment of the present invention.
  • food preparation start time may be timestamped or determined from the time the food order was taken.
  • Kitchen equipment temperature, food type settings, and cooking time are measured or determined, as shown in block 154.
  • the kitchen equipment operating status may also be measured by other types of sensors, as shown in block 156. For example, if an oven is set at 350 degrees, the actual temperature inside the oven is measured to determine whether the oven is operating properly. Such metrics are compared with predetermined thresholds in block 158. If the threshold is exceeded, then it is flagged or an alert is sent to the restaurant processor and/or the central computer, where it is observable by the local manager and the senior manager.
  • the managers may then take suitable actions in real-time or at a later time as deemed appropriate.
  • the food preparation end time is also noted, as shown in block 162.
  • the total food preparation time is then compared with a predetermined threshold, as shown in block 164.
  • an alert is issued if the threshold is exceeded, as shown in block 165 and suitable action may be taken.
  • the food temperature is measured, as shown in block 166.
  • This measurement is compared with a predetermined threshold, as shown in block 168, and exceptions are noted and actions may be taken when appropriate, as shown in block 169.
  • the process ends in block 170.
  • FIGURE 9 is a flowchart of an embodiment of a food service process 180 according an embodiment of the present invention.
  • the food temperature is measured by the temperature bot and compared with a predetermined threshold or range.
  • An alert is issued so that appropriate action may be taken when the threshold is exceeded or the value falls outside of the range.
  • the time is noted. This timestamp is compared with the time when the food preparation end time to determine how long the food idled at the holding area before it was served.
  • This elapsed time is compared with a predetermined threshold or range, as shown in block 190. If the threshold is exceeded or the value falls outside of the range, then an alert is issued so that a suitable response may be made.
  • the process ends in block 192.
  • the system may store data for at least two reasons.
  • the system may store the data of the past hour, for example, including quantitative data and video images. This data is continuously overwritten so that what is stored is always the data for the last predetermined period.
  • a circular cache may be used for this purpose.
  • the second type of data storage occurs as a response to an exception.
  • selected quantitative and qualitative data may be stored in a cache or some form of non-volatile memory.
  • the exception-based data are not overwritten unless specific instructions are given to do so.
  • the type and quantity of data stored may depend on the type of breach that occurred.
  • the video images may be stored at a reduced frame rate to conserve space, if necessary.
  • FIGURES 12A-12C are illustrations representing a reference image array, a target image array and a change image array generated and used in an exemplary image quantitative analysis process according to the teachings of the present invention. References will also be made to FIGURE 13, which is a flowchart of the basic people counting process 300. At some point each day when there are no people in the dining area and waiting area of a restaurant, a video image is captured of the "empty space", as shown in block 302.
  • This image is stored as the reference image against which future images will be compared. More specifically, the video image is captured in pixel format (typically 8 bit or 16 bit) and stored in a two-dimensional array, as shown in block 304. The data is preferably stored in pixel format. This stored pixel data forms a reference image array 230. In the example shown in FIGURE 12A, reference image array 230 contains the two-dimensional pixel data representing three tables 231-232 and the accompanying chairs 234-245 in a dining area of the restaurant . Preferably to increase the accuracy of the people counting process, each video camera covers a small grouping of tables, so that multiple video cameras are needed to cover the entire foodservice area.
  • target image array 250 When a customer count is needed, such as at opening and every five minutes thereafter, another video snapshot of the same scene is taken using the same camera, as shown in block 306.
  • the information is preferably stored in the same format as the reference image array, as shown in block 308.
  • the resultant two-dimensional pixel data is a target image array 250.
  • target image array 250 also includes pixel representations of people (seated 251-255 and standing 256 and 257) , a service tray 260, and plates of food 26-1.
  • the stored target image array 250 is then compared to reference image array 230, as shown in block 310. For those areas where the pixels are the same in both arrays, the system determines that there are no new objects or people. When there are differences between the two arrays, the system may assume that people or objects are now present where there were none in the reference image array.
  • a change image array 270 is generated by subtracting reference image array 230 from target image array on a pixel-by-pixel basis, leaving only what has changed.
  • Change image array 270 is stored, as shown in block 312.
  • the next step is to reduce the pixels in change image array 270 into pixel groups, as shown in block 314. In its simplest form, this is accomplished by systematically examining each pixel in change image array 270.
  • a pixel group is defined as multiple adjacent changed pixels surrounded by unchanged pixels. Each pixel group is suspected by the system to represent a person, plates of food, or new objects brought into the area, for example. It should be noted that, changes in pixel values due to a difference in lighting, time-of-day, weather conditions, etc.
  • the first option utilizes the fact that people tend to move frequently, but most other significant objects in a foodservice area do not .
  • the process of generating a change image array is repeated, at relatively brief intervals, for a predetermined number of times. For example, a change image array may be generated for images captured every ten seconds for six repetitions. Thus, six sequential change image arrays are generated in 60 seconds. The sequential change image arrays are then compared to each other. If, in each of the six sequential change image arrays, a pixel group of a required size has substantively changed, this particular pixel group may be counted as a person.
  • the original reference image array 230 becomes less critical over time. For example, if a pixel group remains unchanged over several sequential change image arrays, then for another set of sequential change image arrays the same pixel group does change, and then the same pixel group reverts back to the original pixel values, the system may assume something or someone passed between the camera and a stationary background. If the system assumes that no inanimate object can move without human assistance, then the movement can be counted as a person.
  • the resultant people count value or the associated change image array is timestamped to provide a time reference for these values.
  • the timestamp of the people count may be used to determine quantitative data associated with time. For example, the time span from when the customers are seated at a table to when their food arrives or the average wait time for a table can be determined.
  • Another people counting method utilizes the fact that, in general, the largest thing to appear into a foodservice area (i.e. a dining room, a bar or a waiting area) is a person. Plates, forks, and other items may enter and leave the scene, but they are much smaller than people. Therefore, one way of counting people is to simply count all the pixel, groups which are larger than a certain predetermined size, as shown in block 316, and the process ends in block 318.
  • the size comparison may be done by comparing the number of pixels in each pixel group to a predetermined number of pixels .
  • Another people counting method is to compare the pixel groups to reference pixel groups which represent two-dimensional shapes of people. For example, representative pixel samples of several people with different color hair and hairstyles may be obtained from several possible angles and distances. The video samples are stored as two-dimensional "people-like" pixel groups. We can then compare the pixel groups in the change image array to the pixel groups in the "people-like" pixel groups for a match. When there is a match, the pixel group in the change image array is counted as a person.
  • Another method for distinguishing a person pixel group is to remove pixel groups that do not resemble a person. For example, those pixel groups that resemble basic shapes of plates and service trays can be distinguished and eliminated. Furthermore, a thermal image of the same scene taken at the same time may be analyzed to eliminate non-human pixel groups. The system then assumes that the remaining pixel groups are people and counted as such.
  • a one-foot ruler can be placed in the lower portion of the screen - which would represent close proximity to the camera located near the ceiling - and also in the upper portion of the screen - which would represent further distance away from the camera.
  • Scaling vectors can then be drawn between the two rulers to indicate the relative scale of items located in the lower portion of the screen as compared to items located in the upper portion of the screen, and stored for future reference during the analysis process.
  • the camera is at a fixed location, but capable of panning and zooming under electronic control, a variation of this method can be used to determine scale as the camera is pointed in different directions or zoomed to different views. This variation utilizes scaling vectors taken from various cameras panning positions and zoom levels.
  • Redundancies can also be used to reduce false measurements. For example, counting the number of people using each of the four alternate measures noted above may yield four different values, but an average of the count values from the four methods may come statistically closer to the actual count. Repetitive readings can also be used to reduce false measurements. For example, capturing and counting a different target array every 3 seconds for 15 seconds, and averaging those counts, may yield more accurate data than a single count.
  • a video camera may be positioned above doorways through which customers enter or exit the facility.
  • an ongoing count of people inside the restaurant can be established and maintained. If the count of people inside the restaurant is compared to the number of people counted using the counting methods, the data produced by these methods can be either confirmed or disputed.
  • a variation of the methods discussed above involves correlating the change image arrays generated from images captured from multiple cameras located in the facility to track each person as they move about the subject area.
  • This method operates in a similar was as the cellular phone system in that when a person moves from the surveillance area of one camera into the surveillance area of another, a "handoff" occurs.
  • This variation tracks each moving pixel group as it moves through the foodservice area.
  • the presence of the restaurant employees invariably affects the accuracy of these counting methods.
  • the value of interest is the number of customers, not employees.
  • the employees may wear badges which can be clearly identified in an image array.
  • the payroll clock which knows how many employees are on duty, may be tied into the system to provide an employee count at a given time.
  • the number of employees can be estimated.
  • the restaurant business is cyclical based on days of the week and the time of day. At a given time, a given number of employees or waitpersons are required to be on duty. This estimate may be provided to the system to account for the number of employees .
  • FIGURE 10 is a screen shot of an exemplary user interface 170 displaying data summaries and detailed data.
  • the user interface may be composed as web pages using hypertext markup language (HTML) and its many extensions (dynamic HTML, extensible markup language (XML) , cascading style sheets (CSS) , etc.) .
  • HTML hypertext markup language
  • XML extensible markup language
  • CSS cascading style sheets
  • the web pages and data are retrieved and displayed by a web browser application running on the central computer and also the restaurant processor.
  • the display preferably includes a table 172 containing data summaries of all facilities of interest to the user.
  • a regional manager's login provision of a user identifier and a password, for example
  • a senior manager's login may provide access to data from facilities in all the regions under his or her control.
  • the data summary of each facility or unit occupies a row in the table.
  • the revenue generated at each facility is shown. This data is derived from summing the POS sales figures received from the facility. A total revenue amount from sales in all the facilities may also be displayed. The revenue amounts are compared with planned projections and the differences and the percentage variances are displayed. Critical, major and minor breaches of various categories are also displayed. When a breach in a facility occurs, the data summary table displays the breach by category, criticality and the number of occurrences. To capture the attention of a user, visual and/or audible alarms may be initiated.
  • a window 174 may open automatically or on demand to display detailed data of the facility experiencing the breach. The user may resize and position the table and window 174 to his or her liking. Window 174 contains more detailed qualitative and quantitative data of a particular facility. This detailed data window may be opened at any time for any facility. Quantitative data may be displayed in a page format 178 with tabs 176 to facilitate access. Each page may display a category of data, such as temperature and statistics, or each page may provide data on a certain process,* such as food preparation activities and data or dining room activities and data.
  • the site statistics page or window may display data such as the number of customers currently waiting to be served and the maximum number of customers that were waiting to be served at some point in time; the number of customers currently in the dining area, the maximum number of customers in the dining area at some point in time, the total number of customers today, the total customer number compared to the norm, the yearly total number of customers, and the yearly customer number compared to the norm.
  • the site statistics page may also include a number of customers that are being served, and a total number of customers that have been served today. Statistics on the customer service time may also be provided, such as the average time, minimum and maximum time, and a comparison against the norm. From this data, the number of customers who requested take out service may be available from the POS processors. Thus, a today's total compared against the norm, and the yearly total compared to the norm can be displayed in the site statistics page. These data types are merely provided as examples as the options are limitless.
  • Qualitative data such as video images 180 and 183-
  • video image 180 provides dynamic streaming video and optional audio taken by a camera location selected in an image window 182.
  • Additional image windows 183-185 may be used to display images captured by currently non- selected cameras.
  • the images in windows 183-185 may be static images or updated at a rate far less than the selected image window 180, depending on available bandwidth.
  • Clicking on the control icons displayed in window 186 can remotely control the pan, tilt, and zoom camera movements. Alternatively, the control icons are not required if the user is able to click on the selected camera image itself to re-center the image or cause the video camera angle to be displaced in a certain direction.
  • the user may select a different camera to be displayed in window 180, which in effect swaps the image currently displayed in dynamic window 180 with an image displayed in one of the static windows 183- 185. Additionally, the system may automatically select a camera installed at the location of a breach when it is detected.
  • the senior manager is notified of a problem in a facility by exception-based reporting when the quantitative and qualitative data fall outside of a preferred range. Therefore, the senior manager or other users are not required to continuously view and monitor the video images and other data of all the sites in order to evaluate their performance. Detailed information of particular sites can be accessed easily on the same screen. Response to breaches of the threshold can be immediately carried out to remedy the breach. The senior manager may use this opportunity to refine the training of his employee by immediately being in contact with the employee or the local manager by videoconference or teleconference.
  • the quantitative data and qualitative data can be archived for later analysis and study, such as trending, historic, forecasting, etc.
  • the captured images may be studied to determine the whys and hows of certain events. For example, a restaurant's June earnings are down substantially, which are not in-line with projections and historic performance. No obvious explanation is apparent.
  • a study and analysis of captured* images by the exterior cameras may reveal that traffic on the road fronting the facilities is nearly non-existent, which led to the realization that the road access to the restaurant has been blocked off due to construction.
  • a study of the captured images may reveal that the restaurant has been operating with fewer tables, which were originally removed because of a shortage of wait staff.
  • the archived image data and quantitative data may also be used to support or deny an allegation of employee rudeness, inattentiveness, wrongdoing, etc.
  • the system is preferably web-based, special software is not needed to access the data.
  • the senior manager may immediately respond by communicating with the staff via teleconferencing or videoconferencing.
  • the present invention virtually transports a senior manager to all the sites of his/her operations and facilities simultaneously, automatically monitors the operational data of each site and flags exceptions, and makes available the experience and knowledge of the senior manager to the staff at each site.
  • the benefits derived from employing the system and method of the present invention are many fold.
  • a long wait time for each phase of the dining experience (wait time for a table, wait time to place an order, wait time to be served, wait time to get the bill, etc.) can be closely monitored and improved if necessary. These parameters are determined by analyzing captured video or still images.
  • the proper manner in which the food is prepared (preparation temperature, presentation, etc.) can be ensured and improved if necessary.
  • Proper food preparation temperature is especially crucial for certain foodstuffs, such as chicken, beef, pork and eggs, for example. Optimal operations in these areas would particularly ensure greater customer satisfaction, higher return business, and higher revenues.
  • the operation and efficiency of the equipment may be closely monitored, including refrigerators, freezers, ovens, grills, fryers, etc.
  • Other security and safety devices such as smoke detectors, security alarms, carbon monoxide detectors, and employee panic buttons, can also be closely monitored on an exception basis.
  • External cameras can be used to capture images of vehicles entering the premises, the number of vehicles parked in the parking lot, traffic conditions on adjacent roadways, and weather conditions. These captured images can be analyzed to provide quantitative data and archive data for later analysis.
  • a particular beneficial aspect of the present invention is the ability to allow a senior manager or owner to leverage his/her many years of experience and knowledge to optimize the operations and revenue of all of his/her facilities.
  • the present invention provides an efficient data conduit from the facilities to the senior manager and an efficient experience and knowledge conduit from the senior manager to the facilities.
  • the senior manager can better apply his experience and skills to optimize the operations or trouble-shoot .
  • the benefit comes from immediate remedy of parameter breaches, immediate feedback and correction on poor performance, immediate feedback and recognition on superior performance, and immediate opportunities to refine the training of the employees.
  • data related to all aspects of the facility can be processed, analyzed, archived, and reviewed.
  • the operations of a facility can be remotely monitored and supported by senior personnel . Even with an inexperienced staff, the remote monitoring, real-time training and communication with the senior manager allows the operations to be run more smoothly. Therefore, the knowledge and experience of the senior manager can be effectively leveraged to carefully monitor, control and optimize the operations of several facilities.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

The present invention provides a system and method of monitoring the operations of one or more labor-sensitive facilities and providing management support thereto. The system includes at least one sensor operable to receive quantitative data associated with the operations of the facilities, and at least one camera installed in the facility operable to capture images. Further, a processor is in communications with the at least one sensor and operable to receive the quantitative data. The processor is also in communications with the at least one camera and operable to receive the captured images. The processor is operable to derive quantitative data from the captured images, analyze the quantitative data, compare them with predetermined thresholds, and generate an alert in response to any threshold being exceeded. A central computer is in communications with the processor and is operable to receive at least a subset of the data received by the restaurant processor and any alert. The central computer is also operable to display to an off-site user the received data and alert, and to enable communication therebetween.

Description

SYSTEM AND METHOD OF
FACILITIES AND OPERATIONS MONITORING AND
REMOTE MANAGEMENT SUPPORT
TECHNICAL FIELD OF THE INVENTION
This invention is related in general to the field of computer systems and software. More particularly, the invention is related to a system and method of exception- based facilities and operations monitoring and remote management support .
BACKGROUND OF THE INVENTION
Today's manufacturing facilities, such as an automotive assembly plant, are highly automated, controlled and monitored. The raw material input is inventoried and tracked as it moves through the factory. Each step of the manufacturing and assembly process is monitored carefully so that variances exceeding a predetermined threshold can be flagged and corrected immediately. Such careful control and monitoring of the entire process results in less failures and errors, even though many of the manufacturing processes are themselves highly automated and thus not particularly labor dependent.
Kitchens are, in many ways, no different from factories or assembly lines. In a kitchen, there is also raw material input that undergoes a well-defined process. At the end of the process, finished goods are produced. However, the "made-to-order" labor-intensive operation of today's commercial kitchens in the food service industry is typically not carefully monitored and managed. Therefore, the food product "manufactured" by a restaurant can be highly variable depending on the experience and training of its employees. With a tight labor market and the typical high turnover rate in the food service industry, employee training and employee performance monitoring and management support are a top priority.
Moreover, a customer's impression and evaluation of a visit to a restaurant is not only dependent upon the manufactured product, but is also highly dependent on the level of service delivered by the restaurant employees. As part of the evaluation and overall impression of the restaurant, customers factor in how long the wait was for a table, how long they had to wait for the waitperson to greet them and take their orders, how long they had to wait for the waitperson to deliver food to the table, how long it took for the waitperson to respond to requests
(beverage refills, more bread, etc.), and how long they had to wait for the waitperson to deliver the check to the table, etc. Above all, it is important that the kitchen staff prepare food following proper procedures to ensure food safety and consistent quality.
The typical organizational structure in which a senior manager oversees a large number of restaurants makes it essentially impossible for the senior manager to ensure proper employee training, and to closely monitor and support each facility. As a result, the senior manager is rarely on the premises of the restaurants and has very little information on the day-to-day operations of the facilities. Even when a visit is made the senior manager does not get a true picture of the day-to-day operations because the employees are often nervous and act differently than normal. Therefore, the senior manager's many years of experience and knowledge cannot be effectively and efficiently applied to ensure optimal operations or to trouble-shoot when problems arise.
The operations of many other types of facilities can also benefit from better control, better monitoring, and better training of its employees. The operations of these facilities are typically highly dependent on its employees to perform a service for the customers, particularly when the formation of the customers' total experience and impressions is highly dependent upon the performance of the employees. Examples of this type of service-oriented businesses include restaurants, hotels, daycare centers, automotive repair shops, drug stores, health clubs, banking institutions, hair salons, and car washes. Other businesses relying on its labor force to a lesser degree, such as retail merchandise outlets, grocery stores, and supermarkets, will also benefit from better training, management, monitoring and management support of its employees.
It may be seen that in all of these service-oriented businesses, the labor force typically suffers from a high turnover rate, and in many cases, little formal training. In general, the senior managers of such businesses receive only periodic and second-hand information reported by lower level managers, who in turn may have received the data from even lower level supervisors. The result is an undesirable disconnect between the senior manager and the day-to-day business. Therefore, the senior manager is not equipped to efficiently apply his/her experience and knowledge to the day-to-day operations of the facilities and to trouble-shoot when problems arise.
SUMMARY OF THE INVENTION
Accordingly, there is a need for a system and a method to monitor the operations and performance of multiple facilities and to provide management support thereto. In accordance with the present invention, a system and method of remote facilities and operations monitoring and providing management support are provided which eliminate or substantially reduce the disadvantages associated with prior operations. In particular, an exception-based remote monitoring of the facilities and operations and management support are provided.
In one embodiment of the invention, a system of monitoring the operations of one or more facilities and enabling management support thereto is provided. The system includes at least one camera installed in the facility operable to capture images. A processor is in communication with the at least one camera and operable to receive the captured images. The processor is operable to derive quantitative data from the captured images, analyze all the quantitative data, compare them with predetermined thresholds, and generate an alert in response to any threshold being exceeded. In another embodiment of the invention, a system of monitoring the operations of one or more food service establishments and enabling management support thereto includes at least one sensor operable to gather quantitative data. The system further includes at least one camera installed in the food service establishment operable to capture images. A restaurant processor is in communication with the at least one sensor and operable to receive the quantitative data. , The restaurant processor is also in communication with the at least one camera and operable to receive the captured images. The restaurant processor is operable to compare the quantitative data with predetermined thresholds, and to generate an alert in response to any threshold being exceeded. A central computer is in communication with the restaurant processor and operable to receive at least a subset of the data received by the restaurant processor and generated alerts. The central processor is able to display the received data and alerts.
' In yet another embodiment of the present invention, a method of monitoring the operations of a facility and providing management support thereto includes the steps of capturing images using at least one camera installed in strategic locations in the facility, processing the captured images and deriving quantitative data therefrom, comparing the quantitative data with predetermined thresholds, and generating an alert in response to any threshold being exceeded. The captured images, quantitative data, and alert are displayed to a user.
In another embodiment of the present invention, a method of remotely monitoring at least one restaurant facility and providing management support thereto includes the steps of receiving quantitative data determined by at least one sensor installed at the restaurant facility, and capturing images using at least one camera installed in strategic locations in the restaurant facility. The quantitative data are then compared with predetermined thresholds . An alert is generated in response to any threshold being exceeded. At least a subset of the quantitative data, captured images and generated alert are displayed to a remote user.
In yet another embodiment of the present invention, a method of counting items in captured images includes the steps of capturing a reference image, storing the captured reference image in a reference image array, capturing a target image, and storing the captured target image in a target image array. The target image array is compared with the reference image array and a change image array representing a difference therebetween is generated. Pixel groups in the change image array are then identified and counted.
In yet another embodiment of the present invention, a temperature sensor includes an infrared thermometer having a field of measurement, and a laser coupled to the infrared thermometer. The laser is operable to generate a laser beam visible to the human eye and is aimed substantially at the center of the infrared thermometer field of measurement. A camera is coupled to the laser and has a field of view substantially centered with the laser beam. Actuating motors are operable to simultaneously effect the displacement of the laser beam, infrared thermometer field of measurement, and camera field of view, so that the laser beam remains centered in the infrared thermometer field of measurement and camera field of view.
A technical advantage of the present invention is that critical data is made available and presented to senior management in such a way that senior management is able to apply their knowledge and experience in real time to improve the operations and earnings of the enterprise. This is accomplished by the provision of hierarchically presented exception-based remote monitoring of facilities and operations. Quantitative data are gathered and compared with predetermined thresholds so that exceptions can be noted. Qualitative data such as video images are captured and analyzed to derive quantitative values therefrom, said derived quantitative values also being compared against expected or normal thresholds or ranges . For example, the number of people waiting to be serviced can be determined by analyzing the captured images. The timing of when food is delivered to a table by the wait staff can also be determined from the captured images. The transformation of qualitative data to quantitative data so that certain performance values can be assigned to j udge the quality of the service is an important aspect of the present invention . The exception information which results from the comparison of the quantitative values to the expected or normal threshold ranges can then be hierarchically presented to a remote user such as a senior manager, within the context of an alert condition, along with a "one click" initiation of a videoconference or teleconference so that said senior manager can provide management support to the exception condition . Using the present invention, a senior manager is better equipped to oversee the day-to-day operations of his/her many facilities and to troubleshoot when problems arise . Further, the system is operable to archive all or a subset of the quantitative data and video images when a service breach is detected . The stored data may be analyzed and studied at a later time to trouble shoot problems and to support or deny an allegation of employee rudeness , inattentiveness , wrongdoing, etc .
Furthermore , the use of non-intrusive "bots " to collect qualitative as well as quantitative data ensures quick installation time and minimal disruption to the operations of the facilities . The use of wireless communications protocols also obviates the need to install cables and wiring to each bot . Because the data are transmitted over the Internet to the senior manager, no special hardware or software is required to access the data and video images .
Further, because the employees are not required to make procedural changes , there is no additional training requirement . The table set forth below near the end of the detailed description summarizes the contemplated benefits and advantages associated with the present invention .
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the present invention, reference may be made to the accompanying drawings, in which:
FIGURE 1 is a block diagram of an embodiment of a system of remote exception-based facilities and operations monitoring constructed according to the teachings of the present invention;
FIGURE 2 is a hierarchical diagram illustrating the levels of monitoring; FIGURE 3 is a block diagram of an embodiment of a system of remote food service operations monitoring constructed according to the teachings of the present invention;
FIGURE 4 is a more detailed block diagram of kitchen equipment monitoring constructed according to an embodiment of the present invention;
FIGURE 5 is a more detailed block diagram of video monitoring constructed according to an embodiment of the present invention; FIGURE 6 is a simplified flowchart of a fast food restaurant process;
FIGURE 7 is a flowchart of an embodiment of an order entry process according to an embodiment of the present invention; FIGURE 8 is a flowchart of an embodiment of a food preparation process according to an embodiment of the present invention;
FIGURE 9 is a flowchart of an embodiment of a food service process according to an embodiment of the present invention;
FIGURE 10 is an exemplary graphical screen layout providing data summaries of a plurality of sites and a separate window providing detailed data of a particular site according to the teachings of the present invention; FIGURE 11 is a flow chart of an exemplary image quantitative analysis process according to the teachings of the present invention;
FIGURES 12A-12C are illustrations representing a reference image array, a target image array and a change image array generated and used in an exemplary image quantitative analysis process according to the teachings of the present invention;
FIGURE 13 is a flowchart of an embodiment of a basic item/people counting process according to the teaching of the present invention; and
FIGURE 14 is a perspective view of an embodiment of a temperature sensor according to the teachings of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
FIGURE 1 is a block diagram of an embodiment of a system 10 of remote exception-based facilities and operations monitoring constructed according to the teachings of the present invention. Using the present invention, the operations of any facility can be remotely monitored. In particular, any business that is service- oriented would benefit from more efficient monitoring, management, and training enabled by the present invention. For example, restaurants, hotels, daycare centers, automotive repair shops, drug stores, health clubs, banking institutions, hair salons, car washes and other facilities in which employees perform tangible measurable tasks are good candidates. The most benefit is derived when there are many facilities under the management of a senior executive, and when the business is highly dependent on the performance of its employees .
System 10 includes a central computer 12 with a user interface, such as a graphical user interface or a web browser application for displaying data, including sensor measurements, images, video streams, audio streams, analysis data, data summary reports, graphical information, and other information. Central computer 12 may be any personal computer, computing platform, workstation, or processor that is capable of processing and analyzing data, and storing and accessing data in a database 14. A video camera 13 is coupled to central computer 12 for videoconferencing. Central computer 12 is also equipped for connection to a communications network such as the Internet 16 for communicating with one or more monitoring systems 20 each located at a different facility. Central computer 12 may incorporate a modem, a cable modem, or have a TI connection, ISDN (integrated services digital network) connection, or a similar means of accessing the Internet 16. Each monitoring system 20 includes at least one processor 22 in communication with a number of distributed data collecting "bots" 24. Processor 22 may be any device which is capable of collecting data from bots 24, processing the data and to communicating the data to a web server (not explicitly shown) . Processor 22 may be a personal computer, central processing unit, computing platform, workstation, processor, or any similar device. Preferably, processor 22 communicates with bots 24 via wireless communications, so that no cable or wiring has to be installed and routed in the facility. For example, the 15-XXXX series wireless observation systems manufactured by COP Security® may be used to transmit and receive the collected data signals and video signals. Preferably, all that is required to install bots 24 is to provide power and the physical mounting of the hardware. However, a subset of bots 24 may be coupled to processor 22 via cables or wires, if desirable or necessary.
Data collecting bots 24 may include embedded processors, temperature sensors, level sensors, smoke detectors, carbon monoxide detectors, panic buttons, current sensors, security devices, perimeter detection devices, motion sensors, video cameras, digital cameras and other devices that gather a variety of data related to the performance and operations of a facility. Bots 24 may collect quantitative data, such as temperature or electrical current usage, or qualitative data, such as video images. The data collected by bots 24 are continuously or periodically transmitted to processor 22 via a transmitter/receiver 23 coupled to processor 22. Processor 22 is further coupled to a monitor or display 21 on which data collected by bots 24, analysis data, and user remote control operators are displayed. Collected data and data derived from analysis may be stored in database or memory 36. The data may be stored for at least two reasons. The system may store the data of the past hour, for example, including quantitative data and video images. This data is continuously overwritten so that what is stored is always the data for the last predetermined period. A circular cache may be used for this purpose. The second type of data storage occurs as a response to an exception. When a predetermined threshold or range is exceeded, the processor may respond by storing all or a subset of the data for a predetermined period of time.
Processor 22 may be coupled to a web server (not explicitly shown) on which web pages used to contain the data are stored. Processor 22 is operable to upload the collected data and analysis data to the web server. The web pages and the dynamic data are downloadable from the web server to central computer 12 via the Internet 16. Alternatively, the web pages and any static data may be stored locally on central computer 12 and the dynamic data and analysis data are transmitted directly from processors 22 to central computer 12 via the Internet. Software code, applets and additional applications may be used to enhance the transmission speed and/or dynamic presentation of the data as known in the art . FIGURE 2 is a hierarchical diagram illustrating the levels of access and monitoring. An executive or senior level manager may have executive access 30 to all facilities or sites 36 under his or her oversight. Sites 38-43 are typically grouped into a number of regions. Regional managers may have regional access 33-35 only to the sites that are within the region (s) under his or her supervision. Alternatively, regional managers as well as site managers may have regional access 32 as well as site access 36 to facility data from all regions and sites. A user's personal identifier and password determine the level of access he/she has. Alternatively, the capability to view system-wide data encourages cross- organizational communication and comparison. For example, the site manager of a less successful facility may access the operational data of a more successful facility to obtain insight and better understanding of its operations. Therefore, system 10 is a learning and management support tool as well as a monitoring tool . System 10 further allows the senior manager to conduct videoconferencing with any or all of his/her regional and site managers while viewing data summaries or detailed data on all the facilities. In this manner, the senior manager is able to efficiently apply his/her many years of experience to the operations of all the facilities.
FIGURE 3 is a block diagram of an embodiment of a system 50 as specifically applied to remote food service operations monitoring according to the teachings of the present invention. System 50 includes a central computer 52 with a user interface, such as a graphical user interface for displaying data, such as sensor measurements, images, video streams, audio streams and other information. Central computer 52 may be any personal computer, computing platform, workstation, or processor that is capable of processing and analyzing data, and storing and accessing data in a database 54. A video camera 53 is coupled to central computer for videoconferencing. Central computer 52 is also equipped for connection to the Internet, a telecommunications network, or a computer network 56 for communicating with restaurant monitoring systems 60 and 80. Each restaurant monitoring system 60 includes a restaurant processor 62 in wireless communications 64 with a number of distributed bots, which may include embedded processors, sensors, and other devices that gather a variety of data about the performance and operations of a food service facility. The collected data and analysis thereof provide insight into the operations of the facilities that is usually not apparent or available, or is apparent only to those who have trained eyes and years of experience.
One or more point-of-sale (POS) processors 66 may be used by the restaurant employees to input customer food orders and determine the transaction amount. Typically, such POS processors 66 incorporate a keypad with buttons representing or labeled with names of food products that the restaurant serves. Similar devices, including wait staff order entry processors 72 typically used in establishments that use a wait staff to serve the food to seated customers, are used to enter ordered food items into system 50. Some POS processors 66 may be linked to portable order entry devices, such as a personal digital assistant (PDA) 76 and pagers 75, used by the wait staff via wireless communications. The food order, timestamp, and transaction amount of the order for each customer are conveyed to restaurant processor 62 via wireless communications. Not shown explicitly are security devices, such as smoke detectors, carbon monoxide detectors, perimeter alarms, etc. which are also in communications with processor 62 to relay any breach of associated parameters.
Well-known wireless local loop technology may be used for the transmission of data between the bots and the restaurant processor. Not shown explicitly are wireless transmitters and receivers that enable the transmission and receipt of data. Typically, frequency in the 900 to 2500 MHz range are used to minimize interference. System 50 may employ any suitable communications protocol between restaurant processor 62 and the data gathering bots and equipment, and between restaurant processor 62 and central computer 52. Although each restaurant processor is shown located at each food service establishment, the restaurant processors need not be installed on the premises of the establishment, if the means of communications between the restaurant processor and the data gathering bots and equipment so permits.
Referring also to FIGURE 4, bots 68 including sensors monitoring various kitchen equipment are also in wireless communication with restaurant processor 62. The temperature of kitchen equipment may be measured by strategically positioned temperature sensors 70. The use of non-intrusive external temperature sensors is preferred to ensure minimal disruption of kitchen operations when the kitchen equipment is not already equipped to measure and transmit this data. Preferably, an infrared thermometer with a laser marker is used to pinpoint an exact measurement region. For example, an infrared thermometer with laser marker, model HHLM-2, manufactured by Omega may be used for this purpose. The measured temperature reading is output as a voltage level, which is converted to a frequency by a voltage-to- frequency converter (not explicitly shown) and transmitted by the wireless transmitter. At the processor, the received frequency signal is converted to a voltage level by a frequency-to-voltage converter (not explicitly shown) . The temperature represented by the voltage level is then provided as output. In a preferred embodiment, the non-intrusive temperature sensor 340, as shown in FIGURE 14, includes an infrared thermometer and laser marker device 342 is combined with a video camera 344. The center of the field of view of the video camera 346 is adjusted and aligned to coincide with the laser beam 348. The center of a field of measurement 350 of the infrared thermometer is also adjusted and aligned with the laser beam 348. Video camera 344 and infrared thermometer and laser marker device 342 are mounted together as a single integral unit to stepping motors 352 and driver 354 which may be used to control the uniform pan and tilt movements of the video camera and infrared sensor/laser. The video image of the video camera is transmitted to the processor via video/audio transmitter/receiver 356 and then to the central computer for display to the user. Therefore, the user is able to see and determine the exact measurement region of the temperature sensor. The user may remotely control the movement of the video camera to point the infrared thermometer to modify or fine tune the aim thereof and thus change the measurement region. The user control signals are transmitted back to the stepper motors to effect the requested camera movement. The user is able to receive constant feedback on the aim of the infrared thermometer by viewing the video image . As known in the art, the video image may be converted to an active pixel map which enables the user to control the camera movement by clicking on the displayed video image. The camera and thus the infrared thermometer are centered about the pixel the user clicked on. In a preferred embodiment, a communication channel previously intended to transmit the audio signal is used to transmit the temperature measurement. Therefore in this preferred embodiment, no extra wireless transmitter and receiver hardware is needed to convey the temperature data along with the video signals to the processor.
Infrared temperature sensor 340 is operable to detect and transmit the operating temperature of kitchen equipment either continuously or when the doors are opened. For example, a temperature reading may be taken when a kitchen staff opens the oven to remove cooked food. The operating temperatures of ovens 90, stoves, grills 94, fryers 92, refrigerators 100, freezers 102, food holding areas 98, etc. may be measured in this manner and transmitted to the restaurant processor. On the other hand, some kitchen equipment may already incorporate temperature and other types of sensors. For example, some ovens 90 may be equipped with processors and sensors to determine and transmit data such as operating temperature, settings (food type and/or temperature setting) for cooking the food, and food preparation time. Similarly, other cooking equipment such as fryers 92 and grills 94 may also include temperature sensors and time keeping devices to determine and transmit temperature and time data to restaurant processor 62. In addition, hot food and cold food holding areas 96 and 98 may also include temperature sensors to detect the temperature of the prepared food held there or the ambient temperature surrounding the food, and/or the temperature of the counter surface where the food is placed. Other food preparation equipment such as a soft serve ice cream machine also may include temperature sensors and level sensors that are operable to determine and transmit ice cream mix temperatures and levels to restaurant processor 62. The operation and performance of additional kitchen equipment such as refrigerators 100 and freezers 102 may also be monitored by detecting and transmitting temperature measurements to restaurant processor 62. In all equipment, energy consumption may also be measured and conveyed to restaurant processor 62 to determine the operating efficiency of the equipment. For example, an inductive current sensor may be used to measure current flow in the power lines of the equipment. The energy consumption measurements may be timestamped to correlate with operating hours and recognize peak usage hours. Processor 62 may poll each bot periodically to receive the data.
FIGURE 5 is a more detailed block diagram of a video monitoring scheme according to an embodiment of the present invention. Video and/or still cameras 110-119 are installed at strategic locations in and around the food service facility. In most instances, the cameras are positioned overhead, but the cameras may also be placed at eye-level, for example, and pointed across the target area to obtain a three-dimensional image of the target area. Optionally, a subset of these cameras may be infrared cameras that operate in the infrared region of the spectrum. The captured video images not only serve as surveillance tools, but can also be analyzed to derive quantitative data that can be used to measure the quality of performance by the employees. Furthermore, selected images at a reduced frame rate may be archived for later analysis.
For example, one or more cameras 110 may be located in the restaurant lobby or waiting area to monitor the activities there. Other video cameras 118 may be installed in the dining area. By using image analysis software as further described below, the number of people waiting in food service lines, waiting to be seated, and seated at tables may be determined. The captured images may be timestamped so that the time associated with certain activities may be determined. For example, an analysis of a captured image of table 21 at 18:32:33 may reveal that no one was seated at the table. An analysis of an image of the same table captured at 18:39:28 may reveal that three customers are seated at the table. An analysis of an image captured at 18:48:32 may show that an additional person arrived at the table and left shortly after. The system may possess sufficient knowledge and artificial intelligence to assume that this person was a server who took food orders from the customers. An analysis of an image captured at 18:59:44 may reveal that round flat objects resembling plates were placed on the table. The system may determine from this information that it took at least nine minutes after the customers are seated for the server to take the food orders, and that it took at least eleven minutes for the food to arrive at the table. Therefore, such metrics for measuring the performance of the restaurant staff are computed and used to compare with permissible ranges. Details of the image analysis process is described below by referring to FIGURES 12A-12C.
One or more cameras 112 may also be installed in the kitchen to monitor the activities of the kitchen staff and other indicators of performance. For example, the cameras may be able to capture whether the floor and food preparation surfaces are kept clean. Food holding area cameras 114 may be used to capture images of the prepared food. Image processing software may be used to analyze the shade and color of the prepared food, the presence or absence of grill marks, and the general overall presentation of the food, for example.
If the food service facility includes drive through service, additional cameras 116 may be installed to monitor the number of cars waiting to place food orders at the drive through window. Furthermore, exterior cameras may be used to capture images of cars entering the premises and then leaving because the drive through line is too long. Traffic conditions of adjacent roadways may also be monitored, analyzed and compared with the norm. Dining room cameras 118 may also be used to survey the activity in the dining area. Also, the overall cleanliness of the dining area including indications of how quickly tables are bussed may also be captured.
One or more additional video cameras 119 are also used for videoconferencing with the senior manager or with other personnel . These videoconferencing cameras are preferably located in an office or an area with minimal disturbance. Videoconferencing cameras 119 may be automatically activated when a service breach has occurred, such as when the temperature of a prepared food item is below the required temperature threshold. Videoconferencing may also be initiated when appropriate controls on the display screen are activated, such as when a videoconferencing button is clicked.
All still and video cameras communicate video and optionally audio data to restaurant processor 22 via wireless transmission. In addition, a sophisticated system may allow a senior manager using central computer 52 to enter camera movement commands (pan and tilt) to point the camera in specific directions and to zoom in and out in real-time. Preferably, the video image is converted to an active pixel map which allows a user to center the image and thereby point the camera by clicking on the video display itself. In this manner, a manager may monitor the activities of the restaurant in real-time as if performing a virtual visit and survey of the facilities. As an aide to marketing, a senior manager or marketing executive may be able to view or otherwise videoconference with customers to gauge reaction to new food product offerings, food quality and quantity, and to receive immediate customer feedback.
FIGURE 6 is a simplified flowchart of a restaurant process 120 serving as an exemplary process in which the system and method of the present invention are applicable. The first step of the process is taking food orders from the customer. The food orders are input into POS processor 66, as shown in block 122. This allows the logging of food orders into system 50 which may be used for a number of purposes to be described below in conjunction with FIGURE 7. Next, the food is prepared, as shown in block 124. In some restaurant operations, food preparation of certain foods at peak dining times may be prepared in anticipation of high demand due to various reasons, such as warm comfort foods during cold wintry weather. Finally, the food is served to the customer, as shown in block 126. The process ends in block 128. With minor modifications, this process may be adjusted to other types of restaurant operations. For example, an additional step of seating the customers at a table may be added prior to the order entry step. Steps of presenting the bill to the customer and receiving payment may also be added. Each of these steps in the process presents opportunities for performance measurement and monitoring. The wait staff may simply log each action in his or her portable order entry device so that each action receives a timestamp.
FIGURE 7 is a flowchart of an embodiment of a process 130 for order entry and to begin service according to an embodiment of the present invention. Using captured images, an indication of the amount of time customers are required to wait to be seated may be determined, as shown in block 131. This step is optional for counter service establishments. Imaging analysis software, as described in detail below, is used to count the number of people waiting in the waiting area of the restaurant. A correlation for a wait time can then be drawn from the people count. If the wait time or the number of people waiting for a table exceeds a predetermined threshold, an alert is generated, as shown in blocks 132 and 133. In block 134, the amount of time a customer is required to wait to place an order is determined. For example, an image of the length of a line or the number of customers waiting in line at the counter may be captured by a video camera and analyzed by the image analysis software. In a "sit-down" restaurant, the order placement wait time may be determined by first ascertaining an approximate time the customers were seated at the table, and an approximate time the server approached the table briefly to take the order. By using image analysis software, the timing of such activities may be identified. Please refer to the discussion below for details on the image analysis software. The time that the customer is required to wait to place the order is then compared to a predetermined threshold, as shown in block 135. If the threshold is exceeded, then an alert is generated, as shown in block 136. If the system is coupled to POS processors, the system then receives the food order, which is entered by the server, as shown in block 138. The order is timestamped or otherwise associated with a time value, as shown in block 140. The order entry process ends in block 142. It will be apparent from the discussion below that the image analysis software may be resident on central computer 52 or restaurant processor 62.
Referring to FIGURE 11, an exemplary process 200 of analyzing the captured images according to the present invention is shown. In block 202, an image is captured by a video or still camera located in areas where customers may be waiting to be served. As described above, infrared cameras may also be used. Preferably, the camera is located overhead and pointing downward. By detecting the boundaries between shades of color or gray scale tonal values, shapes may be delineated and recognized, as shown in block 204. The delineated shapes are then compared with relevant shapes stored in memory, as shown in block 206. For example, the relevant shapes are shapes that may represent customers ' heads and plates of food. The infrared spectrum cameras may be used to obtain thermal imaging of the scene . The thermal image may be used as an aide to better analyze the captured video or still image. The thermal image may alternatively be the sole image analyzed to detect the presence of people. Using the thermal image, the outline of each person may be more accurately determined, as shown in block 208. In block 210, by counting the recognized outline of each person, the number of people waiting in line can be determined. Therefore, the qualitative image data is translated to a quantitative value that can be used in comparison against preset standard thresholds and ranges. The comparison provides an evaluation against the . norm such that exceptions may be noted and flagged. The process ends in block 212. The resulting number of waiting customers or a determination of wait time to place the order is compared with respective predetermined thresholds or predetermined ranges. If the thresholds are exceeded or if the data falls outside of the predetermined ranges, as determined in blocks 134 and 135 (FIGURE 7) , then an alert may be issued (blocks 133 and 136) so that remedial action may be taken. Remedial action may include a wide range of responses depending on the type and seriousness of the breach. For example, the senior manager may immediately communicate with the restaurant manager via a teleconferencing, videoconferencing, or some other forms of communications. In response to certain alerts, system 50 may automatically generate an electronic mail message, a page, voice mail, or some other form of notification to the restaurant manager and/or the personnel causing the alert. In other instances, violations of the threshold may be logged and video images associated with the violation may be cataloged and stored for later statistical analysis and evaluation without immediate corrective action. System 50 then receives the order at POS processor 66, as shown in block 138, at which time a timestamp may be determined, as shown in block 140. The process ends in block 142.
FIGURE 8 is a flowchart of an embodiment of a food preparation process 150 according an embodiment of the present invention. In block 152, food preparation start time may be timestamped or determined from the time the food order was taken. Kitchen equipment temperature, food type settings, and cooking time are measured or determined, as shown in block 154. The kitchen equipment operating status may also be measured by other types of sensors, as shown in block 156. For example, if an oven is set at 350 degrees, the actual temperature inside the oven is measured to determine whether the oven is operating properly. Such metrics are compared with predetermined thresholds in block 158. If the threshold is exceeded, then it is flagged or an alert is sent to the restaurant processor and/or the central computer, where it is observable by the local manager and the senior manager. The managers may then take suitable actions in real-time or at a later time as deemed appropriate. Next, the food preparation end time is also noted, as shown in block 162. The total food preparation time is then compared with a predetermined threshold, as shown in block 164. Similarly, an alert is issued if the threshold is exceeded, as shown in block 165 and suitable action may be taken. When the food is taken to the food holding areas, the food temperature is measured, as shown in block 166. This measurement is compared with a predetermined threshold, as shown in block 168, and exceptions are noted and actions may be taken when appropriate, as shown in block 169. The process ends in block 170.
FIGURE 9 is a flowchart of an embodiment of a food service process 180 according an embodiment of the present invention. In blocks 182 and 184, the food temperature is measured by the temperature bot and compared with a predetermined threshold or range. An alert is issued so that appropriate action may be taken when the threshold is exceeded or the value falls outside of the range. It is preferable to continuously evaluate the food temperature up to the time the food is taken from the holding area and served to ensure that the customer receives food at the optimum temperature. When the food is served, the time is noted. This timestamp is compared with the time when the food preparation end time to determine how long the food idled at the holding area before it was served. This elapsed time is compared with a predetermined threshold or range, as shown in block 190. If the threshold is exceeded or the value falls outside of the range, then an alert is issued so that a suitable response may be made. The process ends in block 192.
The system may store data for at least two reasons. The system may store the data of the past hour, for example, including quantitative data and video images. This data is continuously overwritten so that what is stored is always the data for the last predetermined period. A circular cache may be used for this purpose. The second type of data storage occurs as a response to an exception. At any time a breach occurs, selected quantitative and qualitative data may be stored in a cache or some form of non-volatile memory. The exception-based data are not overwritten unless specific instructions are given to do so. The type and quantity of data stored may depend on the type of breach that occurred. The video images may be stored at a reduced frame rate to conserve space, if necessary. The archived data can be studied and analyzed to provide insight into the breach and perhaps provide solutions to avoid similar breaches in the future. In addition, the cached data, especially the stored video images, may be used to provide evidence for subsequent legal actions concerning the operations of the facilities and employee activities. FIGURES 12A-12C are illustrations representing a reference image array, a target image array and a change image array generated and used in an exemplary image quantitative analysis process according to the teachings of the present invention. References will also be made to FIGURE 13, which is a flowchart of the basic people counting process 300. At some point each day when there are no people in the dining area and waiting area of a restaurant, a video image is captured of the "empty space", as shown in block 302. This image is stored as the reference image against which future images will be compared. More specifically, the video image is captured in pixel format (typically 8 bit or 16 bit) and stored in a two-dimensional array, as shown in block 304. The data is preferably stored in pixel format. This stored pixel data forms a reference image array 230. In the example shown in FIGURE 12A, reference image array 230 contains the two-dimensional pixel data representing three tables 231-232 and the accompanying chairs 234-245 in a dining area of the restaurant . Preferably to increase the accuracy of the people counting process, each video camera covers a small grouping of tables, so that multiple video cameras are needed to cover the entire foodservice area.
When a customer count is needed, such as at opening and every five minutes thereafter, another video snapshot of the same scene is taken using the same camera, as shown in block 306. The information is preferably stored in the same format as the reference image array, as shown in block 308. The resultant two-dimensional pixel data is a target image array 250. As shown, target image array 250 also includes pixel representations of people (seated 251-255 and standing 256 and 257) , a service tray 260, and plates of food 26-1. The stored target image array 250 is then compared to reference image array 230, as shown in block 310. For those areas where the pixels are the same in both arrays, the system determines that there are no new objects or people. When there are differences between the two arrays, the system may assume that people or objects are now present where there were none in the reference image array.
There are several methods for segregating the pixels which have changed. A change image array 270 is generated by subtracting reference image array 230 from target image array on a pixel-by-pixel basis, leaving only what has changed. Change image array 270 is stored, as shown in block 312. The next step is to reduce the pixels in change image array 270 into pixel groups, as shown in block 314. In its simplest form, this is accomplished by systematically examining each pixel in change image array 270. A pixel group is defined as multiple adjacent changed pixels surrounded by unchanged pixels. Each pixel group is suspected by the system to represent a person, plates of food, or new objects brought into the area, for example. It should be noted that, changes in pixel values due to a difference in lighting, time-of-day, weather conditions, etc. should be discounted. For example, if the sun goes behind a cloud, a table may appear somewhat darker than it did moments before, so the pixel values in the target image array may be somewhat different than the reference image array. Changes of this type should not be translated to the presence of people. Accordingly, changes which are not substantially different may be ignored both for the purpose of creating the change image array, and also for determining pixel groups. For example, tonal changes in the pixel value not exceeding a predetermined range may be interpreted as changes caused by lighting changes and not caused by any real change in the scene .
There are several methods to determine the number of people in the image from the pixel groups. The first option utilizes the fact that people tend to move frequently, but most other significant objects in a foodservice area do not . Once pixel groups have been identified, as shown in block 314, the process of generating a change image array is repeated, at relatively brief intervals, for a predetermined number of times. For example, a change image array may be generated for images captured every ten seconds for six repetitions. Thus, six sequential change image arrays are generated in 60 seconds. The sequential change image arrays are then compared to each other. If, in each of the six sequential change image arrays, a pixel group of a required size has substantively changed, this particular pixel group may be counted as a person. Conversely, if a pixel group has not changed in each of the six change image arrays, it is likely that this particular pixel group does not represent a person. Using this method, the original reference image array 230 becomes less critical over time. For example, if a pixel group remains unchanged over several sequential change image arrays, then for another set of sequential change image arrays the same pixel group does change, and then the same pixel group reverts back to the original pixel values, the system may assume something or someone passed between the camera and a stationary background. If the system assumes that no inanimate object can move without human assistance, then the movement can be counted as a person.
The resultant people count value or the associated change image array is timestamped to provide a time reference for these values. The timestamp of the people count may be used to determine quantitative data associated with time. For example, the time span from when the customers are seated at a table to when their food arrives or the average wait time for a table can be determined.
Another people counting method utilizes the fact that, in general, the largest thing to appear into a foodservice area (i.e. a dining room, a bar or a waiting area) is a person. Plates, forks, and other items may enter and leave the scene, but they are much smaller than people. Therefore, one way of counting people is to simply count all the pixel, groups which are larger than a certain predetermined size, as shown in block 316, and the process ends in block 318. The size comparison may be done by comparing the number of pixels in each pixel group to a predetermined number of pixels .
Another people counting method is to compare the pixel groups to reference pixel groups which represent two-dimensional shapes of people. For example, representative pixel samples of several people with different color hair and hairstyles may be obtained from several possible angles and distances. The video samples are stored as two-dimensional "people-like" pixel groups. We can then compare the pixel groups in the change image array to the pixel groups in the "people-like" pixel groups for a match. When there is a match, the pixel group in the change image array is counted as a person.
Another method for distinguishing a person pixel group is to remove pixel groups that do not resemble a person. For example, those pixel groups that resemble basic shapes of plates and service trays can be distinguished and eliminated. Furthermore, a thermal image of the same scene taken at the same time may be analyzed to eliminate non-human pixel groups. The system then assumes that the remaining pixel groups are people and counted as such.
It should be noted that there are several variations and/or enhancements that may be used to gain a higher degree of accuracy. For example, objects closer to the camera appear larger than identical objects further away from the camera. Accordingly, objects in one portion of the screen may appear larger than identical objects in another portion of the scene. If a camera is located near the ceiling in a corner of a room, for example, objects in the lower portion of the captured image would appear larger than identical objects in the upper portion of the image. The people counting process may apply this knowledge to achieve higher accuracy. Furthermore, if the camera is installed in a fixed location, viewing a room of a fixed size, it is relatively simple to determine how far away from the camera various portions of the screen represent. For example, a one-foot ruler can be placed in the lower portion of the screen - which would represent close proximity to the camera located near the ceiling - and also in the upper portion of the screen - which would represent further distance away from the camera. Scaling vectors can then be drawn between the two rulers to indicate the relative scale of items located in the lower portion of the screen as compared to items located in the upper portion of the screen, and stored for future reference during the analysis process. If the camera is at a fixed location, but capable of panning and zooming under electronic control, a variation of this method can be used to determine scale as the camera is pointed in different directions or zoomed to different views. This variation utilizes scaling vectors taken from various cameras panning positions and zoom levels. These processes would enhance the counting methods described above to the extent that size is an important component in determining whether or not a pixel group should be counted as a person.
Redundancies can also be used to reduce false measurements. For example, counting the number of people using each of the four alternate measures noted above may yield four different values, but an average of the count values from the four methods may come statistically closer to the actual count. Repetitive readings can also be used to reduce false measurements. For example, capturing and counting a different target array every 3 seconds for 15 seconds, and averaging those counts, may yield more accurate data than a single count.
As a further aide to improve the accuracy of these methods, a video camera may be positioned above doorways through which customers enter or exit the facility. Using the sequential change array method, an ongoing count of people inside the restaurant can be established and maintained. If the count of people inside the restaurant is compared to the number of people counted using the counting methods, the data produced by these methods can be either confirmed or disputed.
A variation of the methods discussed above involves correlating the change image arrays generated from images captured from multiple cameras located in the facility to track each person as they move about the subject area. This method operates in a similar was as the cellular phone system in that when a person moves from the surveillance area of one camera into the surveillance area of another, a "handoff" occurs. This variation tracks each moving pixel group as it moves through the foodservice area.
The presence of the restaurant employees invariably affects the accuracy of these counting methods. Typically, the value of interest is the number of customers, not employees. There are several ways of estimating how many employees are on site so that the count values can be adjusted. The employees may wear badges which can be clearly identified in an image array. The payroll clock, which knows how many employees are on duty, may be tied into the system to provide an employee count at a given time. Furthermore, the number of employees can be estimated. The restaurant business is cyclical based on days of the week and the time of day. At a given time, a given number of employees or waitpersons are required to be on duty. This estimate may be provided to the system to account for the number of employees .
FIGURE 10 is a screen shot of an exemplary user interface 170 displaying data summaries and detailed data. It may be seen that the user interface may be composed as web pages using hypertext markup language (HTML) and its many extensions (dynamic HTML, extensible markup language (XML) , cascading style sheets (CSS) , etc.) . The web pages and data are retrieved and displayed by a web browser application running on the central computer and also the restaurant processor. The display preferably includes a table 172 containing data summaries of all facilities of interest to the user. For example, a regional manager's login (provision of a user identifier and a password, for example) may only provide access to data from facilities under his or her management, and a senior manager's login may provide access to data from facilities in all the regions under his or her control. As described above, it may be beneficial to allow local managers to access data from other facilities to provide insight into the success and failure of the operation.
As shown in table 172, the data summary of each facility or unit occupies a row in the table. In the first data column, the revenue generated at each facility is shown. This data is derived from summing the POS sales figures received from the facility. A total revenue amount from sales in all the facilities may also be displayed. The revenue amounts are compared with planned projections and the differences and the percentage variances are displayed. Critical, major and minor breaches of various categories are also displayed. When a breach in a facility occurs, the data summary table displays the breach by category, criticality and the number of occurrences. To capture the attention of a user, visual and/or audible alarms may be initiated. For example, a row in the table containing a breach may be highlighted with a different color or flashed alternately with a contrasting color. An audible alarm also may sound. A window 174 may open automatically or on demand to display detailed data of the facility experiencing the breach. The user may resize and position the table and window 174 to his or her liking. Window 174 contains more detailed qualitative and quantitative data of a particular facility. This detailed data window may be opened at any time for any facility. Quantitative data may be displayed in a page format 178 with tabs 176 to facilitate access. Each page may display a category of data, such as temperature and statistics, or each page may provide data on a certain process,* such as food preparation activities and data or dining room activities and data. For example, the site statistics page or window may display data such as the number of customers currently waiting to be served and the maximum number of customers that were waiting to be served at some point in time; the number of customers currently in the dining area, the maximum number of customers in the dining area at some point in time, the total number of customers today, the total customer number compared to the norm, the yearly total number of customers, and the yearly customer number compared to the norm. Further, the site statistics page may also include a number of customers that are being served, and a total number of customers that have been served today. Statistics on the customer service time may also be provided, such as the average time, minimum and maximum time, and a comparison against the norm. From this data, the number of customers who requested take out service may be available from the POS processors. Thus, a today's total compared against the norm, and the yearly total compared to the norm can be displayed in the site statistics page. These data types are merely provided as examples as the options are limitless. Qualitative data, such as video images 180 and 183-
185, may also be displayed in window 174. As an example, video image 180 provides dynamic streaming video and optional audio taken by a camera location selected in an image window 182. Additional image windows 183-185 may be used to display images captured by currently non- selected cameras. The images in windows 183-185 may be static images or updated at a rate far less than the selected image window 180, depending on available bandwidth. Clicking on the control icons displayed in window 186 can remotely control the pan, tilt, and zoom camera movements. Alternatively, the control icons are not required if the user is able to click on the selected camera image itself to re-center the image or cause the video camera angle to be displaced in a certain direction. At any time, the user may select a different camera to be displayed in window 180, which in effect swaps the image currently displayed in dynamic window 180 with an image displayed in one of the static windows 183- 185. Additionally, the system may automatically select a camera installed at the location of a breach when it is detected.
Many benefits stem from the ability to remotely quantify the qualitative data and to evaluate the performance of the facility based on an analysis of the quantitative data. First, the senior manager is notified of a problem in a facility by exception-based reporting when the quantitative and qualitative data fall outside of a preferred range. Therefore, the senior manager or other users are not required to continuously view and monitor the video images and other data of all the sites in order to evaluate their performance. Detailed information of particular sites can be accessed easily on the same screen. Response to breaches of the threshold can be immediately carried out to remedy the breach. The senior manager may use this opportunity to refine the training of his employee by immediately being in contact with the employee or the local manager by videoconference or teleconference.
Second, the quantitative data and qualitative data can be archived for later analysis and study, such as trending, historic, forecasting, etc. Furthermore, the captured images may be studied to determine the whys and hows of certain events. For example, a restaurant's June earnings are down substantially, which are not in-line with projections and historic performance. No obvious explanation is apparent. A study and analysis of captured* images by the exterior cameras may reveal that traffic on the road fronting the facilities is nearly non-existent, which led to the realization that the road access to the restaurant has been blocked off due to construction. On the other hand, a study of the captured images may reveal that the restaurant has been operating with fewer tables, which were originally removed because of a shortage of wait staff. Furthermore, the archived image data and quantitative data may also be used to support or deny an allegation of employee rudeness, inattentiveness, wrongdoing, etc.
Third, because the system is preferably web-based, special software is not needed to access the data. In the case of serious breaches, the senior manager may immediately respond by communicating with the staff via teleconferencing or videoconferencing. In this manner, the present invention virtually transports a senior manager to all the sites of his/her operations and facilities simultaneously, automatically monitors the operational data of each site and flags exceptions, and makes available the experience and knowledge of the senior manager to the staff at each site.
The benefits derived from employing the system and method of the present invention are many fold. On customer service, a long wait time for each phase of the dining experience (wait time for a table, wait time to place an order, wait time to be served, wait time to get the bill, etc.) can be closely monitored and improved if necessary. These parameters are determined by analyzing captured video or still images. On food preparation, the proper manner in which the food is prepared (preparation temperature, presentation, etc.) can be ensured and improved if necessary. Proper food preparation temperature is especially crucial for certain foodstuffs, such as chicken, beef, pork and eggs, for example. Optimal operations in these areas would particularly ensure greater customer satisfaction, higher return business, and higher revenues. Further, the operation and efficiency of the equipment may be closely monitored, including refrigerators, freezers, ovens, grills, fryers, etc. Other security and safety devices, such as smoke detectors, security alarms, carbon monoxide detectors, and employee panic buttons, can also be closely monitored on an exception basis. External cameras can be used to capture images of vehicles entering the premises, the number of vehicles parked in the parking lot, traffic conditions on adjacent roadways, and weather conditions. These captured images can be analyzed to provide quantitative data and archive data for later analysis.
A particular beneficial aspect of the present invention is the ability to allow a senior manager or owner to leverage his/her many years of experience and knowledge to optimize the operations and revenue of all of his/her facilities. The present invention provides an efficient data conduit from the facilities to the senior manager and an efficient experience and knowledge conduit from the senior manager to the facilities. Armed with specific knowledge about the operations of his/her facilities, the senior manager can better apply his experience and skills to optimize the operations or trouble-shoot . The benefit comes from immediate remedy of parameter breaches, immediate feedback and correction on poor performance, immediate feedback and recognition on superior performance, and immediate opportunities to refine the training of the employees. Furthermore, data related to all aspects of the facility can be processed, analyzed, archived, and reviewed. Senior, regional and local managers may also be able to view and analyze the quantitative and qualitative data of other facilities in order to* achieve an understanding of why some operations are more successful than others. By analyzing the data of successful facilities and modeling after them using the data, the management and staff of less successful facilities may take corrective actions to improve their performance .
Other add-ons which may otherwise not be economically viable, such as on-line reservation or placement of online carry-out orders, may now be economically feasible.
Constructed and operating in the manner described above, the operations of a facility can be remotely monitored and supported by senior personnel . Even with an inexperienced staff, the remote monitoring, real-time training and communication with the senior manager allows the operations to be run more smoothly. Therefore, the knowledge and experience of the senior manager can be effectively leveraged to carefully monitor, control and optimize the operations of several facilities.
The following table summarizes the types of qualitative and quantitative data that can be collected by the system and method of the present invention, and the direct benefits and advantages derived therefrom:
Figure imgf000038_0001
Figure imgf000039_0001
Figure imgf000040_0001
Figure imgf000041_0001
Although several embodiments of the present invention and its advantages have been described in detail, it should be understood that mutations, changes, substitutions, transformations, modifications, variations, and alterations can be made therein without departing from the teachings of the present invention, the spirit and scope of the invention being set forth by the appended claims.

Claims

WHAT IS CLAIMED IS:
1. ■ A system of monitoring the operations of a facility, comprising: at least one camera strategically installed in the facility operable to capture images; and a processor in communication with the at least one camera and operable to receive the captured images, the processor being further operable to process the captured images to derive quantitative data therefrom, to analyze the quantitative data by comparing with predetermined thresholds, and generating an alert in response to any threshold being exceeded.
2. The system, as set forth in claim 1, wherein the processor is operable to process the captured images to determine count values of the number of people in the captured images.
3. The system, as set forth in claim 1, wherein the processor is operable to timestamp the determined people count values.
4. The system, as set forth in claim 1, further comprising at least one sensor strategically located in the facility to collect quantitative data, the processor being operable to communicate with the at least one sensor to receive the quantitative data, to analyze the quantitative data by comparing with predetermined thresholds, and to generate an alert in response to any threshold being exceeded.
5. The system, as set forth in claim 4, wherein the at least one sensor measures the room temperature and communicates the room temperature to the processor.
6. The system, as set forth in claim 4, wherein the at least one sensor measures the operating temperature of kitchen equipment and communicates the operating temperature to the processor.
7. The system, as set forth in claim 4, wherein the at least one sensor measures the temperature of prepared food and communicates the food temperature to the processor.
8. The system, as set forth in claim 4, wherein the at least one sensor measures the level of a liquid and communicates the liquid level to the processor.
9. The system, as set forth in claim 4, wherein the at least one sensor measures the energy consumption of equipment and communicates the energy consumption to the processor.
10. The system, as set forth in claim 4, wherein the at least one sensor detects the presence of air-borne irregularities and communicates this data to the processor.
11. The system, as set forth in claim 4, wherein the at least one sensor detects a security breach and communicates this data to the processor.
12. The system, as set forth in claim 1, further comprising at least one point-of-sale processor operable to generate a transaction amount for each transaction and a time value with each transaction, the processor being in communication with the at least one point-of-sale processor to receive the transaction amount and time value.
13.* The system, as set forth in claim 1, further comprising a central computer in communication with the processor and operable to receive at least a subset of the quantitative data and captured images received by the processor and any alert generated thereby, the central computer being operable to display to a user the received quantitative data in summary mode and in detailed mode as well as the captured images and alert.
14. The system, as set forth in claim 4, further comprising a central computer in communication with the processor and operable to receive at least a subset of the quantitative data and captured images received by the processor and any alert generated thereby, the central computer being operable to display to a user the received quantitative data in summary mode and in detailed mode as well as the captured images and alert.
15. The system, as set forth in claim 14, wherein the central computer is in communication with a plurality of processors associated with a plurality of food service establishments, each restaurant processor being in communications . with at least one point-of-sale processor operable to receive customer food orders, determine a transactional amount, and associated timestamps, at least one kitchen equipment sensor operable to measure the temperature values of the kitchen equipment, at least one camera operable to capture images, each restaurant processor being further operable to analyze the operations of each food service establishment, comparing it with predetermined thresholds, and generating an alert in response to exceeding any threshold.
16. The system, as set forth in claim 14, wherein the central computer is in communication with a plurality of processors of a plurality of retail outlets, each processor being in communication with at least one point- of-sale processor operable to determine transaction sales amounts and associated timestamps, at least one sensor operable to determine quantitative data associated with the retail outlet, at least one camera operable to capture images, each processor being further operable to analyze the operations of each retail outlet, comparing it with predetermined thresholds, and generating an alert in response to exceeding any threshold.
17. The system, as set forth in claim 14, wherein the central computer is in communication with a plurality of processors of a plurality of supermarkets, each processor being in communications with at least one point-of-sale processor operable to determine transaction sales amounts and associated timestamps, at least one sensor operable to determine quantitative data associated with the operations of a supermarket, at least one camera operable to capture images, each processor being further operable to analyze the operations of the supermarket, comparing it with predetermined thresholds, and generating an alert in response to exceeding any threshold.
18. The system, as set forth in claim 14, wherein the processor communicates with the central computer over the Internet .
19. The system, as set forth in claim 14, wherein the processor communicates with the central computer over the Internet and the data is displayed in the form of web pages by a web browser application.
20. The system, as set forth in claim 1, wherein the processor is operable to display a summary of the quantitative data in response to an alert being generated, and further operable to display detailed quantitative data and captured images in response to receiving a user demand.
21. The system, as set forth in claim 14, wherein the central computer is operable to display a summary including quantitative data associated with a plurality of facilities received from the plurality of processors, with the quantitative data associated with an alert generated by a particular processor highlighted, and further operable to simultaneously display detailed quantitative data and the captured images of the facility associated with the generated alert in response to receiving a user demand.
22. The system, as set forth in claim 21, wherein the central computer is operable to display a primary sequence of captured images and at least a secondary sequence of captured images of a selected facility, the primary sequence of captured images being displayed more prominently than the at least one secondary sequence of captured images.
23. The system, as set forth in claim 15, wherein the central computer is operable to display a summary including quantitative data associated with a plurality of facilities received from the plurality of processors, the quantitative data including total daily and yearly sales figures to-date computed from the transactional amounts determined by the point-of-sale processors of the plurality of facilities, the quantitative data further including the temperature values, with the quantitative data associated with an alert generated by a particular processor at a particular facility highlighted, and further operable to simultaneously display detailed quantitative data and the captured images of the facility associated with the generated alert in response to receiving a user demand.
24 The system, as set forth in claim 1, wherein the processor is operable to store at least a most current subset of the quantitative data and captured images collected in a predetermined time period.
25. The system, as set forth in claim 1, wherein the processor is operable to store at least a subset of the quantitative data and captured images in response to a generated alert.
26. The system, as set forth in claim 21, wherein each of the plurality of processors is operable to store at least a most current subset of the quantitative data and captured images associated with the plurality of facilities and collected in a predetermined time period.
27. The system, as set forth in claim 21, wherein each of the plurality of the processors is operable to store at least a subset of the quantitative data and captured images in response to a generated alert.
28. The system, as set forth in claim 13, further comprising a video camera coupled to the central computer for videoconferencing with a user at the facility.
29. A system of monitoring the operations of a food service establishment, comprising: at least one sensor operable to gather quantitative data; at least one camera installed in the food service establishment operable to capture images; a restaurant processor in communication with the at least one sensor and operable to receive the quantitative data, the restaurant processor further in communication with the at least one camera and operable to receive the captured images, the restaurant processor being operable to compare the quantitative data against predetermined thresholds, and generating an alert in response to any threshold being exceeded, the restaurant processor being operable to display the captured images, quantitative data, and alerts; and a central computer in communication with the restaurant processor and operable to receive at least a subset of the data received by the restaurant processor and alerts, the central computer being operable to display to a user the received data and alerts.
30. The system, as set forth in claim 29, further comprising at least one point-of-sale processor operable to receive customer food orders, determine transaction amounts and associate- each food order with a time value, the restaurant processor being in communication with the point-of-sale processor and receiving data therefrom.
31. The system, as set forth in claim 29, wherein the central computer is operable to display the captured images along with the received data on one screen.
32. The system, as set forth in claim 29, wherein the central computer is in communication with a plurality of restaurant processors of a plurality of food service establishments, each restaurant processor being in communication with at least one point-of -sale processor and being operable to receive customer food orders and associated timestamps, the restaurant processor also in communication with at least one sensor operable to measure a property associated with a piece of kitchen equipment and being operable to receive the measured property, the restaurant processor also in communication with at least one camera and being operable to receive the captured images, each restaurant processor further being operable to analyze the operations of each food service establishment, comparing it with predetermined thresholds, and generating an alert if any threshold is exceeded.
33. The system, as set forth in claim 29, wherein the restaurant processor is operable to receive a temperature setting of at least one oven in the food service establishment, the restaurant processor being operable to compare the temperature setting with the temperature value of the oven, and generating an alert in response to the difference therebetween exceeding at least one predetermined threshold.
34. The system, as set forth in claim 29, wherein the restaurant processor is operable to receive a food cooking setting of at least one oven in the food service establishment, the restaurant processor being operable to compare the food cooking setting with the temperature value of the oven, and generating an alert in response to the difference therebetween exceeding at least one predetermined threshold.
35. The system, as set forth in claim 29, wherein the restaurant processor is operable to receive a temperature value of a hot food holding area, the restaurant processor being operable to compare the temperature value with at least one predetermined threshold, and generating an alert in response to the temperature value exceeding the at least one predetermined threshold.
36. The system, as set forth in claim 29, wherein the restaurant processor is operable to receive a temperature value of a cold food holding area, the restaurant processor being operable to compare the temperature value with at least one predetermined threshold, and generating an alert in response to the temperature value exceeding the at least one predetermined threshold.
37. The system, as set forth in claim 29, wnerem the restaurant processor is operable to receive a temperature value associated with at least one piece of food preparation equipment in the food service establishment, compare the temperature value with at least one predetermined threshold, and generating an alert in response to the temperature value exceeding the at least one predetermined threshold.
38. The system, as set forth in claim 29, wherein the at least one camera comprises at least one camera installed in an area of the food service establishment operable to capture images of customers entering the establishment and waiting to be served.
39. The system, as set forth in claim 38, wherein the restaurant processor further comprises means for analyzing the captured images of the waiting area and determining a wait time of the customers.
40. The system, as set forth in claim 29, wherein the at least one camera comprises at least one camera installed in a food holding area operable to capture images of prepared food.
41. The system, as set forth in claim 29, wherein the at least one camera comprises at least one camera installed in a food preparation area of the food service establishment operable to capture images of kitchen staff and activities therein.
42. The system, as set forth in claim 41, wherein the restaurant processor further comprises means for analyzing the captured images of the prepared food and determining a quality measurement of the prepared food.
43. The system, as set forth in claim 29, wherein the restaurant processor further compares the timestamps of customer food orders and determines whether the time for filling each customer order exceeds at least one predetermined threshold.
44. The system, as set forth in claim 30, wherein the restaurant processor receives timestamps associated with food order entry, food preparation, and food service, determines a time duration thereof, and compares the time duration thereof with at least one predetermined threshold.
45. The system, as set forth in claim 29, wherein the restaurant processor communicates with the central computer via the Internet .
46. The system, as set forth in claim 29, wherein the restaurant processor communicates with the at least one sensor and the at least one camera via wireless communications .
47. The system, as set forth in claim 29, wherein the processor is operable to display a summary of the quantitative data in response to an alert being generated, and further operable to display detailed quantitative data and captured images in response to receiving a user demand.
48. The system, as set forth in claim 29, wherein the central computer is operable to display a summary including quantitative data associated with a plurality of food service establishments received from the plurality of restaurant processors, with the quantitative data associated with an alert generated by a particular restaurant processor highlighted, and further operable to simultaneously display detailed quantitative data and the captured images of the food service establishment associated with the generated alert in response to receiving a user demand.
49. The system, as set forth in claim 29, wherein the central computer is operable to display a primary sequence of captured images and at least a secondary sequence of captured images of a selected food service establishment, the primary sequence of captured images being displayed more prominently than the at least one secondary sequence of captured images .
50. The system, as set forth in claim 30, wherein the central computer is operable to display a summary including quantitative data associated with the plurality of food service establishments received from the plurality of restaurant processors, the quantitative data including total daily and yearly sales figures to-date computed from the transactional amounts determined by the point-of-sale processors of the plurality of food service establishments, the quantitative data also including data gathered by the at least one sensor, with the quantitative data associated with an alert generated by a particular restaurant processor at a particular food service establishment highlighted, and further operable to simultaneously display detailed quantitative data and the captured images of the food service establishment associated with the generated alert in response to receiving a user demand.
51. The system, as set forth in claim 29, wherein the restaurant processor is operable to store at least a most current subset of the quantitative data and captured images collected in a predetermined time period.
52. The system, as set forth in claim 29, wherein the restaurant processor is operable to store at least a subset of the quantitative data and captured images in response to a generated alert.
53. The system, as set forth in claim 32, wherein each of the plurality of restaurant processors is operable to store at least a most current subset of the quantitative data and captured images associated with the plurality of food service establishment and collected in a predetermined time period.
54. The system, as set forth in claim 32, wherein each of • the plurality of the restaurant processors is operable to store at least a subset of the quantitative data and captured images in response to a generated alert .
55. The system, as set forth in claim 29, further comprising a video camera coupled to the central computer for videoconferencing with a user at the food service establishment.
56. A method of monitoring the operations of a facility, comprising: capturing images using at least one camera installed in strategic locations in the facility; processing the captured images and deriving quantitative data therefrom; comparing the quantitative data with predetermined thresholds, and generating an alert in response to any threshold being exceeded; displaying the captured images and the generated alert .
57. The method, as set forth in claim 56, wherein processing the captured images comprises: capturing a reference image; storing the captured reference image in a reference image array; capturing a target image; storing the captured target image in a target image array; comparing the target image array with the reference image array and generating a change image array representing a difference therebetween; identifying pixel groups in the change image array; and counting a number of pixel groups in the change image array .
58. The method, as set forth in claim 57, further comprising: comparing each identified pixel group to stored pixel groups representing people; and counting only those pixel groups which closely resemble the stored pixel groups.
59. The method, as set forth in claim 57, further comprising: comparing each identified pixel group to a predetermined number of pixels corresponding to the size of a person; and counting only those pixel groups which have a number of pixels greater than or equal to the predetermined pixel number.
60. The method, as set forth in claim 59, further comprising scaling the predetermined number of pixels corresponding to the size of a person by the location of the pixel group in the change image array.
61. The method, as set forth in claim 57, further comprising: capturing a second target image; storing the captured second target image in a second target image array; comparing the second target image array with the reference image array and generating a second change image array representing a difference therebetween; identifying pixel groups in the second change image array; counting the number of identified pixel groups in the second change image array; repeating the above steps over a predetermined period for a predetermined number of times, and ' generating a plurality of change image arrays and corresponding number of pixel groups ; and averaging the number of pixel groups from each change image array and generating a count value.
62. The method, as set forth in claim 57, further comprising : capturing a second target image; storing the captured second target image in a second target image array; comparing the second target image array with the reference image array and generating a second change image array representing a difference therebetween; identifying pixel groups in the second change image array; repeating the above steps over a predetermined period for a predetermined number of times, and generating a sequence of change image arrays and corresponding pixel groups ; comparing the pixel groups of the sequence of change image arrays to determine substantial change due to movement ; and counting those pixel groups having substantial change due to movement .
63. The method, as set forth in claim 57, further comprising: transmitting the received data, captured images, and alert to a central computer; and displaying to a user the received data, captured images, and alert.
64. The method, as set forth in claim 57, further comprising: capturing images using a plurality of cameras installed in strategic locations in a plurality of facilities; processing the captured images from each facility and deriving quantitative data therefrom; comparing the quantitative data with predetermined thresholds, and generating an alert in response to any threshold being exceeded; transmitting the received data, captured images, and alert to a central computer; and displaying to a user the received data, captured images, and alert associated with the plurality of facilities.
65. The method, as set forth in claim 56, further comprising : measuring a quantity associated with the operations of the facility using a sensor; comparing the measured quantity with a predetermined threshold; and generating an alert in response to the predetermined threshold being exceeded.
66. The method, as set forth in claim 56, further comprising: measuring an operating property of a piece of equipment in the facility; comparing the measured operating property with a predetermined threshold; and generating an alert in response to the predetermined threshold being exceeded.
67. The method, as set forth in claim 56, furtner comprising : measuring an operating temperature of at least one piece of kitchen equipment in a food service facility; comparing the measured operating temperature with a predetermined threshold; and generating an alert in response to the predetermined threshold being exceeded.
68. The method, as set forth in claim 56, further comprising measuring temperature values associated with food being stored, prepared, and temporarily held by at least one piece of kitchen equipment installed at the food service establishment.
69. The method, as set forth in claim 68, further comprising: measuring a temperature value of a hot food holding area ; comparing the temperature value with at least one predetermined threshold; and generating an alert in response to the temperature value exceeding the at least one predetermined threshold.
70. The method, as set forth in claim 68, further comprising: measuring a temperature value of a cold food holding area; comparing the temperature value with at least one predetermined threshold; and generating an alert in response to the temperature value exceeding the at least one predetermined threshold.
71. The method, as set forth in claim 56, wherein capturing images comprises: capturing images of customers entering a food service facility into a wait area and waiting to be served; and capturing images of activities in a dining area of the food service facility.
72. The method, as set forth in claim 71, further comprising analyzing the captured images of the waiting area and determining a wait time of the customers.
73. The method, as set forth in claim 56, wherein capturing images comprises capturing images of prepared food in a food holding area.
74. The method, as set forth in claim 73, further comprising analyzing the captured images of the prepared food and determining a quality measurement of the prepared food.
75. The method, as set forth in claim 56, further comprising receiving customer orders using at least one point-of-sale processor associated with a food service facility and associating each food order to a time value.
76. The method, as set forth in claim 75, further comprising comparing the time values of customer food orders and determining whether the time for filling each customer order exceeds at least one predetermined threshold.
77. The method, as set forth in claim 75, further comprising : receiving time stamps associated with food order entry, food preparation, and food service; determining a time duration thereof; and comparing the time duration thereof with at least one predetermined threshold.
78. The method, as set forth in claim 56, wherein displaying the captured images, quantitative data and alert comprises displaying a summary of the quantitative data in response to an alert being generated, and further displaying detailed quantitative data and captured images in response to receiving a user demand.
79. The method, as set forth in claim 56, further comprising : capturing images using a plurality of cameras installed in a plurality of facilities; processing the captured images and deriving quantitative data therefrom; measuring and collecting quantitative data associated with the operations of the plurality of facilities; comparing the quantitative data with predetermined thresholds, and generating an alert in response to any threshold being exceeded; displaying a summary including quantitative data associated with the plurality of facilities, highlighting the quantitative data associated with the generated alert at a particular facility; and simultaneously displaying detailed quantitative data and the captured images of the particular facility associated with the generated alert in response to receiving a user demand.
80. The method, as set forth in claim 79, wherein displaying the captured images comprises displaying a primary sequence of captured images and at least a secondary sequence of captured images of the particular facility, the primary sequence of captured images being displayed more prominently than the at least one secondary sequence of captured images .
81. The method, as set forth in claim 75, further comprising: receiving transactional amounts associated with the food orders associated with a plurality of food service facilities; displaying a summary including total daily and yearly sales figures to-date computed from the transactional amounts, the quantitative data further including the temperature values, highlighting the quantitative data associated with a generated alert associated with a particular food service facility; and displaying detailed transactional amounts, sales figures, temperature values, and captured images of the food service facility associated with the generated alert in response to receiving a user demand.
82. The method, as set forth in claim 56, further comprising storing at least a most current subset of the quantitative data and captured images collected in a predetermined time period.
83. The method, as set forth in claim 56, further comprising storing at least a subset of the quantitative data and captured images in response to a generated alert.
84. The method, as set forth in claim 63, wherein transmitting the data comprises transmitting the received data, captured images, and alert to the central computer over the Internet.
85. The method, as set forth in claim 63, further comprising initiating a videoconferencing session between the user and a user at the facility.
86. A method of remotely monitoring at least one restaurant facility, comprising: receiving quantitative data determined by at least one sensor installed at the restaurant facility; capturing images using at least one camera installed in strategic locations in the restaurant facility; comparing the quantitative data to a predetermined threshold; generating an alert in response to any threshold being exceeded; and displaying at least a subset of the quantitative data, captured images, and generated alert to a remote user.
87. The method, as set forth in claim 86, further comprising : transmitting, over the Internet, the received quantitative data, captured images, and any alert to a central computer; and displaying, on the central computer, the received data, captured images, and alert.
88. The method, as set forth in claim 86, further comprising : receiving a temperature measurement from a sensor; comparing the temperature measurement with predetermined thresholds; and generating an alert in response to the temperature measurement exceeding at least one predetermined threshold.
89. The method, as set forth in claim 86, further comprising: receiving a fluid level measurement from a sensor; comparing the fluid level measurement with predetermined thresholds; and generating an alert in response to the fluid level measurement exceeding at least one predetermined threshold.
90. The method, as set forth in claim 86, further comprising: analyzing the captured images; and determining the number of customers waiting to be served.
91. The method, as set forth in claim 86, further comprising : analyzing the captured images; and determining the quality of prepared food in a food holding area.
92. The method, as set forth in claim 86, further comprising : analyzing the captured images; and determining the amount of wait time for customers to be seated at a table.
93. The method, as set forth in claim 86, further comprising displaying a summary of the quantitative data, and visually highlighting any data exceeding the predetermined threshold.
94. The method, as set forth in claim 86, further comprising automatically displaying detailed data of a restaurant facility at which a predetermined threshold is exceeded.
95. The method, as set forth in claim 86, further comprising automatically initiating an audiovisual communication channel between a restaurant facility at which a predetermined threshold is exceeded and a user viewing the displayed data.
96. The method, as set forth in claim 86, further comprising displaying a summary of the quantitative data in response to an alert being generated, and further displaying detailed quantitative data and captured images in response to receiving a user demand.
97. The method, as set forth in claim 86, further comprising : displaying a summary including quantitative data associated with a plurality of facilities, highlighting the quantitative data associated with the generated alert at a particular facility; and displaying detailed quantitative data and the captured images of the particular facility associated with the generated alert in response to receiving a user demand .
98. The method, as set forth in claim 97, wherein displaying the captured images comprises displaying a primary sequence of captured images and at least a secondary sequence of captured images of the particular facility, the primary sequence of captured images being displayed more prominently than the at least one secondary sequence of captured images .
99. The method, as set forth in claim 86, further comprising: receiving transactional amounts of food orders at a plurality of facilities; displaying a summary including total daily and yearly sales figures to-date computed from the transactional amounts, the quantitative data, and highlighting the quantitative data associated with a generated alert associated with a particular facility; and displaying detailed transactional amounts, sales figures, temperature values, and captured images of the restaurant facility associated with the generated alert in response to receiving a user demand.
100. The method, as set forth in claim 86, further comprising storing at least a most current subset of the quantitative data and captured images collected in a predetermined time period.
101. The method, as set forth in claim 86, further comprising storing at least a subset of the quantitative data and captured images in response to a generated alert .
102. The method, as set forth in claim 86, further comprising conducting a videoconferencing session between the remote user and a user at the at least one restaurant facility.
103. A method of counting items in captured images, comprising : capturing a reference image; storing the captured reference image in a reference image array; capturing a target image; storing the captured target image in a target image array; comparing the target image array with the reference image array and generating a change image array representing a difference therebetween; identifying pixel groups in the change image array; and counting a number of pixel groups in the change image array.
104. The method, as set forth in claim 103, further comprising : comparing each identified pixel group to stored pixel groups representing people; and counting only those pixel groups which closely resemble the stored pixel groups.
105. The method, as set forth in claim 103, further comprising: comparing each identified pixel group to a predetermined number of pixels corresponding to the size of a person; and counting only those pixel groups which have a number of pixels greater than or equal to the predetermined pixel number.
106. The method, as set forth in claim 105, further comprising scaling the predetermined number of pixels corresponding to the size of a person by the location of the pixel group in the change image array.
107. The method, as set forth in claim 103, further comprising: capturing a second target image; storing the captured second target image in a second target image array; comparing the second target image array with the reference image array and generating a second change image array representing a difference therebetween; identifying pixel groups in the second change image array; counting the number of identified pixel groups in the second change image array; repeating the above steps over a predetermined period for a predetermined number of times, and generating a plurality of change image arrays and corresponding number of pixel groups; and averaging the number of pixel groups from each change image array and generating a count value.
108. The method, as set forth in claim 103, further comprising: capturing a second target image; storing the captured second target image in a second target image array; comparing the second target image array with the reference image array and generating a second change image array representing a difference therebetween; identifying pixel groups in the second change image array; repeating the above steps over a predetermined period for a predetermined number of times, and generating a sequence of change image arrays and corresponding pixel groups; comparing the pixel groups of the sequence of change image arrays to determine substantial change due to movement ; and counting those pixel groups having substantial change due to movement .
109. The method, as set forth in claim 103, wherein the method counts people.
110. The method, as set forth in claim 103, wherein the method counts vehicles.
111. A temperature sensor, comprising: an infrared thermometer having a field of measurement ; a laser coupled to the infrared thermometer and operable to generate a laser beam visible to the human eye and aimed substantially at a center of the infrared thermometer field of measurement; a camera coupled to the laser and having a field of view substantially centered with the laser beam; and actuating motors operable to simultaneously effect displacement of the laser beam, infrared thermometer field of measurement and camera field of view so that the laser beam remains centered in the infrared thermometer field of measurement and camera field of view.
112. The temperature sensor, as set forth in claim 111, further comprising: a display monitor operable to receive at least one captured image taken by the camera and display the captured image to a user; and graphical controls displayed by the display monitor operable to enable the user to enter movement commands to the actuating motors.
113. The temperature sensor, as set forth in claim 111, wherein the camera is a video camera.
114. The temperature sensor, as set forth in claim 111, wherein the actuating motors comprises a first stepping motor operable to effect a pan movement of the camera, and a second stepping motor operable to effect a tilt movement of the camera.
115. The temperature sensor, as set forth in claim 111, further comprising a transmitter coupled to the infrared thermometer and the camera, the transmitter operable to transmit the captured images and a temperature measurement over a video channel and an audio channel, respectively.
PCT/US2001/004496 2000-02-10 2001-02-12 System and method of facilities and operations monitoring and remote management support WO2001059736A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001236932A AU2001236932A1 (en) 2000-02-10 2001-02-12 System and method of facilities and operations monitoring and remote management support

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US50189100A 2000-02-10 2000-02-10
US09/501,891 2000-02-10

Publications (3)

Publication Number Publication Date
WO2001059736A2 WO2001059736A2 (en) 2001-08-16
WO2001059736A3 WO2001059736A3 (en) 2002-03-21
WO2001059736A9 true WO2001059736A9 (en) 2002-10-24

Family

ID=23995439

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/004496 WO2001059736A2 (en) 2000-02-10 2001-02-12 System and method of facilities and operations monitoring and remote management support

Country Status (2)

Country Link
AU (1) AU2001236932A1 (en)
WO (1) WO2001059736A2 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102752578B (en) * 2012-06-05 2015-03-11 深圳市粮食集团有限公司 Unusual monitoring system and method for grain storage
US20140121807A1 (en) 2012-10-29 2014-05-01 Elwha Llc Food Supply Chain Automation Farm Tracking System and Method
US20140122186A1 (en) * 2012-10-31 2014-05-01 Pumpernickel Associates, Llc Use of video to manage process quality
US9257150B2 (en) 2013-09-20 2016-02-09 Panera, Llc Techniques for analyzing operations of one or more restaurants
CA2923690A1 (en) * 2013-09-20 2015-03-26 Pumpernickel Associates, Llc Techniques for analyzing operations of one or more restaurants
US10019686B2 (en) 2013-09-20 2018-07-10 Panera, Llc Systems and methods for analyzing restaurant operations
US9798987B2 (en) 2013-09-20 2017-10-24 Panera, Llc Systems and methods for analyzing restaurant operations
US20190164104A1 (en) * 2017-11-25 2019-05-30 Ruptub Solutions Private Limited Method and system for quality control of a facility based on feedback from multiple sources
US20200334767A1 (en) * 2019-04-16 2020-10-22 Hm Electronics, Inc. Systems and methods for using information obtained through time-limited events among quick service restaurants
CN114708557B (en) * 2022-04-19 2023-01-24 国网湖北省电力有限公司黄石供电公司 Electric power construction monitoring method and system based on air-ground communication

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2142500B (en) * 1983-06-17 1986-08-20 Atomic Energy Authority Uk Monitoring of dangerous environments
US4626992A (en) * 1984-05-21 1986-12-02 Motion Analysis Systems, Inc. Water quality early warning system
JPH07106839B2 (en) * 1989-03-20 1995-11-15 株式会社日立製作所 Elevator control system
US5153722A (en) * 1991-01-14 1992-10-06 Donmar Ltd. Fire detection system
US7304662B1 (en) * 1996-07-10 2007-12-04 Visilinx Inc. Video surveillance system and method
US5812060A (en) * 1997-05-01 1998-09-22 Darling International, Inc. Automated waste cooking grease recycling tank
US5900801A (en) * 1998-02-27 1999-05-04 Food Safety Solutions Corp. Integral master system for monitoring food service requirements for compliance at a plurality of food service establishments
JPH11266487A (en) * 1998-03-18 1999-09-28 Toshiba Corp Intelligent remote supervisory system and recording medium

Also Published As

Publication number Publication date
WO2001059736A3 (en) 2002-03-21
AU2001236932A1 (en) 2001-08-20
WO2001059736A2 (en) 2001-08-16

Similar Documents

Publication Publication Date Title
US7817182B2 (en) Internet surveillance system and method
US8438175B2 (en) Systems, methods and articles for video analysis reporting
US8818841B2 (en) Methods and apparatus to monitor in-store media and consumer traffic related to retail environments
JP5866559B2 (en) Computer system and method for managing in-store aisles
US7474330B2 (en) System and method for integrating and characterizing data from multiple electronic systems
JP5958723B2 (en) System and method for queue management
US20070268121A1 (en) On-line portal system and method for management of devices and services
US20050149382A1 (en) Method for administering a survey, collecting, analyzing and presenting customer satisfaction feedback
US20070168202A1 (en) Restaurant drive-through monitoring system
WO2001059736A9 (en) System and method of facilities and operations monitoring and remote management support
US11741425B2 (en) Operating system for brick and mortar retail
CA2416221A1 (en) Target rater
WO2018012389A1 (en) Facility operation support device, user terminal device, and facility operation support method
CN115053243A (en) Food safety performance management model
CN117593085A (en) Unmanned vending control system and method
JP6288567B2 (en) Facility operation support apparatus and facility operation support method
JP2018010373A (en) Facility management support device and facility management support method
CN118587813B (en) Automatic vending machine management system
JP3227922U (en) Store management support system
CN118587813A (en) Automatic vending machine management system
CN115862190A (en) Market management system based on passenger flow volume detection
JP2003281343A (en) Device, method and system for supporting service

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

AK Designated states

Kind code of ref document: C2

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: C2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

COP Corrected version of pamphlet

Free format text: PAGES 1/9-9/9, DRAWINGS, REPLACED BY NEW PAGES 1/8-8/8; DUE TO LATE TRANSMITTAL BY THE RECEIVING OFFICE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69(1) EPC

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP