CN111131750A - Scheduling method, device and system - Google Patents

Scheduling method, device and system Download PDF

Info

Publication number
CN111131750A
CN111131750A CN201911311639.0A CN201911311639A CN111131750A CN 111131750 A CN111131750 A CN 111131750A CN 201911311639 A CN201911311639 A CN 201911311639A CN 111131750 A CN111131750 A CN 111131750A
Authority
CN
China
Prior art keywords
area
server
conference
video
scheduling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911311639.0A
Other languages
Chinese (zh)
Inventor
毛国峰
康明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201911311639.0A priority Critical patent/CN111131750A/en
Publication of CN111131750A publication Critical patent/CN111131750A/en
Priority to PCT/CN2020/109888 priority patent/WO2021120652A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Abstract

The embodiment of the application provides a scheduling method, equipment and a system, and relates to the technical field of electronics. The scheme is as follows: specifically, when the vehicle enters a new area, the camera in the new area is automatically added in the meeting, and equipment such as a mobile phone of field security personnel in the new area can be added, so that the video stream of the camera in the new area can be played; in addition, the cameras in the old area can be automatically deleted from the conference, so that the real-time updating and replacement of conference members are ensured, and unnecessary resource expenditure is reduced.

Description

Scheduling method, device and system
Technical Field
The embodiment of the application relates to the technical field of electronics, in particular to a scheduling method, device and system.
Background
In a security service scene, the video monitoring system and the command center operate independently, effective linkage cannot be realized, the command center cannot browse camera pictures of the scene around a fleet in real time, and the camera pictures cannot be transmitted to communication equipment used by security related personnel in time. Generally, in the advancing process of a fleet of leaders, a command center dispatcher needs to add a camera in a new area, add a mobile phone of a field security worker in the new area, and add a mobile phone, a tablet personal computer and other equipment of a leader in the new area; deleting cameras in the old area, deleting the mobile phones of field security personnel in the old area, and the mobile phones, tablet computers and other equipment of leaders in the old area. Obviously, the operation of the dispatching personnel of the command center is complex, the workload is large, the conditions of all areas in the process of traveling need to be familiar, errors are easy to occur, the efficiency is low, and the timeliness is poor.
Disclosure of Invention
The embodiment of the application provides a scheduling method, equipment and a system, which can realize automatic scheduling of terminal equipment, thereby realizing rapid command, improving command and scheduling efficiency, and being difficult to make mistakes and higher in accuracy.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in one aspect, the present technical solution provides a scheduling system, including: a dispatch server and an application server. The scheduling server is configured to send first information (information) to the application server, where the first information is used to notify that the target object enters the first area. The application server is used for responding to the first information and sending a first message to the scheduling server, wherein the first message is used for indicating that at least one first conference member is added to the conference managed by the scheduling system, the first conference member is terminal equipment in a first area, and the first area is a preset area.
Wherein the first message may comprise identification information of at least one first conference member.
In the scheme, the dispatching system can realize automatic dispatching of the terminal equipment, compared with manual dispatching of dispatchers in the prior art, the dispatching system can realize rapid command, improve commanding and dispatching efficiency, is not easy to make mistakes, and has higher accuracy.
In a possible implementation manner, the system further includes a video monitoring server, where the video monitoring server is configured to receive second information from the first terminal, the first terminal is a video monitoring terminal, the second information includes an image acquired by the first terminal, and the image includes a picture of the target object; alternatively, the second information is used to indicate: the target object is located in the first area; the video monitoring server is also used for sending the first information to the scheduling server.
That is to say, the video monitoring server receives the second information of the video monitoring terminal, generates notification information for indicating that the target object enters the first area, and reports the notification information, so that the scheduling server can automatically add the terminal device in the preset area to the created conference via the application server, thereby implementing automatic scheduling of the terminal device.
In another possible implementation manner, the scheduling server is further configured to receive second information from the first terminal, where the second information is used to indicate a position of the target object.
That is, the scheduling server may receive second information indicating a location of the target object from the first terminal, thereby initiating an operation of automatically adding terminal devices in the first area according to the second information indicating the location of the target object.
In another possible implementation, the system further includes a video conference server. The scheduling server is also used for responding to the first message and sending a conference member adding message to the video conference server. The video conference server is configured to send a conference call message to the dispatch server, the conference call message being used to instruct the conference to call at least one first conference member. The scheduling server is further used for sending a second message to the first conference member, wherein the second message is used for indicating the conference to call the first conference member; receiving a call response message from a first conference member; and sending a third message to the video conference server, wherein the third message is used for indicating that the first conference member is added into the conference.
That is to say, the scheduling system can automatically add the common members in the terminal device in the first area to the conference, so that the common members can communicate with each other in the conference, and the scheduling server can receive the information reported by the common members and can issue the information to the common members, thereby facilitating the fusion and intercommunication among the interiors of the common members and between the scheduling server and the common members.
In another possible implementation manner, the system further comprises a video conference server and a video monitoring server, and the first conference member is a first video monitoring device. The scheduling server is also used for responding to the first message and sending a conference member adding message to the video conference server. The video conference server is configured to send a conference call message to the dispatch server, the conference call message being used to instruct the conference to call at least one first conference member. The scheduling server is further configured to send a fourth message to the video monitoring server, where the fourth message is used to instruct the conference to call the first video monitoring device. The video monitoring server is also used for sending a return video request message to the first video monitoring device; receiving a response message of a returned video request from the first video monitoring device; and sending a response message of the message four to the scheduling server. And the scheduling server is also used for responding to the response message of the message four and sending a message five to the video conference server, wherein the message five is used for indicating that the first video monitoring equipment is added into the conference.
That is to say, the scheduling system can automatically add video monitoring members in the terminal equipment in the first area to the conference, and fusion of video monitoring images and intelligent command scheduling is realized, so that the scheduling server can realize real-time browsing and historical browsing of camera images of the scene around the fleet, and convenience is provided for command scheduling.
In another possible implementation manner, the video conference server is further configured to receive image data from the first video monitoring device; and sending the image data of the first video monitoring device to a scheduling server. The scheduling server is further used for sending the image data of the first video monitoring device to other conference members.
That is to say, after the first video monitoring device is added to the conference, the scheduling method may further push the image of the first video monitoring device to other conference members in the conference, for example, to conference members based on public network access, such as a mobile phone, a tablet computer, or the like, so that related personnel holding other conference member devices can quickly know the latest situation, and the fusion and intercommunication among the video monitoring system, the scheduling server, and the public network system is realized.
In another possible implementation manner, the video monitoring server is specifically configured to determine that the target object enters the first area from the second area; wherein the second area is a preset area; the first information is used for indicating that the target object enters the first area from the second area. The application server is further configured to send a message six to the scheduling server in response to the first information, where the message six is used to indicate that at least one second conference member is deleted, and the second conference member is a terminal device in the second area.
That is to say, after the video monitoring server determines that the target object enters the first area from the second area based on the second information reported by the video terminal, the scheduling method may further automatically delete the terminal device in the second area, may ensure that the conference members are updated and replaced in real time, avoids redundancy of the conference members, is beneficial to automatic scheduling management, and may also reduce unnecessary resource overhead.
In another possible implementation manner, the scheduling server is specifically configured to determine that the target object enters the first area from the second area; wherein the second area is a preset area; the first information is used for indicating that the target object enters the first area from the second area. The application server is further configured to send a message six to the scheduling server in response to the first information, where the message six is used to indicate that at least one second conference member is deleted, and the second conference member is a terminal device in the second area.
That is, the scheduling server may determine that the target object enters the first area from the second area based on the second information indicating the location of the target object. Then, the scheduling method can automatically delete the terminal equipment in the second area, can ensure that the conference members are updated and switched in real time, avoids the redundancy of the conference members, is beneficial to automatic scheduling management, and can also reduce unnecessary resource overhead.
In another possible implementation manner, the second conference member is a second video monitoring device; the scheduling server is further configured to send a message seven to the video surveillance server in response to the message six, where the message seven is used to indicate that the second video surveillance device is deleted. The video monitoring server is also used for sending a video return stopping request message to the second video monitoring equipment; receiving a return video response stopping message from the second video monitoring device; and sending a response message of the message seven to the scheduling server. And the scheduling server is also used for responding to the response message of the message seven and sending a message of deleting the conference members to the video conference server.
That is, after the video monitoring server determines that the target object enters the first area from the second area, the scheduling method may automatically delete the second video monitoring device in the second area, so that the scheduling server does not need to receive unnecessary video monitoring image information any more, which is beneficial to automatic scheduling management, and may also reduce unnecessary resource overhead related to the second video monitoring device.
In another possible implementation manner, the scheduling server is further configured to send third information to the application server, where the third information is used to instruct the target object to enter a third area; wherein the third area is a non-preset area. The application server is also used for responding to the third information and sending a message eight to the scheduling server, wherein the message eight is used for prompting the user for manual operation.
That is, in an automatic scheduling scenario, if an abnormal situation that a target object deviates from a preset traveling region occurs, adjustment needs to be performed in time, and manual scheduling is started.
On the other hand, the technical scheme of the application provides a scheduling method, which is applied to an application server in a scheduling system, and the method comprises the following steps: the application server receives first information from the scheduling server, wherein the first information is used for informing a target object to enter a first area; the first area is a preset area; and responding to the first information, the application server sends a message I to the scheduling server, wherein the message I is used for indicating that at least one first conference member is added to the conference managed by the scheduling system, and the first conference member is a terminal device in the first area.
Wherein the first message comprises identification information of at least one first conference member.
In the scheme, the dispatching system can realize automatic dispatching of the terminal equipment, compared with manual dispatching of dispatchers in the prior art, the dispatching system can realize rapid command, improve commanding and dispatching efficiency, is not easy to make mistakes, and has higher accuracy.
In one possible implementation manner, after the application server sends the first message to the scheduling server in response to the first information, the method further includes: responding to the first information, the application server sends a message six to the scheduling server, wherein the message six is used for indicating that at least one second conference member is deleted, and the second conference member is terminal equipment in a second area; and the target object enters the first area from the second area, and the second area is a preset area.
That is, when the target object enters the first area from the second area, the application server may send a message indicating that at least one second conference member is automatically deleted to the scheduling server, so as to ensure that the conference members are updated and switched in real time, avoid redundancy of the conference members, facilitate automatic scheduling management, and reduce unnecessary resource overhead.
In another possible implementation manner, after the application server receives the first information from the scheduling server, the method further includes: the application server receives third information from the scheduling server, wherein the third information is used for indicating the target object to enter a third area, and the third area is a non-preset area; and responding to the third information, the application server sends a message eight to the scheduling server, wherein the message eight is used for prompting the user for manual operation.
That is, in an automatic scheduling scenario, if an abnormal situation that a target object deviates from a preset traveling region occurs, adjustment needs to be performed in time, and manual scheduling is started.
On the other hand, the technical solution of the present application provides an application server, which is applied to a scheduling system, and is characterized by including: one or more processors; a memory; a plurality of application programs; and one or more computer programs; wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the electronic device, cause the application server to perform the steps of: receiving second information from a scheduling server, wherein the second information is used for informing a target object to enter a first area; the first area is a preset area; and responding to the second information, and sending a message I to the scheduling server, wherein the message I is used for indicating that at least one first conference member is added to the conference managed by the scheduling system, and the first conference member is a terminal device in the first area.
In one possible implementation, after sending the message one to the scheduling server in response to the first information, the application server further performs the following steps: responding to the first information, and sending a message six to a scheduling server, wherein the message six is used for indicating to delete at least one second conference member, and the second conference member is a terminal device in a second area; and the target object enters the first area from the second area, and the second area is a preset area.
In another possible implementation manner, after receiving the first information from the scheduling server, the application server further performs the following steps: receiving third information from a scheduling server, wherein the third information is used for indicating that the target object enters a third area, and the third area is a non-preset area; and responding to the third information, and sending a message eight to the scheduling server, wherein the message eight is used for prompting the user for manual operation.
In another possible implementation, the message one comprises identification information of at least one first conference member.
On the other hand, the technical scheme of the application provides an embodiment of a video scheduling method, which comprises the steps that a scheduling platform obtains a first area where a target object is located; the scheduling platform queries a first camera corresponding to the first area according to the first area; instructing to push the video stream of the first camera to the video processing device.
In the scheme, the dispatching system can realize automatic dispatching of the terminal equipment, compared with manual dispatching of dispatchers in the prior art, the dispatching system can realize rapid command, improve commanding and dispatching efficiency, is not easy to make mistakes, and has higher accuracy.
In one possible implementation, the video processing device may play the received video stream. Optionally, when the target object leaves the first area and enters a second area, the method further includes: the scheduling platform obtains the third area; the scheduling platform queries a second camera corresponding to the third area according to the third area; instructing to push the video stream of the second camera to the video processing device.
In another possible implementation manner, querying, according to the first area, a first camera corresponding to the first area specifically includes: acquiring a video processing equipment ID corresponding to the first area information by inquiring a corresponding relation between an area and the video processing equipment ID, wherein the video processing equipment comprises the first camera; or obtaining a video processing device group ID corresponding to the first region information by inquiring the corresponding relation between the region and the video processing device group ID, wherein the video processing device group comprises the first camera.
In another possible implementation manner, pushing the video stream of the camera to the video processing device specifically includes: adding the camera into a conference, adding the video processing equipment into the conference through the video processing equipment ID, and pushing the video stream of the camera to the video processing equipment through the conference; or adding the camera into a conference, adding all the video processing devices in the video processing device group into the conference through the video processing device group ID, and pushing the video stream of the camera to the video processing device group through the conference.
In another possible implementation manner, the method further includes: when the target object is located in a first area, querying first video processing equipment corresponding to the first area, wherein the first video processing equipment is used for playing a video stream of the first camera; and when the target object is located in a third area, querying a second video processing device corresponding to the third area, wherein the second video processing device is used for playing a video stream of the second camera.
In another aspect, a technical solution of the present application provides an embodiment of a video scheduling apparatus, where the video scheduling apparatus includes:
the position obtaining module is used for obtaining first area information of the target object; the query module is used for querying the camera corresponding to the first area according to the first area information; and the pushing module is used for indicating to push the video stream of the camera to the video processing equipment. In the scheme, the dispatching system can realize automatic dispatching of the terminal equipment, compared with manual dispatching of dispatchers in the prior art, the dispatching system can realize rapid command, improve commanding and dispatching efficiency, is not easy to make mistakes, and has higher accuracy.
On the other hand, the present technical solution provides an application server, including: one or more processors; a memory; a plurality of application programs; and one or more computer programs; wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions. The instructions, when executed by the processor, cause the application server to perform the scheduling method in any one of the possible implementations of any of the above aspects.
On the other hand, the technical solution further provides an application server, including: an interface and one or more processors; the interface is used for receiving first information from a scheduling server, wherein the first information is used for informing the target object of entering the first area; the first region is a preset region, and the processor is configured to: the application server executes the following steps by running a computer program: and responding to the first information, and sending a first message to the scheduling server, wherein the first message is used for indicating that at least one first conference member is added to the conference managed by the scheduling system, and the first conference member is a terminal device in the first area.
In another aspect, the present disclosure provides a computer-readable storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to execute a scheduling method in any one of the possible implementations of any one of the foregoing aspects.
In another aspect, the present disclosure provides a computer program product including instructions, which, when run on an electronic device, causes the electronic device to execute a scheduling method in any one of the above possible implementations.
Drawings
Fig. 1 is a schematic structural diagram of a scheduling system according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a computer system according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating an architecture of a scheduling system according to an embodiment of the present application;
fig. 4A is a flowchart of a scheduling method according to an embodiment of the present application;
fig. 4B is a flowchart of another scheduling method provided in the embodiment of the present application;
fig. 5A is a flowchart of creating a conference according to an embodiment of the present application;
fig. 5B is a flowchart of another process for creating a conference according to an embodiment of the present application;
fig. 6 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 7A is a flowchart of another scheduling method provided in the embodiment of the present application;
fig. 7B is a flowchart of another scheduling method provided in the embodiment of the present application;
fig. 8A is a flowchart of another scheduling method provided in the embodiment of the present application;
fig. 8B is a flowchart of another scheduling method provided in the embodiment of the present application;
fig. 9 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 10 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 11 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 12 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 13 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 14 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 15A is a flowchart of another scheduling method provided in the embodiment of the present application;
fig. 15B is a flowchart of another scheduling method provided in the embodiment of the present application;
fig. 16 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 17 is a flowchart of another scheduling method provided in an embodiment of the present application;
fig. 18 is a schematic view of a structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the prior art, in a security service scene, in the advancing process of a fleet of important protected objects, a command center dispatcher needs to add a camera in a new area, add a mobile phone of a field security worker in the new area, and add a mobile phone, a tablet personal computer and other equipment of a leader in the new area; deleting the camera in the old area, deleting the mobile phone of the field security personnel in the old area, and deleting the mobile phones, tablet computers and other equipment of other personnel in the old area. Therefore, the operation of the dispatching personnel of the command center is complex, the workload is large, the conditions of each area in the process of traveling need to be familiar, errors are easy to occur, the efficiency is low, and the timeliness is poor.
The embodiment of the application provides a scheduling system, in a security service scene, a camera in a new area can be automatically added in the traveling process of a fleet of leaders, the mobile phone of field security personnel in the new area, the tablet personal computer of leaders in the new area and other equipment can be automatically added; and automatically deleting cameras in the old area, the mobile phone of field security personnel in the old area, the mobile phone of leader personnel in the old area, the tablet personal computer and other equipment. Therefore, compared with a scheme of manual scheduling in the prior art, the scheduling system provided by the embodiment of the application can realize automatic scheduling of the terminal device, and is faster and more accurate compared with manual scheduling of a dispatcher in the prior art.
The embodiment of the application provides a scheduling method, equipment and a system, and relates to the technical field of electronics. The scheme is as follows: specifically, when a vehicle enters a new area, a camera owned by the new area is inquired, so that cameras in the new area are automatically added in a meeting; moreover, video processing equipment such as a mobile station of field security personnel in the new area can be added, so that the video stream of the camera in the new area can be played in the video processing equipment; in addition, the cameras in the old area can be automatically deleted from the conference, so that the real-time updating and replacement of conference members are ensured, and unnecessary resource expenditure is reduced.
Exemplarily, fig. 1 shows a schematic structural diagram of a scheduling system provided in an embodiment of the present application. Referring to fig. 1, a scheduling system 100 may include an application server 101, a scheduling server 102, a video conference server 103, and a video surveillance server 104. The application server 101, the scheduling server 102, the video conference server 103, and the video monitoring server 104 may each include one or more sub-servers. The dispatch server 102 may also be referred to as an intelligent director server.
Among other things, the application server 101 may be used to manage and present services. In an automated dispatch scenario, the application server 101 may create an automated intelligent command scheme that may include fleet information, route information for fleet travel, abnormal situation handling schemes, and the like.
Illustratively, the fleet information may include primary vehicle information, secondary vehicle information, and other vehicle information, among others. The primary vehicle, the secondary vehicle, and the other vehicles may each include one or more. For example, the host vehicle information may include a license plate of the host vehicle, a vehicle type, a vehicle style, a vehicle color, a vehicle photograph, a principal member of the ride, and the like. Similarly, the secondary information may include the license plate of the secondary, the model of the vehicle, the color of the vehicle, the photograph of the vehicle, the principal members who took the vehicle, and the like. Other vehicle information may include the license plate of the other vehicle, the model of the vehicle, the payment for the vehicle, the color of the vehicle, the photograph of the vehicle, the principal members who took the vehicle, etc. It is to be understood that the fleet information is not limited to the above examples, and may include other information related to the fleet vehicles, and the embodiments of the present application are not limited to the type of fleet information.
For example, the route information of the fleet travel may include a plurality of preset travel route information. The plurality of pieces of preset traveling line information may be divided into main traveling line information and standby traveling line information. In general, the primary travel line may include one or more travel lines, and the alternate travel line may also include one or more travel lines. The preset traveling route information is used for indicating the track information of the motorcade in the traveling process. For example, the route information for the traveling of the fleet includes one piece of main traveling route information and two pieces of alternate traveling route information, the main traveling route may be shown as the following route 1, and the two alternate traveling routes may be shown as the following routes 2 and 3, respectively:
region 01 → region 02 → region 03 → region 04 → region 05 (line 1)
Region 01 → region 12 → region 13 → region 14 → region 15 (line 2)
Region 11 → region 12 → region 13 → region 14 → region 15 (line 3)
The automatic intelligent command scheme can further comprise area information related in a preset travelling line. The area information may include, among others, an area ID, area basic information, a camera list of the area, a user list of the area, and the like. Wherein the area ID may be used to identify the area. The area basic information may include information of an area location name, longitude and latitude locations of the area, main roads of the area, main buildings of the area, and the like. The user list of the area may include a private network cluster user list of the area, a public network user list of the area, and the like.
The private network cluster user list of an area is a user list included in the area in a private communication network constructed to satisfy scheduling management. Illustratively, the private network cluster user list for an area may include a list of various devices accessed in the area in a private communication network established for a worker such as a field commander, a field policeman, or the like. For example, the private network cluster user list of the area may include the private network stations of the field commander, the private network stations of the field policeman, etc. in the area, and the private network stations, the private network interphones, etc. of other field workers.
The public network cluster user list of the area is a user list included in a wireless cluster system constructed based on a public network. The public network cluster has super-strong coverage, is not limited by distance, is more convenient for remote scheduling, and can achieve the effects of one call, multiple responses and simultaneous conversation of multiple people. The public network cluster user list of the area may include various devices lists such as mobile stations, interphones and the like of the public network cluster in the area.
The public network user list of the area refers to a device list of the area accessed through a public operator network. For example, the list of public network users for a region may include a list of devices such as a cell phone, tablet, or personal computer for a field worker or leader.
The abnormal situation handling scheme may include a handling scheme of the fleet deviating from a preset traveling route, a handling scheme in case of an emergency dangerous situation, and the like. For example, the dispatch system may initiate manual direction when the dispatch system detects that a fleet of vehicles enters a non-preset area.
The dispatch server 102 may be used to implement control and management of automated dispatch. The video conference server 103 may be configured to implement video conference capabilities between various terminals, such as various private network cluster users, public network users, and cameras. The video monitoring server 104 may be used to implement access to a camera, and may implement functions of storing, browsing, analyzing, retrieving, and alarming video of the camera.
In the scheduling system 100 shown in fig. 1, the application server may be configured to send a conference creation request to the scheduling server, and then the scheduling server 102 may be configured to send the conference creation request to the video conference server 103, and the video conference server 103 may create a conference after receiving the conference creation request.
In some embodiments, after the conference is created, the video surveillance server 104 may receive information from the video surveillance terminal indicating that the target object is within the preset area one. Wherein the target object may be a primary object in the travel of the fleet; for example, the target object may be a primary vehicle or a primary person in a fleet of vehicles, or the like. The first preset area is an area included in route information of the fleet traveling in the automatic intelligent command scheme, that is, information of the terminal device in the first preset area is stored in the application server 101 in advance. According to the information, the video monitoring server 104 may obtain notification information that the target object enters the first preset area. The video surveillance server 104 may then send the notification information to the scheduling server 102, and the scheduling server 102 sends the notification information to the application server 101. The application server 101 may send a message instructing to automatically add at least one first conference member, which is a terminal device in the first preset area, to the scheduling server 102 in response to the notification information.
In other embodiments, after the conference is created, the scheduling server 102 may receive information from the first terminal, the information indicating that the target object is within the first preset area. Wherein the target object may be a primary object in the travel of the fleet; for example, the target object may be a primary vehicle or a primary person in a fleet of vehicles, or the like. Then, the scheduling server 102 may transmit the notification information to the application server 101. The application server 101 may send a message instructing to automatically add at least one first conference member, which is a terminal device in the first preset area, to the scheduling server 102 in response to the notification information.
Therefore, the scheduling server 102 can automatically add terminal devices in the first preset area, including the cameras, the on-site commander devices, the related leader devices, and the like in the first preset area. In addition, in response to the notification information, the application server 101 may further send a message to the scheduling server 102, where the message indicates that at least one second conference member is to be automatically deleted, where the second conference member is a terminal device in a second preset area where the target object is located before entering the first preset area. Therefore, the scheduling server 102 may automatically delete terminal devices in the second preset area, including the cameras, the on-site commander devices, the relevant leader devices, and the like in the second preset area.
Illustratively, each of the application server 101, the scheduling server 102, the video conference server 103, and the video monitoring server 104 in fig. 1 may be implemented by the computer system 200 shown in fig. 2. The computer system 200 includes at least one processor 201, communication lines 202, memory 203, and at least one communication interface 204.
The processor 201 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present invention.
The communication link 202 may include a path for transmitting information between the aforementioned components.
The communication interface 204 may be any device, such as a transceiver, for communicating with other devices or communication networks, such as an ethernet, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), etc.
The memory 203 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a RAM or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via a bus. The memory may also be integral to the processor.
The memory 203 is used for storing application program codes for executing the scheme of the application, and the processor 201 controls the execution. The processor 201 is configured to execute the application program code stored in the memory 203 to control the computer system 200 to implement the scheduling method provided by the following embodiments of the present application. Optionally, the computer-executable instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 201 may include one or more CPUs, such as CPU0 and CPU1 in fig. 2, each of which may support multiple virtual CPUs, also referred to as VCPUs, as an embodiment.
In particular implementations, computer system 200 may include multiple processors, such as processor 201 and processor 207 in FIG. 2, for example, as an embodiment. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In particular implementations, computer system 200 may also include an output device 205 and an input device 206, as one embodiment. The output device 205 is in communication with the processor 201 and may display information in a variety of ways. For example, the output device 205 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display device, a Cathode Ray Tube (CRT) display device, a projector (projector), or the like. The input device 206 is in communication with the processor 201 and can accept user input in a variety of ways. For example, the input device 206 may be a mouse, a keyboard, a touch screen device, or a sensing device, among others.
The computer system 200 may be a general purpose communication device or a special purpose communication device. The embodiment of the present application does not limit the type of the computer system 200. In a specific implementation, the computer system 200 may be a desktop computer, a laptop computer, a web server, a Personal Digital Assistant (PDA), a mobile phone, a tablet computer, a wireless terminal device, an embedded device, or a device with a similar structure as in fig. 2. The various components of computer system 200 may be deployed simultaneously on the same computer device or on different computer devices located in a distributed system.
As yet another example, FIG. 3 shows an architecture diagram of a scheduling system 300. Referring to fig. 3, the scheduling system 300 may be divided into four layers, which are a service application layer, a core control layer, an access layer, and an access device layer.
The service application layer may include an application server 301, and the application server 301 may be used to manage and present services.
The core control layer includes a scheduling server 302 and a video surveillance server 303. The dispatch server 302 may also be referred to as an intelligent director server. The dispatch server 302 may be used to implement control and management of automatic dispatch. The video monitoring server 303 may be used to access a camera, and may implement functions of storing, browsing, analyzing, retrieving, and alarming a video of the camera. Dispatch server 302 may include a broadband dispatch sub-server 3021. The broadband scheduling sub-server 3021 may interact with the video monitoring server 303 and the video conference server 306, and the command center scheduling console may implement real-time browsing and historical browsing of image data of the camera through the broadband scheduling sub-server 3021, and may send the image data of the camera to the mobile station of the on-site security personnel, the mobile phone of the leader, the tablet computer, and other devices through the broadband scheduling sub-server 3021, so as to implement the video conference and video distribution functions. The broadband scheduling sub-server 3021 may also interface with the video monitoring server 303, for example, interface with the video monitoring server 303 through a video gateway, so as to implement browsing and intercommunication of image data of the camera.
The access stratum may include narrowband trunking gateway 304, voice gateway 305, and video conference server 306. Wherein the narrowband cluster gateway 304 is used to establish a connection between the narrowband cluster device and the broadband scheduling servant 3021. The voice gateway 305 is used to establish a connection between the voice communication device and the broadband dispatch sub-server 3021. The video conference server 306 is used to establish connections between various video terminals and the broadband scheduling subserver 3021 and the video surveillance server 303. Therefore, through the connection of the access layer, the broadband scheduling sub-server 3021 may communicate with various communication devices and monitoring devices, so that the command center scheduling console may implement data sharing and scheduling for various communication devices and monitoring devices through the broadband scheduling sub-server 3021.
The access device layer may include various access devices, such as various communication devices and monitoring devices. For example, referring to fig. 3, the access device layer may include a mobile police device 307, a broadband cluster device 308, a narrowband cluster device 309, a voice communication device 310, a video terminal 311, a stationary camera 312, and a mobile camera 313.
The mobile police device 307 is a police device based on operator public network access, and may include a mobile phone, a tablet, or a notebook equipped by a police officer, for example. Wideband cluster device 308 and narrowband cluster device 309 refer to devices in a wideband cluster and a narrowband cluster, respectively. For example, broadband cluster device 308 may be a broadband cluster handset; narrowband cluster device 309 may be a narrowband cluster hand-piece. The difference between narrowband and wideband clusters is that the bandwidth range used for narrowband cluster communication is smaller than the bandwidth range used for wideband cluster communication. In general, narrowband clusters provide data at a lower rate, while broadband clusters may provide a higher data rate. Thus, a wideband cluster may communicate a larger amount of data information than a narrowband cluster. For example, a narrowband cluster may communicate data, voice, etc. information; the broadband cluster can also transmit information such as images and videos besides data and voice. The voice communication device 310 may be a device that communicates by voice, for example, the voice communication device 310 may include an office phone. The fixed camera 312 refers to a camera with a fixed position, for example, a camera fixedly disposed in a first preset area. The mobile camera 313 refers to a camera whose position is movable, for example, a mobile cloth control ball, a vehicle-mounted camera, an unmanned aerial vehicle, and the like. The mobile police equipment 307 and the broadband cluster equipment 308 may directly access the broadband scheduling sub-server 3021, and the narrowband cluster equipment 309 may access the broadband scheduling sub-server 3021 through the narrowband cluster gateway 304 and the voice communication equipment 310 through the voice gateway 305.
Video terminals 311 may include all terminals that are capable of accessing the conference via video conference server 306. That is, the video terminals 311 may include different types of terminal devices such as the mobile police device 307, the broadband cluster device 308, the narrowband cluster device 309, and the voice communication device 310. Video terminal 311 may join the conference through video conference server 306 and access broadband dispatch servlet 3021; thus, the access devices in the access device layer, including the mobile police device 307, the broadband cluster device 308, the narrowband cluster device 309 and the voice communication device 310, may all join the conference and access the broadband scheduling sub-server 3021, so that the sharing, the consultation and the distribution of the information of these access devices may be realized through the broadband scheduling sub-server 3021.
For convenience of understanding, the following embodiments of the present application will specifically describe a scheduling method provided by the embodiments of the present application with reference to the accompanying drawings by taking a scheduling system as an example of a system having the structure shown in fig. 1 and 3.
Referring to fig. 4A, the scheduling method may include:
411. and the scheduling server sends the information one to the application server.
Wherein the information one is used for informing the target object to enter the first area.
412. In response to the information one, the application server sends a message 1 to the scheduling server.
The message 1 is used for indicating that at least one first conference member is added to a conference managed by the scheduling system, wherein the first conference member is a terminal device in a first area, and the first area is a preset area.
Message 1 may illustratively comprise identification information of at least one first conference member. Since the preset area is an area included in route information of the traveling of the fleet in the automatic intelligent command scheme, information of the terminal device in the preset area is stored in the application server in advance. Thus, after the scheduling server sends the information one to the application server, the application server may send a message 1 including identification information of the at least one first conference member to the scheduling server in response to the information one to instruct the scheduling server to automatically add the at least one first conference member to the conference according to the identification information of the at least one first conference member.
In the scenario described in steps 411-412, the scheduling server may send information one for notifying the target object to enter the first area to the application server. Then, the application server may send a message 1 to the scheduling server in response to the first message, where the message 1 is used to instruct that at least one first conference member is added to the conference managed by the scheduling system, the first conference member is a terminal device in a first area, and the first area is a preset area. Therefore, the dispatching method can realize automatic dispatching of the terminal equipment, can realize rapid command and improve the efficiency of command and dispatching compared with manual dispatching of dispatchers in the prior art, and is not easy to make mistakes and has higher accuracy.
In an embodiment, the video monitoring terminal may send information two to the video monitoring server based on the shot monitoring image data, where the information two includes an image collected by the first terminal, and the image includes a picture of the target object. Responding to the second information, the video monitoring server sends notification information for notifying the target object to enter the first preset area to the scheduling server; in response to the notification information, the scheduling server may initiate an operation of adding the terminal device in the first preset area.
Referring to fig. 4B, the scheduling method provided in this embodiment may include:
401. the application server sends a conference creation request to the scheduling server.
402. The scheduling server sends a conference creation request to the video conference server.
403. The video conference server sends a conference creation response to the scheduling server.
And after receiving the conference creation request sent by the scheduling server, the video conference server creates a conference and sends a conference creation response to the scheduling server.
Wherein the meeting creation response can include return information that can be used to identify the created meeting. For example, the return information may include a conference ID, an access number, a password, and the like.
For example, in an automatic intelligent command scenario, the creation of the meeting may be performed before the fleet travels, so that the terminal devices of the current preset area may be automatically added in the created meeting when the fleet travels through the preset area, and the terminal devices within the already passed preset area may also be automatically deleted in the created meeting.
Referring to fig. 5A, the application server may include a Software Development Kit (SDK) for implementing presentation of specific functions of the application server. The application server may also have a corresponding client device. The client device may include one or more client devices. The client device may also be referred to as a dispatch station. The dispatcher can log in the application server through the dispatching desk for dispatching. The scheduling server may include a traffic control center (BCC) BCC as a module for performing traffic control. The video conference server may include a Service Management Center (SMC), a Service Controller (SC), and a multi-point control unit (MCU). The MCU can be used for video exchange, audio mixing, related processing, terminal access, signaling interaction and the like, and is a media stream processing center of a video system. The SMC is a module for performing unified management of services. The SC is a module for performing service control.
In the process of creating a conference, referring to fig. 5A, the process of creating a conference may specifically include:
501. and responding to the preset operation of the user, and sending a conference creation request to the SDK of the application server by the dispatching desk.
Wherein the user may be a dispatcher. The preset operation of the user may be an operation in which the user initiates a conference by using the scheduling console. For example, the preset operation of the user may be that the user clicks a "conference creation" button in a display interface of the client device through a mouse. Alternatively, the preset operation of the user may be a voice instruction of the user on the client device. For example, a user speaks a voice indication of "initiate a meeting" through a microphone of the client device. It is to be understood that the preset operation by the user is not limited to the above example, and the form of the preset operation by the user is not limited in the embodiment of the present application.
The conference creation request may include information on the type of conference, the conference members, and whether to record or not. The conference type may include an audio conference, a video conference, and the like, among others. The conference members may include dispatcher terminals, other member terminals, and the like.
502. The SDK of the application server sends a conference creation request to the BCC through a Session Initiation Protocol (SIP) message channel.
Step 502 is a specific form of step 401 shown in fig. 4B.
503. In response to the conference creation request, the BCC invokes the create conference interface schedulconf () of the SMC.
Wherein, step 503 is a specific form of step 402 shown in fig. 4B.
504. And the SMC sends the conference creating message to the MCU, so that the MCU creates the conference.
505. After the MCU successfully creates the conference, the SMC sends a conference creation response scheduleconfnresponse () to the BCC.
Wherein the meeting creation response can include return information that can be used to identify the created meeting. For example, the return information may include a meeting ID, an access number, a password (passcode), and the like. Wherein the conference ID is used to indicate the address of the conference. The access number is used to indicate the number of the access conference. passcode is used to indicate the password to access the conference.
Step 505 is a specific form of step 403 shown in fig. 4B.
506. The BCC sends the return information to the SDK.
507. And the SDK returns the conference creation result information to the dispatching desk.
508. And the SDK calls an audio and video call interface of the BCC to enable the dispatching desk to actively call into a meeting.
The calling mode when the dispatching desk actively calls the conference is Dial-in mode, and the Dial-in mode is calling mode and is used for indicating the active call conference. The called number of the dispatch station may be the access number xpasscode. It can be understood that the called number of the dispatching station can also be in other forms, and the form of the called number of the dispatching station is not limited in the embodiment of the application.
509. The BCC sends call-in information to the MCU.
510. The MCU returns a call conference in-acknowledgement response to the BCC in response to the call conference in-information.
511. The BCC returns a call in-conference acknowledgement response to the SDK.
Through the conference creation flow from step 501 to step 511, the conference is successfully created, and the scheduling station is joined in the conference.
In some embodiments, there may be other members (i.e., other end devices) other than the dispatch station at the time the conference is created, and these other members may need to be joined to the conference. Thus, after the MCU receives the create conference message, i.e., after step 504, referring to fig. 5B, the create conference flow may further include:
521. the MCU sends the information of calling other members to the SC.
The calling mode when the MCU calls other members can be Dial-out mode, and the Dial-out mode is calling mode and is used for indicating to call other members.
522. The SC sends the call other member information to the BCC.
523. The BCC calls the terminal equipment.
Wherein, the terminal device of the BCC call is the terminal device of other members. For example, the terminal device called by BCC may be a mobile phone, a tablet computer, etc.
524. The BCC receives the call response of the terminal device, which makes the terminal device enter the conference.
After the conference is established, the video monitoring server reports the notification information to the scheduling server according to the information from the video monitoring terminal, so that the scheduling server can automatically add the terminal equipment in the preset area to the established conference through the application server. The preset area is an area in route information of the traveling of the fleet included in the automatic intelligent command scheme, namely, information of the terminal equipment in the preset area is stored in the application server in advance. Therefore, after step 403, referring to fig. 4B, the scheduling method may further include:
404. and the video monitoring server receives the information II from the first terminal.
The first terminal may be a video monitoring terminal. Illustratively, the first terminal may be a video surveillance device in the first area that may aggregate video streams of a plurality of terminals (e.g., cameras), e.g., the first terminal may be a camera in the first area. Alternatively, the first terminal may be a mobile video monitoring device (e.g., a drone with a camera function), for example, the first terminal may be a mobile camera.
In some implementations, the second information may include an image captured by the first terminal, the image including a picture of the target object. The target object may be a primary object in the travel of the fleet. For example, the target object may be a primary vehicle or a primary person in a fleet of vehicles, or the like. The embodiment of the present application does not limit the type of the target object. The captured images may include video and/or pictures. For example, the first terminal may be a camera in the first area, and the second information may include a video image captured by the camera, where the video image includes a main vehicle in the fleet of vehicles.
In other implementations, the information two may be used to indicate: the target object is located in the first region. For example, the second information may be that the first terminal recognizes that the target object is located in the first area according to the acquired image. For example, the first terminal may be a camera in the first area, and the camera in the first area may have an intelligent recognition capability, where the intelligent recognition capability may include license plate recognition, vehicle recognition, or face recognition, so that the camera in the first area may recognize a target object in the fleet based on the acquired image by using its intelligent recognition capability, and then send the recognized information to the video monitoring server. The embodiment of the application does not limit the type of the intelligent identification capability.
405. And the video monitoring server sends the first information to the scheduling server.
Wherein the information one is used for informing the target object to enter the first area.
Illustratively, when the second information includes an image acquired by the first terminal, the video monitoring server may detect that the target object enters the first area through an intelligent recognition capability according to the image acquired by the first terminal, where the intelligent recognition capability may include license plate recognition, vehicle recognition, face recognition, or the like, so that the video monitoring server may send the first information to the scheduling server. The embodiment of the application does not limit the type of the intelligent identification capability.
For example, when the information two is used to indicate that the target object is located in the first area, the information one may be the information two.
406. And the scheduling server sends the information one to the application server.
407. In response to the information one, the application server sends a message 1 to the scheduling server.
The message 1 is used for indicating that at least one first conference member is added to a conference managed by the scheduling system, wherein the first conference member is a terminal device in a first area, and the first area is a preset area.
Message 1 may illustratively comprise identification information of at least one first conference member. Since the preset area is an area included in route information of the traveling of the fleet in the automatic intelligent command scheme, information of the terminal device in the preset area is stored in the application server in advance. Thus, after the scheduling server sends the information one to the application server, the application server may send a message 1 including identification information of the at least one first conference member to the scheduling server in response to the information one to instruct the scheduling server to automatically add the at least one first conference member to the conference according to the identification information of the at least one first conference member.
It can be seen that steps 406 and 407 in fig. 4B are the same as steps 411 and 412 in fig. 4A, respectively.
In the scenario described in steps 401 to 407, the application server sends a conference creation request to the scheduling server, and the scheduling server sends the conference creation request to the video conference server to create a conference through the video conference server. Then, the video monitoring server receives information from the video monitoring terminal for detecting the position of the motorcade, generates notification information for indicating the position of the motorcade, and reports the notification information to the scheduling server. The scheduling server can judge that the motorcade enters the preset area according to the notification information, so that the scheduling server can automatically add the terminal equipment in the preset area to the established conference via the application server. Therefore, the dispatching method can realize automatic dispatching of the terminal equipment, can realize rapid command and improve the efficiency of command and dispatching compared with manual dispatching of dispatchers in the prior art, and is not easy to make mistakes and has higher accuracy.
In another embodiment, the scheduling server receives information from the terminal indicating that the target object is within the first preset area, so that in response to the information, the scheduling server may initiate an operation of adding terminal devices in the first preset area.
Similar to the scheduling method shown in fig. 4B, the scheduling method provided in this embodiment may also include steps 401 to 403. After step 403, referring to fig. 6, the scheduling method provided in this embodiment may further include:
601. the scheduling server receives the second information from the first terminal.
Wherein the information two can be used to indicate the position of the target object.
The first terminal is capable of detecting a position of the target object. For example, the second information may be location information of the first terminal. Illustratively, the first terminal may be co-located with the target object or may move in synchronization with the target object. For example, when the target object is a main vehicle in the fleet of vehicles, the first terminal may be a terminal device on the main vehicle, for example, the first terminal may be a mobile phone, a tablet computer, a vehicle-mounted camera, or the like on the main vehicle; alternatively, the first terminal may be an unmanned aerial vehicle or other device that moves synchronously with the primary vehicle, so that the position information of the first terminal may represent the position information of the target object. The embodiment of the present application does not limit the type of the first terminal.
That is, the scheduling server may receive the second information indicating the location of the target object from the first terminal, thereby initiating an operation of automatically adding the terminal device in the first area according to the second information indicating the location of the target object.
602. And the scheduling server sends the information one to the application server.
Wherein the information one is used for indicating the target object to enter the first area.
603. In response to the information one, the application server sends a message 1 to the scheduling server.
The message 1 is used for indicating that at least one first conference member is added, and the first conference member is terminal equipment in a first area; the first area is a preset area.
In the schemes described in steps 401 to 403 and steps 601 to 603, the scheduling server is configured to send a conference creation request to the video conference server to create a conference through the video conference server. The scheduling server may receive information two indicating the location of the target object from the first terminal, and thus transmit information one indicating that the target object enters the first area to the application server according to the information two. In response to the first information, the application server sends a message 1 to the scheduling server, instructing to automatically add the terminal devices in the first area, thereby automatically adding the terminal devices in the first area to the already created conference. Therefore, the dispatching method can realize automatic dispatching of the terminal equipment, can realize rapid command compared with manual dispatching of a dispatcher in the prior art, is not easy to make mistakes, and has higher accuracy.
After step 407 or step 603, the scheduling method provided by the embodiment of the present application may automatically add conference members to the conference.
In one embodiment, automatically adding a conference member to the conference may include automatically adding a common member to the conference, where the common member may be a terminal device other than a camera, such as a mobile phone, or a tablet computer. Referring to fig. 7A, after step 407 or step 603, the scheduling method provided in the embodiment of the present application may further include:
701. in response to message 1, the scheduling server sends an add conference member message to the video conference server.
702. The video conference server sends a conference call message to the scheduling server.
The conference call message is used for indicating a conference to call at least one first conference member, and the first conference member is terminal equipment in a first area; the first area is a preset area.
703. The scheduling server sends message 2 to the first conference member.
Wherein the message 2 is used to instruct the conference to call the first conference member;
704. the scheduling server receives a call response message from the first conference member;
705. the scheduling server sends a message 3 to the video conference server.
Wherein the message 3 is used to indicate that the first conference member is joined to the conference.
Through the solutions described in the steps 701 to 705, the common members in the terminal device in the first area can be automatically added to the conference, so that the common members can communicate with each other in the conference, and the scheduling server can receive the information reported by the common members and can also issue the information to the common members, thereby facilitating the fusion and intercommunication among the interiors of the common members and between the scheduling server and the common members.
Illustratively, referring to fig. 7B, the process of automatically adding a common member to the conference, which is shown in steps 701 to 705, may specifically include:
711. the SDK of the application server passes message 1 to the BCC via the SIP message channel.
The message 1 is used for indicating that at least one first conference member is added, and the first conference member is terminal equipment in a first area; the first area is a preset area.
712. The BCC calls the add conference member interface AddSiteInScheduledConf () of the SMC.
713. And the SMC sends the message of adding the conference member to the MCU.
714. The MCU sends a message of calling the conference members to the SC.
715. The SC sends a call conference member message to the BCC.
716. The BCC calls the terminal equipment.
Wherein the terminal device is a terminal device in at least one first conference member.
717. The BCC receives the call response of the terminal device, which makes the terminal device enter the conference.
718. In response to the call answer of the terminal device, the BCC sends a join MCU conference message to the SC.
And the MCU conference adding message is used for indicating that the terminal equipment is added into the MCU conference.
719. And the SC sends a conference message of joining the MCU to the MCU.
In another embodiment, automatically adding a conference member to the conference may include automatically adding a video monitoring member to the conference, the video monitoring member being a terminal device that may be used for video monitoring, for example, the video monitoring member may be a camera, and may include a fixed camera and a mobile camera. Referring to fig. 8A, after step 407 or step 603, the scheduling method provided in the embodiment of the present application may further include:
801. in response to message 1, the scheduling server sends an add conference member message to the video conference server.
802. The video conference server sends a conference call message to the scheduling server.
Wherein the conference call message is for instructing the conference to call at least one first conference member.
803. The scheduling server sends a message 4 to the video surveillance server.
Wherein the message 4 is used to instruct the conference to call the first video surveillance device.
804. And the video monitoring server sends a return video request message to the first video monitoring device.
805. The video monitoring server receives a response message of the returned video request from the first video monitoring device.
806. The video surveillance server sends a response message to the scheduling server for message 4.
807. In response to the response message of message 4, the scheduling server sends message 5 to the video conference server.
Wherein the message 5 is used to indicate that the first video surveillance device is to be joined to the conference.
Through the scheme described in the steps 801 to 807, the video monitoring members in the terminal equipment in the first area can be automatically added to the conference, the fusion of the video monitoring picture and the intelligent command and dispatch is realized, so that the dispatch server can realize the real-time browsing and the history browsing of the camera images of the scene around the fleet, and the convenience is provided for the command and dispatch.
For example, referring to fig. 8B, the process of automatically adding video monitoring members to the conference shown in steps 801 to 807 may specifically include:
811. the SDK of the application server passes message 1 to the BCC via the SIP message channel.
The message 1 is used for indicating that at least one first conference member is added, and the first conference member is a first video monitoring device in a first area; the first area is a preset area.
812. The BCC calls the add conference member interface AddSiteInScheduledConf () of the SMC.
813. And the SMC sends the message of adding the conference member to the MCU.
814. The MCU sends a message of calling the conference members to the SC.
815. The SC sends a call conference member message to the BCC.
It can be seen that steps 811-815 in FIG. 8B are the same as steps 711-715 in FIG. 7B.
816. And the BCC determines the conference member as a video monitoring member according to the called number of the conference member.
817. The BCC sends a video backhaul request to the video gateway.
The video gateway can be used for providing interconnection between the video monitoring system and the scheduling server and providing information transfer between the video monitoring system and the scheduling server. Illustratively, the video gateway may be a Virtual Gateway (VGW) or an eGW650, and the like. The embodiment of the present application does not limit the type of the video gateway.
Illustratively, the format in which the BCC sends the video backhaul request to the video gateway may be: Call-Info: < Call number >; type ═ video appload; fmt ═ D1; camera is 0; user _ confirm is 0; mute is 0; emergency _ callInd ═ 1; the priority is 5. The embodiment of the present application does not limit the format of the video backhaul request.
After receiving the video return request, the video gateway sends the video return request to the video monitoring server, and then the video monitoring server may perform the operations shown in steps 804 to 805, receive a response message of the video return request (hereinafter referred to as a video response message), and send the video response message to the video gateway.
818. The video gateway sends a video response message to the BCC.
819. In response to the video answer message, the BCC sends a call answer message to the SC.
820. The SC transmits a call response message to the MCU.
Referring to fig. 9, after step 807, the scheduling method may further include:
901. the video conference server receives image data from the first video surveillance device.
902. And the video conference server sends the image data of the first video monitoring device to the scheduling server.
903. And the scheduling server sends the image data of the first video monitoring device to other conference members.
That is to say, after the first video monitoring device is added to the conference, the scheduling method may further push the image of the first video monitoring device to other conference members in the conference, for example, to conference members based on public network access, such as a mobile phone, a tablet computer, or the like, so that related personnel holding other conference member devices can quickly know the latest situation, and the fusion and intercommunication among the video monitoring system, the scheduling server, and the public network system is realized.
In some embodiments, after automatically adding conference members in the first region, the scheduling method may also automatically delete at least one conference member in the second region that the target object passed through before entering the first region.
In some implementations, referring to fig. 10, after the video monitoring server receives the second information from the first terminal, that is, after step 404 shown in fig. 4B, the scheduling method may further include:
1001. the video monitoring server determines that the target object enters the first area from the second area.
Wherein, the second area is a preset area. That is, the second area is an area in the preset travel route pre-stored by the application server, so that the information of the terminal device in the second area is already pre-stored in the application server.
After step 1001, the scheduling system may perform steps 405-406 shown in fig. 4B, so that the scheduling server may send information one to the application server, at which time the information one is used to indicate that the target object enters the first area from the second area.
1002. In response to the information one, the application server sends a message 6 to the scheduling server.
Wherein the message 6 is used to indicate to delete at least one second conference member, and the second conference member is a terminal device in the second area.
That is to say, after the video monitoring server determines that the target object enters the first area from the second area based on the second information reported by the video terminal, the scheduling method can also automatically delete the terminal device in the second area, can ensure that the conference members are updated and replaced in real time, avoids redundancy of the conference members, is beneficial to automatic scheduling management, and can also reduce unnecessary resource overhead.
In still other implementations, referring to fig. 11, after the scheduling server receives the second information from the first terminal, that is, after step 601 shown in fig. 6, the scheduling method may further include:
1101. the scheduling server determines that the target object enters the first area from the second area.
Wherein the second area is a preset area; that is, the second area is an area in the preset travel route pre-stored by the application server, so that the information of the terminal device in the second area is already pre-stored in the application server.
After step 1101, step 602 shown in fig. 6 may be performed, so that the scheduling server may send information one to the application server, at which time the information one is used to instruct the target object to enter the first area from the second area.
1102. In response to the information one, the application server sends a message 6 to the scheduling server.
Wherein the message 6 is used to indicate to delete at least one second conference member, and the second conference member is a terminal device in the second area.
That is to say, after the scheduling server determines that the target object enters the first area from the second area based on the second information reported by the first terminal, the scheduling method may further automatically delete the terminal device in the second area, may ensure that the conference members update and switch in real time, may avoid redundancy of the conference members, may facilitate automatic scheduling management, and may also reduce unnecessary resource overhead.
After step 1002 or step 1102, the scheduling method provided by the embodiment of the present application may also automatically delete the conference member. For example, the scheduling method may automatically delete a common member; or, the scheduling method can automatically delete the video monitoring member; or, the scheduling method may automatically delete the common member and the video monitoring member.
When the scheduling method can automatically delete the video surveillance member, referring to fig. 12, after step 1002 or step 1102, the scheduling method may further include:
1201. in response to message 6, the scheduling server sends message 7 to the video surveillance server.
Wherein the message 7 is used to indicate the deletion of the second video surveillance device.
1202. And the video monitoring server sends a video return stopping request message to the second video monitoring device.
1203. And the video monitoring server receives the return stopping video response message from the second video monitoring device.
1204. The video surveillance server sends a response message to the scheduling server for message 7.
1205. In response to the response message of message 7, the scheduling server sends a delete conference member message to the video conference server.
That is, after the video monitoring server determines that the target object enters the first area from the second area, the scheduling method may automatically delete the second video monitoring device in the second area, so that the scheduling server does not need to receive unnecessary video monitoring image information any more, which is beneficial to automatic scheduling management, and may also reduce unnecessary resource overhead related to the second video monitoring device.
In some embodiments, the target object may travel into a third region, which may be a non-preset region. After the target object enters the non-preset area, because the information of the terminal equipment in the non-preset area is not stored in the application server, automatic scheduling for automatically adding the terminal equipment to the conference can not be performed any more, and therefore, manual scheduling needs to be started. That is, in an automatic scheduling scenario, if an abnormal situation that a target object deviates from a preset traveling region occurs, adjustment needs to be performed in time, and manual scheduling is started.
In one implementation, referring to fig. 13, after step 403 shown in fig. 4B, the scheduling method may further include:
1301. and the video monitoring server receives the information III from the second terminal.
The second terminal is video monitoring equipment in a third area; or the second terminal is mobile video monitoring equipment. The information is used to indicate that the target object is within the third region. Wherein the third area is a non-preset area.
Illustratively, the information three may include an image captured by the second terminal, the image including a picture of the target object, and the image may include a video and/or a picture. For example, the second terminal may be a camera in the third area, and the third information may include a video image captured by the camera, where the video image includes a main vehicle in the fleet of vehicles.
Illustratively, the information three may include information obtained by the second terminal after processing the image acquired by the second terminal. For example, the second terminal may be a camera in the third area, and the camera in the third area may have a recognition capability (e.g., a license plate recognition capability, a vehicle recognition capability, a face recognition capability, or the like), so that the camera in the third area may recognize the target object in the vehicle fleet based on the captured image by using the recognition capability, and then send the recognized information to the video monitoring server. The embodiment of the application does not limit the type of the intelligent identification capability.
1302. And the video monitoring server sends information IV to the scheduling server.
Wherein the information four is used to indicate that the target object enters the third area.
Illustratively, when the third information includes an image acquired by the second terminal, the video monitoring server may detect, according to the image acquired by the second terminal, that the target object enters the third area through an intelligent recognition capability, where the intelligent recognition capability may include license plate recognition, vehicle recognition, face recognition, or the like, and the video monitoring server may send the fourth information to the scheduling server. The embodiment of the application does not limit the type of the intelligent identification capability.
For example, when the information three includes information obtained by processing the image acquired by the second terminal, the information four may be the information three.
1303. And the scheduling server sends the information IV to the application server.
1304. In response to message four, the application server sends a message 8 to the scheduling server.
Wherein the message 8 is used to prompt the user for a manual operation.
That is, based on the image captured by the video surveillance device, when the video surveillance device or the video surveillance server detects that the target object enters the non-preset third area, the scheduling server may start a user manual operation to start the manual scheduling in response to the notification information indicating that the target object enters the third area.
In another implementation, referring to fig. 14, after step 403 shown in fig. 4B, the scheduling method may further include:
1401. the scheduling server receives information three from the second terminal.
Wherein the information is used for indicating that the target object is in the third area; the second terminal is communication equipment in a third area; the third area is a non-preset area.
In some embodiments, information three may be used to indicate the location of the target object.
1402. And the dispatching server sends information four to the application server.
Wherein the information four is used to indicate that the target object enters the third area.
1403. In response to message four, the application server sends a message 8 to the scheduling server.
Wherein the message 8 is used to prompt the user for a manual operation.
That is, based on the information three reported by the second terminal, when the scheduling server detects that the target object enters the non-preset third area, the scheduling server may start a user manual operation to start manual scheduling.
It is understood that, after the start of the manual command, as the target object travels, the terminal device or the mobile camera of the new area may detect that the target object reaches the new area, which is the preset area. At this time, the scheduling server may receive information that the target object enters a new area, and the scheduling server may perform automatic scheduling again in response to the information.
In summary, the present application discloses a scheduling method, which is applied to a scheduling server, and in an embodiment, as shown in fig. 15A, the method may include the following steps 1501 to 1506:
1501. the scheduling server receives a conference creation request from the application server.
1502. The scheduling server sends a conference creation request to the video conference server.
1503. The scheduling server receives a conference creation response from the video conference server.
1504. The scheduling server receives the first information from the video monitoring server.
Wherein the information one is used for informing the target object to enter the first area.
1505. And the scheduling server sends the information one to the application server.
Wherein the information one is used for informing the target object to enter the first area.
1506. The dispatch server receives message 1 from the application server.
The message 1 is used for indicating that at least one first conference member is added, wherein the first conference member is a terminal device in a first area, and the first area is a preset area.
In the scheduling method, for example, the operation of the scheduling server may refer to the operation of the scheduling server in the embodiment shown in fig. 4A and 4B and the related text description, which is not repeated herein.
In another embodiment, as shown in FIG. 15B, the method may include steps 1511-1516:
1511. the scheduling server receives a conference creation request from the application server.
1512. The scheduling server sends a conference creation request to the video conference server.
1513. The scheduling server receives a conference creation response from the video conference server.
1514. The scheduling server receives the second information from the first terminal.
Wherein the information is used to indicate that the target object is within the first region.
1515. And the scheduling server sends the information one to the application server.
Wherein the information one is used for indicating the target object to enter the first area.
1516. The dispatch server receives message 1 from the application server.
The message 1 is used for indicating that at least one first conference member is added, and the first conference member is terminal equipment in a first area; the first area is a preset area.
In the scheduling method, for example, the operation of the scheduling server may refer to the operation of the scheduling server in the embodiment shown in fig. 6 and the related text descriptions, which are not described herein again.
In addition, the embodiment of the present application further discloses a scheduling method, which is applied to a video monitoring server, and as shown in fig. 16, the method includes the following steps 1601 to 1602:
1601. and the video monitoring server receives the information II from the first terminal.
Wherein the information is used for indicating that the target object is in the first area; the first terminal is a video monitoring terminal, and the first area is a preset area.
1602. And the video monitoring server sends the first information to the scheduling server.
Wherein the information one is used for informing the target object to enter the first area.
In the scheduling method, for example, the operation of the video monitoring server may refer to the operation of the video monitoring server in the embodiment shown in fig. 4B and the related text description, which is not repeated herein.
In addition, the embodiment of the present application further discloses a scheduling method, which is applied to an application server, and as shown in fig. 17, the method includes the following steps 1701 to 1702:
1701. the application server receives the information one from the scheduling server.
The information one is used for informing the target object of entering the first area; the first area is a preset area.
1702. In response to the information one, the application server sends a message 1 to the scheduling server.
The message 1 is used for indicating that at least one first conference member is added, and the first conference member is a terminal device in a first area.
In the scheduling method, for example, the operation of the application server may refer to the operation of the application server in the embodiment shown in fig. 4A, fig. 4B, or fig. 6 and the related text descriptions, which are not repeated herein.
In addition, it can be understood that the scheduling method provided by the embodiment of the application can be applied to not only major security scenes but also scenes such as emergency commands and sporting events. And with the gradual improvement of the intelligent technology, the internet big data and intelligent technology can be applied to the scheduling method, so that the command efficiency is effectively improved, and abnormal or dangerous conditions are found as soon as possible.
It will be appreciated that in order to implement the above-described functions, the electronic device comprises corresponding hardware and/or software modules for performing the respective functions. The present application is capable of being implemented in hardware or a combination of hardware and computer software in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the electronic device may be divided into functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case where each functional module is divided with corresponding functions, fig. 18 shows an electronic apparatus 1800, and the electronic apparatus 1800 may include a transmitting unit 1801, a receiving unit 1802, and a processing unit 1803.
In some embodiments, the electronic device 1800 may be the video surveillance server, and each unit in the electronic device 1800 may be configured to execute each step of the video surveillance server, so as to implement the scheduling method provided in this embodiment of the present application.
In other embodiments, the electronic device 1800 may be the scheduling server, and each unit in the electronic device 1800 may be configured to execute each step of the scheduling server, so as to implement the scheduling method provided in this embodiment of the application.
In other embodiments, the electronic device 1800 may be the above-mentioned application server, and each unit in the electronic device 1800 may be configured to execute each step of the above-mentioned application server, so as to implement the scheduling method provided in this embodiment of the application.
Where an integrated unit is employed, the electronic device 1800 may include a processing module, a memory module, and a communication module. The processing module may be configured to control and manage the actions of the electronic device 1800, and for example, may be configured to support the electronic device 1800 to execute the steps executed by the transmitting unit 1801, the receiving unit 1802, and the processing unit 1803. The memory modules can be used to support the electronic device 1800 in storing program codes and data and the like. A communication module that may be used to support communication between the electronic device 1800 and other devices.
By introducing the foregoing embodiments, it can be seen that the present invention further provides a video scheduling method, including: (1) the scheduling platform obtains a first area where the target object is located. (2) The scheduling platform queries an application server (or a local storage) according to the first area, and records a video processing device ID corresponding to the first area information in the application server, or records an ID of a video processing device group corresponding to the first area information in the application server, where the video processing device group includes video processing devices. Therefore, the number of the cameras which can be inquired is one or more, and the first camera is included in the one or more cameras. (3) The scheduling platform instructs the video monitoring equipment (or the first camera) to push the video stream of the first camera to the video processing equipment; (4) the video processing device is, for example, a video playing device, and can play the received video stream. Or a video analysis device, may analyze the received video stream. In this embodiment, the scheduling platform may be a scheduling server, or may be a combination of other devices such as a scheduling server and an application server.
It should be noted that, in this embodiment (and the foregoing embodiments), when a target object (e.g., a vehicle) enters a next area from one area, the foregoing processes of querying the area, querying a camera, and pushing a video stream may be repeated, so as to push the video stream of the camera of the next area to a corresponding video processing device, for example: when the target object enters the third area from the first area, the video stream of the second camera located in the third area can be pushed to the corresponding video processing device by performing steps similar to (1) (2) (3) (4).
Further, in this embodiment (and the foregoing embodiments), the pushed video processing device may be of two types. One type is a common video processing device, for example, a video playing device managed by a global administrator, or a video playing device such as the aforementioned tablet of a leader, for which the video playing device (large-screen terminal, split-screen television, tablet, mobile station, computer, etc.) does not belong to a specific area, so that the video streams of the third area and the first area can be played in parallel (or processed in other management manners, such as editing, image analysis, video analysis) in the video playing device. Another type of video processing apparatus belongs to a specific area (e.g., a video playback apparatus managed by an area manager). For example, only the video playing device associated with the first area, or only the video playing device associated with the third area, the video playing devices only have playing (or other management methods such as editing, image analysis, and video analysis) authority for the video streams of the cameras in the associated area, and do not play the video streams of the cameras in other areas. The video processing device (for example, a cell phone of a region manager) belonging to a specific region can obtain the authority for playing the video stream of the corresponding region by joining a conference and the like after the target object enters the corresponding region.
There are various methods for pushing video streams, for example, a video monitoring device and a video processing device are added to a same conference, and the two devices can be used as conference members to push contents, and the foregoing embodiments of the conference mode have been described in detail, and a simple exemplary description is provided here: adding a camera into the conference, adding a video processing device into the conference through the ID of the video processing device, and pushing the video stream of the camera to the video processing device through the conference; alternatively, it is also possible to use: and adding the camera into the conference, adding all the video processing devices in the video processing device group into the conference through the video processing device group ID, and pushing the video stream of the camera to the video processing device group through the conference. The scheduling platform may be a member of the conference, such as a conference member that has chairman rights to the conference.
The following are specifically mentioned: in the embodiments of the present invention, the effect of automatically pushing the video stream is achieved, and the pushing approach is not limited to the conference. For example, a dedicated communication channel may be directly established between the video monitoring device and the video processing device, and multicast push may be performed using the communication channel.
According to the first area, querying a first camera corresponding to the first area may specifically include: acquiring a video processing equipment ID corresponding to first area information by inquiring the corresponding relation between an area and the video processing equipment ID, wherein the video processing equipment comprises a first camera, and each video processing equipment is added into the conference in a single conference member mode in this way; or, acquiring a video processing device group ID corresponding to the first region information by querying a corresponding relationship between the region and the video processing device group ID, where the video processing device group includes the first camera, and in this way, the video processing device group joins the conference as a whole. It can be understood that querying the second camera corresponding to the third area according to the third area may specifically include the similar process described above, and details are not described here again.
Correspondingly, an embodiment of the present invention further provides a video scheduling apparatus, which may execute the corresponding method, where the video scheduling apparatus may include: the position obtaining module is used for obtaining first area information of the target object; the query module is used for querying the camera corresponding to the first area according to the first area information; and the pushing module is used for indicating to push the video stream of the camera to the video processing equipment.
Accordingly, embodiments of the present invention also provide a computer program product containing instructions which, when run on a computer, cause the computer to perform the methods described in the preceding method embodiments.
The processing module may be a processor or a controller. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
Embodiments of the present application further provide a computer-readable storage medium, where computer instructions are stored, and when the computer instructions are executed on an electronic device, the electronic device is caused to execute the above related method steps to implement the scheduling method in the above embodiments.
Embodiments of the present application further provide a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the scheduling method executed by the electronic device in the above embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the scheduling method executed by the electronic device in the above-mentioned method embodiments.
The electronic device, the computer-readable storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer-readable storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, for convenience and simplicity of description, the division of the functional modules is merely exemplified, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the above described functions.
The apparatus and method disclosed in the several embodiments provided in this application can be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. A scheduling system, comprising: a scheduling server and an application server;
the scheduling server is used for sending first information to the application server, wherein the first information is used for informing a target object to enter a first area;
the application server is configured to send, in response to the first information, a first message to the scheduling server, where the first message is used to instruct that at least one first conference member is added to a conference managed by the scheduling system, where: the first conference member is terminal equipment in the first area, and the first area is a preset area.
2. The scheduling system of claim 1 further comprising a video surveillance server configured to receive the second information from the first terminal, wherein:
the second information comprises an image collected by the first terminal, the image comprises a picture of the target object, and the first terminal is a video monitoring terminal; or, the second information is used for indicating: the target object is located in the first area;
the video monitoring server is further configured to send the first information to the scheduling server.
3. The scheduling system of claim 1 wherein the scheduling server is further configured to receive second information from the first terminal, the second information indicating the location of the target object.
4. The scheduling system of any one of claims 1-3 wherein the scheduling system further comprises a video conference server and a video surveillance server, the first conference member being a first video surveillance device;
the scheduling server is further used for responding to the first message and sending a conference member adding message to the video conference server;
the video conference server is used for sending a conference call message to the dispatch server, wherein the conference call message is used for indicating a conference to call the at least one first conference member;
the scheduling server is further configured to send a second message to the video monitoring server, where the second message is used to instruct a conference to call the first video monitoring device;
the video monitoring server is further configured to send a return video request message to the first video monitoring device;
receiving a response message of a returned video request from the first video monitoring device;
sending a response message of the second message to the scheduling server;
and the scheduling server is further used for responding to the response message of the second message and sending a third message to the video conference server, wherein the third message is used for indicating that the first video monitoring equipment is added into the conference.
5. The scheduling system of claim 4,
the video conference server is further used for receiving image data from the first video monitoring device;
sending the image data of the first video monitoring device to the scheduling server;
the scheduling server is further configured to send the image data of the first video monitoring device to other conference members.
6. Scheduling system according to any of claims 2, 4, 5,
the video monitoring server is specifically configured to determine that the target object enters the first area from a second area; wherein the second area is a preset area;
the application server is further configured to send, in response to the first information, a fourth message to the scheduling server, where the fourth message is used to instruct to delete at least one second conference member, and the second conference member is a terminal device in the second area.
7. Scheduling system according to any of claims 1 or 3,
the scheduling server is specifically configured to determine that the target object enters the first area from a second area; wherein the second area is a preset area;
the application server is further configured to send, in response to the first information, a fourth message to the scheduling server, where the fourth message is used to instruct to delete at least one second conference member, and the second conference member is a terminal device in the second area.
8. A scheduling method is applied to an application server in a scheduling system, and is characterized by comprising the following steps:
the application server receives first information from a scheduling server, wherein the first information is used for informing a target object of entering a first area; the first area is a preset area;
and responding to the first information, the application server sends a first message to the scheduling server, wherein the first message is used for indicating that at least one first conference member is added to the conference managed by the scheduling system, and the first conference member is a terminal device in the first area.
9. The method of claim 8, wherein after the application server sends a first message to the dispatch server in response to the first information, the method further comprises:
responding to the first information, the application server sends a fourth message to the scheduling server, wherein the fourth message is used for indicating to delete at least one second conference member, and the second conference member is a terminal device in a second area; and the target object enters the first area from a second area, wherein the second area is a preset area.
10. An application server, comprising: one or more processors and interfaces;
the interface is used for receiving first information from the scheduling server, and the first information is used for informing a target object of entering a first area; the first area is a preset area;
the processor is configured to: the application server executes the following steps by running a computer program:
and responding to the first information, and sending a first message to the scheduling server, wherein the first message is used for indicating that at least one first conference member is added to the conference managed by the scheduling system, and the first conference member is a terminal device in the first area.
11. The application server of claim 10, wherein after sending the first message to the dispatch server in response to the first message, the application server further performs the steps of:
responding to the first information, and sending a fourth message to the scheduling server, wherein the fourth message is used for indicating to delete at least one second conference member, and the second conference member is a terminal device in a second area; and the target object enters the first area from a second area, wherein the second area is a preset area.
12. A computer-readable storage medium comprising computer instructions which, when executed on an electronic device, cause the electronic device to perform the scheduling method of any one of claims 8-9.
13. A video scheduling method, comprising:
when the target object is located in the first area, the scheduling platform obtains the first area where the target object is located; inquiring a first camera corresponding to the first area according to the first area; instructing to push the video stream of the first camera to a video processing device;
when the target object leaves the first area and enters a third area, the method further comprises: the scheduling platform obtains the third area; inquiring a second camera corresponding to the third area according to the third area; instructing to push the video stream of the second camera to a video processing device.
14. The video scheduling method of claim 13, further comprising:
and the video processing equipment plays the received video stream.
15. The video scheduling method according to claim 13, wherein querying, according to the first area, a first camera corresponding to the first area specifically includes:
acquiring a video processing equipment ID corresponding to first area information by inquiring the corresponding relation between an area and the video processing equipment ID, wherein the video processing equipment comprises the first camera; or
And acquiring a video processing equipment group ID corresponding to the first area information by inquiring the corresponding relation between the area and the video processing equipment group ID, wherein the video processing equipment group comprises the first camera.
16. The video scheduling method according to claim 15, wherein pushing the video stream of the camera to the video processing device specifically includes:
adding the camera into a conference, adding the video processing equipment into the conference through the video processing equipment ID, and pushing the video stream of the camera to the video processing equipment through the conference; or
And adding the camera into a conference, adding all video processing equipment in the video processing equipment group into the conference through the video processing equipment group ID, and pushing the video stream of the camera to the video processing equipment group through the conference.
17. The video scheduling method of claim 13, further comprising:
when the target object is located in a first area, querying first video processing equipment corresponding to the first area, wherein the first video processing equipment is used for playing a video stream of the first camera;
and when the target object is located in a third area, querying a second video processing device corresponding to the third area, wherein the second video processing device is used for playing a video stream of the second camera.
18. A video scheduling apparatus, comprising:
the position obtaining module is used for obtaining first area information of the target object when the target object enters a first area;
the query module is used for querying the camera corresponding to the first area according to the first area information;
the pushing module is used for indicating to push the video stream of the camera to the video processing equipment;
when the target object leaves the first area and enters a third area, then:
the position obtaining module is further configured to obtain the third area;
the query module is further used for querying a second camera corresponding to the third area;
and the pushing module is further used for indicating to push the video stream of the second camera to the video processing equipment.
19. A computer-readable storage medium comprising computer instructions which, when executed on an electronic device, cause the electronic device to perform the scheduling method of any one of claims 13-17.
CN201911311639.0A 2019-12-18 2019-12-18 Scheduling method, device and system Pending CN111131750A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911311639.0A CN111131750A (en) 2019-12-18 2019-12-18 Scheduling method, device and system
PCT/CN2020/109888 WO2021120652A1 (en) 2019-12-18 2020-08-18 Dispatch method, device, and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911311639.0A CN111131750A (en) 2019-12-18 2019-12-18 Scheduling method, device and system

Publications (1)

Publication Number Publication Date
CN111131750A true CN111131750A (en) 2020-05-08

Family

ID=70499770

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911311639.0A Pending CN111131750A (en) 2019-12-18 2019-12-18 Scheduling method, device and system

Country Status (2)

Country Link
CN (1) CN111131750A (en)
WO (1) WO2021120652A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021120652A1 (en) * 2019-12-18 2021-06-24 华为技术有限公司 Dispatch method, device, and system
CN114338253A (en) * 2021-12-29 2022-04-12 上海洛轲智能科技有限公司 Vehicle-mounted conference management method and device and vehicle
CN114821886A (en) * 2022-06-23 2022-07-29 深圳市普渡科技有限公司 Scheduling server, scheduling robot and reminding system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102724482A (en) * 2012-06-18 2012-10-10 西安电子科技大学 Intelligent visual sensor network moving target relay tracking system based on GPS (global positioning system) and GIS (geographic information system)
CN202750170U (en) * 2012-08-21 2013-02-20 北京盈想东方科技发展有限公司 Integration visual command scheduling system
CN104038743A (en) * 2014-06-23 2014-09-10 浙江工业大学 Video monitoring method fusing position tracking and video monitoring system fusing position tracking
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN104349118A (en) * 2014-11-26 2015-02-11 苏州科达科技股份有限公司 Gateway for video conference system to schedule video monitoring system and processing method thereof
CN105187846A (en) * 2015-07-30 2015-12-23 上海互韦涵信息技术有限公司 Personal video real-time live broadcast method and system for runway
US20170150103A1 (en) * 2013-03-15 2017-05-25 Sony Interactive Entertainment America Llc Real time virtual reality leveraging web cams and ip cams and web cam and ip cam networks
CN106899826A (en) * 2015-12-19 2017-06-27 西安成远网络科技有限公司 A kind of police car Vehicular video monitoring system
CN110087039A (en) * 2019-04-30 2019-08-02 苏州科达科技股份有限公司 Monitoring method, device, equipment, system and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6271752B1 (en) * 1998-10-02 2001-08-07 Lucent Technologies, Inc. Intelligent multi-access system
IL201131A (en) * 2009-09-23 2014-08-31 Verint Systems Ltd Systems and methods for location-based multimedia monitoring
CN102263933B (en) * 2010-05-25 2013-04-10 浙江宇视科技有限公司 Implement method and device for intelligent monitor
CN103873816B (en) * 2012-12-10 2018-03-27 中兴通讯股份有限公司 Video frequency monitoring method and device
CN104811654B (en) * 2014-01-26 2018-01-16 杭州华为企业通信技术有限公司 A kind of monitoring method based on Internet of Things, apparatus and system
CN106470331B (en) * 2015-08-17 2019-04-19 杭州海康威视数字技术股份有限公司 A kind of monitoring method, monitor camera and monitoring system
CN109660710A (en) * 2017-10-10 2019-04-19 中兴通讯股份有限公司 Camera control method, terminal and storage medium based on video conference
CN111131750A (en) * 2019-12-18 2020-05-08 华为技术有限公司 Scheduling method, device and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102724482A (en) * 2012-06-18 2012-10-10 西安电子科技大学 Intelligent visual sensor network moving target relay tracking system based on GPS (global positioning system) and GIS (geographic information system)
CN202750170U (en) * 2012-08-21 2013-02-20 北京盈想东方科技发展有限公司 Integration visual command scheduling system
US20170150103A1 (en) * 2013-03-15 2017-05-25 Sony Interactive Entertainment America Llc Real time virtual reality leveraging web cams and ip cams and web cam and ip cam networks
CN104038743A (en) * 2014-06-23 2014-09-10 浙江工业大学 Video monitoring method fusing position tracking and video monitoring system fusing position tracking
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN104349118A (en) * 2014-11-26 2015-02-11 苏州科达科技股份有限公司 Gateway for video conference system to schedule video monitoring system and processing method thereof
CN105187846A (en) * 2015-07-30 2015-12-23 上海互韦涵信息技术有限公司 Personal video real-time live broadcast method and system for runway
CN106899826A (en) * 2015-12-19 2017-06-27 西安成远网络科技有限公司 A kind of police car Vehicular video monitoring system
CN110087039A (en) * 2019-04-30 2019-08-02 苏州科达科技股份有限公司 Monitoring method, device, equipment, system and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021120652A1 (en) * 2019-12-18 2021-06-24 华为技术有限公司 Dispatch method, device, and system
CN114338253A (en) * 2021-12-29 2022-04-12 上海洛轲智能科技有限公司 Vehicle-mounted conference management method and device and vehicle
CN114338253B (en) * 2021-12-29 2024-03-08 上海洛轲智能科技有限公司 Vehicle-mounted conference management method and device and vehicle
CN114821886A (en) * 2022-06-23 2022-07-29 深圳市普渡科技有限公司 Scheduling server, scheduling robot and reminding system

Also Published As

Publication number Publication date
WO2021120652A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
US11425333B2 (en) Video management system (VMS) with embedded push to talk (PTT) control
CN108769240B (en) Intelligent dispatching command system and method
US11902342B2 (en) Incident communications network with dynamic asset marshaling and a mobile interoperability workstation
WO2021120652A1 (en) Dispatch method, device, and system
EP2679029B1 (en) Dynamic asset marshalling within an incident communications network
CN110445773B (en) Fire control commander regulation and control system based on thing networking
CN102917201B (en) A kind of Logistic Scheduling method and system based on Internet of Things and cloud computing
CN202773002U (en) Integrated visualized command and dispatch platform
CN110648121A (en) Alarm condition information linkage processing system, method and device and computer equipment
CN104410605A (en) Scheduling terminal, scheduling console, scheduling method and communication scheduling command method
CN107547821A (en) A kind of integrated command dispatching system and method based on PGIS
CN113727320A (en) Communication-converged emergency command scheduling system and method
CN111461556A (en) Staring control platform, method and equipment for railway electric service operation and storage medium
CN102404694B (en) Multimedia clustering dispatching and commanding system, mobile terminals and voice conversation method
CN204145538U (en) A kind of multimedia cluster dispatching communication system
CN115914539B (en) Method and system for scheduling audio and video equipment resources
CN112954254B (en) Emergency command fusion communication platform
KR102300124B1 (en) video surveillance system by use of core VMS and edge VMS in mobile edge computing
CN104966146A (en) Airport operation command system
CN113554542A (en) Emergency command and dispatching platform for emergent public health events of 5G mobile flow dispatching instrument
JP2017050615A (en) Inter-system cooperation device
CN106921946B (en) Dispatching desk
TWI781397B (en) Mobile communication device with automatic group changing function and its operation method
TWI745952B (en) Method for performing wireless broadcasting and mobile communication device
CN117764800A (en) Rail transit regional passenger flow monitoring and analyzing system based on multimode traffic

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200508

RJ01 Rejection of invention patent application after publication