CN113766279A - Information processing method, server and mobile terminal - Google Patents

Information processing method, server and mobile terminal Download PDF

Info

Publication number
CN113766279A
CN113766279A CN202110977698.2A CN202110977698A CN113766279A CN 113766279 A CN113766279 A CN 113766279A CN 202110977698 A CN202110977698 A CN 202110977698A CN 113766279 A CN113766279 A CN 113766279A
Authority
CN
China
Prior art keywords
data
server
mobile terminal
file
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110977698.2A
Other languages
Chinese (zh)
Inventor
贾正锋
贾军营
杨海波
徐光磊
葛文臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Fengchi Software Co ltd
Original Assignee
Shenyang Fengchi Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Fengchi Software Co ltd filed Critical Shenyang Fengchi Software Co ltd
Priority to CN202110977698.2A priority Critical patent/CN113766279A/en
Publication of CN113766279A publication Critical patent/CN113766279A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/231Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234336Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention provides an information processing method, a server and a mobile terminal, wherein the mobile terminal can acquire field environment data and upload the field environment data, audio data and video data to the server, the server synchronously processes and stores the three data based on a video frame synchronization technology, and then transcodes and combines the data to form a video file capable of displaying the field environment information, so that a user can search the corresponding video file and synchronously display the field environment information. According to the invention, the field environment data is transmitted based on WebRTC communication, and the three data are synchronously processed based on the video frame synchronization technology, so that the reduction degree of the field scene is improved when abnormal conditions such as extreme weather occur, and the field can be accurately reduced, thereby improving the utilization value of the audio and video data, and further more clearly and effectively reflecting the field conditions.

Description

Information processing method, server and mobile terminal
Technical Field
The invention relates to the field of WebRTC communication, in particular to an information processing method, a server and a mobile terminal.
Background
In a video monitoring video storage and retrieval system based on environments such as railways, traffic and the like, most of the data transmission aspects only support audio data and video data. The storage of video monitoring audio and video can provide sound and image data records for daily work, but is not enough to acquire information such as environment, geographic position, longitude and latitude and the like of the current scene. For example, when unexpected conditions such as extreme weather, abnormal environmental indexes, traffic accidents and the like occur, the storage of the on-site audio and video data can not accurately restore the on-site conditions, the storage of environmental information such as temperature, humidity, longitude and latitude and the like can greatly improve the value of the audio and video data, clearly and accurately explain the problems, and effectively improve the efficiency.
Therefore, how to invent a method capable of transmitting the field environment information under the condition of transmitting the audio and video data becomes a problem to be solved at present.
Disclosure of Invention
In order to solve the problems that in the prior art, the field situation cannot be well restored only according to audio and video data, and the environment, abnormal conditions and the like of the current scene cannot be accurately known, the invention provides an information processing method based on a server on the first aspect, which is used for the server.
The second aspect of the present invention also provides an information processing method based on a mobile terminal, which is used for the mobile terminal.
The third aspect of the present invention also provides a server supporting WebRTC communication.
The fourth aspect of the present invention also provides a mobile terminal supporting WebRTC communication.
In view of the above, a first aspect of the present invention provides a server-based information processing method, where WebRTC communication is supported between a server and a mobile terminal, and the server-based information processing method specifically includes: receiving multi-track data uploaded by a mobile terminal through WebRTC communication, wherein the multi-track data at least comprises field audio data, field video data and field environment data, and the field environment data is non-audio/video data; storing the received multi-track data into at least one basic file according to a division rule, naming the basic files in sequence according to a preset naming rule, and storing the basic files to a preset position; according to time, carrying out synchronous processing and associated storage on the field audio data, the field video data and the field environment data in each basic file to form an intermediate synchronous file; and performing preset processing on the at least one intermediate synchronous file and then storing the intermediate synchronous file to form at least one final video file, wherein the preset processing comprises transcoding processing.
The information processing method based on the server is used for video monitoring of railway traffic, stations and tracks. Specifically, in the using process, the server may be initialized, and perform communication connection and parameter configuration with the mobile terminal, and perform synchronous calibration on the time of the mobile terminal, and then may receive the audio and video data and the field environment data uploaded by the mobile terminal according to the field condition, where the field environment data may include data of temperature, humidity, longitude and latitude, terrain conditions, and the like. And the situation of the site can be further restored on the basis of audio and video by receiving the environment data uploaded by the mobile terminal, so that the problems can be more clearly and effectively explained. And after receiving the multi-track data, naming and storing to form a video basic file, carrying out synchronous processing and associated storage on the multi-track data in the video basic file, forming a video intermediate file after the synchronous processing, and finally carrying out preset processing on the video intermediate file to form a final video file. The preset processing at least comprises transcoding processing, and if the audio data, the video data and the field environment data in the video intermediate file are in the same time period, merging processing is further included.
In the technical scheme, firstly, the WebRTC communication provides a standardized media transmission mode, clear regulations are made on the encoding format, the transmission mode and the negotiation process of the audio and video, and in principle, all terminals supporting the WebRTC have no obstacle in interoperability. That is, based on the WebRTC communication, a good foundation can be provided for the coding and transmission of the audio and video, and better support can be obtained on the aspects of operability and compatibility. The server supports three data modes, and carries out synchronous processing on the three data based on a video frame synchronization technology. This enables the server to receive multi-track data. Meanwhile, the data received by the invention comprises multi-track data, so that the field environment data can be increased on the basis of the existing audio and video data. By adding the field environment data, namely storing the environment information such as temperature, humidity, longitude and latitude and the like, the restoration degree of the field scene can be improved when the unexpected conditions such as extreme weather, abnormal environment indexes, traffic accidents and the like occur, so that the field can be accurately restored, the utilization value of the audio and video data is improved, and the field condition can be more clearly and effectively reflected.
The WebRTC in the present application, abbreviated as Web Real-Time Communication (Web Real-Time Communication), is a technology supporting a Web browser to perform Real-Time voice conversation or video conversation.
In addition, the information processing method based on the server in the above technical solution provided by the present invention may further have the following additional technical features:
in the above technical solution, the multi-track data has attribute information, and the attribute information includes a source of the data and a forming time period of the data, and the server-based information processing method further includes: adding description information to each final video file according to the attribute information, the division rule, the preset naming rule and the preset position, wherein the description information comprises: and at least one of the size, name, source, time period and file storage path of the final video file.
In the technical scheme, when transcoding processing is performed to form a final video file, a storage format of the final video file is determined according to the coding type of each media stream, and description information is generated according to information such as the start time and the end time of monitoring video, the corresponding mobile terminal ID, a file storage path and the like. The description information may be at least one of a size of the final video file, a location name, a source of the mobile terminal, a time period to which the mobile terminal belongs, and a file storage path. The user can search the video files in the external service system corresponding to the server according to the description information. By generating the description information, each final video file can have associated information to correspond to the corresponding final video file, the management is more convenient, and meanwhile, a user can retrieve the associated monitoring video according to the description information.
In any of the above technical solutions, a retrieval or viewing instruction is generated according to the acquired user input information; when a retrieval or viewing instruction is received, finding a final video file matched with the retrieval or viewing instruction; and synchronously displaying the matched final video files.
In the technical scheme, a user can input description information in an external service system, and further the description information is converted into a retrieval or viewing instruction in a server, and the corresponding video file can be found and displayed according to the retrieval or viewing instruction. It should be noted that, since the final video file is subjected to the synchronization processing and is stored in association with each other according to the time sequence, the on-site environment information at the corresponding time can be synchronously displayed while the audio/video data is played.
The step of synchronously displaying the matched final video files comprises the following steps: and synchronously displaying according to the incidence relation of the field audio data, the field video data and the field environment data, namely synchronously displaying according to time.
In any of the above technical solutions, the step of performing preset processing on at least one intermediate synchronization file includes: transcoding the data in the at least one intermediate synchronization file to form a transcoded file; the audio data and the video data in the transcoding file are synchronously synthesized to form first synthesized data.
In the technical scheme, the intermediate synchronization file formed after synchronization needs transcoding, after transcoding, whether the audio data and the video data are in the same time period or not is judged, and if the audio data and the video data are in the same time period, the audio data and the video data are combined to form first synthetic data.
In any of the above technical solutions, the step of performing preset processing on at least one intermediate synchronization file further includes: and synchronously synthesizing the first synthetic data and the field environment data in the transcoding file to form second synthetic data.
In the technical scheme, the field environment data is stored according to the received time, and after the audio data and the video data in the same time period are combined, the field environment data can also be combined according to the time to form second composite data.
In any of the above technical solutions, the step of storing the received multi-track data into at least one basic file according to the partition rule includes: storing the multi-track data received in the current time period once every preset time length to form a basic file; and when a data acquisition stopping instruction is detected, directly storing the multi-track data which is received in the current time period into a basic file.
In the technical scheme, when the time length of the received file exceeds the preset time length, the file is cut off from the preset time length and is named for storage, so that a basic file is formed, and the rest of files are continuously received. For example, the preset time may be set to thirty minutes, when the received file exceeds thirty minutes, the file is cut off from thirty minutes and named for saving, the subsequent part continues to receive, and when the received file exceeds thirty minutes again, the above steps are repeated. By the method, the duration of all the stored files is uniformly less than or equal to the preset duration, so that the situation that the duration of the files is long or short is avoided, and uniform management is facilitated. The recording mode of the mobile terminal can be set to a full recording mode, and can also be selectively set to a time-sharing recording mode. And when an instruction for stopping data acquisition is detected, stopping data acquisition and reception, and storing the currently received multi-track data as a basic file.
In any of the above technical solutions, the network state of the mobile terminal is monitored, and when a preset abnormal condition is monitored at the mobile terminal, a data acquisition stop instruction is generated, and the mobile terminal stops data acquisition processing.
In the technical scheme, the network connection with the mobile terminal is carried out, meanwhile, the working state of the mobile terminal is monitored, and when the mobile terminal is found to have preset abnormal conditions, such as network disconnection and equipment failure, a data acquisition stopping instruction is generated to stop the data acquisition of the mobile terminal. By detecting the network state of the mobile terminal, when any emergency occurs in the mobile terminal, the received file is stored, data loss is avoided, and the transmission reliability is improved.
A second aspect of the present invention provides an information processing method based on a mobile terminal, which is used for the mobile terminal and includes: performing WebRTC communication connection with a server; and uploading the collected multi-track data to a server through WebRTC communication, wherein the multi-track data at least comprises field audio data, field video data and field environment data, and the field environment data is non-audio/video data.
According to the technical scheme, the information processing method based on the mobile terminal is used for the mobile terminal, firstly, WebRTC communication connection is carried out with a server, initialization, parameter configuration and other steps are carried out, and then data acquisition is started, on-site audio, video and on-site environment data are acquired and uploaded to the server. The mobile terminal and the server are in WebRTC communication connection, initialization, network configuration and the like are carried out, so that good communication guarantee and corresponding relation are established between the mobile terminal and the server, meanwhile, WebRTC communication provides a basis for subsequent transcoding and storage steps of audio, video and field environment data collected by the mobile terminal, and better support is obtained on operability and compatibility.
A third aspect of the present invention provides a server comprising: the first communication unit is used for carrying out WebRTC communication with the mobile terminal so as to receive the multi-track data uploaded by the mobile terminal; the database is used for storing data and files; the human-computer interaction unit is used for acquiring input information of a user and displaying the information; and the processing unit comprises a storage unit and a processor, wherein the storage unit stores a computer program, and the processor is used for realizing the steps of the server-based information processing method provided by any one of the technical schemes when executing the computer program.
According to the server provided by the technical scheme of the invention, the first communication unit can send and receive the event notification with the mobile terminal; the database can be used for storing audio, video and field environment data transmitted from the mobile terminal; the man-machine interaction unit can acquire and display information input by a user; and the processing unit can be used for executing synchronization and preset processing and storing the video files formed after the processing. Meanwhile, since the server further includes the processing unit, and the processor of the processing unit can implement the steps of the server-based information processing method provided in any of the above technical solutions when executing the computer program on the storage unit, the server has all the technical effects of the server-based information processing method, and details are not described herein.
Further, the server further comprises: and the signaling receiving unit is connected with the mobile terminal signaling sending unit and used for receiving the event notification from the mobile terminal. That is, in the present application, there are two communication units between the server and the mobile terminal, one is used for receiving multi-track data, and the other is used exclusively for receiving signaling. Of course, the signaling receiving unit and the first communication unit may also be combined together.
A fourth aspect of the present invention provides a mobile terminal, comprising: the audio and video acquisition unit is used for acquiring the field audio data and the field video data of the position of the mobile terminal; the environment data acquisition group is used for acquiring field environment data of the position of the mobile terminal, and the field environment data comprises at least one of field temperature data, humidity data and longitude and latitude; the second communication unit is used for carrying out WebRTC communication with the server; and the processing unit is used for controlling the audio and video acquisition unit and the environment data acquisition group to work and controlling the second communication unit to upload multi-track data to the server, wherein the multi-track data comprises field audio data, field video data and field environment data.
According to the mobile terminal provided by the technical scheme of the invention, the audio and video acquisition unit and the environmental data acquisition group are used for acquiring multi-track data; the second communication unit is used for transmitting the multi-track data to the server and receiving event notification from the server; a processing unit for responding to the received event notification. The processing unit is also used for controlling the second communication unit to upload multi-track data to the server, wherein the multi-track data comprises the field audio data, the field video data and the field environment data.
Further, the environmental data collection group includes: at least one of a position detection unit, a temperature detection unit, a humidity detection unit, and a meteorological data acquisition device. The weather obtaining device is used for obtaining weather data at any moment, such as data of how wind direction, how cloud amount, whether it is rainy, visibility, haze index and the like.
Furthermore, the mobile terminal also comprises a data coding unit which is connected with the environmental data acquisition group and used for summarizing and coding the field environmental data acquired by the plurality of acquisition devices; the environment data acquisition group comprises a plurality of acquisition devices, the acquisition devices are connected with the data coding unit and used for acquiring field environment data, and the acquisition devices can code the acquired field environment data and transmit the data through the communication unit.
The acquisition device can be a sensor arranged at the mobile end, and can also be an independently arranged sensor independent of the mobile end, and at the moment, the peripheral sensor can receive and transmit data with the mobile end in a wired or wireless mode.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 shows a flow chart diagram of a server-based information processing method proposed by an embodiment of the present invention;
fig. 2 is a flowchart illustrating a mobile-side based information processing method according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating an information processing method based on the system of the server and the mobile terminal according to an embodiment of the present invention;
FIG. 4 shows a block diagram of a server proposed by an embodiment of the present invention;
fig. 5 shows a block diagram of a WebRTC communication-based mobile terminal of the present invention;
fig. 6 is a block diagram showing a system of a server and a mobile terminal according to an embodiment of the present invention.
Wherein, the correspondence between the reference numbers and the part names in fig. 4 to 6 is:
the system comprises a 400 server, a 410 first communication unit, a 420 database, a 430 human-computer interaction unit, a 440 processing unit, a 442 storage unit, a 444 processor, a 500 mobile terminal, a 510 audio/video acquisition unit, a 520 environment data acquisition group, a 530 second communication unit, a 540 processing unit, a 600 server, a 610 signaling receiving unit, a 620 data receiving unit, a 630 data storage unit, a 640 data synchronization unit, a 650 media transcoding unit, a 660 storage unit, a 670 host configuration unit, a 700 mobile terminal, a 710 signaling sending unit, a 720 data sending unit, a 730 field environment data acquisition group, a 740 host configuration unit and an 800 external service system.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a more particular description of the invention will be rendered by reference to the appended drawings. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and therefore the scope of the present invention is not limited by the specific embodiments disclosed below.
An embodiment of the first aspect of the present invention provides a server-based information processing method, as shown in fig. 1, the method including the following steps:
s102, receiving multi-track data uploaded by a mobile terminal through WebRTC communication, wherein the multi-track data at least comprises field audio data, field video data and field environment data, and the field environment data is non-audio/video data;
s104, storing the received multi-track data into at least one basic file according to a division rule, naming the basic files in sequence according to a preset naming rule, and storing the basic files to a preset position;
s106, synchronizing and storing the field audio data, the field video data and the field environment data in each basic file according to time to form an intermediate synchronous file;
and S108, performing preset processing on the at least one intermediate synchronous file and then storing the intermediate synchronous file to form at least one final video file, wherein the preset processing comprises transcoding processing.
The information processing method based on the server is used for video monitoring of railway traffic, stations and tracks. Specifically, in the using process, the server may be initialized, and perform communication connection and parameter configuration with the mobile terminal, and perform synchronous calibration on the time of the mobile terminal, and then may receive the audio and video data and the field environment data uploaded by the mobile terminal according to the field condition, where the field environment data may include data of temperature, humidity, longitude and latitude, terrain conditions, and the like. And the situation of the site can be further restored on the basis of audio and video by receiving the environment data uploaded by the mobile terminal, so that the problems can be more clearly and effectively explained. And after receiving the multi-track data, naming and storing to form a video basic file, carrying out synchronous processing and associated storage on the multi-track data in the video basic file, forming a video intermediate file after the synchronous processing, and finally carrying out preset processing on the video intermediate file to form a final video file. The preset processing at least comprises transcoding processing, and if the audio data, the video data and the field environment data in the video intermediate file are in the same time period, merging processing is further included. First, WebRTC communication provides a standardized media transmission method, and makes explicit provisions for the encoding format, transmission method, and negotiation process of audio/video, and all terminals that support WebRTC will have no obstacle in interoperability in principle. That is, based on the WebRTC communication, a good foundation can be provided for the coding and transmission of the audio and video, and better support can be obtained on the aspects of operability and compatibility. The server supports three data modes, and carries out synchronous processing on the three data based on a video frame synchronization technology. This enables the server to receive multi-track data. Meanwhile, the data received by the invention comprises multi-track data, so that the field environment data can be increased on the basis of the existing audio and video data. By adding the field environment data, namely storing the environment information such as temperature, humidity, longitude and latitude and the like, the restoration degree of the field scene can be improved when the unexpected conditions such as extreme weather, abnormal environment indexes, traffic accidents and the like occur, so that the field can be accurately restored, the utilization value of the audio and video data is improved, and the field condition can be more clearly and effectively reflected.
In any of the above embodiments, the multi-track data has attribute information, and the attribute information includes a source of the data and a forming time period of the data, and the server-based information processing method further includes: adding description information to each final video file according to the attribute information, the division rule, the preset naming rule and the preset position, wherein the description information comprises: and at least one of the size, name, source, time period and file storage path of the final video file.
In this embodiment, when transcoding is performed to form a final video file, the storage format of the final video file is determined for the encoding type of each media stream, and description information is generated according to the start time and end time of the monitoring video, the ID of the corresponding mobile terminal, the file storage path, and other information. The description information may be at least one of a size of the final video file, a location name, a source of the mobile terminal, a time period to which the mobile terminal belongs, and a file storage path. The user can search the video files in the external service system corresponding to the server according to the description information. By generating the description information, each final video file can have associated information to correspond to the corresponding final video file, the management is more convenient, meanwhile, a user can input the description information in an external service system, and further the description information is converted into a retrieval or viewing instruction in the server, and the corresponding video file can be found and displayed according to the retrieval or viewing instruction. It should be noted that, since the final video file has been synchronized, merged, and the like, the on-site environment information at the corresponding time can also be synchronously displayed while the audio/video data is played.
In any of the above embodiments, a retrieval or viewing instruction is generated according to the acquired user input information; when a retrieval or viewing instruction is received, finding a final video file matched with the retrieval or viewing instruction; and synchronously displaying the matched final video files.
In this embodiment, the user may input the description information in the external service system, and further convert the description information into a retrieval or viewing instruction in the server, and according to the retrieval or viewing instruction, may find and display the corresponding video file. It should be noted that, since the final video file has been synchronized, merged, and the like, the on-site environment information at the corresponding time can also be synchronously displayed while the audio/video data is played.
In any of the above embodiments, the step of performing preset processing on at least one intermediate synchronization file includes: transcoding the data in the at least one intermediate synchronization file to form a transcoded file; the audio data and the video data in the transcoding file are synchronously synthesized to form first synthesized data.
In this embodiment, the intermediate synchronization file formed after the synchronization process needs to be transcoded, and after the transcoding, it is determined whether the audio data and the video data are in the same time period, and if so, the audio data and the video data are merged to form the first synthesized data.
In any of the above embodiments, the step of performing preset processing on at least one intermediate synchronization file further includes: and synchronously synthesizing the first synthetic data and the field environment data in the transcoding file to form second synthetic data.
In this embodiment, the intermediate synchronization file formed after the synchronization process needs to be transcoded, after transcoding, it is determined whether the audio data and the video data are in the same time period, and if so, the audio data and the video data are merged, and the field environment data is stored according to the received time.
In any of the above embodiments, the step of storing the received multi-track data into at least one elementary file according to the partition rule includes: storing the multi-track data received in the current time period once every preset time length to form a basic file; and when a data acquisition stopping instruction is detected, directly storing the multi-track data which is received in the current time period into a basic file.
In this embodiment, when the duration of the received file exceeds the preset duration, the file is cut off from the preset duration and named for saving, so as to form a basic file, and the remaining files are continuously received. For example, the preset time may be set to thirty minutes, when the received file exceeds thirty minutes, the file is cut off from thirty minutes and named for saving, the subsequent part continues to receive, and when the received file exceeds thirty minutes again, the above steps are repeated. By the method, the duration of all the stored files is uniformly less than or equal to the preset duration, so that the situation that the duration of the files is long or short is avoided, and uniform management is facilitated. The recording mode of the mobile terminal can be set to a full recording mode, and can also be selectively set to a time-sharing recording mode. And when an instruction for stopping data acquisition is detected, stopping data acquisition and reception, and storing the currently received multi-track data as a basic file.
In any of the above embodiments, the network state of the mobile terminal is monitored, and when a preset abnormal condition of the mobile terminal is monitored, a data acquisition stopping instruction is generated, so that the mobile terminal stops data acquisition processing.
In this embodiment, when the network connection with the mobile terminal is performed before, the working state of the mobile terminal is monitored, and when a preset abnormal condition, such as a network disconnection or an equipment failure, occurs in the mobile terminal, a data acquisition stop instruction is generated, so that the mobile terminal stops data acquisition. By detecting the network state of the mobile terminal, when any emergency occurs in the mobile terminal, the received file is stored, data loss is avoided, and the transmission reliability is improved.
An embodiment of the second aspect of the present invention provides an information processing method based on a mobile terminal, as shown in fig. 2, the method includes the following steps:
s201, performing WebRTC communication connection with a server;
s202, uploading the collected multi-track data to a server through WebRTC communication, wherein the multi-track data at least comprises field audio data, field video data and field environment data, and the field environment data is non-audio/video data.
In this embodiment, the mobile terminal first performs WebRTC communication connection with the server, performs initialization, parameter configuration, and the like, and then starts data acquisition. And then the mobile terminal collects the audio and video of the site and the site environment data and uploads the data to the server. The mobile terminal and the server are in WebRTC communication connection, initialization, network configuration and the like are carried out, so that good communication guarantee and corresponding relation are established between the mobile terminal and the server, meanwhile, WebRTC communication provides a basis for subsequent transcoding and storage steps of audio, video and field environment data collected by the mobile terminal, and better support is obtained on operability and compatibility.
In an actual process, the server and the mobile terminal are used cooperatively, and the specific steps of the whole system for the information processing method are described below with the server and the mobile terminal as one system.
As shown in fig. 3, the information processing method when the server and the mobile terminal are used as a system specifically includes the following steps:
s301, the server establishes connection with the mobile terminal, and performs preparation work such as system initialization, network parameter configuration, time calibration and the like;
s302, the mobile terminal collects on-site multi-track data and uploads the data to a server, and the server receives the multi-track data uploaded by the mobile terminal;
s303, judging whether the duration of the audio and video data in the multi-track data is greater than the preset duration, if not, entering S304, and if so, entering S305;
s304, the server forms a basic file and names and stores the received multi-track data according to the receiving address, the port and the mobile terminal ID;
s305, forming a basic file by the multi-track data meeting the preset duration part, naming and storing; if the part exceeding the preset time length exceeds the preset time length, continuing to store the part according to the preset time length to form a plurality of basic files;
s306, carrying out video frame synchronization processing on audio, video and site environment data in the basic file to form a video intermediate file, and carrying out transcoding processing on the video intermediate file;
s307, judging whether the transcoded video intermediate file has audio and video files in the same time period; if yes, entering S308, otherwise entering S309;
s308, merging the audio and video files;
s309, forming a final video file and adding description information to the final video file according to the file size, name, source, the time period and file storage path information.
In the embodiment, the data transmitted and received between the mobile terminal and the server comprises multi-track data, so that when data information is monitored on the sites of traffic, railway and the like, the site environment data can be added on the basis of the existing audio and video data. By adding the field environment data, namely storing the environment information such as temperature, humidity, longitude and latitude and the like, the restoration degree of the field scene can be improved when the unexpected conditions such as extreme weather, abnormal environment indexes, traffic accidents and the like occur, so that the field can be accurately restored. When transcoding processing is carried out to form a final video file, the storage format of the final video file is determined according to the coding type of each media stream, and description information is generated according to the information such as the start time, the end time, the corresponding mobile terminal ID, the file storage path and the like of the monitoring video. The description information may be at least one of a size of the final video file, a location name, a source of the mobile terminal, a time period to which the mobile terminal belongs, and a file storage path. The user can search the video files in the external service system corresponding to the server according to the description information. By generating the description information, each final video file can have associated information to correspond to the corresponding final video file, the management is more convenient, meanwhile, a user can input the description information in an external service system, and further the description information is converted into a retrieval or viewing instruction in the server, and the corresponding video file can be found and displayed according to the retrieval or viewing instruction. And after the audio data and the video data in the same time period are merged, the site environment data can also be merged according to the time after the audio data and the video data in the same time period are merged to form second synthetic data. And when the time length of the received file exceeds the preset time length, cutting off the file from the preset time length, naming and storing the file to form a basic file, and continuously receiving the rest files. By the method, the duration of all the stored files is uniformly less than or equal to the preset duration, so that the situation that the duration of the files is long or short is avoided, and uniform management is facilitated.
In any of the above embodiments, the network state of the mobile terminal is monitored, and when a preset abnormal condition of the mobile terminal is monitored, a data acquisition stopping instruction is generated, so that the mobile terminal stops data acquisition processing.
In this embodiment, when the network connection with the mobile terminal is performed before, the working state of the mobile terminal is monitored, and when a preset abnormal condition, such as a network disconnection or an equipment failure, occurs in the mobile terminal, a data acquisition stop instruction is generated, so that the mobile terminal stops data acquisition. By detecting the network state of the mobile terminal, when any emergency occurs in the mobile terminal, the received file is stored, data loss is avoided, and the transmission reliability is improved.
An embodiment of the third aspect of the present invention provides a server 400, where the server 400 supports WebRTC communication, and as shown in fig. 4, the server specifically includes: 410 a first communication unit, configured to perform WebRTC communication with the mobile terminal to receive multi-track data uploaded by the mobile terminal; 420 a database for storing data and files; 430 a human-computer interaction unit for acquiring input information of a user and displaying the information; 440 processing unit comprising a storage unit 442 and a processor 444, wherein the storage unit 442 stores therein a computer program, and the processor 444, when executing the computer program, is capable of implementing the steps of the server-based information processing method provided in any of the above embodiments.
According to the server 400 provided by the embodiment of the present invention, the first communication unit 410 may transmit and receive an event notification with the mobile terminal; the database 420 can be used for storing audio, video and field environment data transmitted from the mobile terminal; a human-computer interaction unit 430, which can acquire and display information input by a user; the processing unit 440 may be configured to perform synchronization and preset processing, and store the processed video file. Meanwhile, since the server further includes the processing unit, and the processor of the processing unit can implement the steps of any of the above-mentioned technical solutions when executing the computer program on the storage unit, the server has all the technical effects of the server-based information processing method, and details thereof are not described herein again.
Further, the server further comprises: and the signaling receiving unit is connected with the mobile terminal signaling sending unit and used for receiving the event notification from the mobile terminal. That is, in the present application, there are two communication units between the server and the mobile terminal, one is used for receiving multi-track data, and the other is used exclusively for receiving signaling. Of course, the signaling receiving unit and the first communication unit may also be combined together.
An embodiment of the fourth aspect of the present invention provides a mobile terminal 500, as shown in fig. 5, where the mobile terminal 500 supports WebRTC communication, and the mobile terminal 500 specifically includes: the audio/video acquisition unit 510 is configured to acquire field audio data and field video data of a location where the mobile terminal is located; the environment data acquisition group 520 is used for acquiring field environment data of the position of the mobile terminal, wherein the field environment data comprises at least one of field temperature data, humidity data and longitude and latitude; a second communication unit 530 configured to perform WebRTC communication with the server; and the processing unit 540 is configured to control the audio and video acquisition units and the environmental data acquisition group to operate, and control the second communication unit 530 to upload multi-track data to the server, where the multi-track data includes field audio data, field video data, and field environmental data.
According to the mobile terminal 500 provided by the embodiment of the invention, the audio/video acquisition unit 510 and the environmental data acquisition group 520 are used for acquiring multi-track data; the second communication unit 530 is used for transmitting the multi-track data to the server and can receive the event notification from the server; and the processing unit 540 is configured to respond to the received event notification and control the second communication unit 530 to upload multi-track data to the server, where the multi-track data includes the live audio data, the live video data, and the live environment data.
The whole monitoring system composed of the mobile terminal and the server provided by the embodiment of the present application is further described with reference to fig. 6.
As shown in fig. 6, the entire monitoring system includes a server 600 supporting WebRTC communication and a mobile terminal 700, where the server includes: a signaling receiving unit 610 connected to the mobile terminal signaling sending unit 710, and configured to receive an event notification from the mobile terminal 700; a data receiving unit 620 connected to the mobile terminal data transmitting unit 720, for receiving audio data, video data, and field environment data from the mobile terminal 700; a data storage unit 630, connected to the data receiving unit 620, for storing the video files; a data synchronization unit 640 connected to the data storage unit 630, configured to perform synchronization processing on the basic file; the media transcoding unit 650 is connected to the data synchronization unit 640, and is configured to transcode the video intermediate file and merge the audio and video files according to the time information; the storage unit 660 is connected with the media transcoding unit 650 and is responsible for storing the final video file and the description information thereof; the host configuration unit 670 is connected to the signaling receiving unit 610, the data receiving unit 620, and the data storage unit 630, and is configured to provide configuration parameters of the server. The mobile terminal 700 further includes: a signaling sending unit 710, connected to the signaling receiving unit 610 of the server, for sending an event notification to the server 600; the data sending unit 720 is connected with the data receiving unit 620 of the server and used for sending the collected audio data, video data and field environment data; the field environment data acquisition group 730 is used for acquiring field environment data of the environment where the mobile terminal 700 is located in real time; the host configuration unit 670 is connected to the signaling sending unit 710 and the data sending unit 720, and is configured to provide configuration parameters of the mobile terminal 700. The mobile terminal 700 may be a law enforcement recorder, a mobile phone, or a webcam, among others. Of course, the mobile terminal 700 further includes a data encoding unit for collecting and encoding data, and the collected and encoded data is transmitted and uploaded to the server through the data transmitting unit 720.
In this embodiment, a set of complete structural diagrams of the server 600 and the mobile terminal 700 is provided, meanwhile, a user may input information of a video file to be queried in the external service system 800 according to the description information, and the server 600 may retrieve and display the corresponding video file according to the information input by the user.
In this specification, the term "plurality" means two or more unless explicitly defined otherwise. The terms "mounted," "connected," "fixed," and the like are to be construed broadly, and for example, "connected" may be a fixed connection, a removable connection, or an integral connection; "coupled" may be direct or indirect through an intermediary. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the description of the present specification, the description of the terms "one embodiment," "some embodiments," or the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above is only an example of the present invention, and is not intended to limit the present invention, and it is obvious to those skilled in the art that various modifications and variations can be made in the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A server-based information processing method for a server, the server-based information processing method comprising:
receiving multi-track data uploaded by a mobile terminal through WebRTC communication, wherein the multi-track data at least comprises field audio data, field video data and field environment data, and the field environment data is non-audio/video data;
storing the received multi-track data into at least one basic file according to a division rule, naming the basic files in sequence according to a preset naming rule, and storing the basic files to a preset position;
according to time, the field audio data, the field video data and the field environment data in each basic file are synchronously processed and stored in a correlated mode to form an intermediate synchronous file;
and performing preset processing on at least one intermediate synchronous file and then storing the intermediate synchronous file to form at least one final video file, wherein the preset processing comprises transcoding processing.
2. The server-based information processing method according to claim 1, wherein the multi-track data has attribute information including a source of the data, a formation period of the data, the server-based information processing method further comprising:
adding description information to each final video file according to the attribute information, the division rule, the preset naming rule and the preset position, wherein the description information comprises: and at least one of the size, name, source, time period and file storage path of the final video file.
3. The server-based information processing method according to claim 1, further comprising:
generating a retrieval or viewing instruction according to the acquired user input information;
when the retrieval or viewing instruction is received, the final video file matched with the retrieval or viewing instruction is found;
and synchronously displaying the matched final video file.
4. The server-based information processing method according to claim 1, wherein the step of performing a preset process on at least one of the intermediate synchronization files comprises:
transcoding data in at least one intermediate synchronization file to form a transcoded file;
and synchronously synthesizing the audio data and the video data in the transcoding file to form first synthesized data.
5. The server-based information processing method according to claim 1, wherein the step of storing the received multi-track data into at least one elementary file according to a division rule comprises:
storing the multi-track data received in the current time period once every other preset time length to form a basic file;
and when a data acquisition stopping instruction is detected, directly storing the multi-track data which has been received in the current time period into one basic file.
6. The server-based information processing method according to claim 1, further comprising:
and monitoring the network state of the mobile terminal, generating a data acquisition stopping instruction when the mobile terminal is monitored to have a preset abnormal condition, and stopping data acquisition processing of the mobile terminal.
7. An information processing method based on a mobile terminal, which is used for the mobile terminal, is characterized by comprising the following steps:
performing WebRTC communication connection with a server;
and uploading the collected multi-track data to the server through WebRTC communication, wherein the multi-track data at least comprises field audio data, field video data and field environment data, and the field environment data is non-audio/video data.
8. A server, comprising:
the first communication unit is used for carrying out WebRTC communication with a mobile terminal so as to receive multi-track data uploaded by the mobile terminal;
the database is used for storing data and files;
the human-computer interaction unit is used for acquiring input information of a user and displaying the information;
processing unit comprising a storage unit having stored therein a computer program and a processor for implementing the steps of the server-based information processing method according to any one of claims 1 to 6 when executing the computer program.
9. A mobile terminal, comprising:
the audio and video acquisition unit is used for acquiring the field audio data and the field video data of the position of the mobile terminal;
the environment data acquisition group is used for acquiring field environment data of the position of the mobile terminal, and the field environment data comprises at least one of field temperature data, humidity data and longitude and latitude;
the second communication unit is used for carrying out WebRTC communication with the server;
and the processing unit is used for controlling the audio and video acquisition unit and the environment data acquisition group to work and controlling the second communication unit to upload multi-track data to a server, wherein the multi-track data comprises the field audio data, the field video data and the field environment data.
10. The mobile terminal according to claim 9,
the mobile terminal also comprises a data coding unit which is connected with the environmental data acquisition group and used for summarizing and coding the field environmental data acquired by the acquisition devices;
the environment data acquisition group comprises a plurality of acquisition devices, and the acquisition devices are connected with the data coding unit and used for acquiring the field environment data.
CN202110977698.2A 2021-08-24 2021-08-24 Information processing method, server and mobile terminal Pending CN113766279A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110977698.2A CN113766279A (en) 2021-08-24 2021-08-24 Information processing method, server and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110977698.2A CN113766279A (en) 2021-08-24 2021-08-24 Information processing method, server and mobile terminal

Publications (1)

Publication Number Publication Date
CN113766279A true CN113766279A (en) 2021-12-07

Family

ID=78791056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110977698.2A Pending CN113766279A (en) 2021-08-24 2021-08-24 Information processing method, server and mobile terminal

Country Status (1)

Country Link
CN (1) CN113766279A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150022666A1 (en) * 2013-07-22 2015-01-22 Intellivision Technologies Corp. System and method for scalable video cloud services
CN105933664A (en) * 2016-06-01 2016-09-07 胡渐佳 System and method for synchronously displaying environment information in video
CN106097225A (en) * 2016-06-17 2016-11-09 北京华风创新网络技术有限公司 Weather information timely dissemination method and system based on mobile terminal
CN106507172A (en) * 2016-11-30 2017-03-15 微鲸科技有限公司 Information coding method, coding/decoding method and device
WO2017118320A1 (en) * 2016-01-04 2017-07-13 努比亚技术有限公司 Mobile terminal, video processing apparatus and method thereof
CN107067159A (en) * 2017-03-09 2017-08-18 深圳华博高科光电技术有限公司 Smart city management and dispatching plateform system
WO2019024919A1 (en) * 2017-08-03 2019-02-07 腾讯科技(深圳)有限公司 Video transcoding method and apparatus, server, and readable storage medium
CN111629175A (en) * 2020-04-13 2020-09-04 中国能源建设集团广东省电力设计研究院有限公司 Video environment monitoring system of transformer substation
CN112905734A (en) * 2020-12-01 2021-06-04 厦门卫星定位应用股份有限公司 Data storage method, device, server and computer readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150022666A1 (en) * 2013-07-22 2015-01-22 Intellivision Technologies Corp. System and method for scalable video cloud services
WO2017118320A1 (en) * 2016-01-04 2017-07-13 努比亚技术有限公司 Mobile terminal, video processing apparatus and method thereof
CN105933664A (en) * 2016-06-01 2016-09-07 胡渐佳 System and method for synchronously displaying environment information in video
CN106097225A (en) * 2016-06-17 2016-11-09 北京华风创新网络技术有限公司 Weather information timely dissemination method and system based on mobile terminal
CN106507172A (en) * 2016-11-30 2017-03-15 微鲸科技有限公司 Information coding method, coding/decoding method and device
CN107067159A (en) * 2017-03-09 2017-08-18 深圳华博高科光电技术有限公司 Smart city management and dispatching plateform system
WO2019024919A1 (en) * 2017-08-03 2019-02-07 腾讯科技(深圳)有限公司 Video transcoding method and apparatus, server, and readable storage medium
CN111629175A (en) * 2020-04-13 2020-09-04 中国能源建设集团广东省电力设计研究院有限公司 Video environment monitoring system of transformer substation
CN112905734A (en) * 2020-12-01 2021-06-04 厦门卫星定位应用股份有限公司 Data storage method, device, server and computer readable storage medium

Similar Documents

Publication Publication Date Title
US20220215748A1 (en) Automated camera response in a surveillance architecture
CN201601788U (en) Audio-video remote real-time vehicle monitoring system based on 3G mobile communication network
CN1254972C (en) Intelligent video content monitoring system based on IP network
CN101022540A (en) Video monitoring system and method under server/customer end constitution
CN103853143A (en) Long-distance wireless monitoring network system applied in power transmission line of power supply system
CN102801978A (en) System for video data acquisition and management based on cloud storage
CN103079050A (en) Vehicle-mounted networked audio/video monitoring system
CN201436810U (en) Mobile video monitoring system
CN102984498A (en) Integrated monitor management method and system for achieving data and image two-way linkage
CN115499460A (en) Converged communication system based on emergency command center
CN103559808A (en) Offshore sea ship traffic monitoring and pre-warning system based on 3G
CN103442205A (en) Monitoring system commonly used for power distribution cabinet
CN2724334Y (en) Cell phone video frequency image monitor
CN110855948B (en) Rail transit construction safety monitoring system
CN103647940A (en) Intelligent monitoring mobile phone, remote video monitoring system and monitoring method thereof
CN104113725A (en) Mobile monitoring alarm method and mobile monitoring alarm device
CN202713535U (en) Video and audio monitoring network system
CN113766279A (en) Information processing method, server and mobile terminal
CN201360312Y (en) Monitoring system based on embedded Web video server
CN112601052A (en) Video resource integration system applied to internal sharing platform
WO2017169149A1 (en) Surveillance camera system and surveillance method
CN101378496A (en) Highgrade integration management system for monitoring remote video dynamically
CN202197369U (en) Private network fire fighting remote image monitoring system based on CDMA1X network
CN111901560A (en) Remote detection video acquisition and monitoring system for static load test
CN203057346U (en) Public place emergent event handling system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination