CN101272397A - Method for acquiring addressable stream media based on ASF data amalgamation technology - Google Patents

Method for acquiring addressable stream media based on ASF data amalgamation technology Download PDF

Info

Publication number
CN101272397A
CN101272397A CNA2008100247634A CN200810024763A CN101272397A CN 101272397 A CN101272397 A CN 101272397A CN A2008100247634 A CNA2008100247634 A CN A2008100247634A CN 200810024763 A CN200810024763 A CN 200810024763A CN 101272397 A CN101272397 A CN 101272397A
Authority
CN
China
Prior art keywords
information
video
audio
asf
spatial orientation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008100247634A
Other languages
Chinese (zh)
Other versions
CN101272397B (en
Inventor
闾国年
丰江帆
刘学军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changshu Zijin Intellectual Property Service Co ltd
Original Assignee
Nanjing Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Normal University filed Critical Nanjing Normal University
Priority to CN2008100247634A priority Critical patent/CN101272397B/en
Publication of CN101272397A publication Critical patent/CN101272397A/en
Application granted granted Critical
Publication of CN101272397B publication Critical patent/CN101272397B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a method for obtaining positional streaming media based on an ASF data confluent technology, which has the steps that audio and video information and spatial orientation information are acquired and then transmitted to a computer on which a Microsoft WME development kit is installed; an ASF streaming media file is used as a container of positional digital video collecting and coding, and a head object of the ASF file is real-timely added with a scripting command object to store the spatial orientation information; the video information and the audio information are compressed and coded in a way of MPGE-4, and then are respectively stored in an ASF data object; by the use of an ASF inside time axis and time domain constraint relationships between the video information, the audio information and the spatial orientation information, reference points on the ASF inside time axis are selected so as to complete the synchronization of the video information, the audio information and the spatial orientation information and realize real-time automatic confluence of the video information, the audio information and the spatial orientation information, and the positional streaming media file is formed. By the method of the invention, the spatial orientation information is added to the ASF container to be recorded and no audio channel is required to be occupied, thus leading the audio information of a positional video to be remained.

Description

Obtain to locate the method for Streaming Media based on ASF data fusion technology
Technical field
The present invention relates to a kind of video/audio information and spatial information be integrated the method that can locate Streaming Media of obtaining, specifically, is a kind of method that obtains to locate Streaming Media based on ASF data fusion technology.
Background technology
In recent years, multimedia technology begins to cause the concern in GIS field.As everyone knows, the geographical space that video is described has the very strong sense of reality, and understandable geospatial information can be provided, and makes people utilize the mode of information more natural, and need not to carry out complicated Geometric Modeling, become the new developing direction in GIS field.But, video itself is the position and the attribute information of space atural object not, therefore it can only store and call usually as a kind of attribute information of spatial entities, be difficult between the georeferencing system of GIS and video image space, set up mapping relations, like this, has significant limitation in application facet.
In the application practice, there have been some scholars to attempt video information and spatial information are integrated.Berry is in article " position and time based on video GIS obtain " (Capture " Where " and " When " onVideo-Based GIS, Geoworld, 2000 (9): 26-27; " learn the world " periodical, 2000 the 9th phases, the video map system has been proposed in 26-27), i.e. accurate position of record and time data in a sound channel of video, and the field data acquisition of data has been proposed, interior industry is handled and actual application scheme." based on the existing line GIS-Geographic Information System of video image " (Tang Bing, Zhou Meiyu, the railway computer application, 2001,10 (11)) railway video image along the line and integrated technical scheme and the implementation method of spatial information have been proposed, between mileage and video, can set up mapping relations, discussed especially and how to have carried out the mileage correction, but owing to do not set up linear reference frame, computation schemes complexity.(video GIS: in the VideoGIS project that Navarrete and Blat preside over based on the video segmentation and the retrieval of geography information, the 5th AGILE (European geography information federation) geography information science proceeding, Spain, Palma, April in 2002-27 days on the 25th .VideoGIS:segmenting and indexing video based on geographic information, 5th AGILEConference on Geographic Information Science, Palma, Spain, April 25th-27th, 2002), video image and spatial information are combined, promptly set up the geographical index of video clips, the hypervideo that generation can be called (Hypervideo) in geographical environment.On the other hand, Hwang etc. have proposed the MPEG-7 metadata proposal and have been used for location dependant services (Hwang T H, Choi K H, Joo I H, Lee J H.MPEG-7metadata for video-based GIS applications.IGARSS ' 03Proceedings, Toulouse, France, 21-25July 2003; The MPEG-7 metadata is used for video GIS application service, IGARSS (international geoscience and remote sensing) academic meeting paper collection in 2003, Toulouse, France, 21-25 day in July, 2003); Joo etc. are from video image metadata notion, realize the cross-reference of map and video image, be used to support interactive operation (the Joo I H of GIS and video image, Hwang T H, Choi K H.Generation of videometadata supporting video-GIS integration.ICIP ' 04Proceedings, Oct 24-27,2004, Singapore. support the integrated video metadata of video and GIS to generate, ICIP (international image processing academic conference) collection of thesis in 2004, Singapore, 24-27 day in October, 2004).Kong Yunfeng has proposed the master data model of a highway video GIS, is about to geographical position (XY), highway mileage (M), video time or frame (T) data and carries out integrated (Kong Yunfeng, design and the realization of a highway video GIS, highway, 2007,1).In recent years, the independent development of the happy figure digital technology of Beijing Century Co., Ltd software and hardware system (the http://www.lotoo.cn.c-ps.net of one cover video acquisition+GPS information wireless transmission, 2008-4-24), be used for wireless monitor fields such as city integrated law enforcement, police's patrol, digital city, the law-executor position that supervision center not only can be monitored and manage by GPS information, and can understand field condition by video.
In sum, our Integrated Solution that sums up video information and spatial orientation information mainly contains dual mode: (1) is left spatial orientation information in the voice-grade channel of video file; (2) mode of the incidence relation between spatial information, the video frame time information with metadata is recorded in the file, sets up the relation of spatial information and video file with this.Though more than two kinds of schemes can set up incidence relation between video information and the spatial orientation information, but have following some weak point: 1, first kind of scheme is in the process of video acquisition, must adopt expensive professional equipment (gps satellite receiver, modulator-demodulator, professional video tape recorder) in real time spatial orientation informations such as locus (referring to latitude and longitude coordinates), orientation to be carried out voice modulation, and be stored in the sound channel of video data; But can improve the coding burden of acquisition terminal like this, and need to use special-purpose software kit to carry out post-processed, the ability demodulation is stored in the spatial information in the voice-grade channel, not only strongly professional, complicated operation, and also video file has lost audio-frequency information.Though 2, second kind of scheme realize than first kind of scheme simple, but need outside index file set up time-domain constraints relation between video, audio frequency and the spatial orientation information, and separate storage video and spatial information are unfavorable for management.Though 3, Le Tu company has realized that well spatial orientation information combines with video information, need to buy their software and hardware system, they are primarily aimed at government regulator at main users, do not have popular popularization.
Microsoft had released high level flow media formats ASF (Advanced Streaming Format) in 1997, and its purpose is to provide the basis for the multimedia collaborative ability to work in the industrial scale.Microsoft is defined as ASF the unified container file format of synchronized multimedia.It is a kind of data format, multimedia messagess such as audio frequency, video, image and control command script effectively can be organized, by the inner common time shaft of ASF file, reach between each media file synchronously.ASF is made up of three high-rise objects in logic: an object (Header Object), data object (Data Object) and index object (Index Object).Its internal structure is (as Fig. 1).Object, it is essential, must be placed on the beginning of ASF file.Object provides the global information that is stored in the multi-medium data in the data object, wherein the script command object between ASF file and other medium synchronously on played crucial effect.Data object, it also is essential, and is general immediately following after an object, comprises multi-medium datas all in the ASF file.Index object.It is optionally, comprises a time-based index that embeds ASF file multi-medium data.It is very useful when providing random point time line to play, and must put after all objects.The ASF file adopts the synchronistic model based on time shaft that various medium are carried out synchronously.Since ASF itself purpose of design and the characteristics of file format, stationary problem has solved fairly good between the multimedia of ASF file inside, can be integrated into dissimilar media informations in the ASF file, utilize the time shaft of ASF file internal unity to finish the synchronous of multiple medium like this.ASF data fusion technology is for the invention provides technical foundation.
Summary of the invention
Technical problem to be solved by this invention is, for overcome prior art to video add the spatial orientation information method in real time autocoding, need take the deficiency of voice-grade channel, and consider popular popularization problem, a kind of method that obtains to locate Streaming Media based on ASF data fusion technology is proposed, this method merges video information, audio-frequency information and spatial orientation information in real time automatically, forms real located for files in stream media.
Basic ideas of the present invention: obtain video/audio information with Digital Video, obtain spatial orientation information with the GPS receiver, in view of the above, using ASF data fusion technology is incorporated into spatial orientation information and audiovisual information in the files in stream media, we can only utilize a Daepori to lead to Digital Video (or camera) to use this scheme, and a notebook, a gps satellite receiver just can add for the video of gathering goes up the space locating information.It mainly implements principle: use ASF as the container that can locate digital video acquisition and coding, store the spatial orientation information that is received by the gps satellite receiver by add the script command object in real time in ASF file header object.Video information and audio-frequency information carry out compressed encoding according to the MPGE-4 mode.Finally video information, audio-frequency information and spatial orientation information are organized at the ASF internal tank.Utilize ASF internal time axle, and the relation of the time-domain constraints between video information, audio-frequency information and the spatial orientation information, on ASF internal time axle, choose reference point, finish the synchronous of video information, audio-frequency information and spatial orientation information.Realize merging automatically in real time of video information, audio-frequency information and spatial orientation information, form real located for files in stream media.
The present invention is based on the basic step that ASF data fusion technology obtains to locate Streaming Media is:
The first step, information are obtained: the employing digital camera equipment (as, common Digital Video or camera) obtain video/audio information, adopt the GPS receiver to obtain spatial orientation information;
Second step, based on the information fusion of ASF: video/audio information, spatial orientation information are transferred to the computer that the WME of Microsoft (Windows Media Encoder) kit is installed by computer external apparatus interface; ASF files in stream media in the use kit is as the container that can locate digital video acquisition and coding, by add script command object store space locating information in real time in ASF file header object; Video information and audio-frequency information carry out compressed encoding according to the MPGE-4 mode and are stored in respectively in the ASF data object; Utilize the time-domain constraints relation between ASF internal time axle and video information, audio-frequency information and the spatial orientation information, on ASF internal time axle, choose reference point, finish the synchronous of video information, audio-frequency information and spatial orientation information, realize merging automatically in real time of video information, audio-frequency information and spatial orientation information, form real located for files in stream media.
Second step concrete steps of described information fusion based on ASF are:
(1) starts main thread, utilize the IWMEncSourceGroupCollection interface in the WME kit to set up a multimedia group, be equivalent to set up a container, then by IWMEncSource interface difference newly-built video information, audio-frequency information and three objects of spatial orientation information script;
When (2) starting main thread, a newly-built thread is in order to obtain the GPS locating information in real time;
(3) coded program is caught video information, audio-frequency information, the GPS locating information of digital camera equipment and the collection of GPS receiver respectively; The code check of video, audio frequency is set respectively according to user's application need, utilize the WMEncoder interface to carry out the video/audio compressed encoding in the mode of MPEG-4 then, the reference format that the GPS locating information postbacks by the GPS receiver is with user-defined text formatting record;
(4) video/audio compression result and GPS information are deposited in respectively in above-mentioned video information, audio-frequency information and three objects of spatial orientation information; Finishing automatically by kit inside synchronously between three objects, but positioning video ASF file so just formed.
The function of the IWMEncSourceGroupCollection interface in described (1) is: comprise the source group of an enumeration type, the source group comprises the synchronous media stream that will encode, and it must comprise an audio stream, and can comprise a video or script again; The user can create the multiple source group, but can only have a source group to be encoded at synchronization.
The function of the IWMEncSource interface in described (1) is: be used to manage specific, the media stream that obtains from external equipment, and join in the group of source as an object.
The function of the WMEncoder interface in described (3) is: be used for creating or destroying a coding process.
The reference format that GPS receiver in described (3) postbacks is: NMEA-0183.The NMEA agreement is by American National ocean Institution of Electronics (NMEA-The National Marine Electronics Association) formulation, in order to set up unified RTCM (RTCM) standard in different GPS navigation equipment, 0180,0182 and 0,183 three kind of form is arranged, and wherein 0183 is the most widely used a kind of data format of present GPS.
Can locate the location of Streaming Media on electronic chart: utilize and support the player of ASF Streaming Media to play to locating Streaming Media, in the time of displaying video, extract the GPS locating information in the corresponding moment, and on electronic chart, show the position that to locate the Streaming Media shooting.
Described ASF DST PLAYER is: can carry out the multimedia player of ASF form, as WindowsMedia Player.
The inventive method is added spatial orientation information to the ASF container in user-defined mode and is carried out record, and need not to take voice-grade channel, but the audio-frequency information of positioning video is kept by add the script command object in an object of ASF file.Use a unified time axle to realize the real time fusion of video information, audio-frequency information and spatial orientation information, but form real positioning video stream.Do not take voice-grade channel, the harmless automatically of implementation space locating information embeds, and avoided the coding again of spatial information, makes video have positioning function.
Description of drawings
Fig. 1 is the ASF cut-away view;
Fig. 2 the present invention is based on the method flow schematic diagram that ASF data fusion technology obtains to locate map;
Fig. 3 is spatial orientation information and video/audio information synthetic schemes;
But Fig. 4 plays positioning video ASF document flowchart;
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.
Embodiment:
The first step: relevant device is prepared.Prepare the UX17 of a Sony type portable notebook and a Rikaline GPS-6033 type bluetooth gps satellite receiver.Sony's UX17 notebook has built-in camera, built-in sound pick-up outfit and has function of Bluetooth communication; The GPS receiver has function of Bluetooth communication.
Second step: can locate Streaming Media and merge.
1) ASF video coding program on the operation Sony portable notebook carries out the initialization of camera, sound pick-up outfit, and opens Bluetooth function on the notebook, opens bluetooth GPS receiver simultaneously.
2) the ASF video coding program brings into operation, and specifically with reference to accompanying drawing 3, step is as follows:
(1) starts main thread, utilize the IWMEncSourceGroupCollection interface in the WME kit to set up a multimedia group, be equivalent to set up a container, then by IWMEncSource interface difference newly-built video information, audio-frequency information and three objects of spatial orientation information script.
When (2) starting main thread, a newly-built thread is in order to obtain the GPS locating information in real time.
(3) coded program is caught video information, audio-frequency information, the GPS locating information that camera, sound pick-up outfit and GPS receiver are gathered respectively.(video code rate is 300k in this example according to user's application need the code check of video, audio frequency to be set respectively, audio frequency is 48k), utilize the WMEncoder interface to carry out the video/audio compressed encoding in the mode of MPEG-4 then, the NMEA0813 reference format that the GPS locating information postbacks by Rikaline GPS-6033 type bluetooth gps satellite receiver is with user-defined text formatting record.
(4) text with video/audio compression result and GPS information deposits in respectively in above-mentioned video information, audio-frequency information and three objects of spatial orientation information.Being finished automatically by kit inside synchronously between three objects so just formed and can locate Streaming Media ASF file.
The program running main false code of above-mentioned (1)-(4) in the step is as follows:
IWMEncSourceGroupCollection?SrcGrpColl;
// set up the source group of a SrcGrpColl by name by IWMEncSourceGroupCollection
WMEncoder?Encoder=new?WMEncoder();
The encoder of an Encoder by name of // instantiation
SrcGrp=SrcGrpColl.Add;
// form for the SrcGrpColl source and to build a SrcGrp example
SrcVid=SrcGrp.AddSource (" video capture device ");
SrcVid.SetInput (" video information of obtaining from video capture device ");
// obtain the video data source of a SrcVid by name
SrcAud=SrcGrp.AddSource (" audio collecting device ");
SrcAud.SetInput (" audio-frequency information that obtains from video capture device ");
// obtain the audio data sources of a SrcAud by name
SrcScript=SrcGrp.AddSource (" by the bluetooth GPS equipment of serial communication ");
The gps data that SrcScript.SetInput (" obtains by serial communication);
// obtain the script data source of a SrcScript by name
Encoder.Start();
// carrying out video, audio frequency and spatial orientation information merges
User Defined form in described (4) refers to: GPS information comprises time, longitude and latitude, geodetic height, translational speed, moving direction, satellite number, respectively defends asterisk, each satellite-signal situation, parameter informations such as each satellite altitude angle and azimuth, these information can be resolved by NMEA-0183 and be obtained (Cao Tingting, Gao Yu, the application of NMEA-0183 agreement among the GPS, the Electronics Engineer, 2006,32 (10): 8-11).The demand different according to the user, the parameter information that use are also different.The user can filter out useful parameter information, stores in scripting object with the form (for example separating with comma or branch between parameter) that oneself defines.When extracting the GPS parameter, also to resolve according to user-defined form.
Can locate Streaming Media in the location on the electronic chart: can locate the location of Streaming Media on electronic chart, specifically with reference to accompanying drawing 4, step is as follows:
(1) adopts Windows Media Player player to open to locate Streaming Media ASF file and play.(2) judge whether the reproduction time of Liu's medium of current player is consistent with the time of ASF file script command objects trigger order, if it is consistent, then from the script command object, extract the GPS information in this moment, obtain current position coordinates by the GPS format analysis, and be presented on the corresponding electronic chart.(3) if inconsistent, then judge whether to be played to end-of-file, if, then change (4) and finish to play, if not, then continue displaying video, change (2) and continue to judge.(4) ends file is play.

Claims (3)

1, a kind ofly obtains to locate the method for Streaming Media, the steps include: based on ASF data fusion technology
The first step, information are obtained: adopt digital camera equipment to obtain video/audio information, adopt the GPS receiver to obtain spatial orientation information;
Second step, based on the information fusion of ASF: video/audio information, spatial orientation information are transferred to the computer that the WME of Microsoft kit is installed by computer external apparatus interface; ASF files in stream media in the use kit is as the container that can locate digital video acquisition and coding, by add script command object store space locating information in real time in ASF file header object; Video information and audio-frequency information carry out compressed encoding according to the MPGE-4 mode and are stored in respectively in the ASF data object; Utilize the time-domain constraints relation between ASF internal time axle and video information, audio-frequency information and the spatial orientation information, on ASF internal time axle, choose reference point, finish the synchronous of video information, audio-frequency information and spatial orientation information, realize merging automatically in real time of video information, audio-frequency information and spatial orientation information, formation can be located files in stream media.
2, obtain to locate the method for Streaming Media according to claim 1 is described based on ASF data fusion technology, it is characterized in that: the concrete steps of described second step based on the information fusion of ASF are:
(a) start main thread, utilize the IWMEncSourceGroupCollection interface in the WME kit to set up a multimedia group, be equivalent to set up a container, then by IWMEncSource interface difference newly-built video information, audio-frequency information and three objects of spatial orientation information script;
When (b) starting main thread, a newly-built thread is in order to obtain the GPS locating information in real time;
(c) coded program is caught video information, audio-frequency information, the GPS locating information of digital camera equipment and the collection of GPS receiver respectively; The code check of video, audio frequency is set respectively according to user's application need, utilize the WMEncoder interface to carry out the video/audio compressed encoding in the mode of MPEG-4 then, the reference format that the GPS locating information postbacks by the GPS receiver is with user-defined text formatting record;
(d) video/audio compression result and GPS information are deposited in respectively in above-mentioned video information, audio-frequency information and three objects of spatial orientation information; Being finished automatically by kit inside synchronously between three objects so just formed and can locate files in stream media.
3, describedly obtain to locate the method for Streaming Media according to claim 1 or 2, it is characterized in that: under the condition of not losing audio-frequency information, realize the fusion of video, audio frequency and spatial orientation information based on ASF data fusion technology.
CN2008100247634A 2008-05-05 2008-05-05 Method for acquiring addressable stream media based on ASF data amalgamation technology Expired - Fee Related CN101272397B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008100247634A CN101272397B (en) 2008-05-05 2008-05-05 Method for acquiring addressable stream media based on ASF data amalgamation technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008100247634A CN101272397B (en) 2008-05-05 2008-05-05 Method for acquiring addressable stream media based on ASF data amalgamation technology

Publications (2)

Publication Number Publication Date
CN101272397A true CN101272397A (en) 2008-09-24
CN101272397B CN101272397B (en) 2010-11-10

Family

ID=40006082

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008100247634A Expired - Fee Related CN101272397B (en) 2008-05-05 2008-05-05 Method for acquiring addressable stream media based on ASF data amalgamation technology

Country Status (1)

Country Link
CN (1) CN101272397B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101547360B (en) * 2009-05-08 2010-11-10 南京师范大学 Localizable video file format and method for collecting data of formatted file
CN102256154A (en) * 2011-07-28 2011-11-23 中国科学院自动化研究所 Method and system for positioning and playing three-dimensional panoramic video
CN105282110A (en) * 2014-07-09 2016-01-27 北京合众思壮科技股份有限公司 Mobile phone streaming media transmission method fusing GPS positioning information and system thereof
CN106998476A (en) * 2017-04-06 2017-08-01 南京三宝弘正视觉科技有限公司 A kind of video inspection method and device based on GIS-Geographic Information System
CN107452409A (en) * 2017-08-16 2017-12-08 柳州桂通科技股份有限公司 information recording method, device, system, storage medium and processor
CN107766571A (en) * 2017-11-08 2018-03-06 北京大学 The search method and device of a kind of multimedia resource
CN108389281A (en) * 2018-03-17 2018-08-10 广东容祺智能科技有限公司 A kind of unmanned plane cruising inspection system with voice record function
CN111754654A (en) * 2020-05-27 2020-10-09 广州亚美智造科技有限公司 Vehicle-mounted video storage method, reading method, storage device and reading device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101547360B (en) * 2009-05-08 2010-11-10 南京师范大学 Localizable video file format and method for collecting data of formatted file
CN102256154A (en) * 2011-07-28 2011-11-23 中国科学院自动化研究所 Method and system for positioning and playing three-dimensional panoramic video
CN105282110A (en) * 2014-07-09 2016-01-27 北京合众思壮科技股份有限公司 Mobile phone streaming media transmission method fusing GPS positioning information and system thereof
CN106998476A (en) * 2017-04-06 2017-08-01 南京三宝弘正视觉科技有限公司 A kind of video inspection method and device based on GIS-Geographic Information System
CN106998476B (en) * 2017-04-06 2020-06-30 南京三宝弘正视觉科技有限公司 Video viewing method and device based on geographic information system
CN107452409A (en) * 2017-08-16 2017-12-08 柳州桂通科技股份有限公司 information recording method, device, system, storage medium and processor
CN107452409B (en) * 2017-08-16 2024-04-26 柳州桂通科技股份有限公司 Information recording method, apparatus, system, storage medium, and processor
CN107766571A (en) * 2017-11-08 2018-03-06 北京大学 The search method and device of a kind of multimedia resource
CN108389281A (en) * 2018-03-17 2018-08-10 广东容祺智能科技有限公司 A kind of unmanned plane cruising inspection system with voice record function
CN111754654A (en) * 2020-05-27 2020-10-09 广州亚美智造科技有限公司 Vehicle-mounted video storage method, reading method, storage device and reading device

Also Published As

Publication number Publication date
CN101272397B (en) 2010-11-10

Similar Documents

Publication Publication Date Title
CN101272397B (en) Method for acquiring addressable stream media based on ASF data amalgamation technology
CN101547360B (en) Localizable video file format and method for collecting data of formatted file
US20180322197A1 (en) Video data creation and management system
CN101867730B (en) Multimedia integration method based on user trajectory
CN102890699B (en) The GEOGRAPHICAL INDICATION of audio recording
CN101694669B (en) Pace note making method, device thereof, pace note making and sharing system
CN102884400B (en) Messaging device, information processing system and program
CN102289520A (en) Traffic video retrieval system and realization method thereof
CN107810531A (en) Data handling system
WO2012115593A1 (en) Apparatus, system, and method for annotation of media files with sensor data
US20150155009A1 (en) Method and apparatus for media capture device position estimate- assisted splicing of media
CN102680992A (en) Method for utilizing video files containing global positioning system (GPS) information to synchronously determine movement track
CN102033874A (en) Management system for recording and playing travel information back in real time and implementation device thereof
CN105675003A (en) Route generation and sharing method and device, route point adding method and device as well as route navigation method and device
CN108139227A (en) For video-graphics, selection and synchronous location based service tool
WO2019028393A1 (en) Methods and systems for detecting and analyzing a region of interest from multiple points of view
CN109471141A (en) A kind of method of mobile phone record daily life and motion profile
CN204733253U (en) A kind of real-time synchronization incorporates barometer, locating information to the video recording system in video
CN106534688A (en) Watermarked photo acquisition method and mobile terminal
TW201242368A (en) Object track tracing system of intellegent image monitoring system
CN103747230A (en) Dynamic positioning video electronic map projection system and method
CN104426937A (en) Method for locating content recorded by other equipment through mobile phone and cloud computation
CN204741517U (en) Real -time synchronization integrates into program recording system of longitude and latitude coordinate information to video in
CN102654848B (en) A kind of method and device realized mobile terminal and kept a diary automatically
Xiu et al. Information management and target searching in massive urban video based on video-GIS

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: CHANGSHU ZIJIN INTELLECTUAL PROPERTY SERVICE CO.,

Free format text: FORMER OWNER: NANJING NORMAL UNIVERSITY

Effective date: 20121211

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 210046 NANJING, JIANGSU PROVINCE TO: 215500 SUZHOU, JIANGSU PROVINCE

TR01 Transfer of patent right

Effective date of registration: 20121211

Address after: 215500 Changshou City South East Economic Development Zone, Jiangsu, Jin Road, No. 8

Patentee after: Changshu Zijin Intellectual Property Service Co.,Ltd.

Address before: Xianlin new town Yuen Road Qixia District Nanjing city Jiangsu province 210046 No. 1

Patentee before: Nanjing Normal University

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20101110