CN104331929B - Scene of a crime restoring method based on video map and augmented reality - Google Patents

Scene of a crime restoring method based on video map and augmented reality Download PDF

Info

Publication number
CN104331929B
CN104331929B CN201410594135.5A CN201410594135A CN104331929B CN 104331929 B CN104331929 B CN 104331929B CN 201410594135 A CN201410594135 A CN 201410594135A CN 104331929 B CN104331929 B CN 104331929B
Authority
CN
China
Prior art keywords
video
information
map
city
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410594135.5A
Other languages
Chinese (zh)
Other versions
CN104331929A (en
Inventor
修文群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201410594135.5A priority Critical patent/CN104331929B/en
Publication of CN104331929A publication Critical patent/CN104331929A/en
Application granted granted Critical
Publication of CN104331929B publication Critical patent/CN104331929B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Architecture (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to urban safety management field, specifically discloses a kind of scene of a crime restoring method based on video map and augmented reality, including:Based on spatial positional information, the city video monitoring network of coordinatograph is established, obtains the video for including positional information, and each video monitoring equipment and data server is interrelated;When video object occurs, the geographic coordinate values of video object different frame in city video monitoring network is obtained, forms the motion track of video object;Video and video object are projected on electronic map, form the city video map of three-dimensional;According to the real scene of video and video object on the video map of city, virtual objects are established, virtual objects are merged with the real scene of city video map;By the presentation of information of augmented reality to user, with user mutual, enhancement information is converted.The present invention is combined video map with the technology of augmented reality so that scene of a crime effective reproduction true to nature directly perceived, effective foundation is provided to solve a case.

Description

Scene of a crime restoring method based on video map and augmented reality
Technical field
The present invention relates to urban safety management technical field, more particularly to a kind of criminal based on video map and augmented reality The live restoring method of crime.
Background technology
Scene of a crime is reappeared by technological means, handles a case in public security, is court's evidence obtaining, significant in training and teaching. However, the methods of means taken at present image typically by tradition, take pictures and draw, has certain limitation, example Such as can not intuitively the spatial relationship between represented object, need to add a large amount of explanatory notes explain, can not reflect it is live three-dimensional true Actual effect fruit, Interactive control can not be responded, be helpless etc. for reappearing simulation crime that personage and scene combine.
And prior art can capture video data by video monitoring, then by video mapping and coordinatograph, will monitor Video presses its locus, projects on electronic map, forms video map, and can be carried out according to video content and locus Interactive query and tracing and positioning destination object.
In addition, augmented reality is to combine the technologies such as computer graphics, image procossing, machine vision, pass through display The equipment such as equipment, graphics device, sensor, tracker, interactive tool, using virtual image, video and text information come to true A kind of technology that real field scape is strengthened, realizes the fusion of real world and virtual world, make user such as in true environment from Right real-time, interactive.
Therefore, video surveillance network is combined with augmented reality, can intuitively shows the true of scene of a crime Actual effect fruit and spatial relationship, reappear crime scene.
The content of the invention
It is contemplated that overcome true effect, reflection object that the technology of existing scene of a crime reduction can not show intuitively it Between the technological deficiency such as spatial relationship, there is provided a kind of scene of a crime restoring method based on video map and augmented reality.
To achieve the above object, the present invention uses following technical scheme:The present invention provides existing with enhancing based on video map Real scene of a crime restoring method, comprises the following steps:
S1, based on spatial positional information, establish the city video monitoring network of coordinatograph, obtain regarding comprising positional information Frequently, it is and each video monitoring equipment and data server is interrelated;
S2, the geographic coordinate values for obtaining video object different frame in the city video monitoring network, are regarded described in formation The motion track of frequency target;
S3, based on the spatial positional information, the video and the video object are projected on electronic map, and will The video and the video object carry out angular transformation and stretched vertically processing, form the city video map of three-dimensional;
S4, the real scene according to the video and the video object on the video map of city, establish comprising enhancing The virtual objects of information, the virtual objects are merged with the real scene on the city video map, carry out augmented reality;
S5, by the presentation of information of the augmented reality to user, and with the user mutual, according to interactive information convert institute State enhancement information.
In some embodiments, the step S1 includes following steps:
S11, mapping obtain the geographical coordinate of each video monitoring equipment in city, using each video monitoring equipment in The heart, according to monitoring range, establish the space coordinates of video surveillance network;
S12, selection each video monitoring equipment monitoring range in characteristic point, surveyed and drawn, obtain other ground Partial positional information, form the video surveillance network of coordinatograph;
It is S13, each video monitoring equipment and server is interrelated, obtain the video for including geographical location information.
In some embodiments, in the step S11, using three-dimensional laser scanner, spirit level, GPS hand-held sets, steel ruler, The instrument of surveying and mappings such as stopwatch, theodolite, to be surveyed and drawn to the geographical coordinate of each video monitoring equipment.
In some embodiments, the step S2 includes:
S21, the pixel point coordinates according to the video object, obtain the geographic coordinate values of the video object;
S22, by the city video monitoring network, follow the trail of the video object, form the movement of the video object Track.
In some embodiments, the step S3 includes:
S31, by the video for including geographical location information acquired in the video surveillance network and the video Target video is decomposed into some continuous static video collections;
Key position point in S32, the selection continuous static video collection, and along decomposition road in video after disassembly Footpath selecting video key frame;
S33, according to streetscape pattern, the coordinate centered on shooting point, by video capture orientation and the sky of video surveillance network Between in coordinate system video geographical position coordinates, project in spherical coordinate, one video flowing along mobile route of generation is empty Between;
S34, continuous, dynamic video carried out to the video in the video fluid space by display screen on the electronic map Display and inquiry.
In some embodiments, the step S4 includes:
S41, the video data for obtaining real scene on city map, and according to the video content of real scene, establish and increase Strong information;
S42, the true geographic coordinate information for establishing the video and Virtual Space coordinate system transformational relation, will be virtual right As corresponding to the real scene for being merged into video data on position, and show that virtual objects melt with real scene by display device The information being integrated.
In some embodiments, S51, the information of the augmented reality passed through into the helmet, anaglyph spectacles, the projection display, shifting Dynamic display or computer monitor are shown to user;
S52, carry out with user eye tracking, click, movement, the rotation virtual objects, gesture identification interact, according to Interactive information converts the enhancement information.
The beneficial effects of the present invention are:By the way that video map is combined with the technology of augmented reality so that scene of a crime Directly perceived effective reproduction true to nature, multi-angle observation and partial enlargement are realized, more intuitively give expression to the sky between object in situ Between position relationship, so as to provide effective foundation to solve a case.
Brief description of the drawings
Fig. 1 is the flow chart of the scene of a crime restoring method of the invention based on video map and augmented reality;
Fig. 2 is a specific embodiment of the scene of a crime restoring method of the invention based on video map and augmented reality Flow chart.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, below in conjunction with accompanying drawing and specific implementation Example, the present invention will be described in further detail.It should be appreciated that specific embodiment described herein is only explaining this hair It is bright, without being construed as limiting the invention.
The present invention thinking be:Scene of a crime electronic map is initially set up, video, video object are projected in electronic map In, accuracy registration is carried out by control point, monitor video is subjected to angular transformation and stretched vertically, it is three-dimensional with display Sense;Then above-mentioned positioning video is carried out by augmented reality module positioning augmented reality processing, adds text information and operation Button, crime target and victim's behavior is set intuitively to show.
Referring to Fig. 1, the flow chart for scene of a crime restoring method of the present invention based on video map and augmented reality.It is logical Following steps are crossed to realize:Step S1 is first carried out, based on spatial positional information, establishes the city video monitoring of coordinatograph Network, the video for including positional information is obtained, and each video monitoring equipment and data server is interrelated.Perform step S2, the geographic coordinate values of video object different frame in the city video monitoring network is obtained, forms the video object Motion track.Step S3 is performed, based on the spatial positional information, the video and the video object are projected to electronically On figure, and the video and the video object are subjected to angular transformation and stretched vertically processing, the city for forming three-dimensional regards Frequency map.Step S4 is performed, according to the real scene of the video and the video object on the video map of city, establishes bag Virtual objects containing enhancement information, the virtual objects are merged with the real scene on the city video map, to strengthen Reality.Perform step S5, by the presentation of information of the augmented reality to user, and with the user mutual, according to interactive information Convert the enhancement information.
It is referring to Fig. 2, specific real for scene of a crime restoring method one of the present invention based on video map and augmented reality Apply the flow chart of example.Detailed process is:Step S11 is performed, the geographical seat of each video monitoring equipment in city is obtained by surveying and drawing Mark, centered on each video monitoring equipment, according to monitoring range, establish the space coordinates of video surveillance network.It is preferred that , using instrument of surveying and mappings such as three-dimensional laser scanner, spirit level, GPS hand-held sets, steel ruler, stopwatch, theodolites, to each video The geographical coordinate of monitoring device is surveyed and drawn.
Step S12 is performed, the characteristic point in the monitoring range of each video monitoring equipment is selected, is surveyed and drawn, is obtained The positional information of other above ground portions, form the video surveillance network of coordinatograph.Feature can be obtained by the mapping to characteristic point The geographical coordinate of point, the imaging array for the image that video monitoring equipment is collected carry out projective transformation and Coordinate Conversion, root According to characteristic point pixel and characteristic point actual geographic coordinate corresponding relation, it is corresponding to obtain other pixels in imaging array Actual geographic coordinate, form the video surveillance network of coordinatograph.Step S13 is performed, by each video monitoring equipment and clothes Being engaged in, device is interrelated, and unified management, acquisition includes the video of geographical location information.By including acquired in each video monitoring equipment The video data for having positional information is uniformly stored in video database, and each video monitoring equipment is connected with data server, number Each video monitoring equipment is managed according to server is unified, controls the monitor state of each video monitoring equipment.
Step S21 is performed, according to the pixel point coordinates of the video object, obtains the geographical coordinate of the video object Value.Due to the pixel point coordinates for having obtained each video monitoring equipment imaging array in step s 12 and true geographical coordinate Corresponding relation, therefore, after the pixel point coordinates of video object is obtained, according to the pixel point coordinates of imaging array with it is true The corresponding relation of real geographical coordinate, the true geographical position coordinates of video object can be tried to achieve.Step S22 is performed, passes through the city City's video surveillance network, the video object is followed the trail of, form the motion track of the video object.Obtaining the true of video object After real geographical position coordinates, the prison of video monitoring equipment corresponding with the geographical position coordinates is transferred by data server Picture is controlled, and follows the trail of video object, forms the motion track of video object.
Step S31 is performed, by the video data and video that include geographical location information acquired in video surveillance network Target video is decomposed into some continuous static video collections.Step S32 is performed, selects key position point, and regarding after disassembly Along decomposition path-ways selecting video key frame in frequency;Perform step S33, according to streetscape pattern, the coordinate centered on shooting point, by regarding The geographical position coordinates of video, are projected in spherical coordinate in frequency shooting orientation and the space coordinates of video surveillance network, Generate a video fluid space along mobile route.Preferably, video is subjected to angular transformation and stretched vertically is handled, form tool There is relief city video map.Step S34 is performed, by display screen on the electronic map to the video in video fluid space Carry out continuous, dynamic video to show and inquire about, inquiry and tracing and positioning can be interacted according to the locus of video data. So as to complete the visual tracking of the foundation of coordinatograph city video map and video object, realize to space environment, video mesh Target is holographic, dynamic, really displaying.
Step S41 is performed, obtains the video data of real scene on city map, and in the video according to real scene Hold, establish the virtual objects for including enhancement information.The virtual objects for including enhancement information, enhancement information bag are established by computer Include the text information explained to the place stressed, Video processing mode (including repeat playing, speed is played, put upside down, Fix and amplify, measure, translation-angle etc.), the information such as marker button, can be the virtual object coexisted with true environment The non-geometry information of body or the object of physical presence.Step S42 is performed, establishes the true geographic coordinate information of video With the transformational relation of Virtual Space coordinate system, virtual objects are merged into corresponding to the real scene of video data on position, and The information to be combined together by display device display virtual objects with real scene.
Step S51 is performed, by the presentation of information of the augmented reality to user.Pass through the helmet, anaglyph spectacles, Projection Display Data message after fusion is shown to user by the observation display device such as device, mobile display or computer monitor.Storage is given birth to Into the data that are merged with virtual objects of real scene.Step S52 is performed, is interacted with user, converts the enhancement information. Interacted by registering, tracking with user.The interacting including the following aspects with user:
Changed according to the user visual field detected in real time, it is corresponding with true geographic coordinate system to rebuild Virtual Space coordinate system Relation;The conversion of space coordinates can be realized by electronic three-dimensional map Accreditation System.
According to user's direction of visual lines, the mapping position addition respective virtual object in projection plane, and by these information Real-time display is in the correct position of fluorescent screen.Preferably, using infrared camera technology and corneal reflection Technical Follow-Up eyeball Motion, calculates direction of visual lines.
Interaction virtual objects are clicked on according to user's operation, moved, the action such as rotate.Preferably, it can pass through Operation of the user to virtual objects is realized in virtual reality hardware and software programming, and the virtual reality hardware includes data glove Deng.
Using the interaction of gesture identification, the interaction based on Video processing mode, button menu is generated such as on marker.
User can carry out positioning playing by scopes such as the AR helmets, glasses to video and enhancement information so that use Family is in scene of a crime, watches crime target and victim's behavior in video.User is in observation, when sight turns from a scene When moving on to another scene, the relation of system reconstructing virtual coordinates and true coordinate, the enhancement information of virtual objects also becomes therewith Change, so as to show the enhancement information of another corresponding scene.User can click on virtual push button as needed and weight is carried out to video The operations such as replay is put, speed is played, puts upside down, fixes and amplified, measures, translation-angle.
By the present embodiment, virtual scene is constructed, by the presentation that scene of a crime is true to nature, realizes multi-angle observation drawn game Portion amplifies, the place stressed to needs, and text information and video information is superimposed.So as to solve real scene crime The problems such as information can not preserve, photo is not directly perceived, is easy to study scene repeatedly, and foundation is provided to solve a case;Simultaneously can be with As course case, for police universities teaching and new alert training.
The embodiment of present invention described above, is not intended to limit the scope of the present invention..Any basis Various other corresponding changes and deformation made by the technical concept of the present invention, should be included in the guarantor of the claims in the present invention In the range of shield.

Claims (5)

1. the scene of a crime restoring method based on video map and augmented reality, it is characterised in that comprise the following steps:
S1, based on spatial positional information, establish the city video monitoring network of coordinatograph, obtain the video for including positional information, It is and each video monitoring equipment and data server is interrelated;
S2, the geographic coordinate values for obtaining video object different frame in the city video monitoring network, form the video mesh Target motion track;
S3, based on the spatial positional information, the video and the video object are projected on electronic map, and will described in Video and the video object carry out angular transformation and stretched vertically processing, form the city video map of three-dimensional;
S4, the real scene according to the video and the video object on the video map of city, foundation include enhancement information Virtual objects, the virtual objects are merged with the real scene on the city video map, carry out augmented reality;
Text information that the enhancement information includes explaining the place that stresses, Video processing mode, marker by The information such as button, or the dummy object coexisted with true environment, or the non-geometry information of the object of physical presence;
S5, by the presentation of information of the augmented reality to user, and with the user mutual, the increasing is converted according to interactive information Strong information;
The step S1 includes:
S11, mapping obtain the geographical coordinate of each video monitoring equipment in city, centered on each video monitoring equipment, root According to monitoring range, the space coordinates of video surveillance network are established;
S12, selection each video monitoring equipment monitoring range in characteristic point, surveyed and drawn, obtain other above ground portions Positional information, form the video surveillance network of coordinatograph;
The geographical coordinate of characteristic point, the image that video monitoring equipment is collected can be obtained by the mapping to the characteristic point Imaging array carry out projective transformation and Coordinate Conversion, according to the pixel of characteristic point and pair of the actual geographic coordinate of characteristic point It should be related to, obtain actual geographic coordinate corresponding to other pixels in imaging array, form the video surveillance network of coordinatograph;
It is S13, each video monitoring equipment and server is interrelated, obtain the video for including geographical location information;
The step S2 includes:
S21, the pixel point coordinates according to the video object, obtain the geographic coordinate values of the video object;
S22, by the city video monitoring network, follow the trail of the video object, form the motion track of the video object;
After the true geographical position coordinates of video object are obtained, transferred and the geographical position coordinates by data server The monitored picture of corresponding video monitoring equipment, and video object is followed the trail of, form the motion track of video object.
2. the scene of a crime restoring method based on video map and augmented reality as claimed in claim 1, it is characterised in that institute State in step S11, using instrument of surveying and mappings such as three-dimensional laser scanner, spirit level, GPS hand-held sets, steel ruler, stopwatch, theodolites, come The geographical coordinate of each video monitoring equipment is surveyed and drawn.
3. the scene of a crime restoring method based on video map and augmented reality as claimed in claim 1, it is characterised in that institute Stating step S3 includes:
S31, by the video for including geographical location information acquired in the video surveillance network and the video object Video is decomposed into some continuous static video collections;
Key position point in S32, the selection continuous static video collection, and selected in video after disassembly along decomposition path-ways Take key frame of video;
S33, according to streetscape pattern, the coordinate centered on shooting point, sat by video capture orientation and the space of video surveillance network The geographical position coordinates of video, are projected in spherical coordinate in mark system, generate a video fluid space along mobile route;
S34, continuous, dynamic video is carried out to the video in the video fluid space on the electronic map by display screen shown With inquiry.
4. the scene of a crime restoring method based on video map and augmented reality as claimed in claim 1, it is characterised in that institute Stating step S4 includes:
S41, the video data for obtaining real scene on city map, and according to the video content of real scene, establish comprising increasing The virtual objects of strong information;
S42, the true geographic coordinate information for establishing the video and Virtual Space coordinate system transformational relation, will be described virtual right As corresponding to the real scene for being merged into video data on position, and by display device show the virtual objects with it is described true The information that real field scape combines together.
5. the scene of a crime restoring method based on video map and augmented reality as claimed in claim 1, it is characterised in that institute Step S5 is stated to comprise the following steps:
S51, the information of the augmented reality shown by the helmet, anaglyph spectacles, the projection display, mobile display or computer Device is shown to user;
S52, carry out with user eye tracking, click, movement, the rotation virtual objects, gesture identification interact, according to interaction Information converts the enhancement information.
CN201410594135.5A 2014-10-29 2014-10-29 Scene of a crime restoring method based on video map and augmented reality Active CN104331929B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410594135.5A CN104331929B (en) 2014-10-29 2014-10-29 Scene of a crime restoring method based on video map and augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410594135.5A CN104331929B (en) 2014-10-29 2014-10-29 Scene of a crime restoring method based on video map and augmented reality

Publications (2)

Publication Number Publication Date
CN104331929A CN104331929A (en) 2015-02-04
CN104331929B true CN104331929B (en) 2018-02-02

Family

ID=52406649

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410594135.5A Active CN104331929B (en) 2014-10-29 2014-10-29 Scene of a crime restoring method based on video map and augmented reality

Country Status (1)

Country Link
CN (1) CN104331929B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541876A (en) * 2020-05-18 2020-08-14 上海未高科技有限公司 Method for realizing high-altitude cloud anti-AR technology

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105468666B (en) * 2015-08-11 2019-09-17 中国科学院软件研究所 A kind of video content visual analysis method based on map metaphor
CN105100501B (en) * 2015-08-27 2018-05-18 黑龙江科技大学 A kind of mobile phone computing system based on Internet of Things
CN105261041A (en) * 2015-10-19 2016-01-20 联想(北京)有限公司 Information processing method and electronic device
CN105933655A (en) * 2016-05-13 2016-09-07 深圳先进技术研究院 Video and WIFI mixed positioning method and system
CN105847756B (en) * 2016-05-13 2018-12-14 深圳先进技术研究院 Video identification tracking location system based on the dotted fitting in position
CN106127858B (en) * 2016-06-24 2020-06-23 联想(北京)有限公司 Information processing method and electronic equipment
CN106412524A (en) * 2016-11-09 2017-02-15 四川诚品电子商务有限公司 Remote case trial system
CN108616718B (en) * 2016-12-13 2021-02-26 杭州海康威视系统技术有限公司 Monitoring display method, device and system
CN106777119A (en) * 2016-12-16 2017-05-31 福建福光股份有限公司 Online synchronous traveling method
CN108320331B (en) * 2017-01-17 2021-10-22 上海掌门科技有限公司 Method and equipment for generating augmented reality video information of user scene
CN107197200A (en) * 2017-05-22 2017-09-22 北斗羲和城市空间科技(北京)有限公司 It is a kind of to realize the method and device that monitor video is shown
CN108932051B (en) * 2017-05-24 2022-12-16 腾讯科技(北京)有限公司 Augmented reality image processing method, apparatus and storage medium
CN107784693B (en) * 2017-09-22 2021-06-04 西安点云生物科技有限公司 Information processing method and device
CN108259827B (en) * 2018-01-10 2020-07-07 中科创达软件股份有限公司 Method, device, AR equipment and system for realizing security
CN108712362B (en) * 2018-03-15 2021-03-16 高新兴科技集团股份有限公司 Video map engine system
CN110515452B (en) * 2018-05-22 2022-02-22 腾讯科技(深圳)有限公司 Image processing method, image processing device, storage medium and computer equipment
CN108874911B (en) * 2018-05-28 2019-06-04 广西师范学院 Suspect's position predicting method based on regional environment Yu crime dramas data
CN108846378A (en) * 2018-07-03 2018-11-20 百度在线网络技术(北京)有限公司 Sign Language Recognition processing method and processing device
IT201800010914A1 (en) 2018-12-10 2020-06-10 Legalgenetics S R L INTERACTIVE DIDACTIC SYSTEM AND METHOD OF DIGITAL REPRODUCTION AND ANALYSIS OF ANY CRIME SCENE
CN110009561B (en) * 2019-04-10 2023-04-18 南京财经大学 Method and system for mapping surveillance video target to three-dimensional geographic scene model
CN110191087A (en) * 2019-04-19 2019-08-30 特斯联(北京)科技有限公司 A kind of monitoring device connection control method, equipment and server
CN110191424B (en) * 2019-05-16 2021-06-15 武汉数矿科技有限公司 Specific suspect track generation method and apparatus
CN111309967B (en) * 2020-01-23 2023-12-01 北斗伏羲信息技术有限公司 Video space information query method based on grid coding
CN111931830B (en) * 2020-07-27 2023-12-29 泰瑞数创科技(北京)股份有限公司 Video fusion processing method and device, electronic equipment and storage medium
CN113157088B (en) * 2021-03-18 2022-04-08 浙江通鹏智能科技有限公司 Criminal investigation virtual scene-based data processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN101339654A (en) * 2007-07-04 2009-01-07 北京威亚视讯科技有限公司 Reinforced real environment three-dimensional registering method and system based on mark point

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140004335A (en) * 2012-07-02 2014-01-13 한국전자통신연구원 User interface device for projection computer and method for interfacing using the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339654A (en) * 2007-07-04 2009-01-07 北京威亚视讯科技有限公司 Reinforced real environment three-dimensional registering method and system based on mark point
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A touring machine:Prototyping 3D mobile augmented reality systems for exploring the urban environment Proceedings of International Symposium on Wearable Computers;FEINER S等;《Cambridge. UK: IEEE》;19971231;第74-81页 *
基于android平台的增强现实导航软件的设计与实现;曾浩;《中国优秀硕士学位论文全文数据库 信息科技辑》;20130715(第07期);第6-9页第2章,第16页第3.1节,第19-20页,图3.5,第20-23页,第21页图3.6 *
增强现实中的虚实注册技术研究;明德烈等;《中国图象图形学报》;20031231;第8卷(第05期);第557-561页 *
面向增强现实的实时三维跟踪;董子龙;《中国博士学位论文全文数据库 信息科技辑》;20110815(第08期);第15-18页,第30-35页 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541876A (en) * 2020-05-18 2020-08-14 上海未高科技有限公司 Method for realizing high-altitude cloud anti-AR technology

Also Published As

Publication number Publication date
CN104331929A (en) 2015-02-04

Similar Documents

Publication Publication Date Title
CN104331929B (en) Scene of a crime restoring method based on video map and augmented reality
US11120628B2 (en) Systems and methods for augmented reality representations of networks
AU2015265416B2 (en) Method and system for image georegistration
US20190147619A1 (en) Method and system for image georegistration
CN101833896B (en) Geographic information guide method and system based on augment reality
CN102981616B (en) The recognition methods of object and system and computer in augmented reality
JP4870546B2 (en) CV tag video display device with layer generation / selection function
CN108304075B (en) Method and device for performing man-machine interaction on augmented reality device
CN107016704A (en) A kind of virtual reality implementation method based on augmented reality
CN106355153A (en) Virtual object display method, device and system based on augmented reality
CN105212418A (en) Augmented reality intelligent helmet based on infrared night viewing function is developed
JP2013507677A (en) Display method of virtual information in real environment image
CN104486586A (en) Disaster lifesaving simulated training method and system based on video map
CN104501797B (en) A kind of air navigation aid based on augmented reality IP maps
CN114372107A (en) GIS-based method and system for visualizing homeland improvement and ecological restoration data
CN104501798A (en) Network object positioning and tracking method based on augmented reality IP map
US10614308B2 (en) Augmentations based on positioning accuracy or confidence
WO2020136633A1 (en) Methods and systems for camera 3d pose determination
Yang et al. Survey on tracking and registration technology for mobile augmented reality
CN112862976B (en) Data processing method and device and electronic equipment
JP2011209622A (en) Device and method for providing information, and program
Zheng et al. [Retracted] Rendering and Optimization Algorithm of Digital City’s 3D Artistic Landscape Based on Virtual Reality
CN107687853A (en) New localization method based on augmented reality
Ravi et al. A study of object recognition and tracking techniques for augmented reality applications
Wu et al. Cognition-based augmented reality visualization of the geospatial data

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant