CN112788264A - Recording and video searching system and method thereof - Google Patents

Recording and video searching system and method thereof Download PDF

Info

Publication number
CN112788264A
CN112788264A CN201911080787.6A CN201911080787A CN112788264A CN 112788264 A CN112788264 A CN 112788264A CN 201911080787 A CN201911080787 A CN 201911080787A CN 112788264 A CN112788264 A CN 112788264A
Authority
CN
China
Prior art keywords
video
audio
recording
pen
intelligent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911080787.6A
Other languages
Chinese (zh)
Inventor
蔡俊光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wenchang Technology Co ltd
Original Assignee
Wenchang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wenchang Technology Co ltd filed Critical Wenchang Technology Co ltd
Priority to CN201911080787.6A priority Critical patent/CN112788264A/en
Publication of CN112788264A publication Critical patent/CN112788264A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Abstract

The invention discloses a sound recording and video recording searching system and a method thereof. The invention only needs to point the intelligent pen to the keyword (abbreviation) to be retrieved in the simple abbreviation, and the searching and playing unit can quickly open the audio and/or video (detailed abbreviation) corresponding to the keyword, thereby shortening the searching time, greatly improving the working efficiency and ensuring the searching precision.

Description

Recording and video searching system and method thereof
Technical Field
The invention relates to the technical field of audio and video searching, in particular to a sound recording and video recording searching system and a sound recording and video recording searching method.
Background
The recording of the on-site events refers to recording the situations and specific contents of the events by recording personnel in the events such as meetings, reporter meetings or classrooms. The "note" is different from the detailed note. The abbreviation is the main word, i.e. the important or main word on the event. The traditional manual recording mode can only realize the brief recording generally, and if the quick and effective detailed recording is to be realized, the recording and video recording of each person of the event are required to be carried out by utilizing equipment such as a video recorder, a microphone, a recording pen and the like.
However, when the recorder needs to find out the corresponding detailed notes from the recorded sound and video by recording the notes, the searching can be performed only slowly by shifting one or more frames, which wastes time and greatly reduces the work efficiency.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a sound recording and video recording searching system which has high searching accuracy and short searching time and can greatly improve the working efficiency.
In order to achieve the purpose, the technical scheme provided by the invention is as follows:
a recording and video searching system comprises an intelligent pen, paper, a recording and video unit, a time synchronization unit and a searching and playing unit;
the intelligent pen is provided with a pen main body, an identification camera, a processor and a memory; the recognition camera and the memory are respectively connected with the processor, and the recognition camera, the memory and the processor are all arranged on the pen main body; the pen main body is used for matching with paper to write a record, the recognition camera is used for collecting an image which is written by a user and is abbreviated, the processor recognizes the writing position of the user and records a time stamp when the user writes based on the collected image, and the memory is used for storing the position information recognized by the processor and the corresponding position time stamp and generating a position information file;
the recording unit is used for recording the audio and/or video of an event and the corresponding recording timestamp information and generating an audio and/or video file;
the time synchronization unit is respectively in data connection with the intelligent pen and the sound and video recording unit, receives one or more position information files of the intelligent pen and one or more audio and/or video files of the sound and video recording unit, and creates a first index table to store position information of each position information file and a corresponding position timestamp; creating a second index table to store the audio and/or video file names and the sound and video recording timestamp information of the audio and/or video files;
the searching and playing unit is respectively in data connection with the intelligent pen and the time synchronization unit, receives searching target position information of the intelligent pen, searches position information matched with the searching target position information in the first index table according to the searching target position information, and acquires a corresponding position timestamp; searching the matched audio and video recording timestamp information in the second index table according to the obtained position timestamp, and obtaining the corresponding audio and/or video file name; and selecting the corresponding audio and/or video file according to the acquired audio and/or video file name, and starting to play the selected audio and/or video file according to the acquired position timestamp.
Further, the timestamp information when the user writes and the recording timestamp information are both based on the UTC time.
Further, the sound recording and video recording time stamp information stored in the second index table comprises a start time stamp and an end time stamp of the audio and/or video file; the recording time stamp information matched with the acquired time stamp in the second index table refers to the recording time stamp information of the acquired time stamp among the start time stamp and the end time stamp.
Further, the paper is printed with an OID code, and the OID code corresponds to the position coordinate.
Further, the intelligent pen also comprises a wireless transmission module which is in wireless data connection with the time synchronization unit and the searching and playing unit.
Furthermore, the sound recording and video recording unit, the time synchronization unit and the searching and playing unit are all arranged on the intelligent terminal.
Furthermore, the time synchronization unit and the search playing unit are both arranged on the first intelligent terminal, and the sound recording and video recording unit is arranged on a second intelligent terminal or a digital sound recording/video camera independent of the first intelligent terminal.
Furthermore, the time synchronization unit is arranged on the cloud server, and the sound recording and video recording unit and the searching and playing unit are arranged on the intelligent terminal.
Furthermore, the time synchronization unit is arranged on the cloud server, the search playing unit is arranged on the first intelligent terminal, and the sound recording and video recording unit is arranged on a second intelligent terminal or a digital sound recording/video camera independent of the first intelligent terminal.
To achieve the above object, the present invention further provides a method for a sound and video recording search system, comprising the steps of:
s1: before the event to be recorded begins, initializing an intelligent pen and a recording unit to synchronize the system time;
s2: when the event to be recorded starts, starting the recording function of the intelligent pen and the audio and/or video recording function of the sound recording and video recording unit;
s3: during the process of an event to be recorded, when an intelligent pen writes a simple sketch on paper, an identification camera acquires an image of the sketch written by a user, a processor identifies the writing position of the user and records a time stamp of the writing time of the user based on the acquired image, and stores the identified position information and the corresponding position time stamp in a position information file of a memory;
s4: when the event to be recorded is finished, the recording function of the intelligent pen and the audio and/or video recording function of the sound recording and video recording unit are stopped;
s5: respectively connecting the intelligent pen and the recording and video recording unit with a time synchronization unit in a data connection mode, wherein the time synchronization unit receives one or more position information files of the intelligent pen and one or more audio and/or video files of the recording and video recording unit, creates a first index table to store the position information of the one or more position information files and corresponding position timestamps, and creates a second index table to store the audio and/or video file names of the one or more audio and/or video files and the recording and video recording timestamp information;
s6: when a user needs to listen to or watch a certain section of the audio and/or video file recorded with the recorded event, the intelligent pen point is pointed to the corresponding sketch recorded on the paper, the sketch position information is identified through the cooperation of the identification camera and the processor, and the identified sketch position information is sent to the searching and playing unit;
s7: the searching and playing unit receives the abbreviated position information, searches the position information matched with the abbreviated position information in the first index table according to the abbreviated position information, and acquires a position timestamp corresponding to the abbreviated position information;
s8: searching the matched audio and video recording timestamp information in the second index table according to the position timestamp acquired in the step S7, and acquiring the corresponding audio and/or video file name;
s9: according to the audio and/or video file name acquired in step S8, the corresponding audio and/or video file is selected, and the selected audio and/or video file is played from the acquired position timestamp. .
Further, the step S1 synchronizes system time of the smart pen and the recording unit, specifically, the step is to synchronize the smart pen and the recording unit with the current UTC time respectively.
Further, when the position written by the user is identified in step S2, the processor extracts the OID code corresponding to the sketch in the image and identifies the OID code, so as to obtain the position corresponding to the OID code, i.e., the position written by the user.
Further, the time synchronization unit is disposed in the cloud server, and step S5 further includes:
s5-1) logging in the cloud server by the user;
s5-2) respectively connecting the intelligent pen and the recording unit with the time synchronization unit in a data connection manner, wherein the time synchronization unit receives the position information file (S) of the intelligent pen and the audio and/or video file (S) of the recording unit;
s5-3) storing the received position information file and the audio and/or video file in the user space of the user at the cloud server;
s5-4) creating a first index table storing location information and corresponding location timestamps of the location information file (S) in the user space, and creating a second index table storing audio and/or video file names and sound recording timestamp information of the audio and/or video file (S).
Further, the step S6 further includes:
s6-1) when the user needs to listen to or watch a certain section of the audio and/or video file recorded with the recorded event, the user logs in the cloud server;
s6-2) the intelligent pen point is matched with the corresponding sketch recorded on the paper, the sketch position information is identified through the identification camera and the processor, and the identified sketch position information is sent to the searching and playing unit;
s6-3) the search playing unit receives the abbreviated position information, searches the position information matched with the abbreviated position information in a first index table of the user in the user space of the cloud server according to the abbreviated position information, and acquires the corresponding position timestamp;
s6-4) searching the matched recording and video time stamp information in the second index table of the user in the user space of the cloud server according to the position time stamp obtained in the step S6-3, and obtaining the corresponding audio and/or video file name;
s6-5) selecting corresponding audio and/or video files in the user space of the cloud server by the user according to the audio and/or video file names acquired in the step S6-4;
s6-6) streaming the audio and/or video files selected according to the step S6-5 to a search play unit;
s6-7) searching for the playback unit to play back the selected audio and/or video file from the acquired position time stamp.
Compared with the prior art, the method has the advantages that,
according to the scheme, the intelligent pen point is only required to be simply abbreviated to a keyword (abbreviated) to be retrieved, and the search playing unit can rapidly open the audio and/or video (detailed) corresponding to the keyword, so that the search time is shortened, the working efficiency is greatly improved, and the search precision is ensured.
Drawings
Fig. 1 is a schematic diagram of a sound recording and video recording search system according to embodiment 1 of the present invention;
fig. 2 is a schematic diagram of a sound recording and video recording search system according to embodiment 2 of the present invention;
FIG. 3 is a schematic diagram of a video recording search system according to embodiment 3 of the present invention;
fig. 4 is a schematic diagram of a sound recording and video recording search system according to embodiment 4 of the present invention.
Detailed Description
The invention will be further illustrated with reference to specific examples:
example 1
As shown in fig. 1, the recording search system of the present embodiment includes a smart pen 1, a paper 2, a recording unit 3, a time synchronization unit 4, and a search playing unit 5.
As shown in fig. 2, the smart pen 1 is provided with a pen main body 11, an identification camera 12, a processor 13, and a memory 14; the recognition camera 12 and the memory 14 are respectively connected with the processor 13, and the three are all arranged on the pen main body 11; the pen body 11 is used for cooperating with the paper 2 to make a writing record, the recognition camera 12 is used for collecting an image of a user writing sketch, the processor 13 recognizes the writing position of the user and records a time stamp when the user writes based on the collected image, and the memory 14 is used for storing the position information recognized by the processor 13 and a corresponding position time stamp and generating a position information file.
The sound recording unit 3 is used for recording audio and/or video of an event (e.g., a meeting, a classroom, etc.) and corresponding sound recording timestamp information, and generating audio and/or video files.
The time synchronization unit 4 is respectively in data connection with the intelligent pen 1 and the sound and video recording unit 3, receives one or more position information files of the intelligent pen 1 and one or more audio and/or video files of the sound and video recording unit 3, and creates a first index table to store position information of each position information file and a corresponding position timestamp; a second index table is created that stores audio and/or video file names and sound and video recording timestamp information for the audio and/or video files.
The searching and playing unit 5 is respectively in data connection with the intelligent pen 1 and the time synchronization unit 4, receives searching target position information of the intelligent pen 1, searches position information matched with the searching target position information in the first index table according to the searching target position information, and obtains a corresponding position timestamp; searching the matched audio and video recording timestamp information in the second index table according to the obtained position timestamp, and obtaining the corresponding audio and/or video file name; and selecting the corresponding audio and/or video file according to the acquired audio and/or video file name, and starting to play the selected audio and/or video file according to the acquired position timestamp.
In this embodiment, the timestamp information when the user writes and the recording timestamp information are both based on the UTC time. The sound recording and video recording time stamp information stored in the second index table comprises a start time stamp and an end time stamp of an audio and/or video file; the recording time stamp information matched with the acquired time stamp in the second index table refers to the recording time stamp information of the acquired time stamp among the start time stamp and the end time stamp. For example, if the acquired time stamp is 22 am at 10 am on 1 st day in 2019, 10 month in 2019, and the start time stamp of a piece of sound recording time stamp information is 22 am at 9 am on 1 st day in 10 month in 2019, and the end time stamp is 22 am at 11 am on 1 st day in 10 month in 2019, the sound recording time stamp information is matched with the acquired time stamp.
In addition, in the present embodiment, the paper 2 is printed with OID codes, and the OID codes correspond to the position coordinates. The paper 2 may be in the form of a notebook, each page of which is paper 2 printed with a unique OID code. The smart pen 1 may further comprise a wireless transmission module 15, which is in wireless data connection (e.g. in a bluetooth wireless connection) with the time synchronization unit 4 and the search playing unit 5. The sound recording and video recording unit 3, the time synchronization unit 4 and the search playing unit 5 are all arranged on an intelligent terminal, and the intelligent terminal can be a smart phone or a smart tablet computer.
The working principle of the embodiment is as follows:
s1: before an event (such as a conference) to be recorded starts, the intelligent pen 1 and the sound and video recording unit 3 are initialized, and system time is synchronized, specifically, the intelligent pen 1 and the sound and video recording unit 3 are respectively synchronized with current UTC time.
S2: when the event to be recorded starts, starting the recording function of the intelligent pen 1 and the audio and/or video recording function of the sound recording and video recording unit 3;
s3: during the recorded event, when the user needs to write a simple abbreviation (such as 'policy direction of the present year company') on the paper 2 when hearing a special matter (such as a part related to the policy direction of the present year company), the pen main body 11 of the smart pen 1 collects an image of the user writing the abbreviation 'policy direction of the present year company' by the recognition camera 12, then transmits the collected image to the processor 13, the processor 13 recognizes the position of the abbreviation 'policy direction' written by the user and records a time stamp of the user writing based on the collected image, and stores the recognized position information and the corresponding position time stamp in the position information file of the memory 14; specifically, when the writing position of the user is identified in this step, the processor 13 extracts the OID code corresponding to the abbreviation "policy direction of the company in this year" in the image, and identifies the OID code, thereby obtaining the position corresponding to the OID code, that is, the writing position of the user.
S4: when the event to be recorded is finished, the recording function of the intelligent pen 1 and the audio and/or video recording function of the recording and recording unit 3 are stopped;
s5: respectively connecting the intelligent pen 1 and the recording and video unit 3 with a time synchronization unit 4 in a data mode, wherein the time synchronization unit 4 receives one or more position information files of the intelligent pen 1 and one or more audio and/or video files of the recording and video unit 3, creates a first index table to store the position information of each position information file and a corresponding position timestamp, and creates a second index table to store the name of the audio and/or video file and the recording and video timestamp information of each audio and/or video file;
s6: when the user needs to retrieve the content of the detail record about the policy direction of the annual company, the intelligent pen 1 points to the corresponding abbreviated policy direction of the annual company recorded on the paper 2, the position information of the abbreviated policy direction of the annual company is obtained by the cooperation and identification of the identification camera 12 and the processor 13 (the identification principle is consistent with the identification principle of the step S3), and the identified abbreviated position information is sent to the search playing unit 5;
s7: the searching and playing unit 5 receives the abbreviated position information, searches the position information matched with the abbreviated position information in the first index table according to the abbreviated position information, and acquires a position timestamp corresponding to the abbreviated position information;
s8: searching the matched audio and video recording timestamp information in the second index table according to the position timestamp acquired in the step S7, and acquiring the corresponding audio and/or video file name;
s9: according to the audio and/or video file name acquired in step S8, the corresponding audio and/or video file is selected, and the selected audio and/or video file is played from the acquired position timestamp.
In this embodiment, the search playing unit 5 can quickly open the audio and/or video (detailed note) corresponding to the keyword by only pointing the smart pen 1 to the keyword (abbreviated) to be retrieved in the simple notes, thereby shortening the search time, greatly improving the working efficiency, and ensuring the search precision.
Example 2
This embodiment is basically the same as embodiment 1 except for the following features, and will not be described here.
In this embodiment, the time synchronization unit 4 and the search playing unit 5 are both disposed on the first intelligent terminal 6a, and the audio/video recording unit 3 is disposed on the second intelligent terminal or the digital audio/video recorder 7a independent from the first intelligent terminal 6 a.
Example 3
This embodiment is basically the same as embodiment 1 except for the following features, and will not be described here.
In this embodiment, the time synchronization unit 4 is disposed in the cloud server 8, and the audio/video recording unit 3 and the search playing unit 5 are disposed in the intelligent terminal 6 b.
The step S5 further includes the steps of:
s5-1) logging in the cloud server by the user;
s5-2) the intelligent pen 1 and the recording unit 3 are respectively in data connection with the time synchronization unit 4, and the time synchronization unit 4 receives one or more position information files of the intelligent pen 1 and one or more audio and/or video files of the recording unit 3;
s5-3) storing the received position information file and the audio and/or video file in the user space of the user at the cloud server;
s5-4) creating a first index table storing location information and corresponding location timestamps of the location information file (S) in the user space, and creating a second index table storing audio and/or video file names and sound recording timestamp information of the audio and/or video file (S).
The step S6 further includes the steps of:
s6-1) when the user needs to listen to or watch a certain section of the audio and/or video file recorded with the recorded event, the user logs in the cloud server;
s6-2) pointing the intelligent pen 1 to the corresponding sketch recorded on the paper 2, obtaining sketch position information through the matching identification of the identification camera 12 and the processor 13, and sending the identified sketch position information to the searching and playing unit 5;
s6-3) the search playing unit 5 receives the abbreviated position information, searches the position information matched with the abbreviated position information in the first index table of the user in the user space of the cloud server according to the abbreviated position information, and obtains the corresponding position timestamp;
s6-4) searching the matched recording and video time stamp information in the second index table of the user in the user space of the cloud server according to the position time stamp obtained in the step S6-3, and obtaining the corresponding audio and/or video file name;
s6-5) selecting corresponding audio and/or video files in the user space of the cloud server by the user according to the audio and/or video file names acquired in the step S6-4;
s6-6) streaming the audio and/or video files selected according to the step S6-5 to the search play unit 5;
s6-7) the search playing unit 5 starts playing the selected audio and/or video file from the acquired position time stamp.
Example 4
This embodiment is basically the same as embodiment 3 except for the following features, and will not be described here.
In this embodiment, the time synchronization unit 4 is disposed on the cloud server 8, the search playing unit 5 is disposed on the first intelligent terminal 6c, and the audio/video recording unit 3 is disposed on the second intelligent terminal or the digital audio/video recorder 7b independent of the first intelligent terminal 6 c.
The above-mentioned embodiments are merely preferred embodiments of the present invention, and the scope of the present invention is not limited thereto, so that variations based on the shape and principle of the present invention should be covered within the scope of the present invention.

Claims (7)

1. A conference recording and video searching system is characterized by comprising an intelligent pen (1), paper (2), an intelligent terminal (3) and a digital camera (4) with a wireless transmission function;
the intelligent pen (1) is provided with a pen main body, an identification camera, a processor, a memory and a wireless transmission module; the recognition camera, the memory and the wireless transmission module are respectively connected with the processor, and the recognition camera, the memory and the wireless transmission module are all arranged on the pen main body; the pen main body is used for being matched with paper (2) to be used as a writing record, the recognition camera is used for collecting an image which is written by a user and is abbreviated, the processor recognizes the writing position of the user and records a time stamp when the user writes based on the collected image, the memory is used for storing the position information recognized by the processor and the corresponding time stamp, and the wireless transmission module is used for being in wireless connection with the intelligent terminal (3);
the digital camera (4) is used for recording the audio and video of the conference and transmitting the recorded audio and video data to the intelligent terminal (3);
the intelligent terminal (3) is respectively in wireless connection with the intelligent pen (1) and the digital camera (4), and the recording function of the intelligent pen (1) and the audio and video recording function of the digital camera (4) are simultaneously started through an application program of the intelligent terminal, so that data streams of the intelligent pen and the digital camera are synchronized; and the corresponding content can be searched out from the audio and video recorded by the digital video camera (4) according to the corresponding time stamp of the content to be found.
2. A conference recording and video searching system according to claim 1, wherein the paper (2) is printed with OID codes, and the OID codes correspond to position coordinates.
3. The system for searching recorded audio and video of conference as claimed in claim 1, wherein said intelligent terminal (3) is a smart phone.
4. The system of claim 1, wherein the wireless transmission module is a bluetooth wireless transmission module.
5. A method for the system of claim 1, comprising the steps of:
s1, wirelessly connecting an intelligent terminal with an intelligent pen and a digital camera respectively;
s2, simultaneously starting a recording function of the intelligent pen and an audio and video recording function of the digital camera through an application program of the intelligent terminal, and synchronizing data streams of the intelligent pen and the digital camera;
s3, in the process of starting the recording function of the intelligent pen and the audio and video recording function of the digital camera, when a user hears a special event in a meeting and needs to write down, writing on paper through the intelligent pen for simple and brief writing, and recording a corresponding timestamp during writing; synchronously, the digital camera transmits the recorded audio and video recording data to the intelligent terminal in real time, and the audio and video recording data are stored by the intelligent terminal;
s4, after the conference is finished, namely the recording function of the intelligent pen and the audio and video recording function of the digital camera are stopped, the intelligent terminal integrates the data streams of the intelligent pen and the digital camera;
s5, when a user needs to listen to or watch back a certain section of the recorded sound and video, the intelligent pen point is pointed to the corresponding sketch recorded on the paper, and the sketch position is identified through the matching of the identification camera and the processor, so that the corresponding timestamp when the sketch is written at the position is found in the memory;
s6, the intelligent pen sends the timestamp found in the step S5 to an intelligent terminal through a wireless transmission module of the intelligent pen;
and S7, the intelligent terminal opens the audio and video record, the audio and video record is played by taking the time stamp sent from the intelligent pen as a starting point, and the played content is the audio and video record of the conference needing to be searched.
6. The method as claimed in claim 5, wherein the step S3 of writing a brief sketch on paper by a smart pen and recording a corresponding time stamp during writing comprises the following steps:
A) the pen main body of the intelligent pen writes simple notes on paper;
B) the recognition camera collects the image written by the user and transmits the collected image to the processor;
C) the processor identifies the writing position of the user and records a time stamp when the user writes based on the collected images;
D) the memory stores location information identified by the processor and a corresponding timestamp.
7. The method as claimed in claim 6, wherein when the user writing position is identified in step C), the processor extracts the OID code corresponding to the sketch in the image and identifies the OID code, thereby obtaining the position corresponding to the OID code, i.e. the position written by the user.
CN201911080787.6A 2019-11-07 2019-11-07 Recording and video searching system and method thereof Pending CN112788264A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911080787.6A CN112788264A (en) 2019-11-07 2019-11-07 Recording and video searching system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911080787.6A CN112788264A (en) 2019-11-07 2019-11-07 Recording and video searching system and method thereof

Publications (1)

Publication Number Publication Date
CN112788264A true CN112788264A (en) 2021-05-11

Family

ID=75747963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911080787.6A Pending CN112788264A (en) 2019-11-07 2019-11-07 Recording and video searching system and method thereof

Country Status (1)

Country Link
CN (1) CN112788264A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112801833A (en) * 2021-01-29 2021-05-14 深圳市鹰硕教育服务有限公司 Examination system, examination method and examination server based on intelligent pen

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4924387A (en) * 1988-06-20 1990-05-08 Jeppesen John C Computerized court reporting system
CN1855117A (en) * 2005-01-12 2006-11-01 跳蛙企业股份有限公司 Interactive apparatus with recording and playback capability usable with encoded writing medium
US20090251440A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Audio Bookmarking
US7647555B1 (en) * 2000-04-13 2010-01-12 Fuji Xerox Co., Ltd. System and method for video access from notes or summaries
CN103019575A (en) * 2011-09-22 2013-04-03 汉王科技股份有限公司 Mobile terminal and information processing method thereof
CN103065659A (en) * 2012-12-06 2013-04-24 广东欧珀移动通信有限公司 Multi-media recording method
CN103186583A (en) * 2011-12-29 2013-07-03 汉王科技股份有限公司 Mobile terminal-based information recording and retrieval method and device
CN103219025A (en) * 2012-01-18 2013-07-24 台均科技(深圳)有限公司 Audio frequency recording method and device
KR20130104087A (en) * 2012-03-12 2013-09-25 한국전자통신연구원 The method to paly the petsonalized indexing on the real-time recording and method to provide the petsonalized indexing on the real-time recording of the video conference
CN104156079A (en) * 2013-05-14 2014-11-19 广州杰赛科技股份有限公司 Learning path recording method and system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4924387A (en) * 1988-06-20 1990-05-08 Jeppesen John C Computerized court reporting system
US7647555B1 (en) * 2000-04-13 2010-01-12 Fuji Xerox Co., Ltd. System and method for video access from notes or summaries
CN1855117A (en) * 2005-01-12 2006-11-01 跳蛙企业股份有限公司 Interactive apparatus with recording and playback capability usable with encoded writing medium
US20090251440A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Audio Bookmarking
CN103019575A (en) * 2011-09-22 2013-04-03 汉王科技股份有限公司 Mobile terminal and information processing method thereof
CN103186583A (en) * 2011-12-29 2013-07-03 汉王科技股份有限公司 Mobile terminal-based information recording and retrieval method and device
CN103219025A (en) * 2012-01-18 2013-07-24 台均科技(深圳)有限公司 Audio frequency recording method and device
KR20130104087A (en) * 2012-03-12 2013-09-25 한국전자통신연구원 The method to paly the petsonalized indexing on the real-time recording and method to provide the petsonalized indexing on the real-time recording of the video conference
CN103065659A (en) * 2012-12-06 2013-04-24 广东欧珀移动通信有限公司 Multi-media recording method
CN104156079A (en) * 2013-05-14 2014-11-19 广州杰赛科技股份有限公司 Learning path recording method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112801833A (en) * 2021-01-29 2021-05-14 深圳市鹰硕教育服务有限公司 Examination system, examination method and examination server based on intelligent pen

Similar Documents

Publication Publication Date Title
US9507776B2 (en) Annotation system for creating and retrieving media and methods relating to same
US8249434B2 (en) Contents playing method and apparatus with play starting position control
US6687671B2 (en) Method and apparatus for automatic collection and summarization of meeting information
EP1536638A1 (en) Metadata preparing device, preparing method therefor and retrieving device
CN111523293A (en) Method and device for assisting user in information input in live broadcast teaching
US20160179225A1 (en) Paper Strip Presentation of Grouped Content
US20080064438A1 (en) Place Name Picture Annotation on Camera Phones
US20050281437A1 (en) Talking paper
JP3895892B2 (en) Multimedia information collection management device and storage medium storing program
JP2009526302A (en) Method and system for tagging digital data
JP2008282397A (en) Method for creating annotated transcript of presentation, information processing system, and computer program
US20170300746A1 (en) Organizing Written Notes Using Contextual Data
US20150116272A1 (en) Tagging of Written Notes Captured by a Smart Pen
JP2022020703A (en) Handwriting device and speech and handwriting communication system
CN104506624B (en) A kind of social information management system and management method
CN112788264A (en) Recording and video searching system and method thereof
CN101437115A (en) Digital camera and method for setting image name
JP2002374481A (en) File name setting system
WO2007058268A1 (en) Associating device
CN101110907A (en) Image display system
CN103186583B (en) A kind of information record based on mobile terminal and search method and device
JP2001318898A (en) Device and method for exchanging name card, and recording medium
JP4649944B2 (en) Moving image processing apparatus, moving image processing method, and program
KR101783872B1 (en) Video Search System and Method thereof
US7873637B2 (en) Automatically imparting an index by using various kinds of control signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination