CN114116763A - Information processing method, information processing device, electronic equipment and storage medium - Google Patents

Information processing method, information processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114116763A
CN114116763A CN202111408932.6A CN202111408932A CN114116763A CN 114116763 A CN114116763 A CN 114116763A CN 202111408932 A CN202111408932 A CN 202111408932A CN 114116763 A CN114116763 A CN 114116763A
Authority
CN
China
Prior art keywords
database
test paper
identification information
answer
question
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111408932.6A
Other languages
Chinese (zh)
Inventor
王军
秦瑞雄
王思梦
周剑一
熊逸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Bank Corp
Original Assignee
China Construction Bank Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Bank Corp filed Critical China Construction Bank Corp
Priority to CN202111408932.6A priority Critical patent/CN114116763A/en
Publication of CN114116763A publication Critical patent/CN114116763A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • G06F16/2423Interactive query statement specification based on a database schema
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24552Database cache management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/248Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Mathematical Physics (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The present disclosure provides an information processing method, which can be applied to the technical field of computers. The information processing method is applied to a first database for caching, and comprises the following steps: acquiring an answer request, wherein the answer request carries user information and test paper identification information; responding to the answer request, under the condition that the question associated with the test paper identification information is not inquired in the first database, obtaining the question associated with the test paper identification information from a second database storing the question according to the test paper identification information so as to cache the question associated with the test paper identification information into the first database; according to the paper grouping strategy, obtaining target titles from titles which are cached in a first database and are associated with the test paper identification information; and sending the target test paper comprising the target question to the answer end according to the user information. The present disclosure also provides an information processing apparatus, a device, a storage medium, and a program product.

Description

Information processing method, information processing device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an information processing method, apparatus, device, medium, and program product.
Background
At present, with the development of the online education industry, more and more users are willing to accept online education and learning through computers or smart phones and the like. In order to verify the learning result of the user, the online learning platform generally detects the learning result of the user through a setup unit test and an industry test.
In carrying out the inventive concept of the present disclosure, the inventors found that at least the following problems exist in the related art: when the number of examination questions is large or a large number of users take examinations simultaneously, the database overhead is increased, so that the test paper loading speed is low, and the user experience is poor.
Disclosure of Invention
In view of the above, the present disclosure provides an information processing method, apparatus, device, medium, and program product that improve test paper loading speed.
According to a first aspect of the present disclosure, there is provided an information processing method applied to a first database for caching, wherein the first database is in communication connection with a second database, the second database stores titles for assembling test paper, and the first database caches partial titles obtained from the second database, the method including:
acquiring an answer request, wherein the answer request carries user information and test paper identification information;
in response to the answer request, when the question associated with the test paper identification information is not searched in the first database, obtaining the question associated with the test paper identification information from the second database according to the test paper identification information so as to cache the question associated with the test paper identification information into the first database;
acquiring a target question from the questions which are cached in the first database and are associated with the test paper identification information according to a paper grouping strategy; and
and sending the target test paper comprising the target question to an answering end according to the user information.
According to an embodiment of the present disclosure, the method further includes:
receiving a first storage request which is sent by the answering end and used for storing first answering data;
responding to the first storage request, and acquiring the first answer data;
and returning a first state code to the answering end under the condition that the first answering data is successfully stored in the first database.
According to an embodiment of the present disclosure, the method further includes:
under the condition that the first answer data is not stored in the first database, returning a second state code to the answer end and marking the first answer data;
receiving a second storage request which is sent by the answering end and used for storing second answering data;
responding to the second storage request, and acquiring the second answer data and the marked first answer data;
and returning the first state code to the answering end under the condition that the second answering data and the marked first answering data are successfully stored in the first database.
According to an embodiment of the present disclosure, the above method further comprises;
sending the target test paper to a server so as to store the target test paper to the server;
when the user quits the answering midway and answers again, acquiring the target test paper from the server according to the identification information of the target test paper and acquiring the answering data corresponding to the target test paper from the first database;
and assembling the target test paper and the answer data corresponding to the target test paper into the target test paper containing the answer data.
According to an embodiment of the present disclosure, the answer data is stored in the first database in a hash storage manner.
According to an embodiment of the present disclosure, the obtaining a question associated with the test paper identification information so as to cache the question associated with the test paper identification information in the first database includes:
acquiring a question associated with the test paper identification information;
dividing the questions related to the test paper into different categories according to the question types;
and caching the titles of different categories to different positions in the first database respectively.
According to an embodiment of the present disclosure, the method further includes:
and clearing the questions which are stored in the first database and are associated with the test paper identification information after a preset time period is exceeded.
According to an embodiment of the present disclosure, the group volume policy includes: a random volume-combining strategy and a fixed volume-combining strategy;
the random paper-grouping strategy comprises the step of randomly acquiring questions related to target question types in the target test paper from the questions related to the test paper identification information;
the fixed paper-grouping strategy comprises the step of obtaining a fixed title from titles associated with the test paper identification information.
A second aspect of the present disclosure provides an information processing apparatus applied to a first database for caching, wherein the first database is in communication connection with a second database, the second database stores titles for assembling test papers, and the first database caches partial titles acquired from the second database, the apparatus including:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an answer request, and the answer request carries user information and test paper identification information;
a second obtaining module, configured to, in response to the answer request, obtain, from the second database according to the test paper identification information, a question associated with the test paper identification information when the question associated with the test paper identification information is not queried in the first database, so as to cache the question associated with the test paper identification information in the first database;
a third obtaining module, configured to obtain a target question from questions, which are cached in the first database and associated with the test paper identification information, according to a paper grouping policy; and
and the first sending module is used for sending the target test paper containing the target question to an answering end according to the user information.
A third aspect of the present disclosure provides an electronic device, comprising: one or more processors; a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the information processing method.
The fourth aspect of the present disclosure also provides a computer-readable storage medium having stored thereon executable instructions that, when executed by a processor, cause the processor to perform the above-mentioned information processing method.
The fifth aspect of the present disclosure also provides a computer program product comprising a computer program which, when executed by a processor, implements the above-described information processing method.
According to the embodiment of the disclosure, under the condition that the questions related to the test paper identification information are not inquired in the first database according to the test paper identification information, the questions related to the test paper identification information are inquired from the second database storing the questions and cached in the first database, then the target questions are obtained from the cached questions in the first database according to the paper grouping strategy, and the target test paper including the target questions is sent to the answer end according to the user information. Because the questions associated with the test paper identification information are cached in the first database, when a large number of users take examinations, the target questions can be determined only by accessing the first database without frequently accessing the second database, so that the query times of the second database are reduced, the problem of slow operation when the data is huge is solved, and the loading speed of the page is increased. The problems that in the related art, when the number of test questions is large or a large number of users answer the test questions simultaneously, the test paper loading speed is low, and the user experience is poor are solved at least partially.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following description of embodiments of the disclosure, which proceeds with reference to the accompanying drawings, in which:
fig. 1 schematically illustrates an application scenario diagram of an information processing method, apparatus, device, medium, and program product according to embodiments of the present disclosure;
FIG. 2 schematically shows a flow chart of an information processing method according to an embodiment of the present disclosure;
fig. 3 schematically shows a flowchart of a method of submitting answer data by an answering end;
FIG. 4 is a flowchart schematically illustrating a method for generating a target test paper including answer data when a user answers again after quitting the test paper halfway;
fig. 5 schematically shows a block diagram of the structure of an information processing apparatus according to an embodiment of the present disclosure;
fig. 6 schematically shows a block diagram of the structure of an information processing apparatus according to another embodiment of the present disclosure; and
fig. 7 schematically shows a block diagram of an electronic device adapted to implement an information processing method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
With the development of the online education industry, more and more users are willing to accept online education and learning through a computer or a mobile phone, and the online learning platform generally detects the learning result of the user through a unit setting test and an industry-closing test.
In carrying out the inventive concept of the present disclosure, the inventors found that at least the following problems exist in the related art: when the number of examination questions is large or a large number of users take examinations simultaneously, the database overhead is increased, so that the test paper loading speed is low, and the user experience is poor.
An embodiment of the present disclosure provides an information processing method applied to a first database for caching, including: acquiring an answer request, wherein the answer request carries user information and test paper identification information; in response to the answer request, when the question associated with the test paper identification information is not searched in the first database, obtaining the question associated with the test paper identification information from the second database according to the test paper identification information so as to cache the question associated with the test paper identification information into the first database; acquiring a target question from the questions which are cached in the first database and are associated with the test paper identification information according to a paper grouping strategy; and sending the target test paper comprising the target question to an answering end according to the user information.
It should be noted that the method and apparatus for information processing according to the embodiments of the present disclosure may be used in the field of education and the field of computer technology, and may also be used in any technical field other than the field of education and the field of computer technology.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, necessary confidentiality measures are taken, and the customs of the public order is not violated.
Fig. 1 schematically shows an application scenario diagram of an information processing method, apparatus, device, medium, and program product according to an embodiment of the present disclosure.
As shown in fig. 1, the application scenario 100 according to this embodiment may include a network, a terminal device, and a server. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as a learning-type application, a shopping-type application, a web browser application, a search-type application, an instant messaging tool, a mailbox client, social platform software, and the like (by way of example only).
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (for example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and perform other processing on the received data such as the user request, and feed back a processing result (e.g., a webpage, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that the information processing method provided by the embodiment of the present disclosure may be generally executed by the server 105. Accordingly, the information processing apparatus provided by the embodiment of the present disclosure may be generally provided in the server 105. The information processing method provided by the embodiment of the present disclosure may also be executed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the information processing apparatus provided in the embodiment of the present disclosure may also be provided in a server or a server cluster different from the server 105 and capable of communicating with the terminal devices 101, 102, 103 and/or the server 105.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
The information processing method of the disclosed embodiment will be described in detail below with fig. 2 to 4 based on the scenario described in fig. 1.
Fig. 2 schematically shows a flow chart of an information processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the information processing method of the embodiment includes operations S201 to S204, and the information processing method may be performed by a terminal device or a server.
In operation S201, an answer request is obtained, where the answer request carries user information and test paper identification information.
According to an embodiment of the present disclosure, the user information may be, for example, location information of the user, test type information, test number information, and the like. The test paper identification information may include, for example, the type of test paper, the difficulty of the test paper, and the like.
In operation S202, in response to the answer request, in a case that the question associated with the test paper identification information is not queried in the first database, the question associated with the test paper identification information is obtained from the second database storing the question according to the test paper identification information, so that the question associated with the test paper identification information is cached in the first database.
According to the embodiment of the disclosure, the second database is used for storing various types of examination questions, and the quantity of the stored questions is larger than that of the questions stored in the first database.
According to an embodiment of the present disclosure, the first database may include, for example, a non-relational database, which may include, for example, a redis (redis is a kind of key-value storage system) database, or the like.
A Key-value database is a database that stores data in Key-value pairs.
According to an embodiment of the present disclosure, the second database may include, for example, a relational database, which may include, for example, an oracle database (oracle is a kind of database management system), and the like. The second database storing topics represents a database storing a large number of topics, which may include, for example, a set of topics.
Specifically, the second database in the embodiment of the present disclosure may be a relational database based on disk reading and writing, and may perform table sorting and storage on data, and may store a large amount of data such as various examination questions. The first database can be a non-relational database operated based on a server memory, and the data reading speed is high.
According to the embodiment of the disclosure, a large amount of topic data is stored in the second database in a table-dividing mode, and then the topics associated with the test paper are stored in the first database, so that a plurality of users can search in the first database conveniently. The first database is a non-relational database based on memory operation, so that the data reading speed is higher than that of a second database based on disk reading and writing, the searching speed of a plurality of users for searching questions at the same time is effectively increased, and the volume forming speed is improved. Meanwhile, the frequency of accessing the second database by the user can be reduced, and the query to the relational database is reduced.
According to the embodiment of the disclosure, questions related to the test paper identification information are inquired from a first database, and if the questions related to the test paper identification information are cached in the first database, target questions are directly obtained from the questions; if the question associated with the test paper identification information is not cached in the first database, the question associated with the test paper identification information is obtained from a second database in which the question is stored, and the obtained question associated with the test paper identification information is cached in the first database.
According to the embodiment of the disclosure, for example, information such as ip, port, account, password and the like can be configured for a first database in advance to form a first configuration file; and configuring information such as ip, port, account, password and the like for the second database to form a second configuration file. In the process of executing the method of the embodiment of the disclosure, the first configuration file and the second configuration file are firstly obtained to get through the network relationship of accessing the first database and the second database, and the first database and the second database can be accessed at any time. When the test paper needs to be inquired, according to a preset logic program, the question related to the test paper identification information is preferably inquired from the first database, if the question is not inquired, the question is inquired from the second database, and then the question related to the test paper identification information inquired from the second database is cached to the first database, so that the data call between the second database and the first data is realized.
In operation S203, a target topic is obtained from topics associated with the test paper identification information cached in the first database according to the paper grouping policy.
According to an embodiment of the present disclosure, the group volume policy includes a random group volume policy and a fixed group volume policy. The volume group policy may only adopt a random volume group policy, may only adopt a fixed volume group policy, and may also adopt a combination of a random volume group policy and a fixed volume group policy.
According to the embodiment of the present disclosure, for example, the titles associated with the test paper identifiers include 300 titles, 100 titles, 200 titles, 300 titles, and the like can be obtained from 300 titles. The acquisition target topic may acquire, for example, 100 tracks from 300 tracks of topics as a target topic, may randomly acquire 100 tracks of topics as a target topic, and may also acquire 100 fixed tracks of topics as a target topic.
In operation S204, a target test paper including a target question is transmitted to the answering terminal according to the user information.
According to the embodiment of the disclosure, the target test paper including the target question is sent to the answer end according to the user information, which shows that the user information is associated with the target question, so that the user can conveniently inquire the target test paper again.
According to the embodiment of the disclosure, under the condition that the questions related to the test paper identification information are not inquired in the first database according to the test paper identification information, the questions related to the test paper identification information are inquired from the second database storing the questions and cached in the first database, then the target questions are obtained from the cached questions in the first database according to the paper grouping strategy, and the target test paper including the target questions is sent to the answer end according to the user information. Because the questions associated with the test paper identification information are cached in the first database, when a large number of users take examinations, the target questions can be determined only by accessing the first database without frequently accessing the second database, so that the query times of the second database are reduced, the problem of slow operation when the data is huge is solved, and the loading speed of the page is increased. The problems that in the related art, when the number of test questions is large or a large number of users take examinations simultaneously, the test paper loading speed is low and the user experience is poor are at least partially solved.
According to an embodiment of the present disclosure, the information processing method further includes:
receiving a first storage request sent by an answering end and used for storing first answering data;
responding to the first storage request, and acquiring first answer data;
and under the condition that the first answer data is successfully stored in the first database, returning a first state code to the answer end.
According to an embodiment of the present disclosure, the first answer data may include, for example, one-question answer data, and may also include multiple-question answer data.
According to an embodiment of the present disclosure, the information processing method further includes:
under the condition that the first answer data is not stored in the first database, returning a second state code to the answer end, and marking the first answer data; receiving a second storage request which is sent by the answering end and used for storing second answering data;
responding to the second storage request, and acquiring second answer data and marked first answer data;
and returning the first state code to the answering end under the condition that the second answering data and the marked first answering data are successfully stored in the first database.
According to an embodiment of the present disclosure, the first status code may include, for example, information characterizing success of storing the answer data. For example, if the preset number "1" indicates that the answer data is successfully stored, the first status code may be "1". And under the condition that the first answer data is successfully stored in the first database, returning a first state code '1' to the answer end.
According to an embodiment of the present disclosure, the second state code may include, for example, information characterizing a failure of storing the answer data. For example, if a preset number "2" indicates that the answer data is failed to be stored, the second state code may be "2", and if the first answer data is not stored in the first database, the second state code "2" is returned to the answer end.
According to the embodiment of the present disclosure, the first and second storage requests in the first storage request and the second storage request represent an order for storing answer data sent by the answering terminal, that is, the second storage request is a request initiated after the first storage request. The first answer data and the second answer data are answer data, the first answer data represents answer data corresponding to the first storage request, and the second answer data represents answer data corresponding to the second storage request.
According to the embodiment of the disclosure, the method further comprises the steps of sending all answer data on the target test paper to the first database after the answer is finished at the answer end, and storing the answer data in the first database.
For example, the answering end may send a storage request for storing answer data once every time an answer is made for one question, or send a storage request for storing answer data once after multiple questions are made, and at this time, the answer data for multiple questions are all sent to the first database.
According to the embodiment of the disclosure, when a storage request is sent to the first database, when the status code returned by the first database is the preset status code, the answer data is successfully stored in the first database. And when the status code returned by the first database is not the preset status code, the answer data is marked if the answer data is not stored in the first database. When the answer end sends the data storage request next time, the answer data of next time and the answer data marked last time are sent to the first database together, the answer data can be stored to the first database, the answer data are prevented from being omitted, the accuracy of answer scores can be ensured, meanwhile, under the condition that the answer data storage fails, only the answer data failed in storage are marked, the window failed in storage cannot be popped up, and the experience degree of a user is improved.
Fig. 3 schematically shows a flowchart of a method for submitting answer data by an answering terminal.
As shown in fig. 3, the method includes operations S301 to S314.
In operation S301, answering of the question end begins.
In operation S302, the first database receives a first storage request for storing first answer data sent by the answering terminal.
In operation S303, the first database acquires the first answer data in response to the first storage request, and returns a status code to the answer end.
In operation S304, it is determined whether the status code is a preset status code, and in case that the status code is the preset status code, operation S305 is performed; in case the status code is not the preset status code, operation S306 is performed.
In operation S305, the first answer data is successfully stored in the first database.
In operation S306, the first answer data is marked.
In operation S307, the first database receives a second storage request for storing second answer data sent by the answering terminal.
In operation S308, the first database acquires the second answer data and the marked first answer data in response to the second storage request, and returns a status code to the answer terminal. According to the embodiment of the present disclosure, before performing operation S308, it may be determined whether marked first answer data exists, if the marked first answer data exists, operation S308 is performed, and if the marked first answer data does not exist, only the second answer data is obtained.
In operation S309, it is determined whether the status code is a preset status code, and in case that the status code is the preset status code, operation S310 is performed; in case the status code is not the preset status code, operation S311 is performed.
In operation S310, the second answer data and the marked first answer data are successfully stored in the first database, and the marking of the first answer data is removed.
In operation S311, the second answer data is marked.
In operation S312, the answer end completes the answer and clicks the paper delivery operation.
In operation S313, all answer data on the target test paper is obtained and stored in the first database, so that all answer data can be stored in the first database of the server.
In operation S314, the answer is ended.
According to an embodiment of the present disclosure, the above method further comprises;
sending the target test paper to a server so as to store the target test paper to the server;
when the user quits answering halfway and answers again, acquiring a target test paper from the server according to the identification information of the target test paper and acquiring answer data corresponding to the target test paper from the first database;
and assembling the target test paper and the answer data corresponding to the target test paper into the target test paper containing the answer data.
According to the embodiment of the present disclosure, the server may include, for example, a file server such as a NAS (network Attached server), where the NAS is a technology for integrating distributed and independent data into a large-scale and centralized-managed data center so as to facilitate access to different hosts and application servers.
According to the embodiment of the disclosure, the target test paper is persisted to the server, so that the user can conveniently inquire data for the second time, for example, the user can directly inquire the target test paper from the server when quitting in the midway and needing to answer again. In addition, the target test paper is persisted in the server, and is not persisted in a large field of a relational database such as the first database, so that the occupation of table space in the relational database is reduced, the relational database can store more data conveniently, and the efficiency of inquiring the table data can be improved.
According to the embodiment of the disclosure, the answer data is persisted to the first database in real time, so that the performance problem of persisted answer data when a large number of users answer at the same time is solved, and in addition, when the users quit in midway, the answer data is convenient to call from the first database.
FIG. 4 is a flowchart schematically illustrating a method for generating a target test paper including answer data when a user answers again after quitting the test paper halfway.
As shown in fig. 4, the method includes operations S401 to S404.
In operation S401, a target test paper is acquired from a server;
in operation S402, acquiring answer data corresponding to a target test paper from a first database;
in operation S403, the target test paper acquired in operation S401 and the answer data acquired in operation S402 are assembled to form a target test paper containing the answer data.
In operation S404, the target test paper including the answer data in operation S403 is displayed to the answering terminal so that the user can continue to answer.
According to the embodiment of the disclosure, the answer data is stored in the first database in a hash storage manner.
According to the embodiment of the disclosure, the answer data is stored in a first database (such as a redis database), and the reading data is based on the internal memory and can support the characteristic of high concurrency. Meanwhile, answer data are stored in a Hash storage mode, a map structure is generated during reading, the time complexity is O (1), reading is convenient, and the loading speed is high.
The algorithm time complexity is a function that qualitatively describes the run time of the algorithm. A time complexity of O (1) indicates that the required time is a constant time.
According to an embodiment of the present disclosure, obtaining a question associated with the test paper identification information, so as to cache the question associated with the test paper identification information in a first database includes:
acquiring a question associated with the test paper identification information;
dividing the questions related to the test paper into different categories according to the question types;
and caching the titles of different categories to different positions in the first database respectively.
According to the embodiment of the disclosure, for example, the questions associated with the test paper include a single-choice question, a multiple-choice question, a judgment question, a blank filling question and the like, and the single-choice question, the multiple-choice question, the judgment question and the blank filling question are respectively stored, so that on one hand, a large amount of data can be conveniently stored, on the other hand, the speed of obtaining the questions can also be increased, and thus the speed of loading the test paper is increased.
According to an embodiment of the present disclosure, the method further includes:
and clearing the questions which are stored in the first database and are associated with the test paper identification information after the preset time period is exceeded.
According to an embodiment of the present disclosure, wherein the group volume policy includes: a random volume-combining strategy and a fixed volume-combining strategy;
the random paper-grouping strategy comprises the steps of randomly acquiring a question about a target question type in a target test paper from questions associated with the test paper identification information;
the fixed paper grouping strategy comprises the step of obtaining fixed titles from titles associated with the test paper identification information.
According to an embodiment of the present disclosure, for example, acquiring a target topic by using a random paper grouping strategy includes acquiring 300 topics associated with test paper identification information from a second database according to the test paper identification information, where the 300 topics associated with the test paper identification information include 100 single-choice topics, 100 multiple-choice topics, and 100 judgment topics. Then, the question about the single-choice question in the target test paper is randomly acquired from 100 single-choice questions, the question about the multiple-choice question in the target test paper is randomly acquired from 100 multiple-choice questions, and the question about the judgment question in the target test paper is randomly acquired from 100 judgment questions. Each user taking the test adopts a random paper-grouping strategy to obtain a target topic, namely the target topic of each user is different.
According to the embodiment of the disclosure, the number of questions associated with the test paper identification information is more than the number of questions in the target test paper, so that different users can extract different target questions conveniently.
According to the embodiment of the present disclosure, for example, acquiring the target questions by using the fixed paper grouping strategy includes acquiring 300 questions associated with the test paper identification information from the second database according to the test paper identification information, and then acquiring 100 fixed questions from the 300 questions in the second database, where each user taking the test performs the test by using the acquired 100 questions as the target questions, that is, the target questions of each user are the same.
Based on the information processing method, the disclosure also provides an information processing device. The apparatus will be described in detail below with reference to fig. 5.
Fig. 5 schematically shows a block diagram of the structure of an information processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 5, the information processing apparatus 500 of this embodiment includes a first obtaining module 510, a second obtaining module 520, a third obtaining module 530, and a first sending module 540.
The first obtaining module 510 is configured to obtain an answer request, where the answer request carries user information and test paper identification information. In an embodiment, the first obtaining module 510 may be configured to perform the operation S201 described above, which is not described herein again.
The second obtaining module 520 is configured to, in response to the answer request, obtain, from the second database storing questions according to the test paper identification information, a question associated with the test paper identification information when the question associated with the test paper identification information is not queried in the first database, so as to cache the question associated with the test paper identification information in the first database. In an embodiment, the second obtaining module 520 may be configured to perform the operation S202 described above, which is not described herein again.
The third obtaining module 530 is configured to obtain a target topic from topics cached in the first database and associated with the test paper identification information according to the paper grouping policy. In an embodiment, the third obtaining module 530 may be configured to perform the operation S203 described above, which is not described herein again.
The first sending module 540 is configured to send a target test paper including a target question to the answering end according to the user information. In an embodiment, the first sending module 540 may be configured to perform the operation S204 described above, and is not described herein again.
Fig. 6 schematically shows a block diagram of the structure of an information processing apparatus according to another embodiment of the present disclosure.
As shown in fig. 6, the information processing apparatus 500 of this embodiment further includes a first receiving module 550, a fourth obtaining module 560, and a first returning module 570.
The first receiving module 550 is configured to receive a first storage request sent by the answering end and used for storing first answer data.
The fourth obtaining module 560 is configured to obtain the first answer data in response to the first storage request.
The first returning module 570 is configured to return the first status code to the answering end when the first answer data is successfully stored in the first database.
According to an embodiment of the present disclosure, the information processing apparatus 500 of this embodiment further includes a marking module, a second receiving module, a fifth obtaining module, and a second returning module.
And the marking module is used for returning the second state code to the answering end and marking the first answer data under the condition that the first answer data is not stored in the first database.
And the second receiving module is used for receiving a second storage request which is sent by the answering end and used for storing second answering data.
And the fifth acquisition module is used for responding to the second storage request and acquiring the second answer data and the marked first answer data.
And the second returning module is used for returning the first state code to the answering end under the condition that the second answering data and the marked first answering data are successfully stored in the first database.
According to an embodiment of the present disclosure, the information processing apparatus 500 of this embodiment further includes a second sending module, a sixth obtaining module, and an assembling module.
And the second sending module is used for sending the target test paper to the server so as to store the target test paper to the server.
And the sixth acquisition module is used for acquiring the target test paper from the server according to the identification information of the target test paper and acquiring the answer data corresponding to the target test paper from the first database under the condition that the user quits the answer in midway and answers again.
And the assembling module is used for assembling the target test paper and the answer data corresponding to the target test paper into the target test paper containing the answer data.
According to the embodiment of the disclosure, the answer data is stored in the first database in a hash storage manner.
According to an embodiment of the present disclosure, the second obtaining module includes a obtaining unit, a dividing unit, and a buffering unit.
And the acquisition unit is used for acquiring the title associated with the test paper identification information.
And the dividing unit is used for dividing the questions related to the test paper into different categories according to the question types.
And the cache unit is used for caching the titles of different categories to different positions in the first database respectively.
According to an embodiment of the present disclosure, the information processing apparatus 500 of this embodiment further includes a clearing module.
And the clearing module is used for clearing the question which is stored in the first database and is associated with the test paper identification information after the preset time period is exceeded.
According to an embodiment of the present disclosure, wherein the group volume policy includes: a random group volume policy and a fixed group volume policy.
According to an embodiment of the present disclosure, the random paper grouping policy includes randomly acquiring a topic about a target topic type in a target test paper from topics associated with test paper identification information.
According to an embodiment of the present disclosure, the fixed paper grouping policy includes obtaining a fixed topic from topics associated with the test paper identification information.
According to the embodiment of the present disclosure, any plurality of the first obtaining module 510, the second obtaining module 520, the third obtaining module 530, the first sending module 540, the first receiving module 550, the fourth obtaining module 560, and the first returning module 570 may be combined into one module to be implemented, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to the embodiment of the present disclosure, at least one of the first obtaining module 510, the second obtaining module 520, the third obtaining module 530, the first sending module 540, the first receiving module 550, the fourth obtaining module 560 and the first returning module 570 may be at least partially implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or implemented by any one of three implementation manners of software, hardware and firmware, or by a suitable combination of any several of them. Alternatively, at least one of the first obtaining module 510, the second obtaining module 520, the third obtaining module 530, the first sending module 540, the first receiving module 550, the fourth obtaining module 560 and the first returning module 570 may be at least partially implemented as a computer program module, which may perform a corresponding function when executed.
Fig. 7 schematically shows a block diagram of an electronic device adapted to implement an information processing method according to an embodiment of the present disclosure.
As shown in fig. 7, an electronic device 700 according to an embodiment of the present disclosure includes a processor 701, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. The processor 701 may include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 701 may also include on-board memory for caching purposes. The processor 701 may comprise a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
In the RAM 703, various programs and data necessary for the operation of the electronic apparatus 700 are stored. The processor 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. The processor 701 performs various operations of the method flows according to the embodiments of the present disclosure by executing programs in the ROM 702 and/or the RAM 703. Note that the above-described programs may also be stored in one or more memories other than the ROM 702 and the RAM 703. The processor 701 may also perform various operations of the method flows according to embodiments of the present disclosure by executing programs stored in the one or more memories described above.
Electronic device 700 may also include input/output (I/O) interface 705, which input/output (I/O) interface 705 is also connected to bus 704, according to an embodiment of the present disclosure. The electronic device 700 may also include one or more of the following components connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, a computer-readable storage medium may include the ROM 702 and/or the RAM 703 and/or one or more memories other than the ROM 702 and the RAM 703 described above.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the method illustrated in the flow chart. When the computer program product runs in a computer system, the program code is used for causing the computer system to realize the item recommendation method provided by the embodiment of the disclosure.
The computer program performs the above-described functions defined in the system/apparatus of the embodiments of the present disclosure when executed by the processor 701. The systems, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
In one embodiment, the computer program may be hosted on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted in the form of a signal on a network medium, distributed, downloaded and installed via the communication section 709, and/or installed from the removable medium 711. The computer program containing program code may be transmitted using any suitable network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by the processor 701, performs the above-described functions defined in the system of the embodiment of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
In accordance with embodiments of the present disclosure, program code for executing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, these computer programs may be implemented using high level procedural and/or object oriented programming languages, and/or assembly/machine languages. The programming language includes, but is not limited to, programming languages such as Java, C + +, python, the "C" language, or the like. The program code may execute entirely on the user computing device, partly on the user device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (12)

1. An information processing method is applied to a first database for caching, wherein the first database is in communication connection with a second database, the second database stores titles for assembling test paper, and partial titles obtained from the second database are cached in the first database, and the method comprises the following steps:
acquiring an answer request, wherein the answer request carries user information and test paper identification information;
responding to the answer request, and under the condition that the question associated with the test paper identification information is not inquired in the first database, acquiring the question associated with the test paper identification information from the second database according to the test paper identification information so as to cache the question associated with the test paper identification information into the first database;
acquiring a target question from the questions which are cached in the first database and are associated with the test paper identification information according to a paper grouping strategy; and
and sending the target test paper comprising the target question to an answering end according to the user information.
2. The method of claim 1, further comprising:
receiving a first storage request which is sent by the answering end and used for storing first answering data;
responding to the first storage request, and acquiring the first answer data;
and under the condition that the first answer data is successfully stored in the first database, returning a first state code to the answer end.
3. The method of claim 2, further comprising:
under the condition that the first answer data is not stored in the first database, returning a second state code to the answer end, and marking the first answer data;
receiving a second storage request which is sent by the answering end and used for storing second answering data;
responding to the second storage request, and acquiring the second answer data and the marked first answer data;
and under the condition that the second answer data and the marked first answer data are successfully stored in the first database, returning the first state code to the answer end.
4. The method of claim 2, further comprising;
sending the target test paper to a server so as to store the target test paper to the server;
when the user quits answering halfway and answers again, acquiring the target test paper from the server according to the identification information of the target test paper, and acquiring answer data corresponding to the target test paper from the first database;
and assembling the target test paper and the answer data corresponding to the target test paper into the target test paper containing the answer data.
5. The method according to any one of claims 2 to 4, wherein the answer data is stored in the first database in a hash storage manner.
6. The method of claim 1, wherein the obtaining the title associated with the test paper identification information so as to cache the title associated with the test paper identification information in the first database comprises:
acquiring a question associated with the test paper identification information;
dividing the questions related to the test paper into different categories according to the types of the questions;
and caching the titles of different categories to different positions in the first database respectively.
7. The method of claim 1, further comprising:
and clearing the questions which are stored in the first database and are associated with the test paper identification information after a preset time period is exceeded.
8. The method of claim 1, wherein the group volume policy comprises: a random volume-combining strategy and a fixed volume-combining strategy;
wherein the random paper-grouping strategy comprises randomly acquiring a question about a target question type in the target test paper from questions associated with the test paper identification information;
the fixed paper grouping strategy comprises the step of obtaining a fixed topic from topics associated with the test paper identification information.
9. An information processing apparatus applied to a first database for caching, wherein the first database is in communication connection with a second database, the second database stores titles for assembling test paper, and partial titles obtained from the second database are cached in the first database, the apparatus comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an answer request, and the answer request carries user information and test paper identification information;
a second obtaining module, configured to, in response to the answer request, obtain, from the second database according to the test paper identification information, a question associated with the test paper identification information when the question associated with the test paper identification information is not queried in the first database, so as to cache the question associated with the test paper identification information in the first database;
a third obtaining module, configured to obtain a target question from questions cached in the first database and associated with the test paper identification information according to a paper grouping policy; and
and the first sending module is used for sending the target test paper comprising the target question to an answer end according to the user information.
10. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-8.
11. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method of any one of claims 1 to 8.
12. A computer program product comprising a computer program which, when executed by a processor, implements a method according to any one of claims 1 to 8.
CN202111408932.6A 2021-11-24 2021-11-24 Information processing method, information processing device, electronic equipment and storage medium Pending CN114116763A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111408932.6A CN114116763A (en) 2021-11-24 2021-11-24 Information processing method, information processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111408932.6A CN114116763A (en) 2021-11-24 2021-11-24 Information processing method, information processing device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114116763A true CN114116763A (en) 2022-03-01

Family

ID=80372448

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111408932.6A Pending CN114116763A (en) 2021-11-24 2021-11-24 Information processing method, information processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114116763A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115269644A (en) * 2022-06-09 2022-11-01 知学云(北京)科技股份有限公司 High-performance online examination method, system and equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115269644A (en) * 2022-06-09 2022-11-01 知学云(北京)科技股份有限公司 High-performance online examination method, system and equipment

Similar Documents

Publication Publication Date Title
CN108153798B (en) Page information processing method, device and system
US20190251087A1 (en) Method and apparatus for providing aggregate result of question-and-answer information
US20200034164A1 (en) System for utilizing one or more data sources to generate a customized interface
US20170344745A1 (en) System for utilizing one or more data sources to generate a customized set of operations
CN110888696A (en) Page display method and system, computer system and computer readable medium
US8856365B2 (en) Computer-implemented method, computer system and computer readable medium
CN107239701B (en) Method and device for identifying malicious website
CN107038194B (en) Page jump method and device
CN111125107A (en) Data processing method, device, electronic equipment and medium
CN109359237A (en) It is a kind of for search for boarding program method and apparatus
US20190147540A1 (en) Method and apparatus for outputting information
EP3188051A1 (en) Systems and methods for search template generation
US11687715B2 (en) Summary generation method and apparatus
CN114116763A (en) Information processing method, information processing device, electronic equipment and storage medium
CN111859077A (en) Data processing method, device, system and computer readable storage medium
CN113132400B (en) Business processing method, device, computer system and storage medium
US10021210B1 (en) Providing faster data access using multiple caching servers
CN110110184B (en) Information inquiry method, system, computer system and storage medium
CN103379022B (en) A kind of instant communication method based on Internet map search and system
US9940364B2 (en) Obtaining desired web content from a mobile device
CN113495498B (en) Simulation method, simulator, device and medium for hardware device
CN114443663A (en) Data table processing method, device, equipment and medium
US20160140188A1 (en) Systems, methods, and computer-readable media for searching tabular data
CN114003659A (en) Data synchronization method, data synchronization device, electronic equipment, storage medium and program product
CN109522211B (en) Interface parameter transmission method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination