CN113408275A - Word learning method, device and system and computing equipment - Google Patents

Word learning method, device and system and computing equipment Download PDF

Info

Publication number
CN113408275A
CN113408275A CN202010182902.7A CN202010182902A CN113408275A CN 113408275 A CN113408275 A CN 113408275A CN 202010182902 A CN202010182902 A CN 202010182902A CN 113408275 A CN113408275 A CN 113408275A
Authority
CN
China
Prior art keywords
word
learning
words
target
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010182902.7A
Other languages
Chinese (zh)
Other versions
CN113408275B (en
Inventor
王捷
丁彦
薛颜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Beiwan Information Technology Co ltd
Original Assignee
Nanjing Beiwan Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Beiwan Information Technology Co ltd filed Critical Nanjing Beiwan Information Technology Co ltd
Priority to CN202010182902.7A priority Critical patent/CN113408275B/en
Publication of CN113408275A publication Critical patent/CN113408275A/en
Application granted granted Critical
Publication of CN113408275B publication Critical patent/CN113408275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/06Foreign languages
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Abstract

The embodiment of the invention discloses a word learning method, which comprises the following steps: acquiring a target word to be learned currently in the word set, wherein the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; displaying the acquired target word; and displaying the associated illustrative sentences corresponding to the target words, wherein the associated illustrative sentences at least comprise the target words and other words in the word set. The embodiment of the invention also discloses a corresponding word learning device, a corresponding word learning system and corresponding computing equipment.

Description

Word learning method, device and system and computing equipment
Technical Field
The invention relates to the technical field of language learning, in particular to a word learning method, a word learning device, a word learning system and a word learning computing device.
Background
A language is composed of a large number of words, and therefore, the words are the basis of the language and word learning is an important part of language learning. Most current word learning schemes are based on an Ebingois forgetting curve to assist in remembering words.
The Ebinghaos forgetting curve is discovered by the Erbinghaos research in psychology of Germany, which describes the law of forgetting new things by the human brain. The Ebinghaos forgetting curve indicates that forgetting during learning is regular, that the forgetting process is fast and fast before slow. Therefore, based on the forgetting curve, the review of the words can be arranged when the memory forgetting point is reached, so as to obtain better memory effect.
However, in the experiment of the Ebingois curve, meaningless letter combinations are used, which have no relationship with each other. However, many words are not unrelated in the process of word learning, and when a word is learned, other words closely related to the word can be more easily learned. The learning and review are completely carried out according to the Ebinghaos forgetting curve, and the learning efficiency is not high.
Therefore, a more advanced word learning scheme is desired.
Disclosure of Invention
To this end, embodiments of the present invention provide a word learning method, apparatus, system and computing device in an effort to solve or at least alleviate the above-identified problems.
According to an aspect of an embodiment of the present invention, there is provided a word learning method including: acquiring a target word to be learned currently in a word set, wherein the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; displaying the acquired target word; and displaying an associated illustrative sentence corresponding to the target word, wherein the associated illustrative sentence at least comprises the target word and other words in the word set.
Optionally, in the method according to the embodiment of the present invention, the step of displaying the associated example sentence of the target word includes: and responding to marking operation of the target word by the user, and displaying the associated example sentence corresponding to the target word, wherein the marking operation indicates the cognitive degree of the user on the target word.
According to another aspect of an embodiment of the present invention, there is provided a word learning method including: receiving a word set selected by a user; acquiring a plurality of associated example sentences corresponding to the word set, wherein the associated example sentences comprise a plurality of words in the word set; and configuring the learning sequence of each word in the word set based on the word set and the plurality of associated example sentences, and determining the associated example sentence corresponding to each word.
Optionally, in the method according to the embodiment of the present invention, the step of obtaining a plurality of associated example sentences corresponding to a word set includes: and acquiring the associated example sentences from corresponding sources based on the types of the word sets.
According to another aspect of an embodiment of the present invention, there is provided a word learning apparatus including: the communication module is suitable for acquiring a target word to be learned currently in a word set, and the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; and a display module adapted to display the acquired target word; and the method is also suitable for displaying the associated illustrative sentences corresponding to the target words, wherein the associated illustrative sentences at least comprise the target words and other words in the word set.
According to another aspect of an embodiment of the present invention, there is provided a word learning apparatus including: a communication module adapted to receive a set of words selected by a user; the example sentence acquisition module is suitable for acquiring a plurality of associated example sentences corresponding to the word set, and the associated example sentences comprise a plurality of words in the word set; and the sequence configuration module is suitable for configuring the learning sequence of each word in the word set based on the word set and the plurality of associated example sentences and determining the associated example sentences corresponding to the words.
According to another aspect of an embodiment of the present invention, there is provided a word learning system including: a client on which a word learning device according to an embodiment of the present invention resides; and a server on which resides a word learning apparatus according to an embodiment of the present invention.
According to still another aspect of an embodiment of the present invention, there is provided a computing device including: one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing the word learning method according to an embodiment of the present invention.
According to the word learning scheme provided by the embodiment of the invention, the associated example sentence of the target word is displayed, so that the user can learn other words associated with the target word (for example, review the recently learned word and preview the word to be learned), the word memory effect of the user on the word is improved, and the learning efficiency of the user is improved. The learning sequence of the words is configured based on the extremely large groups, so that the words which can be learned by the user through the associated example sentences are maximized as much as possible, and the user can master more words in a short time.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and the embodiments of the present invention can be implemented according to the content of the description in order to make the technical means of the embodiments of the present invention more clearly understood, and the detailed description of the embodiments of the present invention is provided below in order to make the foregoing and other objects, features, and advantages of the embodiments of the present invention more clearly understandable.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a word learning system 100 according to one embodiment of the invention;
FIG. 2 shows a schematic diagram of a computing device 200, according to one embodiment of the invention;
FIG. 3 illustrates an interaction flow diagram of a word learning method 300 according to one embodiment of the invention;
FIG. 4 shows a schematic diagram of a word learning sequence according to one embodiment of the invention;
5A-5D illustrate schematic diagrams of a plurality of graphical user interfaces according to one embodiment of the present invention;
FIG. 6 illustrates an interaction flow diagram of a word learning method 600 according to one embodiment of the invention;
FIG. 7 shows a schematic diagram of a word learning apparatus 700 according to one embodiment of the invention; and
FIG. 8 shows a schematic diagram of a word learning apparatus 800 according to one embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
FIG. 1 shows a schematic diagram of a word learning system 100 according to one embodiment of the invention. The word learning system 100 may assist the user in word memory and learning. As shown in fig. 1, the word learning system 100 may include a client 120 and a server 140. In other embodiments, the word learning system 100 may include different and/or additional modules.
The client 120 may receive user input, for example, a graphical user interface may be provided for the user to select a set of words to learn and to set the number of new words to learn per day. The server 140 may schedule words for the user to learn daily based on the user's input and communicate with the server 120 via the network 160, such as sending the user's words to learn today and entry data for the words to the client. Network 160 may include wired and/or wireless communication paths.
According to an embodiment of the present invention, each of the components (client, server, etc.) in the word learning system 100 described above may be implemented by a computing device 200 as described below.
FIG. 2 shows a schematic diagram of a computing device 200, according to one embodiment of the invention. As shown in FIG. 2, in a basic configuration 202, a computing device 200 typically includes a system memory 206 and one or more processors 204. A memory bus 208 may be used for communication between the processor 204 and the system memory 206.
Depending on the desired configuration, the processor 204 may be any type of processor, including but not limited to: a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. The processor 204 may include one or more levels of cache, such as a level one cache 210 and a level two cache 212, a processor core 214, and registers 216. Example processor cores 214 may include Arithmetic Logic Units (ALUs), Floating Point Units (FPUs), digital signal processing cores (DSP cores), or any combination thereof. The example memory controller 218 may be used with the processor 204, or in some implementations the memory controller 218 may be an internal part of the processor 204.
Depending on the desired configuration, system memory 206 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 206 may include an operating system 220, one or more applications 222, and program data 224. In some implementations, the application 222 can be arranged to execute instructions on the operating system with the program data 224 by the one or more processors 204.
Computing device 200 may also include an interface bus 240 that facilitates communication from various interface devices (e.g., output devices 242, peripheral interfaces 244, and communication devices 246) to the basic configuration 202 via the bus/interface controller 230. The example output device 242 includes a graphics processing unit 248 and an audio processing unit 250. They may be configured to facilitate communication with various external devices, such as a display or speakers, via one or more a/V ports 252. Example peripheral interfaces 244 can include a serial interface controller 254 and a parallel interface controller 256, which can be configured to facilitate communications with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 258. An example communication device 246 may include a network controller 260, which may be arranged to facilitate communications with one or more other computing devices 262 over a network communication link via one or more communication ports 264.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in such a manner as to encode information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
Computing device 200 may be implemented as a server, such as a database server, an application server, a WEB server, and the like, or as a personal computer including desktop and notebook computer configurations. Of course, computing device 200 may also be implemented as at least a portion of a small-sized portable (or mobile) electronic device.
In embodiments according to the invention, the computing device 200 may be implemented as the word learning apparatus 700 and/or 800 and configured to perform the word learning methods 300 and/or 600 according to embodiments of the invention. Where the application 222 of the computing device 200 contains a plurality of instructions for performing the word learning method 300/600 according to embodiments of the present invention, the program data 224 may also store configuration data for the word learning system 100, and the like.
FIG. 3 illustrates an interaction flow diagram of a word learning method 300 according to one embodiment of the invention. The word learning method 300 is suitable for execution in the word learning system 100.
As shown in fig. 3, the word learning method 300 begins at step S310. In step S310, the client 120 may request the server 140 to obtain a target word to be learned currently in the word set. In some embodiments, the acquisition of the target word currently to be learned in the set of words may be requested from the server 140 in response to the user initiating learning of the set of words or ending learning of the last word in the set of words.
In step S320, the server 140 may determine the target word according to the learning sequence of the words in the word set and the learning progress of the user on the word set. For example, the server 140 may store a learning order for each word in a set of words, all of which are arranged in the learning order to form a word learning sequence. The server 140 may also store a user's learning progress for the set of words, indicating to which word in the word learning sequence the user learned. The server 140 selects, as the target word, the word whose learning order immediately follows the word of the learning progress indication.
FIG. 4 shows a schematic diagram of a word learning sequence, according to one embodiment of the invention. In fig. 4, the words are arranged in the configured learning order from top to bottom, and the learning order of the word located at the top is advanced. The arrow indicates that word D in the word learning sequence has been learned so that word E located below word D can be selected as the target word.
The server 140 may transmit the target word to the client 120 so that the client 120 displays the acquired target word via the graphical user interface in step S330. Alternatively, the server 140 may send the target word and its entry data to the client 120, the entry data including paraphrase, phonetic symbol, reference frequency, illustrative sentence, etc. of the target word.
In some embodiments, the graphical user interface displaying the target word may include markup buttons via which the client 120 receives a user's markup operations for the displayed target word. The tagging operation may indicate how well the user is cognizant of the target word, such as recognizing the word, not recognizing the word, or otherwise obscuring the recognition of the word.
According to step S340, the server 140 may further send an associated illustrative sentence (which will be described in detail later) corresponding to the target word to the client 120, so that the client 120 displays the associated illustrative sentence in step S350, where the associated illustrative sentence includes other words in the word set in addition to the target word. Therefore, the user can learn other words in the word set in a relevant manner besides learning the target word through the relevant example sentence, and the learning efficiency is improved. Alternatively, the server 140 may send the relevant data of the associated illustrative sentence of the target word, other words in the set of words contained in the associated illustrative sentence, and other words to the client 120. The related data may include at least one of entry data of the word, learning time, and learning state.
The learning state of the word may include learned and to-be-learned. For a word whose learning state is learned, the server 140 stores the last actual learning time or actual review time of the word. For a word whose learning state is unlearned, the server 140 determines an expected learning time for the word based on the new word learning amount per unit time set by the user and the learning order of each word in the word set.
In some embodiments, the associated illustrative sentences of the target word may be displayed in response to a marking operation of the target word by the user. For example, if the marking operation indicates that the user does not recognize or vaguely recognizes the target word, the associated illustrative sentence for the target word may be displayed. If the marking operation indicates that the user recognizes the target word, the associated example sentence of the target word may or may not be displayed.
In the associated illustrative sentence, the target word may be highlighted, such as highlighted, line-down-shifted, etc., to facilitate the user in locating the target word. In addition to displaying the associated illustrative sentence, the paraphrase of the target word in the associated illustrative sentence, as well as the source, paraphrase, pronunciation, etc. of the associated illustrative sentence may also be displayed. Alternatively, other words in the word set included in the related example sentence may be acquired, and the learning state of the other words and/or the other words may be displayed.
In some embodiments, the client 120 may also receive a user selection operation (e.g., a click operation) of the displayed other words and display, in response to the selection operation, learning times of the other words and/or term data of the other words.
Fig. 5A-5D respectively illustrate a plurality of graphical user interfaces in accordance with one embodiment of the present invention. As shown in FIG. 5A, the graphical user interface 510 is displayed with the target word acknowledge, and the bottom of the interface is displayed with the labeled buttons "I know" and "prompt once". The user clicks the "i know" tag button, indicating that the user knows the target word. The user clicks the "prompt for next" tab button and proceeds to the graphical user interface 520 shown in FIG. 5B. The bottom of the graphical user interface 520 displays labeled buttons "think" and "do not think". The user clicks the "don't remember" tag button, indicating that the user does not recognize the target word, and enters the graphical user interface 530 shown in FIG. 5C. The user clicks the "think up" tag button, indicating that the user is vaguely aware of the target word, and also enters the graphical user interface 530. The graphical user interface 530 displays the associated example sentence with the target word, the paraphrase of the target word in the associated example sentence, the reference frequency of the target word, the paraphrase, pronunciation, and source of the associated example sentence. Wherein the target word acknowledge is highlighted in the associated illustrative sentence.
Below the associated example sentence, the graphical user interface 530 also displays other words, demonstrate, per, invocable, recipient, having a set of words that appear in the associated example sentence, and the learning status of those words. The user clicks on the word delete, the learning time of the word delete ("this word has been learned 2 days ago") and the term data may be displayed in the graphical user interface 540 as shown in FIG. 5D.
The user may also click a "next" button displayed at the bottom of the gui 540, end learning the currently displayed target word acknowledge, and obtain and display a next target word to be learned.
According to one embodiment of the present invention, the server 140 may configure the learning status of the target word as learned and record the actual learning time of the target word in response to the user finishing learning the target word currently displayed. Meanwhile, the server 140 may also update the learning progress of the user for the word set so as to determine a next target word to be learned based on the updated learning progress and the learning order of the words. Wherein the user clicking a complete button (e.g., the "next" button in graphical user interface 530) on the graphical user interface displaying the target word or exiting the learning of the set of words indicates that the user is finished learning the currently displayed target word.
According to an embodiment of the present invention, the server 140 may determine a plurality of target words and data thereof to be learned in a current unit time, associated example sentences corresponding to the target words and data thereof, and other words and data thereof in a word set included in the associated example sentences according to a learning sequence of each word in the word set, a learning progress of the word set by a user, and a new word learning amount per unit time set by the user, and send the target words and data thereof, the associated example sentences and data thereof to the client 120. The client 120 displays each target word and the associated example sentence of each target word in order of learning the target words.
It should be noted that the word learning method according to an embodiment of the present invention is directed to learning of new words that have not been learned, regardless of review of the words that have been learned. The "target word" referred to throughout is a new word that the user has not learned. Unlike review of learned words, learning of a target word refers to the first learning of the target word, and the learning order of words in a set of words refers to the first learning order of words.
The following describes in detail the determination of the learning order of each word in the word set and the associated example sentence corresponding to each word with reference to fig. 6.
FIG. 6 illustrates an interaction flow diagram of a word learning method 600 according to one embodiment of the invention. The word learning method 600 is suitable for execution in the word learning system 100.
As shown in fig. 6, the word learning method 600 begins at step S610. In step S610, the client 120 may receive a set of words selected by the user and/or a new word learning amount per unit time set by the user.
The client 120 may provide a user interface for a user to select a set of words that the user wants to learn, such as a set of jazz/tolock words, a set of level four/six words, a set of famous/movie words, or other suitable set of words. The client 120 may also provide a user interface for the user to enter an amount of new word learning per unit time, i.e., the number of new words to be learned per unit time, such as an amount of new word learning per day, an amount of new word learning per week, and so forth.
In step S620, the client 120 sends the word set selected by the user and/or the new word learning amount per unit time input by the user to the server 140, and the server 140 configures the learning sequence and the associated example sentence of each word in the word set for the user.
In step S630, the server 140 may obtain a plurality of associated illustrative sentences corresponding to the word set. The associated illustrative sentence may contain at least two words, i.e., a plurality of words, of the set of words. In some embodiments, the associated illustrative sentences may be obtained from a respective source based on the type of the set of words. For example, if the test type word set is used, the associated example sentence can be obtained from the calendar truth question corresponding to the word set. If the word set is of a movie/television play type, the associated illustrative sentences can be obtained from the movie/television play script or the subtitle text corresponding to the word set.
Then, in step S640, the server 140 may configure the learning sequence of each word in the word set and determine the associated example sentence corresponding to each word based on the word set selected by the user and the plurality of associated example sentences corresponding to the word set.
Specifically, a word relationship graph may be constructed based on a set of words and associated example sentences corresponding to the set of words. The word relation graph takes words in a word set as nodes, takes associated example sentences containing the words as attributes of the nodes, and an edge between two nodes indicates that the associated example sentences simultaneously including the words corresponding to the two nodes exist.
After the word relationship graph is constructed, a plurality of huge cliques satisfying the attribute constraint condition in the word relationship table can be obtained. The attribute constraint indicates that the nodes of the maximal clique have at least one common attribute, that is, words corresponding to the nodes of the maximal clique should appear in the same associated illustrative sentence.
For example, assume that there is a maximum graph Q in the word relationship graph1=(r1,r2,r3) And maximal group Q2=(r1,r2,r4). Associated example sentence S1And associated example sentence S2All comprise a node r1Corresponding words, i.e. nodes r1Having an attribute value S1And S2. Associated example sentence S2Includes a node r2Corresponding words, i.e. nodes r2Having an attribute value S2. Associated example sentence S1And associated example sentence S3All comprise a node r3Corresponding words, i.e. nodes r3Having an attribute value S1And S3. Associated example sentence S2And associated example sentence S3All comprise a node r4Corresponding words, i.e. nodes r4Having an attribute value S2And S3. Apparently, the very big cluster Q1The words corresponding to the nodes do not appear in the same associated example sentence, the nodes do not have common attributes, and the maximum group Q1The attribute constraint is not satisfied. Extremely large group Q2The words corresponding to the nodes in the sentence appear in the same associated example sentence S2In (2), each node has a common attribute S2Maximum group Q2And satisfying the attribute constraint condition.
The learning sequence of each word in the word set can be configured based on the obtained plurality of huge clusters, and the associated example sentence corresponding to each word is determined.
Preferably, according to an embodiment of the present invention, in order to make the proportion of words in the word set learned by the user through the associated example sentence in a predetermined period of time (for example, a period of time after the user is registered) as high as possible, the learning order of the words corresponding to the maximum cliques with a larger number of nodes in the acquired plurality of maximum cliques may be configured to be earlier, and the learning order of the words corresponding to the maximum cliques with a smaller number of nodes may be configured to be later. That is, the learning order of the words included in the extremely large clique with the large number of nodes may be prior to the learning order of the words included in the extremely large clique with the small number of nodes.
For example, the obtained maximal cliques may be traversed in the order of the number of nodes from high to low, the learning order of the words included in the first traversed maximal clique is configured as the front, and the learning order of the words included in the later traversed maximal clique is configured as the back. For each maximal clique traversed, the learning order of the words included in the maximal clique can be randomly configured. For example, randomly selecting one word in the maximal clique configures its learning order to be the first of all the words included in the maximal clique, randomly selecting another word configures its learning order to be the second, and so on.
Before configuring a learning sequence for each word in the traversed maximal clique, whether the word is configured with the learning sequence or not can be judged (considering that the previously traversed maximal clique may also comprise the word), if so, the word is ignored, and otherwise, the learning sequence is configured for the word. And meanwhile, determining that the associated example sentence corresponding to the common attribute of each node in the traversed (including the word) huge group is the associated example sentence corresponding to the word. In this way, words in the word set contained in the example sentence can be associated through the associated example sentence, so that word learning can be easier and more effective.
It should be noted that the above only gives one specific example of configuring the learning sequence, and those skilled in the art can devise various ways for configuring the learning sequence based on the above examples, all of which are within the scope of the present invention.
After the learning sequence of each word in the word set is configured, the predicted learning time of each word can be determined based on the learning sequence of each word in the word set and the new word learning amount per unit time set by the user, and meanwhile, the learning state of each word can be configured to be learned.
FIG. 7 shows a schematic diagram of a word learning apparatus 700 according to one embodiment of the invention. As shown in fig. 7, the word learning apparatus 700 resides in the client 120 and may include a communication module 710 and a display module 720.
The communication module 710 is adapted to obtain a target word to be learned currently in a word set, where the target word is determined based on a learning progress of the user on the word set and a learning sequence of each word in the word set. The display module 720 is coupled to the communication module 710 and is adapted to display the obtained target word and display an associated illustrative sentence corresponding to the target word, where the associated illustrative sentence includes at least the target word and other words in the set of words.
The word learning apparatus 700 may further comprise an interaction module 730 (not shown in fig. 7), the interaction module 730 being adapted to receive input from a user, such as the user selecting a set of words or setting a new amount of word learning per unit of time.
For the detailed processing logic and implementation process of the modules in the word learning apparatus 700, reference may be made to the foregoing description of the word learning methods 300 and 600 in conjunction with fig. 1 to 6, and details are not repeated here.
FIG. 8 shows a schematic diagram of a word learning apparatus 800 according to one embodiment of the invention. As shown in fig. 8, the word learning apparatus 800 resides in the server 140 and may include a communication module 810, an illustrative sentence acquisition module 820, and a sequence configuration module 830.
The communication module 810 is adapted to receive a user selected set of words. The example sentence obtaining module 820 is coupled to the communication module 810 and is adapted to obtain a plurality of associated example sentences corresponding to the word set, where the associated example sentences include a plurality of words in the word set. The sequence configuration module 830 is coupled to the example sentence acquisition module 820, and is adapted to configure the learning sequence of each word in the word set based on a plurality of associated example sentences corresponding to the word set and the word set, and determine the associated example sentence corresponding to each word.
For the detailed processing logic and implementation process of the modules in the word learning apparatus 800, reference may be made to the foregoing description of the word learning methods 300 and 600 in conjunction with fig. 1 to 6, and details are not repeated here.
In summary, according to the word learning scheme of the embodiment of the present invention, by displaying the associated example sentence of the target word, the user can learn other words associated with the target word (for example, review a word recently learned, and pre-learn a word to be learned), which is helpful for improving the word memory effect of the user and improving the learning efficiency of the user. The learning sequence of the words is configured based on the extremely large groups, so that the words which can be learned by the user through the associated example sentences are maximized as much as possible, and the user can master more words in a short time.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of embodiments of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U.S. disks, floppy disks, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing embodiments of the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the methods of embodiments of the present invention according to instructions in the program code stored in the memory.
By way of example, and not limitation, readable media may comprise readable storage media and communication media. Readable storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
The present invention may further comprise: a6, the method of A5, wherein the entry data includes at least one of: word paraphrases, reference frequencies, pronunciations, illustrative sentences and illustrative sentence paraphrases. B8, the method according to B7, wherein the step of obtaining a plurality of associated example sentences corresponding to the word set comprises: and acquiring the associated example sentences from corresponding sources based on the types of the word sets. B9 the method of B7, wherein the step of configuring the learning order of each word in the set of words based on the set of words and the plurality of associated example sentences and determining the associated example sentence corresponding to each word comprises: constructing a word relation graph based on the word set and the plurality of associated example sentences, wherein the word relation graph takes the words in the word set as nodes, takes the associated example sentences containing the words as the attributes of the nodes, and the edges between the two nodes indicate that the associated example sentences simultaneously comprising the words corresponding to the two nodes exist; obtaining a plurality of maximal cliques in the word relationship table which meet an attribute constraint condition, wherein the attribute constraint condition indicates that each node of the maximal cliques has at least one common attribute; and configuring the learning sequence of each word in the word set based on the acquired plurality of huge groups, and determining the associated example sentence corresponding to each word. B10, the method according to B9, wherein the learning order of the words included in the maximal cliques with the larger number of nodes is prior to the learning order of the words included in the maximal cliques with the smaller number of nodes. B11, the method according to B10, wherein the step of configuring the learning order of the words in the set of words based on the obtained plurality of maximal cliques comprises: traversing the plurality of extremely large clusters according to the sequence of the number of the nodes from large to small; and configuring the learning sequence of the words included in the first traversed maximal clique as the front, and configuring the learning sequence of the words included in the later traversed maximal clique as the back. B12 the method of B11, wherein the step of determining the associated example sentence corresponding to each word based on the obtained plurality of huge cliques comprises: and for each word in the traversed maximal clique, determining the associated example sentence corresponding to the common attribute of each node in the traversed maximal clique as the associated example sentence corresponding to the word. B13, the method according to B7, further comprising: and determining the predicted learning time of each word in the word set based on the learning sequence of each word in the word set and the new word learning amount per unit time set by a user.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with examples of embodiments of the invention. The required structure for constructing such a system will be apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of embodiments of the present invention as described herein, and specific languages are described above to disclose embodiments of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that is, the claimed embodiments of the invention require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of an embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of and form different embodiments of the invention. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the above embodiments are described herein as a method or combination of elements of a method that can be performed by a processor of a computer system or by other means for performing the functions described above. A processor having the necessary instructions for carrying out the method or method elements described above thus forms a means for carrying out the method or method elements. Furthermore, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While embodiments of the invention have been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the embodiments of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive embodiments. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present embodiments are disclosed by way of illustration and not limitation, the scope of embodiments of the invention being defined by the appended claims.

Claims (10)

1. A word learning method, comprising:
acquiring a target word to be learned currently in a word set, wherein the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set;
displaying the acquired target word; and
and displaying an associated illustrative sentence corresponding to the target word, wherein the associated illustrative sentence at least comprises the target word and other words in the word set.
2. The method of claim 1, wherein displaying the associated illustrative sentence of the target word comprises:
and responding to marking operation of the target word by the user, and displaying the associated example sentence corresponding to the target word, wherein the marking operation indicates the cognitive degree of the user on the target word.
3. The method of claim 1 or 2, wherein the step of displaying the associated illustrative sentence corresponding to the target word comprises:
and highlighting the target word in the associated illustrative sentence.
4. The method of any of claims 1-3, further comprising:
acquiring other words in the word set contained in the associated example sentence;
displaying the other words and/or the learning states of the other words.
5. The method of claim 4, further comprising:
and responding to the selection operation of the user on the other words, and displaying the learning time and/or the entry data of the other words.
6. A word learning method, comprising:
receiving a word set selected by a user;
acquiring a plurality of associated example sentences corresponding to the word set, wherein the associated example sentences comprise a plurality of words in the word set; and
and configuring the learning sequence of each word in the word set based on the word set and the plurality of associated example sentences, and determining the associated example sentences corresponding to the words.
7. A word learning apparatus comprising:
the communication module is suitable for acquiring a target word to be learned currently in a word set, and the target word is determined based on the learning progress of a user on the word set and the learning sequence of each word in the word set; and
the display module is suitable for displaying the acquired target words; and the method is also suitable for displaying the associated illustrative sentences corresponding to the target words, wherein the associated illustrative sentences at least comprise the target words and other words in the word set.
8. A word learning apparatus comprising:
a communication module adapted to receive a set of words selected by a user;
the example sentence acquisition module is suitable for acquiring a plurality of associated example sentences corresponding to the word set, and the associated example sentences comprise a plurality of words in the word set; and
and the sequence configuration module is suitable for configuring the learning sequence of each word in the word set based on the word set and the plurality of associated example sentences and determining the associated example sentences corresponding to the words.
9. A word learning system comprising:
a client hosting the word learning device of claim 14; and
a server on which resides the word learning apparatus of claim 15.
10. A computing device, comprising:
one or more processors; and
a memory;
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing any of the word learning methods of claims 1-6.
CN202010182902.7A 2020-03-16 2020-03-16 Word learning method, device, system and computing equipment Active CN113408275B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010182902.7A CN113408275B (en) 2020-03-16 2020-03-16 Word learning method, device, system and computing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010182902.7A CN113408275B (en) 2020-03-16 2020-03-16 Word learning method, device, system and computing equipment

Publications (2)

Publication Number Publication Date
CN113408275A true CN113408275A (en) 2021-09-17
CN113408275B CN113408275B (en) 2023-10-20

Family

ID=77676593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010182902.7A Active CN113408275B (en) 2020-03-16 2020-03-16 Word learning method, device, system and computing equipment

Country Status (1)

Country Link
CN (1) CN113408275B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101142231B1 (en) * 2011-06-15 2012-05-07 한민석 Vocabulary learning apparatus and method thereof
JP2012118640A (en) * 2010-11-30 2012-06-21 Casio Comput Co Ltd Example sentence book preparation device and example sentence book preparation program
KR20140101548A (en) * 2013-02-12 2014-08-20 주홍찬 Apparatus and method for learning word by using link example sentence.
KR20140142552A (en) * 2013-06-04 2014-12-12 이상현 Example Sentence Providing System, Terminal and Method based on Studied Words
JP2015045904A (en) * 2013-08-27 2015-03-12 株式会社リコー Information processing device and method
US20190080626A1 (en) * 2017-09-14 2019-03-14 International Business Machines Corporation Facilitating vocabulary expansion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012118640A (en) * 2010-11-30 2012-06-21 Casio Comput Co Ltd Example sentence book preparation device and example sentence book preparation program
KR101142231B1 (en) * 2011-06-15 2012-05-07 한민석 Vocabulary learning apparatus and method thereof
KR20140101548A (en) * 2013-02-12 2014-08-20 주홍찬 Apparatus and method for learning word by using link example sentence.
KR20140142552A (en) * 2013-06-04 2014-12-12 이상현 Example Sentence Providing System, Terminal and Method based on Studied Words
JP2015045904A (en) * 2013-08-27 2015-03-12 株式会社リコー Information processing device and method
US20190080626A1 (en) * 2017-09-14 2019-03-14 International Business Machines Corporation Facilitating vocabulary expansion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐芬芬: "基于Android的单词学习系统设计与实现", 中国优秀硕士学位论文全文数据库, no. 2013, pages 1 - 4 *

Also Published As

Publication number Publication date
CN113408275B (en) 2023-10-20

Similar Documents

Publication Publication Date Title
CN110651251B (en) System and method for adaptive user interface
CN102141889B (en) Typewriting auxiliary for editor
CN111985229B (en) Sequence labeling method and device and computer equipment
CN108845806B (en) Applet distributing method, device, server and storage medium
US20130290836A1 (en) Methods and apparatus for copying text format pattern
CN112906865B (en) Neural network architecture searching method and device, electronic equipment and storage medium
CN112199477A (en) Dialogue management scheme and dialogue management corpus construction method
DE102014101026A1 (en) Stylus shorthand
US11501655B2 (en) Automated skill tagging, knowledge graph, and customized assessment and exercise generation
DE102014101042A1 (en) Modifying a stylus input or response using an inferred motion
DE102014101027A1 (en) Stylus with encyclopedia sharing
CN116580267A (en) Defect sample generation method and device, electronic equipment and storage medium
Ikpe Developmental post-conflict reconstruction in postindependence Nigeria: Lessons from asian developmental states
WO2020052060A1 (en) Method and apparatus for generating correction statement
WO2022159196A1 (en) Automated intelligent content generation
CN111008519B (en) Display method of reading page, electronic equipment and computer storage medium
CN104142821A (en) User-creatable custom workflows
CN113408275A (en) Word learning method, device and system and computing equipment
CN109331469A (en) The method and electronic equipment of a kind of typing of language based on programming outpost information
CN115756692A (en) Method for automatically combining and displaying pages based on style attributes and related equipment thereof
WO2022271369A1 (en) Training of an object linking model
CN111767710B (en) Indonesia emotion classification method, device, equipment and medium
CN114511393A (en) Financial data processing method and system
CN110895924B (en) Method and device for reading document content aloud, electronic equipment and readable storage medium
EP3765976A1 (en) Service-backed contextual document embedding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant