COMPUTER-BASED METHOD, SYSTEM AND
INTERNET SITE FOR ATTAINING A TARGET RECALL
ABILITY FOR A SET OF KNOWLEDGE
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to computer-based education systems generally, and more specifically, to a computer-based method, system and internet site for attaining a target recall ability for a set of knowledge.
2. Background Art
Modern adaptive education systems can be generally classified into three categories. First are intelligent tutoring systems in which an educator defines the source of a particular student's deficiency in a given subject, and then adapts the education system to eliminate or reduce the deficiency source. For example, suppose that a student repeatedly demonstrates an inability to accurately multiply numbers together. An intelligent tutoring system might analyze the student's multiplication process and identify that the student repeatedly fails to accurately "carry" a value when necessary. In response to this deficiency, the educator might adapt the education system to present the student with remedial study concerning when and how to "carry" numbers in multiplication.
A second category of adaptive education systems is decision science. A decision science system is a way to define a solution or effect a result based on a narrowing hierarchy of alternatives. Generally, each subsequent level in the hierarchy is dependent upon and smaller than the prior. For example, suppose that a doctor was attempting to diagnose a patient's condition. Before his diagnosis, the patient may have any of an unlimited number of ailments. The doctor might begin his diagnosis with an analysis of the patients obvious symptoms such as congestion or cough. Based on these obvious symptoms, the doctor can narrow down the
selection of all possible ailments to, perhaps, 40 alternatives. Next, the doctor might ask the patient questions concerning symptoms that are relevant only to the 40 alternatives such as stomach pain, indigestion, or cramps. Like before, the number of alternatives might be further reduced to 5. Last, the doctor might prescribe a blood test relevant only to the 5 alternatives narrowing the possible ailments to one.
A third category of adaptive education systems is memory modeling. In memory modeling, memory decay is in some manner applied to a student's educational system. For example, suppose that a student is provided with a lesson followed by a series of tests spaced out over time. In assessing the student's overall understanding of the lesson content, a conventional memory modeling education system might weight the earlier tests performances more heavily than the later ones based on the presumption that the student's level of recall falls as time goes on. Such a graduated weighing of test performances prevents the student from being strongly penalized for poor test performance when it is more likely that the student has forgotten the lesson content.
What conventional memory modeling systems lack is a flexible user- oriented method or system for allowing the student to define for himself a target level of memory strength for information that the user wishes to "know. " Conventional systems cannot provide this service because conventional memory modeling systems can neither assess nor affect a user-defined level of memory strength for a given item of information.
For example, suppose a young medical student has aspirations of someday traveling in France. This student has at least two educational goals: to "know" medicine and to "know" French. The two goals are distinguished, however, because for this student the necessity of knowing medicine is immediate and comprises far greater detail than the student's desire to have novice knowledge of French in the far future. Conventional memory modeling educational systems cannot accommodate this student's unique educational needs. What is needed is a system or method through which the student can select the general French content
and specific medical content that she desires to "know", define the level at which she wishes to "know" each piece of selected content, define when she needs to know the information by, and be provided with a review system in which the user's unique target level of knowledge for each piece of selected content is efficiently affected and maintained.
The following references summarily disclose embodiments of computer-related adaptive education systems.
The U.S. Patent to Hitchcock et al., 5,823,781, discloses a method for training a user on any number of computer software applications. The method provides an important technical advantage by reducing or eliminating wasted training. This invention first diagnoses the proficiency level of each user in one or more computer software applications and then provides a training plan specifically tailored to each user's proficiency level to raise the user's proficiency level in each application. Additionally the method can store and analyze data representing the amount and effectiveness of a particular user's training.
The U.S. Patent to Sloane et al., 5,813,863, discloses a multimedia method for interactive behavior modification. The method includes a selectable progression of modules including dynamic introductory modules leading to a combination of educational modules. The primary learning modules include an interactive/contextual adventure, local information, topical encyclopedia, and subject matter quizzes. In addition, while the user is navigating the foregoing modules, a tracking module tracks the user's decisions and other characteristics and alters the program content accordingly.
The U.S. Patent to Bloom et al., 5,597,312, discloses a method for computer-based tutoring. The method comprises selecting a teaching parameter, generating a student model, and monitoring a student task based upon the teaching parameter and the student model. The method further comprises generating an updated student model based upon the student's response to the task, and repeating the task generation and monitoring processes.
The U.S. Patent to Bro, 5,377,258, discloses a computer-aided system for motivational telephone counseling. The system comprises a client database, a means for reinforcing predetermined client behavior with motivational messages or questions to the client, and a computer having a means for transmitting the messages or questions over a telephone network such that the messages and questions may be posted and accessed at different times.
The U.S. Patent to Nichols et al., 5,987,443, discloses a learning system that provides the user with a simulated environment presenting a business opportunity that the student must understand and solve optimally. Mistakes are noted and remedial educational material is presented to dynamically build the necessary skills that a user requires for success in the business endeavor. The system utilizes an artificial intelligence engine driving individualized and dynamic feedback used to simulate the real-world environment and interactions. A business model provides support for the realistic activities and allows the user to experience real- world consequences for their actions and decisions.
The U.S. Patent to Peterson et al., 5,957,699, discloses a system for remote teaching over a computer network. The system presents various types of stimuli to remote students and records the students' individual response data that correspond to the stimuli. In addition, the system forms an evaluation of each individual student's response data. The evaluation correlates the student's response data to pre-determined correct response data. Further, the teaching system modifies its own behavior according to each student's evaluation to thereby tailors the behavior of the teaching process to the cognitive abilities of each individual student.
The U.S. Patent to Ho et al., 5,779,486, discloses a method and system for assessing a student's understanding in a subject and, based on that understanding, generating individually-tailored tests for enhancing the student's understanding in the subject.
A first component of the invented system includes a score generator. In one preferred embodiment, the score generator generates an overall score based
on the student's latest and prior-to-latest test results for a particular group of questions. In another preferred embodiment, the score generator calculates an overall score that reflects the student's degree of forgetfulness as a function of time for the particular group of questions. In one calculation embodiment, the scoring involves a first-order weighted combination of test scores. If the weight for the latest score is more than the weight for the prior-to-the-latest overall score, the calculation has the effect of putting more emphasis on the latest test score. In another calculation embodiment, the prior-to-the-latest overall score is multiplied by a weight raised to a power. The power depends on the time between the latest test and the prior-to-the-latest test.
After a score is generated, it is stored and later analyzed by a second component of the invented system, a recommendation generator. The recommendation generator generates recommendations for the student's future learning based on the student's overall score. The recommendation generator applies a set of pre-defined subject-specific analysis, relationship, or pre-requisite rules to the student's overall score to recommend a learning path that may best enhance the student's understanding in a subject or series of subjects.
A third component of the invented system includes a question generator. For each recommendation produced by the recommendation generator, the question generator generates a number of questions. In one preferred embodiment, the question generator avoids producing duplicative questions. The question generator only considers questions previously generated within a certain time frame, such as within one year of the latest test, because the student may not likely remember questions more than a certain time frame away.
The non-patent literature entitled "A Simple Model for Adaptive
Courseware Navigation" http://wwwis.wm.me.nl/irifwet97/proceedings/da_silva_2_ full.html) describes a web-based "adaptive hypermedia system." The system combines computer assisted instruction, an intelligent tutoring system, and hypermedia to build a unique instruction model tailored for a particular student's educational ability and goals.
The non-patent literature called "Adaptive Learning" (httpJwww. adaptivelearning.com/default.html) describes web-based adaptive learning generally.
The non-patent literature entitled "ATS - Adaptive Teaching System" http://www.ercim.org/publication/Ercim_News/enw34/specht.html) describes web based adaptive learning generally.
The non-patent literature entitled "Intelligent Development Tool for Adaptive Courseware on WWW" (http://www.csa.ru/Inst/gorb_dep/artific/IA/ calisce98.htm) describes the architecture of a web-based Intelligent Distance Learning Environment.
The non-patent literature "User Modeling and Adaptive Navigation Support in WWW-based Tutoring Systems" (http://www.psychologie.uni- trier.de: 8000/ρrojects/ELM/Papers/UM97- WEBER.html) discusses the necessity of student profiling in an individualized WWW-based tutoring system, what the goals of student profiling are, and the necessity of adaptive navigation support. Next, an adaptive, knowledge-based tutoring system is introduced followed by an empirical study of results.
The non-patent literature entitled "ATS - Adaptive Teaching System" (http : //www . rand . org/hot/mcarthur/Papers/role . html) describes web-based intelligent tutoring systems generally.
SUMMARY OF THE INVENTION
A preferred embodiment of the present invention comprises a computer-based method and system for attaining or mamtaining a user-defined target recall ability for one or more sets of knowledge. In accord with this embodiment, the system receives input selecting a set of knowledge to review and defining a target recall ability for the selected set of knowledge. In response, the system presents a review of informational items included within the knowledge set. Based
on the time of and performance on the previous review, the target recall ability and a memory strength assessment algorithm, the system calculates a subsequent time to review the selected set of knowledge. Preferably, the review cycle is repeated until the user-defined target recall ability is attained by the user.
Alternatively or additionally, a user of the present invention maintains a target recall ability for a selected knowledge set over an extended period of time or lifetime.
Additionally, the system presents a review schedule forecast and is configured to receive user input modifying the review schedule. Similarly, the system is configured to receive user input modifying the target recall ability for a knowledge set.
In accord with the preferred embodiment, the memory assessment algorithm comprises a memory strength algorithm and a memory decay algorithm.
Another aspect of the present invention comprises providing a user with an indication of his or her memory strength with respect to review items.
Preferably, the present invention is implemented on the World Wide Web. However, the present invention may "stand alone" such as a software application running on a single computer device.
Commonly, the user will review a plurality of information items associated with a given knowledge set. Information items comprise one or more of text, graphics, movies (including animation) and audio clips. Information items are categorized into groups and sub-groups related by concept author or content. For example, the item group "Mathematics" may contain the item sub-groups "Calculus, Geometry, Algebra, and Trigonometry. "
Preferably, the user may purchase information items, create information items or share information items with other users.
Preferably, each information item and group of information items is defined by a plurality of field labels and associated field values.
The above objects and other objects, features, and advantages of the present invention are readily apparent from the following detailed description of the best mode for carrying out the invention when taken in connection with the accompanying drawings and Appendix (a partial business plan for the present invention).
BRIEF DESCRIPTION OF THE DRAWINGS
FIGURE 1 is an overview block diagram of a system for implementing the present invention;
FIGURE 2 is an example review item for use in implementing the present invention;
FIGURE 3 is an example memory strength model for use in implementing the present invention;
FIGURE 4 is an example memory decay model for use in implementing the present invention;
FIGURE 5 is a block flow diagram illustrating a review cycle in accord with the present invention;
FIGURE 6 is a block flow diagram illustrating content acquisition in accord with the present invention;
FIGURE 7 is a block flow diagram illustrating content organization in accord with the present invention;
FIGURE 8 is an overview block flow diagram of a review schedule forecast system for use in implementing the present invention;
FIGURE 9 is a block flow diagram illustrating a review schedule forecasting method in accord with the present invention;
FIGURE 10 is a block diagram of a system for user review administration in accord with the present invention;
FIGURE 11 is a block diagram illustrating a system for data management for use in implementing the invention;
FIGURE 12 is a block flow diagram illustrating data flow in accord with the present invention;
FIGURE 13 is an example graphical user interface (GUI) displaying learning content acquisition;
FIGURE 14 is an example GUI displaying a question regarding an item of information;
FIGURE 15 is an example GUI displaying an answer regarding the item of information;
FIGURE 16 is an example GUI displaying a study tip regarding the item of information;
FIGURE 17 is an example GUI displaying additional information regarding the item of information;
FIGURE 18 is an example GUI displaying content organization and recall ability management;
FIGURE 19 is an example GUI displaying review scheduling; and
FIGURE 20 is an example GUI displaying review progress and statistics.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)
For convenience, the best mode for carrying out the invention is divided into the following principal sections: Review Cycle, Content Acquisition, Content Organization, Review Schedule Forecasting, Recall Management, Performance Feedback, User Administration Options, Content Definition, and Data Management. In addition, data values and data field labels are represented in SMALLCAPS font for clarity.
Review Cycle
Figure 1 is an overview of the environment in which the present invention operates. The system aspect of the present invention comprises a server computer 104 operably serving at least one client computer device 108.
Server computer 104 includes any computer-related machine or group of machines capable of input, output, process, and storage of information. Examples of server computer devices include, but are not limited to, a network server system, a personal computer system, and a mobile computer system (e.g., personal data assistant, notebook computer, hand-held computer device, etc.).
The client computer 108, includes any computer-related machine or group of machines capable of input, output, process, display, and storage of information. Examples of client computer devices include, but are not limited to, a personal computer system and a mobile computer system (e.g., personal data assistant, notebook computer, hand-held computer device, etc.).
Computer network 148 between the server 104 and client computer 108 includes any technology providing operable communication between the two devices. Examples include, but are not limited to, hardwired communications such as a local area network, the Internet, and wireless communications such as satellite, radio frequency or infrared communication. Preferably, the present invention will operate as a World Wide Web site on the Internet. Quite possibly, however, the server computer device and the client computer device, could be the same computer device. Additionally, the operable connection between the client and server computers may be continuous or based on one or more separate connections.
In accord with the preferred embodiment, the server computer device contains at least one item of information (the "item") as represented in block 112, a memory strength model as represented in block 116, and a memory decay model as represented in block 120. The item 112, the memory strength model 116, and the memory decay model 120 are discussed in more detail below in Figures 2, 3, and 4, respectively.
As generally shown by the flow of Figure 1, the review cycle begins as the user 124 at the client computer device 108 presents input 128 to the server computer device 104 selecting the item 112 for review and defining a target recall ability 132 for that item. In response to the user-defined target recall ability 132, the memory strength model 116 and the memory decay model 120, the server computer device calculates a date after which the item is due for review as represented in block 136.
Upon login into the server computer 104, the user receives a lesson of all information items due for review at that time, as represented in block 140.
Following the user's review of the item, the date of, the time of, and the user's performance on the review (the "review information") is transmitted from the client computer device 108 to the server computer device 104 as shown in block 144.
To complete the review cycle, the server computer 104 calculates another review date 136 time based on the review information 144, the memory strength model 116, the memory decay model 120, and the user's target recall ability 132.
Preferably, the review cycle is repeated to attain or maintain the user- defined target recall ability for the item. At any time, however, the user may discontinue the review process.
Referring to Figure 2, an example information or knowledge item is illustrated. Preferably, knowledge items comprise a prompt 204, a response 208 and reference information 212 such as a mnemonic, study tip, or link to such information. Referring to Figure 14, an example GUI is shown displaying a knowledge item prompt 181. Referring to Figure 15, an example GUI is shown displaying a response 191. Referring to Figure 16, an example GUI is shown displaying reference information 275 for the review item. Referring to Figure 17, an example GUI is shown displaying additional information 240 concerning the review item.
Referring to Figure 3, an example memory strength model is shown.
The memory strength model is a mathematical function 312 defining relationship between the amount of memory strength ("MEMORY STRENGTH") 304 that a user has demonstrated in past reviews and the time period 308 the user should wait until the next review ("DAYS AND TIME TO NEXT REVIEW").
Generally, as a user's memory strength with respect to an item increases, the amount of time that user can wait until the next review of the item is increased. Conversely, as a user's memory strength with respect to an item decreases, the time that the user can wait until the next review is decreased.
Mathematically, the memory strength model can be represented by several functions. Possible functions include, but are not limited to, a geometric
function and an exponential function. Equation 3.1 is a preferred geometric function supporting the memory strength model:
Days and Time to Next Review = Strength Cl x(Strength C2 + New Strength)^
Eqn. 3.1
where the values STRENGTH Cl , STRENGTH C2, NEW STRENGTH, and β are defined and discussed in more detail below.
Equation 3.2 is an example exponential function supporting the memory strength model.
D ays and Time to N ext Review = Strength C l x βN e S trenSth
Eqn. 3.2
where the values STRENGTH Cl , β and NEW STRENGTH are defined and discussed in more detail below. Notably, the constants in Equation 3.2 are not necessarily equivalent to those of equation 3.1.
Referring to Figure 4, an example memory decay model is shown. The memory decay model is a mathematical function 404 defining a relationship between a normalized value of the value DAYS AND TIME TO NEXT REVIEW 408 defined by the memory strength model shown in Figure 3 and the user's current recall ability 412 ("PERCENT CHANCE OF RECALL") for the item of information being reviewed.
Generally, as the normalized value of DAYS AND TIME TO NEXT
REVIEW approaches "1 ", the likelihood that the user will recall the item decreases. Conversely, as the normalized value of DAYS AND TIME TO NEXT REVIEW approaches "0", the likelihood that the user will recall the item increases.
Mathematically, the memory decay model can be represented by several functions. Possible functions include, but are not limited to, a geometric function and an exponential function. Equation 4.1 is a preferred geometric function for defining the memory decay model.
■ a
Percent Chance of Recall = Decay C3 x (φ x Review D ate Adjustment)
Eqn. 4.1 where
(Computer Device Date and Tim e - D ate and Tim e of Last Review) D ays and Time to Next Review
Eqn. 4.1a and
In (Target Recall Ability) - In (Decay C3)
-
Review Date Adjustment Factor= e
Eqn. 4.1b
The values contained in Equation 4.1, Equation 4.1a, and Equation 4. lb are defined and discussed in more detail below.
Equation 4.2 is an exemplary exponential function for defining the memory decay model.
n , _ , • m ii r. ^-> - or(ø x R evie w D ate A djustm en t Factor)
Percent Chance of Recall = D ecay C3 x _-
Eqn. 4.2
where
(Current Date and Time -Date and Time of Last Review)
Φ =- Days and Time to Next Review
Eqn. 4.2a and
( ln(% RTarget) "
Re view Date Adjustment Factor = - — w __ _ ; ~
J l, ln(% RStandard)J
Eqn. 4.2b
The values contained in Equation 4.2, Equation 4.2a, and Equation 4.2b are defined and discussed in more detail below.
Referring to the block flow diagram shown in Figure 5, a detailed illustration of the review cycle is disclosed. Referring to blocks 508 and 526, the manner in which these review cycle tasks are performed using the geometric scheduling algorithm is distinguished from the manner in which the same review cycle tasks are performed using the exponential scheduling algorithm. Referring to blocks 504, 512, 516, 518, 522, and 530, however, the manner in which the review cycle tasks are performed using the geometric scheduling algorithm is similar to the manner in which the tasks are performed using the exponential scheduling algorithm.
As shown in block 504, the review cycle begins with the user defining the value TARGET RECALL ABILITY for the item. Refer to the discussion of Figure 22 below for a detailed description of TARGET RECALL ABILITY definition.
Next, the REVIEW DATE ADJUSTMENT FACTOR is calculated as shown in block 508. Preferably, the REVIEW DATE ADJUSTMENT FACTOR is calculated by the server computer (not shown) according to the user's memory decay profile shown in Figure 4 and defined by Equation 4. lb or 4.2b depending on whether the exponential or geometric scheduling algorithm is implemented.
(Computer Device Date and Time - Date and Time of Last Review) __ — . —
Days and Time to Next Review
Table 1 includes, but is not limited to, pre-defined Decay_C3 and CX values for Equations 4.1 and 4.2.
Table 1
Referring to block 512, the value DATE AND TIME OF NEXT REVIEW is calculated according to Equation 5.1:
Date and Time of N ext Review = D ate and Time of Last Review + (Days and Time to Next Review x Review Date Adjustment Factor)
Eqn. 5.1
where the values DATE AND TIME OF LAST REVIEW and DAYS AND TIME TO NEXT REVIEW are pre-defined for the user's initial review or, for subsequent reviews, are based on previous review results.
Table 2 includes, but is not limited to, exemplary pre-defined values for the DATE AND TIME OF LAST REVIEW and DAYS AND TIME TO NEXT REVIEW.
Table 2
Preferably, the server computer device 104 or the client computer device 108 shown in Figure 1 calculates the value DATE AND TIME OF NEXT REVIEW.
As illustrated in block 516, the user reviews the item on or near the DATE AND TIME OF NEXT REVIEW calculated in Equation 5.1.
After the user reviews the item, the value φ is calculated as shown in block 518. The value φ is the ratio of the actual time lapsed between the last and current review to the scheduled time lapse between the last and current review, unadjusted by the value REVIEW DATE ADJUSTMENT FACTOR.
The value φ is calculated according to Equation 5.2:
Eqn. 5.2
where the value COMPUTER DEVICE DATE AND TIME STAMP is the date and time the review actually took place, the value DATE AND TIME OF LAST REVIEW is the date and time that the user reviewed the item previously, (or default value from Table 2) and the value DAYS AND TIME TO NEXT REVIEW is the scheduled time interval between the previous and current reviews defined by Equation 3.1 or 3.2.
Preferably, the server computer device 104 or the client computer device 108 shown in Figure 1 calculates the value φ.
Next, the value NEW STRENGTH is defined as shown in block 522. Qualitatively, the value NEW STRENGTH represents the user's current memory strength with respect to the reviewed item. For example, if a user reviews an item on the scheduled review date, and reviews the item correctly, then his memory strength with respect to that item is increased. Conversely, if a user reviews an item on the scheduled review date, and reviews the item incorrectly, then his memory
strength with respect to that item is decreased. However, if a user reviews an item before the scheduled review date, and reviews the item incorrectly, a larger decrease in memory strength occurs than would have occurred had the user reviewed the item incorrectly on the scheduled review date. Similarly, if a user reviews an item of content after the scheduled review date, and reviews the item correctly, a larger increase in strength occurs than would have occurred had the user reviewed the content correctly on the scheduled review date.
Table 3 defines preferred definitions of the value NEW STRENGTH:
Table 3
where the value φ is defined by Equation 5.2. the value STRENGTH is defined by the value of NEW STRENGTH defined in a previous review, the condition Correct is satisfied if the user reviewed the content correctly, and the condition Incorrect is satisfied if the user reviewed the content incorrectly.
For users reviewing content for the first time, the value STRENGTH is a pre-defined default value. A preferred default value of STRENGTH includes, but is not limited to, "1".
Preferably, the server computer 104 or the client computer 108 shown in Figure 1 calculates the value NEW STRENGTH. ,
Next, the value DAYS AND TIME TO NEXT REVIEW is calculated as shown in block 526. Preferably, the server computer 104 shown in Figure 1 calculates the value DAYS AND TIME TO NEXT REVIEW according to the user's memory strength model shown in Figure 3 and defined by Equation 3.1 or 3.2
depending on whether the geometric or exponential scheduling algorithm is implemented.
Table 4 includes, but is not limited to, pre-defined STRENGTH Cl, STRENGTH C2 and CX values for Equations 3.1 and 3.2 according to the selected scheduling algorithm.
Table 4
At any time during the review process, the user may re-define the TARGET RECALL ABILITY by selecting a different REVIEW MODE as shown in block 530. Refer to the discussion of Figure 22 below for a detailed description of how a user re-defines the value TARGET RECALL ABILITY.
As shown by the general flow of Figure 5, the review process repeats to best attain or maintain the user-defined TARGET RECALL ABILITY.
Content Acquisition
Preferably, the server computer 104 shown in Figure 1 contains a plurality of informational items for review. It is also preferred that the items are grouped. Grouping criteria includes, but is not limited to, the general concept showed by group item members and the author of the group.
Referring to the block diagram shown in Figure 6, a preferred embodiment of content acquisition is shown.
As illustrated by block 604, the user may acquire content by purchasing the content from other users 608 or from the server computer device 612.
As illustrated by block 616, the user may create content to review. Preferably, the server or client computer device will support content creation.
As illustrated by block 620, the user may share content with other users 622. Content may be packaged for export to or import from the server computer device or other user's client computer devices. Preferably, only content that is user-created may be exported wholesale (to ward against copyright infringement). It is also preferable that only user created data, and not copyrighted pre-packaged content, be exported. For example, user-created alternative mnemonics can be exported.
Referring to Figure 13, an exemplary user interface displaying user content acquisition options is shown. As shown, the user may search for content 190, create content 192, and select content by feature 193 or publisher 194.
Content Organization
Referring to the block diagram shown in Figure 7, a preferred embodiment of content organization is shown.
According to the preferred embodiment, acquired content is organized into a parent-child group hierarchy 704. The hierarchy may be stored and maintained on the server computer device 708, the client computer device 712, or any other computer-related device capable of data input, output, process and storage.
As shown in block 704, acquired groups are organized within a parent-child group hierarchy. Parent groups 716a-716n topically distinguish child group content. For example, the parent group "Mathematics" contains child groups concerning mathematics (e.g., Calculus, Trigonometry, Geometry, etc.), and the parent group "Physical Sciences" contains child groups concerning Physical Science (e.g., Earth Science, Chemistry, Biology, etc.). To organize acquired content, the user may be shown the organization hierarchy having the individual groups listed according to their group display name. To add a new group, the user chooses the parent group to which the newly acquired content will belong.
Additionally, the child groups may contain sub-child groups 720a- 720n. For example, the child group "Calculus" may contain the sub-child groups Calculus I, Calculus II, and Calculus III.
Review Schedule Forecasting
Referring to Figure 8, an overview of review schedule forecasting is shown. As shown by block 750, the user at the client computer device 754 may present input to the server computer device 758 requesting a review schedule forecast 762. In response to this input the server computer device will present the user at the client computer device 754 with a review forecast 762 to, but not limited to, seven (7) days following the user's review schedule forecast request.
As shown generally in block 768, the study schedule forecast outlines the total number of items the user is scheduled to review on each scheduled review day falling within the duration of the review forecast as shown in block 772, and the total estimated daily review time as shown in block 774.
Preferably, the user 778 can schedule a vacation day or "day off" from the review cycle. In the event that a day within the forecast is a user-defined vacation, the items scheduled to be reviewed on the vacation day are moved to their respective next non- vacation day.
Refer to Figure 19 for an exemplary user interface displaying a user's review schedule forecast. Blocks 231 and 232 demonstrate the user's ability to schedule a vacation day.
Referring to Figure 9, a detailed algorithm for defining the review schedule is shown. The algorithm begins with the first day to be forecast as shown in block 450. For each item whose scheduled review date is equal to that forecast day, the value TOTAL NUMBER OF ITEMS scheduled to be reviewed on that day is incremented as shown in block 458. In addition, the value TOTAL REVIEW TIME for that day is incremented in proportion to the average review time for each item forecast for that day's review as shown in block 462.
Next, the condition RANDOM RESULT is tested for each item as shown in block 468. The condition RANDOM RESULT is satisfied if a server-defined arbitrary value RANDOM NUMBER between 0 and 1 is greater than the TARGET RECALL ABILITY.
If the condition RANDOM RESULT is "Yes", then the STRENGTH for the tested item is increased by "1 " as shown in block 472. Conversely, If the condition RANDOM RESULT is "No", then the STRENGTH for the tested item is decreased by "1" as shown in block 476.
As shown in block 480, the value DAYS AND TIME TO NEXT REVIEW is calculated next. In accord with the geometric review scheduling algorithm, the value DAYS AND TIME TO NEXT REVIEW is calculated according to Equation 3.1.
In accord with the exponential review scheduling algorithm, the value DAYS AND
TIME TO NEXT REVIEW is calculated according to Equation 3.2.
After the value DAYS AND TIME TO NEXT REVIEW is calculated for the item, the item is added to the appropriate forecast day as shown in block 484. As shown by arrow 498, the scheduling process is repeated for each item whose date and time of next review falls on the forecast day. When all such items for a particular day are forecast, the forecast process increments to the next day to be forecast as shown in block 450. As shown by arrow 499, the scheduling process is repeated for each day falling within the forecast.
Last, the forecast review schedule, including the total number of items to be reviewed each day and each day's total estimated review time is presented to the user at the client computer device as shown in block 496.
Preferably, items will be presented to the user for review in the order high to low based on the value Epsilon. Qualitatively, Epsilon is the ratio of actual time waited to review since last review to the planned time, adjusted for the value REVIEW MODE. Quantitatively, Epsilon is calculated according to Equation 11.1.
(Current D ate and Time - D ate and Time of Last Review )
6 _
(Days and Time to Next Review x Review Date Adjustment Factor)
Eqn. 11.1
Recall Management
Preferably, a user can define and re-define the value TARGET RECALL ABILITY for each review item or group of review items to match the user's desired level of knowledge for each item or item group. To define this value, the user may be presented with a graphical interface defining a plurality of REVIEW MODES.
Exemplary REVIEW MODES might include, but are not limited to,
STANDARD, LONG-TERM, and CRAM. The STANDARD REVIEW MODE is best-fit for a user attempting to review information at an average rate. The LONG-TERM
REVIEW MODE is best-fit for a user attempting to review information over an
extended period of time such as a lifetime. The CRAM REVIEW MODE is best-fit for a user attempting to review information at an accelerated pace in preparation for an exam. Table 5 includes, but is not limited to, three exemplary REVIEW MODES and their corresponding TARGET RECALL ABILITIES.
Table 5
Optionally, a user may schedule a test date for the item. Refer to Figure 10 for a detailed description of test scheduling. When a test is scheduled, the REVIEW MODE will switch to CRAM. Preferably, the REVIEW MODE will switch to Standard when a test is canceled.
Referring to Figure 18, an exemplary user interface displaying a preferred embodiment of recall ability management is shown. In this figure, four REVIEW MODES are graphically shown: "Cram" 850, "Learn" 852, "Maintain" 854, and "Life-Long Learning" 856.
In accord with the preferred embodiment, a user has the ability to organize review items or item groups according to the REVIEW MODE of his or her personal choice. For example, a high-school student user might begin a review of French by placing the acquired French I content under the "Learn" REVIEW MODE 852. Shortly before her French I final exam, the student might transfer the content from the "Learn" category 852 to the "Cram" category 850 thereby re-defining the REVIEW MODE and corresponding TARGET RECALL ABILITY associated with the French I content. The practical effect of this change is to increase the frequency of French I review in preparation for the exam. After the student completes the course, she may wish to transfer the French I content, or a portion of the French I
content, to the category "Life-Long Learning" 856. The student might choose to do so in hopes of maintaining recollection of French I content over the course of her lifetime. Accordingly, the TARGET RECALL ABILITY associated with the French I content may be reduced commensurate with the life-long learning REVIEW MODE substantially reducing the frequency of scheduled French I reviews.
Preferably, items in the "Life-Long Learning" category 856 will be identified as such by the user control button 183 shown in Figure 14. Accordingly, a user may operate the user control button 183 to move items into and out of the "Life-Long Learning" category.
Performance Feedback
Preferably, the server computer device will present the user at the client computer device with feedback information regarding review performance and progress. User statistical information includes, but is not limited to, the user's memory strength with respect to each review item or group of items, the total number of items scheduled for review, the average review time for each item or item group, the total number of active items, and a forecast review load including the forecast number of items to be reviewed and their average review time.
Referring to Figure 20, an exemplary user interface displaying user statistical information is shown.
For each item or item group shown in block 360, the user may be presented with graphical feedback illustrating the number of questions asked for each item group as shown in block 364. Additionally, the user may be presented with graphical feedback illustrating the user's memory strength in each selected category as shown in block 368, and each review item as shown in block 182 of Figure 14.
User Administration Options
Referring to the block diagram shown in Figure 10, a preferred embodiment of the user's review administration options are shown.
According to the preferred embodiment, the user 850 at the client computer device 852 may communicate with the server computer device 854 to administer the user's review schedule 856, administer the review content 858, administer the user's account information 860, and schedule newly acquired items for review 862.
Review schedule administration 856 includes, but is not limited to, scheduling a test, canceling a test, scheduling a review vacation day, and canceling a review vacation day.
Review content administration 858 includes, but is not limited to, selecting or creating lesson builders. Lesson builders allow a user to manage what and how much information is presented for review. Preferably, the user can select from default lesson builders defined by the server computer device, or the user can create his or her own lesson builder. Default lesson builders include, but are not limited to, STANDARD DAILY, THIRTY MINUTE, and TEST. The STANDARD DAILY lesson builder shown includes all items whose DATE AND TIME OF NEXT REVIEW is less than or equal to the present date. The THIRTY
MINUTE lesson builder includes all standard daily items such that the total time to review is less than 30 minutes. The TEST lesson builder includes all items scheduled for a test whose DATE AND TIME OF NEXT REVIEW is less than or equal to the present date, and for which a test has been scheduled.
Criteria for creating lesson builders include, but is not limited to, the following:
- content group - the user can select groups of content for review
- "due for review" status- the user can select content items that are due for review
- total number of content items - the user can request that the total number of items to review do not exceed a user-specified limit
- total review time - the customer can request that the total estimated review time for the items does not exceed a user-specified limit
Preferably, as the user builds his or her lesson, a running estimate of total items for review and total time for review shall be displayed to the user for reference.
User account administration 860 includes, but is not limited to, maintaining the user's name, contact information, system identification, password, billing status, native language, and services subscribed to.
Schedule Newly- Acquired Items for Review 862 includes, but is not limited to, scheduling all items for immediate review, scheduling a fixed number of items to be introduced daily or weekly, or selecting a pre-defined or user-defined curriculum that defines the scheduling of items.
Content Definition
In accord with a preferred embodiment of the present invention, each knowledge item and knowledge item group is defined by a series of attributes. Knowledge item attributes include but are not limited to USERID, ITEMID, CHAPTERID, GROUPID, AUTHORID, PROMPT, RESPONSE, REFERENCE DATA, STRENGTH, DATE AND TIME OF NEXT REVIEW, DAYS AND TIME UNTIL NEXT REVIEW, TOTAL TIMES REVIEWED, TOTAL TIME SPENT REVIEWING, TOTAL TIMES CORRECT, ACTIVE STATUS, FLAG, LIFE LONG LEARNING STATUS, AND DATE ACTIVATED.
Preferably, knowledge item attributes are updated following each item review. The update may be performed by the server computer device, the client computer device, or both.
Knowledge item group attributes include but are not limited to: USERID, GROUPID, PARENTGROUPID, AUTHORID, AUTHORGROUPNAME, GROUP DISPLAY NAME, TEST DATE, REVIEW MODE, STRENGTH_C1, STRENGTH _C2, DECAY_C3, ALPHA, BETA, COST, ICON, TIME PERIOD FOR ACTIVATING NEW ITEMS , AND NUMBER ADDED EACH PERIOD .
Preferably, knowledge item group attributes are updated following each item review. The update may be performed by the server computer device, the client computer device, or both.
Data Management
Referring to Figure 11, a preferred embodiment of system data tables is shown. Preferably, the system data tables are maintained on the server computer, the client computer, or both. Each table contains at least one data field label with a corresponding data field value (not shown). The data field labels associated with each data table are described below.
The table USER INFO 270 includes but is not limited to the following data fields: SYSTEM-SPECIFIED USER ID, USER IDENTIFICATION INFORMATION, USER ADDRESS, NATIVE LANGUAGE, BILLING STATUS, AND ACTIVE STATUS.
The table USER ACCOUNTING DATA 272 includes but is not limited to the following data fields: SYSTEM-SPECIFIED USER ID, TOTAL NUMBER OF ITEMS, ACTIVE STATUS, AND AMOUNT BILLED (the amount of money that the user has been billed for the review service).
The table AUTHOR INFO 274 includes but is not limited to the following data fields: AUTHOR IDENTIFICATION AND AUTHOR CONTACT INFORMATION.
The table USER GROUP DATA 276 includes but is not limited to the following data fields :USERID, GROUPID, PARENTGROUPID, AUTHORID, AUTHORGROUPNAME, GROUP DISPLAY NAME, TEST DATE, REVIEW MODE, STRENGTH_C1 , STRENGTH C2, DECAY-C3, ALPHA, BETA, COST, ICON, TIME PERIOD FOR ACTIVATING NEW ITEMS , AND NUMBER ADDED EACH PERIOD .
The table GROUP HISTORICAL DATA 278 includes but is not limited to the following data fields: the SYSTEM SPECIFIED USER ID, the GROUP AUTHOR NAME, and HISTORICAL DATA including but not limited to the number of correct and incorrect reviews for the group, and the value φ for each review.
The table PURCHASED GROUP LIST 280 includes but is not limited to the following data fields: SYSTEM SPECIFIED USER ID, GROUP AUTHOR NAME, GROUP AUTHOR INFORMATION, PURCHASE DATE AND MARKETING CODE.
The table CONCEPTS 282 includes but is not limited to the following data fields: the GROUP AUTHOR NAME, GROUP AUTHOR INFORMATION, and AUTHOR ITEM VERSION.
The table REVIEW MODE 284 includes but is not limited to the following data fields: REVIEW MODE and TARGET RECALL PERCENTAGE.
The table ITEM CORE 286 includes but is not to the following data fields: ITEMID, CHAPTERID, GROUPID, AUTHORID, PROMPT, RESPONSE, REFERENCE DATA.
The table ITEM EXTENDED 288 includes but is not limited to the following data fields: USERID, STRENGTH, DATE AND TIME OF NEXT REVIEW, DAYS AND TIME UNTIL NEXT REVIEW, TOTAL TIMES REVIEWED, TOTAL TIME SPENT REVIEWING, TOTAL TIMES CORRECT, ACTIVE STATUS, FLAG, LIFE LONG LEARNING STATUS, DATE ACTIVATED, STORAGE STATUS.
The table SCHEDULED ITEMS 290 includes but is not limited to the following fields: the SYSTEM SPECIFIED USER ID, GROUP AUTHOR NAME, GROUP AUTHOR INFORMATION, and AUTHOR ITEM VERSION, and the DATE ADDED ( the date that the item will be added to the user's list of items to be reviewed).
The table SYSTEM PARAMETERS 292 includes but is not limited to the data fields and corresponding data field values shown in Table 6:
Table 6
Referring to the block diagram shown in Figure 14, a preferred embodiment of data flow is shown.
As shown in block 150, server computer device 152 presents an output 154 to an input 156 of the client computer device 158. Output 154/150 contains, but is not limited to, a content item or group of content and item or group information.
As shown in block 160, client computer device 158 presents an output 162 to an input 164 of the server computer device 154. Output 162/160 contains, but is not limited to, review information and item edit information.
The review information includes, but is not limited to, the USER IDENTIFICATION, ITEM AUTHOR, ITEM VERSION, AUTHOR GROUP NAME, DATE
AND TIME OF REVIEW, REVIEW PERFORMANCE, ELAPSED REVIEW TIME, and Φ.
The item edit information includes, but is not limited to, the USER IDENTIFICATION, ITEM AUTHOR, ITEM VERSION, AUTHOR GROUP NAME, the NUMBER OF EDITS, and for each edit, the TYPE OF EDIT, the NAME OF THE EDITED FIELD, and the NEW FIELD VALUE.
While embodiments of the invention have been illustrated and described, it is not intended that these embodiments illustrate and describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention.