PSI LogoParadigm Shift
International

USA-505-586-1536

OtherWise


A new guest appears monthly, and the prior wisdom is archived in the Library.
Register for free email notice when monthly update occurs.


Featured Guest Speaker

J. Olin Campbell,
President, Performance Mentor
Research Associate Professor,
Vanderbilt University, Nashville, TN

Posted: May 16, 1998

Asynchronous Learning Networks:
Evaluating Anytime/Anywhere Learning


An Asynchronous Learning Network (ALN) is a people network for learning that is largely asynchronous. It combines self-study with substantial, rapid, asynchronous interactivity with others. In ALNs learners use computer and communications technologies to work with remote learning resources, including coaches and other learners, but without the requirement to be online at the same time.

By this definition, a Web-based workshop that requires frequent online conferencing and collaboration with others is an ALN. So is a text- or computer-based training course that requires learners to use email to discuss assignments with each other and with the coach. An ALN also encompasses a proctored examination at a specified time and place, or occasional synchronous chat or lab sessions for close-proximity learners, or an in-person kickoff meeting.

By this definition, distance education based primarily on a synchronous audio or video presentation or conference is not ALN because these constantly require learners and instructors to be available at the same time. A videotaped course or mail-based correspondence course or computer-based training is not ALN because these do not include substantial and rapid interactivity with others, even though the learner might mail in a paper or test and receive a reply days later.

My work includes development and evaluation of computer- and Web-based tools for learning and performance support. While much of the work is in technical areas, I also apply the learning strategies to developing interpersonal coaching skills. One study found that methods that combined computer, video and instructor support provided significantly higher learner performance with 1/3 the instructor time, in comparison to traditional methods with classroom delivery [1].

I will share with you here some of our current thinking based on a Web workshop we are offering, in which over 1000 people have enrolled. Related information is readily available, along with a sizable body of current ALN knowledge and tools, on the ALN Web site at http://www.aln.org

ALN includes strategies where learners are separated by time and space but joined by common interests and electronic communication with each other and with coaches. Computers, random access audio and video, and the worldwide web afford the opportunity to use learning strategies like individualized tutorials, group projects where members are at a distance, and simulations. A worldwide course might allow learners and coaches to collaborate using web pages and a conferencing system. Required postings and projects can be submitted electronically.

ALN can be helpful for working adults, to develop and maintain knowledge and skills. As described below, it may also improve learning, decrease learning time, or decrease costs. However there are considerable problems. Users may not realize promised results, as when low course completion rates cancel out improved performance. These larger issues must be addressed by looking at impacts of learning programs.

Design Strategies

The overall strategy is to first provide challenges and projects that are authentic for a given content area and learner ability. Next offer learning opportunities, mentoring, and collaboration with others to help learners meet the challenge. Here are other strategies:

  • Organize the course by assignments with their due dates. Make the assignments frequent (e.g., one per week). Specify what must be accomplished in each assignment and who may work on it (e.g., individuals or a team). Say in what form it is to be submitted (e.g., emailed text, reply to a topic in a computer conferencing system, URL for a Web, computer file as an email attachment, paper), and the criteria for completion. Put all assignments in one place (e.g., on the course syllabus).
  • Make explicit the criteria for success: Provide sample test problems with some exemplar solutions, and project scoring criteria with good examples from prior years (with approval from their authors). For situations where learners will apply principles to solve a complex problem, provide a set of scoring criteria with examples of how the principles are applied in practice.
  • Get learners actively engaged in learning via exercises, simulations, required postings to online conferences, and projects. This involvement is one of the keys to learner investment of time and to course completion. It changes the focus from a lecture for passing information, to learners constantly seeking the information they need to solve problems -- as they must on their jobs, in their families, and in communities.
  • Form cohorts of learners who progress together, rather than having individuals begin and end on their own. Individual self-study provides flexibility, but lessens the expectation for regular due dates and peer discussion of assignments, which may add to the completion rate.
  • Assess learner performance frequently and provide a grade, certification, or other indicator of value. This provides both feedback and motivation. The goal is to increase learner engagement with the course, which basically means their spending time on it. Provide frequent assignments with feedback, and develop the expectation that learners can succeed by getting their questions answered by another learner or the course coach.
  • Evaluate the learning environment to improve it (formative evaluation). Include at least an evaluation of learner reaction (the typical end of course rating) and of learner performance in the study program (e.g., scores on criterion measures that can be reliably graded). If possible use a comparison group (e.g., individuals in the previous study program), and follow up by tracking performance of course graduates in an organization where they use what they learned (e.g., ratings of supervisors, peers, and subordinates on the job).

Evaluation Strategies

There are many sources for evaluation methods [2, 3, 4, 5, 6, 7, 8]. Kirkpatrick [9] and Phillips [10] describe several levels of evaluation:

  1. Learner reaction to the methods, materials, and media of the learning environment (e.g., course evaluations)
  2. Learner performance in the learning environment (e.g., exams, projects, portfolios)
  3. Learner performance on the job (e.g., supervisor ratings)
  4. Benefit of the learning for an organization (e.g., increased sales, decreased costs to facilitate a course)
  5. Return on investment in terms of cost and organization impact (e.g., $30,000 safety course saves $120,000 in accident costs)

Most evaluations of distance learning currently use self-reports and qualitative evaluations. For example learners may be asked to rate how much they liked specific aspects of the ALN, and trained observers may record what happens while learners are working. A few evaluations will track grades or other course performance measures, then ask the instructor to say how students in this year’s course compare to students in prior years (an opinion). Fewer will use the data from a comparison group (e.g., another section in this year’s course). Only a very few people attempt experimental evaluations of ALN [1].

Evaluation may initially use self-reports and anecdotal information to rapidly and inexpensively gather information about the effects of an ALN program. End-of-course rating sheets are an example of self-reported opinions. Anecdotes and problem descriptions about what works and what doesn’t can be very helpful in a formative evaluation.

Another method is a qualitative or descriptive evaluation. Descriptions and analyses may be especially important to understand how to scale up a program. You may gain more insight into intervention effects using a qualitative evaluation than an experimental evaluation. However, initial promising results may lead you to use an experiment to test the links in a suspected chain of causes and effects.

Evaluating Costs and Benefits for Customers and Providers

We want to identify factors in decreasing cost and improving learning and organization impact. We attempt to identify factors associated with lower cost for ALN than for classroom-based courses, where the level of learning is equivalent or better. Cost to providers includes direct, indirect, and subsidized expenses (e.g., from state governments to their colleges, and grants to individuals). Cost for learners includes their expenses for the course, travel, housing, meals, and related materials. The value of learner time may be included as a cost if a company, subsidy, or scholarship pays for it.

Costs are of two types: development/maintenance and incremental costs per learner. The former costs are concentrated in the design and authoring phase. The latter costs are incurred for coaching support, equipment operation and maintenance and related items having to do with supporting learners. ALN costs are often higher at the front end than for typical lecture courses, but these costs may be distributed among a larger group of learners.

The economic key is to spread development and revision costs over large numbers of learners, and to drive out ongoing costs. This can be accomplished by incorporating much of the learning support in tutorials, simulations, and online responses to frequently asked questions (FAQ’s), and by using very capable but low cost coaches (e.g., working professionals in the field who enjoy coaching).

To evaluate effectiveness, we identify factors associated with higher learner performance (e.g., number of people passing a certification test on the first try, or final exam grades) and better organization impacts (e.g., lower accident costs after a safety training program, or increased enrollments for a school). In an academic environment it is difficult to assign a monetary value to performance gains. However from a larger view -- for example of a state supporting higher education -- complex evaluation is justified to investigate the effect of improved capabilities in the labor force on attracting new business and decreasing welfare costs. At the corporate level, it is quite feasible to attach monetary value to improved performance outcomes by those who have completed a performance program. See Phillips [10] for strategies and examples.

The goal here is to cut cost for equivalent or better learning. In rough terms a 15-week academic course with about 45 contact hours that already exists in lecture format can be converted for Web delivery for about $15K - $20K by an academic team. This does not include expenditures for equipment. That is an exceedingly low figure in comparison to development of a new training program for business and professional use, which can cost $35K per contact hour.

Development of new courses varies with the application. An academic course is likely to be priced lower than a professional course where complex technical skills and simulations are involved (note, however, that every lecture at a top university may cost as much as a Broadway play). We do not yet have much experience on costs for new courses. There is a tradeoff between comprehensive automated interactions (e.g., automatically graded simulations) and personnel costs to support the course. At one end is computer-based instruction that provides information and multiple-choice questions, with no interaction with others. At the other is an individual tutorial with extensive simulations.

Evaluation Results

Evaluations of asynchronous computer-based learning and interactive video indicate that these methods can decrease learning time or improve learner performance by approximately 30 percent [11, 12, 13, 14]. In two experimental studies [1] we have found that even for complex interpersonal skills a combination of classroom and collaborative computer tutorial can decrease the amount of facilitator time required by two-thirds, for equivalent or better performance. This can decrease cost per learner.

Where learners are paid, their salaries are usually the most expensive part of training. Decreasing learning time can produce large savings.

Value of Evaluation

Assessment of learning can guide and direct instruction [4], so learning is focused on what is most important. This is especially important with Advanced Placement courses in high school and certification training classes for professionals. This focusing function of assessments can contribute to improved learning efficiency.

We have conducted a preliminary analysis of three ALN studies at Vanderbilt that used experimental designs. These studies indicate that learners using an electronic laboratory simulator on their own time in lieu of physical labs were able to solve problems on physical equipment as fast or faster than those who took physical labs. In addition, the written lab tests were equivalent between the two groups. Use of the simulations permitted Vanderbilt to drop all night and weekend labs in the course, and thus to reassign a Teaching Assistant to work with students at night when they were doing their homework and had questions.

Mayadas [15] reports on several important ALN studies:

  • Silsbee & Draeger [16] at Cornell University had students work asynchronously with homework and quizzes turned in at prescribed times. Silsbee or a graduate assistant was available through email or a Web-based system. On the basis of homework, quizzes, and post-semester discussion with students on the subject matter, Silsbee judged that the asynchronous cohort had learned as much as other classes.
  • At Drexel Charles Monsanto reports in a personal communication to Frank Mayadas [15] that homework and examination results and surveys of students lead to the conclusion that ALN is equivalent to that of face-to-face classes. All students felt that seeing ideas of others was useful, 67% felt they had more communication with fellow students, 97% felt they had more access to the facilitator than in conventional classes, and 95% said they would take another ALN course. Half, however, said that they missed classroom lectures.
  • At Northern Virginia Community College Sener [17] reports on a study of 250 students across four courses. Of the ALN students who turned in at least one assignment, 65% earned an A, B, or C grade while 57.5% of those in the on-campus course received these grades. Retention rates are reported to also be comparable, and students are said to have been very positive in their comments and survey responses.
  • At New Jersey Institute of Technology Hiltz [18] reports that fifty percent of ALN students received grades of A or B, compared to 31 percent of students in traditional classes. Students on surveys said ALN improved their learning.
  • At UIUC, Oakley [19] reports that ALN sections of ECE270 had better results and lower dropout rates than those doing paper and pencil homework and face-to-face interactions. This held when the student/faculty ratio was increased by 50%. These ALN students attended regular lectures but turned in homework and interacted with others via computer network.

Those who are just learning the language in which instruction is presented or who are shy will often take a more active role in ALN discussions than in a classroom discussions. ALN may equalize dominance in online discussions because it gives all participants an equal opportunity. It also puts the focus on content, not the sender. This may contribute to lowered learner anxiety and a greater sense of empowerment. These issues can be further investigated.

Conclusion

ALN programs have been evaluated with positive results. It is now possible to extend these evaluations to comparisons of performance gains, to impact on organizations, and to cost/benefit analysis. The most important impact will likely come through formative evaluations that provide feedback to course designers and sponsors about the relative effectiveness and cost/benefit ratios of various learning strategies.

Note on the Web of Asynchronous Learning Networks: Anytime, Anywhere Learning Networks

The Sloan Foundation has provided funds for us to maintain a central clearing house of web-accessable ALN information. The objectives of the ALN Web are to:

  1. Provide a focal point for information interchange among researchers and practitioners in the field of asynchronous learning networks.
  2. Provide a scholarly reviewed on-line journal that captures the archival knowledge of the field.

The ALN Web contains the Journal of Asynchronous Learning Networks (JALN), the ALN Magazine, news posting areas, workshops, interactive discussions, and other features directly or indirectly related to ALN. If you are involved or are contemplating involvement in leading edge learning and development programs you will find a wealth of resources here.

This article is adapted from a presentation at the Society for Applied Learning Technology Orlando Multimedia conference [20].

References

  1. Campbell, J. O., Lison, C. A., Borsook, T. K., Hoover, J. A., Arnold, P., "Using Computer And Video Technologies To Develop Interpersonal Skills," Computers in Human Behavior, 11, 2, 1995, pp. 223-239.
  2. Babbie, E. R., The Practice of Social Research (6th Ed.). Belmont, CA: Wadsworth, 1992).
  3. Campbell D. T., Stanley, J. C., "Experimental and Quasi-Experimental Designs For Research On Teaching," In N. L. Gage (Ed.), Handbook Of Research On Teaching. Chicago: Rand McNally, 1963.
  4. Campbell, J. O., "The Opportunities Of A Changing Technology For Performance Assessment," In What We Can Learn From Performance Assessment For The Professions: 1992 Invitational Conference, Princeton, N.J.: Educational Testing Service, 1992, pp. 69-76.
  5. Dick, W., King, D., "Formative Evaluation In The Performance Context," Performance and Instruction, 33, 9, Oct. 1994, pp. 3-8.
  6. Phillips, J. J., Handbook of Training Evaluation and Measurement Methods (2nd Ed.). Houston: Gulf Publishing, 1991.
  7. Walker, N., A Primer on Evaluating Your Engineering Education Research Project, Atlanta: Georgia Institute of Technology, SUCCEED Engineering Education Coalition, http://succeed.engr.vt.edu/walker.html, 1996.
  8. Weiss, C. H., Evaluation Research: Methods for Assessing Program Effectiveness, Englewood Cliffs, New Jersey: Prentice-Hall, 1972.
  9. Kirkpatrick, D. L., Evaluating Training Programs: The Four Levels. San Francisco: Berrett-Koehler, 1994.
  10. Phillips, J. J., In Action: Measuring Return on Investment (Vol. 1), Alexandria, VA: American Society for Training and Development, 1994.
  11. Orlansky, J. String, J., Cost Effectiveness Of Computer-Based Instruction In Military Training, (IDA Paper P-1375), Alexandria, VA: Institute for Defense Analyses, 1977
  12. Kulik, J., Kulik, C, "Computer-Based Instruction: What 200 Evaluations Say," In M. Simonson, S. Zvacek (Eds.), Proceedings of Selected Research Paper Presentations at the 1987 Convention of the Association for Educational Communications and Technology, Ames, IA: Iowa State University, 1987, pp. 18-24.
  13. Fletcher, J. D., Effectiveness And Cost Of Interactive Videodisc Instruction In Defense Training And Education (IDA Paper P-2372), Alexandria, VA: Institute for Defense Analyses, 1990.
  14. Cronin, M. W., Cronin, K. A., "Recent Empirical Studies Of The Pedagogical Effects Of Interactive Video Instruction In ‘Soft Skill’ Areas". Journal of Computing in Higher Education, 3, 2, 1992, 53-85.
  15. Mayadas, A. F., "Asynchronous learning networks: A Sloan Foundation perspective," Journal of Asynchronous Learning Networks, 1, 1, 1997, 1-16.
  16. Silsbee, R. H. & Draeger, J., Exploring the Solid State: Guided Simulations for Teaching, Cambridge, U.K.: Cambridge University Press, 1997.
  17. Sener, J., "Developing a Distance Education Engineering Program for Home-Based Learners: Lessons Learned," J. Instructional Delivery Systems (in press).
  18. Hiltz, S. R. The Virtual Classroom: Learning without Limits via Computer Networks, Norwood, N. J.: Ablex, 1994.
  19. Oakley, Burks, "A Virtual Classroom Approach to Learning Circuit Analysis," IEEE Trans. on Education, 39, 1996, pp. 287-296.
  20. Campbell, J. O. & Bourne, J. R., "Designing ALN Systems" Orlando Multimedia ‘97. Warrenton, VA: Society for Applied Learning Technology, 1997.

J. Olin Campbell, President of Performance Mentor (615) 665-9105, and
Research Associate Professor at The ALN Web Group, Vanderbilt University, 615-322-0110.
campbejo@ctrvax.vanderbilt.edu, http://www.aln.org/alnweb/


Would you like to offer some thoughts or add to the dialog? Responses of general interest may be posted below. Send your comment to . IMPORTANT: Make sure the subject line of your message contains: Comment on Guest Speaker 5/98.
========= Reply =========================
From: (Rick Dove) Date: Fri , 22 May 1998
Olin -There is a lot of new thinking about "multiple intelligences" and the belief that people have a variety of different learning styles. I know that some formal education environments and some corporate continuing education programs are experimenting with different ways to reach different people in a learning activity. Everything I have heard so far makes heavy use of the face-to-face interaction to discern when a different learning style must be accommodated with a different delivery approach. Have any of the ALN programs you are aware of addressed this "custom delivery" concept?

========= Reply =========================
From: "J. Olin Campbell"
campbejo@ctrvax.Vanderbilt.Edu Date: Sat, 23 May 1998
There are several items here:
- Learning Styles
- Discerning Learning Styles
- ALN "Custom Delivery" programs

I have found two ways to accommodate different learning styles:
a. include a range of activities that encompass several styles. For example have some projects where folks construct their own problems, and some activities that use direct instruction and practice.
b. Discern or allow the learner to select a style, and provide custom-tailored instruction for that style.

I bid and helped guide a large project where we tried approach b. Not surprisingly it cost about twice what developing for one or a mixture of styles would be. We had to develop the same content in two basically independent modes. The mode we thought would be most engaging (video, including conversations with a cartoon character who knew all about the inner workings of the system that technicians were learning) was dropped by almost every learner after the first five minutes. The "drill and kill" page turner mode was by far the most favored.

My guess as I went through the program is that the technical content was intense, and difficult to get when you had to stop the video and find the correct spot to jump back to. Much easier to just step back a few "pages" to find what you need. One might argue that technicians just want step by step instruction, but in this case the mechanics of managing the video for technical content seemed to be the major factor.

I think it difficult to separate learning styles from the mechanics of a learning system, and also more expensive to duplicate content. Thus I favor approach "a" of including a range of styles that do not duplicate content.

========= Reply =========================
From: DrBobdove DrBobdove@aol.com Date: Sat, 23 May 1998
Thanks for the link to the ALN. What a great site and organization. You are inspiring me to get my own site up and running.

========= Reply =========================
From: campbejo@ctrvax.vanderbilt.edu (Olin Campbell) Date: Mon, 25 May 1998
Thanks Dr.Bob Dove for your comment about the ALN Web - The ALN Web is supported by the Sloan Foundation, HP, and Microsoft. Please contribute -- an article for the magazine or journal, comments in ALNTalk, or just a note in the guestbook. We'll all benefit.

========= Reply =========================
From: elhansen@csulb.edu (Rick Hansen) Date: Mon, 25 May 1998
During the spring semester of 1998, I used a class management website, called CourseInfo, as a teaching/learning tool. My experience with it was very positive, and I wanted to share that and also hear from others about their experiences.

The course that I taught using CourseInfo was an MBA capstone course. I teach this course on campus in a traditional classroom. The class meets once a week, on Tuesday evening, for three hours. There is no distance learning component. It is a traditional seminar, with about twenty students in the class. Students typically organize themselves into about five teams. I do not use a textbook. Instead, the reading consists primarily of journal articles and one or two supplementary books. I make extensive use of Harvard Business School cases, both for discussion and for written assignments.

For each class, all five of the student teams prepare to present the discussion case. Then, in class, I select one team at random to actually present the case, and a second team, also at random, to critique the presentation. There follows a general discussion of the case and the assigned reading material. The mid-term exam is a team take-home case. Also, each student writes a four-part paper, called an Individual Project, during the semester. In this paper, they analyze their current work situation and develop a plan for their company and for themselves.

CourseInfo is a class management set of webpages. It appears to work with any browser, including AOL's, although Netscape communicator and internet explorer 4.0 appeared to be the ones most commonly used by my students. The site consists of a number of different online modules, and I used it in a number of different ways to support the student learning experience. The announcement feature on the homepage was a good place for general information and for an information backup to general email. Of course there's the standard page to set up the syllabus and course description. I noticed practical advantages to being able to communicate with individual students, a group of students, or all of my students at any time, without having to wait for the next class meeting.

I had students upload all of the papers that they submit, and I graded them and sent them back. Because we were not tied to submitting paper, I offered students the option of turning in drafts of their papers early (via CourseInfo) so that I could give them preliminary feedback. For any given assignment, about half of the students took me up on my offer. It definitely improved the quality of their learning experience, judging from the improved quality of the papers. This early feedback also gave them more control over their grades. For me, grading a paper the second time was much faster than grading one from scratch, so I'm not sure that I put in much, if any, extra time doing this for my students.

A related benefit was my ability to manage the students' workload. For example, this class met once a week, on Tuesday nights. But because students were submitting papers by uploading them rather than turning in hardcopy, I wasn't limited to having Tuesday night due dates. Students had the ability to submit papers at any time. In practice, they frequently opted to have papers due on Sunday evenings or first thing Monday morning.
CourseInfo's communication module was probably the most used feature. Each team had a homepage, and they used its chatroom feature to hold online team meetings to work on their case preparations. This was pretty important, since students were all commuting, and arranging off-campus meeting sites was generally inconvenient.

The discussion board was the most active single feature, so far as I could tell. Because there were twenty students in the class, classroom discussions generally left several people out… the shy ones. Initially, I decided to remedy this by using the discussion board as an extension of the classroom discussion. Thus students who were not as extroverted as others were able to participate actively in 'classroom' discussions. To energize the discussion board, I would usually seed it with a few cryptic comments of my own about the next class's reading assignment and what that might have to do with the assigned case. This made the discussion very relevant for the upcoming case presentations. As you can imagine, the discussion board took off. It became the 'hot' spot on CourseInfo. In addition to accomplishing its intended purpose of including shy students in the discussion, it emerged as an ongoing daily dialogue centered around course topics. By the third week of the semester, the quality of the written and oral case presentations shot up, and it stayed higher than I've ever seen it for the next twelve weeks. This, in turn, energized the in-class discussions.

The grading module allowed students to use their password to access their grades at any time. This had a number of advantages over other approaches. First, it kept the students much more informed about where they stood with regard to their grades. This saved me time answering individual questions. Again, it further reduced the amount of paper with which I had to deal. Instead of walking into class carrying thick folders full of student papers and grading materials, I would just bring the night's discussion case… nothing more.

Putting this all together, the total impact of using CourseInfo was greater than the sum of its parts. At least for this class, CourseInfo radically changed the nature of the learning experience. Students were used to having a one-night a week class 'meeting,' with maybe one outside team meeting every two weeks to prepare assignments. The addition of CourseInfo metamorphosed this more traditional experience into an ongoing, interactive community of learners… literally active on a daily basis.

I personally had more fun teaching this course than I've had in a long time. Comments, thoughts?

========= Reply =========================
From: "J. Olin Campbell" campbejo@ctrvax.Vanderbilt.Edu Date: Wed, 27 May 1998
A strong testimonial and presentation of your methods and learner actions. Thanks for sharing your experiences. Any sense of amount of learner hours spent on the capstone course, in relation to hours spent in the same class in prior years or learners' other classes?

========= Reply =========================
From: elhansen@csulb.edu (Rick Hansen) Date: Thu, 28 May 1998
Olin, in response to your question, I think the average sudent spent about 2 hours outside of class for every hour in class. That's probably a little more, but not much more, than for the other section of this class, which was taught by another instructor in the more traditional once-a-week seminar format. But I'm also pretty sure the additional time has more to do with me and the way I design a course rather than the format. I almost always get hammered by my students inthe part of the course evaluation that asks about workload.

========= Reply =========================
From: barnes@eac.cc.az.us (Cindy Barnes) Date: Wed, 3 Jun 1998
I am wondering if a student can receive a PELL Grant in this type of learning situation. Thank you in advance for any information.

========= Reply =========================
From: "J. Olin Campbell" campbejo@ctrvax.Vanderbilt.Edu Date: Wed, 03 Jun 1998
Pick an institution and ask if you can get a Pell grant to study there. There's nothing I am aware of specifically about ALN and Pell grants -- an institution that offers an accredited program for which Pell grants are appropriate can offer it in a variety of formats, including ALN. However, there is much reluctance by faculty and some administrators to "selling our good name" on courses they may view as inferior. The central issue is how well people learn.

We and many others are investigating whether there is anything about ALN that makes it better or worse than a classroom learning environment. Early data suggests equivalent performance, higher dropouts, and more interaction between learners and faculty using ALN. I believe one solution to this problem is to focus on assessments of outcomes (e.g., learners' knowledge, skill, and attitude), rather than the inputs (e.g., peer reviews of faculty, facilities, and curriculum for accreditation).

Certification that focuses on what learners can do is becoming standard for information technology (IT) professionals. How you develop yourself is up to you. You can pay for private tutorials, read a book, take a class . . .

With certifications we need not fear new approaches to learning. There is competition among providers to deliver the most learning for the money, and learners can choose a method and provider that suits them. In such an environment, Pell-type grants might be awarded backward: as you are certified in each small area, you get paid, regardless of what approach you or a provider used. If you can learn on your own instead of paying a provider, you pocket the money. If you need just a little help, you split it with a provider.

A problem with this approach is that learners need a way to make intelligent choices about providers. They need a consumer guide that rates providers along several dimensions (e.g., how many of their graduates pass the certification exam on the first or second try, how expensive is each provider, how much tutorial support do they provide).

There is much to learning and personal development that is difficult to assess directly. Part of education is confronting new ideas, developing friendships and contacts, examining yourself and the world around you. Some providers, such as small colleges, might choose to focus on these areas, and import courses developed by world experts from other organizations.

========= Reply =========================
From: gdnag@nitpu.wb.nic.in (gurudas nag), Date: Fri, 4 Dec 1998
very good

========= Reply =========================
From: battige@juno.com  (S. Battige) Date: Mon, 15 Feb 1999
Do you believe that distance learning models would be effective for k-6? What enhancements or modifications would you percieve as necessary?

========= Reply =========================
From: "J. Olin Campbell" olin.campbell@vanderbilt.edu Date: Mon, 15 Feb 1999
Regarding distance learning for K-6: yes, in some instances, with support from parents or teachers, distance learning makes a lot of sense. For example, at Stanford we ran a K-3 reading curriculum at a distance that doubled the learning rate of the students. Some students liked it so much that they beat on the door to get into the learning room at their school during their lunch hour (and while the aide was trying to eat her own lunch). Young learners need supervision and basic support that can be provided by adults who are not experts but learn themselves while they aid the younger students. Moms and their children can learn to read together in this way. This multiplies the number of students one aide can support, and multiplies the number of aides who can, in a natural way, learn for themselves.


Home | Library

Send mail to with questions or comments about this web site.
Paradigm Shift International, 2051 Lama Mtn., Box 289, Questa, NM 87556, 505-586-1536, -2430(fax)
© 1997/-2005 Paradigm Shift International - Attributed Copies Permitted
Last modified: February 25, 2005