Monday, February 16, 2009

Mark's thoughts on Milrad et al. 2002

Milrad et al. clearly lay out their goal in this paper: to show that "...technology can be effectively used in distributed learning environments to support learning in and about complex systems....To achieve this goal, learning theory (socio-
constructivism), methodology (system dynamics) and technology (collaborative tele-learning) should be suitably integrated (Spector & Anderson, 2000). We call this integration Model Facilitated Learning (MFL) (Spector & Davidsen, 2000)." (from pages 2-3). Apparently this has all been published elsewhere (citations above), but this paper just gives a little bit more concrete explanation of MFL with specific example(s).

Milrad et al. talk quite a bit about all the other theories/methods of the past, and how they've integrated them into MFL: situation/problem-based learning, cognitive flexibility theory (CFT), instructional design methods per "elaboration theory" and "cognitive apprenticeships. HUH??!!! Luckily, page 4 and beyond gives some background on these. "Situation based" says that learning "occurs in the context of activities that typically involve a problem, others, and a culture." SO, MFL applies technology to CFT, allowing collaboration in context-dependent situations, where the learning objectives are first concretely shown, then increasing complexity is added and inquiries collected/solved to allow the learner to construct a model of the concept. In other words they do, "coupling of system dynamics with collaborative and distributed technologies."

MFL is further boasted about (p6) because it suggests a sequence of learner challenges from 1) challenging learners to standardize behavior of a complex system to 6) challenging them to diversity and generalize to new problem situations. And, just as Deniz mentioned last week, MFL "advocates learning WITH models...to introduce learners to a new domain...and to promote learning simpler procedures" (using causal loop diagrams, for instance). Then, more advanced learners transition from learning with models to LEARNING BY MODELING (p9-10). To do this (still with MFL), the learners 1) must realize there is a system behavior occuring (underlying connections happening); 2) use graduated complexity (let learners fill in missing info on a partial model, have them construct a simple model, then complex model (or link simple models), then have them reach a goal/conclusion through from-scratch modeling.

It is nice that on p.11 a concrete EXAMPLE of MFL, using problem orientation, inquiry experimentation, and policy development in regard to acid rain/water quality is shown. I wish there was a bit more detail though (especially since it is in my area of "ecology!"). They argue that using MFL (structured, building, collaborative model-based learning), they meet all the "requirements" needed to allow learner growth. BUT, it doesn't appear this was ever tested in this paper (lecture on same stuff, or non-collaborative simulations VS. MFL to compare learner outcomes). Perhaps this next 2008 paper will show some!

3 comments:

Victor said...

I appreciate Milrad and his colleagues putting together this instructional approach. Actually, our study in Shawnee also follow a similar approach.

However, I think Mark is right that we need some empirical evidences to see the effect of MFL. One major question we may need to ask is the assessment issue. There are three components in the framework: problem-orientation, inquiry-exploration, and policy development. How should we evaluate them?

MWalvoord said...

I definitely want to hear more about your Shawnee study, which you have previously mentioned. (Partly because I got my undergraduate degree at OBU in Shawnee!) The whole area of assessment for any "learning by modeling" confuses me, and I look forward to seeing some examples/frameworks. In the MFL setup, it seems Problem Orientation would be easiest to assess (factual questions or ranking questions to see if they understand the problems). Inquiry is more difficult, because there are any number of questions/inquiries that could be asked to lead to an "experiment" or model-building. Policy development might be moderate in ease, IF new scenarios could be postulated/plugged-in to stduents models to see if they "work" (produce a solution that makes sense). Still, these latter two could be painful if the class was larger than 50-60 students (if assessing/grading by hand)...

Deniz said...

Mark, I am really enjoying your comments on the articles. It seems that you are evolving in your comprehension on the issues and reflect appropriately on how to utilize in science despite the lack of theoretical background. I will provide an overview, in the class, of different theoretical perspectives and how they are related to MBL. Meanwhile, I put a link to TIP database in D2L (same place as readings) which provides a nice overview of different theories.