Ioannis Karaliotas
H801 Examinable Component
October 2000

Investigation of Learner Performed Assessment of Learning Processes in 3 rd Generation ODE:
A Pilot Study

Table of Contents
  • Context
  • Aims & Objectives
  • Rationale
  • Specific Research Questions
  • Research Approach and Strategy
  • Methodology, Instruments & Timetable
  • Dissemination of Findings
  • Project Timetable
  • Resources needed
  • Role in the Research
  • References
  • Appendix


    Context   [ TOC ]

    In the dawn of the 21 st century and well into the era of almost infinite, freely accessible information resources, assessment of possession of information rather than of mastery of ongoing processes appears to be the norm in educational settings ( Cramer, 1994 ). Current testing practices neglect the assessment of more complex thinking processes.

    "Many of the tests we use are unable to measure what should be the hallmark of a thinking curriculum: the cultivation of students' ability to apply skills and knowledge to real-world problems. Testing practices may, in fact, interfere with the kind of higher-order skills that are desired" ( Resnick, 1987 ). Perhaps more importantly, assessment of learning in education usually reflects rigidly unchanged power relations which still put knowledge norms of teacher and institution in centre stage ( Lemke & Dressner, 1995 ). Consequently, this power differentiation diminishes the possibility for a 'learning relationship' between teachers and students, the much desired ‘learning organisation’ where metaphors are developed and reappear throughout the process of elaboration of concepts and like concepts; opportunities for passion and exchange seldom occur.

    Current assessment practices dominating education at large, which seem to perpetuate ‘possession of information’ trends and teacher/institution domination by imposing an even more rigorous testing approach, (e.g. Standards Testing National Scheme in USA, Exam-Centred Curricula in Europe), raise a number of questions which may be used as stimuli for relevant investigations:

  • Is recall the most important thought process, more important than the ability to classify, hypothesise, generalise, value, synthesise, or engage in any other thought process?
  • Is there no difference in the human brain’s ability to handle disorganised, random information and organised information?
  • Are extrinsic rewards for better academic performance more important than intrinsic rewards?
  • Are present test instruments sufficiently sophisticated to measure the qualities of intellect and character we seek as outcomes of education?
  • Are objective criteria for determining what should be taught necessary? Should whatever “educated” people happen to know determine the content of instruction?
  • Don’t learners need to understand how they have been programmed by culture and experience to select, organize, integrate and relate knowledge?
  • Could one set of standards be equally appropriate for all, as if learners did not differ significantly in cultural, societal, or psychological respect?
  • Will this generation’s answers speak adequately to the next generation’s questions?
  • Are educational problems attacked at their roots by unleashing market forces (competition, merit pay, ranking, etc.) in education?
  • (adapted from Brady, 2000 )

    Aims and Objectives   [ TOC ]

    With the above in mind, and with the view that open and distance learning, a potentially innovative educational practice, may serve as experimental ground for educational change ( McClintock, 1992; Thorpe, 1995, Mason, 1998 ), I am proposing research in the form of a pilot study which would investigate learners’ assessment of the process of their learning in 3 rd generation ODE.

    The aim is to examine the impact of learner-performed informal assessment on generating and supporting deeper learning. It is envisaged that this pilot study may result in further investigations into how such processes can best be facilitated in ODE environments, contributing to the design of improved online course structures for dialectic learning. To reach this aim, the following objectives will be sought:

  • location of attributes and patterns of Learner Performed Assessment (LPA) occurring informally in collaborative learning settings
  • identification of correlations between Higher Order Thinking (HOT) and LPA in context
  • verification of the observed functionality of LPA against learners experiences
  • The pilot study is proposed to be conducted on final year students participating in the MA in ODE programme offered by the Institute of Educational Technology (IET) of the Open University, UK.

    Rationale   [ TOC ]

    If the purpose of learning is understanding, the process of assessment should be more than just evaluation: it ought to be a substantive contribution to learning. Contemporary literature widely acknowledges the fact that, in our changing times, disciplines re-examine the very nature of their content and that expanded understandings of learning accompany these shifting perspectives. Learning becomes viewed as an active process wherein individuals construct meaning based on prior experiences. These changes dictate teaching/learning and assessment practices that reflect the dynamic nature of knowledge. The need for assessment that challenges learners solve, make connections, explore assumptions, elaborate, and apply nonalgorithmic (higher order) thinking has been emphasised in literature ( Mokhtari et al, 1996; Wiggins, 1992 ).

    As Robin Mason argues in her article Models of Online Courses :

    "Current assessment procedures in higher education are long overdue for a rethink. They are particularly ill suited to the digital age in which using information is more important than remembering it, and where reusing material should be viewed as a skill to be encouraged, not as academic plagiarism to be despised. [.....] There are certainly educational benefits to be had from a re-thinking of assessment where online access is possible. We would welcome opportunities in the faculties to move towards more integrated, more learner-oriented and more collaborative tmas [tutor marked assignments] and even exams!" ( Mason, 1998 ). For many years, the main goal of higher education had been to make students knowledgeable within a certain domain. Building a basic knowledge store was the core issue. It is clear now that the main goal of higher education moves towards supporting students to develop into 'reflective practitioners' who are able to reflect critically upon their own professional practice ( Schon, 1987; Falchikov & Boud, 1989; Thorpe, 1995; Kwan & Leung, 1996 ).

    In this spirit, the value of alternative, innovative, non-traditional assessment, which is authentic and focuses on process as well as product, in improving the teaching and learning process has been evinced in literature. In terms of stakeholders attitudes, the study of the effect of innovative assessment practices on student learning in the Impact of Assessment project at Northumbria reports enthusiasm amongst both staff and students for assessments which aimed to allow more flexibility or autonomy, better feedback or a linking of academic knowledge with real world experience ( McDowell, 1995; Sambell et al, 1997 ).

    Processes of learner performed assessment, self-, peer and co-assessment, are being investigated during the last decade as potentially effective practices capable of scaffolding deeper/higher learning ( Boud et al, 1999; Morgan, 1997; Entwistle and Ramsden, 1983 ). A review of literature based on the analysis of 63 studies concerning self-, peer or co-assessment related to students in higher education concurs in the following:

    “Overall, it can be concluded that research reports positive findings concerning the use of self-assessment in educational practice. Students who engage in self-assessment tend to score most highly on tests. Self-assessment, used in most cases to promote the learning of skills and abilities, leads to more reflection on one's own work, a higher standard of outcomes, responsibility for one's own learning and increasing understanding of problem-solving. The accuracy of the self-assessment improves over time. This accuracy is enhanced when teachers give feedback on students' self-assessment. “suggests that the use of a combination of different new assessment forms encourages students to become more responsible and reflective” ( Dochy et al, 1999 ). However, processes of learner performed assessment are apparently being investigated solely from the point of view of teaching and in relation to teacher/institution performed evaluation and grading of learning outcomes. My intention, although in line with the core guidelines of previous research mentioned above, is to investigate assessment processes not leading to grading, processes that are generated by learners under objectives that learners have decided for themselves or negotiated with their teachers, in order to find how these processes may be supporting higher order learning in context.

    Social interaction for monitoring learning process and higher order skills such as creativity, productive thinking, divergent thinking, reflective thinking, critical thinking are unlikely to occur naturally in settings where others rigidly control the external rewards and manipulate and shape the behaviours of learners. It is risky to develop and reveal new ideas and thoughts. Most high level thinking involves new ideas. Therefore, it is risky. It is not safe to take risks when someone else controls the rewards.

    Not surprisingly then, collaborative environments of open and distance learning, where a higher degree of negotiation between learners and teachers/institution is facilitated through the adoption of constructivist course structure and networked computer conferencing, seem to favour the occurrence of such learner generated assessment processes ( Appendix 1 & 2 ).

    De Corte (1990) refers to the design of such enabling learning environments. They are characterised by the view that learning means actively constructing knowledge and skills on the basis of prior knowledge, embedded in contexts that are authentic and offer ample opportunities for social interaction. Such learning environments preclude competitiveness and promote learner collaboration. Collaborative learning has been reported to include:

  • the development of the cognitive and metacognitive skills of reasoning and reflection ( Jonassen et al, 1993 )
  • individual learning that occurs as a result of a group process involving "some agreement on common goals and values, and the pooling of individual competencies for the benefit of the group or community as a whole" ( Kaye, 1992 )
  • My choice of the subject of the proposed inquiry is based on the evidence presented above and the experience I have gained as a learner-participant in such settings. The focus of this pilot study, ‘MA in Open and Distance Education', is a postgraduate degree programme which was presented for the first time in 1997.

    According to Mason (1998) : "the MA[ODE] is founded very squarely in a Western, ‘constructivist learning’ understanding of higher education”. It comprises of three courses/modules (H801, H802 & H804) which address the needs of professionals engaged in open and distance learning. The programme provides an innovative environment for conversational, synergetic learning in the context of distance learning ( Pask, 1975; Laurillard, 1993 ), by utilising new communication technologies of e-mail, computer conferencing and the World Wide Web, in addition to more traditional printed course materials. The students and tutors are all networked, enabling study support and active participation in lively peer exchange which generates its own discource.

    My participation in the MAODE programme since 1998 - now covering the third and final year (H801/2000) - renders an ethnographic nuance to the proposed inquiry. Further, the selection of students in their final MA year as the sample for the inquiry will enable a critical survey of accumulated experience and it is estimated to contribute to the reliability of results.

    Specific Research Questions   [ TOC ]

    Questions to cover
    1. What are learners’ perceptions of informal self- and peer assesment appearing in online conference?
    2. When and how these processes seem to occur; which forms do they appear in?
    3. What are the perceived benefits of these peer generated processes?
    4. Which skills/abilities are developed from such practice?
    • To ascertain level of recognition and acknowledgement of investigated processes.

    • To identify patterns and attributes, and possible relations to course structure.
    • To record learner conception and attitude toward assumed expectations.
    • To identify the level of impact on specific learning domains.

    Research Approach and Strategy   [ TOC ]

    The object of the proposed inquiry, its purpose and the setting in which it is proposed to be conducted concur in the adoption of a naturalistic [constructivist] approach within the naturalistic paradigm ( Lincoln & Guba, 1985; Guba & Lincoln, 1989 ).

    The naturalistic [constructivist] approach to this study accords with the natural setting (course learning environment) in which it is pursued; the employment of the human instrument (student MA cohort<-> researcher); the qualitative methods deemed appropriate for pursuing the inquiry (participant observation, less-structured interviews); the use of tacit knowledge derived from tenure in the setting; purposive sampling of convenience (as proposed earlier) & extreme cases refinement (interview).

    The intented research strategy is the one that seems to better fit the needs of a case study which would act as a pilot for further investigations. The proposed inquiry is essentially descriptive, being of an ethnographic exploratory rather than explanatory nature. Observation of both the live online environment and parallel in-house research will be keeping an open channel for the hermeneutic circle dealing with the interplay of data collection and analysis. Participant observation could be effectively performed under tutor or contracted researcher role.

    Further review of relevant literature is proposed to run parallel to the rest of research actions to fit the needs of the hermeneutic circle.

    Methodology, Instruments & Timetable   [ TOC ]

    For the implementation of the proposed methodology, access to the online MA environment for participant observation needs to be provided.

    The inquiry will follow the running cycle of the MA courses (February to October). Review of literature in the areas of deep learning, HOT, metacognition, social skills for learning, collaborative peer assessment would commence in February and continue through the research period.

    A range of qualitative methods will be employed (Figure 1).

    Figure 1: Methods, Instruments & Timetable

    Ethnographic participant observation of the H80* learning environment for

  • changes/alternative-emerging-patterns (both institution & learner initiatives) while courses are running
  • monitoring of relevant in-house research running in parallel
  • will be continuous throughout the project.

    Questionnaire containing exploratory questions will be administered early in the year – March (Figure 2)

    Draft Questionnaire :

    1. During your MA studying, how active would say you have been in exchanging messages with your peers? (electronic conference and/or email)

    (Use of Likert scale: Extremely active – Very active - ... )

    2. How helpful to your learning did you find exchanging messages with your peers, either in electronic conference or by email?

    (Use of Likert scale: Extremely helpful – Very helpful - ... )

    3. Which aspects of studying do you feel you where helped with by peer exchange? (tick as many)

    (Checklist+ Open ended)

  • making sense of course content
  • making sense of course process
  • dealing with technical aspects of computer conferencing
  • dealing with social/cultural aspects of computer conferencing
  • reflecting on personal views
  • monitoring your own learning
  • other (please elaborate in the space below)
  • 4. Would you regard course related peer exchanges as a form of informal assessment that helps monitoring each other’s learning advances?

    Yes   No    Don’t know

    Figure 2: Draft Questionnaire

    The quasi-quantitative analysis of the questionnaire will serve two purposes:

  • Identify areas for observation and for contributing to the construction of the interview agenda.
  • filter sample to minimum selection (10%) for phone interviews. Sellection based on extreme cases (Likert scale Qs) .
  • Open-ended phone interview schedule with a minimal agenda based on previous findings will be arranged for August, for the filtered sample (10%). It is expected that it will help to verify observed behaviour against learners experiences.

    Dissemination of Findings   [ TOC ]

    If the proposed study is accepted as part of a PhD research, there will be an internal report to course teams regarding general findings.

  • Possible recommendations for tutor staff development and course design would be included.
  • A report of the study and its findings may be submitted to electronic Journals or be proposed for online discussion on IFETS or ITForum.
  • Copies of the original report, in English and Greek, will be uploaded to my personal WEB site for public scrutiny.
  • Project Timetable   [ TOC ]
    Research planning and preparation
    Research analysis and report
    Prepare learner Q
    Online conference content (OCC)
    Literature search

    (OCC) analysis

    Disseminate Q
    (OCC) collection 

    Q collection

    Literature search

    (OCC) analysis

    Q analysis

    Maintain CC contact 
    (OCC) collection
    Literature search

    (OCC) analysis

    Maintain CC contact
    (OCC) collection
    Literature search

    (OCC) analysis

    Maintain CC contact
    (OCC) collection
    Correlate & store data from dif sources 
    Set up learner telephone interview appointments
    (OCC) collection

    Conduct Interviews

    Tape transcription and analysis 

    collection/analysis Learner interviews

    Draft summary of results so far
    Write final report

    Resources needed   [ TOC ]

    Excluding staff/contract time for the researcher, resources required will be kept at a minimum since the research project could be carried out online with the possible exception of 10 telephone interviews. For learners with PC audio capability, interviews can be held online thus bringing the cost even lower. Internal university funds are expected to be available for resources at this level, subject to acceptance by the relevant committee. Researcher’s time can be counted in with part-time tutor fees if this is the case. (Table 1)

    Table 1: Researcher’s hours (700)

    300 hrs (9/week)
    Participant Observation
    Downloading messages (online), analysis of messages (offline)
    20 hrs
    Designing questionnaire and interview questions
    Includes literature search for existing diagnostic instruments 
    10 hrs
    10 interviews by telephone/web
    80 hrs
    Main literature search
    Analysis techniques 

    Published work on alternative assessment & OD course structure

    190 hrs
    Audio tapes and transcripts, other documents, interpretation of results, addressing aims and objectives
    100 hrs
    Writing up results
    Final report 

    Telephone costs (from GR)

    The author would be providing equipment and consumables.

    Role in the Research   [ TOC ]

    The author will be the sole implementor of the proposed study.

    Duties will involve review of relevant literature, design and preparation of instruments; participant observation through the project; analysis of data (eBBS or First Class conference content); conducting interviews over the phone/Web; writing of results.

    References   [ TOC ]

    Boud, D., Cohen, R. & Sampson, J. (1999). LEARNING -- Technique; TEACHING – Technique. In Assessment & Evaluation in Higher Education, Dec99, Vol. 24 Issue 4, p413.

    Cramer, S. F. (1994). Assessing Effectiveness in the Collaborative Classroom. In K. Bosworth & S. J. Hamilton (Eds.), Collaborative learning: underlying processes and effective techniques. San Francisco: Jossey-Bass.

    De Corte, E. (1990). Learning with New Information Technologies in Schools: Perspectives from the Psychology of Learning and Instruction. Journal of Computer Assisted Learning, 6, 2, 69-87.

    Dochy, F., Segers, M. & Sluijsmans, D. (1999). The Use of Self-, Peer and Co-assessment in Higher Education: a review. In the Journal “Studies in Higher Education”, Oct99, Vol. 24 Issue 3, p331, 20p.

    Falchikov, N. & Boud, D. (1989) Student self-assessment in higher education: a meta-analysis, Review of Educational Research, 59, pp. 395-430.

    Guba, E. & Lincoln, Y. (1989). Fourth Generation Evaluation. London: Sage.

    Hilgard, E. (1956). Theories of learning. Appleton-Century-Croft.

    Jonassen, D, Mayes, T., McAleese, R. (1993): A Manifesto for a Constructivist Approach to Technology in Higher Education. In Duffy et al (eds), Designing Constructivist Learning Environments. Springer-Verlag.

    Kaye, A. (ed) (1992). Collaborative Learning through Computer Conferencing. Springer Verlag

    Kwan, K. & Leung, R. (1996) Tutor versus peer group assessment of student performance in a simulation training exercise, Assessment and Evaluation in Higher Education, 21, pp. 205-214.

    Laurillard, D. (1993). Rethinking University Teaching: a framework for the effective use of educational technology, Routledge: London.

    Lemke, R. & Dressner, R. (1995). Asynchronous Learning Networks. Paper presented to Teaching Strategies for Distance Learning, 11th Annual Conference on Teaching and Learning, Madison, Wisconsin. 117-120.

    Lincoln, Y., & Guba, E. (1985). Naturalistic inquiry. Beverly Hills, CA: Sage.

    Mason, R. (1998). Models of Online Courses. ALN Magazine Volume 2, Issue 2 - October 1998.

    McClintock, R. 1992. Power and Pedagogy: Transforming Education through Information Technology. Cumulative Curriculum Project Publication #2, New York Institute for Learning Technologies.

    McDowell, L. (1995) The impact of innovative assessment on student learning, Innovation in Education and Training International, 32(4), pp. 302-313.

    Mokhtari, K., Yellin, D., Bull, K., & Montgomery, D. (1996). Portfolio assessment in teacher education: Impact on preservice teachers' knowledge and attitudes. Journal of Teacher Education, 47(4), 245-252.

    Pask, G. (1975). Conversation, Cognition, and Learning. New York: Elsevier.

    Resnick, L. (1987). Education and learning to think. Washington, D. C.: National Academy Press.

    Rowntree, D. (1987). Assessing Students: How Shall We Know Them?, Kogan Page, London

    Sambell, K. & McDowell, L. (1998) The value of self and peer assessment to the developing lifelong learner, in: Proceedings 5th Improving Student Learning Symposium, September 1997, pp. 56-66.

    Sambell, K., McDowell, L. & Brown, S. (1997) "But is it fair?": an exploratory study of student perceptions of the consequential validity of assessment, Studies in Educational Evaluation, 23(4), pp. 349-371.

    Schön, D. et al (1999). High Technology and low-income communities. Prospects of the positive use of advanced information technology. Eds. Schön, Donald A., Sanyal, Bish, and Mitchell, William J. - London: MIT Press.

    Schon, D.A. (1987). Educating the Reflective Practitioner: towards a new design for teaching and learning in the professions (San Francisco, CA, Jossey-Bass).

    Thorpe, M. (1995). Reflective Learning in Distance Education. European Journal of Psycology of Education, 1995, Vol.X, 2, 153-167.

    Wiggins, G. (1989). A true test: Toward more authentic and equitable assessment. Phi Delta Kappan, 70(9), 703-713

    Appendix   [ TOC ]

    1. A collection of peer messages exchanged online in the context of H801/2000.

    Digest of All Messages Below Message # 633

    Msg #633 of 759 posted 9/20/00 by arturo-e
    H801 Plenary area | Course Themes and ... | Refining proposals | Forming small groups | Group B: Course De... | Structuring and Dr... |

    Structuring and Dr. Frankestein

    Dear Jon and Gordon,

    I have noticed that structuring in course design tends to be associated with behaviourism. From comments made to my outline in the refining proposal area, I noticed that just the fact I mentioned I liked to determine what degree of structure is adequate was a sign that I was on the same team with Dr. Frankestein and Dr. Skinner.

    Were not structure and dialogue qualitative variables of distance under Moore's independent study model? Of course weak-behaviourism uses structure but it also uses dialogue. Constructivism uses structure as well.

    Have a nice day.

    View Message
    Msg #638 of 759 posted 9/20/00 by yannis-k
    H801 Plenary area | Course Themes and ... | Refining proposals | Forming small groups | Group B: Course De... | Structuring and Dr... | Structure for lear... |

    Structure for learning

    Dear Arturo,

    I sympathise with you wholeheartedly. My only concern is that we ought to be aware of qualitively different strucures: structure that facilitates teaching as opposed to structure which enables learning.

    Such distinction may even help us go beyond the notion of transactional distance where structure is not necessarily the opposite end of the continuum (structure - dialogue) but stands in compliment to dialectic process.

    Although phenomenography touches on this issue, attributes are still to be defined - my view is that playful synergetic activities which generate and scaffold dialectic exchanges might be the key to such structure for learning.

    View Message
    Msg #647 of 759 posted 9/20/00 by arturo-e
    H801 Plenary area | Course Themes and ... | Refining proposals | Forming small groups | Group B: Course De... | Structuring and Dr... | Structure for lear... | Baseline structure... |

    Baseline structure, teaching and learning structures

    Dear Yannis,

    Thank you for replying to my message in a bottle. What you say is absolutely fascinating. See, I am trying to make a difference in my research design between distance (Moore) and its qualitative variables structure and dialogue, and instructional design and delivery methods. However, I feel pretty much alone.

    If you could elaborate more on what you say it would be very helpful. It will also help me if you can point out some references.

    I do agree with you that a learning structure such as the one you envision is separate from or is beyond the transactional distance model.

    You say that phenomenography touches this issue. I will explore it. A new universe of knowledge has been opened up for me. Thank you again.


    View Message
    Msg #649 of 759 posted 9/20/00 by gordeon-mca
    H801 Plenary area | Course Themes and ... | Refining proposals | Forming small groups | Group B: Course De... | Structuring and Dr... | Structure for lear... | Baseline structure... | Redundant comment ... |

    Redundant comment - but maybe not

    I note the full reply given by Yannis. My own comment lies in the possibility that "structure" as referred to by Moore does not equate in meaning to the use of the term by behaviourists, but it's a long way back in H801 so don't take my word for it.
    Phenomenograhy, eh ? Good luck !

    View Message
    Msg #657 of 759 posted 9/21/00 by yannis-k
    H801 Plenary area | Course Themes and ... | Refining proposals | Forming small groups | Group B: Course De... | Structuring and Dr... | Structure for lear... | Baseline structure... | Structure for Lear... |

    Structure for Learning -- attempt to elaborate

    Dear Arturo,

    Thanks for your thoughtful response encouraging me to elaborate on initial thoughts. I must say that you put me in a difficult spot here, having to defend a, largely, personal view with little evidence as yet - mostly a gut feeling. So, my attempt to elaborate will involve the posing of questions rather than the quoting of affirmable, well supported statements.

    Structure for learning is the central object of inquiry in my own research proposal "to investigate the role of assessment procedures in assessing the process of learning in 3rd generation ODL". I find that looking at structure from the assessment perspective offers a better view of the whole teaching/learning system in context and facilitates understanding of inter-relations between its variables.

    Relevant questions posed:
    - What is the purpose of the assessment system in context?
    - What are the differences between process assessment and the assessing (testing) of outcomes?
    - If process assessment procedures are used as means of scaffolding understanding as opposed to testing for grades, what are the skills and abilities (domain-independent cognitive, metacognitive, social) that need to be assessed in evaluating process so that they can be enabled/developed?
    - Can these be learnt/developed, or are they just acquired and, therefore, there is no reason for them to be taught (there is evidence that the latter does not stand - msg #252,105 )?
    - How can they be learnt/taught?

    Now, its my feeling that the last two questions are central to the 'structure for learning' issue. 'Naturalistic' research and particularly phenomenography (mostly of Scandinavian & Australian origin), with 'its' notion of deep learning, can help because it examines stakeholders experience and, therefore, makes more salient certain variations which otherwise might have not been noticed.

    I have had relevant references included in: (phenomenography) (naturalistic)

    There is also extensive reference to the philosophical model which, in my view, can be utilised in designing structure for learning (H804 project report):

    Latest in Phenomenography: (The New Phenomenography) (Phenomenography Crossroads)

    The work of our late Alistair Morgan -
    and the work of OU, UK on Mathetics:

    View Message
    Msg #698 of 759 posted 9/24/00 by arturo-e
    H801 Plenary area | Course Themes and ... | Refining proposals | Forming small groups | Group B: Course De... | Structuring and Dr... | Structure for lear... | Baseline structure... | Structure for Lear... | Impressed |


    Thank you very much Yannis for a worderful insight into structuring.

    I am afraid your questions are too much for me at this stage of my studies. Can metacognition be taught? WHat a question! WHat structures support metacognition? In what specific context? Too many variables for me.

    In teaching languages I am convinced that more structure is somehow possible and better, so I believe that there is another dimension here: what is being taught. What is being taught also tends to impose its own dynamic. The problem is how can we overcome this issue and teach languages or humanistic subjetcs for anybody regardless of the current perpetuation of teaching models.

    Thank you for the references. I decided not to include the issue of structuring in my study for the moment because just defining it will take a research on its own. It will be very valuable but not practical for my evaluation at this time. So once I get this EC out of my mind I will start investigating these issues.


    View Message

    [Reply | Navigator | Parent | Jump | New replies | All in One Page | Info ]

    Return to conference root
    Return to H801 Welcome page


    2. Outliner of part of messages including the ones shown above.

    Outline of replies to message #516

    #516 Group B: Course Design issues Sunday, September 10, 2000, b.e.crooks

    | Comment | Message | Parent | Jump | Delete | New replies | All in One Page | History | Search | Info | File Area |
    Return to conference root
    Return to H801 Welcome page

    [ TOP ]

    Ioannis Karaliotas © 2000