Sunday, August 22, 2010

Taster Exercise

Cisco Networking Case Study
Alex and I have put our thoughts together in reponse to the 'taster' exercise and our findings are below.

What are the key issues?
1. There doesn’t to appear to be an equal blend of online and face to face instruction. It is very heavily computer based training with limited face to face input or feedback from a facilitator. ‘Lectures’ infer this is a large group activity – where is the opportunity for the student to connect directly with the facilitator?

2. Access to computer labs is limited to empty rooms; the assessment structure involving on-campus tests and examinations may not suit the availability of all students.

3. Feedback computer based rather than from a facilitator, providing feedback could provide an opportunity for the facilitator to work directly with their students.

How can the problem be solved? What strategies can be used?
There doesn’t seem to be any mention of a class discussion board or common workspace. Both of us feel that the best way to communicate with other students and the facilitator is through a discussion forum, allowing students to clarify understanding, discuss ideas and for the facilitator to check progress. In a course with a large group of students, discussion forums and encouraging peer-support are great tools to motivate learners.
We also feel there is potential for activities and assessment tasks be restructured into group activities where the facilitator could have more ‘face-time’ with the learners.

How can the evaluation process assist in finding a solution? What type of evaluation is appropriate in this situation?
Evaluation can assist by actually finding out what would work for the learners in this situation. What do they like about how the course is structured? What do they find difficult about it? How might they change it themselves? What tools do they find really useful?

A feedback form or online survey where learners are encouraged to share their ideas in words, rather than by rating scales or multiple choice questions, would be appropriate. Lisa suggested Survey Monkey as a great tool for teachers that enable users to create their own web-based surveys (some cost is involved for institutions). Another valuable tool maybe for the facilitator to reflect on the programme they delivered in a structured manner. A pre and post evaluation of what they felt would be successful vs. what was successful might be quite revealing.


  1. Lisa you and Alex have identified some important factors from an educational perspective. It is interesting that you have both picked up on the need for online communication to supplement the f2f lectures and online and computer-based activities.

    I agree that the evaluation can be used to find out what "would work" as well as what is working. Some good suggestions - did you get a chance to see what the evaluator discovered in his report in the Exemplars section?

  2. I always find it a bit challenging to group or theme comments from learners into meaningful actions from post learning evaluations. Do you have any tricks you can share with me?