Can Educational Research Inform Educational Practice?
pg 86 - 95
In this chapter, Elliot Eisner writes about the disconnect between educational researchers and educational practitioners. Although he is the former head of the American Educational Research Association, and a university professor at Stanford, Eisner feels the researchers aren’t really taken seriously by those working in the field.
Duh? In what sand dune have these researchers hiding their heads? In my opinion, and as an ESL teacher, I would have to agree with Eisner. Although we may read some of the research that is coming out in our various educational fields, we don’t really use it to inform our practice. Instead, we, as educators, do what we must in order to meet our objectives (if as Eisner says we even bother to create them) and to deliver the content of our programs. According to Eisner’s research, a “typical” response is that research findings function in the background as a frame of reference. “After all, research in education does not provide the kind of prescriptions that are employed, say, in medical practice as a result of research in medicine. The use of research in education is more heuristic’ it provides a framework that we can use to make decisions, not a set of rules to be followed slavishly.”
Eisner tries to define this connect by breaking it down into two specific problems. Problem one concerns the use of educational research as a framework to inform practice. Basically, he states that being aware of the research does not necessarily mean the practitioner is going to make it a part of the practice. Whether the educator actually uses the results of the research to aid in the design of the curriculum remains to be seen. How the research is thus interpreted by the teacher and used within the system is an individual choice and its expression is portrayed in different ways.
The second problem is whether or not the research actually improves practice. Instead it is more of an influence over how the curriculum is delivered. Eisner agrees that “some educational practices have changed as a result of educational research.” (p 89)
What is often forgotten is the learner. How has the image of the learner changed? In what ways is that image different than in the past? Eisner talks of how research must have had an influence on the way we envision the learner but then goes on to say that research usually follows changes within the practice. First comes change, and then comes the research into the effects of those changes. This is reminiscent of the image of the dog chasing its tail. Educational research and educational practice are caught in this inextricable dance, going round and round, a Viennese waltz of sorts. Even when the dancing stops, the choreographer continues to create new steps (okay, so I’m watching So You Think You Can Dance Canada as I am writing this). My, and Eisner’s point is that research and practice really haven’t caught up with each other.
To conclude, Eisner states that in order for research to truly inform practice, a language must be allowed to develop which will be able to accurately express the research and make the connection to practice. Research must become intimate with those who are in the field and then talk to all aspects of the practice, not just the curriculum, nor the image of the student, nor the subject matter. Research must be all-inclusive and be more than a make-work project for researchers to maintain or achieve tenure. Eisner doesn’t write to be critical but rather to be constructive in his observations and to express an optimism and hope for the future of educational research. “What is pessimistic is a failure or unwillingness to recognize our condition—to look at our professional world through glasses that allow us to see only what we wish. That would be pessimistic.” (p 94)
5 comments:
Nice summary Vincent,
I would have to agree with you that there is so often an obstructive disconnect between theory/research and practice. Ultimately, as we discussed in class Oct. 6th, the ubiquitous tension in education between “ivory tower theories” and “in the field practices” is really a false dichotomy. There is no practice uninformed by one set of theoretical suppositions or another.
I recently learned more about the tension between theory and practice via an on-line resource from Ottawa U’s Brad Cousins, and his work on program evaluation. Cousins pointed out that in the past secondary school districts/administrators were often hesitant to embrace program evaluations performed by outside experts. Reports might or might not have been read before, “being put on a shelf,” he lamented. An ironic reluctance, given that schools certainly have no qualms about evaluating their students! To overcome this reticence, Cousins and his team adopted a kind of interactive, collaborative program evaluation model called PARTICIPATORY EVALUATION. Herein, the evaluative team works in partnership with those teachers, managers etc. who are actually delivering the programs to help verify their efficacy. I think this in an excellent model. It provides a sense of ownership and credibility to all the stakeholders involved. Teacher/administrators become engaged in a process in which they and their students have a vital stake; program evaluators minimize the chance of their work being politely ignored. As Cousins said, this effectively lessens the, “ivory tower syndrome” that outside academics often encounter when trying to introduce new ideas into daily practice.
Cousins and his research team employed participatory evaluation in a Manitoba case study. Some 40 high schools, which had developed new and innovative programs to engage at-risk students, were seeking evaluative measures to determine the success of their programs. Cousins’ methodology was a hybrid of quantitative (questionnaire) data collection and qualitative research. The top 4 schools out of the 40 were then researched in depth to determine what made them successful. I.e. What was their dedication to program evaluations? What were their capacities for program evaluations? What measures were in place to effect organizational learning? Lastly, did the school demonstrate a corporate culture openness and readiness to change? If so, how? I daresay this final question was pivotal since the answers to the other inquiries flow from it.
Ultimately, arguably the only constant in life is change. School districts, departments, ministries of education etc. unduly wed to past strategies - impervious to the gaze of constructive, critical research review - are hardly doing their students any favours.
Norm and Vincent,
Why is there such distrust bewteen the ivory towers and the ground level workers in eduation. Shouldn't they all be there for the students and the programs? It is frightening and sad that adminstrators don't want to see evaluations on themselves. Doesn't that make them less accountable to everyone? As you say Norm there must be credibility to the program or else how can taxpayers, employeees and students themselves be seen as having any worth or value. If teachers or the school "vibe" is laisse faire or disinterested how can they possibly hope to engage students?
Hi Norm and Jacquie…
I don’t think the dichotomy which exists is as much a fear of being evaluated but rather more of a fear of the “imposter syndrome”, of the education system being found out. What I mean is this…there is a huge failure within the system and it is affecting the students tremendously, so much so, that even parents’ and societal expectations are changing in their expectations. The educational yardstick of learning used in the past has changed and what students are expected to know is no longer held to the most scrutinous of accountability. Today, education is all about empowering the student, raising their self-esteem (I AM GREAT!), teaching them social skills (what ever happened to Miss Betty’s school of etiquette?), and generally, just passing them through.
Well, if this is what the education system (Ontario curriculum) deems to be important, why wouldn’t they be fearful of accountability? In most instances, education in this province/country, is a sham. I don’t think of it as education but rather a free-ride to a diploma. B. Cousins may be working on doing evaluations of curriculum, but are the primary stakeholders going to do anything useful with the results? That remains to be seen. In the meantime, there still exists the gap between research and practice. How do we minimize this? We can’t very well read every bit of research that comes out but perhaps we can at least stay informed as to our particular areas of interest. Perhaps by staying in touch, we will eventually use what we read to inform our practice in more ways which will benefit our students.
Vincent,
Doesn't or shouldn't that research be trickling down from the adminstrators or during in-services? In nursing there is dedicated nurses that do research and those findings are disemminated to all nurses as Best Practice Guidelines. They are reviewed every few years and then everyone(hospitals, nursing homes etc.)adopt this new research. It is the only way to stay accountable, update and responsible to the stakeholders which like education is society at large.
Jacquie...
Jacquie...
When it comes to public health care, it is my opinion that the nursing is available to them. industry has to be more accountable that those in the teaching profession. This high level of accountability is one of the things which helps to keep nurses up to date with the latest research and industry news. Unlike teachers, nurses actually use the research that is available and should be viewed as somewhat of a leader in using research to inform practice. The question then becomes, how can this be translated so that teachers will follow the same example?
Post a Comment