* (1500)
The Acting Chairperson (Mrs. Shirley Render): Would the Committee of Supply come to order, please. This section of the Committee of Supply has been dealing with the Estimates of the Department of Education and Training.
Would the minister's staff please enter the Chamber. We are on Resolution 16.2. School Programs (c) Assessment and Evaluation (1) Salaries and Employee Benefits $2,988,400. Shall the item pass?
Ms. Jean Friesen (Wolseley): Madam Chair, I understand that we are discussing School Programs, 16.2(c). The minister's staff are here, and I wanted to ask some questions about a number of issues here. I think the one that perhaps the minister has received a great deal of correspondence on deals with assessment and deals, particularly, with the Grade 3 assessment.
The people in Alternative Education in Winnipeg, in particular, but not exclusively in Winnipeg, I believe, have written a number of letters. I think I have had copies of close to 50 of them that have gone to the minister to protest the establishment of the Grade 3 math exam. The concerns of these parents I have raised in Question Period. I have also raised them with the minister before at committee last year, and the parents themselves also made presentations on this.
In my concerns, what I have tried to emphasize to the minister is that this is a program. The Alternative Education program is one, I think, we should all be proud of. This is a program where as I have often said parents line up at three and four in the morning in order to ensure that their children receive a place in the program. This is the program where parents play a vital and extensive role in the classroom.
This is a program with a particular philosophy, where parents play a part not just as part of a school advisory committee, not just as in partnership with the teacher over matters concerning their own child, but where they are concerned about the development of co-operative learning and their concern is for the whole class as well as for their own child. They believe strongly in assessment. They are very satisfied with the kind of assessment which they have received in the past. They are very satisfied, indeed enthusiastic is the word they use frequently, about the kind of assessment and partnership that they have with their child's teacher. They believe that Winnipeg No. 1, in particular, has given very strong support to the Alternative Education program, and indeed that is true.
It is an education program which depends upon co-operative learning, which does not depend upon the kind of encouragement of standings of students, first in the class to the bottom in the class, which does not believe in drawing the lines of "pass and fail" in the way that one test on one day does. But they believe very strongly in assessment and in continuous assessment. They are enthusiastic about the kind of assessment they have received in the past. They have registered in a particular program, offered by their school division, which offers and sustains a particular kind of philosophy. That philosophy is shared by a thousand families across the province. About 12 schools are involved in this. It is a relatively small program enthusiastically supported by parents actively involved in the classroom.
When I asked the minister about this in Question Period, my question was one which, in a way, was stimulated by one of her predecessors, Clayton Manness. Manness always said you listen to parents. The minister says the same thing. Every Minister of Education in this particular Tory government wants to make that same kind of claim. They want to choose, it seems to me, the parents to whom they will listen. The minister claims that parents have written to her supporting these Grade 3 math exams, and I do not doubt that some have. Here we have parents representing over a thousand families who have said, no, this is not for my child, this is not the kind of education that I subscribe to.
I ask the minister particularly to understand that our enthusiastic support for the teachers, for the other parents and for the children in these classrooms is something which we hold very dear. We see the Grade 3 math exam as something which is counterproductive, is not conducive to the kind of learning which we want to encourage in the classroom.
They are not saying they want to have their philosophy imposed upon others, but they are saying we chose this, we chose a program offered by a regular school division, and we chose it for particular reasons, and we now find that the minister is imposing one particular set of educational philosophies in a classroom which is not set up for that.
They do not want other parents necessarily to follow their particular philosophy, but they do want the government to do as it has said it is doing, and that is listening to parents, but basically what the government is doing is choosing which parents it will listen to. It is applying a very rigid philosophy of standard tests right across the province in all four core areas every year. It is going to be a $15-million project annually by the time the government has everything in place. The parents in this particular program are very concerned about that cost, but their most important point is that this is not the education for which they signed up. This is not the philosophy they want in their Grade 3 classroom.
* (1510)
Now, where are the letters that the minister has in support of the Grade 3 testing? As I said, I do not doubt that she has some, but what she also has is many letters saying, this is not for us, and asking for essentially to be excluded from the Grade 3 test, not to be asked to withhold their children from school. Many of them, I think, in their letters to the minister have said, look, this is where this may be leading, and we do not want to do that; we want to have the kind of education system which our school division has offered, for which we have signed up, which we fully support; we want the support of the minister. That is what they are asking for.
It is always interesting that Clayton Manness would say, well, who has the most important role, the state or the family? I must admit it was a question which often took me always a minute to consider. It was very interesting to see his perspective in that. Sometimes he would be using it to support issues of home education, but he also I think did respect the role of the family. And that is what the minister is doing here, it seems to me, saying that the state is more important than the family.
What these parents are requesting is not something which is going to negate the whole process of testing at the Grade 3 level. The absence of five or 600 children is not going to negate the statistical viability of the tests, because after all we know that the minister has said that 20 percent absence from tests, such as we had in the snowstorm this year and the snowstorm last year, does not negate the statistical viability of these tests. So it is obviously not the numbers which concern the minister. It is, it seems to me and seems to these parents, I think, an issue where the government is saying ( a) we are going to listen to some parents and not others; and, (b) the state knows better.
I want to ask the minister, if she could--I have, I think, tried to summarize the concerns of the parents both on a philosophical basis as well as on the issue of cost. I wonder if the minister would like to respond to that and to try and give us a sense of why it is so supremely important to this minister to ignore these parents, to throw their views out of the window, and to say that the state will be supreme and these tests must be held in this classroom every year?
Hon. Linda McIntosh (Minister of Education and Training): Madam Chairman, I should say, first of all, just to make sure that the record shows a really clear balance as to what actually happens here. From listening to the member's comments, one would think listening to the member's comments that students are being subjected to a test for which they have to pay special attention for the whole year in daily discussions in class, et cetera, for which the learning program has to alter, for which students get marks back, for which there is widespread opposition.
I think it is correct for the record to show what has actually happened. Because the member's comments--and I do not think she is intending to do it, but they leave an impression that is not quite accurate in terms of what is really happening out there. I should say for starters that the member indicated that she thought I would have received about 1,000 letters, or you said there were 1,000 parents involved. Essentially, I have received letters from some of the parents in two schools, some of the Grade 3 parents in two schools, nowhere near 100 letters, like, a very, very statistically small number of letters.
(Mr. Marcel Laurendeau, Chairperson, in the Chair)
So I have gotten letters from the parents of Grade 3 students in Laura Secord and letters from some of the Grade 3 parents in other schools. They are vastly outnumbered by the indications from the majority of parents and taxpayers supporting exams. So when the member says that I have received a great deal of correspondence opposing Grade 3 standards assessment, and perhaps I have had some supporting it, just the use of the words "a great deal of correspondence" leads you to think that it is a great deal instead of a relatively small number, and saying that you do not doubt that maybe I have had some people indicating support implies that there has not been a lot of support; in fact, those two thoughts need to be reversed.
I have received around maybe 50, 60 letters on this topic from parents of Grade 3 students in two schools that are alternative schools, and they are form letters. They are form letters that someone has typed up that other people have filled in. I am not doubting their real desire not to have their children assessed, but the assessment need not have any impact whatsoever other than the few hours that we ask to have them sit and write the exam, because as the member said here that these parents do not believe in the first-in-the-class to the bottom-of-the-class philosophy. Indeed, students would have no way of knowing whether they are first in class or bottom of the class. We do not look at it that way.
Unless the parents ask for the results, children never get the results. These are results given out only if the parents want them or the teachers wish to share them, and given that in Laura Secord School none of the parents believe in the assessment, they do not have to worry. All we want to do is just make sure the children are able to compute to a certain level. You said they do not believe in the pass or fail. It is totally irrelevant for their purposes because these are diagnostic tests, and, as I said again the parents and the teachers--the teachers do not need to give the results to the parents. If the parents do not want to know the children's mark, and in this case they would not, then the teacher just does not give them to them. The teacher does not have to give them out at all unless the parents want them.
Let us clearly understand what is happening here. The alternative program is in no way at all impacted by the fact that the Department of Education wishes to assess children at the end of Grade 3 to see if they can compute and do problem solving to a certain level. Those pieces of information are for diagnostic purposes only so that we can in turn say that our children in Manitoba have achieved a certain level of learning, so that teachers themselves can take that information and, if there are gaps showing in the learning, can prepare for the next year's teaching to try to address those gaps or to build upon strengths that are identified. So there should be no impact whatsoever.
All we do is we say at the end of June, we wish the children to pause for a couple of hours while we do an assessment of their ability to compute and problem-solve. We wish to do that for all students as part of our responsibility. When we guarantee that we will deliver in education to the children of Manitoba, we wish to assess that to satisfy ourselves that in fact that learning has taken place as promised and we go out and collect $1 billion from the taxpayers of Manitoba to pay for the education system in any one given year. That is where the many thousands of letters that I do receive on the topic of assessments come in. I would say these letters on balance, when you put them into equation, I would have 50 to 60--let us really stretch it and say 100--opposed to who are in the alternative program opposing their children writing Grade 3 exams and several thousand--I have not calculated or counted them up--of people saying that as part of their accountability as citizens of Manitoba, as part of the government's accountability, they want to know that when they pay a billion dollars to ensure a well-turned-out, well-educated populace, they want some guarantee or some proof that their money is going to produce a well-educated populace.
Hence, we assess every third year; Grade 3, Grade 6, Grade 9 and Grade 12, and at the Graded 3 level it is simply a diagnostic assessment. Teachers just teach what they teach. We go in and say here is a test. We want to know how your students do computing and problem solving. That gives us the ability to be able to say we are honouring our obligation to the taxpayer, we are able to indicate to you that children have achieved a certain level of learning that will enable them to go on to the next level of learning, and we are being accountable to the public, as the public has asked us to do, with some exceptions occurring here and there. I suppose in terms of responding to the majority of people, the vast majority do want some indication that there is a standard that is being met. But this in no way interferes with the alternative program except they will lose some three hours of instructional time while they pause to administer it.
* (1520)
The children do not have to be told ahead of time that they are going to be given a test that is in contravention of all that the parents have struggled in terms of philosophy or whatever to achieve for them, because that is not accurate. We have found that where students become upset about testing it is because generally adults have told them they have something to be worried about. In schools where it is accepted as just a normal part of the way in which things happen, there is very little if any anxiety. But here, the alternative program teaches and the results are given back to the school division. They do not have to share it with the parents, and the parents in this case would not want it.
The parents, as the member said, the reason they do not want these Grade 3 exams is because they do not believe in the first-in-class to the bottom-of-the-class philosophy, nor do they believe in the pass-and-fail philosophy and that is fine. The parents do not have to receive the information. It is purely diagnostic. It is not a pass or fail. These marks do not count towards a student's final grade; it is purely diagnostic. It is purely for teachers to be able to prepare for the next year by assessing what they have learned and for the department, as well, to ascertain that across the province certain understandings have, in fact, been learned by the students of Manitoba. The member asked what was the purpose of the testing program in Manitoba and the standards test purpose will be to give an accurate, balanced and well-rounded profile of student growth and achievement.
Most parents in most places would like that, but if parents do not want it, teachers do not have to provide it. In fact, in some areas teachers have chosen not to provide it, and parents have written to the department in far greater numbers than the people at Laura Secord saying they want access to that information. They want to have that profile to help them better understand their children's progress so that they can help at home et cetera, et cetera.
But it says standards testing will be complemented at the school level by tools and procedures such as portfolios, demonstrations, exhibitions, teacher observations, all of those things that are part and parcel of the alternative programming. We say to schools, share this information with parents, but if parents do not want it to be shared with them, then certainly schools do not have to share it. We are saying to schools where parents want that information, we encourage you to share it. Even then they do not have to. Certainly if parents do not want the information, then schools should not have to provide it to them. Parents do not have to use the information, but teachers use these tests for decision making as to what factors maybe should be in next year's teaching, et cetera. That is one of the purposes of the tests.
The present course of action complements the implementation of new curricula, supports the Best Practice of Teaching as it relates to specific subject areas. Optimal preparation for provincial testing is achieved by sustained use of provincially mandated curricula. This fact enhances the accountability of the education system to students, parents, the community.
We have a commitment to the delivery of curriculum-congruent examinations. All of our examinations, if you go through them question by question in relation to the curriculum, are all curriculum-congruent. So it does not require any teaching to a test. We do not encourage teaching to a test. We encourage the teaching of good, curricular material. Since the good, curricular material contains the type of material that we would like students to know, the testing is like a literacy test.
The language arts testing is basically a literacy test. You cannot study for it. You cannot prepare for it. You have either acquired the knowledge and learned how to apply it or you have not. So in the alternative program there is nobody that should ever have to say that they are preparing students and teaching students to a test. That is the fallacy in the member's argument, and it may be the fallacy in the information that the parents in the alternative program have been provided because, in reality, this test does not interfere in any way, shape or form with the alternative program, aside from the fact that we remove the children for a few hours to do the assessment, and the parents, if they want to know nothing more about it than that, need know nothing more about it than that. If they do not want it to be recognized as, for their children's purposes, a testing instrument, they do not have to. We just want the information. It will show us that there is the accountability we have promised the ratepayers.
Part of education renewal, provincial testing seeks to ensure that effective educational strategies are used consistently and appropriately across the system and that all students have the same opportunity to achieve success at school. Educational renewal represents the government's commitment to revitalize the public education system for the current and future generations of students.
The purpose of provincial testing is to determine levels of performance based on pre-established criteria for particular grade levels. The classroom teacher, however, remains the primary assessor of students. Student assessment is the continuous, systematic, and comprehensive process designed to determine the extent to which student learning outcomes have been achieved. This assessment process involves careful planning, systemic implementation, and comprehensive analysis, interpretation, and reporting of results. It is an integral part of teaching and learning.
I think that those looking at assessment should be very, very careful not to wade into the trap that I suspect the member opposite may have waded into or that some parents may have been led to believe. That is that, because there is a standards assessment at the end of every third year, that somehow no other form of assessment takes place. That is absolutely, totally, and horrifically wrong. Assessment takes place all the time. In the alternative program, as the member herself indicated, assessment takes place all the time.
As a province, we are not saying that this alternative program is a state-run program. The alternative program is clearly there as a choice for parents, because we believe in choice for parents. It is an approved program. School board has approved it. The Department of Education has blessed it in terms of that is a program that is there for students, an alternative program, parents' choice, and we support that. So this is clearly a case of parents being able to make a choice, and somehow the member has come to believe--and I think she has maybe been given incorrect information. She may wish to take a look at how the alternative program works and see what the actual impact of taking three hours to write an assessment does. It just simply takes away three hours of instructional time, and it need do nothing more than that unless people want to create an impression that it does do more than that and then make that a self-fulfilling prophecy and then turn it into a big political football. That can happen, and I suspect that maybe has happened.
* (1530)
The results of provincial testing are reported by schools and school divisions as a supplement to other reporting information, including teachers' anecdotal comments about student growth and achievement. Manitoba Education and Training will provide interpretive comments with the student performance data to assist schools using the test results, but it is the local responsibility of schools to place the test results in a context that ensures the best results, the best interests of students are addressed. Test results provide valuable data and information that enable all those working with the child to make adjustments to replace deficits and enhance assets leading to greater success.
I have just been passed a note from my staff saying that the Grade 3 exam is a maximum of two hours, and I recognize that. I am saying that probably with the--you have to pause and so on--you probably have a half hour both days where you are either getting things set up or taken down. The actual writing is two hours. You probably lose a half hour in getting out the papers and putting away your books, et cetera. So we are talking 40 hours and 20 minutes and 40 hours and 20 minutes, two days, and so--pardon? Oh, 40 minutes, I am sorry. Well, then if it were 40 hours I would share the member's concern. I meant 40 minutes, I am sorry.
Winnipeg No. 1 has always stated that it complies at Laura Secord School and the alternative education with the curriculum. So if we take a look and Winnipeg No. 1 says that the Alternative Education program at Laura Secord and other schools complies with our curriculum, and we take the fact that our exam test skills and abilities are curriculum congruent, then there is no need for those students to have to teach to a test. They just need to teach, because if they are teaching the curriculum and we come in just to check to see if those skills have actually been learned, there is nothing new that the teachers have to do. They can carry on with their particular style of teaching, and they need do nothing differently, nothing differently than what they do, because we do not in any way point to nor require a particular approach to grouping of children for instruction, to methodology of teaching. We do not say, here is the curriculum and you must teach it this way. We say, here is the curriculum, and whatever method they are using to teach it is fine. They will end up hopefully learning the curriculum and when we go to assess it then, if the curriculum has been taught, it should show that in the assessment.
I want to also indicate that the Grade 3 mathematics standards test does not conflict with the philosophy of learning in alternative programs. It does not conflict. The way in which they teach and the way in which they learn has no conflict at all with us going in and asking for approximately three hours to assess the progress. As one form of assessment, the standards test complements the variety of classroom assessment and the variety of evaluation practices the teachers are expected to use or choose to use to guide and support a range of instructional strategies to accommodate diverse learning styles.
We have never said you can only teach one way. All we want to know is that at the end of the day, in terms of our accountability to taxpayers for the money we collect from them, that we are turning out a well-educated populace, which can then sustain the world in which the taxpayers who pay for it live for the future so that we have a sustainable society of educated people.
Many more thousands of parents have asked for that accountability than the less than a hundred who have asked for us not to assess the students just because they have had a different teaching style. We have no problem with the teaching style, none whatsoever, and we are not asking them to change it. We are not asking them to do anything differently, but we need to support, and we do, a wide range of instructional strategies, because children do not all learn the same way. As a result, students and parents are provided with an accurate, balanced and well-rounded profile of student progress and development.
The Grade 3 mathematics standards tests do not count as part of the final grade. They are diagnostic. The test results are to be used by classroom teachers to enhance classroom instruction and improve student learning opportunities and experience. Most parents want the results. We have said to school divisions, you know, we encourage you to release the results, but if parents do not want them, they do not have to have them. We found out, very loudly and clearly, that this year in terms of popular demand that province-wide the parents were demanding the release of the results. They absolutely insisted on the release of the results. That was a far, far greater request from the public than the, by comparison, relative small number who do not wish their children to participate in assessment at Grade 3.
The Grade 3 mathematics standards test reflects the learning outcomes, that is, the grade- and subject-specific knowledge and skills that students are expected to learn. These learning outcomes help to ensure that teachers have consistent, high expectations for all students regardless of gender, race and social standing across the province. These are regardless of those factors, and I think that is important, because it gives all students the opportunity and the right.
Teachers in alternative programs have the opportunity to supplement curriculum outcomes with program-specific learning outcomes.
I just want to harken back to one thing I said, when I said that parents were demanding the results. They were demanding the results. They wanted their student's own mark, and I would not think in the alternative school, they wanted their student's mark. The school did not have to give them. We eventually actually published, not individual marks, but the school-by-school marks as a result of public demand, but the individual student marks, first of all, schools do not have to give them out, period.
Most of them are because most parents are demanding them, but where parents are not interested in the assessment process, I would imagine they would not want the marks and the school does not have to give them the marks. So for all intents and purposes, the two hours spent writing the exam and the half hour spent clearing away the books and getting out the exam paper could just be an exercise like children today were going to have an exercise in doing this and this and this, or we are going to have a guest speaker or we are going to have some other activity. It need be no more significant for those children than that, unless people want to make a big issue of it, which I think in the current atmosphere may have happened.
The parents may feel in some way that the teaching and learning has changed. That is what I found in those 60 letters that I got. Those form letters were designed to sort of imply something that the teaching and learning experience was going to be different because at the end of the year, there would be an assessment. That is not true. The teaching and learning experience did not change and need not change in any way, shape or form, because it is all based on a curriculum anyhow, and then the alternative schools teach that same curriculum. They just do it with different methodology. This test does not test the methodology. It just assesses how much the children have been able to absorb and apply in their everyday lives.
* (1540)
We know--and we ran an election on this. I think there was a lot of discussion about this in the election. School effectiveness research indicates that schools with high expectations, a clear mission and frequent monitoring of students' progress, which is what alternative schools do, can promote a variety of positive student outcomes including achievement and behaviour. It is also what assessments do. Standards assessments place high expectations, clear mission and frequent monitoring, every third year, of the student body. High expectations of success, for example, help to reinforce an academic emphasis and a positive learning environment.
Standards tests provide a variety of benefits to students, teachers and the public. Standards tests help to reduce inconsistencies in student learning and performance expectations, assessment and evaluation practices among schools and school divisions. Alternative schools can continue with their ongoing assessment. The member opposite indicates that they do constant assessment, and they can continue to do that. Nothing stands in their way. It is a fallacy and a false argument to say--[interjection] Pardon? Two minutes? We have got time limits here? I did not know that. I better hurry up, because I still have a lot I want to say. We can do it in a series of questions, I suppose. How much time do we have? [interjection] Okay.
Test results and, in particular, the professional development available to teachers participating in the test development and marking process help to strengthen teaching and learning opportunities, and we have talked about the valuable professional development experience of marking before.
Schools are required to administer the Grade 3 mathematics standards test, and students are entitled to receive regular assessment and evaluation of their academic performance and achievement.
I will stop because I do not want to end up putting the chairman in the position where he has to stop me, but I just want to conclude with this one sentence. It is a fallacy, it is not the truth to think that the alternative program has to change its methods, its way of doing things one little bit because we come in at the end of the year and ask for a couple of hours to do an assessment of students' abilities to compute and problem solve at a relatively high level that is the standard for the province.
Ms. Friesen: Well, it seems to me that there are a number of fallacies in that response.
One is to say that high expectations, clear mission and monitoring can only be achieved by a standards test. That is not the case at all, and I think that the minister might want to recognize that. Nobody here is opposing assessment. What parents of their children in alternative programs across the city are concerned about is that they do have, in fact, a different approach to the delivery of curriculum. Yes, the minister is saying that the curriculum is the province's, but what I am beginning to wonder is if, in fact, the minister is not misunderstanding the alternative education program. Yes, the curriculum is the same. The methods of teaching may be different, but what is most important is that the timing is different.
When the minister comes in and says we know better than these parents, the views of these parents do not count, we will test these children for these concepts at this age, then I think that we might begin to recognize that there is some misunderstanding on the minister's part about the way in which curriculum is taught in the alternative classrooms.
I want to read some of the letters that we have received from these parents, and I know that the minister has read them. I am not going to use any names, but I do want to express the sense of these parents and their concern that their views should count. We are parents, they say; we do not want this; we are pleased with our daughter's education; we are now standing up with our daughter because we feel that the quality of her education and that of her younger sister is at stake.
A multi-age, Mr. Chairman--and this is the part that I think the minister might want to think about. What these parents are saying is that a multi-age classroom is not set up to teach the curriculum in one year. It is set up to teach the curriculum over a number of years. In this case in the one-to-three classroom, or later on in a four-to-seven classroom. The same curriculum may not be delivered at the same time, because what the alternative curriculum does is to take the interests of the class, that co-operative sense of the collective--
Mr. Chairperson: Order, please. The honourable minister, on a point of order.
Mrs. McIntosh: No, Mr. Chair. I was just asking if I could obtain a copy of the document from which the member was reading. I thought the page maybe could get one for us. I did not want to interrupt the proceedings.
Ms. Friesen: Mr. Chairman, yes, there is one that has been sent to the minister already, but certainly I will check them.
Mr. Chairperson: Order, please. On the minister's request, I am going to take it under advisement. I am not sure at this time if you have to table it. Because you are reading from the letter, I believe there might be a ruling that you have to table it, but I would like to research that first, and I will get back to the committee in just a little while.
Mrs. McIntosh: I would appreciate having it in enough time to be able to have this discussion on it. That is the problem. If I get the information tomorrow then the--if it is a letter that has already been sent to me, then the member should have no trouble. I do not mind it being tabled with the name blocked out. I just want the content for discussion.
Mr. Chairperson: I will make a decision in a few minutes here I believe. Are you prepared to table it now?
Ms. Friesen: I will block out the names, yes.
Mr. Chairperson: Okay. The honourable member will get the names blocked out and then we will have it tabled.
Ms. Friesen: Do you want to take a recess?
Mr. Chairperson: The committee will recess for five minutes.
The committee recessed at 3:47 p.m.
The committee resumed at 3:53 p.m.
Mr. Chairperson: The committee will come to order.
Ms. Friesen: Before we recessed I was discussing the difficulty that the minister might be having with understanding the way in which alternative education is taught. Using a letter that I think expresses it very well, the teaching of curriculum in the alternative classroom is a cyclical process. It is one that is not where the students are taught, perhaps, in a lockstep manner, but the interests of the classroom, the project-base method is one that is followed.
By the end of a certain period, most of the curriculum is taught, but it may not be taught in the order in which a test requires it to be tested. And so that is the concern of parents, that the multi-age classroom, the co-operative framework do not lend themselves to the testing that the minister wishes to impose. That is one issue. I think the second issue is that the parents have said very clearly that they do not want this, and it still continues to puzzle me why the minister wishes to impose this Grade 3 testing over the wishes of the parents.
The minister really has two arguments in response to that. One is that she has thousands of letters, and I think I am quoting accurately, supporting assessments. I wonder if the minister would tell us how we can see the evidence for that and whether, in fact, that evidence supports the testing of children whose parents clearly do not want it. Are those parents ones who want to impose this upon others? My sense is that it is not, that Manitobans are generally a very tolerant group of people and that the imposition on a group of parents like this, who very clearly do not want the testing at the Grade 3 level, and they are very specific, I do not think they are--I am going to be a Doubting Thomas on this and say, I have not seen these thousands of letters. In fact, I do not think any of them have been copied to me.
Now, that does not mean they do not exist, but I would like the minister to indicate to us where they exist, how we can have evidence of her argument that she has received thousands of letters on this, and whether, in fact, those letters would support the examining of children at the Grade 3 level whose parents clearly do not wish it.
The second argument the minister made was that she fought an election on this, and I think she is on better ground there. She argues that the election was fought, the evidence is clear that the government won, and that assessment at all levels was part of that election. So I think that is a stronger argument, but I am interested in the minister's thousands of letters which she claims to have received, and I think the parents whose perspective I am trying to put here would also be interested in that.
I do believe that there is across Manitoba, not just in the case of alternative parents, some very serious concerns about the costs of these tests, even from those parents who do support testing in a variety of areas. Some parents I think support testing at the Grade 12 level, some would support it at other areas, some would support it in some subjects, but not others. I think there was a range of opinion on tests, but I do understand that the cost of these tests, my sense is close to $15 million when it is up and running in four subjects at four different levels, is something which is of serious concern to parents.
On the Grade 3 tests, I think the minister will find in areas away from alternative education that there have been concerns that this is being marked at the very highest expense level, that is, the centralized marking in Winnipeg with the cross checks and everything that is dealt with in other tests. Certainly opinions have been expressed to me and I expect to the minister as well that the Grade 3 tests, if they are diagnostic, if they are as the minister claims not disturbing in the classroom, then why can they not be marked by the classroom teacher? The issue of costs seems to me most easily addressed by the minister at that level, but, again, we do seem to have a very rigid approach to testing right across the province and one which seems to me, in this case, in the case of the alternative parents, not to take account of the parents' wishes.
I keep coming back to that because it is a government which says it listens to parents. Now, in the Dakota forum that it had, there were some parents, as the minutes of that forum some five or six years ago now, I think--some parents were very enthusiastic about testing. Some parents were enthusiastic about testing at different levels. I do not know that those minutes carry enthusiastic testing for all students at the Grade 3 level in several subjects. I think at the senior high level there is a greater interest in testing across the province, but, again, not entirely enthusiastic or fully supportive as the minister perhaps wishes to say.
The third forum that the government held I attended, and I do not recall testing being part of the discussion. That forum at Yellowquill School dealt with parent advisory councils, with the role of parents in education, the partnerships, the very important partnerships that they play. The first Parents' Forum I do not believe had a formal report, so it is difficult to evaluate what the full sense of that was.
So I am puzzled by what kind of evidence we can look at other than the election to show that thousands of parents are supporting the assessment at the Grade 3 level and, in particular, would insist upon the assessment in the alternative schools over the express wishes of children.
Now, the minister says that it will have no impact on education, that it is merely--I think the comparisons she used were ones where, it is like having a guest lecturer; it is like having a guest into the classroom; it is like a particular kind of activity.
(Mrs. Shirley Render, Acting Chairperson, in the Chair)
But I think however much parents try to ensure that their children face a test with equanimity, that they are calm, well-prepared, understand that it is diagnostic, I think there is an atmosphere amongst children in a school, a peer atmosphere, which makes it more difficult for parents and which perhaps becomes the overriding atmosphere. This is not something necessarily that parents and teachers have control over. It does happen. I think the minister has been a teacher. She understands that. The day of the test; preparing for the test; how did you do on that question; what did you do on that one?
* (1600)
All of those things become part of the atmosphere of a school and in particular of a classroom, and that is what these parents are saying. The multi-age classroom, the co-operative framework that they have chosen, the very active partnership that they have entered into with the school are not conducive to the kind of atmosphere which inevitably in any school accompanies a test. Of course, there are not alternative schools in the sense that the minister wants to talk about them, although perhaps Harrow School at one point would have considered itself a fully alternative school--I am not sure it does at the moment; it has other programs--but most of the alternative programs run in schools which are dual or triple track, and so the atmosphere of the test extends throughout the school and throughout the playground.
So, again, I come back to the sense of here are perhaps 500 or 600 children whose parents are expressing in a number of ways that they think that this is appropriate. The numbers must not affect in the minister's mind the statistical viability of this test because she is prepared to accept 20 percent not writing at other levels, and she said that that was acceptable in Alberta.
Indeed, I checked up on that because I was surprised. I actually thought that 20 percent not writing would affect the statistical viability, but I did call Alberta and they said, no, that is the way we work it, and we have always worked it, and we do not have a plan B, and we are prepared to allow that number of students not to take part.
So given that and given that the minister has said that she does follow the Alberta example on that, although I understand this coming year there will be a plan B for the Grade 12 winter exams, I wonder again why these parents, small in number, but very, very enthusiastic about the kind of education they have received, are to have their children tested at the Grade 3 level over their express wishes. Why is it that the wishes of these parents do not count?
Mrs. McIntosh: It is unfortunate. I do not want to be arguing semantics, but I do think it is important that the member try to change her habit of taking what has been said and just twisting it a little bit to make it sound like something different.
For example, the member indicated, when the minister says we as government know better than those parents do, that is not of course what I said. When the member says a number of things, when the member says the minister claims she has had correspondence, there is an emotional evaluation put upon the word that I think she understands very well as do I. You can take words, and you can say "claims" and "alleges," and they are not inaccurate, but they leave a connotation that encourages the listeners to suspend belief. The member knows that; she is an intelligent, well-educated woman. I know that as well. So the general public may not recognize the subtlety of the game that is being played here, but the articulate listener will, and it belittles the lofty motives for which we are gathered in the House to take words and put emotional connotations on them in that way.
The member, for example, indicated that I said there were two reasons for having tests, one being that I have received thousands of letters or I claim I have received thousands of letters, and the other reason being that we fought an election. Well, I did not say those were the reasons for having tests. I encourage her to search Hansard. I was responding to a question she asked as to what support we had for the tests. That is not a reason for having a test. I stated clearly the reason for having the test. That is based upon the research, the extensive research that is there, and my answer was very, very clear that there is a tremendous amount of research indicating that where you have proper assessment, standards assessment of this type with high expectations, that students will learn better, and the teaching and learning experience will be improved. Those were the reasons.
The fact that we have received thousands of letters and thousands of votes was in answer to the question on what support do you have. So when the member says back to me that I have said the two reasons we are having tests are because of thousands of letters and thousands of votes, she again leaves an impression for which she has already had an answer, that we have not taken research on learning; strong, solid educational research on learning and assessment, and the vital role that assessment plays as part of the learning experience into account. Of course, that is misleading to anybody listening.
I also made clear and distinct references to best teaching practices. Those are all in my answer. I invite her to read Hansard and she will see that I did not give thousands of letters and thousands of votes for the reason for having exams, and I repeated it several times because I have learned I have to with this member and still it did not seem to sink in. Thousands of letters and thousands of votes showed support for the rationale, the rationale being the extensive research that has been done on this topic.
I may draw the member's attention to what some of that specific research is. I made reference in my earlier answer to best practice in teaching. Best practice in teaching reading, and there are all kinds of items in here, many of which may be utilized in the alternative system, which I do understand and I thought I indicated to the member that I did understand how that system worked, but I will go through that again in a minute to show how her concern about the timing at which things are learned again bears no relevance to them having to change anything in order to have a standards exam. I will explain that in a minute.
First, in terms of best practice in teaching reading, you will have reading aloud; time for independent reading; choosing your own reading materials; exposing children to a wide and rich range of literature; teacher modelling; primary instructional emphasis on comprehension; using strategies that activate prior knowledge; how do students make and test predictions; structure help during reading; provide after-reading applications; social collaborative activities with a lot of discussion and interaction; grouping by interests or book choices; silent reading followed by discussion; teaching skills in the context of whole and meaningful, of literature, writing, before and after reading; use of reading in content fields, historical novels in social studies, for example; evaluation that focuses on holistic higher ordered thinking processes; measuring successive reading program by students reading habits, attitudes, comprehension; exclusive stress on the whole class or reading group activities or individual teacher selection of reading materials for individuals or for groups; student selection. There are so many. The list is extremely lengthy. You could continue on.
You could go into the primary instructional emphasis on reading subskills, such as phonics, word analysis, syllabication, or teaching reading as a single one-step act or solitary seat work. You can group according to reading level. You can group in different ways. You can do a whole series of things. Some of these things that are things that are on the increase were first named on the second set. You could look at things that are no longer considered to be things that you want people to do. You do not maybe want them teaching isolated skills in phonics workbooks or drills with little or no chance to write and so on.
In terms of research again, take a look at the selected bibliography, Anderson. There is a book called Becoming a Nation of Readers: the Report of the Commission on Reading in Washington, D.C., by Anderson, Hiebert, Scott and Wilkinson. There is a book by Roger Farr, The Teaching and Learning of Basic Academic Skills in Schools. here is a book called Putting it all Together, Solving the Reading Assessment Puzzle, Reading Teacher. There is a New Policy Guidelines From Reading, Connecting Research and Practice, that is by Jerome Harst, National Council of Teachers of English. There is the book The Struggle to Continue, Progressive Reading Instruction in the United States, that is by Patrick Shannon; Reading Process and Practice by Constance Weaver. There is a selected bibliography of Teaching Writing, and again you can go through a lengthy list of things in terms of teaching writing that show the number of things that are being done and the number of things that are decreasing in terms of being done.
* (1610)
We are writing across the curriculum as a tool for learning now, more and more. We are writing for real audiences more and more and so on, but selected bibliographies in Best Practice in Teaching Writing. Again, all of these leading up to the emphasis on the assessment being a way in terms of evaluation to really measure and to continue the progress by being able to build upon known skills and foundations in a very clear and concise and consistent way.
We have a list of books for Best Practice in Teaching Writing and Assessment and so on, writing and response, theory practice and research. You had asked for research to support Grade 3 tests when parents do not want it. We are breaking new ground, so research neither supports or defeats it in that sense, but we design our tests at all levels to comply with the principles of fair assessment which are endorsed right across North America.
I can provide the member with a list of 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, I can go on with dozens of books on how to assess reading and writing in terms of research. In terms of mathematics we have an increase in terms of best practices in teaching mathematics with evaluation, having an assessment as an integral part of teaching, focusing on a broad range of mathematical tests and taking a holistic view of mathematics, developing problem situations that require applications of a number of mathematical ideas, using multiple assessment techniques including written, oral and demonstration formats.
When we look at the increase in teaching practices in terms of problem solving, communication, reasoning, mathematical connections, numbers, operations, computations, geometry measurements, statistics, probability patterns and functions, all of those items again, a selective bibliography on this for best practices, and I know the member agrees with best practices and all of these books: Aims Educational Foundation; Teaching and Learning Mathematics in the 1990s; Curriculum and Evaluation Standards for School Mathematics; Maths Wise, Teaching Mathematics in Kindergarten to Grade Eight; Professional Standards for Teaching Mathematics. You know, this one here happens to be by the National Council of Teachers of Mathematics in the United States of America: Helping Children Learn Mathematics; Mathematics Assessments; Miss Model's Good Questions and Practical Suggestions, again the National Council Teachers of Mathematics; Family Maths; New Directions for Elementary School Mathematics, even the term that we use, New Directions, again not by us but by Trafton; Living and Learning Mathematics, Heidi Mills and Timothy O'Keefe, and so on and so on and so on and so on.
The list is extremely long. I have a list, a long, legal page here of book titles just on that one topic alone, and I could do it for every subject area, because we have all the research documents listed here, and I am going to come back to some of the research statements that support what we are doing here, but I just wanted to address before I did that some of the charges the member made, and I will come back and hopefully will have enough time to give her all the rest of this information on the research that has been done and the fair student assessment practices that I know the member supports, upon which all our testing is based.
The member has said three or four times that I have ignored the desires of these parents, and how can I, and I guess I do not feel I have ignored their wishes, because their concern was that their alternative program in some way would be changed, and it need not be, it does not have to be. The fact that they have been somehow told that it would be because someone is pausing to assess their children's progress is really sad. I wish they had not felt that. We could offer to go to those schools and work with any teachers who may as yet be unable to see how our assessment and their curricula in no substantive way constrain their philosophy or schools philosophy. I would be pleased to do that if in fact they asked me to do that, but they did not.
Nor did the schools incidentally ask for an exemption on those exams, but we do have to take a look at all of this research, which is the reason, not the support, but the reason for having proper assessment of this nature being done right across North America, not just in Manitoba, but right across North America. It is a fact in a democracy that people do have a right to their views and they can and should indicate support at election time for the party that espouses their views and they do that. My obligation in the end is to bring in those policies that we have indicated we believe in, that research shows are good and for which there also appears to be support. That is not the reason, the support is not the reason. The support is the support and I hope the member is able to understand that.
But our tests do support the curriculum, and the curriculum is being taught in the alternative process. The member says the fact that we are doing an assessment requires a lock-step approach, and that, of course, is utter nonsense. That is total and absolute nonsense, that to have an assessment test that would go in and measure a certain standard of literacy, mathematical literacy, to say that that requires a lock-step approach to teaching is absolute nonsense.
There is a good basic curriculum prepared by teachers who understand reading and writing, and that good basic curriculum is followed in the alternative program. The diagnostic purposes to which the tests are put provide teachers and parents, if they want the information, with solid, valid information in a nonjudgmental way. It is sound educational practice to say here is where your child is at this moment. Here is a photograph of your child's progress.
The standards tests are not unlike putting a bar on a high jump. There will be a peg on the bar where you can put the high jump, and the children can jump over it. Some will jump it at a lower or a higher level, but there is a range. It might start at two feet and go up to four feet or whatever, and some children may be jumping over it at the two-foot level. That is very good for teachers to know. Janie is jumping at the two-foot level. That is good for me to know. Now I can work with her to get her to the three-foot level. It might also be very useful for the teacher to know, goodness, gracious, Janie is jumping at the four-foot level; it is time for us to move on to the next level.
All of these are things that the teacher may well be able to determine with her own testing or his own testing methodology. Is it not good to have it confirmed by a standards test? What teacher would not want to have their own assessment confirmed with a standards test? I know of no teacher, who doing his or her own assessment and accurately indicating where their child is, would not welcome a verification of their diagnostic assessment. It is always comforting to know that you have assessed properly, that you have an accurate photograph of the child, and it is also very helpful if you missed the assessment correctly to get some other assessment that would help. I mean, constant improvement for the child is what this is all about. I think if any principal felt that students would be harmed by writing the test or the program destroyed by writing the test, the principals would have asked for an exemption because we do have exemptions.
* (1620)
But I guess what I find interesting is that I do understand fully the nature of teaching the alternative program, and I do not concur that by teaching in a different way or manner or over a different time period, that students should not be assessed at a particular point in time. Any student in an alternative program who is of a certain chronological age associated with Grade 3 can or should write a test associated with Grade 3 for diagnostic purposes to measure the progress and find out where he or she is at that point. If the child is not able to perform all of the functions associated with Grade 3, then the teacher and the parent, if the parent wishes, can assess their performance and confirm that which they already knew, assess the information, look at motivation, look at capability, look at is this where we expected this child to be now.
This is a help. A child may well in an alternative program be well ahead of the standard. This helps the teacher, again, get a good measurement that the child has moved ahead at his or her own pace and has arrived at a particular stage in learning that is beyond the Grade 3 standard. It cannot be measured on a Grade 3 standards test. That is a confirmation to the teacher of their own diagnostic performance, and that is helpful. What teacher would not want that? If parents do not want to hear it, they do not have to. The Grade 3 assessment provides one more piece of information that the classroom teacher can use in order to make decisions about teaching and learning, and, as I say, it can confirm what they hopefully have determined in their working.
I should also indicate that the member, I think in her answer, left the distinct implication that some students in the alternative Grade 3 age grouping might actually be at the end of Grade 4 or Grade 5 or some other thing, or below that, depending upon where their learning is. But in fact, I think the member, if she understands fully how the alternative program works--and she says she does, so we assume she does; the member claims that she understands fully this process--then she knows as well that the Winnipeg School Division makes every effort to ensure that the students do not get too far out of line with the congruency in other schools because they know that children will move from time to time. They want to make certain that as children move out or into and out of the alternative program that, if they transferred to another part of the province or another province, they will not be out of kilter. So Winnipeg No. 1 has indicated to us that they try to keep it fairly congruent with the grades in other schools. It may not be exactly the same because they are able to move back and forth, but they do try to keep it fairly similar.
Our tests assume that the curriculum has been taught. In turn, it assumes best practice, which is anything but the lock step the member referred to. Best practice is not lockstep, and that is what we use in our schools. The alternative program may not use it the same way other schools do, but I am sure even in the alternative program that they pay attention to best practices. Again, I have a lot of material here and I could just spend the next half hour reading all of the book titles in, but I do not want to miss some of this other opportunity. I guess that is all I will say on that, but I will come back now to some of the information that I think would be useful.
John Bishop at Cornell University did a reanalysis of data collected as part of the second International Assessment of Education Progress, IAEP; the member may have heard of it. He found that holding the social-class background of students constant, just the same socioeconomic background, students from Canadian provinces with examination systems, regardless of how they taught but where they had examination systems, were substantially better prepared in mathematics and in science than students from provinces lacking such exams.
He also found that there is no evidence that external exams caused any of the undesirable effects that opponents of external exams have predicted. Reading for fun went up, not down as predicted. Mathematics teachers decreased their emphasis on low-level skills, computation, et cetera. Science teachers arranged for a student to do more, not fewer, experiments, and if the member wants to read that particular reference, I refer her to John Bishop's work "The Impact of Curriculum-Based External Examinations on School Priorities and Student Learning," which is from Cornell University, and it is a fairly well-respected university.
A lot of the myths that people who oppose exams--and I know the member says that they support exams in assessment, but I have never heard one word yet to indicate support--in everything that has ever been said from the opposition about standards and exams leads me to believe that while they say they support exams and assessment, they have never found one detail about them that they have supported. I have been told by certain friends of mine who happen to be teachers that the NDP is being strongly supported by certain members of the union, because the NDP has indicated that they will get rid of standards exams if they ever come to power. I believe that, based on everything I have heard the member imply about the wretched impact of exams on students, the terrible things it does to them and the way in which nobody wants them except people who vote Tory. This is based on research. It is based upon knowledge and understanding that I would hope the member would ultimately come to embrace.
This is not going to do anything to the alternative program regardless of what timing they learn or anything else. All we want to do is a snapshot of children at the end of June in Grade 3, a two-hour snapshot so that we can in clear conscience say to the public, you hold us accountable to do an assessment, to ascertain for sure that children have learned to compute and problem solve. We give you a billion dollars every year to do that, we want the accountability so we can give that to them. But the reason the testing is done is because all research shows that which I have indicated to the member in some detail.
The Principles for Fair Student Assessment Practices for education in Canada--there is a developmental process, develop and selecting methods for assessment. My staff has just handed me an interesting thing. These are some of the comments from students received during the Grade 3 math pilot last year. I will not read the name of the teacher. It has been blacked out, or the name of the student has been blacked out. As the member did with her letter, the member submitted a letter from some parents saying they did not want their students to write the Grade 3 assessment exam, because they learn their math with a different methodology than most other schools.
Here are just a few. These are from students: I thought this test was easy. I mean, come on, 14 divided by seven. That is too easy. I think I performed well on the test. It, the test, was long but not by much. That is one student.
Here is another student. We can table these. I think the names are gone. Oh, here is a name still showing here: When I wrote the math test I felt excited because I never wrote one before and it was exciting. Some of the questions were hard and some were easy but most of all I had a really good challenge. I think I did well on the test. Well, I hope I did. The test was sort of difficult to me but I managed okay. It was exciting.
When I first thought of taking this test--this is another letter--I told my parents, no, I do not want to go to school, it is the Grade 3 math test. Then I remembered that in this Grade 3 math test this is just the stuff we learned from the beginning of Grade 3, but I wondered if it would be more difficult than the regular math sheets we have, but I found out when we did the Grade 3 math test that it really was stuff we had already learned and that it was really easy.
* (1630)
This is--names are here so--Dear such and such, my teacher, I think there should be more questions because this test did not show all my thinking. I can think of a lot more than what is just on this test. I thought this was a fun test. I think the instructions should be more clear. Thank you for making up this test, but I think the questions should be harder so I can show all my thinking.
Here is another letter. I think that the article with the example in a newspaper about the combinations with tee shirts and jeans, our whole class said that was really easy. I personally think it was pretty easy but some adults might have had a bit of trouble on that question. If some adults did have trouble on the question, here is what I did to explain it to them. And then she has, or he has, a little thing that shows how he explained it to the adults who could not do it.
Another student, Grade 3: I thought the question you put in the newspaper was very easy. I was surprised to know that some adults had trouble solving the question. Here is how it is done, and then again the answer. Then another letter: Before the exam I was nervous, but after the exam I felt better. Part B was way easier. When we finished paragraph 4 I was really anxious about flipping the page to part B. I could see through the last page of part A and I could see a car. Then I got curious to know what kind of math would be on part B, so I turned it over. I think I did a pretty good job, but I--okay, I will come back, Madam Speaker. The member had asked me about anxiety for students. The central marking I will do in the next answer.
Ms. Friesen: Most interesting comments from the students, I am very glad to hear them. The minister has talked about students' comments. She has also talked about teachers' comments and how teachers should welcome and, as many of them do, some confirmation of their own assessments. The minister offered a bibliography of support for standard testing. My sense is that apart from the particular one that she mentioned from Cornell University, that many of the books on that list encourage many forms of testing, many forms of assessment, and so the evidence for that, perhaps, is not quite as striking as the minister would like to indicate.
But the issue that I have indicated and suggested is that it is the parents. The minister wants to talk about students. She wants to talk about teachers and that is all well and good, but the issue is that these are parents who have asked that their children be excused. This is a government which claims that it has listened to parents, and I am puzzled by the way in which these parents are not being listened to.
I think we have been around that one for a number of routes now. Let us have a look a little bit further at the Grade 3 test. The minister says that it is a diagnostic test, and so I am interested in knowing what is planned for the release or the publication of the information. The minister has said that parents need not ask for it, and that is quite correct. Does the minister intend to release these results school by school as she has for other exams, or are the Grade 3 tests as diagnostic tests to be dealt with in a different way, or is the minister leaving it to the divisions to release as they see fit?
I am also interested in the way in which diagnostic tests are to be used. A diagnosis is an indication of a certain level of ability or of inability. What does the minister have--it is the minister's test, it is the province-wide test--what proposals, what handbooks, what directions, what guidance, what professional development, what assistance is the minister offering for classrooms and for teachers to deal with the results of this diagnosis?
Mrs. McIntosh: I will try to go a little faster. I have noted down the member's question. I will finish up quickly my answer to the other and then come back and answer this, so I will try to speed through it a little more quickly. I think it is important that I read just a few more of these student's letters, because I think it does indicate the impact on students, and I think it would be important for the record for them to be here.
I mean, we get letters--like, I remember the show, was it Perry Como that used to say, we get letters, we get stacks and stacks of letters?--on a whole wide variety of topics all the time and most delightful of all, of course, are the letters from children. Here is a student, anyhow, that said that she was curious to know what was going to be on the test. This student says, I do not think the test was too long. I wish it was one day long. It was not difficult reading the questions. Some were hard to understand what they meant. I do not think the questions are all easy; some are hard, but some of the questions were easy. This is Grade 3, so the language may not be Grade 12 level. Some were like the ones we do in class and I had fun.
This is another student. I felt I probably did well on the test. The test was kind of hard and kind of easy. The test was not too long. Reading the test was easy to read. The questions were easy. They were the same as the ones we do in school.
This letter is about the Grade 3 math test that you are in charge of. This math test was fun, but I think you should put more questions in it. I also think you should put it into three parts: part A, part B, and part C. I liked all the questions. Thank you for thinking up the test. Thank you again for reading my letter.
Those kinds of letters are interesting, because, first of all, it says with the right psychological approach students can look at these tests and actually find them fun. Where teachers say tomorrow you are going to be assessed, and we are going to see how much you have learned about your math and take some positive approach, students generally will think that. Where the student is told by whomever--parents, teachers, system, media, whatever, next week you are going to write a test and this test will determine whether or not you know, and it is going to be very scary and very big, and some big change in life will occur if you do not do well on this exam, then the students will come in absolutely terrified.
(Mr. Chairperson in the Chair)
On a diagnostic test, there is no reason for them to be fed that line. I think this also shows that students rose to the challenge, the questions that expanded their mind they considered to be fun, and they wanted more challenge. It also shows that some questions were easy, some questions were hard, which shows that we were being able to reach for to assess those students that move way beyond the provincial standard and those that are still underneath it because the range of abilities is there. This is not a pass-fail test.
In terms of fair assessment, I think this needs to go into the record as well because these are the types of things we are looking at when we are using tests. If any of these are contradictory to the alternative program, the member might wish to tell me which ones are in contradiction with the alternative way of teaching, because I do not believe they are.
* (1640)
First of all, the Principles for Fair Student Assessment Practices for Education in Canada, which is what we use, defines the assessment method; indicates what we are intended to measure and how we are going to use it; warn the users against common misuses of the assessment method and describe the process by which the method was developed; include a description of the theoretical basis, rationale for the selection of conduct and procedures, et cetera; provide evidence that the assessment method yields results that satisfies intended purpose; investigates the performance of students with special needs and students from different backgrounds; reports evidence of the consistency of validity of the results produced by the assessment method for these groups; provide potential users with directions, answer sheets, et cetera; review printed assessment methods and related materials for content or language generally perceived to be sensitive, offensive or misleading; describe the specialized skills and training needed to administer an assessment method correctly, and the specialized knowledge to make valid interpretations of scores; limit sales of restricted assessment materials to people who are qualified; provide for periodic review and revision of content and norms; provide evidence of the comparability of different forms of an instrument where forms were intended to be interchangeable; provide evidence that an assessment method translates into a second language is valid, et cetera; advertise an assessment method in a way that states what it could be used for--
Mr. Chairperson: Order, please. The honourable member for Wolseley, on a point of order.
Ms. Friesen: Perhaps, I should tell the minister that I have the Principles for Fair Student Assessment Practices in front of me, and perhaps I should remind the minister of what the question was which dealt with the Grade 3 exams and the use of diagnostic tests, the publication of results, and the way in which the department was intending to provide assistance to teachers dealing with the results of those diagnoses.
Mr. Chairperson: Order, please. The honourable member did not have a point of order. It might have been clarification.
Mrs. McIntosh: I appreciate the member's point. She had asked a series of questions in the question before. I had indicated I would like to finish the answers to those questions, took note of the question she had asked last and presented them.
If she has the Principles for Fair Standard Assessment Practices for Education in Canada, then perhaps she would not mind if I just tabled mine for the record, because while she has them, all of this is going on the record, and she left some questions that had implications inherent in the question; you know, Mr. Chairman, how you can be damned by the question sometimes?
I would like to table this to let anybody reading Hansard show that the implications in her earlier questions, that we are doing this in lock-step and that we are doing it with no reason, et cetera, that that shows--and I would like those back, by the way, because I do not have any other copies after you have taken a copy for the record.
I thank you, Mr. Chairman, and I will proceed then to the answer. She asked about central marking, she asked about release of marks, and she asked about diagnosis. First of all, she said in her preamble to her question on this time, she said that the bibliography supports many kinds of assessment, and that is exactly my point. The bibliography supports many kinds of assessment, including standards assessment, and specifically standards assessment. Each of those documents supports standards assessment, but obviously other kinds, and we have said repeatedly here today that assessment is ongoing. It occurs and should be occurring with every teaching experience.
I have information on central marking that my staff has provided me, and I have some more research on best instructional practice, which, incidentally, is a Stanley Knowles professor, ironically enough, research on best instructional practice supporting higher level of student achievement inextricably linked to the systemic implementation of best instructional practices which includes assessment and testing, from Brandon University, a Stanley Knowles professor. I think that is quite interesting.
There is a whole series of acceleration alignment assessment, CBL computer-based co-operation, co-operative learning criteria, expectations, feedback, learning styles, mastery, meta-cognition, motivation, orderly environment, peers, peer tutoring, remediation, school ethos, study skills, time on task, et cetera, et cetera, et cetera. I will not go through the whole bibliography here, although it is yet another very, very lengthy set of bibliography.
Mr. Chairman, what I have tabled with you is not the bibliography. What I have tabled is the Best Assessment Practices. The bibliography--anybody reading this who would like the bibliography can contact my office. It is extremely extensive, very long, can fill up many, many, many bookshelves.
I want to try to take these in order. The Grade 3 diagnostic test is a diagnostic test. Will we release the information school by school? Yes. We will not be releasing individual marks. We will not do that, but the overall photograph of the province is useful information. It is part of the accountability. It was not our intention in the first instance to release those, but Brandon School Division, I think, made the decision for the people of Manitoba by its actions. It sparked such interest.
By trying to hide the marks or ignore the marks, Brandon School Division sparked such all-persuasive, all-pervading interest across the province that we just felt that we had to release the marks in response to that public cry, but not student by student. The public pays for the schools. They may be interested in every three years receiving a snapshot of certain grades just to get a handle on the type of learning that has been absorbed and understood and can be applied, but not with individual students.
Every provincial standards test will have a report that goes to school with interpretive results indicating strengths and weaknesses, and a technical report also is developed that goes to divisions and schools. That is for internal use. That is not for the broader public. Teachers and principals can then take a look at their diagnosis and see if treatment is required.
The member made reference to a medical diagnosis, and the same principle does apply. We might issue, from the Department of Health, for example, a statement saying that the public in Manitoba is generally healthy. We have also, as we do, have done recently interested a public statement on diagnosis saying we have a real problem with diabetes in our aboriginal community. Now, we do not say which patients have diabetes, but we will say this particular community has a problem with diabetes, and therefore we need to take corrective measures. We need to have, you know, programs on nutrition, et cetera, et cetera. That is a diagnosis. It is made public in a generic way, but not with the names of individual patients.
It is to let the public know the state of health care in Manitoba, the particular problems we are facing, the triumphs we have had where we say measles has all but been eradicated, or measles has had a resurgence. We publish the picture for the people to know because we take many millions--for Health we take a billion and a half from them to run the health system. So it gives them an ability to get a sense of what their dollars are providing in terms of how students are able to absorb and apply information in a relevant way for becoming meaningful members of society. The public depends upon them to become meaningful members of society, because we know that without a well-educated populace, society begins to decline.
* (1650)
This will also help parents mix and match there as they go into schools of choice. Perhaps they would like to go to an alternative school where the standards tests may show that all students have, in fact, gone way beyond the Grade 3 level and are actually doing Grade 4 work. It may cause people to say, I am going to get in that long lineup to go to the alternative school, or it may cause them to say, you know, my son has had a terribly hard time keeping up to the Grade 3 standard, and I see that at this particular alternative program the students are still working at the Grade 2 level, reinforcing and reinforcing before going on to the next, and that is what my child needs. So therefore maybe I would like to go there. So it may have that impact, but basically it is to inform the public.
The individual diagnostic results that go to the schools are there for professional development in the schools. Regional sessions will be conducted by the department, by divisions to assist in the interpretation of results, and all of the workshops that are presented in math and English language arts incorporate assessments issues, both provincial and local.
My staff is planning a fall session called an executive seminar for senior school division officials, one topic in workshop being on present-day assessment methods to assist in a clear understanding that our testing is not accurately described as standardized tests or standard tests, but are in fact standards tests. That is something very different. The nature and purpose of standards testing is not the same as the others.
I will just want to indicate that schools will receive, with this, student profiles. They will get student profiles in terms of strands, levels, aggregate. They will get the interpretive guide to help them in using the profiles. They will get a school summary and a provincial summary, plus the divisions will receive a divisional summary as well. Interpretive comments will also be prepared to aid in overall meaning of results. I said we are going to have the regional workshops.
All of our standards tests should be used for decision making regarding instruction at the classroom level. That is what they are intended for. That is the main focus. It is to inform, for accountability purposes; it is to diagnose, for continued learning. It does not need to contradict what is already happening in the classroom. It is an enhancement, and it harmonizes very well with it. The information gives a teacher one more tool to make decisions about individuals or groups within the classroom regarding emphasis on instruction. So the tests are for formative purposes, to help form the next level of learning.
The member asked about local versus central marking. There has been a tremendous amount of research done on this as well. The member probably is aware of this, but she did ask the question, so for the record I will indicate that one could do regional marking and you would get marks. One could do divisional marking and you would get marks, and you could do individual classroom marking and you would get marks. But the further you get away from a central marking system the less consistency you are going to have in terms of comparing apples to apples.
We had indicated that we could additionally try a standards test being marked locally. It raises a number of questions, a number of issues surrounding the reliability of scoring. The main question is: Can teachers in 40 or 50 divisional marking sites mark an examination in the same way consistently for all students? So we used the June 1996 pilot test on Grade 3 mathematic standards to compare central versus local marking.
Schools participating in the pilot were asked to double mark 20 percent of their test papers at the local level. These double-marked papers were sent to the department to be re-marked by a central team. The results of the central marking were compared to the results of the local marking and surveys were then conducted to obtain feedback from markers, co-ordinators and administrators.
An analysis of results of central versus local marking has shown that discrepancies in marking have occurred from division to division. A student's paper marked in one division could receive a significantly higher or different mark from the marking team in another division, but of course that is the problem that universities and employers have had for decades in Manitoba since the ending of departmental exams, which were 100 percent pass/fail, and they were scary stuff. They would get 80 percent in one division, which did not mean the same as 80 percent in another division, hence they have not been able to accurately assess students at the university level or for employers as to whether or not they really did, each, know 80 percent of the work, because some marked more easily than others and some taught a different curricular content.
Mr. Chairperson: Five minutes.
The committee recessed at 4:56 p.m.
The committee resumed at 5:08 p.m.
Mrs. McIntosh: I indicated that when we analyzed the results of central versus local marking, we saw discrepancies occurring between divisions--not within the division but from one division to another. Finalizing the scoring key can only be completed after a large sample of student papers have been examined. The key must be finalized by the group responsible for marking. If it is done locally, variations may occur in the key from one marking site to another. Staff involved in the Grade 3 mathematics pilot indicate that it is more efficient to train teachers to mark centrally. Standards tests require nontraditional types of questions and must be scored in new ways using rubrics. Many teachers do not at this point--although many do, many also do not yet have the experience in working with rubrics. So training is required for teachers whether the tests are marked locally or centrally. We have found that the accuracy of doing it centrally is much greater than when you break it up.
* (1710)
The 1996 SAIP experience in the scoring of the 1993 mathematics tests in three different sites revealed a number of inconsistencies in marking that had to be rectified subsequently. The next SAIP mathematics test will be marked in one central location. The conclusion, then, was that the 1996 pilot on central versus local marking indicated that at one extreme one division marked 20 percent lower than central marking, and at the other extreme a division marked almost 10 percent higher. Other divisions were between these two extremes. By marking centrally, the children then are not penalized. We recognize that it is certainly more convenient to mark locally and probably less expensive, so we continue to look for ways to get the same accuracy that you get centrally. Marking on a local way, as yet we have not determined or discovered a method that will give us the same degree of consistency and accuracy. So unless, or until, we are able to find a way that is as accurate, we will likely be remaining with the central marking as the most accurate and consistent.
Aside from the locus of marking and to go back, or to repeat a point from an earlier question--and this will conclude my answer, Mr. Chairman, for this particular series of questions. New Directions states that standards testing will be complemented at the school level by tools and procedures such as portfolios, demonstrations, exhibitions, teacher observations, et cetera. They will be developed to give students and parents an accurate, balanced, and well-rounded profile of student growth and achievement. Standards testing must therefore never exceed in value the important assessment work being conducted on an ongoing basis at the classroom level.
Manitoba Education and Training has recommended very strongly to the field that careful consideration be given to the weight attached to any one assessment activity, and that the greater the variety of assessment opportunities taken, and the greater the variety of tools used, the greater the chance students will have of revealing what they know and what they can do. I just wanted to emphasize that to show that the standards testing is not the only and absolute way of testing. It is one of a variety of assessment methods, and we believe and encourage a wide variety of assessment tools on an ongoing basis to maximize opportunities for students.
I think I have covered the points raised in the last question, and if I have not the member can perhaps indicate that when she is asking her next series of questions.
Ms. Friesen: A number of things that I think were interesting in the minister's response, one of them I do not know if she wanted to leave that on the record, but she seemed to imply that students who were anxious about tests may have been, and I am quoting, "fed a line." I do not know who she thought was feeding them a line, whether it was parents or teachers, but it seems to me an unusual interpretation of student concerns about exams.
I asked the minister about the impact of the diagnosis and what the next step was. She gave me a number of responses and some of those are very interesting and very helpful. But I had already earlier asked the minister to provide, as through freedom of information which we did receive, the interpretations of the Grade 12 or 40S and 40G math English and French summary reports. My sense of those summary reports was that they contained a lot of the mathematical scores, the means, the medians, the deviations, et cetera, and that is helpful for some people, but the written response or the written evaluation of the exams seemed to me, at least the ones that we received as being public, were very limited in their information for parents and teachers.
I was interested that the minister said that there was a second type of evaluation which went to schools. Now we were talking at that time in the context of only the Grade 3 diagnostic exams, so one part of my question is: Is there the same kind of larger interpretive report on the senior exams and presumably on the Grades 6 and 9 and 12 exams, as they come up, that is going to schools? The public one that I got, really the conclusion was a page long. It contained generalizations which I am sure are helpful to teachers but really did not take things very far. For example, teachers should try to integrate mathematics with other subjects so that students could obtain a better understanding of the usefulness of mathematics in solving daily problems. Students need to communicate their mathematical ideas more clearly and effectively. When solving word problems, a final statement answering the problems should be made by the students. Students should understand that mathematics is a language and needs to be communicated properly.
Now I can certainly understand that that is one of the conclusions from the mathematics exams, but from a parent's perspective or the larger citizen's perspective here, it seems to me that that is not the kind of detail that really helps us understand where teachers should go next, where students should go next, and what the overall impact of the exam has been. So I am interested to ask the minister a second question here which is: Is there going to be a larger evaluative report that goes to the schools? Is there anything from that evaluation which leads to curriculum, the whole substance of examination and curriculum being a cyclical process that if you are going to have exams and you learn something from them that that leads to some specific changes, additions, subtractions, perhaps, from the curriculum. So how is that process being addressed? And, for example, in the 40SG mathematics English and French summary reports, in specific terms the summary report says that the lowest results for the English mathematics 40G examination with a trigonometry, the lowest results on French mathematics were on consumer mathematics. That is a very specific criticism of, I would think, both the curriculum, the teaching, and the student's understanding. So what is the next step in those two areas? I am just using it as an example of how these exams are being used to develop, to adapt the curriculum or to instruct teachers in areas that need strengthening, and how that instruction is to be given.
* (1720)
I am also concerned, and the minister has made a number of references--and, indeed, was reading the Principles for Fair Student Assessment Practices for Education in Canada. She was reading from the section which deals with exams which are external to the classroom. One of those, No. 5, principles accepted by educators across Canada, and I think that is the purpose of using this particular document. It is one that has received wide support across Canada at all levels of education. The fifth principle that it elaborates on is, and I quote, that a good exam or an appropriate exam should investigate the performance of students with special needs and students from different backgrounds. It should report the evidence of the consistency and validity of the results produced by the assessment method for these groups. Now, I did not see any assessment of that in the one that was made public. Is that the issue of different backgrounds and of special needs students? I take those as two different categories. Is that in the second and larger report that is made to the school divisions and to the individual schools?
Mrs. McIntosh: Mr. Chairman, I will begin just by responding to a bit of the preamble. The member indicated that I had said that where examinations were not made into a big deal but rather just an ordinary part of learning that they were comfortable, but where they were, I think she said I had indicated "fed a line" that they could become quite agitated, and she did not know who would feed the line. I believe, if she checks Hansard when this comes out, I had indicated that students, in my opinion, from observation, and in the opinion of a lot of people from observation, there are many, many schools where assessment is seen as an integral part of learning. It is customary. It is frequent. It is not considered unusual, and students tend to take it as just a matter of course. Therefore, they have a comfort level that enables them, some of these students whose letters I just read, to actually feel the test is fun.
I believe I indicated fairly clearly in my answer that where students are, as the member says, "fed a line" or given to be made to believe that the test will be traumatic, hard, scary, negative, have serious consequences for them, et cetera, and I said specifically they could be given this information or have this attitude of fear instilled in them by, and I said the media, by homes, by schools or general public commentary.
We know we have school divisions whose boards will put out information about the damaging aspects of exams. I have been to a division where that is the feeling, that exams are not good and exams will harm the children's psyche, lower their self-esteem, damage them psychologically, et cetera, et cetera. That then is given to the parent councils as information by the school division. The parents then become very fearful. The children then become very fearful, and the media will then report parents fearful of exams and so on. That is what I meant by that. That happens, and it has happened.
I can walk into divisions, and you can just tell. You walk into a certain division that believes in assessment and sees it as a normal part of the learning process. There is a comfort level that is quite extraordinary. You walk into divisions that see it as a fearful thing that say, well, you know, we are going to use these tests for merit pay or whatever other fearful connotation they can put to it, and the whole system right down to the student in the classroom is affected. So that is clarification on that for the member.
Is there a larger report that is going to go to schools? The member is making reference to the fact that in our public summary report issued after the January Grade 12 exams, we did not include any contextual information. We simply said things such as students did well in calculation, generally speaking, but not as well in problem solving. We did not reference it contextually to say, in division A, the reason they did not do as well in problem solving is because of this, this and this. That, we felt, is information that the school division could and should, in terms of laying down its school plans, address those kinds of particular contextual things.
The next report, however, probably will have a bit more contextual material. We have noted several things in broad, sweeping principles. The one I have just mentioned, students had done well in calculation but not generally well in problem solving. We noted that 13-year-old girls are now performing as well as 13-year-old boys in science, but 16-year-old girls still lag behind 16-year-old boys, that type of thing.
Contextually why, that we have left to the divisions to explain. You know, why did we see these particular results in our division? What are our school plans for addressing any concerns that we might have identified as things that need addressing? But still, in the end, schools themselves will have the broader set of indicators of student performance, and, therefore, they have to, themselves, inform their public of much more contextual information because they are the ones who have it. We can give the overall picture; they can provide the detail. We will give a little more detail, but we still want them to be the ones controlling the detailed information since they are the ones living it.
The member questioned a previous reference to fair assessment and special needs. Every reasonable effort is made to enable students to demonstrate learning in relation to the objectives or the expected learning outcomes set out in the curriculum of the course or the subject area that is being examined or tested. Students with learning disabilities, cognitive disabilities and physical disabilities--visual impairments, hearing loss, that type of thing--may be granted one or more of the following adaptations, provided the adaptations do not alter the validity of the examination or the test.
Adaptations that may be allowed for one subject area--for example, mathematics--may not be allowed for another subject area such as English language arts. But the kinds of adaptations we are talking about are--they could include the use of a word processor, a Braille writing device, typewriter, specially printed assessment instruments, large print or Braille versions. We may allow additional writing time for people who have a physically disability with hand movements and so on. We allow breaks during which the student is supervised if there is a problem with sitting still for too long. We will allow alternative settings outside the classroom with continuous supervision. We can adjust the setting of the examination or the test. We can have other subject-specific adaptations as approved by the Assessment and Evaluation Unit of the School Programs Division or the Direction des services de soutien en éducation of the Bureau de l'éducation française.
I do not know where the other--but a survey of Grade 3 teachers was conducted in connection with the Grade 3 mathematics standards test pilot to determine the number and types of students who might have to be exempted or require special accommodations when the standards tests are written in June of '97. Teachers were provided with draft criteria as part of the survey. Preliminary figures indicate that in the opinion of classroom teachers, about 5 percent of students should be exempted from writing, and the major factors addressed are emotional or psychological; learning disabilities; physical disabilities; language difficulty; and multihandicaps.
Teachers also indicate that about 8 percent of students would require special accommodations in the writing of standards tests, and those main accommodations would be allowing more time, reading the test to the student, other things such as Braille. For French Immersion that figure was 10 percent, Mr. Chairman.
The examination in standards testing programming provides provincial information with respect to areas of the curriculum the students have difficulty with or where they are strong. The curriculum development process has taken information from local, as well as national assessments and development committees take all of this information, as well as the research in best practices when the curriculum development process is underway.
One example of this is the emphasis that we now have on problem solving across the curriculum. Professional development activities could be tailored to divisional or regional needs, and the department has worked very closely with the regions in the implementation of curriculum and will continue to do so. Some Manitoba school divisions have taken local and provincial information to provide their community with a comprehensive set of information and to let their publics know what their priorities for continuous improvement are.
* (1730)
I mentioned we have workshops and sessions with local school divisions to go through the details of their own profiles to assist them, and it may help if I refer to, without going through all the detail, the provincial examination development process from our own provincial documents, page 1 and page 2. The provincial examination process outlines the development of pilot tests, pilot test administration, pilot test marking, provision of pilot test, notification process, examination administration, et cetera, et cetera, et cetera and that is in our New Directions document for the member's referral.
But we do go out to the divisions and conduct more detailed information on their own divisional profile with them, and that is not public but it is intensive analysis for the divisions to assist them. Did I miss anything? I think that addresses the points in the most recent question.
Ms. Friesen: The questions I was asking were coming from the evaluation of the examinations. My suggestion was that the ones that have been made public so far are useful but are of limited use to the general public, and so I had asked the minister about additional ones, and in so doing, what I was doing was taking the document which the minister had introduced into this discussion, which establishes fair student assessment practices across Canada and took two--and I am going to add another one now--elements of evaluation for the exams.
It seems to me that what it says is that the examining body should investigate the performance, not the needs. The minister in her response spoke about the needs of students who are to write, how many needed braille, how many students there would be in this category, but Fair Student Assessment Practices says that your evaluation should investigate the performance of students with special needs. So I am not sure the minister understood the question.
The second part of it was that it should investigate the performance of students from different backgrounds, and the minister's response on that was that was up to the division. Now, that seems to me rather difficult for a division to have enough information to understand where their results fit with those of others. If we are evaluating people from different backgrounds, surely it is only the province which has the full range of information that will enable a fair evaluation of that test and its effect and impact on students from different backgrounds. How can one division do that? How can it place itself in comparison to others when it does not have the same range of information that has been made available to every division?
In addition, the minister did suggest that this was the way she was proceeding, and yet the actual details of the procedure seem to me to be quite different from the Principles for Fair Student Assessment.
In addition, I wanted to ask the minister about the sixth element of Principles for Fair Student Assessment Practices, and that is No. 6 which says that the examining body should provide potential users with representative samples or complete copies of questions or tasks, directions, answer sheets, score reports, guidelines for interpretation and manuals. Now, I read that out in specific detail because I am concerned about the mathematics exams at the Grade 12 level and the concerns that were expressed by parents and teachers in Brandon about the level of information which was provided.
The evaluation in the report that was done by the Brandon University professors in the mathematics department indicated that the difficulties that they believe occurred in that exam, and I am quoting from their report. They say: We believe that this shift in emphasis could be the major problem, the shift from 40 percent to 60 percent for the short and long answers.
The minister and I have exchanged questions and answers on this in Question Period as well, but in their evaluation, the professors from Brandon pose a number of alternatives, I guess for preparation and performance, and they argue that students in one scenario are likely to study from old exams. I wondered if the minister had anticipated this, that this is how students would study and whether she was prepared to make exams public so that the strategy is available to everyone. I believe I did indicate to her that British Columbia does this on a regular basis, and she said last year that that was something she would look at.
The professors make the point that the students' performance on 1997 exams depends on what students are told to expect and how they study for it. That is where it seems to me the principle No. 6 in the Principles for Fair Student Assessment Practices is a very important principle, that potential users have representative samples or complete copies of questions or tasks, directions, answer sheets, score reports, guidelines for interpretations and manuals. I talked to the Alberta Department of Education, or at least our staff did, and this seems to me to be the practice in Alberta where this kind of provincial standards assessment has been in place for a long time.
So I would like to know from the minister whether, in fact, those samples, complete copies of questions, task directions, answer sheets, score reports, guidelines for interpretations and manuals were distributed as part of, in this case let us say, the math exam. Were they distributed for other exams? What level of information is provided? Can the minister, for example, table the representative samples, the complete copies of questions, the task directions, answer sheets, score reports, guidelines for interpretations and manuals that were sent out to each school division in advance of the 1997 tests?
* (1740)
Mrs. McIntosh: Mr. Chairman, the series of questions are basically on one topic, but there were two. The member had indicated that in my previous answer she had asked why our public information was not that detailed, and I had provided that answer to that question. If I heard her question correctly, I really thought that she had said the information that we put out was not detailed, and she had asked why. I had responded that we gave the generic overview that it was up to the school divisions to provide the detail if they wished that released. So it was in that context that I was answering her question.
She had also asked the question about the performance of students with special needs and students from different backgrounds. That was a pretty specific question, and in providing the answer to that, then I spoke to under what circumstances such students might be provided special compensation or exemption. But, if she was looking for different information on that, investigating the performance of students with special needs and students from different backgrounds, to indicate that we do expect to receive some information back when the special needs review is complete that may assist us in a variety of ways with decisions on these types of students, but in the meantime the answer I provided for that was in response to a definite question that she had asked.
The member refers to correspondence from the university of Brandon professors who were provided information from the Brandon School Division and not from any other source. So I say that in light of the fact that if you read the correspondence given to the Brandon professors from Brandon School Division, you may find some information not there. The Brandon professors provided an answer, which was to use old exams, et cetera. We do not have an abundance of old exams at this point in standards assessment, in the standards test, because they are relatively new. But that is not a bad idea.
The questions being done in class following the curriculum, if you are following the curriculum, then the chances are that they have already had a lot of questions of the type they would have seen on the exam in class. The professor, Grant Woods, at the University of Manitoba, who is a mathematics professor there, volunteered to me, and apparently we have correspondence from him that you might be interested in seeing--Brandon, of course, did not submit his perspective--Professor Woods volunteered to me when I was out at the university not long ago that the mathematics Senior 4 examination that was written in January was absolutely, exactly the kind of exam that would measure the type of material students needed to be able to take his courses at the university. I understand we now have a piece of correspondence from him reiterating that, that that is the type of exam that is required if students are to be able to perform at a level satisfactory for him to be able to begin teaching at the university level at which he teaches.
The member did not ask any questions about the suitability of the exam. From that, I am assuming that the argument that was put forward by Brandon that the exam was not curriculum-congruent or did not address relevant information, I believe that argument is now gone, because it is clearly known and understood that the exam was curriculum-congruent, that it did contain material that students who had taken the course should have known or should have been able to answer, had they absorbed and been able to apply the information. Those arguments are gone.
What the member is now asking us is: What material did we give these students in Brandon or anyplace else to enable them to know what was going to be on the exam in a generic sense? We did not give old exams, as I indicated, partly because we really do not have a lot of old exams at this point, although certainly we did give, in the year before we actually provided samples of the kinds of questions that were going to be on the exam and distributed those from the department and I think correctly assumed that most divisions this year would use those same samples again or develop their own samples as most divisions did from the samples we had sent them.
Just to indicate what we did do, the exam difficulty and design, and I will repeat that, because I think it is important, the exam difficulty and the exam design were communicated to all schools in the fall of 1996. Along with that, they received the information that long answer questions had the poorest results in the June '96, 40S mathematics examinations, and that students required more practising in answering long-answer questions. That is consistent with the press release that we had put out, because this was also not just communicated to divisions, but it was a press release in the spring of '96. It was not just a press release, but it was reported in the media fairly widely. I was also on radio talking about this as well on more than one radio station. So this was well publicized, sent directly to them, and publicized in the media as well. They also received notice that a significant portion of the marks, 60 percent, for the January '97 exam would be based on long answer questions.
We provided this information to help schools predict more precisely the format and difficulty of the 1997 examination. By widely distributing this information in the fall of '96 and talking about it publicly--and I myself had discussions with school divisions on it throughout the fall term, in a generic way, you know, attending meetings of teachers and trustees, et cetera. In talking about exams, I would frequently say, for example, you received notice that problem solving was a problem and that we are asking you this fall to work harder on problem solving and trying to improve those techniques, because you are going to have more questions on it in the next exam. So, I mean, those kinds of comments were widely made throughout the fall.
The levels of difficulty for questions on the examination were determined and assigned by the mathematics 40S exam development committee and approved by the department. All 40S provincial examination questions were derived from the objectives of core, compulsory modules outlined in Mathematics 304, 301, 300, curriculum guide 1989. So they were not new in that sense. Each question on the mathematics 40S January 1997 provincial examination was matched to the mathematics 40S curriculum.
When we were doing the review, and the review and screening procedures that were followed were, I think, pretty rigorous and designed to make sure that anybody who had learned the material in the curriculum, regardless of whether or not they prepared as indicated under No. 6, would be prepared for the exam. The mathematics exam questions were selected on the basis of the table of examination specifications related to the curriculum guide. Three examination forms were developed by the committee. An external review committee independently examined and modified the mathematics questions. All three forms of the mathematics examination were piloted in English and French throughout the province in 16 schools. We used about 600 students in the pilot. The pilot examinations were marked by the mathematics teachers, and individual student results were forwarded to the schools that participated in the pilot.
The SAS system was used to conduct an item analysis of the multiple choice questions and a distribution of marks for the long answers. The development committee reviewed the statistics and modified the questions in each form. They selected one of the forms for the first semester and a second form for the second semester. After the statistics for the first semester examination were available, the Mathematics Development Committee met to review the results, statistics and comments. It is only after all of that was done that, based on that information, the development committee modified the examination questions on the second form and recommended this exam.
* (1750)
So the process for preparing students was quite thorough. The letter that we sent to all teachers of mathematics 40S was sent by Norman Mayer from our department, who is here with us today from Assessment and Evaluation. It indicated amongst other things, it was sent to all teachers of mathematics 40S from the Assessment and Evaluation Unit, and it was sent out at the beginning of October 1996. It says, this is a reminder that the province-wide mathematics 40S three-hour examination for the first semester will be administered between the hours of this and this on this and this date, which was January 22, Wednesday, l997.
An additional 15 minutes prior to the examination will be required to prepare and assist students in completing the information required in the answer sheets. You will be receiving a copy of the administration manual from supervising teachers before the examination date so that you may familiarize yourself with the detailed processes to follow before, during and after the examination. If you have not received the manual prior to the examination date, please check with your principal. You are reminded that this examination is based on the mathematics 40S curriculum and information contained in the examination specifications enclosed with this package.
So they had examination specifications included with the package and a reminder that it is based on the mathematics 40S curriculum.
It then went on to say: The examination consists of multiple-choice-select and written-response-supply questions in the approximate ratio of 40 percent to 60 percent, respectively. Experience has shown that students tend not to perform as well on the written-response sections of assessments in examinations as they do in the multiple-choice section. Students should be encouraged to answer written response questions--and that is underlined--in a concise and complete manner. Students may not use notes, textbooks or dictionaries during the exam. Nonprogrammable, nongraphing calculators are required to write this examination. Calculators must be checked at least two days in advance of the examination to ensure that the student's scientific calculator does not include programming or graphing features. The examination counts for 30 percent of each student's final grade, and marks will be reported to your school as a score out of 30.
Mr. Chairperson: The hour being five to six, committee rise.
Call in the Speaker.
Mr. Deputy Speaker (Marcel Laurendeau): The House will come to order.
Mr. Gerry McAlpine (Sturgeon Creek): Mr. Deputy Speaker, I would like to make some committee changes and rescind the committee change that was made in Public Utilities and Natural Resources earlier today.
I move, seconded by the honourable member for Pembina (Mr. Dyck), that the committee change made earlier today, we rescind the change for the member for Arthur-Virden (Mr. Downey) for the member for Brandon West (Mr. McCrae).
Motion agreed to.
Mr. McAlpine: I move, seconded by the honourable member for Pembina (Mr. Dyck), that the composition of the Standing Committee of Public Utilities and Natural Resources be amended as follows: the honourable member for Emerson (Mr. Penner) for the honourable member for Gimli (Mr. Helwer); the honourable member for Morris (Mr. Pitura) for the honourable member for Turtle Mountain (Mr. Tweed); the honourable member for Arthur-Virden (Mr. Downey) for the honourable member for Pembina (Mr. Dyck).
Motion agreed to.
Mr. McAlpine: I move, Mr. Deputy Speaker, seconded by the honourable member for Pembina (Mr. Dyck), that the composition of the Standing Committee on Law Amendments be amended as follows: The honourable member for St. Vital (Mrs. Render) for the honourable member for Gladstone (Mr. Rocan).
Motion agreed to.
Mr. Deputy Speaker: The hour being six o'clock, this House is now adjourned and stands adjourned until 1:30 p.m. tomorrow (Tuesday).