In yesterday’s column (“The AIM Puzzle: The Problem of Identification”), we presented some of the concerns about the AIM identification process as laid out by people who spoke with the Vanguard, as well what was laid out by Tobin White and Scott Carrell in their respective reports. However, some have argued that this is neither a full nor accurate picture of the AIM identification process or what is going on.
A key component that was missing from analysis presented yesterday is the search and serve process. The Vanguard was referred to the GATE Master Plan, which was unfortunately last updated back in 2008. Since that time, budget cuts resulted in the loss of the GATE counselor who was used to help evaluate students for identification purposes.
The main instrument that the district uses is the OLSAT (Otis-Lennon School Ability Test), which attempts to assess “examinees’ ability to cope with school learning tasks, to suggest their possible placement for school learning functions, and to evaluate their achievement in relation to the talents they bring to school learning situations.”
If the student does not qualify based on the minimum OLSAT scores at the 96th percentile, there are a set of specific criteria for triggering the Search and Serve Process for re-screening. Yesterday, we wrote that the district is not “enforcing the rule that retesting for the TONI [Test of Nonverbal Intelligence] occur with students within five standard errors of the 96th percentile.”
But if you read the Gate Master Plan, you see that this is only one criterion for retesting. As was pointed out to me, if you believe that the OLSAT is problematic for low-SES (socioeconomic status) kids and kids with other learning impairments, then you wouldn’t want simply to rely on those who are near-misses to fill the program.
Instead, there are several assessments that trigger the Search and Serve process. These include risk factors such as: socioeconomic status, language, health, designated special education, etc. There are also work sample assessments and parent or teacher indicators of gifted characteristics.
The point was made to the Vanguard that one reason you have a higher rate of acceptance on retesting than the general population is that, while the OLSAT is administered to every student, the TONI or other retesting methods are only administered to those students who are believed to have been missed by the OLSAT.
In other words, Tobin White noted that just 3 percent of students score at the 99th-percentile through the OLSAT, while 28 percent do so through TONI. He found that students administered the TONI were six times more likely to qualify than those taking only the OLSAT and nine times more likely to score in the 99th percentile.
However, Tobin White may not be making an apples to apples comparison if the OLSAT is administered to everyone but the TONI is administered only to those believed to be strong candidates for the program.
Accordingly, the WISC (Wechsler Intelligence Scale for Children) Verbal is administered to those students “who appear highly verbal and who meet criteria (as laid out above).” The TONI “is administered to those students who appear nonverbally/spatially advanced, who have limited English, who benefit from more structured test environment without time constraints, and who meet the criteria (as laid out above).” The Structure of Intellect Learning Abilities Test (SOI-LA) or the Slossen Intelligence Test (SOI) “may be administered as an alternative for those students who have taken the above named tests within a 12 month period and who meet the criteria (as laid out above).”
Finally, “Administration of re-screening methods will be available in the student’s primary language if the primary language is English or Spanish; it may be available in other languages whenever appropriate for the student and feasible for the district.”
One person the Vanguard talked to noted that one year the use of the TONI was replaced with Slossen, but the results were identical. They also told the Vanguard they were looking for ways to expand the options for testing and retesting.
Yesterday, we noted from the Tobin White report that 331 of the 492 retested through TONI “had no risk factors at all.” This number is in dispute, but the Vanguard would like to see documentation to verify or invalidate this claim.
The Master Plan is admittedly outdated. The Vanguard was told that the AIM Advisory Committee was precluded from updating it. However, at least based on the Master Plan, the claim that retesting is only for those who score within five points of the qualification score appears false and there appear to be good reasons not to do that.
That being the case, the graphics presented show a changing GATE/AIM population over the last decade, but perhaps not a larger one.
We will also further examine the claim that the TONI is being misused outside of those with specific disabilities or risk factors.
The Vanguard‘s purpose here is not to assert a stance on GATE/AIM, but rather to evaluate the competing claims, and as much as possible move from the realm of opinion into hard data that can be used to assess the critical issues facing the AIM program.
—David M. Greenwald reporting
What you end up with is a mixed ability group that requires differentiated instruction.
We don’t know how the students were selected for retesting. What we also don’t know is how many search and serve students who are identified end up in AIM classes.
The board could ask the GATE coordinator.
but the board majority thinks the gate coordinator is the problem.
“What you end up with is a mixed ability group that requires differentiated instruction.”
not necessarily, you’re buying into the olsat is objective rather than subjective argument.
“We don’t know how the students were selected for retesting. What we also don’t know is how many search and serve students who are identified end up in AIM classes.”
so as i read it, you’re advocating change when you don’t know the core issues here.
“The board could ask the GATE coordinator.”
Or, in the interests of transparency, the GATE coordinator could speak out publicly on this issue since we technically do not know whether or not the board has made such an enquiry.
Or alternatively isn’t this information public. One should be able to ask to see the data behind all these assertions and graphs?
I would ask Vanguard to obtain the data behind these assertions/studies/graphs and post them as part of their analysis.
We’ve heard (read) from people who have served on the AIM committee that there didn’t seem to be much information or any explanation on how or why students are selected during this “search and serve” process for retesting. There is a long list of “risk factors” that could apply to any large number of children and leaving out others. One commenter questioned why a child with divorced or remarried parents would be considered having a risk factor and eligible for retesting, but not a child whose parents fight every night, for example. There has been public comment from teachers that don’t understand why one child is retested over another, or why one child is identified as GATE over another child who is not when considering behavior and performance in their classroom. We’ve heard that there are low numbers of underrepresented minorities who enroll in AIM classes, even though identified as GATE, but we haven’t been able to get the numbers for the ethnicity of AIM students actually enrolled in AIM classrooms. The District may have the data, but it is nowhere to be found on the District’s website that I can find.
It seems to me that the AIM Advisory committee could keep a watchful eye on this data. We shouldn’t have this data only available through one staff member.
ryankelly: We’ve heard that there are low numbers of underrepresented minorities who enroll in AIM classes, even though identified as GATE, but we haven’t been able to get the numbers for the ethnicity of AIM students actually enrolled in AIM classrooms.
It may not be as current as you would like (only 2011), but, here.
I looked at this but the data is divided up by school and doesn’t show the data by program, i.e. AIM.
My link didn’t record as I thought it would. It is also possible to get data districtwide. For 2011, n=842 for student enrolled in GATE in Davis JUSD, according to this site. It shows the breakdown by race/ethnicity.
GREAT, David. This terrific column really fills out your previous column but I have a couple of quibbles with that one. Math acceleration is not the main attraction for all the kids and all the “momster” parents who want AIM. The appeal was not to be held back while others who aren’t ready for the next step or next lesson struggle. It mortifies gifted kids to see the trouble their classmates have and the opposite is also true. Some children I have known can and do move ahead on their own, especially in math, but that doesn’t mean they don’t need instruction from their teacher. And they really do need peers. For one thing, in an AIM class you may be the best kid in one subject but there will always be someone better than you are in something else. If you don’t get that lesson, you can’t live well in the world.
Don’s response to the previous column is so right that GATE students are also humanists and everyone needs high levels of literature, history and other subjects like political science, economics and public speaking to be managers who look at business and research with vision instead of just dollar signs. Time for foreign languages is necessary for us to be citizens of the world, even those of us who think we are the only whole world. For that a thorough grounding in language arts and writing is essential before doing it in another language.
I am grateful that you have looked into the Tobin report and found it wanting. When the GATE Master Plan was updated in 2008, it was one of only three in the State that was updated for five years, indicating that the program was so outstanding statewide that it was good to go that long without that level of oversight. It should have been updated earlier than this (2013) but that wasn’t where the interest of the District staff lay. So the AIM AC, which was useful (and required) to get the Master Plan 2008 update, was sidelined; and the coordinator was over-managed, and she was the one who was pushing for the new Master Plan. I think the staff used the lack of funding of the pittance it cost the district for 20% or more of its charges as an excuse for not moving ahead, or were just waiting for the Board majority to change on this issue.
There’s been some discussion of the Carrell report as well but I wish it had been more fully. One at a time?
ryankelley : What you end up with is a mixed ability group that requires differentiated instruction.
Yes, but not in the sense that I think you mean. The AIM classrooms are already differentiated because every gifted kid isn’t gifted in everything to the same extent and AIM teachers have to and do differentiate within self-contained classrooms every day. That’s part of their GATE training — at their personal expense for a year-long certification program. So why add the children with very high ability to the children with average ability and over-achievers and expect one teacher to do this in a class of 30+ students? Someone has to look at the training for differentiation before this District kills off its teachers for the people who don’t know but just hate the program. The 20% currently served in AIM enable teachers to use differentiation in a much smaller (but still huge) range of individual needs. And the rest (what one critic calls the 70% whom he worries are left to wallow) need better differentiation but it’s not as if the AIM students don’t need it, it’s just that they mostly get it. It’s not something that the 10% who opt not to go into AIM or are shut out by the lottery always get from their teachers. (But a few figure out how to do it by themselves. When their parents do, it’s called enrichment)
We don’t know how the students were selected for retesting.
No, you don’t know.
What we also don’t know is how many search and serve students who are identified end up in AIM classes.
Again? Somebody knows. Ask her, if you can still find her. It’s not released information because in many cases there are so few search and serve students in any one class that information on racial identity, disability, or family situation would lead to a privacy violation. The reasons may be as simple as transportation problems or fear that the parents of search and serve children have who think their child would shine better in a non-self-contained classroom , though there’s substantial evidence that African-American and Latino students are particularly successful in self-contained gifted classrooms. Under-achieving students often perk up when they’re finally with their peers. Kids with learning problems in one or some areas can be encouraged, sometimes for the first time, to apply their other high cognitive skills. People who think Da Vinci is the only place one can learn to deal with all sorts of people productively are wrong.
Tia, unless the new coordinator has already been hired (and I’d be “shocked, shocked” if so, and if there were anyone with the qualifications the old one had in the area of search and serve), what interest would the one who does know have in protecting the transparency the District did not offer her before kicking her arm crutches out from under her? And the new one will no doubt have “such a lot to do to get up to snuff” in the next two weeks!
https://davisvanguard.org/2015/08/board-receives-update-on-aim-in-advance-of-big-september-17-meeting/
“Tia, unless the new coordinator has already been hired “
I have purposefully made no comment whatsoever about the previous coordinator because I have no knowledge of her besides what is posted here on the Vanguard. What I do know is that there is a strong contingent of GATE/AIM supporters who believe that she has been an invaluable resource. I also know that members of the board have stated that they cannot comment because this is a personnel issue. Then starts the second guessing about motivation from the supporters. So I would like to add a different perspective from someone who is neither for nor against this action based on a lack of knowledge about this individual but with a background in personnel issues as an assistant chief.
Recently there had been some succession issues with our Sacramento group. There were some ardent supporters of one individual for promotion, however, what they did not know ( and could not be informed of based on personnel confidentiality issues) was that this individual did not qualify for the position based on a previous issue regarding the creation of a “hostile work environment” and accusation of “harassment”. Another individual was selected but the reason was never made known to the group at large to protect the first candidate’s privacy.
There was a fair amount of speculation and accusations of favoritism on the part of the administrative team that were unwarranted but which managed to generate a lot of controversy and hard feelings. I remain committed to the idea that speculation and “guessing” about motivations is not a productive activity while proposing positive alternatives in areas in which public input is appropriate is a much better strategy.
well said.
Madhavi Sunder requested and reviewed Deanne Quinn’s personnel file. None of the other board members did that. The administration recommended continuing Deanne’s contract. So while you have provided a useful general insight, I don’t consider it pertinent to this situation.
Don
You may be right. I simply do not have the specific information, nor will I ever, to know whether or not the principle involved might be relevant. There, are for example, other ways of acquiring knowledge. In our case, I did not review the personnel file in question because I had direct knowledge of the situation.
it seems to me a personnel file is not what you want anyway. what you need is data from the students and that’s not coming from a personnel file.
I think this article points out exactly why the board is doing what they are doing. We have just enough information to know there are red flags flying concerning the AIM program and the search and serve process, but not enough information to definitively argue the issues. You have a credibility problem when you have a selective program in which three fourths of the students fail to qualify (until retesting)…when your selection criteria is 96%, but you actually select to 91%…when a significant number of the students qualified on retesting, test far below even the 80 percentile, as low as 4%. But even if we had every detail, people would continue to disagree. I think the board realized this and instead of engaging in tit for tat, they directed the district to come back with a well defined, clear program.
It’s amazing how much guessing we all have to do about the motivations, guiding principles, and intentions of the board majority, considering they are sitting right there with microphones and all.
Don
I have a feeling that I may be misunderstanding your point.
“It’s amazing how much guessing we all have to do about the motivations, guiding principles, and intentions of the board majority”
I am not sure why anyone feels that they have to “guess” about motivations. It seems almost a sport in our town to try to “guess” about motives or turn what someone has said into something else instead of taking them at their own words.
We have no words.
they said comparatively little
I guess that I see a decision to request more information and suggestions from staff as “words” until they are better prepared to make a decision. As a 10 year member of our administrative team at work, we frequently had to defer decisions. It usually was not particularly useful for the members of the department not on the administrative team to be “guessing about the motivations….” prior to us being prepared on the basis of all information received to make a decision. What was useful was for people to put forth their own constructive ideas on how best to proceed.
I see these as very similar situations. Am I missing something here about your interpretation ?
Absolutely agree with you Tia. I think the board has been very clear….they made an evaluation, they concluded there were problems, they directed the district to bring back their best fix for those problems (in September). This process began more than 6 months ago, and became part of the board meeting agenda at least 5 months ago. Whether you agree with what the board is doing or not, they have made the process very clear.
they made an evaluation perhaps on faulty data and assumptions if this article is correct – and no one has refuted it.
The numbers are what they are. When three fourths of those “selected” for a selective program fail to qualify, you have a problem. You can argue what the cause is, but you still have a problem. The board (wisely) made no assumptions, but asked the district to use their expertise to suggest a solution.
Here’s a podcast that describes an old strategy that would very likely get more students scoring on the OLSAT at GATE-identification levels, and have a more representative demographic mix: Early Lessons.