This morning's news brings the exciting headlines Education Technology Isn't Helping, and Study: No benefit going high-tech for math and science, because of a new study released today by the US Department of Education.

Duh - this is old news, there has been decades of research showing that drilling kids does nothing, even if you pretty up it up with fancy names and graphics.

But our language for this stuff is so limited. The headlines SHOULD read, "Bad Educational Practice Proved Ineffective, Again!" But no, it gets called "educational software" or "educational technology", and immediately gets tied to EETT funding. It's an obvious conclusion, although the Washington Post gets it sort of right, Software's Benefits on Tests In Doubt: Study Says Tools Don't Raise....

OK, if I thought test scores mattered, I might care about that.

But here's what I care about.

Now, every time we talk about kids doing interesting stuff that involves a computer, we'll get hit with this. Podcasting, programming, blogging, collaboration, projects, kids making games, exploring virtual worlds, GIS, Google Earth? What are you thinking, haven't you heard? Educational Technology Doesn't Work.

Here's what's worse:
1. These publishers are getting off scot-free. Why is the USDOE not publishing the actual evaluation of the individual software products. Isn't this public information? This allows the publishers to hide behind the report and continue to claim that their individual studies are valid.

2. The apologists will shortly come out. "It's just bad implementation." "Teacher's need more support." This makes it better? C'mon, people, let's speak the truth and make meaningful distinctions between educational software that pretends to replace teachers and technology that gives students agency and supports a learning community.

Argh. I have to work harder.

Views: 572

Comment by Steve Hargadon on April 5, 2007 at 1:02pm
You are really eloquent and insightful on this topic. I'm really glad you posted this.
Comment by nlowell on April 5, 2007 at 1:31pm
I'm not convinced that "harder" is the right answer.

Working harder at stopping the tide with a teaspoon is only gonna get you wetter.

The system is the problem and it's being driven by politics and momentum.
Comment by Sylvia Martinez on April 5, 2007 at 2:59pm
Well... harder, smarter, louder... something! I refuse to believe that standardized testing and dumb practice are irreversible trends. Momentum can change, tides always turn.
Comment by Alice Mercer on April 5, 2007 at 3:50pm
Thank you Sylvia, can I share your response with my staff? We spent $$$ on a "standards" aligned "quiz" program a year and half ago based on a staff vote (one other teacher and I were the only ones who saw how useless it would be). I'd like to educate them on the futility of these programs to raise test scores (which was why that program was purchased).
Comment by Steve Hargadon on April 5, 2007 at 4:04pm
Wes Fryer just posted on this, too. I totally respect Wes, but I think, Sylvia, you did a better job addressing the issue. However, worth looking at:
Comment by Sylvia Martinez on April 5, 2007 at 4:06pm
Sure. and Wes Fryer just posted on his blog some links about research that makes better distinction between the kinds of software and the pedagogy supporting use.
Comment by Chris Lehmann on April 6, 2007 at 4:08pm
If I had to guess, I'm guessing that many of the packages "measured" were curriculum specific packages. As Sylvia mentions, let's see what happens if they were to measure schools that were using the tools for research, production, collaboration, communication and presentation.

As long as we use these tools to merely recreate the same pedagogy that created the factory / banking model of education, we *won't* see any change, not in test scores, not in engagement, not in learning.

We finally have the tools to let us achieve the promise of Dewey, let's start using them that way.

(And yes, I'm preaching to the choir.) :)
Comment by Sylvia Martinez on April 6, 2007 at 4:18pm
Yeah, you are right. I'm compiling a list as we speak from the full report. But I bet you could guess quite a bit of the list off the top of your head - big publishers, big products, big promises.

PS I'm a big fan!
Comment by Carolyn Foote on April 7, 2007 at 7:19am
I posted about this on my blog as well.

I was frustrated by how they buried the names of the companies in the article(last paragraph) . And I agree, the "summary" is somewhat meaningless because perhaps LeapFrog's software did quite well, while Pearson's was completely ineffective. You can't really make any judgment about it without knowing more.

I believe the article said that the basis for the companies participating in the study was that the individual results wouldn't be released. Also that seems questionable.

The whole "measurement" movement is based on so many faulty uses of statistical data in the first place that it shouldn't surprise me. It's just dismaying to know the distinctions and have it enter the mainstream news the way that it will, with such broad brush strokes.
Comment by Sylvia Martinez on April 7, 2007 at 9:47am
The articles were actually written a day before the study was released, so there must have been a press release announcing the results. The news reports were simply going from the press release, which had incomplete information. As willing as I am to condemn the press for missing the story, this was not their fault.

This page
explains how the products were chosen. They go on to say that in year two they may have the individual products singled out. I'm not sure I see a complete promise to do that, it's worded sort of elliptically.

The study does list the names of the companies and software products. I'll put them in a new blog post, it's kind of long for a comment.


You need to be a member of Classroom 2.0 to add comments!

Join Classroom 2.0


Win at School

Commercial Policy

If you are representing a commercial entity, please see the specific guidelines on your participation.





© 2024   Created by Steve Hargadon.   Powered by

Badges  |  Report an Issue  |  Terms of Service