Chuck Thacker is one of the ‘grey beards’ of the computer industry having performed some splendid engineering and research at both Xerox PARC and Digital’s Palo Alto Research Center. CNet has an interview with him here.
A couple of pieces of the interview struck me :
I’d like to get your thoughts on the quality of computer scientists coming up through the ranks these days.
Thacker: I am actually quite disturbed by this trend. I worked on a project to try to figure out how to actually use computers more effectively in education in the lower grades because a lot of people now work on improving university-level education and maybe high school. That’s not where the problem is. The problem is in the very first exposure of a kid to education. It’s a hard slog because the education market is worse than the medical market in terms of fragmentation.
I wonder what the fragmentation is ? Does he mean in terms of educational approaches ? Hardware suppliers ? Software suppliers ? Too many points of contact with the education market purchasers ?
As a computer scientist, what do you see as the next big challenge, the next big hurdle for computer science?
Thacker:
So we need to look around and see what to do…The tradeoff is that it looks like the manufacturers are going to want to increase the number of processors on one silicon chip rather than increasing their complexity of a single processor. The problem with that is that we don’t know how to program it. We just aren’t very good at concurrence. One of the things that I say to my academic friends is that–to some extent–this is your fault because we hire a lot of computer science graduates with bachelor’s degrees. They have not learned anything about parallel programming.
The challenge of parallelism is undoubtedly one of the biggest hurdles the software industry needs to face. At university I worked with Transputers and Occam however that approach still didn’t succeed in part due to the fact that it was still too complicated (and in part due to British Technology Companies general failings). At Be we had a much lower processor count but still struggled with explaining to developers how to handle multi-threaded programming and concurrency. Now with multi-core CPUs the problem is becoming even more pressing and it’s certainly something that we’re challenged by at my current employer.
There needs to be continued research in the areas of concurrency, with a genuine focus on both the abstract science problems and on the practicalities of using the technology in todays software engineering projects. It might just be that Google has some of the most to gain from this and can in turn devote some considerable (hopefully public) research in the area. Google’s data-centers now contain many thousands of CPU cores all essentially working in parallel to perform work – admittedly the relatively constrained problem of search, but it’s a beginning.
Posted under Personal
This post was written by awk on July 5, 2007