I meant to post sooner after having attended but simply ran out of time. The Wilson Lecture caught me by surprise - all I really knew was that the lecturer was coming from MIT and was talking about technology in the developing word. In my mind, the combination of the "MIT" and "Technology" meant something super high tech and shiny. It didn't occur to me that technology would mean any kind of innovation, even if it involves rusty pieces of sheet metal making a tool to help prepare corn. Much of what Amy Smith, the lecturer, advocated was teaching people how to invent something for themselves. It took the old saying "Give a man a fish and you feed him for a day. Teach a man to fish and you feed him for a lifetime" a step further, stating that if you teach that man how to teach himself, he can eat whatever he wants for a lifetime.
I thought this had interesting implication for computer science. Something that has always bothered me about computer science is how steep the learning curve is and how expensive the technology is. For example, last week my hard drive failed. I have only a basic idea of what a hard drive is and no idea of how it actually functions (I should probably take 240...) I spent several hours on the phone with a tech savvy friend back home who walked me through diagnosing it and trying to recover it. I'm sad to report that it's been deemed a lost cause and I'm now looking at purchasing a $60 replacement. After investing a large amount of time and energy into creating a collection of digital data, I then lost all of it and now lack both the expertise and the equipment to recover it - and I know more about computer science than the average person.
The point I'm making is that computer science is inherently inaccessible. This kind of technology, which requires expensive parts and expertise, is never going to be meaningful in the day-to-day lives of the majority of people on the planet, and I'm not sure how I feel about that. I think this will change in the future, as technology becomes cheaper and people become more technologically literate, and I'm excited to see what happens when it does, because the lecture got me thinking about what people could invent if we had a whole world brainstorming idea for programs and technologies.
I like the idea of a world where people could make their own computers by hand to fit their own needs, then program the specific programs they need. I don't know whether this is the specific direction that the computer science world is headed in because frankly, it doesn't sound very lucrative for the current industry, but I would like to think it is. This fantasy of mine ties in nicely with the TUIs. If people were educated in ways to make their own computers, they would surely move away from a monitor, mouse, key board and GUI set-up. Again, I don't know if this will ever happen, but I's like to see computer science move in this direction.