We finally have everything in our project done!
Ruth and I spent spent most of Saturday evening in the lab again, debugging the suit and filming the video (with the help of Ruth's AWESOME friend). The phidget's code had gotten messed up, so we still had to debug and clean that up, then various connectivity and data issues kept coming up (the accelerometer would often think it was horizontal when it wasn't, for example). We ended up getting it all to work for the video though.
Since Kat hadn't been able to make this work session, she offered to splice the video together and get our website up, so that should be done now, too.
Now that we're done I have time to reflect on what I've gotten out of doing this project. What I think I've learned the most is that hardware is quite annoying to work with, but also quite rewarding to figure out. Every step of the way SOMETHING wasn't working - whether it be the liquid level sensor that gives gibberish feedback, the wii nunchuck which gives output at a different rate, sensor wires that won't stay connected, Arduino boards that aren't getting enough power, or an LED light that had come ungrounded, something or another didn't want to function. However, that moment of figuring out why something isn't working is really rewarding. It means that all the energy you had put into figuring it out wasn't for nothing.
All of this came with the territory of working with data generated in the real world. Real-world data (in this sense) is pretty much unique to TUIs and comes with its own host of problems. It's not consistent and is subject to being manipulated by things like gravity and short-term movements. We had continuously be conscious of the fact that data was only meaningful if it was true for more than the split second that the sensor happened to collect data.
Lastly, I learned that I need to brush up on my coding skills. There were many times when I thought "I used to know how to do this - and now I can't remember!" It was really frustrating and a situation I don't really want to be in again. I hope to brush up on my coding over break.
As an ending thought, I think this project, (and this class in general), has inspired me to think about how I can bring computing into the real world. To me, this represents a shift in how I was thinking about the course at the beginning of the semester (when we started this project). When I was first introduced to this topic I thought about it in terms of bringing the real world into computing, or how the real world can enhance the experience of using the computer. While I think that this is important to keep in mind, I think it's fundamentally different from how I'm thinking about the subject now. Now I tend to think about how computers can be used to enhance the real-world. Computer are becoming increasingly powerful and subtle. The possibilities of what they can do are becoming less and less limited. I think it's far more inspiring to think about how these machines can fit into the rich and vibrant world we live in, than trying to fit our world into the traditional ideas about machinery.
Breaking away from the keyboard and mouse...
exploring tangible user interfaces at Wellesley College
Tuesday, December 13, 2011
Finishing the suit
A lot of things ended up coming together at the last moment.
Kat had ordered the suit, and we had almost all the sensors, so we knew what we were working with, but because of a miscommunication, we didn't have an accelerometer. Consuelo said that it would actually probably work if we hacked the accelerometer in a Wii nunchuck and offered one for us to use (which we REALLY appreciated).
Ruth and I decided that the project probably wouldn't be done on time unless we spent most of the day Saturday working on it. Kat was busy so she couldn't make it, except to drop by a couple of times.
After stripping main chord
Wii nunchuck connected to Aruino baord
We still kept running into silly hardware problems for the rest of the evening. Ruth had figured out how to wire in our button, vibration actuator, LEDS and light sensor while I had been troubleshooting the Wii nunchuck, but we still had to figure out the thresholds for the new liquid level sensor and program for the accelerometer in the Wii. Ruth decided to take on the liquid level sensor, while I started to make sense of the Wii data. The problem I kept running into was that the computer would stop recognizing the Arduino board after just a couple of minutes - not enough time to figure out any meaningful data. I kept trying different things - reinstalling drivers, plugging in the USB again, restarting the Arduino software - but the problem persisted. I finally gave up at 3 in the morning after hours of trying to troubleshoot the issue.
Luckily, Ruth's boyfriend ended up joining her in the lab to keep her company after I left and discovered that it was just a matter of the Arduino board not getting enough power. Within an hour, Ruth was able to figure out the Wii's data.
But, with the presentation on Tuesday, we still had a lot to do. The thee of us all ended up spending most of Monday night wrapping things up. Kat had meant to program the Phidgets program, to record when people enter the dining hall, but perhaps because of a misunderstanding, had only found code that printed out information when a tag was within range of a reader, and changed the information to let the person holding the tag know they were in the dining hall. The problem with this was that the computer had no memory of the tag once it was out of range (about three inches away from the reader), and we needed something that knew when you were in the dining hall, ad when you had left - even if you weren't within range of the reader.
I ended up spending the better part of Monday night trying to code that, which turned out to be much more complicated than anticipated. Unfortunately, I had forgotten most of the details of what I had learned in CS230, so my night involved a lot of looking things up to refresh my memory. The sample code that Kat found ended up being a really good basis for what I needed to write, since all the event handlers were already dealt with - all I had to do was modify the event handler that managed what happened when an RFID tag was within range of the reader. What I ended up doing was creating a new class that represented a tag passing the reader. When a tag passed the reader, the program would create a new instance of the class, then store it in an array. First, however, it would check to see if an object with the same RFID tag number was already in the array. If it was, then the program would know that the person with that tag was LEAVING the dining hall, and could remove the object from the array. If the tag wasn't already in the array, the program would add it.
The program would also beep when someone entered the dining hall, to alert people that a new person had arrived, and keep track of how many people are in the dining hall, to print out whenever someone came, to let them know how many friends they could expect to find.
If there was a way we could communicate between the suit and the Phidgets, the Phidget software could easily keep track of how many times a tag passes an RFID reader, and the suit could use that data to give the suit meaningful feedback.
We also got to solder some things together. Most of the sensors and actuators were intended for the lilypad, so Kat and I had to solder wires together to the sensors to make the connections more secure. I had taken a jewelry making course a couple years ago, and had quite a bit of experience using a soldering iron, so I ended up doing most of the soldering.
Meanwhile, Ruth finalized a lot of the sensor thresholds and got everything working well in conjunction, and Kat made a a box to hold everything.
The next day we still had quite a bit to do before we were ready for our presentation: everything still had to be installed in the suit! Because of another miscommunication, Kat had thought we didn't need super long wires connecting the many inputs and outputs to the board, since we were using a different strategy. The result was the we had soldered shorter wires to the sensors and actuators. In realty, we really needed those longer wires to get to the various parts of the suit. We frantically twisted wires together, and pushed them into the various parts of the suit, stuffing everything into the box that would contain everything (which ended up being a smidge too short, but worked out). We managed to get everything up and running in time to run through our presentation and deliver it on time!
By the way, I got another "what it like to be in a class with so many men" comment on the way to NY for Thanksgiving.
Kat had ordered the suit, and we had almost all the sensors, so we knew what we were working with, but because of a miscommunication, we didn't have an accelerometer. Consuelo said that it would actually probably work if we hacked the accelerometer in a Wii nunchuck and offered one for us to use (which we REALLY appreciated).
Ruth and I decided that the project probably wouldn't be done on time unless we spent most of the day Saturday working on it. Kat was busy so she couldn't make it, except to drop by a couple of times.
We ended up getting a lot done. Ruth worked on hooking up the light sensor, vibration actuator, and the LEDS, while I figured out how to hack the nunchuck. Hacking the nunchuck proved to be simpler than I thought. After extensive internet research, (the most difficult part of the process), I found out that it was really just a matter of stripping the main chord, which would reveal 4 inner wires. One of these was input (power), one of these was ground, and two were output. Hooking them up was pretty easy, too.
After stripping main chord
Wii nunchuck connected to Aruino baord
The really hard part actually came after that. The Wii nunchuck was giving us data, which appeared to respond to movement, but it appeared in gibberish. After several hours of trouble shooting, I figured out that the Wii nunchuck's output is at a different rate than the output we had been using. All we had to do was change the rate that the serial output read at.
We still kept running into silly hardware problems for the rest of the evening. Ruth had figured out how to wire in our button, vibration actuator, LEDS and light sensor while I had been troubleshooting the Wii nunchuck, but we still had to figure out the thresholds for the new liquid level sensor and program for the accelerometer in the Wii. Ruth decided to take on the liquid level sensor, while I started to make sense of the Wii data. The problem I kept running into was that the computer would stop recognizing the Arduino board after just a couple of minutes - not enough time to figure out any meaningful data. I kept trying different things - reinstalling drivers, plugging in the USB again, restarting the Arduino software - but the problem persisted. I finally gave up at 3 in the morning after hours of trying to troubleshoot the issue.
Luckily, Ruth's boyfriend ended up joining her in the lab to keep her company after I left and discovered that it was just a matter of the Arduino board not getting enough power. Within an hour, Ruth was able to figure out the Wii's data.
But, with the presentation on Tuesday, we still had a lot to do. The thee of us all ended up spending most of Monday night wrapping things up. Kat had meant to program the Phidgets program, to record when people enter the dining hall, but perhaps because of a misunderstanding, had only found code that printed out information when a tag was within range of a reader, and changed the information to let the person holding the tag know they were in the dining hall. The problem with this was that the computer had no memory of the tag once it was out of range (about three inches away from the reader), and we needed something that knew when you were in the dining hall, ad when you had left - even if you weren't within range of the reader.
I ended up spending the better part of Monday night trying to code that, which turned out to be much more complicated than anticipated. Unfortunately, I had forgotten most of the details of what I had learned in CS230, so my night involved a lot of looking things up to refresh my memory. The sample code that Kat found ended up being a really good basis for what I needed to write, since all the event handlers were already dealt with - all I had to do was modify the event handler that managed what happened when an RFID tag was within range of the reader. What I ended up doing was creating a new class that represented a tag passing the reader. When a tag passed the reader, the program would create a new instance of the class, then store it in an array. First, however, it would check to see if an object with the same RFID tag number was already in the array. If it was, then the program would know that the person with that tag was LEAVING the dining hall, and could remove the object from the array. If the tag wasn't already in the array, the program would add it.
The program would also beep when someone entered the dining hall, to alert people that a new person had arrived, and keep track of how many people are in the dining hall, to print out whenever someone came, to let them know how many friends they could expect to find.
If there was a way we could communicate between the suit and the Phidgets, the Phidget software could easily keep track of how many times a tag passes an RFID reader, and the suit could use that data to give the suit meaningful feedback.
We also got to solder some things together. Most of the sensors and actuators were intended for the lilypad, so Kat and I had to solder wires together to the sensors to make the connections more secure. I had taken a jewelry making course a couple years ago, and had quite a bit of experience using a soldering iron, so I ended up doing most of the soldering.
Meanwhile, Ruth finalized a lot of the sensor thresholds and got everything working well in conjunction, and Kat made a a box to hold everything.
The next day we still had quite a bit to do before we were ready for our presentation: everything still had to be installed in the suit! Because of another miscommunication, Kat had thought we didn't need super long wires connecting the many inputs and outputs to the board, since we were using a different strategy. The result was the we had soldered shorter wires to the sensors and actuators. In realty, we really needed those longer wires to get to the various parts of the suit. We frantically twisted wires together, and pushed them into the various parts of the suit, stuffing everything into the box that would contain everything (which ended up being a smidge too short, but worked out). We managed to get everything up and running in time to run through our presentation and deliver it on time!
By the way, I got another "what it like to be in a class with so many men" comment on the way to NY for Thanksgiving.
Monday, December 12, 2011
Modified direction
After our presentation on November 4th, we had to adjust the direction of our project a little bit. Since we had the water bottle aspect 90% figured out and just needed a sensor, Orit agreed to help us get a new sensor. We still had to implement the exercise monitor, sleep monitor, food intake monitor and social monitor, though.
After talking to Orit, we decided not to use a heart rate sensor. It was a lot of money for the lab to spend on risky technology and there were definitely ways around what we were trying to achieve with the heart sensor. A simple accelerometer could do most of what we wanted to do. It could register types of movement to see if you're running, and could see the direction you're facing to see if you are lying down. This data, in conjunction with light levels (we'd already ordered a light level sensor), could let us know if the user is sleeping.
We also decided to use phidgets to deal with the food intake and social aspect. In an ideal scenario, we'd be able to install small, powerful RFID readers in every suit, able to register if a friend is nearby. Since these are usually quite expensive, we decided to work the social aspect into the food intake aspect of the project. In our scenario, each dining hall will be outfitted with an RFID reader. When someone enters the dining hall, a beep will go off, letting people within the dining hall know that a new friend has arrived. Ideally, we would also have some sort of communication between the phidgets installed in the dining hall and the suits, so the suit could receive data about how frequently the user has been to the dining hall. This, again, uses expensive and inconsistent technology, so we've decided not to implement this aspect.
As an afterthought, we've also decided to install a button in the suit, allowing the user to turn the feedback off so that it doesn't communicate too much information about their lifestyle in potentially embarrassing situations, like class.
We still have a lot of coding to do, but it's mostly all coming together.
After talking to Orit, we decided not to use a heart rate sensor. It was a lot of money for the lab to spend on risky technology and there were definitely ways around what we were trying to achieve with the heart sensor. A simple accelerometer could do most of what we wanted to do. It could register types of movement to see if you're running, and could see the direction you're facing to see if you are lying down. This data, in conjunction with light levels (we'd already ordered a light level sensor), could let us know if the user is sleeping.
We also decided to use phidgets to deal with the food intake and social aspect. In an ideal scenario, we'd be able to install small, powerful RFID readers in every suit, able to register if a friend is nearby. Since these are usually quite expensive, we decided to work the social aspect into the food intake aspect of the project. In our scenario, each dining hall will be outfitted with an RFID reader. When someone enters the dining hall, a beep will go off, letting people within the dining hall know that a new friend has arrived. Ideally, we would also have some sort of communication between the phidgets installed in the dining hall and the suits, so the suit could receive data about how frequently the user has been to the dining hall. This, again, uses expensive and inconsistent technology, so we've decided not to implement this aspect.
As an afterthought, we've also decided to install a button in the suit, allowing the user to turn the feedback off so that it doesn't communicate too much information about their lifestyle in potentially embarrassing situations, like class.
We still have a lot of coding to do, but it's mostly all coming together.
Getting the water bottle to work
One weekend, Ruth and spent pretty much all Sunday trying to implement the water bottle part of our suit. We'd agreed on getting a pouch style water bottle several days ago and I had gone to get one that looks pretty much like this:The idea is that we would connect it to a tube, which would let you keep it attached to you while you drink water.
While this was a great idea, a slight problem occurred to me on Saturday night: this kind of water bottle works through suction - the complete lack of air within the water bottle is what allows the water to reach the tube at the top. To achieve this the water level within the water pouch can't ever actually decrease, making compatibility with our water level sensor impossible. Even if we turned it upside down, there is still no way for the air to escape and allowing the water level to go down. What we needed was a pouch (which would allow us to install the liquid level sensor), that could connect to a tube, that let air in and that was watertight.
What we found was that this didn't exist. However, I managed to rig something together during a frantic second trip to REI. Luckily Nalgene makes a pouch water bottle, as well. While the lids don't let air in, they are the same size as Camelbak water bottle lids, which DO let air in and suck water up via a straw. So what we ended up doing was getting a Nalgene water pouch, using a Camelbak lid, and then connecting it with the Platypus tube we had from before.
Meanwhile, Ruth began to actually experiment with how to set up an Arduino board and breadboard. Basically the breadboard allows you to extend the input and output docks on the Arduino board itself, giving you a space to use things like resistors (which decrease the current), and the several different kinds of input and output that we'll eventually need.
On Monday evening we went back to the lab to actually program for it. We needed a way to continuously keep track of someone drinking water, while also making sure that they've drank enough in the last 24 hours. I had come up with an implementation (that was maybe a little complicated) which involved creating a new class that would describe every cup of water drunk. When the water level had decreased a cup a new waterCup object would be created with a time stamp of when it was created. It would then be stored in an array. The array would have to have at least 8 waterCup objects in it (people are supposed to drink 8 cups of water a day), or a pink light would go on telling the user to drink more water. The main method would be continuously checking the array to make sure that all the waterCup objects had been created in the last 24 hours, removing ones that had passed the 24 hour mark, and making sure at least 8 were in the array.
This, however, was going to be very difficult to code and I hadn't yet figured out how to deal with the fact that people might fill the water bottle up mid cup. While Ruth and I were talking the problem through, she came up with a really elegant solution. The program will keep track of the total amount you drink in a variable. Every five minutes the program will test the water level to see if it has changed since the last five minutes. If it has changed in a negative direction (if the water level has decreased), then it adds that amount to the total amount you've drank. Every six hours, the program subtracts 2 cups from the total amount that you've drunk. If the variable representing the total amount drunk sinks below an arbitrary number (I think we ended up going with 0), then the program will turn a pink light on and the blue light off, letting the user know they need to drink more water. This keeps the user pacing their water intake throughout the day, and also allows them to store water intake, (since for instance, you can't drink water while you're sleeping).
We needed to come up with a way of making sure that the change in water level wasn't the result of a momentary shift in posture or something of the sort, so I suggested we also test the water level a couple second after the initial test, verify the initial reading.
While Ruth and I worked on implementing the code, Kat came in and attached the liquid level sensor to the interior of the water bottle. While Ruth was continuing to add details to the code, I began testing the thresholds for the water level, trying to figure out what was a cup, where the bottom of the water bottle was and how sensitive the sensor was. The sensor was giving fairly reliable feedback, but did not seem to acknowledge the bottom quarter or so of the waterbottle, giving it all the same reading.
As I was trying to figure out why it was doing this, the liquid level sensor stopped working entirely, giving us gibberish. We still don't know why it did this, but suspect that a membrane which reads the air pressure at the top of the sensor became damaged in some way. We hooked our board up to a flex sensor, and called it a night, since there was very little we could do about the sensor.
While this was a great idea, a slight problem occurred to me on Saturday night: this kind of water bottle works through suction - the complete lack of air within the water bottle is what allows the water to reach the tube at the top. To achieve this the water level within the water pouch can't ever actually decrease, making compatibility with our water level sensor impossible. Even if we turned it upside down, there is still no way for the air to escape and allowing the water level to go down. What we needed was a pouch (which would allow us to install the liquid level sensor), that could connect to a tube, that let air in and that was watertight.
What we found was that this didn't exist. However, I managed to rig something together during a frantic second trip to REI. Luckily Nalgene makes a pouch water bottle, as well. While the lids don't let air in, they are the same size as Camelbak water bottle lids, which DO let air in and suck water up via a straw. So what we ended up doing was getting a Nalgene water pouch, using a Camelbak lid, and then connecting it with the Platypus tube we had from before.
Meanwhile, Ruth began to actually experiment with how to set up an Arduino board and breadboard. Basically the breadboard allows you to extend the input and output docks on the Arduino board itself, giving you a space to use things like resistors (which decrease the current), and the several different kinds of input and output that we'll eventually need.
On Monday evening we went back to the lab to actually program for it. We needed a way to continuously keep track of someone drinking water, while also making sure that they've drank enough in the last 24 hours. I had come up with an implementation (that was maybe a little complicated) which involved creating a new class that would describe every cup of water drunk. When the water level had decreased a cup a new waterCup object would be created with a time stamp of when it was created. It would then be stored in an array. The array would have to have at least 8 waterCup objects in it (people are supposed to drink 8 cups of water a day), or a pink light would go on telling the user to drink more water. The main method would be continuously checking the array to make sure that all the waterCup objects had been created in the last 24 hours, removing ones that had passed the 24 hour mark, and making sure at least 8 were in the array.
This, however, was going to be very difficult to code and I hadn't yet figured out how to deal with the fact that people might fill the water bottle up mid cup. While Ruth and I were talking the problem through, she came up with a really elegant solution. The program will keep track of the total amount you drink in a variable. Every five minutes the program will test the water level to see if it has changed since the last five minutes. If it has changed in a negative direction (if the water level has decreased), then it adds that amount to the total amount you've drank. Every six hours, the program subtracts 2 cups from the total amount that you've drunk. If the variable representing the total amount drunk sinks below an arbitrary number (I think we ended up going with 0), then the program will turn a pink light on and the blue light off, letting the user know they need to drink more water. This keeps the user pacing their water intake throughout the day, and also allows them to store water intake, (since for instance, you can't drink water while you're sleeping).
We needed to come up with a way of making sure that the change in water level wasn't the result of a momentary shift in posture or something of the sort, so I suggested we also test the water level a couple second after the initial test, verify the initial reading.
While Ruth and I worked on implementing the code, Kat came in and attached the liquid level sensor to the interior of the water bottle. While Ruth was continuing to add details to the code, I began testing the thresholds for the water level, trying to figure out what was a cup, where the bottom of the water bottle was and how sensitive the sensor was. The sensor was giving fairly reliable feedback, but did not seem to acknowledge the bottom quarter or so of the waterbottle, giving it all the same reading.
As I was trying to figure out why it was doing this, the liquid level sensor stopped working entirely, giving us gibberish. We still don't know why it did this, but suspect that a membrane which reads the air pressure at the top of the sensor became damaged in some way. We hooked our board up to a flex sensor, and called it a night, since there was very little we could do about the sensor.
Saturday, December 10, 2011
Direction for Project
We made a lot of progress on our project, as far as actually knowing what we're doing. While we were initially thinking of using Phidgets and even went so far as to order a sensor, we ended up decideding that would be bad idea for a couple of different reasons. First of all, from our research it looks as though Phidgets have to be connected to a computer to work, or use a complicated wifi set-up, which would add a whole other layer of work to our project. Second of all, Phidgets has a somewhat limited selection of sensors. Especially since we were planning on using a heart sensor, it seemed like Arduino might be a better choice. Arduino is much more flexible, as far connecting to a wide range of sensors goes, and can be programmed to function without a computer.
After deciding to use Arduino, Ruth and I did quite a bit of research about the Arduino Lilypad. What we eventually decided however, after consulting with Consuelo and Orit, was that the Lilypad, though beautiful and easy to disguise within fabric, is not really powerful enough for our project. There isn't enough input/output options, the power will be limited, and the connections between different elements will be unstable. Instead we are using the a normal Arduino board, of which the lab has two.
Making this decision allowed us to start thinking about specifics as far as our project goes. We looked into different sensors and figured out which ones would work for our project. The task we've been charged with tackling first is the water drinking task, since it will probably be the hardest to implement. While we were working with the Phidgets, we thought that measuring the changing weight of the water bottle might be a good way to record how much water someone is drinking, but there are a lot of ways this could easily become complicated. What if the weight of the water bottle isn't on the weight sensor? How do we build a structure that includes the weight sensor in a body suit? We were basically all a little nervous about how this would work.
When we decided to switch to Arduino, we actually found out that there is a liquid level sensor available for the Arduino. It gives input in a range from around 150 to 1000, which relate where the top of the water is hitting the sensor. Installing this within a water bottle will allow us to record the exact level of water that in the water bottle at any given moment. We can then program for when the water level decreases or increases, and what that means for someone's water intake throughout the day.
When we decided to switch to Arduino, we actually found out that there is a liquid level sensor available for the Arduino. It gives input in a range from around 150 to 1000, which relate where the top of the water is hitting the sensor. Installing this within a water bottle will allow us to record the exact level of water that in the water bottle at any given moment. We can then program for when the water level decreases or increases, and what that means for someone's water intake throughout the day.
Friday, October 21, 2011
Women in Computer Science
I'm going to write this post while it's fresh in my head.
Last night I watched a screening of the movie Missrepresentation. The film talks about the popular media's portrayal of women as a backlash to the progress women have been making in society in the last century, then explains how this portrayal is not just negative, but also very dangerous for all the freedoms women have recently gained. The movie discussed the idea that women still take second stage in popular media. Whether it's being the ditsy sidekick to the serious male news anchor, or being the lovesick women looking for a man in the latest romantic comedy, women are presented as existing for men. This hurts women in several ways. At a political level, this limits the way we listen to female politicians. When the media focuses on the appearance of a politician instead of the content of what they are saying, it's hard to respect them. This leads to less women in politics. At a day-to-day level, this kind of media portrayal tells women that, above all, we are valued by what we have to offer men. This is again, quite limiting and leads to young women failing to pursue important careers and ambitions because whatever it is, it isn't as important as important as appealing to men. It then becomes cyclical - with fewer women in important, powerful positions, there are fewer voices to advocate for women in the world at large.
Though the movie focused on politics, since this part of society has a unique ability to shape how empowered certain groups are in the future of our society, I really believe that it can be applied to any male dominated field. Over the summer I played in an Ultimate tournament with a team I didn't know too well and got a ride to the fields with a male student from Carleton College named Alex. We had a long trip to get there and ended up talking a lot about school. Early on in the conversation I mentioned that I had taken several classes in the computer science department and was hoping to minor. He reacted by asking me if it was difficult being in classes with so many men, before remembering that I went to a women's college.
My only experiences with the computer science field have been at Wellesley where, obviously, my peers are mostly female. To me, a computer science student a woman just because that's what I've been exposed to. But Alex' comment reminded me that the rest of the world still imagines a computer science student as male and further more, imagines computer science as a masculine field. I often wonder what it will be like to leave Wellesley with a background in Computer Science and enter a world where people say "Wow! What's it like to be part of such a male dominated field?"
That being said, what I think Missrepresentation inspired me to do, even more than before, is to specifically choose to pursue male dominated fields to work in. Computer science is fascinating and interesting to work on, but I'm in a fairly lucky position where I can take classes without having to question whether I, as a women, should study something more "feminine." I think it's important that every women have the opportunity to study what she finds fascinating and I hope that by studying computer science, I'm helping the world get just a little bit closer to achieving that goal.
Last night I watched a screening of the movie Missrepresentation. The film talks about the popular media's portrayal of women as a backlash to the progress women have been making in society in the last century, then explains how this portrayal is not just negative, but also very dangerous for all the freedoms women have recently gained. The movie discussed the idea that women still take second stage in popular media. Whether it's being the ditsy sidekick to the serious male news anchor, or being the lovesick women looking for a man in the latest romantic comedy, women are presented as existing for men. This hurts women in several ways. At a political level, this limits the way we listen to female politicians. When the media focuses on the appearance of a politician instead of the content of what they are saying, it's hard to respect them. This leads to less women in politics. At a day-to-day level, this kind of media portrayal tells women that, above all, we are valued by what we have to offer men. This is again, quite limiting and leads to young women failing to pursue important careers and ambitions because whatever it is, it isn't as important as important as appealing to men. It then becomes cyclical - with fewer women in important, powerful positions, there are fewer voices to advocate for women in the world at large.
Though the movie focused on politics, since this part of society has a unique ability to shape how empowered certain groups are in the future of our society, I really believe that it can be applied to any male dominated field. Over the summer I played in an Ultimate tournament with a team I didn't know too well and got a ride to the fields with a male student from Carleton College named Alex. We had a long trip to get there and ended up talking a lot about school. Early on in the conversation I mentioned that I had taken several classes in the computer science department and was hoping to minor. He reacted by asking me if it was difficult being in classes with so many men, before remembering that I went to a women's college.
My only experiences with the computer science field have been at Wellesley where, obviously, my peers are mostly female. To me, a computer science student a woman just because that's what I've been exposed to. But Alex' comment reminded me that the rest of the world still imagines a computer science student as male and further more, imagines computer science as a masculine field. I often wonder what it will be like to leave Wellesley with a background in Computer Science and enter a world where people say "Wow! What's it like to be part of such a male dominated field?"
That being said, what I think Missrepresentation inspired me to do, even more than before, is to specifically choose to pursue male dominated fields to work in. Computer science is fascinating and interesting to work on, but I'm in a fairly lucky position where I can take classes without having to question whether I, as a women, should study something more "feminine." I think it's important that every women have the opportunity to study what she finds fascinating and I hope that by studying computer science, I'm helping the world get just a little bit closer to achieving that goal.
Wilson Lecture Response
I meant to post sooner after having attended but simply ran out of time. The Wilson Lecture caught me by surprise - all I really knew was that the lecturer was coming from MIT and was talking about technology in the developing word. In my mind, the combination of the "MIT" and "Technology" meant something super high tech and shiny. It didn't occur to me that technology would mean any kind of innovation, even if it involves rusty pieces of sheet metal making a tool to help prepare corn. Much of what Amy Smith, the lecturer, advocated was teaching people how to invent something for themselves. It took the old saying "Give a man a fish and you feed him for a day. Teach a man to fish and you feed him for a lifetime" a step further, stating that if you teach that man how to teach himself, he can eat whatever he wants for a lifetime.
I thought this had interesting implication for computer science. Something that has always bothered me about computer science is how steep the learning curve is and how expensive the technology is. For example, last week my hard drive failed. I have only a basic idea of what a hard drive is and no idea of how it actually functions (I should probably take 240...) I spent several hours on the phone with a tech savvy friend back home who walked me through diagnosing it and trying to recover it. I'm sad to report that it's been deemed a lost cause and I'm now looking at purchasing a $60 replacement. After investing a large amount of time and energy into creating a collection of digital data, I then lost all of it and now lack both the expertise and the equipment to recover it - and I know more about computer science than the average person.
The point I'm making is that computer science is inherently inaccessible. This kind of technology, which requires expensive parts and expertise, is never going to be meaningful in the day-to-day lives of the majority of people on the planet, and I'm not sure how I feel about that. I think this will change in the future, as technology becomes cheaper and people become more technologically literate, and I'm excited to see what happens when it does, because the lecture got me thinking about what people could invent if we had a whole world brainstorming idea for programs and technologies.
I like the idea of a world where people could make their own computers by hand to fit their own needs, then program the specific programs they need. I don't know whether this is the specific direction that the computer science world is headed in because frankly, it doesn't sound very lucrative for the current industry, but I would like to think it is. This fantasy of mine ties in nicely with the TUIs. If people were educated in ways to make their own computers, they would surely move away from a monitor, mouse, key board and GUI set-up. Again, I don't know if this will ever happen, but I's like to see computer science move in this direction.
I thought this had interesting implication for computer science. Something that has always bothered me about computer science is how steep the learning curve is and how expensive the technology is. For example, last week my hard drive failed. I have only a basic idea of what a hard drive is and no idea of how it actually functions (I should probably take 240...) I spent several hours on the phone with a tech savvy friend back home who walked me through diagnosing it and trying to recover it. I'm sad to report that it's been deemed a lost cause and I'm now looking at purchasing a $60 replacement. After investing a large amount of time and energy into creating a collection of digital data, I then lost all of it and now lack both the expertise and the equipment to recover it - and I know more about computer science than the average person.
The point I'm making is that computer science is inherently inaccessible. This kind of technology, which requires expensive parts and expertise, is never going to be meaningful in the day-to-day lives of the majority of people on the planet, and I'm not sure how I feel about that. I think this will change in the future, as technology becomes cheaper and people become more technologically literate, and I'm excited to see what happens when it does, because the lecture got me thinking about what people could invent if we had a whole world brainstorming idea for programs and technologies.
I like the idea of a world where people could make their own computers by hand to fit their own needs, then program the specific programs they need. I don't know whether this is the specific direction that the computer science world is headed in because frankly, it doesn't sound very lucrative for the current industry, but I would like to think it is. This fantasy of mine ties in nicely with the TUIs. If people were educated in ways to make their own computers, they would surely move away from a monitor, mouse, key board and GUI set-up. Again, I don't know if this will ever happen, but I's like to see computer science move in this direction.
Subscribe to:
Posts (Atom)