Thursday, March 12, 2009

Math

I'd nearly forgotten this blog existed.

I have a lot of things going on, and a lot on my mind lately. But I came here to post a specific thought, and here it is:

It blows my mind sometimes how much work has been done before me. Take math for example. All those formulas and theorems that I've spent hours trying to learn for the past few years - someone had to think of them. And then, once they'd thought of them, they had to prove they were right. I know this may seem like an obvious assertion, but it's just... staggering, if you really think about it. Especially considering that not all of this stuff is exactly straightforward.

It's not a coincidence that each math class builds upon each other, either. Each theorem is built upon previous theorems, and our understanding of them. Following facts until they reach a conclusion. But if we didn't have that previous work to rely upon, we'd be nowhere. You can't do calculus if you don't know algebra. You can't do algebra if you can't do arithmetic.

What if we hadn't had that basis to build upon? What if, say, Newton never published his thoughts on calculus? Or Euclid never published his geometry? Where would we be today? I know most of the thoughts that these particular mathematicians presented weren't completely unique, and the ideas probably would have shown up eventually. But it's still an interesting thought.

Now let's look at computer science. Have you ever thought about how much code goes into making your computer work? Even if you ignore the hardware side of things, there's the bootloader, the bios, the operating system, the individual device drivers... we rely on all of these things to make our applications work, the same way we rely on arithmetic to make calculus work.

Except computer science faces some issues that math doesn't. First off, computer scientists almost never have to prove that their code works. It's nearly impossible to do so, and it takes too much time and effort, so pretty much the only people that bother with it are people who write code that will cause significant human or financial loss if it breaks. That means, basically, that no one has *proven* that anything on your computer works. We just know that it works under most testable circumstances, and that's usually good enough. Except when it isn't.

Second, programmers face a lot of issues with licensing and copyright law. Remember when I said to imagine if Newton had never done his work with calculus? Well, what if you knew that he had done it, but you weren't allowed to use his work in your own proofs? You'd have to reinvent Newton's theorems, just to prove your own. Sounds pretty ridiculous, but that's basically what's going on in the software world right now. And that's why open source software is so important. We need that background, those blocks to build upon. We can't really make progress unless we've seen what came before.

I think this is getting a little off topic now, but the other thing that's been on my mind lately is how much computer science is being used in other disciplines. I think math people are used to seeing math being used everywhere, and to an extent we're all used to seeing computers everywhere. But it's neat to think that just 20 or 30 years ago, mathematicians (and college students) didn't have calculators. If they wanted a specific value of sin or cos at a non-standard value, they had to look it up in a table. Or, think about the largest prime number that was found to date. That was done using a high-tech, complicated technique called distributed computing.

I don't understand why there are so few girls in computer science, because the applications of it are so diverse! People think that if you're a computer science major, you're going to end up in a cubicle somewhere churning out code and rarely interacting with another human being. There are lots of jobs out there like that, but there are so many more possibilities! The internet put the world at our fingertips, and now we get the chance to shape it.