Hopefully so far you've at least built up a little more of an appreciation for what's going on inside your computer, even if you're still not really sure what it all means. I haven't really said anything about what programming is like in practice, but I hope I've hit on most of the big things that you can't really get by googling "[language name] tutorial." To finish out the week I want to just hit on some recent trends in programming and technology in general, what they mean, and how I feel about it. My hope is that you might learn something or will at least have a better idea of what a programmer might think about things that you're undoubtedly aware of (or it'll at least just be an opportunity for me vent a little a technology nowadays...).
This is one of my least favorite terms in technology today. The situation is straightforward enough: create services that allow people to do things wherever they want, you'll need their files so you may as well store it instead of asking for them when you need then. I don't mind the idea and it certainly is useful for the consumer. This is obvious something that really took off and companies figure out their own ways to handle the costs in infrastructure (normally by build huge computer that essentially run simulations of dozens of other computers on them, which has it's own complications). Maybe they show you advertisements, maybe they sell your data to other companies—that's business.
And I would be fine with all of this if they just did it and shut up about it (like every other website that has to handle user data). But instead, they tout "cloud storage" or "cloud synchronization" like they've just invented some amazing new technique. Unfortunately this pattern perpetuates itself because the average consumer has now heard the term 500 times and doesn't know enough to just assume that most new technologies will have it. Marketing is frustrating.
But that's not even what bothers me most. Cloud computing as an idea encompasses a great deal more than just data storage. Projects like folding@home use cloud technology in the true sense: distributed computing over a network. Basically, while most "cloud" services are really just simple data housing and sharing set ups, folding@home has a goal and a true computing model. Folding@home simulates molecular interactions (primarily protein folding) but splitting up the work into pieces and sending them to people's computers. People then use the folding@home program they have installed for free to perform complicated calculations using their computers resources. Once they have finished the work they send the new data to the server and get assigned more. Noble intentions aside, I consider this a far more impressive and worthy-of-the-term example of cloud computing than most of what claims it today.
Tablets, smart phones, kitschy laptops, oh my. Mobile technology goes hand-in-hand with cloud storage (actually it pretty much necessitates it). Maybe that's why I hate mobile too.
The pervasiveness of small, shitty mobile apps seems to have somehow convinced my generation that making "the next best thing" basically just requires 5 minutes of inspiration and a friend that's a programmer. Just like making a prototype for an invention or starting a business takes time, effort, and usually a loan or investor of some sort, getting your "app" made without coding it yourself is going to require you to hire at least one person (and programmers generally aren't all that cheap) and figure out how you're going to profit from and market your idea.
Some Other Thoughts
I'd like to close by just listing off a few other things that I think about from time to time being a programmer:
- Programmers and other computer people do not know every program you use. If you know someone that is nice enough to help you with your computer issues, understand that they are probably just using general troubleshooting techniques (see this xkcd for an example)
- Things like Windows 8's metro interface and Apples app store is OS X Lion are great examples of current trends (in these cases mobile) convincing companies to try to bridge all of their platforms. I think they are misunderstanding their users and, at least in Microsoft's case, not even really trying to make quality products (anymore) but instead just catering to what they perceive to be their average user
- In case you didn't know, there is an operating system called Linux, which is basically a free replacement for Windows or OS X for programmers and other more technical people (which is not to say that programmers don't use the others or that non-technical people can't use Linux, it's just an available option that tends to be more involved but also more customizable
- I very strongly believe that computer science is not only important enough to be taught in grade schools but should be held with the same level of importance as math and other sciences
- Learning to program is, in a way, like rewiring your brain to solve problems more efficiently. When I took my first CSE class it literally changed my life. What I learned in that class inspired me to learn more and eventually add a minor. Soon after that computer science was my major and everything else dropped a notch or two. It is an invaluable gift and a hobby that lasts a lifetime
- Many companies (especially the big ones) use Java or C# as their primary language (they are pretty much same, even if that is a somewhat controversial idea in some groups), at least when recruiting. But ultimately this is just risk mitigation and maximizing the number of people qualified. But programming is not just about making something that works. It's about making something that works even when other things break. Making something that can be added to by someone else and re-used for other purposes. These large companies get their work done, but at the cost of far too many projects to redesign or reinvent their existing systems
- There is a great deal of software available for free and that you can even get the source code to. This allows programmers to make changes and redistribute them to every else, creating a huge community of hackers (in the traditional sense, really just meaning something to the effect of freelance programmers) with an ever-growing corpus of code, all for the common good
- Legal issues aside, many companies are hesitant to use much of this free software. This is often due to a lack of official support. When Microsoft release a new product or new version of an old product, you can pay them a fee to help you if and when you have issues. With software developed by the greater hacker community, you're often stuck Googling. Rather than going through the trouble of finding and hiring experts with this software, many companies go the route of just paying Microsoft or whatever other vendor both for the product and for the support
I absolutely love what I do. And I don't think I'll ever want to do anything else. What I do is still work, though, even when you account for how nerdy I am. But it's work that gives me so much to think about, so much to get better at, so much more to learn. It's it own world; a new way of thinking and looking at things, my primary source of analogy (which doesn't help me out with too many people, honestly). It's a world that feels so real and so concrete, even in all the abstractions and magic that happens in between. It's a world that is now a part of me. And I hope that, even if only in some very small way, I have helped open that world up to you.