Coding in College vs. The Real World
Before student loans began eating graduates alive, college students mostly worried about how to choose a major that was enjoyable enough to turn into a fulfilling career. As computer programming grew and became a lucrative career path, many students migrated to technical degrees, hoping to get a leg up on non-formally educated coders. When the economy crashed and took the job market with it, students like me chose technical degrees because graduates in those fields had the most job opportunities. With STEM degrees only going up in popularity, it’s important for both graduates and employers to understand the differences in college-educated and experienced programming.
Starting at the Bottom vs. Ain’t Nobody Got Time for That
In a college classroom, I had a whole semester to learn whatever language that class focused on. My professors spent time teaching us the most basic things — basic like 15 minutes discussing how to declare a variable, or whole test sections on the appropriate time to use functions or subroutines. This isn’t necessarily a bad thing; we go to college to learn and learn thoroughly. I walked out of my first programming class — Introduction to Programming in Java — with a solid foundation not only in the language itself, but in data storage, functions, and OOP. These were things that served me well in the future in more than just Java programming.
In real-world programming, neither you nor your client or employer have time for you to learn every language you use from bottom to top, especially not before you start building. Once you understand basic programming logic, it’s time to start applying it to multiple languages for multiple purposes. Things like syntax and how to declare a variable in particular languages need to be learned on the fly, because you should be focusing mostly on functionality. The comfortable feeling that comes with knowing everything you need to know before embarking on a new app is a luxury that only college level or highly experienced programmers have.
Writing All of the Basics vs. Using Other People’s Code
While we’re discussing learning programming languages from the ground up, in my college coding experience the focus was put on being able to do everything yourself. A function to make it do this, make it do that, do everything to make sure it’s secure, and make sure it’s fast. I can build a database management system from scratch using VB.NET. Is that useful? Sure. Educational and thorough? Absolutely. Valuable? Not to many, given that the same system has been built 100 times over, including way more functionality than I could ever dream of building in a reasonable amount of time. The truth is that programming courses in colleges don’t typically focus on how to utilize other people’s or organizations’ code, they focus on a particular language or paradigm.
To write almost any app with purely your own code and logic is, at best, inefficient. Thousands of programmers before us have blazed trails and developed reusable code for programmers like us to use to build bigger and better things. They took care of things like the data models and big scary functions so we could easily integrate them into our own work. You might be able to write your own user profile system, inventory system, or subscription handling application, which is awesome (and cheaper!), but that doesn’t mean it’s the fastest, most secure, or best way to do things. Learning how to integrate other applications, frameworks, and libraries into your own code is imperative to successful development.
Literally Writing Code vs. Writing Code
In my classes, we spent hours in front of our laptops, watching and copying while our professors projected their screens on the big whiteboard up front. Then we wrote code with a pencil and paper on tests. Instead of my 8-multiple-choice-per-question tests like in Intro to Java (I called my mom promptly after to let her know in advance that I’d probably be failing out of my second attempted major), many of my tests consisted of literally writing my code, with no IDE to remind me what I meant to type or that I forgot that closing parenthesis. Guess what I’ve never done since then.
Yeah, penciling code down to the semicolons. Don’t get me wrong, I love getting a chance to work with a pencil or pen instead of the keyboard when I’m working on data models or even writing some Plain English before I get down to business. I know literally writing my code probably taught me to pay attention to detail, but of all the useless things I resented having to do in college, writing code is included and close to the top of the list.
Word Problems vs. World Problems
Every test, assignment, or question always came fully loaded with everything necessary to solve the problem. The numbers and relationships and functions were clearly laid out, and professors almost never gave out a problem that they hadn’t already taught a solution to. Basically, as a programmer, you knew there was no missing information and that you definitely had (at least been taught) the skills to solve the problem.
The business problems that need to be solved with real world code don’t always include everything you need to know. In fact, they usually don’t. It’s up to a business or development team to decipher everything the app will need, and chances are they won’t know exactly how to write every bit of code straight off the bat. You might have to make a few educated guesses or decide the best way to do something yourself. Making decisions quickly as you face new problems is a huge part of successful development, and at some point, you’ll probably have to start moving forward before you feel ready. Go with it and do your best, it’s the fastest way to learn.
This one is the most painful for a college student. Unless you’re working on a development team while you’re in school, you’re likely watching some of your peers doing exactly the same things you do, except they’re earning money. And they’re doing it without a college degree, like the one you’re chipping away at. Most developers know that you don’t need a degree to get a job as a programmer, and it can be hard to curb the FOMO when you see the grind paying off sooner for others. Student, rest assured your work is not worthless — the value of your degree comes from long hours of studying and testing, and your employers and clients will appreciate your knowledge.
I wouldn’t have any of my technical skills if it weren’t for the dedicated professors in the BIT Department at Virginia Tech, and I’m thankful every day to have been a part of the growing program there. Do not walk away from this post with the idea that university-level programming is not useful: the depth of knowledge that comes from studying the history, the structure, and the minute details of particular languages with a teacher for months cannot be rivaled. It could be argued that learning the principles behind well-commented code and object-oriented programming is just as important as the languages themselves. These are things that college classrooms teach well. Even so, the newly graduated developer should expect a learning curve when it comes time to write code with the entire internet’s selection of languages and environments.