If you've been programming for a while, a question that has most likely crossed your mind is this:
With all that being said, I want to discuss topic of defining the tipping point for developers, which is essentially the point at which a developer goes from a beginner to a pro. Since this topic is a bit abstract, it's not possible to point to a specific point in time and say:
There's not a sentinel moment when programming mastery occurs. It's different for every individual.
I remember when I was originally learning programming. Understanding the syntax and context did not come easy for me. It seemed like I spent 99% of my time looking things up and copying and pasting code from others just to get my programs running.
Needless to say, my confidence as a programmer was very low in the beginning. I kept being plagued by nagging doubts, such as:
If you're a new developer maybe some of this sounds familiar to you, or maybe it doesn't and I simply lacked confidence. Either way, I trudged along, trying everything I could think of to improve as a developer:
So, what did the trick and pushed me over the edge to become a professional developer? None of those things… and all those things. I persevered through project after project and I consumed every training resource I could find. And slowly something amazing started to happen:
"Everything started to make sense."
Even though it was a while ago, I still remember the moment my first development tipping point happened. I was sitting in front of my computer in a coffee shop and working on a web application.
As great as that was, I still had very far to go. I remember the next moment when I felt like I reached another key milestone. Even though my confidence had increased as a developer, the thought of anyone seeing my code was a very scary thought. However, I had started to build my freelance business and a client (who was also a developer) asked me to perform a pair programming session with him.
So, what was my secret to getting over the hump and going from a beginner to a professional developer? Unfortunately, there is no easy-to-follow recipe. However, there is a process that is guaranteed to work. And the process isn't specific to becoming a programmer, it's the same whether you want to be a developer or a professional athlete… it's hard and smart work.
In the book The Tipping Point, by Malcolm Gladwell, Gladwell gives countless case studies of what it takes for individuals to achieve mastery in a specific field. The key comes down to how dedicated an individual is to a specific skill. The book postulates that it takes around 10,000 hours for an individual to become a true master of whatever they're pursuing.
Before tackling this question, let's take a step back and discuss the topic of prodigies. Because whenever someone thinks that a certain group of individuals are born with superhuman-like talent, they're essentially saying that these special people are prodigies.
But are prodigies real? Let's take a look at one of the most famous prodigies of all time, Mozart. At the age of 5, Mozart was playing concert grade music to the royal family. Surely, this would qualify Mozart as a prodigy, right?
In his book, Ericsson dedicates a full chapter to debunking the concept of prodigies. And in each case, he illustrates that the individuals achieved their respective levels of success through massage amounts of work.
Extending the Mozart case study, let's discuss how this applies to developers. Whenever we see a skilled coder it's easy to think that they were born with the innate ability to build applications and that learning new languages and frameworks comes easy to them.
In Chapter 1, Discovering the Tipping Point for Developers I've discussed the tipping point for developers. The longer I teach and the more I work on my own coding skills, the more I'm convinced that the key to excellence is as straightforward as focused practice.
If you want to become a skilled developer badly enough, and you're willing to:
Before I end this chapter, I want to address a subtle issue that explains the reason of why we, as humans, love the idea of prodigies.
I guess I wasn't born to do be a developer.
I wish I had talent like XYZ programmer, everything seems to come so easy to him.
If you catch yourself with thoughts like these, remind yourself that prodigies aren't real.
"This is Jordan, he's just here to be smart."
I'm going to get off my soap box for a moment and discuss the life of Steve Prefontaine. If you've never heard of him before, Prefontaine was one of the world's greatest runners during his time. Before tragically dying in a car accident at 24 years old, he had already broken seven track world records.
However, he was famous for getting furious at people for this type of statement. He said that his success had literally nothing to do with talent. In fact, he said he wasn't born with any innate ability as a runner. Instead he credits 100% of his success with his legendary work ethic.
It's important to take the same approach that Prefontaine took as developers. If you fall into the trap of thinking that only geniuses can become good coders, it will most likely lead to quitting when tasks become challenging. This is because our minds constantly are searching for ways to work less. And if you believe that being a genius is a requirement for development, you will have a built-in excuse for faltering on your developer learning journey.
In a comprehensive educational study published in Scientific American (http://www.scientificamerican.com/article/the-secret-to-raising-smart-kids1/), kids were broken into two groups and taken through some academic assignments. Both groups scored around the same for the assignments. One of the groups were praised by their parents and teachers, and the focus of the compliments centered around how smart and talented the kids were.
So why did two groups of students have such different results when, by all appearances, the students had the same level of actual skill?
So, instead of taking the mindset that you need to be a genius to become a developer, take the mindset that best developers are also the hardest working developers. With this approach, your potential for skill is literally limitless. You are 100% in control of how good you will become as a coder. And your success will completely be determined how hard (and how smart) you are willing to work.
http://www.scientificamerican.com/article/the-secret-to-raising-smart-kids1/), kids were broken into two groups and taken through some academic assignments. Both groups scored around the same for the assignments. One of the groups were praised by their parents and teachers, and the focus of the compliments centered around how smart and talented the kids were.
So why did two groups of students have such different results when, by all appearances, the students had the same level of actual skill?
So, instead of taking the mindset that you need to be a genius to become a developer, take the mindset that best developers are also the hardest working developers. With this approach, your potential for skill is literally limitless. You are 100% in control of how good you will become as a coder. And your success will completely be determined how hard (and how smart) you are willing to work.
So, instead of taking the mindset that you need to be a genius to become a developer, take the mindset that best developers are also the hardest working developers. With this approach, your potential for skill is literally limitless. You are 100% in control of how good you will become as a coder. And your success will completely be determined how hard (and how smart) you are willing to work.
Let's take a case study example: understanding how logarithms work. Logarithms are used throughout the fields of mathematics and computer science; however, unless you use them regularly it's easy to get rusty on them:
- The first task that I will do is take a piece of paper and write Logarithm in the center and circle it.
- Next, I'll go to a comprehensive post on the topic, such as one on Wikipedia. When reading the first sentence, I come across a few terms that are a bit fuzzy:
- Inverse operation
- Exponentiation
I will stop reading the logarithm article and go and read those two articles until I feel comfortable with what they represent. After I feel good about those two items, I write them as their own circles that connect to the Logarithm circle. I will also add any examples that will help me understand what the terms mean if necessary.
- Next, I'll go back to the original Logarithm post and keep going through the article repeating this process until the entire page is filled with a mind map that explains each component that makes up logarithms and how they work together. This may include base case examples, such as:
64 = 2^6 is the same as log 2 (64) = 6
If this seems like a dead simple approach to study…it is. The goal of studying is to learn a topic, and one of the easiest ways to understand a complex subject is to break it into easy to comprehend components. For example, if you're trying to understand an advanced algorithm in computer science from scratch, you may feel a little intimidated.
This type of approach to studying doesn't work because our minds don't function like computers. A computer can take in information and then spit it back out. However, our minds are more relational in nature.
However, because your brain hasn't been properly introduced to the concepts, it will eventually eject the information, viewing it as useless since it's not related to the rest of your view of the world.
Whenever I'm teaching a new programming concept to students, I try to draw a fitting analogy to a real-world concept. This process is called reification and I view it as one of my most important tasks as a teacher.
What if instead of trying to memorize key terms about the MVC pattern you focused on drawing a real-world analogy to the process? My favorite way to understand this type of architecture is comparing it to a restaurant:
- Model: The model is the chef in the kitchen. In the same way that a chef prepares the meal for customers, the model works directly with the data for the application.
- Controller: The controller works like a restaurant waiter. In an application, the controller's role is based on taking requests and managing communication between the model and the view. This is much like a waiter who takes customer orders, communicates with the chef, and eventually brings the food out to the table.
- View: The view is like the table that a customer is sitting at. It doesn't do much besides provide a platform for placing the food on. This is exactly like how the view should operate in an application. If built properly, a view should simply be a place where data is shown to users.
Do you see what we just did? We learned about the MVC design pattern in a way that our minds can actually comprehend. I could fall out of bed and recite back the role of each component of the MVC architecture, not because I spent countless hours trying to memorize them, but because I connected the concept to my real-world experiences.
- Model: The model is the chef in the kitchen. In the same way that a chef prepares the meal for customers, the model works directly with the data for the application.
- Controller: The controller works like a restaurant waiter. In an application, the controller's role is based on taking requests and managing communication between the model and the view. This is much like a waiter who takes customer orders, communicates with the chef, and eventually brings the food out to the table.
- View: The view is like the table that a customer is sitting at. It doesn't do much besides provide a platform for placing the food on. This is exactly like how the view should operate in an application. If built properly, a view should simply be a place where data is shown to users.
Do you see what we just did? We learned about the MVC design pattern in a way that our minds can actually comprehend. I could fall out of bed and recite back the role of each component of the MVC architecture, not because I spent countless hours trying to memorize them, but because I connected the concept to my real-world experiences.
Over the years I've concluded that if studying is easy…I'm doing it wrong. I used to follow study pattern of:
How is it damaging? If you have followed this type of study system you know one thing: it takes time. This time spent reading and memorizing could have been used in countless other ways that would have proven more effective in the long run. And when it comes to studying, time is one of the most valuable assets that you have, so wasting it is akin to an educational felony.
In addition to the process of reification, there are a number of other study strategies that research is showing to be more effective than traditional study practices. In their book Make It Stick, cognitive psychologists Brown, Roediger, and McDaniel give the following recommendations for studying:
- When learning from a textbook, use the key terms from the back of each chapter to test yourself.
- List out key terms and use each one in a paragraph; this will test to see if you understand a concept outside of the realm of how the textbook or instructor supplied it.
- While reading new material, convert the main concepts into a series of questions and then go back and answer the questions when you're done reading the chapter.
- Rephrase the main ideas in your own words as you go through the material.
- Relate the main concepts to your own experiences, much like the reification process we've already discussed.
- Look for examples of key concepts outside of the text. When I'm learning a new programming language I never rely on a single source. If I come across a concept that doesn't make sense I'll usually review 2- 3 other sources that provide alternative explanations to what I'm attempting to learn.
However, there are millions of individuals with his height and wingspan who watch him at the Olympics from their couches every four years. There is no magical swimming gene that Phelps was born with. Instead, the secret to his success can be found in his discipline to a practice called deep work. Muscle Prodigy (https://www.muscleprodigy.com/michael-phelps-workout-and-diet/) research claims:
"Phelps swims minimum 80,000 meters a week, which is nearly 50 miles. He practices twice a day, sometimes more if he's training at altitude. Phelps trains for around five to six hours a day at six days a week."
If Malcom Gladwell's 10,000-hour rule is even close to being accurate, Michael Phelps surpassed this benchmark years ago.
In case you're wondering how this applies to coding, don't worry, I haven't forgotten that this is a show for developers.
As you go through these chapters, you may discover that one of my favorite books is Deep Work by Cal Newport. (The fact I referenced the book a few dozen times may given it away). So, what exactly is deep work? A dead simple explanation of deep work is:
"Deep work is the ability to focus without distraction on a cognitively demanding task."
Whether you believe that swimming is cognitively demanding or not, I believe that Phelps's example is fitting. If you have ever attempted to train with the level of intensity that Phelps does, you can attest to the mental toll that training takes. So essentially, deep work can be simplified by saying that it has the following characteristics:
Let's dissect the definition of deep work and build a practical strategy for how it can be implemented from a developer perspective. Let's imagine that you want to learn about the computer science topic of asymptotic analysis. If you've never heard of asymptotic analysis, don't worry, you can trust me that it qualifies as a challenging topic.
Let's start with the fact that deep work is an action. With that in mind, you will need to make a clearly defined time slot. If you have never practice deep work studying before, I'd recommend limiting the slot to around two hours. As you'll discover deep work is a draining task. For our example, let's imagine that you have designated 9 AM to 11 AM as when you're going to study asymptotic analysis.
With your time slot set, now it's time to remove any and all potential distractions. Let me be 100% explicit; this means:
Now that you have dedicated a few hours to studying asymptotic analysis and have removed all your distractions, it's finally time to get down to business. If you think that now you can simply start reading a few Wikipedia posts, I'm sorry, that won't earn you a deep work badge.
For deep work to be truly effective, it has to be difficult. If I was learning about asymptotic analysis for the first time and wanted to practice deep work while studying it, I'd take the following approach:
- I'd begin by reading a number of online resources on the subject.
- Next I'd watch an online lecture while taking notes.
- I would then find practice exercises where I would attempt to figure out problems from scratch.
- Next, I would write a blog post or record myself teaching the concept.
- Lastly, I would have another student or instructor review my teaching and exercises to ensure that I understood the concept properly.
When I mentioned earlier how you should limit your deep work sessions to around 2 hours, I don't mean that you can understand any topic in that period of time. Some complex topics may take days, weeks, months, or years to properly understand. So, it is completely fine to spend a number of sessions working through the same concept. If you are going to do this, I recommend that you make notes for what you were doing when you stopped. This will allow you to pick up right where you left off.
In this chapter, I'm going to discuss the concept of task switching costs. Task switching, commonly referred to as multitasking, can be detrimental to your performance as a developer and can even lead to errors in your projects. Our world has changed dramatically over the past decade, whether for good or bad is not a topic we'll discuss in this chapter. However, one thing is sure: we are constantly bombarded with distractions.
As I was researching this chapter, I received over a dozen emails, 7 Snapchat messages, 30 notifications on Instagram, 7 Twitter notifications, 5 Skype instant messages, and surprisingly only 9 text messages. If you were counting, that's around 72 various notifications that were pushed to me in the past two hours. Beyond that, I researched this chapter at a coffee shop filled with potential distractions.
So exactly how bad are distractions? Research from Gloria Mark (https://www.fastcompany.com/944128/worker-interrupted-cost-task-switching), who is a Professor in the Department of Informatics at the UC Irvine, shows that it takes, on average, 23 minutes and 15 seconds to get fully back on task after being distracted. That's a very, very bad thing when it comes to productivity; however, I've seen it myself, I've lost track of how many times I'll be in the middle of a development project and receive an email on a completely unrelated matter and instead of ignoring it and continuing to work I'll read it and then spend time working on another task before returning to the project.
This may not sound like a major issue, except that when I come back to the project, I don't pick up from where I left off. Instead I have to re-familiarize myself with what I was working on the moment that I was distracted. If the problem was complex, it may take me even longer than the 23 minutes in order to get back in the zone and working on the project.
So, in a world filled with emails and social media distractions, how can anyone get any real work done? After reading Cal Newport's book Deep Work, I started to put together some practical ways that I can work efficiently and still stay in touch with the world.
- If I'm working on a project, I set aside a specific amount of time that morning. For example, if I'm working on Project X for 2 hours, I will put it on my calendar and say that from 9 AM to 11 AM I'm working on Project X.
- I remove any and all negative distractions during that time. That means I'll usually put my phone on Airplane mode so I don't receive any social media notifications. Notice how I said negative distractions? I made this distinction because in the same research report from UC Irvine it revealed that not all distractions are bad. If the distraction is related to the task that you're working on, it can actually be beneficial. For example, if I'm working on the routing engine for a web application and the client messages me to discuss the application, what they say may actually influence the work that I'm doing or give me an idea on how to refine it. That's a good distraction and it's why I typically will keep my email and instant messenger on while I'm working. However, if I see that the Skype message or email is coming from another client or is completely unrelated I'll simply ignore it. I do know many Deep Work proponents who would say that 100% of your distractions have to be eliminated; however, that's not always practical.
- Have a clear conclusion for whatever you are studying or working on. If you don't establish an end for the task, your mind is going to be prone to wander in the same way that a runner without a finish line won't be able to effectively compete in a race. The research around task switching costs also reveals that even planned distractions are harmful, so if you are planning on working for 2 hours straight on a project, don't plan any breaks in the middle of the task. Maintain your focus throughout the allotted time and then you'll be free to relax afterwards.
This will include a practical walk through on:
For graduate school I have performed extensive research on the topic of task switching costs. While studying about task switching, I came across the topic of willpower limits and how they related to performance. Essentially, the study of willpower limits says that individuals have a limited amount of decision making power each day.
If that sounds weird to you, don't worry, I had a hard time with the concept right away too. So, let's go through a typical day for a developer. What are some decisions that you make each day?
Notice how none of those items are related to development at all. And in fact, those were all common decision items that you have to make each morning before you even get into work. If you actually count the number of decisions that you have to make each day, you'd discover the number is probably in the hundreds or even thousands. If you include subconscious decisions such as staying in your lane while driving, the number is most likely in the millions every day!
Hopefully, I've helped you see all of the decisions that we make daily. So why do willpower limits matter when it comes to making decisions? Mainly because without willpower the quality of our decisions will suffer dramatically.
Imagine yourself without willpower for a second. With no willpower, you:
So, with all of that in mind, is there really a limit to the amount of willpower you have each day? I recently went through the book, The Willpower Instinct, written by Dr. Kelly McGonigal (no relation to Professor McGonagall that I'm aware of). In the book Kelly presents research and countless case studies that clearly show that we do indeed have a limit to our daily willpower.
With all of this in mind, the concept of saving up our willpower reserves seems like a pretty important concept. Let's go back to the water bottle analogy. If you were in a desert and had a limited supply of water, what would you do? I think the obvious answer is that you would only use the water when it was needed.
What's a practical way of doing this? Let's walk through a simple but practical example.
If you watch my show on CronDose you may have noticed something… You get a gold star if you noticed that for the last 13 weeks (14 weeks if you include this week) I've worn the same shirt. Please note, it's not the same exact shirt. When I decided to experiment with the one outfit concept I purchased eight identical shirts.
Does wearing the same outfit each day really help improve my performance? I can't scientifically say one way or the other. Most likely it has a negligible effect. However, it has a much more powerful benefit than simply removing my morning outfit decision. Each day when I put this shirt on it reminds me that I have a limit to my willpower and that I need to use it wisely. And having that mindset does make a difference.
As a side note, the idea of wearing the same outfit is not an original idea. Steve Jobs, President Obama, and Mark Zuckerberg all have a similar ritual and that's where I got the idea from. If some of the most successful individuals in the world make it a priority to remove any and all unnecessary decisions, I thought it would be a good idea to try out.
With that in mind, I think the topic of cramming versus consistent study habits should be beneficial since the way that we study is just as important as the volume of how much we study. Most of us have been in the situation where we put off studying for too long and before we know it an exam is upon us that we have to cram for. If you can remember back to the last time that you crammed for an exam or project, how much of what you studied can you remember today?
For me, I would procrastinate studying because staring at the list of the books I had to read was intimidating, and this was mainly due to the fact that I didn't set any practical goals for studying. If you stare at a Discrete Mathematics textbook and tell yourself to study, it's natural to want to put it off; however, if you set small goals, you're less likely to put it off.
With that in mind, I'll put a note, such as read 3 pages of my Information Retrieval textbook, and 3 pages doesn't sound nearly as scary as the vague "just study" mindset. The interesting result in making small, manageable goals for studying is that not only does it help curb procrastination, but typically I will also read much more than the 3 pages. There have been plenty of times where I set of goal of a few pages of a book and ended up reading a few chapters.
Let's analyze a few key statistics with regard to reading.
However, research from the Rype Academy (http://rypeapp.com/blog/5-easy-ways-to-read-more-books-and-double-your-knowledge/) shows that CEOs such as Elon Musk, Mark Cuban, and Peter Thiel read around 60 books a year! That's 4-5 books each month.
So why do some of the most successful individuals in the world take the time to go through so many books? At a high level it may seem excessive, but if you truly believe that knowledge is power, wouldn't it make sense to dedicate whatever time is needed to attain more knowledge?
If you look at reading like a form of linear learning, then yes, reading would be a waste of time. Linear learning would be a 1 to 1 transfer of knowledge. For example, if it took the author of the book 10 years to research a topic and it took me 10 years to go through the book, that would be pretty pointless. At the end of the day this type of reading would be pointless.
However, I look at reading like it's compounded learning. What is compounded learning? Good question! Compounded learning is the process of taking the knowledge from an individual, but not having to spend the same amount of time that it took that individual to research the topic.
For example, imagine that you read a book on How to Become a Better Developer. The author of the book had to spend years researching the topic (assuming that it was a well-written/well-researched book). However, if you go through the book in a few weeks, that means that you were able to gain years worth of knowledge in a few weeks!
Research (http://blogs.plos.org/neurotribes/2011/06/02/practical-tips-on-writing-a-book-from-22-brilliant-authors/) shows that top authors will spend a minimum of two years researching a book. And that research time doesn't take into account the fact that authors draw on their entire lifespans to write a book. All of this means that each time you read a book it's as if you were able to gain a lifetime's worth of experiences and wisdom from the author.
It's one thing to say that reading is important; it's another thing entirely to go through a large number of books on a regular basis. With that in mind I've developed my own reading system. This system also takes into account a number of complaints that I've heard others say about reading.
First and foremost, I schedule a set amount of time each day for reading. Usually, this equals around 1-2 hours; however, on weekends this number can be double that number. At any given point of time, I'm usually going through a dozen books ranging from mind/skill hacking through technical programming books.
I'm not sure where the stigma of audio books came from. However, with my travel schedule, I've discovered that audio books are an invaluable tool in my learning arsenal. Obviously, you can't go through programming books via Audible. However, you can go through skill and business-based books. And I personally have hundreds of books in my Audible account, many of which I've gone through multiple times. In fact, many of the books I've discussed and quoted from were books I listened to rather than read.
A common pattern I see with students learning how to code is:
When it comes to hitting a learning plateau, it's important to look at the potential root causes for why it's occurring. It's been my experience that no two plateaus are the same. And until you've diagnosed why you're not learning, you won't be able to move on to your next level of skill.
Before I continue I want to reiterate something: you will never reach a point where your level of skill is maxed out. Maybe if you're a professional athlete and getting older, then your body is naturally going to decrease in performance. But when it comes to concepts such as understanding development, if you continue to dedicate yourself and if you're willing to listen to experts, your skill will never reach a peak.
Over the years I have witnessed a few key reasons why individuals (and myself) run into skill plateaus.
When a student lacks access to proper information, it makes learning a more arduous process. Imagine a talented developer in high school who had been relying on her teacher (who had limited skill). In cases like this, the student will need to find additional resources, such as online courses, that will help teach her concepts she's never been taught before.
During a phase of the learning cycle when best practices are the focus, students may feel like they are hitting a learning plateau. I remember when I was first learning about test-driven development. The concept seemed counterintuitive. I would spend 2-3 times the amount of time on a feature. And this became incredibly frustrating. It felt like I wasn't learning anything new because my new knowledge wasn't affecting anything on the screen.
In my experience, the main cause of students hitting a skill plateau is when they stop challenging themselves. If you remember back to when you were first learning development, it seemed like your knowledge was skyrocketing each day.
This approach is less taxing mentally. However, it has the nasty side effect of limiting how we improve. Whenever I feel like I'm getting into a rut, I will look at popular websites and I'll start to put together a list of features that I want to learn how to build. From that point, I can put a plan together for what concepts I need to learn in order to implement them.
One of my favorite illustrations of getting past skill plateaus was made by the calligrapher, Jamin Brown:
But also notice that the key to overcoming a plateau is called the Frustration Zone. I think that's a great name for it. Learning complex topics is not easy. As you've probably heard countless times, "if it were easy, everyone would do it".
Becoming a developer can be one of the most rewarding experiences that someone can have. And part of what makes learning how to code so fulfilling is how many challenges you'll need to overcome to succeed.
During a phase of the learning cycle when best practices are the focus, students may feel like they are hitting a learning plateau. I remember when I was first learning about test-driven development. The concept seemed counterintuitive. I would spend 2-3 times the amount of time on a feature. And this became incredibly frustrating. It felt like I wasn't learning anything new because my new knowledge wasn't affecting anything on the screen.
In my experience, the main cause of students hitting a skill plateau is when they stop challenging themselves. If you remember back to when you were first learning development, it seemed like your knowledge was skyrocketing each day.
This approach is less taxing mentally. However, it has the nasty side effect of limiting how we improve. Whenever I feel like I'm getting into a rut, I will look at popular websites and I'll start to put together a list of features that I want to learn how to build. From that point, I can put a plan together for what concepts I need to learn in order to implement them.
One of my favorite illustrations of getting past skill plateaus was made by the calligrapher, Jamin Brown:
But also notice that the key to overcoming a plateau is called the Frustration Zone. I think that's a great name for it. Learning complex topics is not easy. As you've probably heard countless times, "if it were easy, everyone would do it".
Becoming a developer can be one of the most rewarding experiences that someone can have. And part of what makes learning how to code so fulfilling is how many challenges you'll need to overcome to succeed.
In my experience, the main cause of students hitting a skill plateau is when they stop challenging themselves. If you remember back to when you were first learning development, it seemed like your knowledge was skyrocketing each day.
This approach is less taxing mentally. However, it has the nasty side effect of limiting how we improve. Whenever I feel like I'm getting into a rut, I will look at popular websites and I'll start to put together a list of features that I want to learn how to build. From that point, I can put a plan together for what concepts I need to learn in order to implement them.
One of my favorite illustrations of getting past skill plateaus was made by the calligrapher, Jamin Brown:
But also notice that the key to overcoming a plateau is called the Frustration Zone. I think that's a great name for it. Learning complex topics is not easy. As you've probably heard countless times, "if it were easy, everyone would do it".
Becoming a developer can be one of the most rewarding experiences that someone can have. And part of what makes learning how to code so fulfilling is how many challenges you'll need to overcome to succeed.
This approach is less taxing mentally. However, it has the nasty side effect of limiting how we improve. Whenever I feel like I'm getting into a rut, I will look at popular websites and I'll start to put together a list of features that I want to learn how to build. From that point, I can put a plan together for what concepts I need to learn in order to implement them.
One of my favorite illustrations of getting past skill plateaus was made by the calligrapher, Jamin Brown:
But also notice that the key to overcoming a plateau is called the Frustration Zone. I think that's a great name for it. Learning complex topics is not easy. As you've probably heard countless times, "if it were easy, everyone would do it".
Becoming a developer can be one of the most rewarding experiences that someone can have. And part of what makes learning how to code so fulfilling is how many challenges you'll need to overcome to succeed.
my favorite illustrations of getting past skill plateaus was made by the calligrapher, Jamin Brown:
But also notice that the key to overcoming a plateau is called the Frustration Zone. I think that's a great name for it. Learning complex topics is not easy. As you've probably heard countless times, "if it were easy, everyone would do it".
Becoming a developer can be one of the most rewarding experiences that someone can have. And part of what makes learning how to code so fulfilling is how many challenges you'll need to overcome to succeed.
The following graph shows the standard learning curve. This was generated by a big data analysis algorithm that analyzed the learning patterns of individuals in a number of industries. The curve is smooth because it takes the average learning process and averages the process.
Later in this chapter, we'll take a look at what a learning curve looks like for a single person. Over the years I've had the privilege of teaching students how to become developers. I've witnessed this learning curve play out again and again. And in this chapter, I want to examine the three stages that all developers go through. Additionally, I'll discuss about how long it takes to traverse from one stage to another. The three stages that I'll discuss are:
Let's start off by taking a look at the liftoff stage. This is an exciting time for new students. During this time students are immersed in learning skills that they've never seen before:
Because all the topics that students learn during this stage are new, their expertise skyrockets. I like to call this the liftoff stage because it's easy to visualize a new student's expertise like a rocket ship soaring into the sky into places it has never been before. During this time, a student will learn how to:
After the exciting liftoff stage of the developer learning curve, aspiring developers will enter the twilight zone:
- While in this stage, many of the core concepts and commands haven't cemented themselves in a student's long-term memory. This results in them having to constantly look up documentation, query Stack Overflow, and things like that.
- During this time, the novelty of simply having an application work has worn off. Now students are asked to perform advanced tasks such as:
- Working with legacy applications
- Debugging defects
- Improving performance
- Building features that they don't have a step-by-step tutorial for
- Additionally, while working through the twilight zone, students are expected to start implementing best practices. In the launch stage, the primary goal was to get applications functional.
During this next phase, students start learning how to build applications that can be used in real-world scenarios. This means that a student may spend five times longer to build an application with the identical feature of something they created during the launch stage.
There is good news though; if a student persists through the twilight zone of learning they will enter The Zone of the developer learning curve:
For example, I recently started working with the Scala programming language. I've been able to pick up on how to build applications in Scala dramatically faster than when I started learning C or PHP a decade ago. This is because I have a decade of knowledge in the development space that allows me to frame the new concepts. When I read the documentation and see what it says about data types, I don't have to wonder what a data type is. Instead I can skip ahead to learning the syntax.
Throughout this chapter you may have noticed that the developer learning curve was smooth. However, that's not reality. The reason why the curve was smooth was because it averaged out the learning path of a large number of individuals. When it comes to a single student, the learning curve looks more like the following graph:
There are ups and downs throughout the learning cycle. As a student, you may decide to switch programming languages after a few years (like I did when I switched from PHP to Ruby around 5 years ago).
If you're like me, when you learn a new topic the first thing you'll do is either move onto the next topic or repeat the concept as quickly as humanly possible. For example, when I learn a new Ruby or Scala programming method I'll usually jump right into using it in as many different situations as possible. However, I've discovered that this may not be the best approach because it's very short-sighted.
When it comes to learning how to code, one of the most challenging requirements is moving knowledge from our short-term memory to our long-term memory.
Remember the last time you learned a programming technique. Do you remember how easy it felt when you repeated what the instructor taught? The syntax seemed straightforward and it probably seemed like there was no way you would forget how to implement the feature. But after a few days, if you try to rebuild the component, is it easy or hard?
So, if our default mindset is to forget what we've learned after a few days (or a few minutes), how can we learn anything? This is where our brain's default programming comes into play and where we can hack the way that we learn.
Through my learning path, I'm going through a number of books and video series. And as I follow along with the guides, as soon as I learn a new topic I completely stop. I'll stand up. Write the new component on one of my whiteboards. And actually, write the program out by hand.
I didn't learn this technique from another developer. Instead, I heard about how one of the most successful classical music institutions in the world, the Meadowmount School of Music in New York, taught students new music compositions. As a game, the school gives out portions of the sheet music. So, where most schools will give each student the full song, Meadowmount splits the music up into pieces.
From that point, the students trade note cards and then focus on learning another piece of the song. They continue with trading cards until each student has been able to work through the entire set of cards.
So, the next time you are learning a coding concept, take a step back. Instead of simply copying what the instructor is teaching, write it down on a piece of paper. Walk through exactly what is happening in a program.
I spoke to you in Chapter 2, Are Developers Born or Made? – Debunking the Myth of Prodigies about the notion that prodigies and savants are a myth. But if this is the case, how can expert developers analyze programs so quickly? To answer this question, we need to go back to Fake Ancient Greece.
I said Fake Ancient Greece because my favorite illustration of mental models was discovered alongside one of the greatest forgeries in modern art history.
In Malcolm Gladwell's book Blink, he tells the story of the Greek Kouros. In 1985, the Getty Museum purchased a Greek statue called the Kouros for over $9 million dollars. Initially, the museum was hesitant to purchase the statue because there was a significant fear that sculpture was a fake. Kouros pieces were so incredibly rare, the chances that a legitimate and well cared for piece had been discovered were slim to none.
They simply knew that something was not quite right. Their suspicions turned out to be correct and the Kouros ended up being proved to be a hoax. But how were these individuals able to do what countless scientific studies could not? It all comes down to mental models.
In preparation for this chapter, I was discussing the topic of mental models with a friend and was surprised when she looked at me, confused. After informing me that she'd never heard of mental models, I decided to add this section to explain what mental models are. And after that we'll get into how we can build them to learn development.
This scenario is the ugly and all-too-common face of procrastination that programmers are forced to fight constantly. If this situation sounds familiar, you're in good company. But if you want to become a professional developer, you'll need to implement a system for hacking procrastination. And that's what we're going to walk through in this chapter.
Before we walk through a system for hacking procrastination, we first need to dive into the root causes for this negative habit. Everyone is unique, but over the years I've seen procrastination is typically caused by three thought patterns:
To overcome procrastination and get back on track we'll need to address each one of these issues. Because if you let any of these mindsets control the way your mind operates, you will never be able to reach your potential.
I called this chapter hacking procrastination because I think that hacking is the most appropriate term for what needs to happen to achieve success. Developers hack applications to build features or fix bugs. In the same way, we need to hack our thought patterns so that our brains function properly.
Starting off the list of the causes for procrastination is perfectionism. Have you ever watched a baby trying to stand up for the first time? Babies, who haven't learned that failure is a bad thing, will spend countless hours trying to stand up.
Next on the list is hacking the fear of success. If you're overcome the trap of perfectionism, congratulations. However, I've seen just as many developers get stuck due to the fear of success as the fear of failure.
For example, when I first learned how to build a connection to a database, I put the book down and didn't pick it up until weeks later. By learning the database concept, it opened up a new and scary new world of all of the new topics I had to learn after that. All of a sudden, I had to understand:
Last on the list for hacking procrastination is creating a practical plan. When I recognize that I'm procrastinating I now tell myself to look at my plan of attack. Usually I'll discover that my plan is too general.
For example, if I'm building a payroll application, I may have an item on my to-do list that says: Build reporting engine. That's a scary feature! That's the type of item that will stick on my to-do list for weeks without me taking any action.
Next on the list is hacking the fear of success. If you're overcome the trap of perfectionism, congratulations. However, I've seen just as many developers get stuck due to the fear of success as the fear of failure.
For example, when I first learned how to build a connection to a database, I put the book down and didn't pick it up until weeks later. By learning the database concept, it opened up a new and scary new world of all of the new topics I had to learn after that. All of a sudden, I had to understand:
Last on the list for hacking procrastination is creating a practical plan. When I recognize that I'm procrastinating I now tell myself to look at my plan of attack. Usually I'll discover that my plan is too general.
For example, if I'm building a payroll application, I may have an item on my to-do list that says: Build reporting engine. That's a scary feature! That's the type of item that will stick on my to-do list for weeks without me taking any action.
For example, when I first learned how to build a connection to a database, I put the book down and didn't pick it up until weeks later. By learning the database concept, it opened up a new and scary new world of all of the new topics I had to learn after that. All of a sudden, I had to understand:
Last on the list for hacking procrastination is creating a practical plan. When I recognize that I'm procrastinating I now tell myself to look at my plan of attack. Usually I'll discover that my plan is too general.
For example, if I'm building a payroll application, I may have an item on my to-do list that says: Build reporting engine. That's a scary feature! That's the type of item that will stick on my to-do list for weeks without me taking any action.
In The War of Art, Pressfield compares procrastination with being an alcoholic. If you're like me, when I first heard this comparison I was skeptical. I had a hard time connecting myself pushing off writing a blog post until tomorrow with an alcoholic passed out on the sidewalk in front of a bar.
After going through this cycle of procrastination for years I finally did recognize the pattern. And Pressfield was right, procrastinating on tasks has the same root cause as being an alcoholic. Alcoholics are willing to trade long-term joy for short-term happiness. By this I mean that an alcoholic will risk their health, career, and family, all for the sake of the feeling that a drink will give them at that moment.
I've already presented my system for hacking procrastination. However, I don't want to describe a problem without giving a solution. Therefore, I will conclude by saying that the best way I've discovered to fight procrastination is by taking baby steps.
One of the greatest strengths of the Pomodoro Technique is how easy it is to implement. The process that I follow is:
- Each morning I pick out the tasks that I want to accomplish that day.
- I then decide how long each task will take. The Pomodoro Technique works on point system. Each time you work through a 25-minute task you earn a point.
- Typically, I try to earn 10 Pomodoro points each day. This means that if I have 3 tasks that I know will take an hour each, I will earn 6 points for those tasks. And it means that I have 4 additional 25-minute slots available for the rest of the day.
Did you notice how I kept saying 25-minute time slots? There is a reason for the odd number. The Pomodoro Technique places a high priority on taking scheduled breaks. After completing each 25-minute task, you take a 5-minute break. During this free time, you can do anything you want. You can get on social media, you can take a walk around the block, or anything that you want to do. Just make sure that your break does not exceed 5 minutes.
Have you ever tried dieting before? When I was younger I struggled with my weight and to help fix it, I tried a number of intense diets. This included nutrition strategies such as dramatically decreasing calories, or killing off carbs. However, I noticed that I'd stay true to the diet for a few weeks or even a few months, but eventually I would fall back into poor eating habits.
In the same way when I was younger I fell into the same pattern with working on tasks. I'd get excited about working on a project or learning a new programming language. And I would spend countless hours working on what I wanted to accomplish.
Additionally, after I have finished my work for the day and have earned my 10 Pomodoro points, I feel a sense of accomplishment that I never felt before. And after work, I don't feel guilty spending time with my family and friends, because I know that I completed every task that I set out to work on that day.
So how can you implement the program? There are a few ways. To start off, you can simply use the timer on your phone and then count up each of the tasks/points that you achieved each day. That's how I started off working with the Pomodoro Technique.
Additionally, there are a number of smartphone apps that have Pomodoro timers and even allow for creating a task list that you can use as a pick list for your tasks each day. I like these types of apps because they also give you historical analytics so you can see how many tasks you've completed each day. The Pomodoro focus app (https://itunes.apple.com/us/app/pomodoro-time-focus-timer/id973134470?mt=12) is my personal favorite (and it's free).
Actually, the opposite was true. Instead, Dr. Gelfond cared enough about us that he imparted to us the secret weapon to mastery: making mistakes. Wait, making mistakes is the opposite of what our mind tells us to do, right? Making mistakes is embarrassing. Mistakes tell the world that we don't understand a concept. However, making mistakes also provides a number of powerful tools that anyone interested in learning should be aware of.
First and foremost, when you make mistakes, especially publicly, you're going to feel like you're taking memory steroids. How so? When I think back to Dr. Gelfond's class I still remember every mistake I made when I was called in front of the class. The memories generated by making mistakes are so vivid that they can be recalled, even years later like mine. Now obviously simply remembering the mistakes by themselves would be pointless.
However, in addition to remembering what I did wrong, more importantly I remember what I had to do to correct my mistake. It's been over three years since I took that class, but I can still remember each of the key concepts that he taught us. And I can tell you from experience that I cannot say the same thing about all of the classes I've taken.
Another benefit to making mistakes is that they force you to learn. No one likes being wrong. So, assuming that you have a passion for knowledge, you can use the memory of making mistakes to help motivate you to learn a concept properly.
At first glance, this may seem like a daunting task. And many aspiring developers have given up on their learning journey because it seems like an insurmountable challenge.
Before I go into the memorization system I have used over the years, it's important to say that repetition is the key to memorizing large amounts of information. None of the techniques I will give you are going to help if you don't take the time to work through them consistently.
With that being said, it's important to know that, by itself, repetition is a slow and naive memory training technique. As a development student, imagine that I had a list of a few hundred method names and tell you to memorize them. If you were to simply stare at the sheet of paper and try to memorize the names, how do you think you'd do? If you're like me and the majority of the world, probably not very well.
In the first memory technique, we're going to walk through visual mental mapping. Our minds are incredible at memorization. However, at the same time, our minds are also picky with how they store information. Let's run a quick test. If I show you 15 random digits, such as:
However, what if I showed you the pictures of 15 celebrities? Now if I give you the same test as with the numbers, do you think you'd do a better job remembering the list of celebrities or the random numbers? Assuming you know who the celebrities are, you'd be able to repeat back a significantly larger number of celebrities than numbers.
With this knowledge in mind we can apply the same principles for memorizing anything.
Because our brains are efficient machines they naturally sort information based on priority. You are most likely aware that you have short-term and long-term memory. This concept is the reason why you can instantly remember your second-grade teacher's name decades later, but may forget a new acquaintance's name 30 seconds after hearing it.
So, when it comes to implementing the visual mental mapping technique, we're essentially tricking our brain into thinking that it needs to move a piece of information into long-term memory. In this process, we associate a visual image with the term that we want to memorize. A key prerequisite for this to work is that the visualization needs to be relevant to the term (or the behavior of the term).
So, in this example I have an image filled with decorations. And on top of the image, I have some text that is underlined. And it's sitting on the decorated fireplace mantle. By creating this visual image, I've mapped:
Sticking with our celebrity theme. Imagine that you wanted to go to a private, VIP party in Hollywood. If you just try to show up the bouncer at the door most likely won't let you in. However, if you're friends with Brad Pitt and you walk in together, you won't have any issues attending the party.
As you drive down the street to work your brain captures millions of data points, such as street signs and people walking, etc. If your brain didn't guard against useless information entering your long-term memory bank, all of this information would be treated with the same priority as your parent's names. Obviously, this wouldn't be a good idea!
So visual mental mapping seems like a great idea. However, the idea of creating thousands of visualizations isn't very practical, which is why, when I'm learning a new programming language, I also focus on picking up on patterns.
border
border-bottom
border-bottom-color
border-bottom-style
border-bottom-width
border-color
border-left
border-left-color
border-left-style
border-left-width
border-radius
border-right
border-right-color
border-right-style
border-right-width
border-style
border-top
border-top-color
border-top-style
border-top-width
border-width
Additionally, you may also notice that each side also has a set of options for color, style, and width. So practically, if you know that these elements are all available to the border set of elements, this list can be shrunk down to 5 items:
On a final note, I want to dispel a common fallacy. As a developer, you don't have to memorize every class and method to build a project.
The only language that I really build applications in is Ruby, which means that I've been forced to become proficient in a number of language that I really didn't have much experience working with, sometimes in a very short period of time. And over the years I've developed a system for learning a new language or framework, and that's what I'm going to walk through in this chapter.
When I'm learning a new programming language I follow these steps:
- Watch a full tutorial series on the language. When I'm watching I don't try to follow along, I simply watch what the instructor does in the demos so I can get a high-level view of the language syntax and flow.
- Create a hello world application. I'll incorporate a few basics, such as running a loop, creating and instantiating a class, and any other high-level concepts I remember from the tutorial.
- Pick out a sorting algorithm and implement it in the language. It's fine if the sorting algorithm is a basic one such as selection or bubble sort. Sorting algorithms force you to use data structures, loops, variables, and functions. Combining each of these elements will give you a good handle on how the language works.
- Go through an advanced tutorial on the language and this time follow along and build the programs with the instructor.
- Go through coding interview questions for the language. Being able to confidently answer these questions will give you a good idea if you have a solid understanding of the language.
I've used these five steps for a number of languages and I can also tell you, once you've become proficient in a single language you'll find it's much easier to pick up new programming languages since most of them have quite a bit of shared processes, and all you'll need to do is learn the difference in syntax.
During this time, I spent quite a bit of time meeting with Dr. Richard Watson. And during one of our meetings I brought up the issues I was having. His first question was based around how I was taking notes for the course.
Finding out that I was taking notes wrong was great. But it wouldn't have been too useful without learning an alternative approach. So, Dr. Watson asked me to try a different type of note-taking technique.
I started following this reverse note-taking process years ago and I still use it today. Through this time, I've noticed a number of key benefits to this approach.
First and foremost, by having the knowledge that I will have to recite back the key components of the lecture forces me to have an increased level of focus. This is opposite to how I used to take notes. My old way of taking notes would many times distract me from the concepts being discussed. I would hear a concept that I felt was important and I would take my focus away from the speaker and focus on writing down the topic.
Another benefit of reverse note-taking is that it forced me to think of the lecture as a unified story instead of a series of facts. Let's go back to our illustration of Napoleon's battle at Waterloo. If you listen to a lecture about the battle and take notes during the class, you'd probably do things like write down the following:
Lastly, the reverse note-taking approach made it easier to review the lecture material compared with my old style of note-taking. Before that I would rarely listen to a lecture recording. Even if I had the intention to listen to the recording, other priorities always seemed to override the task. I mainly attribute this failure to the fact that I, for some reason, trusted my notes.