Ten Reasons to Learn C (not C++)

Now I've heard pros and cons and pros and cons for this topic. Read the

article and tell me what you guys think. Here is a basic rundown:

  1. C is lower level then other programming languages
  2. Device drivers and operating systems are written exclusively in C
  3. What if you ever want to get a job programming micro controllers?
  4. C programs are smaller and faster than any other program created in a different language.
  5. If you have learned C, you can learn any modern programming language.
  6. Because C has been around for many years, it has a large community and collective code base.
  7. C is the language of the Open Source community.
  8. C is the only language that teaches you what pointers really are.
  9. C is still the most commonly required language for programming jobs.
  10. Anything that has a microprocessor in it has support for C.

I understand their arguments, well at least some of them. But, to me, all these are also arguments against this point. I personally think people should start off with C++. I do believe, however, that when used right C in no way compares to C++ in terms of speed and many of the above points. But the amount of time and code needed for this efficiency sometimes isn’t worth it.

C++ was developed from C to be easier and, in some cases, more efficient. The latter has some issues, I’m sure everybody will agree. But in today’s day, with the type of computers that are on the market and with the time restrictions you have in a production environment it just doesn’t seem worth it to me.

I had the experience of working in a business environment with C. My employer insisted that C was more efficient, faster, all the good things. But coming from C++, there was a bit of a learning curve for me. Now was this learning curve worth it? I don’t believe so. If the system was written in C++ things would have been simpler, there would have been more defined structures and production would have been faster. Come on, we were developing applications for a Xeon processor server. Do those milliseconds really count on that level? I don’t think so, but that’s just me.

Let me know what you guys think.