History of Programming

History of Programming

Fun and brief history about programming

"Programming isn't about what you know; it's about what you can figure out.” - Chris Pine

The world we see now wouldn't be where it is now without the computer technology. It lays the foundation for the modern world. Let's learn how history have imagined and invented new possibilities with programming and coding, and how geniuses in the past used their skills to make their ideas become a reality – something we deal with on a daily basis, thanks to them. We’ll highlight a few big names and things concerning programming. We can go back as early as 60 AD to find about programming but I'll dive into only the most important parts that led to where we are now. Hold on it will be a fun ride🎢

Ada Lovelace

Historians recognise Ada Lovelace as the first computer programmer. Ada, born in 1815, studied mathematics, which at the time was highly unusual for a woman. After meeting Charles Babbage, the inventor of the Analytical Engine, Ada was entranced. In 1843, she published an article about the machine, translated from French, adding in her own extensive notes, which came to be recognised as the first algorithm intended to be processed by a machine.

Turing Time

This is interesting. Remember when Doctor Strange played the part of Alan Turing in the movie The Imitation Game? If you haven’t already seen it, I highly recommend it, as it is a fascinating story. But relevant to this narrative is the part he played in the development of coding. Close your eyes and take yourself back to black and white age during war and continue after sometime. In 1923, German military combined coding and electricity, communicating via secret coded messages on the Enigma machine . Alan Turing famously cracked the code, reportedly helping end the war two years earlier than predicted. Turing later took the concept and evolved it, creating a more flexible machine in 1936. Unlike its predecessors, Turing’s computer could complete more than one task, due to its ability to read multiple instructions in binary code. Read more about Turing Test .

80s - Golden Age

Let's fast forward a bit. Many consider the 1980s as the golden age in computer and technology history. 1983 saw the conception of C++, a language in use consistently today, while 1987 was the year PERL debuted, a language currently in use by IMDB and Amazon, amongst others. By 1989, Tim Berners-Lee had invented the internet, which has arguably had the biggest impact upon our modern working lives. As part of that invention, Berners-Lee devised HTML, URL and HTTP – familiar to anyone who’s ever used the internet. You now know how important was 80s.

90s

We see some big leaps. 1990s welcomed some of today’s biggest, and most recognisable names in coding – think Python, JAVA, and JavaScript. Without these, we wouldn’t have social media, Android/OS apps, or streaming services(you wouldn't be having Netflix and chill🤓).

Conclusion

We have been lead to this, all the way from Analytical Engine to what we see now all around us. We have come a long way and having advanced so much. It only makes us wonder, where will this lead to? One thing is for sure: while ever-evolving technology is inevitable, coding will remain at the centre of every development. It has been a fun ride until now, but it will be a wild ride in the future. So hold on to your seatbelt and you should either get involved to shape the future(yes, really!!!🦾) or appreciate the work done by some of the finest mind of our generations. Okay, now what? I would like to ask what are the biggest changes that you think are going to take place and are looking for most in the future.

Thanks for reading my blog. If you enjoyed reading please consider following me on Twitter👇.