We are all now in what’s called the “big data era,” and we’ve been here for quite some time. Once upon a time we were only just starting to piece together dialogue. Then when one group of people had learned this dialogue, it was up to them t pass it on the next group and so on and so on. However, as more people began to fill the Earth, more information was learned and gathered, making it too difficult to pass on in the form of dialogue. Instead, we needed to codify this information to share it all.
Sharing and codifying this learned knowledge into writing would have been quite a shift, technologically, for our species. Another big change came when we moved to the complex mathematics we have today from what was once just simple calculations. Coding, however, is still relatively new in comparison and didn’t come into play until 1945 when people like Grace Hopper worked on the Harvard Mark 1 computer. It emerged more through necessity rather than anything else. People figured that if they could find a way to codify instructions to a machine to tell it what steps to take, any manual operation could be eliminated saving any business time and money.
Then came along algorithms. Algorithms are very different from code. The code is a set of instructions for the computer. It’s calculation in a specific platform in a specific programming language. Algorithms, on the other hand, are a series of steps that describe a way of solving a problem that meets the criteria of both being correct and ability to be terminated if need be. Algorithms have been around much longer than coding had and was recorded as being used as far back as 820AD by the Muslim mathematician, Al-Khawarizm. They’re a finite number of calculations that will yield a result once implemented. Because coding is a way of getting instructions direct to a computer it’s well suited to implement algorithms.
The way code performs can be impacted depending on how the algorithm is implemented. Algorithms generate better performance gains than any hardware can. In 2010, a Federal report showed how algorithmic improvements have resulted in significant performance increases in areas including logistics, natural language processing, and speech recognition. Because we’re now in the “big data” era, we need to think big in order to be able to cope with the vast amount of data coming in. Instead of writing code to search our data given a set of parameters of the certain pattern as traditional coding focuses on, with big data we look for the pattern that matches the data. But now, there is so much data that even the patterns are hard to recognize. So again, programmers have had to take another step back.
Now another step’s been added to the equation that finds patterns humans don’t see, such as the certain wavelength of light, or data over a certain volume. This ‘over data’ is what’s known as big data. So, this new algorithmic step now successfully searches for patterns and will also create the code needed to do it. Pedro Domingos explains this well in his book, “The Master Algorithm.” Here he describes how “learner algorithms” are used to create new algorithms that can write the code needed to carry out their desired task. “The Industrial Revolution automated manual work and the Information Revolution did the same for mental work, but machine learning automates automaton itself. Without it, programmers become the bottleneck holding up progress. With it the pace of progress picks up,” says Domingos.
More News to Read