We like to hold fast to the myth of the individual creative genius as the source of the world’s most impactful scientific revolutions or disruptive innovations. But it’s consoling to recall how Isaac Newton consoled his rival Robert Hooke: “If I’ve seen further than others, it was by standing on the shoulders of giants.”
This is French how painter Nicolas Poussin represents Cedalion providing sight to the blind Orion, a mythological pair associated with each generation’s progress over its predecessors.
Fast Forward Labs clients frequently ask why the algorithms we explore in our reports have all of the sudden become so important. While the answers to these questions are extremely complex, we devote part of each report to explaining the history of a given technique and suggesting what changes had to take place to render each innovation possible and practical.
Take deep learning. A basic form of the neural nets that underly deep learning algorithms have been around since the 1950s. In 1958, Frank Rosenblatt told the New York Times that his Perceptron would be the beginning of computers that could walk, talk, see, write, reproduce themselves, and be conscious of existence. As fifty years have passed without generating sentient machines - and in spite of the fear of the so-called “intelligence explosion” that dominated discussions at Effective Altruism - we know these statements were exaggerated given the state of technology at that time. But three recent developments have caused resurgent interest in and applicability of neural nets.
First, there’s been strong progress in our theoretical understanding of the networks themselves. Second, GPU (Graphical Processing Unit) computation has become affordable. GPUs were primarily developed for video gaming and similar applications, but are also optimized for exactly the kind of processing that neural networks require. Finally, large image and video datasets are now available. This, more than anything, has motivated and enabled significant progress for both research and industry applications.
In a recent interview with Inverse, Fast Forward Research Analyst Jessica Graves similarly described how the shifts in the economics of data storage have made big data so important. She explains:
“Storage capabilities leaped over time. If you think in terms of the web, there’s a negligible hardware cost difference per user between storing data about 10 thousand users or 10 million users. I just read a piece about the development of tracking cookies in the ‘90s which made it easier to get a better sense of unique visitor information on websites. It’s cheaper than ever to store information about what consumers are doing on all parts of the web, down to how many seconds a user spent hovering over a candid celebrity photo.”
What’s your take? Why is deep learning just now gaining traction? And what material changes do we hope will occur to render the next big shift in AI possible?
More from the Blog
Aug 7 2015
Today’s post is inspired by a slow-motion recording we captured of a Stirling engine that Ryan, Fast Forward’s General Counsel, just so happened to have lying around our New York City offices. For the non-mechanics among us, a Stirling engine is a heat engine that operates by cyclic compression and expansion of air and other gas at different temperatures; the temperature differential translates...
Aug 19 2015
by — This is the first of two articles about recent developments in fashion technology. Part two will focus on implications for consumer privacy. The next frontier for recommender systems is the retail store. We’re used to associating machine learning with ecommerce giants like Gilt and Lyst, but can data science transform physical stores like Rebecca Minkoff and Zara? The impact would be signifi...
Oct 4 2017
by — We’ve started work on our next prototype. While the design is still evolving, we’re pretty sure one element of it will be a visualization of tens of thousands of data points, clustered through a dimensional reduction algorithm (most likely using T-SNE). For the past week I’ve been exploring how to render that many points in the browser and I wanted to document some early lessons in this post, s...