Lately, artificial intelligence has become very much the hot topic in Silicon Valley and also the broader tech scene. To those of us associated with that scene it seems like an incredible momentum is building around the topic, with all sorts of companies constructing a.-. in to the core of the business. There has been a rise in A.-.-related university courses which can be seeing a wave of extremely bright new talent rolling into the employment market. But this is not an easy case of confirmation bias – interest in the subject has become on the rise since mid-2014.
The noise across the subject will simply increase, as well as the layman it really is all very confusing. Based on everything you read, it’s easy to think that we’re headed for an apocalyptic Skynet-style obliteration as a result of cold, calculating supercomputers, or that we’re all likely to live forever as purely digital entities in some kind of cloud-based artificial world. In other words, either The Terminator or The Matrix are imminently going to become disturbingly prophetic.
When I jumped to the A.I. bandwagon at the end of 2014, I knew very little about it. Although We have been included in web technologies more than two decades, I hold an English Literature degree and am more engaged using the business and artistic probabilities of technology than the science behind it. I was interested in A.I. due to the positive potential, however, when I read warnings from your likes of Stephen Hawking concerning the apocalyptic dangers lurking in our future, I naturally became as concerned as anybody else would.
And So I did the things i normally do when something worries me: I started researching it in order that I could comprehend it. More than a year’s worth of constant reading, talking, listening, watching, tinkering and studying has led me to your pretty solid knowledge of what it really all means, and I would like to spend the next few paragraphs sharing that knowledge with the idea of enlightening anybody else who may be curious but naively scared of this amazing new world.
The very first thing I discovered was that Udacity, as being an industry term, has actually been going since 1956, and it has had multiple booms and busts in this period. In the 1960s the A.I. industry was bathing in a golden era of research with Western governments, universities and big businesses throwing enormous quantities of money on the sector in the hopes of creating a brave new world. However in the mid seventies, in the event it became apparent which a.I. was not delivering on its promise, the market bubble burst as well as the funding dried up. Within the 1980s, as computers became more popular, another A.I. boom emerged with similar amounts of mind-boggling investment being poured into various enterprises. But, again, the sector failed to deliver and also the inevitable bust followed.
To understand why these booms did not stick, you first need to know what artificial intelligence really is. The short response to that (and believe me, you can find very long answers out there) is that A.I. is a number of different overlapping technologies which broadly deal with the task of using data to produce a decision about something. It boasts a tstqiy of various disciplines and technologies (Big Data or Internet of Things, anyone?) but the most important one is a concept called machine learning.
Machine learning basically involves feeding computers huge amounts of information and allowing them to analyse that data to extract patterns that they can draw conclusions. You may have probably seen this in action with face recognition technology (such as on Facebook or modern digital cameras and smartphones), where the computer can identify and frame human faces in photographs. To do this, the computers are referencing a tremendous library of photos of people’s faces and have learned to recognize the characteristics of a human face from shapes and colors averaged out spanning a dataset of hundreds of countless different examples. This method is actually exactly the same for virtually any application of machine learning, from fraud detection (analysing purchasing patterns from bank card purchase histories) to generative art (analysing patterns in paintings and randomly generating pictures using those learned patterns).