Following on from our successful Salon, co-hosted with the Innovation Centres, we will be publishing a series of blogs focussing on the technology behind automation, its social impact and rounding up with what it means for the future of the way we learn and how we work.
We will start with looking at the technology behind automation and what better place to start than in the weeks after Uber launches its first self-driving taxi in San Francisco, albeit with a ‘safety human’ ready to take over in case of emergencies.
AI is the probably the most important technology driving automation. It’s long been hailed as the next ‘big thing’, but has struggled to live up to its hype. In the past few years though that has changed.
We now have pieces of software that can outperform humans at a whole host of tasks. Earlier this year Google’s AlphaGo system defeated Lee Sedol, one of the world’s best players of Go, a board game so complex that computers had not been expected to master it for another decade, at least.
It’s not just board games that computers are able to play, they can translate texts, enable robots to stack shelves and move items around warehouses, diagnose illnesses, drive cars and search mountains of legal documents. They can also create some forms of journalism, such as writing market reports and sports’ summaries.
Automation will also enable the public sector to make huge changes to how they design services, allowing them to finally move to taking preventative approach rather than reacting when things go wrong. Take, for example, Glasgow City Council who are using data from systems to identify families that are at risk of in-work poverty and not claiming free school meals and clothing grants. They have proactively targeted those families by sending them a cheque in the post. In the future, they are looking to remove the process of the application for support altogether, because they already collect that information.
So, what has changed that has allowed this change in capability? An AI technique called “deep learning” allows systems to learn and improve by analysing lots of examples rather than being explicitly programmed. This technology is already being used to block spam e-mails, suggest e-mail replies, recognise voice commands, and detect credit card fraud.
When AI was first developed, it used artificial neural networks (ANNs). The developers were inspired by the structure of human brains with their neurons, or brain cells, and tried to replicate the process. In a biological brain, each neuron can be triggered by other neurons whose outputs feed into it, and its own output can then trigger other neurons in turn. A simple ANN has an input layer of neurons where data can be fed into the network, an output layer where results come out.
Whilst they could complete simple tasks such as recognising handwriting they struggled with more complex tasks. In the past decade, new techniques and a tweak to the activation function, have made training deep networks possible.
All this takes a lot of number-crunching power, which became readily available when several AI research groups realised in 2009 that graphical processing units (GPUs), the specialised chips used in video-games consoles to generate graphics, were also well suited to running deep-learning algorithms. They found that GPUs could speed up its deep-learning system nearly a hundredfold.
What makes this approach useful is that there is no need for a human expert to draw up a list of rules, or for a programmer to implement them in code; the system learns directly from the labelled data.
Over the past decade, there has been an explosion of data sets available to developers to train their AI systems. With the Internet of Things (IoT), the numbers are pretty staggering; in 1999 there were 90 million devices, in 2013 9 billion and in 2015 1 trillion. Each of these devices creates data that can be used alongside that produced by the internet, mobile phones, finance systems and a host more to make AI smarter.
We’ve all heard many times before that ‘the robots are coming to take our jobs’, but technological change has always produced more jobs than its destroyed. Is automation any different? Find out in our next blog whether it’s going to be the software rather than the hardware coming for our jobs.
This Salon would not have happened without the support of the Innovation Centres through their partnership with TEDxGlasgow. If you are interested in hosting an event with TEDxGlasgow, please check out our Partnerships page for more information.