Back to News page Read next article Read previous article
Computers have been dumb for 50 years
Computers have changed everything. From the mainframes of the 70s to the smartphones of today, the economy, education and civic society has seen remarkable transformation. Yet all of this change was achieved with an approach that hasn’t moved on in 50 years.
Computers are dumb. When you write code, be it for an app or to send a rocket to the moon, you start with a blank slate. You tell the computer, line by line, what colour the text should be, what question to ask next and how to deal with an error.
This all has to be done in painstaking detail. When we say, “colour of text”, we mean a precise value from 4,294,967,296 possibilities. When you ask a question, you have to consider all possible valid answers. When you need to deal with an error, you have to specify precisely what sort of error, when it might occur and how to deal with it.
Finally, when you’re done with all of that hard work, the code you write can only do the job you designed it to do. A program you write to play chess couldn’t possibly play checkers.
Code that writes code
Artificial intelligence turns this on its head. By mimicking mental functions such as learning, a short and simple program that’s fed lots of data can do incredible things.
Self-driving cars are a great example. Without using AI, writing a program that can help a car drive would require millions of lines of code. Lines that state precise rules for what should be done when you see a red light, how to deal with somebody cutting you up and what to do in an emergency.
Modern cars are high-tech and sensor rich. By recording and pooling this data, manufacturers are able to create a database detailing hundreds of thousands of hours of safe driving.
When we come to have a car drive itself, we have it compare what its sensors say with this historical data. By doing so it can determine both what is happening now and what the safest course of action should be.
For example, where the car approaches a junction and the other cars around it slow down, the vast majority of safe drivers also slow down. Therefore, our programme should instruct the car to slow down.
Rather than writing millions of rules ourselves, the programme looks first at what safe driving has been historically, then compares it to what the car can sense to determine rules of its own. That’s the fundamental promise of AI – code that writes code.
Previously impossible, better than before
That basic idea can be used in many other contexts. By comparing a kidney scan with millions of others you can tell if kidney is healthy. Today, spam filters, credit card fraud detection, Amazon recommendations, loan applications, Google searches, firewalls and even terrorist watch-lists are all supported by this type of AI to some extent.
These are great examples of AI helping us do new things, or do existing things much better. DeepMind can spot things even experienced doctors can’t. An army of financial analysis could never deal with the volume of fraud that automated systems already do.
The good news is that, as with all technology, AI is becoming much more accessible to businesses of all sizes. Today there are countless low cost, ready to use, smart services any business can hook into. Here are three great examples:
Why it matters?
AI is a fundamental re-think at the heart of the technology industry. Rather than writing endless lines of ever complex rules, we can now produce code that writes these rules for us. It will allow us to take on previously unsolvable problems and reconsider previously addressed issues with greater effectiveness. The use of artificial intelligence is increasingly becoming more relevant and accessible to SMEs.
Terms like ‘game changing’ and ‘paradigm shift’ are used too often in technology. AI is a such a fundamental shift that it will fuel a second computer revolution. AI could well be our best chance at beating cancer. If that’s not game changing or a paradigm shift I don’t know what is.
Bradley Stacey is Digital Technical Head at Bray Leino. Follow him on Twitter.