With AI being the buzzword now, Apple is further along the road than you may, at first, think
Apple has never been a follower of trends, rather an innovator. I think that could well be what we’re seeing with their current development of Artificial Intelligence.
Billions, and I do mean billions, are being spent on R&D by Apple in this area right now. The companies balance sheet shows that over $7 billion has been thrown at research & development…in the last quarter ALONE!
Whilst all the current chat, and focus of attention at the Californian HQ is on the AR/VR headset, that R&D spend, proves that they are busy working on future projects, ideas, and technologies.
To think that Apple, one of, if not the leading tech companies in the world, is not looking at current trends, just doesn’t wash with me. Maybe, it’s because they are just doing things their way.
A reader of mine recently commented that Apple doesn’t need to be first here – just the best.
All in the name
While others chase the AI headlines, Apple has been quietly beavering away in the background, with what they call Machine Leaning (ML). Every Apple produced processor since 2017’s A11 Bionic chip has had a Neural Engine – a part of the chip solely dedicated to processing ML algorithms. It is ML that has been at the core of many of Apple’s most recent features.
Live Text blew us away when it was first launched – the fact you could select text from a photo or image, then copy & paste it was groundbreaking. As too was the ability to search your entire photo library by a simple word – tree, food, cat etc. All of this is only possible because of Apple’s investment in ML.
Whilst there is a valid debate if AI & ML are the same thing, or whether ML is merely a fragmented branch-off from AI is a separate debate. What is unquestionably true, though, is that they occupy the same territory, and they are ever so close in functionality.
Somewhere deep inside of ChatGPT, Bard, and Bing there is some machine learning going on – even the hit of last year, DALL-E uses it to produce its images. Within their DNA, they rely on data. The more they are used by us, therefore the more data they are fed, the better they become. We truly live in an age where data, and algorithms are king.
The best matters
Apart from the fact of the development of the Neural Engines in their Bionic chips, and the R&D money being spent, Apple has also made sure to have the finest people behind the ML side of their business as well.
John Giannandrea joined Apple in 2018. Who’s he? Well, he is the Senior Vice President of Machine Learning and AI Strategy, and reports directly to CEO, Tim Cook.
Giannandrea oversees the strategy for artificial intelligence and machine learning across the company and the development of Core ML and Siri technologies.
He is a graduate from the UK’s University of Strathclyde in Glasgow, Scotland, where he earned a Bachelor of Science with Honours in Computer Science, and was awarded a Doctorate Honoris Causa.
Early in his career, he co-founded two companies, before spending eight years at Google, from where he joined Apple. The fact that Giannandrea is so highly thought of in the industry, and, reports directly to Cook, once again demonstrates that Apple has far from taken their eyes off the ball, and the importance they are placing on it.
Whilst researching for this story, I also found that Apple has a public website, where they openly discuss, and highlight their current papers, and thoughts on ML.
They are also one of the sources that regularly contribute to open-source learning projects, too. I even discovered there is a framework, already in place, CoreML, specifically designed for developers, helping them to easier integrate machine learning in their future projects.
So, all-in-all, I think Apple are pretty invested in the future of AI or ML – which ever name you want to hang on it.
Unfair to compare
Siri is often the butt of jokes when it comes to voice assistants.
Recently, it has been criticised and compared to some of the chatbots doing the rounds. In honesty, not only are they different creatures, but the chatbots have been far from perfect in some of their responses.
When Google recently had a press day for their soon-to-be-released chatbot, Bard, it made some factual errors when asked about the James Webb Telescope.
My gut feeling is that Apple is sitting back, and letting some of the other players take the hits. Also, the goals are different for Apple, when compared to Microsoft, or Google.
Search could soon be AI-driven. We could be witnessing a change in our habits. Soon, when we need to know what is the most expensive bottle of wine, for instance, we could turn to chatbots, ahead of search engines.
As such, the other companies have way more to lose. Google, and YouTube, are both basically massive search engines. Those search engines generate massive income from traffic, data, and advertisers. Clearly, if we are to start turning to chatbots instead, as search-based enterprises, they need to be the first to adapt, and adopt. Billions are at risk for them.
When Apple feels that have created a bot, or ML system that is capable of flowing, contextual conversation, that is when they will pounce in. The obvious benefits of being able to converse with Siri are obvious, and far-reaching through their ecosystem.
I mentioned that Apple regularly contributes to open-source projects. One company that are working closely with is Stable Diffusion, and their art-based AI model. In December, Apple announced that Stable Diffusion was now optimised for both CoreML, and Apple silicon too. The close collaboration of these companies suggests that Apple is working toward an even deeper integration of ML in to their devices.
If Apple can make their ‘on-device’ searches more accurate, and quicker, that would have a more direct benefit to them rather than joining the war of chatbots with the likes of Google & Microsoft.
As you can tell, much of Apple’s background development is in to their chip’s Neural Engines, machine learning, and artificial intelligence, which are not headline grabbing, sexy news stories.
But what Apple is doing here, is looking organically, about what suits them and their devices best. I’m sure deep-down, they probably know that Siri is a little limited right now, but rather than rushing out half-baked improvements, they will be mining a pool of information, and data in the background, ready to make wholesale, quantum changes.
With the billions, they are spending, and the resources they have directed to it, Apple, are aware of what the future holds in store.
Assuming that the headset does get released this year, the better its machine learning-lead responses are by that point, can only possibly help with the devices’ reception at launch.
First is not best – as my reader said, best is best.
Fancy receiving my weekly video newsletter?
It’s free, and simple to join. Just leave me your details here, and every Sunday lunchtime, I will drop in to your inbox, catching up on the last week.
Guess what – if you look forward to my articles & blogs landing each day, you can help that happen! By clicking via this link, you can join Medium, and get my blogs every day, the moment I publish them. And, you can even get email notifications about them too. Go on – one little click of the Magic Mouse, will make a big difference to both you and me! 😋
I am now on Vero – follow me here https://www.vero.co/dtalkingtech