Google’s AI Lead on Piercing the Generative AI Hype at The AI Summit London
The astonishing capabilities of ChatGPT and other generative AI models have been often described – breathlessly at times − as magical by its users.
Laurence Moroney, AI advocacy lead at Google, wants to bring everyone down to earth. “If we can’t break out of the hype overdrive as an industry − this is where things go to die − we could be faced with another AI winter,” he said during a session at AI Summit London.
Moroney cited the famous Gartner Hype Cycle for Emerging Technologies to pinpoint where generative AI is today: at the Peak of Inflated Expectations.
Based on the hype cycle trajectory, the next stage after Innovation Trigger is the Trough of Disillusionment. This is when the sky-high expectations of a new technology do not meet reality. But if the technology can power through, it hits the Slope of Enlightenment when its benefits to enterprise are better understood and solidified. The last stage, Plateau of Productivity, is reached when there is mainstream adoption of the technology.
“A lot of technologies, if they can't get out of this Peak of Inflated Expectations, go to die,” Moroney said.
Remember Google Glass? Moroney said when the voice-controlled AR glasses were introduced, the hype around it took off. The smart glasses hit the market in 2014 at a price of $1,499. As excitement around it increased, so did societal fears – which is what’s happening with generative AI today.
Back then, “people were getting beaten up in bars in San Francisco because they thought they were filming people with their Google Glass,” he said. “Movie theaters banned it because they thought you can actually use Google Glass to copy a movie.”
The truth is, Google Glass can only record video for about 10 minutes. Also, “if you try to take pictures or anything, it lights up,” he said, making it hard to secretly photograph someone. However, these facts “got lost in this Peak of Inflated Expectations, and as a result was never able to get out of that.”
Google Glass became mired in the Trough of Disillusionment and never escaped. Three months ago, Google discontinued all iterations of Google Glass, which never gained any market traction in its nearly decade-long life.
Generative AI is now at the Peak of Inflated Expectations, and with it comes societal fears such as widespread job losses and even existential risks.
Moroney said his job is to help generative AI pass through the Trough of Disillusionment toward the Plateau of Productivity. “We’re using the product, we’ve kicked the tires, we understand the limitations.”
To get to the productivity stage, it is important to educate people about what the technology is and is not, Moroney said.
Avoiding Another AI Winter
AI and machine learning flips traditional programming, which is rules-based to get to an output. In machine learning, developers start with the desired output, coupled with the dataset and the machine comes up with the rules.
For example, let’s say you’re trying to program a smartwatch to track the user’s fitness. In a rules-based approach, you could write a rule that says if the watch is traveling at less than four miles an hour, the person wearing it is likely walking. Then you write more rules to determine if a person is running or riding a bike. But there are many more scenarios in which people could be traveling slower but not walking, such as when playing golf.
There would be myriad possibilities and it would be “really, really infeasible” for a human developer to write rules for every single one, Moroney said.
But “what if instead of us trying to figure out the rules, we have a computer figure out the rules by telling it the answers?” he posited. “We get people to wear devices like these and tell them that they're walking and tell them that they're running. You do that every time you use it. And then we get a computer to determine the patterns between these things.”
The AI model would sort through a multiplicity of factors to find patterns, too numerous and nuanced for people to do manually.
Generative AI is an iteration of this process. In the fitness example, data on walking, biking and running is manifested as zeros and ones and labeled activity. In generative AI, a revolutionary technique Google developed called Transformers follows the same logic. After finding patterns in a sequence of words, it can then predict the next sequence of words.
For example, if given the words ‘If you’re happy and you know it,” most often the following sequence of words would be ‘clap your hands.’”
“So a Transformer, all it’s doing is it’s learning that pattern,” he said. And “if you get enough data and you start breaking down that data − those words − into sequences, what these large models start to do is … generalize” that this sequence is usually followed by that sequence.
“You've given (the language model) a prompt to tell me how to do something, it will then generate the words that follow that prompt,” Moroney explained. “It has figured out the rules that determine how one set of words will match in the next set of words. That's what a Transformer is all about.”