Is AI More Intelligent Than A 1-Year-Old? Can AI Be Trained Using AI?
Artificial Intelligence's Limitations
The responses to these questions are generally negative. This raises the question of how much of the hype surrounding AI is actually valid. Andy Kessler, a writer for the Wall Street Journal, has made a noteworthy observation: AI is incapable of teaching itself new skills.
OpenAI recently raised a record-breaking $6.6 billion in venture capital funding, valuing the company at $157 billion. Despite this, the company is expected to lose $5 billion this year and projects losses totaling $44 billion by 2029.
Hype Versus Reality
Despite the hype and grandiose claims made by companies like OpenAI, there are several realities that dampen the enthusiasm. For instance, the Moravec's paradox states that AI is still less intelligent than a baby. This paradox, proposed by robotics researcher Hans Moravec in 1988, suggests that while it is relatively easy to program computers to perform at adult levels on intelligence tests or games like checkers, it is much more difficult to give them the skills of a one-year-old in terms of perception and mobility.
Apple's AI researchers seem to agree with this assessment, noting that current large language models (LLMs) are not capable of genuine logical reasoning. Instead, they attempt to mimic the reasoning steps observed in their training data.
Challenges Faced by AI
Another challenge for AI is the linguistic apocalypse paradox, which suggests that AI's intelligence comes from human logic embedded between words and sentences. Large language models require human words as input to become more advanced. However, some researchers believe that we will run out of written words to train models sometime between 2026 and 2032.
Furthermore, AI models cannot be trained on AI-generated prose as it leads to model collapse, where the output becomes gibberish.
The scaling paradox suggests that large language models may follow power-law curves, meaning that increasing model size, dataset size, or computation can lead to significant performance boosts, but with diminishing returns as you scale up.
Economic Implications
The spending paradox is another major concern. Data centers currently have a huge demand for graphics processing units to power AI training. However, venture capitalist David Cahn of Sequoia Capital wonders if this is sustainable. He estimates that the AI industry needs to see $600 billion in revenue to pay back all the AI infrastructure spending so far.
Goldman Sachs’s head of research has questioned whether the spending on AI is yielding enough benefits. Nobel laureate and Massachusetts Institute of Technology economist Daron Acemoglu thinks AI can perform only 5% of jobs and warns that a lot of money is going to get wasted.
DotCom Bust Comparison
The current situation with AI is reminiscent of the DotCom bust, where companies like Gemstar-TV Guide International Inc., which was expected to hold the key to the future of interactive television, saw its stock fall hard after the dot-com crash.
Bottom Line
While AI has the potential to transform our lives for the better, its journey is not a straight path upwards. The hype surrounding AI and the reality of its capabilities and limitations must be carefully balanced. The lessons learned from the DotCom bust should serve as a cautionary tale. What are your thoughts on this matter? Share this article with your friends and let us know your views. You can also sign up for the Daily Briefing, which is delivered every day at 6pm.