🎉 #Gate xStocks Trading Share# Posting Event Is Ongoing!
📝 Share your trading experience on Gate Square to unlock $1,000 rewards!
🎁 5 top Square creators * $100 Futures Voucher
🎉 Share your post on X – Top 10 posts by views * extra $50
How to Participate:
1️⃣ Follow Gate_Square
2️⃣ Make an original post (at least 20 words) with #Gate xStocks Trading Share#
3️⃣ If you share on Twitter, submit post link here: https://www.gate.com/questionnaire/6854
Note: You may submit the form multiple times. More posts, higher chances to win!
📅 End at: July 9, 16:00 UTC
Show off your trading on Gate Squ
AIGC is at an inflection point: what's next for real-world adoption?
Original source: AGI Innovation Lab
According to a new report released earlier this year by Bloomberg Intelligence analysts, the AI industry could expand at a rate of 42% within a decade, driven first by the foundations needed to train AI systems. That's driven by demand for facilities, and then demand for subsequent devices using AI models, advertising and other services. The release of consumer-focused AI tools like ChatGPT and Google's Bard will fuel a decade-long boom that will grow the AIGC market from $40 billion in revenue last year to an estimated $1.3 trillion in 2032.
Generative AI (AIGC) is gaining wider adoption, especially in business.
For example, recently, Walmart announced that it would roll out an AIGC app to 50,000 non-store associates. According to Axios, the app combines Walmart data with third-party large language models (LLMs) to help employees complete a range of tasks, from speeding up the drafting process to acting as a creative partner to summarizing large documents and more.
Such deployments help drive demand for graphics cards (GPUs) needed to train powerful deep learning models. Graphics Cards GPUs are specialized computing processors that execute programming instructions in parallel rather than sequentially like a traditional central processing unit (CPU).
According to the Wall Street Journal, training these models "could cost companies billions due to the massive amounts of data they need to ingest and analyze." Provides support for ChatGPT and Bard chatbot applications.
01. Riding the wave of generative AI
The AIGC trend has given major GPU supplier Nvidia a powerful boost: The company reported eye-popping earnings for its most recent quarter. It's a boom time for Nvidia at least, as nearly every big tech company is trying to get their hands on high-end AI graphics cards.
Erin Griffiths writes in the New York Times that startups and investors are taking extraordinary measures to get their hands on these chips: “What tech companies are desperate for this year isn’t money, engineering talent, hype or even profits, but the desire for GPUs."
Ben Thompson calls it "Nvidia on top of the mountain" in this week's Stratechery newsletter. The momentum was further fueled by the announcement of a partnership between Google and Nvidia that will see Google's cloud customers gain greater access to technology powered by Nvidia GPUs. All of this points to the current scarcity of these chips in the face of surging demand.
Do current demands mark the culmination of a new generation of AI, or perhaps herald the beginning of the next wave of developments?
02. How generative technologies are shaping the future of computing
Nvidia CEO Jensen Huang said on the company's recent earnings call that this demand signals the dawn of "accelerated computing." He added that it would be wise for companies to "shift capital investments away from general-purpose computing and focus on generating artificial intelligence and accelerating computing."
General purpose computing refers to CPUs designed for a variety of tasks, from spreadsheets to relational databases to ERP. Nvidia believes that CPUs are now legacy infrastructure and developers should optimize GPU code to perform tasks more efficiently than traditional CPUs.
GPUs can perform many calculations simultaneously, making them ideal for tasks such as machine learning (ML) that perform millions of calculations in parallel. GPUs are also particularly good at certain types of mathematical calculations, such as linear algebra and matrix manipulation tasks, which are the basis of deep learning and artificial intelligence.
03. GPUs are of little benefit to some types of software
However, other categories of software, including most existing business applications, are optimized to run on CPUs and benefit little from the parallel instruction execution of GPUs.
Thompson seems to hold a similar view: “My interpretation of Huang’s point is that all of these GPUs will be used for many of the same activities that are currently running on CPUs; this is certainly an optimistic view for Nvidia because it means pursuing The potential excess capacity generated by generative AI will be filled by current cloud computing workloads."
He continued: "That being said, I doubt it: both humans and companies are lazy, and CPU-based applications are not only easier to develop, but are mostly already built. I have a hard time imagining which companies would take the time and effort Porting something that already runs on the CPU to the GPU."
04. History repeats itself
InfoWorld's Matt Assay reminds us that we've seen this before. “When machine learning first emerged, data scientists applied it to everything, even if there were simpler tools. As data scientist Noah Lorang once pointed out, “Only a small subset of business problems are best solved by machine learning; Most people just want good data and understand what it means. "
The point is, accelerated computing and GPUs don't meet all software needs.
Nvidia had a strong quarter, driven by the current rush to develop a new generation of AI applications. The company is naturally enthusiastic about this. However, as we have seen from the recent Gartner Emerging Technology Hype Cycle, this new generation of AI is having a moment and is at the peak of inflated expectations.
Singularity University and XPRIZE founder Peter Diamandis said these expectations are about seeing future potential without any negative consequences. "At that point, the hype begins to generate unfounded excitement and inflated expectations."
05. Current Limitations
At this point, we will soon reach the limits of the current AIGC craze. As venture capitalists Paul Kedrosky and Eric Norlin of SK Ventures wrote on their firm’s Substack: “Our view is that we are at the tail end of the first wave of AI based on large language models. This wave began in 2017 year, with [Google] With the release of the Transformer paper ("Attention is All You Need") and its conclusion sometime in the next year or two, people are faced with various limitations. "
These limitations include "a tendency to hallucinate, insufficient training data in a narrow domain, training corpora from years ago that are out of date, or a myriad of other reasons." They add: "We are already at the tail end of the current AI wave."
To be clear, Kedrosky and Norlin don't think AI has hit a dead end. Instead, they argue that substantial technological improvements are needed to achieve something better than “so-so automation” and limited productivity growth. They believe the next wave will include new models, more open source, and especially "ubiquitous/cheap GPUs" which, if correct, may not bode well for Nvidia but will enable those who need People benefit from the technology.
As Fortune notes, Amazon has made clear its intention to directly challenge Nvidia's dominance in chipmaking. They're not alone, as many startups are vying for market share - as are chip giants including AMD. Challenging a dominant incumbent is extremely difficult. At least in this case, expanding the sources of these chips and reducing the price of scarce technology will be key to developing and spreading the AIGC wave of innovation.
06 The next wave of AI
Despite the limitations of current generation models and applications, the future of AIGC is bright. There are likely several reasons behind this commitment, but perhaps the most important is the generational worker shortage across the economy, which will continue to drive demand for higher levels of automation.
Although AI and automation have historically been viewed as separate, this view is changing with the emergence of AIGC. The technology is increasingly becoming a driver of automation and productivity. Mike Knoop, co-founder of workflow company Zapier, mentioned this phenomenon in a recent Eye on AI podcast, saying: "Artificial intelligence and automation are collapsing into the same thing."
Of course, McKinsey believes this. “AIGC is poised to unleash the next wave of productivity,” they said in a recent report. They are not alone. For example, Goldman Sachs says a new generation of artificial intelligence could boost global GDP by 7%.
Whether or not we are at the pinnacle of the current generation of AI, it is clearly an area that will continue to evolve and spark debate across the enterprise. As great as the challenges are, so are the opportunities—especially in a world hungry for innovation and efficiency. The race for GPU dominance is just a snapshot in this unfolding narrative, the prologue to a future chapter in artificial intelligence and computing.
References: