This thread cracks me up.
We Bogleheads like to trust the market to do the right thing (overall, long term) and then like to second guess certain trends, esp in response to misleading analysis from unreliable sources.
Much of the growth and innovation in the US market is in tech. Those companies are very large and compete with one another to provide unique and deep and needed services to the population. Each of those large tech companies has decided that (newly developed at scale) AI models provide a breakthrough way for them each to improve the services they provide customers. And each other them is convinced enough of that to invest billions or tens of billions of their dev budgets to incorporate AI models into their core services.
But no worry, it sounds like 'Tulip mania' to someone, who writes a blog post to that effect. LOL.
Nvidia does effectively have a monopoly on high performance AI chips. Their products are, technically, mind blowing. I use them to accelerate my compute intensive tasks at work, and have to look at the nitty gritty of how they work. The innovation and implementation (and scale) are just remarkable, a 'breakout' approach after the effective exhaustion of Moore's Law a decade ago.
For example, A used, $500 V100-class (7 year old) Nvidia GPU card, applied to scientific computation, is >100x faster than a 12 year old CPU-based linux cluster that cost over $50k. 100x faster for 1% of the price. One is the size of an appliance and needs dedicated power and cooling, the other you can hold in your hand and install in your desktop. People are porting their code from one platform to the other as fast as their nerdy fingers can type.
NVidia developed a solution to efficient multi-processor computing, and has built it at scale. Whether you believe in 'AI' or not is immaterial... there are many different 'AI' algos for different tasks, and many other compute heavy modeling tasks that are not AI.
And Nvidia's architecture is simply the future of large-scale computing. And the whole world is jumping on it, after a decade of stagnation in compute performance/price growth.
We Bogleheads like to trust the market to do the right thing (overall, long term) and then like to second guess certain trends, esp in response to misleading analysis from unreliable sources.
Much of the growth and innovation in the US market is in tech. Those companies are very large and compete with one another to provide unique and deep and needed services to the population. Each of those large tech companies has decided that (newly developed at scale) AI models provide a breakthrough way for them each to improve the services they provide customers. And each other them is convinced enough of that to invest billions or tens of billions of their dev budgets to incorporate AI models into their core services.
But no worry, it sounds like 'Tulip mania' to someone, who writes a blog post to that effect. LOL.
Nvidia does effectively have a monopoly on high performance AI chips. Their products are, technically, mind blowing. I use them to accelerate my compute intensive tasks at work, and have to look at the nitty gritty of how they work. The innovation and implementation (and scale) are just remarkable, a 'breakout' approach after the effective exhaustion of Moore's Law a decade ago.
For example, A used, $500 V100-class (7 year old) Nvidia GPU card, applied to scientific computation, is >100x faster than a 12 year old CPU-based linux cluster that cost over $50k. 100x faster for 1% of the price. One is the size of an appliance and needs dedicated power and cooling, the other you can hold in your hand and install in your desktop. People are porting their code from one platform to the other as fast as their nerdy fingers can type.
NVidia developed a solution to efficient multi-processor computing, and has built it at scale. Whether you believe in 'AI' or not is immaterial... there are many different 'AI' algos for different tasks, and many other compute heavy modeling tasks that are not AI.
And Nvidia's architecture is simply the future of large-scale computing. And the whole world is jumping on it, after a decade of stagnation in compute performance/price growth.
Statistics: Posted by just frank — Sat Jun 22, 2024 5:20 am — Replies 62 — Views 8187