Sign up to Brilliant and you'll also get 20% off an annual premium subscription: https://brilliant.org/tldr/Last week, Nvidia revealed another massive increa...
The AI technology we’re using isn’t “new” the core idea is several decades old, only minor updates since then. We’re just using more parallel processing and bigger datasets to brute force the “advances”. So, no, it’s not actually that new.
We need a big breakthrough in the technology for it to actually get anywhere. Without the breakthrough, we’re going to burst the bubble once the hype dies down.
The landmark paper that ushered in the current boom in generative AI (Attention is all you need (Vaswani et al 2017)) is less than a decade old (and attention itself as a mechanism is from 2014), so I’m not sure where you are getting the idea that the core idea is “decades” old. Unless you are taking the core idea to mean neural networks, or digital computing?
I just don’t get this. There has not been some huge leap in processing power over the past few years, but there has been in generative AI. parallel processing, on the other hand, has been around for decades.
I just don’t know how one can look at this and think there hasn’t been some big step forward in ai, but instead claim it’s all processing power. I think it’s pretty obvious that there has been some huge leap in the generative AI world.
Also I’ve been incorporating it more and more. It boggles my mind that someone would look at this and seea passing fad.
The AI technology we’re using isn’t “new” the core idea is several decades old, only minor updates since then. We’re just using more parallel processing and bigger datasets to brute force the “advances”. So, no, it’s not actually that new.
We need a big breakthrough in the technology for it to actually get anywhere. Without the breakthrough, we’re going to burst the bubble once the hype dies down.
The landmark paper that ushered in the current boom in generative AI (Attention is all you need (Vaswani et al 2017)) is less than a decade old (and attention itself as a mechanism is from 2014), so I’m not sure where you are getting the idea that the core idea is “decades” old. Unless you are taking the core idea to mean neural networks, or digital computing?
I just don’t get this. There has not been some huge leap in processing power over the past few years, but there has been in generative AI. parallel processing, on the other hand, has been around for decades.
I just don’t know how one can look at this and think there hasn’t been some big step forward in ai, but instead claim it’s all processing power. I think it’s pretty obvious that there has been some huge leap in the generative AI world.
Also I’ve been incorporating it more and more. It boggles my mind that someone would look at this and seea passing fad.