I hope you are doing well. I just caught the Bill Dally interview of Yann LeCun at GTC 2025. Yann started off early by saying "I'm not so interested in LLMs anymore. They're kind of the last thing..."
As you have said all along, he clearly stated that LLMs will not get us to AGI, or what he likes to call Advanced Machine Intelligence.
Well, I certainly thought of you when I heard the interview.
" ... and we need just a few millions words to master language and reasoning. Not tens of trillions."
I'm pretty sure I don't know millions of words or even go looking for them in the language / reasoning toolbox but then I'm pretty sure I haven't mastered these things either.
Not sure I've consumed that many words either. And combinations of the same words in daily use vary little unless someone is trying to sound clever. When we have to look up a word or remind ourselves of its obscure meaning we rarely use it again unless a simpler alternative is unavailable.
In general, I have found that deeper aspects of reality where thoughts and words cease to exist allow for greater understanding of our experience than can be expressed through language alone.
Why are we pursuing AGI as a species when we already possess the most sophisticated machinery for this purpose?
Why don't we solve problems that we have to solve ourselves and develop future technology in support of human endeavors?
It seems to me that the drive behind AGI development is to replace humans altogether.
Hello Peter,
I hope you are doing well. I just caught the Bill Dally interview of Yann LeCun at GTC 2025. Yann started off early by saying "I'm not so interested in LLMs anymore. They're kind of the last thing..."
As you have said all along, he clearly stated that LLMs will not get us to AGI, or what he likes to call Advanced Machine Intelligence.
Well, I certainly thought of you when I heard the interview.
https://www.youtube.com/watch?v=eyrDM3A_YFc
Best Regards,
Michael
Yes, indeed - the consensus is shifting. Slowly.
Best Regards,
Peter
" ... and we need just a few millions words to master language and reasoning. Not tens of trillions."
I'm pretty sure I don't know millions of words or even go looking for them in the language / reasoning toolbox but then I'm pretty sure I haven't mastered these things either.
Not sure I've consumed that many words either. And combinations of the same words in daily use vary little unless someone is trying to sound clever. When we have to look up a word or remind ourselves of its obscure meaning we rarely use it again unless a simpler alternative is unavailable.
In general, I have found that deeper aspects of reality where thoughts and words cease to exist allow for greater understanding of our experience than can be expressed through language alone.
Why are we pursuing AGI as a species when we already possess the most sophisticated machinery for this purpose?
Why don't we solve problems that we have to solve ourselves and develop future technology in support of human endeavors?
It seems to me that the drive behind AGI development is to replace humans altogether.