Artificial Intelligence Chat GPT to the device!
The number of neurons in the human cortex is of the order of 1 followed by 11 zeros (100 billion).
At number of neurons in GPT-3the language model that is part of the Artificial Intelligence Chat GPT from OpenAIis of the order of 1 followed by 8 zeros (100 million).
On the other hand, this difference of three orders of magnitude increases if we refer to the connections between neuronswhich in the case of the brain are more numerous and diffuse than in GPT-3.
Even if we bridge the gap in our knowledge of how the human brain works, the comparison is strikingOnly" a few tens of orders of magnitude!....
Without going into the endless discussions that flood the web about the conversational experience, the ethical aspects or the implications of the exponential saturation of artificially generated content on the internet, I wonder:
What level of neural complexity, in number of neurons, will we be able to reach in the next 20 years?
It is known that with only 4 months of existence, Artificial intelligence Chat GPT has generated a remarkable opinion impact. Will this impact be translated into real activity, in which industries will it become a transformative element, and will it remain just an opinion?
It is too early to tell, but it seems clear that there are many people of different profiles thinking about how to apply it and even already applying it to their activity. Apart from GPT-3, which is still a proof of concept, AI is here to stay.
A growing list of applications are already a reality. Although perhaps with less notoriety than ChatGPT, AI is already part of the toolbox of IT professionals and systems.
The very consensus, at least at the application level, of what is AI is dynamic. What was considered AI 10 years ago is no longer considered AI now.. It seems safe to assume that this will still be the case in 10 years' time.
In conclusion, I wonder about the controversial possibility that the synthetic content could be used to train new models and so on.
But in a way, Isn't this how we humans learn and generate knowledge?? We consume information and content from others to generate our own and this in turn is used by others.
It is obvious that there are two major differencesThe scale and the way we generate our knowledge and content. ChatGPT has been trained with a huge amount of content.
It could be said that he is simply responding to something he has read, something that is in his memory, although that would be oversimplifying how ChatGPT actually works.
Opponents and opponents fill the blogs and forums with arguments in both directions. Great challenges lie ahead technical, economic and social issues that will present us with threats and opportunities. Humans will have to decide, at least for the time being... how to deal with them.
Below is a summary of a recent interview with Sam Altman (CEO of OpenAI) in the outreach channel Dot CSV (@dotCSV).
I liked his level-headedness and his vision, he seems to be someone with well-furnished ideas: