It seems it has (rudimentarily) learned concepts general enough to be logic itself. That is general intelligence. Now hook it up to reinforcement circuitry and make it even larger and it will mark the end to life as we know it.
GTP-3 has 175 billion parameters, but the human brain has 100 trillion synapses, so 0.175%. NN model capacity currently has a 3.4 month doubling time.[1] In 7-10 doublings we'll be in a similar ballpark, i.e. 2-3 years.
Is there any specific reasoning behind equating 1 synapse to 1 NN parameter? Seems a bit simplistic. Seems to me like a synapse probably has more computational ability than a single parameter.
Real neurons have many other trainable parameters and a lot more computational structure, so this is of course a simplifying assumption, but it is not entirely baseless either as it is known ANNs can approximate any function in theory, which may suggest synaptic weights do the heavy lifting in biological brains (since what more than general do you need?).
Though biological brains are likely overly complicated due to evolutionary baggage. There are hydrocephalus cases which have much reduced brain matter, but still high IQ.[1] The recurrent laryngeal nerves in giraffes is about 4.6 metres (15 ft) because it goes up and down their neck as it could not be rewired more directly during evolution.[2] Our pristine mathematical models and low-noise computational environments are likely superior to evolved wetware hacks.
Also if anything brains are hyper optimized for many things (based on the many specialized sub-units). I’d bet we are essentially not unsupervised, and the sub-units of the brain are essentially fine tuned for many tasks, and hyper optimized to use all their resources incredibly efficiently (memory optimization must be intense). Not that the generative models won’t get close in some general way relatively soon, but I could see human brains being another 10-1000x more powerful than your ballpark pretty easily.
GTP-3 has 175 billion parameters, but the human brain has 100 trillion synapses, so 0.175%. NN model capacity currently has a 3.4 month doubling time.[1] In 7-10 doublings we'll be in a similar ballpark, i.e. 2-3 years.
[1] https://openai.com/blog/ai-and-compute/