assuming you see an image every 400ms which is given blinking and activation of neural pathways a good approximation.
Billion images per that rate is equivalent to 12 years of never stopping to watch
there have been systems that learned to generalize after seeing couple of examples not thousands (digit recognition)
child can see just one animal and label it as a monkey, an algorithm could probably do the same with more algorithmic machinery, but we are still not there
there have been systems that learned to generalize after seeing couple of examples not thousands (digit recognition)
child can see just one animal and label it as a monkey, an algorithm could probably do the same with more algorithmic machinery, but we are still not there