@abucci@anthony.buc.ci Interestingly the Wikipedia article on GPT-3 describe it as:

Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model released in 2020 that uses deep learning to produce human-like text. Given an initial text as prompt, it will produce text that continues the prompt.

Which is even more confusing to me, mostly because it doesn’t speak of a neural network at all. Basically I was (on my short-lived holiday) doing some R&D on neural networks, evolutionary algorithms and other reading πŸ˜…

​ Read More

Participate

Login to join in on this yarn.