http://v4r6k4k35jz3htygpawhyu22hoe2zip3kpwdzogc4g2eup37lpz72yyd.onion/ob
In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets. Also the current public model shown here only uses 345 million parameters, the "full" AI (which has over 4x as many parameters) is being witheld from the public because of it's "Potential for abuse".