Thread
Tweet 1
A thing about GPT I've noticed is if you prompt it in such a way that it tells you something wrong or makes a contradiction it'll never correct itself if you point it out and push it.
---
Tweet 2
Though often when it makes an error a slightly different path would have given a correct result.
---
Tweet 3
Much like people tbh
---