...
Yeah, that's the thing with AI, it's impressive and all, but it's still a bit rough on the edges. It's a bit like an autistic kid, they might know the answer, but they don't know how to express it correctly in a manner that you're expecting.
In this case we see that the AI is not adding anything interesting to the conversation. I think it just focused on the math of it and not on the psychological aspect of it.
Even in terms of pure math, chat GPT can be wrong sometimes. I remember asking it about the probability of winning calculations for the UK's premium bonds, but the answer it gave me didn't seem quite right, so I started to quizzed it a little bit more and eventually it just admitted to being wrong and apologised

Moral of the story - be careful when using it to develop any strategies involving real money, as it's just not there yet in terms of reliability.