I understand that some game theory uses this definition of rationality, but I find it to be a misnomer. It's really just financial optimization. If a psychological need is valued more than a fixed monetary amount, it's not illogical to prefer the psychological satisfaction.
What if you consider that the utility measure already accounts for all these psychological factors? So the numbers in the matrix don't represent raw dollars, but an accurate estimation
1 of how happy would it make you, in relative terms, to pick that option, all things considered. Then rationality is accurate enough a name for optimization of that, isn't it?
That is formally equivalent to calling the utility money and assuming (again, for the sake of mathematical simplification) that that is all you care about.
1: Don't ask me
how would you go about attributing a scalar value to the hairy mess of actual human preferences and biases, though. Game theory doesn't even try to do this.