Complexity of signing is O(1).
That is plain wrong. The more bits you have to sign the longer signing takes. The question is: how long in terms of the input length.
We would have O(n) if Sign(32_bytes) == Sign(64_bytes) / 2. Actually we have Sign(32_bytes) == Sign(3200_bytes) / 2, so it's more like O(1). Right?
Sorry, still wrong.
It looks more like O(log n) but don't think so, as the whole input has to be read. That still takes O(n).
If we count time to read the input then we should count header processing time too. What about a compromise at O(log log n)?

Well. No.

It's like a mathematical proof.

E.g. if you want to XOR all byte of the input onto the same variable, you need O(n) because you are reading the input. As simple as that.
When making compromises, let's just assume Qubic is safe. I think it's rather 66% safe as instead of 51%, agreed? I mean 66% is even easier to remember.