One of the non-obvious reasons Bitcoin works is transaction fees are based on the size (in bytes) of the transaction. And since there are no loops executing a transaction, CPU usage is bound by transaction size.
If CPU usage to verify a transaction is NOT related to the transaction size, then you open yourself up to denial-of-service attacks. Like:
...
this is precisely why a turing complete scripting language is a bad idea in the context of cryptocurrencies: it is highly non-trivial to put bounds in place to prevent or mitigate exploits.