Don't you think this combined transaction will occupy comparable amount of block space?
No. Two separate transactions take more space, than a single combined transaction, when you cut some data in the middle, and you prove in signatures, that it was done correctly.
Also, amounts can be combined as well, so miners have an incentive to join transactions, because then, they can get the same fees, but use less block space, than before joining.
More than that: if you can join any two matching transactions, then you can join N transactions as well, and then, the only limits are related to things like max standard transaction size.
It might have the same amount of computational and network data cost.
There are more computations needed, but they are temporary. When joined transactions are deeply confirmed, then new nodes won't need that data during Initial Blockchain Download.
You still need to propagate and verify the transaction signed by Bob.
Only as long, as this transaction is unconfirmed. However, when it is batched into other transactions, then Bob can just keep some SPV-like proof, and every other node can forget about it.
It's a hidden cost.
What do you think is better? Having higher cost during mining recent blocks, and then lower cost during Initial Blockchain Download, or the opposite situation, where each and every full node has to verify non-batched transactions over and over again?
Very often, you have non-batched transactions in the scope of the same block. Which means, that the final outcome of the block is "consumed N inputs, and made M outputs", but it is not expressed in the simplest way.
2) It will be awesome if you support your claims with proper mathematical calculations.
3) Don't forget about network data propagation delays.
4) Remember that blockchain security relies on full nodes that do all verifications in full against all protocol rules. Full nodes never rely on any spv-like proofs. There is no compromise here on what is better. There are no "options" here to think about.
5) Finally. Could you estimate what is % of transactions go in chains "Alice->Bob->Charlie" within short time intervals compared to stand-alone transactions "Alice to Bob"? If transactions which could be "batched" are very rare, then how can you get any non-marginal efficiency improvement here?