1. Is it right to assume computational power to encode the data more/less same with current serialization format?
Pretty close. The encoder takes a little amount of computation because it template matches scripts with common templates.
2. Are there any less rough estimate of "more computationally expensive" on decode part? 20%? 50%? twice?
IMO 15% reduction in the total ongoing bandwidth usage is big deal for public node with hundred connection.
Your benchmark should probably bet relative to validation costs rather than the current format, the current format is essentially 'free' to decode and all the time decoding it is likely spent allocating memory for it.
I think the big tradeoff other than just software complexity/review is just that a node which has stored blocks in compacted form will have to burn cpu time decoding them for peers that don't support the compacted form.
For relay of loose transactions, I struggle to see any downside (again other than the review burden).