Unfortunately, I don't believe there is time to properly test and incorporate merkle tree pruning before we need to increase the block limit, but that should be a focus as well as using invertible Bloom filters .
http://www.coindesk.com/juniper-research-bitcoin-transactions-double-2017/A couple years according to these researchers. Double our current average and we still aren't at 1MB.
It is the upper limit peaks which are problematic and preparing for the future before the next traffic increase along with other decentralized apps like lighthouse being restricted. We are already seeing full blocks from time to time at the moment, thus we should be concerned now.
To which full block(s) are you referring? What height?
Or by "full" are you referring to the lower than 1MB max block sizes used in some pools?
If you are just making this up to escalate urgency and fake a crisis, then why do that?
No. He is talking about the 970-980 kB blocks which not rare these days. While the average is around 300kB, that means we are at 1/3 capacity. And I personally think 1/3 is where we should seriously consider our options for scalability. The way people are split on the subject right now it is even more urgent to start getting a consensus.
But on the other hand there is no hurry and no crisis. The extra transactions can just go through altcoins if Bitcoin chooses the 1MB block + higher fees 'solution'.

They aren't rare, but they are not the norm. Most aren't near full.
F2Pool solved a few that were close recently, above 900K. With a good number of freebies.
https://blockchain.info/block/0000000000000000175b44859017a5148c48ecba7a67f14012232e9bb6b47a73https://blockchain.info/block/00000000000000000f9597aed448ce8429c550a65f896b66760381d0c364901eand then there is this 7K block in between
https://blockchain.info/block/000000000000000014efb22561313ebe3c27780808b5d8939ebc1a850badf9daThere are a lot of blocks with <200K, so we'd get some scalability of more Tx/s with a minimum block size too, but that would not be a good thing to do.