It would severely slow down keyboard input because the keyboard driver, which is usually small so that it runs and processes character input quickly, now has to do a round of encryption for every character, not to mention that since the data is just a single byte, you end up wasting more time padding a few hundred more zero bytes at the end just so the cipher can parse the input correctly.
This causes keyboard input latency to change from a few milliseconds to several hundred milliseconds. The delay will be noticeable as if the system was lagging. And this is without even considering decryption time yet.
Besides, the keyboard input is exposed directly in assembly code immediately after a device interrupt (i.e. you press a key), so the unencrypted value can still be obtained by reading the character from device memory and writing it somewhere else in memory, before the encryption even starts.
Not to mention that the keyboard is used as an entropy source so encrypting everything you type isn't possible anyway, without running out of random entropy and then relying on a pseudorandom number generator for encryption instead.