I assume you meant "difficulty += .001 * (600 - block.genTimeInSeconds)" instead?
(...)
randomFractionFactor=.001. This isn't .001 necessarily, but just a factor to scale the change to a certain percentage of current difficulty.
The difficulty factor should be a function of the hash target, not the other way around. We shouldn't try to reach consensus (i.e. the hash target) based on floating point numbers, because the rounding depends on the compiler flags used and/or architecture, and you don't want different nodes to have a different view of what the hash target is.
Also, you can't trust block.genTimeInSeconds, because the timestamp inside the block can be forged by the node who generated it, and the nodes can't use their own clocks to determine how long it took for the block to be generated, because then every node would have a different view of what the hash target should be.
IMO, we should do the calculation based on the timestamps of the latest set of 2016 blocks, similarly how it is done today, taking into account that not all block timestamps may be accurate.
So the bottom line is that I disagree with the actual implementation you are proposing, but I agree with the general idea.