I have 1x Gigabyte GTX 1060 6Gb (micron memory)
~300 sol/s and ~96w @ 62°c
and a EVGA 1070 8gb (micron memory too)
~430-440 sol/s and ~120w @ 58°C
My screen :

I have Geforce 1070 windforce OC. About the same hashrate as your EVGA 1070, but I checked with GPU-Z, it used much wattage compared to yours. 1070 TDP is 150w, how do you get 120w? Did you undervolt it? If yes, do you mind to share your experience? Thanks!