I'm upgrading my home server (Gigabit G4/400) to a G5 DP 1.8 GHz. The G4/400 is on all the time, and the G5 will be, too (of course, the G4 will be gone to my parent's place, so there'll still only be 1 machine on all the time).
So, how much more power does the G5DP/1.8 take versus the G4/400? What kind of effect on my utilities bill do you think it'll have?