My understanding is that the fan will only use as many amps as it needs and that when selecting a power supply you only need to make sure that it's not more voltage than your device needs and that it's at least as many amps as you need.
voltage isnt really the determining factor here.. It's amperage.. It's AMPS that push the voltage through the wire.. Higher the amps, the more power you will have. IE 240 V at 100 AMPS puts out 24000 VA... Where a 120 V 200 Amp, puts out 24000 VA as well.. I have used these types of circuits before at work.. "Electrical Maintenance" blah...
Was thinking about hooking up a Allen Bradley PLC system to my tank, so I can micromanage, and basically leave the maintenance to the birds!
Not to sure about pc fans... But you might want to make sure that the fan is not a DC that you are trying to convert using an AC adapter... You will smoke it... period... Hope this helps
I think you have it reversed, i'm pretty sure voltage is the force, or the potential difference between two different points.
I have an ATI Powermodule and I've noticed that increasing the voltage increases noise. But doesn't decible rating of the fan matter more? I have these 20db fans from China that are running at 12V and they are pretty quite.
FYI I'm cooling some LEDs with this fan.
Also note that 12V plug might be 12 volts when loaded with components that it was made for, wouldn't suprise me if you read 15V on it with a voltmeter, so your fan might run a bit on the fast/noisy side.
And what I used to teach my students, compare electricity to water, current is the amount of water that's moving, voltage is the pressure difference that allows that water to flow.