It all really boils down to efficiency. As kcress mentioned above, you can pretty much calculate the amount of heat the driver has to dissipate if you know the power it's operating at and it's efficiency. Efficiency of these drivers can vary considerably depending on operating conditions - for instance, if a driver is rated up to 24v of output, and you're only running three LEDs on it, it might be more or less efficient (probably way less) than if you were running 23.9v of LEDs on it.
With respect to dimming, it depends on how the dimming is implemented. If the output is simply switched (i.e. in response to a PWM signal) then the driver is only actually on X% of the time, so it won't be producing heat 100-X% of the time.
Since the commercial drivers are all essentially black boxes with respect to the above two factors, it's hard to predict results other than anecdotal evidence from other users. Meanwhile, in the various DIY drivers I've built, I can tell you EXACTLY under what conditions they'll run hotter or cooler.
Also, of course, ambient conditions. Throwing a 1" square buckpuck in a 2" square project box, vs. having it out in the open in somewhat drafty conditions, will definitely make a difference in the perceived heat.