depending on the resistivity of the copper wire versus the lead-tin mix in the solder, there may end up being more current going through the solder than the actual wire.
Even if the resistivity values of the copper and solder are similar (I'm pretty sure they're at least in the same order of magnitude), you'll still end up with the same amount of current running through the copper wire and the solder.
You don't know what you're talking about. Copper is very conductive compared to solder.
Pure copper has a resistivity of 0.0172µΩ⋅m while 63% tin/37% lead solder has a resistivity of 0.145µΩ⋅m. (http://alasir.com/reference/solder_alloys/) That's almost an order of magnitude difference.
Let's say you had the absurd case of 1/2 copper and 1/2 solder. I = I_cu + I_solder, and V = I_cu * R_cu = I_solder * R_solder => 8x more current going through the copper than the solder.
You would need 8x more solder than copper in order to get "more current going through the solder than the actual wire."
Well hey, I guess I hit it on the head with them being at least within an order of magnitude, eh?
You forgot to take into consideration the thermal conductivity of solder. Although there's only 1/8th the current going through the solder than that of the wire, solder has a much much lower thermal conductivity than copper. After a bit of googling, copper's thermal conductivity is 401W/(mK) and bismuth solder is 19W/(mK). Although there's less current going through that solder, it's so much more thermally conductive it'll probably see its temperature increasing faster than that of the copper.
That certainly doesn't help when solder's melting point tends to be around 140C and copper's is 1000C. Also, while you may have said a 50/50 mix of solder and copper is extreme, if the OP was an awful at soldering and was using something like 16 gauge wire, it's not so ridiculous that there may be a 50/50 mix of solder to wire, hell that might even be a bit low.
The numbers you yourself reported show that copper is much more thermally conductive than solder, and not the other way around.
Thermal conductivity describes how quickly the heat conducts through the wire. Since the wire is uniformly heated, this is a minor detail (assuming that the wire's thermal conductivity is considerably higher than the electrical insulation around the wire). Instead, you need to look for the heat transfer coefficient of insulated wire.
The melting point of Sn63Pb37 is 183C, not 140C. 183 is not "around 140."
If the wire were hot enough to melt the solder in a copper+solder combination then it would be well more than hot enough to melt the plastic insulation around just the copper wire itself, which is typically rated for only 90C.
Um, what? You'll dissipate more heat at 110v than 220v (high school physics -- heat dissipation is proportional to resistance and the square of current, so for a given wattage, it is inversely proportional to voltage — hence high voltage power transmission).
In any event, 220v is no big deal. It's household current almost everywhere except the US. (A few countries use 110V, Australia is supposedly 240v.)
The heat dissipated of the two metals would be identical if their resistances were the same, of course. I'm not even considering the difference in amperage between a 110V and 220V connection, as I'm pretty sure that's not what the parent parent post was in reference to.
I was under the impression that the issue at hand was that using solder for high-load wires was bad because of solder's extremely low melting point. I meant to point out that even though the solder is just holding the wires together, there is still a current running through it, and if the resistivity of the solder is lower than that of the copper, the chances of the solder melting would be quite a bit higher.
I honestly don't know if that's why the parent parent post mentioned that soldering high load wires are bad. I just extrapolated issues that someone might have when using solder on high load wires, and relayed them in my reply.
Yeah i was puzzled by the post too, but solder is no way going to be lower resistance than copper, and if your wiring is heating up to the melting point of solder you have other issues.
> Even if the resistivity values of the copper and solder are similar (I'm pretty sure they're at least in the same order of magnitude), you'll still end up with the same amount of current running through the copper wire and the solder.
And what's the problem with that? The resistance of the solder will be minimal (as you say yourself, pretty much equivalent to copper), so the voltage drop and heat production should both be low? Am I missing something?
I mean, I'm pretty sure there's solder connecting the copper wire to the PCB of any PSU in my house, all running at 230V.
Even if the resistivity values of the copper and solder are similar (I'm pretty sure they're at least in the same order of magnitude), you'll still end up with the same amount of current running through the copper wire and the solder.