Out of curiosity, what's the problem with that? I assume he'd have the copper twisted together to make a good connection, with the solder just holding it in place instead of acting as the conductor.
Household wiring must always use screw fittings, wire nuts, etc. Solder is not "generally approved" on dc, either, unless inside an approved housing, and then only when done in certain ways.* Household wiring is all the construction codes usually bother to cover, and they apply to 120/240Vac up to the switch, socket or outlet. Lamps, stereos, etc. are covered by code in some areas, like LA county, but by "Underwriter Labs Approval" in most others, in the US. A "UL" label (o/e) on the power cord is required for sale in most larger US communities.
120VAC can be attached by solder to an approved type of glass or phenolic circuit board inside the correct kind of enclosure. UL generally will not approve its use just about anywhere else for power-line connections.
This is something to keep in mind when doing any DIY project. Get a copy of the UL construction-code book. You'll find that all connections should be inside a non-meltable (fire-retardant) enclosure, via screw terminals, wire nuts, or compression fittings of some kind. Heat and humidity can quickly cause electrolytic corrosion on any soldered joint, eventually causing enough resistance for it to get hot and melt down.
Solder is great for some low-voltage dc purposes. When a connection might get subjected to heat (as in a house fire, resistive connection, etc.) the mischief that running solder can cause is just too great for it to be a good idea, so codes do not approve it in most areas. [Sorta like teflon tape in high-pressure gas lines. :-)] Over time, electrolysis slowly destroys soldered copper connections that aren't hermetically sealed, too. Some fungi accelerate that.
> Household wiring must always use screw fittings, wire nuts, etc.
This is for reasons of mechanical strength. Naive soldered connections can be pulled apart with your bare hands, and will not stand up to being snagged, tripped over, used as an acrobatic toy by a toddler, etc. This is especially true for the single-strand cable used in building wiring, which can apply a lot of leverage to a soldered joint.
Electrolysis and fungi? No, solder is chemically very stable. It's used to great effect to join copper water pipes, where the large surface area and built-in strain reliefs allow solder to be strong enough.
depending on the resistivity of the copper wire versus the lead-tin mix in the solder, there may end up being more current going through the solder than the actual wire.
Even if the resistivity values of the copper and solder are similar (I'm pretty sure they're at least in the same order of magnitude), you'll still end up with the same amount of current running through the copper wire and the solder.
You don't know what you're talking about. Copper is very conductive compared to solder.
Pure copper has a resistivity of 0.0172µΩ⋅m while 63% tin/37% lead solder has a resistivity of 0.145µΩ⋅m. (http://alasir.com/reference/solder_alloys/) That's almost an order of magnitude difference.
Let's say you had the absurd case of 1/2 copper and 1/2 solder. I = I_cu + I_solder, and V = I_cu * R_cu = I_solder * R_solder => 8x more current going through the copper than the solder.
You would need 8x more solder than copper in order to get "more current going through the solder than the actual wire."
Well hey, I guess I hit it on the head with them being at least within an order of magnitude, eh?
You forgot to take into consideration the thermal conductivity of solder. Although there's only 1/8th the current going through the solder than that of the wire, solder has a much much lower thermal conductivity than copper. After a bit of googling, copper's thermal conductivity is 401W/(mK) and bismuth solder is 19W/(mK). Although there's less current going through that solder, it's so much more thermally conductive it'll probably see its temperature increasing faster than that of the copper.
That certainly doesn't help when solder's melting point tends to be around 140C and copper's is 1000C. Also, while you may have said a 50/50 mix of solder and copper is extreme, if the OP was an awful at soldering and was using something like 16 gauge wire, it's not so ridiculous that there may be a 50/50 mix of solder to wire, hell that might even be a bit low.
The numbers you yourself reported show that copper is much more thermally conductive than solder, and not the other way around.
Thermal conductivity describes how quickly the heat conducts through the wire. Since the wire is uniformly heated, this is a minor detail (assuming that the wire's thermal conductivity is considerably higher than the electrical insulation around the wire). Instead, you need to look for the heat transfer coefficient of insulated wire.
The melting point of Sn63Pb37 is 183C, not 140C. 183 is not "around 140."
If the wire were hot enough to melt the solder in a copper+solder combination then it would be well more than hot enough to melt the plastic insulation around just the copper wire itself, which is typically rated for only 90C.
Um, what? You'll dissipate more heat at 110v than 220v (high school physics -- heat dissipation is proportional to resistance and the square of current, so for a given wattage, it is inversely proportional to voltage — hence high voltage power transmission).
In any event, 220v is no big deal. It's household current almost everywhere except the US. (A few countries use 110V, Australia is supposedly 240v.)
The heat dissipated of the two metals would be identical if their resistances were the same, of course. I'm not even considering the difference in amperage between a 110V and 220V connection, as I'm pretty sure that's not what the parent parent post was in reference to.
I was under the impression that the issue at hand was that using solder for high-load wires was bad because of solder's extremely low melting point. I meant to point out that even though the solder is just holding the wires together, there is still a current running through it, and if the resistivity of the solder is lower than that of the copper, the chances of the solder melting would be quite a bit higher.
I honestly don't know if that's why the parent parent post mentioned that soldering high load wires are bad. I just extrapolated issues that someone might have when using solder on high load wires, and relayed them in my reply.
Yeah i was puzzled by the post too, but solder is no way going to be lower resistance than copper, and if your wiring is heating up to the melting point of solder you have other issues.
> Even if the resistivity values of the copper and solder are similar (I'm pretty sure they're at least in the same order of magnitude), you'll still end up with the same amount of current running through the copper wire and the solder.
And what's the problem with that? The resistance of the solder will be minimal (as you say yourself, pretty much equivalent to copper), so the voltage drop and heat production should both be low? Am I missing something?
I mean, I'm pretty sure there's solder connecting the copper wire to the PCB of any PSU in my house, all running at 230V.