It partly depends on which SWG you have.
SWGs, like the AquaRite determine salinity based on the production characteristics of the cell and water temperature. For that type of SWG, it's the chloride that matters.
However, SWGs that have a separate salinity sensor, like the AutoPilot or Jandy AquaPure, detemine salinity based on conductivity, which is affected by all ions.
The AquaPure and AutoPilot can be recalibrated to match the test, if necessary.
It also depends on which salt test you're using. Test strips and the Taylor k-1766 measure chloride, whereas a meter uses conductivity.
To be safe add less than you think you will need and retest to see what the amount added accomplished.
As long as you use sodium chloride, everything should be fine. There's no benefit to using potassium chloride.
In addition to extra cost, you could potentially have other problems. For example, although the AquaPure measures salinity using conductivity, it also assess the production characteristics of the cell and compares them to the expected characteristics based on the calculated salinity.
Since the production characteristics are determined by chloride, there will be a mismatch, which could potentially cause the unit to halt production and display an error code indicating high or low current.
In that case you could probably resolve the issue by recalibrating the salt to match the measured chloride level.
For example, if you had an AquaPure and a starting salt of 200 ppm and wated the correct chloride level for 3400 ppm salt, you would add about 480 lb sodium chloride or about 615 lb potassium chloride. However, the salinity reading would be about 4300, which would cause a high salt error and possibly a low current error. Then you could recalibrate the salt to 3400 ppm.