Nice explanation of diff between Shannon and Boltzmann etropy.
Thermodynamic entropy vs information entropy – Advanced Physics Forums
The information entropy is the log of the number of accesible states, and is dimensionless.
The thermodynamic entropy is equal to the information entropy times the Boltzman (sic) constant. (kB = 1.38×10^-23 J/K) so the thermodynamic entropy has units of energy per degree kelvin.
It is worth noting that present day computers process so little information compared to the number of equivalent thermodynamic states accesible to them, that the information entropy of the device is insignificant compared to the thermal entropy. I believe this means that present day computers operate nowhere near the thermodynamic limits of computation, so in a certain sense the equivalence of the two forms of entropy is irrelevant except that it does allow one to place theoretical limits on the process of computation.