# The "decimal" data type should be used when you require a
# high level of precision, since it can accurately store data
# up to 28 digits after the decimal points (128-bit precision).
There's nothing in the XML Schema or XPath specifications that specifies the
precision of xs:decimal, other than a statement that at least 18 decimal
digits must be supported. I've no idea where you got 128 bits from, nor 28
digits.
C# is where it can come from:
http://www.yoda.arachsys.com/csharp/decimal.html
In general, however, it is not a fault of C# or the poster; it
is easy to make the mistake with most other numeric types derived
from binary representations.
It is a trap for a user to provide floating point inexact types
defined by restriction of an exact rational type.