I thought this was an interesting result. I put two Pressure Plug (BMP085) sensors in a 1.5 L glass canning jar and put it in a cabinet outdoors so the temperature drifts slowly up and down over the day. Below is a PDF plot of Pressure vs Temperature from both sensors, with 60 hours of data. The temperature (horizontal axis) is reported from a separate single sensor (LM35 into Analog Plug with milli-degree resolution). In theory, from the ideal gas law, this plot should be a perfectly straight line, from both sensors. But in fact we see a small jog in the data about every 2 degrees C. Both sensors show these jogs, at almost but not quite the same temperatures & pressures for each. Are we seeing some kind of numerical round-off issue inside the function that returns pressure in calibrated units (Pa)?
Note, I should add that this is a just curiosity of mine because I am interested in precision measurements. It is as if you had an analog barometer with a slightly sticky indicator, with a dead zone of 0.2 hPa wide in pressure every time the temperature changed by about 2 deg C. Barometric pressure for weather stations often report only to integer hPa units, so in that application you would never notice this very small effect.