Measurement resolution

Mar 4, 2012 at 10:47 AM

First, thanks for making this library available; it's saved me some time.

My question concerns the resolution of the measurements returned.  First, I believe you're throwing away potential precision in the line that converts the raw bytes from the sensor into Celcius.  You do an integer divide by 16, then convert to float.  This effectively discards the bottom 4 bits of the temperature value, and you'll only ever get an integer Celcius value.  This would be better:

ds.Celcius = (float)temp / 16.0f;

When I make this change, I start to see fractional values, but for the most part they are limited to half-degree increments.  This corresponds to the 9-bit conversion mode of the sensor (DS18B20).  However, I've checked and my sensors are in 12-bit mode (configuration register reads 0xFF).

After that, it's all rather puzzling.  Very occasionally I see more precise readings (corresponding to 12-bit, or 1/16 of a degree).  Next, I can't see where your code is timing the conversion.  The datasheet isn't terribly clear, but it suggests that after issuing the convert command (0x44), the conversion takes a time dependent on the resolution and by default (12 but resolution) this is 750ms.  It suggests that reading the sensor during this period will return 0, then 1 when the conversion is complete.  Your code, however, assumes it will read 0 immediately if connected, and you then send the Read Scrathpad command (0xBE) and then wait - for 500ms by default.

Your code clearly works, but I can't make it marry with the datasheet and I can't figure out how to get the increased resolution.  Any clues, please?