Bar30 Pressure Sensor - Datasheet doubts

Hello BR.

I am developing a ROV and I’m using the Bar30 Pressure Sensor. I’m having some issues for measuring low depths with adecuate accuracy. Also, I’m experiencing the same drift problem as other users describe in other posts.

I have some doubts about the pressure sensor datasheet. I think that solving them, may help me.

First, in the pressure and temperature calculation chart there are some notes at the bottom which says:
“min and max have to be defined” for the variables dT, OFF and SENS.
I don’t understant it. Can I define min and max for those variables? How do I do that? Would that help in the absolute accuracy? I think that in your library you don’t define it. Does it have some thing to do with the OSR selected?

Another question: In the pressure output characteristics, What does it mean the note (1) “Whit autozero at one pressure point” for the absolute accuracy?

Thank you in advance for your help!

Matías Robador

It’s not clear, but I think the comment about ‘min and max have to be defined’ means that the min and max values provided for that step of the calculation are constrained by the min and max values of the calibration constants defined above. These aren’t parameters that you can change.

Again, the note about ‘autozero’ is not clear, but I think it means that they are ‘taring’ the sensor measurements at some arbitrary pressure value within the specified range.

You should zero/tare the sensor with your own application (just add an offset for current air pressure), as our libraries don’t handle that.

  • Which library of ours are you using?
  • What range exactly is ‘low depths’ and what accuracy do you require?
  • Are you seeing inaccuracies greater than those specified in the datasheet?


Hi Jacob. Thanks for your answer.

I am using your MS5837 Library Master, which I downloaded from MS5837 Library Master

Now I am working in a range of depths from 0 to 1.5 m aprox. Perhaps in later tests it could be till 3 or 5 meters of maximum depth. For my application I require as maximum ~ (+/- 15 cm ~ 15 mbar) of accuracy. (I think this range is low compared to what you are familiared with.)

I’m not seeing greater innaccuracies than 50 mbar (as specified in the datasheet, for this operation range). Actually, the problem that I have is the drift of the measurement. Today I measured atmospheric pressure for almost 90 minutes. It started at 925 mbar and ended at 906 mbar. It decreased in total 19 mbar ~ 19 cm. In average, it was a drift of ~ -0,4 mbar/min. It is similar to what happened to Trevor in this post:Drifting absolute value, but (fortunately), it is not as much as this one: Reading decrese with time

In one of those posts, Adam explained that before the Bar30, you have made some custom “Bar 02” sensors, wich only read a depth up to about 20m, but it is more accurate with much less drift. I think that it would work better for me. Is there a way to solve the drift problem? If there is no solution, is there a chance to change my sensor for that one? (note: I live in Bariloche, Argentina).

I await your answer.


Hello again guys.

I wanted to tell you that I could finally to solved the drift problem I had with the sensor. It was solved by feeding the sensor with 3.0 V. I don’t know why you recommend feeding it with 5V. On the manufacturer’s data sheet [Datasheet sensor] says that the supply voltage must be between [-0.3 - +4] V. And also it says that you have the slightest error in reading pressure with 3V.
I now observe errors according to what th e data sheet specifies.

I hope this this information will be usefull for you.



We recommend 5v because our Bar30 sensor has additional circuitry to accept 5v and convert it to 3.3v. The devices we interface with provide 5v so that is our standard recommendation but it will also work well at 3.3v.

I’m not sure why you wouldn’t see the same results when feeding it 5v versus 3v. Can you confirm that the results are drifting at 5v but not at 3v?