- Joined
- Mar 7, 2018
- Messages
- 47
- Reaction score
- 38
I couple years ago I researched, designed, and built a proof-of-concept prototype of an automated alkalinity monitor for a class I was taking in introductory machine learning for applications within chemical engineering. For a while, I was thinking about trying to bring the design approach to market, but I've been tied up with other aspects of life, so I thought I'd just share it with everyone. The design concept is based upon the tracer-monitored titration technique introduced by Martz et al. 2006. I think this approach has several distinct advantages over existing alkalinity monitor designs:
1) No probe calibration necessary because there are no pH probes
2) No pump calibration because titrant volume is measured spectrophotometrically
3) Precision pumps are not needed which reduces cost and may reduce maintenance (piezoelectric small-volume pumps seem pretty attractive for this application)
4) Single inexpensive reagent
5) Demonstrated superior accuracy using minimal training data
This prototype lacks the ancillary hardware that would make the system fully automated (e.g. a pump to sample the water and flush the flow cell, a magnetic stirrer for the flow cell, etc), but does demonstrated that the technique itself works quite well. This design uses a standard acid titrant (0.02N H2SO4) spiked with an indicator dye (bromocresol green), i.e. the tracer. The titration is conducted in a cuvette (or flow cell if automated) and is monitored by a multi-channel DIY spectrophotmeter. The spectrophotmeter is designed to be fairly inexpensive and consists of an RGB LED emitter and a spectral sensor with 6 bands across the visible spectrum (both obtained from SparkFun). Together, the emitter and sensor can make 18 individual measurements for each discrete titrant volume added (3 emission bands x 6 sensor bands = 18). The software supporting this hardware uses machine learning techniques (i.e. fancy term for regression analyses) and is trained on the spectral response of sample with known alkalinity during titration. To do this training, I took samples of my tank water and adjusted the alkalinity so that I had many individual samples with different alkalinity within the range of plausible values. This was my final project for the class and I didn't have a lot of time to do the training, so I ended up only recording the spectral response of 11 individual samples. From that very small dataset, I performed a cross-validation (train on 10 and measure on 1) to demonstrate that it is possible to get accuracy as good as +/-2.6% with an appropriate machine learning algorithm. I expect that to improve significantly with a larger training set.
I've attached my paper if anyone is interested. Please don't fault me for how the paper was written. I had to comply with many specific requirements for the assignment, so there are a bunch of rather uninteresting results included.
1) No probe calibration necessary because there are no pH probes
2) No pump calibration because titrant volume is measured spectrophotometrically
3) Precision pumps are not needed which reduces cost and may reduce maintenance (piezoelectric small-volume pumps seem pretty attractive for this application)
4) Single inexpensive reagent
5) Demonstrated superior accuracy using minimal training data
This prototype lacks the ancillary hardware that would make the system fully automated (e.g. a pump to sample the water and flush the flow cell, a magnetic stirrer for the flow cell, etc), but does demonstrated that the technique itself works quite well. This design uses a standard acid titrant (0.02N H2SO4) spiked with an indicator dye (bromocresol green), i.e. the tracer. The titration is conducted in a cuvette (or flow cell if automated) and is monitored by a multi-channel DIY spectrophotmeter. The spectrophotmeter is designed to be fairly inexpensive and consists of an RGB LED emitter and a spectral sensor with 6 bands across the visible spectrum (both obtained from SparkFun). Together, the emitter and sensor can make 18 individual measurements for each discrete titrant volume added (3 emission bands x 6 sensor bands = 18). The software supporting this hardware uses machine learning techniques (i.e. fancy term for regression analyses) and is trained on the spectral response of sample with known alkalinity during titration. To do this training, I took samples of my tank water and adjusted the alkalinity so that I had many individual samples with different alkalinity within the range of plausible values. This was my final project for the class and I didn't have a lot of time to do the training, so I ended up only recording the spectral response of 11 individual samples. From that very small dataset, I performed a cross-validation (train on 10 and measure on 1) to demonstrate that it is possible to get accuracy as good as +/-2.6% with an appropriate machine learning algorithm. I expect that to improve significantly with a larger training set.
I've attached my paper if anyone is interested. Please don't fault me for how the paper was written. I had to comply with many specific requirements for the assignment, so there are a bunch of rather uninteresting results included.