Listen to the JALMTalk Podcast
Article
Dennis J Dietzen, Connor J Blair, Stephen M Roper. Raising the Dead Volume: Analysis of Microsamples Diluted and Corrected with Near Infrared Tracer. J Appl Lab Med 2023; 8(5): 931-9.
Guest
Dr. Dennis Dietzen from Washington University School of Medicine and St. Louis Children’s Hospital.
Transcript
[Download pdf]
Randye Kaye:
Hello, and welcome to this edition of JALM Talk, from The Journal of Applied Laboratory Medicine, a publication of the Association for Diagnostics & Laboratory Medicine. I’m your host, Randye Kaye.
Over the past several decades, automation has transformed the workflow of the clinical laboratory. Robotic sampling and transfer of clinical laboratory specimens is now ubiquitous, particularly in high volume laboratories. However, while automation offers numerous benefits, such as increased throughput and reduced analytic errors, it requires relatively large volumes of blood specimens. This is particularly problematic when labs must process samples from neonatal and pediatric patients. Microtainer blood tubes are commonly used for these patient populations but small tubes typically cannot be handled by modern automation lines and therefore must be processed manually.
The September 2023 issue of JALM features a study that explores raising the dead volume. That is, increasing the specimen volume in the tube after the blood is drawn by adding a biochemically inert diluent. The authors’ novel approach could enable automated processing of small volume specimens using current technologies. Today, we are joined by the article’s first and corresponding author, Dr. Dennis Dietzen.
Dr. Dietzen is a Professor of Pathology and Pediatrics at Washington University School of Medicine and he is the Medical Director of Laboratories at St. Louis Children’s Hospital. Welcome, Dr. Dietzen. Firstly, what stimulated your group’s idea to explore diluting small volume specimens for use with laboratory automation?
Dennis Dietzen:
Yeah. That’s a good question. I think this sort of thing is the culmination of about 20 years of frustration with the marketplace and how we have to handle tiny specimens. Most of the drivers of the market of laboratory automation is the adult market, frankly. So, the average birth weight of a child in the United States is about three kilograms and they have maybe a few hundred of milliliters of blood onboard. So, if we pull 5 to 10 ml of blood from those kids multiple times a day, there’s not much left, and then my blood bankers get mad at me because we have to transfuse kids too often.
So, that combined with the fact that sample volumes are not really the problem in our world. The real problem is the dead volume. How much volume do you have to have in the tube for a probe to access it. And dead volumes are a couple orders of magnitude higher than sample volumes. For 20 years, this has bothered me and this is kind of one-shot solution to this problem. And I think the other thing that really prompted this was--Theranos was a bad experience for a lot of people. There was fraud there, and there was deceit there, but the fact that it drew so much attention and so much investment, marketed as, you know, lots of information on very small volumes of blood, I think we should listen to that part of the Theranos experience.
So, I think all of those things, the pent-up frustration and recent experiences kind of led us to think about, led me to think about other – better ways to do this.
Randye Kaye:
Yeah. Well, they definitely uncovered a need. They just didn’t provide the solution that we all hoped for. So, thank you.
Dennis Dietzen:
Exactly, that’s what we’re here for.
Randye Kaye:
Exactly. So, how is this particular approach to dilution different from conventional dilution studies that are already performed in clinical laboratories?
Dennis Dietzen:
Yeah, the approach is pretty simple when you think about it but it’s a little more complicated. When we do typical dilution studies in the laboratory, we have to know two things. We have to know the volume of the specimen that we’re dealing with and then we have to know the exact amount of the diluent that we add to those, and then we can mathematically correct those. The point of this approach is that we don’t need to know, we should not need to know, the exact volume of specimen that we’re dealing with. The way that we build the diluent, we can build a way to figure that out into the construction of the diluent, okay?
This would be a little easier if tubes were always filled to the same volume but in kids, we get a variety of sample volumes. Typically, samples are drawn via a butterfly into a syringe, and the syringe, the contents of the syringe are distributed to multiple tube types without much in the way of caring about how much volume is delivered to those tubes. So, we don’t know how much is in there. If we could figure out a way to back-calculate the volume in that tube without actually having to measure it, we would be in much better shape.
We referred to this approach, or I referred to this approach, as a blind delusion sort of approach but it’s not really blind at the end of the day. That’s kind of how we formulated the approach to get at this problem.
Randye Kaye:
All right, thank you. Now, your diluent included a tracer for the mathematical correction of the diluted results. What were the key considerations in selecting this optical tracer and designing the diluent?
Dennis Dietzen:
Yeah. So, the diluent had to do a number of things. It had to be pretty invisible except for the tracer. The tracer was key and we actually got really lucky. The first one that we really tried worked out reasonably well. The qualities of the tracer had to be that it had a very high extinction coefficient. And what that would allow us to do is to add very small quantities of it and still be able to find it. And that would decrease its capacity to interfere with some of the tests that we were going to run on it.
The second one is that it had to be detectible using an optical system that was outside of the realm of normal chemistry assays. If we put it right in the middle of the visible spectra, it was going to get in the way of a number of measurements just like typical interferents get in the way of a number of measurements. For example, hemoglobin, around 500 to 550 nanometers gets in the way. So, if we picked the tracer that absorbed at that same spot, we wouldn’t be achieving anything. So, the tracer that we picked is a near-IR tracer that has its maximal absorbance up above 800 nanometers so that we would stay away from those things.
The other component of the diluents we didn’t spend a lot of time optimizing because frankly we didn’t think they were that important. We put a little bit of buffer in there just to make sure that the pH didn’t alter what we were trying to do, and we put a tiny amount of detergent in it too, just to make the diluent have, to try to mimic the surface tension of a plasma specimen as well, so that it would get handled accurately by the automated chemistry equipment that we were trying to do.
And finally, those ingredients couldn’t be salts. They couldn’t be sodium salts, potassium salts, chloride salts, because those things would add to the endogenous quantity of things that we were trying to measure in these specimens. We got lucky, I think, that it turned out to be simple but at the beginning it seemed like it might be kind of a tough hill to climb but that’s how we approached building it.
Randye Kaye:
Okay, it’s good to get lucky on occasion. But you also --
Dennis Dietzen:
It doesn’t happen to us very often. But this time, it did.
Randye Kaye:
Well, it’s good when it happens. Of course, you tested your approach and you used ten routine chemistry assays. So, what kinds of assays did you include and why?
Dennis Dietzen:
Yeah. So, 10 was just kind of a nice round number that we thought we could get away with. We wanted to test a variety of assay formats to make sure that the tracer didn’t interfere either with the optical measurement or with the chemistries themselves. So, we picked a variety. So, we picked sodium because it’s an electrode approach and we thought there was minimal chance that the tracer would interfere there and it certainly did not. And then, we had a couple of assays, total protein and the total calcium assay that just use direct die binding to the constituent that we’re trying to measure. Then we used a couple of enzyme-mediated metabolite assays like glucose which uses hexokinase and a dehydrogenase to generate signal and cholesterol, it uses an esterase and oxidase to measure cholesterol.
And finally, we wanted to test what would the impact be on enzyme activity measurements. And for those two, we just randomly picked creatine kinase and we picked ALT as two of the enzyme assays. So, the 10 assays that we picked seemed to cover a broad number of assay formats and a broad dynamic range of compounds that we could interrogate the process pretty thoroughly.
Randye Kaye:
All right, thank you. So, can you tell me about the major experimental design components that you built into the study?
Dennis Dietzen:
It was a pretty simple approach but we did a couple of things just to make sure we weren’t fooling ourselves, I think. One of the things that we had to do is we had to include an offline spectrophotometer because the chemistry instrument we used didn’t really have a filter that would allow us to measure the amount of tracer, the absorbance of the tracer, so we had to build in an offline approach to the volume correction.
And then what we thought would be important is we corrected the volume dilution by using the tracer data but what we also did is we also built in a blind approach where we had Dr. Roper, who’s one of the authors on the paper, actually distributed the samples and recorded the volume that he distributed, and he recorded the amount that he diluted them.
So, I was blinded to all of this. He was blinded to the results of the studies, and the technician that did all the measurements was blinded to the math that was involved in doing those dilutions. So, we ended up doing these two different ways. We did the actual chemical correction factor using the absorbance from the tracer dilution, and then we did a mathematical correction just to make sure that we weren’t fooling ourselves, and to see if our tracer dilution technique matched our mathematical dilution correction.
Randye Kaye:
Wow, very thorough. So, these are really encouraging results but are there any limitations to this technique?
Dennis Dietzen:
There certainly are and we anticipated most of these. This was more of a proof-of-concept study than anything else. But when you start to dilute specimens, you start to move their concentrations into the lower end of the dynamic range of these assays, and we know very well that imprecision gets a little bit worse at those low ends. So, we actually measured that and we dealt with that. Part of the error that’s involved in our corrections are certainly secondary to some of that imprecision that we experienced.
We tried to use assays that had broad dynamic ranges for that purpose. So, for example, we didn’t use the serum sodium application on the instrument; we used the urine sodium application because it had a broader dynamic range. And it allowed us to measure things at lower concentrations. Some analytes are certainly out of reach of this approach given the way that chemistry assays are built today. Creatinine in kids is at the low end of the dynamic range of creatinine assays, so a lot of dilution takes that out of the dynamic range for us. We could not really touch immunoassays using this approach either.
Good examples are TSH and troponin. Detection limits of those two particular analytes are really, really critical, so if we dilute a specimen any lower than it already is, we defeat the purpose of having those highly sensitive assays. So, this approach is really not applicable to those assays at this point. I think the other main limitation here is that we did an awful lot of manual offline sample manipulation. This is designed for utility I think in a very automated robotics sort of environment but to do the proof of principle study, we ended up doing a lot of manual manipulation and I think a lot of the recovery that we had, the variability and recovery that we saw will certainly do to a lot of that manual sample manipulation.
So, are these limitations surmountable? Absolutely. A lot of the chemists, the routine chemistry assays that we do today, their dynamic range is poised to measure what is ambient in the plasma specimen but we can move that dynamic range around a little bit to accommodate a reasonable amount of dilution, a modest amount of dilution, shall we say. So, these are our limitations that are just there because of the way that we do things and the way we’ve always done things. Many of them are not insurmountable.
Randye Kaye:
All right, thank you. So, what happens next? Where does the project go from here?
Dennis Dietzen:
That’s a good question. I don’t know all the answers to that. I think there’s – we’re interested in finding some partners to maybe see if this has, can we build an instrument like this and can it penetrate the market? Can it disrupt the market, I think is what we’re thinking. Are there other options out there that will help us do this?
Clearly, the momentum in a mature, the inertia I think is a better word. The inertia in the current diagnostic marketplace, it’s a very mature marketplace. The same players generate the next generation of instruments all the time and they really don’t, the robotics and the sample processing equipment doesn’t really change that much. And the beauty of this approach I think is that it doesn’t require a quantum leap in technology to do this. Current sample handling, current liquid handling equipment, current centrifugation processes, current tube manipulation processes, this sort of approach is all compatible with those.
So, it would not take a really extraordinary leap, in my mind, to do this. We’re looking into some mechanisms now to try to build a prototype of this to see what it could look like and see what some of the other mechanical hurdles and engineering hurdles and software hurdles that we might run into. That’s our next step. The main purpose of this was just to demonstrate that we could do this and now I think we’re in kind of uncharted territory, at least for me anyway, about how to best apply this and how to look for the right partners to see if this sort of technology might have legs and might be a solution to solving the crisis around using very tiny sample volumes to generate as much data as we possibly can.
Randye Kaye:
All right. Well, it sounds like a really good start. Thank you so much for joining me today.
Dennis Dietzen:
Thank you, appreciate it.
Randye Kaye:
That was Dr. Dennis Dietzen from Washington University, describing the JALM article, “Raising the Dead Volume: Analysis of Microsamples Diluted and Corrected with Near Infrared Tracer.” Thanks for tuning in to this episode of JALM Talk. See you next time and don’t forget to submit something for us to talk about.