A Practical Review of Adaptive Platform Trials
Traditional randomized controlled trials (RCTs) can provide rigorous evidence but are often slow and resource-intensive, requiring separate trials for each intervention. Adaptive platform trials (APTs) have been promoted as a solution, offering a framework that tests multiple therapies under a single protocol, with arms added or dropped as evidence accumulates. However, their advantages come with trade-offs that warrant scrutiny. In this review, we critically appraise 3 landmark APTs. The I-SPY2 trial accelerated Phase II oncology research by utilizing Bayesian adaptive randomization and surrogate endpoints; however, much of its efficiency stemmed from relying on intermediate outcomes, which may not reliably predict survival. RECOVERY demonstrated the power of scale on a pragmatic UK-wide platform, but its success reflected health system infrastructure, political leadership, and the unique circumstances of the COVID-19 pandemic as much as its design. REMAP-CAP, a perpetual platform trial for pneumonia, rapidly switched to pandemic mode in 2020 and tested COVID-19 therapies using Bayesian models and response-adaptive randomization (RAR); however, the RAR amplified random noise in some domains, exposing patients to interventions later shown to be ineffective. A recent systematic review confirmed wide heterogeneity in APTs and suboptimal reporting. APTs are not inherently better than classical RCTs. Gains in speed may depend on less rigorous endpoints, complex adaptive methods, or streamlined oversight, each of which introduces new risks of error. As APTs spread to new fields such as transfusion medicine, clinicians and researchers must learn to recognize both the potential benefits and the pitfalls of this design.
Gene Therapies for Hemoglobinopathies: Efficacy, Cell Collection & Transfusion Support
Sickle cell disease (SCD) and transfusion dependent β-thalassemia (TDT) are complex disorders, often resulting in lifelong morbidity and reduced life expectancy. Allogeneic hematopoietic stem cell transplant (HSCT) is curative, with matched-related donor (MRD) transplant having the highest success. MRD availability is limited for both disorders, and HCT carries the risk of transplant-related complications, such as graft-versus-host disease (GHVD) and graft failure. Gene therapy (GT) offers an alternative curative option by modifying autologous hematopoietic stem and progenitor cells (HSPCs), making the treatment available to all, while eliminating the risk of GVHD. The U.S. Food and Drug Administration (FDA) has approved GTs for both SCD and TDT: lovotibeglogene autotemcel (Lyfgenia) and exagamglogene autotemcel (Casgevy) in 2023 for SCD and betibeglogene autotemcel (Zynteglo) in 2022 and exagamglogene autotemcel (Casgevy) in 2024 for TDT. This article appraises the studies the FDA approvals were based upon, with comments on transfusion and stem collection regimens. The latter aspects highlighting variability in practice and the need for additional studies to optimize pretransfusion regimens and the collection process for successful GT.
TMR Paper for Issue on "Technologies in Transfusion Medicine" Dried Plasma - Where Are We and Where Next?
Traumatic hemorrhage is a major cause of morbidity and mortality, in both military and civilian settings. Early hemostatic resuscitation including red cell and plasma administration is a mainstay of treatment. Prehospital blood transfusion benefits trauma patients, but delivery is logistically challenging. Dried plasma, stored in ambient conditions and reconstituted rapidly, without specialist equipment, offers a pragmatic solution to the logistical barriers to prehospital transfusion. This review outlines mechanisms of action, and approaches to evaluating efficacy of plasma, summarizes advances in drying technologies and their sequelae on plasma quality, and critically appraises 4 published studies.
Monitoring Mitochondrial Oxygen Tension: mitoPO as Physiologic Transfusion Trigger?
Hemoglobin-based red blood cell transfusion (RBC) triggers are inadequate for personalized transfusion decisions because they are population-based and therefore unable to identify individual patients that will benefit from RBC transfusion. Therefore, physiological transfusion triggers are sought after to provide tools for a more individualize approach. Since mitochondria are the ultimate destination of oxygen it seems reasonable to suggest that measuring oxygen at the mitochondrial level might provide insight in the need for RBC transfusion. Mitochondrial oxygen tension (mitoPO) is a novel clinical parameter that can be measured by an optical technology. This narrative review provides a brief introduction on mitoPO monitoring and uses 5 recent studies to explore the potential of mitoPO as tool for assessing need for transfusion and/or monitoring the effect of transfusion. A mathematical model shows an ideal behavior of mitoPO on critical hematocrit and from 4 recent clinical studies we learn that mitoPO is an independent parameter that can be used in transfusion-related studies. Further investigation into the potential role of mitoPO in transfusion medicine is needed.
ABO matching for platelet transfusions for prevention or treatment of bleeding: A systematic review with meta-analysis
Transfusion of ABO identical platelets is recommended in national guidelines, though transfusion of ABO non-identical platelets has been widely adopted to ensure availability and reduce wastage. When ABO non-identical platelets are necessitated, there is a lack of consensus on prioritisation of major or minor compatibility. We conducted a systematic review and meta-analysis (PROSPERO CRD42023450792) of randomised and non-randomised studies to assess whether there is a difference when comparing ABO-identical and non-identical (major, minor, bi-directional mismatch) platelet transfusions. From 4177 potential references, 18 studies met our criteria: 3 randomised controlled trials (RCTs), 8 prospective and 7 retrospective observational studies. Evidence was very low certainty as to whether there was a difference from transfusion of ABO identical or non-identical platelets, where data were available, for clinically significant (WHO grade 2+ and 3+) bleeding, mortality, acute transfusion reactions, platelet refractoriness. Platelet increments were the most frequently reported outcomes. Overall, there was a paucity of evidence for clinical outcome data including bleeding risk for ABO identical compared to non-identical transfusion. We make recommendations for designing and reporting future platelet ABO matching studies based on our observations in this review. Future studies should consider the effect of repeated exposure to ABO identical or non-identical transfusions and known confounders.
Toward Personalized Transfusion Strategies: The Emerging Role of Wearable Biosensors in Chronic Anemia Management
Chronic transfusion-dependent anemia presents ongoing challenges in optimizing symptom control and functional outcomes, particularly in an older, often comorbid patient group. Conventional transfusion strategies based on fixed hemoglobin thresholds may inadequately address the individual variability in oxygen delivery needs and symptom burden. Wearable biosensor technologies enable continuous monitoring of physiological parameters such as heart rate, respiratory rate, and physical activity in real-world settings. These tools offer the potential to detect early deterioration and support more responsive, patient-centered transfusion decisions and improve hemovigilance. This review evaluates current evidence on the feasibility, acceptability, and clinical relevance of biosensor use in transfusion medicine. Findings from recent pilot studies demonstrate high data quality, favorable tolerability, and preliminary indications of physiological response following transfusion. However, the clinical utility of biosensor-guided transfusion strategies remains unproven, with key challenges including data interpretation, workflow integration, and validation of clinically meaningful endpoints. As the field moves toward personalized supportive care, biosensors may offer a novel means to optimize transfusion timing, preserve functional capacity, and enhance quality of life.
Efficacy of Granulocyte Transfusions in Treating Neutropenic Infections: A Systematic Review and Meta-Analysis of Intervention Studies
The therapeutic value of granulocyte transfusions (GTX) remains debated. We conducted a systematic review and meta-analysis of intervention studies evaluating GTX efficacy in treating neutropenic infections. MEDLINE, EMBASE, and Cochrane Central were searched from inception to March 2025 to identify interventional studies evaluating the efficacy of GTX for neutropenic infections. Studies were qualitatively summarized. Summary risk ratios (RR) with 95% confidence intervals (CIs) were estimated for randomized controlled trials (RCTs), and non-randomized controlled trials (NRCTs) using random-effects models. Certainty of evidence was evaluated using GRADE. There were 110 studies meeting inclusion criteria: 16 RCTs, 14 NRCTs, and 80 uncontrolled trials. The most frequent underlying disease was leukemia, and the most frequently reported pathogen was Candida. In RCTs, GTX showed no significant all-cause mortality reduction over standard-of-care in pediatric/adult patients or neonates, both associations with low certainty of evidence. In contrast, prospective NRCTs including pediatric/adult patients showed that GTX led to lower all-cause mortality (RR 0.40; 95% CI: 0.23-0.68, I: 64%), particularly among recipients of high-dose GTX (≥1 × 10cells/transfusion), with very low-certainty evidence. Results support a dose-response relationship and highlight heterogeneity in patients, treatment settings, and infections. This work recommends carefully designed future RCTs, including strict patient stratification.
Artificial Intelligence and Machine Learning in Transfusion Practice: An Analytical Assessment
Transfusion medicine is vital to healthcare and affects clinical outcomes, patient safety, and system resilience while addressing challenges such as blood shortages, donor variability, and rising costs. The integration of artificial intelligence (AI) and machine learning (ML) presents new opportunities to improve clinical decision-making and operational effectiveness in this field. This structured narrative review identified and evaluated studies applying AI and ML in transfusion medicine. A search of PubMed and Scopus for articles published between January 2018 and April 2025 yielded 565 publications. Studies were included if they applied AI or ML techniques, focused on transfusion management or decision support, and were evaluated using electronic health records or expert review. Four exemplar studies were selected, each representing a distinct AI paradigm: supervised, unsupervised, reinforcement, and generative learning. These studies were critically appraised for methodological rigor, clinical relevance, and potential for implementation in practice. The reviewed studies reflected a clear shift from traditional analytic methods toward more advanced computational approaches to improve prediction accuracy, optimize resource allocation, and support clinical decision-making. Three overarching themes emerged: the need to balance model complexity with interpretability and clinical feasibility; the impact of data quality and preprocessing on model performance and fairness; and the barriers to broader applicability and cross-institutional deployment. As technological barriers continue to decline, future challenges will increasingly center on privacy regulations, infrastructure constraints, and aligning model complexity with practical utility. Thoughtful integration of these considerations through scalable, clinical-grade, and transparent solutions will be critical in realizing the full potential of AI and ML in transfusion medicine.
Intravenous Iron Therapy Versus Blood Transfusion for Iron Deficiency Anemia: A Systematic Review
This systematic review aimed to assess and compare the effect of blood transfusion and intravenous iron therapy on the hemoglobin levels based on clinical trials. To do this, a search was conducted 25 of September 2024 by using PubMed, Cochrane, and Embase databases to identify studies comparing intravenous iron with blood transfusion in patients with iron deficiency anemia (<12 g/dL for women and <13 g/dL for men). The outcome selected was change in hemoglobin levels. The quality of the trials was assessed using Cochrane Risk of Bias Tool and Newcastle-Ottawa Quality Assessment Scale. We included 5 studies (three randomized controlled trials, 1 observational study and 1 retrospective study) comprising a total of 154,539 patients. Patient populations were heterogenous, encompassing surgical patients, patients undergoing hip fracture and pregnant women. Due to heterogeneity among the included studies, hemoglobin levels were reported at varying follow-up intervals. At 3 weeks follow-up or later after initial treatment, 3 studies reported significantly higher hemoglobin levels (ranging from 0.7 g/dL to 1.4 g/dL higher) in the intravenous iron group compared to the blood transfusion group. The remaining 2 studies found similar hemoglobin levels. Less than 3 weeks after initial treatment, 2 studies reported significantly higher hemoglobin levels in the blood transfusion group compared to the intravenous iron group. Our findings indicate that blood transfusion is more effective in achieving a rapid increase in hemoglobin levels shortly after therapy initiation, although this effect diminishes relatively swiftly. In contrast, intravenous iron seems to exert a more gradual increase in, but also longer lasting effect on, hemoglobin levels. However, our findings are limited by the small number of trials as well as questionable methodological quality of the included studies, resulting in a high risk of bias. Further investigation is warranted.
Optimizing Buffy Coat Pooling: Enhancing Platelet Yield Through Platelet Count-Based Sorting
Optimizing platelet yield in pooled buffy coat (BC)-derived platelet products by sorting them according to donor whole blood (WB) platelet counts is a promising approach to increase the production of double-dose platelet concentrates while reducing the variability among products. We aimed to assess the benefit of sorting according to an HTML-report generated by an in-house developed preselection algorithm. In this study, we compared two approaches of BC pooling to produce pathogen-inactivated double-dose platelet concentrates. The platelet count of donor WB was measured using a hematology analyzer prior to pooling. In one group, six BCs were randomly assigned to pools, while in the other group, six BCs were sorted by platelet counts of WB samples prior to pooling according to an in-house developed preselection algorithm, selecting six BCs for each pool in a way that yields for each product are similar. All BCs were included as no minimum or maximum platelet count entry criteria were used. Yield and divisibility rate of both approaches were compared using Wilcoxon Rank-Sum Test to assess the impact of sorting by platelet count. Sorting BCs according to our in-house developed algorithm resulted in significantly higher median platelet concentrations (×10³/µL), rising from 1247 (interquartile range [IQR] 1207-1349) when randomly assigned to 1307 (IQR 1237-1381) when sorted (P = .0434). In line, yields per platelet unit (×10/unit) significantly increased from 4.6 (IQR 4.3-4.8) when randomly assigned to 4.8 (IQR 4.5-5.1) when sorted (P = .0191). The proportion of divisible pathogen-inactivated platelet units increased from 72.1% to 89.5%. For both approaches, all units were above the minimum threshold (>2.0 × 10/unit) and no maximum threshold was defined. Assigning BCs to pools according to an in-house developed preselection algorithm enables the production of platelet concentrates with increased yields and leads to more standardized products.
Quantifying Harms Associated With Red Cell ABO Incompatible Blood Transfusion: A Systematic Review of the UK SHOT Literature
ABO-incompatible (ABOi) red blood cell (RBC) transfusions can lead to severe clinical consequences, including patient death. Electronic systems, such as Bedside Electronic Transfusion Checks (BETC), have been developed to lower the risk of these serious incidents occurring due to errors in patient identification at the bedside; however, the benefits for patients have not yet been fully quantified. To address this gap, we aimed to quantify the harms (ie, morbidity and mortality) associated with ABOi RBC transfusions in the UK, enabling us to better understand the benefits of BETC in preventing these events for patients. Twenty-seven years of published UK hemovigilance data from cases submitted to Serious Hazards of Transfusion (SHOT), including reports from 1996 to 2023 were reviewed using systematic review methodology by 2 independent reviewers. Data was collated into a Microsoft Excel database for further analysis. The data were analyzed to determine the number of reports of ABOi RBC transfusion and the rate of mortality/morbidity associated with these events. Morbidity was defined as hemolytic transfusion reaction (acute and delayed), any organ injury, extended length of hospital stays, the requirement for mechanical ventilation and ITU admission (including critical care units), and any other adverse events as reported in each case. Over 27 years (1996-2023), 55.3 million RBC units were issued in the UK, with 368 ABO-incompatible (ABOi) transfusions, equating to 0.67 per 100,000 transfusions. Clinical errors accounted for 53.3% of the observed ABOi transfusions (0.36 per 100,000), primarily occurring during administration (0.16 per 100,000), blood collection (0.10 per 100,000), and sample collection (0.07 per 100,000). Laboratory errors made up for 13.6% of the observed ABOi transfusions (0.09 per 100,000), predominantly being a consequence of errors in pretransfusion testing (0.06 per 100,000). Mortality among the observed ABOi transfusions was 6.3% (0.04 per 100,000), with major morbidity at 23.9% (0.16 per 100,000), which includes ICU admissions (0.03 per 100,000) and hemolytic reactions (0.05 per 100,000). While ABOi RBC transfusions have become rare in the UK, they are associated with significant short-term morbidity and mortality. Early SHOT reports lacked standardization and provide limited data on patient outcome. When patient outcome was reported, it was limited to short-term outcomes immediately post ABOi transfusions. No data was reported on longer -term patient outcomes limiting the ability to provide long-term outcome assessment. Enhancing hemovigilance practices is essential to reducing ABOi risks. National hemovigilance schemes worldwide need to harmonize/standardize the reporting of short-term and long-term outcome data collection for ABOi RBC transfusions so we can better understand the risk and burden of these events on patients.
Ultra-Restrictive Transfusion Thresholds in Critically Ill Adults: Perhaps Not for Everyone
Combinations of Non-di(2-ethylhexyl) Phthalate Collection Sets, Storage Bags and Additive Solutions for Red Blood Cells
For decades, di(2-ethylhexyl) phthalate (DEHP) has been the primary plasticizer used to make polyvinyl chloride (PVC) blood bags flexible. DEHP leaches into the blood product, stabilizing red blood cell (RBC) membranes and preventing excessive hemolysis. Despite being classified as a substance of very high concern due to its potential endocrine-disrupting and carcinogenic effects, DEHP has continued to be used in red cell concentrate (RCC) storage bags, as alternatives have often led to reduced RCC quality during storage. However, under the European Medical Device Regulation the use of DEHP in medical devices is restricted to below 0.1% by weight per July 2030. As a result, the effect of several alternative plasticizers on RCC quality, such as di(2-ethylhexyl) terephthalate (DEHT), 1,2-cyclohexane dicarboxylic acid diisononyl ester (DINCH), and N-butyryl-n-hexyl citrate (BTHC), have recently been investigated in combination with different storage solutions. Although previous studies using these new combinations showed variable results, these alternatives remain the most promising options, with current data demonstrating reduced leaching and lower toxicity compared to DEHP. This review highlights key publications on the transition from DEHP-PVC blood bag systems for RCC storage, demonstrating that several non-DEHP alternatives are viable replacement options, particularly when combined with next-generation storage solutions. Future studies are required to assess the frequency of adverse events, the occurrence of handling issues such as leakage, and to evaluate practical performance and clinical efficacy through post-transfusion recovery and increment studies.
Analytical Review: Neutrophil Extracellular Traps and Antiphospholipid syndrome
Antiphospholipid syndrome (APS) is an autoimmune prothrombotic disorder defined by the presence of one or more antiphospholipid antibodies (aPL) in conjunction with clinical manifestations such as thrombosis and/or obstetrical complications. One of the notable recent developments in APS research is the identification of a contributory role for neutrophil extracellular traps (NETs) in its pathogenesis, establishing a mechanistic link between thrombosis, inflammation, and complement activation. NETs, composed of decondensed chromatin and neutrophil-derived granule proteins, are released in response to various infectious and sterile triggers. In individuals with APS, elevated NET levels and the presence of anti-NET antibodies have been observed, aligning with thrombotic events and enhanced complement system activation. Studies support an emerging model that neutrophils are primed in APS to form NETs as a central mechanism in the development of thrombosis. This review explores multiple mechanisms linking NETs and thrombosis in APS including: contribution of aPL to enhanced leukocyte adhesion and the induction of NETosis via P-selectin glycoprotein ligand-1 (PSGL-1) and the transcription factor KLF2; cyclic AMP and the adenosine A receptor on the neutrophil surface as negative regulators of NETosis and thrombus formation in APS; and NET-mediated resistance to activated protein C leading to hypercoagulability, amongst others. Intervening in NET-related pathways represents a promising therapeutic strategy to mitigate thrombotic risk in APS, underscoring the need for ongoing investigation into neutrophil-mediated mechanisms in this autoimmune disorder.
Ultra-Restrictive Transfusion Thresholds in Critically Ill Adults: Are We Ready for the Next Step?
Anemia is almost universal in critically ill patients, with 25% receiving blood transfusions as clinicians aim to prevent insufficient oxygen delivery. The current 'restrictive' hemoglobin (Hb) threshold of 7 g/dL for the nonbleeding critically ill population is supported by several landmark transfusion trials. While some trials have investigated lower transfusion thresholds, these were not conducted in this specific population. Transfusion is associated with various risks including transfusion-associated circulatory overload, transfusion-related acute lung injury, and hemolytic reactions. Moreover, transfusion products are scarce and expensive as they are produced from voluntary blood donations. Therefore, it is essential to limit blood transfusion to when absolutely necessary. Research indicates that several patient categories tolerate lower Hb levels than 7 g/dL. For instance, studies on acute hemodilution in healthy volunteers have shown that lower Hb levels do not lead to organ ischemia. Similarly, studies involving patients who refuse transfusions, often report lower Hb levels down to 5g/dL or less. These lower Hb levels appear to have limited impact on mortality or morbidity related outcomes. In patients with severe burns or hematological disorders, Hb levels below 7 g/dL are not associated with significant adverse outcomes. These findings suggest that the transfusion threshold for critically ill patients could potentially be lowered, as Hb levels under 7 g/dL do not inherently lead to increased mortality or morbidity. An individualized approach to deciding whether to transfuse or not might be best. This shift in transfusion practice could help reduce costs and minimize the risks associated with blood transfusions.
ADAMTS13 Testing During Clinical Remission of Immune Thrombotic Thrombocytopenic Purpura: A Critical Review
Immune thrombotic thrombocytopenic purpura (iTTP), an autoimmune disorder characterised by thrombocytopenia and microangiopathic haemolytic anaemia, is associated with significant morbidity. The diagnosis is made when ADAMTS13 activity is <10% in conjunction with supporting clinical features. Treatment includes plasma exchange with immunosuppressive and anti-von Willebrand factor therapies. While diagnosis and management of acute iTTP are well established, our understanding of optimal monitoring during clinical remission remains incomplete. Clinical relapse of iTTP occurs most commonly within the first year of remission, however, there is little consensus as to the frequency of ADAMTS13 monitoring during clinical remission and when to intervene when there is ongoing deficiency. Through selecting studies that performed ADAMTS13 activity testing during clinical remission of iTTP we critically analyse the current research of ADAMTS13 monitoring during clinical remission and suggest areas for further research with a focus on clinically important outcomes.
The Utility of a Critical Antibody Titer in Anti-K Alloimmunized Pregnancies: A Systematic Review and Meta-Analysis of Diagnostic Test Accuracy
Anti-Kell (anti-K) alloimmunization is a known cause of severe hemolytic disease of the fetus and newborn (HDFN), yet the utility of a critical maternal antibody titer in guiding clinical management remains debated. We conducted a systematic review and meta-analysis to evaluate the diagnostic accuracy of a maternal anti-K titer threshold of ≥8 for predicting the need for intrauterine intervention due to severe anti-K-mediated HDFN. In parallel, we characterized all reported cases of severe HDFN occurring in the setting of low maternal anti-K titers (<8). Studies were excluded if they lacked reported titers, did not include K-positive or K-unknown fetuses, failed to report fetal outcomes, or included interventions that could lower maternal alloantibody levels. Studies that assessed all alloimmunized patients meeting inclusion criteria were incorporated into a diagnostic test accuracy (DTA) meta-analysis; all eligible studies were included in a qualitative synthesis. Fifty-four studies, comprising 582 fetuses, met inclusion criteria. Of these, 6 studies (350 fetuses) were included in the DTA analysis, which demonstrated a pooled sensitivity of 97.0% (95% CI, 88.7%-99.2%) and specificity of 33.1% (95% CI, 27.9%-38.8%) for an anti-K titer ≥8. Among fetuses affected by severe HDFN, 98.6% (204/207) were associated with maternal anti-K titers ≥8. These findings suggest that severe disease is uncommon in the setting of low anti-K titers and support the use of a critical titer threshold to inform antenatal surveillance. Reevaluation of current clinical guidelines may be warranted in light of these data.
Identifying Modifiers of CAR T-Cell Therapeutic Efficacy and Safety: A Systematic Review and Individual Patient Data Meta-Analysis
CAR T-cell therapy is effective in relapsed/refractory hematologic malignancies, but its use has been tempered by heterogeneity in response and safety outcomes. We performed individual patient data meta-analysis (IPDMA) of CAR T-cell therapy in patients with hematologic malignancies to explore whether patient-level factors modify therapeutic efficacy/safety. We searched MEDLINE, Embase, and Cochrane CENTRAL for relevant trials. IPD was collected and pooled from each included trial, and prevalence of outcomes among strata of potential modifiers was explored. Our primary outcome was complete response, and the secondary outcomes were cytokine release syndrome (CRS), and immune effector cell associated neurotoxicity syndrome (ICANS). We identified 89 trials comprising 2,331 patients for the IPDMA. Complete response proportion ranged from 25% to 75% depending on cancer type. Decreased complete response was seen in those that received bridging therapy compared to those that did not (34% vs 58%, RR:0.55, 95% CI:0.30-0.98), as well as with autologous cell sources compared to allogeneic sources (53% vs 67%, RR:0.61, 95% CI:0.43-0.87). Compared to CAR T-cell therapies targeting CD19 alone, therapies that combine CD19 targeting with additional targets such as CD20, CD22, CD30, CD33, LeY, NKG2D, or BCMA were associated with higher complete response rates (72% vs 58%, RR:1.69, 95% CI:1.15-2.50). Autologous cell sources demonstrated increased risk of ICANS relative to allogeneic sources (24% vs 3%, RR:10.48, 95% CI:1.87-58.57). Safety and efficacy of CAR T-cell therapy within specific cancer types was also affected by modifiers including bridging therapy, CAR T-cell source, CAR T-cell target, sex, age, number of cell infusions, co-stimulatory domain, and dose.
Platelet Additive Solutions and Pathogen Reduction Impact on Transfusion Safety, Patient Management and Platelet Supply
Since 1998, leuko-reduction is used in France for all platelet concentrates (PCs), apheresis-derived (APCs) and pooled whole blood-derived buffy-coats (BCPCs). Platelet additive solutions (PAS), introduced in 2005, accounted for over 80% of the platelet supply from 2011 to 2017. The Intercept pathogen reduction technology (PR), started in a pilot study in 2007, was generalized in 2018. Between 2007 and 2021, the use of BCPCs increased steadily from 23% to 70% of the supply. Objectives: to analyze the impact of these modifications on adverse transfusion reactions (ATRs), patient management and blood transfusion organization. Results: The overall incidence of ATRs /10 PCs is significantly lower with PAS- and PR-PCs as compared to PCs in plasma (PL), with the decreasing hierarchy PL > PAS > PR. PAS- and PR-PCs lead to significantly lower incidences of allergy and alloimmunization to RBC antigens (RC-AI) ATRs. The incidence of bacteria transmission (TTBI) is significantly reduced by 95% with PR-PCs. APC-related ATR incidence is significantly higher than BCPC for allergy (+233%), TTBI (+100%), APTR (+75%), Major-ABO-II (+65%), HLA/HPA-AI (+38%), FNHTR (+22%), and life-threatening ATRs (+106%). A single diagnosis is significantly less associated with APCs: RC-AI (-47%). The generalization of PR-PCs, which exhibit a lower platelet content than PAS- and PL-PCs, is associated with a significant 9% decrease in the ATR incidence per PC, a 13% increase in the number of PCs transfused per patient, and a nonsignificant 3% increase in the ATR incidence per patient. The outdated PCs percentage declined significantly from 3.7% to 1.7%.
