Genomics has undergone a transformation so rapid that even seasoned scientists remark on its velocity. Sequencing costs that seemed impossibly low a decade ago have fallen further still. Spatial and single-cell technologies now populate not just top-tier institutes but mid-size research hospitals. Physicians increasingly expect molecular information at the time of diagnosis, not months later. The scientific imagination has expanded accordingly: whole-genome sequencing for newborn screening, single-cell atlases for entire organs, and tumor profiling in real-time during therapy are now normal.
Yet, amid this acceleration, a curious bottleneck has emerged. Sequencing, once the rate-limiting step, is often the easiest part. The hard parts lie upstream and downstream of sequencing: preparing pristine libraries and extracting coherent meaning from vast data streams. Genomic science today depends on more than instrument speed. It depends on choreography—precisely timed incubations, bead binding and elution steps, micro-pipetting of microliter volumes, and hundreds of discrete actions performed without error. Downstream, bioinformatic interpretation creates its own demands for standardization, transparency, and reproducibility.
Market Specialist Manager
Agilent
The pace of change only compounds the complexity. “New kits, sequencers, and methods are constantly emerging,” said Michelle Hiscutt, PhD, market specialist manager of the European life science and diagnostic division at Agilent. “Just as a lab validates one workflow, the next breakthrough is already making waves.” Automation once served as a stabilizing force; today it must be both stabilizing and adaptable. Rather than freezing a protocol in place, modern automation must absorb novelty.
Across interviews with leaders from Agilent, Integrated DNA Technologies (IDT), Opentrons Labworks, Illumina, MGI Tech, 10x Genomics, and SPT Labtech, a consistent narrative has emerged. Automation is no longer simply mechanization. It is becoming software-definable, modular, sensor-aware, and increasingly intertwined with artificial intelligence (AI). At the same time, it must remain dependable, affordable, and deployable beyond a handful of capital-rich research centers. The field is moving toward a new ideal: automation not just as efficiency, but as infrastructure for equitable precision medicine.

Modularity for a moving target
Agilent has leaned into this new paradigm. The company’s Bravo NGS Workstation has long been regarded as one of the more flexible automated liquid handlers for genomic workflows. That flexibility now reads like a prescient design decision. “We empower labs to innovate without being constrained by rigid systems,” Hiscutt said. In practice, that means workflows that accommodate new chemistries without mechanical overhauls, new analysis tools without software rewrites, and new scientific questions without months-long revalidation cycles.
Agilent’s automation of its SureSelect Max library prep kit illustrates the approach. This method is central to comprehensive cancer genomic profiling, detecting point mutations, indels, fusions, and copy-number alterations. Automating such a protocol requires more than scripting pipetting commands. It requires safeguarding the subtle kinetics that underlie hybrid capture and ensuring that every step functions across labs with varied experience levels. “Automation isn’t just about reducing hands-on time,” Hiscutt explained. “It’s about preserving the integrity of the assay and making sure every lab can achieve consistent results.”
Development was iterative and collaborative. Agilent worked hand-in-glove with application specialists and pilot labs to adjust parameters, reduce manual touch points, and maintain performance under the realities of everyday lab environments. The result is an automated workflow that mirrors manual performance while reducing waste and consumable usage. Sustainability, often a secondary consideration in automation, has become a design criterion.
Looking forward, Hiscutt sees the next frontier at the interface of automation and analytics. “Integrating AI-driven bioinformatics with automated sample prep will be transformative,” she said. In her view, modular automation platforms must serve not only as instruments of precision but as launchpads for AI-augmented workflows that learn and adapt alongside science.
Walk-away reliability as democratization

Vice President
Integrated DNA Technologies (IDT)
If modularity addresses the pace of innovation, walk-away reliability addresses its distribution. Verity Johnson, PhD, vice president of gene reading solutions at IDT, speaks directly to the unevenness of laboratory resources. “Automating wet-bench sample processing for genomics labs is deceptively complex,” she said. Although some institutions have dedicated automation engineers, many smaller or newer labs rely on multitasking scientists who cannot stand vigil over liquid handlers.
IDT’s work to establish robust, walk-away-capable next-generation sequencing (NGS) workflows on Hamilton and Beckman Coulter systems is therefore more than technical—it is cultural and geographic. It aims to place high-fidelity genomics within reach of labs without specialized staff. Key improvements came from rethinking common steps, like reducing bead cleanups, refining consumable configurations, and tuning temperatures and transfer speeds. But the underappreciated challenge was reagent stability. Manual workflows allow researchers to retrieve refrigerated reagents at precise intervals; automation requires them to sit unrefrigerated through multi-hour protocols. “We rigorously evaluated each step to determine which reagents could tolerate ambient conditions,” Johnson said.

The payoff is practical. Labs gain back labor hours. Assays become more reproducible. And researchers can focus on interpretation and hypothesis-generation rather than mechanical vigilance. In oncology, this matters profoundly: the difference between an actionable sequencing report and an inconclusive one can rest on the stability of a reagent and the precision of a pipetting stroke.
IDT frames this not as luxury but necessity. “Our goal is to deliver turnkey solutions that empower labs of all sizes to generate and interpret genomic data at scale,” Johnson said. Walk-away automation becomes democratizing infrastructure by bringing genomic rigor to more bench tops, in more cities, and in more health systems.
A laboratory that evolves itself
Where Agilent and IDT highlight reproducibility and reach, Fran Slater, PhD, senior director of strategic marketing at Opentrons Labworks, zeroes in on adaptability. “Techniques and methods are in a constant state of evolution,” she noted. For many automation systems, every inventive graduate student or new reagent kit introduces friction. Opentrons’ mission is to turn that friction into flexibility.

Senior Director
Opentrons Labworks
Opentron’s Flex platforms allow labs to reconfigure routines rapidly, but the most striking development is OpentronsAI. “Researchers have created or updated nearly 20,000 protocols in the past year,” Slater said. The implication is remarkable: automation is becoming verbally programmable. Instead of formal coding, scientists describe desired steps in natural language, and the AI translates that intent into executable Python. Protocol writing, once a niche skill, becomes broadly accessible.
Still, freedom demands safeguards. Ensuring that AI-generated scripts execute flawlessly is an ongoing engineering challenge. Opentrons addresses this by layering testing, simulation, and feedback loops that draw on thousands of user interactions. Slater envisions a future in which automation systems can propose modifications, detect suboptimal performance, and optimize workflows based on sensor input and historical data. Yet she repeatedly emphasized accessibility. “These advances must be accessible to a wide range of labs, not just the most well-funded,” she said. The future lab, in her telling, is both adaptive and inclusive.
Standardization that adapts to diversity
At Illumina, vice president of marketing Richard Shippy focuses on variation not in technology but in people and environments. “Clinical research labs working with saliva, buccal, fresh/frozen tissue, or FFPE (formalin-fixed paraffin-embedded) samples often have very different throughput requirements and staffing models,” he said. No single automation system currently serves every scenario. Illumina’s strategy is to co-develop workflows across multiple third-party platforms and ensure that, irrespective of whether labs run compact bench-top systems or industrial robots, library prep feeds seamlessly into sequencing and analysis.

Vice President, Illumina
The company builds in barcoding, real-time checks, and automated pass/fail logic so that laboratories can trust chain-of-custody and quality control (QC) at every step. Close collaboration with early adopters ensures that edge-case failures are identified before broad rollout. Looking ahead, Shippy anticipates more sensor-rich instruments, tighter integration between prep and analysis, and what he calls “workflow apps”—analytically verified digital protocols that remain version-locked for regulatory stability even as software evolves. Illumina is committed to making their multiomics solutions “as seamless, scalable, and reliable as any other laboratory assay,” he said.
Sequencing goes modular
Yang Meng, PhD, senior vice president at MGI Tech, argues that truly democratizing genomics means challenging a foundational assumption: sequencers must be monolithic. “The core challenge today is systemic complexity,” he said. Closed instruments, proprietary consumables, and siloed workflows constrain scalability. MGI’s αBrick module breaks from this model. It treats sequencing not as a box but as a capability unit, with snap-in sequencing analogous to a processor module in a computer.

Senior Vice President
MGI Tech
“This marks our shift to delivering plug-and-play intelligent sequencing capabilities,” Meng said. Achieving modular sequencing required reconsidering hardware interfaces, data standards, and fluidics architecture. By open-sourcing the αBrick standard, MGI hopes to spur an ecosystem in which instrument makers, software developers, and clinical labs co-develop specialized solutions.
Meng envisions sequencing embedded everywhere, from major cancer centers to community hospitals, from research core labs to regional diagnostic centers. “Sequencing should be available at any medical institution in need,” he said. It is an ambitious vision, but in a world where genomics impacts oncology, infectious disease, neonatal care, and pharmacogenomics, it may be a necessary one.
The “last mile” of analysis
Even flawless automation fails if analysts cannot extract meaning from data. “Researchers are now generating single-cell data on billions of cells each year across a wide range of species and tissue types,” said Michael Schnall-Levin, PhD, chief technology officer of 10x Genomics. While there are common early steps in analysis independent of experimental design, every biological question ultimately requires customized approaches and code. This creates what Schnall-Levin calls a “last mile” problem, where the breadth of applications makes it difficult to fully automate workflows while still maintaining the flexibility needed for discovery. Moreover, single-cell and spatial datasets are not just large; they are conceptually rich and multiscale. Interpreting them requires statistics, biology, and often creativity.

Chief Technology Officer
10x Genomics
To lower the barrier, 10x has made tools like Cell Ranger accessible through natural-language interfaces linked to large language models (LLMs). “This means the ‘last mile’ problem can be addressed by LLM tools that can allow a scientist to ask questions in natural language without writing code,” Schnall-Levin said. Years of groundwork to modularize code and ensure reproducibility preceded this integration. “Then it was mainly about engineering the connections correctly to allow these software components to interact well with Anthropic’s Claude for Life Sciences,” Schnall-Levin said.
In the future, Schnall-Levin imagines AI as not only running pipelines, but also synthesizing literature, contextualizing findings, and recommending next steps.

Precision as foundation, miniaturization as frontier
SPT Labtech approaches automation from first principles: chemistry comes first. “Small errors in liquid handling or timing can cascade into big differences downstream,” said Paul Lomax, head of genomics. In clinical genomics, mistakes carry not just scientific cost but human consequence.

Head of Genomics
SPT Labtech
The company’s firefly®+ platform automates hybrid-capture and target-enrichment workflows, long regarded as extremely challenging for automation. Historically, scientists performing these assays relied on dexterity and intuition developed over years at the bench—watching beads behave and feeling the timing of washes. Converting that expertise into automation required intimate collaboration with assay developers and validation against real clinical and oncology samples.
Close attention to the firefly design means that any unit in any lab can download a protocol from the SPT community and have it work, highlighting a level of machine-to-machine reproducibility that contrasts sharply with the fine-tuning often required to achieve consistency on other robotic platforms. “The result was a set of protocols that deliver the same assay performance as the manual gold standard while enabling true walk-away operation,” Lomax said.
The SPT Labtech ethos is not only to match the manual gold standard but also to eliminate variability from person to person or robot to robot. Precision is the foundation and miniaturization is the frontier. Automation must not simply be equivalent; it must be trustworthy.

SPT Labtech is also pushing reaction miniaturization by shrinking volumes to reduce cost and environmental impact. But miniaturization is not simply about smaller volumes. It requires greater precision because errors scale non-linearly as volumes shrink. Inline optical and fluidic checks ensure that reactions proceed correctly, and dispensing technology has been engineered for microliter and sub-microliter reliability. Lomax sees the future as seamless: sample in, data out with traceability, auditability, and QC built in. “Automation should enable access to clinical-grade genomics, not add risk or complexity,” he said. As genomics moves into mainstream clinical medicine, SPT Labtech’s precision-first philosophy may prove foundational.
Toward intelligent, inclusive automation
Across these conversations, certain motifs recur. Oncology is the crucible where automation is tested first. Rare disease genomics follows, as does infectious disease sequencing and reproductive health. But all voices insist that automation must not only accelerate science—it must broaden who can do it. Agilent emphasizes modular platforms that evolve with technology. IDT stresses turnkey reliability for labs without specialist staff. Opentrons champions programmable automation powered by accessible AI. Illumina’s sequencing and software solutions are used in every field genomics touches, making it crucially important to design for reproducibility across diverse settings. MGI advocates open ecosystems for sequencing. 10x democratizes analysis literacy. SPT Labtech anchors the movement in precision and clinical trust.

Taken together, their visions describe a laboratory ecosystem where workflows assemble like software, robots anticipate failure modes, sequencing modules plug into broader systems, and AI helps interpret both experimental design and biological meaning. Crucially, companies want these capabilities distributed globally, not locked behind academic prestige or capital access.
Automation, in other words, is not a narrow engineering project. It is a structural transformation. Its success will be measured not merely in throughput, but in fairness; not only in precision, but in participation; not just in how much data we generate, but in how many lives are improved by it.
The era of manual pipelines built anew for each experiment is fading. In its place is an era where laboratories are intelligent systems: adaptable, rigorous, self-monitoring, and profoundly interconnected. The challenge now is not whether this transformation will occur, but whether we will design it thoughtfully enough to ensure that its power benefits all.
Mike May, PhD, is a freelance writer and editor with more than 30 years of experience. He earned an MS in biological engineering from the University of Connecticut and a PhD in neurobiology and behavior from Cornell University. He worked as an associate editor at American Scientist, and he is the author of more than 1,000 articles for clients that include GEN, Nature, Science, Scientific American, and many others. In addition, he served as the editorial director of many publications, including several Nature Outlooks and Scientific American Worldview.
