How AI Is Reshaping Lab Design
Artificial intelligence (AI) is no longer a future consideration in laboratory planning. Rather, it is actively reshaping how facilities are programmed, designed, and operated. For lab project teams and end users, the shift is not simply about integrating new technologies, but about rethinking the fundamental relationship between people, processes, and space.
Matthew Malone, science and technology practice leader at Perkins&Will’s New York and Philadelphia studios, shares with Lab Design News about AI how is not replacing traditional lab environments, but redistributing priorities within them.
Market reset driving smarter space planning
Despite concerns about rising lab vacancies in some markets, Malone emphasizes that the issue is less about AI disrupting space needs and more about timing and economics. “The current rise in lab vacancy is less about a fundamental mismatch between AI-enabled workflows and existing lab inventory, and more about broader macroeconomic forces reshaping the market,” he says.
Many facilities delivering today were conceived under vastly different conditions. “Many projects delivering vacant space today were conceived in a vastly different economic context, and far too down the road to redirect in a cost-effective manner,” Malone says. This lag between design, construction, and occupancy underscores a key lesson for today’s projects: flexibility is no longer optional, but essential.
Continue the conversation by registering for Lab Design’s Smart Design Tools & Software Digital Conference. This free event will air on July 14, 2026, and will be available for on demand viewing afterward. Sign up today!
AI and automation are fundamentally altering the spatial composition of labs. Traditional bench-centric layouts are giving way to a more diversified ecosystem of wet labs, dry labs, automation zones, and office environments.
“We will see a shift in the ratio of wet labs, to dry or computational labs and pure office environments,” Malone says, noting that future facilities could trend toward “potentially 33 percent each for office, dry/computational, and wet, or even 25 percent each for office, dry lab, automation, and wet lab.”
This redistribution reflects the growing importance of computational work and data analysis. It also introduces new design tensions. Automation systems thrive in open, flexible environments, while data scientists require quieter, more controlled settings.
“AI aspects are typically managed by data analytics employees, who often want a more quiet or controlled office environment,” Malone says. At the same time, collaboration between disciplines remains critical, requiring thoughtfully designed hybrid spaces where computational and wet lab teams intersect.
Designing for automation and continuous operation
Automation is doing more than increasing throughput—it’s changing how labs function over time. Many AI-enabled labs operate continuously, requiring infrastructure and layouts that support 24/7 workflows.
“Robotics and high throughput systems often require open ballroom environments with overhead service distribution and plug and play flexibility,” Malone says. These environments support rapid reconfiguration while minimizing downtime.
Importantly, automation also influences where and how people interact with the lab. Malone suggests that highly automated zones may become more machine-centric and less hospitable to human occupancy.
“I believe automation labs will function best as controlled, stand-alone environments with limited need for daylight, high lighting levels, or prolonged human occupancy,” he says.
This creates an opportunity to rethink adjacencies. By relocating human-centered activities—collaboration, meetings, informal exchanges—outside of automation-heavy zones, designers can improve both safety and knowledge sharing.
Infrastructure is the new backbone of lab design
As labs become more data-intensive, infrastructure is emerging as one of the most critical design drivers. Power, cooling, and digital connectivity are no longer secondary considerations—they are central to performance.
“Computing power needs will drive some variations in infrastructure from what is typically seen in traditional wet-lab environments,” Malone says. “AI will require 100 GbE or more if possible (10 GbE is a standard, but it is low).”
This shift has cascading implications:
Higher power density to support compute-intensive processes
Increased cooling demand, often favoring chilled water systems over air-based approaches
Enhanced backup power systems to maintain continuous operations
Robust digital infrastructure for data flow, storage, and security
“Cooling suggests an increase in chilled water capacity. Power must also consider increases in demand load for backup power. A valuable benefit to increased automation is off-hour operations, making the need for uninterrupted power to support the machines, and the related cooling more acute,” Malone says. “In addition, monitoring as part of a 24/7 operation will likely increase. This along with the increased computing power will drive one or two things: a need to increase data flowrate and a need for a more secure or controlled network for data flow, data storage, latency, and security/IP protection. Some AI or ‘in-silico’ organizations have the collection and licensing of their data as a business model, so keeping it secure is directly tied to revenue and sustained research.”
At the same time, the nature of lab data is evolving. AI-driven research relies on both internally generated experimental data and vast external datasets.
“AI driven labs also operate on two distinct types of data,” Malone says, including “wet lab generated training data” and “sequencing and databank information drawn from more than 80 open sources and core facilities.”
To support this, labs may increasingly incorporate on-site IT infrastructure such as micro data centers or blade farms—effectively blurring the line between laboratory and data facility.
Rethinking environmental strategies
One of the more surprising impacts of AI-driven lab design is its potential to improve sustainability—albeit through unconventional means. By reducing reliance on energy-intensive wet lab space and increasing computational zones, facilities may shift their energy profile.
“This evolution presents a significant opportunity,” Malone says. Facilities can “improve efficiency and reduce environmental impact by increasing their electrical demand, but trading airflow for water-based cooling.”
This represents a fundamental change in how lab sustainability is approached. Instead of focusing solely on reducing energy use, designers are optimizing how energy is consumed—moving from high air-change environments to more efficient cooling strategies.
For developers and owners navigating uncertain market conditions, AI presents both a challenge and an opportunity. Existing lab buildings may require repositioning to remain competitive.
“The first step is a careful reassessment of infrastructure through the lens of new space ratios and performance expectations,” Malone advises.
In many cases, existing systems can be adapted rather than replaced. For example, chilled beam systems—once considered niche—may become highly valuable in supporting increased cooling demands.
“Reducing wet lab intensity may allow for recalibration of air systems, maximizing existing capital investments by diverting cooling to data,” he says. This kind of targeted retrofit strategy can help owners align their assets with emerging tenant needs without overinvesting.
Designing for the unknown
Perhaps the most important takeaway for today’s lab projects is the need to design for change. AI technologies—and the workflows they enable—are evolving rapidly, making long-term adaptability a top priority.
“Long-term adaptability will depend on several key decisions made today,” Malone says. Among them:
Planning for a diverse mix of space types
Investing in scalable power and data infrastructure
Incorporating flexible distribution systems, such as overhead service carriers or utility bulkheads
Prioritizing bandwidth and backup power capacity
“Maximizing data bandwidth (flow and capacity) and ensuring robust backup power capacity beyond minimum life safety requirements will determine whether facilities remain competitive,” he adds.
AI isn’t just another tool in the lab—it’s a catalyst for a broader shift in how space is used, how infrastructure decisions are made, and how people engage with the built environment. Success will come from designing facilities that are not only technologically advanced, but also inherently flexible. The labs of the future won’t be defined by any single technology, but by their ability to evolve alongside the science happening within them.
