
Ethylene cracker efficiency erodes quietly through shifting feed conditions, furnace degradation, and run-cycle progression, and the margin gap between average and optimized performance now determines which assets stay competitive. This article traces where losses build across feedstock execution, furnace thermal transfer, coking behavior, and downstream separation loads, why static APC captures less value as conditions drift from its design basis, and how AI optimization trained on plant operating history coordinates severity, firing, and fractionation trade-offs as one connected system.
Inside an ethylene cracker, small efficiency losses build quietly through shifting feed conditions, furnace limits, and run-cycle progression. Those losses carry far more economic weight than they did in tighter market cycles. Global utilization has fallen to around 80%, and nearly 24% of capacity now faces potential closure. In that environment, the gap in ethylene cracker efficiency between an average performer and a well-optimized one can determine which assets stay competitive.
Feedstock economics set the floor, but operating discipline determines how much margin a plant actually captures. That distinction matters because feedstock positions are often fixed by geography and long-term contracts. The production optimization strategies that matter most are the ones operations leaders can actually influence day to day.
Margin losses in ethylene crackers accumulate across feedstock execution, furnace degradation, and fragmented control. The gap between current performance and recoverable value widens as utilization softens.
Here's where those losses build, why traditional control can't close the gap alone, and where recovery exists.
The feedstock decision drives cracker economics more than almost any other variable. The gap between ethane and naphtha pricing has widened considerably during recent market cycles, reaching up to about $400 per metric ton since 2021. That cost gap explains why feed flexibility matters, but it also explains why flexibility alone isn't enough.
Naphtha-based operations produce co-products like propylene and butadiene that can diversify revenue when those markets cooperate. Ethane-fed crackers run lighter and simpler but carry more exposure to a single product and less flexibility when ethylene demand softens. The economics are never one-dimensional, and they shift faster than most operating models anticipate.
A feed that looked optimal based on last month's pricing may not be the right call this week. That's one reason feedstock optimization keeps moving up the priority list for operations teams managing multi-feed crackers.
For that reason, multi-feedstock flexibility matters more than it used to. Crackers designed to process naphtha, propane, butane, and LPG can shift in response to pricing dynamics. Each change in feedstock variability demands different furnace severity, steam-to-hydrocarbon ratios, and separation configurations.
A lighter feed like ethane requires higher cracking severity to achieve target conversion, while heavier feeds produce a wider slate of co-products that shifts the load on downstream fractionation. Getting those settings wrong, even slightly, erodes the economic advantage that multi-feed capability was supposed to provide.
A suboptimal severity setting doesn't just affect ethylene yield in the furnace; it ripples through fractionation energy, product purity, and compression load downstream.
Ethylene cracker efficiency erodes gradually across the run cycle, even when headline plant performance looks acceptable. The visible constraints usually show up first in furnaces, run length, and the interaction between upstream and downstream units.
Furnace thermal transfer is one of the main operating constraints. As tube metallurgy ages and coke deposits accumulate, heat transfer degrades progressively. Convection section fouling makes this worse over the run cycle, sometimes unevenly across furnace passes. Plants often respond by backing away from the edge of the operating window.
That protects equipment but also gives up yield and throughput that a healthier energy efficient furnace could sustain. When multiple furnaces in a bank age at different rates, the constraint on one furnace can force conservative operation across the entire bank, even when several furnaces still have headroom.
Coking behavior also creates uneven performance. When one coil or pass deteriorates faster than the rest, the weakest performer can end up driving the decoking schedule for the full furnace. That shortens run length and compresses available throughput across the bank. The loss is not always dramatic on a single day. It often appears as a series of smaller compromises that accumulate across the cycle.
Each run gets a little shorter, and decoking frequency climbs beyond what the original plan assumed. Over a year, those extra decoke cycles add up to real lost production time and erode the kind of olefins plant optimization that operations teams are working toward.
Even when furnaces get most of the attention, downstream fractionation and compression loads remain tightly linked to upstream operating decisions. A change that looks favorable in the furnace can compromise energy optimization or separation balance downstream, which is why local optimization often leaves broader value unrecovered.
Higher cracking severity in the furnace, for example, produces more methane and hydrogen that must be separated and compressed before ethylene reaches the cold box. Plants rarely optimize those interactions as one system, and the cumulative cost of that fragmented approach is difficult to see from any single unit's metrics.
The operators managing furnaces and the operators managing fractionation are often making good local decisions that add up to a suboptimal plant-level outcome.
Advanced process control (APC) has delivered real value in ethylene operations, particularly in stabilizing coil outlet temperatures and managing fuel flow. But that architecture also creates a ceiling on what APC captures when conditions keep changing.
Traditional APC holds a target while adjusting around it. It's less effective at continuously rebalancing operating choices as the process moves through feed transitions, run-cycle progression, or grade changes. Static targets don't follow a moving process as closely as operators often need, and the gap between the controller's design basis and current operating conditions tends to widen over the cycle. The models behind traditional APC are typically linearized around a narrow operating region, which works well when conditions stay close to that region but loses accuracy as the process drifts.
The usual result is a more conservative operating posture: plants trade yield for run length, and experienced operators compensate around the controller by adjusting setpoints manually based on pattern recognition. That works, but it doesn't scale across shifts or units, and the plantwide process control infrastructure captures less economic value the farther the process moves from the controller's original basis.
Operators have years of decision-making experience behind their responses, and no AI system replaces that judgment. But static linear models were designed for a narrower problem: holding key variables near a target, not continuously finding the best economic operating point across an entire unit.
Most operations teams know the resulting tension well. Stability and run length take priority, and some yield and energy opportunity gets left behind.
AI optimization approaches the cracker differently. Dynamic process models trained on actual plant operating history learn relationships as they exist on that unit, not only as they were expected at design conditions. A reinforcement learning controller can then adjust multiple setpoints together as conditions evolve.
Furnace optimization is often the highest-value starting point. Instead of treating control loops independently, AI optimization can coordinate changes across the furnace bank and reflect how local decisions affect the rest of the unit. The wider view is what recovers those losses, because many come from interactions, not single variables.
Adjusting severity on one furnace while accounting for its neighbors' coking state, the downstream fractionation load, and the current feed composition produces a different answer than optimizing each in isolation. The same system perspective applies to yield and energy.
Setpoint optimization across furnaces, fractionation, and compression happens at the same time.
Many implementations build value by starting in advisory mode. The system recommends setpoint changes, and operators decide whether to accept them. That approach lets teams compare trade-offs before committing, improve cross-shift consistency, and track how process behavior changes over time.
Some plants may choose to stay in advisory mode because it supports operator judgment, builds trust gradually, and delivers useful decision support on its own. Advisory mode also gives planning and economics teams a practical way to run scenarios, evaluate changing constraints, and challenge LP assumptions built on older models without immediately moving to automation.
The trust-building period is where real adoption happens. When experienced operators recognize their own decision patterns reflected in the model's behavior, the system becomes more practical to use.
It supports judgment instead of competing with it, and the shared model gives operations and engineering a common reference for how the plant actually behaves today instead of falling back on design-basis assumptions that may no longer apply.
For process industry leaders seeking more margin from existing ethylene cracker assets as the gap between viability and closure keeps narrowing, Imubit's Closed Loop AI Optimization solution is built for this operating reality. The platform learns from a plant's own historical data, builds dynamic process models specific to that unit, and writes optimal setpoints directly to the distributed control system in real time. Plants can begin in advisory mode and progress toward closed loop operation as confidence builds.
Get a Plant Assessment to discover how AI optimization can recover efficiency and margin from your existing ethylene cracker assets.
Feedstock variability increases the value of AI optimization because each composition change shifts the best balance of severity, steam ratio, firing patterns, and downstream separation load. Static control models tuned to a narrower basis struggle as those conditions move through feed transitions and run-cycle progression. In plants that already have multi-feed flexibility, AI optimization lets teams adjust continuously instead of relying on fixed targets that fit only part of the run. That continuous coordination is what makes a self-optimizing petrochemical plant possible.
AI optimization can support longer run lengths by changing how plants respond as furnace conditions evolve, especially when individual coils or passes begin to deteriorate faster than the rest. By reflecting behavior across the furnace bank instead of only isolated loops, the model gives operators an earlier read on severity, firing, and throughput trade-offs. Teams get a clearer view of the relationship between decoking timing, available throughput, and plant operations at the unit level.
The main difference is that AI optimization adapts to a moving process rather than holding static targets around a narrower design basis. Traditional APC still delivers value, but its linear models have limits in a nonlinear, time-varying process like steam cracking, especially when feed conditions and run-cycle progression keep changing. AI optimization uses models built from existing plant data and coordinates variables across furnaces and downstream units as one connected system.