Before “Engineering” Existed

For most of human history, the people who built things, solved technical problems, and created innovations were simply craftspeople, artisans, and inventors. They learned through apprenticeships, owned their own workshops, and directly profited from their creations. When James Watt improved the steam engine, he partnered with manufacturer Matthew Boulton to sell it. When Eli Whitney invented the cotton gin, he attempted to manufacture and license it himself (with mixed success, admittedly). The person who invented something was typically the person who figured out how to make it, sell it, and profit from it.

This wasn’t romanticized entrepreneurship - it was simply how technical work was organized. If you could design and build something useful, you controlled your work and reaped the rewards. The concept of separating the “thinking” from the “business” hadn’t yet crystallized.

The Birth of the Professional Engineer

The term “engineer” emerged in the 18th and 19th centuries as technical problems became too complex for traditional craftwork. Civil engineers designed canals and railways. Mechanical engineers developed industrial machinery. Electrical engineers harnessed a force that had barely been understood a century prior.

Engineering became a profession - with formal education, professional societies, and licensing. The first engineering schools appeared in France in the 1700s, and by the mid-1800s, institutions like MIT and the Colorado School of Mines were training engineers in America. Engineers gained prestige and authority. They were professionals, like doctors and lawyers, with specialized knowledge that commanded respect.

Many engineers of this era still operated with significant autonomy. Thomas Edison didn’t just invent - he ran companies. George Westinghouse was both engineer and industrialist. Alexander Graham Bell co-founded Bell Telephone. The engineer-entrepreneur was common, perhaps even the norm among the most successful.

But as companies grew larger and more complex, a division of labor emerged. Someone had to manage the workers, handle the finances, negotiate with suppliers, navigate regulations. Initially, this was often still engineers themselves, or the founders who had engineering backgrounds. But gradually, a managerial class began to develop.

The Rise of “Scientific Management”

In the early 1900s, Frederick Taylor introduced “scientific management” - the idea that work could be optimized through systematic study and standardization. While Taylor was himself an engineer, his methods treated workers (including technical workers) as variables to be optimized rather than professionals to be respected. The engineer’s judgment increasingly became subordinate to the system, the process, the manager’s oversight.

This era also saw the rise of large corporations with layers of management. Engineers became employees in bureaucratic hierarchies. They might still do impressive technical work, but they did it at the direction of others, on timelines set by others, with success defined by others. The engineer was now “managed.”

Still, through the mid-20th century, engineers maintained significant status. The aerospace and defense industries of the Cold War era treated engineers as strategic assets. Major corporations like Bell Labs, Xerox PARC, and IBM Research gave engineers remarkable freedom to explore and innovate. Engineers led major projects and often ascended to executive leadership. The CEO of a manufacturing or technology company was likely to have an engineering background.

The MBA Revolution

The 1960s and 70s saw the rapid expansion of business schools and the MBA degree. Business administration became a discipline unto itself, increasingly disconnected from the industries being administered. A new philosophy emerged: managers didn’t need to understand the technical details of what their companies did. They needed to understand finance, marketing, and strategy. Management was a portable skill.

This created a class of professional managers who moved between industries with little domain expertise. Companies began to be run by people who had never designed a product, operated a factory, or understood the technical challenges their engineers faced daily. The engineer’s perspective became one input among many, and often not the most valued one.

Shareholder Primacy and the Financialization of Everything

The real inflection point came in the 1980s and 90s with the rise of shareholder value ideology. Milton Friedman had argued in 1970 that a company’s only social responsibility was to increase profits for shareholders. By the 1980s, this philosophy dominated. Jack Welch at General Electric became the model: ruthless cost-cutting, relentless focus on quarterly earnings, mass layoffs to boost stock prices.

Engineering became a cost center to be minimized rather than a capability to be cultivated. Long-term R&D investments fell out of favor. Why spend years developing new technology when you could boost this quarter’s earnings with layoffs and stock buybacks? Engineers were increasingly seen as interchangeable resources - line items on a spreadsheet.

Offshoring accelerated. If engineers in another country cost less, the financialized logic was simple: move the work there. The fact that you might be dismantling decades of accumulated expertise and institutional knowledge was irrelevant to the quarterly earnings call.

California Energy Crisis

The California energy crisis of 2000-2001 demonstrated what happens when essential infrastructure is treated as a financial playground rather than a system to be engineered and managed. Deregulation had opened electricity markets to trading, and companies like Enron discovered they could manipulate supply and create artificial scarcity to drive up prices. They had names for their schemes: “Fat Boy,” “Death Star,” “Get Shorty.”

Engineers who actually understood the grid watched in frustration as traders created rolling blackouts not because there wasn’t enough power, but because creating scarcity was profitable. The people who knew how to keep the lights on were subordinate to people who knew how to game the market. When the scheme eventually collapsed, Enron executives walked away wealthy while California taxpayers and ratepayers bore the costs.

San Diego Home Price Bubble

In the early 2000s, San Diego home prices detached from any rational relationship to local incomes. A home that an engineer could have afforded in the 1990s now required exotic financing: interest-only loans, adjustable-rate mortgages with teaser rates, stated-income loans that didn’t verify what borrowers claimed to earn. The Bay Area had always been expensive, but now San Diego was following the same trajectory.

This wasn’t driven by a sudden shortage of housing or surge in high-paying jobs. It was driven by the financialization of housing - the treatment of homes as investment vehicles and loan originators who profited from volume regardless of whether borrowers could actually afford the loans long-term. The people selling the loans and bundling them into securities made their money upfront. By the time the system collapsed, they’d already cashed out.

Great Financial Crisis

The inevitable collapse came in 2008. The Great Financial Crisis revealed that the sophisticated financial engineering that was supposed to optimize everything had instead created systemic fragility. The mortgage-backed securities, the collateralized debt obligations, the credit default swaps - all the complex instruments that were supposed to distribute and manage risk - turned out to be a house of cards built on the assumption that housing prices would never fall.

Engineers and other workers paid the price through mass layoffs, decimated retirement accounts, and lost homes. Meanwhile, the architects of the crisis - the executives who had profited enormously from the bubble - largely escaped consequences. Many received bailouts. Some got bonuses. The system that had treated engineers as disposable resources proved very protective of the people at the top.

The crisis should have prompted a rethinking of shareholder primacy and short-term financial extraction. Instead, once the immediate panic subsided, the same patterns resumed.

The Tech Era

Tech founders were often engineers themselves. But, as tech companies matured, they followed the same patterns: MBAs took over leadership, financial engineering became as important as software engineering, and technical workers became resources to be “optimized.” The difference was mainly aesthetic - they got free snacks and a foosball table while still being managed by people who didn’t understand what they did.

The 2010s and 2020s saw this reach new extremes. Private equity firms bought engineering companies and extracted value through debt loading and cost cutting. Tech companies conducted mass layoffs while posting record profits, viewing engineers as a variable expense to be adjusted for “efficiency.” The language of “human capital” became standard - engineers weren’t professionals, they were capital to be allocated.

Meanwhile, executive compensation exploded while engineering salaries stagnated relative to company value creation. A CEO who laid off thousands of engineers might receive a bonus worth more than those engineers would collectively earn in their lifetimes.

COVID

Then 2020 brought COVID-19, and the working world went through a disorienting series of whiplashes. First came the furloughs and layoffs as companies panicked. Then came the sudden shift to remote work - something many companies had insisted was impossible right up until it became mandatory overnight.

As the economy lurched back to life, something unexpected happened: a scramble for talent. For a couple of years, engineers found themselves in demand. Changing jobs could mean 20%, 30%, even 50% salary increases. Remote work meant geographic arbitrage - you could earn Silicon Valley money while living anywhere. It seemed, briefly, like leverage had shifted back toward workers.

It didn’t last. By 2023, the layoffs were back, justified by vague appeals to “efficiency” and “rightsizing” even as companies posted strong profits. The remote work that had been fine during the pandemic was suddenly a problem that required return-to-office mandates - often transparently designed to induce voluntary attrition without the optics of layoffs.

Inflation Reduction Act

In August 2022, the Biden Administration passed the Inflation Reduction Act - the largest climate and energy investment in U.S. history. The IRA included nearly $400 billion in funding for clean energy projects, tax credits for renewable energy and energy efficiency, incentives for electric vehicles and heat pumps, support for grid modernization, and investments in domestic manufacturing of clean energy technology.

For energy engineers, this appeared transformative. Suddenly there was massive federal funding for the kinds of projects energy engineers work on. Energy audits, building retrofits, solar and storage installations, EV charging infrastructure, grid upgrades, industrial decarbonization - all of it backed by substantial federal incentives. Companies, utilities, and government agencies needed engineers to design, implement, and manage these projects. The pipeline of work looked strong for years to come.

The AI Era

Generative AI tools like ChatGPT, released in late 2022, have demonstrated capabilities that seemed impossible just years ago. The technology is advancing rapidly, with new models and applications emerging constantly. What this means for engineers - and specifically for energy engineers - remains unclear.

Some see AI as a tool that will augment engineering work, handling routine tasks and allowing engineers to focus on higher-level problem-solving and innovation. Code generation, documentation, preliminary designs, data analysis - AI could handle the tedious parts and free engineers to do more creative and impactful work. In the energy sector, AI could optimize grid operations, improve building energy modeling, accelerate equipment diagnostics, and help navigate the complexity of decarbonization.

Others see AI as the latest justification for treating engineers as a cost to be minimized. If AI can do some of what engineers do, why not reduce headcount? Never mind that AI systems require engineering judgment to deploy effectively, or that they make mistakes that require expertise to catch, or that the most valuable engineering work involves understanding messy real-world constraints that AI doesn’t grasp. The financial logic is simple: if you can claim AI will make engineers more “efficient,” you can justify layoffs.

What’s certain is that AI, like every previous technological shift, will be shaped by the same forces that have shaped engineering work for decades. The question isn’t just what AI can do - it’s who controls it, who profits from it, and whether engineers have any say in how it’s deployed. We’re living through the opening chapter of this story, and the ending hasn’t been written yet.

The Second Trump Administration

In January 2025, Donald Trump began his second term as president with a dramatically different energy policy. The new administration moved quickly to undermine or dismantle much of the IRA’s implementation - pausing funding disbursements, reinterpreting eligibility requirements, and signaling that clean energy incentives would not be a priority. Fossil fuel development was back at the center of federal energy policy.

For energy engineers, this created immediate uncertainty. Projects that had been planned around IRA funding were suddenly in limbo. Companies that had been hiring to meet expected demand started freezing positions or scaling back. The clear signal from the federal government was that the energy transition was no longer a priority - or at least, not one the government would support.

The whiplash was stark. In the span of less than three years, energy engineers went from a surging market driven by historic federal investment to uncertainty about whether that investment would continue.

Data Center Power Demand

Regardless of political shifts in energy policy, one trend has emerged with startling clarity: AI requires massive amounts of electrical power. Training and running large AI models demands enormous data centers filled with GPUs that consume electricity at unprecedented scales. As AI became a national security and economic priority, the race to build data center capacity accelerated.

For energy engineers, this created a new and lucrative niche. Data center construction engineering offers high-paying roles because every kilowatt that isn’t wasted as heat is monetized to run more GPUs. Efficiency isn’t just an environmental goal - it’s directly tied to computing capacity and revenue. Optimizing cooling systems, power distribution, and energy recovery became critical competitive advantages.

There’s also potential work on the demand side - helping utilities and grid operators free up capacity for data centers by improving efficiency elsewhere or managing load more intelligently. And there’s the generation side: connecting new power sources to the grid specifically to serve data center loads, whether that’s natural gas, renewables, nuclear, or some combination.

However, this opportunity comes with geographic constraints. San Diego, with some of the highest electricity prices in the nation, is unlikely to see major AI data center development. The economics simply don’t work when power costs are that high and AI companies can build in regions with cheaper electricity. For San Diego-based energy engineers, the data center boom might mean opportunities elsewhere, but probably not in the local market.

The broader point remains: AI’s energy demands are real and growing, creating engineering work regardless of federal energy policy. Whether that work expands or contracts depends on factors beyond any individual engineer’s control - electricity prices, grid capacity, corporate strategy, and yes, national priorities around AI development.


See Also

Company Culture