We love developing brand new ideas and discovering fresh insights from old concepts.

Our team is constantly exploring, learning and creating.

We celebrate vigorous debate, frequent experimentation, and the investigation of new topics. We’re particularly interested in ideas that leverage our engineering, business, and commercialization experience in large industrial fields.

We’re currently working in clean energy, wireless communications, and aerospace, but we’re constantly evaluating opportunities in similar or related industries.

See below for our latest ideas and commentary.

Diving into the Houston Chronicle’s coverage of NET Power

The Houston Chronicle recently covered the NET Power Demonstration Plant. We came away with a few comments and reactions.

8 Rivers was pleased to see NET Power covered by James Osborne in the Houston Chronicle in July: Does Carbon Capture Have Life Yet?

The piece does a great job of introducing readers to supercritical CO2 in general and NET Power’s technology in particular. It’s also exciting to see coverage in the Houston region, where NET Power’s demonstration plant is currently being built.

We did observe several areas in the article where we think further commentary might be useful.

1. NET Power’s development timeline is much shorter than Mr. Osborne suggests.

“Even the most ardent proponents of supercritical carbon dioxide say if the technology is proven, commercial application is at least a decade off.”

Other technologies in the field might be moving along this timeline, but NET Power is already actively working on commercial deployment, which should occur in less than half the time suggested by Mr. Osborne. The company is in commercial plant development discussions with numerous utilities, power generators, and oil & gas companies in the US and around the world; some of those entities have already identified sites and specific projects for NET Power plants. Once NET Power’s demonstration plant comes online in 2017, the company will quickly collect the data required to move forward on its first commercial plant. Moving along this development pathway, the technology should be operating commercially in 2020.

2. The article suggests that sufficiently low-cost oxygen production is a technical hurdle NET Power must overcome, which is not the case.

“The technical hurdles begin with finding a cheap way to produce pure oxygen.”

Mr. Osborne is correct in identifying that oxygen is expensive to produce and has been the Achilles heels of many systems that utilize oxycombustion. In NET Power’s case, though, the technology utilizes well-proven (in fact, OLD) oxygen production technologies. So, from a purely technical standpoint, this is a completely proven part of the system. In addition, NET Power includes both the capital cost and energy requirements for oxygen production in all of its cost and performance numbers. Unlike other oxycombustion processes, NET Power has uniquely applied this process in an entirely new power cycle, called the Allam Cycle, that has a sufficiently high gross efficiency and low capital cost, enabling the technology to absorb the cost and performance impact of oxygen production while remaining competitive with conventional combined cycle systems. Exactly how the Allam Cycle accomplishes this needs to be the subject of its own post in the future- we’ll follow-up with that soon.

3. CO2 utilization and storage is also raised as a key hurdle, and NET Power believes this question can be readily addressed.

“Then, there is the matter of what to do with carbon dioxide once captured.”

Mr. Osborne goes on to rightly point out three primary approaches for utilization and disposal of CO2: enhanced oil recovery (EOR), which is a large market available today (and, we can include enhanced coal bed methane recovery (ECBMR) in this category); underground sequestration, which has tremendous potential for storage capacity and is the subject of important work by the U.S. Department of Energy and other agencies around the world; and algae feedstock, which is one of many potential future uses for CO2. Each one of these areas deserves their own post, but we will touch on them briefly here.

EOR is a large opportunity that exists today. ECBMR is also a highly viable option, but it has been practiced less. 8 Rivers has looked at how much CO2 would be required to recover the resources recoverable by these two methods, storing the CO2 underground in the process. A high level analysis suggests that these opportunities have the capacity to store ALL of the CO2 produced by ALL of the IEA projected fossil fuel capacity builds around the world from today until 2040. This topic definitely warrants an expanded discussion at a later date.

Underground sequestration, such as in saline formations, also has tremendous storage capacity. The IPCC Special Report on Carbon Capture estimates that, at the low end, accessible global saline formations are capable of storing at least 1000 gigatons of CO2!

Lastly, we believe CO2 utilization technologies will become a big opportunity. Like anything, the economic viability of many of these options is a function of the feedstock. The cost of CO2 from current CCS technologies is simply too high. NET Power’s low cost of Co2 will transform this landscape and enable an entire CO2 economy to develop. The National Energy Technologies Laboratory has a great graphic that outlines most of these potential uses:

In a short space, Mr. Osborne has covered some of the most important subject-matters around CCS and supercritical CO2, and we appreciate his focus on these important topics. We hope we were able to provide a bit more detail in a few key areas, and we’ll look to expand on several of these items in the future.

A Novel Approach to an Energy Future

The World Economic Forum calls for “investing in ‘no regrets’ areas that have a positive business case, and so will be palatable in almost any economic climate.” The Future of Energy, Steve Bolze and Ignacio Galan, Chairs

NET Power is a true “no regrets” technology that allows the world to meet all climate targets without having to pay more for electricity.

NET Power:

  • makes electricity from natural gas
  • costs the same as, or less than, electricity from existing natural gas power plants
  • generates electricity at high efficiency (59% LHV)
  • will capture substantially all of the CO2 and non-CO2 atmospheric emissions without any additional cost
  • captures the CO2 at pipeline purity and pressure, ready for various industrial uses
  • does not need to use water (at a small reduction in efficiency)

In a world evolving towards cleaner energy, NET Power is more relevant and necessary than ever before. Using the most up-to-date IEA projections, a total addressable market of 1500 GW exists through 2040. The IPCC has made clear that carbon capture and storage (CCS) is essential in getting us to our climate targets. In partnership with renewables, NET Power allows the planet to keep power prices low, maintain grid stability/flexibility, support a growing industrial load AND meet proposed climate targets.

Unlike any other technology, NET Power is economically competitive without the need for CO2 revenues. When combined with CO2 sales, NET Power is less expensive (and cleaner) than any other fossil fuel alternative.

NET Power is a true “no regrets” option.

The NET Power Story

In 2008, Miles Palmer and I created 8 Rivers Capital, a technology development firm devoted to addressing large problems in sustainable ways. Shortly after the company was formed, Rodney Allam and Jeremy Fetvedt joined to develop the technology at the core of NET Power (the Allam Cycle).

Assembling the partners

In 2011, NET Power attracted the attention of Toshiba Power Systems Co., one of the world’s most experienced manufacturers of ultra-supercritical steam turbines. Soon thereafter, Chicago Bridge & Iron and Exelon joined to help design, build and operate the finished product.

Breaking Ground

In March, 2016, ground was broken on NET Power’s demonstration plant, located 30 minutes south-east of Houston in La Porte, TX. Commissioning is planned to begin early 2017.

NET Power has received significant levels of industry interest from the largest utilities in North America. Moreover, oil and gas companies see NET Power as a synergistic way to achieve low carbon fuel standards. Numerous environmental and climate change organizations are embracing NET Power as one of the best options in an “all-of-the-above” approach to energy production.

Full Scale in 2020

The first NET Power 295MWe plant build is planned for mid-2020.

The NET Power Demonstration Plant under construction

The Allam Cycle is Simple

Oxy-combustion is great chemistry

Most combustion, such as in airplane engines and automobiles, uses the oxygen in air as an oxidant. Air is 78% nitrogen and 21% oxygen, plus other trace elements. Since air is mostly nitrogen, culprit greenhouse gases produced in combustion – such as CO2 – must be separated from the nitrogen in order to store or reuse. This process is incredibly costly and energy intensive. One ingenious solution to avoid separating greenhouse gases from nitrogen is to burn a fuel in pure oxygen in a process known as oxy-combustion. Oxy-combustion has been known for over a hundred years, but has remained elusive due to the high cost of seperating oxygen from air. However, as a means of capturing CO2, it is alluringly simple: when methane fuel is burned in pure oxygen the resulting by-products are easily separated—largely, carbon dioxide as a gas, and water as a liquid.

By keeping the heat, oxy-combustion is viable

Oxy-combustion had one big problem: oxygen was expensive to produce.

The Allam Cycle solves this problem by eliminating the use of an comparatively inefficient steam cycle that dumps massive amounts of heat in the form of water phase change. Instead, the Allam cycle retains that heat within the system and uses the energy to aid air seperation and heating after recompression. Thus, 94% of the mass coming into the front of the turbine is hot (720° C, or 1330° F), maintained by massive heat recuperation. This means that the remaining 6% of combustible mass (fuel and O2) only has to supply a little more excess energy to raise the temperature of the flow to the required turbine inlet temperature (1150° C, or 2070° F).

The result is simple

By focusing on a single recuperated system, Allam Cycle plants avoid the complexity of “combined cycle” plants. Furthermore, every single component is readily available — with the exception of the turbine, which is being designed, engineered and built by Toshiba, the world’s leading manufacturer of advanced ultra-supercritical turbines.

The basic Oxy-combustion Allam Cycle

Our future cannot tolerate failure

The World Economic Forum established the “Future of Electricity” platform to help advise stakeholders around the world on sustainable ways by which electricity needs can be met. Many technologies hold promise. All come with trade-offs, and not all trade-offs have the same perceived value propositions.

No Regrets.

The 2015 “Future of Electricity” report, issued under the leadership of chairs Steve Bolze and Ignacio Galan, called for

“investing in ‘no regrets’ areas that have a positive business case, and so will be palatable in almost any economic climate.”


While renewables are key to our future, they cannot do all of the lifting. Heavy reliance on renewables results in diminishing returns, with costs that might impede their acceptance. Even in high renewable penetration scenarios, gas build-out remains largely unchanged due to renewables’ baseload support requirements. Estimates show that a buildout of 50% renewables with 50% Natural Gas Combined Cycle baseload support (NGCC) doubles the cost of power. A 173% renewables and 15% NGCC buildout (renewables with associated storage for intermittency) results in a cost of power nearly 6x of today’s levels. The cost of power has been linked previously to trends of economic growth and unemployment, with high power costs having dramatic impacts on both.

As a result, low carbon fossil fuel emissions have been seen as a key to mitigating climate change. Switching from coal to natural gas helped slow the growth of CO2 emissions, but Carbon Capture and Sequestration (CCS), or the process of capturing and storing or reusing carbon dioxide emissions, is needed to make important headway.

CCS had a slow start

The IPCC Fifth Assessment models demonstrate that CCS is the single most critical tool to achieving climate targets. Scenarios without CCS result in the greatest number of failures to meet climate goals, and projected costs for electricity soar.

Over the last decade, CCS systems had proved to be uneconomic in the absence of a governmentally induced price on carbon or a high commercial price for CO2. While government intervention failed to create the right circumstances, CO2 could be sold in the commercial markets to oil companies for use in enhanced oil recovery (EOR). Because of the expense of operating CCS systems, oil needed to be above $75 per barrel to support the right CO2 price.

As the price of oil fell, some felt that the push for CCS should be relaxed and that renewables and nuclear should pick up the missing capacity.

Cellphones and CCS

Those who do not believe in the progress of technology are often surprised – much like McKinsey – when, in the 1980s, it projected that the total market for cellphones in 2000 would be 900,000 (by 2000, 900,000 new cell services were being sold every 3 days).

We make a mistake unless we recognize that current carbon capture technologies are analogous to the “brick phone”: they are valuable leaders, bringing in developers of the next generation of technology. We saw a similar phenomenon happen in solar.

NET Power changes everything

By standing on the shoulders of those before, the developers of the Allam Cycle have devised how to make CCS affordable. NET Power generates electricity at the same price as existing combined cycle gas turbines, and at no extra cost it captures all of the CO2. This redefines the game of carbon capture.

The IEA projects 705 GW of gas capacity additions from 2015 to 2025, and an additional 748 GW from 2026 to 2040. Were all this capacity to be supplied via NET Power it would equate to deployment of nearly 5000 NET Power turbines—providing for the capture of 3,900 million tons of CO2 every year. If planned coal buildout were averted (or coupled with CCS), an additional 12,300 million tons of CO2 would be diverted from the atmosphere every year.

NET Power is the most synergistic fossil fuel fired baseload support to pair with renewables, creating a truly resilient, cost-effective, carbon-free electrical grid.

Continued buildout of renewables requires ongoing baseload support by NGCC, resulting in escalating costs of power. At the highest ends of renewable penetration, in order “to achieve CO2 reductions on par with balanced portfolios, intermittent renewable systems must be built much larger, to between 154 and 195% capacity levels.” The 173% point refers to this scenario. NET Power represents a completely new frontier for power production. Data obtained from: Brick, S., and Thernstrom, S., Renewables and decarbonization: Studies of California, Wisconsin, and Germany, The Electricity Journal, 2016, 29, 6-12.

Information obtained from: IPCC Fifth Assessment; IEA Technology Perspectives 2014; U.S. Deep Decarbonization Pathways; ERP 2015; WWF 2014; Jacobson et al. 2014; Brick et al. 2016

The Bandwidth Bottleneck

As the use of technology improves and broadens, bandwidth demand continues to outpace its deployment. What will the future hold if our communication channels continue to serve as our bottleneck?

Communications sits at an impasse. Today, our computing demands continue to outpace the capacity of our networks. The problem is rooted in a fundamental disconnect between the rate at which bandwidth grows and the rate at which computing power grows. Decades of historical data show that our internet bandwidth increases roughly with Nielsen’s Law, which predicts that bandwidth grows at a rate of 50% annually, in both the wireless and wired worlds. Contrast this capacity with Moore’s Law, which states the growth rate of computing power – a corollary for demand – is around 60% annually. With these growth rates, bandwidth will increase by a factor of 57X in the next 10 years, close to half the projected increase in computing power of 100X.

This situation of supply and demand for bandwidth has resulted in compounding strain on networks. Decades ago, wireline industry leaders and engineers saw the strain and sought out a better, broader solution. They realized that they had to move away from sending radio frequencies over copper and coax, and instead move to sending light signals through glass, technology today we call fiber optics. Today, all of the backbone networks of the internet are run on fiber optics, which serve as a platform to continuously feed the insatiable demand for bandwidth in that part of the network.

Bandwidth will increase by a factor of 57X in the next 10 years, close to half of the projected increase in computing power of 100X.

Wireless technology development parallels that of the wired network, as technology has moved to fit more data through the air over time. Going back to 2000, wide-area network or cellular signals have moved through multiple standards utilizing different parts of the spectrum and increasingly complex modulation and encoding techniques… moving through 3G, HSDPA, WiMAX, HSPA, LTE and now advanced LTE. In this evolution, theoretical single user speeds have jumped from 0.1 Mbps at the turn of the century to the ~50 Mbps that they are today with LTE.

One way to make up difference between Moore’s Law and Nielsen’s Law in the wireless world is to jump up the fundamentally limited radio frequency spectrum. Historically the jumps from radio broadcast to TV broadcast to cellular wavelength were leaps in search of more bandwidth to support new applications…accompanied by the requisite political and regulatory wrangling. Nearly all of this spectrum is currently allocated to a specific use. Some unlicensed bands, like 2.4 Ghz, is becoming increasingly crowded, and others like the 700 Mhz is auctioned to large cell carriers for > $19bn.

Another way to make up difference between Moore’s Law and Nielsen’s Law is to remove the prohibitive cost of upgrading networks. In the wired universe in the “last-mile”, there is an understandable unwillingness to replace the extensive cable infrastructure with new fiber optic wires due to the significant cost. In spite of this, billions are spent yearly laying down the small stretches of fiber to connect end-users to broadband access. In this cost were lower, gigabit could become a reality to many more people much more quickly.

Current rates of development fail to support the demand that is driven by enormous amounts of computing power.

Unfortunately, the physical limits today’s infrastructure fail to support the increasing demand. Fiber is too expensive to take to every household… even optimistic projections have gigabit speeds failing to reach even 50% of homes over the next decade. In wireless, the move to 5G promises to take us to 100 Mbps, but even that promise is wrought with risk. In particular, the risk of scarce licensed spectrum and unlicensed spectrum over-crowding. What is needed is a revolution in the way that our data is transferred, just as happened in the wireline world decades ago.

If the present is any indication, the global desire for internet connectivity is far from satiated and will require innovative technology leaps to provide for current and worldwide future demand. Current communication technology operates on a tapped and aging complicated regulatory system. What if communications entered a new frontier of connectivity? 8 Rivers Networks addresses these problems by offering high speed, wireless mesh networks that can operate at distances experience in the “last mile”. It avoids the regulatory issues of the radio frequency spectrum by implementing communication through rapid pulses of harmless infrared light, providing an easy-to-deploy platform to get infrastructure providers the sorely needed bandwidth. The low cost of endpoint devices enables access to a greater swath of the population, providing the leap necessary to eliminate the bandwidth bottleneck.