Energy and Environment

Warnings: Return of The Long Emergency

by Limbic on June 25, 2017

warnings

James Kunstler’s 2005 book “The Long Emergency” made a huge impression on me when I read it in 2006. In fact, it was one of the reasons I found myself pursuing a career in cloud computing in 2007. Partly thanks to this book and a former boss from British Telekom, my business partner and I were convinced that peak oil and climate change would create a huge demand for energy efficient, carbon neutral compute resources, and cloud computing was the future.

The Long Emergency was primarily concerned with America’s oil addiction and ill-preparedness for what looked at the time to be the coming energy (oil) shock, but it also examined other threats to civilization:

  • Climate Change
  • Infectious diseases (microbial resistance)
  • Water scarcity
  • Habitat destruction
  • Economic instability
  • Political extremism
  • War

Every one of those is still an enormous threat.

A new book by national security veteran Richard Clarke and R.P Eddy called “Warnings: Finding Cassandras to Stop Catastrophes” updates The Long Emergency with some new features of the threat landscape.

The book starts off by asking how we can reliably spot Cassandras – people who correctly predict disasters but who were not heeded – so that we can prevent future disasters.

They examine recent disasters – like 9/11, the Challenger space shuttle disaster and Hurricane Katrina, then examine the people who predicted these events, looking or patterns. They come up with some stable characteristics that allow us to score people on their Cassandra Quotient.

The second part of the book looks at current threats, and their doomsayers, to see if any have a high Cassandra Quotient and thus should be heeded.

The threats are:

  • Artificial Intelligence
  • Pandemic Disease
  • Sea-Level Rise
  • Nuclear Ice Age
  • The Internet of Everything
  • Meteor Strike
  • Gene Editing (CRISPR)

The bad news is that they all have high Cassandra Quotients and the scenarios in the book are plausible, science-backed and terrifying.

Artificial Intelligence as a threat hs been on my radar for a year or so thanks to Elon Musk, Bill Gates, Stephen Hawkins and Sam Harris warning of the risks of intelligent machines that can design and build ever moire intelligent machines.

Pandemic Disease has worried me since reading The Long Emergency, but I thought there had been better global awareness, especially since the world took the 2011 flu scare seriously, and Ebola and Zika.  Unfortunately, we are – as a planet – woefully ill-prepared for a global pandemic. A high fatality airborne flu could kill billions.

Sea-Level Rise genuinely surprised me, especially since the Cassandra in question – James Hansen – predicted the current melting and ice shelf break-offs we see in the Arctic today…30 years ago. I even googled how high my home is above sea level after being convinced we could see 7m rises within my lifetime.

As a child of the 70’s and 80’s, nuclear horror is deeply embedded in my psyche. But I thought the risk of a Nuclear Ice Age was a pretty low risk. It turns out you do not need a large-scale nuclear exchange between the US and Russia to cause global climate chaos. A limited exchange between India and Pakistan could be sufficient to kill billions though global starvation. I was also surprised to learn that Pakistan moves its nuclear arsenal around to thwart attacks my Indian commandos in the event of a war. This raises the risk of terrorists intercepting on of these weapons on the move, and using it for nuclear terrorism.

The book does a good job of examining the incredible fragility of out interconnected IT systems in the chapter on The Internet of Everything. As an IT professional I know the reality of how fragile these systems are and we are right to be scared of dire consequences of a serious cyber war.

I do not really think about Meteor Strikes, as there is little we can do about them and they are now part of popular culture.

The final worry in the book is about Gene Editing, especially CRISPR. CRISP has absolutely marvelous potential, but it also has many people worried. Daniel Saurez even has a new book on the topic called “Change Agent“. CRISPR is could be the mother of all second order effects. Take “off target events” for example:

Another serious concern arises from what are known as off-target events. After its discovery, researchers found that the CRISPR/Cas9 complex sometimes bonds to and cuts the target DNA at unintended locations. Particularly when dealing with human cells, they found that sometimes as many as five nucleotides were mismatched between the guide and target DNA. What might the consequences be if a DNA segment is improperly cut and put back together? What sorts of effects could this cause, both immediately and further down the road for heritable traits? Experimenting with plants or mouse bacteria in a controlled laboratory environment is one thing, but what is the acceptable level of error if and when researchers begin experimenting with a tool that cuts up a person’s DNA? If an error is in fact made, is there any potential way to fix the mistake?

So we have planet-scale problems, ingenious solutions. Instead of feeling paralysis or resignation we should accept Peter Thiel’s challenge to find the big breakthroughs, 0 to 1 intensive progress:

Progress comes in two flavors: horizontal/extensive and vertical/intensive. Horizontal or extensive progress basically means copying things that work. In one word, it means simply “globalization.” Consider what China will be like in 50 years. The safe bet is it will be a lot like the United States is now. Cities will be copied, cars will be copied, and rail systems will be copied. Maybe some steps will be skipped. But it’s copying all the same.

Vertical or intensive progress, by contrast, means doing new things. The single word for this is “technology.” Intensive progress involves going from 0 to 1 (not simply the 1 to n of globalization). We see much of our vertical progress come from places like California, and specifically Silicon Valley. But there is every reason to question whether we have enough of it. Indeed, most people seem to focus almost entirely on globalization instead of technology; speaking of “developed” versus “developing nations” is implicitly bearish about technology because it implies some convergence to the “developed” status quo. As a society, we seem to believe in a sort of technological end of history, almost by default.

It’s worth noting that globalization and technology do have some interplay; we shouldn’t falsely dichotomize them. Consider resource constraints as a 1 to n subproblem. Maybe not everyone can have a car because that would be environmentally catastrophic. If 1 to n is so blocked, only 0 to 1 solutions can help. Technological development is thus crucially important, even if all we really care about is globalization.

…Maybe we focus so much on going from 1 to because that’s easier to do. There’s little doubt that going from 0 to 1 is qualitatively different, and almost always harder, than copying something times. And even trying to achieve vertical, 0 to 1 progress presents the challenge of exceptionalism; any founder or inventor doing something new must wonder: am I sane? Or am I crazy?

From Blake Masters notes

 

 

{ 0 comments }

[Another post from the draft folder from June 2009]

Cannot recall what this was supposed to be, but the tectonics of events is an intellectual theme of mine. It refers to people, places and things getting caught – crushed – between the tectonics of events – the big movement of history, geography and physics.

{ 0 comments }

Steve O’Donnell discusses McKinsey’s recent prediction that datacenter carbon footprint will quadruple by 2020, exceeding air travel’s.

CIO and CTOs are in denial about this, but it is true, and one of the biggest challenges facing the industry.

Data Center emissions will quadruple by 2020 matching the volume of air travel. | The Hot Aisle

{ 0 comments }

If only we would let ourselves be dominated

by Limbic on October 12, 2008

In a thoughtful post entitled “Thoughts on the Financial Crisis“, Tim O’Reilly quotes a Rilke poem:

I can tell by the way the trees beat, after
so many dull days, on my worried windowpanes
that a storm is coming…What we choose to fight is so tiny!
What fights us is so great!
If only we would let ourselves be dominated
as things do by some immense storm,
we would become strong too, and not need names.

He explained the quote like this:

There are a lot of people bloviating about the financial crisis. It’s outside of our area of expertise, so there didn’t seem to be a lot of urgency to add to the hot air. Even professional economists and financial experts disagree on where this is going. I’ve been reading a lot, and sharing the best links via my twitter feed, but frankly, I’m feeling that we’re in the middle of a wave that no one completely understands.Meanwhile, I did in fact spend my NY Web Expo talk on the idea that “I sense a storm coming” (Rilke quote), and the idea that companies and individuals need robust strategies (ones that can work even in uncertain times), with one robust strategy being to “work on stuff that matters.”

In a letter to his own employees where he elaborates on this, he passes on some great advice that we can all heeded:

Many of you have no doubt been alarmed by the developments of the last couple of weeks in financial markets……robust strategies are ones you’d adopt in good times and in bad…we probably end up with more robust strategies if we assume the worst rather than the best.

We could be in for a long, rough time in the economy. I’m not going to say otherwise.

But I also want to point out that rough times are often the best times for creativity, opportunity and change.

…And if you look at history, you see that this has always and everywhere been true. It’s not an accident that economist Joseph Schumpeter talked about the “creative destruction” inherent in capitalism. Great problems are also great opportunities for those who know how to solve them. And looking ahead, I can see great opportunities.

The energy crisis (both global warming and the oil price shock) is helping people to focus on how technology can transform the energy sector. The financial crisis has demonstrated just how out-of-whack an unregulated, proprietary, black-box approach can get. This will lead to
an emphasis on regulation, but I hope, above all, on transparency. This is of course analogous to what happened with open source software. Meanwhile, the mobile revolution will continue, regardless of the state of the economy. If it can prosper in Africa, it can prosper even in an
American downturn. And all the stuff we’re exploring with Make: new materials, new approaches to manufacturing, and the “open source” approach applied to hardware, will take us in unexpected directions.  And all of these areas can benefit from what we do best: capturing and
spreading the knowledge of innovators.

We don’t know yet how problems in the overall economy will affect our business. But what we can do now are the things we ought to be doing anyway:

  • Work on stuff that matters: Assuming that the world does go to hell in a handbasket, what would we still want to be working on? What will people need to know? (Chances are good that they need to know these things in a world where we all continue to muddle along as well.)
  • Exert visionary leadership in our markets. In tough times, people look for inspiration and vision. The big ideas we care about will still matter, perhaps even more when people are looking for a way forward. (Remember how Web 2.0 gave hope and a story line to an
    industry struggling its way out of the dotcom bust.)
  • Be prudent in what we spend money on. Get rid of the “nice to do” things, and focus on the “must do” things to accelerate them.
    These are all things we should be doing every day anyway. Sometimes, though, a crisis can provide an unexpected gift, a reminder that nobody promised us tomorrow, so we need to make what we do today count.

{ 0 comments }

Intel: Servers Do Fine With Outside Air

by Limbic on September 18, 2008

This might come as a shock, but one of the biggest expenses in Data Center operations – air conditioning – responsible for up to 50% of all power requirements, might be based on a myth:

Do servers really need a cool, sterile environment to be reliable? New research from Intel suggests that in favorable climates, servers may perform well with almost no management of the environment, creating huge savings in power and cooling with negligible equipment failure.

Intel’s findings are detailed in a new white paper reviewing a proof-of-concept using outside air to cool servers in the data center – a technique known as air-side economization. Intel conducted a 10-month test to evaluate the impact of using only outside air to cool a high-density data center, even as temperatures ranged between 64 and 92 degrees and the servers were covered with dust.

Intel’s result: “We observed no consistent increase in server failure rates as a result of the greater variation in temperature and humidity, and the decrease in air quality,” Intel’s Don Atwood and John Miner write in their white paper. “This suggests that existing assumption

[From Intel: Servers Do Fine With Outside Air « Data Center Knowledge]

It will be very interesting to see what Steve O’Donnell at The Hot Aisle says about this.

{ 1 comment }

Interesting developments in the Data Center design. Data Centers and their efficiency – data and electrical – are rapidly coming to be the number on priority for IT design specialists:

Typically, computers are connected by a network architecture that consists of a “tree” of routing and switching elements regulated by specialized equipment, with expensive, non-commodity switches at the top of the hierarchy. But even with the highest-end IP switches and routers, the networks can only support a small fraction of the combined bandwidth available to end hosts. This limits the overall cluster size, while still incurring considerable costs. Application design is further complicated by non-uniform bandwidth among datacenter nodes, which limits overall system performance.

The UC San Diego researchers’ envision creating a datacenter that will have scalable interconnection bandwidth, making it possible for an arbitrary host in the datacenter to communicate with any other host in the network at the full bandwidth of its local network interface. Their approach requires no modifications to the end-host network interface, operating system or applications, and is fully backward compatible with Ethernet, IP and TCP. Ideally, the datacenter would also use inexpensive, off-the-shelf Ethernet switches as the basis for large-scale datacenter networks, thereby replacing high-end switches in much the way that commodity personal computers have displaced supercomputers for high-end computing environments.

[From HPCwire: Computer Scientists Propose New Way to Build Datacenters]

{ 0 comments }