Eyeing a post-Christmas pile of pile, a longtime Alaskan might be inclined to contemplate the ethics of the mega-outdoor gear brands – Patagonia, The North Face, REI, etc. – pandering to consumers with their must-have gear while preaching environmental responsibility to the world.
This behavior has always appeared fundamentally hypocritical. How many pile jackets and vests does one customer really need anyway? Or waterproof-breathable wind shells or backpacks or snow pants or overalls or hats or gloves or hiking boots/trail shoes or, or, or?
The Tarahumara Indians from the Copper Canyons region of northwestern Mexico can make a perfectly serviceable pair of trail sandals from an old automobile tire, slip them on their feet, and then walk just about anyone reading this into the ground.
They don’t need fancy $250 adidas Terrex Agravic Tech Pro trail running shoe to get from point A to point B. They can recycle someone else’s automotive discard into perfectly serviceable footware.
And the Tarahumara aren’t alone. All around the globe, people with little improvise with what they can scavenge to make life more comfortable while most in the U.S. keep buying and buying and buying long past the time they’ve reached mere comfort.
It seems so, well, profligate and wasteful.
So much so that a decade ago, the Worldwatch Institute made news when it proclaimed consumerism the major threat to the environment.
And yet it is this constant, unrelenting desire for better “stuff,” for lack of a more descriptive word, that has brought our species to where it is today.
Without the constant drive for the new and better, we’d still be shivering in caves in Africa, eating wild plants and the uncooked flesh of whatever small fish or animals we could kill with our hands, and hoping large, carnivorous animals would not find us because of our minimal abilities to defend ourselves.
Fingernails and teeth aren’t very effective weapons against a lion with claws and fangs. A rock is a little better, but a club beats a rock and a spear bests a club and a rifle chambered in .458-magnum caliber changes everything.
The roots of our consumerism surely go back to this beginning. It’s not hard to imagine the time the first human who figured out how to actually make a fire started trading the knowledge to others for something – food, power, a better sleeping place in the cave, who knows.
And from there, away we went.
Pretty soon another someone stumbled upon the idea of lashing a rock to the end of a stick to make a club and everyone who saw the club in action wanted one of their own.
Some historians, most notably Margaret MacMillion, have argued that there was something of a dark side to this march of progress, that technological change and the desire of people for it was largely wrapped in conflict through the millennia.
The theory is that the globe’s inherently tribal homo sapiens were constantly in search of better tools and weapons with which to wage war against each other and this drove the technological changes that brought us to where we are today.
Whether war started or merely accelerated the process is hard to say, but there is no denying warfare is woven into the fabric of technological advances through history and especially of the century just past.
Stainless steel, sonar, air traffic control, the zipper and the sanitary napkin were but a few of the seemingly non-war-related inventions to come out of World War I. And World II opened the floodgates on technological change:
Radar, microwaves, aircraft as we know them today, nuclear fission to harness the atom for power, and a whole lot more.
“Had it not been for the Manhattan Project (to build this country’s first nuclear bomb), it is doubtful today’s Internet would exist – and not only because the Internet’s origins were to ensure that a decentralized computer network could survive an atomic attack,” writes Peter Suciu at Tech World News. “But the (Manhattan) project was also the catalyst for the development of computers.”
The fears of atomic attack were, as everyone knows, generated by WWII almost immediately morphing into a more than 40-year-long “Cold War” between the U.S. and the now-defunct Soviet Union that revolved around comparatively small, armed conflicts fought in proxy states and one huge, public, image-driven battle to break the bond to the planet and launch the first explores into space.
As President John F. Kennedy told the nation in 1962, “this generation does not intend to founder in the backwash of the coming age of space. We mean to be a part of it. We mean to lead it. For the eyes of the world now look into space, to the moon and to the planets beyond, and we have vowed that we shall not see it governed by a hostile flag of conquest, but by a banner of freedom and peace. We have vowed that we shall not see space filled with weapons of mass destruction, but with instruments of knowledge and understanding.”
Kennedy went on to describe the technological hurdles standing in the way of that 240,000-mile journey, starting with the need for “a giant rocket more than 300 feet tall, the length of this football field, made of new metal alloys, some of which have not yet been invented, capable of standing heat and stresses several times more than have ever been experienced, fitted together with a precision better than the finest watch, carrying all the equipment needed for propulsion, guidance, control, communications, food and survival, on an untried mission, to an unknown celestial body, and then return it safely…earth, re-entering the atmosphere at speeds of over 25,000 miles per hour, causing heat about half that of the temperature of the sun….”
It was crazy to think of putting a man on the moon in 1962. The U.S. had only six months earlier managed to put an astronaut into orbit in a one-man capsule only to cut short the flight amid fears astronaut John Glenn’s journey might end badly as had the flights of many of the animals the country earlier rocketed into space.
Not to mention the country had no moon ship. The Apollo Project aimed at designing a three-man capsule and lunar landing model to take a team of astronauts to the moon and back had just begun.
Still, the U.S. would manage to put a man on the moon by 1969, and the race to get there powered an avalanche of technological changes with which the bureaucratically constrained Union of Soviet Union Republics (USSR) couldn’t compete.
Mikhail Gorbachev, the last leader of the USSR, tried to fix that by reforming a top-down, government-run economy but couldn’t.
Gorbachev, as the History website outlines the story, believed “that private initiative would lead to innovation, so individuals and cooperatives were allowed to own businesses for the first time since the 1920s. Workers were given the right to strike for better wages and conditions. Gorbachev also encouraged foreign investment in Soviet enterprises.
“However, these reforms were slow to bear fruit. Perestroika (economic restructuring) had torpedoed the ‘command economy’ that had kept the Soviet state afloat, but the market economy took time to mature. in his farewell address, Gorbachev summed up the problem: ‘The old system collapsed before the new one had time to begin working.’) Rationing, shortages and endless queuing for scarce goods seemed to be the only results of Gorbachev’s policies. As a result, people grew more and more frustrated with his government.”
Eastern European nations that had been occupied by Soviet forces since the end of WWII soon revolted against their occupiers, and the rest is history. The USSR collapsed and then reorganized into various nation-states, Russia being the largest and most powerful among them.
The U.S. had a large consumer-driven economy well before that happened, but an even bigger one after.
Meanwhile, consumer spending was growing from 59 percent of GDP to its 68 percent today. A significant part of what the country had been spending on war had shifted to individuals spending on stuff.
Some of that spending has no doubt been wasteful, but waste is a very subjective construct. What’s art to one person is a waste of paint and canvas to another.
And some of the spending, and the change it drove, was clearly not wasteful. Think of what the air in American cities would look like today if we were all driving the automobiles of the 1950s instead of the newer, better, more efficient, cleaner-burning vehicles of the present.
Or, better yet, imagine the shitty mess if the almost 331 million people living in the country today were still getting around with horses or horse-and-buggy as many did before the Model T Ford sparked a revolution in transportation near the start of the 20th Century when the nation was home to fewer than 100 million.
Then again, maybe it wouldn’t have been so bad. Maybe it would have meant the construction of more electric trolleys in the nation’s cities, a lot less urban sprawl and more people continuing to walk to work.
All of which would have been good if the development of petrochemical fertilizers and the internal combustion engine had continued unabated because without the fertilizer to grow more crops and the machinery to harvest them there would have been no Green Revolution and the planet likely would have faced widespread starvation long ago.
Progress almost always come with tradeoffs, but we keep progressing because of that desire for better stuff. It’s a human drive hard to avoid even for those who want to avoid it.
Some of us, maybe many of us, might well be hardwired to shop. Blame nature where the new is constantly replacing the old in and evolving along the way.
Natural evolution – except maybe for the pandemic virus SARS-CoV-2 mutating almost daily before our eyes – is hard to see, but it is happening constantly.
So is technological evolution, and it is obvious all around us.
We’ve come a long way from witch doctors to CRISPR gene editing, from crap-excreting horses to hydrogen trucks from which the only waste product is water, from inefficient incandescent light bulbs to high-efficiency LEDs that can produce the same amount of light from 75 to 80 percent less energy, and from the risky businesses of variolization to the development of a messenger RNA (mRNA) vaccine that appears capable of priming our immune systems to fend off the new pandemic pathogen.
One would have to argue that technology is, on balance, our friend. But it’s also a little like alcohol. It has a dark side, and can, for some, prove addictive.
It can encourage people to buy so many things they really, really, really do not need that they end up in financial trouble, and subject others to the danger of being smothered to death by an avalanche of pile if they carelessly open the wrong closet.
Oh, the dangers of life in Alaska.