● Authority, LSD & hippies: the advent of the personal computer

NewImage

Yes, Alan Turing is “the father” of modern computer science.

Without him, no modern algorithms, no contemporary concepts of computation.

But what about the personal computer? I’m talking about the one I’m using right now to type those words.

And the web — how you’re reading these words.

So, yes. We do stand on the shoulder of giants.

But the people who made the personal computer possible were science-fiction loving, long-haired hippies.

As Steward Brand — founder of the Whole Earth Catalog — puts it, in a 1995 article for Time Magazine:

‘Ask not what your country can do for you. Do it yourself,’ we said, happily perverting J.F.K.’s Inaugural exhortation. Our ethic of self-reliance came partly from science fiction. We all read Robert Heinlein’s epic Stranger in a Strange Land as well as his libertarian screed-novel, The Moon Is a Harsh Mistress. Hippies and nerds alike reveled in Heinlein’s contempt for centralized authority. To this day, computer scientists and technicians are almost universally science-fiction fans. And ever since the 1950s, for reasons that are unclear to me, science fiction has been almost universally libertarian in outlook.

So the answer to the perennial question “why are all computer science geeks Star Wars fans?” you have the answer here.

Vintage science-fiction books have anti-authoritarian slants that appealed to young people in the sixties and seventies.

First then, the rejection of authority.

In his 1984 book, Hackers: Heroes of the Computer Revolution, Steven Levy enshrined the hacker ethic — the political beliefs that motivated the creation and promotion of personal computers by said hippies:

  1. “Access to computers should be unlimited and total.”
  2. “All information should be free.”
  3. “Mistrust authority – promote decentralization.”
  4. “You can create art and beauty on a computer.”
  5. “Computers can change your life for the better.”

So the idea is that the books brought the original enthusiasm.

What brought the vision?

Can we go as far as arguing that LSD  — “turn on, tune in and drop out”, a phrase popularised by Timothy Leary — is what enabled these people to create machines that would set them free? If not, where does the inspiration to make the necessary abstractions computers require come from?

For instance, do all the people using IBM’s application Lotus know that it is based on Lotus 1-2-3, a spreadsheet software created by Mitch Kapor, a transcendental meditation teacher, hence the name Lotus?

My point is that the invention of personal computers has a political origin, unlike the invention of the lightbulb — but please let me know if I’m wrong.

The people responsible for the promotion of computers rejected the ideas of authority that led to the horrors of the 20th century. Aided by drugs and the smooth climate of California, new ideas popped into their open minds.

Somewhere along the road, though, it went a bit sour.

Initially, Steve Jobs was a different creature that what is known of him in popular culture today:

In the 1960s and early ’70s, the first generation of hackers emerged in university computer-science departments. They transformed mainframes into virtual personal computers, using a technique called time sharing that provided widespread access to computers. Then in the late ’70s, the second generation invented and manufactured the personal computer. These nonacademic hackers were hard-core counterculture types – like Steve Jobs, a Beatle-haired hippie who had dropped out of Reed College, and Steve Wozniak, a Hewlett-Packard engineer. Before their success with Apple, both Steves developed and sold “blue boxes,” outlaw devices for making free telephone calls. Their contemporary and early collaborator, Lee Felsenstein, who designed the first portable computer, known as the Osborne 1, was a New Left radical who wrote for the renowned underground paper the Berkeley Barb.

In 1995, as Brand wrote this article, I’m sure he wouldn’t know that in 2016, scores of people would despise Steve Jobs for his wrongdoings — ranging from how he treated his own daughter, to the closed nature of the Apple ecosystem and the despicable way the iPhone is manufactured by Foxconn.

I understand the betrayal computer scientists may feel today.

Jobs took their ideas, applied a healthy dose of human-centred design and marketed them to death.

Brand concludes his piece (again, written in 1995) with a hopeful vision that unfortunately — as iOS & Android, Facebook and overall progress of social media cement our entrance into the 21st century — is not unfolding as he intended:

Our generation proved in cyberspace that where self-reliance leads, resilience follows, and where generosity leads, prosperity follows. If that dynamic continues, and everything so far suggests that it will, then the information age will bear the distinctive mark of the countercultural ’60s well into the new millennium.

The Internet-equipped smartphone created a new frontier for computer science. Everyone has a computer in their pocket and people can see their loved ones’ faces from across the globe in the blink of an eye with software like Skype. Closed distribution models (Google Play and App Store) are hurdles.

But!

But the Internet enables anyone to learn new skills — do you want to learn how to code? How to fix an oven? How to write in Portuguese? — and the Internet enables anyone to then sell their skills by means of products or services. If you’re not into making software, you can use it to advance yourself or your business. And if you’re into making software, it is up to you and I to make it useful — politically and economically.

People like Aaron Swartz, that Brand wouldn’t have known, are the ones behind new, public technologies like RSS and other standards. They are part of the fourth generation of hackers he mentioned in his Time article.

We have the tools and the information. Our cleverness will help us cut through the chaotic noise to only get the delightful juice of the signal.

Next time you boot your computer, maybe you’ll think of its history, maybe you won’t.

Now though, you can’t say you didn’t know.

P.S Original pieces (categorised in “Commentary”) will now be preceded by a black circle, ●.

Two eras of the internet: pull and push

Two eras of the internet: pull and push

1. All beliefs in whatever realm are theories at some level. (Stephen Schneider)

2. Do not condemn the judgment of another because it differs from your own. You may both be wrong. (Dandemis)

3. Read not to contradict and confute; nor to believe and take for granted; nor to find talk and discourse; but to weigh and consider. (Francis Bacon)

4. Never fall in love with your hypothesis. (Peter Medawar)
5. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories instead of theories to suit facts. (Arthur Conan Doyle)

6. A theory should not attempt to explain all the facts, because some of the facts are wrong. (Francis Crick)

7. The thing that doesn’t fit is the thing that is most interesting. (Richard Feynman)

8. To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact. (Charles Darwin)

9. It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. (Mark Twain)

10. Ignorance is preferable to error; and he is less remote from the truth who believes nothing, than he who believes what is wrong. (Thomas Jefferson)

11. All truth passes through three stages. First, it is ridiculed, second, it is violently opposed, and third, it is accepted as self-evident. (Arthur Schopenhauer)

Prospero’s Precepts  – 11 rules for critical thinking from history’s great minds. (via explore-blog)

The history of pasta

The history of pasta

The history of Coca-Cola and why we took cocaine out of it

When cocaine and alcohol meet inside a person, they create a third unique drug called cocaethylene. Cocaethylene works like cocaine, but with more euphoria.

So in 1863, when Parisian chemist Angelo Mariani combined coca and wine and started selling it, a butterfly did flap its wings. His Vin Marian became extremely popular. Jules Verne, Alexander Dumas, and Arthur Conan Doyle were among literary figures said to have used it, and the chief rabbi of France said, “Praise be to Mariani’s wine!”

Pope Leo XIII reportedly carried a flask of it regularly and gave Mariani a medal.

Thereafter, some guy called John Stith Pemberton decided to make his own version of the Mariani wine and called it French Wine Coca. His Coca, however, became illegal, not because of the cocaine, but because of the alcohol. In Georgia a prohibition law was passed. 

Pemberton took the wine out and replaced it with sugar syrup and in 1886, Coca-Cola was born. From 1886 to 1899, for thirteen years, Coca-Cola was very popular among “intellectual” white males. In 1899, Pemberton started selling Coca-Cola in glass bottles which made it accessible to a bigger market. 

Remember, there was still cocaine inside it, just sugar syrup instead of wine. So why did they take cocaine away?

Middle-class whites worried that soft drinks were contributing to what they saw as exploding cocaine use among African-Americans. Southern newspapers reported that “negro cocaine fiends” were raping white women, the police powerless to stop them. By 1903, [then-manager of Coca-Cola Asa Griggs] Candler had bowed to white fears (and a wave of anti-narcotics legislation), removing the cocaine and adding more sugar and caffeine.