Thirty years in “new media” Part II The veteran strikes back

A reader of the part of one of this blog, might ask, “Did you really spend  thirty years in new media?”


The answer is a yes and I was into computers long before that.

In 1968, as a teenage page at the Toronto Public Library system, I was part of a summer experiment in the multimedia of the day, as libraries dipped their toes into the water of the new era beyond books.  We made a student science fiction film and as part of the project we filmed 1968-epoch computers being installed at the Ontario Science Centre, then under construction.

As an editorial assistant at CBC Radio News  1977-79 I had used a very primitive computer system  assigned to its then internal wire service.  By primitive,  its memory  was the equivalent of an amoeba compared to humans.  You had to type a story, perfectly, on a green CRT screen, because there was no memory to save your work. When the story was ready, you pushed Enter and it was dumped to punch tape, then sent over a regular teletype circuit.

I arrived in London in December of  1980, born of British parents in a then British colony, and thus a dual citizen, following the track of  other  generations of young Canadians. London was the place to advance a career.  London did that for me, creating a media geek rather than a foreign correspondent. So I began my 30 years in “new media.” 

Another aim in going to London was to do research for a couple of planned books.

Over Christmas I worked in a crazy pub, the Duke of Kendal, and then in January 1981, after registering as a researcher at the British Library, I landed a job in the  mail room of French Travel Service, an independent rail tour service affiliated with SNCF, offering package and independent rail tours to France.  The job paid the rent and let me do my research at the British Library.   There was one unexpected bonus.   FTS was one of the British  travel companies that was experimenting with the UK developed Prestel videotex system. Although I had nothing to do with the Prestel reservation system, it fascinated me and I was looking over peoples’ shoulders as they operated.

Lesson 1: IT should always be the servant,  never the master. Know your hardware and  software

The computer chap at FTS (there was no IT in 1981) was a tall man with a black beard, in an area, London Victoria, of  mostly clean shaven business types.  The computer reservation system was a main frame in a clean room on one side of the small office.  The man appeared to be  incredibly arrogant and he began every conversation  I overheard with the managers and their secretaries, all shorter in stature,  (he never lowered himself to speak to me).  Towering over them, he would say: “You don’t know much about computers, but…..”  And he would get his way.

In retrospect, it was then I probably decided that I had to know more about computers.  Perhaps because I was an avid reader of science fiction and guessing that computers would be a big part of the future, a year later, back in Toronto,  I would take a basic computer course at (programming  punch cards) and with that basic understanding of all hardware and software I was using.  It is not just that if you know the basics of  the system you are using, you will not be intimidated by the  IT personnel, you will know enough, as some one who is working in the media, to be tweak the system and be creative.

After a couple of months, and wrapping up the research at the British Library, I answered an ad for  someone with computer experience (rare in 1981) at Universal News Services, the UK public relations wire (later part of the PR Newswire empire) UNS  was also experimenting with the British videotex system, Prestel.   Rather than sending out the news releases by teletype, the releases would be easily available for newspapers editors outside of  London on a TV screen, information retrieved from a central mainframe computer.

It wasn’t exactly a leap into the future. Given the strength of the National Graphical Association (one of the unions later broken by Rupert Murdoch) I would  type the stories on a typewriter, and the an NGA member would enter it into the computer just as they would send out a news release by teletype.

Lesson 2  What goes around comes around I  There ain’t no such thing as  a free lunch

UNS promised the newspapers a “free”service, meaning they weren’t charging for what today would be called page views. (Some Prestel service providers did charge and soon found they had few clients– an indication of the shape of things to come).   British Telecom was still charging for both the phone lines that went to the Prestel mainframe and a usage metre. Newspaper clients didn’t understand  the difference between what today would be called bandwidth and the actual content and so UNS constantly got letters of complaints from newspaper editor who did not understand that difference, just like someone today, perhaps a teenager,  with a mobile phone in 2011 who spends time with a free app and doesn’t know about bandwidth charges.

Lesson 3  What goes around comes around II. Life in 140 characters.

There wasn’t much you could say with the limited Prestel system, but one venerable news organization did adjust very well,  creating short snippets of news. Which is why I blogged in  March 2009, that the Economist invented the tweet without knowing it. 

After a few months at UNS, I was invited to lunch at the Canadian High Commission in London, which was recruiting Brits working in Prestel to come to Canada and work on the competing, Canadian developed Telidon system.   After a little wine, some good food and persuasion from the diplomatic corps, I decided to head home. A few months later I was back in Toronto,.

My first job was with the Southam Infomart project. Southam was then the largest Canadian newspaper chain. How Southam ran Infomart was probably the first example of how a large media corporation  can completely screw up a project. (Knight Ridder was running its own experiment in the US and their project was shut down about the same and I have no knowledge how KR ran their videotex project. However, from the few online comments I have seen, it appears KR did not make the horrendous mistakes Southam did)

I was there just a few months, before there were a series of layoffs, the project was failing and  failing quickly.  After a couple of months of  unemployment I was hired by the CBC’s parallel teletext experiment Project Iris.

Lesson 4  Engineers know nothing about content. Neither do the sales force.

Although Southam was a content company, a  newspaper chain with a storied and respected history in Canada, Southam abandoned management of their first new media experiment to the techies, in this case a group of  former IBM middle managers (who kept telling us, the content staff, “This is what we did at IBM.”) The other key figures were the sales staff, who  somehow convinced Sears to put its soon to be released 1982  catalogue on  the system, despite the fact the graphics were primitive. So the majority of the company effort was an early experiment in e-commerce.  Only there was no audience for the service, there were no sets in homes. Bell was planning to offer the service but even then we asked  who would take it (although we were optimistic it would take off).  Even then I had to wonder, what were they thinking?  At least in the UK the Economist  created readable content for Prestel.  The news content at Infomart didn’t even come from Southam, they picked up a raw feed from the Broadcast News wire, without stripping the headers and with no index so a viewer could find stories.

As for CBC Project Iris, it too was managed by engineers, since the funding came from an agreement between the Department of Communications and CBC Engineering headquarters in Montreal.  Unlike Southam,  Mother Corp  did not cede editorial content control to the engineers, so there was a  small, but very real newsroom repurposing CBC content for the service, which did have an audience, 200 test homes.  Later we also had an American audience, since CBS was also testing teletext and one of the test sites was WIVB in Buffalo, with 50 test homes, which meant each audience (if it wanted) could see each other’s feed. So the CBC project continued long after the Southam project died, until it was killed by Brian Mulroney’s budget cuts.

So thirty years later, what goes around, comes around.   Media and content organizations are still  often under the thumb of engineering departments, but now they are outside vendors and engineers, whether it  is Google’s arcane search algorithms,  page or layout design created for the web or tablets or phones by software engineers with no background whatsoever in content.

Then there is Steve Jobs, until recently the CEO, but still the godfather, of Apple, giving295-cestab1w.jpg desperate media companies offers they cannot refuse, demanding that they charge for content  on the Ipad so Apple can get its  30 per cent cut,  content that Apple says it can censor at will.  Of course, there were dozens of tablets at the 2011 Consumer Electronics Show, but the question is how many of those tablets will survive the evolutionary competition and whether or not one tablet succeeds by giving the media companies a way of saying no to the godfather from Apple.

Lesson 5.   Apps, brought to you buy the butterfly effect.

285-butterflyrose.jpgIn physics,  chaos theory is summed up by this phrase. “Sensitive dependence on initial conditions.” (or if a butterfly flaps its wings in one area, it triggers a hurricane across the world) In the days of videotex, there were no homes with sets in North America.  So the companies experimenting with the technology had to make some money. So they came up with the  idea of putting videotex sets in malls as sort of electronic guidebooks.   One of the best commercial clients for videotex in the early days were restaurants. The content could be produced easily, menus were mostly text and restaurant pages did not really need the photographic quality graphics that made the Sears catalogue project a failure. So the idea was to have a guide to the restaurants in a large mall or perhaps even  neighbourhood.

How do you make it easy for people to use the system? The engineers came up with a brilliant solution.  Touch screens.

The problem was that in the period 1980-1984 touch screens in malls  and offices were a total, utter complete and costly failure. Why? Because  idiots, whether they were teenagers or adults who hadn’t grown up, were constantly stubbing lit cigarettes onto the touch sensitive part  of the screen.  A single cigarette could destroy a computer system costing thousands of dollars.  The videotex booths disappeared from malls almost as quickly as they had appeared.

So think about this.  Over the past 30 years, smoking has been banned indoors, in malls, and in offices,  because of the proven  connection between cancer and second hand smoke.  With little historical memory of the videotex failure, it is perhaps a lucky coincidence that second generation, PC based touch screens began to appear in government and corporate offices at about the same time as smoking bans.   The success of large touch screen systems allowed the development of apps on smaller smart phones and tablets

Smoking bans likely not only made the air cleaner and saved lives from second hand smoke, the bans also brought you the apps you finger on your Android phone or your iPad tablet.

One last note, today there are apps for your smart phone using the GPS interface that will let you find restaurants nearby and the  menus, so the concept was right, but 30 years too early.

So when you’re developing a technological innovation, remember success or failure may depend on  something that has absolutely nothing to do with how fast your hardware is or how good your code is. It may depend on something like a ,bunch of  executives lying at a congressional hearing in Washington about the addictive properties of nicotine.

In North America, most of the videotex and teletext projects in both the United States and Canada died between the fall of 1984 and the spring of 1985. The official reason was budget cuts, whether the project was in the public sector or the private sector.  The main reason, of course, was that the growth of  the personal computer made the videotex system obsolete and the growth of multichannel cable television was quickly becoming highly profitable, especially due to carriage fees on cable channels, and teletext was just not  worth developing.

Lesson 6.  Experts are often blind to the world around them.

Over the past 30 years, companies and governments have often been blind sided by an  “unexpected” technological development.  The latest example, of course, is Wikileaks, which, in retrospect, could have been foreseen as a by product of putting all records in electronic form.

The videotex and teletext systems began development in the UK (Prestel)  and Canada (Telidon) in the mid 1970s.   

The statement attributed  to Thomas J Watson of  IBM, that the world would only need five computers is an urban myth. In the 1950s and 1960s, IBM  was concentrating on large expensive mainframe machines to be used by  universities and corporations.  It was clear the a machine that would rent for between $12,000 and $18,000 a month (in 1953 dollars) would be totally inaccessible to the general public.

Even by the 1960s, that there was a growing public interest in computers and there were visionaries who began looking for a way to involve the public, create a market,  give access  to information and even make a profit.  The solution was videotex.  The computer keyboard had already been developed.  Add some memory, make the keyboard a little smarter, connect it to a TV set (already in every home) and then by phone line to (usually IBM for videotex and DEC for teletext) mainframe computer, and lo and behold, the public would be introduced to the world of personal computing.

So when I first became interested in videotex in London in the winter of 1981, and when I returned to Canada in the fall of 1981, I was told by the companies I worked for on both sides of the Atlantic and by other people in the industry at meetings, that all the experts believed it would take 20 years of slow but steady improvement of the keyboard-phoneline-mainframe system before there was a viable personal computer system

In 20 -20 hindsight, Monday  morning quarterbacking, the failure of videotex was certain. Steve Wozniak had introduced the first Apple II personal computer in June 1977 followed by the Apple II Plus in June 1979.  I had actually considered buying an Apple II Plus in the that summer of 1979 before I headed for London.(it was too expensive especially for an impoverished freelancer)  As I was working in videotex, IBM, the maker of the mainframes used by some of the videotex 96-osborne1.jpgsystem, was already working on the development of the personal computer. In August 1981, as I resigned from UNS and went for a two week vacation in Greece, IBM launched the first personal computer.  There were competitors, the Atari and Commodore systems and the Tandy TRS-80, the “Trash 80”  which many techy journalist of the era fell in love with and CP/M machines like the Osborne I bought in 1983, while I was still working at CBC Project Iris. The introduction of the IBM PC XT in March 1983 ( I saw it at a trade show in Toronto that month) with its amazing 10 megabyte internal hard drive, which was the first truly consumer friendly PC, meant videotex was doomed.

As I said, what goes around comes around. It’s thirty years later and what, apart from the tablet, was hot at the Consumer Electronics Show this year?


 One big item was a real old fashioned idea, obsolete for  more than a quarter of a century, connecting your television set to a computer system, and giving it a keyboard.  Of course, it is a high definition set and one of  the reasons to connect to the Internet is to download movies, but the system also allows the user to have complete access to the World Wide Web.  If  one of those experts from 1981 had been caught in a time warp and suddenly reappeared in a living room  in Christmas 2011, where the family gathers around to watch a downloaded movie on an HD set and check their e-mail at the same time, that expert, with no knowledge of what had happened in the previous three decades, would have thought their prophecy had proven true. (And given that the telecoms want to charge more for all that bandwidth to download  a movie, that too might bring back memories for our time traveller).

After Project Iris was killed by  Brian Mulroney,  I kept my connection with developing tech with my new Osborne.  I wrote my first book, King of the Mob, on that four inch screen.  In October 1988, I joined CTV News as a writer on the CTV National News.

Lesson 7.  Beware of software executives bearing gifts

At CTV at that time, 1988 to 1994, the TV news writing software was awkward and primitive, compared to the expanding and consumer friendly software creating for the growing PC and Mac markets.  A company named Columbine had created a mainframe based software for tracking commercial sales and placement.  The company threw in the news writing software as an added inducement for bean counting corporate executives to buy the commercials tracking system.  While Columbine may have had some expertise in tracking commercials,  the news writing software was a mash up.

Add on software, is, in most cases, a very bad deal.

There is exactly the same situation with Novell Groupwise, which is certainly not the best e-mail client in the world, but because it’s added to the Novell’s networking software, which seems to work well, many companies force their employees to use Groupwise, even though there are much better products on the market.  Why would any company in its right mind, spend all that extra money licencing Groupwise per workstation in addition to all the money they pay for the Novell’s networking software, when there are better products available such as Thunderbird?  Not to mention, Gmail. During the CBC lockout, we created a duplicate of the CBC Groupwise system using Gmail, at no cost  (and it worked better)

Lesson  8.   Managers should always consult that people who actually use the hardware or software.

I can’t count the number of times that media managers, based on talking to consultants, fast talking software sales people and sometimes even IT people, impose software and/or hardware on staff without asking them to see if it actually works for what the company wants to do with it.   One of the few times that staff were consulted was at CTV News, when management brought us in to see what they thought was a great piece of TV news writing software, to replace the much hated Columbine.   It was a good piece of software, but as the sales people enthusiastically ran through its features, my techy alarm bells started ringing, and so I began asking questions, about how the lineup editor and the producer would communicate if one was at the main desk and one in the control room and how the writers would work with the lineup editor.   What management didn’t realize was until I the user and techy guy, began asking the questions was that the vendor was presenting software that was really good for a small local station, (the vendor’s client base in the US) but totally inadequate for a network news operation.  They didn’t buy that software.

In the fall of 1993, I began co-writing the first book on Researching on the Internet. It was a rather exciting time to be writing that kind  of book, just as  Mosaic and later Netscape,  opened up the World Wide Web.  It was also the time that both PC and Mac were taking off, with hundreds of small  new companies in fierce competition with each other, just to survive.

Lesson 9.   Software vendors will always promise you the moon, the stars, and a galaxy, far, far away.

Software sales people haven’t changed in a quarter of a century.   They promised you the moon with a 10 megabyte hard drive PC in 1983 and now in 2011,  with mobile phones on the genius level, compared to the computers that  actually sent NASA to the moon, they promise you the stars.  Whether it’s 1983 or 2011, the software guy who comes to your office or greets you at a trade show  (even these days, it is still usually a guy) is wearing a company polo shirt and nicely faded blue jeans, sounds more like a  California surfer dude than a geek, has a big smile, is so good looking that he’s may be also registered with Central Casting and so really loves his tech that he really believes that his product is the greatest thing since the invention of  the silicon chip and COBOL (look it up on Wikipedia).

Caveat emptor.   That’s Latin for “let the buyer beware,” which  leaves one wondering, given that the Romans were such good engineers, if there were  tech trade shows in the Coliseum when the gladiators had a day off.

The surfer dude salesman’s supervisor also wears the company polo shirt but sports dress pants, is in his late 30s, maybe wears glasses, sounds more like a professor and is geekier than his sales staff. He was probably the good looking kid at a trade show long, long ago and far, far, away and stopped going to the gym when he was promoted or married or both.  His role, of course, is like that boss in an auto dealership,  with the sales manager offering you “the deal”  the sales person can’t.   If you were wearing a media badge, that usually meant the software was free.   For  anyone else,   the manager has visions of the ten thousand workstation contract.  The pitch is always the same, whether it is 1983 with the first PC, the multitude of tablets at the CES 2011 and the new, new thing at whatever trade show is hot in 2021, our software is the greatest thing since the creation of the universe.  After a while, to  the jaded veteran, it all sounds exactly the same.

There is one lesson that holds true, for hardware or software,  in 1983, in 2011 or 2021. Never buy Version 1.0. Never!  (At least, in the beginning,  in 1983, Version 1.0 was usually stable, if incomplete with minimal features. These days with the rush to market and pressure for profit, Version 1.0 is actually closer to Beta  0.56 Build 1066 ).

Lesson 10.   One of the great failures of the mainstream media was its lousy coverage of the software industry

Again, with 20-20 hindsight, it is easy to see that an early indication of the coming failure of the mainstream media was not in its adoption or failure to adopt new technological innovations, but the media’s failure to cover the software industry as it was then covering the police beat, city hall, provincial or state and federal governments.

When I was asked to write Researching on the Internet, I had already been following tech for a decade. I knew everything was changing at high speed.  The solution was not to create a software manual, impossible in any case, because unlike Version x.x of software, the web wasn’t static. My idea for book (especially since it was written in a time of transition) was to give the reader some basic principles so that they could work with the web as long as possible.  The idea was right, because Researching on the Internet stayed in print and selling (and making me a profit, the book “earned out.” long after the actual  software had been replaced by new versions)

So with that in mind, when I approached software companies, my questions were similar to  those I   often asked as a reporter, to police, to city hall, to the big industries in town and in the locker room.  Software companies traditionally held their developments secret so as not to reveal them to competitors, which is perfectly understandable.   The problem was that most  software companies were used to uncritical coverage as they announced their latest products.  They were not expecting even the mildest kind of  critical question even a local sports reporter whose was perhaps too close to the home team might have  asked a hockey coach about his plans (or lack of them) for the coming season.

I remember meeting with an executive of one then prominent software company, who turned pale at some of my pretty innocuous questions, and quickly palmed me off to a PR person, who simply repeated how good their products were and showed me to the door. (It later turned out that the company’s financial position was not as good as it claimed and it was later sold).

One area that was generally ignored by the mainstream and the computer press  (the latter dependent on advertising from software companies) was  softcide. Softcide was a common practice during the boom of the 1990s where one company with deeper pockets, bought a company with a better product, then killed that product, so that the next so-called “upgrade” resulted in angry customers being offered the inferior product, while support for the better  and now orphaned software was abandoned. The business press was even worse, usually caring only about the stock price and not the actual management of these companies.

It was only when some of those outraged customers,  computer writers, former employees and sometimes current and anonymous employees  who were branching out on their own began blogging with inside scoops on the software industry did the mainstream media catch  up (and even today the MSM is too often dependent on those bloggers.)

In 1994, I returned to CBC where I would work as a TV lineup editor, then  web writer and producer and later photo editor.  I watched as online news started as a hole in the wall closet office experiment, then a small team working and changing on the go until, like all other online news operations,  it was finally folded into the corporate machine


Lesson 11.  Team should mean team

Team has become a cliched buzzword.   Software companies and your ISP sign off their messages  with the X Team.  So spammers take advantage of the team cliche.  (I have received auto spam from the “ team,” not bad for a one man operation.)  At the same time, television news, using the same  cliched buzzword, promise “full team coverage,” as does every other TV station in town. Not to  mention the newspapers.

One has to wonder why the executives, whether in software or the media, are so blinkered that they actually believe that the public pays attention to this constantly repeated nonsense.
A good newsroom has always been a team, going back 150 odd years or more to the first major newspapers. Software with its often millions of lines of code is also a team effort.

In many cases, bean-counting management, applying cost benefit analysis, have undermined team efforts in both industries, with staff cuts, ignoring morale problems and by creating bureaucratic headaches. while creating a message track of a team effort.

Like all cliches, like all message tracks, the team analogy is based on truth.  In the 30 years that I have worked in new or online media, the system worked best when the IT staff were present in the actual newsroom, rather than on another floor or even another city.  In a couple of cases, it was one single person who was  working with us in developing projects.  In another case,  the IT staff,  programmers, network administers and hardware geeks were crammed into a small office with the news staff, because there was no room  for them anywhere else.

In all three cases, the majority of the IT staff saw what we were trying to accomplish and worked their butts  off to help us to make sure their system they had created did what was supposed to, especially in cases where there problems getting stories up on the web during breaking news and the miracle workers created instant work around.

Unfortunately, when the IT people eventually had their own office, they soon lost interest in what the newsroom needed and their aim was to fullfil the IT department’s priorities and the demands of IT culture.   It got  even worse when bean-counting management consolidated IT network and technical support in call centre in a city hundreds of miles away with people who never actually had any concept of what the media staff were trying to do.  (At least the call centre was in Canada, not Bangalore or Kuala Lumpur).

IT culture at its best can  be creative, at its worst it is a bureaucratic nightmare. Unless there is a symbiotic relationship with the actual productive staff, when the IT culture is separate from the newsroom culture, the system breaks down.  It’s as if the journalists are the leopards and the IT staff the lions, the journalists are the Orcas and the IT staff the sharks, similar creature in an  similar environment, but with different and often competing goals.

The worst case of IT disconnect came in 2001.  At one major news organization, the IT staff had scheduled a network upgrade for September 13,  2001.  The idiots were so blind that the network upgrade went ahead regardless of the events  two days earlier on September 11 and the entire system slowed to a crawl. IT honchos were rather put out at the escalating calls of complaint, starting with front line news staff and escalating to senior news management, when the network upgrade didn’t work properly

The journalism programs at Columbia (Tow Center for Digital Journalism ) and New York University are currently working on a programs/curriculum that will create “journo-programmers” 

(See also Nieman Labs  Boston Hack Day Challenge and  Educating the journo-programmer. )

I was one of the first journo-programmers myself .  After I returned to Canada from London, I took a programming course at York University. It being 1981. I programed using punch cards. The course was invaluable and because I always had a basic understanding of how computers worked, I was always able to adapt to new innovations.

There’s one problem, however, with what Columbia and NYU  are attempting. There is no mirror image  curriculum where the IT people are trained as programmer-journos  (or programmer-doctors, programmer-cops or programmer-millwrights etc. ). While it is a good that a young journo-programmer knows, the basics of code and/or how to run a server, it is not going to do that young man or woman  much  good when they come up against corporate IT and their priorities.  The journo-programmer may know what he/she is talking about but if history is any guide, in most cases, they will be ignored by IT.

Many corporate IT people still believe that anyone who calls to report a problem is the cliched dummy who puts their coffee mug in the CD drive holder and knows nothing about the system. I say many because I and my geek colleagues always made it a point to find out who were the better and more responsive IT people and when possible went directly to them.

We always joked that best training in dealing with corporate  IT was watching M*A*S*H.  Unfortunately, in too many cases, these better IT people soon left either  because  media IT salaries were low compared to other areas, because other companies recognized their talent and hired them away,  they left because they couldn’t stand the stultifying bureaucracy or were fired  because their bosses didn’t want employees who were smarter than they were.
 I have always thought that at any company, no matter what the product or service, all IT staff should be made, as a condition of employment, to start at the bottom, at least for a month and work in their company’s main product or service line.  However, that dream for the working staff (and perhaps a nightmare for the IT staff) will likely never happen.

Throughout my career, and this is a good reason to have journo-programmers, if we could avoid working with the IT people on the other floor, we did our own work arounds.

Of course, if the news staff and the IT were truly a team,  then there wouldn’t be these kinds of problems.

It soon became apparent at those news organizations that were early on the web that they had to quickly expand their staff beyond the pioneer geeks. 

 That’s when the in the broom closet IT staff created the first template systems, which then grew into in house and later outside vendor supplied Content Management Systems.   Those Content Management Systems meant a whole generation of  journalists, working on the web, never actually had to understand the nuts and bolts of how the system worked. They simply showed up for work and wrote their copy or uploaded their photos and video in a system that too them was not too different than the typewriter of an earlier generation.  (That is if the system actually worked.  Again senior management was too often seduced by the promises from software vendors, bought expensive CMS systems that were not suitable for the news, TV  or magazine media)

Lesson  12.  Be aware of the innovation cycle and be prepared for it.

As everyone who works in the media knows, the business is mired in a deep crisis and that crisis is getting worse as new innovations seem to appear almost every day, with corporate news executives flopping around like fish out of water in their efforts to catch up.
 After about a decade of relative stability from the late 1990s to the late 2000s after the introduction and then the maturity of  the world wide web, in the past few years, came Facebook, then Twitter, then the smart phone, then Foursquare, then the iPad and now Quora.
This is reflected on the Twitter feed #futureofnews.  I quickly noticed something about those posting on #futureofnews (I admit that this is unscientific and anecdotal, but perhaps someone looking for a PhD dissertation can quantify it). 

There is, as far as I can tell, an age related reverse bell curve, on those who are posting, either on #futureofnews or #journalism and discussing the survival of the news media.  The majority of posters are either in their 50s and 60s or in their 20s,  students and young journos.

 There are people I met at the Computer Assisted Reporting Conferences in the heady days of the early 1990s, or who appeared on the early CAR and Online news lists like Dan Gillmor, Steve  Yelvington, Danny Sullivan, Steve Outing,  gurus from then and now like Don Tapscott and  other slightly later pioneers like Jim Brady (@jimbradysp) and Jeff Jarvis On the rising side of the reverse bell curve are the younger side,  people in their 20s,  like Adam Westbrook and Cody Brown.

Why is that?  News management these days might like to believe that anyone over 40 is obsolete as far new media technological innovation is concerned.

Not so. My contemporaries, call us the over the hill gang or the geeks from Cocoon, if you wish, were part of a innovation cycle, where we had to adapt to something new every day.  While there are people in  their 30s and 40s on #futureofnews, they are usually not the most frequent posters. Most of those people came into journalism in the relatively stable and mature period of the world wide web from approximately 1996 to 2006.

It is the generation from 18 to 28 that face the greatest challenges. It is a time of economic crisis for  all of society and even more so for the news media, at a time  of  technological innovation that is proceeding at warp speed.  (After all the previous generation, my generation, faced innovation at a time of prosperity and apart from a couple of downturns, economic stability)

That is why the new generation journalists or journalists-to-be are most frequent posters  on #futureofnews and that is where the most productive feedback and mentoring occurs between the previous generation that faced an innovation cycle and the current  generation.

I am not optimistic that the current (mostly aging) corporate news management can adapt to both the economic downturn, the increasing pace of technological innovation, and for the west, especially the United States, too long comfortable at the top of the heap,  growing international competition.

If only a few executives come to realize that we are in a period of evolving media (as I discussed in part one  of this blog) some of the better media will likely survive.

As for the long term survival of  traditional journalism that tells the world both what it wants to know and also what it needs to know,  it is likely that, if anyone saves the craft and the profession, it will be someone who right now is 19 or 21 or even 28, who will discover the key to future success.

If they want help of an old veteran, I’ll be glad, grasshopper, to tell them more tall tales of punch cards and four inch screens and hand coding html news stories.   The world is different, but as I have said what goes around comes around, so I write  in the hope that the Tao of News will give them some ideas on how to be flexible and adaptable in the few of the latest new, new thing,  how to deal with bean-counting managers and corporate IT call centres, so they can do what’s really important, cover the news.

Enhanced by Zemanta

In the beginning: Why the media couldn’t charge for content.


If only, if only, my colleagues say, if only the news media had started charging for content when they launched their first websites.

If only the media had charged, then none of the current problems of free content would have happened, the public would know that content costs money and the newspapers and TV stations would have a second, strong income stream and all would be well. There would be lots of good, high paying jobs and the money to do real journalism rather than celebrity silliness.


If only……


So now there is a search for scapegoats. Media managers who have shown that they are completely incompetent in running traditional print and broadcast are an easy and obvious target.

Others blame the tech community and a misunderstanding of the truncated quoting of Stewart Brand, “information wants to be free.”

Then came Wired editor Chris Anderson’s nasty tract, Free. The main flaw in “free” is the assumption that the concept can transfer outside the tech and science fiction communities.  Unlike commodity (or atom)  based corporations, for creative individuals and most of the media, “Free” usually doesn’t work outside those arenas, an inconvenience that the advocates of “free” constantly ignore. What is left is  basically a schoolyard bully taunt: “So there, free is the future, so take your medicine and work for nothing.”

Most of the people who ask the question and provide answers were not around in the earliest days of online news media. So that is why there is a belief that if somehow the media had charged in the early days, today all would be well.

Yes, there was one day and just one day, when, if the media had got its act together, it could have started charging for online news, September 1, 1993. The trouble was that  there were no major media on the Internet in a big way come that September.


I was present at the creation of online media


I was working in “online media” long before the launch of the World Wide Web, back in the days of videotex and teletext from 1981-1985.

The Internet played a role in my science fiction short story Wait Till Next Year, published in Analog in November, 1988 (although I got some of the tech details wrong).

I got my first Internet account in August 1993. Note I am a very early adopter and got in just before the Internet tsunami a month later in September 1993.


I co-wrote the first book on Researching on the Internet, published in the fall of 1995. So I was researching the state of the internet, the web, and the media at the first moments of news on the web.

I was the third employee assigned to CBC News Online, April 1, 1996.

The cold, hard fact is that web evolved with free content. It had little to do with Stewart Brand. So when the media first ventured onto the web, the media had to play by the rules at the time. Those rules appeared to say, “commerce on the Internet is a no- no.”

The Genesis of the media on the Internet



In the beginning, (in 1968-1969) US Department of Defence created ARPANET.

And DOD saw that it was good.

DOD said let the military and the scientists communicate.

And the military and the scientists communicated.

And DOD saw that it was good. The American was getting a good return for their money.

But then there was darkness on the face of ARPANET,

DOD saw that too many people had access to the ARPANET and most of the users didn’t have security clearances.

DOD said in 1983, we will create a separate MILNET and give the scholars ARPANET
In 1984, the National Science Foundation created NSFNET.

And DOD and NSF saw that it was good.

Thus TCP/IP spread to universities around the world.
And the scholars saw that it was good.

The techs improved a system called UUCP and created protocols for e-mail, ftp and newsgroups.

And the techs saw that UUCP was good and said GNU, thus, this protocol shall be free to all.

The campus deans said let us have more access to ARPANET, NSFNET,TCP/IP and UUCP NET via private sector telecoms who can do the wiring.

Verily the private sector telecoms wired the universities and the laboratories and created dial up for scholars in their homes.

The telecoms reaped great profits of gold and silver and precious things from those wires.

And DOD and NSF and the scholars and the techs and the telecoms saw that it was good.

NSF decreed that NSFNET and ARAPNET shall be free from commerce, for it was the will of the community that the networks are for education and the spread of human knowledge.
And so NSF said we shall cast out UUCP NET because it can be used for commerce (but we will still use the free software they developed).

And thus UUCP NET was cast out.

The telecoms and the nations of the world far from North America agreed that this networked system was good and created their own networks.

And they all saw that it was good.

Thus it came to pass that the universities which had journalism schools gave their students access to what was now known as the Internet.

And lo and behold it appeared to be free (although their accounts were paid for, in part, by tuition fees). The students were taught that the Internet was educational and thus should be free for all.

At the same time their elders in journalism who loved tech were using another system called CompuServe (which the elders had to pay for with their credit cards).
The journalism students and j-professors came on to CompuServe said “Behold I bring you tidings of great joy. There is this wonderful thing called the Internet and it is free.”

It came to pass that Tim Berners-Lee at CERN created the World Wide Web.
And all saw that the World Wide Web was good.

So the professors, and the students and the reporters and the editors, all of whom loved tech, all rejoiced when they saw the World Wide Web. For they thought they had found the perfect way to deliver the news.
Out of a whirlwind came Netscape.245-netscapes.jpg

At first only the techies loved Netscape.

Then Netscape said we shalt have an IPO.

In the year of our Lord 1995, on the ninth day of August, the IPO came to pass, and it was wonderful and the Netscape stock set a record on Wall Street.

So Netscape became front page news and was high on the evening newscasts.

The media barons and all priests and scribes of the news temples saw that much gold and silver was going to Netscape and asked “What is going on?”

So the barons and the priests and the scribes summoned those of their followers who were techies and said “Tell us, what is this Internet? What is this World Wide Web? Why is Wall Street giving gold and silver and precious things to Netscape?”
The techie reporters and editors said to the barons, the priests and the scribes, this is the Internet, this is the Web.

The techie followers showed the barons, the priests and the scribes their personal websites. Thetechie editors showed the barons, priests and scribes the under the table news sites they had created. They told the exalted ones this World Wide Web is perfect for delivering news, you can have text, you can have pictures, you can have audio and you can even have video.

The barons and the priests and scribes decreed to their techie followers and editors. “Thou shalt build websites for our news operations.”

So the techie news people and the tech techies laboured mightily and created websites. They presented the websites to the barons, priests and scribes.

The barons, priests and scribes looked at the websites and saw that they were good. So they told the news people and techies that they had done a great service and would be rewarded from the gold and silver we get from this new World Wide Web (although the barons, scribes and priests, like all their kind, were lying and did not intend to really reward their followers).

The techie news people and the tech techies trembled and quaked but bravely told the barons, priests and scribes, “No, oh exalted ones, that is forbidden. It has been decreed from on high that there will be no commerce on the Internet.” And they were sore afraid.

The barons, priests and scribes said to themselves, “What the fuck is going on?”

So that’s the story.
 From creation to evolution

There are two key points.

First, as is well known, the Internet did evolve from military, scientific and university communications systems which were, on the surface, free, although, of course, largely paid for by the American taxpayer and university endowments

The culture of free exchange of information is the basis of scholarship, but is, of course, paid for behind the scenes, by government, foundation and endowment funding. Thus the culture of freeinformation existed at the core of Internet use at the time the media first began to be interested in putting news on the web.


Second, in the early 1990s, before the rise of the independent Internet Service Providers and the expansion of services by the telecoms, large and small, the main communication network for the Internet in North America was the NSF Backbone, the high speed Internet communications network run by the U.S. National Science Foundation, which as part of its policy, forbade the use of the backbone for commercial purposes.

Thus in theory, and the conventional wisdom believed, no one using the Internet for commercial purposes, and that would have included charging for news, could use the main North American Internet information communications backbone.

But, in reality, the situation was a lot greyer and not so black and white.

I kept all my research material from the time in 1993-1994 (which I recently donated to the York University Computer Museum)when I was writing Researching on the Internet.

Here is what a couple of the leading books of the time said (books which most libraries, I suspect, discarded long ago and so are no longer available to those who lament the media if only)

Internet Companion A beginners guide to global networking
Tracy LaQuey with Jeanne C Ryer, Addison Wesley, May 1993, put it this way:

Probably the best known and most widely applied is NSFNETs Acceptable Use Policy , which basically states the transmission of “commercial” information or traffic is not allowed across the NSFNET backbone, whereas all information in support of academic and research activities is acceptable.

It sounds somewhat complicated, but you need to remember the original Internet began as US government‑funded experiment and no one expected it to become the widespread, heavily used production network it is today.

It’s going to take a while for commercialization and privatization of these networks to occur. The Internet as whole continues to move to support‑‑or at least allow access to‑‑more and more commercial activity. We may have to deal with some conflicting policies while the process evolves, but at some point in the Internet future, free enterprise will likely prevail and commercial activity will have a defined place, making the whole issue moot, In the meantime, if you’re planning to use the Internet for commercial reasons, make sure the networks you’re using support your kind of activity.


Another book, just a little later, Kevin M Savetz Your Internet Consultant The FAQs of Life Online. Sams, 1994

Commercial activity isn’t allowed on the Internet? It’s purely an academic and educational network, right?

People who advertise and sell stuff on the net should be flogged, right?
Yes and no. As mentioned earlier in this book the Internet is composed of a variety of different networks. Each network has its own set of rules, called acceptable use policies.

Certain networks [particularly the National Science Foundation network, the NSFnet, have strict acceptable use policies that ban most types of commercial use.

On the other hand, another backbone network within the Internet world has been finding considerable interest among commercial internet users‑‑the Commercial Internet Exchange (CIX). The acceptable use policies of CIX are much more broad and advertising and selling are both within its purview. So although commercial activity isn’t allowed on certain parts of the Internet, it is allowed on others.

People who advertise on the Internet should only be flogged for heinous violations of Internet culture, such as sending unsolicited junk e‑mail or posting commercial messages to Usenet groups that aren’t supposed to be used for commercial messages.

In the same book, another writer, Michael Strangelove, answered the question (key for the media in retrospect and somewhat prescient as well)

Is advertising allowed on the Internet?

…many people see internet as a noncommercial, academic, purely technical environment. Not so: today about fifty per cent of the Internet is populated by commercial users, The commercial Internet is the fastest growing part of cyberspace,

Businesses are discovering that they can advertise to the Internet community at a fraction of the cost of traditional methods. With tens of millions of electronic mail users out there in cyberspace today . Internet advertising is an intriguing opportunity not be overlooked. When the turn of the century rolls around and there are one hundred million consumers on the Internet, we may see many ad agencies and advertising supported magazines go under as businesses learn to communicate directly with consumers in cyberspace.


Those were print books aimed at the newbie Internet user.

But it also means that if the media had had the foresight to get on the Internet in the earliest years of the 1990s, they would have had to become part of the proposed Commercial Internet Exchange.

But in 1991, 92, 93, online in a newsroom was confined to what was called in many American (and Canadian) newsrooms, the “geek in the corner.”

The situation was already changing even as those books went to press.

Here is how Wikipedia explained the changes.

The interest in commercial use of the Internet became a hotly debated topic. Although commercial use was forbidden, the exact definition of commercial use could be unclear and subjective. UUCPNet and the X.25 IPSS had no such restrictions, which would eventually see the official barring of UUCPNet use of ARPANET and NSFNet connections. Some UUCP links still remained connecting to these networks however, as administrators cast a blind eye to their operation….

In 1992, Congress allowed commercial activity on NSFNet with the Scientific and Advanced-Technology Act, 42 U.S.C. § 1862(g), permitting NSFNet to interconnect with commercial networks.[31] This caused controversy amongst university users, who were outraged at the idea of noneducational use of their network


So, the US Congress had opened up the Internet to commercial activities in that country.


The geeks, bearing content


Most of the media was still clueless and didn’t jump to the opportunity, even if they ran Sunday feature stories on the geeks or closing items on the evening news about this thing called “The Internet.”

It is likely that the vast majority of executives with their eyes on Wall Street and paying consultants pushing 1970s media models had no idea that they employed a “geek in the corner,” much less what the geek was doing.

Apart from tech companies, both hardware and software’s growing giants plus the small office start ups and computer science grad students with big ideas, which made up most of Strangelove’s “commercial activity,” the private sector around the world was slow to take up the challenge.


The CBC, as Canada’s public broadcaster, had, at least in those days, a mandate to experiment and innovate. So in 1993, CBC began an experiment working toward streaming radio on the Internet, in cooperation with the Communications Research Council. But as an experiment and coming from a public broadcaster there was no thought of charging for the service. (The history of the early days of shows the kinds of problems that executives faced. And it was a lot harder for the private sector which was expected to make money and even harder now  in the era of bean counting consultants and their talk of profit centers).

When business executives finally realized that the Internet was open to commerce, the news media was one of the first industries to make a major effort to invest on posting their material, most of it repurposed on the World Wide Web. The move was most often driven by those managers and employees who were still around from the videotex and teletext days, who saw web based news might succeed where the 1980s projects failed. Usually, these experiments were not sanctioned by head office and the money came from a little creative budgeting.

That meant the content had to be free, right from the beginning.


There’s one factor, that today’s audience metrics obsessed media bean counters have never considered when they say “If only. ” Their all important audience. The audience for online news in the mid-1990s were Internet and Web early adopters and most had adopted the culture of free information. In those early days, no media was willing to make an investment in online content that was actually worth paying for. Most of the news was repurposed from existing print, radio or television, which was readily available (for a price, of course)

So when the first media pioneers ventured on to the Internet in the mid-1990s (including CNN, NBC, the CBC where I worked, the Raleigh News and Observer, The Globe and Mail and The Toronto Star and others) the media was caught in an evolutionary feedback mechanism.

To attract the early adopter audience, the news had to be free. The audience that might have paid was not yet online (although the richer business types were using proprietary electronic services–which meant they didn’t need to pay for web content either. That pre-web willingness to pay for business information is why the  Wall Street Journal paywall has worked while others failed). 250-timecover2s.jpg

Where was the money to come from? The early click through rates for the first banner ads (which many in the audience actually objected to) were dismal.

Development of good websites cost time and money and the media was already facing the culture of free. (I predicted trouble for newspapers when I was interviewed by Craig Saila for the Ryerson Review of Journalism in fall 1996, an article published in spring 1997 (Registration required)   (Also available on Craig Saila’s site)

The headline pretty much sums up the attitude the students of the time had to media management which was failing to adapt to the fast changing environment.

Looks like the students were right. The Ryerson article was just about the media that had had the courage to venture on to the web by the fall of 1996.

Most of the news media were late comers and took almost a decade to catch up in page views with the early starters. The late comers couldn’t charge for their content because 95% of those early online services, their competitors, were free. Neither were putting that much money into real web content.

If only


There was one day that all the media could have made sure they could charge for content. September 1, 1993.

For it was in September 1993 that the Internet (not yet the web) took an evolutionary leap from a government, military and academic information network and communication system to one used by the public.

In September 1993, America Online, then the largest paid service, opened a gateway to Usenet, the “newsgroups” of the Internet for its subscribers. It was a time for those who then thought the Internet was their exclusive domain remember with horror, called by some the tsunami or the beginning, as described by Wikipedia as the “Eternal September,” when their private party ended.

Yes there were a few news organizations with a presence on CompuServe or America Online on September 1, 1993 but far too few and the content was far too thin.

If the media wanted to charge for content, after September 1993, when the thousands of AOL subscribers ventured on to the adolescent Internet of the time and embraced the culture where they expected free content, it was already too late.


A tectonic collision occurred that September, the leading edge of one continent collided with another.

Invasive species penetrated the long balanced media ecosystem and disrupted it beyond imagination. So will evolutionary forces work, will the news media adapt to the new environment​?


Thirty Years in New Media

Thirty Years in New Media Part II The Veteran Strikes Back


Enhanced by Zemanta