A reader of the part of one of this blog, might ask, “Did you really spend thirty years in new media?”
The answer is a yes and I was into computers long before that.
In 1968, as a teenage page at the Toronto Public Library system, I was part of a summer experiment in the multimedia of the day, as libraries dipped their toes into the water of the new era beyond books. We made a student science fiction film and as part of the project we filmed 1968-epoch computers being installed at the Ontario Science Centre, then under construction.
As an editorial assistant at CBC Radio News 1977-79 I had used a very primitive computer system assigned to its then internal wire service. By primitive, its memory was the equivalent of an amoeba compared to humans. You had to type a story, perfectly, on a green CRT screen, because there was no memory to save your work. When the story was ready, you pushed Enter and it was dumped to punch tape, then sent over a regular teletype circuit.
I arrived in London in December of 1980, born of British parents in a then British colony, and thus a dual citizen, following the track of other generations of young Canadians. London was the place to advance a career. London did that for me, creating a media geek rather than a foreign correspondent. So I began my 30 years in “new media.”
Another aim in going to London was to do research for a couple of planned books.
Over Christmas I worked in a crazy pub, the Duke of Kendal, and then in January 1981, after registering as a researcher at the British Library, I landed a job in the mail room of French Travel Service, an independent rail tour service affiliated with SNCF, offering package and independent rail tours to France. The job paid the rent and let me do my research at the British Library. There was one unexpected bonus. FTS was one of the British travel companies that was experimenting with the UK developed Prestel videotex system. Although I had nothing to do with the Prestel reservation system, it fascinated me and I was looking over peoples’ shoulders as they operated.
Lesson 1: IT should always be the servant, never the master. Know your hardware and software
The computer chap at FTS (there was no IT in 1981) was a tall man with a black beard, in an area, London Victoria, of mostly clean shaven business types. The computer reservation system was a main frame in a clean room on one side of the small office. The man appeared to be incredibly arrogant and he began every conversation I overheard with the managers and their secretaries, all shorter in stature, (he never lowered himself to speak to me). Towering over them, he would say: “You don’t know much about computers, but…..” And he would get his way.
In retrospect, it was then I probably decided that I had to know more about computers. Perhaps because I was an avid reader of science fiction and guessing that computers would be a big part of the future, a year later, back in Toronto, I would take a basic computer course at (programming punch cards) and with that basic understanding of all hardware and software I was using. It is not just that if you know the basics of the system you are using, you will not be intimidated by the IT personnel, you will know enough, as some one who is working in the media, to be tweak the system and be creative.
After a couple of months, and wrapping up the research at the British Library, I answered an ad for someone with computer experience (rare in 1981) at Universal News Services, the UK public relations wire (later part of the PR Newswire empire) UNS was also experimenting with the British videotex system, Prestel. Rather than sending out the news releases by teletype, the releases would be easily available for newspapers editors outside of London on a TV screen, information retrieved from a central mainframe computer.
It wasn’t exactly a leap into the future. Given the strength of the National Graphical Association (one of the unions later broken by Rupert Murdoch) I would type the stories on a typewriter, and the an NGA member would enter it into the computer just as they would send out a news release by teletype.
Lesson 2 What goes around comes around I There ain’t no such thing as a free lunch
UNS promised the newspapers a “free”service, meaning they weren’t charging for what today would be called page views. (Some Prestel service providers did charge and soon found they had few clients– an indication of the shape of things to come). British Telecom was still charging for both the phone lines that went to the Prestel mainframe and a usage metre. Newspaper clients didn’t understand the difference between what today would be called bandwidth and the actual content and so UNS constantly got letters of complaints from newspaper editor who did not understand that difference, just like someone today, perhaps a teenager, with a mobile phone in 2011 who spends time with a free app and doesn’t know about bandwidth charges.
Lesson 3 What goes around comes around II. Life in 140 characters.
There wasn’t much you could say with the limited Prestel system, but one venerable news organization did adjust very well, creating short snippets of news. Which is why I blogged in March 2009, that the Economist invented the tweet without knowing it.
After a few months at UNS, I was invited to lunch at the Canadian High Commission in London, which was recruiting Brits working in Prestel to come to Canada and work on the competing, Canadian developed Telidon system. After a little wine, some good food and persuasion from the diplomatic corps, I decided to head home. A few months later I was back in Toronto,.
My first job was with the Southam Infomart project. Southam was then the largest Canadian newspaper chain. How Southam ran Infomart was probably the first example of how a large media corporation can completely screw up a project. (Knight Ridder was running its own experiment in the US and their project was shut down about the same and I have no knowledge how KR ran their videotex project. However, from the few online comments I have seen, it appears KR did not make the horrendous mistakes Southam did)
I was there just a few months, before there were a series of layoffs, the project was failing and failing quickly. After a couple of months of unemployment I was hired by the CBC’s parallel teletext experiment Project Iris.
Lesson 4 Engineers know nothing about content. Neither do the sales force.
Although Southam was a content company, a newspaper chain with a storied and respected history in Canada, Southam abandoned management of their first new media experiment to the techies, in this case a group of former IBM middle managers (who kept telling us, the content staff, “This is what we did at IBM.”) The other key figures were the sales staff, who somehow convinced Sears to put its soon to be released 1982 catalogue on the system, despite the fact the graphics were primitive. So the majority of the company effort was an early experiment in e-commerce. Only there was no audience for the service, there were no sets in homes. Bell was planning to offer the service but even then we asked who would take it (although we were optimistic it would take off). Even then I had to wonder, what were they thinking? At least in the UK the Economist created readable content for Prestel. The news content at Infomart didn’t even come from Southam, they picked up a raw feed from the Broadcast News wire, without stripping the headers and with no index so a viewer could find stories.
As for CBC Project Iris, it too was managed by engineers, since the funding came from an agreement between the Department of Communications and CBC Engineering headquarters in Montreal. Unlike Southam, Mother Corp did not cede editorial content control to the engineers, so there was a small, but very real newsroom repurposing CBC content for the service, which did have an audience, 200 test homes. Later we also had an American audience, since CBS was also testing teletext and one of the test sites was WIVB in Buffalo, with 50 test homes, which meant each audience (if it wanted) could see each other’s feed. So the CBC project continued long after the Southam project died, until it was killed by Brian Mulroney’s budget cuts.
So thirty years later, what goes around, comes around. Media and content organizations are still often under the thumb of engineering departments, but now they are outside vendors and engineers, whether it is Google’s arcane search algorithms, page or layout design created for the web or tablets or phones by software engineers with no background whatsoever in content.
Then there is Steve Jobs, until recently the CEO, but still the godfather, of Apple, giving desperate media companies offers they cannot refuse, demanding that they charge for content on the Ipad so Apple can get its 30 per cent cut, content that Apple says it can censor at will. Of course, there were dozens of tablets at the 2011 Consumer Electronics Show, but the question is how many of those tablets will survive the evolutionary competition and whether or not one tablet succeeds by giving the media companies a way of saying no to the godfather from Apple.
Lesson 5. Apps, brought to you buy the butterfly effect.
In physics, chaos theory is summed up by this phrase. “Sensitive dependence on initial conditions.” (or if a butterfly flaps its wings in one area, it triggers a hurricane across the world) In the days of videotex, there were no homes with sets in North America. So the companies experimenting with the technology had to make some money. So they came up with the idea of putting videotex sets in malls as sort of electronic guidebooks. One of the best commercial clients for videotex in the early days were restaurants. The content could be produced easily, menus were mostly text and restaurant pages did not really need the photographic quality graphics that made the Sears catalogue project a failure. So the idea was to have a guide to the restaurants in a large mall or perhaps even neighbourhood.
How do you make it easy for people to use the system? The engineers came up with a brilliant solution. Touch screens.
The problem was that in the period 1980-1984 touch screens in malls and offices were a total, utter complete and costly failure. Why? Because idiots, whether they were teenagers or adults who hadn’t grown up, were constantly stubbing lit cigarettes onto the touch sensitive part of the screen. A single cigarette could destroy a computer system costing thousands of dollars. The videotex booths disappeared from malls almost as quickly as they had appeared.
So think about this. Over the past 30 years, smoking has been banned indoors, in malls, and in offices, because of the proven connection between cancer and second hand smoke. With little historical memory of the videotex failure, it is perhaps a lucky coincidence that second generation, PC based touch screens began to appear in government and corporate offices at about the same time as smoking bans. The success of large touch screen systems allowed the development of apps on smaller smart phones and tablets
Smoking bans likely not only made the air cleaner and saved lives from second hand smoke, the bans also brought you the apps you finger on your Android phone or your iPad tablet.
One last note, today there are apps for your smart phone using the GPS interface that will let you find restaurants nearby and the menus, so the concept was right, but 30 years too early.
So when you’re developing a technological innovation, remember success or failure may depend on something that has absolutely nothing to do with how fast your hardware is or how good your code is. It may depend on something like a ,bunch of executives lying at a congressional hearing in Washington about the addictive properties of nicotine.
In North America, most of the videotex and teletext projects in both the United States and Canada died between the fall of 1984 and the spring of 1985. The official reason was budget cuts, whether the project was in the public sector or the private sector. The main reason, of course, was that the growth of the personal computer made the videotex system obsolete and the growth of multichannel cable television was quickly becoming highly profitable, especially due to carriage fees on cable channels, and teletext was just not worth developing.
Lesson 6. Experts are often blind to the world around them.
Over the past 30 years, companies and governments have often been blind sided by an “unexpected” technological development. The latest example, of course, is Wikileaks, which, in retrospect, could have been foreseen as a by product of putting all records in electronic form.
The videotex and teletext systems began development in the UK (Prestel) and Canada (Telidon) in the mid 1970s.
The statement attributed to Thomas J Watson of IBM, that the world would only need five computers is an urban myth. In the 1950s and 1960s, IBM was concentrating on large expensive mainframe machines to be used by universities and corporations. It was clear the a machine that would rent for between $12,000 and $18,000 a month (in 1953 dollars) would be totally inaccessible to the general public.
Even by the 1960s, that there was a growing public interest in computers and there were visionaries who began looking for a way to involve the public, create a market, give access to information and even make a profit. The solution was videotex. The computer keyboard had already been developed. Add some memory, make the keyboard a little smarter, connect it to a TV set (already in every home) and then by phone line to (usually IBM for videotex and DEC for teletext) mainframe computer, and lo and behold, the public would be introduced to the world of personal computing.
So when I first became interested in videotex in London in the winter of 1981, and when I returned to Canada in the fall of 1981, I was told by the companies I worked for on both sides of the Atlantic and by other people in the industry at meetings, that all the experts believed it would take 20 years of slow but steady improvement of the keyboard-phoneline-mainframe system before there was a viable personal computer system
In 20 -20 hindsight, Monday morning quarterbacking, the failure of videotex was certain. Steve Wozniak had introduced the first Apple II personal computer in June 1977 followed by the Apple II Plus in June 1979. I had actually considered buying an Apple II Plus in the that summer of 1979 before I headed for London.(it was too expensive especially for an impoverished freelancer) As I was working in videotex, IBM, the maker of the mainframes used by some of the videotex system, was already working on the development of the personal computer. In August 1981, as I resigned from UNS and went for a two week vacation in Greece, IBM launched the first personal computer. There were competitors, the Atari and Commodore systems and the Tandy TRS-80, the “Trash 80” which many techy journalist of the era fell in love with and CP/M machines like the Osborne I bought in 1983, while I was still working at CBC Project Iris. The introduction of the IBM PC XT in March 1983 ( I saw it at a trade show in Toronto that month) with its amazing 10 megabyte internal hard drive, which was the first truly consumer friendly PC, meant videotex was doomed.
As I said, what goes around comes around. It’s thirty years later and what, apart from the tablet, was hot at the Consumer Electronics Show this year?
One big item was a real old fashioned idea, obsolete for more than a quarter of a century, connecting your television set to a computer system, and giving it a keyboard. Of course, it is a high definition set and one of the reasons to connect to the Internet is to download movies, but the system also allows the user to have complete access to the World Wide Web. If one of those experts from 1981 had been caught in a time warp and suddenly reappeared in a living room in Christmas 2011, where the family gathers around to watch a downloaded movie on an HD set and check their e-mail at the same time, that expert, with no knowledge of what had happened in the previous three decades, would have thought their prophecy had proven true. (And given that the telecoms want to charge more for all that bandwidth to download a movie, that too might bring back memories for our time traveller).
After Project Iris was killed by Brian Mulroney, I kept my connection with developing tech with my new Osborne. I wrote my first book, King of the Mob, on that four inch screen. In October 1988, I joined CTV News as a writer on the CTV National News.
Lesson 7. Beware of software executives bearing gifts
At CTV at that time, 1988 to 1994, the TV news writing software was awkward and primitive, compared to the expanding and consumer friendly software creating for the growing PC and Mac markets. A company named Columbine had created a mainframe based software for tracking commercial sales and placement. The company threw in the news writing software as an added inducement for bean counting corporate executives to buy the commercials tracking system. While Columbine may have had some expertise in tracking commercials, the news writing software was a mash up.
Add on software, is, in most cases, a very bad deal.
There is exactly the same situation with Novell Groupwise, which is certainly not the best e-mail client in the world, but because it’s added to the Novell’s networking software, which seems to work well, many companies force their employees to use Groupwise, even though there are much better products on the market. Why would any company in its right mind, spend all that extra money licencing Groupwise per workstation in addition to all the money they pay for the Novell’s networking software, when there are better products available such as Thunderbird? Not to mention, Gmail. During the CBC lockout, we created a duplicate of the CBC Groupwise system using Gmail, at no cost (and it worked better)
Lesson 8. Managers should always consult that people who actually use the hardware or software.
I can’t count the number of times that media managers, based on talking to consultants, fast talking software sales people and sometimes even IT people, impose software and/or hardware on staff without asking them to see if it actually works for what the company wants to do with it. One of the few times that staff were consulted was at CTV News, when management brought us in to see what they thought was a great piece of TV news writing software, to replace the much hated Columbine. It was a good piece of software, but as the sales people enthusiastically ran through its features, my techy alarm bells started ringing, and so I began asking questions, about how the lineup editor and the producer would communicate if one was at the main desk and one in the control room and how the writers would work with the lineup editor. What management didn’t realize was until I the user and techy guy, began asking the questions was that the vendor was presenting software that was really good for a small local station, (the vendor’s client base in the US) but totally inadequate for a network news operation. They didn’t buy that software.
In the fall of 1993, I began co-writing the first book on Researching on the Internet. It was a rather exciting time to be writing that kind of book, just as Mosaic and later Netscape, opened up the World Wide Web. It was also the time that both PC and Mac were taking off, with hundreds of small new companies in fierce competition with each other, just to survive.
Lesson 9. Software vendors will always promise you the moon, the stars, and a galaxy, far, far away.
Software sales people haven’t changed in a quarter of a century. They promised you the moon with a 10 megabyte hard drive PC in 1983 and now in 2011, with mobile phones on the genius level, compared to the computers that actually sent NASA to the moon, they promise you the stars. Whether it’s 1983 or 2011, the software guy who comes to your office or greets you at a trade show (even these days, it is still usually a guy) is wearing a company polo shirt and nicely faded blue jeans, sounds more like a California surfer dude than a geek, has a big smile, is so good looking that he’s may be also registered with Central Casting and so really loves his tech that he really believes that his product is the greatest thing since the invention of the silicon chip and COBOL (look it up on Wikipedia).
Caveat emptor. That’s Latin for “let the buyer beware,” which leaves one wondering, given that the Romans were such good engineers, if there were tech trade shows in the Coliseum when the gladiators had a day off.
The surfer dude salesman’s supervisor also wears the company polo shirt but sports dress pants, is in his late 30s, maybe wears glasses, sounds more like a professor and is geekier than his sales staff. He was probably the good looking kid at a trade show long, long ago and far, far, away and stopped going to the gym when he was promoted or married or both. His role, of course, is like that boss in an auto dealership, with the sales manager offering you “the deal” the sales person can’t. If you were wearing a media badge, that usually meant the software was free. For anyone else, the manager has visions of the ten thousand workstation contract. The pitch is always the same, whether it is 1983 with the first PC, the multitude of tablets at the CES 2011 and the new, new thing at whatever trade show is hot in 2021, our software is the greatest thing since the creation of the universe. After a while, to the jaded veteran, it all sounds exactly the same.
There is one lesson that holds true, for hardware or software, in 1983, in 2011 or 2021. Never buy Version 1.0. Never! (At least, in the beginning, in 1983, Version 1.0 was usually stable, if incomplete with minimal features. These days with the rush to market and pressure for profit, Version 1.0 is actually closer to Beta 0.56 Build 1066 ).
Lesson 10. One of the great failures of the mainstream media was its lousy coverage of the software industry
Again, with 20-20 hindsight, it is easy to see that an early indication of the coming failure of the mainstream media was not in its adoption or failure to adopt new technological innovations, but the media’s failure to cover the software industry as it was then covering the police beat, city hall, provincial or state and federal governments.
When I was asked to write Researching on the Internet, I had already been following tech for a decade. I knew everything was changing at high speed. The solution was not to create a software manual, impossible in any case, because unlike Version x.x of software, the web wasn’t static. My idea for book (especially since it was written in a time of transition) was to give the reader some basic principles so that they could work with the web as long as possible. The idea was right, because Researching on the Internet stayed in print and selling (and making me a profit, the book “earned out.” long after the actual software had been replaced by new versions)
So with that in mind, when I approached software companies, my questions were similar to those I often asked as a reporter, to police, to city hall, to the big industries in town and in the locker room. Software companies traditionally held their developments secret so as not to reveal them to competitors, which is perfectly understandable. The problem was that most software companies were used to uncritical coverage as they announced their latest products. They were not expecting even the mildest kind of critical question even a local sports reporter whose was perhaps too close to the home team might have asked a hockey coach about his plans (or lack of them) for the coming season.
I remember meeting with an executive of one then prominent software company, who turned pale at some of my pretty innocuous questions, and quickly palmed me off to a PR person, who simply repeated how good their products were and showed me to the door. (It later turned out that the company’s financial position was not as good as it claimed and it was later sold).
One area that was generally ignored by the mainstream and the computer press (the latter dependent on advertising from software companies) was softcide. Softcide was a common practice during the boom of the 1990s where one company with deeper pockets, bought a company with a better product, then killed that product, so that the next so-called “upgrade” resulted in angry customers being offered the inferior product, while support for the better and now orphaned software was abandoned. The business press was even worse, usually caring only about the stock price and not the actual management of these companies.
It was only when some of those outraged customers, computer writers, former employees and sometimes current and anonymous employees who were branching out on their own began blogging with inside scoops on the software industry did the mainstream media catch up (and even today the MSM is too often dependent on those bloggers.)
In 1994, I returned to CBC where I would work as a TV lineup editor, then web writer and producer and later photo editor. I watched as online news started as a hole in the wall closet office experiment, then a small team working and changing on the go until, like all other online news operations, it was finally folded into the corporate machine
Lesson 11. Team should mean team
Team has become a cliched buzzword. Software companies and your ISP sign off their messages with the X Team. So spammers take advantage of the team cliche. (I have received auto spam from the “robinrowland.com team,” not bad for a one man operation.) At the same time, television news, using the same cliched buzzword, promise “full team coverage,” as does every other TV station in town. Not to mention the newspapers.
One has to wonder why the executives, whether in software or the media, are so blinkered that they actually believe that the public pays attention to this constantly repeated nonsense.
A good newsroom has always been a team, going back 150 odd years or more to the first major newspapers. Software with its often millions of lines of code is also a team effort.
In many cases, bean-counting management, applying cost benefit analysis, have undermined team efforts in both industries, with staff cuts, ignoring morale problems and by creating bureaucratic headaches. while creating a message track of a team effort.
Like all cliches, like all message tracks, the team analogy is based on truth. In the 30 years that I have worked in new or online media, the system worked best when the IT staff were present in the actual newsroom, rather than on another floor or even another city. In a couple of cases, it was one single person who was working with us in developing projects. In another case, the IT staff, programmers, network administers and hardware geeks were crammed into a small office with the news staff, because there was no room for them anywhere else.
In all three cases, the majority of the IT staff saw what we were trying to accomplish and worked their butts off to help us to make sure their system they had created did what was supposed to, especially in cases where there problems getting stories up on the web during breaking news and the miracle workers created instant work around.
Unfortunately, when the IT people eventually had their own office, they soon lost interest in what the newsroom needed and their aim was to fullfil the IT department’s priorities and the demands of IT culture. It got even worse when bean-counting management consolidated IT network and technical support in call centre in a city hundreds of miles away with people who never actually had any concept of what the media staff were trying to do. (At least the call centre was in Canada, not Bangalore or Kuala Lumpur).
IT culture at its best can be creative, at its worst it is a bureaucratic nightmare. Unless there is a symbiotic relationship with the actual productive staff, when the IT culture is separate from the newsroom culture, the system breaks down. It’s as if the journalists are the leopards and the IT staff the lions, the journalists are the Orcas and the IT staff the sharks, similar creature in an similar environment, but with different and often competing goals.
The worst case of IT disconnect came in 2001. At one major news organization, the IT staff had scheduled a network upgrade for September 13, 2001. The idiots were so blind that the network upgrade went ahead regardless of the events two days earlier on September 11 and the entire system slowed to a crawl. IT honchos were rather put out at the escalating calls of complaint, starting with front line news staff and escalating to senior news management, when the network upgrade didn’t work properly
The journalism programs at Columbia (Tow Center for Digital Journalism ) and New York University are currently working on a programs/curriculum that will create “journo-programmers”
I was one of the first journo-programmers myself . After I returned to Canada from London, I took a programming course at York University. It being 1981. I programed using punch cards. The course was invaluable and because I always had a basic understanding of how computers worked, I was always able to adapt to new innovations.
There’s one problem, however, with what Columbia and NYU are attempting. There is no mirror image curriculum where the IT people are trained as programmer-journos (or programmer-doctors, programmer-cops or programmer-millwrights etc. ). While it is a good that a young journo-programmer knows, the basics of code and/or how to run a server, it is not going to do that young man or woman much good when they come up against corporate IT and their priorities. The journo-programmer may know what he/she is talking about but if history is any guide, in most cases, they will be ignored by IT.
Many corporate IT people still believe that anyone who calls to report a problem is the cliched dummy who puts their coffee mug in the CD drive holder and knows nothing about the system. I say many because I and my geek colleagues always made it a point to find out who were the better and more responsive IT people and when possible went directly to them.
We always joked that best training in dealing with corporate IT was watching M*A*S*H. Unfortunately, in too many cases, these better IT people soon left either because media IT salaries were low compared to other areas, because other companies recognized their talent and hired them away, they left because they couldn’t stand the stultifying bureaucracy or were fired because their bosses didn’t want employees who were smarter than they were.
I have always thought that at any company, no matter what the product or service, all IT staff should be made, as a condition of employment, to start at the bottom, at least for a month and work in their company’s main product or service line. However, that dream for the working staff (and perhaps a nightmare for the IT staff) will likely never happen.
Throughout my career, and this is a good reason to have journo-programmers, if we could avoid working with the IT people on the other floor, we did our own work arounds.
Of course, if the news staff and the IT were truly a team, then there wouldn’t be these kinds of problems.
It soon became apparent at those news organizations that were early on the web that they had to quickly expand their staff beyond the pioneer geeks.
That’s when the in the broom closet IT staff created the first template systems, which then grew into in house and later outside vendor supplied Content Management Systems. Those Content Management Systems meant a whole generation of journalists, working on the web, never actually had to understand the nuts and bolts of how the system worked. They simply showed up for work and wrote their copy or uploaded their photos and video in a system that too them was not too different than the typewriter of an earlier generation. (That is if the system actually worked. Again senior management was too often seduced by the promises from software vendors, bought expensive CMS systems that were not suitable for the news, TV or magazine media)
Lesson 12. Be aware of the innovation cycle and be prepared for it.
As everyone who works in the media knows, the business is mired in a deep crisis and that crisis is getting worse as new innovations seem to appear almost every day, with corporate news executives flopping around like fish out of water in their efforts to catch up.
After about a decade of relative stability from the late 1990s to the late 2000s after the introduction and then the maturity of the world wide web, in the past few years, came Facebook, then Twitter, then the smart phone, then Foursquare, then the iPad and now Quora.
This is reflected on the Twitter feed #futureofnews. I quickly noticed something about those posting on #futureofnews (I admit that this is unscientific and anecdotal, but perhaps someone looking for a PhD dissertation can quantify it).
There is, as far as I can tell, an age related reverse bell curve, on those who are posting, either on #futureofnews or #journalism and discussing the survival of the news media. The majority of posters are either in their 50s and 60s or in their 20s, students and young journos.
There are people I met at the Computer Assisted Reporting Conferences in the heady days of the early 1990s, or who appeared on the early CAR and Online news lists like Dan Gillmor, Steve Yelvington, Danny Sullivan, Steve Outing, gurus from then and now like Don Tapscott and other slightly later pioneers like Jim Brady (@jimbradysp) and Jeff Jarvis On the rising side of the reverse bell curve are the younger side, people in their 20s, like Adam Westbrook and Cody Brown.
Why is that? News management these days might like to believe that anyone over 40 is obsolete as far new media technological innovation is concerned.
Not so. My contemporaries, call us the over the hill gang or the geeks from Cocoon, if you wish, were part of a innovation cycle, where we had to adapt to something new every day. While there are people in their 30s and 40s on #futureofnews, they are usually not the most frequent posters. Most of those people came into journalism in the relatively stable and mature period of the world wide web from approximately 1996 to 2006.
It is the generation from 18 to 28 that face the greatest challenges. It is a time of economic crisis for all of society and even more so for the news media, at a time of technological innovation that is proceeding at warp speed. (After all the previous generation, my generation, faced innovation at a time of prosperity and apart from a couple of downturns, economic stability)
That is why the new generation journalists or journalists-to-be are most frequent posters on #futureofnews and that is where the most productive feedback and mentoring occurs between the previous generation that faced an innovation cycle and the current generation.
I am not optimistic that the current (mostly aging) corporate news management can adapt to both the economic downturn, the increasing pace of technological innovation, and for the west, especially the United States, too long comfortable at the top of the heap, growing international competition.
If only a few executives come to realize that we are in a period of evolving media (as I discussed in part one of this blog) some of the better media will likely survive.
As for the long term survival of traditional journalism that tells the world both what it wants to know and also what it needs to know, it is likely that, if anyone saves the craft and the profession, it will be someone who right now is 19 or 21 or even 28, who will discover the key to future success.
If they want help of an old veteran, I’ll be glad, grasshopper, to tell them more tall tales of punch cards and four inch screens and hand coding html news stories. The world is different, but as I have said what goes around comes around, so I write in the hope that the Tao of News will give them some ideas on how to be flexible and adaptable in the few of the latest new, new thing, how to deal with bean-counting managers and corporate IT call centres, so they can do what’s really important, cover the news.