Wednesday, December 28, 2011

Only the Brave

Steve:

I've been in this storytelling business longer than I care to remember, and I've seen many comrades along the way come and go.  You could credit this to a million individual stories that seem to have nothing in common.  Business downturns.  Failed books.  Lagging series.  Crooked agents.  Bankrupt publishers.  Money.  Health problems.  Family problems.  Lack of sales.  Imploding genres.  Failed magazines.  And on, and on, and on.

But mostly all those stories come down to one story:  Somebody had, for one or more of a huge variety of reasons, a failure.

And they gave up.

Now, you could say the reason they gave up was failure.  But you would, nine times out of ten, be wrong.  I know this from experience.  What stopped them was fear of failure.  Fear of failing again.

I look at this myself, and I'm not always happy at what I see, and where I am as a writer.  I've published a lot of books and stories.  I'm a national best-seller twice over.  But I'm not a household name.  I'm comfortable, but not rich, and not as secure as I'd like to be.  I've sure not written as much as I'd like to have written, and certainly not as much as I could have written.

And I've failed.  A lot.

But the difference between me and those other folks is that, while I've often spent far too long laying in the mud, looking up at the sky and feeling sorry for myself, I've always eventually gotten up, dusted myself, and trooped on.  And every time I did, it was with the full knowledge that I was going to fail, again.

Two things prompted this post.

First of all, my friend Dean Wesley Smith just did a post in his "The New World of Publishing" that touches on this subject from a somewhat different perspective, and in a much broader publishing context.  But I'm going to keep this simple and direct (which I think, ironically broadens the application of what I'm saying to a whole range of endeavors beyond writing and publishing.)

The second thing was a list on THR, the Hollywood Reporter web-site, of "2011's Biggest Rule Breakers."  What interested me most was entry #9, George Clooney.  Here's what it said:

Nominated in actor, writer, producer and director Golden Globe categories for his work in The Descendants and The Ides of March, Clooney still admits he's "afraid of failure."

Clooney told THR, "I failed so many times, I have a much better understanding of the journey. It's how you handle the down part [that counts]."


That's what I'm here to talk about, Clooney's "understanding of the journey."  Clooney knows, like me, that failure is inevitable.  He also knows that sad truth that all of us who have made the journey have discovered: there is no magic point, no level of success, at which you're immune to failure.  Remember that Clooney is a man who, having already achieved Hollywood stardom, headlined the movie that tanked the monster (a fair chunk of a billion dollars at that point) Batman franchise!

That one film would have been a perfect career killer, and for many an actor, it would have been.  But here's the thing, when you say the name "George Clooney" to a random person on the street, a lot of things may pop into their mind.  But it's highly unlikely that thing will be, "oh, that guy who killed the Batman franchise."  In fact, you probably were reminded of this fact only after I mentioned it, and most likely, the memory created a momentary feeling of surprise.  Probably you thought something like, "oh, he was in that turkey, wasn't he?"  Then you will chuckle, and start thinking in terms of Clooney's many accomplishments, nominations, and awards since.

How did this massive failure turn into a relative footnote in a distinguished movie career?

Well, the first answer is that George Clooney didn't give up.  As as he says himself, it isn't because he has no fear of failure.  He does.  But he's learned to deal with that fear and keep moving.  He's learned that to let a failure stop you, even for a while, is to give it power, to make it bigger.  The thing about failure is that it isn't just the kind or degree of failure that makes it significant, it's the position of the failure in the narrative of your career.

It's like punctuation.  The most powerful punctuation in an English sentence is always at the end of the sentence.  The most powerful failure is the one at the end of your career.  The next most powerful punctuation is that in the middle of the sentence between words, the kind the represents a pause.  Commas are important, but we brush past them without any conscious notice most times.  

Other types of punctuation, a semicolon, ellipsis, or em dash, these represent longer or more significant breaks or transitions.  But again, we move past them.  The most powerful punctuation marks, the ones that define the entire nature of the sentence, are at the end.  The exclamation mark!  The period.  And...the question mark?

Failure is a like a generic punctuation mark.  Once it's happened, you can't remove it, but you do have some power to define which mark it will be.  If you let it stop you for a bit, slow you down, throw you off, then it becomes one of those mid-sentence marks.  The faster you move beyond it, the longer the narrative that follows it, the more likely it is to become a comma, passed over, significant but barely noticed.

Let that failure hang through inaction, and you give it power.  I admit, my own career is a mess of semi-colons and em-dashes.

And woe, if you let it be the end of you, or even the apparent end of you.  Think of it: "Batman and Robin, the movie that killed George Clooney's career!"  Let that happen, let it stop you too long, and the termination mark will stick, even if you don't.  If you move on beyond that career-killing mark, you will always be known as "that person who made (or at least attempted) a comeback."  That's a very risky label to be carrying around.

What can we learn from this?

1. Not stopping is your greatest power over failure, your most effective means of damage control, and your surest path to recovery.

2. By not stopping, you take control of the narrative, which really isn't set in concrete till that final mark.

3. By not stopping, even if you drop off the radar and people forget you exist, when they do discover you again, it's apparent that you didn't stop with their lack of awareness.  You continued to write (or act, or direct, or paint, or whatever).  The failure to notice your good work and value becomes theirs, not yours.  Even if failure means you can't immediately work in the same place, or at the same commercial level as you did before, it is the continued forward motion in your field that counts.

4. Even though you think the narrative is about you, you are not the entire narrative, and the rest of that narrative will continue to evolve without you, possibly to your advantage.  While Clooney continued to work (often on less highly-commercial films) fending off the terminal mark, the Batman franchise didn't actually die.  After a pause, it was rebooted to even greater commercial and critical success with Batman Begins.  As Batman and Robin turned into an em dash for the franchise instead of an exclamation point, in turn lessening its impact on the narrative of Clooney's career.

5. Dwelling on past failure is pointless.  Learn what you can from it, and move on.  Obsessing about it further will only slow you down, or worse, stop you.  Move on, as soon as you can, as fast as you can.  No matter how massive the failure, it will be diminished by time and distance.  You have no power over time, but you can stretch the narrative of your career away from it, limited only by the speed with which you can create new work.

6. Dwelling on future failure is even more pointless.  It will happen.  But take comfort, you are not alone.  We all go through it, and we will all go through it again.  It's not to be welcomed, of course, but it can be managed.  If you are smart, if you know what to do, if you know how to handle it, and if you don't give up, you are in control.

Period.

If you've found this post interesting or useful, please share the link and consider making a small donation using the button below.  It will encourage us to do more posts like it.  Or better yet, purchase one of the many books written by J. Steven York and Christina F. York (who also writes mystery as Christy Evans and Christy Fifield).  Ebooks are available on Kindle, Nook, Smashwords.com and most other major ebook vendors.  Visit our Amazon Store.

Saturday, November 19, 2011

The (Apparently) Forgotten History of the Personal Computer

Photo: Bluedisk
The recent death of Apple co-founder and former CEO Steve Jobs generated a huge amount of press coverage, much of it very bad.  In particular, my sampling of this coverage made it obvious that many, if not most, of the people charged with writing about computers, technology, and their associated industries, have not a clue about the history of personal computers.

At every turn, people got it wrong.  Jobs's (many) contributions to the computer industry were misstated, distorted, or exaggerated.  Perhaps most egregiously, a post on a major newspaper's site credited Jobs with "inventing the personal computer," a statement so wrong, and so wrongheaded, on so many levels that my brain hurts just trying to count them all.

It's not surprising, really.  Most of this stuff happened before many of these journalists were born.  Many of them can't even remember a time when personal computers didn't come down to the "Macintosh and everybody else who runs windows, except a few hippies and nerds who run Linux."  I on the other hand am a bit older, and was there pretty much at the beginning of it, as a hobbyist, salesperson, and later, as a writer for computer magazines (and computer software companies).

Fact is, Apple had a lot of history before the Macintosh came along, and the Macintosh, as influential as it was, was for most of its history a niche product that never came close to dominating the personal computer business.  And that business existed before Apple did.  Apple was an early player in the personal computer, but was a late bloomer that again, despite its niche success, was usually overshadowed by one competitor or another, several of which will probably surprise you.

Let's test your knowledge of computer history.  (Are you out there, computer/technology journalists?)  Here's a list of nine important, game-changing, ground-breaking computers, all of them predating the Macintosh.  Do you recognize them and can you state why each of them is significant to computer history?  We'll give you a few minutes here...

. MITS Altair 8800
. TRS-80 Model 1
. TRS-80 Model 100
. Commodore 64
. Osborne 1
.Compaq Portable
.HP 110
. Xerox Star
.Apple Lisa

Da-da-da-da
Da-da-da
Dada-dada
Da
Dadadadada


Da-da-da-da
Da-da-da
Dunt. Da-dunt-dunt
Dum.  Dum. Dum.
Dadum!


Time's up! Keyboards down everyone!  No?  Not ringing any bells?  Only ringing a few bells?  Well, then perhaps your understanding of the history of the North American computer business isn't as complete as you think it is.  A lot more could be written about these computers, their history, and significance, but here's a quick rundown.

. MITS Altair 8800 (1975)
The Altair was the first widely-known (it grained fame through hobby magazines like Popular Electronics) and available (it was sold by mail-order) commercial microcomputer.  It's pretty much the spark that ignited the whole personal computer revolution.

The maker expected to sell a few hundred, but soon had thousands of orders.  The Altair was also the subject of the first computer "clone," the copy-cat IMSAI 8080.  Among its many contributions to computing, the Altair also introduced the S-100 buss, a standard system by which accessory cards  and peripherals could be used interchangeably between various makes and types of computers.  It's also significant as the hardware on which the very first Microsoft product ran, a version of the BASIC computer programming language written by Microsoft founders Bill Gates and Paul Allen.

. TRS-80 Model 1 (1977)
Photo: Flominator
Radio Shack is best known today, if it's known at all, as that place in the strip-mall where you buy obscure cords,adapters and accessories for your electronic devices.  And so it's hard to imagine that Radio Shack was once not only a major player in the personal computer business, but for a brief period actually dominated it, especially the business part of the market.

In the mid-1970s, Radio Shack was already a hang-out for electronic hobbyists, ham radio operators, and home-audio geeks, and was riding high on the popularity of the CB-radio craze, when they made the bold decision to go into the then brand-new microcomputer business.  Now, three major players entered the market in 1977, all significant in their way: Apple, with its Apple II (yes, there was an Apple 1, a kit computer, but few were made, and it's a bit of a footnote), Commodore, then a major maker of pocket calculators with its PET Computer, and Radio Shack with it's Model 1 (then known simply as the "TRS-80," the other designation only being added when additional models were added to the line a bit later).

But the huge advantage Radio Shack had over its competitors was in its retail outlets.  While the number of dedicated computer stores in the country was probably no more than a few dozen, there were thousands of Radio Shack stores spread through America's heartland and small towns.  And every single one of those stores was shipped a single TRS-80 computer.

Again, the level of interest and demand was underestimated.  While personal computers had been around for  two years at this point, the TRS-80 was the first personal computer most Americans would ever see, much less touch or use.  Radio Shack was a known and trusted name at this point, and walking into an established store and buying a computer in a box was much less a leap than going to a fly-by-night computer store in a major city, or ordering some unknown quantity by mail.

Moreover, the TRS-80 entry model was under $500, monitor included.  Still a lot of money in those days, but far less than the $1298 entry Apple II (with no monitor).

Radio Shack's dominance of the market wouldn't last long.  Commodore would soon dominate the home computer market, and Apple would finally take off as a business computer (largely because it ran the first, and for a while only, spreadsheet program, VisiCalc.  Shortly thereafter, IBM would crush everyone in the business side of things, and break Radio Shack's back in the computer business.  Apple would soldier on in the educational and high-end home market, and eventually make a slow comeback with the Macintosh, but Radio Shack never recovered.  Still, in bringing the computer to Main Street, Radio Shack make everything that followed possible.

. TRS-80 Model 100 (1983)
Photo: NapoliRoma
While Radio Shack's dominance of computers was brief, it still had a few significant contributions to make to the history of computing.  Though little-remembered today, the TRS-80 Model 100 is likely the most significant.  The Model 100 was the first successful and practical laptop computer.  (Technically, it was proceeded slightly to the market by the similar looking Epson HX-20, but hampered by a tiny screen and lack of build-in software, it never enjoyed the success of the Radio Shack competitor introduced later the same year.)

Innovative and forward-thinking, a few of the Model 100's specifications remain impressive even today (especially battery life).  It was rugged, capable, weighed just over three pounds, came equipped with an LCD screen, a variety of build in software (including communication, basic word-processing, and spreadsheet), a decent keyboard, instant-on, and equipped with a modem for long-distance communication.  It was capable of running 20 hours (and holding its memory contents for up to 30 days) on a set of four standard AA batteries,   It became especially popular with journalists, scientists, engineers and others who not only needed a computer that could be used in the field, but sometimes beyond the reach of civilization.

Actually, the computer hardware was designed by Japanese company Kyocera, and Microsoft (the software), and other versions were sold under the Kyocera, NEC, and Olivetti brands, but it was Radio Shack and its retail presence that ushered the machine to success.  Microsoft founder Bill Gates has said it was the last Microsoft product in which he did significant programming, and one of his favorites.  Interestingly, it's probably (certainly one of the first) the first successful consumer computer to use the now almost universal "cut" and "paste" metaphor.

. Commodore 64 (1982)
Photo: Evan-Amos
It's easy to overlook the Commodore 64.  At the time, many computer enthusiasts (myself included) dismissed it as a "toy computer," and it mainly found its place in homes, not in offices or labs.  But it was a powerful computer for the time, with more memory than many of its competitors (especially when compared with their base models) and built around an improved version of the processor used in the better-respected Apple II family.

The Commodore was produced continuously for over a decade, and has been produced in some form as recently as 2004.  Over its lifetime, somewhere at least 12.5 million units (there are questionable claims in excess of 22 million) were sold, and it is probably still the best-selling single computer model of all time.  Through several years of the middle 80s, the C64 dominated computer sales numbers.  In 1983, in terms of unit sales, it had over 40% of the computer market (by comparison, the Apple II never exceeded about 16% of the market).  Moreover, it held more than a quarter of the computer market for four years running, and kept a double-digit share of the market for a fifth year.  For a generation of aspiring programmers, scientists, engineers, hackers and game designers, "computer" meant "Commodore 64."

So why no respect?  In an era when computers were still primarily sold in specialty and electronics stores, the Commodore was a mass-market computer, sold in discount and department stores, often at discounted prices.  In 1983 Commodore introduced a $100 "trade-in rebate" that was sensationally successful, and resulted in a huge shake-out in both the computer and video-game business, driving out or bankrupting several companies.

And history is written by the victors.  While Commodore dwarfed Apple sales for much of its existence as a computer company, it was dealt a huge blow by the standardization around the IBM PC platform, and was ultimately killed by the market failure of its very-capable Macintosh competitor, the Commodore Amiga.

. Osborne 1 (1981)
Photo: Bilby
Everything about early computers was big.  Big disk drives, big boards with huge numbers of parts, big monitors with big glass picture-tubes, and big power-supplies to drive it all.  As a result, the last thing you'd called a computer in those days was mobile.  The Osborne 1 changed all that.  The Osborne was the first commercially successful portable computer.  It wasn't a laptop, or anything remotely like one.  It weighed 24 pounds! The category it created was popularly known as the "luggable."

The Osborne resembled a small suitcase, or a portable sewing machine.  One end of the case unlatched to reveal two floppy disk drives and a grand 5-inch CRT screen!  The lid of the case contained the keyboard.  By modern standards it was huge.  It was ugly, the screen was a joke, but it was portable, and that was a breakthrough.

The Osborne also broke ground in one other important area: it was the first computer to come with "bundled" software.  At the time, most computers shipped with no software at all.  Even the operating system was extra.  The Osborne came with the popular CP/M operating system, the business-standard WordStar word processor, and at various times in its run, accounting software, database, spreadsheet, and even games.  The bundle of software was worth almost as much as the machine's $1795 price.

Despite its initial success, Osborne sales plummeted when they prematurely announced an improved model, which allowed more capable competitors to move into the "luggables" market.  But the practice of bundling software with computers continues to this day.

.Compaq Portable (1983)
Photo: Tiziano Garuti
Speaking of "luggables," the Compaq portable was another example of the class.  It was even more expensive ($3590) and heavier (28 pounds), ut it was one with an important difference; one that would change the history of the computer business.

In 1981 IBM legitimized the personal computer with the introduction of the IBM PC.  The PC was an expensive, and in many ways unremarkable (other than its 16 bit microprocessor) computer, but the IBM name suddenly made it conceivable for businesses, especially in the Fortune 500, to buy computers.  It seemed inevitable the IBM would rule the high-end computer business for the foreseeable future.

But IBM made two critical mistakes.  They built it from off-the-shelf parts, and they licensed their operating system from Microsoft and failed to secure exclusive rights to it.  The Compaq portable may not have been the first cloned computer, but it was the one that broke IBM's potential monopoly on the business market.  Even more brilliant, they put it in a luggable package, something that IBM didn't offer.  All those offices that were finding their new IBM PCs indispensable, also needed a Compaq Portable for road trips.  Their foot in the door, Compaq moved into the desktop market, and dozens of clone-building competitors followed.  A diversified future for personal computers was assured.

.HP 110 (1984, with mentions to GRiDCompass, Dulmont Magnum, Sharp PC-5000, TRS-80 Model 200)
Photo: Oldcomputers.net
Of all the computers on this list, the HP 110's place in history is perhaps the most questionable, thus the other computers I'm listing above.  All of them proceeded the 110 to the marketplace by at least a bit, and all had some version of the now-familiar "clamshell" laptop design, but the bunch, only the Sharp of them ran Microsoft's MS-DOS operating system.  And it deviated from being a modern laptop in many ways.  Its 8 line screen was too small to function as even a marginally useful MS-DOS computer, it had a built in printer, and used the interesting, but ultimately dead-end, technology of bubble memory.

The HP 110 put all the pieces together in a portable, battery-powered package.  The initial model had drawbacks.  It had a non-standard screen that displayed only 16 lines instead of the IBM norm of 24 (big enough to be useful, but a drawback until replaced in a later upgrade), the disk-drive was an external accessory, and the printer port used a non-standard proprietary interface.  But I admit to being biased on this one.  I tested a prototype (when it was still using its internal code-name of "Nomad") for a computer magazine, and I'll never forget the feeling of wonder when I was able to pop this thing open and use it while riding a Seattle city bus.  It was life-changing, and the birth of the modern laptop did change the way we looked at computers forever.

. Xerox Star (1981)
Photo Al Lemos, via Wikipedia
One of the major things Steve Jobs was falsely credited with inventing is the GUI, the Graphical User Interface seen on pretty much all modern computers, be it in the form of the Macintosh Operating system, Linux shells, or Windows (and in a modified form, in mobile operating systems like Apple's iOS and Google's Android).  We're talking about the use of a pointing device (conventionally, a mouse), a fully bit-mapped screen, and visual devices like desktops, icons, pull-down menus, overlapping program windows, etc.

But Jobs didn't invent these ideas and they didn't originate with the Macintosh.  In fact, as we'll see, the Mac wasn't even the first Apple computer to use them!  But the concepts didn't originate with Apple or Jobs, nor were they first brought to market by them.  Credit goes to Xerox.  Yes, the copier and laser-printer people.

In the late 1970s, the Xerox Palo Alto Research Center (PARC) was a major incubator of new ideas in computing.  Ethernet, the foundation of most modern wired computer networks, was developed there during this same time period (and in fact, the Star came equipped with it, another innovation to its credit).  Around this time, they produced the Xerox Alto, a workstation that pioneered the whole GUI concept.  Though about 1500 were built for internal use and for distribution to government and university labs, it wasn't a commercial product.  That honor went to the Xerox Star (AKA the Xerox 8010 Information System).

Now, at $16,000, the Star was hardly a personal computer.  And it wasn't even intended to work as a stand-alone system.  Ideally an installation would have several workstations, a server, and a printer, running at least $50,000 - $100,000.  But I include it on the list because it's directly responsible for everything that came after it.

Xerox didn't really know how to market the Star, wasn't much interested in the computer business, and perhaps because of its high price and lackluster marketing, it was not a huge success.  The company was distracted by anti-trust actions and the success of its laser printer and copier products.

The Star would fall by the wayside, largely forgotten.  But its legacy lives on.  See, there was this little company called Apple...


.Apple Lisa (1983)
Photo: German Wikipedia
As I said, the GUI at Apple didn't start with the introduction of the 1984 introduction of the Macintosh.  But if Xerox originated it, how did it end up at Apple at all?  Here's where Steve Jobs does take a role.  He was one of several outsiders given a tour of Xerox's PARC in 1979.  Among the things he was shown was the GUI interface, and he came away excited about the possibilities, many of which were incorporated into the Lisa.

The Lisa was, without doubt, a computer ahead of its time.  It was powered by a 32-bit Motorola 68000 processor, and a large amount (for the time) of memory.  It had a fully bit-mapped display, a mouse, and an interface that would look familiar to most modern computer users.  It's memory, file system, and some technical aspects of its operating system were in many ways more advanced than the Macintosh models that immediately followed it.  And unlike the early Macs, it was expandable and upgradable, with internal card slots and an interface for an external (and later internal) hard drive.

But the computer was not a success.  It was hampered by its lack of compatibility with the "business standard" IBM PC, it's unproven design, and most especially by it's high (initially, nearly $10,000) price tag.  And despite impressive hardware for the time, the demands of the Lisa's complex software and operating system slowed the machine to a crawl, making it seem sluggish and slow. The product went through several design revisions and serious price cuts during its three years of existence, but it never took off.  Steve Jobs had long since become enchanted with the Macintosh design (correctly) thinking that it had greater commercial potential, and had focused his efforts there.

Some of the last Lisas were converted to run Macintosh software and sold as "Max XLs," and it's rumored that many simply ended up in landfills.  But in introducing the GUI computer to the true personal computer marketplace, and to its eventual champion, Apple, the Lisa deserves to be recognized for its place in history.

The Lisa was followed into the marketplace by the Macintosh, which immediately took the computer world by storm!  Not.  That's another myth.  The Macintosh made a disappointing launch, and attracted plenty of lookers, but not so many buyers.  While interest was high from the very beginning, the lack of compatibility turned away business buyers, and the price (about $2000, later raised, a rarity in the computer business, to about $2500) was too high for otherwise interested home buyers, and some were turned off by its small, monochrome, screen.

But the computer found niche markets (especially desktop publishing and the educational market), Apple did well with the LaserWriter printer (many of which ended up attached to IBM compatible computers or to mixed-computer networks), and Macintosh was given time to find its legs.  But despite all its press that would lead you to believe otherwise, Apple has never been dominant in the computer industry, rarely having over 10% of the market during the Macintosh era, and briefly peaking at about about 16% of the market with the Apple II.  It's currently at just under 13%, which may be a Macintosh era high, still leaving it the number three computer company.

Unit sales and profits, of course, are two different things, and Apple has done well for itself.  That doesn't make it dominant in the industry though (the iPod was probably their first truly market-dominating product, and Apple looks to be a strong, possibly dominant player in the post-PC era).  But Apple popularized such concepts as GUIs, desktop-publishing, and networks, and provided a steadfast alternative that helped keep Microsoft competitive and prevented their having a monopoly position. 

In these accomplishments Apple's place in personal computer history is assured.  It just isn't quite the place that a lot of people seem to believe.


Friday, October 28, 2011

Publishing is "Moneyball"

Steve:

Chris and I belatedly saw the movie "Moneyball" a couple days ago, and let me just say that if you're a writer, you definitely should see this if it's still playing in your area (or available on video by the time you read this).

If you haven't seen this, but have heard of it, probably what you "know" about it is that it's a baseball movie.  Well, sorta.  It's a book set in the baseball industry. (And if you had any illusions that baseball, or any other major "sport" is actually a sport and not an industry, this movie should cure you of that fallacy.)  This is a movie about business, about problem solving, and about management.

If that sounds dry and boring, it isn't.  It's fascinating, in part because it is a baseball movie.  Not that I'm a sports fan.  I'm not (though I come closer to liking baseball than pretty much any traditional sport I can name).  It's fascinating because Moneyball has some great acting, good characters, snappy dialogue (courtesy of writer Arron Sorkin, who brought you "The West Wing," and "The Social Network," among other great things), and a compelling story to present its lessons in business.

Yes, it's an enjoyable movie for anyone with half-a-brain, an interest in baseball, or both, but why should writers care about Moneyball?  Because Moneyball is about a very old industry in transition.  It's about an industry bound by tradition and "conventional wisdom" which may not be so wise.  It's about how all that falls apart when the economics don't work any more, when the fat-cats are fat and simply assume that the little-guys will continue to play the game as always, no matter how bad the deal for them gets.  It's about how, when those traditions are challenged by new ideas, the "experts" are frequently the last to know.

If you think this sounds like the state of book publishing over the last few years, then you would be right, and this movie is just full of lessons and metaphors for understanding what's going on there.

On the face of it, Moneyball is about the turn-of-the-century Oakland As, a once-mighty team laid low by money.  The team's new owners simply didn't have the money to buy players that major teams did.  That wasn't fair, but that was the game as it existed then.  Everybody bought players from the same pool, using the same methods, and so teams with deep pockets got good players (by the agreed-upon standards of good) and won most of the games.

By the behind-the-scenes rules of Baseball at the time, the A's were in a tail-spin with no way out.  The wouldn't win games.  They couldn't win games.  And that should have been the end of that.

It's easy to think that an industry as old, as traditional, as hidebound as major-league baseball is inherently stable, but it's not.  When you create an inequity, when you sit on top of the people who actually are the heart of your industry, that stability is an illusion.  Even if the people on the bottom seem totally boxed in by the system, even if it seems the status quo can't possibly upset, it takes only one minor change in the system to send that all into chaos, perhaps even to topple the straw-fort that you've built for yourself.  And moreover, it is nearly impossible for the people sitting on top of the heap to see this change from their perspective until it's too late.

In the case of Moneyball, the change comes when a few people running a baseball team realize that since they can't afford star players, they will need to find a way to win games without them.  They think they've found that way, and they put it into motion, but of course, everyone thinks they're insane, until they start winning game after game.  And then of course they -- still think they're insane.  It's just luck.  It will all be over soon.  This streak they're on, it will end pretty much any second now.  This can't be happening, ergo it isn't.  Yup, that's publishing all right.

So, what are a few lessons we can learn about publishing from Moneyball?

Conventional wisdom may not be so wise - Many of my favorite scenes in Moneyball revolve around scouts, the old guys who are charged with going out to the minors, to schools, and sand-lots looking for future super-stars.  There's a great scene where they sit around a table, combined centuries of baseball experience and wisdom, spouting increasingly bizarre nonsense about why a potential player choice is, or isn't, major-league material.  They don't have a "good face," or a strong enough jawline, or they have the wrong body type.  And my favorite among favorites is the "ugly girlfriend" rule.  A player is rejected because his girlfriend is judged not pretty enough.  "It shows a lack of confidence" announces one geezer with absolute authority.

It's so easy to transpose baseball scouts into literary agents, or book editors, since these are the people the traditional publishing industry trusts to choose potential best-sellers for them.  Again, you have people with a fantastic amount of knowledge and experience often making dunderheaded decisions for absurd reason.

Why?  Because, despite all that knowledge and experience, nobody really knows what's going to be the next "Hunt for Red October," or "Harry Potter" or "Twilight."  But everyone likes to believe that they know, or at least pretend to others that they do.  And there's not as much incentive to figure it out as you might think, because as long as everyone is playing by the same rules, failure is easily swept under the rug.  On the other hand, any success is quickly capitalized on, and whoever brings home the prize (even if it's by blind chance" gets to be the hero for a while.  If the prize is big enough, they can coast on it the rest of their career.  They may even get to add some bit of nonsense to the body of conventional wisdom.  ("Nobody buys books that have goats in them.  Even one goat reference will kill a book!)

This is simply human nature.  Science shows us that people in industries faced with uncertainty and lack of control are the ones most likely to resort to superstition or supernatural belief systems.  That's why gamblers and sailors tend to be particularly superstitious.  It's why the truism "there are no atheists in fox holes" may actually be true.  It's why there are agents and editors out there questioning the worth of  your manuscript because you used the wrong-color of paperclip.

Yes, writer, you've been rejected because your manuscript has an ugly girlfriend.

You don't need home-runs to win - The winning strategy developed in Moneyball is simply to throw out the whole star system in favor of a mathematical approach to winning games.  Stars don't win games, especially when they require disproportionate resources to acquire and keep.  Home runs don't win games.  Getting on base, and doing so consistently is what wins games.  Players that do that (especially if they have other "flaws," like an ugly girlfriend, or a weak jaw) may not attract attention or cost much to sign, and you can afford lots of these gems-in-the-rough instead of one or two "stars."

This applies to publishing as well.  Traditional publishing, especially in the last ten years, has increasingly been about the best-selling book.  Folks in traditional publishing will tell you this must be so.  Increasingly retailers are stocking fewer books and stocking them for shorter amounts of time.  Only "big" books get ordered.  Only "big" books get reviewed (because there are fewer newspapers and magazines doing that) in order to become the "big" books that stores will actually order and stock.

A little success is not enough any more.  It's more common that not these days to drop authors and book series that are profitable and have respectable and climbing book sales simply because they numbers aren't big enough, and they aren't climbing fast enough.  If it's not a hit, cancel it and throw something else at the wall on the oft-chance it will be a hit.  And if it doesn't pop the way you hope, dump it and start again.

Things are being dumped every day that could be earning an author a decent living and making the publisher a small-but-steady profit, because that's the way the game is played.  And as long as the traditional publishing chain and traditional retail are the only ways to put books in front of the reader, writers had no choice but to play along.

But that's changed.  There are options.  Self-published ebooks.  Self-published print-on-demand books.  Smaller presses using new technology.  The "because, because, because" of traditional publishing may still be true for them, but it doesn't have to be true for you, the writer (or for you, the reader either).

For most of publishing history, even successful writers have lived big-check to big-check.  We're always waiting for the next big advance, the next big royalty check, the next big option money, and slowly starving a lot of the rest of the time.  By tradition, those big-checks (and best-seller lists, major reviews, and awards) are how we judge out success and measure our worth.  But in fact, we don't need them.  What we need is a steady, dependable income over the lifetime of a career, and this new world is far more likely to offer it to us than traditional publishing ever is.  And in fact, the new model can do this without ever having a break-out success.  Readability and consistency win the day, not a plaque on the wall, a notation on a list in a newspaper that nobody reads any more, or a big check that never comes again.  That's the new success.

It is more important to ask the right questions than to get the right answers - The mistake the old-guard made in "Moneyball" was to ignore what the As were doing because, by their yardstick, the As were obviously going to fail.  They were asking the wrong questions.  "Are they getting star players?"  No.  "Without star players, are they going to get lots of spectacular plays and home runs?"  No.  Ergo, they are going to lose.

The right questions in this case were, "Can we redistribute our player budget to get a better team, than a few better players on a team?"  Yes.  "Can these players consistantly get on base, even if it's only to first, and even if it's in an undramatic fashion?" Yes.  "Do they have specialized strengths that we can deploy strategically to our advantage on the field?" Yes.  "Can we live with their weaknesses if these are balanced out in other ways?"  Yes.  "Will we ultimately win games?"  Yes!

By traditional publishing standards, it's very hard for indie publishers to win.  Nearly impossible in fact.  But that's because traditional publishing is asking the wrong questions: "Can you get into bookstores and major retailers?"  Not so much, but given the costs, the returns, and the lack of support there, maybe I don't really need to.

"Will you have the force of a major publicity department behind you?" - Oh, you mean the publicity effort that you put behind a small percentage of your titles?  In any case, I need those only if I need velocity of sales, a big spike to sell through the stores I'm not in anyway.  I can drift along for years selling my ebooks and print-on-demand books through online-retailers.  I don't care as much about how many books I sell in the next six weeks as a do the numbers I'll sell in the next six years.

"Will you get reviewed in major print publications?" - Probably not, but again, other than massaging my ego, those reviews serve mainly to get me into the stores I'm not getting into in any case, and to create sales velocity that I just explained I don't need for my business model.

Will you get on best-seller lists? - How much do those pay again?

"Will you sell as many books?"  -  Maybe but if not, I'm still making seven times more per sale, so if I sell 1/7th as many books, I'm ahead.

"Can you make as good and dependable a living as you can with traditional publishing?"  - Uh, yeah, but only if I don't make an even better one!

If you enjoyed this post or found it useful, you can contribute to our good and dependable living goals buy buying one of our indie-ebooks.  Search for "Tsunami Ridge Publishing" on any major ebook seller to find our titles!  Or just click on the "donate" button below and send us a little something directly.  Thanks!


Oh, and after watching the movie, Chris immediately ordered the ebook of "Moneyball," and finds it to be even more entertaining and informative!
 

Friday, September 9, 2011

Genre is the in Eye of the Beholder

Steve here:

The other day I was listening to a story on NPR about the unlikely (to those of us in the U.S.) love of country music in the West Indies.  The story is plenty interesting on its own merits, but one bit of an interview with a Jamaican writer Colin Channer really jumped out at me.

He tries to explain his people never never shut country music out. He explains that when he was growing up, there were only two radio stations, and to them, only two kinds of music, "local," and "foreign."  They didn't recognize the various genres of American music, and simply picked the stuff that they liked (a lot of which turned out to be country).

That's a fantastic summation of how irrelevant seemingly solidly-defined genre categories, and this applies to publishing just as well as it does to music.  Genres are handles for marketing and organizational purposes, not something handed down from above on a stone tablet.

But because of that long-time bastion of popular reading, the bookstore (and the publishers catalogs from which those stores took their stock) they've taken on far more substance and importance than they deserve.  They've become not only labels to define what a book is, but also to define what it isn't.  Each genre has become its own, walled, city state, with its own leaders, its own awards, its own rules, and its own keepers-of-the-faith.

Some are more strongly defended than others. (Perhaps none more bitterly than science fiction, where purists will still point at Star Wars, with its space-ships, aliens, and ray guns, and sneer, "that's not science fiction!")  It's no surprise then, given human nature, that some of these genres have become ghettoized, isolated both from without and within.

But I don't think this is healthy for literature.  I don't think it's healthy at all, especially in that it leads to the impression that if a book doesn't fit neatly into a familiar genre category, it doesn't exist at all.  It's difficult to sell such a book to traditional publishers because "sales doesn't know how to sell this."  ("This is a good book," apparently never occurs to them.)  And even if it gets sold, and even if stores buy it ("we don't know how to sell this.") then it may go into a limbo where it is shelved in one (or sometimes multiple) departments where it is an uncomfortable fit, and where even readers actively seeking it will have trouble finding it.

As a recent example, a few years ago my wife Chris wrote a pair of novels using characters from the J.J. Abrams spy/fantasy/adventure/family-drama TV series, ALIAS.  When we traveled, we'd often drop buy bookstores looking for shelved books to sign, and depending on which chain and/or store we visited it could be found shelved variously in mystery, thriller, entertainment, young-adult, fantasy or general fiction.

On the other hand, there are publishers and writers with literary pretensions very nervous about being shelved as, or even described as, genre books, even when by every definition of the word they are mystery or fantasy or science fiction.

The most chilling aspect of this genre isolationism is that it enforces an idea that the various genres define all the stories that are possibly (or at least worth) telling, ignoring the fact that the genres are like polka-dots on a pillowcase, covering less area that the white areas that surround them.  Countless potential stories go unsold, unwritten, unconsidered, simply because they don't fit in someone's clearly defined circle.

There are also sociopolitical aspects to some genres.  In the United States, many (if not most) of African ancestry shun country music because of its cultural association with southern racism and the legacy of slavery.  But in the West Indies, there were no such cultural associations, and those of African ancestry simply judged the music on its own merits.

You might think this doesn't apply to fiction genres, but it does more often than you think.  Many genres and sub-genres are strongly associated with women, to the extent of being minimized and ghettoized from the outside.  Romance is the obvious example, but the Cozy Mystery sub-genre also comes to mind.  Women like it, ergo it must be "trash," not "real literature," and have no possible interest to male readers.  Not that these genres don't produce fine books, and not that men can (and don't) enjoy these works.  But the stigma associated with them drives many readers away, and often makes those that partake secretive about their reading habits.

Likewise, there still a race line in books.  Books overtly written by black authors, and especially those obviously featuring black protagonists, are often seen a "black" books, to be reviewed and celebrated perhaps, but mainly of interest to black readers (or to white liberals who are often more interested in displaying the unread book as a symbol of their openness, than actually reading them).  And the flip side of this is that a book with an overtly black protagonist, especially if it deals with matter of race, can suffer a kind of reverse discrimination if it is written by a white author.

And I've already talked about the form of literary snobbery that considers true literature above and separate from all forms of "common" genre.  This is an extension of intellectual elitism, which is in turn an extension of the English and European class system.

But story does not know of class and culture.  Story is story.  Later in the NPR interview, writer Channer says, "I think a good story is a good story.  And Kenny Rogers is a good storyteller."

And ultimately, that's all that matters.  Genre should exist to help us find books we want to read, not to hide those books from us behind arbitrary walls.


If you've found this post interesting, informative, or useful, please share the link elsewhere, and consider leaving something in our tip jar below.  Doing so will encourage us to do more posts like it in the future.


Tuesday, August 16, 2011

Free Ebooks! (Our poor excuse for World Science Fiction Convention Promotion

 As we were getting organized for our Reno Worldcon trip this week, I had plans for the steps we'd take to promote our work during the convention, including special business cards and give-away coupons for free ebooks.

But unfortunately, family emergencies have pushed all our priorities around, and last week I decided this was one of many things I just had to let go.  We made the decision to come to Worldcon, and hopefully we'll stay for all our scheduled programming appearances, but life is kind of rocky at the moment.

So, in lieu of that, here are some links and Smashwords coupons for a couple of selected free ebooks, so that we can just announce our webpage URL at the panels and maybe some folks will find their way here.

Meanwhile, those if you who are stumbling in from Twitter, Google+, Facebook or the interwebs at large are also welcome to enjoy the freebies.  Coupons are good through Sept. 25th.  Smashwords supports download formats for all major reading devices, but if you prefer, you can purchase direct to your device (for the regular 99 cents each, sorry) on Kindle, Nook, Kobo, iBooks and other major ebook outlets.

Thanks for reading, and we hope to see you at Worldcon!

                              - Steve and Chris (Christy)

The Unwinding of Liberty Brass, A Clockwork Cowboy Story
by J. Steven York

"J. Steven York's Clockwork Cowboy stories aren't just 'weird Westerns.' They're quite touching, too. Yes, Liberty Brass is a metal man with a busted 'governor.' But he's got as much heart as any other hero you'll find riding the range." 
-- Steve Hockensmith, author of Pride and Prejudice and Zombies. Dawn of the Dreadfuls, and the Holmes on the Range mystery series


Throughout the west tales are told of a legendary Clockwork Cowboy, a restless mechanical wanderer who rode a clockwork horse, and whose bullets never missed. Some called him traitor, or monster, or murderer, but some called him hero. Some said he never stood by idle when the strong preyed upon the weak, and no bad men, mechanical or flesh, were safe while he wandered the plains.

But every story has a beginning...

Some say the Clockwork Cowboy was actually a Confederate Artilleryman called Liberty Brass. But for Liberty Brass, the trail seems ready to end almost before it begins. In a half-destroyed barn near the Gettysburg battlefield, on a dark, rainy night, two clockwork men, both damaged in war, meet. In the hours that follow, a terrible secret is revealed, a fateful judgment is made, and only one can survive to see the morning sun...



Coupon Code for free download: SF89E (Enter at checkout.  Expires Sept. 25th, 2011)  DOWNLOAD


A Day at the Unicorn Races
by
Christina F. York
(AKA Berkley Prime Crime authors Christy Fifield and Christy Evans)

Bubbles lives her dream as a successful unicorn jockey. The upside? Fame, fortune, and a job she loves. Downside? Enforced celibacy. Unicorns, after all, can only be ridden by virgins. So what's a girl to do when she falls in love? 


Coupon Code for free download: CP92P (Enter at checkout.  Expires Sept. 25th, 2011)  DOWNLOAD

Saturday, July 30, 2011

Bridganomics: Build a River, and They Will Build a Bridge (Or how companies destroy their business and create their successors)


I've been watching with a mix of grim fascination and alarm as both cable and phone companies, for somewhat different reasons, have been doing away with unlimited data plans, "throttling" heavy users, and finding other ways of either preventing heavy data usage, or making it prohibitively expensive.

This bothers me because I think it's bad for most everybody.  It's bad for consumers because -- hey -- paying more, getting less.  It's bad for innovation, since many new internet products depend on plentiful and cheap internet bandwidth.  It's bad for the economy, since the internet is where consumer business is done these days.  It's the engine of economy, and internet providers literally want to throttle it back.  It's bad for my country (the U.S. of A.) because we already lag behind most of the rest of the world in terms of cellular and internet service, and further restrictions push us more in the direction of becoming a second (maybe third) rate technological and economic power.

In fact, it would seem that it's bad for everyone except cable and cell phone companies.

But the one thing about this that gives me hope is that I know this last statement is absolutely not true.  These attempts to limit data usage will help them in the very short term, by increasing revenue and reducing the cost and necessity of network upgrades.  But in the long term it will hurt them.  In fact, in the long term, it will quite probably destroy them.  And moreover, it's going to help create the very companies and technologies that are going to plow them under.

How do I know this?  Well, it isn't because I'm an economist.  I've never had a class in economics in my life.  But I've been a keen observer of technology and the businesses that go with them for a lot of years now, and I happened to have lived long enough, and seen enough transitions (both of technologies, and of dominant companies in those technologies) to see certain clear patterns develop.  And one of those patterns I call "Bridganomics."  Simply stated, it means that if you build a river, and consumers want to cross it, then someone else will come along and build a bridge.  Making the river deeper, faster, wider, doesn't help, and often only increases the demand for the bridge.

Now, like I said, no economics education here, so there's a fair chance here I'm only reinventing the wheel and applying a new name to a well known economic principle.  But even if it is well known to economists, it clearly is not well known to those running American businesses, or if so, they're simply choosing to ignore it.  There's simply no other explanation for the way they keep shooting themselves in the foot over and over.

So I'm giving it a sexy, marketable, name, the kind that could go on the cover of a New York Times best-selling book (if only I had an economics degree and a lot of questionable friends in high places) in the hopes that it might catch on.

Fundamentally, bridganomics means that in the marketplace, you can't build an impermeable barrier in the way of any consumer desire or trend.  Any attempt to do so will be only temporarily successful at best, and will fuel the creation of bypass services, companies, or technologies that will render your business model (and possibly your business) obsolete.  And it doesn't matter how dominant your company may seem, or how firm a grip you have on  your monopoly, that dominance, that control, is only another part of the restriction to the market.  In bridganomics, we call this the "river," but if it's easier for your mind to wrap around, think of it as a fence, or a wall, or a trench.  Same thing.

Examples?  I've got plenty.

The most applicable to the current internet provider situation is the AT&T breakup of the 1980s.  Most people think of this in terms of the breakup of the phone company itself, and of access to a lower-cost and more competitive marketplace for phones, phone services, and accessories such as answering machines.  That's true, but there was a smaller, yet ultimately more important, aspect of it dealing with data access and the creation of the internet as-we-know-it-today, and I was on the front-lines of that battle.

Along about the time of the antitrust action that broke up AT&T, the telephone modem came along; a device that allowed computers to trade data remotely over phone lines.  Actually, modems had been around for quite a while, but what was happening then was that modems were finding their way into the hands of consumers, who were hooking them up to their residential phone lines and finding new ways to use them.

Keep in mind that there was no publicly accessable internet back then.  If you wanted to share date between two computers, they had to call each other directly, using modems over a telephone line, and trade data.  Or, they had to call an intermediary computer or computer network, again with a telephone line and modem, that would act as a middle-man and pass the data along.

The only alternatives in those days were physical delivery of a floppy disk (no thumb-drives, and CDs were in their infancy), or putting both computers in one place and connecting them with a cable.

Yet, despite these limitations, consumer services started showing up.

Initially, there were computer bulletin boards: small and simple store-and-forward messaging and discussion services.  Often these ran on a single personal computer and phone line.  One user would call in to read and post messages while on line.  Anyone trying to connect while they were on would get a busy signal and have to try back later.  Eventually they would finish and hang up, opening the phone line and the host computer for the next user.

You may be shaking your head at the crudity of it all, and the difficulty of use.  And I haven't even mentioned that a computer could easily cost you $3000 in pre-inflation 1980s dollars, and the modem would cost you $2-300 more (that's a loaded iPad with app-money left over, kids), or that your current internet connection is almost certainly over a thousand (possibly several thousand) times faster than those old modems.  Why would anyone possibly use such a thing?

Simple answer: Because there was nothing like it that was better.

Not that we didn't want better, even from the very beginning.  We wanted multitasking host computers that would eliminate the busy signals.  We always wanted faster modems.  And over time, we got those things.  Bridganomics applies to natural rivers as well as ones created by misguided CEOs.  The desire was there, and the limitations of the technology caused bridges to be built.  Multitasking operating systems for PCs.  Faster modems. Mainframe-based dial-up information services like CompuServe, GEnie, and AOL.  Dial-up internet.  Broadband.  The-Internet-as-We-Know-It.

From that dial-up 300-baud trickle a million companies were made, a million fortunate.  Without it, there is no Amazon, no Google, no FaceBook, no Twitter, no Dot Coms, and ultimately no iPhone or iPad or Android.

So what was AT&T's role in this economic revolution?

They tried to stop it.

To their corporate eyes, those pesky phone modems were, at best an annoyance, and at worst a threat.  They worked on ways to prohibit them, or simply price them out of existence.

There were proposals to put filters on residential lines that would simply render modems inoperative.  There were plans to charge residential customers for locals calls by the minute, or to cap usage.

There were plans to require modem users to install a separate and much more expensive business line for data calls, since there was no legitimate use for a modem other than business.

Some of these plans even went into effect in various localities and among the various "baby-Bells" that came from the breakup of AT&T.  Most met with protest and outrage, and none of them ever gained traction.

Various arguments for these restrictions and pricing models were used, may of which will be familiar to those who have been following the current "open internet" debate.

The phone system was designed for voice, not data.  Simply because it's possible to use it for other things (data) does not mean it should be.


Our multiplexers, used to compress voice traffic over long-distance lines, are designed for voice, and won't work as well for modems.  Our network capacity will be overwhelmed


We will be forced to make expensive upgrades to our network to accommodate this new activity, and we will need to pay for it.


Only a tiny percentage of our customers use modems and will be significantly impacted by these new rates and terms.  Why should all be charged more to pay for the needs of a few?  The rates for our "average" customer will actually go down under our new plans!

Didn't I read all this in a Verizon Wireless press release just a couple weeks ago?

We all know how this went down for AT&T.  Even as they were being broken up and losing their dominance over the voice telephone market, they had a golden opportunity to build a bridge to a new world of digital communications and data services.  Instead, it happened in spite of them.  For a few years, their networks carried most data traffic, despite their objections and foot-dragging.  Modems got faster and faster, and as prices for phone services dropped, people started ordering more phone lines to support their modems, fax-machines, and increasingly connected families.

But this was all short term gain.  The phone system was still the river, not the bridge.  By the time phone companies woke up and started rolling out their long-promised DSL broadband services, it was too late.  They'd been out-performed and under-priced by cable and fiberoptic companies.  Some phone companies are fighting to gain back that market, but they're trying to recover something that could have owned if they'd been the bridge, and not the river.



This mistake didn't destroy the phone companies (or at least, it hasn't yet).  They had a more diverse business model, and were able to enter new areas such as cell-phone service and providing infrastructure for the internet, and so were able to survive.  But they gave away one of the biggest business opportunities of all time in a simple moment of greed and ignorance.

This sort of things happens over and over again.  Where businesses and industries build rivers instead of bridges, they kill the golden goose over-and-over again.  For example, through high rates and poor service, dial-up ISPs (with plenty of help from the phone companies themselves) gave way to broadband providers.

When hotels and models started gouging business travelers through high room-phone charges, it helped fuel the establishment and growth of cell phones.

When hotels and motels again started gouging customers with high in-room internet charges, they fueled the development of wireless internet services.

When Blockbuster developed a dominance of video rentals and started taking their customers for granted with poor service and high late-fees, it opened the door for Netflix to slip in with an entirely new business model.

When the music industry kept gouging consumers with ever higher-prices on ever cheaper to produce product, enacted draconian anti-piracy measures, and showed general contempt for their customers, they fueled first mass-scale music piracy, and then lower price (and for them, lower profit) music download services such as iTunes and Amazon.

The pattern is pretty clear.  So, how does one go about building a bridge?  How does one go about not building a river?  Some rules of thumb.

Build a Bridge

Follow consumer desire, don't resist it, even if that desire seems to be contrary to your immediate benefit.

When people want things, make it easier for them to find them.  (Google)

When people want things, make it easier for them to find them.  (Amazon, Netflix, iTunes)

When people want to meet and gather, give them a place to do so.  (FaceBook, Twitter)

When people want to do things, make it easier and more fun for them to do them.  (Apple)


Build a River

Take your dominance of an industry, technology, or market category for granted.

Take your customer base for granted.

Create bad-will through poor customer service.

When you sense consumer desire outside your current business model, attempt to squelch it, block it, or it price it out of existence.

Raise prices indiscriminately.  Consumers will tolerate high prices so long as they judge them to be fair.  A customer perception that your pricing structure is unfair immediately transforms even a bridge into a river.

Respond to competition not by competing with it, but by eliminating it through buy-outs, protective laws, and unfair trade practices.

And finally, there's the one way to build a bridge and a river at the same time:


Build a Bridge and a River

Build a bridge over yourself:  While attempting to hold onto your current business model and core technology, build the Next-Great-Thing that will move beyond them.

The clear example here is Apple, which while it had never achieved dominance in the desktop computer market, had established itself as the clear, premium alternative to leader (and river) Microsoft, and there it could have been content to cost for years, if not forever.  Instead, Apple built the iPhone (creating the smart-phone market) and the iPad (creating the pad market, and a clear alternative to the desktop computer for many of its most common uses).  They're still the clear premium alternative to Microsoft in the PC market, but they've created a whole new market in which they are dominate, and their competitors (old and new) are all playing catch-up.  That's smart business.

That's Bridganomics.  Ignore it at your peril.

The lessons to be learned here are simple and obvious, and apply to a range of businesses and technolgies, from space flight, to fast-food, to the print-publishing industry to which I am intimately connected.

And if wireless internet is going to be restricted and over-priced for very long, then the river is there, and somebody will bridge it (and is probably already hard at work doing it).

Cell phone companies, you've got a very limited window to reverse course on this, maybe a year or two, tops.  What will replace you?  I don't know.  Maybe a distributed frequency satellite system like Lightsquared.  Maybe tennis-shoes with wireless routers in the heel.  Maybe a plan to distribute data through drinking-water pipes.  Maybe just a better business model using the same-old technologies (ala discount airlines).

Who knows?  But if the river is there, and the people certainly want to cross it, then it will be crossed.

AT&T Wireless?  Verizon Wireless?  Sprint Nextel?  T-Mobile?  The rest of you guys?  Let me know how this works out for you...

if you found this post useful or informative, please let us know by making a donation of your choice through the button below.  This will encourage us to take time our of busy writing schedule to do more of them.