Tag Archives: Apple

The Apple Dividend: End Of An Era?

Last week, Apple declared that it would pay its shareholders a dividend for the first time since 1995.

The market’s reaction? It paused for a moment. Then it yawned. It might have even groaned a bit. The value of Apple’s stock barely budged, beginning and ending the week trading at $599 per share.

Huh? Why did a successful company’s announcement of a new dividend payment policy generate so little enthusiasm? And why might Apple’s declaration mark the end of an era in the internet technology sector?

The Best Alternative Use

The convention wisdom states that dividend payments represent financial gratuities that companies pay shareholders to thank them for their loyalties. That’s why US Airways decided to name its frequent flyer program Dividend Miles, in order to convey the optimistic message that its “dividend” point awards are expressions of appreciation for its customers.

But dividend payment decisions may convey relatively pessimistic messages as well, signals that tend to worry equity investors in high growth firms. Namely, dividend payments may be perceived as desperation measures, payments that are authorized when firms have run out of attractive growth opportunities in which to invest their funds.

Tobacco companies, for instance, often make sizable and stable dividend payments to their investors. Why? Because they tend to earn significant cash profits, but are unable to find investment opportunities that are more attractive to their shareholders than the prospect of simply receiving payments. For instance, Philip Morris (of Marlboro fame) once tried diversifying into the food business by purchasing Kraft, and RJ Reynolds (for its Winston brand) once dabbled in the development of a smokeless cigarette. But neither firm was able to turn those opportunities into enduring successes.

So, based on the stock market’s reaction last week, it appears that investors may have interpreted Apple’s dividend decision as a pessimistic message regarding its future growth prospects. And interestingly, the lingering memory of Steve Jobs may have influenced their opinions.

Bracketing An Era

It’s no coincidence that Apple’s most recent dividend payment (prior to last week’s corporate announcement) occurred in 1995, one year before Steve Jobs returned from exile with Apple’s purchase of NeXT to turn around the company’s fortunes. The newly announced dividend will be paid later this year, approximately one full year after Jobs passed the mantle of leadership to his successor Tim Cook.

Indeed, these two dividend transactions serve to bracket the transformational era of Steve Jobs’ second stint at Apple’s helm. Although the firm first decided to suspend its dividend payments in the 1990s because of extreme financial difficulties, very few investors ever pressured it to pay dividends during the early years of the 21st century. And that made perfect sense; after all, why would investors ask a company to pay dividends to its shareholders while it needed the cash to invent and introduce iPods and iPhones and iPads to millions of new customers around the world?

At the time of his death, Jobs was rumored to be focusing intensively on television as Apple’s next great development opportunity. And at the time, in characteristic fashion, Jobs was refusing to consider any plan to authorize dividend payments. But under Tim Cook, the firm has struggled to find a path to break into the television industry, and the firm’s hoard of unspent cash has climbed to $100 billion. Thus, with no preferable alternative use of the cash readily available for investment purposes, Cook decided to pull the proverbial trigger and authorize the dividend payment.

The Microsoft Analogy

When was the last time a high flying technology firm faced a similar decision? On January 16, 2003, Microsoft decided to declare a dividend in order to deal with its own $40 billion hoard of unspent cash. But just one year later, on February 4, 2004, Harvard University student Mark Zuckerberg launched thefacebook.com from his dormitory room, and Microsoft has been scrambling to compete in the social media era ever since.

Would Microsoft have been better served keeping the funds that it decided to pay out as dividends, and developing or acquiring its own social media network to compete with Facebook? Arguably, it has attempted to execute similar strategies in a couple of industry sectors, by developing the search engine Bing to challenge Google, and by purchasing the communication service Skype to compete with various voice and text messaging firms. But these competitive strategies have failed to restore Microsoft to its former position of industry dominance.

Indeed, in retrospect, Microsoft’s 2003 announcement appears to have marked the end of an era that was dominated by desktop computers and their independent software programs, and the beginning of an era of mobile devices and their cloud-connected service systems. With newly emerging firms like zynga and foursquare grabbing the attention of today’s technology investors, it’s possible that we’ll eventually look back at this week’s Apple announcement as the end of an era as well.

Steve Jobs: Contrarian

Much praise has been lavished — and deservedly so — on the life and legacy of Steve Jobs in the days following his untimely demise at the age of 56. Although comparisons to historic figures like Thomas Edison and Henry Ford may be a bit strained, we can all certainly agree that Jobs’ emphasis on product design and quality helped transform the consumer technology industry.

Absent from the initial wave of obituaries, though, was a focus on the contrarian approach that Jobs repeatedly employed throughout his storied career. Time and again, Job made decisions that left the pundits scratching their heads in confusion, decisions that nevertheless led to eventual success.

Some of those decisions represented conscious choices to repudiate fundamental principles of modern business theory. Others represented the implementation of highly risky tactics that are seldom successful in the contemporary economy, but that Jobs nevertheless managed to implement effectively.

Repudiating The Academics

Several years ago, Apple hired Dean Joel Podolny away from the Yale School of Management to manage Apple University, the firm’s internal training function. Dean Podolny was also assigned the task of creating a series of written case studies, for use in Apple’s training programs, that captured the essential principles and theories that Jobs employed during his tenure.

The cases themselves might be difficult to integrate into a traditional university curriculum, given Apple’s propensity to repudiate various fundamental tenets of traditional MBA lesson plans. Consider the principle of product obsolescence, for instance; firms are generally advised to extend the life cycles of their products, and not to consciously speed their obsolescence.

But Jobs continually developed new products that cannibalized existing Apple lines. Sales of iPod music players, for example, plummeted once Apple incorporated their core functions into the iPhone. And the iPad didn’t simply take business away from other laptop and netbook computer manufacturers; it apparently drained sales from the MacBook line as well.

Some professors might protest that Apple was simply combining complementary functions in new packages, in the manner that consumer product manufacturers sell soap and shampoo in toiletry travel packages, or spoons and forks in cutlery sets. But at the time that Apple combined its mobile music player with its new telephone, for instance, the pair of functions resided in entirely different industries.

Sony had not originally contemplated the placement of a telephone in its Walkman; likewise, Motorola had never attempted to play music through its Razr. The integration of music by the iPhone, and its resulting cannibalization of the iPod line, was thus a truly groundbreaking decision.

Rolling The Dice

Other decisions authorized by Jobs were not necessarily repudiations of classic business theories per se, and yet they represented highly uncertain “rolls of the dice” that paid off for Apple. Indeed, they were reflections of a corporate culture that embraced entrepreneurial risk-taking at the highest level.

Apple’s decision to rehire Jobs in 1996 after firing him in 1985, for example, represented an astonishing about-face by the firm’s Board of Directors. Although it is not unprecedented for corporate founders to return to the helms of their organizations after having retired or resigned to pursue other endeavors, the rehiring of a fired CEO was undoubtedly a risky choice for the firm.

Then, shortly after his return to the CEO position, Jobs reached out to Microsoft and secured a direct $150 million capital infusion. Such equity investments are likewise not unprecedented in nature, but the manner in which Jobs introduced and then defended the transaction startled the public. At the 1997 Macworld Expo, Bill Gates himself unexpectedly appeared “live” on an immense view screen, looming over the audience in a manner that reminded some viewers of Big Brother’s presence in Apple’s seminal 1984 Super Bowl ad.

Furthermore, throughout his tenure at Apple, Jobs repeatedly took the risk of striving for product simplicity in an industry that continued (and still continues) to grow increasingly complex over time. Although some simple designs, such as the minimalist Mac Cube, failed in the market place, others — such as the single button iPod, iPhone, and iPad — succeeded wildly. That’s why, for instance, many psychologists now recommend giving iPads to individuals with autism because of their ability to master its simple commands.

The Test Of Time

Ultimately, though, the most impressive accomplishment of Steve Jobs’ career may be his success in maintaining Apple’s position at the forefront of technological innovation for an astounding 35 years. During that time, numerous competitors have risen and fallen, including Xerox, Wang, Compaq, and Yahoo. None was able to maintain a tradition of creative leadership that stretched from the mainframe focused year of 1976 to the cloud computing era of 2011.

Indeed, the sheer longevity of Apple’s reign may represent the greatest legacy of a man in an industry where life cycles are measured in months and years, not decades. And now the attention of the technology community will turn to Tim Cook, Jobs’ successor, to observe whether he will be able to maintain Apple’s track record of accomplishment.

Netbooks: Is Apple Leaping the Uncanny Valley?

Fans of Brad Pitt’s virtuoso performance as the man who ages backwards in The Curious Case of Benjamin Button might wish to pause a moment before lauding him for the quality of his work.

That’s not to denigrate his performance, of course; many critics believe that it was superb. But fans might wish to confirm that they’re praising the right performer.

Truth be told, much of what appears to be a Pitt performance should actually be credited to software programmers who inserted Pitt’s eyes and face on a digitally manipulated image of a decrepit old man, as well as a newborn infant.

Many believe that the software programmers stole the show by leaping across the uncanny valley. And in a different way, Apple might now be planning to do the same in the technology industry.

The Uncanny Valley

Japanese robotics inventor Masahiro Mori coined the phrase “uncanny valley” back in the 1970s. He was referring to the fact that people tend to be fond of robots that don’t look very human, and tend to like them more as their appearances become more human, but only up to a point. Sooner or later, the robots become so humanoid that people suddenly notice that their eyes are glassy and their skin is unnaturally white.

Then, suddenly, they say that the robots are creepy. Spooky. Chilling. And they feel repulsed by them.

Robot designers, of course, have no intention of quitting when they reach the uncanny valley; instead, they continue to improve the lifelike realism of their characters, hoping to keep working until the robots can no longer be differentiated from humanity. But humanity need not worry, they haven’t yet reached this stage of robotic realism … at least not yet!

Visual entertainment media companies have long wrestled with the same problem. For instance, the special effects in the original 1933 film King Kong were primitive by modern standards, but were nevertheless embraced for their “remarkable” effectiveness. Interestingly, though, the far more technologically sophisticated 2005 film version was panned by some for its “horrible” effects.

Hollywood knew that it would be a “challenge” to convince the audience to accept Brad Pitt in his roles as the decrepit old man and newborn infant versions of Benjamin Button. But they had nothing to fear; their software engineers had succeeded so well at rendering these images in a lifelike manner, the audience barely remembered that Pitt is currently in the prime of his life.

The Curious Case of Apple’s Netbook Strategy

So how is the uncanny valley phenomenon related to netbooks, those little devices that look like miniature laptops and mimic their functions, but that don’t possess hard drives and thus lack many of the features and capabilities of “real” computers?

Just a few days ago, Apple’s COO Tim Cook ridiculed the netbook, saying that “it’s a stretch to call them a personal computer,” in much the same way that moviegoers once thought that it was a stretch to call the 2005 version of King Kong a real ape. Interestingly, though, Apple once created the world’s first tiny handheld computer. Its Newton was a cute but grossly underpowered version of a laptop computer; it was so popular among a small group of devotees that it still maintains a small cult following today.

Although Apple discontinued the Newton over ten years ago, today’s netbooks are becoming more and more similar to laptop computers with every passing day. Their screens are growing larger, their keyboards are becoming roomier, and their operating systems are evolving into better representations of full-fledged Windows programs. It might have been reasonable to assume that Apple’s historical fondness for the Newton and current success with the iPhone might have translated into a desire to manufacture netbooks, but Cook’s outright hostility makes one wonder whether Apple has fallen into an uncanny valley of its own.

True Revulsion, or a Head Fake?

During most of the 1990s, Apple promoted the Newton as a “personal digital assistant,” capable of accessing email, holding electronic files, and recognizing handwriting. But it clearly provided no match for the power and functionality of a laptop computer.

And yet in many ways, the spirit of the Newton lives on in the Apple iPhone.  Furthermore, the netbook lines of competitor technology companies appear to be stealing large chunks of business from high end mobile telephones and low end laptop computers. So what is one to make of Tim Cook’s ostensible revulsion for the netbook concept?

One point of view is to take him at his word, and to assume that consumers will soon fall into his uncanny valley as well. If you subscribe to this belief, you are inherently assuming that netbook designers will never be able to travel that proverbial final mile and overcome the product’s inherent weaknesses to make them fully interchangeable with low end laptop computers.

On the other hand, an alternative point of view is to assume that Cook is manipulating his competitors with a clever head fake. Perhaps Apple is hard at work, developing its own competitive line of netbooks, and it is simply biding its time until consumers reach a point where they are ready to leap over the uncanny valley.

Are you buying Apple stock or selling it short? Before you make your investment decision, you might be wise to decide for yourself whether you subscribe to one point of view or another, and whether you believe the netbook will ever become a true mass market device. The answers to those questions might well determine the future of the computer industry.