Steve Jobs is not an idiot
I keep thinking back to 1989. Apple had just introduced the Macintosh II. This was way back in System 6.x days. A long, long time ago. But why did that year matter? Well, Apple was way way way ahead of the rest of the industry. I remember being in a computer science class back then where they forced us all to use DOS. In the journalism department we had just gotten brand new Mac IIcx’s. I think that’s one reason I went into journalism rather than trying to please my dad and become an engineer or a computer scientist.
Anyway, back then I thought Apple was going to take over the world. Apple’s equipment was just so brilliantly designed. They had the best printer, the best network, the best GUI, the best applications. Remember, back then Microsoft’s apps on Macs were WAY ahead of Microsoft’s apps on DOS and Windows was still a joke.
So why didn’t Apple win?
Well, go back to Rich Cameron’s classroom and look again. He wrote a ton of Hypercard applications for his journalism classes. That’s how we learned how to cover press conferences and all sorts of other things. Many of his tests were done in Hypercard too.
But Apple didn’t realize the power of developers. They ignored Hypercard. Never really improved it. Never gave developers really great tools. I remember meeting software developers who worked on Apple applications and they were always complaining about how hard they were to use, or how many rules they had to follow to make sure their apps were “Apple compliant.”
Many people think Apple didn’t win because Apple didn’t go Microsoft’s route of licensing the OS to clone manufacturers. I’m not so sure about that.
Look at what Microsoft did for developers between 1990 and 1995 and you’ll see that THAT was a huge reason that Microsoft became dominant with Windows 95. I remember when Visual Basic came out that lots of Apple developers would look over at it and say “that’s what Hypercard should have become.”
In 1989 Apple was in charge. By 1995 Apple was a second rate company and by 1999 people were thinking that Apple was going to disappear. Of course we all know the rest of the story, right? Steve Jobs.
So, why do I say that Steve Jobs is not an idiot?
Because he’s had to learn the lesson of 1989. Give developers tools to build apps easily and extend your product or else they, and the market, will go somewhere else.
Anyway, right now Apple is acting a lot like Apple did in 1989. Apple is miles ahead with its iPhone. It’s pretty. The folks I’ve talked to who’ve had their hands on one say it pushes the experience of using a cell phone ahead a mile and is way ahead of, say, my little Nokia N95 that’s sitting next to me right now.
But, why is Steve Jobs telling iPhone developers to pound sand? Dave Winer posits that Apple isn’t opening up the iPhone because they don’t have to.
Oh, but 1989 reminds us that chosing to remain non-friendly to developers will work for a while, but long term will doom you to second rate status.
Steve Jobs isn’t an idiot.
So, what do I think will happen? Oh, I can see the Steve Jobs keynote in 2008 right now. “We’ve sold eight million iPhones, more than we expected” and “remember how I said iPhone apps needed to be done with JavaScript and HTML? Well, we heard from all of you that you wanted to play games on Pogo.com so we added Flash. And we’ve been working on our own iPhone applications for more than a year now and we’re sharing the developer tools we use internally.”
Go back to 1989. What if Apple HAD invested in developer tools? What if Apple, instead of Microsoft, had released Visual Basic? What if Apple, instead of Microsoft, had taken the “consumer coolness” that they had in the Apple II line and made it so that a geek working inside some big company could make a business justification to use Macs instead of Windows machines? (Hint: a big part of that is how easy it is to make business applications).
Maybe Apple is happy with its 5% market share, but I doubt it. Steve Jobs is not an idiot.
Watch him open up the iPhone next year. Until then at least Dori Smith should have a job (she’s one of the world’s experts on JavaScript and is out looking).
Or, do you think Apple will keep the iPhone closed and tell developers to pound sand forever?
Steve Jobs is not an idiot.
Font smoothing, anti-aliasing, and sub-pixel rendering
Apple and Microsoft have always disagreed in how to display fonts on computer displays. Today, both companies are using sub-pixel rendering to coax sharper-looking fonts out of typical low resolution screens. Where they differ is in philosophy.
- Apple generally believes that the goal of the algorithm should be to preserve the design of the typeface as much as possible, even at the cost of a little bit of blurriness.
- Microsoft generally believes that the shape of each letter should be hammered into pixel boundaries to prevent blur and improve readability, even at the cost of not being true to the typeface.
Now that Safari for Windows is available, which goes to great trouble to use Apple's rendering algorithms, you can actually compare the philosophies side-by-side on the very same monitor and see what I mean. I think you'll notice the difference. Apple's fonts are indeed fuzzy, with blurry edges, but at small font sizes, there seems to be much more variation between different font families, because their rendering is truer to what the font would look like if it were printed at high resolution.
(Note: To see the following illustration correctly, you need to have an LCD monitor with pixels arranged in R,G,B order, like mine. Otherwise it's going to look different and wrong.)
The difference originates from Apple's legacy in desktop publishing and graphic design. The nice thing about the Apple algorithm is that you can lay out a page of text for print, and on screen, you get a nice approximation of the finished product. This is especially significant when you consider how dark a block of text looks. Microsoft's mechanism of hammering fonts into pixels means that they don't really mind using thinner lines to eliminate blurry edges, even though this makes the entire paragraph lighter than it would be in print.
The advantage of Microsoft's method is that it works better for on-screen reading. Microsoft pragmatically decided that the design of the typeface is not so holy, and that sharp on-screen text that's comfortable to read is more important than the typeface designer's idea of how light or dark an entire block of text should feel. Indeed Microsoft actually designed font faces for on-screen reading, like Georgia and Verdana, around the pixel boundaries; these are beautiful on screen but don't have much character in print.
Typically, Apple chose the stylish route, putting art above practicality, because Steve Jobs has taste, while Microsoft chose the comfortable route, the measurably pragmatic way of doing things that completely lacks in panache. To put it another way, if Apple was Target, Microsoft would be Wal-Mart.
Now, on to the question of what people prefer. Jeff Atwood's post from yesterday comparing the two font technologies side-by-side generated rather predictable heat: Apple users liked Apple's system, while Windows users liked Microsoft's system. This is not just standard fanboyism; it reflects the fact that when you ask people to choose a style or design that they prefer, unless they are trained, they will generally choose the one that looks most familiar. In most matters of taste, when you do preference surveys, you'll find that most people don't really know what to choose, and will opt for the one that seems most familiar. This goes for anything from silverware (people pick out the patterns that match the silverware they had growing up) to typefaces to graphic design: unless people are trained to know what to look for, they're going to pick the one that is most familiar.
Which is why Apple engineers probably feel like they're doing a huge service to the Windows community, bringing their "superior" font rendering technology to the heathens, and it explains why Windows users are generally going to think that Safari's font rendering is blurry and strange and they don't know why, they just don't like it. Actually they're thinking... "Whoa! That's different. I don't like different. Why don't I like these fonts? Oh, when I look closer, they look blurry. That must be why."
(Via Joel on Software.)
cob 1864 baseball with Conan O Briencob 1864 baseball
Hilarity Ensues
Powell: Close Guantanamo Now, Restore Habeas
This morning on NBC’s Meet the Press, Gen. Colin Powell strongly condemned the U.S. prison at Guantanamo Bay, calling it “a major problem for America’s perception” and charging, “if it was up to me, I would close Guantanamo — not tomorrow, this afternoon.”
He also called for an end to the military commission system the Bush administration has created to try Guantanamo detainees. “I would simply move them to the United States and put them into our federal legal system,” Powell said. He scoffed at criticism that the detainees would have access to lawyers and the writ of habeas corpus: “So what? Let them. Isn’t that what our system’s all about?”
“[E]very morning I pick up a paper and some authoritarian figure, some person somewhere, is using Guantanamo to hide their own misdeeds,” Powell said. “[W]e have shaken the belief that the world had in America’s justice system by keeping a place like Guantanamo open… We don’t need it, and it’s causing us far more damage than any good we get for it.”
Watch it:

Powell also sounded off on conservatives, including Vice President Cheney, who oppose diplomacy with Syria and Iran, calling their view “short-sighted.” Powell endorsed direct talks “not to solve a particular problem or crisis of the moment or the day, but just to have dialogue with people who are involved in this region in so many ways.”
Transcript: (more…)
(Via Think Progress.)
WWDC Monday at 10:00AM PST / 1:00PM EST, set your alarms
Filed under: Announcements
Monday morning El Jobso takes the stage to discuss Apple products for the first time since the iPhone unveiling in January (no, we don't really count D where he chatted with Mossberg and sat down with Gates). As always, you know where to turn for the whole spread, including real-time blow-by-blow coverage, live photography, and only the largest ring of iBookies taking bets on what Steve's gonna announce with Leopard and the iPhone.
Go here and bookmark this page, it's where the action happens Monday morning.
7:00AM - Hawaii
10:00AM - Pacific
11:00AM - Mountain
12:00PM - Central
1:00PM - Eastern
5:00PM - GMT
6:00PM - London
7:00PM - Paris
2:00AM - Tokyo (June 12th)
P.S. -Feel free to leave the usual timezones / predictions / wish lists / "STEVE I LOVE / HATE YOU!"s in comments.
(Via Engadget.)
Wikipedia’s Real Problem: Nerd Bias
There's been plenty of debate over the past couple of years about the merits of Wikipedia, generally focusing on how "trustworthy" the site is because of its anonymous contributors and lack of professional editorial review. But SomethingAwful has cut to the heart of Wikipedia's problems: its apparent nerd bias (via TechCrunch). The site, rather amusingly, compared the length of articles on related topics, such as modern warfare and lightsaber combat, or Buzz Aldrin and Jean-Luc Picard, concluding that the "nerdy" topics were more thoroughly written. Of course, many of the topics the article highlights reflect more of a pop culture bias (such as Aristotle vs. Oprah), while the sheer length of the article isn't a real comprehensive test of quality. The underlying point, though, is that people contribute in areas which they're passionate about, and in which they have some knowledge. While on the face of it, this piece would appear to give more ammo to Wikipedia's critics, perhaps the point to take away from it is that the site can serve as a useful reference on areas that tap the knowledge of its contributors, and illustrate that the community is capable of creating comprehensive reference works. While the SomethingAwful piece oversimplifies and overstates the gap in quality among the supposedly nerd and non-nerd topics, the challenge for Wikipedia is to keep growing the community, so level of knowledge that's being shared across the board continues to rise.
(Via Techdirt.)



