Boosters of the tablet form-factor love to tell us that we’ve entered the post-PC era. For some reason, the successful PC and laptop form-factors that we’ve evolved over 30+ years will instantly be swept into irrelevance.
Yeah, I don’t buy that theory either. Tablets are fine for media consumption, casual gaming, or some light couch-based Web surfing. But when it comes to serious knowledge work — the kind that powers an economy — you need a keyboard and a big screen. You need a PC.
So what will these new PCs be like, Richi? Glad you asked.
But first, a disclaimer: Although this is an HP-sponsored site, I have no knowledge of HP's unannounced plans; this article is educated speculation. See also the box to the right, particularly the phrase “Articles…do not necessarily represent the views and opinions of HP.”
Here are nine ways that mainstream PCs are changing…
1. Power consumption
Reducing a PC’s power consumption isn’t just for laptops any more. The days of a desktop PC costing $1.50 in electricity per day are long-gone.
In the future, energy is only going to get more expensive for organizations and consumers. This fact, plus increased focus on CO2 emissions, mean that PC vendors (such as HP) continue to reduce the power requirements of their products.
Several technical factors are leading to this reduced power use, many of which I discuss throughout this article. Here’s one of them…
2. Mobile processors
In Intel’s latest roadmap, power usage goes down to under one Watt within three years. This is the projected power use for the Atom-based Airmont system-on-a-chip (SoC) design, which is promised to use a 14 nm, 3D manufacturing process. Within 12 months, Intel should have made substantial progress towards this goal with the interim SoC product line, codenamed Silvermont.
Atom SoC roadmap overview (source: Intel)
Intel’s aim is to break ARM’s stranglehold on low-power mobile systems. If Intel gets its way, Windows 8 on ARM will be a short-lived red herring (just as MIPS, Alpha, PowerPC, and Clipper were).
AMD is also continually improving its “Fusion APU” designs. 2012 should bring the low-power products codenamed Wichita and Hondo.
3. Laptop batteries
Battery technology continues to improve, in part thanks to cross-fertilization with technology used in electric vehicles.
For several years, the main way to create laptop batteries was by combining several lithium-ion cells. While they were better than the previous techniques for constructing cells, those batteries suffered from limited energy storage, self-discharge problems, and noticeably less battery life after fewer than 500 charge/discharge cycles.
Today's state of the art laptop batteries are based on lithium-ion polymer and lithium iron phosphate chemistries. Looking ahead to what’s in the R&D labs, the next generations of batteries may be based on thin-film lithium-ion, lithium sulfur, or potassium-ion chemistries.
Each new generation of battery will be lighter and smaller, for the same amount of charge; or they may store more charge for the same size and weight. In other words, they’re said to have a higher energy density.
The new batteries will offer a longer useful life, allowing more charge/discharge cycles before they need replacing. They’ll also suffer from lower self-discharge, retaining their charge for longer when switched off.
In addition, the underlying battery cells are moving away from the standard, AA-style “18650” cylinder-shaped design. This allows more packaging flexibility, allowing manufacturers to fit batteries into awkward spaces, and wasting less space between cells.
This and the ongoing efforts to reduce power use, are leading to what some are calling Ultrabooks: Incredibly thin and light laptops, but at “volume” or “mainstream” price points. Such as shown in this concept image, courtesy of Intel:
A concept Ultrabook (source: Intel)
Of course, this picture shamelessly apes Apple’s promotional image for the MacBook Air (MBA). The intention is to highlight the concept of a laptop that’s thinner than an MBA, and doesn’t command the inflated price premium of Apple gear.
An SSD (solid state drive) is a large flash memory drive, usually packaged as a drop-in replacement for a regular hard drive — in the same form factor and with the same interfaces.
As I wrote recently, solid state drives are ready for prime time. They’re not cheap, but think of their price as an investment that offers real returns. SSDs use less power than conventional, spinning hard drives; and they’re amazingly fast.
Even the slowest SSD gives you far better real-world performance than does the fastest conventional hard drive. This not only means better user productivity, but also higher quality work.
But typical SSDs are smaller than conventional hard drives. The lack of internal storage space can be a problem for many users. Newer PCs solve this by including hardware to cache information onto an SSD. This capability will soon become mainstream.
Here’s how this works: You store your information on a conventional hard drive, as usual, but the hardware also keeps a copy of frequently-used data on an SSD. When you come to access that information again, it’ll come from the cache, which is at least 100 times as fast and about 20–50 times as big as the in-memory disk cache. And, because the cache is stored on an SSD, it survives a reboot.
The first mainstream version of this technology is available with Intel’s latest chipsets: the Z68 series, with its Smart Response Technology. This caches reads and writes on any SATA SSD of up to 64 GB, although you’ll see the benefit with much smaller drives: e.g., 20 GB.
Of course, as SSD prices come down, the need for internal spinning storage will go away, but it’ll be some time before prices drop to today’s 5–10¢ per GB benchmark!
SSDs replaced by on-board flash
In future, we’ll see more PCs come with some NAND flash memory, either directly soldered to the PC’s motherboard, or connected via a PCI Express card.
A PCI Express SSD: A high-end device today, but mainstream in future (source: Fusion-io)
This is a good way of supporting a cache, rather than using a packaged SSD connected via SATA; it’s less expensive and adds less performance overhead.
The productivity benefits of having two, three, or more displays are often discussed and debated. Certainly a pair of studies by the University of Utah said the advantages exist; however detractors like to point out that the studies are biased, being sponsored by a display manufacturer.
Whatever the truth behind the productivity benefits, it’s clear to me that my three monitor setup is good for my productivity. And few knowledge workers who’ve experienced the benefits of multiple-monitors are likely to agree to go back to a single monitor.
Some say that it’s not so much the number of monitors that counts, but the number of pixels displayed on them. However, there’s currently a large price gap between 20-24 inch 1920×1080 displays and anything bigger, with more pixels. The reason is that 1920×1080 is the so-called Full HD television resolution, so there’s an economy of scale advantage in sharing LCD panel production between PC displays and small HDTVs.
For roughly half the price of one single 2560×1600, 30-inch IPS display, you can buy three high-quality, LED-backlit, 23-inch HD displays. Three such displays, arranged side-by-side in portrait orientation, give a total of 3240×1920 resolution: more than 50% more pixels than on the 30 inch display.
We’re even seeing the tentative steps onto the market of dual-screen laptops. Although few uses for them seem to justify permanently carrying around an extra screen, compared with occasionally carrying an add-on screen that’s connected and powered via USB 3.0 or Thunderbolt.
What’s that? You’d like to know what Thunderbolt is? I’m happy to oblige…
Thunderbolt is a new, high speed connection standard for devices, such as displays and external storage. It’s essentially a backwards-compatible enhancement of the DisplayPort standard, adding a PCI Express lane.
Thunderbolt is “daisychainable” — that is to say, up to seven devices can be connected to a single PC’s Thunderbolt port, by connecting them in series, from one device to the next. The daisychain starts from what looks like a Mini DisplayPort connector on the PC, then typically goes to a monitor, then another wire goes to the next device, and so on.
In a year or two, all mainstream PCs will come equipped with a Thunderbolt port. The initial version of Thunderbolt runs at 10 Gbits/second in each direction: twice as fast as USB 3.0. Future versions are predicted to reach 100 Gb/s.
7. Thin is in (again!)
I’ve lost count of the number of times when “next year” has been predicted to be the year when thin-client computing takes over the corporate desktop. From the early days of X-Window terminals, via Sun and Oracle’s various efforts to convert the world to Unix- or Java-based “network computers,” to more recent initiatives from Microsoft and its partners around Windows terminal servers.
Today’s Bright Young Thing proclaiming that you can never be too thin is Google. The idea here is to use a computer with a dramatically cut-down operating system —Chrome OS — that’s only designed to run one app: the Chrome Web browser.
However, you’re not limited to browsing the Web, as Google and others are producing apps that run inside Chrome. These apps use new Web technologies such as HTML5, which allow a far greater level of interactivity than previously.
For an example of what’s possible, take a look at the Google+ circle management user interface:
Also, some of these applications can work offline, without an Internet connection. For example, the next version of the Gmail and Google Docs Web clients will use HTML5 local storage. (A previous version used a proprietary local storage scheme, called Gears.)
8. Graphics acceleration: Not just for gaming
Animation in user interfaces becoming more important. Desktop users’ expectations are rising, based on the user experiences in new smartphone and tablet platforms. It turns out that rich animated interface subtleties actually assist users with discovering new capabilities.
That’s why improved 2D and even 3D graphics acceleration will be important in future PCs — but without drawing hundreds of Watts of electrical power. On-board graphic hardware from Intel and AMD is becoming extremely powerful, especially when paired with enough fast memory.
The success of multi-touch Windows 7 PCs from HP and others is only the beginning. As with animation, multitouch capabilities in new smartphone and tablet platforms are causing users’ expectations to rise on the desktop.
Microsoft is at pains to point out the new Windows 8 touch capabilities aren’t just for tablets. The old shell, first seen in Windows 95, is seen as “legacy” whereas the new, multitouch-enabled shell is also known internally as “Modern Shell.”
In the so-called post-PC world, it’s evolution, not revolution. We’ll see lighter, greener, more productive PCs. However, PC users will also see the benefit of cross-pollination from the smartphone and tablet worlds.