Sunday, December 21, 2008

The death of PC gaming and the rise of the netbook

WARNING: The following post may contain technical terms that you will have to Google.  If you think your computer is run by magic faeries and a pinch of gold dust, you might want to skip this one.

I love games.  I play both computer (Windows PC) and video games (Wii and PS3), and I have done ever since my family had a computer (1996 or thereabouts).  However, there is a definite trend in the games industry away from computers, and towards console games.  To me, this seems a bit odd; most people already own computers, many computers are as powerful, if not more so, than current-generation consoles (especially the Wii), and wouldn't you think it would be cheaper and easier to consolidate one's games into one system, the computer?

Console game sales far outpace sales of computer games.  The Entertainment Software Association, or ESA, quotes statistics that support this claim: in 2007, console game sales accounted for $8.64 billion, while computer games accounted for a comparatively measly $910 million.

I think that there are a few reasons for this decline in PC gaming vis-à-vis console gaming.  The first is that dreaded and sick thing known as "Recommended System Requirements."

I remember trying to convince my parents to let me get a console when I was 12.  My main argument centered around the statement "there are no system requirements!  It always runs!"  This is my biggest beef with computer games, and I think it may have led to the downfall of the platform.  Console games, besides the fairly obvious "this game runs on this system" sticker on the front cover, have no system requirements.  It's either on such-and-such a system (and at full quality), or it's not.  

PC games, in contrast, come with a rather cryptic message on the bottom of the box that has a list of the "minimum requirements" (ie whether or not you can load the menu screen) and the "recommended requirements" (ie whether or not you can actually run the game).  You have to actually know what kind of processor you have, how much RAM you have, what video card you have, and what speeds they all run at.  You have to actually know how your computer works!  Oh noes!

In my experience with computer games, I have bought a game, taken it home, and installed it, only to find that that it either doesn't run at all or that it runs at such low quality as to make it virtually unplayable countless times.

Console games?  No problem.  Go to the store, buy the game that says "I run on your console!", pop the disc in, and bingo!, you're playing within five minutes.  No long installs (well, not until recently, anyway), no system requirements, no pain.

The second reason that I believe computer games are on their way out has to do with hardware.  Video game consoles have pretty much the same hardware, no matter what version of a platform you buy (yes, there are occasional updates, but for the most part they consist of nothing more than a new DVD drive, or a slightly faster processor).  PCs, on the other hand, are constantly being updated.  New graphics cards (arguably the single most important factor in how a computer runs a game) come out almost monthly; Intel and AMD (the two major processor manufacturers) release new processors regularly; new standards of RAM (random access memory, the stuff that stores data temporarily, as opposed to the hard drive, which stores things permanently) come out yearly.  The only thing that doesn't seem to change constantly in a computer is the hard drive.

Like all Apple products, PCs are obsolete almost the moment you buy them.  Because of the lack of a standard of hardware - a benchmark PC - games can, and do, vary over the entire spectrum of system requirements.  The prime example of this is a game called Crysis (pronounced like crisis).  Gamers love to poke fun at Crysis (in the way that school kids poke fun at the bully, but are secretly afraid and awed by him).  Crysis is consistently described as about two years ahead of current hardware and continues, a year after its release, to be the golden standard of extreme graphics on the PC, an amazing feat.

However, I have never seen a PC that could run Crysis at the highest settings.  They do exist, but they cost $5000 and up.

Who wants to constantly have to burn money on a computer to make sure that it can run the latest games?  Not I.  And, it seems, not the 38% of American households that own a video games console.

The third reason that "hardcore" PC games are going to die (which finally explains the title of this post) is that the world is transitioning away from big, powerful computers to small, portable, less powerful computers.  The "netbook" is a term coined fairly recently for the new category of computers with low-power processors and screens under 12" across.

I am typing this on a laptop, and although it's a huge 17" desktop replacement, the very fact that I own a laptop and not a more powerful desktop is an admission to the fact that I value portability more than power.

It is no coincidence that a "niche product" like the netbook exploded into the mainstream in the biggest year for video game console sales ever.  This is the point at which computers and games go their separate ways.  Video game consoles have and will continue to evolve into sophisticated multimedia centers, with games at their cores, while computers will evolve into more portable devices that center around interaction via the internet.

Indeed, some netbooks already blur the lines between the internet and a desktop environment.  Google recently announced a project, called Native Client, that would run x86 code (the types of programs that you run on your computer) inside your browser, which is presently limited to things like Flash or JavaScript.

Some may argue that games like World of Warcraft, which have an exclusively PC (and Mac) audience and are extremely popular, disprove this theory.  As much as I am inclined to dismiss these people as n00bs, they have a point.  WoW and other Massively Multiplayer Online Games (MMOs) have a tremendous audience on the computer.  However, new MMOs, like Bioware's the Old Republic, a Star Wars-themed MMO which has received a huge amount of hype, will be available on consoles as well as computers.  As internet connection speeds increase and MMOs become more hardware-intensive, the limiting factor of the graphics of PC MMOs like WoW will cease to be connection bandwidth and will become the actual video hardware of the machines they are played on.

Computers and consoles are headed in fundamentally different directions and only one can take gaming with it.  At the moment, it would appear that consoles have it mostly wrapped up.  Of course, what does it matter to Microsoft and Sony, the companies that make both our computers and our consoles if we have to buy one of each?

Friday, December 5, 2008

Metricate me, Cap'n!

To avoid any awkward confusion upfront, "metrication" is defined as "rewriting something in such a way that it is indecipherable to Americans" "the act, process, or result of establishing the metric system as the standard system of measurement."

The United States has officially recognized and endorsed the use of the metric system (officially the International System of Units - Le Système International d'Unités - or SI) since 1866.  However, it is one of three countries in the world that has not adopted it as its primary system of measurement (the other two are Liberia - a former US colony - and Myanmar).

This is not a post about how great the metric system is (very great), or why the metric system is better than the conventional system (it just is), or even how stupid the US is for refusing to adopt such a common sense series of units (quite stupid).  No, this post is none of these things because all of these things have been written about ad nauseam.

Essentially, the United States has refused to switch to the metric system because of a myriad of political and cultural reasons.  It is the only developed country in the world that has continued to use conventional units (with the quasi-exception of the UK), and most people tend to believe that metric units will continue to be used only in academia and technical fields like robotics and engineering.  However, I believe that the US could, and will, switch much faster, and much sooner, than is presently predicted.  And all because of the Internet.

North America has the highest percentage of Internet penetration in the world (73%); the United States alone has about 220 million Internet users.  Internet culture has blended so much with American culture that it is not uncommon to hear Internet expressions like "lol" or "1337" (pronounced "leet" and short for "elite" for all you non-1337 h4x0rz out there) used in everyday verbal conversations.  The Internet also uses metric.

Think about it; the hard drive in the computer you are reading this on is measured in gigabytes, the SI prefix giga, meaning "one billion," bytes.  My Internet connection is measured in Mbps - megabits per second (mega being the SI prefix for "one million").  The resolution of the photos you uploaded to Facebook the other day are measured in megapixels.  You use the metric system every day on a computer and on the Internet.

The international nature of the Internet also contributes to the metric influence.  Since so many (metric) countries are represented on the Internet, and since the Internet hosts content from all of them, it is inevitable that if one spends enough time on the Internet, one will encounter the metric system.

It is this subtle infiltration of America by the metric system that I believe will ultimately lead to a United States in line (literally) with the rest of the world.  90% of US residents aged 18-29 use the Internet, and so the metric system has finally learned what the Catholic Church has long known - "get 'em while they're young."  I foresee a kind of Glorious Revolution in which the metric system is finally introduced, in policy, as the primary system of measurement for the United States by the maturing Internet-age of Americans - those born 1992 (the birth of the World Wide Web) and later.

Gone will be the Carter-era pamphlets on "metrication" and a "metric future," to be replaced by...well, nothing.  We don't need propaganda to convince us to use the metric system, we already use it voluntarily with our computers and the Internet.  The Internet has brought the world together, and has (recently) begun the process of standardizing a compendium of knowledge and experience (including a system of measurement) that transcends national boundaries.  As high technology becomes more and more integrated within our culture, the metric system, the measurement scheme of high tech, will become integrated, as well.

Fear not the revolution, for it has already come.

P.S. Commenter Dr. Detroit makes a very good point, and I am reproducing part of his comment here: 
"There were many starts and fits in the direction of the metric system in the US since the fateful year of 1866 when it became legal throughout the land. A toxic combination of business lobbies (it's too expensive!), undereducated patriots (it's un-Amerikun!), and sheer inertia (imperial works, why bother?) has killed off any serious attempts at conversion several times. Although people may know their kilo-, mega-, and giga- prefixes thanks to PC's and the Internet, we still live in an America of 21-inch monitors, 3.5-inch hard drive bays, and hard-drive densities measured in bits/square inch." 
It is, sadly, true that high technology, because much of it is developed in America, is subject to the awkward dual use of metric and customary units.  I would concede the point that many measurements of high-tech devices are still measured in inches and other imperial units.  However, it is conceivable that we are in the first stages of a metric transformation that could be graphed as a parabola, that is, a transformation that starts slowly but builds upon itself to become a huge and significant force within a short period of time.  As the generations that never learned the metric system age (or as my mom so eloquently put it to me, "when I die"), the metric system may gain ground at an exponential rate (hence the parabola comparison).  Already we are seeing some small glimmers of hope as computer manufacturers have completely rejected the idea of using fractions of an inch in screen size measurements, in favor of the decimal system (ie .1, .2, .3, etc.) which is used by the SI.