Wednesday, November 27, 2013

User Interface: Windows 8 vs Classic vs ???

Disclaimer: I am not really in a position to talk about the success and failure of Windows 8's UI in terms of actual use.  Were this a discussion of the ins and outs of using it, I would be in the wrong, as I haven't used it except for some slight use of the preview builds.  Fortunately, that is not the discussion at hand.

Starting from my first editorial on this blog (and continuing later), I've espoused improvements to modern UI, which has grown... not stale, that would be the wrong word.  But the UI scheme that has functioned since the early 90s has in general not scaled with the capabilities of modern hardware, and perhaps more significantly, it has not become more mature even as home users, interested industries, and non-technical industries found more and more uses for it.  Arguably the last great leap forward in desktop design was the windows Vista/7 taskbar, and that's not much of a recommendation.

Which isn't fair, truly; there are smaller advances in UI design happening in Linux (and other unix-style systems) all the time.  And, more importantly, there is experimentation--and competition.  The K Desktop Environment (KDE) started in 1996, and has evolved greatly; as a UI, it is arguably more feature-complete than Windows.  Xfce also started in '96; GNOME stared in '99; there are others, some short lived, others well loved.  Even lately, new UI is appearing; since 2010, Canonical (who make the Ubuntu distribution of linux software) has been developing their own desktop essentially from scratch, called Unity.

And in general, these UI are at least slightly (and at best wildly) more powerful UI than Windows, specifically because Windows was never really designed for users.  It was sane, maintainable, and above all, predictable; when you ship millions of windows licenses to businesses, you want to be sure that nobody is surprised by the product; not the users, but also not the people who have to maintain the desktop.  Every advance of the user interface enables users; that is not a practice that Microsoft is versed in, as Windows 8's UI makes clear from its philosophy, let alone its design.  What part of the transition gives users more options than they had before?

At the same time I am not content with the advances being made in UI on Linux.  When I said, above, that the "last great leap forward" was from Vista, I truly can't think of anything else.  Perhaps it is my bias; I'll be the first to admit I have one, and that bias is generally that I think I have a better idea and it bothers me that nobody is coming close, in fact there is barely a footstep taken in its direction.

That idea, and I've mentioned it before, is modularity.

Put another way, the most irresponsible component of Windows 8's UI is its dependence on snowflake code.  Window design has been fairly standard for a while, but when Microsoft decided to make the new UI based on full-screen applications, they couldn't throw out existing UI design wholesale.  Instead, they decided their new UI was a "special snowflake," and changed the whole world to make sure it didn't melt.  Windows 8 includes whole brand-new subsystems whose whole point and purpose is bolting on one particular mode of operation to an OS that otherwise wouldn't support it.

The obvious and opposite design methodology would be embracing the fullscreen app mode that they desire, while providing hooks so that they could add new desktop modes if the fancy struck them.  Far from being a difficult task to generalize, the potential of such a system, and its methods, immediately come to mind, along with several other desktop modes:

* Fullscreen app desktop (such as Windows 8 Metro)
* Windowed desktop (with and without widgets in the background, foreground, or sides)
* Widget-only desktop (similar to OSX Dashboard)
* Launcher desktop (icons and widgets without app windows)
* Null desktop (Cannot recieve application windows; only has the wallpaper, or perhaps something else.  For isolated monitors, small tertiary displays, trackpads, etc)

Okay, that's interesting; but suddenly, something more interesting is born out of it. once you have, in general, different desktop modes, suddenly a new idea is dropped in your lap:

Different monitors in a multiple-monitor machine can become qualitatively different devices.

It has been a constant disappointment for decades.  Somewhere in the 2000s I purchased a small USB touchscreen monitor.  Such potential!  You could put all your widgets there, fill it with shortcuts, use it to control the rest of your system.  But the sad fact was that it was tied to (in my case) the Windows desktop, and could never be anything but an extended desktop.  It interacted with the rest of the desktop in all the old ways, even when you didn't want it to; windows dragged around would appear there, the screen would flicker and go dark whenever a fullscreen app took control, icons moved around if another monitor changed resolutions, etc, etc.

Imagine instead a separate desktop.  It's not a hard thing to envision.  Window managers have been reluctant to embrace it, because when the only type of desktop is "Windows with or without widgets, and desktop icons" it's rather redundant.  But a separate desktop for widgets, or launchers, or one dedicated to a fullscreen app--this would make small touchscreen monitors interesting.  And not just monitors; there have been attempts to integrate such touchscreens into keyboards before, and gamepads, and other peripherals.  There was even a keyboard with every key made out of its own micro display!  (It was quite expensive.)  But in all cases the device manufacturer had to cheat.  The operating system didn't really support the functionality, so they had to do it themselves.  Anyone who wanted to suborn their technology and use it for their own purposes also had to work with the device maker; so that wonderful little display on your keyboard never really could have its day in the sun, except briefly, as enthusiasts jumped on it to see what it could really do.  Then, as they discovered the time investment it took, they slowly trickled away.

But when that display on your keyboard is just a tiny monitor, when it can be taken over by a full-screen app (by the keyboard manufacturer) to display information, or partitioned into a widget dashboard or launcher, or just left blank as a separate desktop with a pretty wallpaper, then it isn't just about what enthusiasts can make it do.  Suddenly, the user interface enables the user.  Suddenly, the other tools you have work with it.  Tools including, and it's not to be underestimated, your average everyday software development tools.  There are suddenly no hoops to jump through; make a fullscreen app, assign it to your keyboard.  Make a widget, put it on the extra monitor.

This is where user interface should be, and it's almost embarrassing that nobody is close.  But Windows 8 isn't almost embarrassing.  They step away from enabling users because they hoped it would be better for their business.  They had a single ideal--what we already have, everywhere--and managed to twist it into a mockery of itself.

For an idealist, no matter what your ideology, when faced with a new challenge, you rise to it.  Windows 8 didn't rise to the challenge of merging mobile and desktop UI.  They faltered and fell, and not on the technical merits of their programmers.  When they decided to bet their business on it, the businessmen took their pound of flesh, and the ideologues lost their heart.  It stopped being a convenient addition as soon as it had to succeed; in order to succeed, it needed to be pushed, to be central, to be marketable.

When an application faces such pressures, the quality goes down, and people shrug and go on with life.  When it gets really bad, competitors spring up and the application dies on the vine.  When an operating system's quality goes down, there is no shrugging and going on.  It affects everything you do on the computer, and becomes unavoidable.

Getting the UI right is not mandatory, as long as you're willing to fall out of favor and be forgotten.  But I rather imagine that Microsoft wasn't expecting that to be the bargain that they were making when they started to skimp on its development.

No comments:

Post a Comment