Wednesday, November 27, 2013

User Interface: Windows 8 vs Classic vs ???

Disclaimer: I am not really in a position to talk about the success and failure of Windows 8's UI in terms of actual use.  Were this a discussion of the ins and outs of using it, I would be in the wrong, as I haven't used it except for some slight use of the preview builds.  Fortunately, that is not the discussion at hand.

Starting from my first editorial on this blog (and continuing later), I've espoused improvements to modern UI, which has grown... not stale, that would be the wrong word.  But the UI scheme that has functioned since the early 90s has in general not scaled with the capabilities of modern hardware, and perhaps more significantly, it has not become more mature even as home users, interested industries, and non-technical industries found more and more uses for it.  Arguably the last great leap forward in desktop design was the windows Vista/7 taskbar, and that's not much of a recommendation.

Which isn't fair, truly; there are smaller advances in UI design happening in Linux (and other unix-style systems) all the time.  And, more importantly, there is experimentation--and competition.  The K Desktop Environment (KDE) started in 1996, and has evolved greatly; as a UI, it is arguably more feature-complete than Windows.  Xfce also started in '96; GNOME stared in '99; there are others, some short lived, others well loved.  Even lately, new UI is appearing; since 2010, Canonical (who make the Ubuntu distribution of linux software) has been developing their own desktop essentially from scratch, called Unity.

And in general, these UI are at least slightly (and at best wildly) more powerful UI than Windows, specifically because Windows was never really designed for users.  It was sane, maintainable, and above all, predictable; when you ship millions of windows licenses to businesses, you want to be sure that nobody is surprised by the product; not the users, but also not the people who have to maintain the desktop.  Every advance of the user interface enables users; that is not a practice that Microsoft is versed in, as Windows 8's UI makes clear from its philosophy, let alone its design.  What part of the transition gives users more options than they had before?

At the same time I am not content with the advances being made in UI on Linux.  When I said, above, that the "last great leap forward" was from Vista, I truly can't think of anything else.  Perhaps it is my bias; I'll be the first to admit I have one, and that bias is generally that I think I have a better idea and it bothers me that nobody is coming close, in fact there is barely a footstep taken in its direction.

That idea, and I've mentioned it before, is modularity.

Put another way, the most irresponsible component of Windows 8's UI is its dependence on snowflake code.  Window design has been fairly standard for a while, but when Microsoft decided to make the new UI based on full-screen applications, they couldn't throw out existing UI design wholesale.  Instead, they decided their new UI was a "special snowflake," and changed the whole world to make sure it didn't melt.  Windows 8 includes whole brand-new subsystems whose whole point and purpose is bolting on one particular mode of operation to an OS that otherwise wouldn't support it.

The obvious and opposite design methodology would be embracing the fullscreen app mode that they desire, while providing hooks so that they could add new desktop modes if the fancy struck them.  Far from being a difficult task to generalize, the potential of such a system, and its methods, immediately come to mind, along with several other desktop modes:

* Fullscreen app desktop (such as Windows 8 Metro)
* Windowed desktop (with and without widgets in the background, foreground, or sides)
* Widget-only desktop (similar to OSX Dashboard)
* Launcher desktop (icons and widgets without app windows)
* Null desktop (Cannot recieve application windows; only has the wallpaper, or perhaps something else.  For isolated monitors, small tertiary displays, trackpads, etc)

Okay, that's interesting; but suddenly, something more interesting is born out of it. once you have, in general, different desktop modes, suddenly a new idea is dropped in your lap:

Different monitors in a multiple-monitor machine can become qualitatively different devices.

It has been a constant disappointment for decades.  Somewhere in the 2000s I purchased a small USB touchscreen monitor.  Such potential!  You could put all your widgets there, fill it with shortcuts, use it to control the rest of your system.  But the sad fact was that it was tied to (in my case) the Windows desktop, and could never be anything but an extended desktop.  It interacted with the rest of the desktop in all the old ways, even when you didn't want it to; windows dragged around would appear there, the screen would flicker and go dark whenever a fullscreen app took control, icons moved around if another monitor changed resolutions, etc, etc.

Imagine instead a separate desktop.  It's not a hard thing to envision.  Window managers have been reluctant to embrace it, because when the only type of desktop is "Windows with or without widgets, and desktop icons" it's rather redundant.  But a separate desktop for widgets, or launchers, or one dedicated to a fullscreen app--this would make small touchscreen monitors interesting.  And not just monitors; there have been attempts to integrate such touchscreens into keyboards before, and gamepads, and other peripherals.  There was even a keyboard with every key made out of its own micro display!  (It was quite expensive.)  But in all cases the device manufacturer had to cheat.  The operating system didn't really support the functionality, so they had to do it themselves.  Anyone who wanted to suborn their technology and use it for their own purposes also had to work with the device maker; so that wonderful little display on your keyboard never really could have its day in the sun, except briefly, as enthusiasts jumped on it to see what it could really do.  Then, as they discovered the time investment it took, they slowly trickled away.

But when that display on your keyboard is just a tiny monitor, when it can be taken over by a full-screen app (by the keyboard manufacturer) to display information, or partitioned into a widget dashboard or launcher, or just left blank as a separate desktop with a pretty wallpaper, then it isn't just about what enthusiasts can make it do.  Suddenly, the user interface enables the user.  Suddenly, the other tools you have work with it.  Tools including, and it's not to be underestimated, your average everyday software development tools.  There are suddenly no hoops to jump through; make a fullscreen app, assign it to your keyboard.  Make a widget, put it on the extra monitor.

This is where user interface should be, and it's almost embarrassing that nobody is close.  But Windows 8 isn't almost embarrassing.  They step away from enabling users because they hoped it would be better for their business.  They had a single ideal--what we already have, everywhere--and managed to twist it into a mockery of itself.

For an idealist, no matter what your ideology, when faced with a new challenge, you rise to it.  Windows 8 didn't rise to the challenge of merging mobile and desktop UI.  They faltered and fell, and not on the technical merits of their programmers.  When they decided to bet their business on it, the businessmen took their pound of flesh, and the ideologues lost their heart.  It stopped being a convenient addition as soon as it had to succeed; in order to succeed, it needed to be pushed, to be central, to be marketable.

When an application faces such pressures, the quality goes down, and people shrug and go on with life.  When it gets really bad, competitors spring up and the application dies on the vine.  When an operating system's quality goes down, there is no shrugging and going on.  It affects everything you do on the computer, and becomes unavoidable.

Getting the UI right is not mandatory, as long as you're willing to fall out of favor and be forgotten.  But I rather imagine that Microsoft wasn't expecting that to be the bargain that they were making when they started to skimp on its development.

Wednesday, June 5, 2013

Net Dreams

The BYOND engine (a game creation platform, far easier to use than normal coding, but with many of the drawbacks that that entails) is, by all accounts, in somewhat dire straits.  The central servers tend to lose money, and the engine itself, while it's updated occasionally, really gets no love in terms of features; it suffers, I imagine, from a lack of adequate code support, people who would do the things nobody really wants to do, but that need doing.

The Space Station 13 project, of which I am a tertiary member (a fork of /tg/station, originally built off an old fork of goonstation, built on some prior project now abandoned) runs on it, and I have put together minor projects in it, enough to really admire the simplicity of the engine.  These projects are diverse--action, strategy--but in each case I could sit down in an afternoon and have a framework, and over days or weeks have a minimum viable time-waster.  In short, I like it; but now, with the money straits they're in, they're making panicky choices to try to drive membership, and I worry that it'll only further distance them from both users and coders, producing the worst possible outcome for all three.

Not that users care--the final product is the thing, and will always be the thing.  Coders want to produce the final product, and anything that gets in the way drives them elsewhere.  And BYOND, like many types of middleware, can only monetize (as distinct from making money) by interrupting the flow somewhere.  Their latest experiment is throwing preroll ads when you connect to a server, presumably to pay for bandwidth costs--but our server, private and unlisted, makes no use of their matchmaking system, so we use their servers (which we are getting preroll ads to pay the bandwith on) barely if at all.  The suggestion, of course, is that we pay for membership instead of having this horse crap forced on us.

Needless to say this is the opposite of progress.

I am sympathetic.  I am.  I am unemployed and would like to make a living in software.  To produce something great and get a revenue stream out of it is an idyllic dream.  Maybe, if I produced a real hit, or dozens of smaller hits, I could live off of it.  Ah, wouldn't that be nice!

And maybe, when you open a diner, a rich person will like your cooking.  Maybe they'll give you a million dollar salary for the rest of your life.  So, why not learn cooking!  If you do, maybe you'll never have to cook in a diner for the rest of your life.

To be honest, where I stand, I would never want to retire.  Ah, say the old farts, the idealism of youth.  And yup, you got me, that's exactly what it is.  Naivete, if you will.  But for the same reason, and in the same way, I would never want to die; inevitably, of course, I will, but that statement belies a deeper, more frustrating truth: before I die, I will age.  I will decay.  And living in my own skin will become less and less bearable.  More importantly, that decay is a failure of my body; it is the status quo, but it is a my body failing to live up to the demands of time and wear.

Abstract things like businesses are not subject to the same sort of failure for one simple reason: they can be rebuilt infinitely.  In practice, the people in charge will dictate the form of the company; your own failure or success as a leader, and your own evolution shape the company around you.  Even if the company leader disappears, until it decides to die, it need not fail; it will, inevitably, but only due to our own mortality.

I digress; it is a habit of mine.

But the point, circuitously reached, is this: not all bullets are silver, and not all guns are golden.  Some things should live that will not pay for themselves.  Revenue--ah!  The bane of business, the heart of capitalism.  If you are not driving revenue, you are not Capitalist.  "You, petty programmer, petty engineer, your words are pretty, your designs are elegant, but unless it is backed by green, you are meaningless to me!"

Well, shit on you, that's a terrible way to run a country.  Or, to mangle a quote about democracy, "It's the worst form of [society] except for all the others we've ever tried."  Some things are infrastructure, which may fit neither in the purview of common good (which is ruled, typically, by government), nor the purview of self-interest (which is ruled by free industry); in this time of trendiness we tend to forget it, but it's true; things like telecoms, transport, and operating systems really do not belong in either of those two camps.  They require adequate competition (ruling out government), and do not adequately return capital investment (ruling out free market forces).  That's why telecoms the world over vary between grossly overpriced and entirely underfunded; it is not simply a matter of doing it, but of finding a way to make it get done.

And infrastructure, whatever else you say about it, needs to get done.  The Romans knew it, as has every city designer since antiquity; if the people need water, they need water, and if they need roads, they need roads.  Nobody today thinks about electric power as an infinite profit generator, and that's a healthy outlook; because it became infrastructure, cheap and ubiquitous, we almost can't live without it, especially in cities; if all power in the world stopped flowing, society would regress centuries overnight.

I did say digressions are a habit of mine?

BYOND is, well, a trivial example of infrastructure, to be sure.  It's a language, some file formats, an editor, compiler, and server software; it is, well, many things, but is just entertainment.

Still, I wish it would survive, and without the cynicism-laden choices that they feel they have to make.  To be frank, if I had a dream for BYOND, it would be for them to be picked up by Google; they have the server know-how, the compiler know-how, the bytecode know-how (BYOND uses bytecode for the server software, as of course does Android), and many other things that a small-time project like BYOND could never REALLY hope for.  I thought of it when comparing, in my head Google Go (a programming language) to BYOND script; the former is so much more capable, but compiles instantly; even setting that aside as a benchmark, they must know so much from experiments with such things, experimentation that a cheap or free tool could never come to understand, especially when the project heads have to work for a living--not just to pay for their own lives, but also to pay for server costs.

And BYOND does need work.  After many years of work, the programmers only just got around to multi-threaded compilation--and even then, it only keeps the UI from freezing while it works!  There are other examples; the map editor is inelegant in countless ways, and underpowered in several more.  Their design sense, while not terrible (although their latest redesign of the pager leads me to doubt this statement), is not the best.  Some parts of their UI toolkit are, while I suppose somewhat robust, definitely overcomplicated and underpowered.  The games themselves, and the server software, are both single-threaded; the bytecode is trivially- or completely un-optimized.  It is, as we say, not professional work.

And what can one do?  I can't fix their problems for them.  I'm an unemployed, idealistic, naive programmer.  I want to live forever, to work forever, to learn and grow forever, but as I am now, I am just your average lost soul.  Desirous of place... of many things.  But for now, just lost.

And dreaming net dreams, some my own, some not.