Tuesday, December 29, 2009

On dealing with people

I consider this a basic summation of what I believe when it comes to people.

This is what is:
Their Nature
Their Meaning & Path
Their Expectations
Their Hopes & Dreams
Their Quality
Their Health (of Heart and Mind)

If they are someone you care about, you must find out what they are, and if you cannot, you cannot trust them. When you know what they are:
If they are not in health, you never ignore them, no matter the trouble.
If they are of the best quality, you never abandon them, no matter the cost.
You never ask them to betray their nature, no matter the utility in doing so.
You never demand they forget their own meaning, no matter your own troubles.
You never spit on their dreams, no matter how big or small.
You meet their expectations as much as your own nature and path allow you, no matter what your dreams or expectations are.
You expect no less of them, no matter their nature, path, expectations, or dreams. Some allowance is made for health, but even then, you gauge their quality by the things they do, and the things they don't, compared to what they can.

Friday, December 11, 2009

MOS/DCA/M4U/VSA

My word, I haven't used this for much. It's a pity, as I'd like to.

There is beyond the slightest shadow of a doubt an enormous potential for revolution in OS design in this world. I won't tell you I have "The Next Thing", as a promise like that can only be made after it has already been accomplished. But I have Pretty Cool Ideas and I know they could do something pretty snazzy as far as OS design goes. I could MAKE them work, and I WOULD, if I had the right people on my side.

The basic principle behind pretty much all of it is that equivalent things are not treated equivalently in different aspects of computing. Two pieces of computing hardware (CPU, GPU, USB, HDD, CD/DVD, etc) in a given coputer don't communicate as equals; two computers with different operating systems don't communicate as equals; a computer does not treat all inputs and outputs in a generic fashion, but as a master plus extras; different applications editing the same data do not share the same session data.

To someone who does not know what I plan, every one of those things is not only true but natural. Indeed I expect that there's no particularly obvious way to do any particular one of those things, and little enough reason why they would all be done under the same banner. However, if we allow ourselves the luxury of daydreaming of what has not yet been built, we see a computer that:

* Has multiple components that are each discrete, all of which have standardized high-speed connectors, are hot-pluggable, and have no exposed internals (either in terms of hardware or in terms of software drivers), while being easy to replace and upgrade.
* Can support an indeterminate number of processors, inputs, outputs, users, etc, configurably and dynamically
* Can store application sessions in a way that they can not only be restored by the same application, but by any application with equivalent functionality, assuming that both parties conform to a session standard
* Can transfer both application sessions and user (GUI) sessions between devices running the same or compatable operating systems, even if those devices have drastically different hardware capabilities (including I/O) and run different software
* Is capable of using the running user session on a mobile device to log on to foreign hardware and use it for display and computation transparently without leaving user settings or data locally, or necessarily exposing the foreign hardware to user software which may be buggy or insecure.

I would love to head into that future by forging a really well-designed specification; I'm just having trouble keeping on track all by myself. That's not to say that having the specifications will naturally guarantee that a finished product will result, but if we have them, and they're good enough, we should be able to find a place for them.

Monday, November 16, 2009

Fanfare for the Common Man

It didn't occur to me until I downloaded it off iTunes that the music that I associate with the NASA project, and Apollo in particular, is called "The fanfare for the common man."

Or rather, it didn't occur to me until the first time I listened to it, and realized that it was that same music. There was a moment of connection in my mind and I checked the title to make sure, but it was no mistake.

I think it's more than appropriate. The people who were the first to see the earth itself from space, with their own eyes, weren't soldiers or conquerers, politicians or rich men. The government placed the risk, and the reward, squarely on the shoulders of scholars... and on the shoulders of men who trusted those scholars with their lives.

Godspeed, mankind. May you never stop rising above those who came before you.

Thursday, October 22, 2009

Experimental Software

The thought occurs to me that, when it comes to software licensing issues, there exists an analogue which really should have been explored long ago, but which I do not recall any conversation about. In particular, there is one thing in particular, and that is the idea that software is trusted with all of our data, even data on which lives can depend. And yet somehow, the concept of "Aircraft-grade" doesn't apply to software in these cases at all.

For people who aren't aware, the building of aircraft is a fairly well regulated industry; I hope I don't have to suggest why. Not only are all the parts supposed to be aircraft-grade, but there is a large body of tests that software must go through before it is declared worthy of FAA certification. If the aircraft is NOT certified safe by the FAA, then you know before you get it in it that what you have is unregulated and it may be no safer than the worst mechanic or part in it. Certified aircraft, such as airliners, are tested for reliability, and as long as they are kept up in accordance with FAA rules, you can be fairly sure that they are worthy of your trust. (They do have a pretty good record with this, too, in spite of the occasional story of airliners crashing--which is made louder and more noteworthy in fact because it happens so rarely.)

I cannot think of a single good reason why software that covers critical sections of computing should not be required by law to be tested by an independent agency for its stability. This would ideally replace the EULA's indemnifying clauses entirely, as the company would in fact be required to account for the faults in its software. In contrast, if a piece of software does not have the testing, because it is an amateur effort or work in progress, it should be clearly marked EXPERIMENTAL--with the understanding that if you use it, you are not allowed to blame the maker.

Ideally, there would be other classifications as well, some with stability testing, some without. Anything where user data is being used MUST have some kind of classification, whether certified or experimental; the user themselves will come to believe in the difference between certified and experimental software, making certification desirable for companies, while at the same time, the fact of testing and the requirement for legal responsibility will dissuade con artists from abusing it.

Would Microsoft or Apple approve of such a scheme? Maybe, but not easily. Microsoft in particular is probably not set up in such a way that any part of its OS, much less the whole thing in total, can be completely certified. However, if a top-notch OS was "certified" and especially if the other OSes were not, then I'm sure they would be more than happy to hold that over everyone's heads with glee.

What should be certified? A quick list.
  • Kernel (sans drivers, but including driver architecture)
  • Registry (for Windows)
  • User Interface (sans plugins)
  • File system
  • Network stack
  • OS initialization daemon (The process which starts applications at boot-time)
  • Password/Identity managers
  • Network security
  • Any server process (ssh, ftp, http, samba, filesharing, remote file storage, remote desktop, etc)
  • Commercial file editors
  • File compression tools (zip, rar, tar/tgz, 7z, etc)
Possible useful certifications:
  • Certified User Application (will not lose user data)
  • Experimental User Application (registered but not guaranteed)
  • Not Classified: User Application (Not registered; use caution)
  • Certified OS component (will not crash and lose data or damage hardware)
  • Experimental OS Component (registered but not guaranteed)
  • Not Classified: OS Component (If you aren't a developer, you just shouldn't)
  • Certified Isolated Application (Does not use user data; games and graphics programs)
  • Experimental Isolated Application (...)
  • NC: Isolated Application (...)
  • Certified Secure User/Network Application (Data is securely stored or transmitted)
  • Experimental User/Network Application (...)
  • NC: User/Network (...)

Thursday, October 15, 2009

Franken Bill and Governance in General

I'm not a political commentator. I hate talking about politics. I hope never to have to discuss them again. I am also not a lawyer. I'm just a blogger.

As you may have heard, 30 Senators from various states recently set their balls gently but firmly on the chopping block and voted against a bill that prevents government contractors from "restrict[ing] their employees from taking workplace sexual assault, battery and discrimination cases to court". From what I can tell, the details of the situation fully bear out that first moment of outrage one feels from the summary. If this was anything but a thinly disguised resignation letter, then the American government should be taken out back and shot.

I'm not being facetious. Senators serve in units of six years, and I doubt many or even any of those 30 senators are fresh out of 2008. In the last six years--in the last twelve, or even eighteen or more--how many laws have been passed? Although many have grown cynical of government recently, there is no way to say this but that those senators show no respect for the weight of their actions on 300 million people.

The concept of democracy, and indeed government itself is simple--in order for a common peace to be established, some amount of power must exist to regulate inconsistencies and especially violence. However, power by its nature has the power to create those same inconsistencies if misused. This is where democracy parts ways with monarchy--Democracy was designed so that it never needed to be true that the people of a nation could not trust its leader, its governance, or its courts. Unfortunately, it IS true. That it is true now suggests that it may have been true at points in the past as well, and many of those laws remain on the books as well. With this in mind, if I could, I would vote for no confidence in America's existing structure of law and governance until such time as a thorough review and restructuring has left the country in the position it was intended to be in at founding: where people need never exist under the rule of an agency they cannot trust.

Admittedly and unfortunately, this is not possible, not least because the entire federal government, from the senate to the federal mail service and public school systems, would have to be put on hold. However, I do not believe that this negates the thrust of my message--that for the people of America, and indeed the world, to truly believe in its political leaders again, the system must be designed such that those who would betray its trust are not allowed to participate, and with this done, the existing body of law should be reconstructed in such a way that it can be understood plainly and cannot be abused by means of details, whether those details are abuseable on purpose or as an unfortunate mistake.

Should it be done? In the next ten years, no. Possibly not for the rest of the life of the United States. It is not possible for acts of desperation to have the clarity necessary to truly escape the itching tendrils of corruption. It is possible for it to be done, but it may take decades worth of retrospection and deep thought, and these thoughts may have to be completely separate from the national dialogues.

So I suppose the best idea would be to come up with a theoretical body of law which would serve any country which tried to implement it, but with the understanding by all nations that until it is complete and refined to the same or a higher degree than the existing system, there will be no need to worry about implementing it. I personally would love to see such a thing; although I hate talking about politics and despite partisan discussion, the idea that it could possibly all work out, even if it doesn't happen for our generation, is an ideal beyond my reckoning.

Tuesday, September 8, 2009

The mythical wrist-computer

What ever happened to the forearm-mounted computer?

Now, I'm not as big a sci-fi buff as many, so I can't say that it's a staple, but the idea of a computer scarcely more cumbersome than an armored bracer isn't something that yet needs introduction to the genre. There are many sizes and flavors with many kinds of I/O, but the tech behind it, for it to be practical, has always been kind of brushed over.

I'm holding my iPod touch in my hand. Including processor, battery, and hard disk, it's hardly half an inch thick. Assuming that half of that was the touchscreen display would be generous--so, if you had two matching clamshells (probably slightly curved), one or both a touchscreen, and with the rest of the hardware on the other half of the arm, it would be barely if at all thicker than one iTouch. Scale up the dimensions of the screen just a smidge and you have something almost large enough for an actual keyboard, plus a second screen for data--or, like the iTouch and iPhone, turn it on its head, or any-which-way, in order to get the computing experience you desire. (You'd have to make the processor and battery marginally better to accomodate its needs, but the iTouch runs off a 500MHz processor--it's not like we don't have the means to make better)

I'm sure the position would be awkward--having a weight on a round-ish surface like your forearm is something that would cause torque to be a big issue, as I (as others) notice with our watches every day (although, for a well-fitting watch, it isn't a problem--and I'm sure that a properly-fitted watch band up front, and a support strap in back, might deal with the torque well enough). Also, the forearm display probably isn't ideal for either typing or watching--but you'd get used to it, and even if you didn't, it's mobile computing--there aren't many solutions that get it right throughout ALL of digital history.

Still, the geek in me feels that the body-mounted computer industry has been severely lacking for far too long. And let's face it--if you can mount a display like that on your forearm, you can make it JUST the display--and carry a laptop in a backpack with a flexible cable running down your arm. There aren't many, or any, really, solutions that would let you do computing like that, while standing or walking, especially in a way that doesn't threaten imminent and trivially-executed device theft. I wouldn't walk down many dark alleys with it, but short of being ordered to take it off at knifepoint, it remains personally attached.

Thursday, May 21, 2009

The Real Thing

I think I know what bothers me about... a lot of things. People who aren't professional; liars; and in general, the entire culture of people who simply seem to not take the world seriously.

Not surprisingly, the problem is in fact that they don't take the world seriously.

By not taking the world seriously, I mean that they seem to view it as a cheap toy--something for which there is a trick, a funny little thing that makes it do what you want, and that's it. The people aren't real people; the problems aren't real problems. It's just a lot of pretty lights, exciting times to be had, and a bunch of squares that are looking to screw up your daily contact high.

Similarly, when it comes to getting out of miserable times, the same people--and I am one at times--view the world like a puzzle, and a cheap one, where whatever the answer is, there is one, and it's something fairly simple. Play with things, and don't worry about doing things you don't want to--the world is only a toy, after all, and it can't make you do unhappy things. And people who accept that you "have to do unhappy things" still seem to want to relapse into this view that the world isn't really complicated--just cruel.

The world is something to take seriously. It can kick your butt while you're looking for the off switch. It will look at you funny when you twist it left and right looking for the "trick". The world will never be a $5 game at Wal-Mart. It's a real thing, and the only way to face it successfully is to be a real thing in response--alive, looking, learning, not becoming a cheap mockery of mankind by settling into patterns that end up leading nowhere.

Where the idea that we can be fake comes from, I don't know. But wherever it's found, it should be unceremoniously smacked upside the head.

Sunday, May 10, 2009

Fun with physics

Fun fact:

The earth is so large that if it were hollow, and two lights on the surface were pointed at the center from the surface, which were set two miles apart, a human eye at the center could not possibly tell them apart.

(Assumptions: The human eye's angular resolution is .03 degrees; the radius of the earth is 6378.1km. The minimum distance to be able to resolve the difference would be just over two miles (about 3 1/3 km))

Sunday, March 15, 2009

Theory: Obsession

Note: This theory is not based on specific evidence or scientific study.

It certainly seems tenable to me, speaking both mechanically and as an observer of people (however little of that I may do) that there should be a correlation between what we think about, and in particular do, while exhausted or when we should be sleeping, and obsession, which I will define here as an irrational reappearance of a particular thought or topic in connection with unrelated events (perhaps a better word for psychologists would be fixation, but I'll stay with my terminology, if only because I am not a psychologist). I have two arguments in favor of this viewpoint, neither of them deductive nor scientific, if only (in the case of the first) for lack of sufficient or sufficiently detailed information to qualify as such. Both arguments are arguments for more research, and not arguments that the premise itself is valid.

The first argument is based on what I think I know about the function of the brain at night. We know that the brain does a great deal of integration of memories and other organizational and maintenance tasks within the brain during sleep. Supposing that this task starts earlier, when the brain begins to assume (based on internal factors) that sleep will be coming soon, it makes a logical kind of sense that if more stimuli (whether external or internal) are presented, with some probability, those stimuli will begin to be mentally associated with whatever memories, topics, or semantics the brain is currently integrating. That is to say, if you were playing baseball, today, and thinking at night about fluid mechanics at bedtime, you would be more likely to question the fluid mechanics behind a baseball pitch in the coming days than if you had studied it before dinner, and went straight to bed when you got tired--in the case of the sleepy studier, his current thoughts about airflow may have gotten misfiled as part of his thoughts about baseball, as his brain was categorizing his baseball-related memories from the day at the time he chose to study.

The second argument is not an argument at all, but rather an intuition pump. I will examine a number of cultures, however shallowly I may have observed them, and point out possible correlations which might be indicative of this relation. Examining my own (geek) subculture, I see two trends which may or may not be in correlation: people who spend their nights partaking in their hobby, and people who have become extraordinarily single-minded about such hobbies, such that the first thought they may have about any particular topic or event may be related to the hobby--for instance, seeing a cat stuck in a tree, a comic book geek's first thought might be about superman or another superhero rather than the fire department. Perhaps the best known example of this is the internet geek who instantly associates any attractive female with, we can only assume, the untouchable, airbrushed ladies of the internet who exist only to be looked upon by you, whatever they may be doing in picture or video to others. (This is not meant as judgmental--keeping in mind that I am proposing a reason for this instant and injurious mental reflex, and pointing out that it may not exist in every case, but that it explains such a case when and if it appears)

A second subculture might be what could best be termed the 'chatterbox' subculture--people who, whenever you see them, will continue speaking about anything that comes to mind until everyone within hearing range has bled to death through their ears. Supposing only that these people are those who habitually had late-night phone calls with each other, and that therefore they developed the nightly habit of talking about anything that came to mind, it is therefore possible that this habit could become a part of their daily ethic, if this imperative to talk becomes bound to the memory of everything they did during the daytime due to this unintentional contamination. This would, interestingly, be an example of such contamination not in explicit thought, but in compulsive behavior.

Similarly, we can ascribe similar compulsions to people who stay up late doing other things--reading, studying/memorizing, watching TV, drinking, or having sex, for example. Although a trivial counterexample of each case is likely (someone who does these things at night a few times but is not adversely affected), I would propose that it is more likely in every case for someone who is performing rote actions at night or before bed to become obsessed with such things that it would be if they exerted the same amount of effort during waking hours, such as in the middle of the afternoon.

This theory also has interesting ramifications for the study of, for example, depression. According to my proposal, people who are habitually depressive at bedtime are likely to become depressive more permanently, unless something else happens to break the association and/or create a new one. Similarly, people (such as myself) who consider the evening to be time during which to studiously avoid thought of work or anything not enjoyable may find themselves having trouble focusing on their work or other hard tasks even during working hours.

It also seems to be capable of explaining additional effects such as: the use of sleep deprivation as an act of torture or reprogramming, the stockholm syndrome, lovesickness (which is just another term for my definition of obsession, when defined by fixation upon a person rather than an act or idea), etc.

This hypothesis also seems testable and falsifiable, especially using the phrasing of the hypothetical consequences laid out previously, and I would be fascinated to know the results of such a study.

Multiple monitors as display devices

As near as I can tell (without having done any research in particular, which bias I readily admit), current trends in single-computer multiple-video output come in four general trends: Multiple terminals, Extended desktop, Display clones, and Custom solutions (the last being reserved for, for example, usb devices such as the Pertelian display). I think most people who have a second monitor get the same itchy feeling at least once--what else can we do with it? There's no good language to describe, or tool implementing, a properly flexible solution (IHNRTS*).

The idea has been itching in my mind for a while that the same windowing system should be able to run different display managers on different displays (where "the desktop" is an example of a display manager, as is any full-screen application, in particular ones that change the display resolution, such as full-screen games or video, or presentations). This effect can be mimicked with the extended desktop to a limited degree, but the extended desktop is limited for one very important reason: there is exactly one meaningful way of switching input focus from one monitor to the next, which is limited by the metaphor of the extended desktop which is broken when an application other than the desktop takes exclusive control of the input.

It's been technically possible to connect two or more keyboards or mice to the same computer for ages, but there has never been a good reason--unless you want to run a mainframe with multiple sessions running, the clutter is meaningless and adds to frustration and confusion more than anything else. There is little reason not to have only one full-sized keyboard--you will only be doing one thing that needs full access to it at once, unless you're a mythical prodigy typing on four keyboards with your hands and feet and mousing with your knees and elbows, or are doing independent work in more than one context, in which case it is likely that more than one computer is a viable solution.

Imagine, however, that you have a second--or third--monitor which doesn't have the capabilities of a full desktop, and is merely a display to which you send tool windows and other informative applications so that they do not clutter your workspace. When running in an extended desktop situation, it is fine to use your primary mouse between the monitors to click options and rearrange the windows a bit--however, run a full-screen game on your primary monitor and you'll see, among other things, that your primary input is now captured and you have no feasable way to interact with it without the dreaded resolution switch back to the desktop, wasting precious seconds and interrupting whatever internal context the game may have. Surely a second mouse could have a use here, but as it stands the primary display manager captures all input, and the tertiary display manager on your other monitor has no independent ability to strip the primary display of the input focus of even redundant input devices.

Thus we get to the proposed solution: a meta-display manager behind the desktop, which is reponsible for determining which display managers have control over what output devices, and which manages input devices configurably--including switching the primary input devices between contexts, and assigning a secondary input device to the nonprimary display, and changing input and output mappings when a new program gains or loses control of one or more resources.

Unfortunately, it may require new APIs as well. Imagine, for example, that you wanted your instant messaging program to have a tool window on a tertiary display while its main window remains on the desktop. Even assuming that the same window-drawing capabilties are granted to the display manager on the tertiary display, how do you programmatically determine or specify which output device you want the new tool to be displayed on? If the tool-window display manager is not part of the operating system, or otherwise not part of the basic windowing library, then any windowing operations will have to be done through its own set of libraries instead of directly through the windowing system. Although this will create increased overhead, most likely these tool windows will not be graphically taxing or real-time, so it is not likely that a slight performance hit will be problematic.

As it is currently envisioned, the only principle problem that this solves is maintaining user interactivity across multiple contexts when one context takes full control of normal user I/O, although it also offers an API which unifies control of non-primary display devices in a way that is not fundamentally different from the APIs used to manipulate primary displays. Whether more can come of this solution is a question that might best be answered in the process of its implementation.

Irrespective, it is an idea to be taken seriously, and a potential that could be fascinatingly useful.

(* I Have Not Researched This Statement)

Disclaimer

The entries in this blog, whether philosophy, theory, or design planning, are my own thoughts and work.  In any cases where I seem to be expounding upon said thoughts and work, I'd like to be known for my work, and as part of that, I'd dearly appreciate being credited for any use of it.

This blog may contain material which makes assertions or suggestions that I as the author am not qualified to make. Such arguments should be considered on their merits, and only within their proper contexts, and the author should not be considered an authoritative or verifiable source on such matters.

This blog may contain material which may be offensive to certain religions, philosophies, or other systems of taste or ethics.  In such cases I would be interested in knowing why it offends you, but I do not consider myself obliged to defer to you, nor to your religion, philosophy, etc.

This blog is copyright 2009-2013 Vincent Van Laak. All rights reserved, except as stated above or at the time of posting.