About Me

I'm just someone struggling against my own inertia to be creative. My current favorite book is "Oh the places you'll go" by Dr. Seuss

Saturday, January 24, 2009

JSON xml and the relational model part 4

An xml element is an unordered set of attribute names, and their associated values.

<div class="title" id="heading" lang="en"> </div>


a JSON object is an unordered set of attribute names, and their associated values.

{"tagName":"div", "className":"title", "id":"heading", 
"lang":"en"}


an xml element may contain an ordered tuple of "nodes", which may be plain text nodes, or could be other xml elements.

<div class="title" id="heading" lang="en">
The Grand Adventure of <i>Lucious Swan:</i> The return of elemental qualities.
</div>


a json property may contain an ordered tuple of values, which may be primatives, or could be other objects.


{"tagName":"div", "className":"title", "id":"heading",
"lang":"en",
   childNodes:[
"The Grand Adventure of ",
{tagName:"i", childNodes:["Lucious Swan:"]},
" The return of elemental qualities."
]
}


and that's basically all there is to it. This may appear somewhat more bulky than other xml to json translations. However, the translation preserves the unique and unordered quality of xml attributes, and the ordered non-unique quality of xml node collections. As a result, it's much simpler to implement readers and writers for this format, because there's fewer exceptions or other special conditions to account for. This is more or less a direct translation of the semantics of xml into JSON.

As an additional bonus, code written against this style of structure would work exactly the same directly against a browser dom representation of an xml document, since this is essentially a stripped down subset of the browser DOM, using the same attribute names. Most server-side XML parsers produce essentially the same structure as well.

a JSON translation into xml, using the same principles however, is not so easy...

Thursday, January 22, 2009

Where does the logic go?

There's a trend in data centric applications. The trend is to move more and more of the contraints and logic out of the database software, and into the application code.

The trend results from the fact that the software technology industry is populated and driven largely by humans, and thus subject to trends and irrational behavior. To understand what's going on today requires a bit of perspective in the history of databases, and their parallel development with programming languages.

SQL is the IE6 of the database languages world. It breaks many of the rules of the relational model- in other words, it's a little bit like a calculator that performs multiplication incorrectly, and doesn't have a minus operator. SQL is not complete enough to be a real solution. It was never developed beyond the prototype stage, and was never meant to be used in industrial settings. But then it was naively used by oracle, which turned out to be a "killer app", SQL became industry standard instead of its technically superior competitors, and the rest is history. SQL's syntax is based around a set of command line tabular data processing tools, and COBOL. Full of bugs, inconsistencies, and a mishmash proprietary versions and features that don't have a grounding in math or logic, results in a situation where it really is unclear what goes where.

Regarding the recent proliferation of ORMs: misguided and ill thought out attempts to patch over the obvious deficiencies of SQL. Database triggers and procedures are another misfeature trying to patch over SQL's problems.

If history had played out in a logical and orderly way, the answer to this question would be simple: Just follow the rules of the relational model and everything will work itself out. Unfortunately, the rules of the relational model don't fit cleanly into the current crop of SQL based DBMS's, so some application level fiddling, or triggers, or whatever other stupid patch is unfortunately necessary, and it ends up being a matter of subjective opinion, rather than reasoned argument, which stupid hack you use.

So the real answer is to just follow the relational model as close as you can, and then fudge it the rest of the way. Put the logic in the application if you're the only one using the db, and you need to keep all your source code in a version repository. If multiple applications are likely to use the database, make the DB as bullet proof and self sufficient as it can be- The main goal here is to ensure that the data remains consistent.

Wednesday, January 21, 2009

What will the future of programming look like?

it will likely be a formal specification language. Instead of indicating the "how", as we do today, we'll specify the "what". So instead of saying step by step how to implement a certain algorithm, we'll specify what the requirements for our program are, and the compiler will work out the most efficient algorithm automatically.

The spec language may or may not be text based. I do not believe we will ever over come the problem of linguistic ambiguity. Even with a computer with an equal or greater intelligence to a human. Humans still misinterpret eachother. Computers will always take what we say painfully literally. this is an anavoidable, but not necessarily intuitively obvious result of various inviolable premises we hold about the operation of a computer. But higher level, more intuitive ways to state a problem unambiguously exist.

I don't think that the star trek depiction of people programming holodecks is entirely far off. It will seem more obvious and intuitive, but computers will still make catastrophic errors based around the ambiguity of natural language.

The reason I think it seems that humans can understand eachother easily in a way that computers can't, is due to shared culture, and our shared understanding that we don't always say what we mean (sympathy), and our ability to (only occasionally) continuously clarify and disambiguate our meaning. Our ability to communicate with eachother is largely the result of the common shape of our bodies, perceptions, and our ability to imagine ourselves as other people (an ability we can see more clearly when we look at those who partially lack this ability- Autistic people and Aspergers people). This enables us to understand in an extremely intimate way, why someone else may be making a specific pattern of noises with their vocal mechanisms, and gesturing in a particular way with their faces and bodies.

We only derive meaning from these patterns of behavior by imagining what we would be thinking if we were doing those things. Computers lack human bodies, vocal mechanisms, and faces, and consequently any ability to sympathise with a human. Any attempt to make a computer truly understand us without making the computer into a human itself, will be largely fruitless.

Despite all that, human comprehension doesn't work quite as well as most of us imagine it does. Consider the challenge each of us have as programmers in determining the shape of a program that client requires. It doesn't happen instantaneously. A successful program is the result of continuous revision over the course of many weeks months and years. That revision process would be very challenging (impossible) to replace with an automatic process. We can better automate the repetitive work, but we will never be able to artificially generate a perfect sympathy for our intent, artistic vision, and personal/ethical needs.

Banking on a future AI is a bet that I wouldn't make, even if it were possible. We should think just as hard about what we would lose in such a proposition, as much as we think about what we would gain.

Thursday, January 15, 2009

New Paradigms in programming.

In programming, I think we need more than just a new input method, there needs to be a new metaphore to go with it. It's a three teired thing. Model-Metaphore-Interface.

I've been thinking more and more lately that language is a poor metaphore for representing a computation. Language is something we use for communication. You could look at a program as a communication to a computer, and simultaneously a communication to other programmers. But there are other ways to communicate other than just the written word. I'm working on a list here, let me know if I'm missing anything.

Methods of Communication *Speech *Body Language *Gestures *Facial Expressions *Sign Language *Painting/Drawing *dials, buttons, sliders, pointing, dragging (gui)

Another metaphore for programming is building. Here's some possible ways of building functional things, that could form the basis for a programming interface

Methods of building functional things *Gears/springs and other mechanics *paper folding/cutting/gluing *patch cords *electronic circuits *hinges, ball bearings, wheels *fountains valves and pipes *Archimedes machines: pulleys, levers, screws *Lego

Yet another way of specifying a computation is by definition.

Methods of Definition *Constraints *Categorization *Set Theory *Properties *Symptoms *Logic tables *Rules *Railroads

But keep in mind why we're doing this. There's obviously some weaknesses in the way programming languages work now, (otherwise we wouldn't want to make new languages) so let's keep them in mind while we're designing our new languages

problems with current languages The interface is hidden

the apis are hidden

Side effects are a huge cause of bugs- Any part of a program can effect any other part.

Refactoring- Sometimes you find that you're repeating yourself, so you need an easy way to factor out the repetition into a macro, or a function, or some other metaphore. This is largely done by hand (or semiautomatically in java) by a massive text manipulation effort. Is there a new metaphore that would make such a thing look utterly silly?

You need an easy way to define your own building blocks, or "words" or idioms, to use to build more complex structures. Your own tools, your own parts of the environment. A lot of languages don't let you do this in a first class way.

compilers punish the programmer severely for the slightest mistake.

Variables lack a sense of time- There's no way to query the history of all the values a variable has been set to in the past. In other words, can we have a programming language where we can "rewind" the progress of our program? The fact that a variable can change, frequently to unexpected values is another source of bugs. This is the other half of the side effects problem

most programming languages have a fairly steep learning curve

Making reference to library or widget X throughout your code largely marries you to that library- Making it difficult to switch to a similar equivelent library without a lot of refactoring. This is largely to do with the fact that libraries have names, and in order to use a library, we're hardcoding the name of that library and its methods throughout our code. Is there a better way?

Poor parrallelism, multithreading leads to bugs, race conditions, deadlocks. Is there a better approach to parallelism that makes such bugs impossible?

Think beyond the computer screen people. Maybe the keyboard is the most efficient interface for entering in complex relationships and symbols. Are you sure? There are more alternatives than just a mouse, or a touch screen, or a tablet. Zillions of ways of interacting with a computer- We have just all settled on one or two rather ordinary ways.

Sunday, January 11, 2009

Of computers, calculators and interfaces

-4 vote down check

There is some controversy over my downvoting in this question. I will explain my reasoning by quoting Jef Raskin, the late usability expert, and originator of the macintosh project at apple.

Calculator or Computer? It's true, Many of us keep a calculator beside our computers. Why do you need this simple-minded device when you have a whole computer in front of you? You need it because you have to go through contortions worthy of a circus sideshow in order to do simple arthmetic with the computer. There you are, tapping away at your word processsor, when you want to do a division: 375 packages of Phumoxx cost $248.93; what is the price for one package? On my computer, I have to open up a calculator window. To do this, I move my hand from the keyboard to the mouse, which I use to do a click-and-drag to open the calculator. Transferring my hands back to the keyboard, I type in the numbers I need or tediously cut and paste them from my document. Then I have to press a few more keys and finally copy the results from the calculator window into my document. Sometimes, the calculator window opens right on top of the very numbers I need, just to add insult to injury. In that case, I must use the mouse to move the calculator window out of the way before proceeding. It is much faster to grab the pocket calculator.


do not think just because you are a programmer you are immune to usability issues. Using a software calculator can be a significant drain on your time, and when you're a professional software developer, time is some serious money, yo.

Jef Raskin went on to propose a solution for a better software calculator. I did not downvote those that I thought were similar enough to Raskin's solution, such as using spotlight. If you're using Textmate, textmate has a feature which is almost EXACTLY the solution that Raskin proposed, that is, you highlight a text which represents a mathematical operation, and you press the "calculate" key on the keyboard. Since most keyboards sadly lack a "calculate" key, in textmate you use control+shift+c instead. All text editors should have this feature. it is sad if they don't.

Also, his son aza made a program called enso. If you have enso installed, you can highlight a calculation, (anywhere in windows), hold down the capslock key and type "calc", and it will perform the calculation, replacing the selected text with the result.

the quote continues

Using an experienced computer and calculator operator as my test subject, with his word processing program open before him, I measured the total time it took for him to pick up a calculator, turn it on, do a simple addition, and return his hands to the keyboard to resume typing. It took about 7 seconds. I then measured the time it took for him to use the built-in calculator. He had to move the cursor to the menu bar at the top of the screen, find the calculator program, open the calculator, enter the sum, and then click back in the word processor so that he could resume typing. This took about 16 seconds.



Comments:

I could say the same thing for an actual calculator. I have to take my hand off the keyboard/mouse, find it, get it out, remove it's cover, turn it on, etc. It is a non argument. Also, that isn't how I open or use the windows calculator. My hand never touches the mouse. – Simucal (Jan 7 at 4:10)

In Vista: Windows Key-"calc", enter. Proceed to enter your calculations using the numpad, never removing my hand from the keyboard. – Simucal (Jan 7 at 4:11)

It's a non argument if you only decided ahead of time that the quote is wrong, and thus only bothered to read until you thought you had enough material to attempt to discredit it. It's a much better argument if you read the whole thing and actually spend a minute or two thinking about it. – Breton (Jan 7 at 4:18) [remove this comment]
But this is my own fault for pissing you guys off before I posted the quote. Read it on its own merits, not because you think I'm an asshole. – Breton (Jan 7 at 4:19) [remove this comment]

@Breton, I read your whole post. What you fail to realize is Jef Raskin's solution is a solution to a problem that doesn't exist. Launching a calculator from the keyboard is so stupidly simple that is a non-issue. Ensu style highlight/calculating is nice but I don't see it as a killer feature – Simucal (Jan 7 at 5:24)

You're focusing on the launching, and ignoring all the other issues that jef raskin brings up, such as windows obstructing your view, application switching, and the manual task of copying and pasting the result, which is actually more significant than just typing in the result from a visible display – Breton (Jan 7 at 5:41) [remove this comment]

This ignorance is what has led me to believe that you didn't, and you still haven't properly read the whole thing. – Breton (Jan 7 at 5:42) [remove this comment]

can you remember the hotkey combination for your calculator, and then press it? Can you do that faster than you can pick up an object off your desk? Are you sure? perception is a tricky thing when it comes to keyboard shortcuts. Try it with a friend and a stop watch. – Breton (Jan 7 at 5:47) [remove this comment]

+1 for a thought out responses that I think is wrong. I can't think of anything that a hand held calculator can provide that a software one can't. Also I just did the test as described at the end: 3 sec using the windows calculator. – BCS (Jan 7 at 7:21)

Re the other issues: alt-tab or enough screen space gets you past the calc over the data issue. -- All else aside the OP asked for a *programmers* calculator, anything that isn't simple math (a.k.a. anything that would be special to programming) I wouldn't use a calculator (either kind) for anyway. – BCS (Jan 7 at 7:27)

@BCS software can do anything. I can't think of anything a hardware calculator can do that a software one can't, aside from be a physical object (has advantages). But it depends on the calculator. As I said before, I didn't vote down software solutions that I thought were decent enough. YMMV. – Breton (Jan 7 at 7:49) [remove this comment]

And worse comes to worse, we're programmers. If we needed something specific out of a software calculator, surely we'd make our own? – Breton (Jan 7 at 7:49) [remove this comment]

You downvoted a whole slew of people because you were convinced by some theoretical article that there is only ONE TRUE WAY? And now you're surprised that people don't like you? Oh, and btw, that ONE TRUE WAY isn't in my environment so...what? I'm out of luck? You must be very popular at parties... – mdbritt (Jan 7 at 16:51)

I wouldn't count an article based on a set of facts and measurements as theoretical. Make your own conclusion from the premise, but you can't make your own premise. I'm not surprised that people downvoted me. people don't like to be criticised. I'm not particularly popular at parties. I don't mind. – Breton (Jan 7 at 21:43) [remove this comment]

@Breton: I'm not one to criticize the late Raskin unless necessary, but understand WHEN this piece was written, vs. today's world. Many modern programmers have the ability to launch their calculator quickly, have it not obscure the text beneath, and be done. (ctd next comment) – John Rudy (2 days ago)

I can do all of that on Vista way more quickly than I can even FIND my real desk calculator. :) Yes, the OP did want a physical calculator -- his choice -- but that doesn't make sense to most others here, and those opinions are perfectly valid. (As is yours; many of us simply disagree with it.) – John Rudy (2 days ago)

It was written in 1999-2000. Computer interfaces have not changed significantly since then (aside from the introduction of OS X). Quite a lot of people still have calculators next to their computers. all the commenters seem deaf to the phrase "I didn't vote down all the software solutions". Oh well. – Breton (yesterday) [remove this comment]

@Breton: I read that part, but understand even the difference between 10 years ago and now. Modern large screens and massive display real estate is a major change for most of us. In 1999, I felt lucky to have 1024x768 -- at work now I have 2560x1024 on two monitors ... No obscuring via soft calc. – John Rudy (yesterday)

And I keep one on my desk, too. But I can get to the software one WAY faster. What I'd LIKE is a good programmer-oriented sidebar gadget calculator -- completely solves all the issues Raskin mentioned by being always visible. – John Rudy (yesterday)

Most of us actually don't have high res screens like you. But even if we do, I think it's a mistake to count more resolution as "more real estate" without taking into account physical dimensions. more space doesn't solve the problems. May be matter of taste, but if you're used to tasting shit... – Breton (yesterday) [remove this comment]

The point is that "More screen realestate" is just a bandaid. The problem is with the concept of an application window. I could go into more detail, or you could just read the book. – Breton (yesterday) [remove this comment]

The concept of the application window? Now I'm curious for the more detail. If you have something like a Vista Gadget -- always visible -- it obscures nothing, and all that is required is a focus switch. Using a handheld calculator is fundamentally the same -- move hand, switch focus, go back. – John Rudy (11 hours ago)

(And why is screen real estate a band-aid? I get it if you're at 1280x768 at 13" -- as I am on the MacBook I'm typing this on -- but in the Windows world, it seems like 96dpi is moderately standard. Yes, I'm cross-platform.) – John Rudy (11 hours ago)

There's a big difference between switching "focus" on a computer, and switching your mind's focus onto a different object. One requires the operation of a mouse/keyboard, and the other doesn't. in raskin's THE the app window is applauded as a huge step forward, but fundamentally unable to support... – Breton (2 hours ago) [remove this comment]

The number of simultaneous tasks, and the complexity of those tasks that we have nowadays. The problem with app windows is that they represent a modal interface. windows cause the meaning of specific gestures to change arbitrarily. This is harmful to habit forming. It's also the reason vi is so hard – Breton (2 hours ago) [remove this comment]

Switching focus from the computer to a calculator is a natural kind gesture that we're evolved to do. Gestures against the calculator always lead to the same result. We can form habits around the operation of the calculator. Using a software calculator means that we're pressing a set of keys to ... – Breton (2 hours ago) [remove this comment]

changes the meaning of another set of keys, then making our calculation, then using a third set of keys to change the meaning of the second set of keys back. We can't form habits around that second set of keys, because gestures against them don't always mean the same thing. – Breton (2 hours ago) [remove this comment]

While habits are possible to form under such conditions, it's difficult because our expectations are often violated. The situation is worsened by the fact that the first set of keys and the third set of keys probably also change their meaning quite frequently. – Breton (2 hours ago) [remove this comment]

For this reason you really want your calculator built into whatever ide/texteditor etc that you're using, if you're going to go with a software solution. This reduces the number of mode changes you have to make in order to do a calculation. Gestures retain their meaning. – Breton (2 hours ago) [remove this comment]

I'll throw in one more thing. Modes also have the problem of communicating to the user which mode they are in, and when you have switched modes. This communication usually fails because the user is focused on the task at hand, and not on the mode indicator. – Breton (2 hours ago) [remove this comment]

In raskin's solution, the number keys retain their meaning- That is, they cause a set of numbers/operators to appear in sequence in your text document. Highlighting some text and pressing the "Calculate" key is a single gesture which always retains its meaning. No modes. – Breton (an hour ago) [remove this comment]

raskin's THE also has a chapter on what he thinks is wrong with modern programming environments, and how to fix it. If you're a programmer, it's well worth the read just for that chapter. – Breton (an hour ago)