Geeks With Blogs
John Alexander Emergent and Shiny

This post is from a very good friend of mine, Billy Hollis. He’s got some interesting food for thought and I think you’ll enjoy his perspective!

For over 15 years now, our industry has been struggling with a crucial tradeoff. We can get broad reach via standards, or we can get the best possible user experience with applications that take advantage of particular devices or platforms.

It's a stereotype that people in software development tend to be code and technology centric and not user centric. Ideally, we would like "one best way" to write our applications, with that "one best way" optimized to allow us to produce reliable, scalable applications as quickly as possible.

However, I don't think we're anywhere close to a universal "one best way". Until the day that web/cloud bandwidth is roughly equivalent to local bandwidth, that can't happen. Until the day that standards balance the need for security and the need for device access, that can't happen. Which means we won't be seeing it any time soon.

So, until that day, our industry must be open to the idea that the way we expose our applications to users must vary with circumstances and application requirements. Sometimes we are OK with straightforward UI and we need broad reach, so we use web standard technologies (e. g., social media). Other times we must have strong integration with devices and the best user experience we can design and create (e.g. clinical records management for healthcare).

Most applications will fall somewhere in the middle, and then we must make hard decisions on what aspects of our application are most important.

Unfortunately, as the people developing the software, we have a tendency to choose what's easiest *for us*. For example, it's usually much easier for us to produce a centralized application. Our deployment story is simplified, and our maintenance path is reasonably clear.

So, in making our tradeoff decisions, we should always remember this: users outnumber developers. Our applications and systems exist for them to get something done.

Large, diversified, distributed groups of users benefit from standards-based development. They get to use the software with whatever device or system they happen to have. They don't have to bend their lives to our software, and that's good for them. Software for such groups usually has a fairly small core feature set, and we can use standards-based UI technologies to give a more-than-adequate user experience.

Smaller, focused, often professional groups of users benefit from applications that make them as productive as possible. That may mean any of the following:

- Intelligent management of complex task-based workflow
- Heads down data entry
- Management of rich data, including graphics and numbers that require context setting
- Visualization of data and analysis features for decision support
- Interface to various devices that supply information to be managed
- Security needs to comply with regulations such as HIPPA

A tangible example may help clarify. At the beginning of the .NET era, I did some light consulting with a local startup working on clinical records software. They decided it would be browser based, because everything they read by various experts in the industry told them that standards-based development was so important that it basically overrode almost all other considerations.

I was the only person advising them to strongly consider a client-based user interface. I know doctors. They demand usability, clarity, and responsiveness in the software they use. That's completely understandable, given the responsibilities they have.

The company devoted six years and perhaps ten million dollars in trying to make that browser-based software function as a clinical records system. At the end of that time, they threw in the towel on it. The doctors simply refused to use it. It worked, in the sense that it managed all the information the doctors needed. It just was not usable and productive enough to replace paper charts.

After six years, they began shifting to Windows Forms wrapping the browser to get more control over the interface. They had just begun to shift to WPF, but the money ran out. 

From what I've seen in HTML5, it still is not ready to give the kind of user experience this sort of application ideally needs. Among other limitations, it lacks the client-based state management to build in enough local intelligence. You could certainly build a better clinical records system in HTML5 than in HTML4, but I still believe it would fall well short of what's possible with other, client-based technologies.

In the last year or so, I've done consulting for the following scenarios:

- Petroleum management software that must run on a local machine, because it is used at oil wells in the middle of nowhere and manages complex data sets with thousands of measurements

- Medical management software that handles complex images and videos, with annotations, dictation, workflow, interface to devices, and strong security

- Kiosk software with touch for manipulation of 3D images

- Retail software that must run regardless of network connectivity, must be highly productive, must have a touch option, and must directly interface to devices such as credit card readers and cash drawers

- Home healthcare that must allow users to travel to homes in a wide area, with no assurance of connectivity, and work with rich databases that must be on the local machine for complete availability

All of these scenarios require capabilities that would stretch HTML5 to the limit, if they could be accomplished at all.

I go into such depth to try and establish a basic concept that ought not be that controversial: for the foreseeable future, our industry needs technology to create rich, productive applications with interfaces that run on the client machines, and in many cases require access to local resources and devices on the client machine.

That doesn't mean I regard HTML5 as worthless or irrelevant. Far from it - I abhor how miserably bad the user experience is on many websites, even those from major corporations. Anything that gives UI designers better options to create more usable and pleasant sites is AOK with me.

However, it pains me to see a recurring attitude among many developers, especially among enterprise-level thought leaders, which is this: whatever works for me is what everyone should use.

I see it in evangelism around process X, Y, and Z, in open source evangelism, and in standards evangelism. All of those areas are valuable - but not universally. Not for every team and every application.

"One size fits all" doesn't even work very well for hats. I hate those adjustable baseball caps, and you don't see Major League baseball players using them.

One of my guiding principles is that any time someone promotes a "one size fits all" concept or technology, I pretty much pigeonhole that person as lacking experience or perspective, since "one size fits all" doesn't work for anything in life.

So it is with HTML5. As I said, I'm in favor of it. There are applications where it's a clear choice, and I hope to see it promote more and better UI design thinking among the software development community, which has historically been delinquent in that area.

But can the HTML5 advocates please give some respect to the alternatives that are needed for other circumstances and other applications? Can we stop with the "HTML5 is taking over, so X is dead" discussions?

We have a generation of users coming up that grew up on iPods and iPhones. Their usability standard is molded by apps on the iPad.

We developers need to step up and take on the challenge of doing better. Sometimes that will mean using the advanced capabilities of HTML5 to create better web applications. Other times, it will means stretching our imagination to the limit to come up with innovative ways to leverage the capabilities of powerful client platforms.

For that range of needs, we'll need a range of UI technologies. We'll need HTML. We'll need Silverlight. We'll need WPF. We'll need other technologies for other client platforms, such as Objective C.

But most of all, we'll need the attitude that we will do what's best for our users, not simply what's best or most comfortable for us.

Posted on Saturday, April 9, 2011 3:34 PM | Back to top

Copyright © John Alexander | Powered by: GeeksWithBlogs.net