Category Archives: technology

The danger of software

My wife has bought a mac. And yes in my house, I am PC and she is Mac.

 

 

 

 

 

 

 

 

 

 

 

 

One of the dangers of the mac is that makes digital video editing too easy.

Why is that a problem?

Because now everyone thinks they are a movie director! Having tools that are easy allows anyone to make a movie.

So…

Over the last twenty minutes, while I was watching the Olympics and blogging, she was crafting this video (complete with audio soundtrack) …

On a PC this would have been nightmarishly hard to do… but then again anything that is worthwhile should be hard to do!

Fun with Symantec

I am currently running Vista on my new shiny tablet.

And for the most part this has been a wonderful experience. I find the user-interface to be significantly more elegant. The system as a whole seems to hang better together than XP.

However, I did run into a problem with my VPN. Now it turns out, for reasons that are a complete mystery to me, that Norton 360 and my Cisco VPN will not work together. After a lot of pain and frustration I decided to call Symantec today and ask for help.

First the helpful person on the phone told me to wait a sec while she logged on to my computer. To which I responded:

How do I disable that feature?

After a lot of time on the phone, the helpful person told me to:

  1. Add a rule that allows any piece of software to use any port and any protocol to go in and out of my computer
  2. Trust every interface.

I was in a rush, and just relieved I could use my VPN and have some kind of antivirus software, so I didn’t pay too much attention to what she told me to do.

But later on this evening I thought…

Hmmm…

Now I may not be a network expert but step 1 and step 2 are the moral equivalent of:

Turn off the firewall.

Symantec rather than tell me honestly:

Dude the product you spent 79$ doesn’t actually work with this other piece of software, so go to advanced options and turn off the firewall.

The helpful helpdesk person walked me through a convoluted process to disable the software without explicitly disabling it.

Blech.

Archive, damn you! Archive!

After reading about how the folks in Redmond can not get Outlook .ost files to work if they are greater than 2GB, and recognizing the challenges of not using cached mode, I decided, foolish boy that I am, to archive my email.

So, of course, when I did the archive, I, foolishly assumed that the damnable software would do a "cut-and-paste" not a "copy-and-paste".

Darn.

It was "copy-and-paste". So after enduring the pain that is the archive process, I must now endure the pain that is the "select-1000’s-of-files" and delete them process. Including the always fun "re-sync-your-local-folder-with-the-exchange-server-spasm".

I am going to go to sleep now.

Maybe if I am lucky I will be able to use my email client tomorrow.

www.librarything.com

Bernadette remarked that the web site www.LibraryThing.com does in fact allow me to export my data.

Since I can now do that, I think I will actually give the place a whirl.

My first impressions are remarkably positive.

I like the use of tags to organize information, rather than the use of fixed columns.

I like the fact that web site can search through Amazon.com or the Library of Congress to fill in information about books.

For example, I typed in “the echo maker” in the “add books” section of the web site.

image

And then the web site very quickly gave me the option to select from a set of books that matched “the echo maker”.

image

I picked the one by Richard Powers, and, just like that, my library know had all of the information about that book.

image

All it took was 15 key presses and one mouse click!

Update: Feb 17, 2008 because I mistyped the web site.

Who owns my data?

 

After I wrote about my Access adventures, Michael Rubin recommended a very interesting web site as an alternative to rolling my own application.

While I was looking at the site, I could not but help to notice the word Beta.

Here’s my anxiety: it will take a lot of time and effort to enter all of the data into this web site. And this web site is a small venture by a small team in Portland.

What happens if this small web site goes out of business? Do I have to re-enter all of my data all over again?

I really want the ability to have a hard-copy of the data that is independent of the web site. Such that if the web site goes down, I can move my data to some other provider.

And as much as it pains me, I feel compelled to agree with the folks at Data Portability. I own the data, not the company that keeps a record of them. I don’t have a problem with them profiting from my data, but damn it why won’t they give me  a copy?

Math, Engineering and the field known as Computer Science

When I started my career many moons ago at SGI, I discovered that there was a set of really smart folks who viewed software development not as an engineering discipline  but more akin to developing mathematics, or writing poetry.

There is a certain intellectual appeal to such a notion. If software, is indeed poetry, or beautiful, then that means that computer scientists are not merely engineers building devices using well defined rules, but we are artists creating something that has enduring value.

More to the point, it means that the software itself has intrinsic value beyond that of the particular product that it ships with.

So as software craftsmen, our job is not to merely satisfy the immediate customer requirements but to develop the perfect set of software that had enduring value that also happens to  meet the customer requirements.

For some reason, this never really worked for me. At some point, software is good enough, not perfect, and we need to move on to the next problem.

What appeals to me about software development is the part where the end product is used by someone to solve a real problem. I want to engineer, which means to make tradeoffs, a solution that people want to buy.

I am not interested in understanding what can be developed within the constraints of computability.

And in many ways, I am beginning to think that the software as mathematics crowd tends to have a dim view of software as engineering, because they are not engineers and don’t see beauty in engineering.

Which puts me in a different camp from the software is beautiful camp. I am in the camp that views the pursuit of software beauty as an end unto itself, a waste of time.

Now lets be clear, it’s not that I think software can not be perfect or beautiful, I do. Nor does it mean that I think that there is no distinction between beautiful software and butt ugly software, I do. And thanks to the discipline that those great engineers instilled in me at SGI, I think I was actually able to approach beautiful code.  It’s just unclear to me how the pursuit of this perfection got me any closer to my revenue goals.

I find beauty in products that exceed customer expectations, that are cheaper to develop than was expected and are easy to evolve at a low cost. I view the underlying software as a means to that end, but not the end in and of itself. And yes, I do  understand that sometimes it’s elegant beautiful software that makes that possible.

I think the art of engineering  is to understand where to invest your time and energy to get meaningful product differentiation and where to just live with problems. And I think it’s an art, because you never really know if you pulled it off until someone, somewhere opens their wallet and forks some money over to you because they want your product: not your software, your product.

Which brings me to the title of my blog. I think that there is a tension in computer science between engineering and mathematics. And I think that there is a a class of computer scientists who think of themselves as mathematicians building fundamental abstractions. And I also think that there is another class of computer scientists who think of themselves as engineers who try and deliver differentiated products that exceed customer demands with imperfect software.

And I think that between the two camps there can be no reconciliation.

The quest for software perfection.

When I started my career as a software engineer at SGI in 1996, I had the privilege of working with a great engineer.

This engineer and I had very different perspectives on software. I viewed software as a means to an end. As a vehicle to deliver the features that the were asked of me. That the perfection of the software was immaterial, what was material was how fast you could deliver those features. In fact, sloppy, disorganized, poorly structured code was okay as long as it worked. What was material was the function not the form.

He, on the other hand, felt that software was like poetry. That it had its own intrinsic beauty and that its beauty was an end in and of itself.

That’s not to say that he did not care about the outcome and the product. He was always passionate about delivering value to customers. He just felt that the elegant, solution was always better than the quick solution.

Being young, and he being great, I was convinced that elegance was worth the price in time and effort.

I’m not sure I still agree with him.

My career is littered with software systems that are no longer in production. SGI’s kernel was EOL’ed last year. NetCache was sold off to Bluecoat. Most of the code I wrote for DFM has been re-written as more and different requirements came into existence.

And he would say that is natural and normal and a reflection of the natural process of things.

And I wonder.

Was it really worthwhile to strive to create the perfect solution given the market pressures? Would I have been better off to just get the job done in the most expedient way possible?

Ultimately, I think the answer boils down to an engineering tradeoff. The perfect solution makes sense if you understand the requirements and the requirements are stable. But if the requirements change, then your attempt to create perfection has to be balanced against expediency and need.

Although I can appreciate a beautiful piece of code, I somehow am more inspired by a system that is easily adapted. A systems whose core abstractions although imprecise are in the right general area and allow for substantial independent directions of innovation.

Let me try this differently.

I think it’s far more valuable to know what the core set of abstractions should be and their general properties than to specify them completely. Instead of trying to perfect them in isolation, one should expose them to the real world and then learn. And if the abstractions were correct, over time they will get precisely defined and perhaps at some point become perfect, as they no longer evolve.

But I suspect that I will have long since moved onto the next set of imperfect abstractions.

And in retrospect, that engineer always remarked that it’s much easier to replace an elegant easily understood solution than a complex, baroque, over or under-engineered hack that was expedient.

gPhone: Google’s capitulation?

The recent announcement by Google around the gPhone has been portrayed by the press as a game breaking move. That somehow a new free OS that is customized for cell-phones somehow, once again, changes everything. And that more, to the point, that was Google’s plan.

I disagree. In fact, I believe Google had grander ambitions, those ambitions proved too costly, and that the recent announcement was an admission that those plans were shelved. And that Google was signaling to the cell phone carriers that they were going to play by their rules.

Hence, the blog title: Google capitulation?

Here’s what I think:

Google’s management team correctly observed that the future of search was search on the cell phone. And that location based search on a cell phone was going to be a tremendous revenue opportunity.

The challenge was that the current cell phone carriers act as tax men. You can’t sell a service without putting the service on a cell phone that the carrier sells. And the cell phone carriers themselves had ambitions on how exactly those advertising services were going to be delivered.

But why would the carriers be any different than MS and Yahoo in their ability to compete with search?

Unlike Microsoft and Yahoo, the cell phone carriers thanks to their connectivity to customers through Yellowbook, and the fact that they sell phone numbers, have the sales force, and the business process to create a real alternative local search advertising market.

So if you’re Google, and you have more money than God, you think outside of the box. If the problem is that the cell phone carriers control access, you need to create a new network that does not have the cell phone carriers acting as the gatekeepers.

To do that Google needed three pieces:

  1. A network that could carry phone calls that was not owned by the cell phone carriers.
  2. A set of devices that would connect to that network
  3. A set of compelling services that would cause people to select that network.

So what was the plan?

Let’s look at them in reverse order. For (3) Google was building it’s own applications, and then buying startups that offered innovative cell phone services. For (2) Google was working on an OS and reference platform. And for (1) Google had a three pronged strategy. The first was to build Metro WiFi like they did in Mountain View and San Francisco. The second was to bid on the wireless spectrum and either build or lure someone to build the network. Third was to create a regulatory environment that would allow other virtual carriers to build their own networks.

I believe that the cost of (1) became prohibitive along two dimensions. The first was the sheer dollar cost to build. The second was that while Google was building out it’s competitive network, the existing cell phone carriers would treat Google and their software as enemy number one. In many ways, the Google move might force the carriers to embrace Microsoft and Yahoo. The potential loss of revenue while the network was being built out and the cost of the network just made the strategy impractical.

Confronted with this reality, Google scaled back it’s ambitions, and like a researcher who has failed to prove something significant, they looked for pieces of that strategy that were still valuable and tried to get some value from them.

And that’s what the gPhone announcement is about. Unlike every Google announcement in the past, Google was announcing vapor. Nothing real, no product, just a statement that the grand cell phone strategy was about releasing a free OS to cell phone carriers.

With that announcement Google was signaling to the cell phone carriers that their plan was to play by their rules. Like Microsoft, Blackberry, Nokia, Palm and Apple they were going to release an OS, that the handset providers could port to their devices, that the cell phone carriers could certify and that Google would continue to be a software provider into those walled gardens.

Google was no longer planning to build an open, unwalled garden.

Google capitulated to the existing market reality. Perhaps we are seeing the limits to even their ambitions?

Live Writer

It worked.

I was able to go from download-to-post faster than  any other tool I have used. It’s actually surreal how easy it was to get the tool working and running.

One place where other tools fell apart was pictures…

Let’s add a picture

ellery_lake

They win.

MS Live Writer is now the current champ of web blogging tools!