Wednesday, August 25, 2010

How Hardware Storage Increases Have Changed Software

Three weeks ago I bought a 750gb external drive for around $100, including tax. Three days ago I saw an even better deal for the same size. The cheap cost of space is really quite astounding if you look back in time.

5 years ago... Well, my current computer, I am about to swap out this friday has 80gb

15 years ago... My work issued computer was a 486dx-32mhz with a 120mb drive. Whohooo, or as they say in the Bronx, "fuuugettaboutit"

I remember back in the late nineties building web content and worrying about what impact adding a 3k or 4k graphic would do. Today I am having a hard time keeping up with the fact I can add video, graphics, build gnarley dynamic apps and not have to sweat a major load about disk space. We can thank the invention of
perpendicular recording back in 2005 for these advances in disk space.

Further, memory is cheap as well. We can quadruple memory today for the same price we paid 3-4 years ago.

And, I do not want to forget the low level infrastructure developers for their advances and optimizations in webserver and rdbms software.

Development has always been a field dominated by young bucks. But unlike us old geezers, they can't remember back to the prehistoric era when we were dealing with portable storage media that maxed out at 1.44mb, A hard drive of 40mb was the norm. You were envied by others if you had 80mb. And 512kb of memory was standard (like 2gb is today).

I am not just waxing nostalgic here. What I am trying to say is that back in that era, the hardware we had forced us to make technical design decisions that were germane to eaking out performance in the applications we developed. And, as a footnote to this, my technological forefathers would chuckle at me and say "sonny, you don't know what lack of resources are. We programmed on punch cards and had to run and debug our programs when we could actually get a time slot on the mainframe that took up an area the size of your house and had 1/1000 of the resources that you get today on that there iPod of yers".

My point is that I grew up in an era when resources were at such a premium we had to seriously consider optimization in our application development. Even today I still develop with this mindset.
Today we can throw hardware at a problem.

But, I do think performance is a key issue an application developer needs to factor into their code. Okay, programming has changed for many of us. I remember having to design databases where a table packed everything into a row that was 2k or less. Writing apps in C one would need to write queues and stacks and ponder using the verb register for which variable. Declaring a pointer against a pool of memory that was the size of a thimble of water compared to todays 4-person hot tub.

But, still there are many things todays developer can do to optimize business apps. For one, minimize disk access. I see programs all the time that invoke multiple queries or stored procedures when only one is needed. Also, there is also nothing wrong with using integers as foreign keys in database design. Using quit loop functions once you have what you need can be a time saver.

A lot of optimization of course depends on matters like how many concurrent users your app will have to contend with. And, what is the intended use of the app is another prime consideration. There are many more.

I would be the first to admit that if you are writing an app that will only have a few users saving 10ms access time through varying your coding practices is not worth anyones time.

I remember many years ago a peer looking at a Perl program I had written and giving me an obtuse (in my opinion) way of doing the same thing. Well, first off, my program was a command line batch program that was going to be run maybe once a week at the most by one programmer on their pc (it was a code generator. I won't bore you with the details). But I was curious. So, I went back to my desk and created two scenarios. One coding my way, one the way I thought was obtuse, I ran them in loops executing 1 million times using a hi-res timer to benchmark performance. And, the dude was right-on, his was faster. By about 15 milliseconds total under my test scenario.

For my intended use it made zero sense to change the code to a coding convention that seemed very foreign to my way of thinking. Plus, I thought, hey the single end user of my app may not have time to go out and have a beer and get back fast enough with time savings like that. We must always be considering the needs of our customer.

On the other end of the spectrum, during the era when client server apps were in-vogue I saw some fat client apps written that would create and dump (errr...I mean deliver) 3-5mb of data to the client in one fell swoop, taking a couple of minutes to load. Hmmmm, I remember thinking the customer will really dig that. And, this story is from a time frame when 16mb was a standard physical memory footprint.

Today is such a cool time to be computing in. One can consider true multimedia solutions in their design and not have to worry about needing to sell the customer a cray to host the app. The emergence of mobile devices (even with their evil proprietary languages) is way cool as well. Streaming video still impresses me.

And, knowing within ten years that someone will figure out how to mainstream a monitor you can see in your glasses, and a virtual full sized keyboard to type on that can be tied to your handheld. I can just picture someone on the bus coming home from work (that would not BE ME!!!) madly typing away some flame email on a keyboard none of us can see. Man, I can't wait for that day :-)

Saturday, August 21, 2010

Developing a Web Development Framework - Walking the Walk

(Actual Written on 7/31/2010)

Software architecture and design is and has always been important to me. A big part of of this is implementation. It is easy to code a hack. We all do it. Time constraints, money, whatever the reason making a hack is easy to do.

After one does enough gigs they should come to a realization that doing the same work over and over again can be simplified or organized in a way that makes life easy, shortens the development cycle, is easy to manage and maintain. With my alchemy web development framework this is a realization I am trying to achieve.
My main design goal regarding implementation is really founded on two main objectives.

1. Be able to quickly implement alchemy anywhere.
2. Easily update an existing alchemy implementation without breaking anything.

Alchemy is in alpha, but is already becoming a monster. The application framework alone is seventy directories and over 550 files. To makes matters just as interesting I host the framework on different posix servers, but my development environment is Windows.

I have just created a Sourceforge project to manage the builds. Right now I have started to move the content and documentation into my developergeekresources.com site. I am having to implement alchemy there as well in order to offer up the variousd templates and toycode examples.

Ultimately I want the alchemy framework to replace my four active websites. Replacing an existing legacy site is easier said than done. I want to make adding new content to them even easier than it currently is.

If you cannot effectively use your own stuff why would you expect anyone else to use it? Right now what I have is still rugged in certain areas. The only way to really see if you are on the right path is to use your own stuff. Too often software has become a modern factory where people that are producing the software never have to use it. Most do not even have to implement it, so they can be totaly insensitive to that discipline as well. A buddy of mine likes to say "fish or cut bait". I like to rephrase it to say "cut bait, then go fishing".

I must admit I am so busy my inclination and nature is to just slam something in and get back to work. Not such a good strategy if you are trying to build an entire framework. You gotta always be thinking about shit like "what will that change do to all the existing stuff?" and "how is that going to be able to implemented in a way that is extensible a year from now to include all the cool new stuff myself or someone else has conjured up that I have yet to even think about or consider.

Thursday, August 12, 2010

Open Source and Google

The other day I was building a portion of an app that dynamically generated charts based on the results of a query. Worked like a champ in my local development environment where the server and the browsers I use both reside on my laptop.

Then I uploaded it to my test server and the chart did not update without having to refresh the browser. Of course, the browser was caching the image and even though the data was changing, the file name remained the same. What to do?

Get the end users to lower there expectations?
Call it a WAD and modify the requirements?
Read a manual until I went into a coma?

No

I Googled: php image not refreshing   ...

... resulting in a link on page one that took me to the solution. (and for that, they deserve a link.)
http://forums.powweb.com/showthread.php?t=60177


Generally, I can find an answer to my questions through Google spending only five or ten minutes. One of the most profound and unheralded innovations in the discipline of computer science is the marriage of search engines and open source software.

Sure beats the days of being dead in the water for days or weeks, getting bounced around and listening to muzak on support lines, and then having to come up with convoluted workarounds because you could not find a solution.