Software Development Overview. The Sandwich Shop (II of III)

In our previous posts we did a small bit about failure in software development projects and our thinking, how we aligned to a sandwich shop (to best describe network traffic).  This is the second of our three articles

This serious of stories is about trying to position how we got to our current thinking (here in our team) of how we now do the db work.  The series so far is outlined below with a link to our older post, then onto part II.

I – I would like a sandwich! [Process all business logic as close as you can to the data {Project Sandwich Shop}]

II – The Office would like a sandwich! [The client asks for a specific rolled up order]

III – You sort the order at home, errr how?

——————————

Part II – The Sandwich Shop, large orders

Last month we covered how the sandwich shop grew (scaled up) to deal with large orders.  You have seen these large premises in your town they grow big, its busy!  So with everyone crossing from the office (the client machine) via the roads (Internet) to get their sandwich (the db), there is a lot of traffic and it is now busy and slow.

In this example, in order to improve the sandwich shop does the easy and safe thing, they scaled up.  In all sorts of hardware (NAS, web farms, Clusters etc).   Older SQL servers, when busy, just dumped you a whole cursor set of data to speed things up.  We likened this to a shop pre-preparing whole meal sets that morning to take the sting out of the first rush for sandwiches.  So now the classic model is in place, we have large traffic and a well oiled machine, churning out product to all that come over.  Problem is, there is a lot of product being churned out.  With all this hardware in place it is easier to make lots and lots of sandwiches so you hit the order just right when you walk in for a single cheddar sandwich!  The rest is thrown away after you get your cheddar sandwich.  Ok, you see the paradigm now, as the db is churning lots of work in order to hit the spot for small very specific orders that make it through the front door.

So where to now, how can we improve this?  Is more scaling the answer, is virtualisation the way forward.  Lets look at what you could do in this instance if you are running the number 1 premium store for sandwiches in the whole city!

  1. Add hardware?   Open another store right next to this one?  Good idea, but assume you have already done it.  A third would have negligible returns as the traffic is already clogged up
  2. Virtualise?  Buy the floor out upstairs and run in parallel, produce more sandwiches? Again, this is not going to help too much.  It will drive down costs and optimise hardware or in this case real estate costs (see that, sneaked that word in for you USA peeps, here we call it property :P   )
  3. Optimise, bring your running costs down, more covers, smarter orders, be ready for the rush.  This is a must and on-going job, make your SQL cost less to process jobs.  Run smarter filters on data, pre-cache data that is used the most frequently etc.  This never stops but imagine you have done a lot of this already…
  4. Send out bigger orders, more food to the businesses, delivered to their door.  Not bad, but in this analogy the wasted food doesn’t cost, but in modern times where traffic bandwidth is needed, dumping loads of data to a clients call for a simple query sounds non expensive but it does slow everything down.

The issue we are not addressing is the bandwidth you have, in this case you cant physically invest in the roads infrastructure, just as the cabling is already in place for your traffic.

Nope the answer, *we think* is to be very clever.  Well not really, but not many people do this, so the exclusivity of minimal practitioners makes this smart by default. :P

Next installment, what we are doing to improve future db work…

 

 

 

 

 

 

Technorati Tags: , , ,

Twitter Digg Delicious Stumbleupon Technorati Facebook Email

No comments yet... Be the first to leave a reply!

Leave a Reply

You must be logged in to post a comment.