Doing more with less – how would Alexander the Great have tackled today’s big data challenge?

This is a guest blogpost by Partha Sen, CEO and co-founder of Fuzzy Logix.

Ever heard of the legend of Alexander the Great and the Gordian Knot? According to the myth, Alexander was confronted on his travels with an ox cart tied with a knot so fiendishly difficult that no man could untangle it. Alexander’s solution was a simple one. He took his sword and cut the knot and freed the cart. Problem solved.

Fast forward to today and we have our very own Gordian knot type challenge: regardless of your business analytics environment, you are likely under increasing pressure to deliver more for less. Your business is demanding faster insights, more efficiency and more simplicity. At the same time as asking to spend less and get a faster return.

So, is there an answer to such a seemingly intractable problem? Is it just a question of trying ever harder to keep pace with the demands of doing more with less or will this approach be just as ineffectual as those who tried to actually untie the Gordian knot? Or is there a different way to look at the challenge and rather like Alexander with his sword, remove the challenge in one fell swoop? To answer, let’s take a step back for a second and explore what our own ‘Gordian knot’ looks like…

If your analytics is based on SAS, you will probably have to significantly increase your investment each year to perform the same analytics on double the volume of data.  It’s expensive and time-consuming to manage all that duplicate storage, datacentre-space and network infrastructure for moving data as well as the need to increase processing power. Even with all of that, the   and the time-to-insights will not live up to your business requirements.

Traditional approaches to analytics like SAS force you to move data from where it is stored to separate analytics servers because the data is in one place and models run in a different place and then feed the results back into the database. This results in huge pain points, including:

  • Expensive hardware
  • Slow insight delivery as two thirds of the time is often spent moving the data
  • Sub-par model analysis – due to memory constraints of the analytics servers, models must be built with only what fits into memory, and not the entire data
  • Outdated analysis – in several industry verticals, the underlying database might change rapidly versus the snapshot moved into memory on the analytic servers.

And there’s another problem; the better the data quality, the more confidence users will have in the outputs they produce, lowering the risk in the outcomes and increasing efficiency. The old ‘garbage in, garbage out’ adage is true, as is its inverse. When outputs are reliable, guesswork and risk in decision making can be mitigated.

Many organizations, faced with this challenge, just try to run ever faster in the hope of meeting the needs of the business. It is a fruitless approach though, doomed to eventual failure. Instead, like Alexander and his sword, they need to appreciate that the objective of achieving scalability at speed for their analytics at a lower cost requires a very different approach.

Let me explain. At Fuzzy Logix, we turned the problem on its head with our in-database analytics approach. We move the analytics to data, as opposed to moving data to analytics, and eliminate the need for separately analytics servers. We leverage the full parallelism of today’s massively parallel processing databases. With DB Lytix, data scientists are able to build models using very large amounts of data and many variables. No more sampling, no more waiting for data to move from some other place to your analytics server. These models run 10X to 100X faster than traditional analytics, bringing you expedited insights at a fraction of the cost. And with a simple software-only install of DB Lytix on to your existing SAS analytics, you can reap the rewards of faster analytics without making major infrastructure changes.

Ok, enough of the marketing speak, how about the reality? Here’s a real world example; a large US health insurer, moving the data out of the database to the SAS servers meant breaking it into 25 jobs and assembling the results – a process that took over 6 weeks!  Using in-database analytics allowed the customer to work on the entire dataset at once, and finish the analytics in less than 10 minutes.

So, how are you going to solve the intractable challenge of doing less with more? Are you going to continue doing what you have always done and hope for success? Or are you going to change tack and adopt the type of approach I’ve outlined above? To use a cliché, I think it’s a ‘no brainer’. But don’t take my word for it because if there’s one thing that history and Alexander the Great has told us, seemingly impossible tasks require bold and different solutions.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close