September 2012 Archives

What big data did next

bridgwatera | No Comments
| More

Ask a database vendor what they want to talk about next and odds on they'll say "big data and analytics" pretty quick. Ask a middleware or infrastructure software the same question, you'll probably get the same answer.

Actually, if you ask a cloud hosting provider, a storage specialist, a Business Intelligence vendor or a data warehousing company what matters most right now and they will all probably tell you ...

... big data and analytics is set to be a game changer.

But should we grace these industry-wide machinations with our credence?

Technology analysis firm IDC has estimated that the world's data is forecast to grow 50 times by 2020 and, if we are honest with ourselves, that may be an underestimation.

Oracle's OpenWorld and JavaOne kicks off today in San Francisco. Right after Larry Ellison's keynote and the other VP's (aka "veeps") also get to tag-team keynote before and after him, we start to get to the first general sessions. Guess what the first one is called?

11:45 a.m. Database General Session Moscone, Hall D Big Data: What's Next for Oracle Database? Andrew Mendelsohn, Senior Vice President Database Server Technologies, Oracle.

It's conference season, it's big data analysis season. The two can not be separated.

Last week we had TIBCO's TUCON symposium and here again we heard discussion centred around companies' needs to "transform their businesses using the intelligence buried in big data" as they address the "storage and access challenges around the volume, variety and velocity of big data" on a day-to-day basis.

You could almost write a standard script for this sector of the IT industry right now. We're running out of words to say the same thing.

TIBCO for its part did have a solid product announcement in this space in the shape of its Spotfire 5.0 data discovery and analytics platform. The new release is said to include a "completely re-architected in-memory engine" to help "visualise and interact" with massive amounts of data.

Spotfire 5 executes complex calculations in-database with Teradata, Oracle, and Microsoft SQL Server, it also has the ability to visually explore data residing in Microsoft SQL Server Analysis Services cubes.

"Spotfire 5 is capable of handling in-memory data volumes orders of magnitude greater than the previous version of the Spotfire analytics platform, giving business users the ability to not only discover trends, patterns and outliers incredibly quickly, but also the freedom to explore much larger data sets," said Lars Bauerle, vice president of product strategy at TIBCO Spotfire.

"Spotfire 5 also extends beyond in-memory analytics with a new visual and interactive interface to directly query databases and cubes, harnessing the power of these external analytic engines to run calculations in-database, where the data reside and then visualise the results. These capabilities are exactly what enterprises are looking for to be able to discover the value hidden in their data," he added.

Who will benefit from big data analysis?

These analytics tools are thought to be of most use in high tech manufacturing, consumer packaged goods, retail, life sciences and financial services. The concept here is simple, better information allows organisations (and doctors when it comes to life sciences) make more informed decisions.

This is what big data did (and is doing) next and the real world human benefits to our medicine may well yet prove to be the most amazing and wonderful aspect.

The facts of (software development) life

bridgwatera | No Comments
| More

The software application development industry is governed by facts. That's just how it is; guesswork does not lend itself well to software engineering and this carved in stone truism will not change over time.

Fact number #1: software testing firms like to use project failure scaremongering tactics to sell the benefits of their products.

Fact number #2: a lot of software projects "fail", or at least fail to deliver the full functionality promised in the required and allotted time at the cost provisioned.

Fact number #3: Software process improvement and systems development company Quantitative Software Management (QSM) estimates that 70% of all software developers have worked on a project say that they knew from the very beginning that it would fail.

Fact number #4: This kind of suggestion leads perfectly into setting the scenario to discuss a new cloud-based software project estimation tool to prevent project failure.

Fact number #5: QSM's new SLIM-WebServices project estimation intelligence product now works on-premise and also in a Software as a Service based delivery format to accommodate for cloud deployments.

-Cold_Facts-_-_NARA_-_514067.jpg

"Project intelligence is infinitely more valuable when shared," explains Larry Putnam, Jr., co-CEO for QSM. "With SLIM-WebServices, we're empowering more people in more positions at more places in an enterprise - improving visibility, transparency and informed decision making. Ultimately, this is a company's best defence against cost overruns, schedule slippages, and failed implementations."

Fact number #6: QSM's SLIM suite draws upon the firm's proprietary database of over 10,000 projects and seven of the top 10 systems integrators in the world rely on SLIM intelligence.

Fact number #7: Advanced algorithmic analysis for project estimation, tracking and benchmarking is a good way of improving any project's success, whether you decide to sell that concept with colourful developer failure claims or not.

TIBCO: how do we build the 'social enterprise'?

bridgwatera | No Comments
| More

The TUCON TIBCO User Conference is staged this week in Las Vegas at the extremely new and shiny Aria resort. In keeping with the current technology streams being championed and evangelised by many large scale IT vendors right now, TIBCO is working to make the social enterprise zone an accepted business trend.

What is the social enterprise?

Put quite simply, the social enterprise is basically Facebook in the workplace.

But there are some crucial differences to take on board here and we need to clarify. Ram Menon is TIBCO president of social computing, he explains that the social enterprise necessitates a new connection to business systems so that they can be brought into the "social enterprise conversation" in any firm.

Menon says that the social enterprise has been growing fast and that his firm's interconnetivity technologies are now helping to facilitate a second stage of growth in this space.

Two stages of growth for the social enterprise

For TIBCO's Menon (clearly a cricket fan) there was a first "innings" where the social enterprise was really just trying to copy Facebook. Now though, a second innings sees us actually trying to build systems that will benefit real worker productivity.

Facebook is really just made up of people and perhaps products if we also include the brands (such as Starbucks and/or various music bands or shops) that have built up a presence and a "fan base" across the social network.

The social enterprise's mechanics on the other hand go further than simply connecting people, Menon says there are at least five major elements (or objects) to now connect:

1. people
2. machines
3. business processes
4. locations
5. other enterprise infrastructure elements

In practical terms this means that a food production plant could complete a job to bake some pink fairy cakes and post data to the social enterprise space "reporting" this fact. Employees could be discussing pink fairy cake production challenges in terms of ingredient pricing coming in from partners. There might be some level of "gamification" here where workers are challenged to also interact with and show their influence upon the social collaboration that is going around the work tasks being executed.

TIBCO works in this space with it's tibbr product. So tibbr uses what has been called a "subject-based approach" to filtering and organising information that is channelled through the social enterprise. This software is built with enterprise-grade security, governance and compliance so that tibbr can be installed behind the company firewall

It also has microblogging services and short messages, updates, documents and links with co-workers exist on your tibbr wall - and yes, TIBCO calls it a "wall" too.

So will social enterprise networks make a difference?

As previously reported on TechTarget by James Furbush, "Enterprise social networks can save organisations billions of dollars per year in productivity time spent managing email, communicating internally and searching for information, according to a recent report by McKinsey & Company."

TIBCO's Menon recently quoted a September 2011 report from Forrester Research which reported that, "Despite significant and ongoing investment in enterprise social technologies, their roughly seven-year lifespan within enterprises has yielded a maximum of 12 percent adoption within the overall workforce."

So we're not quite there yet perhaps in terms of deployment. But the technology is here, it works and perhaps we now need to read more use case analysis pieces where we can see how well the social enterprise has really helped impact firm's bottom line positively.

TIBCO hoops big data on Californian basketball courts

bridgwatera | No Comments
| More

The TUCON TIBCO User Conference is staged this week in Las Vegas. As is customary at these customer-centric, developer, user and partner-friendly events, a nice case study presentation always goes down well during the opening keynote sessions.

As the self-styled "enterprise infrastructure" company, TIBCO's customers appear to have (arguably) made big data analysis work sexier than most.

NOTE: TIBCO stands for The Information Bus COmpany

The Golden State Warriors are an American professional basketball team based in Oakland, California playing in American National Basketball Association (NBA).

President and chief of operations for the Golden State Warriors Rick Welts described his team and "company's" work to use big data to analyse player behaviour on court.

Welts said that his coaches wanted to know if a player scores most often after two bounces or the ball or one, or three -- or at what minute of the game, or under what kind of defence and attack positions on the court and that this task required the interpretation of a massive amount of big data once the players' moves were all digitised and fed into an analysis engine.

TIBCO founder and CEO Vivek Ranadivé (who now happens to own the basketball team) also plays a part in this story as he was profiled in Forbes magazine explaining how he couldn't understand why, on defence, most teams retreated to defend their home basket -- instead of pressing and challenging passes.

For non-basketball fans, the alternative is a tough defence, keeping the ball close as close as possible to the hoop where your team scores points.

"Applying data visualisation tools, intense analytics to study team performance and patterns of other players is just part of the ways Spotfire can make data into a competitive advantage. Seeing trends in shot selection, location of plays on the court or patterns of hits, misses, rebounds and other statistics can help the coaches select the right plays with the best chance of success," writes David Wallace of the Spotfire Blogging Team.

NOTE: TIBCO Spotfire is a data discovery and analytics tool for what is described as "extreme" data volumes and "complex" analytical challenges. Spotfire 5.0 has this week been integrated with partner Teradata's Enterprise Data Warehouse technology.

The big data story for the Golden State Warriors goes further. The team's use of big data analysis will now move to tracking fans as they order their tickets as we move to a near future where paper ticketing is gone.

Baron-Davis-golden-state-warriors-37324_1024_768.jpg

Salesforce Heroku drives Java in a single click

bridgwatera | No Comments
| More

Salesforce.com stages its Dreamforce 2012 conference and exhibition this week in the shadow of downtown San Francisco's hip vibe and bracing ocean-facing climate.

The company has drawn what is claimed to be 70,000 attendees and over 350 cloud-centric companies together under one roof for this event. On a stage soon to be occupied by arch-rival Larry Ellison for Oracle Open World in two week's time, Saleforce CEO Marc Benioff is this week preaching his social enterprise gospel to a messianic army of employees and paying attendees.

Benioff asserts that his firm was born on in the virtualisation space and his famous 'No Software' tagline still buys him media miles. "We achieved our market position by being born on cloud, but we are being reborn 'social'," he states in a book due for publication soon.

This transition to social virtual enterprise spaces is where Salesforce envisions all company's should be. The firm's collaborative Chatter tool works much like Facebook or email, but is designed to be open and collaborative rather than static. It works by creating "open groups" which employees can create and use as discussion forums so that the data within them can be analysed at a wider level across the business.

Salesforce for developers

The company appears to have a healthy programmer-facing approach and it runs an extremely busy developer channel at the Dreamforce event.

Key among the technologies receiving an update here this week is a Heroku, a cloud Platform-as-a-Service (PaaS) that resides under the Salesforce.com main brand

Heroku Enterprise for Java is a newly launched service designed to build and run Java applications in the cloud. The firm's big claim with this launch is that Heroku Enterprise for Java allows programmers to create mission-critical Java applications "in minutes instead of months", as well as move their apps to a continuous delivery model.

"Java is the most widely adopted language in the enterprise, with millions of Java developers building and maintaining Java applications worldwide.  Traditionally, creating these applications has required piecing together both a range of development and runtime infrastructure tools--such as source code control systems, continuous integration servers, testing and staging environments, load balancers, application server clusters, databases and in-memory caching systems," said the company, in a press statement.

According to Salesforce.com, "This painstaking process typically extends application building and deployment by months, taking developer attention away from their core focus of app development. With Heroku Enterprise for Java, for the first time, enterprise developers can get a complete Java solution in a single package, provisioned with a single click."

Heroku COO Oren Teich suggests that enterprise developers have been looking for a better way to easily create innovative applications without the hassle of building out a back-end infrastructure. He now says that with Heroku Enterprise for Java, developers get all the benefits of developing in Java along with the ease of using an open, cloud platform "in a single click" at any time.

Cloud's next battleground: upload/download

bridgwatera | No Comments
| More

It's conference season in the technology industry. Or, if you prefer, it's symposium, convention, congregation and convocation or confabulation season.

It's true, I do end up looking for synonyms for the word "conference" when I am dashing between various different developer events and you have my permission to borrow any of those and use them at your leisure.

This season marks the first time I will be trying to leave my laptop in my room and work from an iPad with a full Bluetooth Apple keyboard.

So I am prepped up with my free five Gig of storage in Apple iCloud, Microsoft Skydrive and Ubuntu One. I'm hoping to be able to use as much virtual space and processing power as possible basically.

With this reliance on cloud storage, the question of upload and download speed starts to come to mind more urgently than it has done recently.

As such, I have my eye on companies like Box. A firm currently pushing to make upload and download speeds way faster by establishing new network endpoints around the globe and opening up its network intelligence software to developers.

box.png

Box is about to launch Box Accelerator, an enterprise-grade global data transfer network that claims to be able to produce a 10x boost in upload speeds.

How fast is 10x fast?

Neutral, third party testing provider Neustar found that Box had the lowest average upload time across all locations tested, and was 2.7 times faster than the closest competitor globally.

"The goal of Box Accelerator is to enable the fastest possible uploading of content to the cloud for enterprises. Box Accelerator takes advantages of a network of infrastructure that we manage ourselves and utilises cloud services like Amazon EC2," said Aaron Levie, the self-styled lead magician and CEO of Box.

... and for developers?

"Next, we plan to extend this technology to our API, allowing third-party developers to build applications on Box that deal with large files and content. Without having to worry about infrastructure and building a global footprint, developers will be able to get the benefits of Box Accelerator in their application within minutes. We think this will be extremely powerful for partners developing services in healthcare, media, manufacturing, science and research fields, where one of the primary challenges is moving around large amounts of data throughout the world," said Levie.

I won't be using Box Accelerator just yet of course, but hopefully the cloud services which I have subscribed to (which, let's face it, are all free) will keep me uploading and downloading quickly enough to suit my needs. This though, is a key consideration for enterprises now moving forward with usage of virtualisation particularly in mobile.

Intel: 15 billion online toasters by 2015

bridgwatera | No Comments
| More

Despite a lowering of its financial forecast for the rest of 2012, Intel continues its mission to be seen as 'more than just a chipmaker' this month with events staged around the firm's IDF Intel Developer Forum in San Francisco.

New initiative launches include the firm's Intelligent Systems Framework, a set of connectivity and interoperability technologies designed to help us on the path towards the Internet of Things.

NOTE: The Internet of Things is a term commonly used to describe the growth of online connected intelligent devices in the form of everything from smartphones to kiosks and onward to televisions, cars, sensors/cameras and yes, even microwave ovens, fridges and toasters.

Intel predicts that over 15 billion devices will be connected to the Internet by 2015 and one third of these connected devices will be intelligent systems.

Scalable just got urgent

Actually, analyst firm IDC predicts the above and Intel agrees with them, but whatever... we're getting massively increasingly connected and so we need to provision for "scalable computing platforms" as a necessity.

Intel warns that today, the process for developing connected devices involves the use of proprietary components from a variety of manufacturers. "Often missing are the security and manageability features needed to protect and manage the network of devices that connects to each other and the cloud, generating massive amounts of data," said the company, in a press statement.

web wok.jpeg

The Intel Intelligent Systems Framework attempts to establish a set of recipes to reduce the development time for hardware and software integration for intelligent systems. The framework also seeks to address fragmentation in today's market by creating a standardised and open platform for the ecosystem that is actively building solutions.

Next steps will see us "unlocking data" from legacy environments where it is not necessarily analysed at the moment.

What will this give us?

If we achieve this connectivity of devices in the Internet of Things properly then manufacturing systems will become more self-aware and empowered by data relating to every machine's performance - and yes, your toaster will tell you when it needs replacing.

Visual Studio 2012 heralds start of Windows 8 development

bridgwatera | No Comments
| More

Microsoft's core software application development IDE (integrated development environment) Visual Studio 2012 was officially released along with the .NET Framework 4.5 this week.

This final release will be heralded by some as the official start of software development for the Windows 8 platform.

Microsoft developers and third party MSDN (Microsoft Developer Network) programmers and others will of course have already been using beta release tools to get much of the first stage of software well advanced.

But this official release is marked out in many senses by virtue of it being a more significant "re-engineering" to the new Windows 8 platform than previous Visual Studio 2012 updates, which (in comparison) have largely been extensions, augmentations and refinements to the existing IDE.

Put simply, Windows 8 looks different, so Visual Studio 2012 looks different.

VSemulator.png

A full technical review is available on programmer website Dr. Dobb's here. The site confirms that Visual Studio comes in multiple versions: Ultimate, Premium, Professional, Test Professional and Express.

According to the above news and review, "The last of these is a free (as in beer) edition that contains a bare set of tools and a stripped-down IDE. Microsoft says that it plans to run a continuous cycle of product updates on a year-round basis. The already planned Visual Studio 2012 Update 1 will offer new support of Agile teams, continuous quality enablement, SharePoint, and Windows development. Clearly then, the momentum of the general product release has already carried forward into this additional development."

Microsoft releases an official statement.

"Today more than ever before, developers have the opportunity to create modern client apps connected to cloud services that make information more easily accessible to users, on any device, at any time," said S. Somasegar, Corporate Vice President of the Developer Division at Microsoft. "The release of Visual Studio 2012 and .NET Framework 4.5 delivers our most comprehensive and streamlined set of tools yet, providing the centerpiece for an integrated development experience targeting the latest and greatest platforms from Microsoft."

What makes a modern app?

Microsoft's notes on what it feels constitutes the core characteristics of a modern app are interesting. The firm rests its view around three key criteria which it says describe the way software now takes its form.

· Applications are user-centric: They provide a customer-centric experience and are accessible from any device.

· Applications are social: They are integrated with a user's identity and connected to peers, colleagues and friends

· Applications are data-centric: They integrate data with a customer's tasks at-hand enables decision making within context.

Where NASA and Instagram get open source databases

bridgwatera | No Comments
| More

The PostgreSQL Global Development Group has announced the PostgreSQL 9.2 open source database with native JSON support, covering indexes, replication and performance improvements.

NOTE: JSON (JavaScript Object Notation) is a text-based data-interchange format programming language that is widely agreed to be "easy for humans to read and write" and is equally easy for machines to "parse and generate" in use. It is based on a subset of the JavaScript programming language and is said to use conventions that are familiar to programmers of the C-family of languages.

But what does "performance improvement" really mean with this kind of technology?

Vendors of all kinds love to use the term "performance" time and time again, so what makes an open source next-generation database operate and, well, perform, so well in this case?

How it works...
nasalogo_twitter.jpeg
The answer appears to lie in PostgreSQL 9.2's ability to execute "linear scalability" across 64 processor cores to share out the burden of processing. This separation of workloads... plus its 'index-only' scans and reductions in CPU power consumption are what actually speed up this product.

NASA, Instagram and HP can't be wrong? Can they?

Organisations including the U.S. Federal Aviation Administration, Instagram and NASA run applications on PostgreSQL and HP has adopted it too to power its HP-UX/Itanium solutions.

Improvements in vertical scalability are also said to increase PostgreSQL's ability to efficiently utilise hardware resources on larger servers.

So just how fast is fast here?

Numerically, this means:

* Up to 350,000 read queries per second
* Index-only scans for data warehousing queries
* Up to 14,000 data writes per second

NOTE: PostgreSQL is an open source object-relational database system. It has more than 15 years of active development and runs on all major operating systems, including Linux, UNIX (AIX, BSD, HP-UX, SGI IRIX, Mac OS X, Solaris, Tru64) and Windows.

How to unlock and drive a car with a 3G iPad

bridgwatera | No Comments
| More

Having been on something of a personal test drive with the latest Vodafone 3G connection offering for iPad recently, I have been trying to adopt the "mobile connected life culture" with great gusto.

Being able to instantly add apps to my device (whether I am in a WiFi zone or not) has obvious advantages.

During a visit to London by my American in-laws last week, I suddenly found myself with a need to guide four people around town for a couple of days and be able to act as a perfect tour guide.

My 3G connections via Vodafone performed well and I was able to use a variety of websites to answer the typical American tourist's questions such as, "How long has the palace or Westminster been here and what exactly did Oliver Cromwell do to warrant getting his statue erected outside?"

Four people in a SMART car?

But then I came unstuck. With four of us to ferry about, my SMART car started to look like something of a problem. Luckily I decided to Tweet while on the hoof and got a recommendation for the Zipcar service.

Zipcar's 11,000 self-service vehicles are available in an "on-demand" supply arrangement and parked around strategically placed reserved parking spots in neighborhoods in Bristol, London, Cambridge, Maidstone and Oxford as well as cities in the USA, Canada, Spain and Austria.

The company is reported to target densely populated city markets with a well-developed "middle class" layer and also university campuses and their surrounding communities.

According to Zipcar, "Car sharing seems like a simple enough idea, but there's a reason that Zipcar has become the leader for cars on demand--we took a simple concept to new heights. It's not just about less cars, less congestion and less pollution (though we're not complaining), it's about understanding why those things are a problem, and finding sustainable solutions."

So how well does it work?

I got registered with a quick call to a London based call centre who appeared to be native speakers of English as their first language, if that is politically incorrect then I apologise.

I was then automatically routed to the DVLA in conference to confirm my details all within less than 10 minutes. They located a car for me and booked me in as I signed in to a downloaded iPad app (no BlackBerry support unfortunately) and went off to wander out to find my vehicle.

As a central London resident one often expects convenience, my car was in fact parked three streets away and there were others almost as close.

zip.png

When I got to the "car club only" parking bay, the car had in fact been left in the wrong place by the previous driver -- so a quick call to the call centre and they found the car via GPS and directed me to it whereupon I did in fact: make the car's horn honk to confirm its location and then unlock the car doors with my iPad (the keys are inside in a special chamber in the glove compartment).

This is the only flaw I can find in Zipcar's service, the firm depends on drivers calling them to tell them if they have had to park the car outside of its pick up location.

How much better would it be if the Zipcar system had just that extra piece of software integration built into it somewhere to ensure that the location data being calculated could be used to the full. The official line though appears to be "we rely on our members" and the firm has been growing since its inception in 2000, so perhaps I was slightly unlucky.

Is this on demand cloud style car delivery?

In many ways it indeed is. This could teach us even more about how we now not only programme our software applications but also how we consume all good and services.

Intel robotic advertising signs "read" human behaviour

bridgwatera | No Comments
| More

Electronic advertising and information sign screens are an increasingly prevalent feature of airports, train stations and many other public places.

As these devices (known as digital signage) now start to proliferate, a question arises.

QUESTION: If electronic signs themselves were capable of analysing the humans passing in front of them to tailor the information they present... would this be a helpful development that would make our lives easier, or would this appear to be a 'Big Brother' style invasion of our personal space as yet another camera puts our everyday behaviour under surveillance?

A quick straw poll on Twitter came up it a mixed bag of "it depends" when the above question was posed -- and one or two users even said that they thought that this development was "inevitable" at some point.

Inevitable indeed, Intel has already produced this technology and the company has this week released its Digital Signage Evaluation Kit-12 (DSEK-12) designed to help streamline and encourage the commercial use of digital signage evaluation around us.

This technology is targeted at software developers working in retail and travel obviously, but also healthcare as a key vertical.

The technical details

The DSEK-12 features technologies from Intel, Kontron and Microsoft in a pre-loaded and validated system. The kit includes an OPS-compliant Kontron media player KOPS800 based on 3rd generation Intel Core processors as well as a 180-day evaluation copy of Windows Embedded POSReady 7.

"We believe the global market for digital signage will reach 10 million media players and a corresponding 22 million digital signs by 2015. The new Intel allows companies to spend more time crafting engaging content and less time dealing with software integration," said Jose Avalos, director of Visual Retail at Intel.

Here's the fascinating part

The DSEK-12 comes pre-loaded with Intel Audience Impression Metrics, a software solution capable of "anonymously monitoring" viewer metrics, such as gender, age bracket and length of attention all in real-time.

This video shows that these signs will now able to register when a teenage girl is passing by and change the image on display to an advertisement for a ladies shoe shop. Equally, the sign will know if it's an older man and switch the ad to golf clubs. The signs are pre-loaded with a massive amount of data on human size and shape, so by using the real time processing power that these systems come with, they can change their information appropriately.

The programming potential appears to be almost limitless and the imagination and creativity of the software application developers (and Intel engineers) that will now use this technology should shape this technology's onward development.

Intel sign.png

What makes financial software worth the money?

bridgwatera | No Comments
| More

Like most verticals, software application development for the financial market is characterised by a number of key criteria and data behaviour types that make it "unique" (although we use the term carefully) in terms of form and function.

Proof point of the week comes from Progress Software who has partnered with Mootwin, a "context-aware" mobile application specialist to enhance its stock-alert features in its banking and finance-oriented solution.

Mootwin will use the Progress Apama Complex Event Processing (CEP) platform to monitor and analyse the securities markets as well as enhance the real-time capabilities of the stock-alert feature within its trading application.

Complex events & triggering events

Mootwin will use the Apama CEP platform to correlate multiple data and "triggering events" in financial markets.

So -- multiple data types within complex events analysed by massively powerful processing engines directed at look for further triggering events. This is, in a sense, financial software development in one sentence.

According to a press statement from Progress, "Subscribers of the Mootwin mobile application and service will be able to select a variety of analytical options and graphical displays that best suits them. Based on individual preferences, users can also set up personalised notifications that alert them to trading opportunities or events, almost instantly."

What makes financial software worth the money then?

This software has to be able to act and react upon data and "events" (complex ones) in real time and provide context-aware analysis of the data in hand.

Where Progress is (arguably) most interesting here is in its data management technology. This "captures and replays event streams" and this allows financial analysts to do what they do, whatever that doing actually is.

NOTE: Complex event processing (CEP) is the use of technology to predict high-level events likely to result from specific sets of low-level factors. CEP identifies and analyzes cause-and-effect relationships among events in real time, allowing personnel to proactively take effective actions in response to specific scenarios. CEP is an evolving paradigm originally conceived in the 1990s by Dr. David Luckham at Stanford University.

About this Archive

This page is an archive of entries from September 2012 listed from newest to oldest.

August 2012 is the previous archive.

October 2012 is the next archive.

Find recent content on the main index or look in the archives to find all content.